sketchucation logo sketchucation
    • Login
    ℹ️ Licensed Extensions | FredoBatch, ElevationProfile, FredoSketch, LayOps, MatSim and Pic2Shape will require license from Sept 1st More Info

    Any SU render engines that renders distorted textures?

    Scheduled Pinned Locked Moved Developers' Forum
    97 Posts 17 Posters 16.5k Views 17 Watching
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • T Offline
      tomasz
      last edited by

      @chris_at_twilight said:

      As for detecting when textures are distorted... I agree that you will very often see 'z' or 'q' (UVQ) values that are not 1.0, but I think you can get false negatives that way. Meaning, you can have distorted textures where the Q is not any value but 1. (This is speculation only)

      I have never received false negatives by checking the 1.0 with enough precision. The posted TheaExporter sample uses TextureWriter and UVHelper. They work in tandem. The texture writer will create additional ('virtual' unique) texture only when necessary. Sometimes TW can create just a single 'virtual' unique texture for two faces that are parallel to each-other an share same distorted texture.
      I have spent many hours to understand how the TW works. There is no point in writing every single face to the TW, because it will take a lot of time. I recognize first if a texture is distorted Q!=1.0.

      if ((uvq.z.to_f)*100000).round != 100000
      

      If it is, it goes to TW, which usually leads to a creation of a unique texture. If it doesn't one have to read the 'handle' returned by the TW. It points to the position of a texture in TW 'stack'.

      It would be interesting to learn how SU DEVELOPERS use additional Q coordinate to add perspective distortion to a texture. I have done a search and haven't found any other software using this type of projection. I hope Google staff would share the trick with authors of redering engines so a use of TW, which is creating additional textures == materials in a renderer, won't be necessary.

      I have implemented also a method without UVHelper using uv=[uvq.x/uvq.z,uvq.y/uvq.z], but the key is actually to learn how SU applies the projection. As far as I am aware no rendering software uses UVQ.

      @frederik said:

      Tomasz - are you sure that the distorted texture isn't being exported as a seperate *(unique)*texture...?? πŸ˜•

      The method uses TW which crates 'a second texture'==additional 'Thea material'. I have not found a model so far that gives me wrong projection. Unfortunately the method forces creation of multiple Thea materials.

      EDIT. I modified Thomas' test model. I have added two additional things - a group with a default mat painted face and a second face with distorted texture which will have same handle in the TextureWriter as the one right bellow.


      UVMappingTest+.skp

      Author of [Thea Render for SketchUp](http://www.thearender.com/sketchup)

      1 Reply Last reply Reply Quote 0
      • thomthomT Offline
        thomthom
        last edited by

        @unknownuser said:

        why are these working?

        Maybe you curved surface is subdivided into so many smaller faces that deviation isn't visually noticeable?

        Thomas Thomassen β€” SketchUp Monkey & Coding addict
        List of my plugins and link to the CookieWare fund

        1 Reply Last reply Reply Quote 0
        • thomthomT Offline
          thomthom
          last edited by

          Tomasz: When you discover a distorted texture (from the Q value), and use TW to write out a unique material;

          • What UV values do you send to you renderer?
          • If the render material has a bumpmap applied to it - how is that handled? Won't you end up with mismatching bump?

          Thomas Thomassen β€” SketchUp Monkey & Coding addict
          List of my plugins and link to the CookieWare fund

          1 Reply Last reply Reply Quote 0
          • FrederikF Offline
            Frederik
            last edited by

            @unknownuser said:

            The method uses TW which crates 'a second texture'==additional 'Thea material'.

            Yeah - that's what I thought...
            This is - again - how everyone else is able to handle it...
            However, I believe thomthom is trying to find out if this can be handled differently...

            @thomthom said:

            Yea, Make Unique generates a new texture and gives the face new UV co-ordinates.

            What I'm trying to find out though, if there is any renderers that actually handles this. More importantly, I'm interesting in how they handle it.

            IMHO, I hardly never see distorted textures used and thereby it has never been a great issue, certainly not a dealbreaker...

            Cheers
            Kim Frederik

            1 Reply Last reply Reply Quote 0
            • T Offline
              tomasz
              last edited by

              @thomthom said:

              • What UV values do you send to you renderer?

              When all textures has been loaded to TW I use:

              					uvHelper=face.get_UVHelper(true, false, tw)
              					point_pos=mesh.point_at(p).transform!(trans.inverse) #trans='nested transformation' of the parent
              #thanks to Stefan Jaensch for figuring out that this trans has to be inverted and applied here
              					uv=uvHelper.get_front_UVQ(point_pos)
              

              @thomthom said:

              • If the render material has a bumpmap applied to it - how is that handled? Won't you end up with mismatching bump?

              I use same UVs. I assume that the bumpmap is the same size as the main texture.

              Author of [Thea Render for SketchUp](http://www.thearender.com/sketchup)

              1 Reply Last reply Reply Quote 0
              • thomthomT Offline
                thomthom
                last edited by

                @unknownuser said:

                @thomthom said:

                • What UV values do you send to you renderer?

                When all textures has been loaded to TW I use:

                					uvHelper=face.get_UVHelper(true, false, tw)
                > 					point_pos=mesh.point_at(p).transform!(trans.inverse) #trans='nested transformation' of the parent
                > #thanks to Stefan Jaensch for figuring out that this trans has to be inverted and applied here
                > 					uv=uvHelper.get_front_UVQ(point_pos)
                

                @thomthom said:

                • If the render material has a bumpmap applied to it - how is that handled? Won't you end up with mismatching bump?

                I use same UVs. I assume that the bumpmap is the same size as the main texture.

                where do the p variable come from?

                Thomas Thomassen β€” SketchUp Monkey & Coding addict
                List of my plugins and link to the CookieWare fund

                1 Reply Last reply Reply Quote 0
                • T Offline
                  tomasz
                  last edited by

                  @thomthom said:

                  where do the p variable come from?

                  It is an index(starting from 1) of a point in a PolygonMesh. You get the Polygon mesh from a face.mesh #.
                  You probably can get point position in other way.

                  Author of [Thea Render for SketchUp](http://www.thearender.com/sketchup)

                  1 Reply Last reply Reply Quote 0
                  • thomthomT Offline
                    thomthom
                    last edited by

                    Also, why don't you use uv = mesh.uv_at(p, true)? Appears it would give the same result.

                    Thomas Thomassen β€” SketchUp Monkey & Coding addict
                    List of my plugins and link to the CookieWare fund

                    1 Reply Last reply Reply Quote 0
                    • thomthomT Offline
                      thomthom
                      last edited by

                      If you use my Probes plugin: http://forums.sketchucation.com/viewtopic.php?f=323&t=21472#p180592

                      Press Tab to see raw UVQ data
                      By default it will use UVHelper to get the UV data
                      Press F2 to make it sample the data from the PolygonMesh

                      From my testing, the data never differs.

                      UVHelper seem to be made to sample UV data from a Face.
                      But if you have a PolygonMesh of the Face, it includes the UV data (provided you asked for that when you used face.mesh).

                      Thomas Thomassen β€” SketchUp Monkey & Coding addict
                      List of my plugins and link to the CookieWare fund

                      1 Reply Last reply Reply Quote 0
                      • T Offline
                        tomasz
                        last edited by

                        @thomthom said:

                        Also, why don't you use uv = mesh.uv_at(p, true)? Appears it would give the same result.

                        I simply use UVHelper because it seems to be designed to this task. What would be the purpose of it otherwise? πŸ‘Š
                        I am not so sure if it would give same result for two distorted faces sharing same distorted texture (the last pair in the modified skp test scene).

                        BTW. I have found that there is some memory leak when using UVHelper, at least those objects are not being dumped well (rubbish collector). In my exporter I stay away from UVHelper as long as I can. I read uv coordinates in all other scenarios using uv_at.

                        Author of [Thea Render for SketchUp](http://www.thearender.com/sketchup)

                        1 Reply Last reply Reply Quote 0
                        • T Offline
                          tomasz
                          last edited by

                          @thomthom said:

                          If you use my Probes plugin: http://forums.sketchucation.com/viewtopic.php?f=323&t=21472#p180592

                          Press Tab to see raw UVQ data
                          By default it will use UVHelper to get the UV data
                          Press F2 to make it sample the data from the PolygonMesh

                          From my testing, the data never differs.

                          UVHelper seem to be made to sample UV data from a Face.
                          But if you have a PolygonMesh of the Face, it includes the UV data (provided you asked for that when you used face.mesh).

                          Ok. Try this. Just load a face with distorted texture to the TW, then you will see the difference if you will use same TW to get uv from UVHelper.
                          What you have written is true as long as you use blank TW. It gives coordinates in relation to the original 'picture', but when you load a face to TW then it 'makes it unique inside TW' therefore uvs change.

                          It looks like this thread would better fit in the new developers section πŸ˜„

                          Author of [Thea Render for SketchUp](http://www.thearender.com/sketchup)

                          1 Reply Last reply Reply Quote 0
                          • C Offline
                            Chris_at_Twilight
                            last edited by

                            @unknownuser said:

                            I have never received false negatives by checking the 1.0 with enough precision.

                            Really? That's good to know. That's a much faster test.

                            As far as why some projected textures aren't distorted, I think Al and thomthom are both right, given the respective circumstance. It could be that the texture is not truly 'distorted' as Al suggest; it could be that the level of subdivision is high enough that you just don't see the distortion, like thomthom said.

                            And Tomasz hit it right on the head as far as using tw and uvhelper; you have to make sure you are using a tw 'loaded' with the face in question, before using it in the uvhelper.

                            http://www.TwilightRender.com

                            1 Reply Last reply Reply Quote 0
                            • thomthomT Offline
                              thomthom
                              last edited by

                              @unknownuser said:

                              if you will use same TW to get uv from UVHelper.

                              Use a TextureWriter to get UV data? ❓

                              Thomas Thomassen β€” SketchUp Monkey & Coding addict
                              List of my plugins and link to the CookieWare fund

                              1 Reply Last reply Reply Quote 0
                              • T Offline
                                tomasz
                                last edited by

                                @thomthom said:

                                @unknownuser said:

                                if you will use same TW to get uv from UVHelper.

                                Use a TextureWriter to get UV data? ❓

                                It is the required parameter: uvHelper=face.get_UVHelper(true, false, **tw**)

                                Author of [Thea Render for SketchUp](http://www.thearender.com/sketchup)

                                1 Reply Last reply Reply Quote 0
                                • thomthomT Offline
                                  thomthom
                                  last edited by

                                  Aaah!!! That's why it asks for a TW. Duh!
                                  That fills in a great hole in the API doc.

                                  Thanks Tomaz!

                                  Thomas Thomassen β€” SketchUp Monkey & Coding addict
                                  List of my plugins and link to the CookieWare fund

                                  1 Reply Last reply Reply Quote 0
                                  • thomthomT Offline
                                    thomthom
                                    last edited by

                                    Here's a simple test I did. (doesn't take into account nested groups etc)
                                    but it's a rough ruby version of Make Unique.


                                    uv_make_unique.rb


                                    distort test.skp

                                    Thomas Thomassen β€” SketchUp Monkey & Coding addict
                                    List of my plugins and link to the CookieWare fund

                                    1 Reply Last reply Reply Quote 0
                                    • AdamBA Offline
                                      AdamB
                                      last edited by

                                      Just a quick word as to what you're seeing.

                                      The texture coordinates in SU are mostly 2d but there is support for projected textures in which the 3rd element of the texture coordinate is a projection. ie by dividing through by the 3 element you get a 2d texture coordinate.

                                      You can get a ok approximation by doing the projection at the vertices and then interpolating across the polygon in 2d (this is what LightUp does), or you can interpolate the 3-element texture coordinate and do the divide at each pixel which is slower but you'll get the right answer. Not sure there are many renderers that do that.

                                      Adam

                                      Developer of LightUp Click for website

                                      1 Reply Last reply Reply Quote 0
                                      • AdamBA Offline
                                        AdamB
                                        last edited by

                                        Oh, and the corollary of this is that if you subdivide the faces into smaller pieces, you'll get a progressively more accurate rendering. So try cutting the distorted quad into 16 quads.

                                        Adam

                                        Developer of LightUp Click for website

                                        1 Reply Last reply Reply Quote 0
                                        • T Offline
                                          tomasz
                                          last edited by

                                          @adamb said:

                                          You can get a ok approximation by doing the projection at the vertices and then interpolating across the polygon in 2d (this is what LightUp does), or you can interpolate the 3-element texture coordinate and do the divide at each pixel which is slower but you'll get the right answer. Not sure there are many renderers that do that.

                                          SU is using OpenGL. Does it mean that SU, for its own 'rendering', does same interpolation and sends the 'unique' texture to OpenGL?

                                          Author of [Thea Render for SketchUp](http://www.thearender.com/sketchup)

                                          1 Reply Last reply Reply Quote 0
                                          • Al HartA Offline
                                            Al Hart
                                            last edited by

                                            @unknownuser said:

                                            SU is using OpenGL. Does it mean that SU, for its own 'rendering', does same interpolation and sends the 'unique' texture to OpenGL?

                                            I suspect so - it is a lot faster to create the distorted image as a bitmap, and then send it to OpenGL - rather than starting to sub-divide faces.

                                            Also, that would explain why SketchUp makes to pre-distorted image available to 3DS output and to us.

                                            Al Hart

                                            http:wiki.renderplus.comimageseefRender_plus_colored30x30%29.PNG
                                            IRender nXt from Render Plus

                                            1 Reply Last reply Reply Quote 0
                                            • 1
                                            • 2
                                            • 3
                                            • 4
                                            • 5
                                            • 3 / 5
                                            • First post
                                              Last post
                                            Buy SketchPlus
                                            Buy SUbD
                                            Buy WrapR
                                            Buy eBook
                                            Buy Modelur
                                            Buy Vertex Tools
                                            Buy SketchCuisine
                                            Buy FormFonts

                                            Advertisement