sketchucation logo sketchucation
    • Login
    ℹ️ Licensed Extensions | FredoBatch, ElevationProfile, FredoSketch, LayOps, MatSim and Pic2Shape will require license from Sept 1st More Info

    Any SU render engines that renders distorted textures?

    Scheduled Pinned Locked Moved Developers' Forum
    97 Posts 17 Posters 16.5k Views 17 Watching
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • T Offline
      tomasz
      last edited by

      @thomthom said:

      where do the p variable come from?

      It is an index(starting from 1) of a point in a PolygonMesh. You get the Polygon mesh from a face.mesh #.
      You probably can get point position in other way.

      Author of [Thea Render for SketchUp](http://www.thearender.com/sketchup)

      1 Reply Last reply Reply Quote 0
      • thomthomT Offline
        thomthom
        last edited by

        Also, why don't you use uv = mesh.uv_at(p, true)? Appears it would give the same result.

        Thomas Thomassen — SketchUp Monkey & Coding addict
        List of my plugins and link to the CookieWare fund

        1 Reply Last reply Reply Quote 0
        • thomthomT Offline
          thomthom
          last edited by

          If you use my Probes plugin: http://forums.sketchucation.com/viewtopic.php?f=323&t=21472#p180592

          Press Tab to see raw UVQ data
          By default it will use UVHelper to get the UV data
          Press F2 to make it sample the data from the PolygonMesh

          From my testing, the data never differs.

          UVHelper seem to be made to sample UV data from a Face.
          But if you have a PolygonMesh of the Face, it includes the UV data (provided you asked for that when you used face.mesh).

          Thomas Thomassen — SketchUp Monkey & Coding addict
          List of my plugins and link to the CookieWare fund

          1 Reply Last reply Reply Quote 0
          • T Offline
            tomasz
            last edited by

            @thomthom said:

            Also, why don't you use uv = mesh.uv_at(p, true)? Appears it would give the same result.

            I simply use UVHelper because it seems to be designed to this task. What would be the purpose of it otherwise? 👊
            I am not so sure if it would give same result for two distorted faces sharing same distorted texture (the last pair in the modified skp test scene).

            BTW. I have found that there is some memory leak when using UVHelper, at least those objects are not being dumped well (rubbish collector). In my exporter I stay away from UVHelper as long as I can. I read uv coordinates in all other scenarios using uv_at.

            Author of [Thea Render for SketchUp](http://www.thearender.com/sketchup)

            1 Reply Last reply Reply Quote 0
            • T Offline
              tomasz
              last edited by

              @thomthom said:

              If you use my Probes plugin: http://forums.sketchucation.com/viewtopic.php?f=323&t=21472#p180592

              Press Tab to see raw UVQ data
              By default it will use UVHelper to get the UV data
              Press F2 to make it sample the data from the PolygonMesh

              From my testing, the data never differs.

              UVHelper seem to be made to sample UV data from a Face.
              But if you have a PolygonMesh of the Face, it includes the UV data (provided you asked for that when you used face.mesh).

              Ok. Try this. Just load a face with distorted texture to the TW, then you will see the difference if you will use same TW to get uv from UVHelper.
              What you have written is true as long as you use blank TW. It gives coordinates in relation to the original 'picture', but when you load a face to TW then it 'makes it unique inside TW' therefore uvs change.

              It looks like this thread would better fit in the new developers section 😄

              Author of [Thea Render for SketchUp](http://www.thearender.com/sketchup)

              1 Reply Last reply Reply Quote 0
              • C Offline
                Chris_at_Twilight
                last edited by

                @unknownuser said:

                I have never received false negatives by checking the 1.0 with enough precision.

                Really? That's good to know. That's a much faster test.

                As far as why some projected textures aren't distorted, I think Al and thomthom are both right, given the respective circumstance. It could be that the texture is not truly 'distorted' as Al suggest; it could be that the level of subdivision is high enough that you just don't see the distortion, like thomthom said.

                And Tomasz hit it right on the head as far as using tw and uvhelper; you have to make sure you are using a tw 'loaded' with the face in question, before using it in the uvhelper.

                http://www.TwilightRender.com

                1 Reply Last reply Reply Quote 0
                • thomthomT Offline
                  thomthom
                  last edited by

                  @unknownuser said:

                  if you will use same TW to get uv from UVHelper.

                  Use a TextureWriter to get UV data? ❓

                  Thomas Thomassen — SketchUp Monkey & Coding addict
                  List of my plugins and link to the CookieWare fund

                  1 Reply Last reply Reply Quote 0
                  • T Offline
                    tomasz
                    last edited by

                    @thomthom said:

                    @unknownuser said:

                    if you will use same TW to get uv from UVHelper.

                    Use a TextureWriter to get UV data? ❓

                    It is the required parameter: uvHelper=face.get_UVHelper(true, false, **tw**)

                    Author of [Thea Render for SketchUp](http://www.thearender.com/sketchup)

                    1 Reply Last reply Reply Quote 0
                    • thomthomT Offline
                      thomthom
                      last edited by

                      Aaah!!! That's why it asks for a TW. Duh!
                      That fills in a great hole in the API doc.

                      Thanks Tomaz!

                      Thomas Thomassen — SketchUp Monkey & Coding addict
                      List of my plugins and link to the CookieWare fund

                      1 Reply Last reply Reply Quote 0
                      • thomthomT Offline
                        thomthom
                        last edited by

                        Here's a simple test I did. (doesn't take into account nested groups etc)
                        but it's a rough ruby version of Make Unique.


                        uv_make_unique.rb


                        distort test.skp

                        Thomas Thomassen — SketchUp Monkey & Coding addict
                        List of my plugins and link to the CookieWare fund

                        1 Reply Last reply Reply Quote 0
                        • AdamBA Offline
                          AdamB
                          last edited by

                          Just a quick word as to what you're seeing.

                          The texture coordinates in SU are mostly 2d but there is support for projected textures in which the 3rd element of the texture coordinate is a projection. ie by dividing through by the 3 element you get a 2d texture coordinate.

                          You can get a ok approximation by doing the projection at the vertices and then interpolating across the polygon in 2d (this is what LightUp does), or you can interpolate the 3-element texture coordinate and do the divide at each pixel which is slower but you'll get the right answer. Not sure there are many renderers that do that.

                          Adam

                          Developer of LightUp Click for website

                          1 Reply Last reply Reply Quote 0
                          • AdamBA Offline
                            AdamB
                            last edited by

                            Oh, and the corollary of this is that if you subdivide the faces into smaller pieces, you'll get a progressively more accurate rendering. So try cutting the distorted quad into 16 quads.

                            Adam

                            Developer of LightUp Click for website

                            1 Reply Last reply Reply Quote 0
                            • T Offline
                              tomasz
                              last edited by

                              @adamb said:

                              You can get a ok approximation by doing the projection at the vertices and then interpolating across the polygon in 2d (this is what LightUp does), or you can interpolate the 3-element texture coordinate and do the divide at each pixel which is slower but you'll get the right answer. Not sure there are many renderers that do that.

                              SU is using OpenGL. Does it mean that SU, for its own 'rendering', does same interpolation and sends the 'unique' texture to OpenGL?

                              Author of [Thea Render for SketchUp](http://www.thearender.com/sketchup)

                              1 Reply Last reply Reply Quote 0
                              • Al HartA Offline
                                Al Hart
                                last edited by

                                @unknownuser said:

                                SU is using OpenGL. Does it mean that SU, for its own 'rendering', does same interpolation and sends the 'unique' texture to OpenGL?

                                I suspect so - it is a lot faster to create the distorted image as a bitmap, and then send it to OpenGL - rather than starting to sub-divide faces.

                                Also, that would explain why SketchUp makes to pre-distorted image available to 3DS output and to us.

                                Al Hart

                                http:wiki.renderplus.comimageseefRender_plus_colored30x30%29.PNG
                                IRender nXt from Render Plus

                                1 Reply Last reply Reply Quote 0
                                • AdamBA Offline
                                  AdamB
                                  last edited by

                                  @unknownuser said:

                                  @adamb said:

                                  You can get a ok approximation by doing the projection at the vertices and then interpolating across the polygon in 2d (this is what LightUp does), or you can interpolate the 3-element texture coordinate and do the divide at each pixel which is slower but you'll get the right answer. Not sure there are many renderers that do that.

                                  SU is using OpenGL. Does it mean that SU, for its own 'rendering', does same interpolation and sends the 'unique' texture to OpenGL?

                                  No, it means SU sends 3-component texture coordinates to OpenGL and allows OpenGL to interpolate and divide at each pixel.

                                  But as Al says, given you can 'bake in' your projection to a unique texture once your happy, its not a big deal.

                                  Adam

                                  Developer of LightUp Click for website

                                  1 Reply Last reply Reply Quote 0
                                  • C Offline
                                    Chris_at_Twilight
                                    last edited by

                                    @adamb said:

                                    No, it means SU sends 3-component texture coordinates to OpenGL and allows OpenGL to interpolate and divide at each pixel.

                                    Just as Adam says, OpenGL provides methods not just for 2 coordinates (UV), but 3 and even 4. glTexCoord3f, glTexCoord4f

                                    http://www.TwilightRender.com

                                    1 Reply Last reply Reply Quote 0
                                    • thomthomT Offline
                                      thomthom
                                      last edited by

                                      I'm curious: for those render engines that renders distorted textures. What's the result if you add a bump map, like the one attached. Exaggerated bump so it's clearly visible.


                                      grid_bump.png

                                      Thomas Thomassen — SketchUp Monkey & Coding addict
                                      List of my plugins and link to the CookieWare fund

                                      1 Reply Last reply Reply Quote 0
                                      • Al HartA Offline
                                        Al Hart
                                        last edited by

                                        @thomthom said:

                                        I'm curious: for those render engines that renders distorted textures. What's the result if you add a bump map, like the one attached. Exaggerated bump so it's clearly visible.

                                        "True" bump maps, where you assign the bump as a second texture, do not work, because SketchUp gives you a pre-distorted texture, which then works with UVs of 0,0, and 1,1, to use for the main texture. But SketchUp does not give you information on how to distort the second texture.

                                        Here the carpet texture is distorted properly, but the bump map is not distorted.
                                        distorted.jpg

                                        "auto-bump" where a single texture is used for the color and for the bump map, does work, because the bump map is distorted as well.

                                        Here the bump map effect matches the distortion of the image in SketchUp.
                                        (The disgtorted SketchUp image was used as the bump map.)
                                        auto-distorted.jpg

                                        Distorted SketchUp Material
                                        distorted sketchup image.jpg

                                        This could be solved for multiple texture bump maps - but you would have to let SketchUp distort the main texture and then let SketchUp distort the bump map texture as well.

                                        Al Hart

                                        http:wiki.renderplus.comimageseefRender_plus_colored30x30%29.PNG
                                        IRender nXt from Render Plus

                                        1 Reply Last reply Reply Quote 0
                                        • thomthomT Offline
                                          thomthom
                                          last edited by

                                          @al hart said:

                                          "True" bump maps, where you assign the bump as a second texture, do not work, because SketchUp gives you a pre-distorted texture, which then works with UVs of 0,0, and 1,1, to use for the main texture. But SketchUp does not give you information on how to distort the second texture.

                                          That's what I suspected. Which is a problem.

                                          Thomas Thomassen — SketchUp Monkey & Coding addict
                                          List of my plugins and link to the CookieWare fund

                                          1 Reply Last reply Reply Quote 0
                                          • Al HartA Offline
                                            Al Hart
                                            last edited by

                                            @thomthom said:

                                            That's what I suspected. Which is a problem.

                                            If you are an IRender nXt user, we can probably write a routine to distort the bump maps for you.

                                            If not you can still do it in Ruby. Write a ruby to use texture writer to replace the diffuse SketchUp texture with the bbmp map texture, then use texture writer to save the distorted bump map, then replace te original texture. Ditto for any specular texture.

                                            Al Hart

                                            http:wiki.renderplus.comimageseefRender_plus_colored30x30%29.PNG
                                            IRender nXt from Render Plus

                                            1 Reply Last reply Reply Quote 0
                                            • 1
                                            • 2
                                            • 3
                                            • 4
                                            • 5
                                            • 5 / 5
                                            • First post
                                              Last post
                                            Buy SketchPlus
                                            Buy SUbD
                                            Buy WrapR
                                            Buy eBook
                                            Buy Modelur
                                            Buy Vertex Tools
                                            Buy SketchCuisine
                                            Buy FormFonts

                                            Advertisement