Any SU render engines that renders distorted textures?
-
@thomthom said:
So question is: how can SU's data be converted into a format that most renderer's and external program uses?
The fact that all of SU's built-in export methods use the "pre-distorted", unique texture method, my guess is that it can't be done (unless the renderer itself supported the same interpolation technique).
-
That's my fear.
Would be interesting to know what texture technique Google uses. Might try to give some of the Googleheads on this forum a prod. -
http://en.wikipedia.org/wiki/Barycentric_coordinates_%28mathematics%29
Maybe some thoughts. Barycentric on a triangle is a very common application. But check out the Barycentric on a tetrahedra (barycentric in 3D). Maybe it's how SU uses it?
-
@chris_at_twilight said:
Huh. The .3ds format only supports triangles (I think), so there is a way to represent the texture interpolation used by SU using only 3 UV coordinates (and the .3ds export knows it! )
not sure if this will trigger any thoughts with you guys (and really, i don't understand much of what you're talking about) but..
if everything is triangles in SU prior to messing with the UV mapping then it will render correctly..
-
IRendernxt straight render in sketchup, you'll have to ask Al how?
-
@Jeff: Maybe because everything is already triangles, the UV coordinates that SU generates are 'compatible' with the traditional triangular UV techniques employed by other applications. So what you see in SU, is what you get in the render. Of course, it could just be how UVTools does the sphere mapping that makes it more compatible?
@richcat said:
IRendernxt straight render in sketchup, you'll have to ask Al how?
You are more than welcome to ask. I won't be surprised if they use the same "pre-distorted" image technique that the SU exporters use.
-
Yup, can confirm Irender aim and renders it correctly.
-
I tried the last version of the IrendernXT, but had a lot of problems - slow response of the interface, sometimes when adjusting textures all goes white, when tried to load a plant from the library SketchUP crashes and etc.
-
The Distorted texture is a nice feature of SketchUp - the ability to take a photo of a house for instance, from an angle, and then distort it onto the actual house so that it looks correct. SketchUp provides tools to get these distorted textures and UV coordinates properly, but it did take a lot of time and effort to figure it out and get it to work properly.
SketchUp made this distortion even easier to use (without knowing that you are doing it) with their Photo Match techniques.
Here is a 3D Warehouse model where the house image was placed using Photo Match. Some renderers process it correctly and some renderers do not recognize and process the distorted texture.
-
it looks ok, right?
so what's irender doing so differently than the other renderers? or does it export a bunch of textures prior to rendering?
-
@unknownuser said:
so what's irender doing so differently than the other renderers? or does it export a bunch of textures prior to rendering?
My quess is that it does export predistorted textures (like kmz export). So for Thomas' scene you will get two different textures when rendered. Sorry, have no irender, so cannot test. Wonder if it saves temp textures on disk before rendering...
-
I didn't see an actual answer in Al's post. Maybe I just didn't understand it?
-
@chris_at_twilight said:
@Jeff: Maybe because everything is already triangles, the UV coordinates that SU generates are 'compatible' with the traditional triangular UV techniques employed by other applications. So what you see in SU, is what you get in the render. Of course, it could just be how UVTools does the sphere mapping that makes it more compatible?
@richcat said:
IRendernxt straight render in sketchup, you'll have to ask Al how?
You are more than welcome to ask. I won't be surprised if they use the same "pre-distorted" image technique that the SU exporters use.
VfSU for instance, I snooped in the .rb files, uses the PolygonMesh of each face to sample the UV data. And the PolygonMesh consists on only triangular polygons - but it does not render correctly. Even if you make a triangle and apply a distorted textures, it'll render incorrectly.
So it doesn't seem to be the case. Though, it is weird what Jeff describes - seems to do the trick for him in Indigo. ...maybe Whaat knows something on this...
-
@al hart said:
Here is a 3D Warehouse model where the house image was placed using Photo Match. Some renderers process it correctly and some renderers do not recognize and process the distorted texture.
Any insight on how to process distorted textures correctly?
-
@chris_at_twilight said:
An export to Collada for the entire scene produces 1 undistorted texture and 1 distorted texture.
How are the unique texture created? No API methods AFIK that dose that.
Something they generate themselves in some external C code? Outside the reach for us mere mortal that only uses Ruby? -
@thomthom said:
How are the unique texture created? No API methods AFIK that dose that.
Something they generate themselves in some external C code? Outside the reach for us mere mortal that only uses Ruby?The texturewriter class will write pre-distorted textures. It is a lot of work to implement this in a rendering engine because you have to first determine which faces are distorted and create a unique material for each of them. As we have seen, this can result in LOTS of materials. I will probably add support to SkIndigo some day but I have not considered it a high priority. Very few users even mention it as a serious problem.
It comes up most often when a user downloads a model from the 3DW that has been photomatched and then they try to render it.
SkIndigo has had a workaround option for over a year (well before SketchUp added it as a standard feature). You can right-click on a face and go to 'Explode Distorted Texture'. It does the same thing as 'Make Unique Texture'
-
@thomthom said:
So it doesn't seem to be the case. Though, it is weird what Jeff describes - seems to do the trick for him in Indigo. ...maybe Whaat knows something on this...
Very interesting thread. Thanks Thomthom for starting this discussion. So far, it has confirmed most of my theories.
Unless the rendering engine can interpolate UV mapping using four anchor points, it can't be done without exporting a pre-distorted texture.
The method of triangulating the mesh for the spherical mapping probably has something to do with how UVTools works. I believe that for faces with greater than three vertices, I call the
position_material
method with four points (which probably results in a very slightly distorted texture, but enough to mess up the rendering engines). For faces with three vertices, I call theposition_material
method with only three points so the texture is not distorted (at least not enough to notice).It is strange. Someone should test to see if the textures are distorted even for triangles after applying the spherical mapping. Maybe I'm wrong...
-
@whaat said:
Unless the rendering engine can interpolate UV mapping using four anchor points, it can't be done without exporting a pre-distorted texture.
Is that possible? Is a distorted texture at all possible to define with three points?
@whaat said:
The texturewriter class will write pre-distorted textures. It is a lot of work to implement this in a rendering engine because you have to first determine which faces are distorted and create a unique material for each of them.
And work out new UV data for the material you create.
But this method won't make new materials in the SU file, right?@whaat said:
Very few users even mention it as a serious problem.
It comes up most often when a user downloads a model from the 3DW that has been photomatched and then they try to render it.
Could that not be because people are avoiding them due to this problem?
Also - recently there's been more stir about UV mapping here on the forum. As you saw with your UVTools 0.2 thread.
I've got some UV plugins I'd like to make - I've been looking into it - but since I render in V-Ray I try to avoid distorting textures. -
@whaat said:
SkIndigo has had a workaround option for over a year (well before SketchUp added it as a standard feature). You can right-click on a face and go to 'Explode Distorted Texture'. It does the same thing as 'Make Unique Texture'
I've never been a fan of such workaround - makes the organisation of the model difficult.
-
@thomthom said:
Any insight on how to process distorted textures correctly?
As I recall, we originally tried to figure out that the image was distorted, but in the end we would up having SketchUp save an image for everything, and having SketchUp give us the UVQ coordinates and it finally just worked.
I just looked through the code and couldn't find anything special that we are doing. (Except to let SketchUp save a texture for all textured materials, and to let SketchUp give us the UVQs)
Advertisement