Trouble with .position_material method
-
Not sure what you're getting at. My q values are very close to 1.0 as it is. I think they're ignored anyhow.
I think the way the texture mapping works is you specify up to 4 vertex pairs where each pair has a vertex in the model and a vertex on the texture. If you give 4 pairs, it will stretch, rotate, shear and tweak the texture until all pairs line up. You don't need q-values on the texture to accomplish that, just u,v values.
-
Sure wish I could figure this out. Anyone have any ideas why I'm experiencing the problem with this texture and face?
-
I mentioned this to the Google dev team. I asked what the Q value did - as the manual claims it's unused.
@unknownuser said:
Hi Thomas,
I looked at the code, and the Q value (or Z attribute of the Point3D returned) is explicitly set to a value of 1.0 in the Point3d object returned. However, the method also does the transform to get the UV values, (which is stored in a 3x3 matrix) on the return Point3d after it initializes Q/Z to 1.0. So for transform matrices such as skew, which operate on the Z coordinate, you will end up getting a value that does not equal 1.0 (this probably also explains why UVHelper returns Point3d objects in stead of a simple UV array, so that the matrix math lines up -- still doesn’t excuse it though ). So while the value may not always be 1.0, it is not getting the value from any internal representation of the model.
Matt
-
pts=[[479.0, 0.0, 95.5],[0,0,0]] material.texture.size = [height, length] face.position_material material, pts, true
-
Appreciate the posts everyone. Not sure if I have a solution though. I'm trying to understand why the UVhelper script is giving me point data that doesn't work when I use it. Thanks for any insight you can provide.
-
@sahi: I think you're on to something there with the texture.size method. I notice that the texture doesn't have the right height and width. In fact, I never really noticed those parameters before, I just left them at whatever they were and then stretched the texture out with the pins.
Since the texture has a width of about 2.5' and the wall itself is 40' long, the texture has to be stretched to 16x it's width in order to fit. That would explain some of those large values in the UVhelper script.
Unfortunately, I have to put this down for awhile but I'll report back later what else I figure out.
-
hm...
The material size should only be relevant if thereæs no UV mapping. It specifies the default size of the texture. When you map 3D model points to 2D UV points it shouldn't matter.
-
Interesting point. That makes sense. Sahi's method still has some application here as he just specifies one point in the real world and one in the texture. That allows the width and height parameters of the texture to take control.
I imagine if you gave it more point pairs, the width and height parameters would not apply anymore.
Guess I'm still stuck.
Do you think this is a bug? Should I file it?
-
I asked this at the beta forum - the best answer that what I quoted earlier. I'm not sure what to make of it. I'll try to prod again. But you can try as well - just to make some more noise about this matter from multiple sources. SU UV mapping is a bit of a mystery and a black art to master.
-
Well, I changed the scale of the texture to be a width of 40' and a height of 8.5'
I then remapped the texture by hand using the pins. When I then looked at the UV values, I got the same geometry positions but the uv coordinates had changed to unit values:
[479", 0", 95.5"],
[1",1",1"],
[0", 0", 95.5"],
[0",1",1"],
[0",0",0"],
[-0", -0", 1"],
[479", 0", 0"],
[1", -0", 1"]I made a script that uses these values and the script does work. Not sure why as I tend to think you're right: the UV coordinate mappings should overide size and width - otherwise, why have more than one pair of coordinates?
Me still confused.
-
Also thinking out loud again. Even though the UVhelper script returns uv values as whatever unit the model is in (in my case, inches), it really is a unit-less value.
Say the value is 0.5....
I wonder if that refers to have the size of the picture dimension or half the size of the geometry that's being applied to.
-
XYZ: 0,0,0 - UV: 0,0
XYZ: 0,10,0 - UV: 0,0.5
XYZ: 10,0,0 - UV: 0.5,0
XYZ: 10,10,0 - UV: 0.5,0.5That would scale the texture one half time across the face.
XYZ: 0,0,0 - UV: 0,0
XYZ: 0,10,0 - UV: 0,2
XYZ: 10,0,0 - UV: 2,0
XYZ: 10,10,0 - UV: 2,2That would scale the texture two times across the face.
-
Understood. Thanks for the tip.
I'm not sure, knowing that, why the uv values changed when I changed the size and fit of the texture and repositioned it so it had the same look.
-
Keep in mind that that q value is not used. This does not mean that it will always be 0.
-
@sangahn said:
Keep in mind that that q value is not used. This does not mean that it will always be 0.
I completely don't get it. It looks SU developers have no clue what SU actually is doing!!!
I have recently discovered that Q value indicates precisely if a texture is distorted or not!I have asked at least twice: 'How can I recognize if the texture is photo-matched or not?'. I haven't received an answer. I have had to stay with my slow method of writing to a TextureWriter and checking the handle it returns.
My recent discovery was so simple: Q VALUE IS BEING USED! It defaults to 1.0 inch. If it has other value texture is photo-matched. So fast and simple method and is valid both for uv_at(index) and UVHelper solution. It can take values close to 1.000000 (say 0.99999995) which still means the texture is not distorted.
I doesn't exactly understand how UVQ projection works, but I hope it will cast some light on your problem.
-
Doesn't have to be photomatched. Only be a skewed/distorted texture - that also changes the Q value.
See screenshot. (I've also attached the sample model. I used my Probes plugin to check the UVQ values: http://forums.sketchucation.com/viewtopic.php?f=180&t=21472)
-
When I use my UV mirror plugin to copy the UV mapping from the front side to the back side - it fails to do so correctly for the distorted texture. I think this is the source of the problem for the original poster of the linked thread.
-
Making some progress with this. Talking with some Google heads and "jeff99" it seems that the UVQ data returned by the UVHelper is the UV data transformed in as 3x3 matrix. Jeff said it seemed like Homogeneous coordinates - where the Q value is Height. The UV values should be divided by the H(Q) value - but it appears that SU multiplies instead.
We have some some testing for each of our projects and this seem to be working. I use this code to get UV values which I can use with
position_material
.def self.flattenUVQ(uvq) uvq.x = uvq.x / uvq.z uvq.y = uvq.y / uvq.z uvq.z = 1.0 return uvq end
-
More feedback. I wasn't fully correct in my previous summary:
@unknownuser said:
Well, mostly. They did not multiply by H, they just did nothing. If they had multiplied by H, you would have had to divide by H twice to get the right answer. Once to undo their multiply, and once more to do the required division.
Your code is correct, but the description that they multiplied instead of dividing is not correct.
-
Hey Tom.
This is "jeff99" from the Beta forums. I followed your link to this thread and was reviewing the discussions here. I wish I had seen this a few days ago!
Thanks again for your observations regarding this. You are very tenacious at attacking a problem!
By the way, the "H" is for "Homogeneous", not "Height". Although it can be thought of as a scale factor.
Advertisement