OK, I think I can help. First off, don't get hung up on the flipped thing. Its not a 'magic' operation - how many times have I multiplied the following number by -1?
123
You can't know. An even number of multiplies (aka flips) gets you back to where you started. Its the same with multiplying an axis.
OK, some background. The basic building block of almost all this stuff is dot product. Dot product is actually projecting 1 vector onto another to see how far along it lies. The transforms in SU are made of 4 things. An xaxis, yaxis, zaxis and a position stored in a Geom::Transformation. When you do "object.transform aMatrix" what you're actually doing is calculating how far along those 3 axes "object" lies. If you look at a vector matrix multiply it is composed of 3 dot products because all you're doing is projecting the vector onto those 3 axes in turn to give 3 numbers to make your 3d coordinate. (Its very simple but I'm amazed the number of engineers I've interviewed for jobs that haven't understood this!)
The scale of an SU transform is in the axis,yaxis,zaxis of the Transformation. So, the length of the xaxis,yaxis and zaxis tell you much scaling there is in each direction.
object.transformation.to_a[0..2] is the xaxis
object.transformation.to_a[4..6] is the yaxis
object.transformation.to_a[8..10] is the zaxis
(Rather unhelpfully if you do object.transformation.xaxis you get a unit length vector..)
A normal matrix has axes of unit length (aka no scaling)
An orthogonal matrix has axes at right-angles to each other.
A orthonormal matrix is both unit length AND at right-angles.
Physics systems often want orthonormal matrices. This means you'll need to record the length of each axis then normalize followed by ensuring its orthogonal by ensuring z = x * y and x = y * z
A note on crossproduct
a * b gives a vector perpendicular to a and b and of length absin(theta) where theta is the angle between a & b. What this means is if a and b are unit length but NOT perpendicular, you'll get a vector that is not unit length and it needs re-normalizing.
You can see that Orthonormalization of matrices is expensive but the quality of your results depends on it.
There's something to be going on...
Adam