Driving joints from external data?
-
The raycast in Newton 1.5 (SP3 uses 1.5) is bugged. So I wont be able to add it until SPIV. In the short term you could probably use the Sketchup "raytest" function.
http://download.sketchup.com/OnlineDoc/gsu6_ruby/Docs/ruby-model.html#raytest
-
Has anyone brought up the idea of using an external programmable controller to control a large number of SP pistons or servos? Basically a factory automation simulation within SU with SP. I'm doing a machine control job now with two large industrial robots, conveyors, custom cartesian robots, lots of pneumatic cylinders, and lots of sensors. I've been stuck with having to program large parts of it without physical items hooked up, and only when it is installed at the factory will we be able to test and debug some of this control code.
I am using some PLC control software that anyone can download and try out on a PC, and though the full version isn't free, it isn't extremely expensive either. It is used in both factories and in home automation, on WinCE-based controllers, and on desktop PCs.
I've written a VB.NET program that communicates between some PC-based machine vision software and this PLC software, so I would think it would be possible to link the PLC variables to Ruby variables in SU, though I haven't looked into that side of things yet.
If possible, then I would think I could control a SP machine with control logic in my PLC.
Again, I would need SU sensors to detect position of objects in the machine as the cylinders or servos were actuated. From what you're saying, I take it that a separate stand-alone sensor .rb plug-in could be created that could be used in SP scripting or in conjunction with my controller?
This would allow me to create a physical sim of my factory automation workcell which could be used to debug my code. (Or for pre-sales visualization, system design visualization, etc.) That would be pretty cool.
-
Basically any ruby can work from within the scripting fields of SP, but I guess if it was very long/complicated you could do it externally. Chris, can you tell the distance with SU's raytest... or anything other than the name of whatever it hits?
-
As soon as I can get some time, I'll really delve into SP and SU scripting to see what's possible... I didn't see anyone bite on the idea of using a hidden or very transparent object as the sensor beam or detection field (can be as long or wide as necessary, or very short), and then script a trigger from a detected collision. Obviously, there should be no physical effect from the collision. If this sounds like an interesting idea for future SP development, then I would also suggest some sort of return codes to enable extra sensing features, such as returning the object name (or ID of some sort) and/or the object material (I know, future SP version). The idea would be that a sensor trigger would ultimately only detect a certain material, such as a metal, and pass/ignore other items. This would also be used to simulate detection of shiny vs. dull objects with a reflection of light from an optical sensor. Also, even more advanced sensing could be simulated if an object name or ID was returned. (Bar-code scanners and machine vision systems can identify unique objects. Simulating sophisticated robots that can identify objects would require this.) Finally, if you go this far, then returning a distance from sensor body to the detected object would be the ultimate, since many sensors now have time-of-flight built in, they can give you the real distance to the object detected. (laser, sonics, radar, and coded-infrared sensors all exist for use in industrial automation. I just saw a 'smart camera' sensor demonstrated that sends out infrared, and builds a 2D image or 3D heightfield of the field of view in 'real-time'. It allows you to do a vision inspection where the 2D image appears as normal, but you also have a height value for every pixel in the image.)
Knowing the distance would also let you create a sophisticated sensor that ignores an object that it detects but is either the wrong material or object type to be detected at that range. So if a 'small' object, or non-reflective object passes by, it has to be closer to the sensor to be successfully detected.Just some ideas I want to float...
-
@wacov said:
Basically any ruby can work from within the scripting fields of SP, but I guess if it was very long/complicated you could do it externally. Chris, can you tell the distance with SU's raytest... or anything other than the name of whatever it hits?
Yeah. I havent tested it but it would be something like this.
point=[0,0,0] up=[0,0,1] contact=Sketchup.active_model.raytest(point,up) if(contact!=nil) dist=contact[0].distance(point) end
-
Ok, thanks... how effecient do you think the raytest is... do you think it's possible to run more than one each frame?
-
It would depend on how complex the model is.
-
I guess you mean the actual geometry? Well, I've made a basic 'ray gun', which uses thrusters in place of actual physical collision detection. There are workarounds, and I'm not feeling well, so sorry if it's poor quality
-
How does it fire with the keyboard?
-
Very cool script. I still dont understand how it actually works.
-
Jay, there's a version on the warehouse at http://sketchup.google.com/3dwarehouse/details?mid=8b02334bbd1681ac6cd8ad826e9017b8&prevstart=0 that can be fired with space.
Basically, I used two objects, making up the arrow, to get the vector for the ray. The start point uses the hidden sphere (I realise now that wasn't needed). There's a hash variable, @@thruster... so, object [1] has @@thruster[1] tied to it by name. When the ray hits an object, it retrieves the object's name (so, [1]), and activates @@thruster[1]. Inside each object is a small thruster, that always faces the gun; this is what pushes the object when @@thruster is activated. There's also an @@obj variable, which doesn't do anything (I can't remember why I thought I needed it )
Advertisement