sketchucation logo sketchucation
    • Login
    1. Home
    2. Trogluddite
    3. Posts
    ℹ️ Licensed Extensions | FredoBatch, ElevationProfile, FredoSketch, LayOps, MatSim and Pic2Shape will require license from Sept 1st More Info
    T
    Offline
    • Profile
    • Following 0
    • Followers 0
    • Topics 8
    • Posts 221
    • Groups 1

    Posts

    Recent Best Controversial
    • RE: Why Are You Still Using Windows XP?

      There is a very easy way to keep XP up to date with the latst security fixes - just get your wallet out!
      The UK National Health Service has done just that - paid Micro$oft millions for ongoing security updates for another year or two. i.e. I am paying my taxes for universal health care (a good thing IMHO) only to see it lining Bill Gates' pocket rather than treating patients. If BG wants to be considered a 'charitable benefactor' then hows about not taking the money from such causes in the first place!

      So yes, planned obsolescence for sure - there is no technical reason that XP cannot continue to be upgraded to keep it "secure", because it is being done as we speak!

      As far as "the weakest link" is concerned - that has little to do with your OS, IMHO. The idea of "hackers"** as genius-level super-programmer 007-villains is mostly a distortion to make people wet their pants and get their wallets out - the "weakest link" is usually the bag of flesh and blood sat in front of the monitor, and most of the biggest security breaches come from social-engineering rather than any kind of clever programming.

      And I strongly agree with Krisidious - having almost every app' and background task these days punching holes in your firewall, often without you even knowing it, is a massive source of possible 'exploits' for the genuinely technical security breaches. Very little software is upgraded so regularly that it needs a permanent connection, and I am perfectly able to spend a few hours on my PC without checking how "popular" I am, or needing to see a silly cat video! (must 'fess, I do find them rather addictive though!).

      Call me old-fashioned, but if I'm not browsing/downloading, the router gets turned off at the wall. If an app' won't play nicely with that, it goes in the bin - simple as that!

      *(*PS - apologies to any real hackers out there. As any true hacker knows, a hacker is someone who just loves finding new ways to make computers do the 'impossible'.
      There's another word for the 'naughty people' we're talking about here - "criminals".)

      posted in Corner Bar
      T
      Trogluddite
    • RE: Why Are You Still Using Windows XP?

      Why do I use XP?

      My main use for a PC is for amateur music production and plugin development, and the audio hardware I use is rather more specialised than a MoBo sound-chip! The gear is all well over a decade old, but is in perfect working order, still sounds fantastic and I have a decade of experience getting it to do what I want. BUT, there are no PC drivers or configuration software for anything beyond XP, and there's no form of VM or "compatibility mode" that can get it working (oh, how I've tried!).
      To replace my studio PC with a Win7/8 machine - that's an expense I can afford. To replace the audio hardware - $1000s, and there isn't even anything on the market any more that duplicates all the functionality that it gives me!

      I now have a Win7 laptop that I use for most other things, but without getting a re-mortgage...erm, oh, hang on, this place is just rented...

      posted in Corner Bar
      T
      Trogluddite
    • RE: Let's build a new 3D software!

      Maybe there's no will for it here Oli - I suppose there will always be a certain loyalty here to SU - but maybe you are simply appealing to the wrong audience?

      I agree with you 100%, there's a huge gap in the market for something with SU's ease of use, but the power and scalability of Blender etc. - more so every day as 3D printing etc. become more available.

      Maybe approaching the Blender community would be more beneficial..

      • It is all open source, with a large, enthusiastic and committed developer base.
      • Has API's for (IIRC) Python and C++.
      • Has a mature, fast and reliable 3D engine behind the scenes.
      • Easily extendable with plugins, C extensions etc.
        Really, the only part of the brief that it doesn't match, is the SU like handling of regular solid shapes, push/pull etc. and the inferencing engine. But are those not something that would appeal to a large number of Blender users? - I think they'd all lap it up! I don't know any of the other alternatives well enough to comment, but I can't see any software community wanting to turn down something that makes what they do faster, simpler and more intuitive!

      Obviously, there may be issues of intellectual property (Trimble considered SU valuable for some reason!) - but it seems to me that 'skinning' Blender to make it more intuitive would be less of a task than creating a whole new engine from scratch, or waiting for Trimble to do this to SU.

      posted in Corner Bar
      T
      Trogluddite
    • RE: SketchUp Problems

      Welcome to the forum Renaissance Man,
      Unfortunately, I can't answer all your questions, as it sounds like you are using SU for a very different purpose than I do. However here are some pointers for a couple of them.

      Not being able to unhide your objects may be a matter of scope (you specifically mention groups). "Unhide All" is not properly named IMHO - "Unhide all in current context" would be more accurate. That is, if you hide an object which is within a group or component, you can only reveal it again it if you re-open the group/component for editing. Unfortunately, even the outliner does not highlight which groups/components currently contain hidden objects - it WILL indicate whole groups/components that are hidden, but not ungrouped geometry.
      I may be wrong, but I have a feeling there is a plugin, or some kind of console command that can be used to do a proper recursive 'unhide all', so that may be worth searching out.

      As to your problems with the broken contours - I don't know the answer, but I would recommend a quick post to say what the source of the data is. I say this because your problems sound very similar to those experienced by many people who import data from non-SU sources, where the kinds of entities in the file may not directly correspond to the kinds which Sketchup uses. Knowing where the data is from will ensure that you get the most appropriate answer.

      @renaissance man said:

      Applications should not require users to make these kinds of consessions just so the program works correctly.

      Have to agree with you there. Sketchup's user interface is a gem, so much simpler than most other CAD packages - but I find myself using it less and less because of the need to constantly work around this stuff. I mostly to design very small objects for circuit boards, and as I work in the UK, use metric measures for everything (as do all but the USA, and two other countries in the entire world!!) - I don't believe you can honestly stick the label 'professional' on any design software that finds this so hard to handle!!

      posted in SketchUp Discussions
      T
      Trogluddite
    • RE: About NameError: wrong constant name

      Hi pingpink.

      I think the problem is coming from how 'Class.to_s' works, and what namespace the HTMLRenderer class is defined in.
      The original code is looking for a Class that is inside the Kernel namespace, using a String so that the Class used can be changed on-the-fly - the end result of the lookup would be something like "Kernel::MyClass".
      The problem with your change is likely one of two things...

      1. The class HTMLRenderer is NOT in the Kernel namespace - possibly just plain-old 'HTMLRenderer' in the main 'global' namespace. In this case "Kernel" would be the wrong place to look for the constant.

      2. The class IS inside Kernel. In this case "const_get(HTMLRenderer.to_s)" would actually return the full name of the Class, including any 'superclasses' - which would be "Kernel::HTMLRenderer".
        But 'Kernel' is already specified as the receiver of the 'const_get' method - leaving us looking for the constant "Kernel::Kernel::HTMLRenderer". As there's no module "Kernel::Kernel", this won't be found, and you get a NameError.

      Whichever it is, the code is more complex than it needs to be...
      The original code is using 'const_get' so that it can choose a class at run-time - 'format string' is choosing what kind of object to make. If you now always want an instance of HTMLRenderer, there is no need to look up the constant this way, you can just write...

      renderer = Kernel;;HTMLRenderer.new(...etc...)
      

      NB) This assumes that the HTLMRenderer class really is inside the namespace of Kernel. If not, the bit before the '::' may need to be different; or possibly it is not needed at all if the class is in the global namespace.

      posted in Developers' Forum
      T
      Trogluddite
    • RE: Puts inconsistency & machine epsilon management

      In case it might be useful to anyone experiencing similar issues with numeric precision, I've tarted up a little module that I've used for a long time to peek inside number values.
      IEEE754.rb
      There are methods to take any Float, Integer or numeric String (including Hex/Binary), trim precision to either 32 or 64 bits, and then show.

      • The Float value represented by those 32/64 bits.
      • Signed and unsigned integer represented by the same bit pattern.
      • Raw bytes that make up the value in hex and binary.
      • Breakdown of the IEEE754 construction of the float (IEEE754 is the international standard for such things - the Wikipedia entry is a good start for understanding how Floats are built.).
      • Shows infinities, NaNs and denormal values.

      I designed it with another API in mind, where Ruby is interfacing to external code that can only handle 32bit floats - but it's turned out to be quite a useful tool whenever rounding errors, infinities, 'Not a Number' etc. are causing woe - and also when interfacing to external DLLs to check that all that horrible String packing/unpacking is working as expected.

      Probably not so useful for the OP's problems, as for 64bit double-precision floats (Ruby standard), decimal values cannot be displayed any more accurately than usual (though it may hint at the case where the binary has an infinitely repeating fraction)

      No restrictions on use, distribution etc. - nor is there any kind of warranty (caveat emptor!!)

      posted in Developers' Forum
      T
      Trogluddite
    • RE: Puts inconsistency & machine epsilon management

      The 'sprintf' method used may or may not be more precise than the standard 'to_s' method, I'm not sure - but it will still never completely circumvent the problem of decimal values that binary cannot represent.

      For example, your value '4.000...003' is just a String as far as the Ruby parser is concerned, and it get converted to binary to make a Float. When you print it, it converts back to decimal. At no point can you know whether the Float is really holding that exact number, or whether it's just that the conversion routines are exactly reversible.

      You can't even do a simple test whether the stored value is precise or not - Ruby would always tell you that it is, because you can only compare with another Float that has also been ultimately converted from a decimal string.

      Whether or not to round or format depends a lot on what you want to use the number for. For end users, 'decimalised' values are much more intuitive to deal with, and they are the 'norm' - but for a developer trying to locate rounding errors, this can be very deceptive.

      Rounding the Float value itself won't always help either - the Float#round method rounds to a fixed number of decimal places - but since even 0.1 can't be exactly represented in binary, you will often still end up with an approximation.

      I have seen some debuggers/editors that do have a setting to always print the actual binary number that is ultimately held in memory - as there is a fixed number of bits in a Float, and 2 is a factor of 10, it IS always POSSIBLE to translate the exact binary value to decimal within a bounded number of digits. When I've used these, it has been quite shocking to see what the CPU does with the values that we type in!

      Very few applications do this because it's counter-intuitive to type in a value and then be told its something different on the display. Scientific and financial applications often use deviously complex routines for what seem like simple calculations so that they can emulate the maths being done in an 'all decimal' number space, to prevent this kind of issue.

      Although often misunderstood and complained about, I have to say that I think the handling of rounding errors by the SU API is very well done - I think there would be many more threads like this from frustrated developers if we had to manage 'epsilon' for ourselves!

      posted in Developers' Forum
      T
      Trogluddite
    • RE: Puts inconsistency & machine epsilon management

      @abakobo said:

      how can i "puts" the real float that's in memory and not a rounded one?

      To answer this final part of your original question...
      The only utterly reliable way to do this is to print the value in binary!

      Each number base has values that it can exactly represent, and those that it can't. We're all familiar with this in base ten, for example...
      1.0 / 3.0 = 0.3333333...etc...
      There is simply no exact representation of 1/3 no matter how many decimal places you use.

      The same is true of base 2 (binary) - for example, a float cannot exactly represent the decimal value 0.1. The fraction 1/10 is infinitely recurring in binary...
      0b0001 / 0b0110 = 0b0.0001100110011001100...etc...
      In effect, there is no such number available to a float, regardless of the number of bits - and it has to make to with the nearest approximation.

      Therefore, any time you change the number base, there will be a rounding error unless BOTH number bases can exactly represent the value. It's important to realise that this is a fundamental property of the way that numbers are represented - greater precision yields closer approximations, but will never solve the problem entirely.

      So in your example case, where apparently 4.0.to_i == 3, what may be happening is that the 'true' binary value is ever so slightly less than 4.0, but is a value with no exact decimal representation. The nearest 'non-recurring' decimal just happens to be the integer 4 - so this is what 'puts' displays.
      This kind of thing happens all the time, it's just that the 'integer/not-integer' case tends to make the problem more visible.
      (NB - this is also the reason that Ruby has the Rational class of number; so that fractions with recurring/irrational decimals can be precisely represented).

      Until we become a race of technology enhanced cyborgs that can all natively do maths in binary, this problem is always going to come up when we use float values anywhere in a user interface - the very act of displaying the value can change the value being displayed!

      posted in Developers' Forum
      T
      Trogluddite
    • RE: [Talk] Ruby Debugger for SketchUp 14+

      @avariant said:

      When displaying data about a variable, does it volunteer variable names or does it just respond with a value?

      I would be very disappointed to find out it did

      The debugger is exposing nothing new here, the same thing could be done anyway through any of the SU consoles by any savvy Rubyist - using nothing but standard Kernel method calls. Setting trace functions and using the returned bindings to read local and instance variables, get the current method and its call signature, read the call stack etc... can all be done from even the most basic install of Ruby.

      You can't actually read the code lines unless they are in a 'plain text' source file (the scrambler presumably 'evals' the unscrambled code, which precludes seeing the source) - but there's not much that can be done to secure the other information without losing the consoles - which would make life very difficult for the amateur developers that we all rely on for our toys!

      posted in Developers' Forum
      T
      Trogluddite
    • RE: Increasing Sketchup's Performance?

      @mrjumpmanv2 said:

      something tells me that there's something going on with sketchup that prevents it from utilizing all of my hardware

      You got that exactly right...
      Although your shiny new i7 CPU has (I guess) four cores, the way that Sketchup is built only allows it to use one of them for modelling. Other cores might still get used for 'low priority' threads such as dealing with the file system etc., so there will be some "sharing" of the CPU load. However, the really CPU intensive stuff is not able to utilise any 'surplus' power that other CPU cores might have available. This will be clearly visible if you check out the 'Performance' graphs in the Windows Task Manager - no matter how hard you make Sketchup work, things will hit a 'glass ceiling'.

      The mention of 'emulation mode' is because Sketchup is still coded as a 32bit application - so when you use it on a 64bit machine, Windows employs some trickery to make it compatible. Very often, this will be engaged automatically, so you won't be aware of it.
      That shouldn't affect 'number-crunching' performance too badly, though it can restrict how much of your 16GB RAM Sketchup is allowed to use (32bit numbers aren't big enough to hold very large memory addresses). For a very large model that won't fit into the allowed space, this could potentially slow things down due to using the HDD as 'virtual memory'.
      If high virtual memory usage is causing a bottleneck, moving the virtual memory to a super-fast solid-state drive may help - but I don't use files anything like as large as yours, so I'm unable to say whether that would be a definite improvement in your case.

      posted in SketchUp Discussions
      T
      Trogluddite
    • RE: Applying Styles to Scenes without refreshing screen

      You might find my Render Favourites plugin useful.

      Using this, you can set up your display any way you like and store the settings as a preset, independent of the Styles window. You can store up to eight of these settings, and they include more than just the Style - all of the shadow and fog settings, and the status of the 'Hide components' options, are stored and recalled.
      You can store up to eight of these setups and assign function keys for quick switching.
      Once they are set up, you can export a folder of images containing a bitmap export for each of the stored styles with a single click. There are global options where you can set the file format and resolution.

      NB) The plugin has not been tested with SU2013 - possibly I will update it in the future, but I currently don't have the time to give my SU plugins any TLC.

      posted in SketchUp Discussions
      T
      Trogluddite
    • RE: [Extension][$] CompoScene v2.0.2 (Updated)

      @thomthom said:

      They don't appear to be doing the same thing...

      Certainly very different approaches - I built 'Render Favourites' with the specific intent of avoiding the creation of scenes (personal preference, I like them to be only for camera viewpoints). I have never rendered or post-processed an SU model in my life, so was quite stunned by the positive reaction of 'renderers'!

      Guess it just depends how folks like to go about doing their rendering - though I guess my plugin must take a bit of setting up for it, as it doesn't have the ready-made 'presets' for the render layers (nor enough 'favourite' slots do provide all of the ones that Renderiza lists at the same time).

      Anyhow - since both are free plugins, I just though I'd drop in to say that if any of the code from my plugin looks handy for making a "best of both worlds" version, then please feel free to take and use anything that looks useful.
      Sadly, I don't have the time lately for much plugin work - but I'll happily try to answer any questions that might arise from trying to adapt the code.

      posted in Plugins
      T
      Trogluddite
    • RE: Adobe ditches Creative Suite

      @pbacot said:

      It occurs to me that one outcome is that the developers may make "updates" that are unpopular and no one can do anything about it. Users can't hold back using a better version because the only version they'll have access to is the updated one.

      "updates that are unpopular" - now what sort of a company would do a thing like that!? πŸ˜‰ 🀣

      Also makes me think of possible 'legacy' and 'migration' issues.
      For example, our company once purchased the assets of a failing business - a range of products that we felt we could modernise and make commercially viable. Most of the CAD files, BOM databases etc. were in ancient formats that our in-house software did not support.
      But, fortunately, the installation disks for the original software were among the assets we purchased - the starting point for re-formatting the data for an automated migration. If the 'cloud' model becomes the standard for software distribution, such things will no longer be possible - or maybe only provided by "legacy software hosts" who would charge a small fortune for their specialist services.

      Given the recent revelations about the NSA's 'data snooping' capabilities, one also has to wonder about the safety of the data being transferred.
      The assurances of the big software companies that they wouldn't collaborate with such snooping are surely disengenuous given their previous track record - e.g. after the way Google rolled over backwards to enable Chinese internet censorship, why should we believe that they wouldn't co-operate equally enthusiastically with any other 'friendly' security agency?
      And the US laws governing the human right of US citizens do not even apply to the data of foreign citizens who happen to be accessing US servers (no doubt likewise for other jurisdictions).
      The 'human rights' aspects of this is only one part. There have been many times in history where security agencies have indulged in industrial espionage to appropriate technology - and once everybody's 'work' is floating in the cloud, the door is opened for the exploitation of all kinds of research and development for purposes for which they were not intended.

      posted in Corner Bar
      T
      Trogluddite
    • RE: How many forum members does it take to change a light bulb?

      🀣

      • 12, posts reminding people to update their profile to say whether they are Edison Screw or Bayonet Cap users, and whether they run a 110V or 220V system, so that we can give them the right advice.

      • 25, flame war over whether Swann or Edison was the true inventor of the lightbulb!

      • 6, level-headed users making a plea for everyone to calm down a bit, the important thing is that light comes out, not which fitting you use.

      +100, β˜€ πŸ˜„ πŸ˜• πŸ˜’ 😑 😎 πŸ˜„ πŸ˜„ πŸ˜„ πŸ˜„ πŸ‘ πŸ‘Ž 🀒 πŸ˜’ ...etc...

      +8, "rose tinted" posts about how things were so much better in the 'gas lamp' days.

      +5, bitchin' that we really shouldn't need to be 'plugging in' lightbulbs, illumination should be part of the core system, so changing bulbs should be the electricity company's responsibility.

      +1, user who long departed the upgrade bandwagon requesting the file in "candle" format!

      posted in Corner Bar
      T
      Trogluddite
    • RE: Ruby loaded file vs eval

      Yes, you have it right. A method definition is turned into 'byte-code' at definition time, and then just called, whereas 'eval' has to invoke the parser and work through the string every single time.

      The 'eval' way can sometimes be handy for making 'dynamic' code, where the method definition is changed in 'real-time' according to the circumstances - but doing it every frame does seem very wasteful. Similar things can often be achieved using more efficient methods, such as building a lookup table of 'Proc' objects; but without knowing what the SP code is doing, it's hard to say if that option would be viable.

      posted in Developers' Forum
      T
      Trogluddite
    • RE: [Plugin] Rendering Favourites (v1.1)

      Hi Dave,

      Glad to hear you find the plugin useful.
      I don't think that your request should be too tricky to implement - though I'll need to try and find a way not to break any old 'favourites' that anyone might have stored.

      I don't to any rendering myself, so I wonder what you think the best way to implement this would be. I was thinking that maybe keeping the 'global' size option, and then allowing each favourite a 'scale factor' might be a nice approach, to save folks having to mess about with a calculator to ensure the same aspect ratio - but would that impose any limitations on your ability to combine the images the way you would like?

      Naturally, I want to try and sort out any SU2013 issues before adding any new features, so it might take a little while before I can implement new scaling options. But don't be afraid to give me a kick if I appear to have forgotten - my project management goes to pieces the minute I'm not at work with the boss peeking over my shoulder!
      (Note to self - must finish 'Quick Lathe' plugin!).

      posted in Plugins
      T
      Trogluddite
    • RE: [Plugin] Rendering Favourites (v1.1)

      Hi Dan,
      Thanks for the heads up.
      To be honest I haven't even tried 2013 yet - I'm on a short deadline for a crucial project, and don't want to risk upsetting my system until that's out of the way (Got to arm-twist the bean-counters into paying the upgrade too!)
      However, I'll see if I can find some time to play with the 'Make' version at home over the weekend, and give all my plugins a once over for compatibility. From the posts over the last few days, I would guess the problem is most likely to do with the new toolbar system, so fingers crossed, there won't be too much code to change!

      posted in Plugins
      T
      Trogluddite
    • RE: Should Trimble Buy More Plugins?

      @tim said:

      Whilst it may well be true that Ruby is slow...

      Which leads to another point - OK, so Ruby will never beat compiled 'C', BUT why are we still stuck with Ruby 1.8.x?
      I use 1.9.3 on a regular basis, which has a whole new byte-code interpreter (YARV), which truly is much faster - operations can take as little as 25% of the time of 1.8.6.
      And far more of the Ruby standard libraries are 'C' coded too, rather than being native Ruby files.
      The only objection I can see to upgrading Ruby, is that there are certain syntax changes etc. that would involve developers having to make minor alterations to their code.

      What better time to do this than at the release of the new 'Extensions Warehouse'? Many developers will have to tweak their code anyway to meet the new guidelines, and a system is in place to, at least in some small way QC the code before admission.

      This seems like yet another missed opportunity.
      I personally rather like the way that plugin dev's currently are independent, so prefer to see the EW as a way for dev's to monetize their hard work - they would surely lose some ability to be so innovative if they were 'bought out' by a large corporation like Trimble (which has yet to show any sign of real 'innovation').
      If the EW is so important to Trimble, what better way to show their commitment than to say to the dev's "hey, we just made all your plugins four times faster, and we'll help you to adapt your code to the changes".
      The lack of this makes me suspect that in the long-run, the EW won't be used to benefit independent developers very much at all - rather, it will be used by Trimble to promote their own 'C' extensions, and charge users extra for features that should be incoprporated into the main program.
      That might be OK if the building blocks of a "roll your own" system were cheap enough, but the hike in licence costs, for so little benefit, of this release does not bode well for that model.

      It also raises the possibility that the EW could be to the detriment of small developers - Trimble now have a way to easily track which plugins are most popular. This makes it very easy for them to pull the rug out from under a successful plugin by implementing their own version. I mean no disrespect to the amazing work done by the Ruby dev's here - but the guys who sign my purchase orders at work are much more easily convinced to spend cash on 'official' software than something they might perceive as 'amateur' ("What? PayPal, you must be kidding!")

      My own Ruby efforts are meagre compared to the likes of ThomThom, Fredo, TIG etc. - but even if I though they were good enough to monetize in some way, I would stick with our own Plugin Store - because I know that the guys who run this place really do have the best interests of the user base at heart.

      posted in SketchUp Discussions
      T
      Trogluddite
    • RE: SketchUp 2013 ;)

      Phew - that all took some reading!
      I've held off commenting thus far because I haven't really put 2013 through its paces - and as they say "the proof of the pudding is in the eating."
      But I would like to make the observation, that whatever the merits of SU 2013, I think there is something seriously lacking - clear, concise and thorough communication from Trimble. To take just a couple of quick examples...

      • Why do we need to ask what the policy is regarding "skipped" upgrades. While the programmers were busy refactoring their code, why was the marketing team not laying out a clear policy, determining price points, and creating documentation that would make this issue perfectly clear right from day one of their "new baby"? Simply saying that any "buy in" after a version skip will be "reasonable" is not good enough - for many of us, I'm sure, that just has the ring of the weasel words that we hear every day from spineless politicians.
        (EDIT: NB - I'd just like to make clear, this is not a dig at the Trimble guys who have been brave enough to post here - such things are, of course, a collective responsibility, and there will be corporate rules about what any individual is allowed to say).

      • "Many bug fixes and performance improvement made" - that's great, it's not always brand new features that bring the biggest workflow enhancements - software that works faster and crashes less is, for sure, a valuable thing in its own right.
        But, where do I look to see these improvements - what exactly can I expect to be more stable? - roughly how much faster can I expect feature "X" to perform?
        The coders will surely have tested these things in great detail before the release - so why are the release notes so vague, Trimble must surely have the requisite information to hand.

      Much as a applaud ThomThom et al. doing some profiling etc., so that the less "tech-savvy" members of the forum can better understand the differences, why are end users having to do this? If these improvements are significant, Trimble should be shouting about them - and maybe the lack-lustre list of new features would not seem quite so underwhelming.

      • If there are changes to the Ruby API, however small - why is the documentation for them not already in place? When I purchase a new gadget, I don't expect the instruction manual to follow along weeks or months later - I expect it right there from day one so that I can immediately reap the benefits of the new toys I have at my disposal.

      • There may be differences under the hood to "re-build the foundations" - but, if so, what? I would be perfectly happy to accept this explanation for the lack of "visible" enhancements, were I given some indication of a "road-map".
        Just some little sign of the anticipated future improvements that are intended to be built on the "new foundations" would go a long way to convince me that my upgrade fee is buying into a platform that has a fantastic future ahead of it.

      I don't intend to imply that the lack of info means that we are somehow being ripped off - but come on, Trimble, you want my boss to sign off my purchase order - convince me!!

      posted in SketchUp Discussions
      T
      Trogluddite
    • RE: What does SketchUp 2013 do for developers?

      @adamb said:

      What concretely do you think Ruby 2.x is going to give SketchUp?

      I'd happily settle even for 1.9.x - just for pure speed, if nothing else.
      Might also stop me from writing so many syntax errors while I'm trying to get my head around the SU API! πŸ˜‰

      posted in Developers' Forum
      T
      Trogluddite
    • 1
    • 2
    • 3
    • 4
    • 5
    • 11
    • 12
    • 2 / 12