Who said SketchUp doesn't need to be 64 bit?
-
@jeff hammond said:
the main problem with that is that there is no development roadmap.. not publicly available at least.. they're entirely too secretive about the future of sketchup. (imo)
We might not know what is on the roadmap, but Jim Bacus' comments seem to clearly indicate some things that definitely are not.
As Rich says, our needs have evolved over time and I think most of us here at SCF (and in the 3D community as a whole) want more than just a user friendly programme that only gets used for sketching things out in 3D.
It's one thing to claim that SU doesn't need to be 64 - I can appreciate the difficulties and arguments for not doing so - but it's almost unforgiveable for Bacus to seemingly bury his head in the sand and ignore how the usage of Sketchup has evolved.
-
@driven said:
just quoting...
heh.. shoulda put the release date of 10.6 so those unfamiliar with osx would have a time reference..
10.6 was released in 2009(also shouldn't have snipped the quote so soon.. the part about making applications 64bit so users can enjoy thumbnails with graphic displays of what's in the file)
this part is fictional.. the quote doesn't really include that. -
"this community is only a drop in the ocean compared to the multiple millions of active SketchUp users in the world, and I very much doubt that anywhere near even 5% of world-wide Pro users would see tangible gains from adding 64-bit support"
Andrew[/quote]Andrew, I would disagree on several points here. First, while the active users from here may be 'a drop in the bucket' consider that many in the rest of the ocean are people who have the free version installed and are modeling things for fun from their basements. For many of us here, Sketchup is our bread and butter. It is a tool in our business plan. The type of users represented here are your biggest evangelists, scripters, salesman, and customers - though small by percentage. By ignoring them for the vast ocean, you may slowly loose some of your greatest assets. Furthermore, if you really are aspiring to compete with the 'big boys' as it relates to BIM and IFC, you simply cannot ignore the 64X issue. Not developing 64X sounds like a dollars and cents decision, which may, in the long run, prove to be short sighted. Not going to 64X may also be tipping the hand as to who the intended user base is, and it seems that may not be many of us here.
Second, I seriously doubt only 5% of users would realize the performance improvement. Anyone working on a large architectural model is familiar with the perfromance issues especially with terrain. If this isn't an issue, why does almost every Basecamp have sessions on working with large complex models?
-
@jdagen said:
Second, I seriously doubt only 5% of users would realize the performance improvement. Anyone working on a large architectural model is familiar with the perfromance issues especially with terrain. If this isn't an issue, why does almost every Basecamp have sessions on working with large complex models?
And when it comes down to it, they aren't actually that large or complex. Just more complex than a few cubes. Other 3D apps can munch through high poly models but sketchup still does not. I don't understand why and don't claim to be a computer guru, but it's getting to the point where many of us are exhausting the possibilities of SketchUp and don't care for it any more. I really can't see what significant improvements have been made in terms of poly count. Same old SketchUp; you open up a few trees and it freezes!
-
@otb designworks said:
... And the iPad viewer they released? laughably useless. So so much for that getting much better any time in the future...And no real face making real-time sections? It's not like we are asking for the earth here...
...
A $9.99 viewer at that.
Yet AutoDesk continues to develop and refine their mobile 3D app FormIt. Sure it's target user is the Revit group but nothing stopping a non revit user from using it. I believe obj and stl exports are coming soon.
-
I don't believe that the inference engine can be used as an excuse -- formZ has a very similar inference engine (which by the way allows you to choose which parts it will snap to, or not, anytime you want).
I'm not sure there is an excuse beyond the desire to make SketchUp cater to the lowest common denominator users (hardware-wise)... and I don't view that as a desirable thing either, but I think Bacus does.
-
@olishea said:
Other 3D apps can munch through high poly models but sketchup still does not. I don't understand why...
Maybe one reason is the inference engine, because this is the main thing that separates sketchup from other 3D apps. This would be the only point i can think of why SU is still at the level of what other apps where maybe 15 years ago. But even then there should have been some kind of progress over the last years... maybe at least doubling the performance (where other apps can maybe hold 10-100 times more today - depending on the RAM and video card and display mode)
But the inference engine can't be an excuse for the slowest saving function i have ever worked with.
Btw. i really don't understand why SU hasn't at least got a bounding box mode for objects long time ago...
And concerning the "drop in the ocean" comparison i really would like to see some statistics as a proof. Where are the numbers from? Counting the downloads of the free version? Sold licences?
I think almost every serious SU user sooner or later will find his way to this forum (there are already 261.151 accounts). And i think the majority of the other users will be free users. And like others said, maybe these people would want the same like all these shameless power users - if they ever knew it would be possible. And then there are the people who never used SU because of the lack of high poly support...@rich o brien said:
We've just morphed into a different type of user since that has has an insatiable appetite for more, more, more.
I would say that 3D as a whole has evolved a lot within the last 10 years - but SU...
-
@jeff hammond said:
@hieru said:
Presumably if users don't like the development roadmap, they should just stop using Sketchup - is that the gist of things?
the main problem with that is that there is no development roadmap.. not publicly available at least.. they're entirely too secretive about the future of sketchup. (imo)
Compare that to the Rhino forums where the McNeel people respond to feature requests on the forum, indicating if they plan to include the request in a future release (generally it's a 'no promises, but we'll try') or not. They don't put out a 'roadmap' per se but if you cruise the forums regularly, you get a clear idea of what's being worked on. And they're very up front when they think a requested feature isn't feasible at this time.
What's not to like about 'transparency'.
-
You should see the Modo forums. Very much in the trenches with the users picking their brains.
-
The bit worrying part to me, or I guess anyone who is using SketchUp (& Layout) daily for their profession, is it isn't clear what the (near) future of SketchUp is going to look like.
Are we trying to use it to make too complex models with too high polycount that the program was never designed for and never will?! Will Sketchup scale better with better hardware soon?
Having to resort to tricks to make SketchUp still work with complex 3d models now is fine if you know the company is dedicated to solve that problem for SketchUp in the very near future.
If that's never going to be the case because there's a different vision for Sketchup, I'm wasting valuable time (building models & libraries) and money (license & plugins) due to the increase in size of projects and demands by customers. -
I fear nobody has the courage to confirm/refute your last sentence.
-
@kaas said:
Will Sketchup scale better with better hardware soon?
The future is multithreaded. So i think the answer is... NO!
-
Just my personal experience, I´m using blender for about 3 years, for some reasons I had to use the 32bits version since last year. Doen´t notice any difference besides simulations and rendering, I end up outsourcing those to my home laptop, there is no difference in the viewport performance between 32 or x64, I can work with projects ranging from 100 to 300Mb with no problems.
-
I have some childish idea (before Bacus finally buried SU):
Let's organize some kind of petition* with our vision of the x64 agenda and send it to the Trimbe Executive Team.
*moderators could send PM with plea for signing to all 200K SCF members.
What do you think? -
@jaceguay said:
Just my personal experience, I´m using blender for about 3 years, for some reasons I had to use the 32bits version since last year. Doen´t notice any difference besides simulations and rendering, I end up outsourcing those to my home laptop, there is no difference in the viewport performance between 32 or x64, I can work with projects ranging from 100 to 300Mb with no problems.
I suppose you haven't read the entire thread...
As far as I've seen so far, no-one is claiming that 64bit will make SU run any faster...
The primary reason users want to see a 64bit version is the use of i.e. 3rd party integrated render applications, where SU - being 32bit - is the culprit because of the RAM limitation...
But it's also a request to make SU compatible with future computer systems as well as a hope that it will be able to handle high poly models better than what it can today... -
@frederik said:
@jaceguay said:
Just my personal experience, I´m using blender for about 3 years, for some reasons I had to use the 32bits version since last year. Doen´t notice any difference besides simulations and rendering, I end up outsourcing those to my home laptop, there is no difference in the viewport performance between 32 or x64, I can work with projects ranging from 100 to 300Mb with no problems.
I suppose you haven't read the entire thread...
As far as I've seen so far, no-one is claiming that 64bit will make SU run any faster...
The primary reason users want to see a 64bit version is the use of i.e. 3rd party integrated render applications, where SU - being 32bit - is the culprit because of the RAM limitation...
But it's also a request to make SU compatible with future computer systems as well as a hope that it will be able to handle high poly models better than what it can today...Yes you right it is not just about the viewport, it´s almost heartbreaking to see so many incredible plugins that are exclusive to SU to be locked in a single thread x32 enviroment.
Besides, blender viewport is high selective of what is shown, for example, by default you won´t see any edges until you enter edit mode for a specific object, much of the modifiers values that should increase drastically RAM usage are only computed at render time, and you can also set different display modes for some part of your model, like wireframe or bounding box, you don´t loose any functionality by doing this because it automatically switch back to the complete style when you enter edit mode for that object.
It´s a good idea for managing large scenes but you loose the view of the whole, GPU renderers should be making up for this. -
@pixero said:
Wow, this thread has really taken off.
A worrying thing is that it seems they haven't even started converting to 64 bit to future proof SU.If it isn't then please tell us what the bottleneck is and fix it.
By the way, how many developers are they? Does anyone know?
Those of you who were at Basecamp maybe have a clue.<I don't know but looking at some of the info they posted on basecamp and the new release of scan explorerI can only conclude their main focus has been support of other effort.
Is there anyway to run some diagnostics in the background that if SU runs out of memory makes a dump of all running processes and memory used at that point?
When these crashes appeared I didn't get a bugsplat I simply got that popup and another with "The application has unexpectedly closed". And it was gone. -
@solo said:
May I ask if SU works better on a 32bit machine than on a 64bit machine as a 64bit needs to emulate 32bits which in essence slows it down?
All else being equal between two CPUs where one is 32-bit and one is 64-bit, a 32-bit application will run at the same level of performance on both. Therefore, the underlying architecture of the CPU is irrelevant for a 32-bit SketchUp executable.
Here's where it gets fun.
We are absolutely certain that if we were to compile identical SketchUp code for 32-bit and 64-bit, and then benchmark the performance of the two on the same 64-bit computer, the 64-bit version would be slower.
-- "How much slower?"
We don't exactly know. Making that determination involves some very complicated analysis and a great deal of investigation that we don't think is worthwhile. After all, since I mentioned that we're somewhat resource-constrained, it would make more sense just to spend that effort in a real migration instead of making a prediction about it. The quick glance we did at this suggests there could be a slowdown of as much as 25%, or as little as 5%. Regardless, all else being equal, many of the CPU-based activities in 64-bit SketchUp would be slower than their 32-bit counterparts.
We hold in very high esteem the fact that we have continually improved SketchUp's performance with every single release. We are not opposed to releasing a 64-bit version of SketchUp in general, but in light of this, I think you can understand that we would be incredibly averse to releasing a new version of SketchUp that performs more poorly than its predecessor, no matter whether there's a mitigating reason. I can't say we'd never release something with a backwards step in performance, but it'd be a hard pill for our team to swallow.
Therefore, you can assume that the production of a 64-bit SketchUp would precipitate the need to invest even more resources in performance improvement than we normally would, in order to try to make up for the added deficit imposed by 64-bit compilation. This would of course increase my previous manpower and time estimates.
My point in explaining all of this is simply that, while those of us on the SketchUp team really do understand the frustrations of our users who run into problems due to the current 32-bit memory limitation, we must also be keenly aware of the massive investment required to produce that change, including all of the other desired fixes and features that would have to be sidelined in order to make it happen.
Andrew
-
@jeff hammond said:
can't ask a mac user.. the last two OS releases have been 64bit only.. ... also of note (maybe) is that even apple's lesser hardware (phones and tablets) are 64bit now... obviously, i'm not a developer but from a user pov, they're sending a pretty clear message.. quit making 32bit applications.
Jeff,
I think the perceived message of "quit making 32-bit applications" is just a side-effect of Apple's circumstances.
Apple's switch from PowerPC to Intel CPUs in 2006 marked a perfect opportunity to simplify development substantially by beginning the universal adoption of 64-bit CPUs. The funny thing though was that even they didn't make the transition quickly. Although there were already 64-bit PowerPC CPUs available, they weren't universally so, making Apple continually drag their 32-bit stuff along in the OS until they felt comfortable dropping PowerPC support. They also didn't release their new hardware with full 64-bit support as they could have. For example, our "Apple Xserve1,1" build servers from that era had 64-bit CPUs (as have all Intel Macs ever released), but not the 64-bit EFI necessary to utilize them. As such, they only supported the 32-bit kernel!
Finally abandoning 32-bit platforms altogether required Apple to consciously drop support for all of the old PowerPC architecture, which we saw them do a few years after switching to Intel, but then also had to abandon several of the first generation of Intel-CPU-based Macs in order to get to a complete level of universal 64-bit kernel support across the board. That only just happened within the last 18 months, IIRC.
Even so, this transition was lightning fast in comparison to what's happening in the Windows world. 15 years ago, people hoped Alpha and then Itanium CPU platforms for Windows would signal the end of 32-bit Intel domination in the PC market, but those both fizzled. Intel still reigns supreme and the fact that they still manufacture 32-bit CPUs, coupled with the open architecture of the Windows platform means that unlike Apple, Microsoft doesn't get to dictate the death of the 32-bit platform, whether on desktops or on less powerful embedded systems like ATM machines or tablets. As such, as long as the hardware is out there, it's in their interest to keep making software for it. This symbiosis means there's still a very large market of 32-bit machines being sold (I think it's around 10-20% of new PCs).
I think we see the same thing echoed in iOS vs. Windows tablets. Apple has the luxury of dictating 64-bit across the board, which just serves to simplify their lives massively, so they do, and then--BOOM--that's the way it is.
But my point here is that I think that the reason Apple is making everybody do 64-bit applications now is not because there's an incontrovertible technical advantage, but because of the massive simplification they created across their whole hardware and software development business by leveraging their god-like ability to dictate a single hardware platform across their kingdom. Third-party developers needing to fall in line with what they do is just a side-effect of a move Apple made to greatly benefit themselves.
...
Now, consider this: If only SketchUp had the luxury of being able to drop 32-bit support altogether, how much easier it would be for us to make the jump to 64-bit. With no 32-bit, there'd be no doubling of our testing surface, no making separate installers, no need to write all kinds of help center articles to explain to the masses how to differentiate the two releases, no modifying our store to provide different products, no educating our resellers about the differences, no translating all of that junk into a dozen languages, etc., etc., not to mention the cost savings associated with all of that.
I can't blame Apple for ditching 32-bit; that's the world I'd like to live in, too! But maybe this helps explain why adopting 64-bit while still supporting 32-bit would really suck, just like Apple saw for the 5-6 years they were in transition to the 64-bit-only approach.
Again, this means opening the 64-bit can of worms is even more expensive than just the effort required to create a 64-bit capable SketchUp.
Andrew
-
oh. I was thinking it would be more of 64bit sketchup replaces 32bit sketchup but I see what you mean about still needing a 32bit version around.. at least for a while (sort of like when you had ppc and Intel support with the mac version)
on a side note-- I'm a 'victim' of the 64bit only thing with osx since I still own a 1,1 Mac Pro.. I thought it was longer than 18months ago when it lost OS support but you're probably about right.. whenever mountain lion was released.
Advertisement