With the release of Adobe CS4, many of us are seeing the power of software leveraging the GPU for the first time. I have personally seen a rather significant increase in redraw performance when zooming, seemingly a direct result of GPU acceleration in Photoshop. But this pales in comparison to what’s on the horizon for desktop computing. Pictured above is the newly announced Tesla desktop “supercomputer” from graphics chip manufacturer Nvidia. Around $9,000 will buy a configuration delivering 4 teraflops of performance and sporting nearly 1,000 processor cores (4 GPUs @ 240 cores ea.). Yes, $9,000 is way more than most of us ever plan on spending for a computer, but this number is significantly lower than even the lowest entry point for previous so-called “supercomputers”. The point is that this signals a sea change in the relative performance of desktop computing. In the near future we could be seeing performance leaps by factors of hundreds or even thousands as opposed to the incremental bumps we’re getting now. As with all technology, the prices will come down and the technology will become mainstream (apparently Dell already has plans to begin manufacturing a consumer version). This is really exciting news for us in the creative sector, while most people would never need this kind of performance at any price, this could fundamentally change the way we create and edit graphics, audio, and video. This is also a potential boon for society in general as low-cost machines like these could enable scientists and researchers to run computer simulations and experiments that once took a year in a single day leading to new breakthroughs in science and medicine.
More info on the Tesla can be found here, and if you’re wallet is getting too heavy you can actually buy one here.
On a side note, who the hell designs this hardware? Why does it all look like the old Xbox? Why does it all look like some bad late-nineties rendition of what alien hardware might look like? Why is there always so much green involved? Hopefully Apple designs one soon.
Pretty impressive setup. I’m no engineer, but I always wondered why graphics programs never took full advantage if any of the GPU. I want to say that maybe four years ago a GPU had the same processing power of a year 2000 computer. Take for example the video card on the 1st gen x-box.
As far as computer cases goes, the only thing that has prevented me from building my own computer is the case (it’s a little superficial, but hey). The only case that I’m impressed by is the Mac Pro desktops, and a far second the HP Blackbird 002.
Check this article about Apple design nirvana
http://www.macworld.com/article/132261/2008/02/casemacs.html
That thing is ready to fight aliens!
Kenny-
“I always wondered why graphics programs never took full advantage if any of the GPU”
I think a few things factor in: first, maybe the lack of a standardized platform for interacting with the gpu, such as Apple’s Core Image/quartz exteme. I a no expert on all this, but it seems like Nvidia has been on the forefront of moving this agenda forward. For good reason too, the only way at this point to unseat a giant like Intel would be to fundamentally change the processing game, offer products with better cost/performance ratios which in turn would prompt hardware manufacturers to change their products to support the new architecture (e.g. motherboard manufacturers etc.) second, video cards have only (relatively) recently become powerful enough to warrant such methods.
But the main reason is probably feature creep and apathy on the part of software developers. With each iteration of software, the developers seem more inclined to add useless fluff features instead of addressing the core issues with their software: stability and performance. A perfect point / counter-point for this is Windows Vista vs. OS X. Microsoft decided to created a bloated, unstable pile of code in their quest to add usability and functionality. No one cares about new features if the OS as a whole doesn’t work. Apple, on the other hand, chose the road less travelled: rebuild your product form the ground up, adding key functionality incrementally, as it becomes ready.
So yeah, that was my long answer, short answer: developers are too busy adding wizards, widgets, and BS features they think will compel consumers to buy the new version of their software than to focus on making that software perform better, through methods like GPU acceleration. I mean come on, it’s 2009 and Photoshop still can’t see the 8GB of ram in my Mac Pro.
What a great name for such an insane machine! Seriously, that thing is a deranged beast.
Funny you should mention the design of the case. The other day I actually thought about giving in fully to the geek design trend and building a system covered in “alien” design and as many garish LEDs as the power supply can handle…
But more seriously, I remember reading a while back about how many geeks (of the LAN-party variety) really disliked the Apple design philosophy and felt they didn’t need a company to dictate taste. They actually preferred to have their computers to look that way. Which could be where Nvidia is coming from in this case.
I have an old G5 tower that I can’t get rid of, mainly because the case is so great. It’s too bad there’s not a way to actually upgrade the computer and still use the beautiful metal case. My new intel mac basically looks the same on the outside so I bought the same stupid enclosure twice.
Maybe geeks love late 90’s alien movies? 😉
it looks like something taken out of the ship in independence day, all thats needed are air piston sounds and its complete
Imagine the kind of physics simulation you could run on that. Games would never be the same again.
That’s cool and all but I bet it still can’t run Mega Race.
Anyone ever play the game The Dig? That game was impossible to beat.
http://ecx.images-amazon.com/images/I/51V685PT24L._SL500_AA280_.jpg
Nice post. Thank you for the info. Keep it up.
I don’t think Apple are the last word in good industrial design.
Dell have been making some pretty sweet looking products lately. And if they are going into the consumer market, I think Apple will be sweating. I’ve got a Dell monitor, and it looks hot.
If anyone wants to hold a candle to Apple design, then they have to stop making products that look like they’re ripping Apple off.
But Dell has stepped up on some of their products, thats for sure.
yeah, I don’t think apple is the last word, but they are one of the few, if not the only, computer manufacturers out there who have the luxury of building quality cases. They can charge enough to cover the costs of making a solid, well constructed device. Dell can design the hell out of a product, but they’re still going to make it out of plastic because they can only sell it for $800. some of the new HP elites look cool, but the second you hold them in your hand all that coolness goes right out the window. the new mbp (or the old one), on the other hand, has a feel to it that reinforces it’s aesthetics. it feels solid, well constructed…
Wow. I bet After Effects would run pretty smoothly on that…
was´nt tesla the name of the dragon in astrid lindrgrens the lionheart brothers??
With all that power, are you guys really worried about what the CASE looks like? They could put it in a cardboard box for all I care.
It seriously looks like they used some of the leftover parts from that awful bat mobile in 1997’s “Batman & Robin”. At least they didn’t throw any Bat-nipples on the thing – oh, wait, we haven’t seen the back yet!
It’s like computer manufacturers are trying to recreate the Star Trek look, with glowing switches and aerodynamic swoops and swishes (not needed, as the computer won’t be zooming anywhere). A lot of PC users say the computer is simply a tool to be used, but only Apple seems to get it: a big, metal slab used to pound away at graphics and video.
The GPU stuff is getting interesting, with the new MacBook Pros and now this.
Actually Newton, the case might be “zooming” into my trash can at light speed so the design might serve one final purpose. Looks like it’s back to the Holodeck for the Tesla design crew.
I think most of us would be pretty shocked to find what the average computer user thinks is “cool” or “techy”. I obviously don’t have any data to back this up. But I bet 7 out of 10 “heavy” computer users would buy the machine as show above.
Scott – In light of your artistic ability, I’m totally impressed with the mechanical/analytical posts you have on the site. Keep it up!
and with computing power like that readily available, brute force methods of code breaking become much less cumbersome.
say good by to all your sensitive information.
好文,支持
更多内容,请访问:
http://www.xasun.com/article/CunChu/