Really neat shit there. Makes sense, since a GPU is really nothing more than an enormous highly parallel math processor which specializes in intensive matrix and vector mathematics.New Scientist . . . yeah, yeah, yeah. wrote: Computer graphics card simulates supernova collapse
* 16:57 10 June 2005
* NewScientist.com news service
* Will Knight
Web Links
* Advanced Computing Lab, Los Alamos National Laboratory
* Terascale Supernova Initiative
* Computer Science, California Institute of Technology
New software enabling scientists to perform mind-boggling mathematical calculations and see the results rendered almost instantly on their screens has been released by US researchers.
The Scout programming language, developed at Los Alamos National Laboratory (LANL) in California, US, lets scientists run complex calculations on a computer's graphics processing unit (GPU) instead of its central processing unit (CPU).
In tests, the graphics processor was able to perform certain types of calculation 12 times faster than a single CPU.
Graphics processors generate smooth and realistic three-dimensional imagery by performing rapid calculations on visual data. And the latest graphics chips rival CPUs for raw processing power, thanks to consumer demand for hardware powerful enough to support the latest 3D computer games.
"These chips normally sit idle when scientists work," says Patrick McCormick, a LANL researcher. "They have all this processing power but it's just not being used."
McCormick says researchers could use the Scout programming language to simulate various phenomena, such as ocean currents and the formation of galaxies. He adds that performing these calculations on a graphics processor makes it simple to render simulations visually at the same time.
Super-giant star
Researchers at LANL have already tested Scout by modelling a critical moment during a particularly spectacular astronomical event: a “core-collapse supernova”. The simulations ran 12 times faster than they do on a single CPU, McCormick says, primarily because the problem is so well suited to a graphics processors' capabilities.
The researchers simulated the shockwave produced after the core of a super-giant star collapses upon itself. The collapse occurs when a gravitationally unstable iron core has been generated by fusion reactions inside the star. A video produced by LANL shows the Scout code used to model the shockwave, alongside a graphical representation of the process.
To make the technology much more powerful, McCormick is working on a version of Scout that will work when several computers are linked together.
Floating point
Peter Schröder, a computer simulation expert at the California Institute of Technology, believes graphics processors have great potential for scientific research. "There is a real market driving this hardware that we can use for scientific computation," he told New Scientist.
Schröder adds that the approach is particularly well suited to "anything that has high floating-point needs with low communication needs" – in other words intensive mathematical calculations that can be easily split up into individual portions. This is because graphics chips contain many individual processing cores that are suited to performing intensive calculations on their own.
But some experts say graphics chips’ design means they will not perform as well as CPUs on less specialised tasks. "For general-purpose scientific computing, GPUs have not proven themselves yet," says Jack Dongarra, a supercomputing expert at the University of Tennessee, US.
Computer Science, University of Tennessee
Scientists put graphics card to work.
Moderator: Thanas
- GrandMasterTerwynn
- Emperor's Hand
- Posts: 6787
- Joined: 2002-07-29 06:14pm
- Location: Somewhere on Earth.
Scientists put graphics card to work.
In running simulations.
Tales of the Known Worlds:
2070s - The Seventy-Niners ... 3500s - Fair as Death ... 4900s - Against Improbable Odds V 1.0
2070s - The Seventy-Niners ... 3500s - Fair as Death ... 4900s - Against Improbable Odds V 1.0
Just a case of scientist waking up to the untaped computer power avaible to them. This realy drives it home though
Mostly becauseMr Dongarra, GPU's are not MENT for for general-purpose computing, they are MENT for highly specilized tasks dealing with very pretty and phyiscaly correct light, shadding and whatnot.But some experts say graphics chips’ design means they will not perform as well as CPUs on less specialised tasks. "For general-purpose scientific computing, GPUs have not proven themselves yet," says Jack Dongarra, a supercomputing expert at the University of Tennessee, US.
"A cult is a religion with no political power." -Tom Wolfe
Pardon me for sounding like a dick, but I'm playing the tiniest violin in the world right now-Dalton
There is also the small matter of the low precision of GPUs floatpoint operations.
GPUs are designed to crunch numbers were it doesnt matter if the lower(lets say) 5 significant digits are off. No one cares if a pixel is 0.5 pixels out of alignment.
But most industries interested in high preformance vector computing cant afford that level of error
GPUs are designed to crunch numbers were it doesnt matter if the lower(lets say) 5 significant digits are off. No one cares if a pixel is 0.5 pixels out of alignment.
But most industries interested in high preformance vector computing cant afford that level of error
"Okay, I'll have the truth with a side order of clarity." ~ Dr. Daniel Jackson.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
- Durandal
- Bile-Driven Hate Machine
- Posts: 17927
- Joined: 2002-07-03 06:26pm
- Location: Silicon Valley, CA
- Contact:
GPU's are parallel processing beasts, so the potential for scientific calculation is great. But they don't support double-precision FP at the moment. However, WGF will require integer and DPFP implemented in hardware on the GPU for acceleration, so this situation will change once Longhorn comes around.
At that point, the days of SIMD implementations on the CPU will be numbered. Why waste real estate on the die when you've got a dedicated GPU that sits relatively idle 90% of the time and can do a ton of parallel calculations much faster than any CPU SIMD?
At that point, the days of SIMD implementations on the CPU will be numbered. Why waste real estate on the die when you've got a dedicated GPU that sits relatively idle 90% of the time and can do a ton of parallel calculations much faster than any CPU SIMD?
Damien Sorresso
"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
- The Grim Squeaker
- Emperor's Hand
- Posts: 10319
- Joined: 2005-06-01 01:44am
- Location: A different time-space Continuum
- Contact:
Wait, let me get this straight,
Until now for intensive Graphical simulations they havent used Graphic cards
Until now for intensive Graphical simulations they havent used Graphic cards
Photography
Genius is always allowed some leeway, once the hammer has been pried from its hands and the blood has been cleaned up.
To improve is to change; to be perfect is to change often.
Genius is always allowed some leeway, once the hammer has been pried from its hands and the blood has been cleaned up.
To improve is to change; to be perfect is to change often.
- GrandMasterTerwynn
- Emperor's Hand
- Posts: 6787
- Joined: 2002-07-29 06:14pm
- Location: Somewhere on Earth.
Powerful GPUs of the scale needed for this sort of work have really only come about in the past few years.the .303 bookworm wrote:Wait, let me get this straight,
Until now for intensive Graphical simulations they havent used Graphic cards
Before, you had ordinary video cards, which didn't have much processing power to speak of. Then you had 2D accelerators, which were ordinary video cards with more of the drawing stuff, that the CPU would've handled, passed off to them. Then you had 3D accelerators, but they were constrained by small memory sizes, and narrow and slow data pipes.
Then you had GPUs sitting on their own dedicated bus (AGP) which still has quite a bit of a data-moving bottleneck. But the GPU on the card got wider and wider internal data pipes (32 bit became 64 bit, which became 128 bit, and is currently at 256 bits wide) coupled with blisteringly fast memory at 64 MB, then 128 MB, up to 256 MB.
Now you have PCI-x cards which don't have the bottleneck issues that AGP does. And future video cards will have greater processing capability (more precision, and so on.)
So, again, really, it's only been within the last few years that the GPU had enough kick to it to make something like LANL's SCOUT package worthwhile.
Tales of the Known Worlds:
2070s - The Seventy-Niners ... 3500s - Fair as Death ... 4900s - Against Improbable Odds V 1.0
2070s - The Seventy-Niners ... 3500s - Fair as Death ... 4900s - Against Improbable Odds V 1.0
- The Grim Squeaker
- Emperor's Hand
- Posts: 10319
- Joined: 2005-06-01 01:44am
- Location: A different time-space Continuum
- Contact:
Huh, thanks.
?
?
How much time would it have taken without the GPU then? Months, daysResearchers at LANL have already tested Scout by modelling a critical moment during a particularly spectacular astronomical event: a “core-collapse supernova”. The simulations ran 12 times faster
Photography
Genius is always allowed some leeway, once the hammer has been pried from its hands and the blood has been cleaned up.
To improve is to change; to be perfect is to change often.
Genius is always allowed some leeway, once the hammer has been pried from its hands and the blood has been cleaned up.
To improve is to change; to be perfect is to change often.