The rumors prove true! AMD has purchased ATI for 5.4 billion dollers.Forbes wrote: LONDON (AFX) - Advanced Micro Devices Inc confirmed it has agreed the acquisition of Canadian graphics chip maker ATI Technologies Inc for 5.4 bln usd in cash and shares.
AMD buys ATI!
Moderator: Thanas
AMD buys ATI!
It's real
"A cult is a religion with no political power." -Tom Wolfe
Pardon me for sounding like a dick, but I'm playing the tiniest violin in the world right now-Dalton
- Dahak
- Emperor's Hand
- Posts: 7292
- Joined: 2002-10-29 12:08pm
- Location: Admiralty House, Landing, Manticore
- Contact:
Does it mean Intel will have to buy nVidia?
Great Dolphin Conspiracy - Chatter box
"Implications: we have been intercepted deliberately by a means unknown, for a purpose unknown, and transferred to a place unknown by a form of intelligence unknown. Apart from the unknown, everything is obvious." ZORAC
GALE Force Euro Wimp
Human dignity shall be inviolable. To respect and protect it shall be the duty of all state authority.
Short term, probably nothing, except you probably won't see Crossfire for Intel anymore (I seriously doubt Intel is going to let AMD produce Intel chipsets. EDIT: Or include a competitors tech in their chipsets. /EDIT). However, this probably won't be an issue, since rumor has it ATI was heading for Crossfire on a card (think 7950GX2 for ATI). Nvidia will probably continue to support AMD (it makes no sense for them not too, unless they want to lose a shit load of sales). Nvidia may also get involved in some type of allience or parternship with Intel.DesertFly wrote:What does this mean for those of us who are in the market for graphics cards and processors? (Which would be me on both counts.)
Long term, we'll might see socket-base GPUs, perhaps with HT links directly to the CPU and memory (and in a 64-bit system, you could have several gigs of RAM, shared between the CPU and GPU). Then hopefully we'll see GPUs become true parallel processors, capable of handling any parallel task (graphics, physics, complex DSP), and maybe even integration into the CPU die (think Cell, but on steroids). This would basically be the prediction Tim Sweeney gave in his NVNews.net interview at E3.
Artillery. Its what's for dinner.
- Durandal
- Bile-Driven Hate Machine
- Posts: 17927
- Joined: 2002-07-03 06:26pm
- Location: Silicon Valley, CA
- Contact:
They'd better have something good planned, because Morgan Stanley had to loan AMD $2.5 billion to complete the acquisition. That means AMD has no cash left and is in the red. Not a good place to be when Intel is pumping out cheaper, faster chips.
Damien Sorresso
"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
- Vohu Manah
- Jedi Knight
- Posts: 775
- Joined: 2004-03-28 07:38am
- Location: Harford County, Maryland
- Contact:
The acquisition may not even occur. ATI shareholders have yet to approve.
“There are two kinds of people in the world: the kind who think it’s perfectly reasonable to strip-search a 13-year-old girl suspected of bringing ibuprofen to school, and the kind who think those people should be kept as far away from children as possible … Sometimes it’s hard to tell the difference between drug warriors and child molesters.” - Jacob Sullum[/size][/align]
- Master of Cards
- Jedi Master
- Posts: 1168
- Joined: 2005-03-06 10:54am
- InnocentBystander
- The Russian Circus
- Posts: 3466
- Joined: 2004-04-10 06:05am
- Location: Just across the mighty Hudson
- Master of Ossus
- Darkest Knight
- Posts: 18213
- Joined: 2002-07-11 01:35am
- Location: California
Right, at a little over $20 per share and .2229 shares of AMD stock/share ATI.InnocentBystander wrote:So where exactly does that $5.4 billion go, the share-holders?
"Sometimes I think you WANT us to fail." "Shut up, just shut up!" -Two Guys from Kabul
Latinum Star Recipient; Hacker's Cross Award Winner
"one soler flar can vapririze the planit or malt the nickl in lass than millasacit" -Bagara1000
"Happiness is just a Flaming Moe away."
Latinum Star Recipient; Hacker's Cross Award Winner
"one soler flar can vapririze the planit or malt the nickl in lass than millasacit" -Bagara1000
"Happiness is just a Flaming Moe away."
- Uraniun235
- Emperor's Hand
- Posts: 13772
- Joined: 2002-09-12 12:47am
- Location: OREGON
- Contact:
I thought the biggest thing GPUs had going for them was the fact that they were purpose-built for doing one thing and doing it well: rendering a fuckton of polygons.Arrow wrote: Long term, we'll might see socket-base GPUs, perhaps with HT links directly to the CPU and memory (and in a 64-bit system, you could have several gigs of RAM, shared between the CPU and GPU). Then hopefully we'll see GPUs become true parallel processors, capable of handling any parallel task (graphics, physics, complex DSP), and maybe even integration into the CPU die (think Cell, but on steroids). This would basically be the prediction Tim Sweeney gave in his NVNews.net interview at E3.
In any case, that sounds like a nightmare for the continually-upgrading hobbyist; it's bad enough that going from Intel to AMD or vice versa means a new motherboard (and, depending on what timeframe you're in, new RAM as well), but tying the graphics card to the CPU maker as well? Yikes.
- Durandal
- Bile-Driven Hate Machine
- Posts: 17927
- Joined: 2002-07-03 06:26pm
- Location: Silicon Valley, CA
- Contact:
More than that, GPUs today on localized motherboards communicate very quickly with their memory. GDDR3 and GDDR4 are very fast, and the CPUs on a video card can talk to that memory much faster than your Intel/AMD CPU on a general motherboard. Dedicated memory is a good thing.Uraniun235 wrote:I thought the biggest thing GPUs had going for them was the fact that they were purpose-built for doing one thing and doing it well: rendering a fuckton of polygons.
On the other side of the coin, an on-die GPU would completely remove the need for going over the PCIe bus. Communication between the GPU and CPU would be as fast as communication between the CPU and its cache.
Damien Sorresso
"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
An on-die or socketed GPU, or a "Cell on Steroids" general purpose sequetial and parallel processor would be cheaper. Making a card cost money. Special purpose RAM cost money. Fast forward five years, and you've got, say DDR4, and tons of it, dedicate links between the CPU and GPU, with perhaps a shared memory controller and shared cache, combined with a low overhead API, you're not going to need a seperate card. Ten years out, PCs may very well look like consoles with keyboards with some upgradibility.
Would I try it with today's tech? Hell no. But I'd love to see it five to ten years from now.
Would I try it with today's tech? Hell no. But I'd love to see it five to ten years from now.
Artillery. Its what's for dinner.
- Arthur_Tuxedo
- Sith Acolyte
- Posts: 5637
- Joined: 2002-07-23 03:28am
- Location: San Francisco, California
I think the on-die GPU is pretty much geared toward budget PC's, if it's feasible at all. No matter how advanced technology gets, a big video card will always be able to do much more than a tiny co-processor.
However, here's one thing that jumped into my mind: A physics co-processor. ATI's been working on physics technology through the GPU, after all, and seems to be ahead of all the other companies in that regard.
But whatever comes of this, there are definite dangers. If they come out with something so ingenious that it makes all competitors obsolete, that's a very bad thing for consumers in the long run. There's also a very real danger of foreclosure. For instance, ATI comes out with a kickass video card that's much better than the current offering from NVidia, but you need an AMD CPU or it won't work. These are not good things for consumers.
Still, it's entirely possible that they are merging not because they want to somehow combine their products, but to take advantage of economies of scale in chip manufacturing. Inability to match Intel's manufacturing capabilities has long been a problem for AMD, and things are always cheaper to make in larger quantities.
However, here's one thing that jumped into my mind: A physics co-processor. ATI's been working on physics technology through the GPU, after all, and seems to be ahead of all the other companies in that regard.
But whatever comes of this, there are definite dangers. If they come out with something so ingenious that it makes all competitors obsolete, that's a very bad thing for consumers in the long run. There's also a very real danger of foreclosure. For instance, ATI comes out with a kickass video card that's much better than the current offering from NVidia, but you need an AMD CPU or it won't work. These are not good things for consumers.
Still, it's entirely possible that they are merging not because they want to somehow combine their products, but to take advantage of economies of scale in chip manufacturing. Inability to match Intel's manufacturing capabilities has long been a problem for AMD, and things are always cheaper to make in larger quantities.
"I'm so fast that last night I turned off the light switch in my hotel room and was in bed before the room was dark." - Muhammad Ali
"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
- Uraniun235
- Emperor's Hand
- Posts: 13772
- Joined: 2002-09-12 12:47am
- Location: OREGON
- Contact:
Just because AMD has bought ATI, there not going to start producing cards which only work with AMD chips. It would be suiside and give Nvidea a huge boost since they will now get all Intel's customers while ATI is not guaranteed to get all of AMD's customer base.
"May God stand between you and harm in all the empty places where you must walk." - Ancient Egyptian Blessing
Ivanova is always right.
I will listen to Ivanova.
I will not ignore Ivanova's recommendations. Ivanova is God.
AND, if this ever happens again, Ivanova will personally rip your lungs out! - Babylon 5 Mantra
There is no "I" in TEAM. There is a ME however.
Ivanova is always right.
I will listen to Ivanova.
I will not ignore Ivanova's recommendations. Ivanova is God.
AND, if this ever happens again, Ivanova will personally rip your lungs out! - Babylon 5 Mantra
There is no "I" in TEAM. There is a ME however.
- Arthur_Tuxedo
- Sith Acolyte
- Posts: 5637
- Joined: 2002-07-23 03:28am
- Location: San Francisco, California
If they had enough of an advantage, they would. Like when ATI had the 9700 Pro and all NVidia had was the GeForce 4 and then the possibly worse FX5800.
"I'm so fast that last night I turned off the light switch in my hotel room and was in bed before the room was dark." - Muhammad Ali
"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
Who say's it has to be a tiny co-processor? Looking ahead to 45nm processes, larger dies, stacked dies and the like, a processor could very have multiple sequencial and parallal cores capable of absolutely annihilating anything we have today. Although I admit it is more like we'll see seperate sequencial and parallel chips linked by a dedicated bus sharing the same pool of (very large) memory.Arthur_Tuxedo wrote:I think the on-die GPU is pretty much geared toward budget PC's, if it's feasible at all. No matter how advanced technology gets, a big video card will always be able to do much more than a tiny co-processor.
We'll probably see something like this a couple of years, perhaps integrated into the motherboard chipset, serving as graphics in a low end ship and a physic processor in a high-end system.However, here's one thing that jumped into my mind: A physics co-processor. ATI's been working on physics technology through the GPU, after all, and seems to be ahead of all the other companies in that regard.
Long term, I still want the true parallel processor, capable of quickly load balancing Direct3D/OpenGL, Havok/DirectPhysics/Aegia and DirectSound/OpenAL. That would fucking rock (10 years from now...).
I don't think AMD will be that stupid, but I'd bet hard cash that a year and half from now ATI's cards work better with AMD than Intel. That could also force Intel into a parternship with Nvidia, which could be interesting.But whatever comes of this, there are definite dangers. If they come out with something so ingenious that it makes all competitors obsolete, that's a very bad thing for consumers in the long run. There's also a very real danger of foreclosure. For instance, ATI comes out with a kickass video card that's much better than the current offering from NVidia, but you need an AMD CPU or it won't work. These are not good things for consumers.
But long term, I think Nvidia will be the big loser from this. If the predictions I've presented here are true, then Nvidia will need Intel to survive, but Intel doesn't need Nvidia. Intel is making DX10 graphics chips, just not very fast ones, so they do understand graphics, and they could probably understand physics and other parallel tasks very quickly, and produce products to meet those needs at all levels. Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now.
Or, we could end up with three x86 platforms - AMD, Intel and Nvidia. That'd be good for the consumer, eventhough I doubt that will happen.
Artillery. Its what's for dinner.
The really funny thing about all this is that ATI partnered with Intel to support Crossfire on the 975X chipset.
[quote=Arrow]Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now. [/quote]
I don't really understand how the designers of the best (at the moment) graphic cards can "[not] know a lot about processors". Can you elaborate on that, please?
[quote=Arrow]Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now. [/quote]
I don't really understand how the designers of the best (at the moment) graphic cards can "[not] know a lot about processors". Can you elaborate on that, please?
"The surest sign that the world was not created by an omnipotent Being who loves us is that the Earth is not an infinite plane and it does not rain meat."
"Lo, how free the madman is! He can observe beyond mere reality, and cogitates untroubled by the bounds of relevance."
"Lo, how free the madman is! He can observe beyond mere reality, and cogitates untroubled by the bounds of relevance."
A CPU is quite a different beast than GPU. Yes, they have parts in common with a GPU, such as math operations and memory access, but these are designed for parallel, straight-through computation. IIRC, GPUs aren't designed to handle lots of memory writes, out-of-order execution, condition jumps and many other sequential operations. Now, GPUs are much more CPU-like than they ever were in the past, but asking Nvidia to come up with a CPU that can compete with AMD and Intel in a short time (a few years) is really pushing it.Elaro wrote:I don't really understand how the designers of the best (at the moment) graphic cards can "[not] know a lot about processors". Can you elaborate on that, please?
Artillery. Its what's for dinner.
It seems that Intel has pulled ATI's chipset license.
Apparentely means no new ATI chipsets for Intel processors after the end of the year.
Apparentely means no new ATI chipsets for Intel processors after the end of the year.
Marcus Aurelius: ...the Swedish S-tank; the exception is made mostly because the Swedes insisted really hard that it is a tank rather than a tank destroyer or assault gun
Ilya Muromets: And now I have this image of a massive, stern-looking Swede staring down a bunch of military nerds. "It's a tank." "Uh, yes Sir. Please don't hurt us."
Ilya Muromets: And now I have this image of a massive, stern-looking Swede staring down a bunch of military nerds. "It's a tank." "Uh, yes Sir. Please don't hurt us."
- Uraniun235
- Emperor's Hand
- Posts: 13772
- Joined: 2002-09-12 12:47am
- Location: OREGON
- Contact:
Intel and AMD have been making the fastest x86 chips in the world for many years now. They have poured many millions (possibly billions?) of dollars into the development of faster processors. They have, between them, tons and tons of direct experience in developing a specific type of processor (not to mention that they do produce many other kinds of computer chips as well, especially Intel).Elaro wrote:The really funny thing about all this is that ATI partnered with Intel to support Crossfire on the 975X chipset.
I don't really understand how the designers of the best (at the moment) graphic cards can "[not] know a lot about processors". Can you elaborate on that, please?Arrow wrote:Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now.
nVidia is small-fry compared to them. They do not have the resources to go up against two juggernoughts who have years of experience competing with each other. Hell, they don't even have their own fabrication facilities; they just design stuff and then contract out construction to someone else.
- Arthur_Tuxedo
- Sith Acolyte
- Posts: 5637
- Joined: 2002-07-23 03:28am
- Location: San Francisco, California
I still see this as a solution for budget chips. CPU's and GPU's are fundamentally different. A processor could "annihilate anything we have today" and still be woefully inadequete as a GPU. The trend has been increasing size of the GPU with each generation, not the reverse. While I don't think that will hold out very much longer, I also don't see the reverse happening.Arrow wrote:Who say's it has to be a tiny co-processor? Looking ahead to 45nm processes, larger dies, stacked dies and the like, a processor could very have multiple sequencial and parallal cores capable of absolutely annihilating anything we have today. Although I admit it is more like we'll see seperate sequencial and parallel chips linked by a dedicated bus sharing the same pool of (very large) memory.Arthur_Tuxedo wrote:I think the on-die GPU is pretty much geared toward budget PC's, if it's feasible at all. No matter how advanced technology gets, a big video card will always be able to do much more than a tiny co-processor.
Now there's an excellent idea.We'll probably see something like this a couple of years, perhaps integrated into the motherboard chipset, serving as graphics in a low end ship and a physic processor in a high-end system.However, here's one thing that jumped into my mind: A physics co-processor. ATI's been working on physics technology through the GPU, after all, and seems to be ahead of all the other companies in that regard.
My first computer had no seperate graphics card. My second computer did, but it was just a straight 2D card with no 3D acceleration. My third computer had a 3D accelerator, and each one I've gotten since then has had a bigger graphics card than the last. Now it's true that sound cards have largely been replaced by onboard sound, but that's because sound quality only needs to be at a certain level before people don't care anymore. Today's game graphics do look mighty impressive, but it will be a very long time before they get so good that any further improvement won't be noticeable.Long term, I still want the true parallel processor, capable of quickly load balancing Direct3D/OpenGL, Havok/DirectPhysics/Aegia and DirectSound/OpenAL. That would fucking rock (10 years from now...).
That's if there are big gains from having a company that produces high-end CPU's and GPU's, which we don't know at this point. But if so, and NVidia goes under with no one able to step up and compete effectively, the biggest loser is the consumer.I don't think AMD will be that stupid, but I'd bet hard cash that a year and half from now ATI's cards work better with AMD than Intel. That could also force Intel into a parternship with Nvidia, which could be interesting.But whatever comes of this, there are definite dangers. If they come out with something so ingenious that it makes all competitors obsolete, that's a very bad thing for consumers in the long run. There's also a very real danger of foreclosure. For instance, ATI comes out with a kickass video card that's much better than the current offering from NVidia, but you need an AMD CPU or it won't work. These are not good things for consumers.
But long term, I think Nvidia will be the big loser from this. If the predictions I've presented here are true, then Nvidia will need Intel to survive, but Intel doesn't need Nvidia. Intel is making DX10 graphics chips, just not very fast ones, so they do understand graphics, and they could probably understand physics and other parallel tasks very quickly, and produce products to meet those needs at all levels. Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now.
Or, we could end up with three x86 platforms - AMD, Intel and Nvidia. That'd be good for the consumer, eventhough I doubt that will happen.
"I'm so fast that last night I turned off the light switch in my hotel room and was in bed before the room was dark." - Muhammad Ali
"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
Arthur, you're raining on my parade. Your views are conservative, and will probably be proven more or less correct, but I still want the one chip that does it all, pumping out insane levels of realism in my games. Oh, and I want the whole computer to clip on (so its upgradable!) the back of a 30" monitor, running at 15 megapixel resolution. And I want it ten years from now.
Oh well, maybe I'll have it when I retire forty years from now...
Oh well, maybe I'll have it when I retire forty years from now...
Artillery. Its what's for dinner.
- DesertFly
- has been designed to act as a flotation device
- Posts: 1381
- Joined: 2005-10-18 11:35pm
- Location: The Emerald City
And now you're being conservative. I fully expect that when we retire in forty years, computers that can generate photorealistic graphics and real-life physics will not only exist, they will be small enough to wear on your belt, or tuck in your pocket, or sew into your underwear. I also imagine that the interface will have gotten much more immersive, perhaps even to the point where things like the Matrix are close to possible.Arrow wrote:Arthur, you're raining on my parade. Your views are conservative, and will probably be proven more or less correct, but I still want the one chip that does it all, pumping out insane levels of realism in my games. Oh, and I want the whole computer to clip on (so its upgradable!) the back of a 30" monitor, running at 15 megapixel resolution. And I want it ten years from now.
Oh well, maybe I'll have it when I retire forty years from now...
Proud member of the no sigs club.