Page 1 of 2
AMD buys ATI!
Posted: 2006-07-24 06:40am
by Mr Bean
It's real
Forbes wrote: LONDON (AFX) - Advanced Micro Devices Inc confirmed it has agreed the acquisition of Canadian graphics chip maker ATI Technologies Inc for 5.4 bln usd in cash and shares.
The rumors prove true! AMD has purchased ATI for 5.4 billion dollers.
Posted: 2006-07-24 06:48am
by DesertFly
What does this mean for those of us who are in the market for graphics cards and processors? (Which would be me on both counts.)
Posted: 2006-07-24 07:58am
by Dahak
Does it mean Intel will have to buy nVidia?
Posted: 2006-07-24 08:31am
by Arrow
DesertFly wrote:What does this mean for those of us who are in the market for graphics cards and processors? (Which would be me on both counts.)
Short term, probably nothing, except you probably won't see Crossfire for Intel anymore (I seriously doubt Intel is going to let AMD produce Intel chipsets. EDIT: Or include a competitors tech in their chipsets. /EDIT). However, this probably won't be an issue, since rumor has it ATI was heading for Crossfire on a card (think 7950GX2 for ATI). Nvidia will probably continue to support AMD (it makes no sense for them not too, unless they want to lose a shit load of sales). Nvidia may also get involved in some type of allience or parternship with Intel.
Long term, we'll might see socket-base GPUs, perhaps with HT links directly to the CPU and memory (and in a 64-bit system, you could have several gigs of RAM, shared between the CPU and GPU). Then hopefully we'll see GPUs become true parallel processors, capable of handling any parallel task (graphics, physics, complex DSP), and maybe even integration into the CPU die (think Cell, but on steroids). This would basically be the prediction Tim Sweeney gave in his NVNews.net interview at E3.
Posted: 2006-07-24 09:21am
by Durandal
They'd better have something good planned, because Morgan Stanley had to
loan AMD $2.5 billion to complete the acquisition. That means AMD has no cash left and is in the red. Not a good place to be when Intel is pumping out cheaper, faster chips.
Posted: 2006-07-24 10:29am
by Vohu Manah
The acquisition may not even occur. ATI shareholders have yet to approve.
Posted: 2006-07-24 11:01am
by DaveJB
Not to mention the FTC.
Posted: 2006-07-24 11:23am
by Master of Cards
DaveJB wrote:Not to mention the FTC.
Not in the same market, so not a problem
Posted: 2006-07-24 12:45pm
by InnocentBystander
So where exactly does that $5.4 billion go, the share-holders?
Posted: 2006-07-24 01:10pm
by Master of Ossus
InnocentBystander wrote:So where exactly does that $5.4 billion go, the share-holders?
Right, at a little over $20 per share and .2229 shares of AMD stock/share ATI.
Posted: 2006-07-24 03:48pm
by Uraniun235
Arrow wrote:
Long term, we'll might see socket-base GPUs, perhaps with HT links directly to the CPU and memory (and in a 64-bit system, you could have several gigs of RAM, shared between the CPU and GPU). Then hopefully we'll see GPUs become true parallel processors, capable of handling any parallel task (graphics, physics, complex DSP), and maybe even integration into the CPU die (think Cell, but on steroids). This would basically be the prediction Tim Sweeney gave in his NVNews.net interview at E3.
I thought the biggest thing GPUs had going for them was the fact that they were
purpose-built for doing one thing and doing it well: rendering a fuckton of polygons.
In any case, that sounds like a nightmare for the continually-upgrading hobbyist; it's bad enough that going from Intel to AMD or vice versa means a new motherboard (and, depending on what timeframe you're in, new RAM as well), but tying the graphics card to the CPU maker as well? Yikes.
Posted: 2006-07-24 03:53pm
by Durandal
Uraniun235 wrote:I thought the biggest thing GPUs had going for them was the fact that they were purpose-built for doing one thing and doing it well: rendering a fuckton of polygons.
More than that, GPUs today on localized motherboards communicate very quickly with their memory. GDDR3 and GDDR4 are very fast, and the CPUs on a video card can talk to that memory much faster than your Intel/AMD CPU on a general motherboard. Dedicated memory is a good thing.
On the other side of the coin, an on-die GPU would completely remove the need for going over the PCIe bus. Communication between the GPU and CPU would be as fast as communication between the CPU and its cache.
Posted: 2006-07-24 04:11pm
by Arrow
An on-die or socketed GPU, or a "Cell on Steroids" general purpose sequetial and parallel processor would be cheaper. Making a card cost money. Special purpose RAM cost money. Fast forward five years, and you've got, say DDR4, and tons of it, dedicate links between the CPU and GPU, with perhaps a shared memory controller and shared cache, combined with a low overhead API, you're not going to need a seperate card. Ten years out, PCs may very well look like consoles with keyboards with some upgradibility.
Would I try it with today's tech? Hell no. But I'd love to see it five to ten years from now.
Posted: 2006-07-24 04:36pm
by Arthur_Tuxedo
I think the on-die GPU is pretty much geared toward budget PC's, if it's feasible at all. No matter how advanced technology gets, a big video card will always be able to do much more than a tiny co-processor.
However, here's one thing that jumped into my mind: A physics co-processor. ATI's been working on physics technology through the GPU, after all, and seems to be ahead of all the other companies in that regard.
But whatever comes of this, there are definite dangers. If they come out with something so ingenious that it makes all competitors obsolete, that's a very bad thing for consumers in the long run. There's also a very real danger of foreclosure. For instance, ATI comes out with a kickass video card that's much better than the current offering from NVidia, but you need an AMD CPU or it won't work. These are not good things for consumers.
Still, it's entirely possible that they are merging not because they want to somehow combine their products, but to take advantage of economies of scale in chip manufacturing. Inability to match Intel's manufacturing capabilities has long been a problem for AMD, and things are always cheaper to make in larger quantities.
Posted: 2006-07-24 05:06pm
by Uraniun235
Independent physics coprocessors will, in my opinion, probably be made obsolete by the introduction of quad-core CPUs by Intel and AMD.
Posted: 2006-07-24 05:14pm
by Lost Soal
Just because AMD has bought ATI, there not going to start producing cards which only work with AMD chips. It would be suiside and give Nvidea a huge boost since they will now get all Intel's customers while ATI is not guaranteed to get all of AMD's customer base.
Posted: 2006-07-24 05:46pm
by Arthur_Tuxedo
If they had enough of an advantage, they would. Like when ATI had the 9700 Pro and all NVidia had was the GeForce 4 and then the possibly worse FX5800.
Posted: 2006-07-24 05:59pm
by Arrow
Arthur_Tuxedo wrote:I think the on-die GPU is pretty much geared toward budget PC's, if it's feasible at all. No matter how advanced technology gets, a big video card will always be able to do much more than a tiny co-processor.
Who say's it has to be a tiny co-processor? Looking ahead to 45nm processes, larger dies, stacked dies and the like, a processor could very have multiple sequencial and parallal cores capable of absolutely annihilating anything we have today. Although I admit it is more like we'll see seperate sequencial and parallel chips linked by a dedicated bus sharing the same pool of (very large) memory.
However, here's one thing that jumped into my mind: A physics co-processor. ATI's been working on physics technology through the GPU, after all, and seems to be ahead of all the other companies in that regard.
We'll probably see something like this a couple of years, perhaps integrated into the motherboard chipset, serving as graphics in a low end ship and a physic processor in a high-end system.
Long term, I still want the true parallel processor, capable of quickly load balancing Direct3D/OpenGL, Havok/DirectPhysics/Aegia and DirectSound/OpenAL. That would fucking rock (10 years from now...).
But whatever comes of this, there are definite dangers. If they come out with something so ingenious that it makes all competitors obsolete, that's a very bad thing for consumers in the long run. There's also a very real danger of foreclosure. For instance, ATI comes out with a kickass video card that's much better than the current offering from NVidia, but you need an AMD CPU or it won't work. These are not good things for consumers.
I don't think AMD will be that stupid, but I'd bet hard cash that a year and half from now ATI's cards work better with AMD than Intel. That could also force Intel into a parternship with Nvidia, which could be interesting.
But long term, I think Nvidia will be the big loser from this. If the predictions I've presented here are true, then Nvidia will need Intel to survive, but Intel doesn't need Nvidia. Intel is making DX10 graphics chips, just not very fast ones, so they do understand graphics, and they could probably understand physics and other parallel tasks very quickly, and produce products to meet those needs at all levels. Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now.
Or, we could end up with three x86 platforms - AMD, Intel and Nvidia. That'd be good for the consumer, eventhough I doubt that will happen.
Posted: 2006-07-24 06:33pm
by Elaro
The really funny thing about all this is that ATI partnered with Intel to support Crossfire on the 975X chipset.
[quote=Arrow]Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now. [/quote]
I don't really understand how the designers of the best (at the moment) graphic cards can "[not] know a lot about processors". Can you elaborate on that, please?
Posted: 2006-07-24 07:26pm
by Arrow
Elaro wrote:I don't really understand how the designers of the best (at the moment) graphic cards can "[not] know a lot about processors". Can you elaborate on that, please?
A CPU is quite a different beast than GPU. Yes, they have parts in common with a GPU, such as math operations and memory access, but these are designed for parallel, straight-through computation. IIRC, GPUs aren't designed to handle lots of memory writes, out-of-order execution, condition jumps and many other sequential operations. Now, GPUs are much more CPU-like than they ever were in the past, but asking Nvidia to come up with a CPU that can compete with AMD and Intel in a short time (a few years) is really pushing it.
Posted: 2006-07-24 07:31pm
by atg
It seems that Intel has pulled ATI's chipset license.
Apparentely means no new ATI chipsets for Intel processors after the end of the year.
Posted: 2006-07-24 07:54pm
by Uraniun235
Elaro wrote:The really funny thing about all this is that ATI partnered with Intel to support Crossfire on the 975X chipset.
Arrow wrote:Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now.
I don't really understand how the designers of the best (at the moment) graphic cards can "[not] know a lot about processors". Can you elaborate on that, please?
Intel and AMD have been making the fastest x86 chips in the world for many years now. They have poured many millions (possibly billions?) of dollars into the development of faster processors. They have, between them, tons and tons of direct experience in developing a specific type of processor (not to mention that they
do produce many other kinds of computer chips as well, especially Intel).
nVidia is small-fry compared to them. They do not have the resources to go up against two juggernoughts who have years of experience competing with each other. Hell, they don't even have their own fabrication facilities; they just design stuff and then contract out construction to someone else.
Posted: 2006-07-24 08:31pm
by Arthur_Tuxedo
Arrow wrote:Arthur_Tuxedo wrote:I think the on-die GPU is pretty much geared toward budget PC's, if it's feasible at all. No matter how advanced technology gets, a big video card will always be able to do much more than a tiny co-processor.
Who say's it has to be a tiny co-processor? Looking ahead to 45nm processes, larger dies, stacked dies and the like, a processor could very have multiple sequencial and parallal cores capable of absolutely annihilating anything we have today. Although I admit it is more like we'll see seperate sequencial and parallel chips linked by a dedicated bus sharing the same pool of (very large) memory.
I still see this as a solution for budget chips. CPU's and GPU's are fundamentally different. A processor could "annihilate anything we have today" and still be woefully inadequete as a GPU. The trend has been increasing size of the GPU with each generation, not the reverse. While I don't think that will hold out very much longer, I also don't see the reverse happening.
However, here's one thing that jumped into my mind: A physics co-processor. ATI's been working on physics technology through the GPU, after all, and seems to be ahead of all the other companies in that regard.
We'll probably see something like this a couple of years, perhaps integrated into the motherboard chipset, serving as graphics in a low end ship and a physic processor in a high-end system.
Now there's an excellent idea.
Long term, I still want the true parallel processor, capable of quickly load balancing Direct3D/OpenGL, Havok/DirectPhysics/Aegia and DirectSound/OpenAL. That would fucking rock (10 years from now...).
My first computer had no seperate graphics card. My second computer did, but it was just a straight 2D card with no 3D acceleration. My third computer had a 3D accelerator, and each one I've gotten since then has had a bigger graphics card than the last. Now it's true that sound cards have largely been replaced by onboard sound, but that's because sound quality only needs to be at a certain level before people don't care anymore. Today's game graphics do look mighty impressive, but it will be a very long time before they get so good that any further improvement won't be noticeable.
But whatever comes of this, there are definite dangers. If they come out with something so ingenious that it makes all competitors obsolete, that's a very bad thing for consumers in the long run. There's also a very real danger of foreclosure. For instance, ATI comes out with a kickass video card that's much better than the current offering from NVidia, but you need an AMD CPU or it won't work. These are not good things for consumers.
I don't think AMD will be that stupid, but I'd bet hard cash that a year and half from now ATI's cards work better with AMD than Intel. That could also force Intel into a parternship with Nvidia, which could be interesting.
But long term, I think Nvidia will be the big loser from this. If the predictions I've presented here are true, then Nvidia will need Intel to survive, but Intel doesn't need Nvidia. Intel is making DX10 graphics chips, just not very fast ones, so they do understand graphics, and they could probably understand physics and other parallel tasks very quickly, and produce products to meet those needs at all levels. Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now.
Or, we could end up with three x86 platforms - AMD, Intel and Nvidia. That'd be good for the consumer, eventhough I doubt that will happen.
That's
if there are big gains from having a company that produces high-end CPU's and GPU's, which we don't know at this point. But if so, and NVidia goes under with no one able to step up and compete effectively, the biggest loser is the consumer.
Posted: 2006-07-24 08:48pm
by Arrow
Arthur, you're raining on my parade. Your views are conservative, and will probably be proven more or less correct, but I still want the one chip that does it all, pumping out insane levels of realism in my games. Oh, and I want the whole computer to clip on (so its upgradable!) the back of a 30" monitor, running at 15 megapixel resolution. And I want it ten years from now.
Oh well, maybe I'll have it when I retire forty years from now...
Posted: 2006-07-24 10:54pm
by DesertFly
Arrow wrote:Arthur, you're raining on my parade. Your views are conservative, and will probably be proven more or less correct, but I still want the one chip that does it all, pumping out insane levels of realism in my games. Oh, and I want the whole computer to clip on (so its upgradable!) the back of a 30" monitor, running at 15 megapixel resolution. And I want it ten years from now.
Oh well, maybe I'll have it when I retire forty years from now...
And now you're being conservative. I fully expect that when we retire in forty years, computers that can generate photorealistic graphics and real-life physics will not only exist, they will be small enough to wear on your belt, or tuck in your pocket, or sew into your underwear. I also imagine that the interface will have gotten much more immersive, perhaps even to the point where things like the Matrix are close to possible.