AMD buys ATI!

GEC: Discuss gaming, computers and electronics and venture into the bizarre world of STGODs.

Moderator: Thanas

User avatar
Mr Bean
Lord of Irony
Posts: 22464
Joined: 2002-07-04 08:36am

AMD buys ATI!

Post by Mr Bean »

It's real
Forbes wrote: LONDON (AFX) - Advanced Micro Devices Inc confirmed it has agreed the acquisition of Canadian graphics chip maker ATI Technologies Inc for 5.4 bln usd in cash and shares.
The rumors prove true! AMD has purchased ATI for 5.4 billion dollers.

"A cult is a religion with no political power." -Tom Wolfe
Pardon me for sounding like a dick, but I'm playing the tiniest violin in the world right now-Dalton
User avatar
DesertFly
has been designed to act as a flotation device
Posts: 1381
Joined: 2005-10-18 11:35pm
Location: The Emerald City

Post by DesertFly »

What does this mean for those of us who are in the market for graphics cards and processors? (Which would be me on both counts.)
Proud member of the no sigs club.
User avatar
Dahak
Emperor's Hand
Posts: 7292
Joined: 2002-10-29 12:08pm
Location: Admiralty House, Landing, Manticore
Contact:

Post by Dahak »

Does it mean Intel will have to buy nVidia? :)
Image
Great Dolphin Conspiracy - Chatter box
"Implications: we have been intercepted deliberately by a means unknown, for a purpose unknown, and transferred to a place unknown by a form of intelligence unknown. Apart from the unknown, everything is obvious." ZORAC
GALE Force Euro Wimp
Human dignity shall be inviolable. To respect and protect it shall be the duty of all state authority.
Image
User avatar
Arrow
Jedi Council Member
Posts: 2283
Joined: 2003-01-12 09:14pm

Post by Arrow »

DesertFly wrote:What does this mean for those of us who are in the market for graphics cards and processors? (Which would be me on both counts.)
Short term, probably nothing, except you probably won't see Crossfire for Intel anymore (I seriously doubt Intel is going to let AMD produce Intel chipsets. EDIT: Or include a competitors tech in their chipsets. /EDIT). However, this probably won't be an issue, since rumor has it ATI was heading for Crossfire on a card (think 7950GX2 for ATI). Nvidia will probably continue to support AMD (it makes no sense for them not too, unless they want to lose a shit load of sales). Nvidia may also get involved in some type of allience or parternship with Intel.

Long term, we'll might see socket-base GPUs, perhaps with HT links directly to the CPU and memory (and in a 64-bit system, you could have several gigs of RAM, shared between the CPU and GPU). Then hopefully we'll see GPUs become true parallel processors, capable of handling any parallel task (graphics, physics, complex DSP), and maybe even integration into the CPU die (think Cell, but on steroids). This would basically be the prediction Tim Sweeney gave in his NVNews.net interview at E3.
Artillery. Its what's for dinner.
User avatar
Durandal
Bile-Driven Hate Machine
Posts: 17927
Joined: 2002-07-03 06:26pm
Location: Silicon Valley, CA
Contact:

Post by Durandal »

They'd better have something good planned, because Morgan Stanley had to loan AMD $2.5 billion to complete the acquisition. That means AMD has no cash left and is in the red. Not a good place to be when Intel is pumping out cheaper, faster chips.
Damien Sorresso

"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
User avatar
Vohu Manah
Jedi Knight
Posts: 775
Joined: 2004-03-28 07:38am
Location: Harford County, Maryland
Contact:

Post by Vohu Manah »

The acquisition may not even occur. ATI shareholders have yet to approve.
There are two kinds of people in the world: the kind who think it’s perfectly reasonable to strip-search a 13-year-old girl suspected of bringing ibuprofen to school, and the kind who think those people should be kept as far away from children as possible … Sometimes it’s hard to tell the difference between drug warriors and child molesters.” - Jacob Sullum[/size][/align]
User avatar
DaveJB
Jedi Council Member
Posts: 1917
Joined: 2003-10-06 05:37pm
Location: Leeds, UK

Post by DaveJB »

Not to mention the FTC.
User avatar
Master of Cards
Jedi Master
Posts: 1168
Joined: 2005-03-06 10:54am

Post by Master of Cards »

DaveJB wrote:Not to mention the FTC.
Not in the same market, so not a problem
User avatar
InnocentBystander
The Russian Circus
Posts: 3466
Joined: 2004-04-10 06:05am
Location: Just across the mighty Hudson

Post by InnocentBystander »

So where exactly does that $5.4 billion go, the share-holders?
User avatar
Master of Ossus
Darkest Knight
Posts: 18213
Joined: 2002-07-11 01:35am
Location: California

Post by Master of Ossus »

InnocentBystander wrote:So where exactly does that $5.4 billion go, the share-holders?
Right, at a little over $20 per share and .2229 shares of AMD stock/share ATI.
"Sometimes I think you WANT us to fail." "Shut up, just shut up!" -Two Guys from Kabul

Latinum Star Recipient; Hacker's Cross Award Winner

"one soler flar can vapririze the planit or malt the nickl in lass than millasacit" -Bagara1000

"Happiness is just a Flaming Moe away."
User avatar
Uraniun235
Emperor's Hand
Posts: 13772
Joined: 2002-09-12 12:47am
Location: OREGON
Contact:

Post by Uraniun235 »

Arrow wrote: Long term, we'll might see socket-base GPUs, perhaps with HT links directly to the CPU and memory (and in a 64-bit system, you could have several gigs of RAM, shared between the CPU and GPU). Then hopefully we'll see GPUs become true parallel processors, capable of handling any parallel task (graphics, physics, complex DSP), and maybe even integration into the CPU die (think Cell, but on steroids). This would basically be the prediction Tim Sweeney gave in his NVNews.net interview at E3.
I thought the biggest thing GPUs had going for them was the fact that they were purpose-built for doing one thing and doing it well: rendering a fuckton of polygons.

In any case, that sounds like a nightmare for the continually-upgrading hobbyist; it's bad enough that going from Intel to AMD or vice versa means a new motherboard (and, depending on what timeframe you're in, new RAM as well), but tying the graphics card to the CPU maker as well? Yikes.
User avatar
Durandal
Bile-Driven Hate Machine
Posts: 17927
Joined: 2002-07-03 06:26pm
Location: Silicon Valley, CA
Contact:

Post by Durandal »

Uraniun235 wrote:I thought the biggest thing GPUs had going for them was the fact that they were purpose-built for doing one thing and doing it well: rendering a fuckton of polygons.
More than that, GPUs today on localized motherboards communicate very quickly with their memory. GDDR3 and GDDR4 are very fast, and the CPUs on a video card can talk to that memory much faster than your Intel/AMD CPU on a general motherboard. Dedicated memory is a good thing.

On the other side of the coin, an on-die GPU would completely remove the need for going over the PCIe bus. Communication between the GPU and CPU would be as fast as communication between the CPU and its cache.
Damien Sorresso

"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
User avatar
Arrow
Jedi Council Member
Posts: 2283
Joined: 2003-01-12 09:14pm

Post by Arrow »

An on-die or socketed GPU, or a "Cell on Steroids" general purpose sequetial and parallel processor would be cheaper. Making a card cost money. Special purpose RAM cost money. Fast forward five years, and you've got, say DDR4, and tons of it, dedicate links between the CPU and GPU, with perhaps a shared memory controller and shared cache, combined with a low overhead API, you're not going to need a seperate card. Ten years out, PCs may very well look like consoles with keyboards with some upgradibility.

Would I try it with today's tech? Hell no. But I'd love to see it five to ten years from now.
Artillery. Its what's for dinner.
User avatar
Arthur_Tuxedo
Sith Acolyte
Posts: 5637
Joined: 2002-07-23 03:28am
Location: San Francisco, California

Post by Arthur_Tuxedo »

I think the on-die GPU is pretty much geared toward budget PC's, if it's feasible at all. No matter how advanced technology gets, a big video card will always be able to do much more than a tiny co-processor.

However, here's one thing that jumped into my mind: A physics co-processor. ATI's been working on physics technology through the GPU, after all, and seems to be ahead of all the other companies in that regard.

But whatever comes of this, there are definite dangers. If they come out with something so ingenious that it makes all competitors obsolete, that's a very bad thing for consumers in the long run. There's also a very real danger of foreclosure. For instance, ATI comes out with a kickass video card that's much better than the current offering from NVidia, but you need an AMD CPU or it won't work. These are not good things for consumers.

Still, it's entirely possible that they are merging not because they want to somehow combine their products, but to take advantage of economies of scale in chip manufacturing. Inability to match Intel's manufacturing capabilities has long been a problem for AMD, and things are always cheaper to make in larger quantities.
"I'm so fast that last night I turned off the light switch in my hotel room and was in bed before the room was dark." - Muhammad Ali

"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
User avatar
Uraniun235
Emperor's Hand
Posts: 13772
Joined: 2002-09-12 12:47am
Location: OREGON
Contact:

Post by Uraniun235 »

Independent physics coprocessors will, in my opinion, probably be made obsolete by the introduction of quad-core CPUs by Intel and AMD.
User avatar
Lost Soal
Sith Devotee
Posts: 2618
Joined: 2002-10-22 06:25am
Location: Back in Newcastle.

Post by Lost Soal »

Just because AMD has bought ATI, there not going to start producing cards which only work with AMD chips. It would be suiside and give Nvidea a huge boost since they will now get all Intel's customers while ATI is not guaranteed to get all of AMD's customer base.
"May God stand between you and harm in all the empty places where you must walk." - Ancient Egyptian Blessing

Ivanova is always right.
I will listen to Ivanova.
I will not ignore Ivanova's recommendations. Ivanova is God.
AND, if this ever happens again, Ivanova will personally rip your lungs out! - Babylon 5 Mantra

There is no "I" in TEAM. There is a ME however.
User avatar
Arthur_Tuxedo
Sith Acolyte
Posts: 5637
Joined: 2002-07-23 03:28am
Location: San Francisco, California

Post by Arthur_Tuxedo »

If they had enough of an advantage, they would. Like when ATI had the 9700 Pro and all NVidia had was the GeForce 4 and then the possibly worse FX5800.
"I'm so fast that last night I turned off the light switch in my hotel room and was in bed before the room was dark." - Muhammad Ali

"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
User avatar
Arrow
Jedi Council Member
Posts: 2283
Joined: 2003-01-12 09:14pm

Post by Arrow »

Arthur_Tuxedo wrote:I think the on-die GPU is pretty much geared toward budget PC's, if it's feasible at all. No matter how advanced technology gets, a big video card will always be able to do much more than a tiny co-processor.
Who say's it has to be a tiny co-processor? Looking ahead to 45nm processes, larger dies, stacked dies and the like, a processor could very have multiple sequencial and parallal cores capable of absolutely annihilating anything we have today. Although I admit it is more like we'll see seperate sequencial and parallel chips linked by a dedicated bus sharing the same pool of (very large) memory.
However, here's one thing that jumped into my mind: A physics co-processor. ATI's been working on physics technology through the GPU, after all, and seems to be ahead of all the other companies in that regard.
We'll probably see something like this a couple of years, perhaps integrated into the motherboard chipset, serving as graphics in a low end ship and a physic processor in a high-end system.

Long term, I still want the true parallel processor, capable of quickly load balancing Direct3D/OpenGL, Havok/DirectPhysics/Aegia and DirectSound/OpenAL. That would fucking rock (10 years from now...).
But whatever comes of this, there are definite dangers. If they come out with something so ingenious that it makes all competitors obsolete, that's a very bad thing for consumers in the long run. There's also a very real danger of foreclosure. For instance, ATI comes out with a kickass video card that's much better than the current offering from NVidia, but you need an AMD CPU or it won't work. These are not good things for consumers.
I don't think AMD will be that stupid, but I'd bet hard cash that a year and half from now ATI's cards work better with AMD than Intel. That could also force Intel into a parternship with Nvidia, which could be interesting.

But long term, I think Nvidia will be the big loser from this. If the predictions I've presented here are true, then Nvidia will need Intel to survive, but Intel doesn't need Nvidia. Intel is making DX10 graphics chips, just not very fast ones, so they do understand graphics, and they could probably understand physics and other parallel tasks very quickly, and produce products to meet those needs at all levels. Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now.

Or, we could end up with three x86 platforms - AMD, Intel and Nvidia. That'd be good for the consumer, eventhough I doubt that will happen.
Artillery. Its what's for dinner.
User avatar
Elaro
Padawan Learner
Posts: 493
Joined: 2006-06-03 12:34pm
Location: Reality, apparently

Post by Elaro »

The really funny thing about all this is that ATI partnered with Intel to support Crossfire on the 975X chipset.

[quote=Arrow]Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now. [/quote]

I don't really understand how the designers of the best (at the moment) graphic cards can "[not] know a lot about processors". Can you elaborate on that, please?
"The surest sign that the world was not created by an omnipotent Being who loves us is that the Earth is not an infinite plane and it does not rain meat."

"Lo, how free the madman is! He can observe beyond mere reality, and cogitates untroubled by the bounds of relevance."
User avatar
Arrow
Jedi Council Member
Posts: 2283
Joined: 2003-01-12 09:14pm

Post by Arrow »

Elaro wrote:I don't really understand how the designers of the best (at the moment) graphic cards can "[not] know a lot about processors". Can you elaborate on that, please?
A CPU is quite a different beast than GPU. Yes, they have parts in common with a GPU, such as math operations and memory access, but these are designed for parallel, straight-through computation. IIRC, GPUs aren't designed to handle lots of memory writes, out-of-order execution, condition jumps and many other sequential operations. Now, GPUs are much more CPU-like than they ever were in the past, but asking Nvidia to come up with a CPU that can compete with AMD and Intel in a short time (a few years) is really pushing it.
Artillery. Its what's for dinner.
User avatar
atg
Jedi Master
Posts: 1418
Joined: 2005-04-20 09:23pm
Location: Adelaide, Australia

Post by atg »

It seems that Intel has pulled ATI's chipset license.

Apparentely means no new ATI chipsets for Intel processors after the end of the year.
Marcus Aurelius: ...the Swedish S-tank; the exception is made mostly because the Swedes insisted really hard that it is a tank rather than a tank destroyer or assault gun
Ilya Muromets: And now I have this image of a massive, stern-looking Swede staring down a bunch of military nerds. "It's a tank." "Uh, yes Sir. Please don't hurt us."
User avatar
Uraniun235
Emperor's Hand
Posts: 13772
Joined: 2002-09-12 12:47am
Location: OREGON
Contact:

Post by Uraniun235 »

Elaro wrote:The really funny thing about all this is that ATI partnered with Intel to support Crossfire on the 975X chipset.
Arrow wrote:Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now.
I don't really understand how the designers of the best (at the moment) graphic cards can "[not] know a lot about processors". Can you elaborate on that, please?
Intel and AMD have been making the fastest x86 chips in the world for many years now. They have poured many millions (possibly billions?) of dollars into the development of faster processors. They have, between them, tons and tons of direct experience in developing a specific type of processor (not to mention that they do produce many other kinds of computer chips as well, especially Intel).

nVidia is small-fry compared to them. They do not have the resources to go up against two juggernoughts who have years of experience competing with each other. Hell, they don't even have their own fabrication facilities; they just design stuff and then contract out construction to someone else.
User avatar
Arthur_Tuxedo
Sith Acolyte
Posts: 5637
Joined: 2002-07-23 03:28am
Location: San Francisco, California

Post by Arthur_Tuxedo »

Arrow wrote:
Arthur_Tuxedo wrote:I think the on-die GPU is pretty much geared toward budget PC's, if it's feasible at all. No matter how advanced technology gets, a big video card will always be able to do much more than a tiny co-processor.
Who say's it has to be a tiny co-processor? Looking ahead to 45nm processes, larger dies, stacked dies and the like, a processor could very have multiple sequencial and parallal cores capable of absolutely annihilating anything we have today. Although I admit it is more like we'll see seperate sequencial and parallel chips linked by a dedicated bus sharing the same pool of (very large) memory.
I still see this as a solution for budget chips. CPU's and GPU's are fundamentally different. A processor could "annihilate anything we have today" and still be woefully inadequete as a GPU. The trend has been increasing size of the GPU with each generation, not the reverse. While I don't think that will hold out very much longer, I also don't see the reverse happening.
However, here's one thing that jumped into my mind: A physics co-processor. ATI's been working on physics technology through the GPU, after all, and seems to be ahead of all the other companies in that regard.
We'll probably see something like this a couple of years, perhaps integrated into the motherboard chipset, serving as graphics in a low end ship and a physic processor in a high-end system.
Now there's an excellent idea.
Long term, I still want the true parallel processor, capable of quickly load balancing Direct3D/OpenGL, Havok/DirectPhysics/Aegia and DirectSound/OpenAL. That would fucking rock (10 years from now...).
My first computer had no seperate graphics card. My second computer did, but it was just a straight 2D card with no 3D acceleration. My third computer had a 3D accelerator, and each one I've gotten since then has had a bigger graphics card than the last. Now it's true that sound cards have largely been replaced by onboard sound, but that's because sound quality only needs to be at a certain level before people don't care anymore. Today's game graphics do look mighty impressive, but it will be a very long time before they get so good that any further improvement won't be noticeable.
But whatever comes of this, there are definite dangers. If they come out with something so ingenious that it makes all competitors obsolete, that's a very bad thing for consumers in the long run. There's also a very real danger of foreclosure. For instance, ATI comes out with a kickass video card that's much better than the current offering from NVidia, but you need an AMD CPU or it won't work. These are not good things for consumers.
I don't think AMD will be that stupid, but I'd bet hard cash that a year and half from now ATI's cards work better with AMD than Intel. That could also force Intel into a parternship with Nvidia, which could be interesting.

But long term, I think Nvidia will be the big loser from this. If the predictions I've presented here are true, then Nvidia will need Intel to survive, but Intel doesn't need Nvidia. Intel is making DX10 graphics chips, just not very fast ones, so they do understand graphics, and they could probably understand physics and other parallel tasks very quickly, and produce products to meet those needs at all levels. Nvidia doesn't know a lot about processors, they could very well be fucked seven or eight years from now.

Or, we could end up with three x86 platforms - AMD, Intel and Nvidia. That'd be good for the consumer, eventhough I doubt that will happen.
That's if there are big gains from having a company that produces high-end CPU's and GPU's, which we don't know at this point. But if so, and NVidia goes under with no one able to step up and compete effectively, the biggest loser is the consumer.
"I'm so fast that last night I turned off the light switch in my hotel room and was in bed before the room was dark." - Muhammad Ali

"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
User avatar
Arrow
Jedi Council Member
Posts: 2283
Joined: 2003-01-12 09:14pm

Post by Arrow »

Arthur, you're raining on my parade. Your views are conservative, and will probably be proven more or less correct, but I still want the one chip that does it all, pumping out insane levels of realism in my games. Oh, and I want the whole computer to clip on (so its upgradable!) the back of a 30" monitor, running at 15 megapixel resolution. And I want it ten years from now.

Oh well, maybe I'll have it when I retire forty years from now...
Artillery. Its what's for dinner.
User avatar
DesertFly
has been designed to act as a flotation device
Posts: 1381
Joined: 2005-10-18 11:35pm
Location: The Emerald City

Post by DesertFly »

Arrow wrote:Arthur, you're raining on my parade. Your views are conservative, and will probably be proven more or less correct, but I still want the one chip that does it all, pumping out insane levels of realism in my games. Oh, and I want the whole computer to clip on (so its upgradable!) the back of a 30" monitor, running at 15 megapixel resolution. And I want it ten years from now.

Oh well, maybe I'll have it when I retire forty years from now...
And now you're being conservative. I fully expect that when we retire in forty years, computers that can generate photorealistic graphics and real-life physics will not only exist, they will be small enough to wear on your belt, or tuck in your pocket, or sew into your underwear. I also imagine that the interface will have gotten much more immersive, perhaps even to the point where things like the Matrix are close to possible.
Proud member of the no sigs club.
Post Reply