Page 2 of 3
Posted: 2005-06-15 01:17pm
by Genii Lodus
An ATI Radeon 9700 Mobile with 256MB RAM on my Laptop.
An ancient NVidia Geforce2 440MX with 64MB RAM on my PC.
Expect to see a reccommend me a new PC topic soon.
(First Post!)
Posted: 2005-06-15 01:21pm
by Companion Cube
NVidia GeForce 4 Ti4600. Pretty old but still good for my purposes.
Posted: 2005-06-15 01:34pm
by Comosicus
Radeon 9800SE 256 biti cu 128 MB RAM on my desktop. Previously I had a Riva Vanta 16 MB.
Posted: 2005-06-15 01:46pm
by Dahak
ATI R420.
(you asked for the chip

. Anyway, a X800 XT PE)
Posted: 2005-06-15 01:53pm
by Nephtys
Radeon PCI-E X600Pro 128mb.
Posted: 2005-06-15 02:13pm
by Quadlok
Radeon 9600 XT w/128mb.
Posted: 2005-06-15 03:18pm
by SPOOFE
GeForce 4 ti4600 128mb. Was gonna upgrade to a GeForce 6800 or something (or a 6600 GT, but whatever). Might wind up waiting to see what the mid-range of the next gen cards offers....
Posted: 2005-06-15 03:24pm
by Miles Teg
Hmm.. A GeForce2 MX AGP2x 64MB... heh
Doom3 Low Quality at 1024x768: 6spf (yes, spf, not fps). I just *had* to see.
However, if I don't get layed off next week, it'll be upgraded (along with my system) to a GF 6600GT PCI-e or something better if the proces drop a lot when the G70 comes out.
Posted: 2005-06-15 03:37pm
by Alan Bolte
GF 6600GT AGP
Posted: 2005-06-15 07:08pm
by Exonerate
GeForce 6800, 128 MB.
Posted: 2005-06-16 09:26am
by Arrow
GF 6800 GT 256MB (from BFG, so its oc'ed by default). It runs everything beautifully at 1920x1200 or 1600x1200 (damn punkbuster FOV detector...) with 2/4 AA and 8/16AF.
I can't wait to see the R600 and the G80.
Posted: 2005-06-16 09:30am
by Ace Pace
Arrow Mk84 wrote:GF 6800 GT 256MB (from BFG, so its oc'ed by default). It runs everything beautifully at 1920x1200 or 1600x1200 (damn punkbuster FOV detector...) with 2/4 AA and 8/16AF.
I can't wait to see the R600 and the G80.
Why? not intrested in the G70 and R520?
Posted: 2005-06-16 09:35am
by Soontir C'boath
Behold my elderly card! For I have a GeForce4 MX440!
Posted: 2005-06-16 09:46am
by The Grim Squeaker
Soontir C'boath wrote:Behold my elderly card! For I have a GeForce4 MX440!
Ha, on my old computer it was one generation behind that .
(cant remmember the name, i dumped the whole shebang on my sister's once i got my x800xt platinum

)
Posted: 2005-06-16 12:09pm
by Ypoknons
Ace Pace wrote:Why? not intrested in the G70 and R520?
Probably because he already has a 6800 and doesn't want to just upgrade to the next generation, wiser to wait 2 generations and then upgrade ...
Posted: 2005-06-16 12:17pm
by Arrow
Ypoknons wrote:Ace Pace wrote:Why? not intrested in the G70 and R520?
Probably because he already has a 6800 and doesn't want to just upgrade to the next generation, wiser to wait 2 generations and then upgrade ...
Exactly. And none of the rumors surrounding the G70 and R520 have me excited (now if the Xenos was coming to the PC, then I'd be excited).
Posted: 2005-06-16 03:16pm
by Ace Pace
Xenos won't be that effective on the PC, not enough bandwidth.
And I thought you wern't intrested from a tech prespective, I'm holding a 68GT PCI-E, I have no reason to buy a new card.
Posted: 2005-06-16 03:59pm
by Arrow
Ace Pace wrote:Xenos won't be that effective on the PC, not enough bandwidth.

I just want the EDRAM and "free" AA. I wonder what the technical reasons for not using EDRAM are? I seem to remember on of them being multiple resolutions, vs the three resolutions supported on the 360. Is it because of implementation issues for supporting everything from 800x600 to 2100+xWhatever? Or is it just a pricing issue, where the cost grow at an extreme rate to support additional resolutions?
Unified shaders would be good too (looks like that will be in the R600).
Oh, and from a tech prespective, it doesn't sound like there's a lot of new tech in either chip. But I'd be glad to be proven wrong on that point.
Posted: 2005-06-16 04:03pm
by Ace Pace
I'm assumming the differing resolution, I'm assumming that it adds alot to add the differant 'styles' of doing the AA.
And its not that big an improvement, just the FINAL evolution of the R300. It really is an amazing chip.
Now the G70 looks sweet.
Posted: 2005-06-16 04:18pm
by Pu-239
Soontir C'boath wrote:Behold my elderly card! For I have a GeForce4 MX440!
Bah! My TNT2 pwns all!!
Posted: 2005-06-18 11:08am
by Ace Pace
Arrow Mk84 wrote:Ace Pace wrote:Xenos won't be that effective on the PC, not enough bandwidth.

I just want the EDRAM and "free" AA. I wonder what the technical reasons for not using EDRAM are? I seem to remember on of them being multiple resolutions, vs the three resolutions supported on the 360. Is it because of implementation issues for supporting everything from 800x600 to 2100+xWhatever? Or is it just a pricing issue, where the cost grow at an extreme rate to support additional resolutions?
Unified shaders would be good too (looks like that will be in the R600).
Oh, and from a tech prespective, it doesn't sound like there's a lot of new tech in either chip. But I'd be glad to be proven wrong on that point.
BUMP!
Update from Beyond3d.com
Update: ATI have clarified the daughter, eDRAM/sample logic die count to be 105M transistors.
Now thats for 3 resolutions, I don't think it would be that complex to add more resolutions, but the transistor count can easily spiral, and ATi is reportedly having serious yeild issues with the R520.
Posted: 2005-06-18 01:41pm
by Arrow
Well the max resolution supported by the chip should dictate the transistor count. I'm thinking that there's some limitation in the current implementation that prevents the logic from working on, say, a dozen resolutions, as opposed the just three.
Posted: 2005-06-18 01:46pm
by Ace Pace
Arrow Mk84 wrote:Well the max resolution supported by the chip should dictate the transistor count. I'm thinking that there's some limitation in the current implementation that prevents the logic from working on, say, a dozen resolutions, as opposed the just three.
Or maybe adding each resolution needs more logic, since it appears its three distinct modes, not just scale up.
Posted: 2005-06-18 01:59pm
by Arrow
That shouldn't be the case. You need enough logic to handle the max resolution you're supporting. After that, it should be matter of allocating the logic for a specific resolution (say your max res is 1920x1200, but you're running at 1024x768, you'd only use enough logic to do the 1024x768 rendering, and you disable the rest of the logic, because its not need). But, some designs do require an all or nothing approach, which would require each res have its own logic.
Posted: 2005-06-18 11:23pm
by EmperorMing
3 systems all AGP:
Leadtech 6600GT
Leadtech Geoforce 2 GTS
Abit Suliro TI4200 vivo