Pages: [1]
|
|
|
Author
|
Topic: That didn't take long... ATI announces new chip (Read 6276 times)
|
Venkman
Terracotta Army
Posts: 11536
|
http://biz.yahoo.com/bw/040504/45283_1.htmlATI has used the most advanced technologies available to design and manufacture its new RADEON X800 graphics chips. Using the successful low k .13 micron semiconductor process from Taiwan Semiconductor Manufacturing Co., Ltd., ATI created a scalable, highly parallel and powerful visual processor that is twice as fast as today's performance leader, the RADEON(TM) 9800 XT. The advanced fabrication process and leading-edge GDDR3 memory have also enabled ATI and its board partners to deliver high-performance, user-friendly graphics card solutions which will work in customer's existing PCs without requiring radical and costly power supply or cooling upgrades. I have no idea how good it actually is.
|
|
|
|
cevik
I'm Special
Posts: 1690
I've always wondered about the All Black People Eat Watermelons
|
|
The above space is available for purchase. Send a Private Message for a complete price list and payment information. Thank you for your business.
|
|
|
Fabricated
Moderator
Posts: 8978
~Living the Dream~
|
Short summary:
The new Radeon is a bit slower than the new GeForce 6800+ in OpenGL, but faster across the board with AA and Ansi on, and decidedly faster with DirectX games (check the Far Cry scores out).
It also doesn't have a disgustingly huge and noisy heatsink/fan combo, nor does it take 100+W of power and 2 dedicated molex connectors.
Thank you, ATi. Now just work on those drivers and Nvidia will be the next 3dfx.
|
"The world is populated in the main by people who should not exist." - George Bernard Shaw
|
|
|
schild
Administrator
Posts: 60345
|
3DFx makes some great professional chips. I still drool over some of them.
|
|
|
|
|
Alluvian
Terracotta Army
Posts: 1205
|
I find it pretty shitty that in the conclusions the talk about some barely noticable differences in farcry rendering (invalid anyway since they are comparing images at two different resolutions) and completely fucking ignore that the new ati cards failed to show ANY shadows at all in whole scenes of splinter cell. The screenshots there looked like shit on the ATIs compared to what they should have looked like. Went from a dark shadowy forest to an appearance of high noon, in a game where shadows are critical.
Overall the ATI cards still seem like they are doing a better job with their hardware, but their driver engineers still are having issues (I would suspect the splinter cell issues are bad drivers).
I just don't trust ATI yet, but Nvidia is not making hardware that really wows me. Damnit why won't one of them buy the other out and have them merge all their strengths?
|
|
|
|
Glamdring
Terracotta Army
Posts: 139
|
I just don't trust ATI yet, but Nvidia is not making hardware that really wows me. Damnit why won't one of them buy the other out and have them merge all their strengths? Nvidia has just been pissing me off with their hardware designs in general, from jet engine sounding fans to the massive power hog that is the 6800. Do they really think we want this crap? I thought technological advancement was supposed to help parts get smaller/quiter, but it seems that Nvidia is hell bent on going off in some other direction.
|
|
|
|
HaemishM
Staff Emeritus
Posts: 42632
the Confederate flag underneath the stone in my class ring
|
I think Nvidia has gotten too focused on maintaining benchmark speed while releasing at the same time or earlier than ATI. Competition is great, until the only goal in your design is to match the competition despite any serious shortcuts you have to take to achieve that. If ATI can release the X800 XT without needing a ginormous fan strapped to the card that requires 2 power connectors, why can't Nvidia? It just seems like those connections and fans are going to cost something awful.
|
|
|
|
Sky
Terracotta Army
Posts: 32117
I love my TV an' hug my TV an' call it 'George'.
|
*cough 3dfx cough*
*hack voodoo6 hack*
|
|
|
|
Alluvian
Terracotta Army
Posts: 1205
|
I have not looked at the details on the new cards, but in some ways the FX series was more powerful than the 9800 series, but not in ways important to video gaming. FX ran at a full 32 bits which turned out to not be necessary and crippled them vs the 16 bits of the 9800 (or 24 or whatever it was). You could not see the 32 bits, but you could sure see the extra fps of the 9800.
It would not surprise me if the geforce6 falls in this same trap by doing more work to get the same result. It seems that they are trying to just use brute force to slam out fps with a hammer.
|
|
|
|
Sairon
Terracotta Army
Posts: 866
|
I'm runing out of hope for Nvidia, shity performance/buck. I think it's pretty damn obvious which card is THE card to get, just as it was with the last generation cards. Of course there's still some Nvidia fanbois talking about how great and powerful their cards are. The expensive Nvidia card supported 3.0 shaders, but who fucking cares, when we are going to see them used, you will have to turn them off anyway because of crapy performance.
|
|
|
|
Alluvian
Terracotta Army
Posts: 1205
|
Yeah, I am not really a fanboi, but I like 3d stereoscopic gaming. And ATI still don't have that.
|
|
|
|
Shrike91
Guest
|
Damnit why won't one of them buy the other out and have them merge all their strengths? Ack! We don't need fewer serious, well funded companies producing GPU's. We need more.
|
|
|
|
Glamdring
Terracotta Army
Posts: 139
|
Well, www.nv40.com is reporting that Nvidia might not be stupid after all... Graphics company ready to embrace friendlier, less-expensive 350W power supply recommendation for upcoming GeForce 6800 Ultra GPU. GameSpot: What are the reasons behind the 480W power supply recommendation for the GeForce 6800 Ultra? Ujesh Desai: We have to spec our products for the absolute worst case, not the best. Marginal power supplies and poorly ventilated cases must be taken in to account. We are traditionally over-cautious about our minimum specifications, and this is no exception. Now that we have had more time with the GPU we are getting a better feel for its capabilities, tolerances and resource requirements. The reason we initially spec'ed out 480W was because the GeForce 6800 Ultra was targeted at the Enthusiast. These folks tend to push their systems to the max. That means overclocking the GPU, using the fastest CPU on the market, and also typically using multiple peripherals. Nice spin.
|
|
|
|
Sky
Terracotta Army
Posts: 32117
I love my TV an' hug my TV an' call it 'George'.
|
"Our cooling system for the FX is not a leafblower...it's an auditory desensitization device specifically designed for mmog players to drown out their bleating children and significant others."
|
|
|
|
HaemishM
Staff Emeritus
Posts: 42632
the Confederate flag underneath the stone in my class ring
|
:: WHOOOSH! ::
What was that sound?
[shouts] DON'T WORRY ABOUT IT, THAT'S JUST MY GRAPHICS CARD STARTING UP! [/shouts]
|
|
|
|
daveNYC
Terracotta Army
Posts: 722
|
...and also typically using multiple peripherals. What, like a microwave?
|
|
|
|
Alluvian
Terracotta Army
Posts: 1205
|
Teh h4rdc0r3 are not exactly going to get UP and go to the KITCHEN to nuke that 3 day old burrito. So yeah, a microwave you smartass :P
|
|
|
|
Glamdring
Terracotta Army
Posts: 139
|
Don't forget the leet car cigarette lighter mod!
|
|
|
|
Shmtur
Terracotta Army
Posts: 67
|
...and also typically using multiple peripherals. What, like a microwave? Just so you know, that was fucking hilarious.
|
|
|
|
|
Pages: [1]
|
|
|
|