Welcome, Guest. Please login or register.
July 23, 2025, 04:54:20 AM

Login with username, password and session length

Search:     Advanced search
we're back, baby
*
Home Help Search Login Register
f13.net  |  f13.net General Forums  |  Gaming  |  Topic: ATI Drivers 0 Members and 1 Guest are viewing this topic.
Pages: [1] 2 Go Down Print
Author Topic: ATI Drivers  (Read 11837 times)
NiX
Wiki Admin
Posts: 7770

Locomotive Pandamonium


on: June 15, 2004, 11:11:55 AM

I just got my Radeon 9600XT and decided to finally see some pretty graphics (running a GeForce 4 MX 440 SE before. The name hints to the suck it contained.) Anyway, I played WW2O with a buddy and noticed the a few problems. In the distance sharp black triangles would appear randomly. Sometimes ground textures would dissapear because they just damn well wanted to. This never happened on the GF4MX. I'm wondering, is it the ATI drivers? I remember reading that there were customer made drivers for ATI.

Just looking to get the best performance out of my card.
Zephyr
Terracotta Army
Posts: 114


Reply #1 on: June 15, 2004, 11:23:39 AM

I use these drivers.  They seem a bit better than the plain catalyst drivers.
geldonyetich
Terracotta Army
Posts: 2337

The Anne Coulter of MMO punditry


WWW
Reply #2 on: June 15, 2004, 01:14:55 PM

I hesitate to call the Omega Drivers genuinely *better*.   They're just heavily tweaked and include some powerful utilities for on the fly resolution adjustments and whatnot.

Personally I stick with the ATI released drivers (using 4.6 right now) and use a program called ATITool for my overclocking (which my card is actually pretty bad at) and quick resolution/anti-aliasing adjustment needs.

NiX
Wiki Admin
Posts: 7770

Locomotive Pandamonium


Reply #3 on: June 15, 2004, 02:02:05 PM

I'm not one to change my res or other settings on the fly. I just want performance. I don't care about AA or Antisoptric (sp?) filtering. I'm not a "shiney" happy kind of person. I've run that GF4 on FFXI/CoH/ToA so I'm not one to care about missing out on something extra :P
Big Gulp
Terracotta Army
Posts: 3275


Reply #4 on: June 15, 2004, 03:00:18 PM

Quote from: NiX
I'm not one to change my res or other settings on the fly. I just want performance. I don't care about AA or Antisoptric (sp?) filtering. I'm not a "shiney" happy kind of person. I've run that GF4 on FFXI/CoH/ToA so I'm not one to care about missing out on something extra :P


Absolute opposite here.  Before I started buying Nvidia I bought 3DFX for their AA, and now I stick with Nvidia because I've just seen way too many graphical issues with ATI cards to warrant me ever getting one.  I gotta have AA, and I can't stand screwed up textures, yadda yadda.

I'll concede that ATI technically makes better cards, but sorry, Nvidia has it all over them in drivers.
Numtini
Terracotta Army
Posts: 7675


Reply #5 on: June 15, 2004, 04:07:59 PM

You said you had an nvidia before? Did you wipe the hard drive and reinstall the OS from scratch?

I did all the registry edits, nasty (nvidia) file removers, searching every file on the hard drive, and all that nonsense and it simply didn't work. The only way I got any sort of performance out of my Radeon was wiping the drive.

If you can read this, you're on a board populated by misogynist assholes.
geldonyetich
Terracotta Army
Posts: 2337

The Anne Coulter of MMO punditry


WWW
Reply #6 on: June 15, 2004, 05:00:16 PM

NVIDIA is considered a little bit more the industry standard.  After all, they're the ones shelling out cash for the "Best on NVIDIA!" advertisements we see slathered all over games like Unreal Tournament 2004.

However, aside from that I think the only thing NVIDIA really has going for them is 3D Shutter Glasses support.

In WW-II Online's case, sounds like an applcation-specific issue.  ATI is continually releasing drivers tweaks to overcome this or that flaw with this or that game.   NVIDIA does the same, actually.

NiX
Wiki Admin
Posts: 7770

Locomotive Pandamonium


Reply #7 on: June 15, 2004, 06:37:09 PM

Quote from: Numtini
You said you had an nvidia before? Did you wipe the hard drive and reinstall the OS from scratch?

I did all the registry edits, nasty (nvidia) file removers, searching every file on the hard drive, and all that nonsense and it simply didn't work. The only way I got any sort of performance out of my Radeon was wiping the drive.
Ack. I don't have a spare drive or blank CD's. No way in hell I'm reformatting this SoB.

Well, I would.. if I could find a program that does an inventory (per-se) of my system and tells me what I have so I can actually remember what to get again.
Arcadian Del Sol
Terracotta Army
Posts: 397


WWW
Reply #8 on: June 16, 2004, 06:49:06 AM

Just in case you installed the drivers that came with the card, new ATI Catalyst drivers came out about 8 days ago.

unbannable
NiX
Wiki Admin
Posts: 7770

Locomotive Pandamonium


Reply #9 on: June 16, 2004, 07:46:02 AM

I got the card monday =\ So, the new ones were messing up a couple games + performance wasn't that good.

Edit: Benchmarks show Omega drivers aren't any better. BOO ON OMEGA! I might just format for shits and giggles then.
Kenrick
Terracotta Army
Posts: 1401


Reply #10 on: June 16, 2004, 08:32:47 AM

The newest catalyst drivers aren't letting me alt-tab in and out of Dungeon Siege.  Other than that, they seem fine to me.  I'm not an overclocker.
Sable Blaze
Terracotta Army
Posts: 189


Reply #11 on: June 16, 2004, 08:39:25 AM

The skinny: ATI drivers suck.  They always have, they always will.

Now you may be saying to yourself, "well, that's no help." No, it's not...now. But it's a truism bought and paid for with personal experience.
When I came to the dark side after years of Macs, my Windows box had an ATI card. NEVER again. No way. No.  

There's enough problems associated with ATI drivers that I'd personally not buy one of their cards even if they cleaned windows and took out the trash. Additionally, my present machine is an nVidia legacy machine, so switching is simply a game of electronic Russian roulette...with five chambers loaded. The fact that performance with equivalent models is almost identical makes new buying decisions very easy.
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #12 on: June 16, 2004, 09:32:11 AM

Both companies have their strengths and weaknesses. In the last couple generations, ATI did slightly better with scene processing (aa/af), and MUCH better with pixel shader 2.0. I switched from my ti4400 to a 9800pro because I'm now running a fixed resolution (1280x720), so any increase in scene quality is crucial because I can't just notch up the resolution.

The next batch of cards looks about equal, with nvidia adding PS3.0 (no big whup imo, 32 bit color instead of 24) and some more elegant pixel processing (because they needed to address missed predictions!) but at the expense of a 480W power supply /requirement/ and two power plugs. ATI gives about the same performance, only no PS3.0.

I'm running 4.4 Catalyst without a problem. I'd hardly say their drivers 'suck'. A leafblower, now /that/ sucks ;) Cheesing me to upgrade my power supply instead of adding in a cheap transformer brick, that sucks.
schild
Administrator
Posts: 60350


WWW
Reply #13 on: June 16, 2004, 09:39:29 AM

A leaf blower blows.

This thread was over when 'ATI's drivers suck and both cards are about equal' was mentioned. They are. I like having an ATI in my laptop because I use it more for dvds than games (they *still* have the best hardware dvd decoder that isn't a seperate board) - but their drivers suck balls, took me 2 months to get CoH running properly. And that only happened when I installed the Omega drivers.

I've since switched to the GeForceFX line in my desktops and couldn't be happier with the performance of games. I'm anti-overclocking because from what I've seen it's the equivilent to putting a giant H badge on your rear window....and I edited video and worked with graphics for most of college. It's just unnecessary wear on parts. Some of these people who overclock could have spent that extra money on a new computer that rendered or ran bots full time instead of wasting the clock cycles on their main computer....but I'm ranting now....so anyway, to bring it full circle - buy GeForce until you ATI comes up with an ACTUAL reason to support their desktop cards.
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #14 on: June 16, 2004, 11:46:59 AM

Quote
but their drivers suck balls, took me 2 months to get CoH running properly

I've had zero problems on my system, which was upgraded from an nvidia card (which I guess causes problems for a lot of people). I don't think I'm a genius by anything but the strictest definition, so why don't I have problems? I'm the statistical anomaly? I don't buy it.

If anything, I've had less problems with this 9800pro than my old ti4400. One graphical anomaly in BF:V using 4.3 drivers that 4.4 fixes. I also don't generally upgrade my drivers unless I need to, or I hear of particularly good enhancements/optimizations.

Quote
A leaf blower blows.

And how does it get the air to blow out of one end?
Alluvian
Terracotta Army
Posts: 1205


WWW
Reply #15 on: June 16, 2004, 11:59:46 AM

Quote
And how does it get the air to blow out of one end?


Air displacement.


I am wanting to go GeForce on my next computer, which my wife wants me to price out soon.  Has the price drop from the 6900 coming out occured yet?  I am looking for something comparable to the 9800, I am thinking GeForce 5900 for that, right?  Any brands people like more than others?  My last two Geforce cards have been Gainward and I have been pretty happy with them to date.
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #16 on: June 16, 2004, 12:19:32 PM

I used to use Visiontek for nvidia cards, they used to develop the reference models, but they've gone out of business.

Displacement? Define your exact meaning, because displacement as I know it doesn't apply to a mechanical action of moving air from one place to another. It's displacing the air at the outflow end of the tube, sure. But that's not how it gets the air to blow out of that end to do said replacement.

It sucks that air in the back of the leafblowing unit, thus why I made the pun that a leafblower sucks.

Yes, schild, I'm bored today, too :)
geldonyetich
Terracotta Army
Posts: 2337

The Anne Coulter of MMO punditry


WWW
Reply #17 on: June 16, 2004, 12:33:00 PM

As Sir Isaac Newton tells us, every action has an equal and opposite reaction.   Remember, a blow is just a reverse suck.

...

Back to the matter at hand, ATI cards aren't nearly as incompatible as I've been hearing on this thread.   In the fucking hundreds of games I play, I've only run into compatibility issues in maybe 2 or 3.   The only real compatibility issues my ATi card ever caused were when it comes to E3 stereographic glasses.   Which is an orgasmically better way to play the game when they work, but is sadly rarely supported for games.   NVIDIA's drivers can get a rare few to work quite well with stereographic drivers, but only 25% of NVIDIA's stereographic success is possible with the finest eDimensional (yet another stereographic glasses vendor who actually put some development time into developing ATI-compatible) drivers.

Alluvian
Terracotta Army
Posts: 1205


WWW
Reply #18 on: June 16, 2004, 12:35:14 PM

If you define a fans function as blowing as opposed to sucking, then what you describe as sucking is actually the result of the air being displaced.  The air in the 'back' of the fan simply moves in to fill the void because of nature hating vacuums and all that.

Alternately you can define the function of a fan as sucking and then the blowing part would work by the sucked air displacing the air at the 'front' of the fan and moving it away.

So call it sucking or blowing, I don't have much preference.  They are the same thing really.
NiX
Wiki Admin
Posts: 7770

Locomotive Pandamonium


Reply #19 on: June 16, 2004, 12:59:45 PM

You're an easy one to please then.

Oh god, I feel dirty
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #20 on: June 16, 2004, 01:51:17 PM

Quote
The air in the 'back' of the fan simply moves in to fill the void because of nature hating vacuums and all that.

Then it's the air that is doing the displacing, even if driven by the relative lack of air which was sucked into the blower, the only place where the blower itself would be doing any displacement is forward of the fan.

And it's not the same thing, it changes the nature of the force driving the displacement of air when you go from suck to blow.
Alluvian
Terracotta Army
Posts: 1205


WWW
Reply #21 on: June 16, 2004, 02:03:30 PM

Sure it is the same thing.  Even your mouth can be defined as doing either.

Blowing is no more than sucking air out your lungs, and sucking is just blowing air back into your lungs.  It just matters on your perspective.
Sable Blaze
Terracotta Army
Posts: 189


Reply #22 on: June 16, 2004, 08:50:07 PM

The FX6800 Ultra is still weighing in at a hefty $450. The promised stripper model has yet to appear.

Bang for the buck would indicate the FX5900XT. $170 give or take and performance about equivalent to the ATI 9700PRO. It's a touch off the higher end 5950 or 9800XT, but overclocking can fix that, if it bothers you. None of these cards can run AA and aniso at 1600x1200, so I consider them all the same.
Kenrick
Terracotta Army
Posts: 1401


Reply #23 on: June 17, 2004, 04:16:07 AM

For me it came down to:

Radeon 9800 Pro 128mb

or

GeForce FX5900XT 128mb

or

Radeon 9600XT 256mb

All of the cards are in the same price ballpark, but it was the 9800 Pro that finally won me over with its specs.  Happy with my decision and with ati's drivers.
Alluvian
Terracotta Army
Posts: 1205


WWW
Reply #24 on: June 17, 2004, 07:02:44 AM

Quote
Bang for the buck would indicate the FX5900XT. $170 give or take and performance about equivalent to the ATI 9700PRO.


They sure dropped in price fast.  I am glad they are in or under the $200 range.  That is usually my cutoff for video cards.  The extra performance of the 9800 is tempting, but:

1) I like the support of clinging on to 'industry standard' for a few more years
2) I know there are millions of ATI cards out there and the problems I have heard with them might be the minority, but it is still orders of magnitude more problems than I have been hearing about with the GeForce cards.
3) I have stereo shutter glasses that I LOVE to use when games support them.  Even if it is somewhat rare.  I don't want to go to ATI and lose the ability to play my current true 3d games or any of the future full 3d games.  Even though pixel shaders are making this kind of support even more rare it seems.

The ati bonus is 'more' performance for the buck. That is a big bonus, but it is hard to quantify that 'more' statement.  Most tests I have seen are run under conditions that I would never ever play a game.  I don't give a rats ass about aa or af.  It seems like these days benchmarkers are more interested in aa and af than anything else.  The reason seems to be that without these features maxed out, all the top two tiers of cards kick the living shit out of every game out there.  Any fps past 30 or 40 is pretty lost on my eyes anyway.

Still a bit till I buy a new system, but that is my current line of thought.  Then again, once I start pricing I usually find some "deal I can't resist".
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #25 on: June 17, 2004, 07:35:38 AM

I always find it odd when folks say they don't care about visual quality, when the entire time spent playing video games is watching the graphics. If I'm going to sit and watch something for hours, I want it to look as good as possible.

Sure, if you don't care about pixel shaders and aa/af, you can pretty much buy most cards on the shelf, even a geforceMX.
Quote
Blowing is no more than sucking air out your lungs, and sucking is just blowing air back into your lungs.  It just matters on your perspective.

But the point we were debating was whether or not it was displacement. To displace something means you need to have something else to displace it with. Sucking air out of your lungs (as your lungs shrink to a smaller size when they aren't filled with air, nothing replaces the air as it leaves) is not displacing anything, but blowing the air out of your mouth will displace the air in front of your mouth with the air from your lungs.

Thus a leaf blower get the air it needs to do it's displacement job by sucking air in through the back of the blower. That air is then displaced by the ambient air in the vicinity, so it does not get the air to do it's business by displacement.
Alluvian
Terracotta Army
Posts: 1205


WWW
Reply #26 on: June 17, 2004, 07:54:08 AM

Quote
I always find it odd when folks say they don't care about visual quality, when the entire time spent playing video games is watching the graphics. If I'm going to sit and watch something for hours, I want it to look as good as possible.


I just don't like aa or af.  I never said I don't like visual quality.  But when most reviews do their side by side shots and talk their babble about quality I can't see any difference in either shot.  They both look good to me.  The exception is when they showed splintercell pandora tomorrow side by side.  In those shots the ATI card failed to render 90% of the shadows in one of the outdoor scenes they showed.  Which is a total game breaker in splintercell.  That is probably a one game issue, but it was the only side by side comparison I have seen where I see enough difference to want one over the other.

I don't really like AA or AF as it always looks a little blurred to me.  I don't find slight blurring as being superior to slight jaggies.  I would rather just up my resolution abit and move on.  At 1200 or 1600 and up jaggies don't bother me in the SLIGHTEST anymore and I personally think AA and AF make the scene look worse.  Just personal preference.  At low resolution like 800x600 and below I can see aa and af improving a scene significantly, but at higher resolutions I just don't like it.  The only time I would use it would be for 3d stereoscopic gaming to get a higher frequency out of my monitor.  But unfortunately neither AA nor AF currently even work in 3d gaming.  So I have no reason to play in those resolutions.  Except DX2 which ran like shit in 1024 on my poor machine.

Generally speaking unless side by side I can't even notice the effects of aa or af on a high resolution scene.  But I usually CAN notice the performance hit.  Maybe on a much better card where I don't notice the performance hit it would be something I might use.  But I doubt it.
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #27 on: June 17, 2004, 09:05:26 AM

You can debate AA, but not AF. You can get an AA effect by raising the resolution, but if you're not using AF you are looking at a relatively crappy muddy image. You complain about blurriness, which is what AF is made to alleviate. AF is in there to correct problems with the way mipmaps are drawn, and it's a pretty big performance hit.
Quote
That is probably a one game issue, but it was the only side by side comparison I have seen where I see enough difference to want one over the other.

Well, this is really my point about this whole thing. I have an ATI card. I don't have any problems with it, never had (and I'm not running in a stock setup, which could have been very problematic). I've also run geforce cards without a lot of problems. I've heard about bugs in both driversets, about equally, but I've never encountered them (aside from a couple that were fixed by later driver releases....the nvidia board, but you don't see me bringing up that marginal information prior to using it as this example, eh?)

So given the relative parity of the two gpu brands, to pick nvidia because they put more money into marketing is pretty dumb. My only gripes about nvidia's new boards is the cooling solution and the fact that they require a 480W power supply, and mine is 380W. So if I buy the new nvidia card, I also have to upgrade my power supply. If I buy the new ATI, I don't.

See, I like practical reasons, too :)
Sable Blaze
Terracotta Army
Posts: 189


Reply #28 on: June 17, 2004, 10:19:53 AM

I don't think the new boards really require a 480watt PS. I think it's a (stupid) attempt to make sure you have the two free molox connectors for the board (a relatively up-to-date PS in other words).

I have an good, but older 300watt PS. I know I have enough molox connectors for an FX6800. I used to run multiple hard drives in a previous incarnation of this machine. Now I don't. However, my PS is old. I'd be inclined to upgrade. If you have a 380 or 425 I wouldn't hesitate to to use it ASSUMING it's a good one. If you have a no-name cheap POS PS, I'd upgrade now regardless of what you're running. Skimping on your PS is almost as bad as doing it on memory.

I choose nVidia boards because they work. I've had an ATI board and it was crap. I'm not inclined to role the dice again. Not when I can buy identical performance AND use my present drivers, or at worst simply update them. If you're building from scratch or replacing your HD(s), then it's an open field. However, once burned, twice shy and you're not getting anything from ATI you wouldn't get from nVidia.

A final addendum; if you're running at 1600x1200 the only two cards that will get you decent (not great, just adequate) framerates with anisotropic filtering are the FX6800 and the ATI X800whatevertheycallit. It is a huge performance hit. If you don't want aniso, then you can save a ton of money on the last generation or two of cards. There really isn't much difference between the FX5900XT and the 9700PRO and the newer cards once you remove the filtering question.
Alluvian
Terracotta Army
Posts: 1205


WWW
Reply #29 on: June 17, 2004, 11:57:36 AM

Quote
So given the relative parity of the two gpu brands, to pick nvidia because they put more money into marketing is pretty dumb. My only gripes about nvidia's new boards is the cooling solution and the fact that they require a 480W power supply, and mine is 380W. So if I buy the new nvidia card, I also have to upgrade my power supply. If I buy the new ATI, I don't.



First, to clear it up, you do NOT need a 480W power supply for the 6900.  They have been run just fine on 350W and even 300W systems.  I heard the explanation of that figure being a rather lame assumption that the owner of the 6900 is also likely to have top of the line everything else as well and is running a dual processor system with multiple hard drives, etc...  It has been since rather soundly trounced.  The two connections is annoying, but not something that would affect me either way.

Second, I am personally not interested in that level of card yet.  I would only go 9800 or 5900 at max.

Regarding AF, I guess I should look into it more and see what it really does for an image.  But I can usually only tell the difference on those kinds of things when they zoom way in on one part of the screen to show the differences.  I am not going to be looking for that kind of thing when gaming.  If it is in my face and detracts from the game then there is a problem.

Finally, one of my reasons for still wanting to stick with Nvidia at this point is that I don't want to give up on 3d stereoscopic gaming.  And going ATI is pretty much giving up on that right now.  There is zero native support and the only driver available only works for a small percentage of the already small percentage of games that work on nvidia.

All this said, I can't get the quantitative numbers I want from benchmarking reviews.  The two competitors both seem to be in the 'off the chart' range when it comes to the scenarios in which I play my games.  They usually have to go into crazy 1600 resolutions to get the figures down to where they can be compared.  For some reason Nvidia performance drops more than ATI at the really high video resolutions.  But I very rarely run 1600 resolution because it forces my monitor refresh lower than I like.

For me I would like to 'rent' both cards after halflife 2 or doom 3 comes out and see what the gaming difference is between them.  Not the numeric benchmark figure.  But I can't do that.  I just know I like my 3d stereoscopic glasses though.
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #30 on: June 17, 2004, 12:24:16 PM

Quote
I don't think the new boards really require a 480watt PS. I think it's a (stupid) attempt to make sure you have the two free molox connectors for the board (a relatively up-to-date PS in other words).  

It's actually it pertains to low end power supplies, which only output the peak wattage under a narrow set of conditions. Since so few people take the time to get a quality power supply, it makes sense. MaximumPC ran tests with the following PS's: PC P&C 510W & 410W; Sparkle 400W; small form factor 250W; and a generic 400W. The sff and cheapo PS's failed when a 3d app was launched (in seconds), but the quality supplies did ok. However, I only have a 380W supply in my machine, so it might not be fine, even though it's name branded and good quality.

Other than that, you just reinforced what I was saying, that the main difference is personal bias based upon isolated incidents. Discounting pixel shading, both cards have been operating at roughly equal speeds for several generations (and nvidia fixed their pixel shading, so that niggling complaint is gone in the newest generation, too). My point is that all things being equal, I'd go with the card that's drawing less energy. Just my own preference, which is all I was pointing out :)

Quote
I heard the explanation of that figure being a rather lame assumption that the owner of the 6900 is also likely to have top of the line everything else as well and is running a dual processor system with multiple hard drives, etc...  It has been since rather soundly trounced.

See, that's why I like to use facts from studies.
Quote
Finally, one of my reasons for still wanting to stick with Nvidia at this point is that I don't want to give up on 3d stereoscopic gaming.

That's cool man, I'm not trying to make you buy ATI and don't own their stock or anything. I just get a little nuts when people say one or the other is better, because it's simply not true. They BOTH rock, with only a few caveats (older gf pixel shading, gfMX, etc).

In fact, do you have a linky linky to the stereoscopic stuff? I've never looked into it.
Sable Blaze
Terracotta Army
Posts: 189


Reply #31 on: June 17, 2004, 12:36:02 PM

Considering the six month development cycle of vid cards, renting is about all you're doing now in some respects. I try and make a good purchase as cheaply as possible to get the most life out of my cards that I can. I get about 18 months, which isn't bad.

Right now, I do need a new card (Ti4400 at the moment--damn good card) for the DX9 stuff coming up. I do run at 1600x1200 (I spent plenty on very good monitors to do this). Filtering isn't really an option for me, but at this resolution I don't much care. I've run AA, can take it or leave it. It's great on my Xbox, but on the PC, it hardly matters. Aniso I'd like to run, but simply can't with this card.

However...$400+ for the new cards I have problems with. I like my cards at $200. That's my buy point. Additionally, this box's days are numbered. It just had it's last upgrade. The mobo/CPU/memory setup is now maxed. The next upgrade will have to be a complete rebuild, probably around the Athlon64 and PCIExpress. I'll need a new card for this regardless of what I do now.

So...what to do now? The FX6800 looks great, but simply is too much money. The lower price version is vapor right now. I can wait for a few months, so I might be OK with the FX6800-Lite when/if it ships. The FX5900XT would seem to be perfect, but I would like to run aniso; that's not an option with the 5900. This is the fun part of building your own machines...the incessant plotting and planning and cost/benefit "analysis" and the benchracing. Fun stuff indeed.

In the meantime, my 16 month old Ti4400 soldiers on...the G4s were great hardware.
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #32 on: June 17, 2004, 01:29:19 PM

I agree, Sable. If I didn't have a new fixed resolution (1280x720) display, I'd be perfectly happy with my old ti4400. But I needs me aa/af, it can get pretty chunky looking at that resolution blown up to 5 feet across.
Alluvian
Terracotta Army
Posts: 1205


WWW
Reply #33 on: June 18, 2004, 08:02:23 AM

Here you go Sky.  I try to evangelize when I can.  I love the tech even if it is far from perfect.

Quick description of how it works from the E-Dimensional website (where I got my glasses from)  $50 for a pair last I checked.  May have gone up or down since then.

http://www.edimensional.com/misc/howitworks.htm
You can get a lot more information for googling keywords such as stereoscopic 3d, glasses/gaming, lcd/shutter

I hate calling them shutter glasses, that is an old term from when they used to make a small click when polarizing and they would then make a lot of noise flipping manically back and forth. Back then they could not flip as fast and they could not handle high resolutions, making the image flicker.  I have used my current glasses up to 200hz and they kept up fine and don't make any sound at all.  And there is no perceptible flicker anywhere over 120hz.  It does not get annoying for me unless I drop below 100hz or so.  You are essentially cutting your display freqeuncy in half by using them.  Because of how they work, they don't work with LCD screens.  The pixels just work differently and can't refresh themselves fast enough to produce the right image.

The only problem with the technology that I have is 'ghosting'.  The monitor has to switch scenes at a rate equal to it's display frequency.  The LCD glasses keep up, but there is some 'after image' left due to phospher glow (or debatably the LCD not going to full black fast enough).  Especially with a bright object on a dark background.  The bright object does not fade quite insantly so the object is still partly seen by the other eye.  The end effect is that if you have a bright object on a black background you see the 3d object, but to each side of it there is a slight ghost image.  It depends a lot on your frequency and your gaming environment.  I got a lot of it in AVP2 for instance.  But that game still looked so cool in 3d it was an easy trade off for me.  I hear that different monitors produce more or less ghosting as well.

The real problem is in game design.  Even with perfect drivers not all games will work.  The games have to be designed in 3d as well.  A lot of 3d games cheat or at least don't have proper depth (I think the drivers use zbuffer) information for the objects in the game.  So you will sometimes get games that are 100% flat, or games like GTA 1 and 2 that work perfect except for the headlights and streetlights being at screen depth (the lens flare effect is just applied after the scene is rendered it seems).  So when playing GTA on my system I play 3d during the day and switch back at night.  Some don't mind the lights, but for me it gives me a headache as I switch focus back and forth between them and the scene 'in' my monitor.

The next main problem is FPS crosshairs.  These are rarely rendered correctly in 3d space.  To render them right, you have to put the crosshair at the depth of the pixel underneath it.  Like a lasersight.  This is the only way you can get accurate targeting with only one point of reference.  The Nvidia drivers can create a driver based laser sight that just places a transparent bitmap cursor at the location of the center pixel on the screen.  This only works if you can turn OFF the ingame cursor though, or else the driver cursor will just get rendered on top of it (the cursor itself being the center most pixel).  So this is marginally effective.  Up until the latest Nvidia drivers it was a near impossible to configure it unless the drivers had native support for that particular game (taking months while you wait for an update).  Now you can toggle the lasersight on or off.  It finally worked for planetside for instance and that game is pretty nice in 3d.  Unfortunately in planetside you cant turn off the original crosshair (for some reason the laser sight still works though) making it hard to focus on the right crosshair sometimes.  And the sniper view is just hosed.

Some wonderful games like Max Payne 1 and 2 actually render the crosshair as a lasersight and work GREAT in 3d.  Jedi Outcast also worked great.  Other games sometimes get some modder to make a version that works.  UT2K4 has a 3d crosshair mod out for it.  So does NOLF 2 to mention a few examples.

For the rest of the games you are stuck trying to aim between the blurred double crosshair image.  Like pointing at something across the room.  You can't focus on both your fingertip and the object at once, and in order to properly AIM at the object you have to split the double image down the middle.

3d is very cool when it works, but it is like it's own puzzle game to get it working sometimes.  And it is more voodoo than science it seems.  Some can get game X to run fine while another with seemingly same setup will have troubles.  The newer shader heavy games are only just starting to get some 3d support now in the nvidia drivers, but most still don't work in 3d.  I am hoping the drivers catch up and that Nvidia keeps upgrading it's 3d drivers.  I do most of my troubleshooting over at:
http://forums.stereovision.net/index.php

There are some guys there working on a nice dvd viewer which will display movies on a virtual big screen 'inside' your monitor.  Reports from those who have seen the effect are pretty positive, I have no clue if that would actually look good or not.

Overall I have EASILY gotten my $50 worth out of it.  Your milage will vary.

The guys over at http://www.dti3d.com/ blow me away.  2d/3d glasses free switchable monitors.  That kicks so much ass.  The problem is that current drivers only create 2 images, that works with these monitors, but gives a limited viewing angle.  To really get a kick ass effect that works at more angles I hear you need about 9 or so rendered images.  Drivers could be made to do that, but it would slow performance down of course.  And also require more video ram for the additional textures needed for the other angles.

It is imperfect technology right now for sure.  Once people start seeing the 3d monitors I think interest and demand will go up.  Having to wear glasses to see a 3d image is still a 'geek' thing.  Joe sixpack won't do it.  But if he imagines 3d football and porn in his living room it could take off.  With the extra data hdtv can transmit, they could send dual images as a channel.  LCD screens could resolve that signal in 3d.  A 3d camera could be devised pretty easily.  All the tech is there, but it would take a long time to get it standardized.  I just want it more thought about in gaming right now.  I want all devs to think 3d when making a game.  It is just a huge niche market right now.
geldonyetich
Terracotta Army
Posts: 2337

The Anne Coulter of MMO punditry


WWW
Reply #34 on: June 18, 2004, 10:13:03 AM

No question, stereographic display rocks when it actually works.

About the only games I've ever really gotten to work in 3D were Anachronox, Half-Life (Natural Selection mod is my fave), and Sinistar Unleashed.    Everything else at least had the crosshair issue going on.   I hear you could get Morrowind to work, but I never could.

Hmm, Nolf 2 has a 3D Crosshair mod you say?   Excellent, that was the only thing holding that game back.   I'll have to try that some time.

Pages: [1] 2 Go Up Print 
f13.net  |  f13.net General Forums  |  Gaming  |  Topic: ATI Drivers  
Jump to:  

Powered by SMF 1.1.10 | SMF © 2006-2009, Simple Machines LLC