Welcome, Guest. Please login or register.
April 24, 2024, 08:40:54 AM

Login with username, password and session length

Search:     Advanced search
we're back, baby
*
Home Help Search Login Register
f13.net  |  f13.net General Forums  |  The Gaming Graveyard  |  Everquest 2  |  Topic: Need graphics help 0 Members and 1 Guest are viewing this topic.
Pages: [1] Go Down Print
Author Topic: Need graphics help  (Read 6782 times)
Soln
Terracotta Army
Posts: 4737

the opportunity for evil is just delicious


on: November 27, 2006, 08:36:55 AM

I put this on my blog and in the Tech thread on the official forums.  Trying to get as much info as possible.

I've resubbed and probably will recruit my gf back into the game. But I'm still frustrated on the display. I have a great new LCD (DVI) monitor and want to improve how EQ2 looks. 30 FPS is fine for any game, but even turning down shadows and animation I am stuck in the mid-teens. Shouldn't do that. I also scaled way back any existing processes via msconfig, so there's no a lot of extra CPU being used (other than Norton antivirus I don't want to uninstall).

I have the following rig and questions:

Dell Dimension 8300 with:
2 Intel 3.2GHz CPU's
Total Physical Memory 3,072.00 MB
Available Physical Memory 2.46 GB
Total Virtual Memory 2.00 GB
Available Virtual Memory 1.96 GB
Page File Space 4.35 GB

RADEON 9800 XT (with latest Omega 3.8.231)
RADEON 9800 XT AGP (0x4E4A), ATI Technologies Inc. (Omega 3.8.231) compatible
Adapter RAM 256.00 MB (268,435,456 bytes)

PCI bus
Intel(R) 82875P Processor to AGP Controller

The above is taken randomly from the sysinfo for I/O. My questions are:

1) Do I have and can I use PCI for my Radeon? It seems to be for AGP, but I also seem to have PCI (only from reading sysinfo). How do I change that?

2) Is there anyway I can optimize for 2 CPU's? For anything, but at least to get more out of EQ2?

3) I wanted to up the DirectX and OpenGL AS and AA to x16 instead of letting the applications handle it. Should I?

4) Anything else I can do to improve the display?

Thanks!
geldonyetich
Terracotta Army
Posts: 2337

The Anne Coulter of MMO punditry


WWW
Reply #1 on: November 27, 2006, 09:34:57 AM

Ah, EQ2 and performance issues go hand in hand.  This game is such a power app it's rediculous.

Quote from: Soln
RADEON 9800 XT (with latest Omega 3.8.231)
RADEON 9800 XT AGP (0x4E4A), ATI Technologies Inc. (Omega 3.8.231) compatible
Adapter RAM 256.00 MB (268,435,456 bytes)
I used to run EQ2 on a Radeon 9800 PRO w/ 256MB.  It was playable, but quite choppy in places, and this was without anti-aliasing or any other fancy enhancers.  The card blew up and my warrenty paved the way to replace it with a relatively cutting edge Radeon X1600XT w/ 512MB, a card that gets about 300% performance over the old.  EQ2 runs pretty dang smooth now, except for around the broker where several idling players lag things pretty hard.  Looking at a benchmark for the 9800 XT I'm seeing it's only a slight improvement over the Radeon 9800 PRO, so I'm going to say your performance issues are likely simply the pixel grinding power that your card is capable of versus what EQ2 demands from it and not issues with the rest of your system.

Quote from: Soln
1) Do I have and can I use PCI for my Radeon? It seems to be for AGP, but I also seem to have PCI (only from reading sysinfo). How do I change that?
Assuming you don't have both an AGP and a PCI video card installed, chances are you're looking at some kind of PCI legacy emulation mode being made available for your Radeon.  Most likely EQ2 isn't trying to use the card in PCI mode, so this shouldn't be a concern.

Quote from: Soln
2) Is there anyway I can optimize for 2 CPU's? For anything, but at least to get more out of EQ2?
I never got to play with a 2 CPU rig before, but it's my understanding that whether or not an application makes adequate use of multiple processors comes down to the coding of the developers in the individual application.  In other words, whether or not your 2 CPUs are being used well in EQ2 is up to the programmers of the game.  There might be some argument you can run the game with in order to make it run in two processor mode if it isn't already - I'd ask over the forums.

Quote from: Soln
3) I wanted to up the DirectX and OpenGL AS and AA to x16 instead of letting the applications handle it. Should I?
Forcing that level of Anti-aliasing and Antiscopic Filtering will likely grind most applications to single digit frame rates.  Heck, even systems with uber SLI video cards shy away from x16 Anti-Aliasing, which is typically a major GPU hog.  There's probably no harm in allowing applications to handle it just so you can customize each application within its settings to comfortable levels.

Quote from: Soln
4) Anything else I can do to improve the display?
My advice is to set EQ2 on "high performance" mode and then tweak aspects of the graphics that might bother you until satisfied.  That's what I did on my Radeon 9800 PRO, and I achieved a pretty satisfying balance.  On my X1600XT I've noticed that the game might actually look a little worse at the highest settings - the foliage only renders out so far even at maximum settings and the landmass of antonica looks a little overly generic.  About the only real benefits I get from the higher spec card is faster frame rates and better cloth simulation.   Also, look out when setting texture quality, check the tool tips and they'll explain that some of the larger textures take up massive amounts of on-card video ram.  One needs 512 MB of video ram to run at the maximum texture size.
« Last Edit: November 27, 2006, 09:37:36 AM by geldonyetich »

Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #2 on: November 27, 2006, 10:04:22 AM

I ran without FSAA and only the most minimal shadows (I may have had none) on my 9800 pro 256. I was also running at 1280x720, your LCD panel is probably even higher resolution, necessitating more cuts to eye candy. The game still ran pretty slowly in most areas, and cities could crawl to a slideshow in busy areas.

Bottom line: you're not going to be running it smoothly with the settings you think you can use. I'd recommend the opposite from Geldon. I usually start at the bottom and slowly turn things up, at least with real intense apps like EQ2 and Oblivion.
Ezdaar
Terracotta Army
Posts: 164


Reply #3 on: November 27, 2006, 12:01:07 PM

EQ2 is also going to run like ass no matter what your hardware is. They need to rewrite their graphics engine from the ground up but I doubt that will happen.

Your best scenario is getting it to run like playable ass while still being pleasing to the eye.
Soln
Terracotta Army
Posts: 4737

the opportunity for evil is just delicious


Reply #4 on: November 27, 2006, 01:02:24 PM

thx for all, this is what I was suspecting :(  too bad Gallenite isn't around here anymore...

hard to believe I need to upgrade cards, but I've heard that from others too.  I mean, 2 CPU's, 3Gig of RAM, come on. 

they ought to at least put in graphics profiles so people can switch whenver they want.  Like, soloing outdoor, to playing in a raid. 

Still makes me hopeful though -- I ought to be able to eek better quality out of this rig with what I have.

Trippy
Administrator
Posts: 23620


Reply #5 on: November 27, 2006, 04:55:42 PM

Your video card is the bottleneck and not just for EQ2. And there's still only a handful of games that take advantage of multiple CPUs.
Murgos
Terracotta Army
Posts: 7474


Reply #6 on: November 27, 2006, 05:53:04 PM

EQ2 is one of the main reasons I upgraded from a 9800 PRO to an X1600 something or other (512 MB AGP version).  All the tweaking in the world couldn't get my 9800 into what I considered acceptible playablility.

I wonder if someone with enough patience could write something that turns a second CPU into a dedicated GPU.  There would have to be some serious OS tinkering going on but it would be neat to see software emulation and a crap 2d card give a dedicated GPU a run for it's money.

"You have all recieved youre last warning. I am in the process of currently tracking all of youre ips and pinging your home adressess. you should not have commencemed a war with me" - Aaron Rayburn
geldonyetich
Terracotta Army
Posts: 2337

The Anne Coulter of MMO punditry


WWW
Reply #7 on: November 27, 2006, 06:16:53 PM

I wonder if someone with enough patience could write something that turns a second CPU into a dedicated GPU.  There would have to be some serious OS tinkering going on but it would be neat to see software emulation and a crap 2d card give a dedicated GPU a run for it's money.
I bet AGEIA could do this easily for making a second CPU act as a physics CPU.  But, well, they've a product to sell. ;)  As for GPUs, though, they're supposed to have unique instructions that make them much better at it than a CPU can be.  So while one could probably make a CPU act like a GPU it wouldn't be quite as efficient.  Maybe it wouldn't have to be if you've got a few extra CPUs.

Murgos
Terracotta Army
Posts: 7474


Reply #8 on: November 27, 2006, 06:38:51 PM

They [GPUs] do vector math on multiple pixels at once.  Modern CPU's have a lot of pipelines (I don't know if GPU's have more) and operate at higher core clock speeds.  There could also be added benefits in using the CPU's extra abilities to shorten or circumvent some of the calculations a GPU does by brute force but that would probably require research by math types to determine.

I work with guys who helped design some of the PowerPC CPUs but I doubt if anyone in my office has a clue what a GPU does, just not that kind of work around here.  Maybe I could ask around though, there is probably someone nerdy enough around to make a good guess at the outcome.

"You have all recieved youre last warning. I am in the process of currently tracking all of youre ips and pinging your home adressess. you should not have commencemed a war with me" - Aaron Rayburn
geldonyetich
Terracotta Army
Posts: 2337

The Anne Coulter of MMO punditry


WWW
Reply #9 on: November 27, 2006, 06:41:01 PM

Yeah, I wouldn't know either.  I read a bit about special low level machine functions that the GPUs can do that CPUs have to make a few extra steps to pull off, but no real clue how much efficiency this would translate into.

Margalis
Terracotta Army
Posts: 12335


Reply #10 on: November 27, 2006, 07:40:39 PM

GPUs have different instructions and also they are on the card. The problem with having the CPU try to be a GPU is that it is physically not on the card. This means it can't read texture memory off the card at any acceptable speed. And any calculations that have to end up being finished by the GPU have to be transferred by the GPU over the bus.

Having another GPU is only marginally useful in itself. It depends on a lot of things like how your app is limited. (Fill rate, geometry, etc) But basically 2 GPUs aren't twice as good as one because you can't split the work in half.

If you have your CPU-GPU handle a part of the pipeline that part will become a horrible bottleneck. That means the CPU-GPU can't handle one whole pipeline phase, instead it has to share some phases with the GPU which has the memory bandwidth issues mentioned above.

Short version: wouldn't work well because operations need to access card-local memory, entire pipeline stages could not be farmed out and pipeline stages are hard to split work on, even given something like SLI.

The best thing you could hope to do with extra CPU power would be to do more advanced culling, depth sorting (if applicable), etc.

vampirehipi23: I would enjoy a book written by a monkey and turned into a movie rather than this.
Bandit
Terracotta Army
Posts: 604


Reply #11 on: November 27, 2006, 10:41:32 PM

A few more screenies, my graphics settings are up more than I normally play at. All screenshots taken from the EoF Lesser Faydark zone.







Soln
Terracotta Army
Posts: 4737

the opportunity for evil is just delicious


Reply #12 on: November 28, 2006, 07:04:55 AM

what are yr specs Bandit?
Bandit
Terracotta Army
Posts: 604


Reply #13 on: November 28, 2006, 10:07:28 AM

I run with a 2.4 ghz, 1 gig ram, ATI 850 and usually run at 'balanced'.  The screenshots were taken at 'high quality'.
jpark
Terracotta Army
Posts: 1538


Reply #14 on: November 28, 2006, 11:13:02 AM

Some nice screenshots there - l liked the armor on the pixie/fearie.  Thanks.

I gather they did not update existing zones to allow cliff/wall climbing?

"I think my brain just shoved its head up its own ass in retaliation.
"  HaemishM.
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #15 on: November 28, 2006, 11:35:53 AM

2.4 GHz means nothing. A Barton? A Core 2 Duo?
Signe
Terracotta Army
Posts: 18942

Muse.


Reply #16 on: November 28, 2006, 11:47:02 AM

There was a time when 2.4 GHz meant something.

My Sig Image: hath rid itself of this mortal coil.
Bandit
Terracotta Army
Posts: 604


Reply #17 on: November 28, 2006, 12:00:17 PM

2.4 GHz means nothing. A Barton? A Core 2 Duo?

Fuck if I know, it's an Intel but not dual core.
Pages: [1] Go Up Print 
f13.net  |  f13.net General Forums  |  The Gaming Graveyard  |  Everquest 2  |  Topic: Need graphics help  
Jump to:  

Powered by SMF 1.1.10 | SMF © 2006-2009, Simple Machines LLC