Welcome, Guest. Please login or register.
April 23, 2024, 06:33:24 AM

Login with username, password and session length

Search:     Advanced search
we're back, baby
*
Home Help Search Login Register
f13.net  |  f13.net General Forums  |  Gaming  |  Topic: Hardware Physics Accelerators for gaming 0 Members and 1 Guest are viewing this topic.
Pages: [1] 2 Go Down Print
Author Topic: Hardware Physics Accelerators for gaming  (Read 8600 times)
jpark
Terracotta Army
Posts: 1538


on: August 21, 2006, 12:05:02 AM

What's the scoop here guys - I see that dedicated boards for physics simulations in games now exist:

http://www.pcper.com/article.php?aid=222

but are there actually any widely played games that make use of this?  I heard CoH mentioned something like this sometime ago.

Is this worth buying or is it on the bleeding edge rather than the cutting edge?


"I think my brain just shoved its head up its own ass in retaliation.
"  HaemishM.
Trippy
Administrator
Posts: 23620


Reply #1 on: August 21, 2006, 12:32:11 AM

What's the scoop here guys - I see that dedicated boards for physics simulations in games now exist:

http://www.pcper.com/article.php?aid=222

but are there actually any widely played games that make use of this?  I heard CoH mentioned something like this sometime ago.

Is this worth buying or is it on the bleeding edge rather than the cutting edge?
AFAIK the AGEIA stuff in CoH is still Beta and still buggy.
Yoru
Moderator
Posts: 4615

the y master, king of bourbon


WWW
Reply #2 on: August 21, 2006, 12:47:28 AM

The demo at E3 for the physics board - I don't remember the brand or name - was a realtime-rendered demo from Heavy Rain, where it was used to do stuff like extremely realistic hair and tears. I think. Either way, the Heavy Rain demo was hyperrealistic - I actually stared at it for a good while before realizing it was (a) rendered and (b) not pre-rendered.
Hanzii
Terracotta Army
Posts: 729


Reply #3 on: August 21, 2006, 03:21:21 AM

I have the Asus PhysX card installed in my home pc.

Bottom line? It's a very noisy fan, that can't be turned off even though it does nothing usefull for 99% of the time it's running. Had I actually bought the card, it would also have been a very expensive fan...

It makes the explosions in GRAW look slightly better, but does nothing to improve speed (running with a 6800 and an AMD X2 3800+).

I still think, that Aegia is merely hoping that nVidia or ATI will buy the company and use the tech. I doubt it.

----------------------------------------------------------------------------
I would like to discuss this more with you, but I'm not allowed to post in Politics anymore.

Bruce
Trippy
Administrator
Posts: 23620


Reply #4 on: August 21, 2006, 03:33:58 AM

I still think, that Aegia is merely hoping that nVidia or ATI will buy the company and use the tech. I doubt it.
Well with AMD buying ATI, NVIDIA is their last hope then cause there is nothing their co-processor does that a dual-core or soon-to-be quad-core CPU can't do.
jpark
Terracotta Army
Posts: 1538


Reply #5 on: August 21, 2006, 07:34:06 AM

I still think, that Aegia is merely hoping that nVidia or ATI will buy the company and use the tech. I doubt it.
Well with AMD buying ATI, NVIDIA is their last hope then cause there is nothing their co-processor does that a dual-core or soon-to-be quad-core CPU can't do.


Tangent:  are "duo core" processors the same as "duo processors" ?  Just looking at some Dell info.

"I think my brain just shoved its head up its own ass in retaliation.
"  HaemishM.
Trippy
Administrator
Posts: 23620


Reply #6 on: August 21, 2006, 07:43:43 AM

Tangent:  are "duo core" processors the same as "duo processors" ?  Just looking at some Dell info.
The Intel Core 2 Duo processors are dual-core if that's what you are asking (the older Intel Core Duo are as well). I.e. in a single CPU "package" (the thing you stick into the CPU socket on your motherboard) there's actually two CPUs on the same die.
Engels
Terracotta Army
Posts: 9029

inflicts shingles.


Reply #7 on: August 21, 2006, 07:53:15 AM

The naming conventions this time around are enough to drive you insane.

There are 'dual core' processors, from both AMD and Intel.

There is an intel line called Core Duo. There is a new intel line called Core 2 Duo. It is this latter chip that is the new belle of the ball, and probably for good reasons. The 6600 functions as well as the latest and greatest AMD chip, the AM2 socket dual core processor, yet costing approximately half the AMD chip's price. The market hasn't adjusted yet to reflect the new performance increase on the Core 2 Duo chips, but noone who knows what they're doing is buying a new AMD chip if they want both performance and good pricing.

There's a small catch, however. Intel has been very reticent to work with Nvidia, so for now, the only chipset from Nvidia that works with the Core 2 Duo is their Nforce 4 chip, which is outdated. This is important because without an NForce chipset, you can't do SLI. You can also find the Nforce 570, but the Nforce 590, which has been out a while for AM2 chips still doesn't exists except in beta for Core 2 Duo chips. There seems to be some sort of a problem with regards to the Nvidia chipsets and ramspeed. If you look closely at the specs on Newegg, for instance, you'll notice that the Nforce 4 board's ramspeed is a lower clockspeed than the Intel chipset boards for the Core 2 Duo.

Furthermore, there are some reported issues regarding ram voltage which are probably related to the above problem; most of this new DDR2 ram likes to run at higher voltages, around 2.4 volts. Well, apparently, these new Intel boards do not like running DDR2 at anything higher than 1.8 voltage. You need to get into the BIOS and lower the voltage meter for your ram. Of course, if you do not already have a stick that's working at 1.8 volts -already- you can't get into the BIOS in the first place.

Add to all that that some of the boards left the assembly line unable to handle Core 2 Duo chips without a BIOS upgrade. So you would need an older chip inserted to boot up so you could flash your BIOS to accept the new chips. Only then could you install the Core 2 Duo chip.

In other words, if you're like me and love to build you own systems, its worth waiting a month or two till this crap gets sorted out. Its truly a home-PC enthusiast's nightmare.
« Last Edit: August 21, 2006, 07:55:25 AM by Engels »

I should get back to nature, too.  You know, like going to a shop for groceries instead of the computer.  Maybe a condo in the woods that doesn't even have a health club or restaurant attached.  Buy a car with only two cup holders or something. -Signe

I LIKE being bounced around by Tonkors. - Lantyssa

Babies shooting themselves in the head is the state bird of West Virginia. - schild
Yegolev
Moderator
Posts: 24440

2/10 WOULD NOT INGEST


WWW
Reply #8 on: August 21, 2006, 08:12:08 AM

In other words, if you're like me and love to build you own systems, its worth waiting a month or two till this crap gets sorted out. Its truly a home-PC enthusiast's nightmare.

Go go procrastination.  I suddenly feel awesome for forcing myself to ignore the hardware arena until 2007.

Why am I homeless?  Why do all you motherfuckers need homes is the real question.
They called it The Prayer, its answer was law
Mommy come back 'cause the water's all gone
Trippy
Administrator
Posts: 23620


Reply #9 on: August 21, 2006, 08:13:44 AM

In other words, if you're like me and love to build you own systems, its worth waiting a month or two till this crap gets sorted out. Its truly a home-PC enthusiast's nightmare.
Go go procrastination.  I suddenly feel awesome for forcing myself to ignore the hardware arena until 2007.
Then you'll have to wait for the quad-core platforms to mature.
Engels
Terracotta Army
Posts: 9029

inflicts shingles.


Reply #10 on: August 21, 2006, 08:52:51 AM

Its always a never ending battle to be bleeding edge, but I think in the case of Core 2 Duos, it may be one worth fighting, once things have settled a bit. The mix of cost cutter CPUs and performance boost has drawn me to consider spending a wad on a new system.

I should get back to nature, too.  You know, like going to a shop for groceries instead of the computer.  Maybe a condo in the woods that doesn't even have a health club or restaurant attached.  Buy a car with only two cup holders or something. -Signe

I LIKE being bounced around by Tonkors. - Lantyssa

Babies shooting themselves in the head is the state bird of West Virginia. - schild
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #11 on: August 21, 2006, 09:30:06 AM

Count me in the camp that'll be building a 6600 system. With the 7950GX2 cards, I'm not sure I'd even want an SLI system. It should be pretty well shaken out by the time I build early next year, anyway.

As far as the original post, I think the physics processor has potential, but I don't see it being adopted unless it gets some serious muscle behind it. It would probably work best after being integrated onto a graphics card (or a daughtercard ala the aforementioned 7950).

I'll toss in my usual pining for an AI processor :) Developers, your AI sucks. All of you. For shame.
Yegolev
Moderator
Posts: 24440

2/10 WOULD NOT INGEST


WWW
Reply #12 on: August 21, 2006, 09:40:17 AM

I expect I'll be happy with a dual-CPU rig during the early days of quad-x86 debauchery.  Considering my current level of satisfaction with my AMD 3000+ XP on a Nforce2 board, that is.

Why am I homeless?  Why do all you motherfuckers need homes is the real question.
They called it The Prayer, its answer was law
Mommy come back 'cause the water's all gone
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #13 on: August 21, 2006, 01:49:36 PM

Yeah, I'm just barely ready to upgrade my Barton 3000+ on the nForce 2 Ultra, 9800pro 256. The Soundstorm chip dying helps me part with the wonderful old lady, though. Also, gaming at a fixed 1280x720 really stretches the gpu pretty far. I think the 6600 with an nforce board and a 7950GX2 jobbie will be a nice pc for a while.
Morfiend
Terracotta Army
Posts: 6009

wants a greif tittle


Reply #14 on: August 21, 2006, 01:58:38 PM

I think the 6600 with an nforce board and a 7950GX2 jobbie will be a nice pc for a while.

I also am thinking of building a new system along these specs some time early next year. Maybe a little closer to what is top of the line then. But im defenetly waiting for the bugs to be ironed out in Core 2 Duo before I get some thing new.
Engels
Terracotta Army
Posts: 9029

inflicts shingles.


Reply #15 on: August 21, 2006, 04:09:16 PM

I just purchased an ASUS 7950gx2, along with an ASUS 939 SLI motherboard, since I'm not willing to wait for Nvidia and Intel to get their collective crap in gear. I'll upgrade my board and chip later, but that card looks too smooth to pass up, and I needed to switch from AGP to PCI-Express anyway.

I'll let you all know how its working out.

I should get back to nature, too.  You know, like going to a shop for groceries instead of the computer.  Maybe a condo in the woods that doesn't even have a health club or restaurant attached.  Buy a car with only two cup holders or something. -Signe

I LIKE being bounced around by Tonkors. - Lantyssa

Babies shooting themselves in the head is the state bird of West Virginia. - schild
Miasma
Terracotta Army
Posts: 5283

Stopgap Measure


Reply #16 on: August 21, 2006, 05:33:39 PM

I just purchased an ASUS 7950gx2
Holy crap, that's quite the card.  I haven't paid too much attention to hardware since I built my last computer, how do cards with two GPUs like that compare to an SLI setup?
Trippy
Administrator
Posts: 23620


Reply #17 on: August 21, 2006, 05:35:13 PM

I just purchased an ASUS 7950gx2
Holy crap, that's quite the card.  I haven't paid too much attention to hardware since I built my last computer, how do cards with two GPUs like that compare to an SLI setup?
It is an SLI setup.
Miasma
Terracotta Army
Posts: 5283

Stopgap Measure


Reply #18 on: August 21, 2006, 05:42:41 PM

Oh, for some reason I thought SLI required two separate cards in their own PCIe slots.  Now I know better, serves me right for not keeping an eye on these things for a while.
Engels
Terracotta Army
Posts: 9029

inflicts shingles.


Reply #19 on: August 21, 2006, 09:10:23 PM

Er, uhm, no, its not an SLI set up, at least not in the conventional sense. Miasma, yer right in thinking that SLI implies two cards with separate PCIe slots, and in fact, you can set up two 7950s with an SLI crossover ribbon, but the results thusfar are a bit dodgy.

To answer your previous question, however, two 7900 cards with an SLI ribbon outperform a 7950gx2, but not by much. So its cheaper to buy the 7950 in terms of bang for your buck.

I should get back to nature, too.  You know, like going to a shop for groceries instead of the computer.  Maybe a condo in the woods that doesn't even have a health club or restaurant attached.  Buy a car with only two cup holders or something. -Signe

I LIKE being bounced around by Tonkors. - Lantyssa

Babies shooting themselves in the head is the state bird of West Virginia. - schild
Trippy
Administrator
Posts: 23620


Reply #20 on: August 21, 2006, 09:30:42 PM

Er, uhm, no, its not an SLI set up, at least not in the conventional sense. Miasma, yer right in thinking that SLI implies two cards with separate PCIe slots, and in fact, you can set up two 7950s with an SLI crossover ribbon, but the results thusfar are a bit dodgy.
It *is* an SLI setup. To the drivers it's like there are two separate GPUs running in SLI mode because there are two separate GPUs running in SLI mode. So all the glitches and problems you have with SLI exist with that card even though it only uses one slot (though it takes up the space of two slots). SLI means splitting the work among multiple GPUs. The physical packaging is a separate issue. This is the exact same thing 3dfx did back in 2000 with the Voodoo5 by mashing two and four GPUs onto one card running in SLI mode (though the 4 GPU version never made it to release).
Engels
Terracotta Army
Posts: 9029

inflicts shingles.


Reply #21 on: August 21, 2006, 11:06:40 PM

You may be right, but nothing on the Nvidia site's technical specs (http://www.nvidia.com/object/7_series_techspecs.html) suggest that. In fact, it specifies that for the 7950 series, SLI drivers are still in production. What makes me think that perhaps you're mistaken is the mention in http://www.slizone.com/page/slizone_faq.html#g3 that states:

Quote
How does this technology differ from 3dfx's SLI technology?
NVIDIA SLI technology differs in many ways. First, 3dfx’s SLI technology was implemented on a shared bus using PCI. The PCI bus delivered ~100MB/sec. of bus throughput, while PCI Express is a point-to-point interface that can deliver ~60x the total bandwidth of PCI. Second, 3dfx’s SLI technology performed interleaving of scan lines, and combined in the analog domain, which could result in image quality issues due to DAC differences and other factors. 3dfx Voodoo technology also only performed triangle setup, leaving the geometry workload for the CPU. This meant 3dfx’s SLI technology only scaled simple texture fill rate, and then used inter-frame scalability. NVIDIA SLI technology is PCI Express based, uses a completely digital frame combining method that has no impact on image quality, can scale geometry performance, and supports a variety of scalability algorithms to best match the scalability method with application demands.


The emphasis on SLI's dependence on PCIe rather than PCI suggests that the dual PCIe slots are the avenue through which a lot of SLI work is done; not simply a connection between two GPUs which then both go through one PCIe slot.

But hey, if you're right, all the better.

I should get back to nature, too.  You know, like going to a shop for groceries instead of the computer.  Maybe a condo in the woods that doesn't even have a health club or restaurant attached.  Buy a car with only two cup holders or something. -Signe

I LIKE being bounced around by Tonkors. - Lantyssa

Babies shooting themselves in the head is the state bird of West Virginia. - schild
Trippy
Administrator
Posts: 23620


Reply #22 on: August 21, 2006, 11:10:47 PM

You may be right, but nothing on the Nvidia site's technical specs (http://www.nvidia.com/object/7_series_techspecs.html) suggest that.
If the 7950 GX2 wasn't dual SLI (in a single card) why would they call a pair of 7950 GX2's quad SLI?

Edit: Also the 7950 GX2 is PCI Express based -- there's a 48-lane PCI Express switch connecting the two PCBs (16 lanes for each GPU plus 16 lanes to the PCI Express slot on the motherboard).
« Last Edit: August 21, 2006, 11:14:46 PM by Trippy »
Engels
Terracotta Army
Posts: 9029

inflicts shingles.


Reply #23 on: August 21, 2006, 11:16:56 PM


If the 7950 GX2 wasn't dual SLI (in a single card) why would they call a pair of 7950 GX2's quad SLI?


Marketing? But hey, I take your point. Don't get me wrong, I don't care if they call it Megatron's Ginormous Graphics EPeen Connector of Doom, so long as its two GPUs working on the same tasks.

I should get back to nature, too.  You know, like going to a shop for groceries instead of the computer.  Maybe a condo in the woods that doesn't even have a health club or restaurant attached.  Buy a car with only two cup holders or something. -Signe

I LIKE being bounced around by Tonkors. - Lantyssa

Babies shooting themselves in the head is the state bird of West Virginia. - schild
Murgos
Terracotta Army
Posts: 7474


Reply #24 on: August 22, 2006, 06:26:14 AM

You may be right, but nothing on the Nvidia site's technical specs (http://www.nvidia.com/object/7_series_techspecs.html) suggest that. In fact, it specifies that for the 7950 series, SLI drivers are still in production. What makes me think that perhaps you're mistaken is the mention in http://www.slizone.com/page/slizone_faq.html#g3 that states:

Quote
How does this technology differ from 3dfx's SLI technology?
NVIDIA SLI technology differs in many ways. First, 3dfx’s SLI technology was implemented on a shared bus using PCI. The PCI bus delivered ~100MB/sec. of bus throughput, while PCI Express is a point-to-point interface that can deliver ~60x the total bandwidth of PCI. Second, 3dfx’s SLI technology performed interleaving of scan lines, and combined in the analog domain, which could result in image quality issues due to DAC differences and other factors. 3dfx Voodoo technology also only performed triangle setup, leaving the geometry workload for the CPU. This meant 3dfx’s SLI technology only scaled simple texture fill rate, and then used inter-frame scalability. NVIDIA SLI technology is PCI Express based, uses a completely digital frame combining method that has no impact on image quality, can scale geometry performance, and supports a variety of scalability algorithms to best match the scalability method with application demands.


The emphasis on SLI's dependence on PCIe rather than PCI suggests that the dual PCIe slots are the avenue through which a lot of SLI work is done; not simply a connection between two GPUs which then both go through one PCIe slot.

But hey, if you're right, all the better.

That's the worst case of marketing speak saying that an apple isn't an apple because that ones a red apple and the other ones a green apple I have seen in a while.

The only fundemental difference between the two versions is that one interleaved an analog signal and the other interleaves a digital signal.

The PCIe vs PCI argument is a red herring.

Whether there are two chipsets on one card or on one chipset on each of two cards is moot.  There is still a bus between the two chipsets (probably PCIe in either case as their whole system will have been designed to work with a PCIe bus from inception) and one of that chipsets will have to do the work of recombining the digital signals and outputting them to the display device.

There may be some issues with the acutal implementations on the cards but in theory if the chips are on individual PCB's or not really shouldn't be a huge factor.

"You have all recieved youre last warning. I am in the process of currently tracking all of youre ips and pinging your home adressess. you should not have commencemed a war with me" - Aaron Rayburn
Trippy
Administrator
Posts: 23620


Reply #25 on: August 22, 2006, 06:38:28 AM

But hey, if you're right, all the better.
Of course I'm right. Read any reasonable review of the 7950 GX2 and it's patently obvious that it's just two cards mashed together in SLI mode.
Engels
Terracotta Army
Posts: 9029

inflicts shingles.


Reply #26 on: August 22, 2006, 11:43:30 AM

The only problem with insisting that its SLI technology (which it is) is that it may draw someone to the conclusion that the 7950 can't work on non-SLI boards, which it can. The card can run on many motherboards that do not require the Nvidia chipset. All the SLI technology is handled on the card, although it appears that some motherboards do require a bios update.

I should get back to nature, too.  You know, like going to a shop for groceries instead of the computer.  Maybe a condo in the woods that doesn't even have a health club or restaurant attached.  Buy a car with only two cup holders or something. -Signe

I LIKE being bounced around by Tonkors. - Lantyssa

Babies shooting themselves in the head is the state bird of West Virginia. - schild
Furiously
Terracotta Army
Posts: 7199


WWW
Reply #27 on: August 23, 2006, 11:20:14 AM

I bought one about a month ago. It runs EQ2 on highest everything (except shadows) very pretty-like.

Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #28 on: August 23, 2006, 11:58:01 AM

Damn you, shadows! Why must you elude me?

Also, DX10 is just around the corner. I'll probably end up buying parts a couple months before a DX10 part is released  angry Although I do get a chuckle out of that 'artist's rendering' of DX9 vs 10 technology. Wtf?
Jain Zar
Terracotta Army
Posts: 1362


Reply #29 on: August 25, 2006, 03:07:21 AM

Pretty much everything is multi core now, cept maybe the Wii.
Macs have been dual and now quad core, the 360 is a dual, the PS3 is an octo core, cept it doesnt need all of em or something like that...

These physics cards seem pretty dumb though.  But like someone mentioned they probably want to be bought out or have their tech liscensed to be put on videocards and such.

Not too many folks are gonna drop serious coin to have improved physics effects.  Or at least enough to get most game developers to bother coding for it anyhow.
Trippy
Administrator
Posts: 23620


Reply #30 on: August 25, 2006, 03:56:49 AM

Pretty much everything is multi core now, cept maybe the Wii.
Macs have been dual and now quad core
The Mac Pro is two dual-core CPUs not quad-core (those aren't out yet).

Quote
the 360 is a dual,
Tri-core.
Koyasha
Terracotta Army
Posts: 1363


Reply #31 on: August 25, 2006, 09:27:42 AM

Not too many folks are gonna drop serious coin to have improved physics effects.  Or at least enough to get most game developers to bother coding for it anyhow.

Annoyingly, that's one of those loop things.  If more games worked with it more people would be inclined to buy it, but since few games work with it, few people buy it, so few devs bother to make games that work with it.  The list of games that I might even want to play that work with the PhysX processor is: City of Heroes.  And I'm not actively playing it, nor do I think buying a PhysX card would make me want to actively play it.  If there are more future games that are going to work with it, they should be making a lot more noise about which games will be compatible, because while I see the ads for the card around, I see very little to tell me what games the card will actually be USED for.

-Do you honestly think that we believe ourselves evil? My friend, we seek only good. It's just that our definitions don't quite match.-
Ailanreanter, Arcanaloth
Tairnyn
Terracotta Army
Posts: 431


Reply #32 on: August 25, 2006, 12:41:27 PM

I'll toss in my usual pining for an AI processor :) Developers, your AI sucks. All of you. For shame.

Until we can develop an AI that can gracefully handle incomplete state information and uncertainty in a real-time environment no amount of extra cycles is going to significantly improve the state of the art. A dedicated AI processor would enable a deeper search tree with more predetermined production rules, which only takes longer for the human to learn how to exploit it. One could argue that turn-based games would gain a lot from more processing muscle, but the gain is logarithmic since the search tree explodes exponentially. Learning algorithms for real-time games are a hot research topic right now, so there's hope on the horizon.

In my opinion, the paradigm shift in AI will occur when the systems are bootstrapped with basic game information and the AI learns on it's own how to play well and adapt to opponents given limited processing resources.
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #33 on: August 25, 2006, 01:37:03 PM

At 4:30 on a friday, I don't have enough brain cells left to get into my pipe dreams for AI, but I'll just put this out there.

One of my angry-at-AI moments that sticks with me is from Medal of Honor: AA.  You are escorting a british agent through nazi territory. I circle around and enter the house from the back and work through to the front. There are several enemies in the front yard/road area. This plays out in only two ways.

1) I block the doorway so the brit can't run out into the yard and he shoots me in the back until I die. Mission over.

2) I let him pass and he's gunned down before I can kill all the nazis. Mission over.

Escort missions have always been a pet peeve of mine for as long as I can remember (back to at least Syndicate). I can't abide babysitting subpar AI. In the above scenario, the AI should /at least/ understand that he can't shoot through the door without hitting me, find cover and cover my back until I move out.
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #34 on: September 06, 2006, 09:54:01 AM

Pages: [1] 2 Go Up Print 
f13.net  |  f13.net General Forums  |  Gaming  |  Topic: Hardware Physics Accelerators for gaming  
Jump to:  

Powered by SMF 1.1.10 | SMF © 2006-2009, Simple Machines LLC