Welcome, Guest. Please login or register.
June 27, 2025, 03:41:11 PM

Login with username, password and session length

Search:     Advanced search
we're back, baby
*
Home Help Search Login Register
f13.net  |  f13.net General Forums  |  Gaming  |  Topic: New Tech To Revolutionize Gaming? 0 Members and 1 Guest are viewing this topic.
Pages: 1 [2] 3 4 Go Down Print
Author Topic: New Tech To Revolutionize Gaming?  (Read 30126 times)
schild
Administrator
Posts: 60350


WWW
Reply #35 on: March 25, 2009, 01:50:27 PM

Both ideas are stupid. These guys would probably have a cheaper and more reasonable businessplan running Netflix for PCs. Like, actual computers, because this business, unlike what I just said, is doomed to fail, whereas you might SOMEHOW MAGICALLY, if you have enough customers, manage to last a few years with my stupid ideas.

The whole thing is a fucking jump-to-conclusions map, Their idea, your idea, my fake idea, all poop.
raydeen
Terracotta Army
Posts: 1246


Reply #36 on: March 25, 2009, 01:58:20 PM

[
BUT

The 1 ms time to receive the packet is meaningless and arbitrary without knowing the bandwidth and the packet size.  Even assuming a 5 Mbs dl speed and 32 bit words this gives a payload somewhere around 150 words.  How the fuck do you turn 150 words of data into acceptable graphics?

You can do it with a full client because the words just say move pc pov to x, play animation y, change numbers[z] and all the data and processing on doing that is on your side of the pipe.  It doesn't make sense to say you can do that with nothing doing processing on the receiving end.

Here's how I understand it's supposed to work. The remote computer is doing all the game rendering and processing. You, the player, are basically watching a live streamed video of said game which you interact with via a set-top box or controller plugged into your computer. I'm not sure if we're at crossed paths here or not, but as I said, the video probably isn't a problem. I'd like to see just how well that remote machine can respond to the button press/joystick move that I make on my end. There's got to be a noticeable lag between when I hit my button, when the remote machine picks up on my button hit, makes all it's calculations, updates the display and then streams it back to my screen. And considering all the sessions HAVE to be virtual server logins, do we have tech that can handle that? We use remote desktop/server software where I work to connect Macs and PCs and such and it's, umm.... not as good as actually sitting in front of the real machine. Even doing voice chat over Ethernet connection produces very noticeable lag when two or more computers have their audio going in the same room. I just can't see this being viable without some wicked fast connection.

I was drinking when I wrote this, so sue me if it goes astray.
RUiN 427
Terracotta Army
Posts: 292


Reply #37 on: March 25, 2009, 02:06:44 PM

In it's simplest concept, it's basically replacing monitor cables and keyboard/mouse cables with the internet. And supposedly their compression makes it work (assuming servers ≥  top of the line pc/mac/consol). Rendering has really nothing to do with weather it will work or not. It's purely about the compression and connection speed (up and down). On average the networks i connect to at friends houses, my house, work etc.. run around 1.7mbs down and 0.5mbs up (this is just an example).

That said, I can't expect to play crysis at 720 on my tv with out it being downgraded. However I could and should expect good or better quality on my small macbook anywhere I go. It won't be the best, but I'm sure it will be enjoyable.

Now, imagine for a second onlive licences the tech or partners with some devs for iphone or android phones... classics like diablo2 fully functional on your phone anyone?
« Last Edit: March 25, 2009, 02:12:45 PM by RUiN 427 »

"There's been no energy reading of any sort on Cybertron for the past seven hundred or so stellar-cycles."
Prospero
Terracotta Army
Posts: 1473


Reply #38 on: March 25, 2009, 02:08:10 PM

Here's their bandwidth requirements: For Standard-Definition TV resolution, OnLive needs a 1.5 Mbps connection. For HDTV resolution (720p60), OnLive needs 5 Mbps.

All I can say is good luck with that. Maybe in a few years, but I live in the SF Bay Area and can't get those rates for a reasonable fee. Not to mention I already have problems with my wife watching Hulu and screwing up my online gaming. Having every game I "own" be at the whim of my net connection? Nu uh.
Trippy
Administrator
Posts: 23657


Reply #39 on: March 25, 2009, 03:07:55 PM

Just for clarity are you suggesting that the server should maintain in memory every pixel needed for the entire zone and the perspective of every client in the zone and then just stream the result at their native resolution directly to each client at 60 FPS?
That's exactly what OnLive is doing.  It runs a version of the game for each client, on their servers, and streams you the output.  N scenes, N copies of the graphics data, renderered N times.  Lots of server hardware, minimal win.

I'm saying that's not the right way to do it.

1 scene, 1 copy of the graphics data, rendered N times.  Exponentially faster, and capable of doing things even the best gaming PC can't (because the server controls the hardware).
Facepalm
fuser
Terracotta Army
Posts: 1572


Reply #40 on: March 25, 2009, 03:36:18 PM


awesome, for real

I don't understand the business model, did the underpants gnomes write it?

Thinking of the datacenter costs bandwidth/servers/peak load/rendering/hvac. The cost of whatever the subscription/per access/game charge, there's no fscking way they can beat a gaming pc/xbox/ps3 cost. That's forgetting about the cost of the customer support they are going to require.

I want them to make it, to see it crash and burn as someone ddos's their rendering/app servers DRILLING AND MANLINESS
« Last Edit: March 25, 2009, 03:38:50 PM by fuser »
Prospero
Terracotta Army
Posts: 1473


Reply #41 on: March 25, 2009, 03:49:20 PM

Yeah, I think it will be much more interesting when we see what they plan to charge. I can't see it being less than $20 a month just for access, and then rental and MMO subscription fees on top of that. All I can imagine is they are hoping a lot of people sign up for he service and then not use it all that much.

The presentation is cool to watch. I think they are trying to do things that are interesting, for instance not just seeing that a friend is playing a game, but being able to watch them play. Even if they die a miserable death this magic compression technology of theirs could be a real boon to the internet at large.
Daeven
Terracotta Army
Posts: 1210


Reply #42 on: March 25, 2009, 03:49:55 PM

Pikers.

Let me know when I can use quantum coupling to render the client in a solid state chip. Embedde in your skull. With a sledgehammer.

Then all of you will dance for me.

Dance my puppets! Dance!

"There is a technical term for someone who confuses the opinions of a character in a book with those of the author. That term is idiot." -SMStirling

It is by caffeine alone I set my mind in motion. It is by the beans of Java that thoughts acquire speed, the hands acquire shakes, the shakes become a warning. It is by caffeine alone I set my mind in motion
CadetUmfer
Terracotta Army
Posts: 69


WWW
Reply #43 on: March 25, 2009, 04:35:10 PM

Facepalm

What OnPoint is doing is dumb and will not catch on.  It's the distributed computing version of a PhysX card.  Excuse me for throwing out some ideas for how this might work in a high-bandwidth, low-latency future.

Regarding latency, "fat games" are running into the same problems.  As environments become more interactive, developers can't rely on traditional client-side prediction as much anyway. (The code that makes your avatar move immediately even though [...] speed of light).
« Last Edit: March 25, 2009, 04:37:05 PM by CadetUmfer »

Anthony Umfer
Developer, LiftOff Studios
Prospero
Terracotta Army
Posts: 1473


Reply #44 on: March 25, 2009, 04:47:02 PM

There's nothing about their tech that would not allow them to do what you are describing, but existing games are not written that way, and would be non-trivial to port. I imagine an OnLive based MMO would infact keep most of the geometry in memory, since at any given point someone is looking at almost every object. However if they want to have a reasonable stable of games on day one they are going to need to use existing binaries, and that means multiple copies being rendered individually.
Venkman
Terracotta Army
Posts: 11536


Reply #45 on: March 25, 2009, 05:04:02 PM

Err, I thought we had this all covered on the first page.

  • They are hosting the game and the entire graphical processing on their end.
  • They need the equivalent of one gaming rig per user account to pull this off. Unless someone tells me there's some way around this (and not retcon some future-state invention  awesome, for real), this is a requirement.
  • They are sending video streams equivalent to HD video to people who barely get that good enough on their TVs.
  • They are expecting to recognize in pseudo-realtime the inputs of a gaming controller.
  • They set the bar at the level of FPS games.

You only need to look at the games your cable provider offers in the Channel 900s range. Solitaire is going to be a pita, and using your TV remote will drive you back to your laptop right quick. Just look at the Gamespot video from GDC. On the best connection they could bring to the show, with none of that pesky internet in the way, they barely had playable FPS games.

As a way to play Peggle? Yea, no problem. Any turn-taking game would be fine.

And that doesn't even get into the business model, which I'm sure will end up being some combination of tiered subscription, premium OnDemand MTX, and embedded advertisement.

Seriously, good idea in the same theorycraft land that it's existed for years now.
Stephen Zepp
Developers
Posts: 1635

InstantAction


WWW
Reply #46 on: March 25, 2009, 05:45:40 PM

Here's the scary part: Early word back from GDC is that a substantial majority of the AAA publishers/developers are wildly excited about this and want to throw money at it.

Rumors of War
Samwise
Moderator
Posts: 19323

sentient yeast infection


WWW
Reply #47 on: March 25, 2009, 05:47:44 PM

Good.  It's going to be super awesome when they put this in a few homes and the early adopters all get their internet cut by Comcast because they exceeded their monthly bandwidth caps in one day.
fuser
Terracotta Army
Posts: 1572


Reply #48 on: March 25, 2009, 05:48:08 PM


    • They need the equivalent of one gaming rig per user account to pull this off. Unless someone tells me there's some way around this (and not retcon some future-state invention  awesome, for real), this is a requirement.

    This is exactly what boggles me, they are expecting to sell high (Ok 720p isn't that high end in the pc gaming market) graphics solutions. You still need a dedicated gpu per person unless they have some self created directx that can spawn out d3d rendering to multiple gpu/cpu's like a bastardized WARP (which from reviews is slow on the highest end i7 CPU's).

    If not they seem to be using a rack mount processing workstation, but more realistically a workstation with framebuffer capture middleware + magical compression to their client.
    « Last Edit: March 25, 2009, 05:55:11 PM by fuser »
    Stephen Zepp
    Developers
    Posts: 1635

    InstantAction


    WWW
    Reply #49 on: March 25, 2009, 05:53:36 PM

    The other thing that sticks out to me is that part of their technical solution is taking absolute control over routing information within the packets to obtain maximum throughput.

    Tell me how long taht will last when Vonage and all of the other business models that depend on reliable internet throughput are shut out by cheating the RFC for transportation/routing protocols (can't remember which one it is atm--287 I seem to remember?).

    We got pissed at Comcast throttling--what's going to happen when 10 or 20 of these rigs show up on your local Cable network loop?

    Rumors of War
    Daeven
    Terracotta Army
    Posts: 1210


    Reply #50 on: March 25, 2009, 06:43:40 PM

    They need the equivalent of one gaming rig per user account to pull this off. Unless someone tells me there's some way around this (and not retcon some future-state invention  awesome, for real), this is a requirement.

    You doubt my Glorious Vision Meat Puppet? the Future is made of awesomeness! And negative ping code running on Vax/Vms terminals sticking out of the side of your head. Now go attend your Daily Hate, Prole.

    "There is a technical term for someone who confuses the opinions of a character in a book with those of the author. That term is idiot." -SMStirling

    It is by caffeine alone I set my mind in motion. It is by the beans of Java that thoughts acquire speed, the hands acquire shakes, the shakes become a warning. It is by caffeine alone I set my mind in motion
    Prospero
    Terracotta Army
    Posts: 1473


    Reply #51 on: March 25, 2009, 07:21:15 PM

    • They need the equivalent of one gaming rig per user account to pull this off. Unless someone tells me there's some way around this (and not retcon some future-state invention  awesome, for real), this is a requirement.


    Technically they need one rig per user during peak usage hours. Steam has something like 12 million users, but their peak is shy of a couple mil. Still, it's a non-trivial amount of hardware.
    Severian
    Terracotta Army
    Posts: 473


    Reply #52 on: March 25, 2009, 08:45:49 PM

    More details are out on what Dave Perry (Earthworm Jim, MDK, Shiny, Acclaim) is saying about his competing approach. It's named Gaikai.

    The big difference is that it's browser-based with no plugin beyond flash. So, it doesn't go direct to TV easily like OnLive, but there's also no little box you need to have.

    From what little info is available on both approaches, I don't think he's addressed latency at all, unlike OnLive, although he does say he needs to make a deal with an ISP. I would think the solution would involve QoS (see Vonage etc.), and sending button presses and stick wiggles doesn't require a lot of data, but I don't know much about what's going on with ISPs and QoS nowadays.

    Entire GameDaily interview below.
    Triforcer
    Terracotta Army
    Posts: 4663


    Reply #53 on: March 25, 2009, 09:26:57 PM

    Please humor the nearly-computer illiterate-

    Is the problem here something fundamental (breaking the internet laws of physics or something like that) or is it merely a function of "stuff ain't fast enough to do that yet" (in which case, this will be adopted by everyone in ten years when all those magic box speed and memory numbers are 100x greater)? 

    All life begins with Nu and ends with Nu.  This is the truth!  This is my belief! At least for now...
    Quinton
    Terracotta Army
    Posts: 3332

    is saving up his raid points for a fancy board title


    Reply #54 on: March 25, 2009, 10:07:56 PM

    "Thin clients" for high end games is stupid as hell.  Why in the world would you want to have to maintain all the CPUs/GPUs/ram on the backend?  What's the billing model that's going to justify this nonsense?

    I'm sure there are a lot of stupid gamesdev execs who are drooling over this due to promises of:
    - no piracy (the bits never exist on the untrusted client)
    - no used sales / trading (the user never even has media)
    - "more flexible pricing" (exciting new ways to gouge the user)
    - etc

    At the end of the day though, it's doomed to fail horribly for anything where latency and quality really matter... so it will only be feasible for trivial stuff, turn based, etc, which really don't need much client resources anyway (people can use flash as a platform), so what's the point.

    Also, Perlman is insane.  I know a number of people who worked with him at webtv, rearden, etc.  He is just totally nuts.
    Sheepherder
    Terracotta Army
    Posts: 5192


    Reply #55 on: March 26, 2009, 12:28:52 AM

    Please humor the nearly-computer illiterate-

    Is the problem here something fundamental (breaking the internet laws of physics or something like that) or is it merely a function of "stuff ain't fast enough to do that yet" (in which case, this will be adopted by everyone in ten years when all those magic box speed and memory numbers are 100x greater)?

    It pretty much fails on both counts.  Control responsiveness alone would be fucking hateful, because the signal would go controller -> console box -> big truck -> game server -> processing, add to that the requirement for something along the lines of a FiOS connection at your end, and basically a bare-bones gaming PC with a custom OS per concurrent user at their end ($599 is a reasonable estimate awesome, for real ).  Of course, maybe they will do some really weird shit with process threading, but a fucking quantum computer in every house is probably an easier sell at that point.
    Lantyssa
    Terracotta Army
    Posts: 20848


    Reply #56 on: March 26, 2009, 11:28:31 AM

    Please humor the nearly-computer illiterate-

    Is the problem here something fundamental (breaking the internet laws of physics or something like that) or is it merely a function of "stuff ain't fast enough to do that yet" (in which case, this will be adopted by everyone in ten years when all those magic box speed and memory numbers are 100x greater)? 
    Latency is going to be the big problem for any real-time app.

    Remember how people were bitching about WARs control responsiveness?  It will be like that but worse, because the signal has to be processed on the far end before it packages it all up and sends it back.  Even if their tech can instantly render the scene, it's going to suck for responsiveness.

    Imagine watching a streaming video and there is no buffer.  Only what happens in the video depends on your input.

    Hahahaha!  I'm really good at this!
    Venkman
    Terracotta Army
    Posts: 11536


    Reply #57 on: March 26, 2009, 11:38:53 AM

    Exactly, which is why this is perfect only for games that don't require instant ongoing input. Casual games, for example. Maybe some of the light persistent worlds.

    Technically they need one rig per user during peak usage hours. Steam has something like 12 million users, but their peak is shy of a couple mil. Still, it's a non-trivial amount of hardware.

    That's a good point. I wonder what they expect their peak concurrency to be? Still as you say it's not a trivial amount of extra hardware. And that's only 1/3 of the total problem.

    But companies will throw money at it anyway. Though not because they think this is viable but rather because they think someone else thinks it's viable. Network affect hype. Nobody wants to be the naysayer, so they're forced into stating support because everyone else is.

    I predict we will see a bunch of PopCap titles on this for Fall, but that's it. Not because those games are crap. Quite the opposite (they're all fun, but my favorite is Venice). But rather because those are the only games that make sense for something like this with today's tech. As OnLive will soon find out when the companies throwing money at them unleash their techs, and as gamers get ahold of betas..
    « Last Edit: March 26, 2009, 11:40:31 AM by Darniaq »
    Sairon
    Terracotta Army
    Posts: 866


    Reply #58 on: March 26, 2009, 12:03:50 PM

    Here's the scary part: Early word back from GDC is that a substantial majority of the AAA publishers/developers are wildly excited about this and want to throw money at it.


    At management level, yes, amongst the actual production people? Not so much. Pretty much every engineer in the industry which has talked about it is saying how stupid it is.

    1600 x 1200 @ 32 bit color depth with 60 FPS = ~460 mb/s, with loss less compression perhaps half that, although compressing that amount of data in real time is sure to add both extra latency and require a computer of fucking doom. I think we can safely assume they're going to compress the shit out of it using a lossy compression scheme, most likely using something at hardware level for it to be cost effective. The whole thing actually represents a very simple network model, which surprise surprise, no twitch games get away with using. Even with very stable ping at sub 100 ms levels the input lag is very noticeable. Also, anything requiring any form of timing won't be fun if there's any spikes.
    schild
    Administrator
    Posts: 60350


    WWW
    Reply #59 on: March 26, 2009, 12:12:46 PM

    It's not scary. It just proves that the substantial majority of AAA business people need to be replaced. Fucking chuckleheads.
    pants
    Terracotta Army
    Posts: 588


    Reply #60 on: March 26, 2009, 02:22:03 PM

    Heh, glad to see in my denned duplicate thread, where I said it looked like negative ping code, that I'm not the only one to think this is doomed!  Dooooomed!
    Venkman
    Terracotta Army
    Posts: 11536


    Reply #61 on: March 26, 2009, 04:37:04 PM

    The whole thing actually represents a very simple network model, which surprise surprise, no twitch games get away with using. Even with very stable ping at sub 100 ms levels the input lag is very noticeable. Also, anything requiring any form of timing won't be fun if there's any spikes.

    Yarp. Do a tracert between their PC and their favorite COD4 server. That's the hops taken the moment you move the analog stick/mouse enough to send a signal. Because the client app doesn't do any of the magic that minimizes what is actually needed on the client vs the server. Oh, then strip away the graphics processing that are entirely client based.

    This is a wet dream for people who don't want anything to do with retailers. It just doesn't survive first contact with reality.
    fuser
    Terracotta Army
    Posts: 1572


    Reply #62 on: March 26, 2009, 04:53:42 PM

    It's going to fail, then LodgeNet will buy it up to provide AAA gaming service in hotels/hospitals (any closed loop system).
    Yoru
    Moderator
    Posts: 4615

    the y master, king of bourbon


    WWW
    Reply #63 on: March 26, 2009, 05:03:06 PM

    Exactly, which is why this is perfect only for games that don't require instant ongoing input. Casual games, for example. Maybe some of the light persistent worlds.

    ...

    I predict we will see a bunch of PopCap titles on this for Fall, but that's it. Not because those games are crap. Quite the opposite (they're all fun, but my favorite is Venice). But rather because those are the only games that make sense for something like this with today's tech. As OnLive will soon find out when the companies throwing money at them unleash their techs, and as gamers get ahold of betas..

    On your first point, it's called fucking Flash.

    As for the second point and all the non-technically-inclined in the thread: No, it will never work. Period. Leave aside the bandwidth issues detailed above. Leave aside the cost of a server cluster that can do both your server and client-side calculations including real-time rendering. Leave aside availability, peak concurrency, and all that other shit. Look entirely at latency.

    There's a theoretical speed limit to data transmission. No information, encoded in any way imaginable, can break the speed of light. 299,792,458 meters per second works down to 299.792458 kilometers per millisecond. Round to 300 for sanity.

    For every 300 kilometers separating your end terminal from their datacenter, there is an absolute floor of 1 millisecond in data transmission time, one-way, or 2 milliseconds round-trip. This is assuming you're using a laser in an evacuated tube; in reality, your current best-case scenario is a fiber line directly from you to their datacenter, which transmits around 2/3 the above speed, or about 200 kilometers per millisecond, and it's not running in a straight line. And then there's the latency involved in optical switching, plus converting from electronic->optical signals at the source and optical->electronic at the source. This gets slower still if you make the system all-electronic, as you still need to do switching but suffer from the slower speed of electrical wave propagation compared to light - impossible best-case scenario is 97% of light, average-case probably in the 70% range.

    Even presuming you had the OnLive datacenter in your hometown with an awesome fiber link just a hop or two off your ISP, your latency floor will be around 20-30 milliseconds one-way. This is much slower than even early LCDs, which no one used for gaming precisely because of the awful, awful response times. Hell, they were painful enough just for desktop use, as you could move the mouse and perceive the delay between your motion and the pointer getting updated.

    In reality, again, you can't have a datacenter in every major metropolitan area unless you're fucking Google. Instead you'll get, maybe, one per time-zone. Your latency floor is now around 30-50 milliseconds, one-way. 60-100 milliseconds to see a response on the screen. It doesn't sound like much until you try it and throw your game controller through the window because it's so fucking frustrating.

    Now add in all the other shit mentioned before I got into the math, and you should get why anyone who's been on the tech side of things is laughing into their coffee. Or, if they're sneaky, coming up with knock-off business plans to fleece the suits who think this sounds like a FANTASTIC idea, like Sun's THE NETWORK IS THE COMPUTER thing in the 90s, only Web2.0 and jazzy and hip.

    If you'll excuse me, I have a date with a word processor...
    Lantyssa
    Terracotta Army
    Posts: 20848


    Reply #64 on: March 26, 2009, 08:28:57 PM

    I predict we will see a bunch of PopCap titles on this for Fall, but that's it. Not because those games are crap. Quite the opposite (they're all fun, but my favorite is Venice). But rather because those are the only games that make sense for something like this with today's tech. As OnLive will soon find out when the companies throwing money at them unleash their techs, and as gamers get ahold of betas..
    It's not even useful for these.  Those games will run on just about any machine, and their entire model is based on being cheap to produce.  Why suddenly introduce hardware costs on their end when before all they needed was a web-server to either initialize the app or allow a download?  Ignoring the technical reasons why this is stupid, the costs aren't beneficial to the people who might have a use for it.

    Hahahaha!  I'm really good at this!
    Wasted
    Terracotta Army
    Posts: 848


    Reply #65 on: March 26, 2009, 08:52:21 PM



    As for the second point and all the non-technically-inclined in the thread: No, it will never work. Period. Leave aside the bandwidth issues detailed above. Leave aside the cost of a server cluster that can do both your server and client-side calculations including real-time rendering. Leave aside availability, peak concurrency, and all that other shit. Look entirely at latency.

    There's a theoretical speed limit to data transmission. No information, encoded in any way imaginable, can break the speed of light. 299,792,458 meters per second works down to 299.792458 kilometers per millisecond. Round to 300 for sanity.

    For every 300 kilometers separating your end terminal from their datacenter, there is an absolute floor of 1 millisecond in data transmission time, one-way, or 2 milliseconds round-trip. This is assuming you're using a laser in an evacuated tube; in reality, your current best-case scenario is a fiber line directly from you to their datacenter, which transmits around 2/3 the above speed, or about 200 kilometers per millisecond, and it's not running in a straight line. And then there's the latency involved in optical switching, plus converting from electronic->optical signals at the source and optical->electronic at the source. This gets slower still if you make the system all-electronic, as you still need to do switching but suffer from the slower speed of electrical wave propagation compared to light - impossible best-case scenario is 97% of light, average-case probably in the 70% range.

    Even presuming you had the OnLive datacenter in your hometown with an awesome fiber link just a hop or two off your ISP, your latency floor will be around 20-30 milliseconds one-way. This is much slower than even early LCDs, which no one used for gaming precisely because of the awful, awful response times. Hell, they were painful enough just for desktop use, as you could move the mouse and perceive the delay between your motion and the pointer getting updated.

    In reality, again, you can't have a datacenter in every major metropolitan area unless you're fucking Google. Instead you'll get, maybe, one per time-zone. Your latency floor is now around 30-50 milliseconds, one-way. 60-100 milliseconds to see a response on the screen. It doesn't sound like much until you try it and throw your game controller through the window because it's so fucking frustrating.

    Now add in all the other shit mentioned before I got into the math, and you should get why anyone who's been on the tech side of things is laughing into their coffee. Or, if they're sneaky, coming up with knock-off business plans to fleece the suits who think this sounds like a FANTASTIC idea, like Sun's THE NETWORK IS THE COMPUTER thing in the 90s, only Web2.0 and jazzy and hip.

    If you'll excuse me, I have a date with a word processor...

    I certainly don't work in IT and aren't exceptionally technical, but thats the first thing I thought of (without the specific numbers and stuff) when I read about this.  If the investors don't understand something that a dumb shit like myself knows then they deserve to lose all their money.
    Prospero
    Terracotta Army
    Posts: 1473


    Reply #66 on: March 26, 2009, 08:53:42 PM

    So I actually played OnLive today and it is both  ACK! and DRILLING AND MANLINESS. I started playing Crysis Warfare on a Dell laptop using a mouse and keyboard. I think you can all guess which one it was. It was actually playable, but it was very unpleasant. The booth worker actually said that a PC FPS player would likely not be happy with the performance and boy howdy she was right. If you tried to turn the hard left or right to look for opponents your view jumped wildly. It actually felt like render lag it was so jerky. It was quite lovely when I stood still though.

    Later I popped back over and tried console games with a Logitech gamepad. I gotta say, it played very smoothly. I wanted to be angry and find something wrong, but it felt good. I tried Lego Batman and and the latest Burnout and both played well. There were a couple hiccups in Lego Batman where there was clearly a network glitch, but otherwise it was as good as local. With Burnout I didn't notice a single difference from playing on a console. Watching other folks play and listening to them chat most people seemed to agree that it was much more pleasant with a gamepad than mouse and keyboard.

    The basic menus were fairly snappy and the whole watch a friend thing was pretty darn cool to see live. Honestly I could see getting this for my living room if I had the pipe and the subscription fee wasn't ridiculous.
    Wasted
    Terracotta Army
    Posts: 848


    Reply #67 on: March 26, 2009, 09:31:12 PM

    So I actually played OnLive today and it is both  ACK! and DRILLING AND MANLINESS. I started playing Crysis Warfare on a Dell laptop using a mouse and keyboard. I think you can all guess which one it was. It was actually playable, but it was very unpleasant. The booth worker actually said that a PC FPS player would likely not be happy with the performance and boy howdy she was right. If you tried to turn the hard left or right to look for opponents your view jumped wildly. It actually felt like render lag it was so jerky. It was quite lovely when I stood still though.

    Later I popped back over and tried console games with a Logitech gamepad. I gotta say, it played very smoothly. I wanted to be angry and find something wrong, but it felt good. I tried Lego Batman and and the latest Burnout and both played well. There were a couple hiccups in Lego Batman where there was clearly a network glitch, but otherwise it was as good as local. With Burnout I didn't notice a single difference from playing on a console. Watching other folks play and listening to them chat most people seemed to agree that it was much more pleasant with a gamepad than mouse and keyboard.

    The basic menus were fairly snappy and the whole watch a friend thing was pretty darn cool to see live. Honestly I could see getting this for my living room if I had the pipe and the subscription fee wasn't ridiculous.

    Did they say how far away the servers where located?  Where they 3m away in the next room?
    Prospero
    Terracotta Army
    Posts: 1473


    Reply #68 on: March 26, 2009, 09:36:09 PM

    50 miles. They are located in Santa Clara.
    Salamok
    Terracotta Army
    Posts: 2803


    Reply #69 on: March 27, 2009, 08:21:37 AM

    what kind of display was the console experience using?
    Pages: 1 [2] 3 4 Go Up Print 
    f13.net  |  f13.net General Forums  |  Gaming  |  Topic: New Tech To Revolutionize Gaming?  
    Jump to:  

    Powered by SMF 1.1.10 | SMF © 2006-2009, Simple Machines LLC