Welcome, Guest. Please login or register.
March 28, 2024, 04:23:21 AM

Login with username, password and session length

Search:     Advanced search
we're back, baby
*
Home Help Search Login Register
f13.net  |  f13.net General Forums  |  Gaming  |  Topic: Stadia / Orion / Fuck Everything 0 Members and 1 Guest are viewing this topic.
Pages: 1 [2] 3 4 Go Down Print
Author Topic: Stadia / Orion / Fuck Everything  (Read 28951 times)
Jeff Kelly
Terracotta Army
Posts: 6920

I'm an apathetic, hedonistic, utilitarian, nihilistic existentialist.


Reply #35 on: March 21, 2019, 04:55:48 AM

Since I've seen this mentioned so often with regards to 5G. 5G won't solve the latency issue with wireless networking. It physically can't. I've read that claim so often that 5G will magically reduce latencies to values that are even better than wired networking and that claim is quite frankly utter bullshit.

I'm not even sure that 5G will solve connectivity issues, access isues or bandwidth issues. The technology is vastly more complex and expensive than LTE.

The 5G specification specifies as one goal the "reduction" of latency and this claim has been misunderstood for the most part because it refers to network latency of 3G (UMTS > 300 ms) or 4G (LTE, < 100 ms) networks, which can be in the order of hundreds of milliseconds. The 1 ms value that has been talked about is the "radio network contribution to packet latency" i.e. the additional latency that comes from the wireless connection between mobile device and base station. This is a function of bandwidth, latency introduced by multiplexing/multiple access of the wireless medium, scheduling (TDMA, FDMA, CDMA) connection signalling and handover mechanisms and in general how the connection between cell tower and wireless device is set up, negotiated and managed.

In 3G (UMTS with HSPA data) all of those mechanisms add 300 ms or more to the total latency, in 4G (LTE) those mechanism add less than 100 ms to the total latency and the PERFORMANCE TARGET for 5G is that those mechanisms shall contribute only 1 ms to the total latency. They are currently quite far off that target however.

1 ms however is not the total latency of the entire network connection between mobile device and destination, because that latency still depends on the back end infrastructure that is used to connect the base stations to the telecom backend and how that telecom operator has setup its peering with the rest of the internet. Given that most cell towers connect to the back end via line of sight radio relay links, the end to end latency can still be in the hundreds of milliseconds.

Additionally 5G probably won't solve access, bandwidth or connectivity issues.

5G operates in 2 frequency bands, below 6 GHz (FR1) and above 24 GHz right into the milimeter wave spectrum (FR2). The claimed performance targets listed on the spec page only apply to FR2 at 24 GHz. Quote from the spec: "when using spectrum below 6 GHz performance targets are closer to 4G". 24 GHz and above is basically line of sight only with a very narrow beam and is easily disrupted by athmospheric conditions or obstacles like trees, walls etc. and completely fails when it rains. Even in ideal circumstances the usable range is about 1km/.6 miles.

A large scale 24 GHz setup would require hundreds of base stations to cover even a few city blocks and those base stations would still need to connect to a wired back end. (compared to 1 or 2 LTE base stations). You'd probably need several to even ensure connectivity in your own appartment given the line of sight nature of 24 GHz waves. (doesn't go through walls or most windows) This will be orders of magnitude more expensive than the existing infrastructure. Additionally the performance targets only apply if the devices support what the material calls "massive MIMO" which means that you have multiple antennae and multiple transceivers per device and a large number of them.

The performance targets sound great on paper. 20 GB/s data rate, 1 ms of latency etc. If you read the entire spec then you have a few caveats to consider however.

20 GBit/s is the maximum achiveable data rate over the wireless connection, per base station, at 24 GHz, if the devices support MIMO (edit: this means it is the shared data rate available that is then divided up depending on the number of users per base station)
The "user experienced data rate" "achievable (...) across the coverage area" is 100 MBit/s over the wireless connection, per base station, at 24 GHz, if the devices support MIMO.
The "radio network contribution to packet travel time" is 1 ms and again this applies only over the wireless connection, per base station, at 24 GHz, if the devices support MIMO.

Additionally there's currently not even an agreement over which frequencies in the FR2 range should be used for 5G and there are no international agreements to harmonize them.

Networks that use FR1 below 6 GHz won't perform significantly better than current 4G networks and considering the vastly higher investments necessary and infrastructure required for 24 GHz networking the bulk of the 5G infrastructure will be at 6 GHz and resemble something that is almost indistinguable from 4G LTE.

24 GHz networks will probably be deployed in high density affluent urban areas, like the Bay Area or New York, most of the world will probably only get something that resembles current 4G networks though, as far as coverage, bandwidth and access are concerned.

Even then you'll still probably end up with a system that has comparable or even higher latency than current wired networks, since most of the latency of a network connection comes from routing/switching and connection related issues and not from signal propagation and you'll still experience network congestion and peak usage issues.
« Last Edit: March 21, 2019, 05:00:12 AM by Jeff Kelly »
Cyrrex
Terracotta Army
Posts: 10603


Reply #36 on: March 21, 2019, 05:17:13 AM

Dang Jeff.

Here's some more points.  Or perhaps the some of the same points, minus the difficult words and numbers:

- The speed of light is a thing.  Maybe Google has figured out how to surpass it, in which case we should be wondering why they are wasting time on gamers who like shitty things.  

- Weakest link. 5G sounds nice, but there is more to a network than that alone, and besides which....wtf does 5G that have to do with most home computing?  Just makes me feel more like this is aimed at mobile gaming.

- Caps are a thing.  It is deliberate whitewashing on their part to suggest that ISPs will just adapt.  ISPs will adapt by gouging the fuck out of people.

- The speeds the are touting for 1080p and even 4k at the frames they suggest will require MASSIVE COMPRESSION using technologies that don't exist yet.  Except perhaps in the labs at Google.  Also, moderate compression on video files is one thing.  In an FPS game?  Hahahahahahahahaha.  So many problems.  Even compression on Netflix 4k annoys the shit out of me, and that isn't even close to the kind of compression Google is suggesting.

"...maybe if you cleaned the piss out of the sunny d bottles under your desks and returned em, you could upgrade you vid cards, fucken lusers.." - Grunk
Jeff Kelly
Terracotta Army
Posts: 6920

I'm an apathetic, hedonistic, utilitarian, nihilistic existentialist.


Reply #37 on: March 21, 2019, 06:37:39 AM

The TL;DR is:

The touted benefits of 5G will only affect affluent densely populated highly urban areas where the likelihood is high that the investment will pay off
The rest will get somthing that is basically 4G but a little bit faster
The coverage/lack thereof aspect is commercially driven and so nothing significant will change wrt 5G coverage and bandwidth if you live somewhere more rural
Nothing will significantly change wrt "soft" performance indicators like latency
Sky
Terracotta Army
Posts: 32117

I love my TV an' hug my TV an' call it 'George'.


Reply #38 on: March 21, 2019, 07:09:27 AM

I love the delusions of tech people who live in bubbles.

I didn't have broadband until 2001, and that's only because I moved closer to work. If I still lived up on the Tughill, I'd have dial-up (no cell service).
schild
Administrator
Posts: 60345


WWW
Reply #39 on: March 21, 2019, 07:11:21 AM

I love the delusions of tech people who live in bubbles.

I didn't have broadband until 2001, and that's only because I moved closer to work. If I still lived up on the Tughill, I'd have dial-up (no cell service).
i'm not QUITE sure what you're saying but

it's not a bubble

it's not giving a shit about people who live like cavemen
HaemishM
Staff Emeritus
Posts: 42628

the Confederate flag underneath the stone in my class ring


WWW
Reply #40 on: March 21, 2019, 08:33:16 AM

Also, he can fuck right off with that 5G bullshit like that's suddenly going to solve everything. That shit is still 2-4 generations of PHONES away from being a viable tech for anyone that isn't an early adopter living in a large metro area.

HaemishM
Staff Emeritus
Posts: 42628

the Confederate flag underneath the stone in my class ring


WWW
Reply #41 on: March 21, 2019, 08:43:36 AM

To add to my own skepticism about 5G doing anything for those outside of cities:

Way back in the early oughts, one of my companies clients that I was helping build a web site for was a company based in Mississippi that wanted to build a cell phone company in Montana. Why Montana? Fuck if I know, other than maybe the market was wide open. I think they burned through investor money for at least 2 years and maybe never even got the web site live before they went tits up. Why? Because trying to build a cell phone network that provided coverage of the whole state somewhere outside of the one or two semi-urban metro areas was a goddamn impossibility. And this was in the days before smartphones or getting email on your phone. This was literally getting a phone call on a phone in your hand without a wire. This isn't even last mile problems, this is getting to the last hundred fucking miles with a population density of "Reb and his fucking cows."

If you live in a city, you are in a very densely populated bubble. And unless you take regular trips into the mountains where no man has tread, you cannot understand the sheer desolation of being in rural fuckhole wherever, especially with regards to the amenities of modern life like electricity, running water, television. High-speed Internet? What the fuck is that? You'll take shitty low upload speed satellite and like it. You may as well be on the moon when it comes to products like fucking Google Stadia.

Draegan
Terracotta Army
Posts: 10043


Reply #42 on: March 21, 2019, 08:58:22 AM

Is this like that assassin's Creed stream Google did? I remember that being really good from a playable standpoint.
Trippy
Administrator
Posts: 23611


Reply #43 on: March 21, 2019, 09:05:58 AM

Quote
The same connectivity challenges that certain physical locations may experience today are the same challenges that prevent them from streaming video, watching YouTube, getting music, playing an online game. While I’m not trying to marginalize those people, it is the reality of the world that we live in. But everything is moving to some kind of digital, some kind of connected future.
Oh and this part is unbelievable bullshit.  The amount of speed and bandwidth those things require are insignificant compared to what he is proposing with Stadia.  And that is before you even get into the latency and input lag conversation.
YouTube already does 1080P at 60 FPS. In fact it can do 4K as well. Stadia is doing the exact same thing — i.e. it’s sending a compressed video stream to your browser, *not* the uncompressed pristine video you get playing locally. The same infrastructure they already have to deliver YT video can be used to delivery Stadia streams.

Is this like that assassin's Creed stream Google did? I remember that being really good from a playable standpoint.
Yes. But Odyseey’s combat is a button mashy affair that almost never requires precise input timing which makes it suitable for Stadia’s extra input lag.

SurfD
Terracotta Army
Posts: 4035


Reply #44 on: March 22, 2019, 12:06:27 AM

Is this like that assassin's Creed stream Google did? I remember that being really good from a playable standpoint.
Yes. But Odyseey’s combat is a button mashy affair that almost never requires precise input timing which makes it suitable for Stadia’s extra input lag.
Yep.  Replace AC with God of War, and watch the rage flow through them as they miss literally every single QTE prompt every single time...

Also, while I would agree that general AC combat is definitely straight up button mashy, but there are a few AC games where you get challenges, such as semi dynamic Chase events or "boss" fights (like the War Elephants in AC Origins) that would be straight up nearly impossible with any significant amount of input lag.
« Last Edit: March 22, 2019, 12:12:01 AM by SurfD »

Darwinism is the Gateway Science.
Jeff Kelly
Terracotta Army
Posts: 6920

I'm an apathetic, hedonistic, utilitarian, nihilistic existentialist.


Reply #45 on: March 22, 2019, 03:21:53 AM

YouTube already does 1080P at 60 FPS. In fact it can do 4K as well. Stadia is doing the exact same thing — i.e. it’s sending a compressed video stream to your browser, *not* the uncompressed pristine video you get playing locally. The same infrastructure they already have to deliver YT video can be used to delivery Stadia streams.

Youtube does not reencode/encode on the fly. They encode once during/after upload. Youtube transcodes every uploaded video into the different video profiles they support, starting with the lowest one. That's why you often only get the 240p stream at first when you watch a recently uploaded video. Once this is done the player simply switches between the different 'versions' depending on your current bandwidth.

Encoding quality and efficiency is in this case limited only by available memory and CPU and given that you have the complete video available to you you can do all kinds of nifty predictive encoding and variable bitrate shenanigans using P- and I-frames to give the user the best quality per bitrate. You also don't have to encode in 'real time' because you only need to encode a video once per supported profile. This is the reason why you can stream a 3840x2160 10 bit HDR video @ 60 frames and still get a 'reasonable' bandwidth requirement (still several MBit/s). Even live video streaming services add an encoding delay of several seconds so that their encoders can work more efficiently and use predictive 'look-ahead' encoding options.

Also, since everyone who watches a certain video gets the same data YouTube could theoretically do multicasting to conserve even more bandwidth.

Even with all of those nifty features in place a 1080p YouTube video requires 3 Gigabyte/hour and a 4K 2160p video will use up to 12 GB/h. This is already practically impossible for mobile users given the data caps and overrun fees and would still be a problem for many wired internet connections because of data caps and network congestion issues.

If you have to do low latency real time encoding of video you lose almost all of the features of modern codecs that enable high compression rates while still retaining good image quality, that's because those features need to process upcoming frames in order to compress the current frame more efficiently. If you can't do that - and with low latency real time encoding you can't because buffering and looking ahead is what causes latency in the first place - you'll either end up with a video stream that retains the image quality but has a much lower compression ratio or a stream that retains the compression ratio but has much worse image quality. There's also always the upper limit of 16.6 ms per frame for 60 Hz you need to consider because it sets a bound for the amount of work the encoder can reasonably do before it has to put out a frame and the amount of work that needs to be done per frame depends on the actual contents and is not constant time. Real time codecs are also much more susceptible to noise and the type of video content.

This is the main reason why Twitch usually has an up to 20 second stream delay and also why most digital broadcasting adds 3 - 5 seconds of delay compared to plain old analogue TV. I'd reckon that a significant portion of the 160 ms to 200 ms of latency is actually encoder delay so that they can do at least some video compression.

You can check how this affects your IQ. Just watch Twitch when a streamer plays a game that has lots of red colors (codecs perfom worst on content that contains lots of red) or where a lot of the on screen contents change quickly. The encoder completely shits itself because the encoding time per frame increases with the rate of change per frame but the output still needs to be processed in 'real time' while keeping the selected bitrate. As the actual encoding time per frame reaches the limit (16.6 ms per frame for 60 Hz) the codec can either reduce the compression rate to keep up or reduce the amount of frame processing done on the video feed while retaining the compression ratio  (worse image quality).

Google certainly knows how to run a data center and can handle the bandwidth required for video streaming but they can't magically remove all of the issues that you have if you have 'unpredictable' content with high rates of change on a frame per frame basis that needs to be streamed 'real-time' with low latency.

If they target the same IQ as YouTube video at the same resolution the required bitrate will be significantly higher (50% or more) and if they target the same bitrate as YouTube video they'll end up with a significantly worse image quality. Or latency increases to a point where actually playing a game becomes impossible.
Falconeer
Terracotta Army
Posts: 11124

a polyamorous pansexual genderqueer born and living in the wrong country


WWW
Reply #46 on: March 22, 2019, 07:20:33 AM

Your post is very informative Jeff. But it doesn't exactly explain to me how PlayStation now works. I mean, I know that PS Now is not 4k and so it kind of proves your point about image quality having to drop, but it's still quite marvelous both in terms of image quality and latency and we are talking about something that has been available for about two years?

My point is, when I read about PS Now I laughed, but when I tried I was somewhat impressed. I don't have any faith in Google Stadia and them being able to maintain their promises, but I don't have a hard time believing they have some aces up their sleeve to vastly improve on a service that already exists, works quite well, and it's two years old.

Not sure if I explained myself: PS Now has big limits but it works well, it's also old and not-Google. Wouldn't be surprised if Google pulled off something not perfect but surprisingly good without bending the laws of physics.

Jeff Kelly
Terracotta Army
Posts: 6920

I'm an apathetic, hedonistic, utilitarian, nihilistic existentialist.


Reply #47 on: March 22, 2019, 08:56:07 AM

It sort of depends on what you want and what you optimize for.

PS Now limits video output to 720p even for PS4 titles but uses a ridiculously high bitrate of 12 MBit/s. This is basically what YouTube uses for 1440p video and about 4 times as much as you would use for 720p video streaming.
This basically means that PS Now optimizes for latency and uses a very inefficient yet probably very fast encoder that doesn't add a lot of encoding latency. They basically sacrifice low bitrates for latency with the average bitrate being significantly higher compared to Netflix or YouTube video (4 times as high actually). The main problem is that this doesn't scale. If they used 1080p video in the same setup the bitrate would probably double to 25 MBit/s which is significantly higher than even YouTube or Netflix 4K video and at 4K UHD they'd use 50 MBit/s which is higher than most BluRays.

Additionally the PS3 and PS4 both have a low input latency to begin with. The PS 4 Dualshock controller for example adds only 3 ms of input latency even though its wireless.

Eurogamer claims that PS Now only adds 4 frames or 66 ms of additional input lag and that the lag is very consistent and has nearly no spikes or jitter. The total input lag times are slightly lower than Stadia, at least when EG tested the system in 2017. They lie between 110 ms and 180 ms depending on the game and the network connection, with wirelss being worst. EG claims that it "feels" like the latency is actually lower than it is because the times are very consistent on average. Given that video is 720p though and given that the encoder seems to be very fast PS Now clearly focuses on latency over IQ and resolution and even then 100 ms - 150 ms seems to be the achievable limit.

Google markets the service as something that is indistinguishable from a local game, though. They claim that you get 1080p or 4K video at the same IQ as a PC, using the same bitrate as YouTube with comparable latencies over mid range internet connections. Using hardware and video settings that are significantly higher than what a PS4 offers.

Basically PS Now but lower latency, 4 times the resolution at 1/4th of the bandwidth used and with games that have significantly higher HW requirements.

In my opinion this is almost impossible because those performance indicators are interdependent and also because Google can only control those performance indicators in their own network and doesn't lnow what happens from the border of their data center until the stream reaches the user.
« Last Edit: March 22, 2019, 09:14:22 AM by Jeff Kelly »
Jeff Kelly
Terracotta Army
Posts: 6920

I'm an apathetic, hedonistic, utilitarian, nihilistic existentialist.


Reply #48 on: March 22, 2019, 09:06:41 AM

I'm not necessary saying that this is impossible for a company like Google who could theoretically throw infinite amounts of money on the problem but Google/Alphabet is not a startup anymore and Stadia needs to be commercially viable or it goes the way of almost all of the other Google services before it.

This necessarily limits the number of servers and the hardware they can use, if they want to be profitable.

One reason OnLive failed was because they were never able to amortize the infrastructure, HW and operational costs necessary to run a decent service at the subscription pricing they offered.

01101010
Terracotta Army
Posts: 12002

You call it an accident. I call it justice.


Reply #49 on: March 22, 2019, 01:53:21 PM

In 50 years, these controllers will be worth thousands as collectors items, similar to Atari ET carts.

Does any one know where the love of God goes...When the waves turn the minutes to hours? -G. Lightfoot
schild
Administrator
Posts: 60345


WWW
Reply #50 on: March 22, 2019, 04:56:10 PM

They're going to sell infinite of these controllers. It works with every console and PC.
Trippy
Administrator
Posts: 23611


Reply #51 on: March 22, 2019, 05:09:14 PM

Huh? What? I don't remember them saying the Stadia controller going to be compatible with anything other than Stadia. They did say that USB controllers can be used instead of the Stadia controller.
Jeff Kelly
Terracotta Army
Posts: 6920

I'm an apathetic, hedonistic, utilitarian, nihilistic existentialist.


Reply #52 on: March 22, 2019, 05:17:27 PM

The controller connects directly to the Google Stadia servers via WIFI. It will only work with Stadia
schild
Administrator
Posts: 60345


WWW
Reply #53 on: March 22, 2019, 05:28:55 PM

oh no i read it backwards

how stupid

don't buy this
HaemishM
Staff Emeritus
Posts: 42628

the Confederate flag underneath the stone in my class ring


WWW
Reply #54 on: March 22, 2019, 06:17:47 PM

The controller connects directly to the Google Stadia servers via WIFI. It will only work with Stadia

What? the? Fuck? Why would you... what would you possibly want to do that for? It's like making fucking air buds only work if you sign into Apple Music or some equally stupid shit.

Trippy
Administrator
Posts: 23611


Reply #55 on: March 22, 2019, 06:29:42 PM

Because it cuts down on input lag. Seriously. And removes the need to install a special device controller on every device you might want to stream to.
schild
Administrator
Posts: 60345


WWW
Reply #56 on: March 22, 2019, 10:13:08 PM

it's going to be so bad
Trippy
Administrator
Posts: 23611


Reply #57 on: March 23, 2019, 02:08:44 PM

YouTube already does 1080P at 60 FPS. In fact it can do 4K as well. Stadia is doing the exact same thing — i.e. it’s sending a compressed video stream to your browser, *not* the uncompressed pristine video you get playing locally. The same infrastructure they already have to deliver YT video can be used to delivery Stadia streams.
[...Jeff writes lots stuff about encoding video...]
Uh, my reply to Cyrrex had nothing to do with the intricacies of real-time video encoding. He was claiming that distributing Stadia streams is somehow far beyond what is available now when in fact it's *exactly* like what's available now on YT.

But you keep doing you awesome, for real

However since I'm here now...

Quote
If you have to do low latency real time encoding of video you lose almost all of the features of modern codecs that enable high compression rates while still retaining good image quality, that's because those features need to process upcoming frames in order to compress the current frame more efficiently. If you can't do that - and with low latency real time encoding you can't because buffering and looking ahead is what causes latency in the first place - you'll either end up with a video stream that retains the image quality but has a much lower compression ratio or a stream that retains the compression ratio but has much worse image quality. There's also always the upper limit of 16.6 ms per frame for 60 Hz you need to consider because it sets a bound for the amount of work the encoder can reasonably do before it has to put out a frame and the amount of work that needs to be done per frame depends on the actual contents and is not constant time. Real time codecs are also much more susceptible to noise and the type of video content.

This is the main reason why Twitch usually has an up to 20 second stream delay and also why most digital broadcasting adds 3 - 5 seconds of delay compared to plain old analogue TV. I'd reckon that a significant portion of the 160 ms to 200 ms of latency is actually encoder delay so that they can do at least some video compression.
<<that's_not_how_any_of_this_works.gif>>

The multi-second delays in Twitch streams and live broadcast TV (at least here in the US) have absolutely nothing to do with the encoding process.

The two main sources of "chat delay" in Twitch, where what you see in the chat window shows up a few seconds later in the video stream for those streamers that include their chat in a video overlay, are the use of 2 second HLS segments and an intentional delay on the streamers side that some streamers add to prevent "stream sniping". You can verify the HLS segment size for yourself if you watch a streamer in a browser with a "debug" console to view the network segment requests. If you download one of them you can verify that they are two seconds long[1]. And if you watch a streamer that's in the "Just Chatting" category or playing a non-competitive game with a good setup (i.e. with no intentional stream delay) and a chat overlay you can measure for yourself that the delay is in fact about 2 seconds.

There can be a delay caused by Twitch transcoding the source stream into other resolutions if you are not watching the source stream but Twitch does *not* transcode the source stream itself -- it only transmuxs it into HLS format[2] which does add a small delay but not a multi-second one as can be confirmed with the test above.

As for live broadcast TV delays, here in the US, live TV is usually delayed by at least a few seconds (typically 5 to 7 seconds) to prevent another mishap like in the Super 2004 halftime show, though broadcast delays have been around for much longer than that[3]. The delay is often handled through a "dump box"[4] and is not inherent in the video encoding process itself. In fact if you look at the specs for broadcast-ready MPEG2 hardware encoders you'll see that there plenty of options for "low-latency" encoders that add < 0.5 seconds of latency[5].

Getting back to low-latency video encoding, the state-of-the-art these days for ultra-low latency encoding is "intra-frame" encoding (encoding parts of a frame at a time) which can lower latency to <5 ms[6]. I haven't used any myself so I don't know how the quality compares to "full-frame" encoders though Fraunhofer claims their ultra-low latency encoder compares favorably to the reference H.264 encoder with similar settings. I wouldn't be surprised if Google is testing this sort of thing with their Stadia platform.


[1] I captured segments from the OWL stream and heyZeusHeresToast's stream, both used 2 second segments

[2] I.e. it's doing a codec copy in ffmpeg rather than a transcode/reencode: https://blog.twitch.tv/live-video-transmuxing-transcoding-ffmpeg-vs-twitchtranscoder-part-i-489c1c125f28

[3] https://www.atlasobscura.com/articles/how-live-is-live-tv

[4] https://en.m.wikipedia.org/wiki/Broadcast_delay

[5] E.g.: https://www.thormodulators.com/files/How_to_Choose_Right_CATV_Modulator.pdf, http://cdn-docs.av-iq.com/dataSheet/Futura%20III%20ASI%2BIP_Datasheet.pdf

[6] Fraunhofer Ultra Low Latency Video Codec, CAST Low-Latency AVC/H.264 Baseline Profile Decoder Core
Jeff Kelly
Terracotta Army
Posts: 6920

I'm an apathetic, hedonistic, utilitarian, nihilistic existentialist.


Reply #58 on: March 23, 2019, 08:58:18 PM

I’ve worked for Fraunhofer for 8 years (until 2011) and I personally know some of the people who worked on MP3/MP4 AAC and h264.

Please keep in mind that ultra low latency encoders are usually targeted for computer vision applications in automotive, medical or industrial applications where you are able to control all of the factors that introduce latency. And where you can basically design the whole application yourself. Including the content aware predictors/algorithms.

Intel, which offers FPGA and IP core based encoder/decoder solutions (Altera based) that actually use the Fraunhofer HHI codec as one of their supported Codecs „only“ achieve 20 ms encoder delay at 1080p in ideal circumstances: encoder and decoder are  both FPGA based, UDP and RTP transmission is done in hardware, boards are directly connected via Ethernet and they eliminated most of the buffers, including the display controller buffers.

https://www.intel.com/content/dam/www/programmable/us/en/pdfs/literature/wp/wp-cast-low-latency.pdf

That’s a setup that you basically can only use if you design your own application (computer vision for automotive, medical or industrial) that need the low delay and where you can control all off the variables (bit rate adaption systems, decode buffer sizes, network transmission delays etc.)

Even though they use all of the techniques you mentioned including intra frame refresh so that the system doesn’t need to buffer complete frames and content adaptive algorithms.

That’s still a one frame delay for 60 Hz video and you’d need absolute ideal circumstances and you still have the tradeoff between bitrate and picture quality.

Most encoder suppliers would actually consider 100 ms to be low latency compared to the multi second delays that you usually have in digital broadcasting Systems.

Im not talking about US live broadcast delay either i’m talking about the delay that is introduced by DVB broadcasting systems compared to analogue TV which - because of the decoders/encoders, buffers and Bitrate Adaption can introduce an additional delay of several seconds.

Considering all of the different factors 150 ms is actually a pretty great  achievement as far as input latency is concerned. I’ve also never claimed that you cannot optimize one of the parameters down to the absolute optimum.

I‘m still skeptical though that they can do everything at once. Very low latency, very high image quality, average bandwidth, PC gaming at high settings, over average congested network connections for a $20 subscription fee.

If they do great I’ll reserve judgement until then through.
Raph
Developers
Posts: 1472

Title delayed while we "find the fun."


WWW
Reply #59 on: March 24, 2019, 12:18:13 PM

One thing most people are missing is that the 160ms delay is inclusive of the game's own latency.

An enormous amount of the games people play today are at that latency. Rockstar's stuff is slower. Ubi's stuff is around that mark already. In fact, Stadia benchmarks are actually slightly *faster* than the same game on some hardware.

Bottom line though, the vast majority of the audience can't tell. Are there genres that won't work? Sure. There are genres that didn't work on touch screens too.

Other big things:

- instant play from any little bit of video or GIF is a huge huge thing. Think of how much the accessibility of free to play changed market penetration. That's a big deal.

- server side compute power. It'll be years before this really gets used (we don't use the compute we have NOW). But it opens huge doors.

I have plenty of skepticism about aspects of Stadia, which I have gotten to personally express to people on the team. :) But I wouldn't discount it.
HaemishM
Staff Emeritus
Posts: 42628

the Confederate flag underneath the stone in my class ring


WWW
Reply #60 on: March 24, 2019, 12:46:22 PM

As I said before, none of the technical shit on Stadia will matter one good goddamn if the games library isn't absolutely top-notch on day one. If it's just a streamed version of the Google Play store's lineup of mobile trash, it will be utter shit.

Jeff Kelly
Terracotta Army
Posts: 6920

I'm an apathetic, hedonistic, utilitarian, nihilistic existentialist.


Reply #61 on: March 24, 2019, 01:38:25 PM

Raph, how would you suggest would engines need to be redesigned in order to best make use of the new paradigm?

One fear people have is that games are going to be optimized and designed to fit the current streaming platforms (high latency, average IQ) and that this will kill more genres.

My question is how you could change the engines so that the whole stack is ultra low latency in a way that could make e.g. fighting games still be viable.
Raph
Developers
Posts: 1472

Title delayed while we "find the fun."


WWW
Reply #62 on: March 24, 2019, 09:32:36 PM

Well, first off, not everyone is going to jump to streaming solutions. And barring exclusivity deals, games will likely end up on multiple platforms, so any platform specific features can’t be too extreme.

To really take advantage of Stadia, you want to use the cloud compute. Google has built some tech that basically gives you more GPUs close to transparently, That gives the “easy” upgrade, which is higher-end graphics.

Making real use of the additional CPU, though, requires having something to use it on, and tech to use it. Game engines aren’t going to magically gain better simulation, cooler AI, more player concurrency, and so on. Even if we could just flip a switch and have 10000 people in a battle royale, it doesn’t mean the game will be any good. But that sort of higher CPU sim is actually what Stadia will eventually be good at.

Stuff like AI as a microservice allowing smarter companion AI, world proc gen as you move around, perhaps to add detail as you look closely at things, generating custom animations as you move reacting to the environment, and so on.

Lower latency will come, but only to a degree. Can’t break the laws of physics.
Kageru
Terracotta Army
Posts: 4549


Reply #63 on: March 25, 2019, 12:24:13 AM


That would also depend on the cost of that high-performance compute power. The CPU in my desktop does not require ongoing rental.

A good sales pitch for people who want to play very complex games on lightweight platforms, but those people are probably also price sensitive.

Is a man not entitled to the hurf of his durf?
- Simond
satael
Terracotta Army
Posts: 2431


Reply #64 on: March 25, 2019, 01:29:10 AM

I could see this work on a mmo especially if it was made exclusively for the platform.
Druzil
Terracotta Army
Posts: 550


Reply #65 on: April 02, 2019, 01:16:16 PM

It will be neat if this revives the Xbox Live days of almost all games on the service having demos.   Being able to hop in an play 10 minutes of a game without downloading 20 GB would be nice.
Lucas
Terracotta Army
Posts: 3298

Further proof that Italians have suspect taste in games.


Reply #66 on: June 03, 2019, 09:55:43 AM

Big announcements coming this Thursday, 6th June, at 9am PDT/6pm CET:

https://www.youtube.com/watch?v=klipg69IyB0

" He's so impatient, it's like watching a teenager fuck a glorious older woman." - Ironwood on J.J. Abrams
Jeff Kelly
Terracotta Army
Posts: 6920

I'm an apathetic, hedonistic, utilitarian, nihilistic existentialist.


Reply #67 on: June 03, 2019, 10:37:23 AM

To really take advantage of Stadia, you want to use the cloud compute. Google has built some tech that basically gives you more GPUs close to transparently, That gives the “easy” upgrade, which is higher-end graphics.

Has Google found a way to eliminate network latency? I assume the 'gives you more GPUs' option is something more sophisticated and quicker than people taking apart your servers and putting additional graphics hardware into them on demand. So I guess additional GPU hardware means additional servers, right?

You still have the overhead of parrallelizing the GPU workload and transferring it to additional servers over the data center network then collecting the rendered data and sending it to the client which incurs a significant amount of additional latency.
Goumindong
Terracotta Army
Posts: 4297


Reply #68 on: June 04, 2019, 05:12:27 PM

If Stadia can do the end turn calculations for my total war games while letting me run the battles then its got legs
Trippy
Administrator
Posts: 23611


Reply #69 on: June 04, 2019, 05:40:03 PM

To really take advantage of Stadia, you want to use the cloud compute. Google has built some tech that basically gives you more GPUs close to transparently, That gives the “easy” upgrade, which is higher-end graphics.
Has Google found a way to eliminate network latency? I assume the 'gives you more GPUs' option is something more sophisticated and quicker than people taking apart your servers and putting additional graphics hardware into them on demand. So I guess additional GPU hardware means additional servers, right?

You still have the overhead of parrallelizing the GPU workload and transferring it to additional servers over the data center network then collecting the rendered data and sending it to the client which incurs a significant amount of additional latency.
It looks like Google has some sort of shared PCIe "backbone" to which instances can attach GPUs as their GPU specs page specifically lists the interconnect as PCIe Gen 3 x16 (plus one instance of NVLink). However you can not add or remove GPUs "close to transparently" as you have to shut down the instance to make those sorts of changes. I.e. Google hasn't solved the rendering pipeline network latency issue cause they aren't using the network to "attach" GPUs to instances -- they are using PCIe instead.

Edit: remove, rendering pipeline
« Last Edit: June 04, 2019, 05:42:52 PM by Trippy »
Pages: 1 [2] 3 4 Go Up Print 
f13.net  |  f13.net General Forums  |  Gaming  |  Topic: Stadia / Orion / Fuck Everything  
Jump to:  

Powered by SMF 1.1.10 | SMF © 2006-2009, Simple Machines LLC