You can sign up to get a daily email of our articles, see the Mailing List page.
We do often include affiliate links to earn us some pennies. See more here.

Google have now finally unveiled their new cloud gaming service named Stadia, offering instant access to play games in Google Chrome.

What they joked was the worst-kept secret in the industry (no kidding), sounds like quite an interesting service. Certainly one that could eventually end up redefining what gaming is. A little hyperbolic maybe? I'm not so sure considering how easy this should be to jump into a game. On top of that, they very clearly talked about how it's built on Linux (Debian specifically) and Vulkan with custom GPUs from AMD.

Something they showed off, was how you could be watching a game trailer with a button to play it on Stadia and (supposedly within a few seconds) you would jump right into it. That's quite en exciting idea, one that would easily pull in quite a lot of people I've no doubt.

As for resolution, they said it will support 1080p and 4K around 60FPS at release with 8K being worked on as well but that sounds further out if anyone even cares about 8K right now.

They also showed off their new controller, with a dedicated Google Assistant button and a button to capture video immediately for YouTube:


While Google are making their own dedicated gamepad, they did say it will be compatible with other devices too.

They also announced partnerships with both Unity and Unreal Engine and Stadia will "embrace full cross-platform play" including "game saves and progression". They also had id Software, talk about how it didn't take long to bring the new Doom Eternal to Stadia, thanks to how they made the previous Doom game with Vulkan.

This means, that development for Linux is suddenly going to become a priority for a lot more developers and publishers. I don't want to overstate how important that is, but it's a very exciting prospect. This doesn't suddenly mean we're going to see a lot more Linux games on the desktop, but it's entirely possible after they go through all the work to get the games working on Linux with Vulkan for Stadia.

Stream Connect is another service they talked about. They mentioned how developers have pushed the boundaries of gaming but often local co-op is left out, as doing it multiple times in top-end games can require really beefy hardware. With Stadia, each instance would be powered by their servers so it wouldn't be such an issue. They also talked about how if you're playing some sort of squad-based game, how you could bring up their screen to see what they're doing which sounds very cool.

Google also announced the formation of their own game studio, Stadia Games and Entertainment, to work on exclusive games for their new service.

As for support from more external game developers, they mentioned how they've shipped "development hardware" to over 100 developers. From what they said, it should be open to smaller developers as well as the usual AAA bunch.

Stadia is confirmed to be launching this year and it will be first available in the US, Canada, UK and "most of Europe". One thing wasn't mentioned at all—price, but they said more details will be available in the summer. The official site is also now up on stadia.com and developers have their own website to look over.

Google also posted up some extra information on their developer blog:

Google believes that open source is good for everyone. It enables and encourages collaboration and the development of technology, solving real-world problems. This is especially true on Stadia, as we believe the game development community has a strong history of collaboration, innovation and shared gains as techniques and technology continually improve. We’re investing in open-source technology to create the best platform for developers, in partnership with the people that use it. This starts with our platform foundations of Linux and Vulkan and shows in our selection of GPUs that have open-source drivers and tools. We’re integrating LLVM and DirectX Shader Compiler to ensure you get great features and performance from our compilers and debuggers. State-of-the-art graphics tools are critical to game developers, and we’re excited to leverage and contribute to RenderDoc, GAPID and Radeon GPU Profiler — best of breed open-source graphics debugging and profiling tools that are continually improving.

There's probably plenty I missed, you can see their video on YouTube here.

As exciting and flashy as it sounds, it's obviously not Linux "desktop" gaming which is what the majority of our audience is likely interested in. However, things change and if it does become a huge hit we will cover it more often if readers request it. Linux gaming can mean all sorts of things from native games to emulators, Wine and Steam Play and now perhaps some cloud gaming so I don't want to rule it out. However, I can't see this replacing Steam, Humble, GOG, itch.io and so on for me personally.

Obviously there’s still a lot of drawbacks to such a service, especially since you will likely have zero ownership of the actual games so they could get taken away at any time when licensing vanishes. At least with stores like Steam, you still get to access those games because you purchased them. Although, this does depend on what kind of licensing Google do with developers and publishers, it might not be an issue at all but it’s still a concern of mine. Latency and input lag, are also two other major concerns but given Google's power with their vast networks, it might not be so bad.

Also, good luck monitoring your bandwidth use with this, it's likely going to eat up a lot all of it. YouTube and Netflix use up quite a bit just for watching a 30-minute episode of something in good quality, how about a few hours per day gaming across Stadia? Ouch.

That doesn't even address the real elephant in the room, you're going to be giving Google even more of your data if you use this service, a lot more. This is the company that failed to promptly disclose a pretty huge data leak in Google+ after all. I don't want to be some sort of scaremongering crazy-person but it's something to think about.

As always, the comments are open for you to voice your opinion on it. Please remain respectful to those with a different opinion on the matter.

Article taken from GamingOnLinux.com.
52 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
285 comments
Page: «21/29»
  Go to:

Ehvis Mar 21, 2019
View PC info
  • Supporter Plus
Quoting: Mohandevir
Quoting: Ehvis
Quoting: MohandevirIs it TCP/IP, UDP or something I haven't heard of? Isn't UDP faster but prone to packet loss thus reducing the quality of the stream?

TCP includes the control mechanism to deal with packet loss (detection and resending). For UDP it is up to the application to decide whether to detect it and what to do if something is lost.

And still be faster than TCP? Or is it better to go with TCP, in that case?

UDP is faster because it does away with some of the work. But if you have large block of data that need to be complete, you'd need the work to be done anyway, so you're better off using TCP. UDP is generally used for small data fragments that expire. Like in game protocols where a single packet can update the position of your fellow players. If you miss such a packet, then resending it is not so useful as you may as well get a new updated one.

In case of video streams it depends on what you want. If you every watched digital cable on a bad day, then you know what packet loss means for a digital video stream. Either you accept that you get half a second of mess if you lose something, or you go for data integrity with a possible cost of some more lag.
etonbears Mar 21, 2019
Quoting: silmeth
Quoting: etonbearsA Stadia game would also be an Audiovisual stream, but one that cannot really be buffered as the stream content must be synchronized with your input device events. Any buffering would show up immediately as lag, possibly making the game unplayable.

On the other hand it can (similarly to what OnLive did) lower the resolution temporarily on bandwidth fluctuations to deliver a continuous real time stream. That would not, of course, work on connection loss, and could be annoying. That’s why I would not myself prefer game streaming over regular PC gaming, but, as I argued, that’d IMO be perfect for demo/timed trial gaming before buying the game.

I wouldn’t want to suffer a whole game playthrough over a fallible network, but I do prefer streaming 25 Mbit of data every second for a few hours to just try the game than downloading the whole game before I can get a taste of it.

Sure, you can alter the content to fit the stream, but if you want to have the best experience, you do it the other way round if you can. There is a mechanism for this called RSVP ( the Resource Reservation Protocol ) which allows a receiver to reserve a virtual "channel" of a particular size for data from a sender, but it requires all intervening points to agree, and effectively prefer your traffic over the first-come first-served norm for a router.

Tricky to get ISP et al. agreements for even a short path, and potentially costly; so the price of using Stadia maybe gets raised again.
Purple Library Guy Mar 21, 2019
Quoting: etonbears
Quoting: Purple Library Guy
Quoting: etonbearsFor me, the interesting implication of Stadia is its ability to change the supply side. The Steam survey shows that the average PC gamer does not have particularly good hardware, and this actually limits developers in what they can do and still address a large enough purchase market.

If Stadia has nodes with Vega56 GPUs as a minimum, and allows arbitrary combining of nodes to produce output, then the complexity of what developers may produce for Stadia can scale very quickly to the point that you actually could NOT run it on any normally available desktop hardware, let alone the average rig, making traditional sales of such games redundant. That may be why the new Google game studio is suggesting their titles will be exclusive to Stadia.

Of course, however amazing their back-end might be, Google still need to get the right price model, overcome the possible network limitations and avoid their normal habit of turning everything into advertising revenue.
Interesting point. Mind you, for most games most of that power would be dedicated to graphics stuff, in which case wouldn't those extra-power-hungry games also be extra-bandwidth-hungry? You could end up trading one bottleneck for another.
Which in turn makes me wonder about two futures clashing. Imagine the future of gaming is this kind of streaming solution. Now imagine the future of gaming is VR. I don't think it can be both unless someone spends a bunch of billions on last-mile fibre optics.

The bandwidth required for graphics stream presentation has historically increased quite slowly. It is proportional to frame rate multiplied by pixels per frame multiplied by bits per pixel. Desired frame rate has remained at about 60 for decades, and bits per pixel for most people has been 24 for decades. That leaves pixel resolution as the main variant, which has risen from 1M pixel screens 30 years ago to 6M pixel screens now. Network bandwidth increase in those 30 years far exceeds the increased requirements of a graphics stream, so if both network and graphics bandwidth trends continue, the streaming itself should reduce as a cause of bottleneck. Even the bandwidth to support binocular XR presentation should not be an issue since the size of XR screens you can put in front of your eyes is physically limited, and the human eye's ability to resolve detail at close range tops out at around 1000 pixels per inch.

In contrast, the amount of additional processing power you can put into determining the content of the graphics stream is effectively unbounded, since almost every aspect of current real-time game production is subject to approximation, simplification and deception, in order to fit into the processing 'budget' available.
Huh. Somehow I was under the impression that video streams were compressed, and so just how detailed the actual picture was (as opposed to the number of pixels) might be relevant to how compressible it was. But yeah, I guess if they're just dumping all the pixels it doesn't matter what the programs are doing with those pixels. Given the pauses I often experience with simple streamed video I can well imagine streamed games having some problems, but that is a separate issue from the backend power needed to run the games.

In terms of VR (XR?) I was thinking more that as I understand it, for it to work without messing up people's heads you need really, really low latency. I can imagine streaming working well enough for ordinary games in some places with some ISPs and data plans. But well enough for VR not to feel bad wonky? I seriously doubt it outside maybe South Korea. Mind you, I'm quite unconvinced that the future of gaming is VR. But if it was, it would be damn tough to stream effectively.
Purple Library Guy Mar 21, 2019
Quoting: Mohandevir
Quoting: Ehvis
Quoting: MohandevirIs it TCP/IP, UDP or something I haven't heard of? Isn't UDP faster but prone to packet loss thus reducing the quality of the stream?

TCP includes the control mechanism to deal with packet loss (detection and resending). For UDP it is up to the application to decide whether to detect it and what to do if something is lost.

And still be faster than TCP? Or is it better to go with TCP, in that case?
I might imagine that in a game, (as etonbears points out, without buffering, everything happening in real time) by the time lost packets get re-sent they'd be irrelevant, so it would be better to just ignore them and leave a little fuzz in the picture than to, like, refuse to show the image until it's all complete. That might suggest this UDP thing. But I don't know anything about this, I'm just trying to do logic from too little data.
etonbears Mar 21, 2019
Quoting: Shmerl
Quoting: etonbearsWhich is one argument against net neutrality - you can't guarantee the quality of service you think you are paying for.

Network congestion due to load is not an argument against net neutrality. Net neutrality is about preventing deliberate traffic discrimination (such as for anti-competitive purposes). Managing the network due to congestion is fine according to the concept of net neutrality. Mind you, something like data caps is not a network management tool, it's users fleecing, anti-competitive trash. Limiting bandwidth when network is overloaded though is a legitimate network managing technique.

You are adopting the narrow view of "network neutrality as monopolist tool" popular in the United States. I mean it in the wider sense of equality/restriction for any purpose. There are good arguments for allowing different quality of service offerings based on what you pay ( providing it is transparent ) and good arguments for variable service for particular traffic types ( largely safety-critical ), for example.

As for the current US "debate", monopolist/unacceptable business behaviour is a general problem not unique to the operation of the internet, and would be best addressed in that light. Should the US really want a completely undifferentiated network backbone operated as a public utility, it would be better to pay for it from the public purse, either publicly managed or sub-contracted, rather than the current arrangement. However, I suspect that might be considered "un-American".
x_wing Mar 21, 2019
Quoting: Purple Library Guy
Quoting: Mohandevir
Quoting: Ehvis
Quoting: MohandevirIs it TCP/IP, UDP or something I haven't heard of? Isn't UDP faster but prone to packet loss thus reducing the quality of the stream?

TCP includes the control mechanism to deal with packet loss (detection and resending). For UDP it is up to the application to decide whether to detect it and what to do if something is lost.

And still be faster than TCP? Or is it better to go with TCP, in that case?
I might imagine that in a game, (as etonbears points out, without buffering, everything happening in real time) by the time lost packets get re-sent they'd be irrelevant, so it would be better to just ignore them and leave a little fuzz in the picture than to, like, refuse to show the image until it's all complete. That might suggest this UDP thing. But I don't know anything about this, I'm just trying to do logic from too little data.

Yeap, TCP has way too much overhead for an application that requires very low latency (the way this stream works is very similar on how tv broadcasting works). Also, in the Google backbone for this service they may be using other less known optimization in order to reduce the latency (Jumbo frames!)
Purple Library Guy Mar 21, 2019
Quoting: etonbears
Quoting: Shmerl
Quoting: etonbearsWhich is one argument against net neutrality - you can't guarantee the quality of service you think you are paying for.

Network congestion due to load is not an argument against net neutrality. Net neutrality is about preventing deliberate traffic discrimination (such as for anti-competitive purposes). Managing the network due to congestion is fine according to the concept of net neutrality. Mind you, something like data caps is not a network management tool, it's users fleecing, anti-competitive trash. Limiting bandwidth when network is overloaded though is a legitimate network managing technique.

You are adopting the narrow view of "network neutrality as monopolist tool" popular in the United States.
Debate and discussion of the term as used by Shmerl has been extremely widespread for a number of years. Even if it's used differently elsewhere, it's probably not used nearly as much your way overall because your sense is more technical and less controversial in its implications, so probably just less talked about. So you shouldn't be surprised if Shmerl's is the sense people expect. And if you think it's going to stay limited to the US, well, maybe, but I've sure noticed that nasty practices often start in the US and are then exported to much of the rest of the world through trade agreements and by the same interests elsewhere latching onto the American example to make their greed respectable.

I do think that public provision would be a good idea. The internet is infrastructure; infrastructure works well public.


Last edited by Purple Library Guy on 21 March 2019 at 7:03 pm UTC
etonbears Mar 21, 2019
Quoting: Purple Library Guy
Quoting: etonbears
Quoting: Purple Library Guy
Quoting: etonbearsFor me, the interesting implication of Stadia is its ability to change the supply side. The Steam survey shows that the average PC gamer does not have particularly good hardware, and this actually limits developers in what they can do and still address a large enough purchase market.

If Stadia has nodes with Vega56 GPUs as a minimum, and allows arbitrary combining of nodes to produce output, then the complexity of what developers may produce for Stadia can scale very quickly to the point that you actually could NOT run it on any normally available desktop hardware, let alone the average rig, making traditional sales of such games redundant. That may be why the new Google game studio is suggesting their titles will be exclusive to Stadia.

Of course, however amazing their back-end might be, Google still need to get the right price model, overcome the possible network limitations and avoid their normal habit of turning everything into advertising revenue.
Interesting point. Mind you, for most games most of that power would be dedicated to graphics stuff, in which case wouldn't those extra-power-hungry games also be extra-bandwidth-hungry? You could end up trading one bottleneck for another.
Which in turn makes me wonder about two futures clashing. Imagine the future of gaming is this kind of streaming solution. Now imagine the future of gaming is VR. I don't think it can be both unless someone spends a bunch of billions on last-mile fibre optics.

The bandwidth required for graphics stream presentation has historically increased quite slowly. It is proportional to frame rate multiplied by pixels per frame multiplied by bits per pixel. Desired frame rate has remained at about 60 for decades, and bits per pixel for most people has been 24 for decades. That leaves pixel resolution as the main variant, which has risen from 1M pixel screens 30 years ago to 6M pixel screens now. Network bandwidth increase in those 30 years far exceeds the increased requirements of a graphics stream, so if both network and graphics bandwidth trends continue, the streaming itself should reduce as a cause of bottleneck. Even the bandwidth to support binocular XR presentation should not be an issue since the size of XR screens you can put in front of your eyes is physically limited, and the human eye's ability to resolve detail at close range tops out at around 1000 pixels per inch.

In contrast, the amount of additional processing power you can put into determining the content of the graphics stream is effectively unbounded, since almost every aspect of current real-time game production is subject to approximation, simplification and deception, in order to fit into the processing 'budget' available.
Huh. Somehow I was under the impression that video streams were compressed, and so just how detailed the actual picture was (as opposed to the number of pixels) might be relevant to how compressible it was. But yeah, I guess if they're just dumping all the pixels it doesn't matter what the programs are doing with those pixels. Given the pauses I often experience with simple streamed video I can well imagine streamed games having some problems, but that is a separate issue from the backend power needed to run the games.

In terms of VR (XR?) I was thinking more that as I understand it, for it to work without messing up people's heads you need really, really low latency. I can imagine streaming working well enough for ordinary games in some places with some ISPs and data plans. But well enough for VR not to feel bad wonky? I seriously doubt it outside maybe South Korea. Mind you, I'm quite unconvinced that the future of gaming is VR. But if it was, it would be damn tough to stream effectively.

Yes, you're right, the data streams would most likely be compressed, but compression and decompression schemes have to be computationally simple to work in real-time, so they less likely to have any interest in how detailed each frame is, and more likely to be interested in how much changes between one frame and the next. If it does not change, you don't need to send it. Compression is an active area of research for many uses, often with quite different characteristics/needs.

In terms of VR latency-induced sickness, yes, a problem. Even locally, and directly connected, and largely related to how an individual mentally tolerates the disjuncture of being in an Immersive environment that does not behave in an immersive manner.

I, personally, will probably never get on with VR as it exists today, and while a lot of people do think it's great, a surprisingly large number still have quite limited time tolerance. Network latency will only make this worse, and is best reduced through short internet paths, since the latency is primarily in the switches, not the cables/wires. But you're right, in the end it may come down to choosing one or the other.

P.S. XR is eXtended Reality, a convenient lumping together of Virtual Reality, Augmented Reality, Augmented Virtuality, and any other marketing buzzwords that come along.
Mohandevir Mar 21, 2019
Wow! Thank you all for your great explanations. You are awesome!
Klaas Mar 21, 2019
Quoting: Purple Library GuyHuh. Somehow I was under the impression that video streams were compressed, and so just how detailed the actual picture was (as opposed to the number of pixels) might be relevant to how compressible it was. (…)

That's definitely the case if the frames contain lots of moving noise e.g. falling snow. An easy example to see the problem is a classic Doom stream on a map that contains the texture FIREBLU (e.g. E3M6 Mt. Erebus) and the image quality drops while the bitrate explodes.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.