Latest Comments by etonbears
Google announce ‘Stadia’, their new cloud gaming service built on Linux and Vulkan
21 Mar 2019 at 7:18 pm UTC Likes: 1
In terms of VR latency-induced sickness, yes, a problem. Even locally, and directly connected, and largely related to how an individual mentally tolerates the disjuncture of being in an Immersive environment that does not behave in an immersive manner.
I, personally, will probably never get on with VR as it exists today, and while a lot of people do think it's great, a surprisingly large number still have quite limited time tolerance. Network latency will only make this worse, and is best reduced through short internet paths, since the latency is primarily in the switches, not the cables/wires. But you're right, in the end it may come down to choosing one or the other.
P.S. XR is eXtended Reality, a convenient lumping together of Virtual Reality, Augmented Reality, Augmented Virtuality, and any other marketing buzzwords that come along.
21 Mar 2019 at 7:18 pm UTC Likes: 1
Quoting: Purple Library GuyYes, you're right, the data streams would most likely be compressed, but compression and decompression schemes have to be computationally simple to work in real-time, so they less likely to have any interest in how detailed each frame is, and more likely to be interested in how much changes between one frame and the next. If it does not change, you don't need to send it. Compression is an active area of research for many uses, often with quite different characteristics/needs.Quoting: etonbearsHuh. Somehow I was under the impression that video streams were compressed, and so just how detailed the actual picture was (as opposed to the number of pixels) might be relevant to how compressible it was. But yeah, I guess if they're just dumping all the pixels it doesn't matter what the programs are doing with those pixels. Given the pauses I often experience with simple streamed video I can well imagine streamed games having some problems, but that is a separate issue from the backend power needed to run the games.Quoting: Purple Library GuyThe bandwidth required for graphics stream presentation has historically increased quite slowly. It is proportional to frame rate multiplied by pixels per frame multiplied by bits per pixel. Desired frame rate has remained at about 60 for decades, and bits per pixel for most people has been 24 for decades. That leaves pixel resolution as the main variant, which has risen from 1M pixel screens 30 years ago to 6M pixel screens now. Network bandwidth increase in those 30 years far exceeds the increased requirements of a graphics stream, so if both network and graphics bandwidth trends continue, the streaming itself should reduce as a cause of bottleneck. Even the bandwidth to support binocular XR presentation should not be an issue since the size of XR screens you can put in front of your eyes is physically limited, and the human eye's ability to resolve detail at close range tops out at around 1000 pixels per inch.Quoting: etonbearsFor me, the interesting implication of Stadia is its ability to change the supply side. The Steam survey shows that the average PC gamer does not have particularly good hardware, and this actually limits developers in what they can do and still address a large enough purchase market.Interesting point. Mind you, for most games most of that power would be dedicated to graphics stuff, in which case wouldn't those extra-power-hungry games also be extra-bandwidth-hungry? You could end up trading one bottleneck for another.
If Stadia has nodes with Vega56 GPUs as a minimum, and allows arbitrary combining of nodes to produce output, then the complexity of what developers may produce for Stadia can scale very quickly to the point that you actually could NOT run it on any normally available desktop hardware, let alone the average rig, making traditional sales of such games redundant. That may be why the new Google game studio is suggesting their titles will be exclusive to Stadia.
Of course, however amazing their back-end might be, Google still need to get the right price model, overcome the possible network limitations and avoid their normal habit of turning everything into advertising revenue.
Which in turn makes me wonder about two futures clashing. Imagine the future of gaming is this kind of streaming solution. Now imagine the future of gaming is VR. I don't think it can be both unless someone spends a bunch of billions on last-mile fibre optics.
In contrast, the amount of additional processing power you can put into determining the content of the graphics stream is effectively unbounded, since almost every aspect of current real-time game production is subject to approximation, simplification and deception, in order to fit into the processing 'budget' available.
In terms of VR (XR?) I was thinking more that as I understand it, for it to work without messing up people's heads you need really, really low latency. I can imagine streaming working well enough for ordinary games in some places with some ISPs and data plans. But well enough for VR not to feel bad wonky? I seriously doubt it outside maybe South Korea. Mind you, I'm quite unconvinced that the future of gaming is VR. But if it was, it would be damn tough to stream effectively.
In terms of VR latency-induced sickness, yes, a problem. Even locally, and directly connected, and largely related to how an individual mentally tolerates the disjuncture of being in an Immersive environment that does not behave in an immersive manner.
I, personally, will probably never get on with VR as it exists today, and while a lot of people do think it's great, a surprisingly large number still have quite limited time tolerance. Network latency will only make this worse, and is best reduced through short internet paths, since the latency is primarily in the switches, not the cables/wires. But you're right, in the end it may come down to choosing one or the other.
P.S. XR is eXtended Reality, a convenient lumping together of Virtual Reality, Augmented Reality, Augmented Virtuality, and any other marketing buzzwords that come along.
Google announce ‘Stadia’, their new cloud gaming service built on Linux and Vulkan
21 Mar 2019 at 6:14 pm UTC
As for the current US "debate", monopolist/unacceptable business behaviour is a general problem not unique to the operation of the internet, and would be best addressed in that light. Should the US really want a completely undifferentiated network backbone operated as a public utility, it would be better to pay for it from the public purse, either publicly managed or sub-contracted, rather than the current arrangement. However, I suspect that might be considered "un-American".
21 Mar 2019 at 6:14 pm UTC
Quoting: ShmerlYou are adopting the narrow view of "network neutrality as monopolist tool" popular in the United States. I mean it in the wider sense of equality/restriction for any purpose. There are good arguments for allowing different quality of service offerings based on what you pay ( providing it is transparent ) and good arguments for variable service for particular traffic types ( largely safety-critical ), for example.Quoting: etonbearsWhich is one argument against net neutrality - you can't guarantee the quality of service you think you are paying for.Network congestion due to load is not an argument against net neutrality. Net neutrality is about preventing deliberate traffic discrimination (such as for anti-competitive purposes). Managing the network due to congestion is fine according to the concept of net neutrality. Mind you, something like data caps is not a network management tool, it's users fleecing, anti-competitive trash. Limiting bandwidth when network is overloaded though is a legitimate network managing technique.
As for the current US "debate", monopolist/unacceptable business behaviour is a general problem not unique to the operation of the internet, and would be best addressed in that light. Should the US really want a completely undifferentiated network backbone operated as a public utility, it would be better to pay for it from the public purse, either publicly managed or sub-contracted, rather than the current arrangement. However, I suspect that might be considered "un-American".
Google announce ‘Stadia’, their new cloud gaming service built on Linux and Vulkan
21 Mar 2019 at 5:18 pm UTC
Tricky to get ISP et al. agreements for even a short path, and potentially costly; so the price of using Stadia maybe gets raised again.
21 Mar 2019 at 5:18 pm UTC
Quoting: silmethSure, you can alter the content to fit the stream, but if you want to have the best experience, you do it the other way round if you can. There is a mechanism for this called RSVP ( the Resource Reservation Protocol ) which allows a receiver to reserve a virtual "channel" of a particular size for data from a sender, but it requires all intervening points to agree, and effectively prefer your traffic over the first-come first-served norm for a router.Quoting: etonbearsA Stadia game would also be an Audiovisual stream, but one that cannot really be buffered as the stream content must be synchronized with your input device events. Any buffering would show up immediately as lag, possibly making the game unplayable.On the other hand it can (similarly to what OnLive did) lower the resolution temporarily on bandwidth fluctuations to deliver a continuous real time stream. That would not, of course, work on connection loss, and could be annoying. That’s why I would not myself prefer game streaming over regular PC gaming, but, as I argued, that’d IMO be perfect for demo/timed trial gaming before buying the game.
I wouldn’t want to suffer a whole game playthrough over a fallible network, but I do prefer streaming 25 Mbit of data every second for a few hours to just try the game than downloading the whole game before I can get a taste of it.
Tricky to get ISP et al. agreements for even a short path, and potentially costly; so the price of using Stadia maybe gets raised again.
Google announce ‘Stadia’, their new cloud gaming service built on Linux and Vulkan
21 Mar 2019 at 5:00 pm UTC
TCP and UDP are the most common protocols used to split up application information into packets that you then post using IP.
With UDP, each packet is numbered with a sequence number and broadcast to anyone listening on a particular UDP port. Any receiver can reconstruct the data that was sent using the sequence numbers in the packets, but if a receiver does not get all of the packets that were sent, it has no way to ask for a re-send, so it is, as you say, lossy, and the application must tolerate this.
TCP is a point to point protocol where the sender sends similar sequence-numbered packets to exactly one address, and the receiver sends back an acknowledgement for each packet received. This allows both parties to know that the data was received, and to re-send any packet that was lost.
In theory, either could be used for a game stream, with UDP being faster but needing error correction in an application for the missing packets. In practice, UDP is sometimes blocked by routers, and as TCP is more robust, it is more commonly used.
21 Mar 2019 at 5:00 pm UTC
Quoting: MohandevirMy knowledge of network traffic is limited but isn't the protocole used for game streaming involved? Is it TCP/IP, UDP or something I haven't heard of? Isn't UDP faster but prone to packet loss thus reducing the quality of the stream?IP is the underlying delivery protocol. The equivalent of writing the address on an packet and posting it. There is no guarantee that IP packets will be delivered in the order of sending, and no guarantee that any individual packet will actually be delivered at all.
Sorry if I'm totally off topic, because there must be a reason why it's never mentionned. Please enlighten me.
TCP and UDP are the most common protocols used to split up application information into packets that you then post using IP.
With UDP, each packet is numbered with a sequence number and broadcast to anyone listening on a particular UDP port. Any receiver can reconstruct the data that was sent using the sequence numbers in the packets, but if a receiver does not get all of the packets that were sent, it has no way to ask for a re-send, so it is, as you say, lossy, and the application must tolerate this.
TCP is a point to point protocol where the sender sends similar sequence-numbered packets to exactly one address, and the receiver sends back an acknowledgement for each packet received. This allows both parties to know that the data was received, and to re-send any packet that was lost.
In theory, either could be used for a game stream, with UDP being faster but needing error correction in an application for the missing packets. In practice, UDP is sometimes blocked by routers, and as TCP is more robust, it is more commonly used.
Google announce ‘Stadia’, their new cloud gaming service built on Linux and Vulkan
21 Mar 2019 at 4:35 pm UTC
21 Mar 2019 at 4:35 pm UTC
Quoting: silmethThree years ago I had a solid symmetric ~300 Mbps connection (I actually measured 291 download and 415 upload, sic! – during that time I might have had been the best individual Kubuntu live DVDs seeder out there…). Today that ISP delivers symmetric 700 Mbps for ~$9.50/month. Unfortunately I moved to another city district and had to change ISP to a much worse one.As with my comment directly above, game streaming may not add significantly to the load, but game streams need to be real-time, or almost so, as they are much more sensitive to irregularity in delivery than heavily buferred Netflix or Youtube streams.
Today I have 120 Mbps down / 12 Mbps up, I don’t saturate it most of the time, but when I do download something, that 120 Mbps is real. I have no problems with simultaneous torrent download and two HD Netflix streams running in my house Friday evening… The situation outside of bigger cities is much worse (often no optical fibers available, so DSL or mobile ISPs only).
You are right that I have no idea what would happen if all other users started really saturating the link at the same time. I believe none of my neighbours, besides me, really ever used that ~300 Mbps network back then. But then – how many people will use game streaming simultaneously? If Youtube + Netflix (and Amazon, and HBO Go, etc.) + some people torrenting don’t seem to generate any problems today, I don’t think a game streaming service would change it much, but maybe I am underestimating its impact.
Google announce ‘Stadia’, their new cloud gaming service built on Linux and Vulkan
21 Mar 2019 at 4:26 pm UTC Likes: 4
A Stadia game would also be an Audiovisual stream, but one that cannot really be buffered as the stream content must be synchronized with your input device events. Any buffering would show up immediately as lag, possibly making the game unplayable.
21 Mar 2019 at 4:26 pm UTC Likes: 4
Quoting: mylkaAudiovisual traffic is heavily buffered ( many seconds worth ) to ensure smooth delivery over an irregular delivery channel, and still you can get such bad contention somewhere on the internet that the buffer runs dry and the stream stutters or breaks.Quoting: Klaaswasnt division2 day one patch 92GB?Quoting: Sir_DiealotSo you are not willing to download 50 GB for a weekend but to download 50 GB for two hours of streaming?If we consider the 25 Mbit/s estimate from a few pages back and 8 hours playing time, you would require approximately 88 GB of traffic. That's insane.
i dont get the traffic "problem". i watch a lot of netflix and youtube. i already have 700+GB traffic each month
since stadia is just streaming a video nothing would change
but i waint and see. i dont think it will replace consoles and pcs anytime soon.
i guess it will be very expensive and if it isnt much cheaper, than my PC, then it makes no sense to me (i wouldnt play games on small screens like a phone)
A Stadia game would also be an Audiovisual stream, but one that cannot really be buffered as the stream content must be synchronized with your input device events. Any buffering would show up immediately as lag, possibly making the game unplayable.
Google announce ‘Stadia’, their new cloud gaming service built on Linux and Vulkan
21 Mar 2019 at 4:11 pm UTC
In contrast, the amount of additional processing power you can put into determining the content of the graphics stream is effectively unbounded, since almost every aspect of current real-time game production is subject to approximation, simplification and deception, in order to fit into the processing 'budget' available.
21 Mar 2019 at 4:11 pm UTC
Quoting: Purple Library GuyThe bandwidth required for graphics stream presentation has historically increased quite slowly. It is proportional to frame rate multiplied by pixels per frame multiplied by bits per pixel. Desired frame rate has remained at about 60 for decades, and bits per pixel for most people has been 24 for decades. That leaves pixel resolution as the main variant, which has risen from 1M pixel screens 30 years ago to 6M pixel screens now. Network bandwidth increase in those 30 years far exceeds the increased requirements of a graphics stream, so if both network and graphics bandwidth trends continue, the streaming itself should reduce as a cause of bottleneck. Even the bandwidth to support binocular XR presentation should not be an issue since the size of XR screens you can put in front of your eyes is physically limited, and the human eye's ability to resolve detail at close range tops out at around 1000 pixels per inch.Quoting: etonbearsFor me, the interesting implication of Stadia is its ability to change the supply side. The Steam survey shows that the average PC gamer does not have particularly good hardware, and this actually limits developers in what they can do and still address a large enough purchase market.Interesting point. Mind you, for most games most of that power would be dedicated to graphics stuff, in which case wouldn't those extra-power-hungry games also be extra-bandwidth-hungry? You could end up trading one bottleneck for another.
If Stadia has nodes with Vega56 GPUs as a minimum, and allows arbitrary combining of nodes to produce output, then the complexity of what developers may produce for Stadia can scale very quickly to the point that you actually could NOT run it on any normally available desktop hardware, let alone the average rig, making traditional sales of such games redundant. That may be why the new Google game studio is suggesting their titles will be exclusive to Stadia.
Of course, however amazing their back-end might be, Google still need to get the right price model, overcome the possible network limitations and avoid their normal habit of turning everything into advertising revenue.
Which in turn makes me wonder about two futures clashing. Imagine the future of gaming is this kind of streaming solution. Now imagine the future of gaming is VR. I don't think it can be both unless someone spends a bunch of billions on last-mile fibre optics.
In contrast, the amount of additional processing power you can put into determining the content of the graphics stream is effectively unbounded, since almost every aspect of current real-time game production is subject to approximation, simplification and deception, in order to fit into the processing 'budget' available.
Google announce ‘Stadia’, their new cloud gaming service built on Linux and Vulkan
21 Mar 2019 at 3:21 pm UTC
21 Mar 2019 at 3:21 pm UTC
Quoting: GuestWhich is one argument against net neutrality - you can't guarantee the quality of service you think you are paying for.Quoting: etonbearsIf Stadia has nodes with Vega56 GPUs as a minimum, and allows arbitrary combining of nodes to produce output, then the complexity of what developers may produce for Stadia can scale very quickly to the point that you actually could NOT run it on any normally available desktop hardware, let alone the average rig, making traditional sales of such games redundant.This was part of the Stadia presentation as it referred to water effects. I must say (latency, pixel-crush, and streaming slideshows aside), some of the things I have witnessed with game streaming (read rack-server RAID) are pretty darn impressive! I distinctly remember Lord of the Rings: War in the North loading and playing orders of magnitude faster in OnLive than on my Steam rig at the time, especially with friends.
Unfortunately, however, come Friday night ... the bandwidths are all "Netflix and Chill."
Google announce ‘Stadia’, their new cloud gaming service built on Linux and Vulkan
21 Mar 2019 at 3:15 pm UTC
21 Mar 2019 at 3:15 pm UTC
Quoting: Sir_DiealotYes, exactly. They are genuinely offering the development community something different, but whether it will be affordable, and when a sufficient audience will have the required network characteristics are still questions to be answered.Quoting: etonbearsFor me, the interesting implication of Stadia is its ability to change the supply side. The Steam survey shows that the average PC gamer does not have particularly good hardware, and this actually limits developers in what they can do and still address a large enough purchase market.Oh don't you worry, it won't be long until the first game can't be run on a desktop machine anymore and it will be a point of pride for the developer. The resource demand will scale with the available resources, so unless Google puts a hefty price tag on that they'll soon have a serious issue on their hands.
If Stadia has nodes with Vega56 GPUs as a minimum, and allows arbitrary combining of nodes to produce output, then the complexity of what developers may produce for Stadia can scale very quickly to the point that you actually could NOT run it on any normally available desktop hardware, let alone the average rig, making traditional sales of such games redundant. That may be why the new Google game studio is suggesting their titles will be exclusive to Stadia.
Of course, however amazing their back-end might be, Google still need to get the right price model, overcome the possible network limitations and avoid their normal habit of turning everything into advertising revenue.
Google announce ‘Stadia’, their new cloud gaming service built on Linux and Vulkan
21 Mar 2019 at 1:50 am UTC Likes: 2
21 Mar 2019 at 1:50 am UTC Likes: 2
For me, the interesting implication of Stadia is its ability to change the supply side. The Steam survey shows that the average PC gamer does not have particularly good hardware, and this actually limits developers in what they can do and still address a large enough purchase market.
If Stadia has nodes with Vega56 GPUs as a minimum, and allows arbitrary combining of nodes to produce output, then the complexity of what developers may produce for Stadia can scale very quickly to the point that you actually could NOT run it on any normally available desktop hardware, let alone the average rig, making traditional sales of such games redundant. That may be why the new Google game studio is suggesting their titles will be exclusive to Stadia.
Of course, however amazing their back-end might be, Google still need to get the right price model, overcome the possible network limitations and avoid their normal habit of turning everything into advertising revenue.
If Stadia has nodes with Vega56 GPUs as a minimum, and allows arbitrary combining of nodes to produce output, then the complexity of what developers may produce for Stadia can scale very quickly to the point that you actually could NOT run it on any normally available desktop hardware, let alone the average rig, making traditional sales of such games redundant. That may be why the new Google game studio is suggesting their titles will be exclusive to Stadia.
Of course, however amazing their back-end might be, Google still need to get the right price model, overcome the possible network limitations and avoid their normal habit of turning everything into advertising revenue.
- Nexus Mods retire their in-development cross-platform app to focus back on Vortex
- Canonical call for testing their Steam gaming Snap for Arm Linux
- Windows compatibility layer Wine 11 arrives bringing masses of improvements to Linux
- GOG plan to look a bit closer at Linux through 2026
- European Commission gathering feedback on the importance of open source
- > See more over 30 days here
- Venting about open source security.
- LoudTechie - Weekend Players' Club 2026-01-16
- CatKiller - Welcome back to the GamingOnLinux Forum
- simplyseven - A New Game Screenshots Thread
- JohnLambrechts - Will you buy the new Steam Machine?
- mr-victory - See more posts
How to setup OpenMW for modern Morrowind on Linux / SteamOS and Steam Deck
How to install Hollow Knight: Silksong mods on Linux, SteamOS and Steam Deck