I’m quickly approaching 40, and I would like nothing more to not have to own the windows desktop that I only use for one thing. To play blood bowl 2 (and eventually 3) a few times a week. If I could do that from a browser on my MacBook, you can bet I’d never own another desktop in this life.
That’s anecdotal or course, but there’s quite a lot of us.
If a developer is not willing to lift a finger to port to mac (a small market, but one with a known size), why would they port to Stadia or some other unknown market?
1) PC gamers tend to revel in owning (building, customizing, optimizing) their hardware; not just because it lets them play the games they want to play, but even for its own sake. RGB arrays, overclocking, custom case builds. Streaming can't compete with that.
2) "Casual" gamers already have powerful devices in their pockets with thousands and thousands of games available, including many free ones and many high-quality ones.
3) Console gamers are presumably the target (?) market. But an Xbox Series S costs $299. The (absolute minimum) Stadia starter kit costs $99; you're already a third of the way there. And then there's the subscription fee. And then you still have to buy the games. Something I don't think Google realized is that over a console generation, the dominant cost quickly becomes the games themselves, not the hardware. If Stadia users still have to buy them at full-price - $60 a pop - that $200 you saved at the beginning quickly becomes a diminishing fraction. You just aren't saving that much, and in exchange, you get the constant risk that your whole library will simply be killed at any moment, as well as...
4) The latency. The problem with latency is it's not a fully solvable issue, no matter how much hardware or money you throw at the problem. There's a physical lower bound on how long it takes electricity to get from your house to a data center and back. And then there's all the routing infrastructure run by your ISP, which a) is outside of Google or Microsoft or whoever's ability to improve, and b) is unlikely to be improved by the ISP because game streaming is basically the only usecase where bleeding-edge latency actually matters. And in terms of how much it matters: one frame at 60FPS translates to 16.7ms. Client-rendered multiplayer games don't have as much of an issue with higher latencies because of client-side prediction: https://en.wikipedia.org/wiki/Client-side_prediction
Here's the only way I could see game streaming being successful:
An all-you-can-eat, Netflix-style buffet of big-budget games. Like Apple Arcade, except it has games like Call of Duty and Borderlands that you could normally only play on a console or a gaming PC. You pay a monthly fee, and you never have to buy or even download a game. Dedicated thin-client hardware is a waste; anybody who wants to buy hardware will just buy a console. Your target customers don't want that. Instead this would only be playable on existing platforms, primarily desktop/web/mobile, though possibly existing consoles as well.
That would be a decent value-proposition for some people. Those playing really fast-paced games and/or sticklers for latency wouldn't go for it, some existing phone-gamers might, but mostly you would get people like your friend from college who just wants to play Borderlands with you but isn't really a "gamer" outside of that.
Microsoft is the most clearly-positioned company to succeed at this, as far as I can tell. They have two decades of experience in the industry, they have cloud chops and datacenters, and they carry clout with publishers and even have in-house studios (because a subscription-only game buffet it going to be a tough sell when it comes to license-holders).
And of course they've already started: Xbox Game Pass is a smallish version of the all-you-can-eat subscription, and they've been experimenting with cloud-hosted releases. You can even play Control on your Nintendo Switch via Microsoft's cloud. That's pretty cool.
But I don't think this will ever make gaming PCs or even consoles obsolete, mainly because of the unsolvability of the latency issue. It will be good enough for some people.
Oh and Stadia will die anyway, because Google doesn't understand any of the above
But more importantly: Mac hardware usually isn't really equipped for high-end games. If you have a pro-tier machine you might do okay, but nobody buys Macs for gaming, at the very least. It's just too niche of a market to go through a lot of effort to support it
Most AAA games already have 200+ ms delays between pressing a button and anything happening on-screen. So there's plenty of room to redesign things to work around that latency in a lot of games
(This obviously doesn't apply to high-end play on twitch shooters or fighting games though, those are pretty much screwed when it comes to streaming)
It's free with an optional subscription for games and 4k.
I think most of the above still applies, but maybe expand "it'll be good enough for some people" to include some portion of average console-gamers (assuming the rest of the productization is done right, and assuming those console-gamers have fairly good internet)
The thing is that, even there, if you're putting it on a TV you're likely not going to want to plug in your Macbook or whatever. Which means, if you don't already have a console, you're going to be buying dedicated hardware regardless. Which significantly cuts into the "savings"/"no-purchase" angle, and steepens the question of "what's the point of this?"
One thought though: Microsoft could use this as a way to keep last-gen console owners engaged. At some future date when the Xbox Series Y or Z or whatever comes out, people with a Series S might still be able to play the latest games by streaming them. They're using dedicated hardware that plugs into a TV, but it's hardware they already bought which is essentially being repurposed.
Edit: Another thing is that the subscription model and the streaming model don't have to go hand-in-hand. I think game subscriptions are absolutely the future, but I think there will always be a market for devices that download and run those subscribed games locally.
Regardless though, I think buying full-priced games that you don't actually own is the real non-starter. These aren't $0.99 songs on iTunes; these are $60 investments.
Source please?
I have produced / designed / managed a few AAA games in my life and none of them had a 200ms latency between when you pressed a button and something happened on screen. That delay would be horrible for a fighting game or a driving game. How are you even defining "something happening on screen"?
Let's suppose you are right, that there is a longish latency between when your input is polled and when the game systems fully react. That happens to some extent in RTSs, because changes in the game state are synchronized. But in that case the delay isn't going to hide the network latency, it is going to be added on top of the network latency.
The latter isn't a niche market, it's a 'not high-end' market. But that could evolve, I think.
Whereas GNU/Linux, even with the massive amount of games targeting Android, hardly gets to see them.
Same applies to Stadia, which is mostly GNU/Linux + Vulkan, with Google sponsoring Unity and Unreal as well.
You'd think, but a lot of mainstream engine-based games that could "easily" have a mac port never get one, even an unofficial one offered as totally unsupported. Look at Among Us for example. Not by any stretch a high-end game. It runs on Windows, Android, iOS, a bunch of XBoxen, and probably other consoles. I bet the developer could spit out a working native macOS version with the push of a button, but so far hasn't.
Kerbal Space Program is another example. When last I checked, they did have a native mac version, but it was hamstrung in some way--I think it was limited to 32-bit or something.
I can't imagine these examples are actually a huge amount of effort to make happen. As a fan and programmer I'd be willing to do it for free.
To even get on Stadia you have to port to their custom Linux distribution, which is a pretty huge ask for most games.
Absolutely false, and I don't know where you got that from.
If there was a game that had that kind of latency between input and reaction, people would notice and the reviews would be horrible.
This is an honest question, since I don't game much (witcher 3, death stranding and a few point and click) , and regular 1080 doesn't bother me, so I'm genuinely curious.
A 30 fps game could go through a complete loop, updating everything: object positions, inputs in 33ms. At 60 fps assuming everything is synced to frame rate that would 16 ms.
I was asking for the commenter's source of information so I didn't have to guess what he or she meant. It's possible to make a game that doesn't respond a user's input in less than 200ms, but why would you? You don't need to be making a technical tour de force to respond in 16-33ms.
1. You claim PC gamers do it for the hardware as much as the software. Let's assume the data backs that - it certainly seems like it's likely to be true. And I'm biased in wanting to believe it too, because I like to build and revel in the machines that run the games I own. What isn't true is that those same people, people like me, cannot also be attracted to things like Stadia.
2. Services like Stadia do not replace the many games that people play on the many devices that already exist. It's not a "one or the other" thing. They allow those devices to play more games.
The biggest flaw is in suggesting that casual gamers (a term which is flawed for many other reasons) wouldn't be a potential market for a thing like Stadi. Mobile game sales account for almost half of ALL game related sales. 48%, in fact. $76 billion in sales. A thing like Stadia means that people can play more games on their devices.
And let me say, games on Stadia play incredibly well on my iPad that's a few generations old. That's very attractive. Being able to play PC quality games on my iPad when I travel is worth every penny. I'd even argue it's easier to play games on Stadia than it is to play natively installed games. With Stadia, there's no downloading of the game, no installing, not time wasted waiting for updates. You just turn it on, and it works.
First, where you say "casual gamers", I think what you're trying to say is "people who play games on their mobile devices." You go on to describe the abilities that mobile devices have. While I won't dispute that, one thing I think you're missing is that services like Stadia make it even easier to play games on those devices that don't exist for those devices, or will at some future date, optimized to run on those mobile devices.
I'll probably beat this horse to death, but to compare: I was playing Cyberpunk 2077 on my iPad through Stadia minutes after it was available. It took nearly a day before I could run it on my PC, and after the first several patches I just stopped bothering. Granted, the game is a beautiful mess, but the point is: it was effortless on the iPad, and has been ever since. Not only that, but I can switch to my iPhone, or to my PC and pick up right where I left off. If I do it quick enough, the game just unpaused when I jump to the new device. And I can travel and still play. There's no way my PC, with its UV reactive liquid cooling is going to travel with me.
3. Stadia starter kit is optional. Stadia is free. Do you have a controller? Keyboard and mouse? A web browser? You're good. There is no required subscription fee. You buy the games, and they cost the same as console games. So yeah, if you have a device that can run modern browsers, you don't need to buy a console.
4. I assume when you mention latency, you mean "input latency" - meaning, the time it takes for the game to react to your button press or mouse movement. There are indeed hard limits to how low input latency can be. The game cannot update its entire model and render it in 0ms. It has to make calculations based on your inputs, then show you what changed. But that's not the only constraint. Consider the entire picture: a target on the screen moves, and you need to shoot it. If you're good, it'll take you about 100ms to react. Most people can't react in less than 150ms. It takes 5-10ms to transmit your reaction over USB. It takes the simulation any number of milliseconds to process and tell the monitor to redraw itself. Let's assume the processing time of the game engine is 0ms. The best monitors will add 2ms to the clock.
So, from your human reaction to the resulting frame, at best, it takes from 107ms to react to something on screen and see the results of your reaction.
And that's on your PC. No networking.
What does Stadia add? On a good connection, it'll add 20-30ms. To be fair, that's what I've seen on my pretty normal cable company internet connection over 5ghz Wifi. With most games, you'd never notice the extra time. Are you going to notice it as a pro gamer playing FPS competitively? Probably.
Your assertion that Stadia will die is about the most right thing you've said. Even with a market, Google tends to kill things seemingly at random. What will help it die quicker is if Nvidia's service is able to outperform Stadia in terms of simplicity and streaming speeds.
But saying streaming based gaming won't find a market reminds me a lot of what the cable companies and Blockbuster used to say about Netflix.
Found an article from a few years ago: https://www.gamasutra.com/view/feature/3725/measuring_respon...
Not all games are that bad, especially these days. And your overall point is correct: adding even a little bit on top of that already horrendous latency is going to be noticeable by players.
0: https://displaylag.com/best-low-input-lag-tvs-gaming-by-game...
And It is weird how resolutions are the focus in streaming when the most important thing is bitrate, feel like we need some kind of standard, because bitrate means nothing to most people.
200ms, while possible, is far from "most AAA+ games", as OP stated.
Sure, there's people that play on lowest-end consoles, on a crappy LCD TV with game mode disabled, but let's not consider that the norm for all players/all AAA+ games, and I'm going to need hard sources showing whether those worst case environments get even close to triple digit latencies.
People can perceive delays smaller than their reaction window. For argument I'll say it's 50ms is the perceivability barrier, since we seem to throwing numbers around here. I can get 50 or 60 ms lag on my wifi often, and I would say that I have a pretty good connection. So therefore, the input lag potential with stadia is significant. 60 > 50.
I can’t ping my router and get consistent latency that low.
Latency on speed tests varies between 15 (off peak no load) and 100ms (normal).
There is no way that by the time that all adds up, stadia is going to be a better experience than local.
My internet is also shared with other people, in a country with notoriously subpar internet (yay Australia), the closer we get to reality, the less appealing stadia becomes. The kind of game streaming I could get behind is the rainway/local streaming approach where I run the game on local hardware (pc/PS5) and stream to convenient device.
An argument I was trying to make is that for other reasons, and for a lot of games, Stadia is better than local when you take the entire experience into account. Cyberpunk 2077 is a great example of where the overall experience is subjectively better. My RTX 3070 based system renders the game and its bugs beautifully, far better than Stadia does. But is that $4500-worth of eye candy worth it compared to the $0.00-worth of totally acceptable Stadia? Lag-wise, I don't notice a difference.
I prefer playing the game on Stadia now because it's just so simple. I can use a controller or mouse and keyboard with my iPad and play from anywhere in my house. And not just my house - I've played it over a LTE connections several times without issue.
As far as latency goes - people tend to get hung up network latency when it's only a small part of the latency story. Granted, at 100ms, it becomes a bigger part of the story, but people either don't know about, or forget, that there's more:
There's peripheral latency, "system" latency (which includes CPU, render queue, and GPU), then display latency for single player games.
Stadia, or any streaming service, adds network latency. For me, with a pretty normal American internet connection provided by a craptastic provider (because it's the only choice I have), it works great.
For what it's worth, I've also played with some of the "local" streaming tech. No joke, Stadia performs better than streaming using Steam's local streaming app, by a long shot. There's the iPad app (the name escapes me at the moment) that lets me stream my XBox to the iPad, and it's better, but still way worst than Stadia.
Yep. I see a good example of this when I watch gameplay videos on Youtube in the highest available 1080p bitrate, and regularly see results that look far worse than playing the game in 720p, maybe even 480p. For example, it's obviously very common to pan the camera through a high-detail scene, which is trivial for a GPU to do, but incredibly information dense for a video encoder. So anything with a lot of detail blurs (in a very ugly way, not like motion blur) when there's movement.
And Youtube has the advantage that the video has as much time to record as Youtube will allow it, it doesn't need to be done with low-latency settings as Stadia does.
Of course, cable TV is even worse, but ordinary consumers don't seem to have noticed or cared about that either.
I'd say your problems 1-3 can be summarized by saying you don't think there's a market for it. I don't think I agree. The prospective market for it is probably console gamers who want to play PC games that aren't ported to their console.
Even CP2077 might be an example of this, because from what I've heard the performance is absolutely terrible on consoles, and if you haven't already spent heavily on an upgraded computer with a graphics card that's going to set you back $1K, you probably can't play it there either. So if you're the stereotypical console gamer, who doesn't care about perfect graphics and the lowest possible latencies, Stadia is going to sound like a pretty decent deal.
And that's before you get to exclusives.
There's an entire chain of things that contribute to latency, and network latency is only one part of that chain.
From what I've experienced on a pretty normal, non-optimized wifi connection (meaning I just plugged a cheap TP Link router in and did nothing to its default settings), I don't notice the latency that Stadia contributes making any difference compared to whatever amount of latency I get on my capable PC.
That's not to say network latency doesn't matter. It matters a lot to pro CS:GO players, for example, (who have reaction times in the 130-300ms range, for what it's worth). Those players are will to pay for high poll rate mice to shave off a few milliseconds from input latency, or build $5k+ machines stuff with insanely fast CPUs and GPUs, with $2k+ monitors with 1ms latency.
But Stadia isn't for that kind of game play.
Like I said in another comment, the talk around streaming games is almost identical to people who scoffed at services like Netflix when they first started streaming. You had Laserdisc nerds freaking out about how the streaming would produce compression artifacts, and people like Mark Cuban saying that people were crazy to think streaming video was the way to go, (all while pitching his HD satellite service).
Having used Stadia as a "normal" person might, I'm certain that in the not too distant future, streaming based gaming services will be as mainstream as Netflix is today. Despite whatever compromises it has to make.
OT, but I'm curious, what kind of router do you have? That seems really bad. I tested this on my laptop (over WiFi, in a very heavy traffic apartment building) and see the following:
50 packets transmitted, 50 received, 0% packet loss, time 49115ms
rtt min/avg/max/mdev = 0.751/1.436/5.000/0.812 ms
I don't say that to brag, I really think that's definitely expected for any LAN device.The obvious example here is a precision platformer like Celeste, but you can say the same (with less and less applicability) to other games, starting with FPS.
In Celeste, there are a handful of frame-perfect inputs in the game. This means you have less than a 20 ms window to get your input in, or you're dead (the game's only failure state). How is this possible, if human reaction time is only ~100 ms at best? It's because there's a difference between reaction time and timing. Reaction time measures your time-to-react to an unpredictable stimulus. Timing is your reaction to a predictable stimulus. Most of the time in games you are reacting to a stimulus that is at least somewhat predictable.
So with a little training you can reliably make that frame perfect jump. But if Stadia adds 60 ms of latency, that means your character is over 3 frames ahead of where you think she is. You're going to miss that jump a lot until you can reprogram your brain to account for the latency, as much as possible. And even then you'll probably find it harder. Throw in a little variability to the latency, so you think the character is 3 frames behind but she's actually 4, and you're doomed.
Granted, not every game is a precision platformer, so there are diminishing returns for low latency in other types of game. But if you, say, enable cross-play between Stadia and non-Stadia in a shooter, the local players are probably going to have a huge advantage. Even making it work against an AI opponent would require some significant work to make the AI's reaction time keyed to Stadia's measurement of latency, not whatever you originally hard-coded into the game.
> Of course, cable TV is even worse, but ordinary consumers don't seem to have noticed or cared about that either.
According to Wikipedia, a DVB-C stream can be between 6-65 Mb/s [1], certainly higher than YouTube's 3-9 Mb/s (assuming 1080p video). The situation for resolutions above 1080p seems to be a bit better [2].
[1] https://en.m.wikipedia.org/wiki/DVB-C
[2] https://www.androidauthority.com/how-much-data-does-youtube-...
A lot of games did drop off the Mac when it moved to 64-bit only though.
I imagine when Apple expands their desktop and laptop lineup to M1 chips, it's going to include many of the games that are available from their mobile catalog.
This is one of the worst parts of game Streaming - games potentially being designed around it, making them worse for everyone else.
It runs a lot better (streaming quality, glitches, start-up times) are incredible. Using Stadia in general is a polished (yet basic) experience. In contrast Nvidia very much felt like a hack. Log-in in my steam account, seeing weird window glitches.
I see a lot of comments negative on stadia here,based on bias rather then actual experience. Stadia is nothing short of tech star even with its downsides compare to the rest of the market.
I plugged in my cable box for the first time in months to watch the Super Bowl, and was shocked at how terrible the video was. I could see obvious artifacts without glasses on, and I can't even tell 720p from 1080p at that distance. Some of my relatives have those MPEG-2 channels, and I remember them being significantly worse.
Not trying to say that cable TV can never be better than Youtube's quality, of course, just trying to give a general impression of my experience with various American cable companies.
- Stadia will be just as vulnerable to "exclusive content fragmentation" as consoles. Now that they've shuttered their internal studio, they will in fact, be constantly on the defensive in the war of exclusive content.
Actually, out of curiosity I just looked up the bitrates for my local cable company. The quality seems to differ a lot: on average between 3 Mb/s MPEG-2 [1] and 12 Mb/s MPEG-4 [2]. So I guess my previous statement isn't really accurate and it depends on the channel.
That website appears to be quite interesting btw; it also tracks YouTube bitrates for live and non-live video and in different encodings! [3]
[1] https://www.digitalbitrate.com/dtv.php?mux=C049&pid=19126&li...
[2] https://www.digitalbitrate.com/dtv.php?mux=C049&pid=19130&li...
[3] https://www.digitalbitrate.com/dtv.php?lang=en&liste=2&live=...
I suppose it remains to be seen how successful Stadia will be at pull these titles to its platform. I think you're right to worry about fragmentation. If developers view Stadia as "just another platform" that they can just choose not to support when creating exclusives, it'll fail. If Stadia can get them to view it as a kind of drop-in that lets a much larger number of people (with, say, underpowered hardware) play their game, they'll be more likely to view it as a win.
One other plausible market: people for whom the upfront cost of a platform is still too high. It's a lot easier for most parents to justify buying Cyberpunk for Stadia for their kid for Christmas than it is a brand new $500 console, or God forbid the several thousand dollar PC you'd need to play it.
12 Mbps MPEG-4 should be quite good, for the stations that support it.
63 packets transmitted, 63 packets received, 0.0% packet loss
round-trip min/avg/max/stddev = 1.659/110.684/1805.961/305.145 msWhat model of router is it? This really feels like a situation where something has to be broken, I can't imagine any router, no matter how cheap, has an expected ping rtt maxing out at around 2 seconds. Notably, your minimum rtt is under 2 ms, so it's definitely capable of getting a response to you faster than that, maybe it's just overloaded or something?
Some of these titles I've played multiplayer via Steam without any of the related issues, granted Steam/Stadia is an apples/oranges comparison.
At the end I suggested we try Armagetron. 2.7MB download and runs on Mac/Win/Linux/Potatoes. I started up a private server and we were running a 16-player game without any issues in literally 5 minutes.