this post was submitted on 19 Sep 2023
134 points (98.6% liked)
Games
16830 readers
1293 users here now
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Because everything ran locally at a datacenter, the real killer app of Stadia would have been a super-massively multiplayer game. There wouldn't be any problems with latency between game states, (any lag would be between the server and the console.) Imagine massive wars or mediaeval battles with thousands of participants. They never developed games that took advantage of what was unique about the platform.
AFAIK, MMOs keep all the game state on the servers already. The difference is that what they send to the client is key deltas to the game state, which the client then renders. Stadia type services instead render that on the datacenter side and send the client images.
With their expertise at networking and so-on, Google might have been able to get a slight advantage in server-to-server communication, but it wouldn't have enabled anything on a whole different scale, AFAIK.
IMO, their real advantage was that they could have dealt with platform switching in a seamless way. So, take an addictive turn-by-turn game like Civilization. Right now someone might play 20 turns before work, then commute in, think about it all day, then jump back in when they get home. With Stadia, they could have let you keep playing on your cell phone as you take the train into work. Play a few turns on a smoke break. Maybe play on a web browser on your work computer if it's a slow day. Then play again on your commute home, then play on the TV at home, but if someone wanted to watch a show, you could either go up and play on a PC, or pull out your phone, or play on a laptop...
Larger massive multiplayer capability was one of the features Google was touting upon Stadia's launch:
Sure, they claimed that, but it's telling that nobody ever took them up on that.
Google's internal network may be good, but it's not going to be an order of magnitude better than you can get in any other datacenter. If getting thousands of people into the same virtual space were just a matter of networking, an MMO would have already done it.
A shard is going to be storing the position, orientation and velocity of key entities (players, vehicles, etc.) in memory. If accessed frequently enough they'll be in the processor's cache. There's no way the speed of accessing that data can compare with networking speeds.
That doesn't mean there couldn't have been some kinds of innovations. Say a game like Star Citizen where there are space battles. In theory you could store the position and orientation of everything inside a ship in one shard and the position and orientation of ships themselves in a second shard. Since people inside the ship aren't going to be interacting directly with things outside the ship except via the ship, you could maybe afford a bit of latency and inaccuracy there. But, if you're just talking about a thousand-on-thousand melee, I think the latency between shards would be too great.
You'd only be able to play with people local to you, in the same Stadia datacenter. If Stadia wanted to minimize latency, they would increase the number of datacenters (thus making fewer people per instance).