Let me clarify: We have a certain amount of latency when streaming games from both local and internet servers. In either case, how do we improve that latency and what limits will we run in to as the technology progresses?
The speed of light, so 50ms or so assuming locations on Earth. In practice a bit more because you have to go around it rather than through the core. Servers already have to make retroactive calls, which is why it looks like you hit but then you didn’t sometimes.
Interestingly enough, Starlink has lower latency than wire despite the longer path because light travels slower than c through glass fiber.
Where are you getting 50ms? The speed of light is a LOT faster than 50ms?
No, it seems to be in the right order of magnitude
https://www.wolframalpha.com/input/?i=circumference+of+earth+%2F+speed+of+light
Obviously light doesn’t have to travel quite as far, but 50ms is not a bad estimation for a worst case. Also you have to add processing delays at each router, which makes everything far slower.
But this is the limit of cloud gaming. Cloud gaming in no design goes all the way around the earth, or half way around the earth. Stadia used regional data centers, as does GeforceNOW, as does Shadow.
50ms seems really arbitrary.
I also think 50ms is a bit pessimistic, but there are locations which are far off of googles datacenters (at least until they finish their Johannesburg location, south africa seems very isolated) and you’re never directly connected via as-the-bird-flies fibre connections, actual path length will be longer than just drawing a line on a map.
This can all be mitigated by just building more and closer edge servers, of course, but at some point you just have a computer in your room again.