A white paper by Microsoft Research details a way to lower latency in cloud gaming, using a project called Delorean.
In cloud gaming, very little processing is done on the user’s device. Instead, a remote server handles game execution and rendering on behalf of thin clients, which can display the game. Potentially any device can play any game using this method. The downside is that latencies are often prohibitive, as commands have to travel to the server and the response sent to the thin client (known as a round trip time, or RTT). Response times can be in excess of 100ms, which is not acceptable for certain types of games.
The Delorean system has been developed for mobile cloud gaming. It can mask up to 250ms of network latency, says Microsoft Research. Essentially, the server renders frames before an event occurs in-game. When the event takes place, the correct frames are sent to the device. The frames are sent to the client device one RTT ahead of time, so users perceive no latency.
To make the system possible, Delorean uses: i) future input prediction; ii) state space subsampling and time shifting; iii) misprediction compensation; and iv) bandwidth compression.
Delorean was tested with two commercially-available games: Doom 3, a first-person shooter, and Fable 3, an action RPG. According to Microsoft Research, players ‘overwhelmingly’ preferred the Delorean system over the traditional cloud gaming experience (no surprise there – nobody likes lag – TA).
Display Daily Comments
We had a long chat with Nvidia about latency at this year’s MWC event and keen readers might remember that when Nvidia demonstrated its remote gaming, it used a very distant server to do the work. The cloud is, no doubt, the future for a lot of gaming. (BR)