Remember when Matt Booty said they could use the cloud to do certain lighting effects and render on the cloud and a lot of people said it was B.S.? Well I ran across Nvidia's Sigraph 2013 presentation and they demo split code, cloud rendered indirect lighting running on various devices.
The most promising part is how they demonstrate it with latency from 0ms on local hardware up to 200ms(6 frames) and the difference is basically irrelevant showing that lag and latency simply is not an issue for something like this.
They have 3 different versions, Voxel, Irradiance maps and photon trace, depending on how powerful the local hardware is, ranging from a basic laptop or tablet to a desktop device.
Traditional local hardware rendering where all lighting processes are done locally.
Above, all processes are done on the local hardware.
Below, all processes are done on the cloud.
So only the initial player input and the final video stream decoding is done locally. Everything else above the white dotted line is performed on the cloud.
Hybrid, split-code cloud based rendering.
Individual examples of the actual 2 different processes being split and then re-combined for the final image.
Neat. And obviously if you can do this with lighting, you can do it with pretty much anything that's latency-insensitive. Physics, A.I., etc.
MS needs to start demo-ing things like this.