Nvidia demos hybrid cloud rendered lighting.

  • 66 results
  • 1
  • 2
Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1  Edited By AlexGlass

Remember when Matt Booty said they could use the cloud to do certain lighting effects and render on the cloud and a lot of people said it was B.S.? Well I ran across Nvidia's Sigraph 2013 presentation and they demo split code, cloud rendered indirect lighting running on various devices.

Video: Demonstration.

The most promising part is how they demonstrate it with latency from 0ms on local hardware up to 200ms(6 frames) and the difference is basically irrelevant showing that lag and latency simply is not an issue for something like this.

They have 3 different versions, Voxel, Irradiance maps and photon trace, depending on how powerful the local hardware is, ranging from a basic laptop or tablet to a desktop device.

No Caption Provided

Traditional local hardware rendering where all lighting processes are done locally.

No Caption Provided

Above, all processes are done on the local hardware.

Below, all processes are done on the cloud.

No Caption Provided

So only the initial player input and the final video stream decoding is done locally. Everything else above the white dotted line is performed on the cloud.

Hybrid, split-code cloud based rendering.

No Caption Provided
No Caption Provided
No Caption Provided

Individual examples of the actual 2 different processes being split and then re-combined for the final image.

No Caption Provided
No Caption Provided
No Caption Provided
No Caption Provided

Neat. And obviously if you can do this with lighting, you can do it with pretty much anything that's latency-insensitive. Physics, A.I., etc.

MS needs to start demo-ing things like this.

Avatar image for syed117
Syed117

407

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I don't think they are ready to demo that kind of stuff.

It's going to take time, but this is where the potential is. It's up to developers to take advantage of the power they have behind the scenes. Someone is going to do it. It's only a matter of time.

The only catch is that these features can only be used for online only games. I'm sure we will see some amazing stuff once developers get used to the idea of that computing power being out there. 200ms seems like the upper extreme and if these things can work with that much latency, I don't think we will have anything to worry about.

Avatar image for burningstickman
BurningStickMan

241

Forum Posts

16860

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

I guess my question is, what if the connection is spotty or not available? Does the game default to local, low-res lighting, and what does that look like.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@syed117 said:

I don't think they are ready to demo that kind of stuff.

It's going to take time, but this is where the potential is. It's up to developers to take advantage of the power they have behind the scenes. Someone is going to do it. It's only a matter of time.

The only catch is that these features can only be used for online only games. I'm sure we will see some amazing stuff once developers get used to the idea of that computing power being out there. 200ms seems like the upper extreme and if these things can work with that much latency, I don't think we will have anything to worry about.

No, you can do this for single player games too that are traditionally offline games, but obviously you need to be always connected to the server to take advantage of it.

That's one of the reasons I was 100% on board with the online requirements and bandwidth requirements. Dropping that probably set this back a year or two since now you will have people without good connections joining the user base.

Hopefully they can just put stickers on the box cover listing the bandwidth requirment or a warning before you download the digital game to keep the slowpokes away because I can already imagine some bloke buying a game that requires this, getting online, and complaining about how it doesn't run properly because he's connecting to Xbox Live with a 768kb DSL connection.

Avatar image for charlie_victor_bravo
charlie_victor_bravo

1746

Forum Posts

4136

Wiki Points

0

Followers

Reviews: 4

User Lists: 4

Too much latency, too much server load. If Steams and other services servers can be grind to halt by foretold events (like sales), I don't see how it is possible to do real-time complex rendering in a popular game without serious issues .

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Too much latency, too much server load. If Steams and other services servers can be grind to halt by foretold events (like sales), I don't see how it is possible to do real-time complex rendering in a popular game without serious issues .

Obviously you didn't watch the video. There's no issue with latency as it's demonstrated up to 200ms which is much slower than the requirement needed for even decent online multiplayer gaming. We've had online multiplayer for decades now and the latency requirement for something like this is more lenient than an online multiplayer game.

What you are talking about is a myth that was spewed around by people who didn't understand how it works.

Avatar image for jimmyfenix
jimmyfenix

3941

Forum Posts

20

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@charlie_victor_bravo said:

Too much latency, too much server load. If Steams and other services servers can be grind to halt by foretold events (like sales), I don't see how it is possible to do real-time complex rendering in a popular game without serious issues .

Obviously you didn't watch the video. There's no issue with latency as it's demonstrated up to 200ms which is much slower than the requirement needed for even decent online multiplayer gaming. We've had online multiplayer for decades now and the latency requirement for something like this is more lenient than an online multiplayer game.

What you are talking about is a myth that was spewed around by people who didn't understand how it works.

Its great that the xbox one will not be always on always connected :p

Avatar image for nergrim
Nergrim

140

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8  Edited By Nergrim
@alexglass said:

@charlie_victor_bravo said:

Too much latency, too much server load. If Steams and other services servers can be grind to halt by foretold events (like sales), I don't see how it is possible to do real-time complex rendering in a popular game without serious issues .

Obviously you didn't watch the video. There's no issue with latency as it's demonstrated up to 200ms which is much slower than the requirement needed for even decent online multiplayer gaming. We've had online multiplayer for decades now and the latency requirement for something like this is more lenient than an online multiplayer game.

What you are talking about is a myth that was spewed around by people who didn't understand how it works.

So when you say it obviously works.

You mean to say they have tried this with 30 million people trying to play these cloud graphics enabled games att the same time?

Avatar image for syed117
Syed117

407

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9  Edited By Syed117

@charlie_victor_bravo said:

Too much latency, too much server load. If Steams and other services servers can be grind to halt by foretold events (like sales), I don't see how it is possible to do real-time complex rendering in a popular game without serious issues .

Steam servers don't exactly stand up to xbox live. There are more people playing on xbox live on a daily basis than on steam. Xbox Live doesn't exactly crash every single day because of that usage. A single game like COD has more players than the top 10 games on steam combined.

If they are expanding the service as much as they say they are, all these things are definitely possible.

There will always be hiccups, but it is definitely possible. The whole point of the demo in the first post is to prove that it is possible.

@jimmyfenix said:
@alexglass said:
@charlie_victor_bravo said:

Too much latency, too much server load. If Steams and other services servers can be grind to halt by foretold events (like sales), I don't see how it is possible to do real-time complex rendering in a popular game without serious issues .

Obviously you didn't watch the video. There's no issue with latency as it's demonstrated up to 200ms which is much slower than the requirement needed for even decent online multiplayer gaming. We've had online multiplayer for decades now and the latency requirement for something like this is more lenient than an online multiplayer game.

What you are talking about is a myth that was spewed around by people who didn't understand how it works.

Its great that the xbox one will not be always on always connected :p

Even when the xbox one had the online checks, it wasn't always on. It was online once every 24 hours. There is a difference.

This tech is obviously built for games that are always online. Like Destiny, Titanfall, Planetside and every game in the future that will be online only.

@niko555 said:
@alexglass said:
@charlie_victor_bravo said:

Too much latency, too much server load. If Steams and other services servers can be grind to halt by foretold events (like sales), I don't see how it is possible to do real-time complex rendering in a popular game without serious issues .

Obviously you didn't watch the video. There's no issue with latency as it's demonstrated up to 200ms which is much slower than the requirement needed for even decent online multiplayer gaming. We've had online multiplayer for decades now and the latency requirement for something like this is more lenient than an online multiplayer game.

What you are talking about is a myth that was spewed around by people who didn't understand how it works.

So when you say it obviously works.

You mean to say they have tried this with 30 million people trying to play these cloud graphics enabled games att the same time?

I would love to know what game you are talking about that would have 30 million players. The biggest selling game is COD right now and it sells maybe 10 million copies with a maximum of 3-4 million players at once.

30 millions? No.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10  Edited By AlexGlass

@jimmyfenix said:

@alexglass said:

@charlie_victor_bravo said:

Too much latency, too much server load. If Steams and other services servers can be grind to halt by foretold events (like sales), I don't see how it is possible to do real-time complex rendering in a popular game without serious issues .

Obviously you didn't watch the video. There's no issue with latency as it's demonstrated up to 200ms which is much slower than the requirement needed for even decent online multiplayer gaming. We've had online multiplayer for decades now and the latency requirement for something like this is more lenient than an online multiplayer game.

What you are talking about is a myth that was spewed around by people who didn't understand how it works.

Its great that the xbox one will not be always on always connected :p

Mine will be.

And I really don't think that's going to stop games like Titanfall from coming out. Halo 5's development was also announced to be centered around the cloud. So two of MS's biggest upcoming IP's will require it which is good news for me.

Over 40+ million Xbox Live users are ready to go and most of those meet the requirements. That's a significant user base. And it's about time my Road Runner connection gets put to good use when it comes to gaming.

Avatar image for charlie_victor_bravo
charlie_victor_bravo

1746

Forum Posts

4136

Wiki Points

0

Followers

Reviews: 4

User Lists: 4

@alexglass:

What is your ping time to google? Add to this all kinds of latency: controller, monitor, processing images together, server, HDD, isp, firewalls, etc.

@alexglass said:

We've had online multiplayer for decades now and the latency requirement for something like this is more lenient than an online multiplayer game.

How it is more lenient? You have to wait for the data from the Cloud to sync with the onsite data, than process it together.

Avatar image for extomar
EXTomar

5047

Forum Posts

4

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12  Edited By EXTomar

This stuff has been kicked around for awhile where per-calculating a lot of this data into the scene data is something many do now. The drawback is that the scenes have to be static to gain the speedup. Doing anything like add or removing lights, moving lights, taking out geometry, etc invalidates all of the calculations.

I still believe that although there are some stellar applications for distributed rendering systems that work well in "cloud architecture", on the fly scene building found in video games isn't one of them. Or another way to think about it is that if engineering can make this style of scene calculation work super efficiently by Internet then someone else can make that same thing work even better locally handling even more geometry and a larger system.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@extomar said:

This stuff has been kicked around for awhile where per-calculating a lot of this data into the scene data is something many do now. The drawback is that the scenes have to be static to gain the speedup. Doing anything like add or removing lights, moving lights, taking out geometry, etc invalidates all of the calculations.

I still believe that although there are some stellar applications for distributed rendering systems that work well in "cloud architecture", on the fly scene building found in video games isn't one of them. Or another way to think about it is that if engineering can make this style of scene calculation work super efficiently by Internet then someone else can make that same thing work even better locally handling even more geometry and a larger system.

Uhm...too many don't even bother to read the post. They are demonstrating DYNAMIC indirect lighting. The video actually shows the lights moving throughout the scene. I mean seriously....it's in the OP.

And the advantage of this is that if you can do this on a server, then you DON'T have to bog down your local resources to do this. That's kind of the entire point behind it. To extend the capabilities of the local hardware. If you can get things like lighting, physics, A.I. for free from a server, why would you want to use up local hardware resources for it? Use the freed up local hardware to improve the game even more by having it do something else.

Avatar image for nergrim
Nergrim

140

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14  Edited By Nergrim

@syed117: I did say gameS in my post

there will obviously be more than one game using this technology.

Avatar image for syed117
Syed117

407

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15  Edited By Syed117

@niko555 said:

@syed117: I did say gameS in my post

there will obviously be more than one game using this technology.

Yes, but 30 million people all playing games that require this technology? That's not reasonable. Xbox lIve has a total of 46 million members. I don't think we will see that many people all playing games at the same time for a long time.

They have been building the infrastructure for years. It's getting bigger every day and it's waiting for someone to use it for games.

The whole point of this stuff is that you push more and more to cloud so that the local hardware can do other things.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Also...remember this?

No Caption Provided
No Caption Provided

LOL.

Avatar image for blu3v3nom07
Blu3V3nom07

4518

Forum Posts

130

Wiki Points

0

Followers

Reviews: 0

User Lists: 7

Well it sure looks purdy.

Avatar image for nergrim
Nergrim

140

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@syed117: ah yes well i may have exaggerated a little bit.

Was just saying that i dont believe it until i see a bit more proof.

Avatar image for akrid
Akrid

1397

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

Very, very cool. I guess latency is definitely not the issue there, as anything under the 1000ms still looked relatively reasonable to me - and I know for a fact that irradiance caches are usually a matter of kilobytes, so reaching that extreme would probably be very rare.

I would be a bit hesitant in thinking that MS is ready to provide that full rack of GPU's they showed per person on XBL however. I'm doubtful that the quality of irradiance mapping shown there could be fulfilled in the near future while still being affordable for anyone involved - as is often the case with proof of concept tech demos - and anything less would inevitably start to look kind of terrible. Still, very interesting considering that the consoles will be around for a long time while GPU's and CPU's continue to approve alongside, and this could potentially give the Xbox One some scalability that the PS4 doesn't have. Though with their Gaikai buy-out, who knows what Sony is cooking. Hell, the guy in the video straight-up mentioned Gaikai - a hint to it's future perhaps?

Another sticking point with this sort of thing though is developer adoption. I mean, not a whole lot of developers have even adopted pre-baked GI, and that's been available to them for years now. It seems a trend for them to sit comfortably right behind the bleeding edge of technology. As cool as it is, I could see this being a feature only first party games will be forced to utilize. Unless MS provides some stellar support in propagating and promoting the tech, most developers won't want to bother - especially if the PS4 version isn't going to benefit.

Avatar image for rebgav
rebgav

1442

Forum Posts

335

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20  Edited By rebgav

Neat. And obviously if you can do this with lighting, you can do it with pretty much anything that's latency-insensitive. Physics, A.I., etc.

Sure, you could but why would you? The gains are small and the comparative inefficiency of a round trip via internet vs calculating these things locally seems pretty crazy.

Why not jump straight to the end-game and just Onlive that shit? Games designed from the ground up to take full advantage of this half&half nonsense are probably going to be grotesquely compromised, I honestly think that anyone who is onboard for this vision of the future should opt to skip the awkward growing pains and go straight to full server-side rendering. I guess that's impossible right now for a service the size of Xbox Live due to the horrific expense.

Avatar image for onarum
onarum

3212

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21  Edited By onarum

so if your modem breaks you get messed up lightning in your games? sounds fantastic...

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Looks like GAF is picking up on my slides...Maybe someone should tell those GAF "experts" Azure doesn't need a GPU to do indirect lighting like this. The Dual Xeons in the Azure servers are fully capable of processing, streaming, and rendering even without GPUs. You can use it as a render farm just fine and MS doesn't actually have to add GPUs to their servers.

In fact, forget a simple line of code like indirect lighting...

LOS ANGELES, California – Axceleon is actively demonstrating the Microsoft Azure integrated workflow rendering process and render farms to customers and prospects.

Axceleon has released CloudFuzion for Azure and is showing existing Media & Entertainment customers and prospects how easy it is to integrate existing studio workflows into a cloud based render farm on Microsoft Azure with no impact on the animator or artist. CloudFuzion is integrated with applications such as Autodesk Maya, 3dsMax, Softimage, Adobe After Effects and allows launching of image renders from the application directly to an Azure render farm anywhere in the world. CloudFuzion will move the scene, including any attributes or references, from the studio data repository to the Azure render farm and in turn will move the resulting rendered images back to the studio data repository as part of an automated workflow. The animator or artist is oblivious as to where the images are being processed or rendered.

http://www.cloudfuzion.com/press/Axceleon_Unveils_Microsoft_Render_Farms.html

You can run an entire 3D animation program on Azure faster than you can on an Alien laptop. Rendering, ray-tracing, everything and stream it down just like you can on a a traditional cloud service.

You can run an entire 3D animation program. Rendering, ray-tracing, everything and stream it down just like you can on a a traditional cloud service. Also, Octane Cloud Renderer which can also run on Azure and can easily outperforms local hardware like this Alienware laptop trying to do the same. In this case Autodesk is running entirely on the cloud, so it's running the entire application, doing the processing, rendering, etc, and just streaming back the results. More like your typical streaming cloud service.

Loading Video...

So no, for those of you wondering, MS doesn't need to add GPUs to their servers. The Xeons will be just fine and offer more than enough rendering power for most of your games, especially for partial code processing.

You can run any code on a CPU. Obviously some of it is better suited for GPUs, but CPUs have traditionally been used for developing ray-tracing engines. Only recently has there been enough improvements to get them running on GPUs and in this case, Nvidia's optimizing their code for GPUs because it's what they do. Their GRID is made up of Kepler GPUs. But you can also optimize it to run on CPU servers like Azure too and I'm sure MS is already doing this. Don't need to change the chips, change the code.

CPUs are actually preferred for complex code. GPUs are fast, but dumb, and better suited for simpler code that benefits from parallel processing.

Avatar image for syed117
Syed117

407

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@onarum said:

so if your modem breaks you get messed up lightning in your games? sounds fantastic...

If your modem breaks, you can't really play always online games can you?

You might as well complain about games like destiny or titanfall not working because your modem is broken. Sometimes it really feels like people post without thinking for two seconds.

Avatar image for extomar
EXTomar

5047

Forum Posts

4

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24  Edited By EXTomar

@alexglass said:

@extomar said:

This stuff has been kicked around for awhile where per-calculating a lot of this data into the scene data is something many do now. The drawback is that the scenes have to be static to gain the speedup. Doing anything like add or removing lights, moving lights, taking out geometry, etc invalidates all of the calculations.

I still believe that although there are some stellar applications for distributed rendering systems that work well in "cloud architecture", on the fly scene building found in video games isn't one of them. Or another way to think about it is that if engineering can make this style of scene calculation work super efficiently by Internet then someone else can make that same thing work even better locally handling even more geometry and a larger system.

Uhm...too many don't even bother to read the post. They are demonstrating DYNAMIC indirect lighting. The video actually shows the lights moving throughout the scene. I mean seriously....it's in the OP.

And the advantage of this is that if you can do this on a server, then you DON'T have to bog down your local resources to do this. That's kind of the entire point behind it. To extend the capabilities of the local hardware. If you can get things like lighting, physics, A.I. for free from a server, why would you want to use up local hardware resources for it? Use the freed up local hardware to improve the game even more by having it do something else.

And you didn't bother to read my response. If an engineer build an efficient DYNAMIC indirect lighting system that uses Internet resources then an engineer can also build an efficient DYNAMIC indirect lighting system that screams locally that handles more geometry and larger systems.

There is nothing magical going on here. This stuff was kicked around in multiple SIGGRAPH from the 1990s and all of the technical issues are still there where the thing that has changed is the available of remote distributed resources.

I'll give you a little hint: When you wrote: "...you DON'T have to bog down your local resources to do this..." you clearly don't understand which part of the rendering pipeline is "expensive". What other purpose does engineering and building an expensive graphics processor on a graphics card than it is to do this very thing? Stand back everyone we've got a genius who knows better!

There are applications for this where as I mentioned doing "render farm" for CG applications would see a big speed up for this. Off the top of my head the only game I can think of that would effectively use this would be Dota 2 which is due to so many scene parameters being fixed.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@rebgav said:
@alexglass said:

Neat. And obviously if you can do this with lighting, you can do it with pretty much anything that's latency-insensitive. Physics, A.I., etc.

Sure, you could but why would you? The gains are small and the comparative inefficiency of a round trip via internet vs calculating these things locally seems pretty crazy.

Why not jump straight to the end-game and just Onlive that shit? Games designed from the ground up to take full advantage of this half&half nonsense are probably going to be grotesquely compromised, I honestly think that anyone who is onboard for this vision of the future should opt to skip the awkward growing pains and go straight to full server-side rendering. I guess that's impossible right now for a service the size of Xbox Live due to the horrific expense.

Because OnLive has to deal with the limitations of a streaming cloud service like lag, resolution and frame rate limitations, which are currently 720p/30fps. Just like any other streaming cloud service. By splitting it, you only process code which is latency-insensitive, so the player doesn't have to deal with all those downsides.

Not to mention if you stream everything on the cloud, your local hardware is useless. You could just as easily plug in a SmartTV to it and see the same result. By splitting the code, you are still taking advantage of both the local hardware AND the server. They're working together, and the local CPU and GPU is also being put to use in addition to the server.

And actually you can probably figure out how to dump off code on the cloud that might require hundreds of GFLOPs or TFLOPs worth of processing....which might be far more than what the console is actually capable of doing. It wouldn't just be used for indirect lighting. But for a lot of different latency-insensitive code. Physics, A.I., lighting, etc....

Once you start dissecting code, and you add it all up, you figure out just how much code is latency-insensitive, I think more and more will be offloaded to the cloud and the amount of processing required for it will make it more than worth it. I mean if I said 50% of your game code is latency-insensitive, and I can save you 1TFLOP of processing power and give you an additional TFlop to play with, how would that not be worth it?

Some things like path or ray tracing may simply be completely out of the reach of the local hardware all together, or absolutely crushing, but easily doable on a powerful server and at no cost to the local hardware.

Avatar image for rebgav
rebgav

1442

Forum Posts

335

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I mean if I said 50% of your game code is latency-insensitive, and I can save you 1TFLOP of processing power and give you an additional TFlop to play with, how would that not be worth it?

I would say "that seems deeply unlikely," and then I would go on and on and on about distributed computing being a horribly expensive dead-end and about network throughput increasing at a much, much slower rate than the raw power of hardware. Assuming we're still talking hypothetically, I'd then chastise you for spending all this time and these resources on building a service for tomorrow which will be obsolete the day after tomorrow, and I'd wrap up by screaming about potential ISP traffic throttling and the insanity of betting everything on a network based service which requires almost complicity from uninterested parties to achieve the expansion of infrastructure required to make it truly viable long-term. Obviously, we would disagree strongly.

I mean, from a realistic perspective, it's almost certain that within a decade you could build an xbox four times as powerful as the new xbox for the same price. Is their network going to expand at that same rate? Are server-to-user data-rates going to expand at that same pace? Probably not, right?

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@rebgav said:

@alexglass said:

I mean if I said 50% of your game code is latency-insensitive, and I can save you 1TFLOP of processing power and give you an additional TFlop to play with, how would that not be worth it?

I would say "that seems deeply unlikely," and then I would go on and on and on about distributed computing being a horribly expensive dead-end and about network throughput increasing at a much, much slower rate than the raw power of hardware. Assuming we're still talking hypothetically, I'd then chastise you for spending all this time and these resources on building a service for tomorrow which will be obsolete the day after tomorrow, and I'd wrap up by screaming about potential ISP traffic throttling and the insanity of betting everything on a network based service which requires almost complicity from uninterested parties to achieve the expansion of infrastructure required to make it truly viable long-term. Obviously, we would disagree strongly.

I mean, from a realistic perspective, it's almost certain that within a decade you could build an xbox four times as powerful as the new xbox for the same price. Is their network going to expand at that same rate? Are server-to-user data-rates going to expand at that same pace? Probably not, right?

Azure is already built. I'm not really sure I understand your argument because it seems to me you are arguing whether or not building a network computing server is feasible in the first place, and the answer to that is clearly a resounding yes.

Consoles get obsolete. Servers get upgraded.

Avatar image for bazookaxp
BazookaXp

18

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28  Edited By BazookaXp

@alexglass said:

@rebgav said:

@alexglass said:

I mean if I said 50% of your game code is latency-insensitive, and I can save you 1TFLOP of processing power and give you an additional TFlop to play with, how would that not be worth it?

I would say "that seems deeply unlikely," and then I would go on and on and on about distributed computing being a horribly expensive dead-end and about network throughput increasing at a much, much slower rate than the raw power of hardware. Assuming we're still talking hypothetically, I'd then chastise you for spending all this time and these resources on building a service for tomorrow which will be obsolete the day after tomorrow, and I'd wrap up by screaming about potential ISP traffic throttling and the insanity of betting everything on a network based service which requires almost complicity from uninterested parties to achieve the expansion of infrastructure required to make it truly viable long-term. Obviously, we would disagree strongly.

I mean, from a realistic perspective, it's almost certain that within a decade you could build an xbox four times as powerful as the new xbox for the same price. Is their network going to expand at that same rate? Are server-to-user data-rates going to expand at that same pace? Probably not, right?

Azure is already built. I'm not really sure I understand your argument because it seems to me you are arguing whether or not building a network computing server is feasible in the first place, and the answer to that is clearly a resounding yes.

Consoles get obsolete. Servers get upgraded.

Alex what is your take on the DME's in the Xbox One and their ability to compress/decompress at 200 MB/s.

How much data over the optimal 1.5 MB/s connection, confirmed by Microsoft, could this amount to for cloud computing?

OWikipedia: O (named o plural oes) is the fifteenth letter and a vowel in the ISO basic Latin alphabet.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@alexglass said:

@rebgav said:

@alexglass said:

I mean if I said 50% of your game code is latency-insensitive, and I can save you 1TFLOP of processing power and give you an additional TFlop to play with, how would that not be worth it?

I would say "that seems deeply unlikely," and then I would go on and on and on about distributed computing being a horribly expensive dead-end and about network throughput increasing at a much, much slower rate than the raw power of hardware. Assuming we're still talking hypothetically, I'd then chastise you for spending all this time and these resources on building a service for tomorrow which will be obsolete the day after tomorrow, and I'd wrap up by screaming about potential ISP traffic throttling and the insanity of betting everything on a network based service which requires almost complicity from uninterested parties to achieve the expansion of infrastructure required to make it truly viable long-term. Obviously, we would disagree strongly.

I mean, from a realistic perspective, it's almost certain that within a decade you could build an xbox four times as powerful as the new xbox for the same price. Is their network going to expand at that same rate? Are server-to-user data-rates going to expand at that same pace? Probably not, right?

Azure is already built. I'm not really sure I understand your argument because it seems to me you are arguing whether or not building a network computing server is feasible in the first place, and the answer to that is clearly a resounding yes.

Consoles get obsolete. Servers get upgraded.

Alex what is your take on the DME's on the Xbox One and their ability to compress/decompress at 200 MB/s.

How much data over the optimal 1.5 MB/s connection, confirmed by Microsoft, could this amount to for cloud computing?

Actually that's something we were discussing in depth over at arstechnica, but the argument was more about physics particles and how much XYZ coordinates you could fit in that bandwidth. Not so much about what the DME are capable of compressing and decompressing, and I have no clue where one would get that info.

We never really came to a solid conclusion. At first the estimates from one poster were pretty low in terms of raw code, but then Microsoft came out with the E3 asteroids demo which pushed a lot of objects around so those data move engines definitely seem to be key.

A better cue and estimation can be drawn fro this:

No Caption Provided

Of course now that MS dropped that requirement, I would imagine developers can also choose to go up if they decide the user base is there but if they are still trying to stay within that 1.5Mbps, then the irradiance maps table is where you could look at.

Avatar image for bazookaxp
BazookaXp

18

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30  Edited By BazookaXp

Thanks for supplying that chart it is very informative.

Here is the source for compression/decompression:

http://www.eurogamer.net/articles/digitalfoundry-in-theory-can-xbox-one-cloud-transform-gaming

Various less efficient but non-destructive compression techniques can be used such as zip and Lempel-Ziv (LZ), and it's of note that Microsoft has included four dedicated 'Memory Move Engines' on its main processor, two of which have LZ abilities. One has LZ decode and the other LZ encode, meaning Microsoft has such interest in compressing data that it has dedicated silicon to the job instead of leaving it to the CPU. The raw specs of 200MB/s data decoding is certainly enough to handle any amount of data traffic coming from the internet at broadband speeds, but the inclusion of these engines sadly doesn't point conclusively to an intention in streaming compressed assets from the cloud.

Avatar image for onarum
onarum

3212

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@syed117 said:

You might as well complain about games like destiny or titanfall not working because your modem is broken. Sometimes it really feels like people post without thinking for two seconds.

Indeed it's a problem when people post without thinking...

they claim this thing can be used for any game right? so if there's a SINGLE PLAYER game that uses a technique like this for it's Lightning and your modem breaks/ your ISP effs up/ a fucking truck hits the a pole and breaks the fiber (or any other multitude of things that can fuck up your connection) what happens then?

I bet that you're one of those people that felt like having the console call home every 24 hours so you could play the single player game you paid for was perfectly fine right?

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Thanks for supplying that chart it is very informative.

Here is the source for compression/decompression:

http://www.eurogamer.net/articles/digitalfoundry-in-theory-can-xbox-one-cloud-transform-gaming

Various less efficient but non-destructive compression techniques can be used such as zip and Lempel-Ziv (LZ), and it's of note that Microsoft has included four dedicated 'Memory Move Engines' on its main processor, two of which have LZ abilities. One has LZ decode and the other LZ encode, meaning Microsoft has such interest in compressing data that it has dedicated silicon to the job instead of leaving it to the CPU. The raw specs of 200MB/s data decoding is certainly enough to handle any amount of data traffic coming from the internet at broadband speeds, but the inclusion of these engines sadly doesn't point conclusively to an intention in streaming compressed assets from the cloud.

Thanks. Interestingly, I think by looking at the local size data for irradiance maps in that chart it seems to be in the ballpark(48mbs) to the DME's 32MB's size, and the DMEs were definitely created with the intent of operating independently from GPU/CPU resources. In other words, I believe Eurogamer's conclusion is probably based more on thinking about streaming down large amounts of video or texture data, but in this casee, we're talking about soecific streamlined code. If you want to stream down video, you would just go straight to RAM. The DMEs are clearly not made to receive streaming video, but for specific streamlined code, they should suffice.

Avatar image for bazookaxp
BazookaXp

18

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@onarum said:

@syed117 said:

You might as well complain about games like destiny or titanfall not working because your modem is broken. Sometimes it really feels like people post without thinking for two seconds.

Indeed it's a problem when people post without thinking...

they claim this thing can be used for any game right? so if there's a SINGLE PLAYER game that uses a technique like this for it's Lightning and your modem breaks/ your ISP effs up/ a fucking truck hits the a pole and breaks the fiber (or any other multitude of things that can fuck up your connection) what happens then?

I bet that you're one of those people that felt like having the console call home every 24 hours so you could play the single player game you paid for was perfectly fine right?

For one you need to calm down and two there will of course be a dynamic system that would take over in case you lose internet.

There is no reason to think a redundancy couldn't be put in place so it can automatically adjust settings so the local hardware can take over.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@onarum said:

@syed117 said:

You might as well complain about games like destiny or titanfall not working because your modem is broken. Sometimes it really feels like people post without thinking for two seconds.

Indeed it's a problem when people post without thinking...

they claim this thing can be used for any game right? so if there's a SINGLE PLAYER game that uses a technique like this for it's Lightning and your modem breaks/ your ISP effs up/ a fucking truck hits the a pole and breaks the fiber (or any other multitude of things that can fuck up your connection) what happens then?

I bet that you're one of those people that felt like having the console call home every 24 hours so you could play the single player game you paid for was perfectly fine right?

For one you need to calm down and two there will of course be a dynamic system that would take over in case you lose internet.

There is no reason to think a redundancy couldn't be put in place so it can automatically adjust settings so the local hardware can take over.

You can, but if I was a developer, I wouldn't bother and I'd just put a requirements sticker on my box, and tell them to upgrade to broadband or they can't play my game. I might also decide to package in warning signs for people to put on their poles to make sure trucks stop hitting them since apparently it's a common occurence ;)

Avatar image for liquidprince
LiquidPrince

17073

Forum Posts

-1

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#35  Edited By LiquidPrince

This continues to be a shitty idea. I don't want my games to have effects calculated elsewhere. What happens when the servers go down for a specific game, or I want my children to play a specific game 20 years in the future? It's a neat idea in theory, but in practice it's garbage. I want the game calculations to be done locally so that what I hold in my hands is a final and complete product without the need to be connected to something else.

Avatar image for bazookaxp
BazookaXp

18

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

This continues to be a shitty idea. I don't want my games to have effects calculated elsewhere. What happens when the servers go down for a specific game, or I want my children to play a specific game 20 years in the future? It's a neat idea in theory, but in practice it's garbage. I want the game calculations to be done locally so that what I hold in my hands is a final and complete product without the need to be connected to something else.

Azure servers do not work this way, they are dynamic so 10 years from now you can start a Titanfall game online and it will start up a server for you. It is definitely not "garbage" or "shitty" in fact this is the future and where we are heading for more than just gaming.

Internet connections are only going to improve and Azure is only going to get bigger.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37  Edited By AlexGlass

@liquidprince said:

This continues to be a shitty idea. I don't want my games to have effects calculated elsewhere. What happens when the servers go down for a specific game, or I want my children to play a specific game 20 years in the future? It's a neat idea in theory, but in practice it's garbage. I want the game calculations to be done locally so that what I hold in my hands is a final and complete product without the need to be connected to something else.

In most cases, and in the beginning, it's most likely only going to be used for additions, not as a replacement. So if your local hardware is maxed out, like for example with its current lighting model, it's not going to affect your game any if I decide to keep my system connected and benefit from additional effects like indirect lighting.

It's up to developers to decide how and what parts of their game they want to offload. Yon can still have a game that runs completely fine on local hardware, and is designed to take full advantage of the local hardware, and will still run completely fine 20 years from now, but if a person decides to connect to Azure you get enhancements.

Avatar image for liquidprince
LiquidPrince

17073

Forum Posts

-1

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

@bazookaxp: This is assuming that the servers still exist in the same capacity that they did when whatever game in question launched. It seems naive to believe that anyone would dedicate any amount of server capabilities to older games for extended periods of time.

Also @alexglass, there really isn't such as thing as an "addition" in the gaming world. Once someone sees the game with the "extra" effects on, that becomes the baseline for what they expect the game to look like. So when you are not connected to the server and your game looks less then what it does when connected, you get the sense that the game is incomplete and missing elements, rather then the game adding something.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39  Edited By AlexGlass

@liquidprince said:

@bazookaxp: This is assuming that the servers still exist in the same capacity that they did when whatever game in question launched. It seems naive to believe that anyone would dedicate any amount of server capabilities to older games for extended periods of time.

Also @alexglass, there really isn't such as thing as an "addition" in the gaming world. Once someone sees the game with the "extra" effects on, that becomes the baseline for what they expect the game to look like. So when you are not connected to the server and your game looks less then what it does when connected, you get the sense that the game is incomplete and missing elements, rather then the game adding something.

Err....what? Ever heard of ports? PC games? Or various power hardware? This is what happens every generation.

How many consoles games get ported over to the PC where they run with better graphics, better resolutions, etc? What about even between PC players with PhysX-capable graphics cards and those without?

By your argument no one should ever make a game that takes advantage of additional hardware. Or port a game to weaker hardware.

Avatar image for bazookaxp
BazookaXp

18

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40  Edited By BazookaXp

@bazookaxp: This is assuming that the servers still exist in the same capacity that they did when whatever game in question launched. It seems naive to believe that anyone would dedicate any amount of server capabilities to older games for extended periods of time.

Also @alexglass, there really isn't such as thing as an "addition" in the gaming world. Once someone sees the game with the "extra" effects on, that becomes the baseline for what they expect the game to look like. So when you are not connected to the server and your game looks less then what it does when connected, you get the sense that the game is incomplete and missing elements, rather then the game adding something.

They don't just dedicate to the specific game it is dynamic and they can host any game at any time.

Avatar image for isomeri
isomeri

3528

Forum Posts

300

Wiki Points

0

Followers

Reviews: 0

User Lists: 26

Seems neat and all but the exciting part is that there's an HD-DVD drive in the very beginning of that video.

Avatar image for rebgav
rebgav

1442

Forum Posts

335

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42  Edited By rebgav

@rebgav said:

@alexglass said:

I mean if I said 50% of your game code is latency-insensitive, and I can save you 1TFLOP of processing power and give you an additional TFlop to play with, how would that not be worth it?

I would say "that seems deeply unlikely," and then I would go on and on and on about distributed computing being a horribly expensive dead-end and about network throughput increasing at a much, much slower rate than the raw power of hardware. Assuming we're still talking hypothetically, I'd then chastise you for spending all this time and these resources on building a service for tomorrow which will be obsolete the day after tomorrow, and I'd wrap up by screaming about potential ISP traffic throttling and the insanity of betting everything on a network based service which requires almost complicity from uninterested parties to achieve the expansion of infrastructure required to make it truly viable long-term. Obviously, we would disagree strongly.

I mean, from a realistic perspective, it's almost certain that within a decade you could build an xbox four times as powerful as the new xbox for the same price. Is their network going to expand at that same rate? Are server-to-user data-rates going to expand at that same pace? Probably not, right?

Azure is already built. I'm not really sure I understand your argument because it seems to me you are arguing whether or not building a network computing server is feasible in the first place, and the answer to that is clearly a resounding yes.

Consoles get obsolete. Servers get upgraded.

Assume the computational power ration of Azure is currently 3:1 per user compared to the hypothetical Xbox One userbase, which I think is what was stated at the original reveal. If we estimate, conservatively, that a bespoke gaming console released ten years from now would be about ten times more powerful than the hardware coming out this year, then to maintain that power ratio the per-user server allocation would also have to be multiplied by ten. Let's say that by the end of the gen you're processing twice the data server-side that you are at the start, then to maintain the ratio the server capacity has to double. To me, that seems a pretty huge investment just to start the following generation in the same position as the coming gen, and unless we were to artificially limit technological advancement in console gaming, it's an investment requirement which never ends - and presumably increases with the expansion of the service, greater adoption rates, and the enterprise and OS/Software applications unrelated to gaming.

As I think I said in an earlier post, the small gains on the gaming side probably won't keep pace with advancements in hardware either, which just makes betting on hybrid rendering seem sketchy as hell.

Avatar image for liquidprince
LiquidPrince

17073

Forum Posts

-1

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

@liquidprince said:

@bazookaxp: This is assuming that the servers still exist in the same capacity that they did when whatever game in question launched. It seems naive to believe that anyone would dedicate any amount of server capabilities to older games for extended periods of time.

Also @alexglass, there really isn't such as thing as an "addition" in the gaming world. Once someone sees the game with the "extra" effects on, that becomes the baseline for what they expect the game to look like. So when you are not connected to the server and your game looks less then what it does when connected, you get the sense that the game is incomplete and missing elements, rather then the game adding something.

Err....what? Ever heard of ports? PC games? Or various power hardware? This is what happens every generation.

How many consoles games get ported over to the PC where they run with better graphics, better resolutions, etc? What about even between PC players with PhysX-capable graphics cards and those without?

By your argument no one should ever make a game that takes advantage of additional hardware. Or port a game to weaker hardware.

Yes, exactly my point. The PC version of games always becomes the standard when you talk about how good a game looks. Then when mentioning any other version of the game, you always refer to it as missing elements. "Oh game X is missing SSAO, or game Y is missing high quality shadows that the PC version has..." You refer to the lower versions by what they are missing rather then what the top version adds.

Avatar image for nekroskop
Nekroskop

2830

Forum Posts

47

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

Won't work until the planet is covered in data centers and fiber a la Warhammer 40K. I can't wait for this to crash and burn.

Avatar image for syed117
Syed117

407

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

All this talk of PC games and PC versions and still the hate on the XB1. We've already gone through a decade of online only games. Every MMO ever, diablo 3, the sims recently. Those games die when their servers die. That's where we are headed. Where are the people to complain about all the other games that require servers to work? Somehow now that the idea is coming to consoles, the flood gates open.

It doesn't matter if a game needs servers simply to run or to produce any other kind of in game effect. It's been happening for more than a decade. As more and more games take advantage of on online connectivity, we are going to see more games that simply die off when it's not practical to run the servers anymore.

Avatar image for slashdance
SlashDance

1867

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Servers cost money, right? Would anyone realistically deticate servers to... make their game look slightly better? Does this sound like a stupid idea or am I missing something?

Avatar image for isomeri
isomeri

3528

Forum Posts

300

Wiki Points

0

Followers

Reviews: 0

User Lists: 26

Servers cost money, right? Would anyone realistically deticate servers to... make their game look slightly better? Does this sound like a stupid idea or am I missing something?

Microsoft is paying for the servers, or rather the servers are paid for with our Xbox Live subscriptions. This leaves game developers with free servers to test stuff out on.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@rebgav said:

@alexglass said:

@rebgav said:

@alexglass said:

I mean if I said 50% of your game code is latency-insensitive, and I can save you 1TFLOP of processing power and give you an additional TFlop to play with, how would that not be worth it?

I would say "that seems deeply unlikely," and then I would go on and on and on about distributed computing being a horribly expensive dead-end and about network throughput increasing at a much, much slower rate than the raw power of hardware. Assuming we're still talking hypothetically, I'd then chastise you for spending all this time and these resources on building a service for tomorrow which will be obsolete the day after tomorrow, and I'd wrap up by screaming about potential ISP traffic throttling and the insanity of betting everything on a network based service which requires almost complicity from uninterested parties to achieve the expansion of infrastructure required to make it truly viable long-term. Obviously, we would disagree strongly.

I mean, from a realistic perspective, it's almost certain that within a decade you could build an xbox four times as powerful as the new xbox for the same price. Is their network going to expand at that same rate? Are server-to-user data-rates going to expand at that same pace? Probably not, right?

Azure is already built. I'm not really sure I understand your argument because it seems to me you are arguing whether or not building a network computing server is feasible in the first place, and the answer to that is clearly a resounding yes.

Consoles get obsolete. Servers get upgraded.

Assume the computational power ration of Azure is currently 3:1 per user compared to the hypothetical Xbox One userbase, which I think is what was stated at the original reveal. If we estimate, conservatively, that a bespoke gaming console released ten years from now would be about ten times more powerful than the hardware coming out this year, then to maintain that power ratio the per-user server allocation would also have to be multiplied by ten. Let's say that by the end of the gen you're processing twice the data server-side that you are at the start, then to maintain the ratio the server capacity has to double. To me, that seems a pretty huge investment just to start the following generation in the same position as the coming gen, and unless we were to artificially limit technological advancement in console gaming, it's an investment requirement which never ends - and presumably increases with the expansion of the service, greater adoption rates, and the enterprise and OS/Software applications unrelated to gaming.

As I think I said in an earlier post, the small gains on the gaming side probably won't keep pace with advancements in hardware either, which just makes betting on hybrid rendering seem sketchy as hell.

Why are you worried about 10 years from now? If you are talking about 3X the CPU processing power that's quite a bit for this generation. And the simple fact that these same servers can be used as cloud streaming services, says there's a huge benefit to gaming in general. If I can plug in my 6 year old laptop and get current gen graphics that would never run on my laptop otherwise, I'm sure I would do the same with a console as well.

In the case of Azure it's already being used for a lot of other things, and we have no idea how much of this network is even going to be used to begin with. Considering MS is offering anyone that wants to use the platform unlimited storage, free one month trials and the capability of spinning up servers on demand, I'd say they have a lot to spare.

I also don't think you considered just how much money $5 a month on a 40 million person user base amounts to. Xbox Live users have already paid for the network many times over so I'm not too concerned about cost. It's about time MS actually gave something back for all that money they're been charging for a matchmaking service. So going back to your argument, by the end of the generation you also have that many more $5 a month subscribers.

Opinions aside, there is a very well known and documented industry wide movement towards cloud gaming. So those that are doing it, be it Microsoft, Nvidia, OnLive, Agawi, or even Nintendo, obviously sees financial feasibility in it. When you have all the major video game manufacturers neck deep in the tech, I'm still not sure what you are arguing. They're saying it's feasible to build newtork computing servers because all the major players are already doing it and are spending billions on it. And yes they are worth upgrading. And upgrading is a lot cheaper than installing the initial infrastructure, which in the case of Azure, it's pretty much already done.

A smaller and perfect example is OnLive who is currently in the process to upgrade their servers to Nvidia's GRID. And you're talking about a service catering to a userbase of 3-4 million users. It's clearly a successful enough venture to keep it going if they are going to upgrade.

Besides, you don't necessarily have to throw out all the hardware every 4 years. It's expandable. You can just add to it and full server side upgrade cycles can run longer than local hardware. Though companies probably find plenty of benefits from upgrading to more efficient, more powerful machines that require less space and less electricity as technology advances.

Avatar image for onarum
onarum

3212

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49  Edited By onarum

@bazookaxp said:

For one you need to calm down and two there will of course be a dynamic system that would take over in case you lose internet.

There is no reason to think a redundancy couldn't be put in place so it can automatically adjust settings so the local hardware can take over.

yeah but if the box can do it on it's own anyway why bother with such an intricate system? well I guess some FPS gains could be achieved, but still...

@alexglass: unfortunately in my country it is, most of our lines go through light poles instead of being underground, this year alone there were 4 extended internet outages that were caused by some accident where a vehicle hit a pole and broke the ISP's optical fiber line, one of them lasted for 2 days...

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50  Edited By AlexGlass

@onarum said:

@bazookaxp said:

For one you need to calm down and two there will of course be a dynamic system that would take over in case you lose internet.

There is no reason to think a redundancy couldn't be put in place so it can automatically adjust settings so the local hardware can take over.

yeah but if the box can do it on it's own anyway why bother with such an intricate system? well I guess some FPS gains could be achieved, but still...

@alexglass: unfortunately in my country it is, most of our lines go through light poles instead of being underground, this year alone there were 4 extended internet outages that were caused by some accident where a vehicle hit a pole and broke the ISP's optical fiber line, one of them lasted for 2 days...

Well I'm not sure if you are talking about Eastern Europe or South America but while a lot of those countries, especially third world countries have been behind, I have seen trends there of newer age tech being introduced from the get go, like fiber optic which is a plus.

A lot of places in Eastern Europe missed out on the DSL or Cable era, but are already getting fiber optic implemented. And yes in a lot of cases, the cable is actually hanging from balcony to balcony but such is life in some parts. Heck I'd still take a FO cable over Road Runner even if it meant running it from my neighbors house or deal with thieves stealing it and selling it off every couple of months but in NC I'm stuck with cable for now.

So consider yourself lucky that you're talking about fiber optic cutting off on you. Wish I had your "problem".