More and more often, I keep hearing this question. “Why would I need a gigabit? There’s nothing I could possibly do to need that kind of bandwidth.” On the surface, I can see the point. There are precious few home applications that, on their own, would consume this kind of bandwidth. Let’s take at how you could use that kind of bandwidth.
I’d like to offer a real-world scenario of my own usage. We have two streaming TVs, two laptops, a desktop, two smartphones, and a Kindle tablet. Our phone service is Google Voice with a ObiHai box with a microcell from Sprint to boost our cell coverage. We use a fair amount of Netflix and Amazon Prime streaming. I have all of the systems backing up to CrashPlan (currently 100GB or so worth) and sync files with ownCloud. I also work from home full-time, so I spend a good chunk of my week (45+ hours) connected to the VPN doing large file transfers, Webex sessions, and VoIP.
I also have a 5.3TB NAS sitting in my closet. It holds backups of all of our software installers, our entire ripped CD/DVD/Blu-ray collection, and a huge library of TV shows (currently 2.2TB and counting). It currently runs Plex for media streaming, and I’m planning on adding both ownCloud and CrashPlan to it for file sync and backups, respectively. Ideally, I’ve love for friends and family to be able to use it too.
The outgoing HD streams would be about 20Mbps a pop. Backups and sync would probably be around 10Mbps per user, if they have a good connection. I also occasionally host a Minecraft server and download multi-gigabit games over Steam. When I add it all up, it’s not hard to see peak usage of over 200Mbps both ways.
Granted, you’re probably not looking at usage as intense as what I’m doing. But what happens in a few years when a few data-hungry teenagers are uploading their 50M-pixel photographs when you try to watch a 4K video? How about when you need to backup 100GB worth of movies and pictures from your latest family vacation? What happens when your hard drive fails and you need to restore over 500GB worth of backups to be back on track? Those aren’t far-fetched ideas, and they certainly aren’t going to be outside of the norm for long. Can you do it with a slower connection? Maybe, but the experience won’t be any good.
And really, this is what gigabit is about: removing the barrier between what you’d do on your local network versus what you’d do over the Internet with remote networks. There’s not much in the way of a single application that would use a gigabit connection. There’s lots of individual applications that, when added up, can saturate even 100Mbps. Would you prefer to carefully plan and ration your usage so that too many episodes of Dora the Explorer doesn’t bog down your marathon Left4Dead 2 session? Or should the bandwidth flow so freely that, like electricity, you don’t worry about which straw will break the camel’s back?
My argument when people put up the no one really uses that much bandwidth is. Why does it hurt for them to be able to copy a file faster. Its all about the burst why artificially limit.
Everyone would love to post their pictures to flicker or Costco a little faster. Not to mention feel like you could actually recover all your files from Crashplan. I know a few people who used Mozy or the like until they realized it was unrealistic to restore their files over there 3/1 DSL line.
My favorite white paper on the subject is “A Blueprint for Big Broadband“
AR(augmented reality) world mapping systems, Something that will hopefully be pervasive in 10 years or so. A half dozen users on a single home access point could easily saturate anything less then a gig.
ARUI’s, AR gaming, AR productivity, etc etc… will be the next big thing once we have powerful enough mobile platforms to handle it(HSA enabled ARMv8 chips maybe finely be their), and big enough internet connections, as they will eat a lot of bandwidth.
Game streaming is going to be huge. nVidia is doing it with Shield, Steam is doing it with SteamOS, and now Microsoft is planning it for the XBox One. Even more than bandwidth, latency is going to really jump to the forefront with these systems. Gamers do not tolerate even a few extra milliseconds of delay.
Shields highest quality mode, requires a 25mbit upload connection, only 2 ways to get this in Utah, Utopianet or Google in provo, and I suppose maybe a university connection.
It is hard to real time encode video feeds to be small, so generally they are larger then what you watch on youtube or netflix.
This is also why in the case of the nvidia shield a geforce 760(or was it 660?) is the lowest end card you can use to stream your pc gaming to the shield device.