The Future is Nigh

It’s already January 5th, and I haven’t made any predictions for the year.  I thought about doing this last year, and then I scrapped the post because I couldn’t think faster than things were actually happening.  How to predict the future?  A couple years back, I followed the advice of not trying to predict what will come, but what will disappear.  While perhaps more accurate, it’s less satisfying than doing the fanciful sort of over the top prediction.  So, here’s a mix.

nVidia will convince itself that it is the sole heir to the future of GPU computing.  With their new tegra X1 in hand, their blinders will go up as to any other viable alternatives, which means they’ll be blind sided by reality at some point.  It is very interesting to consider the amount of compute power that is being brought to bare in very small packages.  The X1 is no bigger than typical chips, yet has teraflop capabilities.  Never mind that that’s more than your typical desktop of a few years back, throw a bunch of these together in a desktop sized box, and you’ve got all of the NSA (circa 1988) at your fingertips.

More wires will disappear from my compute space.  The Wifi thing has surely been a huge boost for eliminating wires from all our lives.  In my house, which was built in 2008, every room in wired with phone jacks, coax connectors, and some of them with ethernet ports.  The only room where I actually use the ethernet ports is my home office, where I have several devices, some of which work best when wired up to the central hub over gigabit connections (the NAS box).  But hay, in the garage, it’s all wireless.  Even the speakers for the entertainment system are wireless.  Between the traditional wifi, and bluetooth, I’m losing wires from speakers, keyboards, mice, devices.  The last remaining wires are for power, and HDMI connectors.  I suspect the power cords will stick around for quite some time, although, increasingly, devices might run off batteries more, so power will be a recharge thing rather than primary.  HDMI connections are soon to be a thing of the past.  This will happen in two different ways.  The first will be those simple devices that give you a wireless HDMI connection.  A highly specialized device that you can already purchase today.  The second will be the fact that ‘smart glass’ either in the form of large ‘tablets’ or HDMI compute dongles, will essentially turn every screen into a remote desktop display, or smarter.  And those devices will all speak wifi and/or bluetooth.

So yah, compute will continue to go down the power/compute density curve, and more wires will disappear.

What about storage?  I’ve got terabytes sitting idle at home.  I mean every single device, no matter how small, has at least a few gigabytes of one form or another.  Adding a terabyte drive to a media player costs around $50 or so.  The entirety of my archived media library is no more than a couple of terabytes.  Storage is essentially free at this point.  So what the cloud?  I don’t have a good way to backup all my terabytes of stuff.  The easiest solution might be to get a storage account with someone, and mirror from my NAS up to the cloud.  But, that assumes my NAS is a central point for all the things stored in my home.  But it’s not.  It contains most of my media libraries, and tons of backed up software and the like, but not everything.

Perhaps the question isn’t merely one of storage, but perhaps it’s about data centers in general.  With terabytes laying about the house, and excess compute power a mere $35 away, what might emerge?  I think the home data center is on the precipice.  What might that look like?  Well, some clever software that is configured to gather, backup, distribute, and consolidate stuff.  Run RAID across multiple storage devices in the home.  Do backup streaming to the cloud, or simply across multiple storage devices in the home.  The key is having the software be brain dead simple to install and run.  A set and forget setup.  This begs the question, will the home network change?  Notions of identity and permissions and the like.  We’ll likely see a shift away from the more corporate based notions of identity and authorization to a more home based solution.

And what about all that compute power?  It’s getting smaller, and requiring less energy as usual.  Self driving  cars?  Well, at least parallel parking assist would be a start, and that’s how it will start.  I’d expect a lot of this compute power to be used to improve surveillance capabilities.  Image processing will continue to makes leaps in accuracy and capability.  This tied with the rise of autonomous vehicles is likely to make a landscape full of tiny flying and roving devices that can track a lot of things.  But that’s rather broad.  In the short term, it will simply mean that military recon will get easier, and not just for large players such as the US.  Corporate espionage will also get easier as a result.

And from the fringe?  The pace of technological advancement is always accelerating.  One thing builds on the next, and the next big thing shows up before you know it.  Writing computer software is still kind of clunky and slow for the most part.  I would expect this to actually accelerate.  It will happen because the price of emulation/simulation is decreasing.  Between FPGAs and dynamically configurable virtual machines, it’s becoming easier to create new CPUs, GPUs and the like, try them out on massive scale, make improvements on the fly, and generally leverage and build systems faster than ever before.  CPUs will not only become simulated silicon, they will stop being manufactured as hard silicon, because the cycle times to create new hard silicon will be much slower than what you can do in simulation.  Parallelism will become a thing.  We will throw off the shackles of trying to build massively parallel systems using mutexes, and instead reembrace message passing as the way the world works.  This combined with new understandings of how massively parallel systems like brains work, will give a renewed emphasis on AI systems, built on the scale of data centers, and continents.  This will be the advantage of the data center.  Not just compute and storage, but consolidated knowledge, and increased AI capabilities.

And so it goes.  The future is nigh, so we’ll see what the future brings.

 

Advertisements


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s