The Last Computer

We’ve been saying that for a while haven’t we? A year and a half ago, when I bought my iPad, I was sure I would never buy another laptop… well, turns out I was wrong.

Last weekend I bought the new 13’ MacBook Air Haswell i7.  All the benchmarks put it either at, or above, the current MacBook Pro lineup, which is due for an update here shortly.  I was had previously convinced myself that I was going to be waiting for the new MacBook Pro Retina.  But, then I came to my senses.  Who am I kidding…?  My need for processing power has diminished exponentially in the past few years.  I certainly don’t have the time for gaming, nor the skill for video editing and creation.  Every once in a blue moon, I find myself on a photography kick again. 2D images however, RAW or not, require more RAM than CPU, and after all, anything would be a vast improvement over my ’09 iMac.

A confluence of circumstances led me to actually pull the trigger last weekend – not the least of which being my want to buy something, anything.  I had found myself using my desktop less and less, and I wasn’t sure if it was because it was so restrictive – stationary that is – or because it was slower to navigate to web pages than my iPad.  I was eventually able to convince myself that I was no different, required no more computing power, than the average user.  The vast majority of my time is spent either at work, or working on work… neither of which requires a great deal of computing power.  Whereas in my undergrad I found myself being thankful that I had one of the first dual core processors and was able to make use of multithreaded processes to run my reactor core simulations with expedience over the computers in the library, today I do little more than word processing and web browsing – all perfectly ordinary.

I did it.  I bought a new computer, even in the face of the impending future updates to the MacBook Pros.  As we shift more and more to cloud computing, the processing power of your own personal computer becomes less and less valuable.  We are shifting back to the earlier explored, and subsequently abandoned, model of the terminal and the mainframe.  The only difference being, that instead of the horrendous ‘HP thin clients’ and crappy remote desktops of your workplace, we’re using Google, Apple, and Amazon’s semi-infinite computing power and spare computing cycles, to do the hard work for our perfectly capable, but comparatively underpowered, machines.

Even the harder work of video and photo editing can now be done in the cloud.  Google+ is doing some amazing things in the cloud with your photos these days.  And these on demand computing resources increase the possibilities and usefulness of the thing that can be done with your photos.  Beyond the normal ho-hum import, algorithms can now determine which of the 15,000 photos you took last year were of the best quality and of the most interest.

Perhaps more importantly, this model, as it has always been, appears cheaper to the user.  One can now buy a Chromecast for $35 - an ingenious device.  The Chromecast is essentially a 35$ computer terminal that interfaces with the cloud.  Your computer sends the demand for content to Google, which then handles all the processing and encoding, and relays it down to the cheap device which is presumably connected to your very expensive display (i.e. a large TV).  Until now, at least personally, the advent of “Smart TVs” has been relatively uneventful, chiefly because of the incredibly underpowered TV processors that were handling the mess of a user interface.  No one wanted to use that shit… it was messy, clunky, and slow.  Google can do that all in the cloud for you now... it’s gorgeous, and you don’t even have to buy a new TV!

Do we really need all that processing power in our homes, or can we leave it to the cloud?

Just as more and more users are getting used to the idea that more and more of their personal data is stored in the cloud, we hear the stories of the NSA downloading the Internet in its entirety.  Now, my lack of confidence in the cognitive ability of most Americans means that I think the implications escape most, but for the savvy the implications are vast. 

“Big Data” is awfully similar to “Big Brother”.  This Orwellian scenario isn’t that far fetched though… even as a miscreant youth of 10, I was aware that the NSA would begin recording my calls if I mentioned certain key words, or that I might expect to see strange tracks in the carpet if my conversations were anything other than on the up-and-up; perhaps I saw too many movies.

Regardless of your feelings on who is, or is not, monitoring the Internet, a more basic question bothers me.  What if the cloud goes away?  What if access is somehow revoked…? Do you want to have the ability to process data regardless of the charity of large corporations?  Should such circumstances arise though, I’m sure my ability to share and crunch data in the cloud will be the least of my worries.

I am currently reading a Sci-Fi book – a genre that I do not often find particularly interesting: Pandora’s Star, by Peter Hamilton.  The book is incredible.  As with most Sci-Fi though, I find myself confused.  Confused that we have not yet arrived at an already conceived of, and in many senses, fully developed way of living.  Why am I not browsing the web via neural inserts?  Why is my e-butler not handling all communications and alerting me of any pertinent information?  Why am I not connecting to the ‘unisphere’?  Why is my entire life’s memory not being continually backed up in the case that I meat a demise, untimely or otherwise?

So many questions, so few answers, but yet despite the declining computer sales of late, the landscape remains relatively unchanged from my days a child, longing for my first laptop so I could SSH into some remote Linux server and IRC chatting with geeks around the globe.