The Last Computer

We’ve been saying that for a while haven’t we? A year and a half ago, when I bought my iPad, I was sure I would never buy another laptop… well, turns out I was wrong.

Last weekend I bought the new 13’ MacBook Air Haswell i7.  All the benchmarks put it either at, or above, the current MacBook Pro lineup, which is due for an update here shortly.  I was had previously convinced myself that I was going to be waiting for the new MacBook Pro Retina.  But, then I came to my senses.  Who am I kidding…?  My need for processing power has diminished exponentially in the past few years.  I certainly don’t have the time for gaming, nor the skill for video editing and creation.  Every once in a blue moon, I find myself on a photography kick again. 2D images however, RAW or not, require more RAM than CPU, and after all, anything would be a vast improvement over my ’09 iMac.

A confluence of circumstances led me to actually pull the trigger last weekend – not the least of which being my want to buy something, anything.  I had found myself using my desktop less and less, and I wasn’t sure if it was because it was so restrictive – stationary that is – or because it was slower to navigate to web pages than my iPad.  I was eventually able to convince myself that I was no different, required no more computing power, than the average user.  The vast majority of my time is spent either at work, or working on work… neither of which requires a great deal of computing power.  Whereas in my undergrad I found myself being thankful that I had one of the first dual core processors and was able to make use of multithreaded processes to run my reactor core simulations with expedience over the computers in the library, today I do little more than word processing and web browsing – all perfectly ordinary.

I did it.  I bought a new computer, even in the face of the impending future updates to the MacBook Pros.  As we shift more and more to cloud computing, the processing power of your own personal computer becomes less and less valuable.  We are shifting back to the earlier explored, and subsequently abandoned, model of the terminal and the mainframe.  The only difference being, that instead of the horrendous ‘HP thin clients’ and crappy remote desktops of your workplace, we’re using Google, Apple, and Amazon’s semi-infinite computing power and spare computing cycles, to do the hard work for our perfectly capable, but comparatively underpowered, machines.

Even the harder work of video and photo editing can now be done in the cloud.  Google+ is doing some amazing things in the cloud with your photos these days.  And these on demand computing resources increase the possibilities and usefulness of the thing that can be done with your photos.  Beyond the normal ho-hum import, algorithms can now determine which of the 15,000 photos you took last year were of the best quality and of the most interest.

Perhaps more importantly, this model, as it has always been, appears cheaper to the user.  One can now buy a Chromecast for $35 - an ingenious device.  The Chromecast is essentially a 35$ computer terminal that interfaces with the cloud.  Your computer sends the demand for content to Google, which then handles all the processing and encoding, and relays it down to the cheap device which is presumably connected to your very expensive display (i.e. a large TV).  Until now, at least personally, the advent of “Smart TVs” has been relatively uneventful, chiefly because of the incredibly underpowered TV processors that were handling the mess of a user interface.  No one wanted to use that shit… it was messy, clunky, and slow.  Google can do that all in the cloud for you now... it’s gorgeous, and you don’t even have to buy a new TV!

Do we really need all that processing power in our homes, or can we leave it to the cloud?

Just as more and more users are getting used to the idea that more and more of their personal data is stored in the cloud, we hear the stories of the NSA downloading the Internet in its entirety.  Now, my lack of confidence in the cognitive ability of most Americans means that I think the implications escape most, but for the savvy the implications are vast. 

“Big Data” is awfully similar to “Big Brother”.  This Orwellian scenario isn’t that far fetched though… even as a miscreant youth of 10, I was aware that the NSA would begin recording my calls if I mentioned certain key words, or that I might expect to see strange tracks in the carpet if my conversations were anything other than on the up-and-up; perhaps I saw too many movies.

Regardless of your feelings on who is, or is not, monitoring the Internet, a more basic question bothers me.  What if the cloud goes away?  What if access is somehow revoked…? Do you want to have the ability to process data regardless of the charity of large corporations?  Should such circumstances arise though, I’m sure my ability to share and crunch data in the cloud will be the least of my worries.

I am currently reading a Sci-Fi book – a genre that I do not often find particularly interesting: Pandora’s Star, by Peter Hamilton.  The book is incredible.  As with most Sci-Fi though, I find myself confused.  Confused that we have not yet arrived at an already conceived of, and in many senses, fully developed way of living.  Why am I not browsing the web via neural inserts?  Why is my e-butler not handling all communications and alerting me of any pertinent information?  Why am I not connecting to the ‘unisphere’?  Why is my entire life’s memory not being continually backed up in the case that I meat a demise, untimely or otherwise?

So many questions, so few answers, but yet despite the declining computer sales of late, the landscape remains relatively unchanged from my days a child, longing for my first laptop so I could SSH into some remote Linux server and IRC chatting with geeks around the globe.

Cloud Computing

How long will it take? That's really the only question there is surrounding cloud computing. The fact of the matter is, the lack of reliability and general incompetence and backwards thinking associated with your company's or school's IT department, will eventually become too much to deal with. Google, along with many other companies, have come up with a solution to this: let them store all of your information, keep it safe, maintain it, and as a bonus you don't have to worry about the large costs associated with upgrading and the headache that will no doubt issue associated with aging hardware and budget problems. What e-mail client does your company use? Browser version? OS? ...old is the answer.

It may take time, but soon enough we'll all be computing on the cloud. I admit, I too did not see the utility in switching over from Microsoft Word and local computing to google docs for example, but with the use of multiple computers per person growing, this will make more and more sense. Especially now that google docs is perfectly capable of handling Microsoft's .docx format. Oddly enough, yet too late, Microsoft has also jumped on the cloud computing model, and if memory serves me, the next version of office will be incorporating online storage and management of your documents in a big way.

The best part about google, and what gives them a leg up in the arena, is the way in which they make their money allows them to offers services such as these for free. This will make them much more attractive, and eventually paying $99 for MobileMe, and $300 for Microsoft Office will become harder and harder to justify.

Cloud computing also represents a new direction in computing in general, which is the general understanding that you're really not on the computer until you have a broadband internet connection. It really does make sense though, for most of our computing to transition to the browser. This way the computers that we have can be less powerful, yet maintain or even increase productivity level. As hard as it will be for some of us, myself included, the idea of a bigger and beefier computer will soon fade in favor of larger bigger and better computers we all access from cheap and plentiful terminals.