Thursday, May 19, 2011

When the brain gets tired

Does anyone at the CIA have the problem of someone coming into the company kitchen and stealing their lunch from the fridge? Is it a common problem? Do people for whom intelligence that keeps out nation safe their very job puzzle over who would be so rude?

Does it spur them to innovation?

Have tiny cameras and poisons one can hide inside common lunch foods been created primarily to prevent corporate kitchen lunch theft, and only later reworked for more covert tasks? Does the CIA actually hire somebody to steal lunches in an effort to keep an arms race internally going to better enhance and advance the knowledge gathering required to unmask the lunch thief? Are they hiring? Do lunch bringers at the CIA regularly bring in tasty lunches?

Have CEO’s learned from this practice, and regularly make a habit of stealing the lunches of their engineers when they’re struggling to solve a difficult problem in the hopes of spurring inspiration from the rage driven hunger of finding their lunch gone? What then, do they do to those that perpetually eat out? Do those that consistently dine out find their wages not keeping up with their peers and the market in general because their bosses secretly want them to start bringing in lunch so that same bosses may steal in hopes of increasing breakthroughs and productivity?

For all the books that are written about creating workplaces and environments that inspire and create inspiration, does it really just come down to stealing people’s lunches?

Monday, May 2, 2011

That which remains

When computers first started, they were massive mainframe machines with terminal stations that people logged into to get their work done. Over time, the personal computer appeared, and people were able setup an individual PC that operated in isolation from all of the others to do one’s own work.

But then Darpanet had to go all viral and now everything is talking to everything else on this internet thingy. What’s more, a lot of the services that were developed for personal computers have migrated onto the web and the PC’s themselves are slimming down into mere terminals to access said online services.

We’ve gone back to mainframes. Only now we call it a cloud.

The analogy is very gross, and I realize that, but it’s my analogy and I like it! And the topic of this rant is not one the merits of the above analogy, but of data retention. More and more data resides almost exclusively on the cloud. But what is the retention policy? If 150 years, barring any strange medical breakthroughs, it is a safe bet that everyone currently using the internet and all of these cloud services will be dead. It’s hard to get a clear picture of how much data gets added to the cloud every day or year, but the term exabyte shows up the forecast. 1 exabyte is a million terabyte hard drives.

So hundreds, thousands, maybe even millions of exabytes of data will be stored associated with accounts that are inactive. Pictures of birthdays, emails about travel plans, videos that are remixes of songs once popular that no one even remembers anymore. Technically, storage is growing at a rate sufficient to keep up with the exponential growth of data generated, but is it really worth it? Will companies like Google want to spend that kind of money to keep data that is private and can never be accessed again still available for immediate access? Will they schedule convenient ‘failures’ to purge the data? Will they announce a policy of erasing accounts that are inactive for 10+ years? 100+ years?

Perhaps instead, maybe all of those private accounts get made public 25 years after you die. Likely by then most of anything you’ve said to anyone will no longer be relevant. There’s a potential whole new industry for people to parse through the daily stuff we write about on a daily basis that ceases to be important within a week or a month, and find things more useful and important to generations many years later. Google Archives would have plenty of content to process. Maybe the emails could be used a hash tables for encryption algorithms. Then again, given how much we all talk about the same things, maybe not.