I just had a really interesting brainstorming session on a possible future of commodity computing.
I carry around many processors, a cellphone, tablet, laptop, camera. These are all just processors, running an operating system and waiting to execute some code. Currently they run application/device specific code, designed for their single use lives. Any of the devices could really be application independent and dynamically perform whatever operation I want to give it. My cellphone breaks, so I just make my computer as a cellphone, or my tablet.
Let me illustrate with an example: All of my devices run Python, so I write a Python video player. So then I can load it onto my cellphone and boom my cellphone is now a media player, or my tablet is now the media player, and so on. The device itself is irrelevant, it is a packaged form factor that I as the user/developer decided what it was going to do now.
So then carry it forward a little bit to where I go into my local hardware shop and buy a slew of "processors" or devices. Generic devices with several interpreters or OS on it that I can drop an image to it. I may buy a couple of matchbox bricks with no display, and once that is a soapbar size with a screen and some buttons. Oh, and another screen with keyboard. I then drop a "cellphone device image" or application on the soapbar device, and perhaps a Word Processor application on the keyboard/screen, and so on. Think of it like Amazon's EC2 computing cloud, but in the palm of your hand.
Now the matchbox devices, they can be simple devices, or maybe they're even like Lego bricks, where I can snap them together for easy parallellization. The processors start communicating and sharing data and code between them, without any intervention from me the user. Maybe some of these small devices then snap into larger devices with a screen, so I can make my tablet device now have 3, 4, or 8 processors in it. Perhaps the devices communicate via exposed connectors, or wirelessy using RF, or even some kind of optic communication.
I may even put 100 of these in as a small stack of cards in my computer. Over time, they slowly start "wearing out", so after a year or 2 I only have 80. So what, some programs run a little slower, or I can't do as many actions on the device. I swap the broken ones out, toss in some more, or even just pick up a new device and interface.
bucket o' computing
Now what if the processors were just small amounts of silicon, like sand. I have a pile of these "sand processors" that communicate omni-directionally with other processors near them, not just the ones touching it, but any within a short spherical range. They would share data, hand off programs and provide me with a compact, parallelized, mobile, amorphic computer.
And if the sand computer can react to visible and non-visible light, I could show it a picture and it is instantaneously processed.
the future is now
While I really think there is a future in "sand computing", I don't think I'll be able to buy my "Pail o' Processing" after WWDC'07. However, we have reconfigurable devices now. Linux has pushed for this for a long-time, a single operating system running on consumer and embedded devices. Even easier is to run an interpreted language like Python or Ruby. Right now I can run a GPS program, MP3 player, or any other number of desired functionality on a handful of devices. Using open-standards I can then share the data between devices, so I can take my GPS waypoints off of my receiver and load them onto my cellphone.
The future is now, and it's fun.