Gordon Bell IEEE Lecture

Titled: Industry's evolutionary path or Moore's Law: Que sera sera

Gordon Bell is a pioneer of the computer industry. When he talks he drops names like Bill Gates, Paul Allen, Bill Joy and Jim Gray. He spent some time at UNSW several decades ago.

CPU increases are stalling, further gains to Moore's Law will come from multi-core designs. Storage is going crazy, allowing many interesting possibilities. GPU's are also an interesting area keeping ahead of Moores law.

Gordon suggests everything "cyberizable" will be eventually. He gave flowchart of how this happens; world to continent to region (intranet) to campus to home to car to body to in body! He also said that data, telephony and television will converge; there is no point in keeping three delivery mechanisms around. This is already happening.

He expanded on "cyberization" saying it means there will be a digital copy of everything in located in cyberspace. This means that the gyprock in your walls will have RFID tags with serial numbers.

Gordon gave his law, Bell's Law. It's based around computer classes, and says that every decade a new, lower cost class of computers emerges. They are primarily defined by the platform, interface and interconnect. You can see how this has happened from mainframes to PC's to networked PC's to wireless mobile PDA's.

Gordon converted to believing that large, single image computers aren't the future in 1995 or so. However, clusters are too hard to program. He jokes that the Gordon Bell Prize is apparently given because programming clusters is so bad anyone that tries deserves a prize.

One reason he never thought UNIX took off until Linux was price and vendor lock in. Bill Joy's law is you don't write software unless you can sell 100,000 and it costs you less than $10 million. Bill Gates law is you don't write software unless you can sell 1,000,000 and it costs you $10 million.

Pharmix was one of Gordons investments. The created molecular mechanics accelerator for modelling drug-receptor interactions on a chip. They have a 1000x speedup over 1GHz PC, but they abandoned it to use commodity GPU hardware.

Very large disks are the driver for the new world vs. the old world. In the old world there is a mainframe which costs cents per transaction and around $85/gigabyte/year. The new world is scaled out PC's (Google); they have basically 0c/transaction and cost around $1million/year/petabyte. You now capture everything because you can. For example, Brewster Kahle runs archive.org for around $2k/terabyte/year, all on open source hardware.

An example of how this information could have been used is how book stores could have used "perfect inventory" to kill Amazon. They could put up a website an tell you if you should go down the road to get your book or order it for delivery. They missed the boat with that.

Gordon is interested in the "1TB life". They found out some guy called Vannevar Bush in 1945 proposed the "memex" which an individual can store all books, records, etc. Gordon showed how he calculated the average life based on a day; emails, web pages, scanned pages, 8 hours of recorded audio, a few digital photos will take about 65 years to fill 1TB, which these days can be carried around.

Add to this a GPS, a heart rate monitor, video and you have a complete record of everything you ever did.

He sees sensor networks as playing the next step in the Bells law chain. He told us about some of his investments in DUST networks based on wireless sensor technology. They have created some little units with GPS and IRDA which the army are dropping in the middle of nowhere. No one really knows why.

Gordon's vision for the next 10 years is mixed. He sees Moore's law continuing. He wants paper quality screens and terrabyte personal stores (running WinFS). He sees Murphy's law remaining with stays with complex systems throwing problems we didn't foresee. There will be astronomical sized databases. Personal authentication will be required to access anything of value network services. "It's the internet, stupid" (shades of "The network is the computer", anyone?). Of course he's positive about sensor networks.


He mentioned that POTS didn't evolve but the point was raised that network providers came up with packetization. He agreed.

A question about the difficultly of programming clusters was asked. Gordon suggested that individual problems can be broken to run on clusters, but general purpose applications do not do well (Google, when it comes down to it, is single applications). He feels this is the "holy grail" of clusters.

Obviously a lot of his technology raises questions about surveillance and big brother. Gordon said it worried him, but suggested there are many advantages. Also there was a bit of fate -- this is the way things are going; what can you do?

He was asked about the cell processor. He thought it was interesting.

Other questions about quantum computing and nerve interfaces; he liked the ideas but would wait till he saw products in the lab to invest in.

I missed a question about another technologist who evidently talks about the digital lifestyle.

He was asked about battery life not keeping up with computers. He mentioned we need to make trade offs between speed and battery life. There is no point having a battery that lasts twice as long if it takes twice as long to do anything.