Info Request Site Map   
Jim Morris's Thought of the Week (or month, or year, ...)

Wednesday, February 21, 2007

Twenty-five Years of Andrew

Twenty-five years ago, I started a project at Carnegie Mellon that created a campus computing system called Andrew. IBM paid for the development. Parts of the system still run at Carnegie Mellon and other places, but it’s time to rethink campus computing strategies. While I ponder that, here is an essay I wrote back then. If some of the technology ideas seem strange, please remember it was written a long time ago.

Quality Communication through Time and Space

Jim Morris, 11/23/82

This slogan sums up what I think should be the focus of CMU's new computer system. It encompasses more than you might think, as I shall explain.

Many years ago, in freshman English at Carnegie Tech, I read a provocative essay by Jacques Barzun about how a university should allocate its limited capital funds: first to dormitories, then to libraries, finally to class rooms. His point was that students first and foremost learn from each other, and bringing them all together was the most important role of the university. Libraries win out over class rooms because, given an inquiring student, the accumulated knowledge of the years will always outweigh whatever wisdom and knowledge transmitted in the lecture halls and class rooms.

To update Barzun to the computer age and our current agenda:

The primary goal should be to broaden, deepen, and improve the communication among students. The rest of the campus community will also participate in this improvement; but the system should not be looked upon simply as a way to amplify the messages of the faculty. This constitutes the communication through space component of the system.

The secondary goal should be to improve access to the accumulated knowledge of the community and the world as represented by the libraries and other information sources. This constitutes part of the communication through time component.

The third goal should be facilitating the traditional teacher-student relationship by rechanneling all the communication that would normally be written. This will allow the really precious commodity, face-to-face communication to be devoted to the higher purposes of inspiration and encouragement -- in both directions.

Space and Time

The current concepts of electronic mail, data bases, and information retrieval should be unified in this system. Electronic mail is essentially a one sender, one receiver, non-updating sort of system. Information retrieval systems are one sender, many receivers with very occasional addition of information. Data bases, as typified by an inventory control system are many senders, many receivers, dynamic systems. Where should this system fall in this multi-dimensional spectrum?

Electronic mail will change our lives! Like the telephone it allows immediate communication over long distances. Unlike the telephone it does not require simultaneous sending and receiving. This crucial improvement makes it a far more useful tool than the telephone as anyone who has played “telephone tag” will attest. By allowing communication to be asynchronous electronic mail increases the number and kinds of conversations a person can have.

In particular, it permits an incredibly enhanced version of the time-honored dormitory bull-session. The most ridiculous one of these I ever witnessed was started by an innocent request for coat-hangers to a wide distribution list. The requester happened to make a somewhat whimsical statement about his theory that coat hangers actually reproduce. This was answered by a theory that a coat hanger's sex depended upon which way the hook went over the bar and an explosion of several hundred messages followed. This phenomenon was like a (late night) dormitory bull-session in that its direction was entirely random, people were free to wander in and out contributing as they wished, it was entirely spontaneous, and everyone enjoyed it. What was revolutionary was that it involved hundreds of speakers and listeners and went on for many days! I have also participated in many less spectacular, but more productive and directed electronic discussions with the same properties. They simply wouldn't have happened without the mail system.

One can easily imagine that a high proportion of class room discussion will be carried out through electronic. The teacher and students will all belong to a message group that exchanges messages on chosen topics throughout the semester. In many ways this is a more reasonable medium for thoughtful discussions than face to face group meetings. People have more time to think about what they are saying; indeed they are forced to because they must type it. When a teacher wishes to base grading on class room discussion he has a written record of who actually said what.

The bulletin board discussions here at CMU exhibit this phenomenon quite well. Alongside your technical expertise in computing I rate the existence of the computer using community as crucial in making CMU the best place for this experiment. You have already begun to work out the norms associated with this new tool. Even more explicit attention should be devoted to this topic using the BBoard as a forum, of course.

Communication across time might seem to be a different topic from electronic mail information retrieval or data bases, perhaps, but I believe it should be looked at in the same context. To oversimplify: an electronic mail system is merely a distributed data base system in which individual recipients of mail search the data base for items with their names in the “To” field, in which recipients of bulk mail search the data base by topic, and so on. It is already commonplace for people to accumulate all the messages of a particular electronic discussion into a single message file for late-comers to the conversation. Unfortunately this process is somewhat hit-or-miss and does not produce good information for the long term. The purely mechanical task of collecting everything on a common subject should be part of the system. The thing that will always require intelligence and attention is the task of editing this information in the light of subsequent events and information. For example, some computer systems are “documented” by their successive release messages. A new user of the system learns how to use the system by reading this history. Over time various pieces of this history become irrelevant to the current system, and someone needs to revise this history. Usually, this never happens. Everyone says “The right thing to do is write a manual,” but that never happens either, partly because no one is confident that the system has stopped changing.

The concept of a growing, democratically created, intelligently edited data base will greatly amplify another venerable, if dubious, campus tradition: the fraternity files. If, year after year, you capture all the written communication associated with a course -- assignments, tests, papers, etc. -- the potential exists to change the nature of how people learn the body of material. It will no longer be just the nominal teacher of the course who does the teaching; it will be generations of unseen students transmitting their solutions to problems and views on issues. This potential can be realized, however, only if this data is suitably culled for what is truly worth studying. Who should do it?

There was once a mechanical engineering student here who maintained his position at the top of the class because he knew how to search the literature. He took every problem to the library and searched until he found a good solution. Occasionally he had to do some invention, but he always began with what he could find from the past. I had the vague feeling that this was cheating, but there is no doubt that he was going to be an incredibly effective practicing engineer as long as he had access to his information base. In the programming community from which I come, the most effective programmers are the ones who make the greatest use of existing libraries of programs. For example, an entire file systemwhose local manifestation is on the Ethernet and is called Onyx was initially constructed from existing programs written for other purposes by people other than the file system implementors. Operating in this way actually requires more experience and talent than traditional programming. We had better come to terms with this method of problem solution, because our tools are about to make it much more attractive and effective. Our general idea that students must learn to “think for themselves” is in danger. We could easily find ourselves in the situation that the only way to force them to do it is always to present them entirely new problems.

Media Quality

My wife is not a technology fan; nothing invented after the light bulb is allowed in our living room. She does not like to play with computers. Nevertheless, she will kill to use my Alto. The reason is quite simple: she can produce a good-looking document that looks like it might have been professionally printed. Anyone who writes likes to see their work presented in the best possible way. Automatic typesetting, demand printing, and other tools like spelling correctors can make writing more effective and enjoyable. In the short term, quality printing will be the strongest draw of the system; people's appreciation of electronic mail and other things takes longer to form.

For a writer, the real challenge is to make the quality content match the quality of the form. It is always disconcerting to poorly thought-out ideas written in a medium that makes them look like God's truth. This is basically media inflation. The major long-term benefit will be that readers will become skeptical of everything they read; in the meantime people with advanced facilities will garner a modicum of undeserved credibility.

This improvement of quality does not stop with multi-font printing. That is only the tip of the iceberg. People also communicate with pictures and sound. Here, however, the technology is far behind the printing world. I don't think there are many graphic artists who would pay, much less kill, to use what is commercially available. We can expect however, that simple kinds of illustrations will be easy for novices to include in their written documents. I think it will be more productive if those in Fine Arts look at this system as a new medium per se, rather than a better way to work in more traditional media. The latter is possible, but presents considerable technical challenges.

The possibilities for communication do not end with words, pictures, and sound, no matter how powerful. A computer program is a form of communication we have just begun to explore. The programmer/sender of a message actually creates an interactive mechanism that the user/receiver can interact with to comprehend. Construed as art, compilers and operating systems are like houses people called users live in; the builders of these things must employ the same kinds of judgments and inspiration and understanding of their clients needs that real architects do. The Adventure game is a rudimentary form of a new kind of art-form: the interactive puzzle/mystery. Current mystery novels suffer from the fact that the reader must read them linearly, following the detective around, trying to guess the solution to the mystery. The mystery novel of the future will dispense with the detective as a character and allow the reader to move into his proper place at the center of the investigation. The reader, not the author, will decide where to go, who to interrogate, and what questions to ask. Needless to say, the readers and writers of mystery novels must become considerably more clever for this new form to work. I think the reason that computer-aided instruction has been a bust at the college level is that we are still waiting for computer literacy to spread far enough so that some genius of a teacher/programmer will create a program that really teaches.

The quality of communication will be greatly enhanced by feedback to the authors. A person who creates something for the community to read or use should be guaranteed of some response, if not money. Simply the information about how many times a program has been run would be very helpful to many programmers I know. I sometimes suspect that programmers subconsciously put bugs into their programs or fail to test them adequately so that they will be some feedback, if only negative.

It seems clear that the users of this system, if it works technically, will be drowning in information. Nobody will have the time to look at everything, and only a small percentage of it will be worth looking at for any particular person. There will be a desperate need for the electronic equivalents of the news room editor, the literary critic, the art reviewer, the computer program reviewer, etc. These people will become more important and influential than the individual producers of information and services, and they too must have a source of feedback.

I am personally fascinated by the notion of interactive television so the tie-in with a cable TV company is interesting. A fantasy: The morning news is transmitted to waiting recorders all through the night. Upon waking, the news viewer turns on his computer-controlled recorder and browses through the new program just as he would read a newspaper. CMU has enough other things on its plate right now, however.

Technical Implications

The foregoing dreams and some unspoken nightmares suggest a few things about the detailed aspects of the system.

First, as emphasized in the preliminary report of the TFFC, accessibility and its supporting attributes reliability and maintainability are crucial. If the system does not approach the electrical power system in dependability, it cannot play the role we envision. The basic design of the system must emphasize this and other aspects must take lower priority both in design and effort allocation. The designers must keep the concept of a 5,000 user system uppermost in their minds so that they remain appropriately conservative. Their talents must be devoted to making a simple system of the complexity of UNIX or the Alto work for a huge, demanding, intolerant user community. They must avoid the temptation to design something that would be impressive in small numbers or in a demonstration.

At its heart, the system is a distributed data base, with so much emphasize on distribution that some might see it as message system. The point is that, except possibly for basic system software, there really is no particular root source of the system. Major sources of information will be the various departments and functions of the university, and one would expect the repositories for their information to be physically distributed. Compared to commercial systems this system will exhibit less locality. Assuming that a student's primary location is his dormitory the only information one can expect to be localized there are things he creates for his own use. However, there is no logical location for things like a discussion file being created by an economics class; the participants are scattered uniformly across the network.

The designers of the data base system should consider the following:

- There are great advantages to never overwriting, or even deleting information once it has entered the system. Information should be moved to less expensive, less accessible media as it becomes obsolete. This makes the process of auditing the system for various purposes easier.

- The author of every piece of information should be automatically identifiable in an authenticated way. It should also be possible to record every access to a piece of information. This is a challenge in a distributed system, but it will be important. It is important that the creators of information and services get feedback about who is using them. It will also be important to have such information for auditing various aspects of the system. Some form of encryption will be needed.

- For the most part the activities of the University have a time clock whose granularity is a day, not a second. In other words, this system will not have severe real-time constraints if designed in a properly decentralized manner. Beyond this basic need we have several points that come under making the system attractive to use. It doesn't matter if the system is accessible if the community doesn't like to use it, even if it would be more rational to do so.

- It seems clear that we want an Englebart work-station; i.e. a large bit-map display and mouse. Anyone who uses one for a while comes to this conclusion. This feature is probably more important to non-computerists. It should be possible to use the system passively without being able to type.

- Good printers will be important. Paper is highly underrated as a communications medium, and the faculty engage in a lot of external communication. To some degree the need for printers can be reduced by software to display page proofs on the bit-map display.

- Being wired into external communication systems will greatly enhance the systems attractiveness. Perhaps the ARPANet is not a possibility, but efforts should be made to connect with the electronic message community.

posted by Jim Morris @ 5:09 PM  1 comments

Wednesday, February 14, 2007

Dave Farber: Master Blogger

Dave Farber has often been described as the grandfather of the internet because so many of his students made crucial contributions. He’s also the grandfather of blogging: his “interesting people” mailing list has been running for several years. It is read and contributed to by a Who’s Who of technology and policy. You can read the archives at http://www.interesting-people.org/archives/interesting-people/ or request a subscription from Dave at dave@farber.net. He doesn’t fool with the mechanics of blogs, he just forwards what he gets, using exquisite taste.

Here, for example, is an important Op-Ed Dave wrote in the Washington Post.

By David Farber and Michael Katz
Friday, January 19, 2007; Page A19

The Internet needs a makeover. Unfortunately, congressional initiatives aimed at preserving the best of the old Internet threaten to stifle the emergence of the new one.

The current Internet supports many popular and valuable services. But experts agree that an updated Internet could offer a wide range of new and improved services, including better security against viruses, worms, denial-of-service attacks and zombie computers; services that require high levels of reliability, such as medical monitoring; and those that cannot tolerate network delays, such as voice and streaming video. To provide these services, both the architecture of the Internet and the business models through which services are delivered will probably have to change.

Congress failed to pass legislation amid rancorous debate last summer, but last week a group of senators reintroduced several initiatives under the banner of "network neutrality."

Network neutrality is supposed to promote continuing Internet innovation by restricting the ability of network owners to give certain traffic priority based on the content or application being carried or on the sender's willingness to pay. The problem is that these restrictions would prohibit practices that could increase the value of the Internet for customers.

Traffic management is a prime example. When traffic surges beyond the ability of the network to carry it, something is going to be delayed. When choosing what gets delayed, it makes sense to allow a network to favor traffic from, say, a patient's heart monitor over traffic delivering a music download. It also makes sense to allow network operators to restrict traffic that is downright harmful, such as viruses, worms and spam.

Pricing raises similar issues. To date, Internet pricing has been relatively simple. Based on experience in similar markets, we expect that, if left alone, pricing and service models will probably evolve. For example, new services with guaranteed delivery quality might emerge to support applications such as medical monitoring that require higher levels of reliability than the current Internet can guarantee. Suppliers could be expected to charge higher prices for these premium services.

Blocking premium pricing in the name of neutrality might have the unintended effect of blocking the premium services from which customers would benefit. No one would propose that the U.S. Postal Service be prohibited from offering Express Mail because a "fast lane" mail service is "undemocratic." Yet some current proposals would do exactly this for Internet services.

We're not saying that all discrimination is good or that the market always gets it right. Some forms of discrimination can be harmful, especially when service providers have market power. For example, if a local telephone company that is a monopoly provider of both broadband access and plain old telephone service for a community blocks its broadband subscribers from using an Internet phone service offered by a rival company, this discrimination can harm both competition and consumers.

Public policy should intervene where anti-competitive actions can be identified and the cure will not be worse than the disease. Policymakers must tread carefully, however, because it can be difficult, if not impossible, to determine in advance whether a particular practice promotes or harms competition. Antitrust law generally takes a case-by-case approach under which private parties or public agencies can challenge business practices and the courts require proof of harm to competition before declaring a practice illegal. This is a sound approach that has served our economy well.

The legislative proposals debated in the 109th Congress take a very different approach. They would impose far-reaching prohibitions affecting all broadband providers, regardless of whether they wielded monopoly power and without any analysis of whether the challenged practice actually harmed competition. If enacted, these proposals would threaten to restrict a wide range of innovative services without providing any compensating customer benefits.

Does this mean we believe that we should place all our trust in the market and the current providers? No. But it does mean we should wait until there is a problem before rushing to enact solutions.

David Farber is distinguished career professor of computer science and public policy at Carnegie Mellon University. Michael L. Katz is a professor of economics at the University of California at Berkeley. Gerald Faulhaber, a professor at the Wharton School and the University of Pennsylvania's law school, and Christopher S. Yoo, a law professor at Vanderbilt University, also contributed to this article.

posted by Jim Morris @ 2:34 AM  0 comments

Sunday, February 04, 2007

BlackBerry + Google beats Windows Mobile

I spent a week hassling with a T-mobile Dash phone, taking a deep dive into Outlook, Exchange Server, and Windows Mobile. I figured I had just bought my last laptop because I’d be running everything from a handheld in a few years. I wanted to explore the state-of-the-art Windows support. Yech! Outlook is as nasty as ever.

Giving up on integrating my laptop and phone, I bought a BlackBerry Pearl which does the basics of phone and email beautifully. I’m seeking a way to get Oracle Calendar to download—the only other thing my Treo could do. I’m expecting Google Calendar to get there first.

Windows won’t survive the platform change to handhelds and the internet, just like the IBM and DEC mainframe software couldn’t survive the change to PC’s, and the old elk don’t survive the winter. The disruptive services—BlackBerry and Google Aps—will grow capabilities much more easily than Windows can slim down. Windows is being hit by a triple whammy: software bloat, the Innovator’s Dilemma, and the fact that Google’s twenty-year-old engineers are more savvy than Microsoft’s forty-year-olds.

posted by Jim Morris @ 1:14 PM  0 comments

Previous Posts Archives
Apply Online
Info Request
Masters Programs
Professional Development Center
Faculty


Carnegie Mellon West - (866) 401-WEST (9378)
Building 23
Moffett Field, California 94035
Contact Us

©2006 Carnegie Mellon University. All Rights Reserved.