Wednesday, November 30, 2005

Les 5 paradoxes du web

( Translation of Five paradoxes of the Web by anGel )

Le web est une plateforme formidable pour fournir des services et de l'information, mais il fait bien son âge. Les choix fondamentaux de conception qui étaient exacts au début du web commencent maintenant à exploser. Ici, nous ferons une tentative d'identification des problèmes impossibles à résoudre avec le web tel qu'il est actuellement.

Les problèmes du web sont bien connus et malheureusement considérés comme "aquis". Le mois dernier, en tant qu'utilisateur de la toile, j'ai du faire face au SPAM (dans mes BAL et sur mon blog), au denial-of-service et à un vol d'identité; sans parler des bugs divers. En tant que développeur web, j'ai du résoudre des incompatibilités de navigateurs et fournir des efforts disproportionnés par rapport aux travaux que j'essayais de finir. Et c'est de pire en pire.

Beaucoup de ressources sont mises en oeuvre pour combattre les problèmes qu'engendrent l'utilisation du web; mais cela ne résout pas les principaux paradoxes de la plateforme. Bien des industries prospèrent de nos jours uniquement grâce aux imperfections du web. C'est un bon indicateur du fait que la communauté des utilisateurs informatiques devrait investir dans une nouvelle infrastructure globale et s'attaquer aux problèmes fondamentaux avec les solutions fondamentales.

  • Tout est gratuit, mais rien n'est gratuit (Paradoxe de compensation)

    Beaucoup de services web sont gratuits pour les utilisateurs, parcequ'on ne peut pas facturer ces services - mais fournir ces services coute de l'argent. Ceci rend les business models insoutenables et exige de doser les ressources utilisées. Même sans limites explicites, les limitations matérielles et de bande passante du fournisseur conduisent souvent à des denial-of-service pendant les heures de pointes - ou pendant les attaques.

  • Nous ne savons pas qui vous êtes, pourtant il n'y a aucune intimité. (Paradoxe d'identité)

    Il n'existe pas de système d'identité globale: un site web ne peut pas vous accueillir par votre nom, sauf si vous avez renseigné un champ spécifique auparavant. Les mécanismes de gestion d'identité sont maladroits et conduisent parfois au vol d'identité. En même temps, il existe des moyens détournées d'envahir l'intimité d'un utilisateur: l'adresse IP, les cookies, les en-têtes de provenance, les 1-pixels GIFs" dans les emails.

  • Mêmes ligne de code, pourtant un rendu différent (Paradoxe de compatibilité)

    Développer des applications web exige de sacrifier un de ces trois ingrédients importants : capacité, compatibilité, ou vitesse de développement. Conduire des tests de compatibilité sur toutes les versions de navigateurs est un luxe que peu peuvent s'offrir. Peu importe si un navigateur est plus "compatible" que les autres; dans les faits, vous devez être compatible ou perdre des utilisateurs.

  • Le code parcourt le réseau, mais il n'est pas mobile (Paradoxe de frontière)

    Le web est asymétrique : il y a un client et il y a un serveur. Le client parle une langue (Javascript), le serveur en parle une autre (généralement pas Javascript). Pour traverser la frontière entre le client et le serveur, le code doit être traduit dans une autre langue. Peu importe la vitesse du réseau, la mobilité du code est donc limitée la vitesse de conversion du programmeur entre client et serveur.

  • Le web n'est pas assez décentralisé, pourtant il n'est pas assez centralisé (Paradoxe de responsabilité)

    La gestion DNS est centralisée; les autorités de certification sont aussi pour la plupart centralisées. Cette centralisation donne des monopoles aux organisations de contrôle, et les rend globalement vulnérables dans le même temps. Mais il n'y a personne à qui en réferrer quand une entité agit mal (ex: le spam), tant que les autorités du web n'acceptent pas d'engager la responsabilité des "citoyens" de la plateforme web.

Identifier les problèmes est la première étape necessaire. Ce blog va tenter d'explorer des solutions possibles, souvent plus radicales qu'autre chose. Penser hors de la bulle "web" est le seul moyen de vraiment progresser. Le bon plan, c'est que la prochaine plateforme gagnante résoudra chacun des 5 paradoxes du web.

In the trenches

Browser wars are still raging. Explorer Destroyer distributes scripts to harass IE users, and Google is paying a $1 bounty for every person who switches to Firefox because of this campaign:
IE users will see a friendly message stating that they need to download and install Firefox to view your site. This script will do the most good and make you the most money. Can you handle it? (At least try it for a day to see how good it feels.)
How good does it feel to deny users access to content based on their choice of a Web client, the choice that in corporate environments is often forced on users by system administrators who control every piece of software that is installed?

The Web browser market today is a zero-sum game: marketshare must be pried away, and even small gains now come with a big price tag. In stark contrast, competition for the next platform's users has not even begun: a relatively small investment buys a monopoly. Who will take advantage of this opportunity?

Tuesday, November 29, 2005

One percent of a hundred billion dollars

Computer crime is a big business:
"Last year was the first year that proceeds from cybercrime were greater than proceeds from the sale of illegal drugs, and that was, I believe, over $105 billion," McNiven told Reuters.
We lose billions of dollars because of Web security issues, and there are billions spent on fighting them. It will not get any better because of the fundamental design flaws in the platform; it will only get worse and cost more next year. Why not take one percent of the money that goes into patching the latest problems and do something proactive instead of reactive: hire the best security professionals in the world, peer review like mad, and come up with a state-of-the-art platform.

Let's not waste any time. The later design of the Web's successor starts, the more security problems it will have and the more money will be lost.

Monday, November 28, 2005

Exceptional advice

Evan Williams summarizes his business experience in Ten Rules for Web Startups:
#6: Be Self-Centered
Great products almost always come from someone scratching their own itch. Create something you want to exist in the world.
I would drop the word "Web" from the title. But that would be a violation of rule #1: Be Narrow.

Wednesday, November 23, 2005

Don't call it bloat

Zawinski's Law of Software Envelopment states:
Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.
Mozilla, Google and Emacs are examples of the law in action. Even Yahoo! Messenger has a component that notifies the user about new email messages.

The Law of Software Envelopment should be updated for Web 2.0:

Every program attempts to expand until it can read or publish RSS/Atom feeds. Those programs which cannot so expand are replaced by ones which can.
Both the original and the updated laws are instances of the general pattern: expansion of information zones.

Tuesday, November 22, 2005

Swimming in terabytes

Is it possible to design an Interstore—a global storage system? It would be like adding a huge hard drive to the Internet. There is a research project at Berkeley that did just that, only it's called OceanStore:
OceanStore is a global persistent data store designed to scale to billions of users. It provides a consistent, highly-available, and durable storage utility atop an infrastructure comprised of untrusted servers.

Any computer can join the infrastructure, contributing storage or providing local user access in exchange for economic compensation.

Now we need to add a processor—or a million processors—to the Internet, and it will look like one gigantic computer. We'll want an OS too. But what should we call this global machine? "Supercomputer" would be an understatement.

Monday, November 21, 2005

Skype: a platform?

Skype, the peer-to-peer phone application, has an ambitious roadmap that includes video (to be released soon) and social networking. Skype has an API that third-party developers can use to build applications. And it has 53 million registered users, as of September.

Did eBay pay $2.6bn for Skype the VoIP application or Skype the platform?

Saturday, November 19, 2005

In the zone

What follows is an attempt to summarize an essential pattern of technological development using a simple quasiscientific theoretical framework. This theory should provide qualitative model of evolution of user experience. The core concept of information zone is introduced first.

Definition: Information zone is a collection of information resources—computational, storage, network interfaces, human interfaces, physical interfaces and software—that are jointly useful.

Examples: A standalone PC is an information zone, but components alone are not: a RAM chip is useless without a motherboard. Craigslist is an information zone; so is, of course, the Web as a whole, but a Web terminal is not: it requires a network connection to function. iPod, iTunes and iTunes Music Store are bound together by a single information zone.

Theorem: In the free market, competitive information zones expand with time.

Proof: for a system to stay competitive in the marketplace, its functionality must not decrease. Shrinking information zones lead to decrease in functionality. (It is possible in principle to compensate for shrunk information zone, but in practice it is very difficult.) Information zones that are larger than competitors attract more new users. Older users switch when the cost of switching is exceeded by the value of switching to the larger information zone. As a result, competitive information zones become larger, and larger information zones become more competitive.

Examples: you upgrade your computer to expand its information zone, and don't remove components under normal circumstances to avoid shrinking it. Number of Web pages increases with time, as the Web is a competitive information zone.

Corollary: In the future, all information zones will be unified, with the exception of contexts in which free market rules do not apply.

Proof: expansion of the total information space is relatively slow compared to the expansion of the information zones. To expand the total informational space, new physical devices must be produced. On the other hand the information zones can be expanded fairly cheaply—for example, bridging two networks doubles the size of both information zones overnight. Free market competition drives fast expansion of information zones. The most efficient way to expand zones is to combine them—and given the slow expansion rate of the total space and high economic pressure to grow the zones, eventually all information zones will merge. Users will still interact with individual devices, but the overall experience will be seamless.

Example: networks merge together to create the Internet, except for military and other closed networks. Global voice network grows to include cellular and VoIP communications, all interoperable.

The application of the information zone theory states: Business models that are in conflict with the vision of a unified information zone are not sustainable in the long term. Specific prediction: efforts to protect proprietary IM networks are pointless and counterproductive. AIM, Yahoo! Messenger, MSN Messenger and Google Talk will all become interoperable to the benefit of all the users.

Thursday, November 17, 2005

Recipe for success, part II

Is it possible to reuse the Web's recipe for success to popularize a new platform? Is it the right time to build a scalable shared world? Let's review the ingredients.
  1. Relevant paradigm.

    New paradigms are not easy to find, but fortunately we have one: peer-to-peer computing. Faster machines and faster connections at the endpoints make a symmetric architecture timely.

  2. Fast and easy content authoring.

    Building a world will require developing good tools first; but when the tools are in place, computer-assisted content authoring can take off. If the system has a good economic model, professionals will be attracted to the medium and will cooperatively design advanced artifacts.

  3. Ease of content distribution.

    With no distinction between a client and a server, the line between content producer and consumer is blurred: today's file sharing networks are an illustration.

  4. Superlinear utility.

    In a digital universe, there is no difference between designing a single chair and building a chair factory—components are reusable, making creation easier with time and causing a network effect.

  5. Interoperability

    Interfaces to email, instant messaging, voice networks, and the Web itself should make the system useful from the start. The goal is not to replace the existing protocols right away but to augment them.

  6. Optimal use of computing resources.

    Broadband connections and 3D accelerator cards are ubiquitous but underused. A shared world will require modern hardware and modern networking—the resulting experience will surpass the Web.

Though processor and network speeds have doubled many times since 1989 the Web has reaped few of the benefits. It's time for a new platform so that the potential of the modern computer can be fully realized.

Wednesday, November 16, 2005

Revolution resolution

Wired News front page: "U.S. Maintains Control of Net." The headline should have read "U.S. Maintains Control of DNS Root Zone", but that wouldn't be as catchy.

The Internet is not controlled by the U.S. or any other country. If all U.S. links are severed tomorrow, the Internet will still exist: there will still be interconnected networks, and packets will go through.

The root zone of the domain name system is controlled by the U.S. But, handing over control from one organization—the U.S.—to another—multinational agency—is not a solution. The real solution is decentralization: abolish Net-wide root zone altogether, and manage the namespace competitvely, recognizing country top-level domains by agreement. (The obvious question is, how do you introduce new global TLDs such as ".xxx" then? The answer is, you don't. Stay tuned for a discussion of why new generic top-level domains are not needed.)

Can you expect the U.N. to start a revolution?

Recipe for success, part I

The Web is used by a billion people, consists of over ten billion pages and has been used to create many hundreds of billions of dollars in wealth. How did it become so successful?

The answer sounds simple: the Web was the right platform at the right time. The exponential rate of adoption can be attributed to six factors:

  1. Relevant paradigm.

    The success of the Web is the success of the client/server paradigm. Client/server is simple: both HTTP client and HTTP server are relatively easy to implement. Client/server puts all the burden of scalability on the operators of the server, who are normally in a position to cope with it. Most notably, client/server was a good match for the topology of the Internet circa 1993: high-powered machines on high-speed links, low-powered machines on low-speed links.

  2. Fast and easy content authoring.

    The only tool you need to start creating Web content is a text editor. Ability to "View source" allows users to learn HTML without looking at documentation. The Web browser never refuses to display a page—if you make a mistake, it is automatically corrected most of the time. Learning HTML can be an enjoyable and rewarding experience since the language makes creating good looking multimedia documents very easy.

  3. Ease of content distribution.

    You don't have to get permission from anyone to join the Web. In the early days, you could just compile and run CERN httpd—you didn't even need to have your own machine or ask your administrator since you could run the server on an unprivileged port. Today, even non-technical users can quickly gain access to the Web for publishing content. There is no barrier to the exponential growth of the content distribution network.

  4. Superlinear utility.

    The network effect of the Web is remarkable. With a simple tag, you can link to—and build upon—information contributed by anyone worldwide. Each new webpage can be both the source of the links and the target of the links, adding connections that make existing content more valuable.

  5. Interoperability.

    If you had to only pick one network client, you would choose a Web browser that has built-in support for other protocols such as FTP. Uniform Resource Locators make all Internet resources addressable from HTML, providing seed content that accelerates and sustains growth of the Web.

  6. Optimal use of computing resources.

    The Web exploded at the time when the networks became fast enough to transmit images, and the general-purpose computers good enough to display them: this is no coincidence. By exploiting timely technological advances the Web captured popular imagination.

Is it possible to use the same recipe to popularize a new platform? (to be continued...)

Tuesday, November 15, 2005

Is hi-tech timeless?

Loyal starts a comment with: "In technology...More is law. More speed, more storage, more graphics, and so on." But will this law—more of an assumption, really—be true tomorrow, and is it true today?

Consider the fate of an industry that for centuries was synonymous with high technology. Before computers, there was clockmaking.

In clocks, more was law: more precision, or at least better price/precision. All the attributes of hi-tech were there:

  • the industry was intellectual property-driven: innovations led to success;
  • it was prestigious: clocks were presents given to royalty;
  • it was strategic, as high-precision clocks were key to ship navigation.
Here's a fact that should illustrate the importance of clockmaking: in 1714, a Longitude Prize of £20,000 was offered by English Parliament for a better clock that would allow precise navigation at sea. £20,000 is not bad today, and that's after three centuries of inflation—at the time it was a staggering sum.

So what happened? Clocks improved slowly but steadily; finally, with the development of the quartz mechanism, they became both "good enough" and "cheap enough." Quartz technology revolutionized the industry: the companies that continued to compete solely on precision went out of business, and in the mainstream market today clocks are judged on style, not performance.

Before insisting that computer industry will never suffer the same fate, answer the question: is the iPod a portable special-purpose data processor or a fashion accessory?

Monday, November 14, 2005

Paradigm shift in a freezer

Nature's disruptive technology is crystallization—formation of ice cubes. Here's what happens to water as it cools down:

Water temperature reaches the freezing point—and drops below it. Liquid wants to become solid—the latter is a state that is more cost-effective when it is cold (more energy efficient), but the "ice technology" hasn't been invented yet. As the temperature is lowered, the demand for crystallization grows: it would conserve more energy—physics equivalent for "save more money". After a period of time, a random cluster of molecules finds itself in a crystal-like formation, "inventing ice." Similar to a network effect, the spread of this new technology is rapid, and soon the solid form becomes the "de facto standard."

While the exact shape of crystals is unpredictable, the process itself is inevitable. As the temperature drops, the demand for phase change—expressed in energy that could be saved—increases. It is only a matter of time for a suitable prototype to be discovered through nucleation allowing the phase transition to begin.

Technological revolutions are inevitable, too. Once the computers are connected to a worldwide network, there is an economic pressure to use the infrastructure to create efficient media for communication. The industry is in a "supercooled" state: huge and daily increasing opportunity cost drives the development and deployment of new computer and social protocols. You can't anticipate the exact path the technology will take, but you can successfully predict where it will lead: you don't know how the ice cubes will look, but you know for sure that the water will freeze.

(Although Nem Chua correctly pointed out that ice is not really a crystal, my analogy between a supercooled liquid and computer industry still holds.)

Friday, November 11, 2005

Standards: Technology vs. Politics

James Gosling wrote about the Phase Relationships in the Standardization Process in 1990, and his observations remain true today. Unfortunately the note is gone from his site, but it can still be found in the Internet Archive:
For a standard to be usefully formed, the technology needs to be understood: technological interest needs to be waning. But if political interest in a standard becomes too large, the various parties have too much at stake in their own vested interest to be flexible enough to accommodate the unified view that a standard requires.
It is safe to assert that today the amount of political activity on the Web outstrips the amount of technical activity. Case in point: blog feed formats.

Thursday, November 10, 2005

Compatibility paradox: Can't we all just get along?

Sam writes: "Most of the advantages that World of Warcraft has over the web is its closed-platform nature." Indeed exercising full control over the system allows Blizzard to avoid all kinds of compatibility problems. The next platform should offer the best of both worlds—the deployment model of World of Warcraft and the open development model of Firefox.

The mission of both the World Wide Web Consortium and Mozilla Foundation is to promote the Web as a platform. Why are there two separate organizations then? Imagine if W3C from the start, in concert with developing the standard, supplied a production-level implementation of the HTML parser and rendering engine that could be easily and freely embedded in third party products. First, the lives of Web browser implementers would be easier—all they would have to do is take the standard code and add chrome. Second, the lives of Web developers would be easier—because the implementations are homogeneous and thus interoperable.

"One standard—multiple implementations" is an obsolete model in the software industry. A centralized open implementation is better in every respect: it is both faster to market and has fewer compatibility problems. The main drawback of the latter model is that commercial derivatives of the open implementation can't be exorbitantly priced, since the barrier to entry is much lower and the competition is greater. Is that really a drawback, though?

A single implementation is also more resistant to an "embrace, extend, and extinguish" strategy. Stricter licensing terms can be attached to source code than to a standard; they would legally prevent a malicious entity from corrupting the platform. An open implementation can be viewed essentially as a proprietary system owned by the user community. (I've used "open" instead of "open source" because I am not sure whether effective anti-corruption protection will be compatible with the open source definition. This is a minor point that I will try to resolve later.)

To complete the picture on a technical level, the system must first, support a transparent autoupdate at all levels, and second, be able to run multiple versions of the code—including core components—concurrently. The former is a polite way of saying "forced upgrades", and you can't get away from it. The latter gives components that are not forward compatible—because of bugs, for example—a way of requesting a specific environment and coexisting with newer instances of peer components.

There is an important argument against homogeneous implementations: security. A vulnerability in the standard implementation is a vulnerability in every installation. This is no worse, however, than a hole in a commercial implementation that has a near-monopoly on the market—and in fact it is much better, since "given enough eyeballs", it's likely that the bug is discovered by the community and promptly fixed via autoupdate. Pooling resources and cooperating on security, as well as on other aspects of design and development of the system, can lead to a better product for everyone.

Wednesday, November 09, 2005

Banner ads: prosperity or extinction?

Ray Ozzie and Bill Gates see bright future for the advertising-supported model:
Online advertising has emerged as a significant new means by which to directly and indirectly fund the creation and delivery of software and services. In some cases, it may be possible for one to obtain more revenue through the advertising model than through a traditional licensing model. Only in its earliest stages, no one yet knows the limits of what categories of hardware, software and services, in what markets, will ultimately be funded through this model.
Gordon Parker analyzes the economic effect of the Compensation paradox and views advertising-supported model as an accident:
Web services become free for users because to the extent they are commodities the price converges on marginal cost which is essentially zero. On the other hand fixed costs are nonzero resulting in the effects described by the author. The result has been an explosion of advertising to cover the gap which leaves us in essentially the state of broadcast television twenty-five years ago (less regulation.) The only reason we have such a paradoxical situation (i.e. modern technology and old model) is because of the "compensation paradox."
Advertising on the Web ranges from informative to obnoxious. Would Bill Gates pay a flat access fee of $10/month to make all the ads opt-in? More to the point, would you pay a flat fee to make all the ads opt-in?

Tuesday, November 08, 2005

Abandon—and move where? And why?

"Abandon" doesn't mean "replace": it means "move on". Building a hypertext system with similar functionality as the Web, but incompatible with it, is a doomed endeavour—no question about it. However, a platform that offers new features, while still providing Web access from within it, can reach critical mass. The "killer app" is computer-mediated realtime human interaction—such as talking to each other. (If you think that's not important, here is a data point: Skype has 200 million downloads.)

A new platform will bring a new perspective. Forget the client-server document view, think of being inside a shared world. (If you like 3D interfaces, the world will be 3D; if you like 2D interfaces, the world will be 2D.) It's not in a window—it's full screen all the time, it is your home environment and not a guest. The system shows objects and avatars, and users can do things that do not fit the Web model of interaction at all. Collaborating in realtime or playing games on the Web is difficult to impossible; in the new medium it will be natural.

One of the objects in the system is an information access device. This device is a computer with a Web browser. By zooming in on it you can access all of the World Wide Web and run old-fashioned applications. (Here's how it looks in Croquet.) You see a computer within a world, not a world within a computer.

In short, the paradigm will change from Memex to Metaverse—a Metaverse in which you can still access the Memex, along with new ways of interacting with humans and information.

Services like There and World of Warcraft represent special-purpose, commercial, closed worlds. Just like online services CompuServe and GEnie were special-purpose, commercial, and closed—before the arrival of the Web, which made them practically irrelevant. A general-purpose, non-profit, and open platform is to come next—bringing with it the equivalents of HTML, HTTP, Mosaic and Netscape.

To truly succeed, the new platform must incorporate solutions to all of the Web's paradoxes. It is crucial that we get the initial design right, since fixing it incrementally might not be possible once backwards compatibility becomes important again. If we do get it right, the Web's problems will not disappear—but they'll gradually fade from importance as the new platform is adopted.

Monday, November 07, 2005

A nickel says I am human

CAPTCHA is a form of payment. It's the most useless payment there is: it is 100% wasteful. The user pays with genuine human attention, and the service provider gets absolutely nothing.

The ways of getting around CAPTCHA are well-known: employing a dedicated person to solve it, asking porn surfers to solve it, asking clever computer scientists to write a program to solve it. The marginal cost of breaking a CAPTCHA in each case is less than a nickel and probably less than a cent in bulk. So you can easily buy your way into services protected by a CAPTCHA.

On the other hand CAPTCHAs are annoying, especially as the arms race causes them to become more complicated for humans to solve. They are a huge user interface problem. And last, but not the least, they discriminate against disabled humans.

Why not just ask users to pay a nickel directly instead of solving a CAPTCHA? Because we don't have any good way to charge a Web user a nickel, or to verify their reputation. So let's fix the underlying problems—the Responsibility paradox and the Compensation paradox—and bury the ugly hacks. (This post is sloppy—please read the follow up)

Saturday, November 05, 2005

Paradoxes and the oil crisis

Charles Miller questions whether this blog is a parody (for the record: it's serious) and writes: "These problems are labeled the "Five Paradoxes of the Web." They're not paradoxes, of course, most of them don't even manage to be contradictions, but a good name is important." I insist that the term "paradox" is appropriate in this context.

Consider a passenger car that goes 0-60mph in 20 seconds and burns 10mpg. You would expect a 10mpg car to accelerate on par with a Ferrari, or a slow car to use much less fuel. In a sense of "2. One exhibiting inexplicable or contradictory aspects", if such a gas-guzzler VW bug was ubiquitous it would be a paradox. Not a contradiction, but certainly quite inexplicable.

The Web, too, often offers the worst of both worlds, for example when it comes to identity and privacy management. A JavaScript tracker can effortlessly discover intimate details about the visitors' computers such as screen resolution, yet many sites don't recognize users after the browser is restarted. This is equivalent to someone knowing the brand and model of your TV but forgetting your name.

If the inefficient and underpowered car is the only one available, it's still a relatively good transportation option: for long distances, it is much better than traveling on foot. Even if this car is a worldwide standard, someone can still design, build and successfully market a Prius—providing a better choice in almost every respect and, in effect, solving the paradox.

Croquet project: It's a very fine day!

Croquet project solves the Boundary paradox and the Compatibility paradox: it is "a framework for delivering a scalable, persistent, and extensible interface to network delivered resources." The mission is not backwards-compatible:
What if we were to create a new operating system and user interface knowing what we know today, how far could we go? What kinds of decisions would we make that we might have been unable to even consider 20 or 30 years ago, when the current set of operating systems were first created?
The architecture relies on truly mobile code:
More traditional distributed systems replicate data, but try very hard not to replicate computation. But, it is often easier and more efficient to send the computation to the data, rather than the other way round.
Croquet is great research. The system makes full use of its foundation, Squeak, an implementation of SmallTalk. Today Squeak is a niche technology—will Croquet manage to bring it into the mainstream and make a big impact?

Friday, November 04, 2005

Compensation paradox: Show me the money!

You run a website. You put some effort into publicizing it, and your audience grows. Then—a breakthrough: your site has been featured on slashdot. And in an hour hundreds of new visitors get server errors, as your bandwidth is maxed out for the month.

Monetary infrastructure at first appears orthogonal to the technological infrastructure; but in reality, it's not. Universal free access is impossible, because it ends up being neither free nor universal. Someone has to pay the electricity bill, since the utility company doesn't believe in universal free access. Scalability on the Web is not cheap: underinvestment leads to poor quality of service for everyone.

By adding a monetary component to each transaction, the virtual economy becomes more balanced; several classes of attacks remain possible but simply become too expensive. If every request costs money to originator, denial of service attacks and spam are generally not as profitable as the alternatives. It doesn't matter what currency is involved: hashcash, "play" money with limited supply, or real cents and dollars: the critical step is going from "free" to "almost free".

The problem of payments has been well-researched: Google Scholar returns over a thousand hits for [micropayments]. The best mechanism can be selected in due time; the important thing is that the mistake of ignoring compensation altogether will not be repeated.

On the Web, the task of building scalable web services is difficult and expensive. The goal of the next platform is coming up with the right economical model, the right programming model and the right deployment model so that many services can scale on demand: as more request come in, additional computational resources are provisioned and paid for with the service usage fees, transparently to the operator who doesn't lift a finger and doesn't invest upfront. In a healthy economic climate, many more content and service providers can prosper, and administrators will welcome the slashdot effect, not fear it.

Shared worlds and the Web

To build a 3D shared world, you need to build a 3D interface and you need to build a shared world. Shared world can exist without a 3D interface—for example, a shared whiteboard is a 2D world. 3D interface can exist without a shared world: for example, VRML is a 3D experience that is not truly shared. The Web can be adapted for the use of 3D interface, but it is architecturally ill-suited for shared worlds.

Consider an example: there are two people in the world, Alice and Bob. Bob throws a ball up in the air—there is gravity, so it goes up and then down. Alice looks at the ball. For this example, it is irrelevant whether the view is 2D or 3D.

On the Web, Bob must notify server Sam that the ball was thrown, and Sam must notify Alice. But how? The Web is technically pull-only, so there is no way Sam can push information to Alice. Alice has to resort to hacks, such as constantly polling Sam or keeping an open connection. (This blog's feed is another hack. Why can't I notify you right away that I've published a new post?)

Once the ball is thrown, Alice has to get a stream of information about the position of the ball. Bob knows the position and can broadcast it, but the network will cause updates to be delayed and Alice sees the ball's motion as jerky.

Alice could know about the behavior of all the objects in the universe, and then all she needs to display the ball is the initial direction and the speed with which the ball was thrown. Knowing about all the universe means that Alice has to know a whole lot, and that the universe can't be updated—what if Bob just invented the ball? Bob knows its behavior, but he can't communicate it to Alice in an easy and secure manner: he can't transparently ship code across because of the Boundary paradox.

For building shared worlds, you need flexible two-way communications and workable mobile code—the Web offers neither. The next platform will fix that.

Thursday, November 03, 2005

Solutions: Executive summary

We know how to solve the problems of the Web. The solutions outlined here are by no means original. They have been researched, prototyped, and used in other contexts. They work.

What we don't know is how to solve the problems of the Web and retain backwards compatibility at the same time. The cost of backwards compatibility is high—more importantly it's recurring: we pay for it every day, every hour the Web is an active platform. We pay for it with productivity losses caused by spam, denial of service, cracked accounts, broken JavaScript—and with new services and content that could have been developed but weren't.

What can a new infrastructure offer that would cause a critical mass of people to use it? An escape from paradoxes that seem intractable today.

  • Everything is free, yet nothing is free. (Compensation paradox)

    Solution: build payments into the core protocols. Every transaction will have a monetary component. The economic model will evolve, as long as there is a technical foundation for it. (read more)

  • We don't know who you are, yet there is no privacy. (Identity paradox)

    Solution: make identity and privacy management part of the platform. There will be no password form fields; applications delegate all identity verification to the platform, which uses multifactor identification if possible. There will be no cookies; services will offer explicit trade-offs between the privacy level and the personalization level.

  • Write multiple times, yet it still doesn't run everywhere. (Compatibility paradox)

    Solution: modernize the development and deployment model. Instead of single standard/multiple implementations, there will be a single openly developed implementation of the platform. Applications and services built on the platform can use any licensing model they want. All components will have an autoupdate functionality. (read more)

  • Code goes over the network, yet it's not mobile. (Boundary paradox)

    Solution: hide the boundaries between machines from the platform. Instead of the client-server model, the developer will view the networked system as a single fault-tolerant computer. Use transactions to manage concurrency.

  • The Web is not decentralized enough, yet it is not centralized enough. (Responsibility paradox)

    Solution: decouple technical hierarchies from the hierarchy of responsibility. Dedicated police services will enforce the laws and provide other functionality that requires centralization, such as maintain certificate authorities and namespace. Entities will be required to associate their identities with one of the police services which will have jurisdiction over them, similar to the passport/state system.

This is a brief description of proposed solutions. No solution can be implemented in the context of the Web because of backwards compatibility; but a completely new infrastructure can adopt them all. The upcoming posts will focus on each paradox and its solution in detail.

Wednesday, November 02, 2005

On a scale from zero to five

Let's rate a modern software platform, such as Java or .Net, on how it solves the five paradoxes of the Web. Java and .Net are similar enough so it doesn't make sense to compare them against each other. For a benchmark, consider a modern gaming platform: a MMORPG such as EverQuest or World of Warcraft.

The modern software platform doesn't address at all the Compensation paradox, Identity paradox or Responsibility paradox: so the maximum score it can get is two. On the Compatibility paradox, it scores around 0.7: I will argue later that a perfect score requires an open source primary implementation. On the Boundary paradox, the score is at most 0.7 as well: the boundaries between VMs are still apparent, as boundary-aware abstraction such as threads and sockets are part of the core API.

MMORPGs addresses more of the Web's problems. 0.7 on the Compensation paradox: a viable economy is present, but there is just one bank. 0.2 on the Identity paradox: at least it knows your nickname. 0.7 on the Compatibility paradox: single closed-source implementation. Zero on the Boundary paradox. And 0.5 on the Responsibility paradox, since there is a central but active authority.

Final rating is: modern software platform—1.4, modern gaming platform—2.1. To build a successor to the Web, don't start with .Net: instead, take World of Warcraft and make it suitable for general purpose use.