Saturday, September 6, 2008

The 10 Laws of Cloudonomics

I've been thinking about the strategic competitive advantage of clouds, and in particular about some of the mathematics and statistics behind cloud computing. For example, the benefits of aggregating demand can be characterized based on fundamental laws of statistics, such as the law of large numbers and the preservation of variance across addition of independent random variables. Observations such as this have profound business implications. For example, service provider clouds have fundamental strategic advantages over enterprise clouds or owned (or financed or leased) infrastructure based on this statistical behavior. I coined the term "Cloudonomics", with Don Tapscott and Anthony Williams book "Wikinomics" in mind, to capture the notion that there are economic laws of cloud service providers and specific curves (e.g., hyperbolic, inverse square, asymptotic) that characterize how they create value.

The article is at BusinessWeek by way of the GigaOM Network. I will post more detailed comments on each law at my Cloudonomics blog, based both on some underlying proofs and simulations that I've posted at Complex Models.

My thesis is that large enterprises have many of the same technologies at their disposal as cloud service providers do, so, for example, virtualization, even though widely used by cloud service providers, hardly provides strategic value to an enterprise, since an enterprise can leverage exactly the same technology. Consequently, the advantages of service providers must derive from the fact that they are inherently multi-tenant and multi-enterprise, providing true on-demand infrastructure or services through dynamic allocation and multiplexing; via larger scale; and via strategic, engineered, dispersion.

The 10 Laws of Cloudonomics are:

  1. Utility services cost less even though they cost more.
  2. On-demand trumps forecasting.
  3. The peak of the sum is never greater than the sum of the peaks.
  4. Aggregate demand is smoother than individual.
  5. Average unit costs are reduced by distributing fixed costs over more units of output.
  6. Superiority in numbers is the most important factor in the result of a combat (Clausewitz).
  7. Space-time is a continuum (Einstein/Minkowski).
  8. Dispersion is the inverse square of latency.
  9. Don't put all your eggs in one basket.
  10. An object at rest tends to stay at rest (Newton).
Read more here.

Saturday, May 31, 2008

How fast is the Internet growing?

Just returned from participating on a panel called "The Exaflood: Managing the coming digital deluge" at the always outstanding Gilder / Forbes Telecosm. The theme this year was "The Exaflood," i.e., the rapidly growing flood of digital information on the Internet and enterprise data networks.

The panel was moderated by Bret Swanson, a Senior Fellow at the Discovery Institute and Director of the Center of Global Innovation at the Progress and Freedom Foundation. It comprised Andrew Odlyzko, Professor at and Director of the Digital Technology Center at the University of Minnesota; Bob Metcalfe, Ethernet inventor and author, now at Polaris Venture Partners; Johna Till Johnson, President of Nemertes Research; Tom Evslin, founder and former CEO of ITXC and "Fractals of Change" blogger; Lane Patterson, Chief Technologist at Equinix, Walt Ordway, former CTO of the Digital Cinema Initiative, and myself.

There was a diversity of opinion regarding the growth of demand and how to measure it, both recently and over the next few years. Andrew Odlyzko began with a fairly modest estimate of growth rates, pointing out that correctly forecasting growth rates is key for the service provider and equipment vendor industry, since if they are unexpectedly high, congestion and service outages will follow, but if they are unexpectedly low, then overcapacity and poor ROIs will occur. Then, Bret Swanson, who moderated the panel and is a Senior Fellow at the Discovery Institute, recapped a recent study he conducted with George Gilder, excerpted in the Wall Street Journal, projecting a 50-fold growth rate in Internet traffic through 2015, which translates to a 54% CAGR. Johna Johnson then discussed the difficulty of acquiring good data, since core network traffic data is likely to differ from edge data that doesn't traverse service provider cores. She quoted Nemertes projections of 100% growth. Johna also pointed out that it can be difficult to determine unserved demand.

Who's right? Well, ask again in 2015. In the meantime, core network growth rates of 60% annually, which is what we've seen on a regular basis over the last few years, are unlikely to slow. In fact, if anything, the opposite is likely to happen, as multi-megabit/s consumer broadband access increases, consumer desktop HD video streaming grows, consumer IPTV gets deployed, peer-to-peer file sharing continues, and enterprise video conferencing from desktops and immersive Telepresence solutions accelerate. Even mobile video bandwidth continues to grow due to synchronous real-time video, both uplink, downlink, and full duplex. Next generation network upgrades to OC-768 and 3G HSPA deployments and 4G LTE spectrum acquisitions and deployments over the coming years mean that wireline and wireless capacity will be growing, hopefully in tandem with the Exaflood of demand.