Saturday, September 6, 2008

The 10 Laws of Cloudonomics

I've been thinking about the strategic competitive advantage of clouds, and in particular about some of the mathematics and statistics behind cloud computing. For example, the benefits of aggregating demand can be characterized based on fundamental laws of statistics, such as the law of large numbers and the preservation of variance across addition of independent random variables. Observations such as this have profound business implications. For example, service provider clouds have fundamental strategic advantages over enterprise clouds or owned (or financed or leased) infrastructure based on this statistical behavior. I coined the term "Cloudonomics", with Don Tapscott and Anthony Williams book "Wikinomics" in mind, to capture the notion that there are economic laws of cloud service providers and specific curves (e.g., hyperbolic, inverse square, asymptotic) that characterize how they create value.

The article is at BusinessWeek by way of the GigaOM Network. I will post more detailed comments on each law at my Cloudonomics blog, based both on some underlying proofs and simulations that I've posted at Complex Models.

My thesis is that large enterprises have many of the same technologies at their disposal as cloud service providers do, so, for example, virtualization, even though widely used by cloud service providers, hardly provides strategic value to an enterprise, since an enterprise can leverage exactly the same technology. Consequently, the advantages of service providers must derive from the fact that they are inherently multi-tenant and multi-enterprise, providing true on-demand infrastructure or services through dynamic allocation and multiplexing; via larger scale; and via strategic, engineered, dispersion.

The 10 Laws of Cloudonomics are:

  1. Utility services cost less even though they cost more.
  2. On-demand trumps forecasting.
  3. The peak of the sum is never greater than the sum of the peaks.
  4. Aggregate demand is smoother than individual.
  5. Average unit costs are reduced by distributing fixed costs over more units of output.
  6. Superiority in numbers is the most important factor in the result of a combat (Clausewitz).
  7. Space-time is a continuum (Einstein/Minkowski).
  8. Dispersion is the inverse square of latency.
  9. Don't put all your eggs in one basket.
  10. An object at rest tends to stay at rest (Newton).
Read more here.

Saturday, May 31, 2008

How fast is the Internet growing?

Just returned from participating on a panel called "The Exaflood: Managing the coming digital deluge" at the always outstanding Gilder / Forbes Telecosm. The theme this year was "The Exaflood," i.e., the rapidly growing flood of digital information on the Internet and enterprise data networks.

The panel was moderated by Bret Swanson, a Senior Fellow at the Discovery Institute and Director of the Center of Global Innovation at the Progress and Freedom Foundation. It comprised Andrew Odlyzko, Professor at and Director of the Digital Technology Center at the University of Minnesota; Bob Metcalfe, Ethernet inventor and author, now at Polaris Venture Partners; Johna Till Johnson, President of Nemertes Research; Tom Evslin, founder and former CEO of ITXC and "Fractals of Change" blogger; Lane Patterson, Chief Technologist at Equinix, Walt Ordway, former CTO of the Digital Cinema Initiative, and myself.

There was a diversity of opinion regarding the growth of demand and how to measure it, both recently and over the next few years. Andrew Odlyzko began with a fairly modest estimate of growth rates, pointing out that correctly forecasting growth rates is key for the service provider and equipment vendor industry, since if they are unexpectedly high, congestion and service outages will follow, but if they are unexpectedly low, then overcapacity and poor ROIs will occur. Then, Bret Swanson, who moderated the panel and is a Senior Fellow at the Discovery Institute, recapped a recent study he conducted with George Gilder, excerpted in the Wall Street Journal, projecting a 50-fold growth rate in Internet traffic through 2015, which translates to a 54% CAGR. Johna Johnson then discussed the difficulty of acquiring good data, since core network traffic data is likely to differ from edge data that doesn't traverse service provider cores. She quoted Nemertes projections of 100% growth. Johna also pointed out that it can be difficult to determine unserved demand.

Who's right? Well, ask again in 2015. In the meantime, core network growth rates of 60% annually, which is what we've seen on a regular basis over the last few years, are unlikely to slow. In fact, if anything, the opposite is likely to happen, as multi-megabit/s consumer broadband access increases, consumer desktop HD video streaming grows, consumer IPTV gets deployed, peer-to-peer file sharing continues, and enterprise video conferencing from desktops and immersive Telepresence solutions accelerate. Even mobile video bandwidth continues to grow due to synchronous real-time video, both uplink, downlink, and full duplex. Next generation network upgrades to OC-768 and 3G HSPA deployments and 4G LTE spectrum acquisitions and deployments over the coming years mean that wireline and wireless capacity will be growing, hopefully in tandem with the Exaflood of demand.

Saturday, September 15, 2007

Facebook statistics and close friends

I just returned from GITEX, the largest IT and consumer electronics trade show in the Middle East, held in Dubai, United Arab Emirates. Perusing Wednesday's Gulf Today, I came across an article regarding Facebook and networks of friendships in the real world. According to Will Reader, an evolutionary psychologist from Great Britain, the number of close friends that one has is essentially invariant, whether one has thousands of Facebook friends or not. Reader indicates that many studies have shown that people have about 150 people in their networks of friends. A small fraction of these are "close" friends, regardless of how big their networks grew.

The reason that this is significant is that it shows several things consonant with my analysis of the statistics of many networks. Specifically, it shows that the connectivity value -- in this case the value of a social connection -- is not distributed equally across all nodes (i.e., friends). This yet again argues against an n squared value for networks, where n is the size of the network, which would require that each connection is of equal value, or at least that the distribution of values has a mean that is a linear function of n, neither of which appears to be the case.

The site now reportedly has over 30 million users, so it also suggests that the number of actual friends and close friends does not change even though the number of potential Facebook friends connections is now substantial.

Monday, September 3, 2007

Increasing Network Value III: Front-Loading

If connectivity value is defined by the aggregation across all connections of the expected net present value of transaction streams, that leads us to a third way of increasing that value. As discussed earlier, one way is to increase the value of each transaction, and a second way is to increase the rate of transactions.

A third way is to accelerate transactions into the present. Even with the same nominal value for each transaction, and even with the same average rate of transactions, front-loading increases the net present value by decreasing the time-based discount for each transaction.

When TiVo offers "lifetime subscriptions," they are not only solving a cash flow issue, but translating a stream of future cash payments from monthly subscriber fees into a single front-loaded payment. Just like with a lottery, where one can accept winnings as a lump-sum or monthly payments stretching out over 30 years, or a life annuity, there is a breakeven point which depends on the projected expectation of payments and interest rates. Depending on these assumptions, a front-end non-recurring charge can be equal to, more than, or less than a particular finite or infinite stream of payments.

Sunday, September 2, 2007

Increasing Network Value II: Transaction Rate

Even if the value of each transaction doesn't change, another way to increase total value of some or all connections, and therefore total network value, is to increase the transaction rate. Rather than buying a more expensive item, there may be an increase in value (in the sense of net present value of the revenue stream) if a customer buys a less expensive item more frequently.

And, certainly if they buy the same item more frequently, that will also enhance the total value of the transaction stream.

Planned obsolescence, whether a result of engineering (or the lack thereof), or fashion trends, is a way to incent us to increase our consumption rate. But also, eliminating physical and societal barriers to consumption is another way to increase the transaction rate. This may mean having a coffee shop or hamburger joint on every street corner, to reduce the time and energy barrier or walking an extra 100 feet. Of course, social forces are a tricky thing: whether it is 2 button vs. 3 button suits, or SUVs vs. hybrids, trends may shift causing changes in the transaction rate.

Increasing Network Value I: Transaction Value and Pricing

Connectivity value in a network is at least one metric for valuing networks. Others, such as Reed, have proposed other metrics such as group-forming value, but let's stick with connectivity value for a moment. Even in Web 2.0 and Enterprise 2.0, groups exist due to relationships which are based on transactions across connections.

If, as I've proposed, there are conditions where connectivity value is linear in the size of the network -- be it a communications network, a producer-consumer network, or anything else -- does that mean that networks have limited value?

Of course not.

If we define the connectivity value of a link as the expected value (i.e., likelihood-adjusted) of the net present value (i.e., adjusted for time value of money) of the transaction stream of that link, then one easy way to increase the value of the connection is to increase the size (i.e., value) of the transactions.

For example, if the connectivity value between me and my car dealer is defined by buying a car every four years, that connectivity value will increase if I buy a Lamborghini every four years instead of a used Yugo. (For those that don't know, the Yugo was of note when it went on sale in the '80s as the cheapest car sold in the U. S. Presumably used ones are still for sale).

Nothing has changed in the order of the value of the network: it is still order (n), in other words, proportional to the number of nodes, which in this case, are many car buyers and a relative few car dealers). However, if everyone started buying Lamborghinis instead of Yugos, the connectivity value of the "global automotive sales network" would increase by several orders of magnitude.

Of course, merely raising prices or selling more expensive products doesn't do the trick. Wal-Mart's revenues are higher than Henri Bendel's. As first steps, understanding price elasticity of demand (what would happen if we charged 10% more for this product) and using dynamic pricing for yield management (this is why airline seat prices appear to fluctuate randomly) can maximize total value of the system.

Also, price targeting, discussed in extremely readable fashion in "The Undercover Economist," by Tim Harford, subtly extracts more money from price insensitive or otherwise ignorant customers. He addresses three main mechanisms: individual targeting, group targeting, and "self-incrimination." It is this last technique that enables gourmet coffee shops to sell a cheap regular coffee right next to a $5.00 super half-caf iced mocha caramel choco-frappuccino. Lest you think that this is because of special hand-picked beans which cost more...it isn't. Tim assures us that the production and operations cost differential between cheap and expensive cups may be disregarded.

In summary, one way to increase the value of a network? Raise prices. Or lower them. Or change them dynamically. Whatever it takes to maximize the expected net present value of the connection. And, as Tim points out, in a free market economy such pricing represents the "truth" about what maximizes value to all parties in the transaction: consumers as well as producers.

Mark Cuban and the Emotional Value of Networks

At Blog Maverick, in a post titled "Metcalfe's Law and Video," Mark Cuban discusses a different perspective on network value, specifically with a view towards the intensity over time of connectivity. He comments that "the more people that see content when it is originally "broadcast," regardless of the distribution medium, the more valuable the content." Although that can be demonstrated by simple net present value calculations, he is talking about emergent effects, such as emotional attachment and the social value from real or virtual simultaneous participation.

He also hypothesizes that not only is there greater value from simultaneous delivery, but also that there is greater cost. His argument is that networks that are designed for large scale simultaneous delivery of content cost more than those that are less ambitious.

To me, this is arguable. For example, there are inherent economies in using a broadcast, content distribution network, or IP multicast to distribute content simultaneously, than to keep redelivering it on demand and sequentially. If the capital expenditure for a scalable and feature-rich network has been made, broadcast and multicast technologies and architectures actually reduce cost per bit delivered per person.

If you combine his viewpoint on the value add of "live" and simultaneous events, with my observation that such events can actually cost less, that means that there is a sweet spot, if the network is engineered properly, in delivering live simultaneous content versus delayed and on-demand content.

This conclusion is actually not surprising, since traditional broadcast TV and movie theaters were only economically viable (in their day) due to the cost reductions inherent in broadcasting program content to a large simultaneous audience rather than unicasting it asynchronously. Of course, today's technology has now reduced the marginal cost of unicasting to be an infinitesimal fraction of a customers willingness to pay for such content.

Or so it would seem. In reality though, for the foreseeable future there will be content that is too bandwidth-hungry for widespread acceptance. Maybe YouTube videos don't have that property right now, but what about HDTV to your laptop screen? How many people are willing to pay for mobile bandwidth sufficient to deliver it in real time, say for 1080p video conferencing? If not that, how about digital cinema quality images?

For the next 5 to 10 years, there will always be that dilemma. After that, perhaps not, because we will have the ability to deliver enough bandwidth to each user, whether fixed or mobile, to equal or exceed the limits of human perception. At that point, until we evolve or bio-engineer our visual cortex and other sensory modalities to become Human 2.0, any additional bandwidth will be overkill, at least for the purposes of entertainment.