Skip to main content
SpectralShifts Blog 
Monday, January 28 2013

TCP/IP Won, OSI Lost.  Or Did It?  Clue: Both Are Horizontal

Edmund Burke said, “Those who cannot remember the past are doomed to repeat it.”  What he didn’t add, as it might have undermined his point, is that “history gets created in one moment and gets revised the next.”  That’s what I like to say.  And nothing could be more true when it comes to current telecom and infomedia policy and structure.  How can anyone in government, academia, capital markets or the trade learn from history and make good long term decisions if they don’t have the facts straight?

I finished a book about the origins of the internet (ARPAnet, CSnet, NSFnet) called “where wizards stay up late, The Origins of The Internet” by Katie Hafner and Matthew Lyon written back in 1996, before the bubble and crash of web 1.0.  It’s been a major read for computer geeks and has some lessons for people interested in information industry structures and business models.  I cross both boundaries and was equally fascinated by the “anti-establishment” approach by the group of scientists and business developers at BBN, the DoD and academia, as well as the haphazard and evolutionary approach to development that resulted in an ecosystem very similar to what the original founders envisioned in the 1950s.

The book has become something of a bible for internet, and those I refer to as upper layer (application), fashionistas who, unfortunately, have, and are provided in the book with, very little understanding of the middle and lower layers of the service provider “stack”.  In fact the middle layers all but dissappear as far as they are concerned.  While those upper layer fashionistas would like to simplify things and say, “so and so was a founder or chief contributor of the internet,” or “TCP/IP won and OSI lost,” actual history and reality suggest otherwise.

Ironically, the best way to look at the evolution of the internet is via the oft-maligned 7-layer OSI reference model.  It happens to be the basis for one dimension of the InfoStack analytical engine.  The InfoStack relates the horizontal layers (what we call service provisioning checklist for a complete solution) to geographic dispersion of traffic and demand on a 2nd axis, and to a 3rd axis which historically covered 4 disparate networks and business models but now maps to applications and market segments.  Looking at how products, solutions and business models unfold along these axis provides a much better understanding of what really happens as 3 coordinates or vectors provides better than 90% accuracy around any given datapoint.

The book spans the time between the late 1950s and the early 1990s, but focuses principally on the late 1960s and early 1970s.  Computers were enormously expensive and shared by users, but mostly on a local basis because of high cost and slow connections.  No mention is made of the struggle modems and hardware vendors had to get level access to the telephone system and PCs had yet to burst on the scene.  The issues around the high-cost monopoly communications network run by AT&T are only briefly mentioned; their impact and import lost to the reader.

The book makes no  mention that by the 1980s development of what became the internet ecosystem really started picking up steam.  After struggling to get a foothold on the “closed” MaBell system since the 1950s, smartmodems burst on the scene in 1981.  Modems accompanied technology developments that had been occurring with fax machines, answering machines and touchtone phones; all generative aspects of a nascent competitive voice/telecoms markets.

Then, in 1983, AT&T was broken up and an explosion in WAN (long-distance) competition drove pricing down, and advanced intelligent networks increased the possibility of dial-around bypass.  (Incidentally, by 1990s touchtone penetration in the US was over 90% vs less than 20% in the rest of the world driving not only explosive growth in 800 calling, but VPN and card calling, and last but not least the simple "touchtone" numeric pager; one of the percursors to our digital cellphone revolution).  The Bells responded to this potential long-distance bypass threat by seeking regulatory relief with expanded calling areas and flat-rate calling to preserve their Class 5 switch monopoly.  

All the while second line growth exploded, primarily as people connected fax machines and modems for their PCs to connect to commercial ISPs (Compuserve, Prodigy, AOL, etc...).  These ISPs benefited from low WAN costs (competitive transit in layer 2), inexpensive routing (compared with voice switches) in layer 3, and low-cost channel banks and DIDs in those expanded LATAs to which people could dial up flat-rate (read "free") and remain connected all day long.  The US was the only country in the world that had that type of pricing model in the 1980s and early 1990s. 

Another foundation of the internet ecosystem, PCs, burst from the same lab (Xerox Parc) that was run by one of the founders of the Arpanet, Bob Taylor, who could deserve equal or more credit than Bob Kahn or Vint Cerf (inventors of TCP) for development of the internet.  As well, the final two technological underpinnings that scaled the internet, Ethernet and Windows, were developed at Xerox Parc.  These technology threads which should have been better developed in the book for their role in the demand for and growth of the internet from the edge.

In the end, what really laid the foundation for the internet were numerous efforts in parallel that developed outside the monopoly network and highly regulated information markets.  These were all 'generative' to quote Zitrane.  (And as I said a few weeks ago, they were accidental).  These parallel streams evolved into an ecosystem onto which www, http, html and mosaic, were laid--the middle and upper layers--of the 1.5 way, store and foreward, database lookup “internet” in the early to mid 1990s.  Ironically and paradoxically this ecosystem came together just as the Telecom Act of 1996 was being formed and passed; underscored by the fact that the term “internet” is mentioned once in the entire Act and one of the reasons I labeled the Act “farcical” back in 1996.

But the biggest error of the book in my opinion is not the omission of all these efforts in parallel with the development of TCP/IP and giving them due weight in the internet ecosystem, rather concluding with the notion that TCP/IP won and the OSI reference model lost.  This was disappointing and has had a huge, negative impact on perception and policy.  What the authors should have said is that a horizontally oriented, low-cost, open protocol as part of a broader similarly oriented horizontal ecosystem beat out a vertically integrated, expensive, closed and siloed solution from monopoly service providers and vendors.

With a distorted view of history it is no wonder then that:

The list of ironic and unfortunate paradoxes in policy and market outcomes goes on and on because people don’t fully understand what happened between TCP/IP and OSI and how they are inextricably linked.  Until history is viewed and understood properly, we will be doomed, in the words of Burke, to repeat it. Or, as Karl Marx said, "history repeats itself twice, first as tragedy and second as farce."

Posted by: Michael Elling AT 10:50 am   |  Permalink   |  0 Comments  |  Email
Friday, January 18 2013

Last summer I attended a Bingham event at the Discovery Theatre in NYC’s Time Square to celebrate the Terracotta Warriors of China’s first emperor, Qin Shi Huang.  What struck me was how far our Asian ancestors had advanced technically, socially and intellectually beyond our western forefathers by 200 BC.  Huang's reign, which included the building of major transportation and information networks was followed by a period of nearly 1,500 years of relative peace (and stagnation) in China.  It would take another 1,000 years for the westerners to catch up during periods of war, plague and socio-political upheaval.  But once they passed their Asian brethren by the 15th and 16th centuries they never looked back.  Having just finished Art of War, by Sun Tsu, I asked myself, is war and strife necessary for mankind to advance?

This question was reinforced over the holidays upon visiting the Loire Valley in France, which most people associate with beautiful Louis XIV chateaus, a rich fairy-tale medieval history, and good wines.  What most people don’t realize is that the Loire was a war-torn area for the better part of 400 years as the French (Counts of Blois) and English (Counts of Anjou; precursors to the Plantagenet dynasty of England) vied for domination of a then emerging Europe.  The parallels between China and France 1,000 years later couldn’t have been more poignant.

After the French finally kicked the English out in the 1400s this once war-torn region became the center of the European renaissance and later the birthplace of the age of enlightenment.  Francois 1st brought Leonardo from Italy for the last 3 years of his life and the French seized upon his way of thinking; to be followed a few centuries later by Voltaire and Rousseau.  The French aristocracy, without wars to fight, invited them to stay in their Chateaus, built on the fortifications of the medieval castles, and develop their enduring principles of liberty, equality and fraternity.  These in turn would become the foundations upon which America broadly based its constitution and structure of government; all of which in theory supports and leads to competitive markets and network neutrality; the basis of the internet.

And before I left on my trip, I bought a kindle version of Sex, Bombs and Burgers by Peter Nowak on the recommendation of an acquaintance at Bloomberg.  Nowak’s premise is to base much of America’s advancement and success over the past 50 years on our warrior instincts and need to procreate and sustain life.  I liked the book and recommend it to anyone, especially as I used to quip, “Web 1.0 of the 1990s was scaled by the 4 (application) horsemen: Content, Commerce, Communication and hard/soft-Core porn.”  But the book also provides great insights beyond the growth of porn on the internet into our food industry and where our current military investments might be taking us physically and biologically.

While the book meanders on occasion, my take-away and answer to my above question is that war (and the struggle to survive by procreating and eating) increases the rate of technological innovations, which often then result in new products; themselves often mistakes or unintended commercial consequences from their original military intent.  War increases the pace of innovation out of necessity, intensity and focus.  After all, our state of fear is unnaturally heightened when someone is trying to kill us, underscoring the notion that fear and greed are man’s primary psychological and commercial motivators; not love and happiness.

Most people generally believe the internet is an example of a technological innovation hatched from the militarily driven space race; which is the premise for another book I am just starting Where Wizards Stay Up Late, by Hafner and Lyon.  What most people fail to realize, including Nowak, is that the internet was an unintended consequence of the breakup of AT&T in 1983; another type of conflict or economic war that had been waged in the 1950s-1970s.  In that war we had General William McGowan of MCI (microwave, the M in MCI, was a technology principally scaled during WW II) battling MaBell along with his ally the DOJ.  At the same time, a group of civilian scientists in the Pentagon had been developing the ARPAnet, a weapon/tool developed to get around MaBell’s monopoly long-distance fortifications to enable low cost computer communications across the US and globally.

The two conflicts aligned in the late 1980s as the remnants of MaBell, the Baby Bells, sought regulatory relief through state and federal regulators from a viciously competitive WAN/long-distance sector to preserve two arcane, piggishly profitable monopoly revenue streams; namely intrastate tolls and terminating access.  The regulatory relief provided was to expand local calling areas (LATAs) and go to flat rate (all you can eat) pricing models.  By then modems and routers, outgrowths of ARPA related initiatives, had gotten cheap enough that the earliest ISPs could cost effectively build and market their own layer 1-2 nationwide "data bypass" networks across 5,000 local calling areas.

These networks allowed people to dial up a free or low cost local number and stay connected with a computer or database or server anywhere all day long.  The notions of “free” and “cheap” and the collapse of distance were born.  The internet started and scaled in the US because of partially competitive communications networks, whom no one else had in 1990.  It would be 10 years before the ROW had an unlimited flat-rate access topology like the US.

Only after these foundational (pricing and infrastructure) elements were in place, did the government allow commercial nets to interconnect via the ARPAnet in 1988.  This was followed by Tim B Lee's WWW in 1989 (a layer 3 address simplification standard) and http and html in subsequent years providing the basis for a simple to use, mass-market browser, mosaic, the precursor to Netscape, in 1993.  The result was the Internet or Web 1.0, which was a 4 or 5 layer asynchronous communications stack mostly used as a store and forward database lookup tool.

The internet was the result of two wars being fought against the monopolies of the Soviet communists and American bellheads; both of which, ironically, share(d) common principles.  Participants and commentators in the current network neutrality, access/USF reform and ITU debates, including Nowak, should be aware of these conflict-driven beginnings of the internet, in particular the power and impact of price, as it would modify their positions significantly with respect to these debates.  Issues like horizontal scaling, vertical disintermediation and completeness, balanced settlement systems and open/equal access need to be better analyzed and addressed.  What we find in almost every instance on the part of every participant in these debates is hypocritical and paradoxical positions, since people do not fully appreciate history and how they arrived at their relative and absolute positions.

Posted by: Michael Elling AT 12:10 pm   |  Permalink   |  0 Comments  |  Email
Friday, January 11 2013

A year ago it was rumored that 250 Apple employees were at CES 2012, even as the company refused to participate directly.  The company could do no wrong and didn’t need the industry.  For the better part of 9 months that appeared to be the case and Apple’s stock outperformed the market by 55%.  But a few months on, and a screen size too small for phones and too big for tablets, a mapping app too limited, and finally a buggy OS, Apple’s excess performance over the market has narrowed to 10%.

Two major themes of this year’s CES--mobile device screen size and extensive application ecosystems to connect just about anything--will place Apple’s mobile dominance and lead further in doubt.  To us it was already in evidence last year when we talked about the singularity.  But the real reason today is becoming apparent to all, namely that Apple wants to keep people siloed into their product specific verticals.  People and their applications don’t want that because the cloud lets people update, access and view information across 3 different screens and any platform.  If you want Apple on one device, all your devices have to be Apple.  It’s a twist on the old Henry Ford maxim, “you can have any device…as long as it is Apple.”

This strategy will fail further when the access portion of the phone gets disconnected from all the other components of the phone.  It may take a few years for that to happen, but it will make a lot of sense to just buy an inexpensive dongle or device that connects to the 4G/5G/Wifi network (metro-local or MAN/LAN) and radiates Bluetooth, NFC and Wifi to a plethora of connected devices in the personal network (PAN).   Imagine how well your “connection hub” would last if it didn’t need to power a screen and huge processor for all the different apps?  There goes your device centric business model.

And all that potential device and application/cloud supply-side innovation means that current demand is far from saturated.  The most recent Cisco forecasts indicate that 1/3rd of ALL internet traffic will be from mobile devices by 2016.  In the US 37% of the mobile access will be via Wifi.  Applications that utilize and benefit from mobility and transportability will continue to grow as overall internet access via a fixed computer will drop to 39% from 60% today. 

While we believe this to be the case, the reality is far different today according to Sandvine, the broadband policy management company.  This should cause the wireless carriers some concern as they look at future capacity costs.  In their recent H2-2012 report Sandvine reveals that power smartphone users are already using 10x more than average smartphone users, or 317 megabytes a month vs 33.  But even the former number is a far cry from the 7.3 gigabytes (20x) that the average person uses on their fixed broadband pipes (assuming 2.3 people per fixed broadband line).  Sandvine estimates that total mobile access will grow from ~1 petabyte in H2-2012 to 17 petabytes by H2-2018.

My own consumption, since moving from 3G to 4G and going from a 4 to 4.7 inch screen is a 10x increase to 1-2 gigs 4G access and 3-6 gigs wifi access, for a total of 4-8 gigs a month.  This is because I have gone from a “store and forward” mentality to a 7x24 multimedia consumption model.  And I am just getting comfortable with cloud based access and streaming.   All this sounds positive for growth and investment especially as the other 95% of mobile users evolve to these usage levels, but it will do the carriers no good if they are not strategically and competitively well positioned to handle the demand.  Look for a lot of development in the lower access and transport layers including wifi offload and fiber and high-capacity microwave backhaul.

Related Reading:

Smartphones use more data than tablets for first time.
 
Broadcom develops small, multi-modal, multi-band modem chips!

 

Posted by: Michael Elling AT 01:51 pm   |  Permalink   |  0 Comments  |  Email
Friday, August 17 2012

How To Develop A Blue Ocean Strategy In A Digital Ecosystem

Back in 2002 I developed a 3 dimensional macro/micro framework based strategy for Multex, one of the earliest and leading online providers of financial information services.  The result was to sell themselves to Reuters in a transaction that benefited both companies.  1+8 indeed equaled 12.  What I proposed to the CEO was simple.  Do “this” to grow to a $500m company or sell yourself.  After 3-4 weeks of mulling it over, he took a plane to London and sold his company rather than undertake the “this”.

What I didn’t know at the time was that the “this” was a Blue Ocean Strategy (BOS) of creating new demand by connecting previously unconnected qualitative and quantitative information sets around the “state” of user.  For example a portfolio manager might be focused on biotech stocks in the morning and make outbound calls to analysts to answer certain questions.  Then the PM goes to a chemicals lunch and returns to focus on industrial products in the afternoon, at which point one of the biotech analysts gets back to him.  Problem.  The PM’s mental and physical “state” or context is gone.  Multex had the ability to build a tool that could bring the PM back to his morning “state” in his electronic workplace.  Result, faster and better decisions.  Greater productivity, possible performance, definite value.

Sounds like a great story, except there was no BOS in 2002.  It was invented in 2005.  But the second slide of my 60 slide strategy deck to the CEO had this quote from the author’s of BOS, W.Chan Kim and Renee Mauborgne, of INSEAD, the Harvard Business School of Europe:

“Strategic planning based on drawing a picture…produces strategies that instantly illustrate if they will: stand out in the marketplace, are easy to understand and communicate, and ensure that every employee shares a single visual reference point.”

So you could argue that I anticipated the BOS concept to justify my use of 3D frameworks which were meant to illustrate this entirely new playing field for Multex.

But this piece is less about the InfoStack’s use in business and sports and more about the use of the 4Cs and 4Us of supply and demand as tools within the frameworks to navigate rapidly changing and evolving ecosystems.  And we use the BOS graphs postulated by Kim/Mauborgne.  The 4Cs and 4Us lets someone introducing a new product, horizontal layer (exchange) or vertical market solution (service integration) figure out optimal product, marketing and pricing strategies and tactics a priori.  A good example of this is a BOS I created for a project I am working on in the area of Wifi offload and Hetnet (heterogeneous access networks that can be self-organising) area called HotTowns (HOT).  Here’s a picture of it comparing 8 key supply and demand elements across fiber, 4G macro cellular and super saturation offload in a rural community.  Note that the "blue area" representing the results of the model can be enhanced on the capacity front by fiber and on the coverage front by 4G.

The same approach can be used to rate mobile operating systems and any other product at a boundary of the infostack or horizontal or vertical solution in the market.  We'll do some of that in upcoming pieces.

 

 

Posted by: Michael Elling AT 09:49 am   |  Permalink   |  0 Comments  |  Email
Sunday, July 15 2012

I met the Godfather of New York Venture capital a few weeks ago and I was talking about an arbitrage opportunity of a lifetime in the communications sector.  I started talking about the lack of competition and resulting high prices (which I highlighted last week) brought about by bandwidth being 20-150x overpriced.  He just looked at me and said, “bandwidth issue?  What bandwidth issue!”  It just so happens that his current prize investment is an IPTV application.  I just rolled my eyes thinking, “if he only knew!”, remembering what happened to all the web 1.0 companies that ran into the broadband brick wall in 2000.

This statement is symptomatic of the complacency amongst the venture community; those investing billions in the upper layers of the stack.  Yet people on Main Street, as evidenced by the Kansas City Fiber video on the Fiber To The Home Council website indicating that 1,000 communities had responded to the contest with over 200,000 people directly involved, know otherwise.

The numbers tell a worse story.  Because of the CLEC boom-bust 10-15 years ago, rescission of equal access, failure of muni-WiFi and Wimax and BTOP crowding-out Telecom spending has disconnected from other venture spending over the past decade.  Based on overall VC spending telecom spending should be 2-3x greater than it is.  Instead it stands 70% below where it was from 1995-2005.  It took a while for competition to die, but now it is official!

Venture spending today for the sector, which used to average 15-20% of total VC spending is now down below 5% over the past 3 years.  All the other TMT sectors have held nearly constant with overall VC spending.

Everyone should look at these numbers with alarm and reach out to policy makers, academics, trade folks, the venture community and capital markets to make them aware of the dearth of investment as a result of the lack of competition.  Now, more than ever contrarian investors should look at the monopoly pricing and realize there is significant profits to be made at all layers of the stack.

Posted by: Michael Elling AT 08:28 am   |  Permalink   |  0 Comments  |  Email
Sunday, July 08 2012

Thursday December 19, 2013 will commemorate the 100 year anniversary of the Kingsbury Commitment.  There are 528 days remaining.  Let's plan something special to observe this tragic moment.

In return for universal service, AT&T was granted a "natural monopoly".  The democratic government in the US, one of the few at the time, recognized the virtue of open communications for all and foolishly agreed to Ted Vail's deceptions.  Arguably, this one day changed the course of mankind for 50-70 years.  Who knows what might have been if we had fostered low-cost communications in the first half of the century?

Anyway, when universal service didn't happen (no sh-t sherlock) the government stepped in to ensure universal service in 1934.  So on top of an overpriced monopoly the American public was taxed to ensure 100% of the population got the benefit of being connected.  Today, that tax amounts to $15 billion annually to support overpriced service to less than 5% of the population.  (Competitive networks have shown how this number gets driven to zero!)

Finally in the early 1980s, after nearly 30 years (the final case started in 1974 and took nearly 9 years) of trying the Department of Justice got a Judge to break up the monopoly into smaller monopolies and provide "equal access" to competitors across the long-distance piece starting and ending at the Class 5 (local switch and calling) boundary.  The AT&T monopoly was dead; long live the Baby Bell monopolies!  But the divestiture began a competitive long-distance (WAN) digitization "wave" in the 1980s that resulted in, amongst other things:

  • 99% drop in pricing over 10 years
  • 90% touchtone penetration by 1990 vs 20% ROW
  • Return of large volume corporate traffic via VPN services and growth of switched data intranets
  • Explosion of free, 800 access (nearly 50% of traffic by 1996)
  • Over 4 (upwards of 7 in some regions/routes) WAN fiber buildouts
  • Bell regulatory relief on intralata tolls via expanding calling areas (LATAs)
  • Introduction of flat-rate local pricing by the Bells

The latter begat the Internet, the second wave of digitization in the early 1990s.  The scaling of Wintel driven by the Internet paved the way for low-cost digital cellphones, the third wave of digitization in the late 1990s.  (Note, both the data and wireless waves were supported by forms of equal access).  By 1999 our economy had come back to the forefront on the global scene and our budget was balanced and we were in a position to pay down our national debt.  I expected the 4th and Final Wave of last mile (broadband) digitization to start sometime in the mid to late 2000s.  It never came.  In fact the opposite happened because of 3 discrete regulatory actions:

  • 1996 Telecom Act
  • 2002 Special Access Deregulation
  • 2004 Rescision of Equal Access and Bell entry into Long Distance (WAN)

Look at the following 6 charts and try not to blink or cry.  In all cases, there is no reason why the prices in the US are not 50-70% lower; if not more.  We have the scale.  We have the usage.  We have the industries.  We have the technology.  We started all the 3 prior waves and should have oriented our vertically integrated service providers horizontally a la the data processing industry to effectively deal with rapid technological change.  Finally, we have Moore's and Metcalfe's laws, which argue for a near 60% reduction in bandwidth pricing and/or improved performance annually! 

But the government abetted a remonopolization of the sector over the past 15 years.

It's almost a tragedy to be American on this July 4 week.  The FCC and the government killed competition brought about by Bill McGowan.  But in 2007 Steve Jobs resurrected equal access and competition.  So I guess it's great to be American after all!  Many thanks to Wall and the Canadian government for these stats.













Related Reading:

New America Foundation Global Cost of Connectivity (It's bad in the US!)

Posted by: Michael Elling AT 07:17 am   |  Permalink   |  0 Comments  |  Email
Sunday, June 03 2012

Since I began covering the sector in 1990, I’ve been waiting for Big Bang II.  An adult flick?  No, the sequel to Big Bang (aka the breakup of MaBell and the introduction of equal access) was supposed to be the breakup of the local monopoly.  Well thanks to the Telecom Act of 1996 and the well-intentioned farce that it was, that didn’t happen and equal access officially died (equal access RIP) in 2005 with the Supreme Court's Brand-X decision vs the FCC.  If it died, then we saw a resurrection that few noticed.  

I am announcing that Equal Access is alive and well, albeit in a totally unexpected way.  Thanks to Steve Jobs’ epochal demands put on AT&T to counter its terrible 2/3G network coverage and throughput, every smartphone has an 802.11 (WiFi) backdoor built-in.  Together with the Apple and Google operating systems being firmly out of carriers’ hands and scaling across other devices (tablets, etc…) a large ecosystem of over-the-top (OTT), unified communications and traffic offloading applications is developing to attack the wireless hegemony. 

First, a little history.  Around the time of AT&T's breakup the government implemented 2 forms of equal access.  Dial-1 in long-distance made marketing and application driven voice resellers out of the long-distance competitors.  The FCC also mandated A/B cellular interconnect to ensure nationwide buildout of both cellular networks.  This was extended to nascent PCS providers in the early to mid 1990s leading to dramatic price declines and enormous demand elasticities.  Earlier, the competitive WAN/IXC markets of the 1980s led to rapid price reductions and to monopoly (Baby Bell or ILEC) pricing responses that created the economic foundations of the internet in layers 1 and 2; aka flat-rate or "unlimited" local dial-up.  The FCC protected the nascent ISP's by preventing the Bells from interfering at layer 2 or above.  Of course this distinction of MAN/LAN "net-neutrality" went away with the advent of broadband, and today it is really just about WAN/MAN fights between the new (converged) ISPs or broadband service providers like Comcast, Verizon, etc... and the OTT or content providers like Google, Facebook, Netflix, etc...

(Incidentally, the FCC ironically refers to edge access providers, who have subsumed the term ISPs or "internet service providers", as "core" providers, while the over-the-top (OTT) messaging, communications, e-commerce and video streaming providers, who reside at the real core or WAN, are referred to as "edge" providers.  There are way, way too many inconsistencies for truly intelligent people to a) come up with and b) continue to promulgate!)

But a third form of equal access, this one totally unintentioned, happened with 802.11 (WiFi) in the mid 1990s.  The latter became "nano-cellular" in that power output was regulated limiting hot-spot or cell-size to ~300 feet.  This had the impact of making the frequency band nearly infinitely divisible.  The combination was electric and the market, unencumbered by monopoly standards and scaling along with related horizontal layer 2 data technologies (ethernet), quickly seeded itself.  It really took off when Intel built WiFi capability directly into it's Centrino chips in the early 2000s.  Before then computers could only access WiFi with usb dongles or cables tethered to 2G phones

Cisco just forecast that 50% of all internet traffic will be generated from 802.11 (WiFi) connected devices.  Given that 802.11’s costs are 1/10th those of 4G something HAS to give for the communications carrier.  We’ve talked about them needing to address the pricing paradox of voice and data better, as well as the potential for real obviation at the hands of the application and control layer worlds.  While they might think they have a near monopoly on the lower layers, Steve Job’s ghost may well come back to haunt them if alternative access networks/topologies get developed that take advantage of this equal access.  For these networks to happen they will need to think digital, understand, project and foster vertically complete systems and be able to turn the "lightswitch on" for their addressable markets.

Posted by: Michael Elling AT 10:21 am   |  Permalink   |  2 Comments  |  Email
Sunday, April 29 2012

The first quarter global smartphone stats are in and it isn’t even close.  With the market growing more than 40%+, Samsung controls 29% of the market and Apple 24%.  The next largest, Nokia came in 60-70% below the leaders at 8%, followed by RIMM at 7% and HTC at 5%, leaving the scraps (28%) to Sony, Motorola, LG, ZTE.  They've all already lost on the scale front; they need to change the playing field. 

While this spread sounds large and improbable, it is not without historical precedent.  In 1914, just 6 years after its introduction the Ford Model T commanded 48% market share.  Even by 1923 Ford still had 40% market share.  2 years later the price stood at $260, which was 30% of the original model in 1908, and less than 10% what the average car cost in 1908; sounds awfully similar to Moore’s law and the pricing of computer/phone devices over the past 30 years.  Also, a read on the Model T's technological and design principles sounds a lot like pages taken out of the book of Apple.  Or is it the other way around?

Another similarity was Ford’s insistence on the use of black beginning in 1914.  Over the life of the car 30 different variations of black were used!  The color limitation was a key ingredient in the low cost as prior to 1914, the company used blue, green, red and grey.  Still 30 variations of black (just like Apple’s choice of white and black only and take it or leave it silo-ed product approach) is impressive and is eerily similar to Dutch Master Frans Hals’ use of 27 variations of Black, so inspirational to Van Gogh.  Who says we can’t learn from history.  

Ford’s commanding lead continued through 1925 even as competitors introduced many new features, designs and colors.  Throughout, Ford was the price leader, but when the end came for that strategy it was swift. Within 3 years the company had completely changed its product philosophy introducing the Model A (with 4 colors and no black) and running up staggering losses in 1927-28 in the process.  But the company saw market share rebound from 30% to 45% in the process; something that might have been maintained for a while had not the depression hit. 

The parallels between the smartphone and automobile seem striking.  The networks are the roads.  The pricing plans are the gasoline.  Cars were the essential component for economic advancement in the first half of the 20th century, just as smartphones are the key for economic development in the first half of the 21st century.  And now we are finding Samsung as Apple's GM; only the former is taking a page from both GM and Ford's histories.  Apple would do well to take note.

So what are the laggards to do to make it an even race?  We don’t think Nokia’s strategy of coming out with a different color will matter.  Nor do we think that more features will matter.  Nor do we think it will be about price/cost.  So the only answer lies in context; something we have raised in the past on the outlook for the carriers.  More to come on how context can be applied to devices.  Hint, I said devices, not smartphones.  We'll also explore what sets Samsung and Apple apart from the rest of the pack.

Related Reading:

Good article on Ford and his maverick ways.  Qualities which Jobs possessed.

 

Posted by: Michael Elling AT 09:24 am   |  Permalink   |  0 Comments  |  Email
Sunday, April 22 2012

I love talking to my smartphone and got a lot of grief for doing so from my friends last summer while on vacation at the shore.  “There’s Michael talking crazy, again,” as I was talking TO my phone, not through it.  “Let’s ask Michael to ‘look’ that up! Haha.”  And then Siri came along and I felt vindicated and simultaneously awed.  The latter by Apple’s (AAPL, OCF = 6.9x) marketing and packaging prowess; seemingly they had trumped Google/Android (GOOG, OCF = 9.6x) yet again.  Or had they?   What at first appeared to be a marketing coup may become Tim Cook’s Waterloo and sound a discordant note for Nuance (NUAN, OCF = 31x).  At its peak in early February NUAN hit a high of 42x OCF, even as Apple stood at 8.2x.

The problem for Apple and Nuance in the short term is that the former is doubling down on its Siri bet with a brand new round of ads by well-known actors, such as Samuel L. Jackson and Zooey Deschanel.  Advertising pundits noted this radical departure for a company that historically shunned celebrities.  Furthermore, with two (NY and LA) class action suits against it and financial pundits weighing in on the potential consumer backlash Apple could have a major Siri migraine in the second half.  Could it be as big as Volvo’s advertising debacle 20 years ago; a proverbial worm in the apple?  Time will tell.

The real problem isn’t with Apple, the phone or Siri and its technology DNA, rather the problem lies with bandwidth and performance of the wireless networks.  Those debating whether Siri is a bandwidth hog are missing the point.  Wireless is a shared spectrum.  The more people who are on the same spectrum band, the less bandwidth for each user and the higher the amount of noise.  Both are bad for applications like Siri and Android’s voice recognition since they talk with processors in the cloud (or WAN).  Delays and missed packets have a serious impact on comprehension, translation and overall responsiveness.  Being a frequent user of Google’s voice-rec which has been around since August, 2010, I know when to count on voice-rec by looking at the wifi/3G/2G indicator and the number of bars.  But do others know this and will the focus on Siri's performance shift to the network?  Time will tell.

The argument that people know voice-rec is new and still in beta probably won’t cut it either.  I am puzzled why major Apple fanboys don’t perceive it as a problem, not even making it on this list of 5 critical product issues for the company to address.  But maybe that’s why companies like Apple are successful; they push the envelope.  In the 1990s I said to the WAN guys (Sprint, et al) that they would have the advantage over the baby bells because the “cloud” or core was more important than the edge.  The real answer, which Apple fully understands, is that the two go hand in hand.  For Sprint (S, OCF = 3.2x) there were too many variables they couldn’t control, so they never rolled out voice recognition on wired networks.  They probably should have taken the chance.  Who knows, the “pin-drop” company could have been a whole lot better off! 

Related Reading:
First impressions of i4s from Android user and Pros/Cons of Siri

A natural opposite for "siri" for the droid crowd would be an "Eliza", based on Audrey Hepburn's famous 'The Rain in Spain' character.  Eliza is the name of a company specializing in voice-rec healthcare apps and also a 1960s AI psychotherapy program.
 

 

Posted by: Michael Elling AT 10:38 am   |  Permalink   |  0 Comments  |  Email
Sunday, March 18 2012

Previously we have written about “being digital” in the context of shifting business models and approaches as we move from an analog world to a digital world.  Underlying this change have been 3 significant tsunami waves of digitization in the communications arena over the past 30 years, underappreciated and unnoticed by almost all until after they had crashed onto the landscape:

  • The WAN wave between 1983-1990 in the competitive long-distance market, continuing through the 1990s;
  • The Data wave, itself a direct outgrowth of the first wave, began in the late 1980s with flat-rate local dial-up connections to ISPs and databases anywhere in the world (aka the Web);
  • The Wireless wave  beginning in the early 1990s and was a direct outgrowth of the latter two.  Digital cellphones were based on the same technology as the PCs that were exploding with internet usage.  Likewise, super-low-cost WAN pricing paved the way for one-rate, national pricing plans.  Prices dropped from $0.50-$1.00 to less than $0.10.  Back in 1996 we correctly modeled this trend before it happened.

Each wave may have looked different, but they followed the same patterns, building on each other.  As unit prices dropped 99%+ over a 10 year period unit demand exploded resulting in 5-25% total market growth.  In other words, as ARPu dropped ARPU rose; u vs U, units vs Users.  Elasticity. 

Yet with each new wave, people remained unconvinced about demand elasticity.  They were just incapable of pivoting from the current view and extrapolating to a whole new demand paradigm.  Without fail demand exploded each time coming from 3 broad areas: private to public shift, normal price elasticity, and application elasticity.

  • Private to Public Demand Capture.  Monopolies are all about average costs and consumption, with little regard for the margin.  As a result, they lose the high-volume customer who can develop their own private solution.  This loss diminishes scale economies of those who remain on the public, shared network raising average costs; the network effect in reverse.  Introducing digitization and competition drops prices and brings back not all, but a significant number of these private users.  Examples we can point to are private data and voice networks, private radio networks, private computer systems, etc…that all came back on the public networks in the 1980s and 1990s.  Incumbents can’t think marginally.
  • Normal Price Elasticity.  As prices drop, people will use more.  It gets to the point where they forget how much it costs, since the relative value is so great.  One thing to keep in mind is that lazy companies can rely too much on price and “all-you-can-eat” plans without regard for the real marginal price to marginal cost spread.  The correct approach requires the right mix of pricing, packaging and marketing so that all customers at the margin feel they are deriving much more value than what they are paying for; thus generating the highest margins.  Apple is a perfect example of this.  Sprint’s famous “Dime” program was an example of this.  The failure of AYCE wireless data plans has led wireless carriers to implement arbitrary pricing caps, leading to new problems.  Incumbents are lazy.
  • Application Elasticity. The largest and least definable component of demand is the new ways of using the lower cost product that 3rd parties drive into the ecosystem.   They are the ones that drive true usage via ease of use and better user interfaces.  Arguably they ultimately account for 50% of the new demand, with the latter 2 at 25% each.  With each wave there has always been a large crowd of value-added resellers and application developers that one can point to that more effectively ferret out new areas of demand.  Incumbents move slowly.  

Demand generated via these 3 mechanisms soaked up excess supply from the digital tsunamis.  In each case competitive pricing was arrived at ex ante by new entrants developing new marginal cost models by iterating future supply/demand scenarios.  It is this ex ante competitive guess, that so confounds the rest of the market both ahead and after the event.  That's why few people recognize that these 3 historical waves are early warning signs for the final big one.  The 4th and final wave of digitization will occur in the mid-to-last mile broadband markets.  But many remain skeptical of what the "demand drivers" will be.  These last mile broadband markets are monopoly/duopoly controlled and have not yet realized price declines per unit that we’ve seen in the prior waves.  Jim Crowe of Level3 recently penned a piece in Forbes that speaks to this market failure.  In coming posts we will illustrate where we think bandwidth pricing is headed, as people remain unconvinced about elasticity, just as before.  But hopefully the market has learned from the prior 3 waves and will understand or believe in demand forecasts if someone comes along and says last mile unit bandwidth pricing is dropping 99%.  Because it will.

 

Posted by: Michael Elling AT 10:47 am   |  Permalink   |  0 Comments  |  Email

Email
Twitter
Facebook
Digg
LinkedIn
Delicious
StumbleUpon
Add to favorites

Information Velocity Partners, LLC
88 East Main Street, Suite 209
Mendham, NJ 07930
Phone: 973-222-0759
Email:
contact@ivpcapital.com

Mastodon

Design Your Own Website, Today!
iBuilt Design Software
Give it a try for Free