Weekly blogs circa 1997-1998 providing some interesting historical context. Enjoy!
Section 1: Demand
Being All things to All People Is Easy In A Telecom Commodity World
The Wild West of Telecommunications
Too Little (or Too Much) Too Late?
Who Will Pay the Piper?
Section 2: Technology
SuperComm--It's Certainly Not Just POTS Anymore!
Bring On An OSI Stack Kind Of World
Stacking Telephony Geographically--The US Is Ahead Of Others
The Paradigm Shift People Aren't Seeing, And Reports of The Voice World's Death Are Greatly Exaggerated
Section 3: Network Economics
On IP Telephony, Let's Not Forget The Real Cost of Communications
The Geometry of Networks Overwhelms Moore's Law
How Called Party Pays Are Centralized Intranets--The Corporate Market Is Key
Bill McGowan's Legacy Is Our Current Prosperity
Section 4: Strategy
Notes From The Wireline Front
Frame-Relay Snafus And Other Reasons Why Strategic Planning Is So Important
Notes From The Wireless Front
The Voice Telephony World Is Going To The Data Dogs
Putting Humpty-Dumpty Back Together Again, Part II--Wireless & Data Services Are Key
Section 5: Marketing
What Is The Most Expensive Wireless Minute? The New Economics Of Wireless
What's In A Name?
Paging Leaders Gather; Same Issues As Six Months Ago, But Here's A Siren Call
Will Some Carriers With Guts Please Stand Up At PCS 98?
Section 6: Regulatory
Beware Double Counting Of Revenues, Internet Access, & The Death Of The PC Tech Boom; They Are All Related
How To Bust The WinTel Monopoly
We Hate To Say It, But We Told You So!
To Bust the WinTel Monopoly & Solve Universal Service, Think National!
Regulatory Malaise Creates Strong Case For Wireless Bypass
The FCC Is Solving A Problem With A Problem
Section 1: Demand
Being All Things To All People Is Easy In A Telecom Commodity World
(Originally published May 26, 1998)
Telecommunications is becoming a commodity business...Not! Over the last few weeks I have been presenting my macro overview to internal corporate forums and industry trade conferences. The presentation is comprised of four parts, in which I: 1) highlight critical issues; 2) illustrate the past, present and future evolution of the industry; 3) discuss near-term strategic opportunities in the areas of wireless and data; and 4) present a framework for understanding the fundamental restructuring of the industry from vertical integration to horizontal differentiation. Throughout the presentation I emphasize such things as intranets, demand drivers, and shifts from the real movement of goods, people, and ideas to a virtual movement of the same.
The every day challenge of a telecom company is to get, keep, and stimulate a customer's demand. Unfortunately, every customer represents something of a unique demand curve. Just look around you and ask yourself how many people have identical communications needs as yourself. Most likely not many! A carrier's marketing matrix must look like a three dimensional cube of selling a multiplicity of applications (along one axis), across a multiplicity of networks (a second axis), to a multiplicity of demand segments (a third axis). Imagine a cube of 100 x 100 x 100, or 1 million, outcomes. Add to this yet another axis, namely latency. For as we pointed out two weeks ago, people will likely pay different prices for real-time, near-real-time, or store and forward information delivery. Now we have a four-dimensional puzzle on our hands. Surely one, two, or five, end-to-end, all-encompassing super telecom networks (STNs) can solve this riddle! Again...Not!
The point of view that the telecom business is a commodity business could only come from a monopoly supply-centric view of the world. Naturally by keeping customer demand static, the engineers and financial types can create vertically integrated STNs. On the other hand, a more demand-driven view of the world understands the infinite demand potential that two-way, multimedia networks will unleash. For instance, video-enabled work at home, or real time product testing, development, and purchase of say, a car, creates a myriad of demand solutions. In such a world, the distinction between content creator and content consumer becomes fuzzy. No one company or entity could ever hope to control the entire process. The result is likely to be multiple competitive providers at and within each layer of the OSI stack that interact with providers above and below them. As we mentioned last week, the topology of the stack has to be refined for WAN/MAN/LAN interfaces and processes. What works for access may not for the backbone and vice versa.
This interaction is controlled by standards. In the data world there is the very real fear that the Wintel standard is becoming a monopoly. The LEC is already a monopoly in the voice world. As we have written in the past, the two are probably inter-related, and if the government were to effectively deal with the latter, the former would probably resolve itself. Instead the government appears to on separate paths to dealing with the problems at relatively great cost.
Despite the regulators and monopolies, demand drives inexorably on. This demand will likely become ever more complex. My ten year view of the world is that multiple intranets come into users homes, the bulk of which they don't pay for, and they don't care over what fiber, copper, or wireless facility they come. If anything a democratization of communications, in which there is freedom of choice, is occurring, rather than a socialism, or commoditization, of communications with its corresponding choice of one. Think of the four dimensional marketing matrix the next time you are confronted with "golden-bullet" claims about how the internet will transfigure life as we know it, because the internet makes all current applications, like voice and fax, into commodities. Not!
The Wild West Of Telecommunications
(Originally published April 6, 1998)
As we rode through yet another telecom frontier town prospecting for digital gold we pulled into a shopping center called Westgate, prompting one of my colleagues to remark that practically every town in America had a "West something or other" shopping plaza. In response I offered that every American shared a predilection to move westward, to seek out change, to develop something new; hence our preference to Westgates over Eastgates. As we parked our car and strode into the local NYC-style bagel shop, I noticed that we were the only "suits" in town. We looked dangerous, like we were the gun-slingers in some modern western drama played out on the digital frontier. By the way, this frontier town happened to be a surprising hot-bed of telecommunications because it sat on a fiber-optic corridor between two major cities and had its very own nationally-renowned university. The alchemy for telecom gold was certainly present in them thar hills!
As I read the papers about prospectors striking it rich on a daily basis, I might not have been too far off the mark with likening today's telecom marketplace with the wild west of a hundred years ago. There is plenty of opportunity and money to be made; almost unlimited horizons, but there are also alot of hype-driven booms that ultimately could, or should, lead to busts. In fact, just last week a new company was listed on the OTC market worth $10 billion, with nary a dollar of revenue or capital investment to its name. Furthermore the trend of managers from large corporations leaving the sanctuary of the corporate fold to strike out and make paper millions overnight merely on promises seems to be accelerating. Unfortunately corporate life is vastly different from the entrepreneurial experience and there are bound to be hiccups.
According to the Wall Street Journal, in the first quarter of 1998 telecom funds outperformed every other fund, increasing 23%, as compared to 13.8% for the S&P 500 and 12% for the average equity fund. I see two primary trends driving this performance, namely consolidation and new issues. Whereas two years ago, with the passage of the Telecom Act of 1996, it appeared that everybody would be shooting into everybody's space, large-scale mergers actually seem to be taking guns off the street, which the market is roundly applauding. Replacing the old guns seem to be a smaller, more refined group of gentlemen gunners with pinpoint accuracy focusing on specific market opportunities.
After the dust clears from all the stampeding, the question rises whether or not the streets are any safer with, say, 60 guns versus 80 guns. For this drama to have a good ending for the audience of investors, management in the telecom industry have to figure out a way to strike it rich without knocking the other guy off. If the consolidating factions and new players can figure out a way to drive value to the end user (i.e. satisfy and stimulate unperceived demand) then the territory should expand enough so that everybody has enough space. In other words it is not the supply that matters, rather it is the demand that counts. I believe that almost unlimited demand (virgin territory) will drive revenues in this segment from $200 billion to $500 billion over 10 years. The real gold is for the telecom industry to capture a growing portion of GDP through the substitution of a real movement of goods, people, and ideas, to a virtual movement of the same.
Ten years from now we will give the players in this saga names and delineate the good from the bad and the ugly. In the near-term I see telecom management continuing to fortify themselves through "consolidation", giving them the best odds possible in any eventual showdown. But then, as with any conflict, flexibility should win out, causing management to unload unwanted baggage and weapons through a "devolution" of assets not considered core to the primary goal of driving value to the end user. As the industry moves from being vertically to horizontally integrated, management will have to make hard choices of what to keep and what to retain. After all, only a few prospectors who went looking for gold actually struck it rich, but the wild west was developed and other people became rich in the process. I believe the same should happen on the telecom frontier. Call us for assistance in mapping the new frontier. After all, its great to make a fist full of dollars.
Too Little (or Too Much), Too Late?
(Originally published June 29, 1998)
Too little, too late? The mega-long-distance/cable merger announced last week? Maybe (I'll get back to that in a second), but after our recent trip to Asia, I think the line is more appropriate for the next generation super-sonic commercial transport due out in 2010 (300 passengers, 1,500 mph, airfare only 20% more than current rates, travel time reduced 50%-60%). Maybe the jet guys hadn't factored in the global communications boom over the past ten years. In any event the rapid growth in global telecoms (with technology advances, each new cross-ocean fiber seems to add as much capacity as all previous lines before it, which is a good thing since cross-ocean fibers can't be upgraded as easily as land-based systems) is certainly making the global economy a smaller place. The only problem is that the physical distances haven't gotten shorter.
You've heard me muse that the current, and likely ongoing, telecom investment boom will be financed by a shift from the real movement of people, goods and ideas to a virtual movement of the same over the next 10 years. No doubt video-conferencing should make for easier dialogue and commercial discussion over enormous distances like those I recently covered, but nothing replaces feet-on-the-street due-diligence and face-to-face meetings, especially the first time, or maybe the last, just before closure on a deal. So I found that our typical sources of information, taxi drivers, in Malaysia and Australia proved that wireless CLECs can become a reality, especially where wireline offerings are scarce or relatively expensive. I also found management and investment opportunities that are capital starved due to the Asia crisis, but that may prove to be better investment opportunities than telecom properties in other, more favored, regions of the world. If you have access to capital and are a contrarian investor, give us a jingle. I have three investments in mind, and they all share the wireless CLEC, high-capacity, low-price theme typical of the computer/data world that I like so much. Strategic and financial investors are welcome.
The plain truth is that the global telecom boom won't make that super-sonic transport obsolete, but already has created the demand for its introduction today. By the time it arrives, we will wish we had more of it, sooner. So too, most major telecom companies hadn't factored in the growth of data networking over the past 10 years. The latter has exhibited, as I pointed out last week almost unlimited growth in demand due to the math of networks (n times n minus one over two). More importantly, the growth of inter, intra and extranet clouds (networks or applications) is only beginning. In fact, what I originally wanted to write about this week, before the mega-merger, was that even the demand I contemplated last week pales when one thinks that any micro-processor is capable of connecting to an infinite number of clouds. I'll develop this math in a future column.
Therefore the mega-merger, much like the super-sonic transport, should probably have come much sooner. But when I look at the numbers behind the mega-merger I have to shake my head and ask, "Why now? Why this much?" The answer lies in today's holy grail of data networking, namely end-to-end digital connectivity. And this is not to save on access costs, as most monopoly LECs presume, but rather to build the foundation for the applications that corporations so desperately crave. The reality is that shift from the real to the virtual economy will be paid by corporate sponsored data and multimedia intra and extranets, not unlike the competitively driven 800 boom in the US. It truly is remarkable that in just 10 years more than 50% of all long-distance traffic is 800-based.
The only problem with the mega-merger is its price-tag ($50 billion, albeit in funny-money), and lack of scope (only 30% of the homes). In addition, the shared-bus architecture of the acquired local connection network will require tremendous investment to make it compatible with the demands of the corporate network manager. That buyer, in particular, requires ubiquity, seamlessness, low-cost, and security. I don't think demand from the typical home-owner will amortize the cost of putting what is essentially a rapidly obsoleting PC on the edge of the network. Selling to the home-owner is a supply solution, which I believe will garner only 15%-25% hard-fought penetration rates. Selling to the corporate buyer is a demand solution, which I believe will result in penetration rates in excess of 50%. One other thing to note is that the same acquirer in the mega-merger is still just beginning to untap the potential of the mobile wireless assets it acquired five years ago and continued to build on to the tune of $30 billion. ROI? You go figure! While the deal is on the right track, it may indeed be too little (or much, depending on your view), too late.
Who Will Pay The Piper?
(Originally published September 21, 1998)
So far in 1998, according to SOC, there have been 1,251 deals totaling $357.2 billion in gross transactions in the telemedia market in the United States. Of that, telecoms (wireline, wireless, and enhanced and support services) accounted for 476 deals and $149.5 billion, or 41.6% of the total, respectively. Media industries, including broadcasting, advertising, cable, publishing, and entertainment accounted for the balance. Out of the total telemedia activity M&A transactions (554 deals) accounted for $205.8 billion, or 57.3% of the total. Debt issuance, with 487 deals, was $118.2 billion, or 33.7%, respectively. Equity, with 210 deals, accounted for a mere $32.2 billion, or 9%.
Within the telecom segment wireline telecommunications lead the way with more debt issued, $57.2 billion in 201 deals, than M&A transactions, 59 deals worth $47.5 billion. Equity issuance was a respectable $14.8 billion through 82 deals. Wireless saw a similar mix of transactions, with 44 debt deals worth $14.5 billion outweighing 34 M&A transactions worth $9.6 billion. Equity issuance was $2.7 billion through 14 transactions. Telecom Support and Enhanced Services saw considerably greater M&A activity on a relative basis, with 35 transactions worth $1.9 billion outweighing $0.8 billion in equity issuance through 8 deals. Debt transactions were a mere $0.5 billion despite a large (45) number of deals.
A large number of the debt issues in both the wireless and wireline segments were in the increasingly competitive CLEC, interexchange, and broadband wireless segments. Numerous issues were brought to market attacking the same pools of demand. This reminds me on a macro level what happened to the paging sector back in 1994-1996 on a micro level. While one could argue that paging is incomparable to the broader telecom market, one just has to replace the millions with billions to adjust for relative differences in demand.
The conditions surrounding the paging sector could be mirrored in today's telecom markets. The reality back then was that too much high-capacity capital came on line before distribution channels and applications were developed to soak up that capacity. Prices for existing applications plunged as carriers were confronted with technical and marketing glitches in rolling out new applications. Today' s telecom infrastructure, with end-toend high-capacity digital networks reaching less than 10% of the population, due to the !LEe monopoly bottleneck, may well have too much capital in the WAN and enhanced service platforms. I started to warn of this before the summer and certainly many of the telecom stocks are down 50%-80% since then. The inevitable result will be restructuring of capital and consolidation over the next several quarters or years until the monopoly bottleneck is broken. Unfortunately a lot of investors are already paying the piper.
Section 2: Technology
Supercomm--It's Certainly Not Just POTS Anymore!
(Originally published June 15, 1998)
Supercomm 98 was like Comnet, CTIA Wireless, and TCA all rolled into one big tsunami. It now has data, telecom, and wireless and hits you with a monstrous wall of information. It was so big, and complex, that I don't think the Atlanta convention center could have fit many more companies. Three different societies--IEEE Communications Society, International Communications Association, International Engineering Consortium--sponsored some 225 different presentations and workshops. It was sink or swim.
Sailing around the show floor I noted that alot of space was given over to access technologies, primarily wireline, although some wireless were present. The amount of different access topologies, and more specifically flavors of xDSL, was mind-boggling. In my estimation, coordinating all these access protocols is going to be monumental task and is the best argument for recreating seamless, end-to-end, networks a la the old Ma-bell system. Unfortunately, given current regulations and industry structure that is not likely to happen. More likely is that a slew of different access technologies leak out in dribs and drabs, and the market ends up looking like a cross between ISDN, and its lack of penetration, and the competing digital wireless protocols across the US that make our landscape look like wireless coral reef.
Under the maelstrom surrounding access issues there was a far more interesting and powerful current developing in the OSS arena. OSS, or operational support systems, is the glue that keeps the telecom super-carrier humming along from the basest network element all the way up to the customer. Heretofore the OSS' have been the rigid nets that kept outsiders out of the monopoly domain. But that is changing for a number of reasons, namely:
1) the sheer tide of competitive forces swimming around of the RBOCs have forced equipment vendors and standards organizations to begin to develop open interfaces to these previously closed systems;
2) the monopoly RBOCs realize that they have to control expenses in an increasingly turbulent world. The latter drives the need for automating interfaces that are currently manually controlled;
3) regulatory edict both in the form of the 14 point interconnection plan and things like CALEA (law enforcement eavesdropping) and E911 services, that are prying open portals in these supercarriers.
Looked at another way, a fully evolved and competitive OSS environment should facilitate the transition from a vertically integrated telephony world to a horizontally differentiated world of multimedia intranets in which each layer of the stack (from physical to transmission to switching to the customer) is optimized for rapid technology obsolescence and capital cost amortization. The latter is the most important element that is not appreciated in the markets today, and one that we all witnessed in the demise of the one-way cheap-beep, numeric paging world.
Perhaps one can read into the vastness of Supercomm the tremendous uncertainty facing the industry. The reality is if you have one opinion or another, you can should be able to go to Supercomm and derive some satisfaction that you are right. Over the next few years we believe the market will begin to weed some of the winning strategies from the losing ones. Central to this will be how service providers are positioned along the notion of the OSI stack and how efficient their OSS systems interface with an increasingly complex communications world.
Bring On An OSI Stack Kind Of World
(Originally published February 25, 1998)
Few people in the voice telecom world know about the OSI stack (Open Systems Interconnect model, established by the International Standards Organization), whose hierarchy rules the way information is transmitted in the data world. We suspect they soon will. After all, Jim Crowe has named his new internet protocol (IP) voice company after the third layer (Layer Three) of the OSI stack. Last week we wrote about our recent experience at Comnet (the mother of all data networking shows) at which we ran into our old voice friends from years back, and the fact that a very BIG, former data person, in the form of Mike Armstrong who will likely introduce some new, data-centric philosophies into the biggest circuit-switched voice carrier of all.
There are seven layers in the OSI stack, each with its own unique protocols. The first three are what really interest us, namely the physical layer (Physical Layer), the transmission layer (Data Link Layer), and the switching layer (Network Layer). The physical layer encompasses the wires (be they twisted pair, coax, or fiber-optics) and connections (pins) in a communications network. The transmission layer defines the basic electronics on either ends of those connections, like ethernet, ATM, xDSL, or Frame-Relay. The third layer includes switching and routing protocols, like the internet protocol. Layers four through seven encompass specific session and application protocols. But what most investors consider to be communications networks are layers one through three.
We believe the entire competitive and monopoly voice telecom industry can be restructured along the lines of the OSI stack. Were this to occur, it should pave the way for unbundling of the local loop and allow the long-distance and enhanced service carriers to gain access to local networks, while allowing the local monopolies to free themselves of layers one, two, and even three, and focus on enhanced services in layers 4-7. In fact, the 14-point interconnection plan ordered in the Telecom Act of 1996 outlines this restructuring. Now all that regulators, the companies, and Wall Street have to do is come up with financial incentives for the local monopolies to spin-off their copper wires and outside plant (layer one) and transmission loop carriers (layer two.)
This type of restructuring goes beyond the retail/wholesale model that some RBOCs and independents have suggested, essentially splitting them into three components. In our opinion, the key would be to find third parties that would pay the local monopoly for layers one and two, which are essentially fully depreciated on the regulated books, and to which investors give little incremental value. The third parties would then be free to operate and sell space on both layers as they see fit. This model is currently being practiced in the competitive long-distance industry.
The financial incentive for the monopolies would be to generate cash, which they can subsequently invest in layers 3-7. In turn, the need (and risk) of upgrading layers one and two is obviated. A local carrier today is concerned about bypass and competitive inroads from the likes of the cable companies' high-speed internet modems. While the telco monopoly might want to upgrade to high-bandwidth xDSL lines, the ISDN experience tells us that we shouldn't hold our breath. In fact, we believe the third parties, who own/operate layer 2, would be in a better position to spread the investment risk of component and network upgrades across multiple players that offer significant distribution scale. After all, the biggest risk, it appears, in upgrading the final mile, be it telco or cable plant, is the fact that most line cards, and elecronics-like set top boxes, will be obsolete within one-to-three years. In the end, modem electronics follow the same silicon curve as PCs--becoming obsolete in a very short time frame. Therefore the investment must be spread over high-volume applications and reasonably large market segments. This can only occur with the availability of competitive distribution channels (i.e., multiple players in layers three through seven). Over time, competition at the upper end of the OSI stack should generate sufficient revenue volumes to justify multiple providers in layers one and two.
We've recently heard about a company that buys capacity at layer one (essentially dark plant), and adds its own components to create both layers two and three. They then sell this capacity at a substantial discount from what other carriers charge to distributors up the food chain who have sold applications directly to end-users. The result is after-tax earnings within four to eight months for this new type of middleman. We have witnessed such financial returns in very few circumstances in the telecom world, and only in high-volume segments, where plant and start-up costs could be rapidly amortized. From what we know, this provider has accomplished this feat in an area not known for high, concentrated volumes. Obviously entrepreneurs have started thinking about the OSI stack; maybe you should too. Who knows, maybe Divestiture II will be a series of little divestitures.
Stacking Telephony Geographically--The US Is Ahead of Others
(Originally published August 10, 1998)
Regular readers of this weekly know that I have argued for a reorientation of the industry from vertically integrated all-things-to-all-people carriers to horizontally differentiated intranets akin to the OSI stack of the computer world. In fact this framework already exists in the long-distance market, or wide-area network, in the US. It is also somewhat developed in the customer premise market, or local-area-network, albeit on a private basis only. Only in the last mile, or metro-area network, is its development retarded by incumbent local exchange carriers. In my opinion, this W ANIMANILAN view of the world is a very useful topology for understanding the recent strategic developments in the industry and what the competitive landscape holds over the next 12-24 months.
The WAN in the US has been well developed by competition and a price umbrella driven by the big three who have suffered for 14 years trying to be all things for all people. The latter goal has been akin to poor Sisyphus trying to push his rock up the hill, because of the infinite variability of demand. The first to succumb to this fate proved to be MCI, whose post-McGowan management team failed to execute on critical strategic and operational fronts. Now Worldcom is taking on the struggle. AT&T, likewise was afraid for years of taking on its former offspring and as a result made its own boulder bigger. The question is will they, or somebody else push that boulder over, and if that occurs what is on the other side.
What is on the other side is what the long-distance industry's price umbrella has created, namely hundreds of horizontally focused carriers, who every day carry out lease/buy decisions at each layer of the stack. The model breaks down when those carriers feel compelled to be all things to all people, driven primarily by the regulatorysupported bottlenecks in the MAN. Because of the MAN bottleneck, these new carriers cannot develop end to end service at each layer of the stack, which in tum prevents them from maximizing economies of scale for each new hardware and software implementation. The result is to look above and below their position in the stack to try to directly drive these economies. The end result over time can only be lowered return on investment. This holds for voice and data centric carriers.
Two events over the past week bear watching. The first is the labor strife between the incumbent MAN monopolies and their workers. The irony here is that one stranglehold leads to another. Such strangleholds are rare in the competitive WAN world. Break the regulatory-supported MAN bottleneck and you will solve the other stranglehold.
The second is the initiative by the Technical Advisory Council (made up of one service provider-who has no revenues, and numerous equipment providers) to develop a new set of protocols to bridge the IP and circuit switched worlds. Seems the IP guys are having a hard time dealing with quality of service and latency issues; issues that the circuit switched world has optimized, at a cost, over 80 years. That cost is further magnified by the MAN bottleneck (how else can the telephony equipment companies charge such high prices for proprietary switching platforms).
So the IP guys may just be pushing another boulder up the hill in trying to amalgamate their original premise of using a protocol that was optimized for private networks where bandwidth was not a critical issue with public carriergrade service protocols. More specifically, I think that the TAC is looking to strand a lot of intelligence at the periphery of the network, which is just fine from the equipment companies' perspectives, but terrible from the public carriers'.
It seems like we're back to the same problem, namely the MAN bottleneck, which might be best solved if regulators, carriers, users, and others look at what has happened in the WAN in the US over the past 14 years, and what could happen in the MAN and LAN if a natural evolution were allowed to progress. Imagine companies that exist only to develop end to end physically connect networks. Others might focus on the transmission layer.
Others might further specialize in switching certain types of traffic. Yet others might be application or support specific. The US, because of its lead in the competitive WAN model, may get their frrst. Sounds improbable? Look at the auto industry. Look at what GM did after resolving its recent labor strife. It spun off its parts division.
Pssst! Let Me Tell You About The Real Paradigm Shift People Aren't Seeing, And How The Reports of The Voice World's Death Are Greatly Exaggerated
(Originally published October 12, 1998)
Discussions of paradigm shifts appear to be de rigeur these days, particularly in company reports, press releases, and presentations. Many of these are wellintentioned forecasts, but fail to account for the major structural inefficiencies (particularly the local access bottleneck) confronting just about every carrier and equipment vendor. Further clouding the outlook for these shifts is increasing competition, rapid technology obsolescence, and blinding strategic change. The market appears to be sensing this as the major competitive segments of the telecom sector have sold off 50%-90% from their peaks in 1998. Furthermore, as we pointed out two weeks ago, the market is having a major telecom capital-issuance hangover, which for lack of a major financial, strategic, or technological event, may be with us for sometime.
One paradigm shift that I stumbled across recently comes from a private telecom equipment company located in Illinois. That company, which I referred to in May after visiting the TRA show in San Francisco, epitomizes the centralized hierarchical "stacked" world that I have repeatedly discussed over the past 9 months. It actually provides software for switches, but has built and is expanding a wholesale network to prove its technology.
Furthermore, it may well have a 12 month lead over similar models, like the ION platform from a major long distance company. Conceptually, this company, is to the major network equipment companies, what Netscape (NQY: not rated) was to Microsoft (MSFT: not rated) four years ago. In response MSFr came out with a enterprise server solution model in which the client was free. This is very similar to the 800 services model in the voice world. So this company is taking this concept down into the stack (namely layer 3) with a server-centric solution to the current, high-cost, inflexible edge network, or clientcentric, model of today' s circuit-switched telecom world.
The latter is controlled by monopoly carriers in the MAN hell-bent on keeping their control over access. The incestuous relationship between the monopoly carriers and the equipment vendors has resulted in the current bottleneck the industry now faces on regulatory, strategic, and financial fronts.
Because the world of data networks has been far more competitive (in the LAN and WAN) for some time, the cost of switching has dropped dramatically. As a result, new competitors have looked to this technology to gain a key competitive advantage. The only problem (as I've pointed out in the past) is that what is good for data may not be good for voice and vice versa. Also, what works well on the edge of the network (LAN), may not work well in the WAN and MAN, and vice versa. Simply put, the reports of the voice, circuit -switching world's death are greatly exaggerated.
Through its VASP (variable access service platform), the company I am referring to enables the real-time rating, monitoring, provisioning and billing of any type of traffic stream (voice, data, multimedia) over ATM. VASP differs from current switching topologies by harnessing SS7 features within the network and not on either side. This results in significant bandwidth savings, but also increases a carrier's flexibility in provisioning new services (i.e. they no longer have to wait 6-18 months for new software releases from the major equipment vendors). VASP has obviated the Class 4 switch (longdistance; WAN) and will begin to attack elements of the Class 5 switch (local; MAN) over the next several quarters. Ifthe latter happens then every CLEC that has invested in Class 5 switches may be looking at obsolete investments. Simply put, the company represents the first real assault from inside the telecom world's circuit-switch fortress.
VASP itself is hardware, or switch platform, agnostic and they've signed licensing agreements with numerous switch manufacturers. The real attraction of the company's server model longer term (much like in the computer world) is that it speaks to many clients, while the client-centric architectures just work with like clients. This plug and play aspect of the server-centric model will appeal to carriers as they transition to a world that looks much like the computer world.
Finally, the key difference between the telecom and MicrosoftlNetscape client/server struggles, is that nobody has a real monopoly on the client in the telecom world. (As an aside, remember the column I wrote that in order to break the Wintel monopoly the regulators should break the local telephone monopoly. This is precisely what I was talking about.) The end result of the process the company is beginning is a restructuring of the vertically integrated telecom world in the WAN, MAN, and some degree, LAN, into a horizontally differentiated stacked world that looks like the OSI model. This is the real paradigm shift, and it is evolving before our very eyes. Call us for a referral to this company.
Section 3: Network Economics
On IP Telephony, Let's Not Forget The Real Costs Of Communications, And Other Musings
(Originally published May 11, 1998)
The other day I was asked what I thought about IP telephony. My answer was 30% cost savings on 30% of the telecom business model, makes for a 9% cost savings. While 9% helps, the reality is it doesn't mean a revolution in communications. The reason the cost savings is so minimal is that communications today is more than just technology. In fact, the underlying network now represents around 30% of the cost of service. The other 70% is split between marketing and operating costs, and hopefully, taxes and profit. Unless the IP telephony companies can get by without marketing and providing customer support, then their cost structure actually mirrors that of the companies that are supposed to make obsolete, like major IXCs and LECs, less the 9% savings in transport and, to some degree, switching.
Behind the IP telephony craze is the notion that the circuit-switched world will be made obsolete by the packet-switched world. This internet view of the world is driven by the bandwidth economizing virtues of the store and forward world of data. Unfortunately the latter has yet to perform well in a real-time two-way narrow or broadband world. Moreover it remains to be seen whether the distributed world of router switching is better than the centralized world of circuit switching. What grew up in the private networking world may indeed not be scaleable to the demands of the highly complex public carrier market? Two weeks ago we documented how large frame-relay networks are being replaced by more manageable, and scaleable, higher-capacity hybrid ATM and frame-relay networks.
Similarly, this internet view of the world will likely give way to a series of intranets and extranets from the customer perspective. Grades of service regarding performance, bandwidth, and latency should become common. As the mission critical nature of these networks increase, telecom managers will likely want no latency and buy a service that has a dedicated amount of bandwidth over a certain period of time between two or more points. But isn't this the same as circuit-switched? In reality, what is really occurring here is the simultaneous movement from monopoly (high and average pricing) to competition (lower and marginal pricing), and from balkanized to centralized platforms on all fronts (network, marketing and operating). In the process the telecom industry is beginning to reorient itself from vertical to horizontal integration (i.e. the telephone business resembling the computer world's OSI stack).
Unfortunately, a number of what I'll characterize as schizophrenic companies will develop who promise a "new future" but very quickly end up trying to emulate the very companies they are trying to obsolete. These new communication companies are raising alot of capital in the financial markets, some with astronomical valuations on simply a seemingly great idea. One thing for certain is that this process will result is not only alot more network capacity, but marketing and operating overhead as well. What is not clear is if there is enough current demand to "soak" up this incremental capacity. Over ten years I would say yes, but in the near-term? That's a big question.
Unless the answer to that question is better known, we suggest that a two-tiered discount-rate structure be used for determining the "value" of all telecom companies. Given all the rapid shifts on both the supply and demand fronts, I believe the world of wireless cannot be accurately predicted beyond the next 4-12 quarters. In the world of wireline I might extend that timeframe by another 4-8 quarters. For instance the WACC we might use for some of these start-ups would be the normal 13%-15% for the first few years. For the latter part of the model, we might use a WACC of 20%, or more. This would underweight the terminal value in year ten of these models, which typically represent 80%-90% of the valuation. What worked in a monopoly or duopoly world simply will not work in the competitive telecom landscape going forward, as evidenced by the debacle in the paging sector a few years ago.
The Geometry Of Networks Overwhelms Moore's Law
(Originally published June 8, 1998)
The geometry of any network of users is N*(N-1)/2. That is to say, if there are 10 users on a network, adding an 11th requires, or creates, 10 more potential connections or pathways to the existing 45, making 55 potential pathways. 20 people connected creates 190 pathways for information to flow. The number of pathways gets really big around 1 million users. This is the basis for the enormous long-run scalability and profitability of public switched intranets and extranets. Replace the image of so many lines or pathways connecting all these users with the picture of a cloud with one line to each user and you have the public network. The cost of one access line is levered by so many new potential pathways, and hence opportunity by the public network provider to collect tolls. Furthermore, the more people are on a network, the more people have to be on a network. Therefore the value of the network and necessity of connecting to it simultaneously explode. We've witnessed this with fax machines, corporate calling cards, 800 services, iDen, and e-mail, among other applications.
The math of Moore's Law, named for the famous Intel scientist, is that the performance of silicon should roughly double for the same price every 18 months. This trend has been clearly demonstrated over the last 30 years and leads us to believe that 1 to 10 gigahertz play/work-stations will be commonplace 10 years from now. Real-time car driving simulations and car-buyer computer aided design may be a reality and cost-effective alternative to today's working capital intensive and demand uncertain automobile production. It is this doubling of local computer processing that has alot of people convinced that the antiquated, analog, circuit-switched telephony world will be replaced by the dumb network, or edge-routed world of the internet.
But the above math says that can't happen. For every edge device that increases bandwidth consumption by a certain amount, network utilization can go up by a geometric amount greater than that. Today's router-based internet consumes 30% of its bandwidth just looking for other routers, because it doesn't rely on layer 2 efficiencies. As a result, I believe networks must stay intelligent and centralized in a hierarchical fashion. Remember, routers grew up in the private, meshed network world, not in the public network world. Visualize, if you will, the elegant cloud versus the confusing number of crisscrossed pathways that become unwieldy after some point. While the cost of silicon and performance of these routers improves by Moore's law (and maybe even better), it is overwhelmed by the growth of connections and potential and realized traffic between these connections. That is the source of internet congestion today. Throwing alot of cheap, inefficiently utilized capacity from layers 1 and 2 provides only temporary relief. This is what I read recently into the ION network announcement. ION is an ATM backbone (layer 2) that is edge device/protocol (layer 3) agnostic. Furthermore it rides on wave-division multiplexed fiber (layer 1).
In the end distributed networks are probably justified for private networks and local area networks, but they will likely fail in the public carrier domain. Still the debate remains one of the largest issues facing the industry today. This conflict between centralized and decentralized was in full evidence at the Supercomm show in Atlanta this past week. My belief is that carriers will optimize technology and topologies at each layer of the OSI stack, instead of having one optimal strategy. In addition, there will likely be tradeoffs between hardware and software costs. This is consistent with the debate raging in the computer/networking industry of whether distributed or centralized processing is best for the user, particularly with regards to stranding or upgrading processing on the edge. Regardless of that demand-driven issue, a single cost advantage at one layer of the supply chain does not guarantee an overall price advantage vis a vis the competition, which I highlighted a few weeks ago in my IP telephony column.
How Called Party Pays Are Centralized Intranets--The Corporate Market Is Key
(Originally published August 3, 1998)
The internet, 800 calling, and paging all share one thing in common. They are all examples of called party pays. Very simply put, there is a demand segment that will pay a carrier to be reached for free. Over the last few weeks, as I have discussed this phenomenon with several people using the example that 50% of all long-distance calls are 800 based--in fact, the number, I was informed, is more like 60%! Peopled don't pay for 60% of the long-distance calls they make.
In addition, the internet is all about providing users with free or low-cost access for new information economies. This low-cost solution is often misunderstood as being cheap. It's not, and it actually makes for a lousy two-way communications medium. The point is the internet is heavily subsidized. Furthermore, in my opinion, if the paging carriers would get their act together then they would find that their device is a perfect receptacle for "push" applications that corporations would pay for, be they internal email intranets, or advertising-like external intranets.
These forms of communications, and their perceived or potential value are easily understood within the context of the Microsoft-Netscape struggle. The latter was best known for its browser, which is also known as a client, which is also synonymous with access in telephony parlance. Microsoft came along and upset Netscape's model by giving the client away for free. The value was put squarely in the server. Corporations bought into that centralized model, because it was more understandable and manageable. Seeing the value shift, Netscape responded quickly. In other words the market was saying, if free access were allowed into the information store, then the value of the store would explode.
The telecom carrier that offers end-to-end, or nearly Ubiquitous, high-bandwidth digital access would likely become the darling of the information economy. I feel that the value of "portal" and online retail companies on the internet, like yahoo and amazon.com, is truly amazing. Imagine what these companies would pay to rent space on this carrier's network to offer free entry to their stores. Once the marketers and product developers figure out the applications (like downloading entire newspapers, developing cars and other devices in realtime, etc ...) this process should become self-fulfilling, very profitable, and have enormous societal and economic consequences.
Corporate America paid for the development of the vast media markets we observe today through advertising on broadcast (one-way) communications media. Only Corporate America can pay for the vast $200 billion investment that will be required to build high-bandwidth, two-way networks to shift significant portions of the real economy to the virtual economy. The process won't be funded by the end user paying for a few more video channels, or higher speed access to the corporate LAN, but rather corporations interested in offering free access to their intranets and extranets, that in turn run on centralized servers.
Nobody owns the server market today, the way that Microsoft owns the local processing market, or the local telcos and cables own the local access. However, I have said in the past that perpetuating the local access monopolies will help Microsoft gain a near-monopoly lock on the server market. Break the local access market today, and you break the Microsoft monopoly of today and tomorrow. Since the regulators seem incapable of accomplishing this I hope there are some carriers that understand this free access model. While I see elements of this strategy in some of the recent mergers between WAN and MAN (wide area and metro area; LAN is local area at the customer premise), no one company looks close to the ubiquity necessary to offer Corporate America what it desperately craves.
Bill McGowan's Legacy Is Our Prosperity
(Originally published August 24, 1998)
Economies are created through ideas. They are not simply the allocation of scarce resources, because that implies a static level of consumption and production. Nor are they simply the pursuit of happiness as one recently opined in a Sunday NYT edition. Rather, it is one individual's thought being bought and sold by others that creates an economy. This process is facilitated by competitive information networks, both one-way and twoway, the latter being more powerful because they provide immediate measurement of one's success at buying and selling ideas.
The US has a commanding global lead in enhanced widearea voice and data services, things like 800 services and thousands of voice and data VPNs (virtual private intra and extranets), provided by thousands of enhanced service providers. This process has occurred because wide-area transmission costs are the lowest in the world due to multiple competitive fiber-optic networks (ironically built on the rights of way of 100 year old railroads), While low cost transmission enables information to be transmitted more cheaply, it more importantly paves the way for enhanced services which increase the speed at which information is created. processed, and consumed. Service providers can develop these services over wide areas and wide market segments by centralizing intelligence platforms and backhauling the information over great distances.
Some characterize this process as the death of longdistance. In fact it is the creation of the network as the computer. Those who have access to the computer are growing faster on average than those who do not. As a result, conventional economic analysis. which is not tied to this notion, is incapable of measuring, comparing, and predicting growth accurately. Perhaps a new science, namely that of measuring information velocity, should be developed. In this fashion economists could distinguish efficient (in this country the wide area network-WAN) from inefficient (now the metro area network--MAN) information movement and explain much of today's economic success and failures. For instance, the current commercial real estate boom may be a function of the fact that whatever competitive fiber optics exists in the MAN ends up in these structures. What happens when fiber, or some other high-bandwidth medium, reaches the home? Probably not dissimilar to what the PC and fax did to white-collar office space in the late 1980s and early 1990s.
This high-bandwidth access to the home will probably happen when the final step in a 30 year process begun by Bill McGowan, namely breaking up the Bell monopoly, is taken. It seems to me that Bill McGowen's legacy of breaking up the Bell Monopoly (at least the WAN) may well be the prosperity we all currently enjoy, and that breaking the local monopoly may well extend that prosperity for the next several decades.
Section 4: Strategy
Notes From The Wireline Front
(Originally published May 18, 1998)
A few months ago I reported on the war going on in wireless from the CTIA convention in Atlanta. Well fresh back from the Telecom Resellers Association (TRA) conference in San Francisco, I can safely say that there is a war going on there as well. But while the war on this front has always been a price intensive affair (with additional fuel being added in the form of IP telephony), this war also deals much more with market segmentation and demand issues. On this front, the reseller lives and dies by his or her relationship with the customer and satisfying that customer's needs. As you, the reader, knows, I view the challenge every carrier faces to be one of selling a multiplicity of applications across a multiplicity of networks to a multiplicity of demand segments.
A little history helps to understand this wireline war. Four years ago I used to present at the TRA, when the only service to resell was dial-1 long-distance. At that time my presentation focused on convergence and the ability for resellers to sell other services and enhanced features to their customers. The need, I argued, would be driven by offensive and defensive opportunities. Now, it appears that everybody is going down this path. More recently, however, I modified my view of the world beyond this convergence theme to be one of devolution where the bigger players actually begin to divest themselves of non-core assets. These devolved assets actually end up in horizontally configured layers, or intranets, resembling the computer world's OSI stack.
Next to IP telephony the big theme at the conference was convergence billing, which is giving rise to a whole host of new players in the space. Furthermore as the complexity of billing seems to be on the rise, a whole new subsector of companies that help carriers determine the best billing systems and the performance of each is developing. Another area of big emphasis, as would be expected from resellers that got their start in the arbitrage business, was in international services, both at the wholesale and retail level. Given the complexity of tariffs and agreements overseas, as well as the advent of new technologies and processes, there continues to be a big focus on this front.
Finally, when we first surfaced the idea that the vertically integrated telecom world should be reoriented like the horizontally differentiated computer world along the lines of the 7-layer OSI stack, many people said that the operating support systems (OSS) of the RBOCs could not be easily circumvented. Well, when there is a will, there is a way, and at TRA we discovered that a cottage industry is growing that will do just that. This cottage industry is gaining impetus from competitors like resellers, CLECs, and IXCs, as well as from within the RBOCs themselves. The latter are doing this as they realize that they will have to automate access to their OSS' in order not to be overwhelmed as their networks open up. And, finally, we saw some interesting developments on the layer two and three fronts that support our notion that IP telephony isn't all that it is cracked up to be (refer back to last week's piece, which was actually written before TRA). One company had something about ATM backbones, with out-of-band signaling, that enables real-time rating of any type of call, be it voice, video, or data. Call us if you want to chat about some of these developments.
Frame-Relay Snafus And Other Reasons Why Strategic Planning Is So important
(Originally published April 20, 1998)
Frame-relay (FR) net got you down? Reading last week's mega-data-networking-snafu from the largest inter-exchange carrier's FR network (causing customer losses estimated in the hundreds of millions, according to the WSJ) sent shivers down my spine, but also a smile across my lips. You see, because as management scrambled to find someone, or something, to blame, the real blame lies with decisions made back in 1991-1992, by the head of the then business-services unit. This manager happened to laugh me out of his office when I mentioned that FR was going to hit the big time. His view of the world was that it would remain a niche application, and complement, to some degree, existing private-line data networks. Because of his decision, the carrier was late to market with FR and developed what, we believe, was a sub-optimal implementation that may well have contributed to the snafu.
Since that day in the office, FR services has hit the big time, despite the growth of the connectionless internet, with revenue run-rate approaching $3 billion. The reasons for this success are many. First is that the standard was defined by a small group of market driven companies, not by monopoly bell-head types, which have been traditionally slow to accept new technologies. Second, it was an evolution from existing packet-based networks. Third, it was a cheaper solution than other "golden-bullets" like ATM (asynchronous transfer mode) and SMDS (switched multimegabit data service) in the early 1990s, and more secure and manageable than IP-based networks. The only problem we've observed with the development of FR has been the tardy development of switched virtual circuits (SVCs) which would give customers greater ease over defining networks, but at the same time lessen the control by carriers, as well as revenues, to some degree. On the latter note, the carriers failed to realize, as many carriers in other industries like paging (multi-frequency pagers), cellular (roaming charges), ISDN (different standards), and many more, that ease-of-use is paramount in communications and actually creates the more valuable, higher-revenue generating, applications like fax, 800, and numeric touch-tone paging.
Despite this, FR has become a secure intranet backbone protocol for many corporations to connect far-flung offices (intranet), and even to begin to connect to suppliers and customers (extranet). As a result, individual FR networks have grown to gargantuan size, sometimes connecting several thousand sites. The result has been enormous loads on carrier networks; networks which were never optimized to "carrier-scale" requirements. You see, most carriers made the wrong implementation choice by putting in alot of little "private-networking" FR switches rather than a few "carrier-scale" FR data switches around the country. The latter, band-aid, solution resulted in this large long-distance company installing 140 switches. It is the large number of switches, and probably numerous topologies as a result, that we believe may have contributed to the mayhem. Upgrading software on one of the FR switches on a Monday morning was also not the best idea!
We spoke with another carrier that has over 300 FR switches, who is now switching over to 8 carrier-class ATM switches with 100 ATM edge-switches. This new network will continue to "carry" FR over the ATM backbone, but be capable of carrying new multi-media services at the same time. It costs more, but the carrier can provide more services, handle more loads, and lower its relative cost and network complexity, ensuring more manageable network evolutions going forward. To me the carrier is finally realizing that data has hit the big-time and this time around, the right strategy is being implemented.
Another competitor, which originally said it was going to be entirely IP-based, announced that it is developing an ATM backbone, as well as expanding its existing circuit-switched voice platform. It seems the more the world evolves to a single golden-bullet solution, the more complex it becomes. This complexity as we've stated in the past is a function of the multiplicity of applications, sold to a multiplicity of demand segments, across a multiplicity of networks. In the end, networks and applications are no longer static items, but constantly undergoing a process of creation and destruction. This is ultimately why scale and time to market are so critical in the competitive telecom environment, and why so very few players are actually strategically well-positioned; despite what the market might think!
As networks have evolved from supply-centric facilities-based monopolies to demand-centric software and marketing driven platforms the rules of the game are changing. No longer can carriers afford to role out limited "private" solutions for high-end users, but rather need to build high-performance WANs from the start, and scale those platforms as quickly as possible. Therefore, the high-end user could be viewed as the loss-leader, with the back-end being filled by mass-market access into those applications (intranets and extranets); often times paid for by those very same high-end users. This is the basis of the 800 services, the internet, and other services that have developed in the competitive telecom world. This is actually the conclusion I develop in my multimedia, macro-telecom presentation that I am happy to share with you, the reader, at anytime. Please call or e-mail me for a drive-through.
Notes From The Wireless Front
(Originally published March 3, 1998)
There is a war going on. Not necessarily the one traditionally associated with telecommunications, like price-wars, but one pitting the supply-siders against the demand-siders. The supply siders mantra is build it and they will come. Demand-siders on the other hand perceive a need and opportunistically find or build supply to satisfy that need. Call it the war between the monopolists and competitors. Even though the former still outweigh the latter, suffice it to say, we think the demand satisfiers have the edge. Just think of all the demand in the enhanced voice and data worlds created by entrepreneurs over the past 15 years that was not evident to the supply-siders during the 1980s. Remember when the market questioned the need for a third competitive long-distance company as recently as 1992? Now we have seven or eight.
Actually, the demand is never created, it is always there. In fact, we believe that not only is there plenty of future demand to sell into, but tremendous amount of latent demand (built up over the past 100 years of inefficient monopoly networks) as well. This demand will materialize in the shift occurring from the real movement of people, goods, and ideas to a virtual movement of the same. The total domestic telemedia services pie can probably expand from $200 billion to $500 billion, or from 4% to 7% of GDP, over the next 10 years. In fact, last week's discussion about restructuring the telecom world along the lines of the OSI stack (going from vertical to horizontal integration, or devolution of the vertically integrated monopoly and duopoly carriers into specialized competitors like in the data world) is really about moving from a supply orientation (layer 1) to a demand orientation (layer 7).
We witnessed this supply-side/demand-side confrontation first-hand at CTIA '98 in Atlanta, one of two big wireless trade shows held each year. This year's official themes were competition and safety, both chosen with an eye towards influencing regulatory perception. Unofficially, we thought the big themes were the unequivocal arrival of CDMA and wireless data. The first was due to the fact that the big pure-play 1900 CDMA systems were activated after last year's CTIA's conference and have now been successfully operating for almost a year. In addition, the overlay 800 CDMA system's have seen their performance improve dramatically. As a result, the world of big-capacity, big-minute-bucket wireless digital systems (including GSM, TDMA and iDEN) is finally here!
In our opinion, this increased capacity has everybody in the industry on the edge of their seat and has driven the supply-siders to conjure up the abundant wireless data applications evident at the show to soak up that incremental capacity. The only problem is the inability by carriers to cost-effectively sell into and satisfy that demand. Placing transmitters into the ground and producing minutes or bits per second is the relatively easy part; soaking up the capacity of those networks not knowing by whom, what, where, why, when, or how that production will be utilized is the challenge. This view is based on the fact (as we previously discussed) that broadband wireless networks have been optimized for voice, which is a universally understood application. Data, on the other hand, is a series of specialized applications, compounding the already daunting marketing and operational challenges that every wireless carrier faces.
In other words specialized wireless data may represent yet another network or operating challenge to carriers who are just trying to get their hands around the voice world. A parallel might be drawn to when cellular carriers had to adjust to the arrival of 0.6 watt transportable phones from 3 watt mobile phones while still dealing with their astronomical subscriber and network growth of the early 1990s. Our suggestion to carriers is to focus on selling more voice minutes and let the data market, which we, nonetheless, believe to be fundamentally bigger than voice as it is in the wireline world, evolve naturally. In other words, satisfy existing demand cost effectively by getting the perception of the cost of mobile minutes down for the simplest application of all, namely voice, before trying to satisfy what will likely be extremely varied and complex data demand. After all one of the big digital carriers at the show said that 44% of the customers would prefer to use a wireless voice connection over a wireline connection when the perceived cost is only 3-7 cents per minute higher. For demand-siders, perception is everything.
The Voice Telephony World Is Going To The Data Dogs
(Originally published February 11, 1998)
At the Communications Network (COMNET) conference in Washington two weeks we ran into a lot of our old friends from shows like SuperComm--historically the venue for voice equipment and service providers. Since we've been out of the wired world for the last two years in the wireless domain, we were thrilled and surprised to run into our voice friends in the land of data. With virtual net services (VPNs) and voice-over-IP being trumpeted everywhere we had to laugh at the thought of dead voice monopolists rolling over in their graves as their old suppliers gave away the keys of the telco city to the new data crowd.
But we think the dead can rest easy in the near-term since the regulators and the courts don't seem to have the heart (or the know-how) to generate effective local competition. Furthermore, only a select subset of users will truly be able to utilize these new services. Moreover, there is nothing magical about voice over IP other than it is an outgrowth of the store and forward, packet switched data world, while the monopolies maintain the balance of power in the real-time, two-way, circuit-switched world.
In our opinion the reports of either one mastering the other are greatly exaggerated. First of all, actually accomplishing voice over IP (which still has slight time-delay) is one thing, managing to provide all the value-added services and functionality that have been developed in the circuit-switched world is another. Second, the real cost savings in voice over IP is in a private network application or by-pass of local access charges. And it is this latter point that could bring to a head the issue of local competition. Simply put, the need for internet fax or internet voice services exists only because of inefficient, high-cost, monopolistic, analog access. The ways around this state of affairs currently include virtual bypass (dial-around), private network bypass, and wireless bypass.
Speaking of wireless bypass, higher frequency wireless technologies, like LMDS, were well represented, as well as wireless packet networks in the cellular and PCS frequencies. We believe both will very soon be giving the monopoly wireline providers (telcos and cables) a run for their money. Noticeably absent from this mix were narrowband wireless applications. Not to mention real-time internet push applications, we think narrowband is a huge opportunity for the data world, particularly as low-cost, two-way wireless messaging networks develop with the ubiquity and in-building coverage for wide-ranging telemetry applications. We believe connecting billions of embedded processors to inter/intranets dwarfs a lot of other "new" communication opportunities.
Finally, maybe it was coincidence but in the same week as COMNET, the new AT&T Chairman, formerly of IBM and digital satellite land, spoke of demand elasticity as the future for his company. Very few, if any, voice management has talked this theme. Do we hear Moore's Law knocking on them telco gates?
Market: The major market indexes gained ground last week, particular the technology sector, as Asian markets continued to rebound. The Dow Picked up 3.6%, the NASDAQ advanced 4.6%, and the S&P 400 gained 3.0% for the week, despite some late week profit-taking. For the month of January, smaller-cap issues and tech. stocks fared best as the Dow Picked up 0.8%, the NASDAQ advanced 3.1%, and the S&P 400 gained 2.7%.
Wireless: Wireless stocks (+4.0/2.8%)* performed in-line with the up-market. Half of our wireless indexes performed better on a market-cap weighted basis as opposed to a price weighted basis, which is down as a proportion from recent weeks, during which investors heavily favored the larger-cap wireless, and indeed telecom, stocks. The narrowband industry (+6.1%/+1.4%)* benefited from positive earnings announcements and talk of value-added services. International Cellular/PCS stocks climbed 9.1%/4.5%*, while LMDS stocks rose 10.1%/10.9%* as that industry's auctions continue.
Wireline: The wireline sector (-0.1%/+0.3%)* underperformed the market indexes, and slightly favored smaller-cap issues. CAPs/VANs (+2.2%/+2.2%)* paced the wireline sector, while long-distance stocks (+1.5%/+0.2%)* also benefited from some strategic announcements in conjunction with earnings reports. The regional phone companies dropped 1.6%/1.5%* for the week.
Support Services: The Telecom Support Services index (+3.0%/+6.0%)* erased some losses from earlier in the quarter. The enhanced services stocks climbed 12.3%/4.6%* to lead the group, while fraud/activation/distribution stocks (+11.8%/+4.0%)* followed closely. Billing stocks (+1.1%/+6.0%)* underperformed the markets, and bucked the trend toward larger-cap issues.
* (Market cap. weighted/price weighted)
Putting Humpty Dumpty Back Together Again, Part II—Wireless And Data Services Are Key
(Originally published January 12, 1998)
AT&T's (T—60 1/8, rated Buy by Guy Woodlief) purchase of Teleport Communications (TCGI—53 3/16, not rated)—giving it a toe-hold in the local market—came on the heels of SBC's purchase of SNET, which gives SBC a toe-hold in the lucrative northeastern markets. Both transactions represent a more significant blurring of the regulatory lines dividing disparate telemedia assets than any here-to-date, including Worldcom/MFS (WCOM—28 3/4, rated Buy by Guy Woodlief). The latter, while symbolically large, did not involve as much total revenues and potential physical scale as these four players. Worldcom/MCI (MCIC—28 3/4, rated Buy by Guy Woodlief), while being large in a revenue sense is little more than consolidation within one of the balkanized subsets of the telecom world, namely long-distance.
Expect more of this reshuffling of the deck as the markets attempt to put humpty-dumpty back together again in the form of multiple end-to-end ma-bells. These new super-telecom-networks (STNs) will look vastly different from the old AT&T as they attempt to meld and develop multiple end-to-end networks into cohesive wholes. The challenge will be enormous, but the opportunity will be to capture a significant portion of a $500 billion telemedia pie 10 years in the future.
Wireless and data services will be key drivers as these STNs race to capture a significant portion of the customer's growing value-added needs beyond plain-old-telephone service (POTS). Wireless provides a ready form of physical bypass for existing competitors, while data services provide the basis for virtual by-pass. Data is a broadly used term which encompasses many applications and services, but the key architectural distinction from that of voice is that data services have developed on store and forward, packet-switched networks, while voice has historically been real-time, two-way, circuit-switched networks. It is the melding of these two over the next several years which should herald a revolution in communications. Wireless is important because it will enable that boom to happen faster and sooner than wired access.
The problem these consolidating STNs face, in addition to the rapid technology evolution on the network supply front is the rapidly shifting demand landscape. These carriers have to develop flexible software and marketing platforms in order to sell a multiplicity of applications across a multiplicity of networks to a multiplicity of demand segments. Imagine developing a product chart that measures 1,000 X 10 X 10,000 just to illustration the enormity of the task at hand. To us, 100 million solutions implies that telecom (telemedia) is anything but a commodity business going forward.
So just as there is going to be a lot of reformatting by putting the pieces of humpty-dumpty back together, at the same time there will be forces tearing them apart. At a very minimum, we see the need for small and medium sized enhanced service providers that focus on specific market segment, application, geographic segment, network element niches. No one STN can satisfy all the potential demand, nor can 5-6 of them, which is the number we see developing out of the current large-cap restructuring.
What Is The Most Expensive Wireless Minute? The New Economics of Wireless
(Originally published May 4, 1998)
The new digital PCS entrants should take a page from the Japanese car companies that invaded the US in the early 1970s. The Japanese didn't take Detroit head-on with claims of offering power windows, power seats, FM stereo, etc. Rather they offered tremendous value both in price and gas consumption, which was what a large proportion of people wanted. "Just get me from A to B as cheaply as possible, with only a modicum of fuss," they said. Of course, over time the Japanese started upgrading the cars with the very same features that made American cars great just as the Americans went the opposite way, and the rest, as they say, is history.
So why, I ask, have the new digital PCS providers taken the analog cellular companies head-on? Instead of focusing on their primary virtue, namely cheaper minutes, they instead try to take cellular providers on in terms of things like coverage, phone features, phone price, etc... In my opinion, the reality is that entrenched cellular has the PCS providers beat hands down on many fronts, except capacity. Here the PCS providers, much like the fuel-thrifty Japanese engines, have a significant advantage in an area that counts the most for American consumers. Why? Because Americans are a mobile society. Better gas consumption fuels mobility as dramatically as cheaper wireless minutes.
So what is the most expensive wireless minute? Well, the one that a provider doesn't sell. In fact, I feel that most providers don't focus on their minute production at all, nor try to compare that with the total minutes sold. Instead, most carriers focus on things like peak consumption in the most congested cells and/or in the switch. The latter is eminently scalable, and with sectorization and pico-cells, so is the former. I believe that most carriers end up selling less than 20% of their available and reasonable production (5 am to 1 am). And yet they are amazed when, with the large bucket plans (800 minutes for $80), their capacity utilization goes up and percentage of traffic in the peak busy hour goes from 14-15%, as is the experience for high-priced cellular, to 9%, which implies a far more even distribution of traffic, and hence greater usage and profitability. I believe carriers should get that down to 5%-6%, or improve utilization across 20 hours of the day. In return, all those expensive minutes won't go to waste and burn off into the ether.
To get there, however, carriers must reject old notions about the wireless minute. First, they should change perception on the part of the consumer about the cost of a wireless minute, which is still viewed as expensive, to the value of the wireless minute within the consumer's time and financial budget. Why haven't we seen PCS providers tout ads that say, for example, "3 minutes on our network costs 30 cents, 3 minutes at a payphone costs 35, 3 minutes on cellular costs 1 dollar 15 cents." Remember MCI's ads when they first started out? Remember Japanese cars that proudly stated they were cheaper and drove alot farther on a high-priced gallon of gas?
Even the large-bucket PCS providers have not changed the perception of the public significantly. Despite almost two years of intense competition, people do not view a wireless minute like water. Water is something which is essential, plentiful, but people will pay premiums for under certain circumstances. What do you think people think when they see a digital PCS ad offering 800 minutes for $80, with a phone for only $149? Expensive, I say! But when compared with the combined cost of local, cellular and long-distance, these PCS plans are actually cheaper, and offer the user the utility of anywhere, anytime communication. For instance, calling grandma on the west coast for 3 minutes at a cost of 45 cents (the same call might cost more than 75 cents on a payphone) or a local friend for 15 cents over the PCS phone when buying a coffee and bagel during a mid-day break for $2.50 isn't all that expensive given the time savings.
Something is clearly wrong here, namely the inability on the part of the market to come to grips with pricing elasticity. In fact, as ARPu (price per minute) drops, ARPU (revenue per user) has clearly been shown to rise. Furthermore, because one customer at 400 minutes is spectrally and marketing and operating dollar much more efficient than 4 customers at 100 minutes, then the cost drops dramatically. It is this inability on the part of companies and, in particular, analysts who equate high price per minute to high average revenue, that is particularly dangerous for these high-capacity PCS providers with their expensive networks and enormous start-up marketing and operations costs. If the PCS providers continue down this path, they may never generate a return on that investment, or, at a minimum, their debt leverage could, I believe, wipe out their equity values.
All-you-can-eat (AYCE) plans changed perception, but they were just plain foolish. The reality is that wireless spectrum is inherently shared and dirty, and therefore its capacity is not as unlimited as fiber-optics. The last thing a wireless carrier should resort to is AYCE pricing. Bucket plans are fine, but smart users will utilize them to their fullest, while the average potential customer will not appreciate the real value of the plan. Instead, carriers should resort to what the long-distance (and many other industries) have resorted to, namely marketing plans that get the customer to perceive one value, while yielding a higher one. These are software and marketing, not technology, solutions. For instance the now famous Dime program from a long-distance carrier had the customer perceiving a 10 cent per minute price, yet actually paying, or yielding, a blended 13 cents. The carrier's cost was 7 cents. I remember when other analysts said that that carrier was giving away the farm. In the end competitors dropped their rates to a flat 10 cents and lost a bundle. They hadn't played the perception game well.
I've actually developed an optimal pricing plan, called PoWeR (Prudential Wireless Research), which incorporates those features described above, while getting the customer to unknowingly help the carrier utilize spare capacity or defray marketing costs, like Friday's Free or Friends & Family. Furthermore it takes advantage of the fact that no customer can be at two places at once, and that we are all 7 by 24 creatures. PoWeR is based on zone pricing, which can be accomplished with current technology and a few software programs quite cost-effectively. The plan actually sells 900 minutes for $55, or an average cost of just 6 cents per minute, yet yields 11.3 cents including 30% usage of long-distance minutes at just 10 cents. The cost to the carrier is just 4.3 cents. Imagine 65%-70% margins in a competitive world. Even if the customer calculated that they were spending over 8 cents on average versus the 6 cents I sold them on, they still wouldn't care because their perception of a wireless minute is probably 50 cents or more. Now that's as big a win/win opportunity as I've ever seen. So what's the most expensive wireless minute? The one that nobody heard about and evaporated into the ether before it could ever be used!
What's In A Name?
(Originally published March 24, 1998)
With all the consolidation going on in the telecom industry, a little-noticed casualty is the corporate name--a name that typically has cost tens to hundreds of millions of dollars to promote. For instance, when 360 and Alltel announced their merger last week, management indicated that the 360 name, which had well over $50 million of direct advertising poured into it, would be dropped. Likewise the names NYNEX and Pacific Telesis, have been dropped in favor of their corporate acquirers, BellAtlantic and SBC, respectively. In the past 5 years, consumers in 360's markets have received service from the same provider under three different names, and now have a fourth to contend with. We can only guess if there will be fifth within the next 2-3 years.
All of this results from the process of undoing the inefficient regulation of the telecom industry over the past 100 years. There is nothing wrong with this process, other than the market is losing some pretty good brands and customers are probably just getting more confused. The latter is being further compounded because just as corporations are consolidating and changing out names, new companies--who heard of Qwest, RCN, Layer 3, or remembered Wiltel just two years ago--are being formed on a daily basis. Moreover, customer confusion can only get increasingly worse as hundreds of enhanced data service providers, like ISPs, start encroaching on traditional voice markets.
The real question is can the market support so many vertically integrated (network, transmission, switching, application) providers who all have marketing and customer support programs. At some point diminishing economies may set in as so much capital is invested in these infrastructures. This in fact maybe what lies behind the consolidation of the bigger players. For that matter, how long can even BellAtlantic and SBC, or even BellSouth, retain their corporate identities if the trend is increasingly towards national networks. What customer in Oregon or Massachusetts would be drawn to the name BellSouth? Of course this begs the question of how long a company like BellSouth, as a full-fledged, heavily-capitalized, facilities-based provider can remain a regional only player.
Our central thesis is that networks and technology will be less valuable going forward in the acquisition, retention, and stimulation of customer demand than application software and the marketing of those services. The challenge to all carriers is to sell a multiplicity of applications to a multiplicity of market segments on a multiplicity of networks. This industry is far from being a commodity business. If anything it is an industry with potential infinite demand. The issue is capturing that demand cost effectively. Normally that is done with rather simple messages and well-known corporate brands. So another question arises: can new corporations joined together retain dual, or split, personalities in their corporate name? How long will the Worldcom-MCI stay as it is? If the MCI name disappears, then what ultimately was the value of that brand?
We believe the solution to this name dilemma maybe the restructuring of the telecom industry into the OSI stack. The result should be multiple players and multiple layers. One company might become a specialist in the physical layer, while another might become a specialist at the Sonet switching layer. This notion fits well with our 10 year view of the world that the telecom landscape will be a series of intranets that are horizontally, not vertically, configured.
Paging Leaders Gather; Same Issues As Six Months Ago, But Here's A Siren Call
(Originally published April 27, 1998)
Other than more bankers and analysts showing up at the semi-annual paging leadership gathering, it seemed like much of the issues raised were similar to those at the last gathering in October. Chief among these was the push to profitability and positive net present value on existing and future subscribers. Carriers and bankers alike espoused the view that slower growth was better for the industry, even if it meant declining customer bases in the near term. Rational pricing, it was generally espoused, would be a positive as well.
I agree with these issues, but only partly. Networks benefit from scale economies and rapid loading normally leads to the greatest profitability. Slow the growth too much, and scale economies could eventually be hurt. Secondly, I believe that in today's age of Moore's law carriers have to be cognizant of offering significant increases in value for any form of price increases. Raise prices too much, or with no incremental value-added, and consumers are likely to perceive the cost of cheap-beeps as being too close to that of broadband alternatives. It is important to note that one analyst indicated on a panel that she had gotten rid of her pager and relied solely on her digital broadband voice/messaging/dispatch device from Nextel.
Narrowband messaging, or paging, clearly occupies a space in the world of wireless that is separate and distinct from the broad and wide-band segments, in my opinion. It offers a form of access, both physical and virtual, that other service providers in the telecom marketplace should recognize. Because of its inherent advantage in latency (it is essentially a store and forward medium, not real-time, two-way) narrowband networks offer low-cost, reliable network solutions compared to broadband. Regional and national implementation are more cost-effectively achieved. Furthermore, because of the latency aspect, narrowband one-way and two-way networks offer advantages for in-building and wide-area geographic coverage.
Perhaps one of the hidden values of today's paging providers is the installed base of existing customers, each with a unique identifiable address, or telephone number. This existing "connection" represents a form of virtual access, that when combined with enhanced messaging applications could be expanded to direct physical access in the person's pocket (i.e. information services, advertising push, and other enhanced offerings). Furthermore smart service providers within the paging space should recognize the opportunity of attacking the market with commercial "intranet" offerings to corporate users.
Unfortunately the world of text messaging lacks the ubiquitous interface of touch-tone that made numeric paging so popular and lead to paging as an accepted form of communicating. The internet is not the solution to this problem as some would suggest, because it involves multiple addressing schemes and multiple steps by the sender and receiver. In my opinion, an interface is needed, and needed soon, particularly as the industry evolves into 1.5 and two-way applications to make the sending a receipt of text messaging as simple and widely accepted as numeric paging, facsimile, 800 numbers, the world-wide-web, simply dial-1, and even the work-group, push-to-talk application from Nextel. Furthermore, I believe the evolution to high-capacity, more expensive, narrowband platforms will only increase the complexity of marketing solutions to customers, and possibly decrease the economies of scale of any one application or device. As a result, some form of ubiquitous interface is required for this industry to be squarely in the middle of the messaging world and achieve underlying economies of scale. For instance selling voice-mail stand alone is alot less profitable than selling it combined with telephone service. The telephone is ubiquitous and touches billions of other devices. Voice-mail systems, on the other hand, are typically closed systems and offer no opportunity to continue the communication beyond other users on that system.
The narrowband messaging industry needs to move quickly, because every day I sense the broadband cellular/pcs industry is outspending it 100 times on the marketing, distribution, customer service, and capital investment front. And whether or not the recent Wall Street Journal article was correct, which cast broadband competition in Florida in a negative light, the reality is that a shift in perception to low-cost minutes is likely to stimulate demand significantly for broadband minutes and services. The competitive issue then arises if communicating over two-way broadband networks becomes so relatively inexpensive, who needs store-and-forward numeric or text messaging? Only if this industry gets together can it survive the coming onslaught profitably. An industry built on niche applications and market segments, with sizable regional and national build-out requirements cannot survive in today's rapidly evolving, and from the demand perspective, confusing, telecom business.
Will Some Carriers With Guts Please Stand Up At PCS 98
(Originally published October 5, 1998)
PCS 98 in Orlando was what one could have expected from a telecom show these days. It was big, yet it took only three hours to traverse the exhibit floor, and that included several 10-15 minute conversations with old and new contacts. Aside from the normal handset innovations (sure the new iDEN phone is impressive) and yet another generation of new, two-way paging devices, many of the headlines seemed to border around CALEA and other regulatory issues and 3G. In other words, there was nothing from a service or carrier perspective that got my juices flowing. It seemed to me that nothing was really new that would really make it easier for one person to talk to another and, in the process, spark a revolution!
Instead, while the carriers are fighting for their strategic and financial lives, the vendors put on a big show about 3G. 3G (next generation digital cellular/pes standard) is, in my opinion, to current protocols what ISDN was to POTS and SMDS was to X.25. It is monopoly overkill and a senseless exercise, pursued purely to keep a lid on competition for both vendors and carriers. (We will devote an entire column on 3G in the near future.) Suffice it to say, it is interesting to note how hard the monopolist/duopolist standard bearers (on both the vendor and carrier front) are trying to suppress inclusion of the current CDMA format (IS-95), which was a direct outgrowth of the most competitive wireless environment in the world.
Shifting gears, there were some interesting insights to be gained from a panel that I hosted about "Increasing The Revenue Pie" for paging and messaging. I asked each of the three panelists from the industry to explore seven case studies of applications that were or were not successful at integrating the simple paging service into a business process. What came to light during the panel was that future applications are almost infinite in scope, but that the challenges on the revenue (pricing), distribution (marketing), support (operating costs), and development (investment & R&D) fronts are equally enormous. They are magnified by the fact that the paging industry has not historically been marketing driven.
What the paging industry needs to do, in my opinion, is find the solution to migrating its existing base of one-way numeric customers to higher-ARPU text messaging services. Paging companies send messages better than anybody, and touch-tone made that process ubiquitous for everybody to get in touch with everybody. Unfortunately low-cost, long-battery life, digital PCS and cellular offerings are becoming competitive from a value context to numeric paging (the notion being that is not a pure substitute, but the need to own a numeric pager above and beyond the broadband device declines for a significant portion of users). To accomplish the simultaneous goals of upgrading the existing base and selling new, two-way applications (which individually are niches, but taken as a whole amount to a mass market) the paging industry needs to create its own version of touch-tone for text messaging.
On a similar note, until cellular and PCS carriers realize that they are not selling handsets, but rather cost-effective minutes, they will be caught in a unending, competitive cycle not dissimilar to the one paging underwent in 19961997. At current prices wireless is, in my opinion, clearly becoming a substitute (or complement) to wireline services. The important distinction is that this pricing not be compared to the price of cellular or even wired telephony, but rather the time and financial budget of the person involved. Only then will the market begin to believe that long-term ARPU's can rise and even be sustained well north of $50 or even $60.
In the end, it's the marketing dummy! Until the vendors help the carriers (perhaps vice versa) recognize this and shift from being supply centric to demand centric these large telecom shows will continue to offer the same old same old. No guts, no glory.
Beware Double Counting Of Revenues, Internet Access, & The Death Of The PC-Tech Boom; They Are All Related
(Originally published April 12, 1998)
Remember last week when I likened the boom/bust cycle of the wild west, well here's something to think about on the valuation front. How large is the long-distance market? $80 billion in revenues? Wrong! It is between $50-$60 billion net of access charges. On the same note, how large is the local telephone market? $105 billion? Maybe not, if we take access revenues out of the equation. For that matter if access charges were to be significantly reduced or disappear completely enterprise value multiples to sales would increase dramatically for the industry. The CEO of the largest long-distance company recently stated that if access charges were significantly reduced he would pass the entire savings on to consumers. Without any incremental demand stimulation, and assuming he could keep his other costs the same, this action would cause revenues to decline by more than 20% and operating margins to expand to the 35% range from the current 25% range. The same goes for the LECs who count those fees as revenues. Needless to say, this event probably has not been factored in by the market. As we said last week, speculative busts are likely in this new wild west of telecommunications.
The reason for access charges in the first place is to support universal service. The latter is an anachronism in competitive telecom markets, where intranet-like services pave the way for free access. If the FCC fully understood this, then it would not, in my opinion, have struck a body-blow to the internet-voice telephony companies last week in the form of access fees for voice services. Voice, you see, is the holy grail for the data industry, which has had a hard time making real financial returns on dial-up data services over old-fashioned analog lines. If those same connections could be used not only to access data, but voice nets as well, then the increased revenues should quickly cover the associated marketing and support costs that the data guys never fully factored into their models to begin with. As a result, the FCC decision could seriously retard the internet sector's ability to compete head to head on a local basis. In the past, new competitors have developed networks by sharing existing facilities with entrenched carriers and building new networks when enough revenue dollars have been captured to justify the huge build-out costs. Conversely, the "build it and they will come" philosophy has typically resulted in sub-optimal returns over time.
Which brings us to the last, and related topic, namely the greatly exaggerated death of the technology boom. Not only was there an article about this in the Wall Street Journal this morning, but a recent clip in the trade-rag, Electronic News, about the end of the PC-decade and its negative implications for the tech sector. If anything, these prognostications can be tied to the same conditions that undid IBM 15 years ago, namely the lack of high-speed bandwidth, or access, to the terminal user's location. Whereas back then it was the local processing of the PC that countered the slow-speed, high-cost analog connection through the monopoly telephone network to the mainframe computer, today it is moving large files of information to and from very powerful PCs that could well be limiting demand. After all once the user has enough processing capability to handle high-end games, spreadsheet, publishing, and database operations, what more is there? The answer lies in workgroup and other multimedia applications, which would significantly increase demand for a whole new generation of silicon-based processors (high-end multimedia network stations). Only this won't happen until there is low-cost, high-speed access available to a large portion of the population. By-passing current access charges are one big reason for building high-bandwidth access.
The over-riding goal of the FCC is to stimulate competition locally and get high-bandwidth access as ubiquitous as possible. So it all gets down to the issues of access charges, and the FCC sits at the middle of it. The issues are complex and the solutions are initially painful. The current system of inefficient access charges can be used as an incentive to get many parties, including the LECs, in the right direction. The $20 billion of access charges should be viewed of as a lubricant to facilitating investment in local facilities. It gives the small guy an opportunity to develop sufficient scale to survive on his or her own and undo the inefficient cost structure that lead to the access charges to begin with. Remember, the LECs are built on 80 years of inefficient, rate of return regulation, with little thought given to rapid incorporation and amortization of new technologies. To date nobody, in my opinion, has come up with the real cost of local telephone networks and the cost of access in a competitive world. I challenge those involved at the FCC to show me how costs are better taken out of the system than through the introduction of new small competitors who have to scratch every day for a living and implement the most cost-effective and innovative solutions to satisfy the needs of the marketplace. As far as I can see, the FCC is managing its level best to destroy any innovation, and is extending the entrenched monopolists (LECs and Wintel consortium) grip over the free-market.
How To Bust The Wintel Monopoly
(Originally published March 10, 1998)
Here's a simple solution to Capitol Hill's quandary over busting the Wintel monopoly. Bust the local telephone (LEC) monopoly! Putting some teeth into the Telecom Act of 1996 and enforcing the 14-point interconnection plan would result in a rapid digitization of the local loop. With high-bandwidth access, users would likely become more tied into the Net and collaborative applications. In our opinion, the client-server debate would swing in favor of the server, where the centralized purchasing decision ultimately accounts for the purchase of the client as well. A rapid shift would give the advantage to the hundreds of companies in the intranet food supply chain. A slow shift (preserving the LEC monopoly) would benefit the Wintel monopoly, giving it time to leverage their monopoly position on the client side into a monopoly position on the server side.
In order to understand this let's look back at the last 20 years. The stage for IBM's demise (an earlier monopoly) and the Wintel ascendancy, had its foundations in the old AT&T monopoly. While data processing requirements were exploding in the 1970s, the telephone network remained a high-priced, low-speed, analog affair. The PC solved that distance problem. It wasn't until the mid-80s with the break-up of the telephone monopoly that high-speed, low-cost, digital networks were introduced. But by then the dirty deed was done--the notion of the centralized server (mainframe) was dead. Long live the PC! Long live processing at the edge!
But then a funny thing began to happen, the PCs began to get connected, first over local, then wide area networks, not unlike what happened with voice telephones in the 1890s. By the early 1990s work-group applications began to appear, and at the same time the amorphous mass, known as the Internet reared its majestic head. Our research indicates these trends were driven mostly by the competitive long-distance network which introduced four competitive price bands: raw material--a.k.a. bits per second, commodity bandwidth--a.k.a. private lines, wholesale--a.k.a. Tariff 12 or virtual private networks (VPNs), and retail--a.k.a. Dial-1; which fell 99%+, 90%+, 70%+, and 50%+ in 12 years. In turn, arbitrageurs utilized these spreads to roll out a host of enhanced voice and data applications. Long live the Net! Long live processing at the core!
So now the debate rages as to whether the client (PC) or the server (Net) will reign supreme. We feel that debate misses the fundamental trend in communications, namely that information can't be controlled by any one party. The old information monopolies (broadcast/entertainment, switched/communications, and info processing/data) are giving way to a series of intranets and extranets, whose underpinnings are the OSI stack we spoke of two weeks ago. These intranets and extranets are redefining corporations. In fact, corporations are beginning to base their very existence on them. Services and applications like 800, voice and data VPNs, fax, and WANs--all outgrowths of the competitive long-distance markets--have served to fundamentally change the economy. Long live the corporate buyer!
It is the corporate buyer that will drive high-bandwidth services into the last mile, both for intranet and extranet applications. The latter point could be one of the fundamental issues in the telecom world over the next ten years, as the market grows from $200 to $500 billion, or 4% to 7% of GDP. The corporate buyer brings purchase economies into the last mile, enabling the network provider to generate a return on an investment that could be obsolete within 1 to 3 years. We only have to look at the development of data networks over the past few years to appreciate the potential for rapid obsolescence. As a result, this development should be even more anti-inflationary than what the economy has experienced here-to-fore. In fact it pays for itself, as we witness a shift from the real to virtual movement of people, goods and ideas. Long live multimedia networks!
We expect to see more consolidation as competitive telecom companies seek to serve the corporate user's wide area needs. Ten years from now, multiple intranets may well be going into a single house--paid for not by the homeowner, but by the intranet or extranet owner. These intranets could come in on wired or wireless facilities, and be owned or leased by the intranet provider. Of the $500 billion market, 70%-80% may well be paid for by the corporate user, making the universal service fund issue moot. Access will be almost free! Oh, and another thing, the pendulum will continue to swing, only not as widely. Processing will never be totally at the edge, nor totally at the core. There will always be a trade-off between intelligence and transmission, not only because of costs, but because of customer preferences.
We Hate To Say It, But We Told You So!
(Originally published March 30, 1998)
Back in February I was roundly criticized at CTIA by members of the FCC for suggesting that they consider auctioning nationwide, not balkanized regional, wireless licenses. We were told it ran counter to public policy which was looking out for the small user, the little guy, the guy on the farm, and the generally disadvantaged. The best way to serve these customers, the FCC advisors stated, was to keep licenses out of the hands of the large carriers, or "big guys". Regulators believe that small carriers will look out for those small users better. Well, after another disappointing set of frequency auctions, we can say, unequivocally, that they were wrong! And to compound matters, we feel those very same disadvantaged folks have been made relatively worse off!
Our vindication stems from the LMDS auctions that concluded last week. The LMDS auctions consisted of two blocks of spectrum ( the A block consisting of 1.15 gigahertz and the B block consisting of 150 megahertz) at the 28 gHz frequency range across 493 BTA's, or 986 licenses. A year ago, these fixed wireless auctions to develop networks for video, data, and voice services (essentially high-bandwidth wireless bypass) were expected to generate $3-$8 billion in gross bids. As business plans developed, highlighting the numerous complexities of developing such networks, that number fell to $1-$5 billion, finally centering around $2-$4 billion earlier this year. After a relatively quick auction, the actual number came in at a disappointing $579 million (nearly $900 million on a gross basis before small business discounts), fully 80% below the average pre-auction estimate, and well below what could have been. Even worse, 109 BTA's weren't even bid on and 181 only received one bid, leaving the outlook for competition in those markets seriously in doubt.
The primary reason for the poor showing is that the "big guys" never showed up at the auction. They were deterred by the enormity of trying to piece together 493 BTA's from scratch, a feat that has been accomplished by only two other carriers in an open-market fashion, namely Sprint (FON, $68) and Nextel (NXTL, $33). As far as the large carriers are concerned, only national or near national footprints will allow them to scale existing operations effectively. Two other nationwide providers, Teligent (TGNT, $30) and Winstar (WCII, $44) managed to assemble their nationwide footprint during pre-auction days from the FCC directly for next to nothing. Even mighty AT&T (T, $66) does not have a nationwide network at a common frequency.
Our sense is that 2 or 3 nationwide frequency blocks (400-600 mHz each) would have netted $0.5-$1.5 billion each. This $1.5-$4.5 billion would have been consistent with earlier forecasts. Furthermore, these nationwide blocks would likely have been built out more rapidly and ubiquitously than the individual markets combined. Why is this? Simply because telecom networks benefit from economies of scale at the equipment level, marketing level, and operating and support level. Rather than promote competition, the FCC may have hurt it not only in the areas that were auctioned off, but in existing markets, since economies of scale on the equipment side are not likely to be achieved. We've witnessed this numerous times in the wired and wireless world, where no one provider has been able to achieve national or near-national scale.
A national fixed wireless provider should be able to have an advantage over any wired offering by ensuring LAN/MAN/WAN connectivity for corporate intranets with suppliers, vendors, and employees. These intranets increase in value the more access points are created. For example, the 100th access point on an intranet is more valuable than the 10th, while at the same time typically costing significantly less to acquire, maintain, and stimulate demand. This is how the rural and lower tele-density areas get served. Carriers are driven by their corporate customers to cast their net as far and wide as possible.
As we've said in the past, today's telemedia market is well over $200 billion and could be $500 billion within 8-10 years, implying real growth around 10%. The market for intranets is potentially unlimited. National networks that can carve out a small sliver of this potential demand are already given strong valuations. The total market value of our domestic universe is $765 billion. A reassembled nationwide LEC network would be worth $300-$360 billion, generating approximately $106 billion in revenues. A reassembled nationwide cellular network would be worth $70 billion, generating approximately $13 billion in revenues. What's $1-$2 billion for footprint to be a major player in this marketplace?
US telecom public policy has to change, and change quickly! The world is going the way of intranets and free access. Regulators concerned about the cost of access and the bypassing of the small user and disadvantaged are not understanding of this trend. These intranets are geometrically more valuable the larger they get, which results in the large carriers, or "big guys" offering access to the little guys. This in turn is the solution to the issues of Universal Service and broadband access to rural and disadvantaged market segments. Furthermore, smaller carriers are not disadvantaged, but rather have the opportunity to sell, distribute, and develop new applications on the "big guys" networks. Over time, if they are successful, the small carriers can indeed grow up to be "big guys." In recognition of this concept, one important change to current policy by the FCC, henceforth, would be to hold nationwide frequency auctions only! Helping the "big guys" actually makes it easier for the "little guys."
To Bust The Wintel Monopoly And Solve Universal Service, Think National!
(Originally published March 19, 1998)
Last week, while making the case for busting the Wintel monopoly through busting the local access monopoly, Bo and I said that corporate intranets and extranets would pay for universal access. This is because, heretofore competition has resulted in nationwide infrastructures driving both wide area economies of scale and prices established on the margin, resulting in lower costs for all users. In many cases access is free to the end-user, like 800 services. This same development is occurring in the internet where the client browser is being given away for free to capture the larger and more lucrative server demand (whose cost rises marginally with each incremental user). As a result, we should throw out the window 100 years of conventional wisdom on the part of regulators that focused on local access to look out for the disadvantaged, little and rural guys.
As we pointed out two weeks ago regarding the CTIA show, this conventional way of thinking has been driven by supply-side, technology centric thinking. But in our opinion the reality is that the future will increasingly be driven by demand-side marketing and software centric approaches. In the competition model, networks become more valuable as more people that have access to them. So one gives away the client (the edge users) to increase the value of the server. Remember the math of competitive networks is N*(N-1)/2—each incremental users adds value geometrically. At the same time, the cost of incremental access is lowered by the scale economies achieved by the intranet. So both the centralized buyer and end-user win!
Over the past 15 years, wherever competition has entered the markets the result has been to centralize, consolidate, and regroup whatever the government tried to split up and balkanize in the name of fairness to the little guy and disadvantaged. It's happened in wireline voice and data, paging, SMR, PCS, 38 ghz and other networks. In fact, we can't think of a single instance where the reverse is true. The result has been that the inefficient allocation of telecom resources has actually retarded the pace of service evolution to those same users the regulators and politicians were trying to protect.
The reason why national scale is important is not just from a technology perspective (cost and implementation of networks based on standardization and purchasing economies like the SMR world with iDen), but perhaps more so from a cost of marketing, fulfillment and customer support perspective. We believe these latter costs represent nearly 60% of the cost structure in the competitive network model. Typically, leading-edge technologies tend to get bought by high-end, high-volume users whose most important telecom needs—be they messaging, voice, data, or multimedia—tend to be wide-area in scope. What we are witnessing in the telecom world today is simply supply following demand. That demand is being driven by the need to move LAN traffic to MAN traffic to WAN traffic. The islands were created first because the monopoly retarded the development of WANs as we pointed out last week.
So regulators should take this thinking to heart as they auction future wireless spectrum and set telecom policy. We believe not splitting up wireless frequencies into MTA and BTA distinctions not only improves efficiencies, but actually works toward the FCC principles of developing competitive and level playing fields that benefit the public good. This holds for both mobile and fixed wireless networks. In a mobile situation that is competitive, a nationwide provider would try to build as quickly as possible to satisfy customer demand and perception. Therefore the high-end urban user would drive demand into the rural markets (which, according to old thinking, would be underserved because demand didn't exist). While less obvious in the case of fixed wireless like MMDS and LMDS, the same holds true because of marketing, operating, content, and even infrastructure-buying scale! How many corporations would outsource all their data transmission needs to be on a ubiquitous LAN/MAN/WAN network? How would the corporation reorient itself if distance and time weren't an issue? Work-at-Home? Not a problem. Locate employees and operations in rural, less costly markets? Not a problem.
The U.S. cannot continue to develop leading telecom infrastructures (which it clearly has in the voice and data corporate markets) as long as the misguided thought that local interests are better served by localized and balkanized monopolies remains in place.
On a sad note, it is too late to apply these principles to the LMDS auctions underway, but future frequency auctions could be more rationally allocated. It remains to be seen what the outcome of the balkanized LMDS auctions will be, but they will likely fall far short of expectations because national players couldn't achieve sufficient scale in their business models to justify getting into a few local or regional opportunities. As a result, LMDS is being touted as a hoped-for golden-bullet, delivering everything and solving competition issues at the same time. But there are already three other higher-frequency players (who are national in scope) who are vying for this same opportunity, as well as satellite, and wired cable and telephone networks, not to mention the down-on-their-luck MMDS players. None of these have been proved out on a stand-alone, local basis.
Regulatory Malaise Creates Strong Case For Wireless Bypass
(Originally published January 5, 1998)
We made the case early in 1997 that wireless carriers should be viewed as competitive local access for the long-distance and enhanced service provider industries to bypass expensive analog local telephone monopolies. While investors woke up to that siren song in stocks like Teligent (TGNT—24 1/16), up 12.8% since its November IPO versus 0.4% for the S&P 400, and WinStar, they have yet to appreciate that potential in other wireless carriers like PCS, paging, and MMDS. This is due to the fact that most wireless management don't realize that their operations (particularly things like marketing and customer support) should be integral to wireline applications in order to amortize those costs over a broader base of revenues. Last week's court ruling throwing the Telecom Act into question and perhaps slowing down competitive local entry should be viewed opportunistically by wireless carriers. In other words wireless is not a standalone business, but rather simply another way of accessing the customer demand that every carrier is competing for in the telemedia world.
This access comes in two forms, physical and virtual. By physical we mean construction of facilities that replace twisted pair copper lines for carrying information bits for messaging and Internet services, voice, high-speed data and multimedia applications. This form of by-pass is highly capital intensive since wireless networks are essentially all-or-nothing affairs. When combined with stand-alone marketing and support platforms, often times these models have proven uneconomic (i.e. paging and MMDS). These facilities based carriers have to be extremely aggressive in order to make, or look to be making, a return in under three years (our definition of the capital investment horizon for wireless technologies).
Virtual bypass has the opportunity to be more profitable from an investment perspective since it requires less up-front capital. In addition, just about any wireless carrier (or wireline or enhanced service provider) with the right marketing and software programming skill sets can implement this form of bypass. The concept of virtual bypass is drawn from the successful marketing programs of the 1990s in the competitive long-distance industry, like Friends & Family, The Most, 1-800-COLLECT, and others.
These programs all shared important features. The first is that they all had a simple, understandable "marketing" hook. Often times these services sold themselves because they made intuitive sense. This lowers acquisition costs. Second, their "perceived" cost differed from their actual "revenue" yield. For instance Sprint's (FON—58, rated Buy by Guy Woodlief) dime plan was viewed as unprofitable by most analysts because the all-in cost per minute was thought to be around 7 cents, providing a 30% margin. This was considered too low in a competitive world, with high acquisition costs, churn, and rapid technology obsolescence. In fact with breakage Sprint was generating revenue yields of 13 cents per minute providing a 45% margin.
While the customer perceived 10 cents, Sprint "received" 13 cents. Even if the customer were to go through the calculations, they would not have been particularly peeved since the conventional thinking at the time was that long-distance prices were generally between 15-25 cents. Similarly, today wireless management should focus on understanding the customer's perceived needs and finding the most economic ways to satisfy those needs.
From this point of view, a paging customer is no longer a cheap-beep user, but rather an information consumer, voice and e-mail messaging user, and single number point of access. Just think what value paging companies might have if those managements figured out what to do with the 30 plus million existing telephone numbers they own that serve to control the customer. So too, PCS providers should sell long-distance not at 15-25 cents per minute (cost is now down to 4 cents for transport and termination), but rather for less than 10 cents, in order to capture not only the consumer's wireless, but also wireline demand. As "me-too" cellular providers PCS companies are doomed to fail. Likewise, MMDS providers realized only too-late that as "me-too" video companies they did not have the marketing appeal to achieve a high penetration level within three years. Only after they used up their initial investments on this failed strategy did they realize that their bandwidth could be used for the more important (again from the customer's perception) high-bandwidth data services. Wake up to the marketing, not technology, opportunities that wireless access provides and the telecom world is an oyster.
The FCC Is Solving A Problem With A Problem
(Originally published July 27, 1998)
The Telecom Act of 1996 was a well-intentioned piece of legislation, but it seems to have lacked the teeth to be effectively implemented in the market as we have all subsequently found out. Around the time of the passage the Act itself was hyped as a way for the RBOCs to gain entry into long-distance, but its real focus was on breaking the RBOC monpoly on local access through a well-defined 14-point interconnection plan. However, the only problem with the 14-point plan was that implementation meant an end to the RBOC monopoly, and possibly, an end to the RBOCs themselves. Since the RBOCs were created by the pen they may very likely die by the pen, not, as some would believe, by some natural market forces. As a result, they have fought tooth and nail in the courts and at the state level to keep the Act in check.
This may be a foolish move for them in the long-run, since the market forces for high-bandwidth, digital access continue to speed on, unabated. This demand pull is spawning a number of large mergers and giving Wall Street the oppourtunity to raise large sums of capital for competitive local exchange carriers. The demand, while seemingly driven by the internet, will probably be sustained by corporate subsidized intra and extranets. While the ROBCs try to save their monopoly on voice, it may well be data that is their undoing, much like IBM's mainframe dominance was undone by new applications running on distributed PCs.
The RBOCs, recognizing this trend, appear to think they can kill two birds with one stone by petitioning the FCC for regulatory relief for high-speed digital access lines for new multimedia networks. The thought is that by keeping these new data networks separate from the same interconnection requirements required for existing analog, circuit switched voice networks the RBOCs will have a greater incentive to invest rapidly in these networks and keep all the profit for themselves.
This is wishful thinking, in my opinion, and does not take into account that the RBOCs have very little indigenous distribution whereby they can rapidly penetrate the data markets and amortize the cost of putting essentially what amounts to a PC on most lines. The problem is that PCs tend to obsolete the moment they leave the shelf. So too, xDSL and other digital transmission devices will rapidly becom obsolete. There is not an easy supply-side solution for this problem. Such has been the dilemna for ISDN for 25 years.
Instead the RBOCs, and the FCC, should allow the large installed base of data, voice and enhanced service providers (literally thousands) to interconnect to these digital access networks. These providers in turn will likely rapidly buy up the digital capacity and create entirely new information economies on the telephone infrastructure, paid for almost entirely by centralized corporate and institutional buyers. The large penetration of buyers could thus ensure the rapid return on the RBOC's rapidly depreciating investment. So the FCC should stick to its guns and treat the "new" networks (even though they really aren't, rather merely electronic upgrades) the same way they are treating the "old" voice networks under the 14-point plan.
Unfortunately, under this approach the current voice revenue model could be negatively affected by the new digital access model. For instance, current high-speed access lines that cost end users $800-2,000 per month might end up costing $100 or less. Given the large number of corporate, wireless and other competitive communications providers using these facilities the effect might be felt overnight. Futhermore, one high-speed xDSL line into the home could replace multiple voice, fax and computer lines currently, cutting current residential revenues significantly. The uncertainty lies with how quickly demand elasticity will kick in to offset these potential shortfalls.
So what to do? Certainly not solve one problem with another. Rather the FCC should sit down and work with all carriers to develop a framework of how networks will look 10 years in the future. I believe that today's balkanzied communications market is getting our economy nowhere quickly. End-to-end digital services are essential. To resolve this impasse two ROBCs have already given up their autonomy and two more are about to. Clearly the leadership in these organizations recognize that rapid change is afoot. At the same time those leaders probably realize that they lack the experience in competitive markets to develop and distribute applications with risk capital.
The root cause of the RBOCs' problem is that most communication is two-way, and the need for most new communications protocols is by high-end users over wide areas. Furthermore the value-added portion of these WANs is typically the server as opposed to the client (witness the Microsoft/Netscape war). Since an RBOC is typically at only one end of a WAN connection, it is in a losing position on two fronts. The RBOC can't control the server and the client is given away for free. It doesn't look pretty from their perspective.
So maybe the FCC should all all the RBOCs to merge, but at the same time work closely with state regulators to implement the 14-point plan as quickly as possible. In the process the RBOCs would likely restructure into something that resembles what the wide-area telecom networks and the computer world resemble today, namely the horizontally differentiated OSI stack.