Skip to main content
SpectralShifts Blog 
Thursday, January 05 2012

Counter-intuitive thinking often leads to success.  That’s why we practice and practice so that at a critical moment we are not governed by intuition (chance) or emotion (fear).  No better example of this than in skiing; an apt metaphor this time of year.  Few self-locomoted sports provide for such high risk-reward requiring mental, physical and emotional control.  To master skiing one has to master a) the fear of staying square (looking/pointing) downhill, b) keeping one’s center over (or keeping forward on) the skis, and c) keeping a majority of pressure on the downhill (or danger zone) ski/edge.  Master these 3 things and you will become a marvelous skier.  Unfortunately, all 3 run counter to our intuitions driven by fear and safety of the woods at the side of the trail, leaning back and climbing back up hill.  Overcoming any one is tough.

What got me thinking about all this was a Vint Cerf (one of the godfathers of the Internet) Op-Ed in the NYT this morning which a) references major internet access policy reports and decisions, b) mildly supports the notion of the Internet as a civil not human right, and c) trumpets the need for engineers to put in place controls that protect people’s civil (information) rights.  He is talking about policy and regulation from two perspectives, business/regulatory and technology/engineering, which is confusing.  In the process he weighs in, at a high level, on current debates over net neutrality, SOPA, universal service and access reform, from his positions at Google and IEEE and addresses the rights and governance from an emotional and intuitive sense.

Just as with skiing, let’s look at the issues critically, unemotionally and counter-intuitively.  We can’t do it all in this piece, so I will establish an outline and framework (just like the 3 main ways to master skiing) and we’ll use that as a basis in future pieces to expound on the above debates and understand corporate investment and strategy as 2012 unfolds.

First, everyone should agree that the value of networks goes up geometrically with each new participant.  It’s called Metcalfe’s law, or Metcalfe’s virtue.  Unfortunately people tend to focus on scale economies and cost of networks; rarely the value.  It is hard to quantify that value because most have a hard time understanding elasticity and projecting unknown demand.  Further few rarely distinguish marginal from average cost.  The intuitive thing for most to focus on is supply, because people fear the unknown (demand).

Second, everyone needs to realize that there is a fundamental problem with policy making in that (social) democrats tend to support and be supported by free market competitors, just as (conservative) republicans have a similar relationship with socialist monopolies.  Call it the telecom regulatory paradox.  This policy paradox is a function of small business vs big business, not either sides’ political dogma; so counter-intuitive and likely to remain that way.

Third, the internet was never open and free.  Web 1.0 resulted principally from a judicial action and a series of regulatory access frameworks/decisions in the mid to late 1980s that resulted in significant unintended consequences in terms of people's pricing perception.  Markets and technology adapted to and worked around inefficient regulations.  Policy makers did not create or herald the internet, wireless and broadband explosions of the past 25 years.  But in trying to adjust or adapt past regulation they are creating more, not less, inefficiency, no matter how well intentioned their precepts.  Accept it as the law of unintended consequences.  People feel more comfortable explaining results from intended actions than something unintended or unexplainable.

So, just like skiing, we’ve identified 3 principles of telecoms and information networks that are counter-intuitive or run contrary to accepted notions and beliefs.  When we discuss policy debates, such as net neutrality or SOPA, and corporate activity such as AT&T’s aborted merger with T-Mobile or Verizon’s spectrum and programming agreement with the cable companies, we will approach and explain them in the context of Metcalfe’s Virtue (demand vs supply), the Regulatory Paradox (vertical vs horizontal orientation; not big vs small), and  the law of unintended consequences (particularly what payment systems stimulate network investment).  Hopefully the various parties involved can utilize this approach to better understand all sides of the issue and come to more informed, balanced and productive decisions.

Vint supports the notion of a civil right (akin to universal service) for internet access.  This is misguided and unachievable via regulatory edict/taxation.  He also argues that there should be greater control over the network.  This is disingenuous in that he wants to throttle the open-ness that resulted in his godchild’s growth.  But consider his positions at Google and IEEE.  A “counter-intuitive” combination of competition, horizontal orientation and balanced payments is the best approach for an enjoyable and rewarding experience on the slopes of the internet and, who knows, ultimately and counterintuitively offering free access to all.  The regulators should be like the ski patrol to ensure the safety of all.   Ski school is now open.

Related reading:
A Perspective from Center for New American Security

Network Neutrality Squad (NNsquad) of which Cerf is a member

Sad State of Cyber-Politics from the Cato Institute

Bike racing also has a lot of counter-intuitive moments, like when your wheel locks with the rider in front.  Here's what to do!

Posted by: Michael Elling AT 01:23 pm   |  Permalink   |  0 Comments  |  Email
Thursday, December 29 2011

67 million Americans live in rural areas. The FCC says the benchmark broadband speed is at least 4 Mbps downstream and 1 Mbps upstream. Based on that definition 65% of Americans actually have broadband, but only 50% who live in rural markets do; or 35 million. The 50% is due largely because 19 million Americans (28%) who live in rural markets do not even have access to these speeds. Another way of looking at the numbers shows that 97% of non-rural Americans have access to these speeds versus 72% living in rural areas.  Rural Americans are at a significant disadvantage to other Americans when it comes to working from home, e-commerce or distance education.  Clearly 70% are buying if they have access to it.

Furthermore we would argue the FCC standard is no longer acceptable when it comes to basic or high-definition multimedia, video and file downloads.  These applications require 10+ Mbps downstream and 3+ Mbps upstream to make applications user friendly.  Without those speeds you get what we call the "world-wide-wait" in rural markets for most of today's high-bandwidth applications.  In the accompanying 2 figures we see a clear gap between the blue lines (urban) and green lines (rural) for both download and upload speeds.  The result is that only 7% of rural Americans use broadband service with 6+/1.5+ Mbps versus 22% nationwide today.

The problem in rural markets is lack of alternative and affordable service providers. In fact the NTIA estimates that 4% of Americans have no broadband provider to begin with, 12% only 1 service provider and 44% just 2 providers. Almost all rural subscribers fall into 1 of these 3 categories. Rural utilities, municipalities, businesses and consumers would benefit dramatically from alternative access providers as economic growth is directly tied to broadband penetration.

The accompanying chart shows how vital broadband is to regional economic growth.  If alternative access drives rural broadband adoption to levels similar to urban markets, then local economies will grow an additional 3% annually.  That's because new wireless technology and applications such as home energy management, video on demand, video conferencing and distance learning provide the economic justification for alternative, lower-cost, higher bandwidth solutions.

Related Reading

FCC Broadband Map

US 3G Wireless Coverage Map

The UK is Far Ahead of the US; Their deficient is our average

Rural Telcos Against FCC USF Reform

GA Tries to Reduce Subsidies, Again

 

Posted by: Michael Elling AT 08:09 am   |  Permalink   |  0 Comments  |  Email
Sunday, December 18 2011

 

(The web is dead, long live the apps)

 

Is the web dead?  According to George Colony, CEO of Forrester, at LeWeb (Paris, Dec 7-9) it is; and on top of that social is running out of time, and social is where the enterprise is headed.  A lot to digest at once, particularly when Google’s Schmidt makes a compelling case for a revolutionary smartphone future that is still in its very, very early stages; courtesy of an ice cream sandwich.

Ok, so let’s break all this down.  The Web, dead?  Yes Web 1.0 is officially dead, replaced by a mobile, app-driven future.  Social is saturated?  Yes, call it 1.0 and Social 2.0 will be utilitarian.  Time is money, knowledge is power.  Social is really knowledge and that’s where enterprises will take the real-time, always connected aspect of the smartphone ice cream sandwich applications that harness internal and external knowledge bases for rapid product development and customer support.  Utilitarian.  VIVA LA REVOLUTION!

Web 1.0 was a direct outgrowth of the breakup of AT&T; the US’ second revolution 30 years ago coinciding ironically with the bicentennial end of the 1st revolution.  The bandwidth bottleneck of the 1960s and 1970s (the telephone monopoly tyranny) that gave rise to Microsoft and Intel processing at the edge vs the core, began to reverse course in the late 1980s and early 1990s as a result of flat-rate data access and an unlimited universe of things to easily look for (aka web 1.0).  This flat-rate processing was a direct competitive response by the RBOCs to the competitive WAN (low-cost metered) threat.

As silicon scaled via Moore’s law (the WinTel sub-revolution) digital mobile became a low-cost, ubiquitous reality.  The same pricing concepts that laid the foundation for web 1.0 took hold in the wireless markets in the US in the late 1990s; courtesy of the software defined, high-capacity CDMA competitive approach (see pages 34 and 36) developed in the US.

The US is the MOST important market in wireless today and THE reason for its leadership in applications and smart cloud.  (Incidentally, it appears that most of LeWeb speakers were either American or from US companies.)  In the process the relationship between storage, processing and network has come full circle (as best described by Ben Horowitz).  The real question is, “will the network keep up?”  Or are we doomed to repeat the cycle of promise and dashed hopes we witnessed between 1998-2003?

The answer is, “maybe”; maybe the communications oligopolies will liken themselves to IBM in front of the approaching WinTel tsunami in 1987.  Will Verizon be that service provider that recognizes the importance of and embraces open-ness and horizontalization?  The 700 mhz auctions and recent spectrum acquisitions and agreements with the major cable companies might be a sign that they do.

But a bigger question is whether Verizon will adopt what I call a "balanced payment (or settlement) system" and move away from IP/ethernet’s "bill and keep" approach.  A balanced payment or settlement system for network interconnection simultaneously solves the issues of new service creation AND paves the way for the applications to directly drive and pay for network investment.  So unlike web 1.0 where communication networks were resistently pulled into a broadband present, maybe they can actually make money directly off the applications; instead of the bulk of the value accruing to Apple and Google.

Think of this as an “800” future on steroids or super advertising, where the majority of access is paid for by centralized buyers.  It’s a future where advertising, product marketing, technology, communications and corporate strategy converge.  This is the essence of what Colony and Schmidt are talking about.   Will Verizon CEO Seidenberg, or his rivals, recognize this?  That would indeed be revolutionary!

Related Reading:
February 2011 Prediction by Tellabs of Wireless Business Models Going Upside Down by 2013

InfoWeek Article on Looming Carrier Bandwidth Shortages

 

 

 

 

 

Posted by: Michael Elling AT 09:56 am   |  Permalink   |  0 Comments  |  Email
Sunday, December 11 2011

Look up the definition of information and you’ll see a lot of terminology circularity.  It’s all-encompassing and tough to define.  It’s intangible, yet it drives everything we do.  But information is pretty useless without people; in fact it doesn’t really exist.  Think about the tree that fell, unseen, in the forest.  Did it really fall?  I am interested in the velocity of information, its impact on economies, societies, institutions and as a result in the development of communication networks and exchange of ideas.

Over the past several years I have increasingly looked at the relationship between electricity and communications.  The former is the number one ingredient for the latter.  Ask anybody in the data-center or server farm world.  The relationship is circular.  One wonders why the NTIA under its BTOP program didn’t figure that out; or at least talk to the DOE.  Both spent billions separately, instead of jointly.  Gee, why didn’t we add a 70 kV line when we trenched fiber down that remote valley?

Cars, in moving people (information) around,  are a communications network, too; only powered by gasoline.  Until now.  The advent of electric vehicles (EV) is truly exciting.  Perhaps more than the introduction of digital cell phones nearly 20 years ago.  But to realize that future both the utility and auto industries should take a page from the competitive wireless playbook.

What got me thinking about all this was a  NYT article this week about Dan Akerson, a former MCI CFO  and Nextel CEO, who has been running (and shaking up) GM over the past 15 months.  It dealt specifically with Dan’s handling of the Chevy Volt fires.  Knowing Dan personally, I can say he is up to the task.  He is applying lessons learned from the competitive communications markets to the competitive automotive industry.  And he will win.

But will he and the automotive industry lose because of the utility industry?  You see, the auto industry, the economy and the environment have a lot to gain from the development of electric vehicles (EV).  Unfortunately the utility industry, which is 30 years behind the communications and IT revolution “digitizing” its business model, is not prepared for an EV eventuality.  Ironically, utilities stand in the way of their own long-term success as EV’s would boost demand dramatically.

A lot has been spent on a “smart grid” with few meaningful results.  Primarily this is because most of the efforts and decisions are being driven by insiders who do not want to change the status quo.  The latter includes little knowledge of the consumer, a 1-way mentality, and a focus on average peak production and consumption.  Utilities and their vendors loathe risk and consider real time to be 15 minutes going down to 5 minutes and view the production and consumption of electricity to be paramount.  Smart-grid typically means the opposite, or a reduction in revenues.

So, it’s no surprise that they are building a smart-grid which does not give the consumer choice, flexibility and control, nor the ability to contribute to electricity production and be rewarded to be efficient and socially responsible.  Nor do they want a lot of big-data to analyze and make the process even more efficient.  Funny those are all byproducts of the competitive communications and IT industries we’ve become accustomed to.

So maybe once Dan has solved GM’s problems and recognizes the problems facing an electric vehicle future, he will focus his and those of his private equity brethren’s interests on developing a market-driven smart-grid; not one your grandmother’s utility would build.

By the way, here’s a “short”, and by no means exhaustive, list of alliances and organizations and the members involved in developing standards and approaches to the smart grid.  Note: they are dominated by incumbents, and they all are comprised differently!

 

Electricity Advisory Committee
Gridwise Alliance
Gridwise Architecture Council
NIST SmartGrid Architecture Council
NIST SmartGrid Advisory Committee
NIST SmartGrid Interoperability Panel
North American Energy Standards Board (NAESB)
SmartGrid Task Force Members (Second list under Smartgrid.gov)
Global SmartGrid Federation
NRECA SmartGrid Demonstration
IEEE SmartGrid Standards
SmartGrid Information Clearinghouse


 

 

Posted by: Michael Elling AT 10:52 am   |  Permalink   |  0 Comments  |  Email
Sunday, December 04 2011

Be careful what you wish for this holiday season?  After looking at Saks’ 5th Avenue “Snowflake & Bubbles” holiday window and sound and light display, I couldn’t help but think of a darker subtext.  I had to ask the question answered infamously by Rolling Stone back in 2009, “who are the bubble makers?   The fact that this year’s theme was the grownup redux from last year’s child fantasy by focusing on the “makers” was also striking.  An extensive google search reveals that NO ONE has tied either years’ bubble themes to manias in the broader economy or to the 1%.  In fact, the New York Times called them “new symbols of joy and hope.”  Only one article referenced the recession and hardship for many people as a stark backdrop for such a dramatic display.  Ominously, one critic likened it to the “Nutcracker with bubbles” and we all know what happened to Tsarist Russia soon thereafter.

The light show created by Iris is spectacular and portends what I believe to be a big trend in the coming decade, namely using the smartphone to interact with signs and displays in the real world.  It is not unimaginable that every device will soon have a wifi connection and be controllable via an app from a smartphone.  Using the screen to type a message or draw an illustration that appears on a sign is already happening.  CNBC showcased the windows as significant commercial and technical successes, which they were.  Ironically the 1% appear to be doing just fine as Saks reported record sales in November.

Perhaps the lack of critical commentary has something to do with how quickly Occupy Wall Street rose and fell.  Are we really living in a Twitter world?  Fascinated and overwhelmed by trivia and endless information?  At least the displays were sponsored by FIAT, who is trying to revive two brands in the US market simultaneously, focusing on the very real-world pursuit of car manufacturing.  The same, unfortunately, cannot be said about MasterCard, (credit) bubble makers extraordinaire.  Manias and speculative bubbles are not new and they will not go away.  I’ve seen two build first hand and know that little could have been done to prevent them.  So it will be in the future.

One was the crash in 1987 of what I like to call the “bull-sheet market of the 1980s”.  More than anything, the 1980s was marked by the ascendance of the spreadsheet as a forecasting tool.  Give a green kid out of business school a tool to easily extrapolate logarithmic growth and you’ve created the ultimate risk deferral process; at least until the music stops in the form of one down year in the trend.  Who gave these tools out and blessed their use?  The bubble makers (aka my bosses).  But the market recovered and went to significant new highs (and speculative manias).

Similarly, a new communications paradigm (aka the internet) sprang to life in the early to mid 1990s as a relatively simply store and forward, database look-up solution.  By the end of the 1990s there was nothing the internet could not do, especially if communications markets remained competitive.  I remember the day in 1999 when Jeff Bezos said, in good bubble maker fashion, that “everyone would be buying goods from their cellphones” as a justification for Amazon’s then astronomical value of $30bn.  I was (unfortunately) smart enough to know that scenario was a good 5-10 years in the future.  10 years later it was happening and AMZN recently exceeded $100bn, but not before dropping below $5bn in 2001 along with $5 trillion of wealth evaporating in the market.

If the spreadsheet and internet were the tools of the bubble makers in the 1980s and 1990s, then wireless was the primary tool of the bubble makers in the 2000s.  Social media went into hyperdrive with texting, tweeting and 7x24 access from 3G phones apps.  Arguably wireless mobility drove people's transiency and ability to move around aiding the housing bubble.  So then what is the primary tool of the bubble makers in the 2010s?  Arguably it is and will be the application ecosystems of iOS and Android.   And what could make for an ugly bubble/burst cycle?  Lack of bandwidth and lack of efficient clearinghouse systems (payments) for connecting networks.

Posted by: Michael Elling AT 08:51 am   |  Permalink   |  0 Comments  |  Email
Sunday, November 20 2011

Are We Stressing the Environment?

Two major global concerns are the price of oil and level of carbon emissions. The US DOE makes a conservative estimate that oil will be consistently between $110-120 by the end of the decade. Population is the key driver as evidenced by the chart to the left comparing growth in population from 1900 to the production of oil.  Note in particular that despite conservation efforts in the mid to late 1970s, production has matched population growth over the past 25 years. Supporting the general trend up in demand and hence prices, the UN expects population to continue to expand for the forseeable future as shown in the below figure.  From there we see that population will rise from current 6.5B to between 8-10B by 2040.  That would imply production exceeding 100 million barrels per day.

Additionally, and perhaps more alarming is the increase in CO2 levels and average temperatures from the late 1800s through the present in the figure below.  The critical number for long-term environmental sustainability is 350 ppm of CO2.  As can be seen from the chart, that level was surpassed around 1990 and now exceeds 370; up 110 ppm over the past 130 years. 

Electricity production accounts for 1/3 of all CO2 production.  The 2011 U.S. EIA Energy Outlook Report states that electricity currently accounts for 40% of total residential delivered energy consumption in the U.S., with projections for both residential and commercial consumption expected to increase 1.2% annually from 2010 to 2035 (not including significant electric vehicle penetration). This growth will require over 200gW of additional electrical energy capacity. With 40% of this capacity already under construction and assuming current construction costs for a gas turbine plant with transmission facilities are $700/kW, additional electric generation costs will approach $90 billion in today’s dollars or $750/U.S. household.

This represents both an energy problem and opportunity for utilities, their customers and society as a whole.  Electric utilities and their customers need to focus on conservation and smart grid solutions to offset the rise in prices and take advantage of new technologies making alternative energy and electric vehicles more economic. The incremental costs for power generation of $750/HH can instead be invested in home energy management systems, at the same time reducing the total amount of CO2 that is generated.

Related Reading:

Map of US showing locations of renewable energy production

Map of US showing over 6400 facilities producing most CO2

 

Posted by: Michael Elling AT 08:00 am   |  Permalink   |  Email
Sunday, November 13 2011

A humble networking protocol 10 years ago, packet based Ethernet (invented at Xerox in 1973) has now ascended to the top of the carrier networking pyramid over traditional voice circuit (time) protocols due to the growth in data networks (storage and application connectivity) and 3G wireless.  According to AboveNet the top 3 CIO priorities are cloud computing, virtualization and mobile, up from spots 16, 3 and 12, respectively, just 2 years ago!   Ethernet now accounts for 36% of all access, larger than any other single legacy technology, up from nothing 10 years ago when the Metro Ethernet Forum was established.  With Gigabit and Terabit speeds, Ethernet is the only protocol for the future.

The recent Ethernet Expo 2011 in NYC underscored the trends and importance of what is going on in the market.  Just like fiber and high-capacity wireless (MIMO) in the physical layer (aka layer 1), Ethernet has significant price/performance advantages in transport networks (aka layer 2).  This graphic illustrates why it has spread through the landscape so rapidly from LAN to MAN to WAN.   With 75% of US business buildings lacking access to fiber, EoC will be the preferred access solution.  As bandwidth demand increases, Ethernet has a 5-10x price/performance advantage over legacy equipment.

Ethernet is getting smarter via a pejoratively coined term, SPIT (Service Provider Information Technology).  The graphic below shows how the growing horizontalization is supported by vertical integration of information (ie exchanges) that will make Ethernet truly “on-demand”.  This model is critical because of both the variability and dispersion of traffic brought on by both mobility and cloud computing.  Already, the underlying layers are being “re”-developed by companies like AlliedFiber who are building new WAN fiber with interconnection points every 60 miles.  It will all be ethernet.  Ultimately, app providers may centralize intelligence at these points, just like Akamai pushed content storage towards the edge of the network for Web 1.0.  At the core and key boundary points Ethernet Exchanges will begin to develop.  Right now network connections are mostly private and there is significant debate as to whether there will be carrier exchanges.  The reality is that there will be exchanges in the future; and not just horizontal but vertical as well to facilitate new service creation and a far larger range of on-demand bandwidth solutions.

By the way, I found this “old” (circa 2005) chart from the MEF illustrating what and where Ethernet is in the network stack.  It is consistent with my own definition of web 1.0 as a 4 layer stack.  Replace layer 4 with clouds and mobile and you get the sense for how much greater complexity there is today.  When you compare it to the above charts you see how far Ethernet has evolved in a very rapid time and why companies like Telx, Equinix (8.6x cash flow), Neutral Tandem (3.5x cash flow) will be interesting to watch, as well as larger carriers like Megapath and AboveNet (8.2x cash flow).   Certainly the next 3-5 years will see significant growth in ethernet and obsolescence of the PSTN and legacy voice (time-based) technologies.

Related Reading:
CoreSite and other data centers connect directly to Amazon AWS

Equinix and Neutral Tandem provide seamless service

 

Posted by: Michael Elling AT 12:46 pm   |  Permalink   |  0 Comments  |  Email
Sunday, November 06 2011

Would gamification work in the smart grid?  Possibly.  Others have asked the same question.  But some would ask, why do you need to incent people to save money?  Because people’s self-interest might not be aligned with the smart-grid as currently envisioned by vendors and utilities. 

Gamification’s value is to do something against one’s self-interest without realizing it.  At the same time, people play games to accomplish something aspirational.  How can these two, somewhat contradictory, precepts be applied to the smart-grid? 

People resist the smart grid because of its perceived complexity, expense and intrusiveness.  They are acting in their self-interest.  Secondly, the smart-grid is supposedly about giving the end-user controls over their own consumption.  Unfortunately, utilities are scared by this future, since it runs counter to revenue growth.

Enter gamification where everyone might win.  If introduced into the design of smart-grid solutions from the get-go it could have a fundamental impact on penetration, acceptance and ultimately revenue and profit growth for the utility industry.   Why?  Because the demand for electricity is potentially unlimited and the easier and more efficient the industry makes consumption the greater the growth potential.

So what might gamification of the smart grid look like?  It would need to satisfy the following conditions: personal growth, societal improvement and marketing engagement.   Right now solutions I’ve read about focus on individual rewards (see Welectricity and Lowfoot), but there is a growing body of evidence that people respond better when their use is compared to their neighbors.  So why not turn efficiency and production into a contest?  Research is already underway in Hawaii and Chicago.  Small, innovative app-driven solutions are entering the market; even supported by former US Vice Presidents.

To get as much participation and ensure wide-spread rewards smart-grid gamification contests should be held at home, neighborhood, city, county, state, all the way to national levels.  It should provide for both relative and absolute changes to provide ALL users an incentive to win; not just the largest users.  And not just individuals, but groups as well.  Contests could also get down to the appliance level and ultimately should include contribution/cogeneration (here’s another example). 

Utilities have done a poor job of getting customers to look at their info online; less than 10% on average.   Playing games with customers and following recipes like this might be a way to change all that.  Win, win, win.

Related Reading:

Gaming across all industries

 

Posted by: Michael Elling AT 11:45 am   |  Permalink   |  0 Comments  |  Email
Sunday, October 30 2011

Without access does the cloud exist?  Not really.

In 2006, cloud computing entered the collective intelligence in the form of Amazon Web Services.  By 2007, over 330,000 developers were registered on the platform.  This rapid uptake was an outgrowth of web 1.0 applications (scale) and growth in high-speed, broadband access from 1998-2005 (ubiquity).  It became apparent to all that new solutions could be developed and efficiencies improved by collapsing to the core a portion of processing and storage that had developed at the edge during the WinTel revolution.  The latter had fundamentally changed the IT landscape between the late 1980s and early 2000s from a mainframe to client server paradigm.

In late 2007 the iPhone was born, just as 3G digital services were introduced by a competitive US wireless industry.  In 2009 “smartphone” penetration was 18% of the market.  By the 3rd quarter of 2011 that number reached 44%.  The way people communicate and consume information is changing dramatically in a very short time. 

The smartphone is driving cloud (aka back to the mainframe) adoption for 3 reasons: 1) it is introducing a new computing device to complement, not replace, existing computing devices at home and work; 2) the small screen limits what information can be shown and processed; 3) it is increasing the sociability, velocity and value of information.   Information knows no bounds at the edge or core.  And we are at the very very early stages of this dramatic new revolution.

Ice Cream Sandwich (just like Windows 2.0 multi-tasking in 1987) heralds a radical new world of information generation and consumption.  Growth in processing and computation at the edge will drive the core and vice versa; just as chip advances from Intel fed software bloat on desktops further necessitating faster chips.   

But the process can only expand if the networks are there (see page 2) to support that.  Unfortunately carriers have responded with data caps and bemoan the lack of new spectrum.  Fortunately, a hidden back door exists in the form of WiFi access.  And if carriers like AT&T and Verizon don’t watch out, it will become the preferred form of access.

As a recent adopter of Google Music I have become very attuned to that.  First, it is truly amazing how seamless content storage and playback has become.  Second, I learned how to program my phone to always hunt for a wifi connection.  Third, when I do not have access to either the 3G wireless network or WiFi and I want something that is stored online a strange feeling of being disconnected overtakes me; akin to leaving one’s cellphone at home in the morning.

With the smartphone we are getting used to choice and instant gratification.  The problem with WiFi is it’s variability and unreliability.  Capital and technology is being applied to solve that problem and it will be interesting to see how service providers react to the potential threat (and/or opportunity).  Where carriers once imagined walled application gardens there are now fertile iOS and Android fields watered by clouds over which carriers exert little control.  Storm clouds loom over their control of and ROI from access networks.

Posted by: Michael Elling AT 09:10 am   |  Permalink   |  0 Comments  |  Email
Sunday, October 23 2011

Even though the US has the most reliable electric system in the world, utility companies are not schooled in real-time or two-way concepts when it comes to gathering and reporting data, nor when it comes to customer service. All of that changes with a “smart-grid” and may be the best explanation why so many smart-grid solutions stop at the meter and do not extend fully into the customer premise. Unfortunately, utilities are not prepared to “get” so much information, let alone “give” much to the customer. Over 20 million smart meters, representing 15% penetration in residential markets, have been deployed as of June, 2011 according to IEE.  They forecast 65 million (50%) by 2015, at an average cost of $150-250 per household.  While these numbers are significant, it will have taken 15 years to get there and even then only 6 million premises, less than 5% of the market, are expected to have energy management devices by 2015.  So while the utilities will have a slightly better view of things and have greater controls and operating efficiencies, the consumer will not be engaged fully, if at all.  This is the challenge of the smart-grid today.

Part of the issue is incumbent organizations--regulatory bodies, large utilities and vendors--and their desire to stick to proven approaches, while not all agreeing on what those approaches are. According to NIST, there are no fewer than 75 key standards and 11 different standards bodies and associations involved in smart-grid research and trials. The result is numerous different approaches, many of which are proprietary and expensive.  As well, the industry breaks energy management within smart-grid into 2 broad categories, namely Demand Response Management (DRM or the side the utility controls) and Demand Side Management (DSM or the side the customer arguably controls), instead of just calling it “end-to-end energy management;” which is how we refer to it.

Another challenge, specifically for rural utilities is that over 60% have PLC meters, which don’t work with most of the “standard” DRM solutions in the market, necessitating an upgrade. This could actually present an opportunity for a well designed end-to-end solution that leapfrogs the current industry debate and offers a new approach.  Such an approach would work-around an expensive investment upgrade of the meter AND allow DSM at the same time. After working with utilities for over 10 years, we’ve discovered that rural utilities are the most receptive to this new way of thinking, not least because they are owned by their customers and they can achieve greater operating efficiencies from end-to-end “smart” technology investment because of their widely dispersed customer base.

Ultimately the market will need low-cost, flexible end-to-end solutions to make the smart-grid pervasive and generate the expected ROI for utility and customer alike.

Posted by: Michael Elling AT 08:13 am   |  Permalink   |  Email

Email
Twitter
Facebook
Digg
LinkedIn
Delicious
StumbleUpon
Add to favorites

Information Velocity Partners, LLC
88 East Main Street, Suite 209
Mendham, NJ 07930
Phone: 973-222-0759
Email:
contact@ivpcapital.com

Mastodon

Design Your Own Website, Today!
iBuilt Design Software
Give it a try for Free