Category Archives: Business Strategies

The Potential Drawbacks of Doing-it-Yourself with Cloud

The No.1 initiative in many of the large corporate IT departments nowadays is about moving to the cloud platform. Bank of America, SAP and many other large companies in non-tech industries are trying it. The decision of getting on either Public, Private or Hybrid Cloud largely depends on the nature of the business, the security concerns and the decision drivers of the company management. Today because the cost of building a cloud-ready modern data center can be much cheaper than in the past, due to both the best practices from Google, Facebook, etc. and downward hardware cost, many security-sensitive companies may decide to build Private Cloud only and completely on their own, although they may have to work with certain technology vendors here and there. TriStrategist thinks that such a decision has several obvious drawbacks.

1. Lack of central vision, planning and design for an extensible platform for potential future needs in cloud.

In a huge enterprise that has been running on traditional IT models for decades, new ideas are often tested on a small scale and initiated by a few departments first. Same are true for testing out the cloud, with only limited design, planning and views of the company. For example, some may start by moving applications on virtual machines; some may interpret cloud services as moving to fee-for-service models between departments, etc. Few people, including management, fully understand cloud computing concepts and technologies in the market. However modern cloud service platform is a gigantic disruptive wave that can disrupt the entire IT operations, resource utilizations and existing internal business models. It’s hard to implement a comprehensive and cohesive cloud platform and service model without considering the IT for the enterprise and the future needs of the company as a whole. Piecemeal implementations could most likely be short-sighted, costly, incompatible or redundant with each other.

2. Lack of right resources and skillset for an efficient implementation.

Cloud and related technologies, such as Big Data, etc., are mostly new. Many companies simply don’t have the right resources with the skills needed. A cloud platform for a large company usually has to embrace many mixed technologies, from hosting infrastructure and virtualization to automation and self-service, etc. It’s often difficult to find people who understand the linkages between different technologies for a smart design. Hiring outside consultants may not help much either since many of these consultants from a particular technology vendor tend to only know about their own product than understand the enterprise’s complex needs and multi-technology environment. Cloud computing also demands new skills. For example, automation development skills are usually hard to be filled by traditional DBAs or IT support staff.

3. Lack of versatility in modern technology infusions.

Cloud technologies are fast evolving. New technologies, new concepts and new tools mushroom every day. Therefore an enterprise cloud platform needs to be open so that it can leave room for future better technologies to fit in. Initial implementation only by internal people tends to follow the least-resistance path which could result in a lack of extensibility and versatility when future needs demand them.

What would be a better approach? TriStrategist thinks that companies inexperienced with cloud, especially those outside technology industry, should try a more open-minded approach in building the cloud at the beginning. Under a comprehensive company-wide vision, business segments with less security constraints should consider first using established Public Cloud providers. The goals are twofold: a fast and cost-effective implementation; collecting experience and training its own staff at the same time. This may be a smarter shortcut to get a extensible Hybrid Cloud for a company’s future needs than taking the painstaking novice effort to build-your-own internal cloud which could surely encounter unexpected problems. A company’s long-term visions should also bear the open-mindedness for its own sake.

The Geographic Advantage

When Google started its first fiber network offering in the US, it also came up with a clever idea of free colocation service in its nearby data centers to content providers such as YouTube, Netflix and Akamai. In this way, it can minimize the content buffering and network congestions from transferring large amount of contents from remote hosting locations by the providers and thus greatly improve the customer experience from Google Fiber Network.

Amazon surely has exploited the similar idea more broadly and globally. Amazon, with its conglomerate products and services, is interested in both content delivery by itself and offering public cloud services for customers with streaming contents on their web sites. To fast expand into a dominant global IaaS provider, besides its global data centers, it has designed and implemented “AWS Edge Locations” in its Global Infrastructure strategies. Now Amazon has 50+ of them. These Edge Locations serve as the “caching” locations for local contents and data that are more convenient to its customers but outside its existing data centers. These Edge Locations provided AWS more advantages for serving global customers with local contents and faster data access without the full investment cycles of building data centers at each global location needed.

Google’s colos and Amazon’s Edge implementations not only help themselves and their customers with content and data deliveries, they will also allow the companies to proactively mitigate the potential negative impacts from the coming Net Neutrality ruling which is pending in the US congress. More importantly, they can be extended to significant geographical advantage and flexibility in the cloud service offerings in the near future, both in the US and internationally.

Global geographic advantage will be critical to the success of a global public cloud provider, especially in IaaS space. The importance of this advantage has already been discussed in TriStrategist’s earlier blog on The Positioning of Public Cloud Services-Part III published on September 19, 2014.

This week in the news, SAP signed a pack with IBM on October 14 to leverage the full fleet of IBM global cloud data centers, in addition to SAP’s own 20 of them, to expand SAP services and store SAP data in local regions to better accommodate the new regulatory requirements from different countries. IBM invested $1.2 billion from the beginning of 2014 to expand its global footprint on cloud data centers to about 40, including 15 from the SoftLayer acquisition and 12 existing ones of its own. Although many are still in the plan, IBM’s strategy of pursuing global geographic advantage apparently has already gained itself some needed edge to catch up on the global cloud war. Data security and sovereignty have long been concerns for many countries and governments to adopt public cloud services by global providers, certainly including the US government. The breakout of the NSA PRISM scandal only worsened the situation. Now several European countries have passed regulations or compliance requirements to have their business or government data reside locally in the country or on the continent. Countries in other regions will surely follow suit. Very soon, the global strategic spread of the data centers will become a prerequisite for a public cloud provider to survive in the global space or be reduced to a niche provider in a few markets.

In the next few years, geographic advantage will likely become the most significant deciding factor in the competitions among global IaaS providers.

The Positioning of Public Cloud Services – Part III

Successful positioning of new products or services helps a company stand out as a market leader or locate a market niche. It comes from the right insights and anticipations into the future markets, the right understanding of one’s own strengths and limits, as well as the confidence to win. The right positioning is crucial for a player who does not have the luxury to experiment all possible scenarios at the beginning of a market entry. Even for a large player with deep pockets, in a fast growing and shifting market as cloud computing today, losing money could happen much faster than gaining dominance. The right positioning from the beginning can help establish a company’s innovative leadership and reputation in customers’ minds and quickly gain market shares. Catch-up games in a competitive market, especially with disruptive innovations, are often more costly and have lesser chance to win.

For the right positioning as a Public Cloud provider, TriStrategist thinks a company needs to decide first on these initial differentiations:

  • A “Department store” or a “specialty store”?
  • A global player or a local player?
  • An infrastructure player or a technology solution player?
  • A fit-all provider or a specific market-segment provider?

These choices are not necessarily mutually exclusive. As the cloud computing market evolves, offering choices and levels will evolve with it. Nonetheless, these considerations will help narrow down one’s targets, save money and time at the beginning of the game in a new, vast but low-barrier market.

A winning position always comes with the right set of strategies to win. TriStrategist believes that Public Cloud providers need to focus on establishing the following one or more strategic advantages with their positioning, fast and clearly, in order to ensure long-term sustained growth or gain market leadership.

1. Geographic advantage

At the current early adoption stage of cloud computing, IaaS may still be the initial entry choice and logical offering for many, especially for “department store” players, although things could quickly change. IaaS has the nature of low-barrier-for-entry and low-barrier-for-switching. It has already started moving towards a commodity service proposition similar to utilities. Surviving utility companies usually own certain geographical dominance due to the heavy geo-centric infrastructure investments to fence off competitions from the price war alone. Not surprisingly, Public Cloud providers on IaaS may well need a similar defense strategy today.

Globally, country and jurisdiction barriers to business remain. Data privacy and sovereignty will be on-going concerns. The strategic placements of data centers will continue to be critical for cloud providers, for redundancy, for data security and privacy, for low latency, and for the availability of local offerings and support. For the next few years, the most significant battle field for global geographical advantage will likely be in China. China still has very low cloud infrastructure and service coverage due to the tight controls and protections from the government, but things can change quickly as Chinese government and its booming industries cannot tolerate the lag any longer. The importance of this market is not only at its huge commercial potentials, but also at its geographical size and central location for the entire APAC region and the whole globe. Today, Microsoft, IBM, Amazon all have been very busy in gaining some foothold in this market.

2. Big Data Advantage

Data to cloud computing is the water to natural clouds in the sky, flowing in and out in various forms. Eventually all data will live in clouds, public or private. If we believe 90% of the data in the world today was created in the past two years, we only saw the tips of an iceberg as more and more data will be generated and flooding in, especially unstructured data. The data reality of the cloud computing age demands compelling Big Data stories and an outstanding ecosystem of Big Data solutions and tools from a successful Public Cloud provider. These offerings also need be on-par or able to connect with the great innovations in data technologies of today and tomorrow.

3. Business Transformation advantage

Cloud computing has also displayed on several fronts the disruptive nature to businesses, especially by SaaS and its possible future variances. Today the worldwide market adoption rate for public cloud services is still low and the differences of on-going technology levels in business and IT operations are extremely wide and apart. Even in the US, on one side, some industry frontiers are moving forward with mind-blowing speed and explosive innovations, such as in computing & IT, biotech, materials, etc. However on the other side, many traditional businesses including governments are still running on very old technologies, slow, fearful, lack of visions, resources or confidence to change. The survivals of many businesses depend on their speed to transform. How to demonstrate the future possibilities and provide quick and easy solutions to transform their IT and business better than competitions should definitely be in the core strategies of Public Cloud providers. This will likely be the area bearing the highest growth potentials and profit margins in the near future for cloud computing.

4. Flexibility in platform and offerings

Flexibility ensures future extensibility. Market demands will vary and growth scenarios enabled by cloud computing will vary. Even today at the earlier stage, flexible offerings tailored to customers’ specific needs are often preferred, in both IaaS and SaaS. Tuning into this advantage will allow more room for a player to win in the long run.

More than ever we live in a ubiquitously connected and technology-driven world. Cloud computing opens up new dimensions of possibilities where many innovations, disruptive or continuous, will be born. If the future competition is going to resemble a marathon race no longer on gravity-controlled earth, but in zero-gravity space, then every business must have an open mindset to envision, explore and experiment from today.

For further in-depth discussions on the right positioning and winning strategies specific to your business, or how to implement them in details, please contact TriStrategy.

The Positioning of Public Cloud Services – Part II

In today’s Public Cloud market, each large player displays their own unique strengths in the offerings. The competitions are fierce on price (with IaaS in particular) and on features of higher growth potentials to achieve better economies of scale. New services and features are being added rapidly in order to compete and differentiate more effectively.

Amazon Web Services(AWS) apparently has the early-entry advantage for IaaS offerings. They pioneered some of the industry concepts such as “pay by usage hours”, etc. Recent surveys indicated that AWS currently leads in market share at more than 50% among all sizes of businesses using Public Cloud. The key strengths of AWS are at least three-fold: the total cost advantage in IaaS by using commodity hardware with promises of unlimited capacity and compute power; the full range of Linux-based open-source solutions for Big Data; and the global geographical zoning coverage. Amazon S3 storage system, very practical for storing massive unstructured data, currently hosts trillions of objects and processes 1 million requests per sec. It offers extremely low cost at $0.03 per GB per month (or about $30 per TB per month), leading in industry. AWS’ packages of Linux-based solutions appeal to a broad market base which have been experimenting mixed IT solutions with newer technologies and open-source, especially for Big Data. Various Hadoop ecosystem tools for Big Data processing and analytics are wrapped inside AWS managed services such as Kinesis, Elastic MapReduce (EMR), etc. Customers only need to focus on tuning the number of cluster nodes needed for processing the data load (and the associated usage cost) instead of wasting time to twist Hadoop code which is often a huge resource challenge for open-source adopters. Another leading factor for customers to choose AWS is at its geographical coverage worldwide. Amazon claimed to have AWS coverage in 190 countries. Its zoning strategy not only offers great redundancy, but also allows customers to control geographic instances of their choice and let many believe that their data can reside in the region of their choice with needed Amazon support in place. However, with currently 30+ major AWS offerings and all kinds of unfamiliar terminologies, things can become quickly confusing to customers. Amazon may need to better organize their offerings and provide better education to the market.

Google has clearly positioned their cloud platform Google Web Services as the platform for developers. Google has significant advantages over developer areas as their software-driven cloud architecture is a direct extension of their internal developer platform and solutions from supporting the gigantic search service. Google’s AppEngine is a strong PaaS contender and adopted by a large share of small and medium companies on the Public Cloud market. Google runs data-driven business. Their developer solutions including MapReduce, BigQuery, etc., often become the most widely chosen ones in the industry. As other companies are still learning and implementing Google’s earlier solution such as MapReduce as the top analytic choice for Big Data, including Amazon AWS, Google has already come up with more innovative ones. For example, DataFlow has already replaced MapReduce and Big Query as a faster solution internally, but not fully disclosed to external yet. Google is also marketing the other advantages of its software-driven cloud, such as high uptime, virtual storage, zero startup time, consistent throughput, etc. The possibility of real-time collaborations on cloud appeals favorably to developers and many business users. Google’s Android developers can leverage the cloud platform directly for mobile applications, which will be a huge growth area for clouds. Although Google faces the dilemma of “the open-source openness” in competitions, its speed on innovations and the flexibility in their cloud architecture make them a formidable competitor in future cloud offerings, especially in application areas. In fact Google is pursuing aggressively on new cloud-based workplace software to compete more effectively in enterprise space as the enterprise adoption of Google’s cloud services has lagged behind those of Amazon and Microsoft.

Microsoft certainly has the capabilities to offer the broadest variety of services on cloud. Currently Microsoft is trying to convert its numerous software advantages from the desktop world to the cloud, but it needs new thinking as the cloud reality demands new dimensions for software than the isolated desktop world. For cloud adoptions, it can benefit from the existing broad customer base, especially among large enterprises. In fact Windows Azure adoption among large enterprises has been steadily growing. With both IaaS and PaaS from Windows Azure and SaaS offerings from Office365, Exchange, Dynamic CRM, etc., Microsoft definitely has the capacity to set up as “one-stop shop” for all enterprises and mobile customers’ needs on cloud, although it still yet to achieve that stage or clearly define its positioning. Its IaaS and PaaS are competing with almost lock-step pricing with Amazon, however the most appealing factor for Microsoft’s cloud offerings resides in its broad software spectrum. Many enterprise tools, including Active Directory, SQL Server, SharePoint, etc., and developer tools such as Visual Studio, are still popular among customers, but the problem is that not all features from the desktop versions can be easily transported to the cloud versions. Conversion itself can be a challenge for Microsoft and easily confuse customers. Even for an enterprise customer on total Microsoft technology stack, migrating existing critical business applications to truly cloud-ready is not an easy task today unless it only uses hosted options. Beyond the familiar, one brand-new feature, Azure ML released in July, is a great addition to Azure PaaS. This drag-and-drop analytic tool on cloud is welcomed instantly by developers and data scientists alike. In addition, Microsoft is also trying to open Windows Azure platform to all customer choices, including Linux-based VMs (HDInsight) and Android/iOS/Symbian/other tools for mobile developers. The results of these offerings are yet to be measured. Compared with competitors’ cloud offerings, Microsoft has the advantages of familiarity to customers, but cautions are needed on the long-term roadmap as familiarity can often become a deterrent in the face of disruptive innovations.

There are also other providers that are competing on the market shares of Public Cloud services, for example, Rackspace in IaaS, Salesforce in SaaS, Verizon (Terremark), AT&T, VMware, IBM, HP, etc. It is definitely a space that is only getting more crowded.

In Part III, we will discuss some of the possible positioning scenarios and winning strategies on Public Cloud services per TriStrategist’s views.

The Positioning of Public Cloud Services – Part I

For IT Industry and for targeted customers’ organizations, is cloud computing a disruptive force or a continuous evolution? Which types of cloud offerings could potentially become commodity services and when would it happen?

The answers may vary among different organizations and for different types of cloud services, but nonetheless addressing these questions may well help determine the positioning strategies for cloud service providers in an increasingly competitive market, most importantly for Public Cloud service providers.

Today there is a paradigm shift among businesses in the perceptions about cloud computing. From the initial concept of infrastructure renting to connected business operations, from lowering CAPEX costs to new opportunity enabling, many companies have emerged from “cloud watchers” to “cloud chasers” and eventually will want to become “cloud riders”. The worldwide adoption of cloud services keeps increasing as a result and many global companies are migrating from the earlier “Transition” stage to “Transformation” stage as the cloud technologies and service establishments move towards maturity and future world with clouds becomes clearer and highly enticing. A surprising jump in the adoption of SaaS in the 2014 4th Cloud Computing survey can be a proof: from 13% adoption in 2011 to 72% in 2014. In this fast shifting market with numerous global vendors jumping in and huge investment money piling up over the past few years, the positioning of the services and value differentiation are hugely important for the Public Cloud providers, for today and the near future.

Because Public Cloud offerings demand a huge amount of upfront and continued investments on both infrastructure and critical capabilities, some have predicted that the industry may eventually consolidate into the hands of a few large players. However, since cloud computing impacts the future IT and business models of every business and offerings can become more and more innovative, the market is immensely large if not unlimited. With increasing future varieties of value-added services(especially on software and application sides), hardware costs getting cheaper and cheaper, open-source tools for enabling cloud-based data handling and applications readily available on the market, it could result in a very fragmented market with both multiple “global department stores” and many “specialty stores”.

For example with IaaS today, value differentiation becomes increasingly difficult as the market is crowded with many large players or smaller players with solid investment backing. For many enterprises, IaaS can be considered a continuous evolution and a low-barrier business. Most of the data center or colo operators of the past can naturally become IaaS providers. On the other hand, today many large companies that already have their own data centers from the past are continuing building more to offer cloud services to themselves or to a related community, for future growth, for global expansion, for the coming age of data and connectedness including IoT, etc.. Operating secure clouds of their own makes a lot of strategic sense for them. Some of them also decided to join the fray as Public Cloud providers, such as AT&T, Verizon, etc. Even for many medium and small companies, Hybrid Cloud solutions are more on their minds for the future with mixed on-premise solutions along with various value-added public offerings from multiple vendors in the picture. Under these market conditions, the coming challenges and competitions for Public Cloud services could be even more intense.

Today, Cloud Computing is still at the early adoption stage throughout the global IT industry although the momentum is fast building. There are still only a handful of large and early-entry Public Cloud providers on the market who can offer a wide range of cloud services, namely Amazon (IaaS, PaaS), Microsoft(IaaS, PaaS, SaaS), Google(IaaS, PaaS, SaaS). If Cloud Computing is viewed as a disruptive innovation in the IT world, early entrants typically will enjoy a certain degree of advantages.

In Part II, we’ll take a look at the current strengths and positioning in the Public Cloud arena from these large players. In Part III, we’ll discuss further the possible wining strategies in positioning Public Cloud services for future competitions.

IBM’s Counter-Disruptive Strategies

IBM’s profitability and stability have been relying on the combination of specialty servers, software and services which have a high switching cost for enterprise customers. On the server side, IBM has been doing fairly well in high-end server market, about 57% market share by beginning 2014.

Although IBM’s recurrent revenue streams and financial performance are still strong, they are facing cloud-computing as a disruptive threat, especially the commodity-hardware-based, software-driven cloud services promoted by Google, Amazon, VMware, etc. With compute power doubling faster and faster and price getting cheaper and cheaper, cloud services quickly nudge into the large enterprise IT and scientific computation space where IBM used to dominate. Because most of IBM’s software and service business are also chained to their specialty high-end servers, the threat is definitely looming larger for IBM’s once prized niche markets. It’s hard for IBM to compete in the low-end cloud service market without some fundamental changes for such a huge company.

To counter the disruptive threat, IBM adopted several strategies simultaneously.

First, IBM is offering its own cloud platform, but in a different fashion. In January 2014, IBM announced to sell their low-end server (mostly x86 based) units to Lenovo of China In order to focus on other strategic goals including cognitive computing, Big Data and cloud. IBM is also trying to acquire more cloud software companies to help migrate their existing software and services to its own “distributed” cloud platform in order to keep its existing lucrative customers who still offer the largest profit margins. Timing and luck still yet to play out, but this strategy may or may not serve to sustain long-term advantages for IBM.

Second, instead of competing head-on in the low-end cloud services, IBM invested $1 Billion from January 2014 to form a 2000-people new business unit in a new location, separate from the corporate headquarter, to transform IBM’s super machine “Watson” onto cloud. Targeting to provide cognitive machine learning technologies and services to enterprises, this strategy is to compete at the higher-end value chain of the cloud services. It’s common for a new initiative to adopt an independent division to avoid being aversely impacted by existing non- efficient corporate culture or playbooks, but a new organization of 2000 headcounts seems overly extravagant. It’s hard to be interpreted as a bold innovative move with due efficiency. On another note, the original famous Watson, IBM’s super intelligent machine built on IBM’s DeepQA Technology (somehow sounds very similar to the Deep Thought of the movie The Hitchhiker’s Guide to the Galaxy. 🙂 See our earlier blog on Machine Learning) was a room-sized giant “machine being” on a costly cluster of 90 IBM high-end servers, with a total of 2880 processor cores and 16 Terrabytes of RAM. Now it is said that by switching to cloud-based computing platform and software, its speed increased 24 times, 2300% performance improvement and a new size of a pizza box. Still this seems to be another tweaking of the existing, a middle-road strategy for dealing with disruptive innovations, which was not generally favored by many past business lessons.

Third, few companies have been able to flex the R&D muscle like IBM does. It has a long tradition of investing heavily on R&D which has paid great dividends over the history of IBM. IBM has invested in many obscure ideas, including building neurosynaptic (to mimic the brain) and quantum (to mimic subatomic particles) machines, replacing silicon with carbon nanotubes and carbon films, etc. IBM just announced in July a $3 billion R&D budget in chip research to further shrink the size of the circuitry and seek new materials for chip designs. If these ideas succeed, the landscape of intelligence market could be changed dramatically. For example, parallel to the prevailing trend of a future all-encompassing super intelligent machine being on cloud, a neurosynaptic machine of a small size could well be positioned in a different value network for the intelligence market of the future. Although many of these ideas are still a bit far from commercialization at the moment, TriStrategist thinks that these ideas may truly be IBM’s savers in the near future. They in fact have the true potentials as the type of innovations that generate the next industry leaders.

Time will tell which strategy could result in best outcomes for IBM and how well they can execute. It will provide great business lessons for many companies in managing and dealing with disruptive innovations.

6 Common Business Strategy Errors

We like very much the summary of the common errors on business strategies from the book Playing to Win: How Strategy Really Works. By A.G. Lafley and Roger Martin, Harvard Business Review Press, 2013.

The Economist magazine today helps give a succint notes of these common strategy errors:
 

  • The Do-It-All strategy- No choice, no priorities.
  • The Don Quixote strategy-  Attacks the company’s strongest competitor first like a fool.
  • The Waterloo strategy- War on too many fronts at once.
  • The Something-For-Everyone- Tries to capture every sort of customer at once.
  • The Programme-Of-The-Month-The populist approach, pursuing whatever fashionable in an industry as a strategy.
  • The Dreams-That-Never-Come-True strategy- Never translates ambitious mission statements into clear choices about which markets to compete in and how to win in them.

Business leaders are tasked to make their strategies work and avoid these common errors. That may be easy to say than done. For new business, we’ve seen so many easily falling into the above categories. We all need keep in mind and keep practicing.  And then, most importantly: “no strategy lasts forever”.