Challenges to Today’s IT Managers

IT managers seem to be living in a sphere of paradoxes these days. With the rapid shifts of many technologies, business models and processes, from traditional server and desktop software support to today’s cloud computing and service-oriented models, from past role-based IT structure to today task-focused realignment, from clearly defined functional teams to highly mixed and collaborative environment, the calling for new management thinking, models and skills is imminent.

A few of these interesting paradoxes to today’s IT managers include:

1. Bold initiatives vs. Business-as-Usual

Nowadays there is no business-as-usual with all the disturbances and changes around IT businesses. Management whose goals are to maintain a steady growth in functional scope likely find themselves walking towards an obvious dead-end. It’s a demanding world for IT managers as new technologies, new concepts, new markets call for bold initiatives and actions towards uncharted territories in order to stay relevant.

2. Delegation vs. Details

To be effective, experienced managers must know how to delegate. Yet in order to show competence in today’s fast-moving IT environment, managers are expected to stay hands-on and understand all details of the ongoing business, technical or otherwise, and be able to articulate every detail to higher management at any time. As things are moving faster and changes are constant, a balancing act is harder and harder to achieve. Situations can get ridiculous.

3. Role-based vs. Skill-based Team Building

Increasingly IT managers are no longer managing a functional team of similar job requirements and skillset. Today’s IT environment often assigns a manager to lead a new initiative or deliver by projects and milestones. Under such a situation, a team of versatile skills are needed to fulfill the delivery requirements. These members with the needed skills, especially those who possess new skills or be able to quickly acquire new skills, can often be called upon to assist other concurrent projects in the larger organization. The concept of “a team” is more a collection of needed skills. An IT manager becomes more of a recruiter or facilitator than a traditional “manager” to fit into such a picture.

4. People vs. Cost

In many large IT organizations, one of the central theme in the adoption of new technologies is to significantly drive down the cost. At the meantime, modern technologies and automation are indeed replacing many human jobs, especially those with manual labor or tasks. RIFs, layoffs are so often these days among large companies’ IT organizations. However we often hear that a good manager must first be a people manager. In an unstable or relentlessly cost-driven organizational environment, this paradox can add to tremendous stress.

5. Experience vs. Everything New

Expectations are high for today’s IT managers in a traditional organization. They need maintain existing business operations fully functional without major troubles, and lead transitions and new initiatives at the same time. They need to be experienced leaders and people managers who can gather together diverse, shared or often newly recruited resources to perform, and concurrently deal with the constant demands and changes of the organization. They need know their daily business in details, and also keep themselves up-to-date on rapidly emerging new concepts and trends in the markets. A complete new set of skills is in urgent need for IT managers.

What are the solutions to these paradoxes? TriStrategist thinks that these daunting challenges to IT managers could be a direct signal that some fundamental changes to the structure or management concepts in IT business may be due. In today’s highly technology-driven environment, new thinking to both organizational structure and management as a science is clearly needed.

Neuromorphic Technology

No computer chips today can compete with a functional human brain in the power of information gathering, intelligence and energy efficiency, yet with its moderate size. That is because a human brain functions drastically different from today’s computers. But what if future computers adopt the brain neuron structure and start processing information by logics closer and closer to the ways a human brain works? If with huge amount of processing power and memory storage available, would computers one day indeed surpass the brain power? The answer has become increasingly difficult to come by.

Scientists and engineers together have been trying to build a brain-like computer for decades. With the concepts of Artificial Neuron Network (See our earlier blog on Artificial Neuron Networks ) and advancement in computer engineering, a modern computer chip designed without a conventional powerful central processing unit (CPU), but with millions of parallel “neurons” and connecting “synapses” packed into a single unit to simulate one brain function, e.g., one cognitive ability, may well come closer to capture that specific brain function after repeated learning, storing and processing information. When a large number of these special units combined together in a coordinated fashion, a “machine-made brain” may just be born. Thus “neuromorphic computing” evolves from here. IBM, Qualcomm and several other chip designers and manufacturers have been experimenting with the ideas with great progress in recent years.

Besides enhanced “brain-like thinking” capabilities of such machines, another key benefit from neuromorphic chips is the energy conservation. Information storage and processing are now arranged inside the same interconnected neuron nodes. The cost of energy and heat from the switching, such as those in between memory and CPU in conventional chips, has been drastically reduced, resulting in better performance in general.

Due to its significant disruptive nature (to both future hardware and software) and high potential commercial clout, the World Economic Forum’s Meta-Council on Emerging Technologies ranked neuromorphic technology as one of the Top 10 Emerging Technologies of 2015. The interesting facts today are that although prototypes of the neuromorphic chips are available, great software to demonstrate their “brain” power are yet to come out.

What are the best ways to test the intelligence level of a machine-made brain? And where would we use it first, on a robot called Chappie?

The Potential Drawbacks of Doing-it-Yourself with Cloud

The No.1 initiative in many of the large corporate IT departments nowadays is about moving to the cloud platform. Bank of America, SAP and many other large companies in non-tech industries are trying it. The decision of getting on either Public, Private or Hybrid Cloud largely depends on the nature of the business, the security concerns and the decision drivers of the company management. Today because the cost of building a cloud-ready modern data center can be much cheaper than in the past, due to both the best practices from Google, Facebook, etc. and downward hardware cost, many security-sensitive companies may decide to build Private Cloud only and completely on their own, although they may have to work with certain technology vendors here and there. TriStrategist thinks that such a decision has several obvious drawbacks.

1. Lack of central vision, planning and design for an extensible platform for potential future needs in cloud.

In a huge enterprise that has been running on traditional IT models for decades, new ideas are often tested on a small scale and initiated by a few departments first. Same are true for testing out the cloud, with only limited design, planning and views of the company. For example, some may start by moving applications on virtual machines; some may interpret cloud services as moving to fee-for-service models between departments, etc. Few people, including management, fully understand cloud computing concepts and technologies in the market. However modern cloud service platform is a gigantic disruptive wave that can disrupt the entire IT operations, resource utilizations and existing internal business models. It’s hard to implement a comprehensive and cohesive cloud platform and service model without considering the IT for the enterprise and the future needs of the company as a whole. Piecemeal implementations could most likely be short-sighted, costly, incompatible or redundant with each other.

2. Lack of right resources and skillset for an efficient implementation.

Cloud and related technologies, such as Big Data, etc., are mostly new. Many companies simply don’t have the right resources with the skills needed. A cloud platform for a large company usually has to embrace many mixed technologies, from hosting infrastructure and virtualization to automation and self-service, etc. It’s often difficult to find people who understand the linkages between different technologies for a smart design. Hiring outside consultants may not help much either since many of these consultants from a particular technology vendor tend to only know about their own product than understand the enterprise’s complex needs and multi-technology environment. Cloud computing also demands new skills. For example, automation development skills are usually hard to be filled by traditional DBAs or IT support staff.

3. Lack of versatility in modern technology infusions.

Cloud technologies are fast evolving. New technologies, new concepts and new tools mushroom every day. Therefore an enterprise cloud platform needs to be open so that it can leave room for future better technologies to fit in. Initial implementation only by internal people tends to follow the least-resistance path which could result in a lack of extensibility and versatility when future needs demand them.

What would be a better approach? TriStrategist thinks that companies inexperienced with cloud, especially those outside technology industry, should try a more open-minded approach in building the cloud at the beginning. Under a comprehensive company-wide vision, business segments with less security constraints should consider first using established Public Cloud providers. The goals are twofold: a fast and cost-effective implementation; collecting experience and training its own staff at the same time. This may be a smarter shortcut to get a extensible Hybrid Cloud for a company’s future needs than taking the painstaking novice effort to build-your-own internal cloud which could surely encounter unexpected problems. A company’s long-term visions should also bear the open-mindedness for its own sake.