Friday 24 August 2012

The myth around niche.


The idea of a niche product by itself is slightly flawed. The intent of this post was certainly not to be mired in the meaning of the term niche. Nevertheless, I do so with the intent of drawing upon the different definitions that reveals some interesting connotations of how the term has come to be used in the context of products and companies. Here are some of the definitions (only the noun form) –
-          A shallow recess especially one in a wall to display a statue or ornament
-          A cranny, hollow or crevice, as in a rock
-          A position particularly well-suited to the person who occupies it
-          A situation or activity specially suited to a person’s interests, abilities or nature
-          A focused, targetable part of a market
-          A special area of demand for a product or service
The sequence of the definitions above, tells us the rough evolution of the term over the years. It is evident from the very words and phrases that are used to describe the word, that the last few definitions are fairly recent.
Now let us take a look at the way the industry has used the term niche. We have niche players in Gartner’s Magic Quadrant that attributes a relatively lesser ability to execute and a lower completeness of vision with respect to one market segment, to companies that fall in that category. On a different note, it beats me why I should be talking about folks who neither have a complete vision nor the ability to follow a strategy through with action!
Somewhere in the entire scheme of things (read parlance and thought process) that the industry has gotten used to, every small player with a small footprint of product installations and customers is christened a niche player. Well, there is nothing wrong if a niche player is small. But there is everything wrong when you start talking about every small player as a niche one!
In this era in the evolution of organizations and businesses (why even in our personal lives for that matter), the number of brick-and-mortar products that are available to cater to each aspect of the business and the choices they offer is, mildly put, large. Despite this backdrop, it is somewhat surprising that the entire business application products space comprises much fewer large players than one would expect. A look at the evolution of these large companies reveals some really interesting facts.
Most of these products evolved, first time around, from robust and comprehensive home-grown automation solutions for rather generic business processes. These were packaged and taken to the neighborhood market with the promise of shorter time-to-market and lesser cost of implementation. These were obvious advantages of a ‘second sale’. With the buyer community latching on to the idea, the footprint grew to an installed base of, at best double digits and at least 5-6 instances and the market itself grew beyond the neighborhood and even across borders aided by a globalization agenda.
In parallel, the number of acquisitions in the space grew dramatically and the smaller companies with their products got cannibalized. The core reason behind the consolidation was as much aspiration as economics. Well, this is all very fine because that is the precise way businesses evolve. But what may turn out to be more interesting is to look at the company that had implemented the product from the smaller (and so called niche) product company in the first place. This company now, is pretty much at the mercy of the larger product company and their strategy to retain or retire the product and brand it acquired. And then, there are the other niche product companies that haven’t sold out and continue to live with the ‘niche’ they have made for themselves. The niche, however, for these companies, has begun to take on the first two of the definitions presented right at the beginning!
At the other end of the spectrum, the larger product companies which started with less than 40-50% coverage and hold on the entire spectrum of business processes that an organization typically has, have continually grown that coverage to over 70-75% through organic and inorganic expansions. Much can be said and debated about what is the core competence of an organization and if or not their customers would see a product or service differentiation if three-fourth of their processes are standardized and pretty much the same as the rest of their flock! But that is a slightly different point.
It is clear that a strategy that is built around adoption of a niche product and company ought to be carefully evaluated against the long-term goals of one’s organization and the sustenance of the strategy over a longer time horizon. And if one does go that path, would it not be prudent to have an option open to transfer ownership of the product and its support to your organization, is well worth consideration.

You may also find some interesting perspectives on this and related themes in


Note: The views expressed here and in any of my posts are my personal views and not to be construed as being shared by any organization or group that I am or have been associated with presently or in the past.

Tuesday 14 August 2012

Supplier collaboration in ICT a poor shadow of its brick & mortar cousin!


One school of thought recommends that when advice comes free (the connotation of free advice here and through the rest of this write-up is advice volunteered and not explicitly called for, not a monetary view of free), think twice before you take it. Well, that’s what all of us were taught and that’s what most of us do. Now here is something to chew on - taking free advice is obviously not the same as implementing it blindfolded. Then what is the cost of taking free advice? It is the time you take to understand it and the time you take to evaluate it. If someone finds that far too expensive, it might be prudent to ask what they think they ought to be doing with their time.
None of us can think of an occasion when we had some funds to put aside in our personal lives and did not agree to meet up with someone who had an idea or two on where we should invest those funds to find the best returns. The more ideas, views and options, the merrier. Inexplicably, we do not wear the investor hat when carrying out business.
Just ask the people who invest on ideas if they would find the time spent listening to, digesting and evaluating an idea, worth its while. In this era of carrying out business collaboratively across multiple partner ecosystems, ideation and innovation cannot be carried out in silos. There is nothing new about that thought given that stuff like CPFR (supply chain collaboration) has been in existence for eternity and supplier (read partner) collaboration is fast catching on, right?
Yes and no! The very folks that have made collaboration a reality within business ecosystems by providing the platform and tools to make it work, and I mean the IT community, is where you would find the least partnership in ideation. I know there are a few organizations that have used their partner ecosystems in information and communication technology effectively to maximize ideation and manage their techno-commercial investments towards realization of their business goals, but these are few and far between.
In fact, there are few planned, systematic, voluntary and periodic interactions in the ICT supply chain, where ideas are sought from ICT partners on the basis of business data, challenges, issues and problems that are shared with them. Even where it exists, more often than not, this isn’t initiated where it ought to be. Even where it is, the focus is on the TCO theme and very little, if at all, focuses on the CVC (Contribution to Value Creation) theme. The appetite and intent to ideate, contribute and make a difference is there. And on either side of the stream. It’s time 'the partners' crossed the stream. It’s time they collaborated.
The cost of ideation, to the consumer of the idea, is inconsequentially disproportionate to the potential value to be derived from it! Partner collaboration provides a unique opportunity for the ICT community to drive business behavior and set examples, for a change, than just being the passive enabler. This could well be the long due impetus required in the business outcome alignment roadmap that is imperative for the next era of ICT growth.

You may also find some interesting perspectives on this theme in

Note: The views expressed here and in any of my posts are my personal views and not to be construed as being shared by any organization or group that I am or have been associated with presently or in the past.

Tuesday 7 August 2012

Big…Bigger…Biggest! - making sense of world data.

There have been numerous articles and publications on the volume of data across analog and digital forms that exist in the world and how these could be put to constructive use. The estimates are that the total volume of data stored by the end of this year would be in the 2.7 zettabyte range and growing at a rate of just under 50% year-on-year. To the uninitiated, a zettabyte is a trillion gigabytes (GB) and a billion terabytes (TB). Now that sure sounds like the next biggest business opportunity that everyone should latch on to, right?

The answer is yes, but with a note of caution. Here is why.

Over 90% of all the data that exists (and over 99% of the new data that is being created or will be created in the future) is unstructured media data including video, audio and images. Take this out of the equation and we still are talking about around 220-230 exabytes of existing data and another 170 exabytes being created additionally through to 2017. Analog data, in one form or the other, was around 6% of the total data of 220-230 exabytes that was available in 2007. However, that would only be growing infinitesimally, so we will take the total analog data as around 3.5% of all data, on an average, and take that piece out of this calculation!

The new denominator is therefore in the 375 exabytes ballpark!

A closer look at this data would reveal a whole bunch of realities that are mind-boggling and help you put the numbers in the right perspective. We will, over the course of this discussion, make some assumptions to help the math, some of which may be incorrect (though no one would be able to prove it one way or the other, despite big data!), but will not thematically challenge the hypothesis. Here’s one. The total amount of storage for non-user data (system and application software, for example) is assumed to be around a third of the total data. That seems a fair assumption when you take the Forrester view that we will have around 2 billion computers in the world by 2015 and that the minimum that each such device would need is around 50 GB only for OS, office, security and networking tools. That would throw a further 125 exabytes out of the window (no pun intended!). That leaves us with around 250 exabytes of user data!

Now that media files and system/application software are out of the equation, we will assume that 90% of the total remaining data is corporate in nature (the other 10%, or 25 exabytes, is not necessarily personal data, it could even be, for instance, data generated in Office tools, mail data etc). And I cannot think of a corporate that does not back-up its data. So the minimum redundancy at the transaction data level itself would be 50%. Take that out of the equation because the same data analyzed twice over would not yield any greater intelligence than doing it once! That would take out 45% of the total data we started with at the beginning of this paragraph. Also, most organizations of a size where the data volume should matter for this calculation would have a data warehouse, an operation data store or a bunch of denormalized data stores at the least to introduce a further 50-67% redundancy. That would translate to another ~30% of the data not to be considered in the equation. This leaves us with 25% of the 250 exabytes of data or ~60 exabytes.

A significant part of the non-corporate data (the 25 exabytes) is essentially generated and consumed with a view to sharing and communication with other data/information stakeholders. I cannot think of someone creating a file locally and sending it to someone for consumption and then diligently removing the redundancy of storing it in their file systems and again in their mail folders. This also happens at the receiver’s end. And not all communication is one-on-one. I cannot think of communication that ends with the first receiver either! Assuming a two-step average communication and a storage redundancy of 50% at each of the 3 players, we are talking of a mere 16% of the data that is unique. It would be significantly lesser in personal data given how much of what you send in your mails is stuff you generated – think about the number of forwarded mails and messages in all your personal mailboxes, accounts etc!! Giving a benefit of doubt and taking a conservative 20% of the non-corporate data as being unique, this part of the data only contributes 5 exabytes taking the total data that we can derive intelligence from down to 40 exabytes.

Assuming there is not much of information worthy of analysis in data that is over 5 years old, effectively the amount of data we are likely to deal with over the next few years, with a view to extracting intelligence and drive decision making is closer to a 30 exabytes magnitude. Now, to put this into perspective, half of this data would be residing in transaction or operational systems and the rest would be in one or the other form of data warehouses already. That would mean around 15 exabytes of data or around 15 million terabytes of data. Imagine the above in the context of what your take on the number of existing relational and dimensional data stores and what your take on the average size of an installation is.

That surely should help put Big, Bigger and Biggest data into perspective. The need is to be pragmatic about what is the real volume of data that we are dealing with and how effectively we could use the intelligence we can derive from it and put it to profitable use. And how far does it push the current data paradigm, therefore! 

You may also find some interesting perspectives on this theme in

Note: The views expressed here and in any of my posts are my personal views and not to be construed as being shared by any organization or group that I am or have been associated with presently or in the past.

Thursday 2 August 2012

Back to Basics – the CIO rediscovered.


Those were the times! Majority of the CIO’s time and in turn, the behavior the CIOs drove within their IT organizations, was focused on the next big possibility for the business to expand. What was hitherto not possible was being made possible thanks to the advances in information and communications technology. From green-field automation for increased capacities and throughput of the business to enabling the globalization of economies and business operations, the CIO organization was a critical enabler to business growth and expansion. They did so with a never-seen-before speed to market that made mega-businesses nimble-footed and agile from strategy to execution. And in the entire scheme of things, the fact that they did so at increasingly lower costs was an added plus.

This was, arguably, the golden phase when the ICT spend of organizations leaped from the fractional existence of yore aligned to running data processing departments (some old dinosaurs would remember this) to the single-digit percentages that are the norm today.

Somewhere down the road, however, the original purpose, though not entirely lost, got sidetracked. The entire emphases started moving towards optimizing this spend. This was hardly surprising given that now the ICT spend was a meaningful percentage of the total costs of running the business and was being viewed exactly like all other business functions and processes that existed as an enabler or support function and not directly involved in the larger cause of doling out the service or product the business was meant for. Instead of being treated like a R&D, business innovation or value engineering function, the function was being likened to business enabling functions like the HCM, F&A or the MRO. In fact, in many businesses, the CIO function was being rolled up or aligned into the CFO agenda.

The CIOs and their organizations, ever so imperceptibly, started optimizing their resource supply chains to source globally, focus on TCO optimization projects that cut down cost of operations and sustenance (run the business, keep the lights on, whatever you want to call it!) and even the little bit of money that was indeed being spent on a new initiative started being in areas like governance, security, risk, compliance et al. Nothing wrong with that unless that is the only thing on your agenda! In the midst of all this, the cost of storage, processing power and communication dipped exponentially making it an extended comfort zone for optimization initiatives.

Life, proverbially as well as in reality, goes a full circle. Any bit of optimization today based on the overarching themes of the previous paragraph, and indeed the past decade and a half, would ring in infinitesimally small incremental benefits that would not sustain the interest of businesses to pump in investments with the same enthusiasm and vigor as in the past. There are some exceptional applications of those themes that still do carry the whack, especially where the order of magnitude of the requirement of those dimensions is humongous. These are in a minority in the larger context of this discussion and, well, prove the rule anyway.

This has necessitated a revisit of the objectives and the larger purpose of the CIO organization, perhaps even to the extent of a need to rechristen the function and the roles thereof. Like any change of this nature and magnitude, this will be evolutionary and may play out over a good part of this decade. But, it is inevitable. An inward focus on ICT optimization would be a self-centered and self-defeating strategy. The urgent need is to realign to the business expansion and growth agenda and what better time to embark upon it than now, when businesses and the larger world economy are at their lowest in decades.

It is a welcome ‘back to basics’ for the CIOs and my take is that the vast majority of them and their larger teams are resilient and fully capable of rediscovering their true roles. It is only their acknowledgement of this situation and urgency to act on it that will separate the boys from the men.

You may also find some interesting perspectives on this theme in
   
Note: The views expressed here and in any of my posts are my personal views and not to be construed as being shared by any organization or group that I am or have been associated with presently or in the past.