Monday 5 November 2012

Talent Acquisition – 8 things you just can’t ignore.


Talent acquisition in organizations, independent of the industry, society or geographies they operate in, has and will continue to pose one of the biggest challenges to its success. How much can you decipher out of an hour, half-hour or quarter-hour that you spend with a prospective employee? For all those that wince at the idea of an arranged marriage – spending a lifetime in marriage to a partner that they spend similar amounts of time to understand – this is the exact situation in work life. There are zillions of points of view and checklists on what to look for and pitfalls to avoid and there is enough literature you will find on the topic. Yet the results continue to be disturbing still. I am trying to put together my own flavor of the dos and don’ts that I do not see covered often. Some of these might be pertinent to only a specific section of the organization (perhaps a bias towards leadership talent acquisition) while others may be more generic and I have jotted them down as they came to mind, so here goes.
-         Trust the ecosystem that has produced the candidate in front of you. Career progression, rewards, recognition, awards are all better indicators of consistent performance than anything you might discover about a person in the minutes spent on a discussion.
-         Darwin’s theory of evolution is a strong one – trust it always. Two of the cornerstone thoughts that shape the theory of evolution and natural selection are the survival of the fittest and the struggle for existence. Not every high school or undergraduate drop-out will go on to win the Nobel Prize or create the largest organizations of the world – those fairy tales sounds good only in retrospect – they can’t be treated as an example to follow for employing the prospective candidate in front of you. Strong academic performance, reputation of the alma mater(s) and the organizations that the candidate has been a part of are critical pointers to what the candidate brings on the table.
-         Beware of the glib talking smooth operator. It is so often easy to get carried away by the way a person comes across and has the gift of the gab to sway the audience with their articulation and mannerisms over a short period of time. To demonstrate capability and credibility, all you need is the conviction and commitment, not the flowery language, though the art of articulation would be an added plus. Look for these qualities in the content of the conversation rather than the manner of it.
-         Beware of the over-achiever profile. Our image of superheroes is that of someone that wears undergarments over their clothes. While that makes them easy to identify, everyone who dresses up so, do not become superheroes purely by virtue of doing so. That would be comic-book naiveté! It is fairly normal for people to have the misconception that they were the only factor in a success story. Taking the credit for what a team delivered and with the backing of a sound ecosystem is a norm with the majority. The wiser ones would know what worked in a situation and what their contribution to it was. They would be able to articulate all factors that contributed to a success. And by virtue of that, how they stand out.
-          A leader that’s always right but never the first. A common mistake that everyone makes is to think that a successful leader has to always, or at least most of the time, be right. That is considered beyond compromise. Look for someone who has taken the initiative, taken the first tentative step, even if nervously for they were stepping into the unknown, but with the tenacity to be able to learn from their mistakes and stay the course. Leaders are thus because they lead, not because they are always right.
-         You need the right fit, not the biggest size. Put the role that you are hiring for into perspective when you start judging the person across the table. More often than not, the base competency required for a role is exactly that – very basic! What you need is a steady head on a level shoulder. You seldom need someone, who can rattle off the value of ‘pi’ to the 100th decimal or multiply two 13 digit numbers mentally within a minute, to do double-entry bookkeeping! You do not need an IQ of 150+ to understand the average business processes and systems and solve potential business problems. A phone operator does not need to know Alexander Graham Bell, your travel desk folks do not need to know the Silk or Spice route and it is ok if your Chief Technology Officer does not know the exact charge or spin of a positron.
-         You do not have any points to prove, so steer clear of bullying. This is the most common mistake people make. When you have a smart person sitting across the table, do not feel either intimidated or succumb to the urge to outsmart the person at one topic or another. On the contrary, get the person to talk more and more. What you need talent from the outside for, is to generate newer ideas, not to impose your old ones on. Remember the old adage, you are smart if you hire folks smarter than you, you’re stupid if you do the opposite.
-         Avoid the checklist trap. This is the classic one that dilutes the purpose of having a conversation. For this kind of an exercise, you might as well administer a questionnaire instead. When you approach a discussion with a checklist in hand, you are not listening to but listening for.

In summary, when you are trying to hire someone, either for a specific role or otherwise, you can’t be too clouded by the interview or discussion process that happens over few to several minutes. There is a pre- and post- process that could be equally effective. And during the process of interaction, you need to make sure that you are grounded – neither overbearing, nor overwhelmed! Last, but not the least, be fully involved in the process and go in with the right, read appropriate, expectations.

Note: The views expressed here and in any of my posts are my personal views and not to be construed as being shared by any organization or group that I am or have been associated with presently or in the past.

Monday 29 October 2012

Leadership – 9 imperatives to success.


What makes a good leader? What are the leadership qualities that are most relevant to the current socio-economic and socio-political context? And what are qualities that are essential in a leader independent of the era, geography, culture, organization and the teams they lead?
The efficacy of a list of leadership qualities and whether they stand their ground on any test is when you take any of the qualities out and the success of the leader is decidedly questionable. Here’s a handy list of 9 leadership tips that you would find everywhere if you didn’t look for it but may not come across searching…
1.       Leadership by command.
You cannot demand respect as a leader, you need to command respect. And that does not come from your business card, the office you occupy or the title of your role. It comes from consistently appealing to the values that drive people to work together and achieve common goals. It comes from walking the talk and delivering on the promises.
2.       Leadership on demand.
Successful leaders are seldom the ones who breathe down the necks of their teams. They are usually the ones that provide you with the independence and space to operate by yourself and are available to offer advice when sought.
3.       Leadership from the high moral ground.
Integrity, honesty and impartiality are imperatives in any role and in every walk of life. It is even more important as a leadership quality since a leader cannot effectively drive organizational behavior without being a practitioner of what one expects from the organization.
4.       Leadership is architecture and not management.
True leaders shape organizations, they do not run them. They build and mould teams, not structure and supervise them. They create ecosystems for success, not plans for wins. They inspire and invigorate, not control and investigate. They create dreams, vision and strategy to achieve them
5.       Leadership through reciprocal trust.
A leader, who cannot trust others, does not trust oneself. Plain and simple. If a leader has exercised the choice of putting folks (s)he thought were right for the task in hand and then is immersed in doubt about those choices, the leader lacks self-confidence and trust in one’s own ability. And nor do people find such a leader trustworthy.
6.       Leadership is exploratory.
Leaders do not have all the answers. They know it. They also know that they are not leaders because they know all the answers. They are so because they know how to find the answers. They are so because they can navigate the course till the answers are all found. They have the veracity to see reality for what it is, they have the spark to seek and explore, they have the strength and spirit to endure and the urge and will to conquer.
7.       Leadership is secure in team’s greater skills and their empowerment.
Success of leadership is a greater factor of the combined strength of the entire team being led and its output than the individual brilliance of the leader. Successful leaders build teams with individuals potentially stronger than themselves (and are secure in doing so), ensure synergies that make the team stronger than the sum of strengths of individuals in it, share any knowledge they may have that could fill gaps or enhance the team’s potential, and believe in sharing the outcome with their teams.
8.       Leadership is being decisive.
Leadership, by nature, is a direction-loaded function. If the choice were obvious, it would not be a choice anymore. Leadership is most often summoned at the precipice or the juncture where the path forward is not obvious. The ability to decide, decide fast and a later vindication of the decision made, is what differentiates a good leader from the average one. When presented with all the data required to make a decision, any competent person would make the cut. It is in the absence of the complete picture, when the data does not lend itself to the decision and when delay means denial and denial means death (okay, agreed, that was dramatic!), when intuition takes preeminence in the making of a decision, that the good leader comes into one’s own.
9.       Leadership touches, impacts and, at times, transforms lives.
I am yet to come across a good leader, whose purpose is not strong by itself to be appealing to the people they lead. I am also yet to come across successful teams that are not bound together by anything beyond the immediate goals of the teams; or for that matter, individuals in a team that are inspired only by professional skills of their leaders. I have not seen customers that cannot appreciate a good leader conceiving or designing a product or service for the value its usage brings to the customer’s business rather than all the features stuffed into it. Leadership in its true sense leaves the person experiencing it, touched, impacted and changed; for the better.
It is not as if you would not find leaders or their teams tasting success in the absence of one, many or all of these aspects of leadership. The unfortunate point is, the success in such cases, are despite the leadership than because of it!

Friday 12 October 2012

What’s new with Agile or XP?


To start with, I would like to acknowledge that there are several views and interpretations of what Agile as a thought, methodology, process, model and framework for software development or maintenance is. To pick one and/or comment on what is the correct view is not the intent of this discussion. The underlying principles that represent Agile, in my words –
-          Seamless, in-the-face and short-cycle communication across all stakeholders to ensure faster decision-making and homogeneous understanding of all project and product aspects
-          Rapid and iterative approach to software solution building that facilitates a ‘completed product view’ despite being in-process so everyone sees it in the same light and are able to contribute tangible value-added changes instead of a virtual and imaginary idea that takes shape much later
-          People-centric rather than process-centric action allowing great deal of flexibility to focus on the what, when, where and which rather than the hows and whys
Extreme programming or XP advocates a similar line of thinking vis-à-vis the methodology adopted in Agile though it has the added ‘no-frills’ approach to core programming on how functionality is coded and features kept to the bare minimum et al.
So now that we have set the ground on what we are discussing about, let us turn our focus to the question I posed to start with. So what’s new about Agile or XP?
I have vivid memories, from more than a couple of decades back, of groups comprising some pretty tech-savvy business users and power users sitting alongside a bunch of programmers and programmer analysts (I must admit whatever that role or title means has always intrigued me!) who understood a bit of the business, a lot of the functionality, had limited analysis or design exposure but were solid to awesome programmers sitting together and doling out mid-sized enhancements to large applications and building small applications by huddling together for a few hours, days and weeks. And the folks that I considered granddads of the trade then had told me the same tales from a decade further back, which sounded pretty similar!
A lot of us have had experiences of sitting together with a business user or power user and building apps from scratch without writing the BRDs, SSDs, URSs, HLDs, LLDs, ABCs, PQRs and XYZs! Many of these apps are still in use, are robust and have since been modified a zillion times over by future generations beyond recognition. So what’s all the fuss about? That was, essentially, as Agile and XP as anyone would care.
It is amusing to see the Sprints and the Scrums and the Scrum Masters, extreme programmers and the whole tribe and its rituals. And the fact that there is a solid commercial value attached to the jamboree not just on doing software development and maintenance in this manner but also on learning how to do so (there are certifications galore to prove the point and your worth, if you care)!
In summary, there is no novelty in Agile or XP. There is merit in choosing to go that path when the situation demands increased velocity and when the torque factor on requirements is a lot more than usual. And it’s been done for donkey’s years. There are multiple ways in which a software development or maintenance project could be carried out. There are multiple methodologies and models to choose from, all with their inherent advantages and disadvantages. The folks on the ground are the best judges of what is required in a given situation and for them to choose one model over another, the least we can do is to keep it simple to understand and demystify the myth rather than creating more of them!

You may also find some interesting perspectives on this and related themes in

Note: The views expressed here and in any of my posts are my personal views and not to be construed as being shared by any organization or group that I am or have been associated with presently or in the past.

Friday 21 September 2012

Prudent investments – Knowledge management and the CKO.


There have been multiple views on what the role of a Chief Information Officer in an organization ought to be. In a previous article I had shared some perspectives on how that role has panned out over the years and why it was critical to revisit the agenda and purpose of the CIO. While the focus of the CIO, and in turn the CIO organization, has to be around how the business boundaries can be expanded - a Contribution to Value Creation (CVC) view rather than the limited TCO view - it still revolves around innovation based upon the usage of technology to accomplish what was hitherto not possible within the information gathering, storage, analysis and deliverance domain. That is the information management ecosystem.
Like all other assets that sit on the balance sheet of an organization, data from its most raw form to the most refined one, is an asset that needs to be managed optimally to derive maximum returns from it. While the data pyramid and the nomenclature applied to different layers of it may vary from person to person and organization to organization, the essence of data refinement and the views to take on it are broadly defined by the value that comes out of its usage as an asset. I take a simplistic 4-layer view of the data pyramid through the rest of this article and this is not sacrosanct, everyone can have a different view and that may not be incorrect at all, but the essence of the takeaways that I focus on, would remain the same, whatever the view. I simply view the 4 layers as data, information, intelligence and knowledge.
The first 3 layers and the nomenclature and semantics thereof have been in use for a long time and have been extensively used. I believe that these layers of the pyramid warrant no further analysis to gain any deeper understanding than already exists and will be touched upon only to provide relative references. What interests me, therefore, is knowledge, and we will discover why.
It pains me no end to hear people talk of almost all the ‘knowledge’ in the world being available at their fingertips within the World Wide Web. Name the thing and you can learn about it by typing it into any of the decent search engines out there and the best part is that you don’t even need to know how to spell it correctly. What we do have access to, is a plethora of information, neatly stacked and delivered for our consumption, about almost anything in the world, real or virtual, with a thousand more views and opinions thrown in. It even has many how-to procedures and instructions to carry out almost any activity that one may need on day-to-day basis.
Well, here is the twist. There are so many books written on ‘how to become a millionaire in x easy steps’, ‘how to get rich in y steps’ et al. If one could follow those instructions and procedures and get to become millionaires and rich, why is everyone not rich yet? Loads of spiritual literature tells us the path to nirvana – reading them and getting there should be a piece of cake, right? There is all the literature in the world on how to build ships, cities, nuclear reactors, particle accelerators, jet aircrafts, social networks, recipes for the best spreads, smartphones, steel making, name the thing! If reading these could have helped someone create them and do so at optimal costs and find the right markets for them, the world would not be what it is today. Here is why…
Structured data is information. Information doled out meaningfully to help analysis and understanding of what the information is about, provides intelligence. Intelligence, in this sense, has the encapsulated knowledge of the person(s) who designed this pre-defined view of the information to make it meaningful in a given context. All of this could still be potentially available to anyone who buys that intelligence, since it has been built into a product that delivers that intelligence.
Knowledge is in the interpretation of the data, information and intelligence that can be put to useful ends. In a business context, it is that crucial element of understanding that helps trigger and enable concrete action to produce tangible and viable results. This needs people and their intellect at the core, and, perhaps a structured approach to how the analysis, hypothesis and resultant interpretation are arrived at. The approach can be likened to the purpose-hypothesis-experiment-observation-analysis-inference cycle in a laboratory experiment context, for instance. Knowledge management deals with the people, methodology, processes, frameworks et al that are required to drive that all-important interpretation-to-action cycle.  
Often, the world has been so obsessed and mired with the information management ecosystem that it has blundered by mistaking it for the knowledge management ecosystem. When you see knowledge management frameworks in organizations, very mature ones in some cases, you actually see a framework for submission, storage, retrieval and sharing of information and some static intelligence (the how-to, procedures, work instructions et al in the best case scenario). The content of such knowledge management systems is hardly unique except for the name of the organization and could easily be found in similar forms (if not the same) in the public domain.
The knowledge that resides in an organization, and by virtue of which the organization differentiates itself in the marketplace, is (or rather ought to be) the total intellectual capital of the organization (not just the ones patented or copyrighted and not confined to the final product or service only!). It is the most valuable asset of an organization. Since it largely resides in people and the ability of the KM framework to facilitate structured interpretation to action cycles, it is a humongous challenge to precisely value this asset. So you do not see it in the balance sheet or any other financial statement for that matter. The importance and criticality of managing this asset is slowly gaining ground and organizations are investing in knowledge management teams and a Chief Knowledge Officer. There has never been a more compelling argument for the existence of any other role in the core group of an organization as this.
There are some businesses that might think that they aren’t much of a ‘knowledge’ organization and that the nature of their business does not need knowledge management in the elaborate way that is described above. The fact that an organization differentiates itself from other players in the same space and does business in an ecosystem of internal and external stakeholders through its own signature products and/or services inherently means that there is knowledge in the organization. It is upon organizations to acknowledge that and to appreciate the fact that the least tangible of their assets is the one that truly helps build the rest of the tangible ones.

You may also find some interesting perspectives on this and related themes in

Note: The views expressed here and in any of my posts are my personal views and not to be construed as being shared by any organization or group that I am or have been associated with presently or in the past.

Thursday 6 September 2012

The POWER age!


The world and human history, has often been viewed as being made of various eons, eras, periods and ages. In the very distant past, the view of the world and different periods in history was largely geological. So we had the Hadean, Archean, Proterozoic and Phanerozoic Eons. The last of these Eons constituted three eras – the Paleozoic, the Mesozoic and the current, Cenozoic Era. The Mesozoic Era made up of the Triassic, Jurassic and Cretaceous periods spanned 250-65 million years ago and were ruled by the dinosaurs. The Cenozoic Era is divided into the Paleogene, Neogene and the Quaternary periods. The Quaternary (beginning of which dates back to around 2.6 million years) is further sub-divided into the Pleistocene and the Holocene stages, the latter the current one that we live in and is approximately 11,700 years old. The quaternary also marks the period of ascendancy of the human race and its future dominance leading to the current day.
I would like to bring into focus this period of human history where each phase or stage in history was defined by one element or the other that determined what would be the prominent or dominant force (and hence who would have the advantage of that dominance) and changed the natural order of things and paved the way for a new order. I do so with a view to take a sneak-peek into what the future holds in store for us.
From the time that living beings came into being on earth till the beginning of the quaternary period, the physically fittest and strongest being was simply the dominant one and ruled that era. The quaternary also roughly coincides in time with what we call the Stone Age. The Stone Age refers to stone being used as a tool that helped the human race dominate the other beings and also extract the maximum out of the environment. Hence the use of tools for hunting to start with and for farming and agriculture, building structures for living etc by the early stages of the Holocene.
This was followed by the Copper, Bronze and Iron ages and the nomenclature derives from the usage of the specific metal or alloy during that phase for building tools, implements and weapons, if you please. The first ones to discover these elements and invent tools that could put them to use far more effectively than the prominent/primary element of the previous era, gained ascendancy and dominated that era.
Fast forward to the recent past of the past 200 plus years and we had the advent of machines. The folks and civilizations that invented these gained prominence and dominance over the rest. There are other references in public domain literature about the Industrial age, the Oil age, the Atomic age and most recently, the Information age.
The point worth noting is that each of these ages is named after the element, the usage of which resulted in power. The power to determine the new order of things, the power to do different things, the power to do things differently and to determine what the world would do and how it would do it. So the Stone Age, Metal Age, Industrial Age, Information Age and so on. When the have-nots were doing things manually, the haves would use machines. When the machines needed fuel, Oil changed the pecking order. Nuclear capability determined the pecking order for decades. When one playing field was leveled, another was discovered to create the imbalance. The Information Age, though acknowledged widely, is probably defined by the subtlest element of all the ones that determined the period and where the power would reside.
As you would have noticed, the ages are not all mutually exclusive and sequential. There are overlaps and there are periods when many might co-exist.
So, what’s next? What is/are those elements that would determine the new pecking order? Which elements are likely to bring power and prosperity to those who possess it and change the dynamics of how the world currently functions?
Well, the power is likely to be in power. Literally. With most of the current sources of energy and power being from exhaustible sources and the consumption of energy perpetually heading north, this is inevitable. To give a sense of the size of the energy economy, the total energy consumption of the world was around 12275 mtoe (million tonnes of oil equivalent) in 2011. The total electricity generated was around 22000 TWH (tera watt hours) which from a calorific value equivalence standpoint would be around 1835 mtoe. The approx size of the energy economy is around $8.7 trillion or ~12% of the world economy. Only around 8% of this is from renewable and hydro sources. Rest is technically exhaustible (primarily coal, oil, gas & nuclear) though the time to last is a moot point. Having said that, the cost of extraction is likely to go up exponentially after a point on the exhaustible sources.
There are quite a few other elements, like everything in the healthcare space, from medicines and pharmaceuticals to infrastructure and equipment, data and doctors, that would have a crucial role to play in shaping the future and gain eminence; or for that matter, all elements that are critical to subsistence like food and water. These always were and will be evergreen elements but their very nature keeps them from assuming the pride of place of being the element to make a difference and define an age or era. Also, as mentioned earlier, there will be overlaps. The Oil and Atomic Ages shall continue. So will the Information Age. Or for that matter, the Industrial or Machine Age. My view is that (other than elements that contribute to total world energy – produce and consumption) these will have lesser and lesser currency and say in the new order.   
Folks dominant in the Oil and Atomic Age paradigm will certainly be up in that new order. They have had and will continue to have, at least for a while, prime place under the sun. But that is going to be turned on its head the moment there is a commercially viable alternative energy choice amenable to mass production (and a lot of folks are focusing on the last word in the previous sentence for the answer!). And that isn’t too far in the future. There are several players who are just round the corner on that pursuit, have recognized this opportunity and are positioning themselves for the new order. And for a lot of the folks that are desperately holding on to a machine-age and information-age dream lasting a lifetime, well, time to wake up.

The power is finally back in the power. And the new order is emerging as we speak!

You may also find some interesting perspectives on this and related themes in

Note: The views expressed here and in any of my posts are my personal views and not to be construed as being shared by any organization or group that I am or have been associated with presently or in the past.

Friday 24 August 2012

The myth around niche.


The idea of a niche product by itself is slightly flawed. The intent of this post was certainly not to be mired in the meaning of the term niche. Nevertheless, I do so with the intent of drawing upon the different definitions that reveals some interesting connotations of how the term has come to be used in the context of products and companies. Here are some of the definitions (only the noun form) –
-          A shallow recess especially one in a wall to display a statue or ornament
-          A cranny, hollow or crevice, as in a rock
-          A position particularly well-suited to the person who occupies it
-          A situation or activity specially suited to a person’s interests, abilities or nature
-          A focused, targetable part of a market
-          A special area of demand for a product or service
The sequence of the definitions above, tells us the rough evolution of the term over the years. It is evident from the very words and phrases that are used to describe the word, that the last few definitions are fairly recent.
Now let us take a look at the way the industry has used the term niche. We have niche players in Gartner’s Magic Quadrant that attributes a relatively lesser ability to execute and a lower completeness of vision with respect to one market segment, to companies that fall in that category. On a different note, it beats me why I should be talking about folks who neither have a complete vision nor the ability to follow a strategy through with action!
Somewhere in the entire scheme of things (read parlance and thought process) that the industry has gotten used to, every small player with a small footprint of product installations and customers is christened a niche player. Well, there is nothing wrong if a niche player is small. But there is everything wrong when you start talking about every small player as a niche one!
In this era in the evolution of organizations and businesses (why even in our personal lives for that matter), the number of brick-and-mortar products that are available to cater to each aspect of the business and the choices they offer is, mildly put, large. Despite this backdrop, it is somewhat surprising that the entire business application products space comprises much fewer large players than one would expect. A look at the evolution of these large companies reveals some really interesting facts.
Most of these products evolved, first time around, from robust and comprehensive home-grown automation solutions for rather generic business processes. These were packaged and taken to the neighborhood market with the promise of shorter time-to-market and lesser cost of implementation. These were obvious advantages of a ‘second sale’. With the buyer community latching on to the idea, the footprint grew to an installed base of, at best double digits and at least 5-6 instances and the market itself grew beyond the neighborhood and even across borders aided by a globalization agenda.
In parallel, the number of acquisitions in the space grew dramatically and the smaller companies with their products got cannibalized. The core reason behind the consolidation was as much aspiration as economics. Well, this is all very fine because that is the precise way businesses evolve. But what may turn out to be more interesting is to look at the company that had implemented the product from the smaller (and so called niche) product company in the first place. This company now, is pretty much at the mercy of the larger product company and their strategy to retain or retire the product and brand it acquired. And then, there are the other niche product companies that haven’t sold out and continue to live with the ‘niche’ they have made for themselves. The niche, however, for these companies, has begun to take on the first two of the definitions presented right at the beginning!
At the other end of the spectrum, the larger product companies which started with less than 40-50% coverage and hold on the entire spectrum of business processes that an organization typically has, have continually grown that coverage to over 70-75% through organic and inorganic expansions. Much can be said and debated about what is the core competence of an organization and if or not their customers would see a product or service differentiation if three-fourth of their processes are standardized and pretty much the same as the rest of their flock! But that is a slightly different point.
It is clear that a strategy that is built around adoption of a niche product and company ought to be carefully evaluated against the long-term goals of one’s organization and the sustenance of the strategy over a longer time horizon. And if one does go that path, would it not be prudent to have an option open to transfer ownership of the product and its support to your organization, is well worth consideration.

You may also find some interesting perspectives on this and related themes in


Note: The views expressed here and in any of my posts are my personal views and not to be construed as being shared by any organization or group that I am or have been associated with presently or in the past.

Tuesday 14 August 2012

Supplier collaboration in ICT a poor shadow of its brick & mortar cousin!


One school of thought recommends that when advice comes free (the connotation of free advice here and through the rest of this write-up is advice volunteered and not explicitly called for, not a monetary view of free), think twice before you take it. Well, that’s what all of us were taught and that’s what most of us do. Now here is something to chew on - taking free advice is obviously not the same as implementing it blindfolded. Then what is the cost of taking free advice? It is the time you take to understand it and the time you take to evaluate it. If someone finds that far too expensive, it might be prudent to ask what they think they ought to be doing with their time.
None of us can think of an occasion when we had some funds to put aside in our personal lives and did not agree to meet up with someone who had an idea or two on where we should invest those funds to find the best returns. The more ideas, views and options, the merrier. Inexplicably, we do not wear the investor hat when carrying out business.
Just ask the people who invest on ideas if they would find the time spent listening to, digesting and evaluating an idea, worth its while. In this era of carrying out business collaboratively across multiple partner ecosystems, ideation and innovation cannot be carried out in silos. There is nothing new about that thought given that stuff like CPFR (supply chain collaboration) has been in existence for eternity and supplier (read partner) collaboration is fast catching on, right?
Yes and no! The very folks that have made collaboration a reality within business ecosystems by providing the platform and tools to make it work, and I mean the IT community, is where you would find the least partnership in ideation. I know there are a few organizations that have used their partner ecosystems in information and communication technology effectively to maximize ideation and manage their techno-commercial investments towards realization of their business goals, but these are few and far between.
In fact, there are few planned, systematic, voluntary and periodic interactions in the ICT supply chain, where ideas are sought from ICT partners on the basis of business data, challenges, issues and problems that are shared with them. Even where it exists, more often than not, this isn’t initiated where it ought to be. Even where it is, the focus is on the TCO theme and very little, if at all, focuses on the CVC (Contribution to Value Creation) theme. The appetite and intent to ideate, contribute and make a difference is there. And on either side of the stream. It’s time 'the partners' crossed the stream. It’s time they collaborated.
The cost of ideation, to the consumer of the idea, is inconsequentially disproportionate to the potential value to be derived from it! Partner collaboration provides a unique opportunity for the ICT community to drive business behavior and set examples, for a change, than just being the passive enabler. This could well be the long due impetus required in the business outcome alignment roadmap that is imperative for the next era of ICT growth.

You may also find some interesting perspectives on this theme in

Note: The views expressed here and in any of my posts are my personal views and not to be construed as being shared by any organization or group that I am or have been associated with presently or in the past.

Tuesday 7 August 2012

Big…Bigger…Biggest! - making sense of world data.

There have been numerous articles and publications on the volume of data across analog and digital forms that exist in the world and how these could be put to constructive use. The estimates are that the total volume of data stored by the end of this year would be in the 2.7 zettabyte range and growing at a rate of just under 50% year-on-year. To the uninitiated, a zettabyte is a trillion gigabytes (GB) and a billion terabytes (TB). Now that sure sounds like the next biggest business opportunity that everyone should latch on to, right?

The answer is yes, but with a note of caution. Here is why.

Over 90% of all the data that exists (and over 99% of the new data that is being created or will be created in the future) is unstructured media data including video, audio and images. Take this out of the equation and we still are talking about around 220-230 exabytes of existing data and another 170 exabytes being created additionally through to 2017. Analog data, in one form or the other, was around 6% of the total data of 220-230 exabytes that was available in 2007. However, that would only be growing infinitesimally, so we will take the total analog data as around 3.5% of all data, on an average, and take that piece out of this calculation!

The new denominator is therefore in the 375 exabytes ballpark!

A closer look at this data would reveal a whole bunch of realities that are mind-boggling and help you put the numbers in the right perspective. We will, over the course of this discussion, make some assumptions to help the math, some of which may be incorrect (though no one would be able to prove it one way or the other, despite big data!), but will not thematically challenge the hypothesis. Here’s one. The total amount of storage for non-user data (system and application software, for example) is assumed to be around a third of the total data. That seems a fair assumption when you take the Forrester view that we will have around 2 billion computers in the world by 2015 and that the minimum that each such device would need is around 50 GB only for OS, office, security and networking tools. That would throw a further 125 exabytes out of the window (no pun intended!). That leaves us with around 250 exabytes of user data!

Now that media files and system/application software are out of the equation, we will assume that 90% of the total remaining data is corporate in nature (the other 10%, or 25 exabytes, is not necessarily personal data, it could even be, for instance, data generated in Office tools, mail data etc). And I cannot think of a corporate that does not back-up its data. So the minimum redundancy at the transaction data level itself would be 50%. Take that out of the equation because the same data analyzed twice over would not yield any greater intelligence than doing it once! That would take out 45% of the total data we started with at the beginning of this paragraph. Also, most organizations of a size where the data volume should matter for this calculation would have a data warehouse, an operation data store or a bunch of denormalized data stores at the least to introduce a further 50-67% redundancy. That would translate to another ~30% of the data not to be considered in the equation. This leaves us with 25% of the 250 exabytes of data or ~60 exabytes.

A significant part of the non-corporate data (the 25 exabytes) is essentially generated and consumed with a view to sharing and communication with other data/information stakeholders. I cannot think of someone creating a file locally and sending it to someone for consumption and then diligently removing the redundancy of storing it in their file systems and again in their mail folders. This also happens at the receiver’s end. And not all communication is one-on-one. I cannot think of communication that ends with the first receiver either! Assuming a two-step average communication and a storage redundancy of 50% at each of the 3 players, we are talking of a mere 16% of the data that is unique. It would be significantly lesser in personal data given how much of what you send in your mails is stuff you generated – think about the number of forwarded mails and messages in all your personal mailboxes, accounts etc!! Giving a benefit of doubt and taking a conservative 20% of the non-corporate data as being unique, this part of the data only contributes 5 exabytes taking the total data that we can derive intelligence from down to 40 exabytes.

Assuming there is not much of information worthy of analysis in data that is over 5 years old, effectively the amount of data we are likely to deal with over the next few years, with a view to extracting intelligence and drive decision making is closer to a 30 exabytes magnitude. Now, to put this into perspective, half of this data would be residing in transaction or operational systems and the rest would be in one or the other form of data warehouses already. That would mean around 15 exabytes of data or around 15 million terabytes of data. Imagine the above in the context of what your take on the number of existing relational and dimensional data stores and what your take on the average size of an installation is.

That surely should help put Big, Bigger and Biggest data into perspective. The need is to be pragmatic about what is the real volume of data that we are dealing with and how effectively we could use the intelligence we can derive from it and put it to profitable use. And how far does it push the current data paradigm, therefore! 

You may also find some interesting perspectives on this theme in

Note: The views expressed here and in any of my posts are my personal views and not to be construed as being shared by any organization or group that I am or have been associated with presently or in the past.

Thursday 2 August 2012

Back to Basics – the CIO rediscovered.


Those were the times! Majority of the CIO’s time and in turn, the behavior the CIOs drove within their IT organizations, was focused on the next big possibility for the business to expand. What was hitherto not possible was being made possible thanks to the advances in information and communications technology. From green-field automation for increased capacities and throughput of the business to enabling the globalization of economies and business operations, the CIO organization was a critical enabler to business growth and expansion. They did so with a never-seen-before speed to market that made mega-businesses nimble-footed and agile from strategy to execution. And in the entire scheme of things, the fact that they did so at increasingly lower costs was an added plus.

This was, arguably, the golden phase when the ICT spend of organizations leaped from the fractional existence of yore aligned to running data processing departments (some old dinosaurs would remember this) to the single-digit percentages that are the norm today.

Somewhere down the road, however, the original purpose, though not entirely lost, got sidetracked. The entire emphases started moving towards optimizing this spend. This was hardly surprising given that now the ICT spend was a meaningful percentage of the total costs of running the business and was being viewed exactly like all other business functions and processes that existed as an enabler or support function and not directly involved in the larger cause of doling out the service or product the business was meant for. Instead of being treated like a R&D, business innovation or value engineering function, the function was being likened to business enabling functions like the HCM, F&A or the MRO. In fact, in many businesses, the CIO function was being rolled up or aligned into the CFO agenda.

The CIOs and their organizations, ever so imperceptibly, started optimizing their resource supply chains to source globally, focus on TCO optimization projects that cut down cost of operations and sustenance (run the business, keep the lights on, whatever you want to call it!) and even the little bit of money that was indeed being spent on a new initiative started being in areas like governance, security, risk, compliance et al. Nothing wrong with that unless that is the only thing on your agenda! In the midst of all this, the cost of storage, processing power and communication dipped exponentially making it an extended comfort zone for optimization initiatives.

Life, proverbially as well as in reality, goes a full circle. Any bit of optimization today based on the overarching themes of the previous paragraph, and indeed the past decade and a half, would ring in infinitesimally small incremental benefits that would not sustain the interest of businesses to pump in investments with the same enthusiasm and vigor as in the past. There are some exceptional applications of those themes that still do carry the whack, especially where the order of magnitude of the requirement of those dimensions is humongous. These are in a minority in the larger context of this discussion and, well, prove the rule anyway.

This has necessitated a revisit of the objectives and the larger purpose of the CIO organization, perhaps even to the extent of a need to rechristen the function and the roles thereof. Like any change of this nature and magnitude, this will be evolutionary and may play out over a good part of this decade. But, it is inevitable. An inward focus on ICT optimization would be a self-centered and self-defeating strategy. The urgent need is to realign to the business expansion and growth agenda and what better time to embark upon it than now, when businesses and the larger world economy are at their lowest in decades.

It is a welcome ‘back to basics’ for the CIOs and my take is that the vast majority of them and their larger teams are resilient and fully capable of rediscovering their true roles. It is only their acknowledgement of this situation and urgency to act on it that will separate the boys from the men.

You may also find some interesting perspectives on this theme in
   
Note: The views expressed here and in any of my posts are my personal views and not to be construed as being shared by any organization or group that I am or have been associated with presently or in the past.

Tuesday 17 July 2012

Sounds a no-brainer to invest during troughs and reap dividends during peaks?

There is enough historical and mathematical evidence to prove that return on investment is maximized when the investment is made at the troughs of the business cycles instead of at the peaks. Mathematically, this must sound fairly obvious considering that returns come from business benefits that are directly or indirectly a function of the business volume.
Why then, do some parts of the investor and business leadership community get overly obsessed with bottom-line improvement in a downturn? When there is a dire need to generate cash on a day-to-day (ok, that was an overkill, read short-term) basis because of an intent to cash-out at the earliest or you’ve already made the mistake of investing at a peak, you are likely to have no choice but to focus on the bottom-line. The latter is reality that you can live with if you have a long-term vision and a sound strategy for volume uptake beyond the market on a revival but the former could be a real danger if the underlying reason is that you have run out of ideas to improve volumes beyond the market even on revival.
You would be surprised just how many people and organizations demonstrate this behavior and what their underlying reasons are. I recommend trying this hypothesis out in your own ecosystem or with businesses you watch closely and see the results for yourself.

Friday 13 July 2012

Am extracting some stuff from my blog http://saysmuraliaboutlife.blogspot.in/ that are more relevant in a business scenario than otherwise. Here's one...

THE IMPORTANCE OF GREY
The first and most important step towards building consensus in a multi-party scenario is to establish the presence of grey. If the view remains binary till a decision is to be made, the only way to decide would be to have the majority prevail if the process is to be kept democratic. That is, often, the worst outcome in a multi-party discussion or negotiation since all parties in the minority have been provided infinitely more time to prove the decision wrong, and that too at the time of carrying out the action following the decision!