No matter how you shuffle the IT Vendor deck, the same cards stay up top

As part of an ongoing partnership with the good people at The Rust Report, I had the following article published yesterday – as usual my aim was to question some of those things we take for granted, let me know what you think…

Recently I was reading one of those CXO surveys in ZDnet (, using it to challenge my view of the IT marketplace.

These articles are often the same. You can take them with a pinch of salt, but there’s always a degree of useful insight. The articles’ main aim is to alarm you. This one was no different: “the most favored vendors are likely to be reshuffled in the next two to three years…IT leaders will have to make bets very carefully.”

However, ultimately I found that my existing view remains unshaken: The Big Four vendors of most relevance in the IT marketplace will continue to be Oracle, Microsoft, SAP and Google for the foreseeable future. Let me explain why…

The principle point of the article was about speed of innovation: “don’t get locked into a vendor that won’t be able to be an innovation partner” warns reputed journalist Larry Dignan. The premise being that the exciting and innovative companies to watch out for – such as Salesforce, Servicenow, Workday etc – could be the new IT giants of the tech world because they are innovating faster and leaving the incumbent players behind. But while this view is exciting and a compelling angle for the media, how realistic is it…really?

The Enterprise IT landscape changes actually quite slowly when you think about it. Salesforce is 15 years old this year and if you take the view that it was its founder Marc Benioff that innovated the cloud, then the process of cloud adoption in the Enterprise has been extremely slow – compared to, say, how quickly the consumer mobile phone market ‘turned on a dime’ in the middle of the last decade. In that time, both Microsoft and Oracle have built impressive cloud offerings that now rival pure-cloud players like Salesforce and Amazon in terms of revenue. While they might innovate slower, that doesn’t mean that The Big Four’s relevance is any less.

This image of CXO views of future vendor relevance is very interesting.

Source: ZDnet

With the exception of VMware and Cisco, the only vendors with a “more important” rating of 30 per cent or more are The Big Four: Oracle, Microsoft, Google and SAP! Furthermore, when you look at reasons for rating these vendors “more important” in the future, ‘dependability’ is second but ‘innovation’ is third at 51 per cent (relevance is first at 70 per cent). That is because IT buyers instinctively know that these players will always add value, even if they don’t yet know what that value will be. To this point, the article makes one solid message: “Think of your large vendors as a series of divisions that aren’t created equal. Some units may be worthwhile even as you resist the cross-sell and lock-in pitches.”

Oracle and Microsoft are excellent examples of this – The Big Four vendors have extremely innovative and nimble business units within them, competing aggressively with their colleagues for customer wallet-share. The large players have deep pockets and can afford to experiment. If they can’t innovate, they can acquire – as the rush of marketing automation software purchases in recent years has shown.

While these business units are less nimble because of horrendous behemoth-bureaucracy, they have two crucial advantages. They are protected from failure far more by cross-subsidies from the more established product lines; and they have access to gold-standard customer focus groups the smaller players can only dream of. Just look at what is happening now. The once nascent SaaS lines within Oracle now define the corporate strategy; HANA is driving much of SAP’s momentum, and Microsoft’s Cloud business has just delivered the new CEO! Moreover, while Google’s Cloud, search and apps products are most prominent, their emerging robotic, wearable and AI divisions will drive the agenda for the next decade.

But as a customer, take heed of Mr Dignan’s advice – your role in the success of these more innovative business units is an important one every time you sit down to renegotiate a deal on your traditional legacy systems and services. The Big Four will perpetually leverage their commercial advantages to stay crucially relevant at the innovative cutting edge; while at the same time continuing to deliver value at the legacy end of the spectrum too.

We need to focus on Data Basics before embarking on Big Data

bigdataI got incredible response from this post on LinkedIn, with more than 3,000 views, 224 likes but more importantly 27 comments. You can read the comments here at LinkedIn:

Here is the post in its entirety, I’d welcome more comments and discussion here also…

With the proliferation of software-as-a-service applications across most organisations, it is likely that many organisations are suffering from a fragmented data environment. This is a problem because just at the time that most organisations need to homogenise their data strategy to take advantage of Big Data learnings, the opposite is happening: data decentralisation and even chaos.

In many cases, organisations have been focussed on data storage and not data quality. Just managing the demanding growth of data volumes for the last 15 years has been enough of a challenge for CIOs. Rapidly scaling data storage infrastructure – including software and networking as well as hardware – has been overwhelming and all too often the actual quality of the data has not been good. How many companies can genuinely claim their database was sound, that their CRM data was clean and that the insane complexity of spreadsheets was under control let alone consolidated? The age old adage “garbage in, garbage out” scales in severity with the size of data volume.

Yet as data storage now decamps to the cloud and the focus moves to Big Data strategies, it seems that data quality is still not a priority. I wonder if the industry – here in Australia as well as globally – is doing enough to enhance the human data skills rather than relying solely on Hadoop et al to do all the work. I’ve written before on the disconnect between data technology and human data skills. There is a lot of talk about “Data Scientists” but is that nothing more than just a fancy title for BI analysts?

Bona fide Data Scientists are like real life scientists. They have a hypothesis, they test this hypothesis againsts different sets of data and validate or disprove their hypothesis. Then they try and look for further causation, correlation and then they might come up with some real insight and a discovery. But in our a commercial situation, the data scientist might invest a lot of time in developing a hypothesis but then find that the data isn’t available or is too messy to use. So what then? (It is worth reading this New York Times story on “Data Wrangling”).

Organisations need to work out – strategically and operationally – how to collect data appropriately, what data they need and then what they might need to look for. There are data scrubbing tools, deduping tools and analytical tools but if the raw data is not in an appropriate state, obviously it isn’t possible to scrub or dedupe data that doesn’t exist.

So it is crucial for CIOs to look initially at their overall application architecture and work out the data flows and how they integrate, and then what insight we might need and operationally what data is needed and where it can be sourced. This isn’t difficult but it requires formality and strategy rather than ad hoc evolution. The current trend in SaaS proliferation and services bought ad hoc on the credit card at the departmental level is haphazard and making data increasingly difficult for CIOs to manage. Not only because the data is decentralised, in different clouds, but because there are now different data models that are often quite difficult to access and often quite complicated to understand.

If organisations want to truly benefit from the Big Data opportunity there needs to be some sober and disciplined thoughts about data analysis skills, data quality control and data strategy before the kind of frantic technology acquisition that the media and vendors promote and discuss. Otherwise we are going to get no closer to any kind of data optimisation than we are now – we will just create more data mayhem and the Return on Investment will remain just as elusive.

Picture credit: