The Problem with AI Strategy

Eric Sandosham, Ph.D.
5 min readOct 22, 2023

--

Photo by JESHOOTS.COM on Unsplash

Background

In my previous article, I wrote about Data Strategy and building Data Capabilities as the first part of a 3-part sub-series on how organisations should go about building their Data Analytics Capability (see diagram below). To recap, building data analytics capability is made up of Data Capability, Computation Capability and Translation Capability:

  1. Data capability — thoughtful data instrumentation strategy and methodology to create re-usable information assets with a priority on 1st party data to maximise competitive value.
  2. Computation capability — advance algorithmic solutions integrated with customer touch-points that optimise customer value through estimation of preferences, risks, price sensitivities.
  3. Translation capability — Data sensemaking abilities to understand information and its context to improve decision-making intelligence across the enterprise.
3 Pillars of Building Data Analytics Capability

This 9th article will be part 2 (of 3) where I will unpack the building of Computation Capability.

When the world thinks about Computation Capability, they drift most definitely towards Machine Learning (ML) and Artificial Intelligence (AI). Strictly speaking, ML is a subset of AI, but the world tends to think about these two topics quite distinctively — the former as a human-augmented statistical and computational procedure, and the latter as a more self-autonomous decision-making entity. Now, over the years, I’ve seen a lot of bad thinking and bad practices when it comes to building organisation-level computation capability. Let’s go through some of them.

AI/ML Algorithms

Firstly, organisations believe that building computation capability means investing in algorithmic superiority. They believe that the more complicated the ML/AI algorithms, the more analytically-matured the business is, and the greater the problem- solving ability. This is far from the truth. AI/ML algorithms do not equate to problem-solving but rather they are part of the suite of optimisation solutions — meaning that they don’t provide insights and address the root causes of problems, but they (AI/ML) remove inefficiencies due to uncertainty, error and execution friction.

As a segue, some organisations waste precious time and resources to conduct research in proprietary ML/AL algorithms thinking it will bring them competitive advantage. Unless you are in an industry that monetises research outcomes, you should not be spending ANY time developing proprietary AI/ML algorithms as this a multi-year and in some cases multi-decade endeavour. The purpose of AI/ML algorithms is to achieve one of two things: (a) to reduce errors in computation output and thus improve the reliability of decisions, or (b) to mimic the man-in-the-middle work so as to replace much of it and in turn significantly reduce human resource cost and power-up scaling. Objective (a) can be done using off-the-shelf AI/ML algorithms, albeit that the items on the shelf grows with time. Objective (b) requires deep studies on the nature of work undertaken by the man-in-the-middle, which is typically multi-faceted, and figuring out how to encode for it (i.e. deep research) — a classic example would be self-driving vehicles. Note also that solving objective (b) isn’t solving for excellence but replicating the average behaviour.

AI/ML Talent

The other bad thinking / bad practice is that many organisations believe that AI/ML talent are hard to come by, particularly in my neck of the woods which is South East Asia. In my experience, you don’t need to look to India or China to grow your AI/ML talent as open source has ensured that algorithms (and use case datasets) are easily available and accessible to data analytics learners and practitioners in all markets. What I have found is that with some patience and use case exposure from experienced data analytics leaders, the talent from South East Asia is equally as good as the more established markets, and typically at a fraction of the cost.

Once you’ve overcome this flawed assumption of ‘talent shortage’, it opens up the real opportunity to site your AI/ML talent within the country of your choice. What you simply need is good mentorship.

Algorithmic Capability as Competitive Advantage

How should organisations go about creating competitive advantage with computation capability? I’m a big subscriber to the thinking of the eminent strategy consultant and academic, Roger Martin. As part of his strategy methodology, he asks the questions: “Where to play? How to win?” Using those questions to anchor our thought process brings us to some simple truths: competitive advantage comes from either the ability to capture revenues that your competitors are unable to, or it comes from delivering your solutions at costs that your competitors cannot achieve. In the domain of AI/ML, the obvious opportunity is on the revenue side. To that end, we therefore need to build computation capability to optimise customer value by reducing uncertainty in customer preferences, customer risk, and customer price sensitivity. While it’s articulated in these 3 simple areas, it is anything but. Using AI/ML to reduce uncertainties in customer preference could consist of building sophisticated recommendation engines, it could mean figuring out how customers want to be contacted, it could include predicting the customer’s affinity to your brand. These are all challenging work.

What does reducing uncertainty in customer risk entails? It could include predicting the likelihood of customer disengagement, figuring out if customers would be anti-advocates / detractors, detecting when customers are exploiting certain features of your product. Finally, reducing uncertainty in customer price sensitivity would cover predicting customers’ non-proportional responses to a whole range of pricing promotions and option, across a stable of products. Again, challenging work!

All the outputs from these computations on customer preferences, customer risk, and customer price sensitivity needs to be networked to create mutually reinforcing knowledge, and integrated into customer touch-points to drive either customer-facing or employee-facing decision-making.

Conclusion

While data trumps algorithm, it does not mean that we should not invest in good computational thinking and algorithmic sophistication, although I abhor complexity for the sake of complexity. But we must be clear what we are solving for when building computation capability. We are solving for the growth and increase in customer value by reducing uncertainties (i.e. output errors) in the decision-making process based on customer motivations, intentions and behaviours. If we can do that consistently and continuously, we would be creating genuine completive advantage.

--

--

Eric Sandosham, Ph.D.
Eric Sandosham, Ph.D.

Written by Eric Sandosham, Ph.D.

Founder & Partner of Red & White Consulting Partners LLP. A passionate and seasoned veteran of business analytics. Former CAO of Citibank APAC.

No responses yet