Originally published on Forbes’ website by Ed Stacey, Managing Partner.
The gap between the promise of Artificial Intelligence (AI) and its implementation in practice has never been greater than it was in 2020.
There were clearly some major milestone AI achievements last year. Take Google DeepMind’s AlphaFold, which was shown to accurately predict 3D models of protein structures, paving the way for groundbreaking research across every field of biology. Or in June, when a beta version of GTP3 was publicly released by Microsoft – an incredibly sophisticated model capable of almost any language task, including writing in the style of Chaucer, and even basic coding. Yet outside of these tech giants, AI adoption remains in exploratory stages for the vast majority of enterprises and a long way off becoming an integral part of day to day business.
At this point in the AI adoption cycle, many enterprises hold an untenable position as long as they fail to appreciate the enormous potential for embedding machine learning into their products and business processes. In order to remain competitive, businesses have to recognise the unquestionable value that becoming AI enabled brings. However, despite this pressing need, most current AI projects fail: about 80% never reach deployment and those that do are only profitable 60% of the time. Appen’s recent report, State of the AI and Machine Learning 2020 found that nearly half of respondents felt their companies are behind on their AI journeys. This begs the question: what is holding them back? What is preventing AI from being embedded into every use case, in every enterprise?
In many senses, the AI industry is still in the ‘craft’ stage of development, with expert data scientists and software engineers hand-tailoring systems to match just a few of their specific organisational use cases. AI hasn’t yet developed the deep and flexible ecosystem or supply chain that’s required for any foundational innovation to scale. This requires reliable suppliers of a full range of specialist tools for building and scaling AI systems, and the provision of interoperable industry standards ensuring these tools work together seamlessly.
However, the AI supply chain is evolving fast and 2021 could be the inflexion point when a majority of AI projects finally start to succeed. For example, platforms such as Amazon SageMaker can be used to build, train and deploy machine learning models for almost any use case. Others, such as The Microsoft Azure Machine Learning studio, combine no-code and code first experiences in an inclusive data science platform. These are both attempts to complete the AI supply chain from the cloud vendors – provided that an enterprise is willing to migrate their data into the cloud (this has recently been made easier with the likes of Snowflake streamlining such transitions).
Innovative startups are also filling this gap, often with even more advanced functionality, and less lock-in than the cloud vendors. Many of these address key supply chain issues, such as seamless access to data. This has been a significant bottleneck in the industry, but we’re now seeing data of all kinds made available, whether structured or unstructured – for training models and for use in production. In order to improve this even further, there needs to be better collection and management of unstructured data in particular, which has been overlooked in traditional systems.