The 2020 prediction year is one like no other. This year has been a financial and emotional disaster for many and 2020 has tested us in ways we hope no year ever will.
Even as a recent Deloitte survey showed that 77% of CEOs reported that the COVID-19 crisis accelerated their digital transformation plans, attempting to predict anything at the moment seems almost unconscionable.
There are still too many questions: “how will we all recover from 2020?”, “how quickly will we return to normal?” and “what will “normal” even look like?”
This is why in this post, I try to look at 2 trends that I think are worth stepping back for and taking a long range approach to: “Augmented Intelligence” & “Composable and Intelligent Applications”.
The case for “Augmented Everything”
In late October 2020, Gartner published its “AI Radar”, a research tool that highlights 24 AI-related technologies bound to affect our future. You’ll find all sorts of predictions, ranging from fully autonomous cars to smart bio enhancements. The most disruptive one in my opinion is the “Composable and Intelligent Applications” trend Gartner predicts won’t happen for another 6-8 years. I’ll explain in the below why I think you should look into this trend as I predict it will materialize faster than we can predict.
First, let’s look at trend #1: Augmented Intelligence.
This trend refers to the optimization of practices through the intelligent use of automation with Artificial Intelligence (aka A.I). Take the process of data preparation: your people spend a disproportionate amount of time looking, assembling and cleaning data before they can do anything with it. The process is usually very manual, error prone and lengthy. Yet, these tasks appear repetitive and seem, for the most part, fairly easy to automate.
It then makes sense that software be invented to inspect data, derive patterns from it and apply intelligent resolution to it. For instance, if an algorithm could detect that a portion of the data is sensitive (credit card numbers for example), it should automatically obfuscate it. If an algorithm detects physical addresses are bunched up in the same column, it could determine that separating the data across multiple columns would simplify analysis later on. And, as it separates the data, if it notes that specific records are missing information, it could enrich the address information by dynamically looking up external datasets: a great example of such a use case could be the autocompleting of zip-codes based on street addresses.
If such a scenario seems futuristic to you, believe me, it’s not: the idea of “augmented” was popularized by Gartner’s Rita Sallam in mid-2017. The “augmented everywhere” trend has since taken flight and an increasing number of vendors and industry analysts have pointed to its impact. Sure, there have been some skeptics who feared that “augmented” would dangerously change our relationship with algorithms. Some have proposed that “augmented” could work both ways: machines could “augment” human tasks with speed and efficacy while humans could “augment” machine tasks by applying judgment and audit for bias.
Yet, no later than this past month, BARC, a research firm based in Europe found that “augmented” was actually being implemented a lot less than everyone expected. Baking Artificial Intelligence into everything we do is proving more challenging than anticipated but, while we can question how quickly “augmented” will be adopted, it has become difficult to believe that it will not happen.
Your IT environment is an ecosystem and too many adjacent trends are bound to make augmented a sticky concept: the Cloud makes it easy to access computing power to process complex tasks at scale. And it is also making it affordable to find reliable information to complete your data.
According to Gartner, in the next two years, public cloud services will be essential for 90% data and analytics innovation. And, in less than 4 years now, Gartner believes that 75% of organizations will have operationalized AI, driving a 5 times increase in streaming data and analytics infrastructures.
Which brings me to my second trend: Composable & Intelligent Applications.
In the earlier section, we looked at how AI can intelligently respond to tasks within the context of a use-case: we talked about “data preparation” but we could have referred to applications like CRM or ERP. Indeed, CRM applications are a packaged and repeatable way our industry has decided to “box in” a set of related tasks or jobs. Over the many decades that such systems have been in existence, our industry has learned to make them faster and smarter.
One such trend has been the introduction of “intelligent applications” or the infusion of AI inside a known business process. Take for instance the use of AI to assess your leads’ propensity to buy. Before intelligent applications, CRM users would have to come up with their own method to predict the likelihood a particular prospect would be receptive to distinctive offers. Now, modern CRM applications automatically score and rank leads for users. Some even proactively propose outreach to prospects on behalf of users.
We can imagine such applications will continue to improve over the years. Yet, it might be premature to think that the future of our applications will continue to be defined by the monolithic constructs we invented for ourselves decades ago.
Look at the below graph. The Marketing Technology Landscape 2020 features 8,000 vendors. Scott Brinker, its inventor, states that when he started this in 2011, it only had 150.
One could argue that this is due to the fact that building narrow applications has become easier. That’s true. But, what if the way we bucketed use cases within our boxed applications had run its course? What if, the future of applications is in fact a set of services that are called based on our needs.
In other words, what if asking salespeople to use a CRM system to do their job in fact reduces their potential because the construct of our CRM systems has grown too limited?
What if, the answer to a salesperson’s need is in fact better answered by the consumption of 100s of services, assembled on the fly, by A.I, maybe even ephemerally?
This, to me, is the potential that Composable and Intelligent Applications present us with.
Now, you might find that this trend is way out there for your organization. And you might find that you won’t have to worry about it for decades (Gartner predicts this won’t happen for another 6-8 years for many of us).
However, I will argue that you ought to look at it in 2021. In her latest Forbes piece, Betsy Atkins calls “Low Code, No Code” the most disruptive trend of 2021. I think she’s right. And I also think we are witnessing the beginning of the disaggregation of packaged applications in the way we have known them in the last 2 or 3 decades.
One of the biggest breakthroughs I have witnessed with digital transformation is that it non only has enabled organizations to modernize their approach but it has also enabled them to imagine things that they couldn’t before. It has allowed them to look at problems differently. And it has allowed us to come up with different solutions because we realized that the problems we thought we were solving in the first place were different than the problems we ought to be solving.
I believe that, in the next 10 years, digital transformation will take us to an era where most of our tasks will be operated intelligently and with more fluidity. The popularization of cloud and AI use will accelerate this trend. And I can imagine that, if in 2021, low code no code enables more humans to change the landscape of applications, we can easily imagine that algorithms will be able to do that next.
When modern systems and approaches enable leaders to do away with the hassle of infrastructure operation, imagination and innovation ensue. I hope that the digital transformation we experienced this year will allow us to move into the best 9 year of this decade into 2030.
Here is to innovation and the realization of your imagination!