In the past few years, digital proliferation across organizations has become widespread, as we have seen with sales. Especially after the onset of COVID-19, many sales organizations accelerated their movement on this front. At first, this meant simply substituting digital and virtual channels for face-to-face customer interactions, but increasingly organizations have become focused on finding more sophisticated approaches to sales systems.
Sales organizations seek to design effective sales teams that account for face to face, physical, digital and virtual channels. And to coordinate customer engagement across said channels. And to manage sales representative performance, as well as hiring, onboarding, training and retaining the best sales talent. Indeed, all aspects of the sales system are being impacted by digital.
Much organizational energy is spent carefully planning and executing these systems. Put yourself in the position of an executive wanting to integrate AI into your sales process. AI systems can help sales representatives predict which actions are likely to satisfy a customer, which messages are likely to resonate and which customers may be receptive to cross-selling and upselling proposals.
Building an AI system capable of generating such recommendations to a sales representative is an involved process: We think about the data that’s required, build the right data pipelines, determine the best algorithms, ensure their accuracy, discover the best ways to deliver these recommendations to the representative via their CRM, think through the feedback loop and more. Without the planning, effort and precision required for these integral tasks, an AI initiative is unlikely to succeed.
But effectively implementing such AI solutions may bring about unintended consequences, which also need to be thoughtfully addressed.
AI may now direct a salesperson to call on a certain customer at a certain time, undertake tasks relevant to that customer and orchestrate customer-facing actions across other channels. Do we, then, require a different kind of sales representative—someone digitally savvy? And if we enhance the effectiveness of all the other channels and shift more tasks to virtual and digital, do we require fewer sales representatives? In this new digital world, should we rethink how representatives get paid?
Let us examine incentive compensation in more detail. Key considerations in a sales representative’s incentive compensation plan include such things as pay level, pay mix and performance metrics. Mix reflects what percentage of a representative’s pay is fixed (base salary) and what percentage is variable (dependent on performance). If the impact of a salesperson vis-a-vis other channels is low or if individual contributions are less relevant to an overall sales strategy, then companies typically place a higher emphasis on base pay. In a sales system driven by companywide AI that’s less reliant on individual decision-making, should sales representatives have more of their compensation in base pay? This assures us of more predictable costs for the organization.
As for measure, how should representatives’ success be measured in an AI-driven sales system? Typically, organizations use sales- or margin-related measures. In fact, experts discourage paying on activities; after all, an incentive plan isn’t meant to replace performance management. But if an organization has developed models that successfully predict customer behavior and provide sales representatives with activity recommendations, should representatives be measured by how closely they adhere to those recommendations?
If the incentive compensation plans are unchanged, there’s a chance that AI-driven approaches may not succeed. For them to take root, AI recommendations must influence salespeople on who to call, at what time and how they might engage their customers differently. AI integration—however technologically sound—would prove pointless if sales representatives failed to consider the recommendations, so it’s important that they be incentivized to do so.
As this example illustrates, the focus must be about more than simply making the AI work. One must think through the shocks to the system, or else whatever is being proposed won’t be adopted.