Come on in and take a seat. We are ready to listen. Here, we bring Salesforce Einstein help, so hopefully you will soon be thrilled with results. In this Guide, you will find common problems and how to diagnose and fix them.
First, let’s go back to the time before you started with Salesforce Einstein. Because when you make the decision to use Einstein, there are some factors to look at. If you don’t address them early in the process, they can cause issues later down the road.
Take a look at Guide 1 of this series, and see if you missed anything that is affecting you today:
- Understand Advanced Analytics, AI & Einstein
- Set Expectations for AI
- Define Clear Business Goals
- Prepare Salesforce Environment
- Do a Readiness Assessment
Also, did your process follow close to this checklist we share in Guide 2? You might find root causes of dissatisfaction here:
- Create the core team
- Gather business and technical requirements
- Design the data architecture
- Identify data transformations and processes
- Define segmentation and personas
- Set up integrations
- Build, train and deploy models for each business outcome
- Create reports for insights vs. actions
- Establish ongoing maintenance program
Now, let’s dig into common reasons why Einstein implementations fail to deliver on the promised dream, how to fix this, and how quickly you can get back on track.
The Salesforce Einstein analytics journey starts with goals. You know what you need to achieve, and you want Einstein to deliver on it. Maybe you want to increase revenue, improve profitability, boost lead conversion, or reduce customer churn.
When Einstein implementations fail, we’ve seen that the project team didn’t get agreement with all key stakeholders on goals, scope and metrics. Thus, all stakeholders should agree from the beginning.
So even if you’ve already implemented Einstein, go back and take stock. Were clear goals set?
Here are tips to help:
- Limit the scope of your Einstein goal to one team or channel. This makes it easier to measure success. So for example, rather than set a goal to “reduce customer attrition to 5%” set a goal to “reduce customer attrition in warranties for commercial segment to less than 5% versus current 8%.”
- Measure results toward goals within 6 months. In this window, you can influence short-term goals like campaign results, sales rep productivity and lead conversion. If you set a longer timeline, other variables can affect the results. This raises the risk that measurements could be invalid or inconclusive.
It’s also important for your executive, business and technical teams to agree on the reasons why Einstein was not successful. Document lessons learned and align on the new goals.
Rather than set a goal to “reduce customer attrition to 5%” set a goal to “reduce customer attrition in warranties for commercial segment to less than 5% versus current 8%.”
The success of Einstein can be affected by a lack of accuracy or reliability. This kind of problem often starts with the data that’s extracted from source systems. Then, the problem is passed into Einstein.
Data is like water flowing to your house from a source. Imagine if your municipality pulled water from a lake and piped it directly to your house. No, they don’t do that – it would be full of stuff you don’t want in your water. Instead they invest in treatment facilities to clean and deliver safe water. When there are problems with water, usually the problem is not the delivery closest to you, like the faucet. The problem is further upstream, where the water comes from.
If you have accuracy and reliability problems, look at the source data.
If reports and dashboards become unreliable after setting up Einstein and mapping data to it, the data feeds could be coming intermittently. Re-check the sources, transformations, and ETL/migration processes. Confirm that there are no broken linkages.
Analytic model inaccuracies
AI and predictive analytics models can be inaccurate or inconclusive due to a few reasons:
- The underlying customer segment changed
- Data are now different than when the models were built
- Your data science team lacks sufficient skills to model accurately
In the chart below, we list issues that can be overlooked during the implementation. These steps are often missed when setting up Einstein, due to time pressure, lack of resources, or technical ability.
Duplication and redundant records
Install deduplication tools. Build more rigorous householding methods customized to your needs.
Before loading to Einstein, centralize the processing to resolve differences across sources & fields.
Incomplete or inaccurate data
Create validation procedures. Append additional and recent data, exclude unusable data.
Poor prediction accuracy for models
Data preparation like cleansing, transformation and formatting to fine-tune the models.
Next, we share the challenges caused when converting from legacy systems. Plus “band-aids” that were used can cause problems later.
Sometimes, hand-offs between IT and business analytics teams might not be well coordinated. One team might add data or tables without informing the other team. This could affect performance.
Here are ways to identify impact on performance, and how you can fix them:
Too many processes and flows to manage
Create central staging tables to consolidate processes vs. direct Salesforce feed. Streamline and remove redundant flows.
Loading jobs are slowing systems down
Jobs are consuming more resources. Evaluate latency, volume and frequency of loading. Re-prioritize and restructure jobs to reduce load.
Data volume and granularity (row detail) increasing significantly
Consider external database structures like Heroku or external objects.
Unexpected changes in data feeds, reports, and results
Improve documentation and hand-offs. Check authoring rights, permissions and ownership.
During implementation, Einstein must be designed to serve two purposes.
One of the most important purposes of Einstein is to deliver actionable insights to front line Salesforce users. Another purpose is to empower users to execute on the recommended actions and alerts.
The following poor implementation practices can affect Salesforce users:
- Delays in getting refreshed, updated data. Because when the most recent or valid data is not available, the users may stop believing the output from Einstein.
- Not assigning the right license or Salesforce permissions to users. Incredibly, not having the right views and data due to permission restrictions can cause users to not get the full benefit from Einstein.
- Actions and systems are not connected. A good recommendation engine alone is not enough. The recommended actions must seamlessly fit into the user’s daily tasks and activities in Salesforce.
Revisit your implementation to see if any of these are affecting your users.
In addition, users can think that Einstein is not for their benefit. They think it’s yet another tool pushed from above to monitor them. Or, user onboarding did not clearly explain the benefits and build enough confidence or clarity in what users are supposed to do. We identified some solutions:
User does not understand why they should use Einstein recommendations
Show examples of other users in your organization who successfully use recommendations and benefit from them. Share use cases and results from other similar organizations.
Users can’t clearly see the benefits they achieved
Build a leaderboard or gamify. Compare incremental revenue (or other goal) for users vs non-users.
Not all data is readily available; user must still use multiple systems
Re-assess the data that is needed for key workflows from beginning to end. Add missing data.
After enthusiastic launch, adoption wanes or is inconsistent
Continue to paint the long-term vision. Rally team around the customer experience they can relate to. Show short-term fast wins toward the long-term vision.
Though you might not be entirely happy today, not all is lost!
As a starting point, go back to the basics of the initial vision and plan. Diagnose the cause of discontent across the “data – structure – users” chain. By doing this, you will identify weak spots.
Then focus on the most problematic area and make the corrections.
Re-start a pilot with limited scope, and openly share learnings. If you found success, clearly document it and then continue to roll out and gain adoption.
If changes were not fruitful, continue to the next diagnosis and follow the limited pilot process again. This is not necessarily a terrible thing. Because when you isolate one or two changes at a time, you can isolate the problem. As a result, there’s a better chance of solving the problem.
It’s all worth it, because two things are certain in today’s business climate:
- Advanced Analytics & AI like Salesforce Einstein are proven to transform businesses and grow customer value.
- If you don’t invest and get it right, your competitors and the larger market surely will.
Contact us at:
* These fields are required.