With more than 50 million smart speakers in US homes, Einstein Voice was the new feature du jour at Dreamforce '18. Fully named, Einstein Voice Assistant, the feature allows you dictate to your smart speaker of choice or directly into the Salesforce mobile app, as another way to interact with your org's data. Einstein Voice is slated to be a free product offering, entering pilot as part of the Winter '19 release.
Einstein Voice Assistant allows for:
Discovery: Ask questions like "How are my Campaigns performing?" and "What's my agenda look like today?"
Prediction: Make inquiries such as "How does my sales forecast look for next quarter?"
Recommendation: Be given product cross-sell and upsell recommendations based on related data.
Automation: Dictate notes, make record field changes, and create follow up tasks.
To make Einstein Voice happen, three steps occur.
Step 1: Automated Speech Recognition ("ASR") - the "easy" part… transcribing spoken voice into unstructured text.
Step 2: Natural Language Processing ("NLP") - the "hard" part… understanding the context of the vocal cues given. This is what gives Einstein Voice the ability to extract company and people names from your commands. It also needs to be able to extract dollar amounts with contextual meaning. If you say "the deal size is $50,000," Einstein Voice needs to know that you're referring to the Opportunity, particularly the Opportunity Amount and not other currency fields like ACV or discounted amounts. It also needs to understand lingo that is unique to your business, such as "V2MOM" and "QBRs" vs "SBRs."
Step 3: CRM Integration - pulling it all together, what should the language and context of your commands do within a CRM?
Here are two examples of how these three tightly coupled steps are used together in different ways.
Scenario 1: Mobile App Usage
Imagine leaving an onsite Client sales meeting. You get back into your car and open Einstein Voice Assistant from the Salesforce Mobile app and say:
"The meeting went great. John wants to renew the service contract for 2 years and will sign next week. Update status to negotiation review. Update the anticipated close date to next Friday. Create a follow up with John for next Friday."
In this case, a new Note will be added to the record with everything up until the actionable commands. It will then update the Opportunity Status to "Negotiation/Review" and Close Date to next Friday's date. It then creates a Task for you to follow up w/ John, the most likely John associated with the Opportunity, for you to follow up on receiving that signed contract.
Scenario 2: Smart Speaker Usage
You're in a boardroom for a weekly pipeline review meeting. Normally, you'd keyboard and mouse your way through the Analytics dashboards. However, there's an Einstein Assistant enabled Amazon Echo in the room. To get the show started, you first say, "Show me my weekly forecast dashboard." The screen on the wall in the room takes you to that dashboard for initial rounds of discussion. You then want to drill-into one of the Dashboard components, so you say "Show me my regional pipeline map" in which the screen updates to show you a focused view of the original dashboard; in this case a map of all 50 states with projections for the next quarter. You then say "show me all of my open Opportunities in Pennsylvania" to further drill-into a dashboard sub-component to list the related Opportunity details.