Introduction
In this article, I am going to demonstrate how to use an automatic delegation feature to simplify your code to collect and confirm slot values, create a dialog model, and delegate the dialog to Alexa. The dialog model identifies the prompts and utterances to collect, validate, and confirm the slot values and intents. When you delegate the dialog to Alexa, Alexa determines the next step in the conversation and uses prompts to ask the user for information.
Ways to delegate the dialog to Alexa
There are two ways to delegate the dialog:
Auto Delegation
Enable auto delegation, either for the entire skill or for specific intents. In this case, Alexa completes all of the dialog steps based on your dialog model. Alexa sends your skill a single intent request when the dialog is complete.
Manual Delegation
Delegate manually with the Dialog.Delegate directive. In this case, Alexa sends your skill an IntentRequest for each turn of the conversation.
- The request indicates the current state of the dialog with the dialogState set to STARTED, IN_PROGRESS, or COMPLETED.
- Your skill returns the Dialog.Delegate directive for incomplete dialogs. This tells Alexa to check the dialog model for the next step and use a prompt to ask the user for more information as needed.
- Once all the steps are complete, your skill receives the final IntentRequest with dialogState set to COMPLETED.
The right delegation strategy for a specific interaction depends on the complexity of the dialog. Auto delegation is simpler because you do not need to handle any of the dialog in your code. Delegating manually with Dialog. Delegate is more flexible because your skill can make run-time decisions such as defaulting values. You can also use Dialog.Delegate in combination with other Dialog directives to take complete control over the dialog if necessary.
You can configure an overall delegation strategy for the entire skill, and then also configure the delegation strategy for each intent in the skill. This means you can use auto delegation for some intents, but use manual delegation or no delegation at all for others.
Automatically delegate simple dialogs to Alexa
Configure your skill to delegate simple dialogs where the dialog steps and slots to fill don’t vary based on run-time interaction with the user. For example, consider a skill that answers questions about the planets. An intent in this skill needs to know the name of the planet the user is asking about. This requirement doesn't change based on earlier turns in the dialog, so just delegate this to Alexa.
For example, in this interaction, Alexa sends the skill just one intent, after all the back-and-forth to collect the value for the planet slot is complete,
User: What is the weather like on other planets?
Alexa: What planet do you want to know about? (Alexa prompts the user with a prompt defined for the planet slot on the GetPlanetWeather intent.)
User: The sun (User responds to the prompt with a value that does not meet the validation rules for the planet slot)
Alexa: We don't think of the sun as having weather, exactly, so please tell me a planet instead. (Because the user's value failed validation, Alexa responds with another prompt to ask for an acceptable value.)
User: Mars.
At this point, the planet slot contains an acceptable value, so Alexa sends the skill an IntentRequest. The intent is GetPlanetWeather intent, which has a single slot planet containing the value "Mars".
To create an interaction like this, you would do the following,
Configure the planet slot on the GetPlanetWeather intent as a required slot and provide prompts to ask the user for a planet.
Enable slot validation rules for the planet slot. In this case, there are two rules:
- Reject a set of values: to reject specific values like "the sun" and provide a more context-aware prompt. This rule is checked first.
- Accept only Slot type’s values and synonyms: to only accept values that are defined in the custom type for the slot. This rule is checked if the user's value passes the first rule.
Configure the GetPlanetWeather intent to automatically delegate the dialog to Alexa.
Output
- const GetPlanetWeatherHandler = {
- canHandle(handlerInput){
- const request = handlerInput.requestEnvelope.request;
- return (request.type === 'IntentRequest'
- && request.intent.name === 'GetPlanetWeather');
- },
- handle(handlerInput){
-
- const intent = handlerInput.requestEnvelope.request.intent;
-
- let planetId = intent.slots.Planet.resolutions.resolutionsPerAuthority[0].values[0].value.id;
- let planet = PLANETS[planetId];
-
- let speechOutput = "On " + planet.PrintName + ", you can expect " + planet.Weather;
-
- return handlerInput.responseBuilder
- .speak(speechOutput)
- .withShouldEndSession (true)
- .getResponse();
- }
- };
This example above illustrates an intent handler for GetPlanetWeather for this scenario. The handler does not need to check the dialog state or verify that the planet slot value identifies a valid planet, because slot validation and auto delegation in the dialog model handle that before the request is ever sent to your skill.
In this example, details about the planets are stored in a PLANETS map. This maps properties about the planets to the IDs defined for slot values in the Planet custom slot type.
Summary
In this article, I demonstrated how to use an automatic delegation feature to simplify your code to collect and confirm slot values, create a dialog model, and delegate the dialog to Alexa. The automatic delegation has been explained in detail along with an example. A proper coding snippet has been provided.