Scope
The purpose of this article is to show how to integrate Cortana in the Menu App.
Introduction
One of the interesting features in Windows Phone 8.1 is Cortana. Cortana is an intelligent personal assistant, that will help users in basic tasks, like call to a friend, schedule an appointment and others tasks.
Cortana is not available in all languages, for this reason some non-English users have changed their devices to support it. For see more about how to have Cortana in our Windows Phone 8.1 devices see the following article.
Integrating Cortana
Cortana will use Voice Commands for interacting with the apps, these voice commands are pre-installed in the apps so Cortana knows how to launch each app.
The first step to integrate Cortana, in the Menu App, is to define the Voice Command Definition (VCD) file, that represents an XML file with the commands that Cortana will recognize and will match with the app.
For the Menu App we will define “Menu” as the name for talking with the Menu App and define the following two commands:
- Show Command: that will allow you to choose which menu we want to see: Beverages, Starters, Mains, Desserts and Special Offers.
- Natural Language Command: that will allow to recognize an expression like “I am hungry”, “I want to eat” and “I want to drink”.
In this sample, the VCD is defined for English (en-US) but more languages could be added.
The VCD file
Here is the VCD file:
- <?xml version="1.0" encoding="utf-8"?>
-
-
- <VoiceCommands xmlns="http://schemas.microsoft.com/voicecommands/1.1">
-
- <CommandSet xml:lang="en-us" Name="englishCommands">
-
- <CommandPrefix>Menu</CommandPrefix>
-
- <Example> I am hungry </Example>
-
- <Command Name="ShowCommand">
- <Example> Show Mains </Example>
- <ListenFor> Show {dictatedShowTerms} </ListenFor>
- <Feedback> Showing in Menu ... </Feedback>
- <Navigate Target="MainPage.xaml" />
- </Command>
-
- <Command Name="NaturalLanguageCommand">
- <Example> I want to eat </Example>
- <ListenFor> {naturalLanguage} </ListenFor>
- <Feedback> Starting Menu ... </Feedback>
- <Navigate Target="MainPage.xaml" />
- </Command>
-
-
- <PhraseTopic Label="dictatedShowTerms" Scenario="Search">
- <Subject> Starters </Subject>
- <Subject> Mains </Subject>
- <Subject> Desserts </Subject>
- <Subject> Beverages </Subject>
- <Subject> Special Offers </Subject>
- </PhraseTopic>
-
- <PhraseTopic Label="naturalLanguage" Scenario="Natural Language">
- <Subject> I want to eat </Subject>
- <Subject> I want to drink </Subject>
- <Subject> I am hungry </Subject>
- </PhraseTopic>
- </CommandSet>
- </VoiceCommands>
Note: In the manifest file, check for the capability for Microphone.
The InstallVoiceCommandsAsync method.
Now we define the commands, we need to install it in the app each time the app starts.
Create the method
- private async Task InstallVoiceCommandsAsync()
- {
- var storageFile = await StorageFile.GetFileFromApplicationUriAsync(new Uri("ms-appx:///Cortana.xml"));
- await VoiceCommandManager.InstallCommandSetsFromStorageFileAsync(storageFile);
- }
And then call it in the OnNavigationTo from the MainPage:
- protected override async void OnNavigatedTo(NavigationEventArgs e)
- {
- _dataTransferManager = DataTransferManager.GetForCurrentView();
- _dataTransferManager.DataRequested += OnDataRequested;
- _navigationHelper.OnNavigatedTo(e);
- await MainViewModel.LoadDataAsync();
- if (e.NavigationMode == NavigationMode.New)
- {
- await InstallVoiceCommandsAsync();
- }
- }
The OnActivated method
We then need to define what the app will be done each time it receives a voice command from Cortana. When Cortana recognizes a command the Menu App will send the data to the Menu App and the OnActivated method in App.xaml.cs will be called.
Here is the complete code for the OnActivated method:
- protected override void OnActivated(IActivatedEventArgs args)
- {
- base.OnActivated(args);
-
- if (args.Kind == ActivationKind.VoiceCommand)
- {
- var commandArgs = args as VoiceCommandActivatedEventArgs;
- if (commandArgs != null)
- {
- SpeechRecognitionResult speechRecognitionResult = commandArgs.Result;
-
- var voiceCommandName = speechRecognitionResult.RulePath[0];
- var textSpoken = speechRecognitionResult.Text;
-
- switch (voiceCommandName)
- {
- case "ShowCommand":
- if (textSpoken.ToLower().Contains("starters"))
- {
- RootFrame.Navigate(typeof (StartersPage));
- }
- if (textSpoken.ToLower().Contains("mains"))
- {
- RootFrame.Navigate(typeof(Main1Page));
- }
- if (textSpoken.ToLower().Contains("desserts"))
- {
- RootFrame.Navigate(typeof(DessertsPage));
- }
- if (textSpoken.ToLower().Contains("beverages"))
- {
- RootFrame.Navigate(typeof(BeveragesPage));
- }
- if (textSpoken.ToLower().Contains("special") || textSpoken.ToLower().Contains("offer"))
- {
- RootFrame.Navigate(typeof(MainPage), "SpecialOffers");
- }
- break;
- case "NaturalLanguageCommand":
- if (textSpoken.ToLower().Contains("eat") || textSpoken.ToLower().Contains("hungry"))
- {
- RootFrame.Navigate(typeof(Main1Page));
- }
-
- if (textSpoken.ToLower().Contains("drink"))
- {
- RootFrame.Navigate(typeof (BeveragesPage));
- }
- if (textSpoken.ToLower().Contains("special"))
- {
- RootFrame.Navigate(typeof (MainPage), "SpecialOffers");
- }
- break;
- }
- }
- }
- Window.Current.Activate();
- }
Each possible voice command defined in VCD must be handled so the results shown to the user won´t be weird.
Using Cortana
Before we talk with Cortana, we need to run the Menu App in the device, because we need to install the VCD. After it, we can start Cortana.
And then click in “see more” to see what we can say:
Each app will show some examples to help users using Cortana. Clicking the Menu App it will show the samples defined In the VCD file:
Playing with Cortana, we will see the following:
Saying “Menu I am hungry”.
And then:
Saying “Menu I want to drink”:
Saying “Menu Show offers”:
Code
See the code for this sample in:
Step 9: How to integrate Cortana with Menu AppSee also: