Log in to the Azure Portal.
Create an event hub namespace. One event hub namespace can hold multiple event hubs.
When the namespace has finished deploying, find the event hub namespace in your list of Azure resources.
Name the Event Hub.
Grant access to the event hub in Shared access policies, click the Add button.
Add an SAS Policy name and click the "Create" button.
Now, our SAS policy is created. Please copy the connection string and save carefully. We will use this connection string later in with our Twitter WPF Client application.
Step 2 - Configure and start the Twitter WPF Client application
The client application gets tweet events directly from Twitter. To do so, it needs permission to call the Twitter Streaming APIs.
Create a Twitter application
It is very easy to create a Twitter application.
Please use this URL for that. You need a verified (Verify with your mobile number) Twitter account for this.
After creating the Twitter app, you must create an access token. This token will be used to authenticate our Twitter WPF Client application.
After successful creation of the token, you can see keys and token values, as shown below.
Step 3 - Configure the WPF Twitter client application
Please download the Twitter WPF Client application from here. I have already attached this zip file in this article too. Run the TwitterWPFClient.exe application. When the application prompts you, enter the following values -
- Twitter Consumer Key
- Twitter Consumer Secret
- Twitter Access Token
- Twitter Access Token Secret
Also, set EventHubName to the event hub name (that is, to the value of the entity path) and set EventHubNameConnectionString to the connection string.
Enter the search group which we must analyze sentiments of from Twitter. I have added flood as search group. We will get more Twitter feed data related to the recent flood in Kerala.
Click the green "Start" button to collect the social sentiment data.
Step 4 - Create a Stream Analytics job
Now that tweet events are streaming in real-time from Twitter, we can set up a Stream Analytics job to analyze these events in the real time.
It's a good idea to place the job and the event hub in the same region for the best performance and so that you don't pay to transfer data between regions.
We have successfully created our Stream Analytics job now.
Step 5 - Specify the job input
In the Job Topology section, click the Input box and click "Add" button.
Currently, Azure Stream Analytics job supports three input types. Please choose Event Hub as we use this service for the data streaming.
Please give a valid Input alias name and choose the Event Hub Namespace, Event Hub, and Event Hub Policy name for our Input stream and then click the "Save" button.
After saving the input stream, please click the sample data from our Input stream blade.
We can click the Query blade under Job Topology and give the SQL query as given below. I gave the input stream name as TwitterInputEventHub. We can query the data from this stream. This acts like a table. Please make sure your Twitter WPF Client application is running continuously. This client application is pulling the Twitter data and pushing to our Event Hub.
We have successfully got the result from our query. Please make sure we executed sample data from the input before submitting a query. After some time, this temporary data will be lost, and we have to re-run it.
Step 6 - Create an output sink
We have now defined an event stream, an event hub input to ingest events, and a query to perform a transformation over the stream. The last step is to define an output sink for the job.
In this tutorial, we will write the aggregated tweet events from the job query to Cosmos DB. You can also push your results to Azure Blob Storage, Azure SQL Database, Azure Table storage, Event Hubs, or Power BI, depending on your application needs.
In the Job Topology section, click the Output box and click the "Add" button.
We have chosen Cosmos DB as our output sink. Please create a Cosmos DB account before doing it. Please enter all the required details and click the "Save" button. Now, our output stream is also created.
We have one input and one output stream now. We saved one query too.
Please click the Start button. Make sure that our WPF Twitter Client application is running now. After some time, our job will be started, and it shows as running.
If you check the Cosmos DB data explorer, you can see that the live Twitter data is coming automatically. You can use the below query to get the data from Cosmos DB.
Step 8 - Query the data from Cosmos DB using a simple MVC application.
On a final step, we will create a simple ASP. NET MVC application and get the data from Cosmos DB. I have attached this sample project with this article. I have not yet used any pagination in this application. We will get all the data in a single page. I am planning to create an Angular app with proper pagination and more features and that is a matter of another article.
We have covered all the steps for creating Event Hub Namespace, Event Hub, and later, we have created a Twitter application and with help of this Twitter app, we push the data to Event Hub. For pulling the data from Twitter and pushing to Event Hub, we used a Twitter WPF Client application provided by Microsoft. We created a Stream Analytics job with one Input, Output, and Query stream. Finally, we created a simple MVC application to get the data from Cosmos DB and show in the browser.
I hope you enjoyed the article and got a full idea of Sentiment Analysis with Azure Stream Analytics and Event Hub.