OpenTelemetry is an observability framework that provides APIs, libraries, agents, and instrumentation to collect distributed traces and metrics from an application. It's an open-source, CNCF project that aims to standardize the generation and collection of telemetry data (metrics, logs, and traces) for cloud-native software.
In this article, you will learn how to export OpenTelemetry application logs from your ASP.NET Core applications to Azure Monitor.
Here are the steps to follow along.
- Replace the contents of the
Program.cs
file with the following code.
using Azure.Monitor.OpenTelemetry.AspNetCore;
var builder = WebApplication.CreateBuilder(args);
// Add the Azure Monitor telemetry service to the application.
// This service will collect and send telemetry data to Azure Monitor.
builder.Services.AddOpenTelemetry().UseAzureMonitor();
// Build the ASP.NET Core web application.
var app = builder.Build();
app.MapGet("/logs-demo", (ILogger<Program> logger) =>
{
// You can send structured log messages to Azure Monitor as well.
logger.LogInformation("This is an information log. User: {username}", "dummy user");
logger.LogWarning("This is a warning log");
logger.LogError("This is an error log");
return "Logs generated!";
});
// Run the application.
app.Run();
The above code snippet demonstrates how to set up an ASP.NET Core web application with OpenTelemetry and Azure Monitor integration for the purpose of collecting and exporting telemetry data. Let's break down the code to understand how it works.
To configure OpenTelemetry logging in the code, we use the AddOpenTelemetry method on the ILoggingBuilder instance. This method adds the OpenTelemetry logging provider to our application. We then remove all other logging providers from our application using the ClearProviders method. This ensures that only OpenTelemetry is used for logging.
Finally, we invoke the AddOpenTelemetry method with a lambda that configures the OpenTelemetry logging. Inside this lambda, we use the AddAzureMonitorLogExporter method to connect the Azure Monitor exporter to our logging pipeline.
To add the OpenTelemetry service and configure it to use Azure Monitor, we use the AddOpenTelemetry().UseAzureMonitor() chain of methods. This service collects and sends all telemetry data to Azure Monitor.
In order to test the implementation, we created an endpoint at "/logs-demo". This endpoint is responsible for generating three different types of log messages, each with a distinct severity level: information, warning, and error. To achieve this, we used the ILogger<Program> instance that was injected into the method. To demonstrate structured logging, we created a structured log message as the first log message. Once the logs have been generated, the endpoint returns a message that says "Logs generated!".
We are ready to debug our application now, but before that, we have to configure the connection string for the Application Insights resource that we created earlier. To accomplish this, set the APPLICATIONINSIGHTS_CONNECTION_STRING environment variable value to the connection string that you copied previously. You can do this by running the following command in the terminal:
APPLICATIONINSIGHTS_CONNECTION_STRING=<connection-string>
Output
Debug the application and invoke the endpoint at /logs-demo. Every successful invocation of the endpoint will generate three log messages with different severity levels each of which will be exported to Azure Monitor.
To view the exported logs, visit the Azure Portal and navigate to the Application Insights resource that you created earlier. Click on the Logs tab and run the following query:
traces
The following screenshot shows the output of the query.
Click on the structured log message to view the details of the log message. You will notice that the log message contains the username
property as a custom dimension that you can later use to filter the logs.
To explore the integration between Azure Monitor and OpenTelemetry further, including the various types of metrics that can be exported, please refer to the official documentation.