Integrate AI into ASP.NET Core App using Semantic Kernel

The rapid development of artificial intelligence (AI) capabilities has increased demand for integrating these capabilities into applications. However, the process of integrating Artificial intelligence into the application can be a challenging task for the developers to manage the different AI services.

Introduction

Semantic Kernal is a powerful SDK that empowers developers to leverage the latest LLMs and bring them into your application smoothly. It allows developers to focus on solving the problem by providing a layer of abstraction for AI services.

It offers benefits in the following ways.

  1. Streamlined Integration
  2. Reduced Learning Curve
  3. Enhanced Reliability

Understanding How Semantic Kernal Works?

Triggers

Key components of the Semantic Kernal

  1. Plugins: They tell the story of your use case to the semantic kernel so that the semantic kernel talks to the LLMs based on this, defining the use case in the plugin.
  2. Al Orchestration Layer: The core part of the Semantic Kernal that enables the communication between our plugin and LLMs.
  3. Connectors: Connectors serve as bridges between the application code and AI models. Semantic Kernal SDK offers several connectors that enable smooth integration into applications.

How to Integrate into ASP.NET Core app?

Integrating Semantic Kernal into the asp.net core app was never easy before Semantic Kernal SDK. Make sure you install all these prerequisites.

Prerequisite

  1. Azure Subscription: To access OpenAI services.
  2. ASP.NET Core 8: Installed on your development environment.
  3. Microsoft.SemanticKernel NuGet Package: This is for integrating Semantic Kernel.

Setup ASP.NET core project

Run the following command to create a simple console app.

dotnet new console -n IntegratSemanticKernel

Now install the necessary Nuget Packages by following these commands.

dotnet add package Microsoft.SemanticKernel
dotnet add package Microsoft.SemanticKernel.Connectors.OpenAI

In this demo, we will learn how to create a basic AI agent that responds to our questions. We will use the SDK packages that we have already installed and the endpoint for the LLMs that we obtained from Azure OpenAI. Semantic Kernel will use those endpoints to connect to the LLMs.

Semantic Kernel supports Hugging Face, OpenAI, and Azure OpenAI LLMs.

var builder = Kernel.CreateBuilder();
builder.AddAzureOpenAIChatCompletion(
    "modelId",
    "endpoint",
    "api-key");

var kernel = builder.Build();

To test the configurations add this line next to the previous code.

var result = await kernel.InvokePromptAsync(
        "Give me a list of breakfast foods with eggs and cheese");
    Console.WriteLine(result);

Run the code and check that you see a response similar to the following.

1. Omelette
2. Frittata
3. Breakfast burrito
4. Scrambled eggs with cheese
5. Quiche
6. Huevos rancheros
7. Cheese and egg sandwich
8. Egg and cheese bagel
9. Egg and cheese croissant
10. Baked eggs with cheese

How do you construct a plugin for the semantic kernel?

A plugin is a class that contains functions that can be called by the kernel. The functions can consist of semantic prompts or native code. To use the plugin, add it to the kernel by using the AddFromType() method and then invoke the desired function using the kernel's InvokeAsync() method. The kernel will access the plugin, locate and run its function, and then return the result.

Here is an example from the samples; you can check those examples from the link below.

Weather Plugin

public sealed class WeatherPlugin
{
   [KernelFunction]
   [Description("Get the current weather in a given location.")]
   public string GetWeather(
        [Description("The city and department, e.g. Marseille, 13")] string location
     ) => "12°C\nWind: 11 KMPH\nHumidity: 48%\nMostly cloudy";
}

Creating a Kernal with "WeatherPlugin"

 private Kernel CreateKernelWithWeatherPlugin()
    {
        // Create a logging handler to output HTTP requests and responses
        var handler = new LoggingHandler(new HttpClientHandler(), this.Output);
        HttpClient httpClient = new(handler);

        // Create a kernel with MistralAI chat completion and WeatherPlugin
        IKernelBuilder kernelBuilder = Kernel.CreateBuilder();
        kernelBuilder.AddMistralChatCompletion(
                modelId: TestConfiguration.MistralAI.ChatModelId!,
                apiKey: TestConfiguration.MistralAI.ApiKey!,
                httpClient: httpClient);
        kernelBuilder.Plugins.AddFromType<WeatherPlugin>();
        Kernel kernel = kernelBuilder.Build();
        return kernel;
    }

Now we have everything ready to execute our application. In the program class Main function, paste this code and run the application. You will see the result that we have provided in our WeatherPlugin.

// Create a kernel with MistralAI chat completion and WeatherPlugin
Kernel kernel = this.CreateKernelWithWeatherPlugin();

// Invoke chat prompt with auto invocation of functions enabled
const string ChatPrompt = """
<message role="user">What is the weather like in Paris?</message>
""";
var executionSettings = new MistralAIPromptExecutionSettings { ToolCallBehavior = MistralAIToolCallBehavior.AutoInvokeKernelFunctions };
var chatSemanticFunction = kernel.CreateFunctionFromPrompt(
   ChatPrompt, executionSettings);
var chatPromptResult = await kernel.InvokeAsync(chatSemanticFunction);
Console.WriteLine(chatPromptResult);

Output

Here are the weather details in Paris:
12°C
Wind: 11 KMPH
Humidity: 48%
Mostly cloudy

Conclusion

Integrating AI into your application is now easy with the help of Semantic Kernal. We can provide our business use case context as a plugin to Semantic Kernal, which will communicate with the LLMs to provide you with a response. Plugins define the tasks the kernel should complete. It can be native code or natural language prompts. Check out the references below to learn more in-depth.

References

  1. Samples: https://github.com/microsoft/semantic-kernel/tree/main/dotnet
  2. Docs: https://learn.microsoft.com/en-us/training/paths/develop-ai-agents-azure-open-ai-semantic-kernel-sdk/
  3. AI Receptionist Sample: https://www.c-sharpcorner.com/article/ai-receptionist-with-microsoft-semantic-kernel-and-openai/

Thanks for reading!