Semantic Kernel and Microsoft.Extensions.AI: Better Together, Part 2
Authored by Roger Barreto, this article explores practical non-agent integration patterns for Microsoft.Extensions.AI and Semantic Kernel within the .NET AI ecosystem. It guides developers through real code samples for chat completion, embeddings, function calls, dependency injection, and service selection.
Semantic Kernel and Microsoft.Extensions.AI: Better Together, Part 2
By Roger Barreto
This is the second installment of a series focused on integrating Microsoft.Extensions.AI with Semantic Kernel. After covering complementary relationships in Part 1, this article provides practical examples to help developers leverage both technologies—especially in non-agent AI scenarios.
Getting Started with Microsoft.Extensions.AI and Semantic Kernel
Microsoft.Extensions.AI introduces foundational abstractions such as IChatClient
and IEmbeddingGenerator<string, Embedding<float>>
. Semantic Kernel builds atop these, enabling features like plugins, prompt templates, and automated workflows. By combining both, you can address a wide range of AI development patterns in .NET.
1. Basic Chat Completion with IChatClient
Semantic Kernel now natively supports Microsoft.Extensions.AI’s IChatClient
interface. Below are ways to initiate chat completions leveraging the integration:
Using Kernel Builder
using Microsoft.Extensions.AI;
using Microsoft.SemanticKernel;
// Create a kernel with OpenAI chat client
var kernel = Kernel.CreateBuilder()
.AddOpenAIChatClient("gpt-4o", "your-api-key")
.Build();
// Simple chat completion
var response = await kernel.InvokePromptAsync("What is the capital of France?");
Console.WriteLine(response);
Using a Chat Client Directly with Azure OpenAI
var kernel = Kernel.CreateBuilder()
.AddAzureOpenAIChatClient(
deploymentName: "gpt-4o",
endpoint: "https://your-resource.openai.azure.com/",
apiKey: "your-api-key")
.Build();
var client = kernel.GetRequiredService<IChatClient>();
var response = await client.GetResponseAsync([
new(ChatRole.User, "Hello, AI!")
]);
Console.WriteLine(response.Text);
Using Dependency Injection
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.AI;
using Microsoft.SemanticKernel;
var services = new ServiceCollection();
// Register the chat client
services.AddOpenAIChatClient("gpt-4o", "your-api-key");
// Register Semantic Kernel
services.AddKernel();
var serviceProvider = services.BuildServiceProvider();
var kernel = serviceProvider.GetRequiredService<Kernel>();
var response = await kernel.InvokePromptAsync("Tell me about artificial intelligence.");
Console.WriteLine(response);
Converting Between IChatCompletionService
and IChatClient
using Microsoft.Extensions.AI;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
// Get the chat completion service
var chatService = kernel.GetRequiredService<IChatCompletionService>();
// Convert to IChatClient when needed
IChatClient chatClient = chatService.AsChatClient();
// Or convert back
IChatCompletionService backToService = chatClient.AsChatCompletionService();
2. Embedding Generation with IEmbeddingGenerator
Semantic Kernel has transitioned from its own ITextEmbeddingGenerationService
to Microsoft.Extensions.AI’s generalized IEmbeddingGenerator<string, Embedding<float>>
.
Basic Embedding Generation
using Microsoft.Extensions.AI;
using Microsoft.SemanticKernel;
# pragma warning disable SKEXP0010 // Type is for evaluation
var kernel = Kernel.CreateBuilder()
.AddOpenAIEmbeddingGenerator("text-embedding-ada-002", "your-api-key")
.Build();
var embeddingGenerator = kernel.GetRequiredService<IEmbeddingGenerator<string, Embedding<float>>>();
// Generate embeddings
var embeddings = await embeddingGenerator.GenerateAsync([
"Semantic Kernel is a lightweight, open-source development kit.",
"Microsoft.Extensions.AI provides foundational AI abstractions."
]);
foreach (var embedding in embeddings)
{
Console.WriteLine($"Generated embedding with {embedding.Vector.Length} dimensions");
}
Working with Azure OpenAI Embeddings
using Microsoft.Extensions.AI;
using Microsoft.SemanticKernel;
# pragma warning disable SKEXP0010
var kernel = Kernel.CreateBuilder()
.AddAzureOpenAIEmbeddingGenerator(
deploymentName: "text-embedding-ada-002",
endpoint: "https://your-resource.openai.azure.com/",
apiKey: "your-api-key")
.Build();
var embeddingGenerator = kernel.GetRequiredService<IEmbeddingGenerator<string, Embedding<float>>>();
// Generate embeddings with custom dimensions (if supported by model)
var embeddings = await embeddingGenerator.GenerateAsync(
["Custom text for embedding"],
new EmbeddingGenerationOptions { Dimensions = 1536 }
);
Console.WriteLine($"Generated {embeddings.Count} embeddings");
3. Function Calling Integration
Semantic Kernel’s function calling model, built atop Microsoft.Extensions.AI, uses KernelFunction
(a type of AIFunction
), resulting in direct integration.
Creating and Using Kernel Functions
using Microsoft.Extensions.AI;
using Microsoft.SemanticKernel;
using System.ComponentModel;
var kernel = Kernel.CreateBuilder()
.AddOpenAIChatClient("gpt-4o", "your-api-key")
.Build();
// Import the function as a plugin
kernel.ImportPluginFromType<WeatherPlugin>();
// Use function calling
var settings = new PromptExecutionSettings
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};
var response = await kernel.InvokePromptAsync(
"What's the weather like in Seattle and what time is it?",
new(settings)
);
Console.WriteLine(response);
public class WeatherPlugin
{
[KernelFunction, Description("Get the current weather for a city")]
public static string GetWeather([Description("The city name")] string city)
{
return $"The weather in {city} is sunny and 72°F";
}
[KernelFunction, Description("Get the current time")]
public static string GetCurrentTime()
{
return DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss");
}
}
Working with KernelFunction
Directly
var kernel = Kernel.CreateBuilder()
.AddOpenAIChatClient("gpt-4o", "your-api-key")
.Build();
// Create a function from a method
var weatherFunction = KernelFunctionFactory.CreateFromMethod(
() => "Sunny and 75°F", "GetWeather", "Gets the current weather");
// KernelFunction is already an AIFunction, so you can use it directly
var chatOptions = new ChatOptions
{
Tools = [weatherFunction], // KernelFunction as AITool
ToolMode = ChatToolMode.Auto
};
var chatClient = kernel.GetRequiredService<IChatClient>();
var messages = new List<ChatMessage> { new(ChatRole.User, "What's the weather like?") };
var response = await chatClient.GetResponseAsync(messages, chatOptions);
Console.WriteLine(response.Text);
4. Content Type Conversions
Semantic Kernel enables you to directly return Microsoft.Extensions.AI types from prompts.
Using InvokeAsync<T>
with Microsoft.Extensions.AI Types
var kernel = Kernel.CreateBuilder()
.AddOpenAIChatClient("gpt-4o", "your-api-key")
.Build();
// Get Microsoft.Extensions.AI ChatResponse directly
var chatResponse = await kernel.InvokeAsync<ChatResponse>(
kernel.CreateFunctionFromPrompt("Tell me a joke")
);
Console.WriteLine($"Model: {chatResponse.ModelId}");
Console.WriteLine($"Content: {chatResponse.Text}");
// Get List<ChatMessage> for conversation history
var message = await kernel.InvokeAsync<ChatMessage>(
kernel.CreateFunctionFromPrompt("Start a conversation about AI")
);
Console.WriteLine($"Message Role: {message.Role}");
Console.WriteLine($"Message Content: {message.Text}");
// Get Microsoft.Extensions.AI TextContent directly
var textContent = await kernel.InvokeAsync<Microsoft.Extensions.AI.TextContent>(
kernel.CreateFunctionFromPrompt("Start a conversation about AI")
);
Console.WriteLine($"Text Content: {textContent.Text}");
5. Service Selection and Dependency Injection
Semantic Kernel supports advanced dependency injection and service selection patterns, aiding projects with multiple AI providers.
Multiple Chat Providers
var services = new ServiceCollection();
// Register multiple chat clients
services.AddOpenAIChatClient("gpt-4", "openai-key", serviceId: "OpenAI");
services.AddAzureOpenAIChatClient(
"gpt-4", "https://your-resource.openai.azure.com/", "azure-key", serviceId: "AzureOpenAI");
services.AddKernel();
var serviceProvider = services.BuildServiceProvider();
var kernel = serviceProvider.GetRequiredService<Kernel>();
// Use specific service
var settings = new PromptExecutionSettings { ServiceId = "AzureOpenAI" };
var response = await kernel.InvokePromptAsync<ChatResponse>(
"Explain machine learning", new(settings)
);
Console.WriteLine("Model: " + response.ModelId);
Console.WriteLine("Content: " + response.Text);
Conclusion
Combining Microsoft.Extensions.AI and Semantic Kernel streamlines the development of .NET AI applications by offering:
- Flexibility: Use foundational abstractions for simple scenarios
- Productivity: Incorporate plugins, prompt templates, and other advanced Semantic Kernel features
- Interoperability: Convert between content types and interfaces with ease
- Scalability: Use dependency injection and service selection for complex workflows
These patterns empower developers to implement solutions ranging from chatbots to robust workflow engines. In further parts of this series, more advanced agent-based scenarios will be explored.
Packages and References
- Microsoft.Extensions.AI.Abstractions (NuGet)
- Microsoft.Extensions.AI (NuGet)
- Semantic Kernel Agent Framework (Microsoft Learn)
- Semantic Kernel Process Framework (Microsoft Learn)
- Semantic Kernel Samples (GitHub)
- Vector Data Extensions – Blog Post
-
[Microsoft.Extensions.AI libraries – .NET Microsoft Learn](https://learn.microsoft.com/en-us/dotnet/ai/microsoft-extensions-ai) - eShop Support with Microsoft.Extensions.AI
- .NET AI Samples
This post appeared first on “Microsoft DevBlog”. Read the entire article here