Introducing Microsoft.Extensions.AI Preview – Unified AI Building Blocks for .NET

  • Thread starter Thread starter Luis Quintanilla
  • Start date Start date
L

Luis Quintanilla

Guest
We are excited to introduce the Microsoft.Extensions.AI.Abstractions and Microsoft.Extensions.AI libraries, available in preview today. These packages provide the .NET ecosystem with essential abstractions for integrating AI services into .NET applications and libraries, along with middleware for adding key capabilities.

To support the .NET ecosystem, the .NET team has enhanced the core Microsoft.Extensions libraries with these abstractions, or “exchange types,” for .NET Generative AI applications and libraries.

AI capabilities are rapidly evolving, with common patterns emerging for functionality like “chat,” embeddings, and tool calling. Unified abstractions are crucial for developers to work effectively across different sources. Middleware can add valuable functionality without burdening producers, benefiting consumers immediately.

For example, the IChatClient interface allows consumption of language models, whether hosted remotely or running locally. Any .NET package providing an AI client can implement this interface, enabling seamless integration with consuming .NET code.

Code:
IChatClient client =
    environment.IsDevelopment ?  
    new OllamaChatClient(...) : 
    new AzureAIInferenceChatClient(...);

Then, regardless of the provider you’re using, you can send requests as follows:

Code:
var response = await chatClient.CompleteAsync( 
      "Translate the following text into Pig Latin: I love .NET and AI"); 

Console.WriteLine(response.Message);

What is Microsoft.Extensions.AI?​


Microsoft.Extensions.AI is a set of core .NET libraries developed in collaboration with developers across the .NET ecosystem, including Semantic Kernel. These libraries provide a unified layer of C# abstractions for interacting with AI services, such as small and large language models (SLMs and LLMs), embeddings, and middleware.

Microsoft Extensions AI core Abstractions unified API layer architectural diagram

Benefits of Microsoft.Extensions.AI​


Microsoft.Extensions.AI offers a unified API abstraction for AI services, similar to our successful logging and dependency injection (DI) abstractions. Our goal is to provide standard implementations for caching, telemetry, tool calling, and other common tasks that work with any provider.

Core benefits​

  • Unified API: Delivers a consistent set of APIs and conventions for integrating AI services into .NET applications.
  • Flexibility: Allows .NET library authors to use AI services without being tied to a specific provider, making it adaptable to any provider.
  • Ease of Use: Enables .NET developers to experiment with different packages using the same underlying abstractions, maintaining a single API throughout their application.
  • Componentization: Simplifies adding new capabilities and facilitates the componentization and testing of applications.

Common abstractions for AI Services​


These abstractions make it easy to use idiomatic C# code for various scenarios with minimal code changes, whether you’re using different services for development or production, addressing hybrid scenarios, or exploring other service providers.

Library authors who implement these abstractions will make their clients interoperable with the broader Microsoft.Extensions.AI ecosystem. Service-specific APIs remain accessible if needed, allowing consumers to code against the standard abstractions and pass through to proprietary APIs only when required.

Code:
public interface IChatClient : IDisposable 
{ 
    Task<ChatCompletion> CompleteAsync(...); 
    IAsyncEnumerable<StreamingChatCompletionUpdate> CompleteStreamingAsync(...); 
    ChatClientMetadata Metadata { get; } 
    TService? GetService<TService>(object? key = null) where TService : class; 
}

As of this preview, we provide reference implementations for the following services:

  • OpenAI
  • Azure AI Inference
  • Ollama

However, we intend to work with package authors across the .NET ecosystem so that implementations of these Microsoft.Extensions.AI abstractions end up being part of the respective client libraries rather than requiring installation of additional packages. If you have a .NET client library for a particular AI service, we would love to see implementations of these abstractions in your library.

Standard middleware implementations​


Connecting to and using AI services is just one aspect of building robust applications. Production-ready applications require additional features like telemetry, logging, and tool calling capabilities. The Microsoft.Extensions.AI abstractions enable developers to easily integrate these components into their applications using familiar patterns.

The following sample demonstrates how to register an OpenAI IChatClient. IChatClient allows you to attach the capabilities in a consistent way across various providers.

Code:
app.Services.AddChatClient(builder => builder 
    .UseLogging()
    .UseFunctionInvocation() 
    .UseDistributedCache()    
    .UseOpenTelemetry()  
    .Use(new OpenAIClient(...)).AsChatClient(...));

The capabilities demonstrated above are included in the Microsoft.Extensions.AI library, but they are a small subset of the kinds of capabilities that can be layered in with this approach. We’re excited to see the creativity of .NET developers shine with all types of middleware exposed for creating powerful, robust AI-related functionality.

How to get started​


Microsoft.Extensions.AI is available in preview starting today.

To get started, you can create a console application and install the Microsoft.Extensions.AI package for the respective AI service you’re working with.

Chat​


The following examples show how to use Microsoft.Extensions.AI for chat scenarios.

Azure AI Inference (GitHub Models)​

  1. Install the Microsoft.Extensions.AI.AzureAIInference NuGet package which works with models from GitHub Models as well as Azure AI Model Catalog.
  2. Add the following code to your application:

Code:
using Azure; 
using Azure.AI.Inference;
using Microsoft.Extensions.AI;

IChatClient client =
    new ChatCompletionsClient(
        endpoint: new Uri("https://models.inference.ai.azure.com"), 
        new AzureKeyCredential(Environment.GetEnvironmentVariable("GH_TOKEN")))
        .AsChatClient("Phi-3.5-MoE-instruct");

var response = await client.CompleteAsync("What is AI?");

Console.WriteLine(response.Message);

OpenAI​

  1. Install the Microsoft.Extensions.AI.OpenAI NuGet package.
  2. Add the following code to your application.

Code:
using OpenAI;
using Microsoft.Extensions.AI;

IChatClient client =
    new OpenAIClient(Environment.GetEnvironmentVariable("OPENAI_API_KEY"))
        .AsChatClient("gpt-4o-mini");

var response = await client.CompleteAsync("What is AI?");

Console.WriteLine(response.Message);

Ollama​

  1. Install the Microsoft.Extensions.AI.Ollama NuGet package.
  2. Add the following code to your application.

Code:
using Microsoft.Extensions.AI;

IChatClient client = 
    new OllamaChatClient(new Uri("http://localhost:11434/"), "llama3.1");

var response = await client.CompleteAsync("What is AI?");

Console.WriteLine(response.Message);

Embeddings​


Similar to chat, you can also use Microsoft.Extensions.AI for text embedding generation scenarios.

OpenAI​

  1. Install the Microsoft.Extensions.AI.OpenAI NuGet package.
  2. Add the following code to your application.

Code:
using OpenAI;
using Microsoft.Extensions.AI;

IEmbeddingGenerator<string,Embedding<float>> generator =
    new OpenAIClient(Environment.GetEnvironmentVariable("OPENAI_API_KEY"))
        .AsEmbeddingGenerator("text-embedding-3-small");

var embedding = await generator.GenerateAsync("What is AI?");

Console.WriteLine(string.Join(", ", embedding[0].Vector.ToArray()));

Ollama​

  1. Install the Microsoft.Extensions.AI.Ollama NuGet package.
  2. Add the following code to your application.

Code:
using Microsoft.Extensions.AI;

IEmbeddingGenerator<string,Embedding<float>> generator = 
    new OllamaEmbeddingGenerator(new Uri("http://localhost:11434/"), "all-minilm");

var embedding = await generator.GenerateAsync("What is AI?");

Console.WriteLine(string.Join(", ", embedding[0].Vector.ToArray()));

Start building with Microsoft.Extensions.AI​


With the release of Microsoft.Extensions.AI, we’re excited to build the foundation of an ecosystem for AI application development. Here are some ways you can get involved and start building with Microsoft.Extensions.AI:

  • Library Developers: If you own libraries that provide clients for AI services, consider implementing the interfaces in your libraries. This allows users to easily integrate your NuGet package via the abstractions.
  • Service Consumers: If you’re developing libraries that consume AI services, use the abstractions instead of hardcoding to a specific AI service. This approach gives your consumers the flexibility to choose their preferred service.
  • Application Developers: Try out the abstractions to simplify integration into your apps. This enables portability across models and services, facilitates testing and mocking, leverages middleware provided by the ecosystem, and maintains a consistent API throughout your app, even if you use different services in different parts of your application (i.e., local and hosted model hybrid scenarios).
  • Ecosystem Contributors: If you’re interested in contributing to the ecosystem, consider writing custom middleware components.

We have a set of samples in the dotnet/ai-samples GitHub repository to help you get started.

What’s next for Microsoft.Extensions.AI?​


As mentioned, we’re currently releasing Microsoft.Extensions.AI in preview. We expect the library to remain in preview through the .NET 9 release in November as we continue to gather feedback.

In the near term, we plan to:

  • Continue collaborating with Semantic Kernel on integrating Microsoft.Extensions.AI as its foundational layer.
  • Update existing samples like eShop to use Microsoft.Extensions.AI.
  • Work with everyone across the .NET ecosystem on the adoption of Microsoft.Extensions.AI. The more providers implement the abstractions, the more consumers use it, and the more middleware components are built, the more powerful all of the pieces become.

We look forward to shaping the future of AI development in .NET with your help.

Please try out Microsoft.Extensions.AI and share your feedback so we can build the experiences that help you and your team thrive.

The post Introducing Microsoft.Extensions.AI Preview – Unified AI Building Blocks for .NET appeared first on .NET Blog.

Continue reading...
 
Back
Top