Jump to content
Xtreme .Net Talk

.Net

Avatar/Signature
  • Posts

    31
  • Joined

  • Last visited

    Never

.Net's Achievements

Newbie

Newbie (1/14)

0

Reputation

  1. HybridCache is a new .NET 9 library available via the Microsoft.Extensions.Caching.Hybrid package and is now generally available! HybridCache, named for its ability to leverage both in-memory and distributed caches like Redis, ensures that data storage and retrieval is optimized for performance and security, regardless of the scale or complexity of your application. Why use HybridCache? HybridCache reliably simplifies the boilerplate code like object serialization, cache-aside pattern implementation, and data consistency down to a single line of code. It also optimizes the data performance by combining in-memory and distributed cache stores enabling the application to run faster. If you’re building ASP.NET Core applications that use repeated or complicated data queries, have a microservice-based architecture, or require real-time data processing, you should use HybridCache to improve the performance and responsiveness of your applications. Key features Built on top of IDistributedCache which means it is compatible with all existing cache backends such as Redis, SQL Server, CosmosDB, Garnet, etc. Simple and easy-to-use API Cache-stampede protection Cache invalidation with tags Performance enhancements such as inbuilt support for the newer IBufferDistributedCache API Fully configurable serialization for JSON and Protobuf Secure-by-default with authentication and data handling Keep reading to learn more about the key features! Simple API – GetOrCreateAsync If you’ve previously tried caching in your ASP.NET Core applications, you may be familiar with other abstractions such as IDistributedCache or IMemoryCache. With these you’d have to write the code to generate keys, try to retrieve the data matching the key from the cache, deserialize the data if it does exist in the cache, or serialize the data and push it into the cache if it did not exist. This is a very manual process and hooking up all the different components is time consuming, requires the knowledge of multiple APIs, and is difficult to get right. // This is the code without HybridCache public class SomeService(IDistributedCache cache) { public async Task<SomeInformation> GetSomeInformationAsync (string name, int id, CancellationToken token = default) { var key = $"someinfo:{name}:{id}"; // Unique key for this combination. var bytes = await cache.GetAsync(key, token); // Try to get from cache. SomeInformation info; if (bytes is null) { // Cache miss; get the data from the real source. info = await SomeExpensiveOperationAsync(name, id, token); // Serialize and cache it. bytes = SomeSerializer.Serialize(info); await cache.SetAsync(key, bytes, token); } else { // Cache hit; deserialize it. info = SomeSerializer.Deserialize<SomeInformation>(bytes); } return info; } // This is the work we're trying to cache. private async Task<SomeInformation> SomeExpensiveOperationAsync(string name, int id, CancellationToken token = default) { /* ... */ } } HybridCache simplifies the cache-aside pattern down to one single line of code while significantly accelerating your application data performance. // Same code as above, now using HybridCache public class SomeService(HybridCache cache) { public async Task<SomeInformation> GetSomeInformationAsync (string name, int id, CancellationToken token = default) { return await cache.GetOrCreateAsync( $"someinfo:{name}:{id}", // Unique key for this entry. async cancel => await SomeExpensiveOperationAsync(name, id, cancel), token: token ); } } Cache-stampede protection A cache stampede occurs when multiple clients request the same data simultaneously after it expires from the cache, leading to a surge of redundant computations and database queries. This can significantly slow down a web application and degrade the user experience, especially under high traffic conditions. HybridCache prevents cache stampedes by ensuring that when multiple requests for the same data arrive, it only performs the necessary computations once. Instead of allowing every request to independently regenerate the data, HybridCache detects the situation, processes the request a single time, and returns the result to all queued requests. This approach maintains data consistency and optimizes performance at scale. Cache invalidation with tags When you add an item to a cache, you can assign one or more tags to it, which are simply string values that help categorize this data. HybridCache now makes it easier to manage and invalidate data in bulk via tagging. For example, when needing to invalidate many items at a time from a cache, instead of deleting the keys one-by-one, you can delete all items with a specific tag with one API call. You can even delete items with multiple tags at a time simply by passing a set of tags. It’s no longer necessary to rely exclusively on unique keys for identifying, grouping, and deleting data from the cache. Summary With all these new features hot off the press, it’s time to try the new .NET 9 HybridCache! To get started or learn more, visit the documentation. The post Hello HybridCache! Streamlining Cache Management for ASP.NET Core Applications appeared first on .NET Blog. View the full article
  2. Welcome to our combined .NET servicing updates for March 2025. Let’s get into the latest release of .NET & .NET Framework, here is a quick overview of what’s new in these releases: Security improvements This month you will find CVEs that have been fixed this month: CVE # Title Applies to CVE-2025-24070 .NET Elevation of Privilege Vulnerability .NET 9.0, .NET 8.0 .NET 8.0 .NET 9.0 Release Notes 8.0.14 9.0.3 Installers and binaries 8.0.14 9.0.3 Container Images images images Linux packages 8.0.14 9.0.3 Known Issues 8.0 9.0 Release changelogs ASP.NET Core: 8.0.14 | 9.0.3 Entity Framework Core: 9.0.3 Runtime: 8.0.14 | 9.0.3 SDK: 8.0.14 | 9.0.3 Winforms: 9.0.3 Share feedback about this release in the Release feedback issue. .NET Framework March 2025 Updates This month, there are no new security and non-security updates. For recent .NET Framework servicing updates, be sure to browse our release notes for .NET Framework for more details. See you next month That’s it for this month, make sure you update to the latest service release today. The post .NET and .NET Framework March 2025 servicing releases updates appeared first on .NET Blog. View the full article
  3. Want to get started with AI development, but not sure where to start? I’ve got a treat for you – we have a new AI Chat Web App template now in preview. This template is part of our ongoing effort to make AI development with .NET easier to discover and use, with scaffolding and guidance within Visual Studio, Visual Studio Code, and the .NET CLI. https://devblogs.microsoft.com/dotnet/wp-content/uploads/sites/10/2025/03/AI-Template-Preview.mp4 Please note that this template is in preview, and future versions may change based on your feedback and the rapid advancements in AI. Install the template now To get started with the first Preview of the template, you install the Microsoft.Extensions.AI.Templates from your Terminal. Just run: dotnet new install Microsoft.Extensions.AI.Templates Once installed, the template is available in Visual Studio, Visual Studio Code (with the C# Dev Kit), or you can just run dotnet new aichatweb to create it in your working directory. Getting Started with the .NET AI Chat Template The .NET AI Chat template is designed to help you quickly build an AI-powered chat application to start chatting with custom data. This initial release focuses on a Blazor-based web app, built using the Microsoft.Extensions.AI and Microsoft.Extensions.VectorData abstractions. The template uses the Retrieval Augmented Generation (RAG) pattern commonly used for chat applications. Key Features and Configuration Options Chat with Custom Data: The template allows you to create a chat-based UI that can interact with sample PDFs or your own data using the RAG pattern. Local and Azure Integration: The template supports both a local vector store for prototyping and Azure AI Search for more advanced configurations. Customizable Code: The generated code includes UI components for chat interactions, citation tracking, and follow-up suggestions. You can customize or remove these components as needed. Data Ingestion: The template includes code for data ingestion, caching, and processing, allowing you to handle various data sources and formats. Using the template in Visual Studio Once the template is installed from the command line, you can find the template in Visual Studio by using the File > New Project… menu. You can search for AI Chat, or choose the AI project type to find the template: After choosing your project name and location, you can select an AI model provider and vector store to get started. By default we’re using GitHub Models with a local vector store, which is the easiest way to get started with minimal setup. You can learn about each of the options from the .NET AI Templates documentation. Using Visual Studio Code and the C# Dev Kit To use the template in Visual Studio Code, first install the C# Dev Kit extension. Then, use the .NET: New Project… command: By default this will create a new project using the GitHub Models model provider and a local vector store. You can learn about additional options from the .NET AI Templates documentation. Chatting with your own data This template includes two sample PDF files and an example of data ingestion code to process the PDFs. This data ingestion code is flexible so that you swap out the sample PDFs. To chat with your own data, do the following: If you have your project running, stop it. Remove the sample PDF files from the /wwwroot/Data folder. Add your own PDF files to the /wwwroot/Data folder. Run the application again. On startup of the app, the data ingestion code (located in /Services/Ingestion/DataIngestor.cs) will compare the contents of the Data folder; it will remove old files from the configured vector store and add new ones. Note: depending on how many files you have, and how big they are, you may run into quota and rate limits with your configured AI model provider. When you hit a limit, you may see an error message or experience long delays in the startup of the application. See the AI Template Documentation for help troubleshooting. Extending the chatbot’s behavior The code is built using Microsoft.Extensions.AI, which makes it very straightforward to plug in custom behaviors. You can give the chatbot access to call any C# function. This can extend its capabilities to include retrieving additional data or taking actions. As a very simple example, you can try giving it access to “weather” data. In Pages/Chat/Chat.razor, define a C# function in the @code block: private async Task<string> GetWeather([Description("The city, correctly capitalized")] string city) { string[] weatherValues = ["Sunny", "Cloudy", "Rainy", "Snowy", "Balmy", "Bracing"]; return city == "London" ? "Drizzle" : weatherValues[Random.Shared.Next(weatherValues.Length)]; } Then, inside the OnInitialized method, update chatOptions.Tools to include your method: chatOptions.Tools = [ AIFunctionFactory.Create(SearchAsync), AIFunctionFactory.Create(GetWeather) ]; Now try setting a breakpoint inside GetWeather and ask the chatbot about the weather. You’ll find it will call your method and will use the result in its answer. You can use this to retrieve any information from external systems, including via asynchronous calls. Bear in mind that the parameters passed in from the LLM should be treated as untrusted input. See it in action Check out the latest episode of the .NET AI Community Standup as Alex, Bruno, and Jordan overview the new templates! What’s Coming Next – Share Your Thoughts In future releases, we plan to expand the template offerings to include an AI Console template, Minimal API template, support for .NET Aspire, and including the templates by default in the .NET SDK. We’ll also be exploring support for Azure AI Foundry and working with the Semantic Kernel team to expand template options for Semantic Kernel users. We want to learn from you and use your feedback to help us shape these templates. Please share your thoughts about the template – both what works well for you, and what you’d like to see change! Thank you and happy coding! The post .NET AI Template Now Available in Preview appeared first on .NET Blog. View the full article
  4. The Microsoft.Extensions.AI.Evaluations library is designed to simplify the integration of AI evaluation processes into your applications. It provides a robust framework for evaluating your AI applications and automating the assessment of their performance. In November, we announced the public preview of the library, and today, we are thrilled to announce that it is now available open source in the dotnet/Extensions repo. This repository contains a suite of libraries that provide facilities commonly needed when creating production-ready applications. By making this library available to everyone, we aim to empower developers to harness the power of AI more effectively in their projects. New Samples for Using the Library To help you get started with the Microsoft.Extensions.AI.Evaluations library, we have released a set of new samples. These samples showcase various use cases and demonstrate how to leverage the library’s capabilities effectively. Whether you are a seasoned developer or just beginning your AI journey, these samples will provide valuable insights and practical guidance. You can find the samples on our GitHub repository. We encourage you to explore them, experiment, and share your feedback with us. Your contributions and feedback are invaluable as we continue to enhance and expand the library’s features. Introducing the Azure DevOps Plug-in Looking to integrate your AI evaluations into your Azure DevOps pipeline? We are excited to announce the availability of a plug-in in the marketplace. This plug-in allows you to seamlessly integrate AI evaluations into your pipelines, enabling continuous assessment of your AI models as part of your CI/CD workflows. With the AzDO plug-in, you can automate the evaluation process, ensuring that your applications meet the desired criteria before deployment. This integration enhances the reliability and efficiency of your AI solutions, helping you deliver high-quality applications with confidence. Get Started Today We invite you to explore the open-source Microsoft.Extensions.AI.Evaluations preview libraries, try out the new samples, and integrate the AzDO plug-in into your pipelines. We are excited to see how you will use these tools to innovate and create impactful AI solutions. Stay tuned for more updates and enhancements, and, as always, we welcome your feedback and contributions. The post Unlock new possibilities for AI Evaluations for .NET appeared first on .NET Blog. View the full article
  5. Today, we are excited to announce the first preview release of .NET 10! We just shipped our first preview release, adding to some major enhancements across the .NET Runtime, SDK, libraries, C#, ASP.NET Core, Blazor, .NET MAUI, and more. Check out the full release notes linked below and get started today. Download .NET 10 Preview 1 This release contains the following improvements. Libraries Finding Certificates By Thumbprints Other Than SHA-1 Finding PEM-encoded Data in ASCII/UTF-8 New Method Overloads in ISOWeek for DateOnly Type String Normalization APIs to Work with Span of Characters Numeric Ordering for String Comparison Adding TimeSpan.FromMilliseconds Overload with a Single Parameter ZipArchive performance and memory improvements Additional TryAdd and TryGetValue overloads for OrderedDictionary<TKey, TValue> More Left-Handed Matrix Transformation Methods Full Release Notes Runtime Array Interface Method Devirtualization Stack Allocation of Arrays of Value Types AVX10.2 Support Full Release Notes SDK Pruning of Framework-provided Package References Full Release Notes C# nameof in unbound generics Implicit span conversions field backed properties Modifiers on simple lambda parameters Experimental feature – String literals in data section Full Release Notes F# This release you will find updates across the F# language, FSharp.Core standard library, and FSharp.Compiler.Service. Visual Basic unmanaged constraint support Honor overload resolution priority Full Release Notes ASP.NET Core & Blazor OpenAPI 3.1 support Generate OpenAPI documents in YAML format Response description on ProducesResponseType Detect if URL is local using RedirectHttpResult.IsLocalUrl Improvements to integration testing of apps with top-level statements QuickGrid RowClass parameter Blazor script as a static web asset Route syntax highlighting for Blazor RouteAttribute Full Release Notes .NET MAUI This release focused on quality improvements for .NET MAUI. In addition to the CollectionView enhancements for iOS and Mac Catalyst are included in this release, browse through the full GitHub release notes for all of the improvements. .NET for Android Android 16 (Baklava) Beta 1 Minimum supported Android API recommendations Building with JDK-21 is now supported dotnet run support for Androd projects Enable marshal methods by default Visual Studio Design-Time Builds no longer invoke aapt2 .NET for iOS, Mac Catalyst, macOS, tvOS Trimmer warnings enabled by default Bundling original resources in libraries Browse the full release notes for all of this and more. Windows Forms Clipboard related serialization and deserialization changes Obsoleted Clipboard APIs New Clipboard related APIs Full Release Notes Windows Presentation Foundation (WPF) This release focused on quality improvements. A full list of changes can be found in the release notes. Entity Framework Core Support for the .NET 10 LeftJoin operator ExecuteUpdateAsync now accepts a regular, non-expression lambda Full Release Notes Container Images 10.0-preview tags use Ubuntu 24.04 Debian images use Debian 13 “Trixie” Ubuntu Chiseled images now contain the Chisel manifest Full Release Notes Get started To get started with .NET 10, install the .NET 10 SDK. If you’re on Windows using Visual Studio, we recommend installing the latest Visual Studio 2022 preview. You can also use Visual Studio Code and the C# Dev Kit extension with .NET 10. Join us for .NET 10 Preview 1 Unboxed Live Stream Join us for an unboxing video with the team to discuss what’s new in this preview release, with live demos from the dev team! .NET 10 Discussions The team has been making monthly announcements alongside full release notes on the dotnet/core GitHub Discussions and has seen great engagement and feedback from the community. Stay up-to-date with .NET 10 You can stay up-to-date with all the features of .NET 10 with: What’s new in .NET 10 What’s new in C# 14 What’s new in .NET MAUI What’s new in ASP.NET Core What’s new in Entity Framework Core What’s new in Windows Forms What’s new in WPF Breaking Changes in .NET 10 .NET 10 Releases Additionally, be sure to subscribe to the GitHub Discussions RSS news feed for all release announcements. We want your feedback, so head over to the .NET 10 Preview 1 GitHub Discussion to discuss features and give feedback for this release. The post .NET 10 Preview 1 is now available! appeared first on .NET Blog. View the full article
  6. We are excited to announce the release of .NET Aspire 9.1! This release includes several new features and quality of life improvements based on feedback from developers using .NET Aspire in production applications. In this post, we will focus on the new features in the .NET Aspire dashboard, as well as some other cool features that have been added in this release. Six great new dashboard features The .NET Aspire dashboard has received several new features in this release. Here are the highlights: Resource Relationships The dashboard now reflects the concept of a “parent” and “child” resource relationship. For example, if you create a Postgres instance with multiple databases, they will now be nested in the Resource page in the same instance. Localization Overrides The dashboard defaults to the language set in your browser. This release introduces the ability to override this setting and change the dashboard language independently from the browser language. Consider the following screen capture, that demonstrates the addition of the language dropdown in the dashboard: Filtering You can now filter what you see in the Resource page by Resource type, State, and Health state. Consider the following screen capture, which demonstrates the addition of the filter options in the dashboard: More Resource Details When you select a resource in the dashboard, more data points are now available in the details pane, including References, Back references, and Volumes with their mount types. CORS Support for Custom Local Domains You can now set the DOTNET_DASHBOARD_CORS_ALLOWED_ORIGINS environment variable to allow the dashboard to receive telemetry from other browser apps, such as if you have resources running on custom localhost domains. For more information, see .NET Aspire app host: Dashboard configuration. Flexibility with Console Logs The console log page has two new options. You’re now able to download your logs so you can view them in your own diagnostics tools. Plus, you can turn timestamps on or off to reduce visual clutter when needed. Various UX Improvements Several new features in .NET Aspire 9.1 enhance and streamline popular tasks: Resource commands, such as Start and Stop buttons, are now available on the Console logs page. Single selection to open in the text visualizer. URLs within logs are now automatically clickable, with commas removed from endpoints. Additionally, the scrolled position now resets when switching between different resources. For more details on the latest dashboard enhancements, check out James Newton-King on Bluesky, where he’s been sharing new features daily. And A Whole Lot More In addition to the new dashboard features, .NET Aspire 9.1 includes several other cool features: Start Resources on Demand: You can now tell resources not to start with the rest of your app by using WithExplicitStart() on the resource in your app host. Then, you can start it whenever you’re ready from inside the dashboard. Better Docker Integration: The PublishAsDockerfile() feature was introduced for all projects and executable resources. This enhancement allows for complete customization of the Docker container and Dockerfile used during the publish process. Cleaning Up Docker Networks: In 9.1, we addressed a persistent issue where Docker networks created by .NET Aspire would remain active even after the application was stopped. This bug, tracked in issue #6504, is resolved. Now, Docker networks are properly cleaned up, ensuring a more efficient and tidy development environment. Improved Development Container support: In this release, we improved support for Dev Containers in both GitHub Codespaces and Visual Studio Code. See dotnet/aspire-devcontainer. See the full list of new features in the What’s new in .NET Aspire 9.1 documentation. Learn More on the .NET Aspire Community Standup Join us on the .NET Aspire Community Standup to learn more about the new features in .NET Aspire 9.1. Watch the recording below: Learn More & Get Involved We hope you enjoy the new features in .NET Aspire 9.1! As always, we welcome your feedback and contributions. Here are some ways to get involved: GitHub: Collaborate with us on GitHub or join us on Discord to chat with team members. Documentation: Check out the official documentation for more information on .NET Aspire. Community Standup: Join us on the .NET Aspire Community Standup to stay up-to-date with the latest developments. Thank you for being a part of the .NET Aspire community! The post .NET Aspire 9.1 is here with six great new dashboard features, and more! appeared first on .NET Blog. View the full article
  7. Join us for an exciting Let’s Learn .NET live stream event where we will explore GitHub Copilot and its capabilities. This event will cover an introduction to GitHub Copilot, best practices, and tips on how to generate documentation, tests, and more. We will also build a mini-game from scratch using Copilot and GitHub Codespaces. Event Details Date: February 27th, 2025 Time: 10:00 AM Pacific Time Location: YouTube Live Stream Topics Covered Introduction to GitHub Copilot Learn about GitHub Copilot, an AI-powered code completion tool that helps you write code faster and with fewer errors. We will cover its features, benefits, and how it can improve your development workflow. Best Practices and Tips Discover best practices and tips for using GitHub Copilot effectively. Learn how to generate documentation, tests, and more with ease. Building a Mini-Game with Copilot Watch as we build a mini-game from scratch using GitHub Copilot. See how Copilot can assist in the development process and help you create a functional game quickly. We will be taking the Microsoft Learn training challenge and earning a badge! Using GitHub Codespaces Throughout the workshop, we will explore GitHub Codespaces and how it can be used in conjunction with GitHub Copilot. Learn how to set up a development environment quickly and efficiently. Get Started To get started with GitHub Copilot and follow along with the event, use this repository: mastering-github-copilot-for-dotnet-csharp-developers. Don’t forget to sign up for your Free GitHub Copilot account. Join Us Don’t miss out on this exciting event! Join us on February 27th at 10:00 AM Pacific Time on YouTube. Join the Event The post Let’s Learn .NET: GitHub Copilot Event appeared first on .NET Blog. View the full article
  8. .NET Multi-platform App UI (.NET MAUI) continues to evolve with each release, and .NET 9 brings a focus on trimming and a new supported runtime: NativeAOT. These features can help you reduce application size, improve startup times, and ensure your applications run smoothly on various platforms. Both developers looking to optimize their .NET MAUI applications and NuGet package authors are able to take advantage of these features in .NET 9. We’ll also walk through the options available to you as a developer for measuring the performance of your .NET MAUI applications. Both CPU sampling and memory snapshots are available via dotnet-trace and dotnet-gcdump respectively. These can give insights into performance problems in your application, NuGet packages, or even something we should look into for .NET MAUI. Background By default, .NET MAUI applications on iOS and Android use the following settings: “Self-contained”, meaning a copy of the BCL and runtime are included with the application. Note This makes .NET MAUI applications suitable for running on “app stores” as no prerequisites such as installing a .NET runtime are required. Partially trimmed (TrimMode=partial), meaning that code within your applications or NuGet packages are not trimmed by default. Note This is a good default, as it is the most compatible with existing code and NuGet packages in the ecosystem. Full Trimming This is where full-trimming (TrimMode=full) can make an impact on your application’s size. If you have a substantial amount of C# code or NuGet packages, you may be missing out on a significant application size reduction. To opt into full trimming, you can add the following to your .csproj file: <PropertyGroup> <TrimMode>full</TrimMode> </PropertyGroup> For an idea on the impact of full trimming: Note MyPal is a sample .NET MAUI application that is a useful comparison because of its usage of several common NuGet packages. See our trimming .NET MAUI documentation for more information on “full” trimming. NativeAOT Building upon full trimming, NativeAOT both relies on libraries being trim-compatible and AOT-compatible. NativeAOT is a new runtime that can improve startup time and reduce application size compared to existing runtimes. Note NativeAOT is not yet supported on Android, but is available on iOS, MacCatalyst, and Windows. To opt into NativeAOT: <PropertyGroup> <IsAotCompatible>true</IsAotCompatible> <PublishAot>true</PublishAot> </PropertyGroup> For an idea on the impact of NativeAOT and application size: And startup performance: Note macOS on the above graphs is running on MacCatalyst, the default for .NET MAUI applications running on Mac operating systems. See our NativeAOT deployment documentation for more information about this newly supported runtime. NuGet Package Authors As a NuGet package author, you may wish for your package to run in either fully trimmed or NativeAOT scenarios. This can be useful for developers targeting .NET MAUI, mobile, or even self-contained ASP.NET microservices. To support NativeAOT, you will need to: Mark your assemblies as “trim-compatible” and “AOT-compatible”. Enable Roslyn analyzers for trimming and NativeAOT. Solve all the warnings. Begin with modifying your .csproj file: <PropertyGroup> <IsTrimmable>true</IsTrimmable> <IsAotCompatible>true</IsAotCompatible> </PropertyGroup> These properties will enable Roslyn analyzers as well as include [assembly: AssemblyMetadata] information in the resulting .NET assembly. Depending on your library’s usage of features like System.Reflection, you could have either just a few warnings or potentially many warnings. See the documentation on preparing libraries for trimming for more information. XAML and Trimming Sometimes, taking advantage of NativeAOT in your app can be as easy as adding a property to your project file. However, for many .NET MAUI applications, there can be a lot of warnings to solve. The NativeAOT compiler removes unnecessary code and metadata to make the app smaller and faster. However, this requires understanding which types can be created and which methods can and cannot be called at runtime. This is often impossible to do in code which heavily uses System.Reflection. There are two areas in .NET MAUI which fall into this category: XAML and data-binding. Compiled XAML Loading XAML at runtime provides flexibility and enables features like XAML hot reload. XAML can instantiate any class in the whole app, the .NET MAUI SDK, and referenced NuGet packages. XAML can also set values to any property. Conceptually, loading a XAML layout at runtime requires: Parsing the XML document. Looking up the control types based on the XML element names using Type.GetType(xmlElementName). Creating new instances of the controls using Activator.CreateInstance(controlType). Converting the raw string XML attribute values into the target type of the property. Setting properties based on the names of the XML attributes. This process can not only be slow, but it presents a great challenge for NativeAOT. For example, the trimmer does not know which types would be looked up using the Type.GetType method. This means that either the compiler would need to keep all the classes from the whole .NET MAUI SDK and all the NuGet packages in the final app, or the method might not be able to find the types declared in the XML input and fail at runtime. Fortunately, .NET MAUI has a solution – XAML compilation. This turns XAML into the actual code for the InitializeComponent() method at build time. Once the code is generated, the NativeAOT compiler has all the information it needs to trim your app. In .NET 9, we implemented the last remaining XAML features that the compiler could not handle in previous releases, especially compiling bindings. Lastly, if your app relies on loading XAML at runtime, NativeAOT might not be suitable for your application. Compiled Bindings A binding ties together a source property with a target property. When the source changes, the value is propagated to the target. Bindings in .NET MAUI are defined using a string “path”. This path resembles C# expressions for accessing properties and indexers. When the binding is applied to a source object, .NET MAUI uses System.Reflection to follow the path to access the desired source property. This suffers from the same problems as loading XAML at runtime, because the trimmer does not know which properties could be accessed by reflection and so it does not know which properties it can safely trim from the final application. When we know the type of the source object at build time from x:DataType attributes, we can compile the binding path into a simple getter method (and a setter method for two-way bindings). The compiler will also ensure that the binding listens to any property changes along the binding path of properties that implement INotifyPropertyChanged. The XAML compiler could already compile most bindings in .NET 8 and earlier. In .NET 9 we made sure any binding in your XAML code can be compiled. Learn more about this feature in the compiled bindings documentation. Compiled bindings in C# The only supported way of defining bindings in C# code up until .NET 8 has been using a string-based path. In .NET 9, we are adding a new API which allows us to compile the binding using a source generator: // .NET 8 and earlier myLabel.SetBinding(Label.TextProperty, "Text"); // .NET 9 myLabel.SetBinding(Label.TextProperty, static (Entry nameEntry) => nameEntry.Text); The Binding.Create() method is also an option, for when you need to save the Binding instance for later use: var nameBinding = Binding.Create(static (Entry nameEntry) => nameEntry.Text); .NET MAUI’s source generator will compile the binding the same way the XAML compiler does. This way the binding can be fully analyzed by the NativeAOT compiler. Even if you aren’t planning to migrate your application to NativeAOT, compiled bindings can improve the general performance of the binding. To illustrate the difference, let’s use BenchmarkDotNet to measure the difference between the calls to SetBinding() on Android using the Mono runtime: // dotnet build -c Release -t:Run -f net9.0-android public class SetBindingBenchmark { private readonly ContactInformation _contact = new ContactInformation(new FullName("John")); private readonly Label _label = new(); [GlobalSetup] public void Setup() { DispatcherProvider.SetCurrent(new MockDispatcherProvider()); _label.BindingContext = _contact; } [Benchmark(Baseline = true)] public void Classic_SetBinding() { _label.SetBinding(Label.TextProperty, "FullName.FirstName"); } [Benchmark] public void Compiled_SetBinding() { _label.SetBinding(Label.TextProperty, static (ContactInformation contact) => contact.FullName?.FirstName); } [IterationCleanup] public void Cleanup() { _label.RemoveBinding(Label.TextProperty); } } When I ran the benchmark on Samsung Galaxy S23, I got the following results: Method Mean Error StdDev Ratio RatioSD Classic_SetBinding 67.81 us 1.338 us 1.787 us 1.00 0.04 Compiled_SetBinding 30.61 us 0.629 us 1.182 us 0.45 0.02 The classic binding needs to first parse the string-based path and then use System.Reflection to get the current value of the source. Each subsequent update of the source property will also be faster with the compiled binding: // dotnet build -c Release -t:Run -f net9.0-android public class UpdateValueTwoLevels { ContactInformation _contact = new ContactInformation(new FullName("John")); Label _label = new(); [GlobalSetup] public void Setup() { DispatcherProvider.SetCurrent(new MockDispatcherProvider()); _label.BindingContext = _contact; } [IterationSetup(Target = nameof(Classic_UpdateWhenSourceChanges))] public void SetupClassicBinding() { _label.SetBinding(Label.TextProperty, "FullName.FirstName"); } [IterationSetup(Target = nameof(Compiled_UpdateWhenSourceChanges))] public void SetupCompiledBinding() { _label.SetBinding(Label.TextProperty, static (ContactInformation contact) => contact.FullName?.FirstName); } [Benchmark(Baseline = true)] public void Classic_UpdateWhenSourceChanges() { _contact.FullName.FirstName = "Jane"; } [Benchmark] public void Compiled_UpdateWhenSourceChanges() { _contact.FullName.FirstName = "Jane"; } [IterationCleanup] public void Reset() { _label.Text = "John"; _contact.FullName.FirstName = "John"; _label.RemoveBinding(Label.TextProperty); } } Method Mean Error StdDev Ratio RatioSD Classic_UpdateWhenSourceChanges 46.06 us 0.934 us 1.369 us 1.00 0.04 Compiled_UpdateWhenSourceChanges 30.85 us 0.634 us 1.295 us 0.67 0.03 The differences for a single binding aren’t that dramatic but they add up. This can be noticeable on complex pages with many bindings or when scrolling lists like CollectionView or ListView. The full source code of the above benchmarks is available on GitHub. Profiling .NET MAUI Applications Attaching dotnet-trace to a .NET MAUI application, allows you to get profiling information in formats like .nettrace and .speedscope. These give you CPU sampling information about the time spent in each method in your application. This is quite useful for finding where time is spent in the startup or general performance of your .NET applications. Likewise, dotnet-gcdump can take memory snapshots of your application that display every managed C# object in memory. dotnet-dsrouter is a requirement for connecting dotnet-trace to a remote device, and so this is not needed for desktop applications. You can install these tools with: $ dotnet tool install -g dotnet-trace You can invoke the tool using the following command: dotnet-trace Tool 'dotnet-trace' was successfully installed. $ dotnet tool install -g dotnet-dsrouter You can invoke the tool using the following command: dotnet-dsrouter Tool 'dotnet-dsrouter' was successfully installed. $ dotnet tool install -g dotnet-gcdump You can invoke the tool using the following command: dotnet-gcdump Tool 'dotnet-gcdump' was successfully installed. From here, instructions differ slightly for each platform, but generally the steps are: Build your application in Release mode. For Android, toggle <AndroidEnableProfiler>true</AndroidEnableProfiler> in your .csproj file, so the required Mono diagnostic components are included in the application. If profiling mobile, run dotnet-dsrouter android (or dotnet-dsrouter ios, etc.) on your development machine. Configure environment variables, so the application can connect to the profiler. For example, on Android: $ adb reverse tcp:9000 tcp:9001 # no output $ adb shell setprop debug.mono.profile '127.0.0.1:9000,nosuspend,connect' # no output Run your application. Attach dotnet-trace (or dotnet-gcdump) to the application, using the PID of dotnet-dsrouter: $ dotnet-trace ps 38604 dotnet-dsrouter ~/.dotnet/tools/dotnet-dsrouter.exe ~/.dotnet/tools/dotnet-dsrouter.exe android $ dotnet-trace collect -p 38604 --format speedscope No profile or providers specified, defaulting to trace profile 'cpu-sampling' Provider Name Keywords Level Enabled By Microsoft-DotNETCore-SampleProfiler 0x0000F00000000000 Informational(4) --profile Microsoft-Windows-DotNETRuntime 0x00000014C14FCCBD Informational(4) --profile Waiting for connection on /tmp/maui-app Start an application with the following environment variable: DOTNET_DiagnosticPorts=/tmp/maui-app For iOS, macOS, and MacCatalyst, see the iOS profiling wiki page for more information. Note For Windows applications, you might just consider using Visual Studio’s built-in profiling tools, but dotnet-trace collect -- C:\path\to\an\executable.exe is also an option. Now that you’ve collected a file containing performance information, opening them to view the data is the next step: dotnet-trace by default outputs .nettrace files, which can be opened in PerfView or Visual Studio. dotnet-trace collect --format speedscope outputs .speedscope files, which can be opened in the Speedscope web app. dotnet-gcdump outputs .gcdump files, which can be opened in PerfView or Visual Studio. Note that there is not currently a good option to open these files on macOS. In the future, we hope to make profiling .NET MAUI applications easier in both future releases of the above .NET diagnostic tooling and Visual Studio. Note Note that the NativeAOT runtime does not have support for dotnet-trace and performance profiling. You can use the other supported runtimes for this, or use native profiling tools instead such as Xcode’s Instruments. See the profiling .NET MAUI wiki page for links to documentation on each platform or a profiling demo on YouTube for a full walkthrough. Conclusion .NET 9 introduces performance enhancements for .NET MAUI applications through full trimming and NativeAOT. These features enable developers to create more efficient and responsive applications by reducing application size and improving startup times. By leveraging tools like dotnet-trace and dotnet-gcdump, developers can gain insights into their application’s performance. For a full rundown on .NET MAUI trimming and NativeAOT, see the .NET Conf 2024 session on the topic. The post .NET MAUI Performance Features in .NET 9 appeared first on .NET Blog. View the full article
  9. We’re excited to announce the Chroma C# SDK. Whether you’re building AI solutions or enhancing existing projects with advanced search capabilities, you now have the option of using Chroma as a database provider in your .NET applications. What is Chroma? Chroma is an open-source database for your AI applications. With support for storing embeddings, metadata filtering, vector search, full-text search, document storage, and multi-modal retrieval, you can use Chroma to power semantic search and Retrieval Augmented Generation (RAG) features in your app. For more details, check out the Chroma website. Get started with Chroma in your C# application In this scenario, we’ll be using the ChromaDB.Client package to connect to a Chroma database and search for movies using vector search. The easiest way to start is locally using the Chroma Docker image. You can also deploy an instance in Azure. Connect to the database Create a C# console application. Install the ChromaDB.Client NuGet package. Create a ChromaClient with configuration options. using ChromaDB.Client; var configOptions = new ChromaConfigurationOptions(uri: "http://localhost:8000/api/v1/"); using var httpClient = new HttpClient(); var client = new ChromaClient(configOptions, httpClient); When using a hosted version of Chroma, replace the uri with your hosted endpoint. Create a collection Now that you have a client, create a collection to store movie data. var collection = await client.GetOrCreateCollection("movies"); To perform operations on that collection, you’ll then need to create a collection client. var collectionClient = new ChromaCollectionClient(collection, configOptions, httpClient); Add data to your collection Once your collection is created, it’s time to add data to it. The data we’re storing will consist of: Movie IDs Embeddings to represent the movie description. Metadata containing the movie title ID Title Embedding Movie Description 1 The Lion King [0.10022575, -0.23998135] The Lion King is a classic Disney animated film that tells the story of a young lion named Simba who embarks on a journey to reclaim his throne as the king of the Pride Lands after the tragic death of his father. 2 Inception [0.10327095, 0.2563685] Inception is a mind-bending science fiction film directed by Christopher Nolan. It follows the story of Dom Cobb, a skilled thief who specializes in entering people’s dreams to steal their secrets. However, he is offered a final job that involves planting an idea into someone’s mind. 3 Toy Story [0.095857024, -0.201278] Toy Story is a groundbreaking animated film from Pixar. It follows the secret lives of toys when their owner, Andy, is not around. Woody and Buzz Lightyear are the main characters in this heartwarming tale. 4 Pulp Fiction [0.106827796, 0.21676421] Pulp Fiction is a crime film directed by Quentin Tarantino. It weaves together interconnected stories of mobsters, hitmen, and other colorful characters in a non-linear narrative filled with dark humor and violence. 5 Shrek [0.09568083, -0.21177962] Shrek is an animated comedy film that follows the adventures of Shrek, an ogre who embarks on a quest to rescue Princess Fiona from a dragon-guarded tower in order to get his swamp back. List<string> movieIds = ["1", "2", "3", "4", "5" ]; List<ReadOnlyMemory<float>> descriptionEmbeddings = [ new [] { 0.10022575f, -0.23998135f }, new [] { 0.10327095f, 0.2563685f }, new [] { 0.095857024f, -0.201278f }, new [] { 0.106827796f, 0.21676421f }, new [] { 0.09568083f, -0.21177962f }, ]; List<Dictionary<string,object>> metadata = [ new Dictionary<string, object> { ["Title"] = "The Lion King" }, new Dictionary<string, object> { ["Title"] = "Inception" }, new Dictionary<string, object> { ["Title"] = "Toy Story" }, new Dictionary<string, object> { ["Title"] = "Pulp Fiction" }, new Dictionary<string, object> { ["Title"] = "Shrek" }, ]; await collectionClient.Add(movieIds, descriptionEmbeddings, metadata); Search for movies (using vector search) Now that your data is in the database, you can query it. In this case, we’re using vector search. Text Embedding A family friendly movie [0.12217915, -0.034832448] List<ReadOnlyMemory<float>> queryEmbedding = [new([0.12217915f, -0.034832448f])]; var queryResult = await collectionClient.Query( queryEmbeddings: queryEmbedding, nResults: 2, include: ChromaQueryInclude.Metadatas | ChromaQueryInclude.Distances); foreach (var result in queryResult) { foreach (var item in result) { Console.WriteLine($"Title: {(string)item.Metadata["Title"] ?? string.Empty} {(item.Distance)}"); } } The result should look similar to the following output. Title: Toy Story 0.028396977 Title: Shrek 0.032012463 Watch it live Join Jiří Činčura on the .NET Data Community Standup on February 26 to learn more about how to use Chroma and the new C# SDK. Conclusion This latest addition enhances the growing AI ecosystem in .NET. It paves the way for a simpler implementation of the existing Semantic Kernel connector and seamless integration into your .NET apps using foundational components like Microsoft.Extensions.VectorData and Microsoft.Extensions.AI. We’d like to thank @ssone95 for his work and contributions to the project. We’re excited to continue building partnerships and working with the community to enable .NET developers to build AI applications. To learn how you can start building AI apps using databases like Chroma, check out the .NET AI documentation. Try out the Chroma C# SDK today and provide feedback. The post Announcing Chroma DB C# SDK appeared first on .NET Blog. View the full article
  10. If you are building web apps with Razor, we have some great new features that you are going to love for both Visual Studio and Visual Studio Code! Extract to Component refactoring and the new Roslyn-based C# tokenizer are now available and are designed to improve your productivity in Razor files, let’s take a look. Extract to Component Extract to Component, available in Visual Studio 17.12, is a new refactoring that automates the process of creating a new Razor/Blazor component. Instead of manually creating a new file and copy/pasting the code you want to extract, selecting this feature will do that work for you by selecting the lightbulb refactoring (CTRL + .) after highlighting the code (or tag) you want to extract. This feature makes it easier to create reusable components, allowing for a cleaner and more manageable codebase. In this first iteration of the feature, Extract to Component focuses on support for basic, mostly HTML-based extraction scenarios. However, we have plans to add additional improvements and more advanced scenarios (i.e. more consistent extractions involving variable dependencies, C#, parameters, etc.). Roslyn C# Tokenizer The C# tokenizer / lexer update brings significant improvements to how Razor handles C# code. Many users have expressed frustrations with not being able to use raw string literals and verbatim interpolated strings in Razor files, and the new Roslyn C# lexer fixes that! In addition to these string formats, the lexer also adds support for binary literals and improves the handling of C# preprocessor directives, ensuring they follow C# rules. Ultimately, the new lexer will also make it easier to support new C# language features going forward. This new tokenizer is not on by default until .NET 10 but is available in both Visual Studio (17.13) and Visual Studio Code for .NET 9. To enable the C# tokenizer today, check the Use the C# tokenizer for Razor files in the IDE option under Tools > Options > Preview Features and add <Features>use-roslyn-tokenizer;$(Features)</Features> to a property group in your .csproj or directory.props file: This new lexer does currently come with some breaking changes, particularly around preprocessor directives, so we encourage you to please share any related issues you may experience in the Razor Github repository. Summary These two updates, Extract to Component and the C# tokenizer, help enhance your Razor productivity. By adopting these features, you can ensure cleaner code, better language support, and an overall more efficient development process. However, there’s always room for improvement! To share your Razor feedback, submit issues in our Razor Github repo, the Developer Community, or check out this survey to share your Extract to Component feedback! Finally, if you’d like to chat directly with the Razor team about our upcoming roadmap and how we’re addressing your issues, you can join our upcoming .NET Community Standup on February 18th! The post New Features for Enhanced Razor Productivity! appeared first on .NET Blog. View the full article
  11. Today we’re excited to introduce a new hands-on course designed for .NET developers who want to explore the world of Generative AI. Generative AI for Beginners - .NET Our focus in this course is code-first, to teach you what you need to know to be confident building .NET GenAI applications today. What is this course about? As generative AI becomes more accessible, it’s essential for developers to understand how to use it responsibly and effectively. To fill this need, we created a course that covers the basics of Generative AI for the .NET ecosystem, including how to set up your .NET environment, core techniques, practical samples, and responsible use of AI. You’ll learn how to create real-world .NET AI-based apps using a variety of libraries and tools including Microsoft Extensions for AI, GitHub Models and Codespaces, Semantic Kernel, Ollama, and more. We’ve included several lessons and they all include: Short 5–10 minute videos explaining each concept. Fully functional .NET code samples ready to run and explore. Integration with GitHub Codespaces and GitHub Models for quick, convenient setup. Guidance on using GitHub Models and local models with Ollama for flexibility and privacy. Lessons Overview These lessons provide a guided roadmap, starting with core generative AI concepts for .NET developers and how to configure your environment to access AI models in the cloud or locally via Ollama. You’ll then explore techniques that go beyond text processing, such as assembling practical solutions with chatbots including adding video and real-time audio to chat. You’ll also learn about the world of AI Agents, or autonomous intelligent agents that act on the user’s behalf. Finally, you’ll learn about the importance of responsible AI use, ensuring your applications remain ethical and secure. Here’s an example of the semantic search feature you’ll build: And here’s what that real-time voice chat looks like: Getting Started All that’s required is some .NET experience and a desire to learn! You can clone the repo and start working all locally. Even better, we’ve done our best to reduce all of the friction from getting started! You can run everything in GitHub Codespaces and use GitHub Models to access the various LLMs we’ll use in the course – all for free. Check out the course repository, and explore the lessons at your own pace. Watch an overview on the .NET AI Community Standup Check out the .NET AI Community Standup where we gave a sneak peek into the Generative AI for Beginners .NET course, showcasing how .NET developers can harness the power of Generative AI in real-world scenarios. Contribute and Connect Join us on GitHub, contributions are welcome! Submit issues, add new code samples, or create pull requests. You can also join the Azure AI Community Discord to connect with other AI enthusiasts. We look forward to seeing what you build with us! Get started right away and discover how simple it can be to bring AI into your .NET projects. The post Announcing Generative AI for Beginners – .NET appeared first on .NET Blog. View the full article
  12. Here is a list from this month’s .NET releases including .NET 9.0.2 and .NET 8.0.13. It should be noted that this month’s release does not include any new security updates. .NET 8.0 .NET 9.0 Release Notes 8.0.13 9.0.2 Installers and binaries 8.0.13 9.0.2 Container Images images images Linux packages 8.0.13 9.0.2 Known Issues 8.0 9.0 Release changelogs ASP.NET Core: 8.0.13 | 9.0.2 EF Core: 9.0.2 Runtime: 8.0.13 | 9.0.2 SDK: 8.0.13 | 9.0.2 Windows Forms: 8.0.13 | 9.0.2 Share feedback about this release in the Release feedback issue. .NET Framework February 2025 Updates This month, there are no new security and non-security updates. For recent .NET Framework servicing updates, be sure to browse our release notes for .NET Framework for more details. See you next month That’s it for this month, make sure you update to the latest service release today. The post .NET and .NET Framework February 2025 servicing releases updates appeared first on .NET Blog. View the full article
  13. Responding to your feedback, the team has been rolling out a series of updates aimed at enhancing the user’s experience and improving performance and reliability. These updates are designed to make coding in C# more efficient, enjoyable, and productive for developers using VS Code. Solution Explorer Updates You told us you don’t always need a solution file in your workspace. Solution-less workspace mode is now in preview. This feature allows developers to work on C# projects without the need for a solution file (.sln), streamlining the workflow and reducing overhead. Try it out now by setting dotnet.previewSolution-freeWorkspaceMode to true. .NET Aspire Orchestration Also, in preview now, you can make any solution a .NET Aspire solution by adding the .NET Aspire App Host and Service Defaults projects to your solution, letting .NET Aspire simplify your run, debug, and deployment process for your existing application. Open the command palette and select .NET: Add .NET Aspire Orchestration, tell it which projects to orchestrate, name the AppHost and ServiceDefaults projects and you are on your way. Razor/Blazor experience Improvements to the Razor/Blazor experience include some improvements in Hot Reload (currently in experimental mode) and enhancements to Razor error management and IntelliSense. For Hot Reload, enable this by setting csharp.experimental.debug.hotReload to true. We continue to improve this experience and have made it more reliable, working toward this feature’s general availability. For IntelliSense, we’ve addressed several issues around go-to definition reliability and erroneous errors appearing in the problems pane. When you fix a problem, the error goes away without a build now, making your Razor editing experience much more productive. Debugging Enhancements The debugging capabilities of the C# Dev Kit have been improved, including enhancements to Blazor web page debugging, and the ability to locally debug Azure Functions apps (including Azure Functions within .NET Aspire apps). These updates make it easier for developers to identify and resolve issues in their cloud native code, leading to faster and more effective debugging sessions. As always, you can debug your solutions without creating a launch.json file. Just press F5 (or select Run > Start Debugging), select C# from the menu, and select which project is your start-up project and a debug session will begin. Testing Testing has seen several improvements as well, fixing issues with test diffing and adding support for call stacks in test failures. And, if you experience issues with your testing experience, we’ve added a diagnostic level for the testing experience to help us troubleshoot and get to a resolution quicker. To enable it, set csharp.debug.testExplorerVerbosity to diagnostic. Try the new features and give us your feedback We work from your feedback and will continue working through the issues submitted to help bring a more reliable and more productive C# editing experience in VS Code. If you haven’t installed the C# Dev Kit yet, install it now from the Visual Studio Marketplace. For those already using the C# Dev Kit, make sure to update to the newest release to try out the new features and enhancements. Get the latest C# Dev Kit The post C# Dev Kit Updates: .NET Aspire, Hot Reload, and More! appeared first on .NET Blog. View the full article
  14. A year ago, we launched Microsoft.Testing.Platform as part of the MSTest Runner announcement. Our goal was to create a reliable testing platform for .NET projects, focused on extensibility and modularity. We are excited to announce that Microsoft.Testing.Platform has now reached 20+ downloads. We are thrilled to see the adoption of the platform by all major .NET test frameworks. Whether you are using Expecto, MSTest, NUnit, TUnit, or xUnit.net, you can now leverage the new testing platform to run your tests. In this post, we’ll highlight the test frameworks that have embraced Microsoft.Testing.Platform, share their unique characteristics, and provide resources for getting started. What is Microsoft.Testing.Platform Microsoft.Testing.Platform is a lightweight and portable alternative to VSTest for running tests in all contexts, including continuous integration (CI) pipelines, CLI, Visual Studio Test Explorer, and VS Code Text Explorer. The Microsoft.Testing.Platform is embedded directly in your test projects, and there’s no other app dependencies, such as vstest.console or dotnet test needed to run your tests. Microsoft.Testing.Platform is open source. To submit an issue or contribute to the project, you can find Microsoft.Testing.Platform code in microsoft/testfx GitHub repository. Key features Microsoft.Testing.Platform is designed as a modular and extensible testing platform, allowing you to include only the components you need and to extend any part of the test execution. The core platform is designed to be portable and dependency-free allowing you to produce test applications that can run anywhere .NET is supported. Microsoft.Testing.Platform is also integrated with Visual Studio Test Explorer, VS Code Test Explorer in C# Dev Kit, Azure DevOps and .NET SDK providing a seamless experience for developers. Additional resources Overview Comparison with VSTest dotnet test support Available extensions GitHub repository Microsoft.Testing.Platform for extension authors Enabling Microsoft.Testing.Platform in your favorite test framework The test frameworks are ordered alphabetically. All examples below will assume the following production source code: Contoso.csproj: <Project Sdk="Microsoft.NET.Sdk"> <PropertyGroup> <TargetFramework>net9.0</TargetFramework> <ImplicitUsings>enable</ImplicitUsings> <Nullable>enable</Nullable> </PropertyGroup> </Project> Calculator.cs: public class Calculator { public int Add(int a, int b) { return a + b; } } Expecto Expecto aims to make it easy to test CLR based software; be it with unit tests, stress tests, regression tests or property based tests. Expecto tests are parallel and async by default, so that you can use all your cores for testing your software. This also opens up a new way of catching threading and memory issues for free using stress testing. With the release of v0.15.0, YoloDev.Expecto.TestSdk now supports running test through the new testing platform. To opt-in, simply edit your project’s project file to set <EnableExpectoTestingPlatformIntegration>true</EnableExpectoTestingPlatformIntegration> and <OutputType>Exe</OutputType>. Expecto Sample Application Contoso.Tests.fsproj: <Project Sdk="Microsoft.NET.Sdk"> <PropertyGroup> <TargetFramework>net9.0</TargetFramework> <EnableExpectoTestingPlatformIntegration>true</EnableExpectoTestingPlatformIntegration> <OutputType>Exe</OutputType> <TestingPlatformDotnetTestSupport>true</TestingPlatformDotnetTestSupport> </PropertyGroup> <ItemGroup> <PackageReference Include="YoloDev.Expecto.TestSdk" Version="0.15.0" /> </ItemGroup> <ItemGroup> <Compile Include="Test.fs" /> </ItemGroup> </Project> Test.fs: open Expecto let tests = testList "Calculator Tests" [ test "Add function returns sum" { let calculator = Calculator() let result = calculator.Add(1, 2) Expect.equal result 3 "Expected sum to be 3" } ] [<EntryPoint>] let main argv = runTestsWithArgs defaultConfig argv tests MSTest MSTest, Microsoft Testing Framework, is a fully supported, open source, and cross-platform test framework with which to write tests targeting .NET Framework, .NET Core, .NET, UWP, and WinUI on Windows, Linux, and Mac. With v3.2.0 or later, MSTest.TestAdapter supports running tests through the new testing platform. To opt-in, simply edit your project’s project file to set <EnableMSTestRunner>true</EnableMSTestRunner> and <OutputType>Exe</OutputType>. MSTest Sample Application Contoso.Tests.csproj: <Project Sdk="Microsoft.NET.Sdk"> <PropertyGroup> <TargetFramework>net9.0</TargetFramework> <ImplicitUsings>enable</ImplicitUsings> <Nullable>enable</Nullable> <EnableMSTestRunner>true</EnableMSTestRunner> <OutputType>Exe</OutputType> <TestingPlatformDotnetTestSupport>true</TestingPlatformDotnetTestSupport> </PropertyGroup> <ItemGroup> <PackageReference Include="MSTest" Version="3.7.3" /> </ItemGroup> </Project> Test.cs: [TestClass] public class CalculatorTests { [TestMethod] public void Add_WhenCalled_ReturnsSum() { var calculator = new Calculator(); var result = calculator.Add(1, 2); Assert.AreEqual(3, result); } } NUnit NUnit is a unit-testing framework for all .NET languages. Initially ported from JUnit, the current production release has been completely rewritten with many new features and support for a wide range of .NET platforms. With the release of v5, NUnit3TestAdapter now supports running test through the new testing platform. To opt-in, simply edit your project’s project file to set <EnableNUnitRunner>true</EnableNUnitRunner> and <OutputType>Exe</OutputType>. NUnit Sample Application Contoso.Tests.csproj: <Project Sdk="Microsoft.NET.Sdk"> <PropertyGroup> <TargetFramework>net9.0</TargetFramework> <ImplicitUsings>enable</ImplicitUsings> <Nullable>enable</Nullable> <EnableNUnitRunner>true</EnableNUnitRunner> <OutputType>Exe</OutputType> <TestingPlatformDotnetTestSupport>true</TestingPlatformDotnetTestSupport> </PropertyGroup> <ItemGroup> <PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.12.0" /> <PackageReference Include="NUnit" Version="4.3.2" /> <PackageReference Include="NUnit.Analyzers" Version="4.6.0"/> <PackageReference Include="NUnit3TestAdapter" Version="5.0.0" /> </ItemGroup> </Project> Test.cs: public class CalculatorTests { [Test] public void Add_WhenCalled_ReturnsSum() { var calculator = new Calculator(); var result = calculator.Add(1, 2); Assert.That(result,Is.EqualTo(3)); } } TUnit TUnit is a modern, flexible and fast testing framework for C#, featuring with Native AOT and Trimmed Single File application support! This new test framework is built solely on top of Microsoft.Testing.Platform. TUnit Sample Application Contoso.Tests.csproj: <Project Sdk="Microsoft.NET.Sdk"> <PropertyGroup> <TargetFramework>net9.0</TargetFramework> <ImplicitUsings>enable</ImplicitUsings> <Nullable>enable</Nullable> <OutputType>Exe</OutputType> </PropertyGroup> <ItemGroup> <PackageReference Include="TUnit" Version="0.8.4" /> </ItemGroup> </Project> Test1.cs: public class CalculatorTests { [Test] public async Task Add_WhenCalled_ReturnsSum() { var calculator = new Calculator(); var result = calculator.Add(1, 2); await Assert.That(result).IsEqualTo(3); } } xUnit.net xUnit.net is a free, open source, community-focused unit testing tool for the .NET Framework. Written by the original inventor of NUnit v2, xUnit.net is the latest technology for unit testing C#, F#, VB.NET and other .NET languages. With the release of xunit.v3, xUnit.net now supports running test through the new testing platform. To opt-in, simply edit your project’s project file to set <UseMicrosoftTestingPlatformRunner>true</UseMicrosoftTestingPlatformRunner>. xUnit.net Sample Application Contoso.Tests.csproj: <Project Sdk="Microsoft.NET.Sdk"> <PropertyGroup> <TargetFramework>net9.0</TargetFramework> <ImplicitUsings>enable</ImplicitUsings> <Nullable>enable</Nullable> <UseMicrosoftTestingPlatformRunner>true</UseMicrosoftTestingPlatformRunner> <OutputType>Exe</OutputType> <TestingPlatformDotnetTestSupport>true</TestingPlatformDotnetTestSupport> </PropertyGroup> <ItemGroup> <PackageReference Include="xunit.v3" Version="1.0.1" /> <PackageReference Include="xunit.runner.visualstudio" Version="3.0.1" /> <PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.12.0" /> </ItemGroup> </Project> Test.cs: public class CalculatorTests { [Fact] public void Add_WhenCalled_ReturnsSum() { var calculator = new Calculator(); var result = calculator.Add(1, 2); Assert.Equal(3, result); } } Looking Ahead We would like to extend our heartfelt appreciation to the framework authors we have collaborated with and continue to work closely with. We are thrilled to witness the ongoing evolution of this platform and its ability to empower developers. We eagerly anticipate numerous contributions from the community and look forward to the innovative extensions that will be created. If you haven’t already, we encourage you to explore the platform, experiment with your preferred framework, and share your feedback. Together, let’s continue to build an outstanding .NET testing ecosystem! The post Microsoft.Testing.Platform: Now Supported by All Major .NET Test Frameworks appeared first on .NET Blog. View the full article
  15. Continuing our tradition, we are excited to share a blog post highlighting the latest and most interesting changes in the networking space with the new .NET release. This year, we are introducing updates in the HTTP space, new HttpClientFactory APIs, .NET Framework compatibility improvements, and more. HTTP In the following section, we’re introducing the most impactful changes in the HTTP space. Among which belong perf improvements in connection pooling, support for multiple HTTP/3 connections, auto-updating Windows proxy, and, last but not least, community contributions. Connection Pooling In this release, we made two impactful performance improvements in HTTP connection pooling. We added opt-in support for multiple HTTP/3 connections. Using more than one HTTP/3 connection to the peer is discouraged by the RFC 9114 since the connection can multiplex parallel requests. However, in certain scenarios, like server-to-server, one connection might become a bottleneck even with request multiplexing. We saw such limitations with HTTP/2 (dotnet/runtime#35088), which has the same concept of multiplexing over one connection. For the same reasons (dotnet/runtime#51775), we decided to implement multiple connection support for HTTP/3 (dotnet/runtime#101535). The implementation itself tries to closely match the behavior of HTTP/2 multiple connections. Which, at the moment, always prefer to saturate existing connections with as many requests as allowed by the peer before opening a new one. Note that this is an implementation detail and the behavior might change in the future. As a result, our benchmarks showed a nontrivial increase in requests per seconds (RPS), comparison for 10,000 parallel requests: client single HTTP/3 connection multiple HTTP/3 connections Max CPU Usage (%) 35 92 Max Cores Usage (%) 971 2,572 Max Working Set (MB) 3,810 6,491 Max Private Memory (MB) 4,415 7,228 Processor Count 28 28 First request duration (ms) 519 594 Requests 345,446 4,325,325 Mean RPS 23,069 288,664 Note that the increase in Max CPU Usage implies better CPU utilization, which means that the CPU is busy processing requests instead of being idle. This feature can be turned on via the EnableMultipleHttp3Connections property on SocketsHttpHandler: var client = new HttpClient(new SocketsHttpHandler { EnableMultipleHttp3Connections = true }); We also addressed lock contention in HTTP 1.1 connection pooling (dotnet/runtime#70098). The HTTP 1.1 connection pool previously used a single lock to manage the list of connections and the queue of pending requests. This lock was observed to be a bottleneck in high throughput scenarios on machines with a high number of CPU cores. We resolved this problem (dotnet/runtime#99364) by replacing an ordinary list with a lock with a concurrent collection. We chose ConcurrentStack as it preserves the observable behavior when requests are handled by the newest available connection, which allows collecting older connections when their configured lifetime expires. The throughput of HTTP 1.1 requests in our benchmarks increased by more than 30%: Client .NET 8.0 .NET 9.0 Increase Requests 80,028,791 107,128,778 +33.86% Mean RPS 666,886 892,749 +33.87% Proxy Auto Update on Windows One of the main pain points when debugging HTTP traffic of applications using earlier versions of .NET is that the application doesn’t react to changes in Windows proxy settings (dotnet/runtime#70098). The proxy settings were previously initialized once per process with no reasonable ability to refresh the settings. For example (with .NET 8), HttpClient.DefaultProxy returns the same instance upon repeated access and never refetch the settings. As a result, tools like Fiddler, that set themself as system proxy to listen for the traffic, weren’t able to capture traffic from already running processes. This issue was mitigated in dotnet/runtime#103364, where the HttpClient.DefaultProxy is set to an instance of Windows proxy that listens for registry changes and reloads the proxy settings when notified. The following code: while (true) { using var resp = await client.GetAsync("https://httpbin.org/"); Console.WriteLine(HttpClient.DefaultProxy.GetProxy(new Uri("https://httpbin.org/"))?.ToString() ?? "null"); await Task.Delay(1_000); } produces output like this: null // After Fiddler's "System Proxy" is turned on. http://127.0.0.1:8866/ Note that this change applies only for Windows as it has a unique concept of machine wide proxy settings. Linux and other UNIX-based systems only allow setting up proxy via environment variables, which can’t be changed during process lifetime. Community contributions We’d like to call out community contributions. CancellationToken overloads were missing from HttpContent.LoadIntoBufferAsync. This gap was resolved by an API proposal (dotnet/runtime#102659) from @andrewhickman-aveva and an implementation (dotnet/runtime#103991) was from @manandre. Another change improves a units discrepancy for the MaxResponseHeadersLength property on SocketsHttpHandler and HttpClientHandler (dotnet/runtime#75137). All the other size and length properties are interpreted as being in bytes, however this one is interpreted as being in kilobytes. And since the actual behavior can’t be changed due to backward compatibility, the problem was solved by implementing an analyzer (dotnet/roslyn-analyzers#6796). The analyzer tries to make sure the user is aware that the value provided is interpreted as kilobytes, and warns if the usage suggests otherwise. If the value is higher than a certain threshold, it looks like this: The analyzer was implemented by @amiru3f. QUIC The prominent changes in QUIC space in .NET 9 include making the library public, more configuration options for connections and several performance improvements. Public APIs From this release on, System.Net.Quic isn’t hidden behind PreviewFeature anymore and all the APIs are generally available without any opt-in switches (dotnet/runtime#104227). QUIC Connection Options We expanded the configuration options for QuicConnection (dotnet/runtime#72984). The implementation (dotnet/runtime#94211) added three new properties to QuicConnectionOptions: HandshakeTimeout – we were already imposing a limit on how long a connection establishment can take, this property just enables the user to adjust it. KeepAliveInterval – if this property is set up to a positive value, PING frames are sent out regularly in this interval (in case no other activity is happening on the connection) which prevents the connection from being closed on idle timeout. InitialReceiveWindowSizes – a set of parameters to adjust the initial receive limits for data flow control sent in transport parameters. These data limits apply only until the dynamic flow control algorithm starts adjusting the limits based on the data reading speed. And due to MsQuic limitations, these parameters can only be set to values that are power of 2. All of these parameters are optional. Their default values are derived from MsQuic defaults. The following code reports the defaults programmatically: var options = new QuicClientConnectionOptions(); Console.WriteLine($"KeepAliveInterval = {PrettyPrintTimeStamp(options.KeepAliveInterval)}"); Console.WriteLine($"HandshakeTimeout = {PrettyPrintTimeStamp(options.HandshakeTimeout)}"); Console.WriteLine(@$"InitialReceiveWindowSizes = {{ Connection = {PrettyPrintInt(options.InitialReceiveWindowSizes.Connection)}, LocallyInitiatedBidirectionalStream = {PrettyPrintInt(options.InitialReceiveWindowSizes.LocallyInitiatedBidirectionalStream)}, RemotelyInitiatedBidirectionalStream = {PrettyPrintInt(options.InitialReceiveWindowSizes.RemotelyInitiatedBidirectionalStream)}, UnidirectionalStream = {PrettyPrintInt(options.InitialReceiveWindowSizes.UnidirectionalStream)} }}"); static string PrettyPrintTimeStamp(TimeSpan timeSpan) => timeSpan == Timeout.InfiniteTimeSpan ? "infinite" : timeSpan.ToString(); static string PrettyPrintInt(int sizeB) => sizeB % 1024 == 0 ? $"{sizeB / 1024} * 1024" : sizeB.ToString(); // Prints: KeepAliveInterval = infinite HandshakeTimeout = 00:00:10 InitialReceiveWindowSizes = { Connection = 16384 * 1024, LocallyInitiatedBidirectionalStream = 64 * 1024, RemotelyInitiatedBidirectionalStream = 64 * 1024, UnidirectionalStream = 64 * 1024 } Stream Capacity API .NET 9 also introduced new APIs to support multiple HTTP/3 connections in SocketsHttpHandler (dotnet/runtime#101534). The APIs were designed with this specific usage in mind, and we don’t expect them to be used apart from very niche scenarios. QUIC has built-in logic for managing stream limits within the protocol. As a result, calling OpenOutboundStreamAsync on a connection gets suspended if there isn’t any available stream capacity. Moreover, there isn’t an efficient way to learn whether the stream limit was reached or not. All these limitations together didn’t allow the HTTP/3 layer to know when to open a new connection. So we introduced a new StreamCapacityCallback that gets called whenever stream capacity is increased. The callback itself is registered via QuicConnectionOptions. More details about the callback can be found in the documentation. Performance Improvements Both performance improvements in System.Net.Quic are TLS related and both only affect connection establishing times. The first performance related change was to run the peer certificate validation asynchronously in .NET thread pool (dotnet/runtime#98361). The certificate validation can be time consuming on its own and it might even include an execution of a user callback. Moving this logic to .NET thread pool stops us blocking the MsQuic thread, of which MsQuic has a limited number, and thus enables MsQuic to process higher number of new connections at the same time. On top of that, we have introduced caching of MsQuic configuration (dotnet/runtime#99371). MsQuic configuration is a set of native structures containing connection settings from QuicConnectionOptions, potentially including certificate and its intermediaries. Constructing and initializing the native structure might be very expensive since it might require serializing and deserializing all the certificate data to and from PKS #12 format. Moreover, the cache allows reusing the same MsQuic configuration for different connections if their settings are identical. Specifically server scenarios with static configuration can notably profit from the caching, like the following code: var alpn = "test"; var serverCertificate = X509CertificateLoader.LoadCertificateFromFile("../path/to/cert"); // Prepare the connection option upfront and reuse them. var serverConnectionOptions = new QuicServerConnectionOptions() { DefaultStreamErrorCode = 123, DefaultCloseErrorCode = 456, ServerAuthenticationOptions = new SslServerAuthenticationOptions { ApplicationProtocols = new List<SslApplicationProtocol>() { alpn }, // Re-using the same certificate. ServerCertificate = serverCertificate } }; // Configure the listener to return the pre-prepared options. await using var listener = await QuicListener.ListenAsync(new QuicListenerOptions() { ListenEndPoint = new IPEndPoint(IPAddress.Loopback, 0), ApplicationProtocols = [ alpn ], // Callback returns the same object. // Internal cache will re-use the same native structure for every incoming connection. ConnectionOptionsCallback = (_, _, _) => ValueTask.FromResult(serverConnectionOptions) }); We also built it an escape hatch for this feature, it can be turned off with either environment variable: export DOTNET_SYSTEM_NET_QUIC_DISABLE_CONFIGURATION_CACHE=1 # run the app or with an AppContext switch: AppContext.SetSwitch("System.Net.Quic.DisableConfigurationCache", true); WebSockets .NET 9 introduces the long-desired PING/PONG Keep-Alive strategy to WebSockets (dotnet/runtime#48729). Prior to .NET 9, the only available Keep-Alive strategy was Unsolicited PONG. It was enough to keep the underlying TCP connection from idling out, but in a case when a remote host becomes unresponsive (for example, a remote server crashes), the only way to detect such situations was to depend on the TCP timeout. In this release, we complement the existing KeepAliveInterval setting with the new KeepAliveTimeout setting, so that the Keep-Alive strategy is selected as follows: Keep-Alive is OFF, if KeepAliveInterval is TimeSpan.Zero or Timeout.InfiniteTimeSpan Unsolicited PONG, if KeepAliveInterval is a positive finite TimeSpan, -AND- KeepAliveTimeout is TimeSpan.Zero or Timeout.InfiniteTimeSpan PING/PONG, if KeepAliveInterval is a positive finite TimeSpan, -AND- KeepAliveTimeout is a positive finite TimeSpan By default, the preexisting Keep-Alive behavior is maintained: KeepAliveTimeout default value is Timeout.InfiniteTimeSpan, so Unsolicited PONG remains as the default strategy. The following example illustrates how to enable the PING/PONG strategy for a ClientWebSocket: var cws = new ClientWebSocket(); cws.Options.KeepAliveInterval = TimeSpan.FromSeconds(10); cws.Options.KeepAliveTimeout = TimeSpan.FromSeconds(10); await cws.ConnectAsync(uri, cts.Token); // NOTE: There should be an outstanding read at all times to // ensure incoming PONGs are promptly processed var result = await cws.ReceiveAsync(buffer, cts.Token); If no PONG response received after KeepAliveTimeout elapsed, the remote endpoint is deemed unresponsive, and the WebSocket connection is automatically aborted. It also unblocks the outstanding ReceiveAsync with an OperationCanceledException. To learn more about the feature, you can check out the dedicated conceptual docs. .NET Framework Compatibility One of the biggest hurdles in the networking space when migrating projects from .NET Framework to .NET Core is the difference between the HTTP stacks. In .NET Framework, the main class to handle HTTP requests is HttpWebRequest which uses global ServicePointManager and individual ServicePoints to handle connection pooling. Whereas in .NET Core, HttpClient is the recommended way to access HTTP resources. On top of that, all the classes from .NET Framework are present in .NET, but they’re either obsolete, or missing implementation, or are just not maintained at all. As a result, we often see mistakes like using ServicePointManager to configure the connections while using HttpClient to access the resources. The recommendation always was to fully migrate to HttpClient, but sometimes it’s not possible. Migrating projects from .NET Framework to .NET Core can be difficult on its own, let alone rewriting all the networking code. Expecting customers to do all this work in one step proved to be unrealistic and is one of the reasons why customers might be reluctant to migrate. To mitigate these pain points, we filled in some missing implementations of the legacy classes and created a comprehensive guide to help with the migration. The first part is expansion of supported ServicePointManager and ServicePoint properties that were missing implementation in .NET Core up until this release (dotnet/runtime#94664 and dotnet/runtime#97537). With these changes, they’re now taken into account when using HttpWebRequest. For HttpWebRequest, we implemented full support of AllowWriteStreamBuffering in dotnet/runtime#95001. And also added missing support for ImpersonationLevel in dotnet/runtime#102038. On top of these changes, we also obsoleted a few legacy classes to prevent further confusion: ServicePointManager in dotnet/runtime#103456. Its settings have no effect on HttpClient and SslStream while it might be misused in good faith for exactly that purpose. AuthenticationManager in dotnet/runtime#93171, done by community contributor @deeprobin. It’s either missing implementation or the methods throw PlatformNotSupportedException. Lastly, we put up together a guide for migration from HttpWebRequest to HttpClient in HttpWebRequest to HttpClient migration guide. It includes comprehensive lists of mappings between individual properties and methods, e.g., Migrate ServicePoint(Manager) usage and many examples for trivial and not so trivial scenarios, e.g., Example: Enable DNS round robin. Diagnostics In this release, diagnostics improvements focus on enhancing privacy protection and advancing distributed tracing capabilities. Uri Query Redaction in HttpClientFactory Logs Starting with version 9.0.0 of Microsoft.Extensions.Http, the default logging logic of HttpClientFactory prioritizes protecting privacy. In older versions, it emits the full request URI in the RequestStart and RequestPipelineStart events. In cases where some components of the URI contain sensitive information, this can lead to privacy incidents by leaking such data into logs. Version 8.0.0 introduced the ability to secure HttpClientFactory usage by customizing logging. However, this doesn’t change the fact that the default behavior might be risky for unaware users. In the majority of the problematic cases, sensitive information resides in the query component. Therefore, a breaking change was introduced in 9.0.0, removing the entire query string from HttpClientFactory logs by default. A global opt-out switch is available for services/apps where it’s safe to log the full URI. For consistency and maximum safety, a similar change was implemented for EventSource events in System.Net.Http. We recognize that this solution might not suit everyone. Ideally, there should be a fine-grained URI filtering mechanism, allowing users to retain non-sensitive query entries or filter other URI components (e.g., parts of the path). We plan to explore such a feature for future versions (dotnet/runtime#110018). Distributed Tracing Improvements Distributed tracing is a diagnostic technique for tracking the path of a specific transaction across multiple processes and machines, helping identify bottlenecks and failures. This technique models the transaction as a hierarchical tree of Activities, also referred to as spans in OpenTelemetry terminology. HttpClientHandler and SocketsHttpHandler are instrumented to start an Activity for each request and propagate the trace context via standard W3C headers when tracing is enabled. Before .NET 9, users needed the OpenTelemetry .NET SDK to produce useful OpenTelemetry-compliant traces. This SDK was required not just for collection and export but also to extend the instrumentation, as the built-in logic didn’t populate the Activity with request data. Starting with .NET 9, the instrumentation dependency (OpenTelemetry.Instrumentation.Http) can be omitted unless advanced features like enrichment are required. In dotnet/runtime#104251, we extended the built-in tracing to ensure that the shape of the Activity is OTel-compliant, with the name, status, and most required tags populated according to the standard. Experimental Connection Tracing When investigating bottlenecks, you might want to zoom into specific HTTP requests to identify where most of the time is spent. Is it during a connection establishment or the content download? If there are connection issues, it’s helpful to determine whether the problem lies with DNS lookups, TCP connection establishment, or the TLS handshake. .NET 9 has introduced several new spans to represent activities around connection establishment in SocketsHttpHandler. The most significant one HTTP connection setup span which breaks down to three child spans for DNS, TCP, and TLS activities. Because connection setup isn’t tied to a particular request in SocketsHttpHandler connection pool, the connection setup span can’t be modeled as a child span of the HTTP client request span. Instead, the relationship between requests and connections is being represented using Span Links, also known as Activity Links. Note The new spans are produced by various ActivitySources matching the wildcard Experimental.System.Net.*. These spans are experimental because monitoring tools like Azure Monitor Application Insights have difficulty visualizing the resulting traces effectively due to the numerous connection_setup → request backlinks. To improve the user experience in monitoring tools, further work is needed. It involves collaboration between the .NET team, OTel, and tool authors, and may result in breaking changes in the design of the new spans. The simplest way to set up and try connection trace collection is by using .NET Aspire. Using Aspire Dashboards it’s possible to expand the connection_setup activity and see a breakdown of the connection initialization. If you think the .NET 9 tracing additions might bring you valuable diagnostic insights, and you want to get some hands-on experience, don’t hesitate to read our full article about Distributed tracing in System.Net libraries. HttpClientFactory For HttpClientFactory, we’re introducing the Keyed DI support, offering a new convenient consumption pattern, and changing a default Primary Handler to mitigate a common erroneous usecase. Keyed DI Support In the previous release, Keyed Services were introduced to Microsoft.Extensions.DependencyInjection packages. Keyed DI allows you to specify the keys while registering multiple implementations of a single service type—and to later retrieve a specific implementation using the respective key. HttpClientFactory and named HttpClient instances, unsurprisingly, align well with the Keyed Services idea. Among other things, HttpClientFactory was a way to overcome this long-missing DI feature. But it required you to obtain, store and query the IHttpClientFactory instance—instead of simply injecting a configured HttpClient—which might be inconvenient. While Typed clients attempted to simplify that part, it came with a catch: Typed clients are easy to misconfigure and misuse (and the supporting infra can also be a tangible overhead in certain scenarios). As a result, the user experience in both cases was far from ideal. This changes as Microsoft.Extensions.DependencyInjection 9.0.0 and Microsoft.Extensions.Http 9.0.0 packages bring the Keyed DI support into HttpClientFactory (dotnet/runtime#89755). Now you can have the best of both worlds: you can pair the convenient, highly configurable HttpClient registrations with the straightforward injection of the specific configured HttpClient instances. As of 9.0.0, you need to opt in to the feature by calling the AddAsKeyed() extension method. It registers a Named HttpClient as a Keyed service for the key equal to the client’s name—and enables you to use the Keyed Services APIs (e.g., [FromKeyedServices(...)]) to obtain the required HttpClients. The following code demonstrates the integration between HttpClientFactory, Keyed DI and ASP.NET Core 9.0 Minimal APIs: var builder = WebApplication.CreateBuilder(args); builder.Services.AddHttpClient("github", c => { c.BaseAddress = new Uri("https://api.github.com/"); c.DefaultRequestHeaders.Add("Accept", "application/vnd.github.v3+json"); c.DefaultRequestHeaders.Add("User-Agent", "dotnet"); }) .AddAsKeyed(); // Add HttpClient as a Keyed Scoped service for key="github" var app = builder.Build(); // Directly inject the Keyed HttpClient by its name app.MapGet("/", ([FromKeyedServices("github")] HttpClient httpClient) => httpClient.GetFromJsonAsync<Repo>("/repos/dotnet/runtime")); app.Run(); record Repo(string Name, string Url); Endpoint response: > ~ curl http://localhost:5000/ {"name":"runtime","url":"https://api.github.com/repos/dotnet/runtime"} By default, AddAsKeyed() registers HttpClient as a Keyed Scoped service. The Scoped lifetime can help catching cases of captive dependencies: services.AddHttpClient("scoped").AddAsKeyed(); services.AddSingleton<CapturingSingleton>(); // Throws: Cannot resolve scoped service 'System.Net.Http.HttpClient' from root provider. rootProvider.GetRequiredKeyedService<HttpClient>("scoped"); using var scope = provider.CreateScope(); scope.ServiceProvider.GetRequiredKeyedService<HttpClient>("scoped"); // OK // Throws: Cannot consume scoped service 'System.Net.Http.HttpClient' from singleton 'CapturingSingleton'. public class CapturingSingleton([FromKeyedServices("scoped")] HttpClient httpClient) //{ ... You can also explicitly specify the lifetime by passing the ServiceLifetime parameter to the AddAsKeyed() method: services.AddHttpClient("explicit-scoped") .AddAsKeyed(ServiceLifetime.Scoped); services.AddHttpClient("singleton") .AddAsKeyed(ServiceLifetime.Singleton); You don’t have to call AddAsKeyed for every single client—you can easily opt in “globally” (for any client name) via ConfigureHttpClientDefaults. From Keyed Services perspective, it results in the KeyedService.AnyKey registration. services.ConfigureHttpClientDefaults(b => b.AddAsKeyed()); services.AddHttpClient("foo", /* ... */); services.AddHttpClient("bar", /* ... */); public class MyController( [FromKeyedServices("foo")] HttpClient foo, [FromKeyedServices("bar")] HttpClient bar) //{ ... Even though the “global” opt-in is a one-liner, it’s unfortunate that the feature still requires it, instead of just working “out of the box”. For full context and reasoning on that decision, see dotnet/runtime#89755 and dotnet/runtime#104943. You can explicitly opt out from Keyed DI for HttpClients by calling RemoveAsKeyed() (for example, per specific client, in case of the “global” opt-in): services.ConfigureHttpClientDefaults(b => b.AddAsKeyed()); // opt IN by default services.AddHttpClient("keyed", /* ... */); services.AddHttpClient("not-keyed", /* ... */).RemoveAsKeyed(); // opt OUT per name provider.GetRequiredKeyedService<HttpClient>("keyed"); // OK provider.GetRequiredKeyedService<HttpClient>("not-keyed"); // Throws: No service for type 'System.Net.Http.HttpClient' has been registered. provider.GetRequiredKeyedService<HttpClient>("unknown"); // OK (unconfigured instance) If called together, or any of them more than once, AddAsKeyed() and RemoveAsKeyed() generally follow the rules of HttpClientFactory configs and DI registrations: If used within the same name, the last setting wins: the lifetime from the last AddAsKeyed() is used to create the Keyed registration (unless RemoveAsKeyed() was called last, in which case the name is excluded). If used only within ConfigureHttpClientDefaults, the last setting wins. If both ConfigureHttpClientDefaults and a specific client name were used, all defaults are considered to “happen” before all per-name settings for this client. Thus, the defaults can be disregarded, and the last of the per-name ones wins. You can learn more about the feature in the dedicated conceptual docs. Default Primary Handler Change One of the most common problems HttpClientFactory users run into is when a Named or a Typed client erroneously gets captured in a Singleton service, or, in general, stored somewhere for a period of time that’s longer than the specified HandlerLifetime. Because HttpClientFactory can’t rotate such handlers, they might end up not respecting DNS changes. It is, unfortunately, easy and seemingly “intuitive” to inject a Typed client into a singleton, but hard to have any kind of check/analyzer to make sure HttpClient isn’t captured when it wasn’t supposed to. It might be even harder to troubleshoot the resulting issues. On the other hand, the problem can be mitigated by using SocketsHttpHandler, which can control PooledConnectionLifetime. Similarly to HandlerLifetime, it allows regularly recreating connections to pick up the DNS changes, but on a lower level. A client with PooledConnectionLifetime set up can be safely used as a Singleton. Therefore, to minimize the potential impact of the erroneous usage patterns, .NET 9 makes the default Primary handler a SocketsHttpHandler (on platforms that support it; other platforms, e.g. .NET Framework, continue to use HttpClientHandler). And most importantly, SocketsHttpHandler also has the PooledConnectionLifetime property preset to match the HandlerLifetime value (it reflects the latest value, if you configured HandlerLifetime one or more times). The change only affects cases when the client was not configured to have a custom Primary handler (via e.g. ConfigurePrimaryHttpMessageHandler<T>()). While the default Primary handler is an implementation detail, as it was never specified in the docs, it’s still considered a breaking change. There could be cases in which you wanted to use the specific type, for example, casting the Primary handler to HttpClientHandler to set properties like ClientCertificates, UseCookies, UseProxy, etc. If you need to use such properties, it’s suggested to check for both HttpClientHandler and SocketsHttpHandler in the configuration action: services.AddHttpClient("test") .ConfigurePrimaryHttpMessageHandler((h, _) => { if (h is HttpClientHandler hch) { hch.UseCookies = false; } if (h is SocketsHttpHandler shh) { shh.UseCookies = false; } }); Alternatively, you can explicitly specify a Primary handler for each of your clients: services.AddHttpClient("test") .ConfigurePrimaryHttpMessageHandler(() => new HttpClientHandler() { UseCookies = false }); Or, configure the default Primary handler for all clients using ConfigureHttpClientDefaults: services.ConfigureHttpClientDefaults(b => b.ConfigurePrimaryHttpMessageHandler(() => new HttpClientHandler() { UseCookies = false })); Security In System.Net.Security, we’re introducing the highly sought support for SSLKEYLOGFILE, more scenarios supporting TLS resume, and new additions in negotiate APIs. SSLKEYLOGFILE Support The most upvoted issue in the security space was to support logging of pre-master secret (dotnet/runtime#37915). The logged secret can be used by packet capturing tool Wireshark to decrypt the traffic. It’s a useful diagnostics tool when investigating networking issues. Moreover, the same functionality is provided by browsers like Firefox (via NSS) and Chrome and command line HTTP tools like cURL. We have implemented this feature for both SslStream and QuicConnection. For the former, the functionality is limited to the platforms on which we use OpenSSL as a cryptographic library. In the terms of the officially released .NET runtime, it means only on Linux operating systems. For the latter, it’s supported everywhere, regardless of the cryptographic library. That’s because TLS is part of the QUIC protocol (RFC 9001) so the user-space MsQuic has access to all the secrets and so does .NET. The limitation of SslStream on Windows comes from SChannel using a separate, privileged process for TLS which won’t allow exporting secrets due to security concerns (dotnet/runtime#94843). This feature exposes security secrets and relying solely on an environmental variable could unintentionally leak them. For that reason, we’ve decided to introduce an additional AppContext switch necessary to enable the feature (dotnet/runtime#100665). It requires the user to prove the ownership of the application by either setting it programmatically in the code: AppContext.SetSwitch("System.Net.EnableSslKeyLogging", true); or by changing the {appname}.runtimeconfig.json next to the application: { "runtimeOptions": { "configProperties": { "System.Net.EnableSslKeyLogging": true } } } The last thing is to set up an environmental variable SSLKEYLOGFILE and run the application: export SSLKEYLOGFILE=~/keylogfile ./<appname> At this point, ~/keylogfile will contain pre-master secrets that can be used by Wireshark to decrypt the traffic. For more information, see TLS Using the (Pre)-Master-Secret documentation. TLS Resume with Client Certificate TLS resume enables reusing previously stored TLS data to re-establish connection to previously connected server. It can save round trips during the handshake as well as CPU processing. This feature is a native part of Windows SChannel, therefore it’s implicitly used by .NET on Windows platforms. However, on Linux platforms where we use OpenSSL as a cryptographic library, enabling caching and reusing TLS data is more involved. We first introduced the support in .NET 7 (see TLS Resume). It has its own limitations that in general are not present on Windows. One such limitation was that it was not supported for sessions using mutual authentication by providing a client certificate (dotnet/runtime#94561). It has been fixed in .NET 9 (dotnet/runtime#102656) and works if one these properties is set as described: ClientCertificateContext LocalCertificateSelectionCallback returns non-null certificate on the first call ClientCertificates collection has at least one certificate with private key Negotiate API Integrity Checks In .NET 7, we added NegotiateAuthentication APIs, see Negotiate API. The original implementation’s goal was to remove access via reflection to the internals of NTAuthentication. However, that proposal was missing functions to generate and verify message integrity codes from RFC 2743. They’re usually implemented as cryptographic signing operation with a negotiated key. The API was proposed in dotnet/runtime#86950 and implemented in dotnet/runtime#96712 and as the original change, all the work from the API proposal to the implementation was done by a community contributor filipnavara. Networking Primitives This section encompasses changes in System.Net namespace. We’re introducing new support for server-side events and some small additions in APIs, for example new MIME types. Server-Sent Events Parser Server-sent events is a technology that allows servers to push data updates on clients via an HTTP connection. It is defined in living HTML standard. It uses text/event-stream MIME type and it’s always decoded as UTF-8. The advantage of the server-push approach over client-pull is that it can make better use of network resources and also save battery life of mobile devices. In this release, we’re introducing an OOB package System.Net.ServerSentEvents. It’s available as a .NET Standard 2.0 NuGet package. The package offers a parser for server-sent event stream, following the specification. The protocol is stream based, with individual items separated by an empty line. Each item has two fields: type – default type is message data – data itself On top of that, there are two other optional fields that progressively update properties of the stream: id – determines the last event id that is sent in Last-Event-Id header in case the connection needs to be reconnected retry – number of milliseconds to wait between reconnection attempts The library APIs were proposed in dotnet/runtime#98105 and contain type definitions for the parser and the items: SseParser – static class to create the actual parser from the stream, allowing the user to optionally provide a parsing delegate for the item data SseParser<T> – parser itself, offers methods to enumerate (synchronously or asynchronously) the stream and return the parsed items SseItem<T> – struct holding parsed item data Then the parser can be used like this, for example: using HttpClient client = new HttpClient(); using Stream stream = await client.GetStreamAsync("https://server/sse"); var parser = SseParser.Create(stream, (type, data) => { var str = Encoding.UTF8.GetString(data); return Int32.Parse(str); }); await foreach (var item in parser.EnumerateAsync()) { Console.WriteLine($"{item.EventType}: {item.Data} [{parser.LastEventId};{parser.ReconnectionInterval}]"); } And for the following input: : stream of integers data: 123 id: 1 retry: 1000 data: 456 id: 2 data: 789 id: 3 It outputs: message: 123 [1;00:00:01] message: 456 [2;00:00:01] message: 789 [3;00:00:01] Primitives Additions Apart from server sent event, System.Net namespace got a few small other additions: IEquatable<Uri> interface implementation for Uri in dotnet/runtime#97940 Which allows using Uri in functions that require IEquatable like Span.Contains-0)) or SequenceEquals-system-readonlyspan((-0)))) span-based (Try)EscapeDataString)) and (Try)UnescapeDataString)) for Uri in dotnet/runtime#40603 The goal is to support low-allocation scenarios and we now take advantage of these methods in FormUrlEncodedContent. new MIME types for MediaTypeNames in dotnet/runtime#95446 These types were collected over the course of the release and implemented in dotnet/runtime#103575 by a community contributor @CollinAlpert. Final Notes As each year, we try to write about the interesting and impactful changes in the networking space. This article can’t possibly cover all the changes that were made. If you are interested, you can find the complete list in our dotnet/runtime repository where you can also reach out to us with question and bugs. On top of that, many of the performance changes that are not mentioned here are in Stephen’s great article Performance Improvements in .NET 9. We’d also like to hear from you, so if you encounter an issue or have any feedback, you can file it in our GitHub repo. Lastly, I’d like to thank my co-authors: @antonfirsov who wrote Diagnostics. @CarnaViire who wrote HttpClientFactory and WebSockets. The post .NET 9 Networking Improvements appeared first on .NET Blog. View the full article
×
×
  • Create New...