#### AI description (iteration 1)
#### PR Classification
New feature
#### PR Summary
This pull request introduces new scripts and classes to enhance the functionality of the repository.
- Added `Get-RepoDigest.ps1` script to generate a repository digest report.
- Added `AzureAIInferenceEmbeddingGenerator.cs` to support embedding generation using Azure AI Inference.
- Added `repo-digest-template.html` to serve as a template for the repository digest report.
* Workaround fixed bug in System.Memory.Data
BinaryData had a bug in its ToString that would throw an exception if _bytes was empty. That was fixed several years ago, but Azure SDK libraries are still referencing older versions of System.Memory.Data that don't have the fix.
* Add pragma warning and update condition check
* Improve EmbeddingGeneratorExtensions
- Renames GenerateAsync extension method (not the interface method) to be GenerateEmbeddingAsync, since it produces a single TEmbedding
- Adds GenerateEmbeddingVectorAsync, which returns a `ReadOnlyMemory<T>`
- Adds a GenerateAndZipEmbeddingsAsync, which creates a `List<KeyValuePair<TInput, TEmbedding>>` that pairs the inputs with the outputs.
* Address PR feedback
Enable producing stable versions and flow dependencies from aspnetcore and runtime
----
#### AI description (iteration 1)
#### PR Classification
Dependency update
#### PR Summary
This pull request updates various dependencies to their stable versions and adjusts build configurations to support stable version production.
- Updated dependency versions from `9.0.0-rc.2` to `9.0.0` in `/eng/Version.Details.xml` and `/eng/Versions.props`.
- Modified `NuGet.config` to update package sources and disable certain internal feeds.
- Adjusted build pipeline configurations in `/azure-pipelines.yml` to remove the code coverage stage and disable source indexing.
- Added `SuppressFinalPackageVersion` property in multiple `.csproj` files to prevent final package versioning during development stages.
- Disabled NU1507 warning in `Directory.Build.props` for internal branches.
* Fix a few issues in IChatClient implementations
- Avoid null arg exception when constructing system message with null text
- Avoid empty exception when constructing user message with no parts
- Use all parts rather than just first text part for system message
- Handle assistant messages with both content and tools
- Avoid unnecessarily trying to weed out duplicate call ids
* Address PR feedback
- Normalize null to string.Empty in TextContent
- Ensure GetContentParts always produces at least one part, even if empty text content
* Use 8.0 era dependencies for non net9.0 TFMs
* PR Feedback.
* Apply suggestions from code review
Co-authored-by: Eric Erhardt <eric.erhardt@microsoft.com>
* Split files per feedback and update dependency versions to latest servicing and RC2
* Removing latest version from Microsoft.Extensions.AI.Abstractions
* Removing unnecessary package reference
* Also removing property from AzureAIInference project
---------
Co-authored-by: Eric Erhardt <eric.erhardt@microsoft.com>
* Update UseOpenTelemetry for latest genai spec updates
- Events are now expected to be emitted as body fields, and the newly-recommended way to achieve that is via ILogger. So UseOpenTelemetry now takes an optional logger that it uses for emitting such data.
- I restructured the implementation to reduce duplication.
- Added logging of response format and seed.
- Added ChatOptions.TopK, as it's one of the parameters considered special by the spec.
- Updated the Azure.AI.Inference provider name to match the convention and what the library itself uses
- Updated the OpenAI client to use openai regardless of the kind of the actual client being used, per spec and recommendation
* Address PR feedback
* Expose FunctionCallUtilities class.
* Update src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/FunctionCallUtilities.cs
Co-authored-by: Stephen Toub <stoub@microsoft.com>
* Remove function call formatting helpers.
* Extract JSON schema inference settings into a separate options class.
* Update src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/JsonSchemaInferenceOptions.cs
Co-authored-by: Stephen Toub <stoub@microsoft.com>
* Address feedback
* Return FunctionCallContent in parser helpers.
* Address feedback
* Update src/Libraries/Microsoft.Extensions.AI/Functions/AIFunctionFactory.cs
Co-authored-by: Stephen Toub <stoub@microsoft.com>
* Refactor to AIJsonUtilities class.
* Move all utilities to M.E.AI and downgrade STJ version to 8 for M.E.AI.Abstractions.
---------
Co-authored-by: Stephen Toub <stoub@microsoft.com>
* Improve CachingChatClient's coalescing of streaming updates
- Avoid O(N^2) memory allocation in the length of the received text
- Propagate additional metadata from coalesced nodes
- Propagate metadata on the coalesced TextContent, like ModelId
- Expose whether to coalesce as a setting on the client
* Remove dictionary merging until we have evidence it's warranted