Skip to content

Migrate Node.js LangChain sample to GA packages and Microsoft.OpenTelemetry distro#309

Open
biswapm wants to merge 2 commits into
mainfrom
pmohapatra/langchain-ga-migration
Open

Migrate Node.js LangChain sample to GA packages and Microsoft.OpenTelemetry distro#309
biswapm wants to merge 2 commits into
mainfrom
pmohapatra/langchain-ga-migration

Conversation

@biswapm
Copy link
Copy Markdown
Contributor

@biswapm biswapm commented May 15, 2026

Summary

  • Upgrade @microsoft/agents-a365-* packages from preview (0.2.0-preview.x) to GA (^1.0.0); drop the now-folded-in agents-a365-observability and agents-a365-observability-hosting packages — their public surface (InferenceScope, BaggageBuilder, BaggageBuilderUtils, AgenticTokenCacheInstance) is now re-exported from @microsoft/opentelemetry
  • Fix manual InferenceScope to record the real Azure OpenAI deployment + aggregated usage_metadata across the React-loop AI messages (was hardcoded gpt-4o-mini / 45 / 78)
  • Switch AzureChatOpenAI to azureOpenAIBasePath so the client works against *.cognitiveservices.azure.com endpoints, not just *.openai.azure.com
  • Enable enableConsoleExporters in useMicrosoftOpenTelemetry() so spans are observable in console alongside the Agent365 backend export
  • Add AZURE_OPENAI_API_VERSION to env/.env.playground.user template and the matching m365agents.playground.yml env mapping

API shape changes accommodated

GA change Adjustment
Request type renamed → A365Request (avoids DOM Request clash) Import alias + variable annotation updated
RefreshObservabilityTokenrefreshObservabilityToken (camelCase) Method call renamed
refreshObservabilityToken no longer requires explicit scopes arg (singleton ships A365 observability scope) Dropped getObservabilityAuthenticationScope() call
TurnContextLike / AuthorizationLike are stricter than agents-hosting types Narrow as any casts at the two call sites

Test plan

  • npm run build
  • Playground (npm run dev:teamsfx:playground) — gen_ai spans for chat (azure) and invoke_agent (langchain) emit with full Microsoft baggage (tenant, user, conversation)
  • Production (npm run dev with NODE_ENV=production + .env stamped by a365 setup all --aiteammate) — Teams traffic via devtunnel passes JWT middleware; Chat manual scope now shows real model (gpt-5.4) and real token usage
  • Agentic-auth token acquisition working (agents.auth.token.duration histogram populated for agentic_instance and agentic_user methods)

…emetry distro

- Upgrade @microsoft/agents-a365-{notifications,tooling,tooling-extensions-langchain,runtime}
  from preview (0.2.0-preview.x) to GA (^1.0.0)
- Drop @microsoft/agents-a365-observability and -observability-hosting; their exports
  (InferenceScope, BaggageBuilder, BaggageBuilderUtils, AgenticTokenCacheInstance)
  are now re-exported from @microsoft/opentelemetry
- Rename Request type import to A365Request to avoid clashing with DOM Request
- Use refreshObservabilityToken (lowercase) per GA naming; drop scopes arg
  (singleton ships with the A365 observability scope as default)
- Cast TurnContext / Authorization to satisfy stricter TurnContextLike /
  AuthorizationLike shapes in GA
- Use real AZURE_OPENAI_DEPLOYMENT and aggregate usage_metadata across React-loop
  AI messages in the manual InferenceScope (was hardcoded gpt-4o-mini / 45 / 78)
- Switch AzureChatOpenAI to azureOpenAIBasePath so cognitiveservices.azure.com
  endpoints work (previous endpoint-parsing only handled openai.azure.com)
- Enable enableConsoleExporters in useMicrosoftOpenTelemetry so spans are visible
  alongside the Agent365 backend export
- Add AZURE_OPENAI_API_VERSION to env/.env.playground.user template and map it
  in m365agents.playground.yml
Copilot AI review requested due to automatic review settings May 15, 2026 16:08
@biswapm biswapm requested a review from a team as a code owner May 15, 2026 16:08
@github-actions
Copy link
Copy Markdown

github-actions Bot commented May 15, 2026

⚠️ Deprecation Warning: The deny-licenses option is deprecated for possible removal in the next major release. For more information, see issue 997.

Dependency Review

The following issues were found:
  • ✅ 0 vulnerable package(s)
  • ✅ 0 package(s) with incompatible licenses
  • ✅ 0 package(s) with invalid SPDX license definitions
  • ⚠️ 4 package(s) with unknown licenses.
See the Details below.

License Issues

nodejs/langchain/sample-agent/package.json

PackageVersionLicenseIssue Type
@microsoft/agents-a365-notifications^1.0.0NullUnknown License
@microsoft/agents-a365-runtime^1.0.0NullUnknown License
@microsoft/agents-a365-tooling^1.0.0NullUnknown License
@microsoft/agents-a365-tooling-extensions-langchain^1.0.0NullUnknown License
Denied Licenses: GPL-3.0-only, AGPL-3.0-only

OpenSSF Scorecard

PackageVersionScoreDetails
npm/@microsoft/agents-a365-notifications ^1.0.0 UnknownUnknown
npm/@microsoft/agents-a365-runtime ^1.0.0 UnknownUnknown
npm/@microsoft/agents-a365-tooling ^1.0.0 UnknownUnknown
npm/@microsoft/agents-a365-tooling-extensions-langchain ^1.0.0 UnknownUnknown

Scanned Files

  • nodejs/langchain/sample-agent/package.json

- Add Environment Settings block (NODE_ENV, PORT) with production-mode note
- Add agent365Observability__* placeholders (stamped by a365 setup all)
- Add connections__service_connection__settings__scopes placeholder
- Add agentic_connectionName placeholder; switch agentic_scopes to graph default
- Drop redundant ENABLE_A365_OBSERVABILITY_EXPORTER and A365_OBSERVABILITY_LOG_LEVEL
  (sample passes a365.enabled programmatically; distro internal logging is rarely
  needed in a sample)
- Drop unused legacy vars (DEBUG, AZURE_EXPERIMENTAL_*, AZURE_TRACING_*,
  OPENAI_AGENTS_DISABLE_TRACING, OTEL_SDK_DISABLED, CONNECTION_STRING,
  USE_AGENTIC_AUTH)
- Keep LLM, MCP Tooling, and MCPPlatform Configuration blocks unchanged
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the Node.js LangChain sample agent to use the GA @microsoft/agents-a365-* packages and the @microsoft/opentelemetry distro, while improving manual InferenceScope telemetry accuracy (real deployment/model + aggregated token usage) and expanding Azure OpenAI endpoint compatibility.

Changes:

  • Migrate observability APIs (InferenceScope, baggage helpers, AgenticTokenCacheInstance) to @microsoft/opentelemetry and update API-shape changes (e.g., A365Request, refreshObservabilityToken).
  • Improve manual inference telemetry by aggregating usage_metadata across LangChain React-loop messages and recording deployment/model dynamically.
  • Switch Azure OpenAI client configuration to azureOpenAIBasePath and wire AZURE_OPENAI_API_VERSION through playground env templates.

Reviewed changes

Copilot reviewed 7 out of 7 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
nodejs/langchain/sample-agent/src/index.ts Switches to @microsoft/opentelemetry exports and enables console exporters in the OpenTelemetry distro config.
nodejs/langchain/sample-agent/src/client.ts Updates observability imports/types, switches Azure OpenAI config, and fixes manual InferenceScope model + token usage recording.
nodejs/langchain/sample-agent/src/agent.ts Moves baggage/token-cache observability helpers to @microsoft/opentelemetry and updates token refresh API call.
nodejs/langchain/sample-agent/package.json Upgrades Agent365 packages to GA and removes preview observability packages.
nodejs/langchain/sample-agent/m365agents.playground.yml Adds AZURE_OPENAI_API_VERSION env mapping for Playground runs.
nodejs/langchain/sample-agent/env/.env.playground.user Adds AZURE_OPENAI_API_VERSION placeholder to the user env template.

Comment on lines 15 to 17
useMicrosoftOpenTelemetry({
enableConsoleExporters: true,
a365: {
Comment on lines 151 to +158
/**
* Sends a user message to the LangChain agent and returns the AI's response.
* Handles streaming results and error reporting.
*
* @param {string} userMessage - The message or prompt to send to the agent.
* @returns {Promise<string>} The response from the agent, or an error message if the query fails.
*/
async invokeAgent(userMessage: string): Promise<string> {
async invokeAgent(userMessage: string): Promise<{ content: string; inputTokens: number; outputTokens: number; finishReason: string }> {
Comment on lines 199 to 204
async invokeInferenceScope(prompt: string) {
const model = process.env.AZURE_OPENAI_DEPLOYMENT || process.env.OPENAI_MODEL || 'unknown';
const inferenceDetails: InferenceDetails = {
operationName: InferenceOperationType.CHAT,
model: "gpt-4o-mini",
model,
};
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants