Skip to content

Conversation

@Husainbw786
Copy link
Collaborator

No description provided.

- Introduced a new JSON column `batch_data` in the `ConversationLog` model to store batch-related information.
- Implemented `create_batch_conversation_logs` function to handle batch log creation for multiple messages.
- Added `updateConversationLog` and `updateConversationLogByBatchData` functions for updating logs based on log ID and batch data.
- Enhanced the `batch` function to process batch conversation logs in the background.
- Updated the `check_batch_status` function to update conversation logs with results from batch processing.

This update improves the handling of batch requests and enhances the logging mechanism for better tracking of conversation states.
@harshhsahu harshhsahu changed the base branch from master to testing January 17, 2026 13:50
Copy link
Collaborator

@harshhsahu harshhsahu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. custom id -> message_id
  2. webhook+response store
  3. improve llm msg when data in queued
  4. error handle data missing
  5. use same function
  6. metrix handling

…of custom_id

- Renamed `custom_id` to `message_id` in various functions and services to improve clarity and consistency.
- Updated the `updateConversationLogByBatchData` function to reflect the change in parameter naming.
- Adjusted batch processing services (Gemini, Mistral, Anthropic, OpenAI, Groq) to use `message_id` for API requests and responses.
- Enhanced the `Batch_Response_formatter` and `process_batch_results` functions to accommodate the new naming convention.
- Modified the `check_batch_status` function to ensure proper handling of message_id during log updates.

This update streamlines the handling of message identifiers across the codebase, improving maintainability and readability.
- Added `org_id`, `bridge_id`, `version_id`, and `thread_id` to the batch JSON payloads in Gemini, Mistral, Anthropic, OpenAI, and Groq services for improved tracking and context.
- Updated the `check_batch_status` function to handle new metadata and calculate individual message costs, including metrics storage for batch processing.
- Enhanced error handling and logging for metrics saving to Timescale DB.

This update improves the granularity of batch processing data and facilitates better monitoring and cost calculation.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants