Skip to content

Conversation

@ViaSocket-Git
Copy link
Collaborator

all intermediate agents in flow publishes to queue

@ViaSocket-Git ViaSocket-Git changed the title all agents in orchestrator flow publishes to queue all agents in orchestral flow publishes to queue Jan 8, 2026
@ViaSocket-Git
Copy link
Collaborator Author

/windsurf-review

Copy link

@windsurf-bot windsurf-bot bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 To request another review, post a new comment with "/windsurf-review".

Comment on lines +224 to +227
# Prepare queue data for this intermediate agent
queue_data = await make_request_data_and_publish_sub_queue(parsed_data, result, params, thread_info)
queue_data = make_json_serializable(queue_data)
current_history_data['queue_data'] = queue_data
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider adding error handling around the queue data serialization to prevent potential runtime errors if the data contains non-serializable objects:

Suggested change
# Prepare queue data for this intermediate agent
queue_data = await make_request_data_and_publish_sub_queue(parsed_data, result, params, thread_info)
queue_data = make_json_serializable(queue_data)
current_history_data['queue_data'] = queue_data
# Prepare queue data for this intermediate agent
try:
queue_data = await make_request_data_and_publish_sub_queue(parsed_data, result, params, thread_info)
queue_data = make_json_serializable(queue_data)
current_history_data['queue_data'] = queue_data
except Exception as e:
logger.error(f"Error serializing queue data: {str(e)}")
# Still continue with the flow even if queue data couldn't be serialized

Comment on lines +513 to +517
# Publish all agents to queue
for history_entry in transfer_chain:
if 'queue_data' in history_entry:
await sub_queue_obj.publish_message(history_entry['queue_data'])

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing error handling for queue publishing operations. If publishing fails for one agent, it could affect the entire chain. Consider adding try/except blocks around the publish operations to ensure the process continues even if one publish fails:

Suggested change
# Publish all agents to queue
for history_entry in transfer_chain:
if 'queue_data' in history_entry:
await sub_queue_obj.publish_message(history_entry['queue_data'])
# Publish all agents to queue
for history_entry in transfer_chain:
if 'queue_data' in history_entry:
try:
await sub_queue_obj.publish_message(history_entry['queue_data'])
except Exception as e:
logger.error(f"Failed to publish queue message for agent {history_entry['bridge_id']}: {str(e)}")

Comment on lines +464 to +466
queue_data = await make_request_data_and_publish_sub_queue(parsed_data, result, params, thread_info)
queue_data = make_json_serializable(queue_data)
current_history_data['queue_data'] = queue_data
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Storing all queue_data in memory before publishing could lead to memory issues with large payloads or many agents in a complex flow. Consider publishing each agent's data immediately after preparing it rather than storing all data and publishing at the end.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants