Skip to content

Add generic MemoryTool and Anthropic-native MemoryTool#119

Draft
KavyaSree2610 wants to merge 10 commits intomainfrom
kkaitepalli/memorytool
Draft

Add generic MemoryTool and Anthropic-native MemoryTool#119
KavyaSree2610 wants to merge 10 commits intomainfrom
kkaitepalli/memorytool

Conversation

@KavyaSree2610
Copy link
Collaborator

No description provided.

@codecov-commenter
Copy link

codecov-commenter commented Mar 3, 2026

Codecov Report

❌ Patch coverage is 99.70501% with 1 line in your changes missing coverage. Please review.
✅ Project coverage is 96.59%. Comparing base (ba575dc) to head (3067b59).

Files with missing lines Patch % Lines
...rc/microbots/tools/tool_definitions/memory_tool.py 99.47% 1 Missing ⚠️
Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #119      +/-   ##
==========================================
+ Coverage   95.88%   96.59%   +0.70%     
==========================================
  Files          26       29       +3     
  Lines        1046     1379     +333     
==========================================
+ Hits         1003     1332     +329     
- Misses         43       47       +4     
Flag Coverage Δ
integration 56.05% <23.59%> (-10.39%) ⬇️
ollama_local 51.92% <20.35%> (-10.13%) ⬇️
slow-browser 42.56% <20.35%> (-7.15%) ⬇️
slow-other 57.43% <20.35%> (-11.41%) ⬇️
unit 85.64% <99.70%> (+4.95%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files with missing lines Coverage Δ
src/microbots/MicroBot.py 100.00% <ø> (ø)
src/microbots/llm/anthropic_api.py 100.00% <100.00%> (ø)
src/microbots/llm/llm.py 100.00% <100.00%> (ø)
src/microbots/tools/tool_definitions/__init__.py 100.00% <100.00%> (ø)
...ts/tools/tool_definitions/anthropic_memory_tool.py 100.00% <100.00%> (ø)
...rc/microbots/tools/tool_definitions/memory_tool.py 99.47% <99.47%> (ø)

... and 2 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@KavyaSree2610 KavyaSree2610 force-pushed the kkaitepalli/memorytool branch from bad1c47 to abba902 Compare March 5, 2026 08:23
@KavyaSree2610 KavyaSree2610 marked this pull request as ready for review March 5, 2026 10:08
Copy link
Member

@0xba1a 0xba1a left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you please explain why this _dispatch_tool_use and native_tool concept is necessary?

I can understand the requirement for AnthropicMemoryTool wrapper over the generic MemoryTool. But can't that be simply used as an ExternalTool itself? IMO, the logic of _dispatch_tool_use can be handled in the ask itself, right? It acts as a tool_call wrapper every tool call (internal and external).

I think we can pass all the tools (not just the native-tools) in the tools argument of Anthropic API call. When every-tool can be passed via this, we may not even need a nativeTool abstraction.

You can silently upgrade generic MemoryTool into AnthropicMemoryTool inside the anthropic_api's __init__

self._memory_dir = base
self._memory_dir.mkdir(parents=True, exist_ok=True)

def is_model_supported(self, model_name: str) -> bool:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This overriding is required here as it exhibits the default behavior of ToolAbstract

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe you mean it is not required

folder_to_mount=folder_to_mount,
)

def _upgrade_tools_for_provider(self):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's move this function to the LLM class itself. Each LLM can have an independent logic to control this behavior.

self.additional_tools = upgraded

def _create_llm(self):
self._upgrade_tools_for_provider()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we may not even need this explicit call. We can allow the LLM to silently replace tools with optimised native-tool

# Dispatch any tool_use rounds before looking for a JSON response.
# The model may call the memory tool multiple times before producing
# its final JSON command.
while response.stop_reason == "tool_use":
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why while loop here?

@KavyaSree2610 KavyaSree2610 marked this pull request as draft March 11, 2026 08:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants