Conversation
jamesbraza
left a comment
There was a problem hiding this comment.
LGTM, added some optional comments
| """ | ||
|
|
||
| supported_templates: ClassVar[set[str]] = { | ||
| "llama2_chat_template_ori.jinja", |
There was a problem hiding this comment.
No one but us is going to know what an Ori template is. Any chance you can rename this file, or add a comment explaining it?
There was a problem hiding this comment.
Yeah it's on my to-do list to rename this, and there's documentation in chat_templates/README.md for all of the chat templates. I'll leave it for a later PR to sort this out.
| assert len(msg.tool_calls) == 1, ( | ||
| "Support parsing only single tool call for now" | ||
| ) |
There was a problem hiding this comment.
Since this is about supporting something, I would rather see raise NotImplementedError than assert
| self.model_name = model_config.model | ||
| @classmethod | ||
| @abstractmethod | ||
| def prep_tools_for_tokenizer(cls, tools: Tools | None) -> list[dict] | None: |
There was a problem hiding this comment.
If we are going to prepare tools, why are we then passing None tools?
Just pointing out, Tools | None doesn't really make sense, imo this should be something like *tools: Tool or tools: Tools
There was a problem hiding this comment.
It's because tools can be None here:
Line 256 in 0d0e0bd
[Local]LLMCallOp.
LocalLLMCallOppreviously hardcoded llama 3.1-friendly message parsing logic. This PR makes that configurable by the user.