feat( deep seek generator)#138
Conversation
…representation with a None value source or destination concept
schmarion
left a comment
There was a problem hiding this comment.
Hi Oumar,
Great to see a review from you ! Nice job ;)
| Text: {context}""" | ||
| return prompt_template | ||
|
|
||
| def deepseek_prompt_concept_term_extraction(context: str) -> str: |
There was a problem hiding this comment.
I feel like this function is the same as openai_prompt_concept_term_extraction, so maybe you should delete the redundant prompts
There was a problem hiding this comment.
Hi Marion,
delighted to contribute to project.
So shall we keep only openai prompt function with the same name , or rename openai prompt fonction to a new and more generic name ?
There was a problem hiding this comment.
A new and more generic name sounds good!
There was a problem hiding this comment.
Ok I choosed to llm_prompt_concept_term_extraction.
the prompt schemas you used is enough abstract and compatible to most of modern LLM.
| def __repr__(self) -> str: | ||
| return f"({self.source_concept.label}, {self.label}, {self.destination_concept.label})" | ||
| if self.source_concept is None or self.destination_concept is None: | ||
| return f"(None, {self.label}, None)" |
There was a problem hiding this comment.
Missing is the case where only the source or the destination is None
Added DeepSeek ( a powerfull and cheaper LLM ) generator to OLAF and all necessaries tests.