Skip to content

Conversation

@lopuhin
Copy link
Contributor

@lopuhin lopuhin commented Apr 2, 2025

This can allow to see when model is less certain:

image

TODO:

  • tests
  • tutorial, including limitations in case of CoT
  • mention as an extra method in README and docs

@lopuhin
Copy link
Contributor Author

lopuhin commented Apr 5, 2025

Would be cool to support that with open source models, likely in a future PR, I had a look:

  • ollama does not support logprobs, but the features is being discussed Provide logits or logprobs in the API ollama/ollama#2415
  • vllm does seem to support that but it's probably not the most popular hobbyist option
  • mlx-llm also supports that, probably makes sense to use in a tutorial.

@lopuhin lopuhin changed the title [WIP] Explain LLM predictions, using logprobs to highlight token probability Explain LLM predictions, using logprobs to highlight token probability Apr 5, 2025
@lopuhin lopuhin requested a review from Copilot April 5, 2025 19:22
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot reviewed 10 out of 18 changed files in this pull request and generated 2 comments.

Files not reviewed (8)
  • README.rst: Language not supported
  • docs/source/libraries/index.rst: Language not supported
  • docs/source/libraries/openai.rst: Language not supported
  • docs/source/overview.rst: Language not supported
  • docs/source/tutorials/explain_llm_logprobs.rst: Language not supported
  • docs/update-notebooks.sh: Language not supported
  • eli5/templates/weighted_spans.html: Language not supported
  • tox.ini: Language not supported

lopuhin and others added 2 commits April 5, 2025 20:25
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
@lopuhin lopuhin merged commit 3219fdf into master Apr 5, 2025
9 checks passed
@lopuhin lopuhin deleted the llm-token-probs branch April 5, 2025 19:33
@codelion
Copy link

codelion commented Apr 7, 2025

You can use optillm for that - https://github.com/codelion/optillm see algorithmicsuperintelligence/optillm#168 (reply in thread) you can also visualize the ouputs using the logprobs visualizer - https://huggingface.co/spaces/codelion/LogProbsVisualizer

@lopuhin
Copy link
Contributor Author

lopuhin commented Apr 8, 2025

Thanks you @codelion , do you mind if I add those links to the end of the tutorial here https://eli5.readthedocs.io/en/stable/tutorials/explain_llm_logprobs.html#open-source-and-other-models ?

@codelion
Copy link

codelion commented Apr 8, 2025

@lopuhin please go ahead, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants