Skip to content

Improve LLM response stream logs: add thoughts, fix double print#83

Merged
0xba1a merged 4 commits intomainfrom
copilot/improve-llm-response-logs
Dec 30, 2025
Merged

Improve LLM response stream logs: add thoughts, fix double print#83
0xba1a merged 4 commits intomainfrom
copilot/improve-llm-response-logs

Conversation

Copy link
Contributor

Copilot AI commented Dec 11, 2025

  • Understand the codebase and identify the issues
  • Fix double printing of LLM output (removed duplicate log that printed command output twice)
  • Add logging of LLM thoughts alongside commands (using 💭 emoji with cyan color)
  • Add logging of final LLM thoughts when task is completed
  • Run tests to validate changes (all 103 tests pass)
  • Address code review comments (removed extra space before colon in log messages)
  • Run security check (no issues found)
  • Address PR review: Convert command execution details to debug log with 🔧 icon
Original prompt

This section details on the original issue you should resolve

<issue_title>Improve the llm response stream logs</issue_title>
<issue_description>Improve the info log of llm commands and response stream.

  • Avoid double print of llm output (once as command output and second as msg back to llm)
  • Print llm thoughts too</issue_description>

Comments on the Issue (you are @copilot in this section)


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Copilot AI and others added 2 commits December 11, 2025 12:59
Co-authored-by: 0xba1a <2942888+0xba1a@users.noreply.github.com>
Co-authored-by: 0xba1a <2942888+0xba1a@users.noreply.github.com>
Copilot AI changed the title [WIP] Improve logging for llm response streams Improve LLM response stream logs: add thoughts, fix double print Dec 11, 2025
Copilot AI requested a review from 0xba1a December 11, 2025 13:03
@codecov-commenter
Copy link

codecov-commenter commented Dec 11, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 91.71%. Comparing base (cb7389d) to head (11f2cff).
⚠️ Report is 2 commits behind head on main.

Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main      #83      +/-   ##
==========================================
- Coverage   91.92%   91.71%   -0.21%     
==========================================
  Files          21       21              
  Lines         805      809       +4     
==========================================
+ Hits          740      742       +2     
- Misses         65       67       +2     
Flag Coverage Δ
integration 79.85% <100.00%> (-0.03%) ⬇️
ollama_local 61.92% <100.00%> (-0.19%) ⬇️
unit 57.84% <0.00%> (-0.29%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files with missing lines Coverage Δ
src/microbots/MicroBot.py 100.00% <100.00%> (ø)

... and 2 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Co-authored-by: 0xba1a <2942888+0xba1a@users.noreply.github.com>
Copilot AI requested a review from 0xba1a December 11, 2025 14:04
@0xba1a 0xba1a marked this pull request as ready for review December 30, 2025 15:25
@0xba1a
Copy link
Member

0xba1a commented Dec 30, 2025

@copilot Resolve merge conflict

Copy link
Contributor Author

Copilot AI commented Dec 30, 2025

@copilot Resolve merge conflict

I attempted to merge main into this branch and there are no actual conflicts - the changes merge cleanly. However, I cannot push merge commits directly. You'll need to run git pull origin main locally or use the "Update branch" button in the GitHub PR interface to merge the latest main into this branch.

Copilot AI requested a review from 0xba1a December 30, 2025 15:30
@0xba1a 0xba1a merged commit 583f65d into main Dec 30, 2025
7 of 9 checks passed
@0xba1a 0xba1a deleted the copilot/improve-llm-response-logs branch December 30, 2025 16:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Improve the llm response stream logs

4 participants