Add ask command for natural-language search across meeting notes#7
Merged
Add ask command for natural-language search across meeting notes#7
ask command for natural-language search across meeting notes#7Conversation
Two-stage LLM pipeline: first identifies relevant meetings from summaries, then answers from full transcripts with quote verification. Includes spinner UX, meeting source citation in answers, keyword fallback, and context_size config option.
There was a problem hiding this comment.
Pull request overview
Adds a new ownscribe ask CLI command that performs two-stage natural-language search over stored meeting notes: first selecting relevant meetings from summaries, then answering from full transcripts with quote verification and citations.
Changes:
- Introduces
ownscribe.searchimplementing meeting discovery, summary chunking, LLM-based meeting selection, transcript answering, and quote verification (with keyword fallback). - Adds
Summarizer.chat(...)to the summarization interface and implements it for Ollama/OpenAI backends (including OpenAIresponse_formatfallback behavior). - Adds extensive automated tests for search behavior and OpenAI JSON-mode fallback; wires the new
askcommand into CLI + docs/config.
Reviewed changes
Copilot reviewed 12 out of 12 changed files in this pull request and generated 8 comments.
Show a summary per file
| File | Description |
|---|---|
| tests/test_search.py | Adds unit + integration tests for meeting search/ask pipeline and quote verification. |
| tests/test_pipeline.py | Updates mocks to patch the new create_summarizer factory. |
| src/ownscribe/summarization/prompts.py | Adds search-specific system/user prompts for find/answer stages. |
| src/ownscribe/summarization/openai_summarizer.py | Implements chat() with JSON response_format fallback attempts. |
| src/ownscribe/summarization/ollama_summarizer.py | Implements chat() with optional JSON formatting. |
| src/ownscribe/summarization/base.py | Extends Summarizer ABC with chat(). |
| src/ownscribe/summarization/init.py | Adds shared create_summarizer(config) factory. |
| src/ownscribe/search.py | New search/ask implementation: discovery, ranking, LLM stages, quote verification, context sizing. |
| src/ownscribe/pipeline.py | Switches to shared create_summarizer factory. |
| src/ownscribe/config.py | Adds summarization.context_size config option and docs comment. |
| src/ownscribe/cli.py | Adds the ask subcommand and options (--since, --limit). |
| README.md | Documents ask usage and the two-stage pipeline behavior. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Two-stage LLM pipeline: first identifies relevant meetings from summaries, then answers from full transcripts with quote verification. Includes spinner UX, meeting source citation in answers, keyword fallback, and context_size config option.