-
Notifications
You must be signed in to change notification settings - Fork 86
feat(lightspeed): show expandable card for deep thinking responses #2092
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(lightspeed): show expandable card for deep thinking responses #2092
Conversation
Changed Packages
|
|
@debsmita1 In the recording I can see the Thinking block is below the main content, but I think it should be above the main content ? |
This is a Patternfly component that I am using for rendering the deepthinking content |
3ee5dc1 to
6788339
Compare
rohitkrai03
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall looks good. Just one comment - shouldn't the thinking component be rendered at the top even if the main content is available?
6788339 to
be30337
Compare
|
its-mitesh-kumar
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code looks good to me . Though not able to verify it locally. Adding lgtm based on screen recording.
/lgtm
/approve
rohitkrai03
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/approve




Hey, I just made a Pull Request!
Resolves:
https://issues.redhat.com/browse/RHIDP-11420
Adds an expandable card when the model responds with reasoning
deppthinking.mov
Test setup:
pull a model that supports deep thinking
replace run.yaml in lightspeed-stack.yaml with this
and llama_stack url in lightspeed-stack.yaml
url: http://localhost:8321Note: not all AI models support tool calling , like this deepseek model. So, ensure that you do not have mcp server configured in lightspeed-stack.yaml
✔️ Checklist