-
Couldn't load subscription status.
- Fork 5.2k
Open
Description
Is your feature request related to a problem? Please describe.
I recently heard about this project and thought to try it out with a locally run openai/gpt-oss-120b instance (llama-server, openai compatible api). Looking at what it can do, I noticed that the reasoning portions of the output don't seem to show in an $ interpreter session.
Describe the solution you'd like
Is there an already existing setting one could toggle, or is it perhaps planned to show thinking reasoning tokens?
Describe alternatives you've considered
- look more for an option I might enable
- use a different model, maybe non-thinking (though for my use cases I find gpt-oss to be pretty good so far)
- look into adding this feature, maybe as a PR
- look for other similar projects
- create a very simple thin wrapper based on
harmony
Additional context
No response
Metadata
Metadata
Assignees
Labels
No labels