mirror of
https://github.com/spantaleev/matrix-docker-ansible-deploy.git
synced 2026-04-19 20:01:09 +03:00
8889b018f3
Reasoning models like `o1` and `o3` and their `-mini` variants report errors if we try to configure `max_response_tokens` (which ultimately influences the `max_tokens` field in the API request): > invalid_request_error: Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead. (param: max_tokens) (code: unsupported_parameter) `max_completion_tokens` is not yet supported by baibot, so the best we can do is at least get rid of `max_response_tokens` (`max_tokens`). Ref: https://github.com/etkecc/baibot/commit/db9422740ceca32956d9628b6326b8be206344e2
2.4 KiB
2.4 KiB