-
Notifications
You must be signed in to change notification settings - Fork 2.9k
fix: invalid model setting when passing prompt to Agent #1852
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| conversation=self._non_null_or_not_given(conversation_id), | ||
| instructions=self._non_null_or_not_given(system_instructions), | ||
| model=self.model, | ||
| model=self.model if prompt is None else NOT_GIVEN, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you add a comment here explaining why?
rm-openai
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also confirming you are sure this is correct? devs are not allowed to override the model set in the prompt?
|
@rm-openai good point; actually it's feasible. so, when a developer explicitly pass a model, it should be allowed. the default model still needs to be ignored. |
33fc4f5 to
42166de
Compare
|
@codex Can you review the changes with fresh eyes? |
|
Codex Review: Didn't find any major issues. Can't wait for the next one! ℹ️ About Codex in GitHubYour team has set up Codex to review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. Codex can also answer questions or update the PR. Try commenting "@codex address that feedback". |
This pull request fixes the bug reported for TS SDK, which exists for Python too.