Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update to new GPT 3.5-Turbo type #385

Open
tyu1996 opened this issue Jun 5, 2024 · 1 comment
Open

Update to new GPT 3.5-Turbo type #385

tyu1996 opened this issue Jun 5, 2024 · 1 comment

Comments

@tyu1996
Copy link

tyu1996 commented Jun 5, 2024

From our existing ChatDev typing.py:

class ModelType(Enum):
    GPT_3_5_TURBO = "gpt-3.5-turbo-16k-0613"
    GPT_3_5_TURBO_NEW = "gpt-3.5-turbo-16k"
    GPT_4 = "gpt-4"
    GPT_4_32k = "gpt-4-32k"
    GPT_4_TURBO = "gpt-4-turbo"
    GPT_4_TURBO_V = "gpt-4-turbo"

    STUB = "stub"

    @property
    def value_for_tiktoken(self):
        return self.value if self.name != "STUB" else "gpt-3.5-turbo-16k-0613"

Where in official OpenAI's models page:

image

The current GPT 3.5-Turbo types (gpt-3.5-turbo-16k-0613 & gpt-3.5-turbo-16k) used in ChatDev is considered legacy and will be deprecated very soon (next week).

Advise to update the types to remain updated to use OpenAI's latest GPT 3.5-Turbo, by just using the value "gpt-3.5-turbo" should be sufficient (which is already supports 16k context window by default).

@tyu1996
Copy link
Author

tyu1996 commented Jun 5, 2024

However, I suggest do some testing before pushing this changes. In my local repo I modified it to use "gpt-3.5-turbo", ChatDev seems underperforming. More observation from different users is required to prove the latest "gpt-3.5-turbo" type works well with ChatDev.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant