openai/chatgpt的api接口,各个模型的最大输入token一览表
chatgpt的各个api模型接口的最大输入量一览表:
MODEL | DESCRIPTION | CONTEXT WINDOW | TRAINING DATA |
---|---|---|---|
gpt-3.5-turbo-1106 | Updated GPT 3.5 Turbo New The latest GPT-3.5 Turbo model with improved instruction following, JSON mode, reproducible outputs, parallel function calling, and more. Returns a maximum of 4,096 output tokens. | 16,385 tokens | Up to Sep 2021 |
gpt-3.5-turbo | Currently points to gpt-3.5-turbo-0613 . Will point to gpt-3.5-turbo-1106 starting Dec 11, 2023. | 4,096 tokens | Up to Sep 2021 |
gpt-3.5-turbo-16k | Currently points to gpt-3.5-turbo-0613 . Will point to gpt-3.5-turbo-1106 starting Dec 11, 2023. | 16,385 tokens | Up to Sep 2021 |
gpt-3.5-turbo-instruct | Similar capabilities as text-davinci-003 but compatible with legacy Completions endpoint and not Chat Completions. | 4,096 tokens | Up to Sep 2021 |
gpt-3.5-turbo-0613 Legacy | Snapshot of gpt-3.5-turbo from June 13th 2023. Will be deprecatedon June 13, 2024. | 4,096 tokens | Up to Sep 2021 |
gpt-3.5-turbo-16k-0613 Legacy | Snapshot of gpt-3.5-16k-turbo from June 13th 2023. Will be deprecated on June 13, 2024. | 16,385 tokens | Up to Sep 2021 |
gpt-3.5-turbo-0301 Legacy | Snapshot of gpt-3.5-turbo from March 1st 2023. Will be deprecated on June 13th 2024. | 4,096 tokens | Up to Sep 2021 |
text-davinci-003 Legacy | Can do language tasks with better quality and consistency than the curie, babbage, or ada models. Will be deprecated on Jan 4th 2024. | 4,096 tokens | Up to Jun 2021 |
text-davinci-002 Legacy | Similar capabilities to text-davinci-003 but trained with supervised fine-tuning instead of reinforcement learning. Will be deprecated on Jan 4th 2024. | 4,096 tokens | Up to Jun 2021 |
code-davinci-002 Legacy | Optimized for code-completion tasks. Will be deprecated on Jan 4th 2024. | 8,001 tokens | Up to Jun 2021 |