Cohere Command R Input Token Limit? #138672
-
ModelsQuestion BodyThis model lists context as 131k input tokens.
Yet with the sample for making a single request, I get this error when passing ~125k tokens
Why do the two numbers (131k and 8k) not line up? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Sorry for the confusion. Those are the characteristics of the model, although we apply token limits you can read here: https://docs.github.com/en/github-models/prototyping-with-ai-models#rate-limits We're still evaluating these limits as this is limited preview and there isn't a seamless upgrade path yet to let you go beyond these constraints. Hopefully more updates soon and I'll think about how we can provide more clarity here. |
Beta Was this translation helpful? Give feedback.
Sorry for the confusion. Those are the characteristics of the model, although we apply token limits you can read here: https://docs.github.com/en/github-models/prototyping-with-ai-models#rate-limits
We're still evaluating these limits as this is limited preview and there isn't a seamless upgrade path yet to let you go beyond these constraints. Hopefully more updates soon and I'll think about how we can provide more clarity here.