Troubleshooting of OpenAI API: Difference between revisions
Jump to navigation
Jump to search
→How to fix "This model's maximum context length is 4097 tokens"
| Line 200: | Line 200: | ||
Solution: | Solution: | ||
* Reduce the length of an input message<ref>[https://community.openai.com/t/splitting-chunking-large-input-text-for-summarisation-greater-than-4096-tokens/18494/3 ⬛ Splitting / Chunking Large input text for Summarisation (greater than 4096 tokens....) - General API discussion - OpenAI API Community Forum]</ref>. ([[Count number of characters]]) | * Reduce the length of an input message<ref>[https://community.openai.com/t/splitting-chunking-large-input-text-for-summarisation-greater-than-4096-tokens/18494/3 ⬛ Splitting / Chunking Large input text for Summarisation (greater than 4096 tokens....) - General API discussion - OpenAI API Community Forum]</ref>. ([[Count number of characters]]) | ||
* Adjust to another model which support longer text | * Adjust to another [https://platform.openai.com/docs/models/overview model] which support longer text | ||
=== How to fix "The server had an error while processing your request" === | === How to fix "The server had an error while processing your request" === | ||