LLMs Usage FAQ: Difference between revisions

Jump to navigation Jump to search
m
mNo edit summary
Line 66: Line 66:
== Generating Longer Article Content ==
== Generating Longer Article Content ==


Problem:
📝 Problem:
I want to use LLMs to generate articles of 5000-6000 words, but each attempt only produces articles of 1000-1500 words.
I want to use LLMs to generate articles of 5000-6000 words, but each attempt only produces articles of 1000-1500 words.


Reason:
💬 Reason:
LLM models have context window length limitations, with a fixed limit on the total number of tokens for input and output combined in each request. Therefore, each generation result encounters an upper limit of 1000-1500 words. The recommended workaround is to break down the intended article structure and generate content chapter by chapter.
LLM models have context window length limitations, with a fixed limit on the total number of tokens for input and output combined in each request. Therefore, each generation result encounters an upper limit of 1000-1500 words. The recommended workaround is to break down the intended article structure and generate content chapter by chapter.


Navigation menu