How to optimize your OpenAI API token usage

From LemonWiki共筆
Revision as of 11:04, 13 August 2023 by Unknown user (talk) (Created page with " == Prevent Sending Duplicate Content to the OpenAI API == * [https://errerrors.blogspot.com/2023/08/how-to-find-duplicate-text-values-in-mysql.html MySQL 如何尋找重複...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Prevent Sending Duplicate Content to the OpenAI API

Streamlining Questions and Responses

While it's acceptable to pose open-ended questions to explore the capabilities of ChatGPT, keep in mind that such questions can lead to longer, more creative responses that might increase costs. To achieve concise and cost-effective answers, consider refining your question by providing specific and limited options for the AI to select from.

For example:

  • Initial question for exploration:
Please offer five keywords for the following articles:

```
Long text
``` 
  • Refined question:
Please select one of the following keywords: keyword1, keyword2, keyword3, keyword4, keyword5, for the subsequent articles:

```
Long text
``` 


Handling Multiple Article Packages

original promot

Please select one of the following keywords for the subsequent articles: keyword1, keyword2, keyword3, keyword4, keyword5.
```
Long text of article No.1
``` 

another prompt:

Please select one of the following keywords for the subsequent articles: keyword1, keyword2, keyword3, keyword4, keyword5.
```
Long text of article No.2
``` 

Refined prompt:

Each row is the article number and content. For each article, select one of the keywords: keyword1, keyword2, keyword3, keyword4, keyword5. Provide your answer in the CSV format: "article number", "comma_separated_keywords"

```
No1. Long text of article No.1 (without return symbol)
No2. Long text of article No.2 (without return symbol)
...
No3. Long text of article No.3 (without return symbol)
``` 

Preparation Before API Result Verification

Prepare several sample texts and verify the API results to ensure they are as expected before processing a large number of articles.

Related pages