GPT 3.5 Turbo settings¶
API Key Setting¶
Get Api Key¶
Click Create new secret key, the secret key will be generated as shown below, click the Copy button on the right.
Configure into the plugin¶
Just paste the copied content into the location of the official source API Key set by IDE ChatGPT.
For GPT 3.5 Turbo, a total of 5 models are supported:
- gpt-3.5-turbo (default model)
You can choose the corresponding model according to your needs.
Enable streaming return¶
No need to wait long, better experience.
Enable Token statistics¶
Show the Token consumption of the trumpet after each conversation is finished.
This feature will not work if streaming return is on.
Enable contextual support¶
This feature is off by default, if you want to use it, please check Enable Context Support here.
Enabling contextual support will consume more Tokens afterwards.
If you have other sources, you can have them configured here.
It is worth noting that
The parameter structure of the custom source request and the returned data structure need to be consistent with the official website, otherwise there may be parsing errors.
The exported file is in Markdown format by default.
- Storage Location: Indicates the default storage location for the exported dialogs, and the title of the question and answer.
- Question: The title of the question in the export file
- Answer: the title of the answer in the export file
For reference, here is an example: