A | B | C | D | E | F | G | H | I | J | |
---|---|---|---|---|---|---|---|---|---|---|
1 | Instructions for the /completions endpoint: 1. (optional) Edit the query parameters in cells B3 and B4 based on the definitions in OpenAI's documentation. 2. Fill in the prompts in column A. Don't edit any of the grey cells. 3. In API Connector, tick the "Multiple request bodies" box and enter +++Completions!E7+++ to run a request for each prompt (see screenshot) | |||||||||
2 | ||||||||||
3 | Query part 1 | {"model":"text-davinci-003","prompt":" | ||||||||
4 | Query part 2 | ","temperature":0.7,"max_tokens":80,"top_p":0.3,"frequency_penalty":0.5,"presence_penalty":0} | ||||||||
5 | ||||||||||
6 | 1. Enter prompts | 2. This function automatically adds each prompt into a request body | 3. This function automatically adds :::BREAK::: for all bodies except the last one | 4. This function automatically concatenates columns B and C | 5. This function automatically concatenates all results into a single cell | |||||
7 | When you receive this prompt, print out some text beginning with 'I've received the 1st prompt' | {"model":"text-davinci-003","prompt":"When you receive this prompt, print out some text beginning with 'I've received the 1st prompt'","temperature":0.7,"max_tokens":80,"top_p":0.3,"frequency_penalty":0.5,"presence_penalty":0} | :::BREAK::: | {"model":"text-davinci-003","prompt":"When you receive this prompt, print out some text beginning with 'I've received the 1st prompt'","temperature":0.7,"max_tokens":80,"top_p":0.3,"frequency_penalty":0.5,"presence_penalty":0} :::BREAK::: | {"model":"text-davinci-003","prompt":"When you receive this prompt, print out some text beginning with 'I've received the 1st prompt'","temperature":0.7,"max_tokens":80,"top_p":0.3,"frequency_penalty":0.5,"presence_penalty":0} :::BREAK:::{"model":"text-davinci-003","prompt":"When you receive this prompt, print out some text beginning with 'I've received the 2nd prompt'","temperature":0.7,"max_tokens":80,"top_p":0.3,"frequency_penalty":0.5,"presence_penalty":0} :::BREAK:::{"model":"text-davinci-003","prompt":"When you receive this prompt, print out some text beginning with 'I've received the 3rd prompt'","temperature":0.7,"max_tokens":80,"top_p":0.3,"frequency_penalty":0.5,"presence_penalty":0} | |||||
8 | When you receive this prompt, print out some text beginning with 'I've received the 2nd prompt' | {"model":"text-davinci-003","prompt":"When you receive this prompt, print out some text beginning with 'I've received the 2nd prompt'","temperature":0.7,"max_tokens":80,"top_p":0.3,"frequency_penalty":0.5,"presence_penalty":0} | :::BREAK::: | {"model":"text-davinci-003","prompt":"When you receive this prompt, print out some text beginning with 'I've received the 2nd prompt'","temperature":0.7,"max_tokens":80,"top_p":0.3,"frequency_penalty":0.5,"presence_penalty":0} :::BREAK::: | ||||||
9 | When you receive this prompt, print out some text beginning with 'I've received the 3rd prompt' | {"model":"text-davinci-003","prompt":"When you receive this prompt, print out some text beginning with 'I've received the 3rd prompt'","temperature":0.7,"max_tokens":80,"top_p":0.3,"frequency_penalty":0.5,"presence_penalty":0} | {"model":"text-davinci-003","prompt":"When you receive this prompt, print out some text beginning with 'I've received the 3rd prompt'","temperature":0.7,"max_tokens":80,"top_p":0.3,"frequency_penalty":0.5,"presence_penalty":0} | |||||||
10 | ||||||||||
11 | ||||||||||
12 | ||||||||||
13 | ||||||||||
14 | ||||||||||
15 |