concepts.language.openai_utils.llm_prompting.simple_llm_prompt_auto_retry#
- simple_llm_prompt_auto_retry(system_instruction, user_prompts, task_prompt=None, model=None, max_tokens=2048, tag=None)#
Generate a simple LLM prompt for the given system instruction and user prompts.
- Parameters:
system_instruction (str) – the system instruction.
task_prompt (str | None) – the task prompt. If provided, it will be appended to the user prompts.
model (str | None) – the model name.
max_tokens (int) – the maximum number of tokens to generate.
tag (str | None) – the tag to extract from the generated text. For example, ‘python’ can be used to extract contents inside <python></python> tags.
- Returns:
The generated prompt.