uqlm.utils.response_generator.ResponseGenerator#

class uqlm.utils.response_generator.ResponseGenerator(llm=None, max_calls_per_min=None, use_n_param=False)#

Bases: object

__init__(llm=None, max_calls_per_min=None, use_n_param=False)#

Class for generating data from a provided set of prompts

Parameters:
  • llm (langchain BaseChatModel, default=None) – A langchain llm BaseChatModel. User is responsible for specifying temperature and other relevant parameters to the constructor of their llm object.

  • max_calls_per_min (int, default=None) – Used to control rate limiting

  • use_n_param (bool, default=False) – Specifies whether to use n parameter for BaseChatModel. Not compatible with all BaseChatModel classes. If used, it speeds up the generation process substantially when count > 1.

Methods

__init__([llm, max_calls_per_min, use_n_param])

Class for generating data from a provided set of prompts

generate_responses(prompts[, system_prompt, ...])

Generates evaluation dataset from a provided set of prompts.

async generate_responses(prompts, system_prompt='You are a helpful assistant.', count=1)#

Generates evaluation dataset from a provided set of prompts. For each prompt, self.count responses are generated.

Return type:

Dict[str, Any]

Parameters:
  • prompts (list of strings) – List of prompts from which LLM responses will be generated

  • system_prompt (str or None, default="You are a helpful assistant.") – Optional argument for user to provide custom system prompt

  • count (int, default=1) – Specifies number of responses to generate for each prompt.

Returns:

A dictionary with two keys: ‘data’ and ‘metadata’.

’data’dict

A dictionary containing the prompts and responses.

’prompt’list

A list of prompts.

’response’list

A list of responses corresponding to the prompts.

’metadata’dict

A dictionary containing metadata about the generation process.

’temperature’float

The temperature parameter used in the generation process.

’count’int

The count of prompts used in the generation process.

’system_prompt’str

The system prompt used for generating responses

Return type:

dict

References