LLM Services

LLM Clients

class econsimulacra.llm_services.clients.base.LLMClient(config, prng=None, registered_classes=[])[source]

Bases: ABC

LLM Client class (abstract class).

You can implement your own LLM client by inheriting this class and implementing the generate_response method. Currently, OpenAIClient and TransformersClient are implemented as built-in options.

See also

  • econsimulacra.llm_services.clients.OpenAIClient: LLM client implementation for OpenAI’s API.

  • econsimulacra.llm_services.clients.TransformersClient: LLM client implementation using the Transformers library and Outlines for structured generation.

Parameters:
abstractmethod async generate_response(prompt)[source]
Parameters:

prompt (str)

Return type:

dict[str, Any]

class econsimulacra.llm_services.clients.openai_client.OpenAIClient(config, prng=None, registered_classes=[])[source]

Bases: LLMClient

OpenAI client for interacting with OpenAI’s language models.

Parameters:
async generate_response(prompt)[source]

Generate a response from the OpenAI API based on the given prompt.

Parameters:

prompt (str) – The input prompt to send to the OpenAI API.

Returns:

The parsed JSON response from the OpenAI API.

Return type:

dict[str, Any]

class econsimulacra.llm_services.clients.transformers_client.TransformersClient(config, prng=None, registered_classes=[])[source]

Bases: LLMClient

Transformers client using Outlines for structured generation.

Parameters:
async generate_response(prompt)[source]

Generate a response from the model based on the given prompt.

Parameters:

prompt (str) – The input prompt to send to the model.

Returns:

The parsed JSON response from the model.

Return type:

dict[str, Any]

class econsimulacra.llm_services.clients.vllm_client.VLLMClient(config, prng=None, registered_classes=[])[source]

Bases: LLMClient

Parameters:
  • config (dict[str, Any])

  • prng (Optional[Any])

  • registered_classes (list[Type])

async generate_response(prompt)[source]
Parameters:

prompt (str)

Return type:

dict[str, Any]

close()[source]
Return type:

None

async aclose()[source]
Return type:

None

Prompt Builders

class econsimulacra.llm_services.prompts.base.PromptBuilder(config, prng=None, registered_classes=[])[source]

Bases: object

Prompt Builder class. Prompt builders are responsible for generating prompts except for the persona description (if applicable) i.e., they translate the observation into a prompt for LLM input. You can implement your own prompt builder by inheriting this class and implementing the build_prompt method.

Parameters:
build_prompt(obs)[source]

Translate the observation into a prompt for LLM input.

Parameters:

obs (dict[str, Any]) – the observation to translate into a prompt for LLM input

Returns:

the generated prompt for LLM input

Return type:

str

Note

Called by LLMAgent.act

Persona Builders

class econsimulacra.llm_services.personas.base.PersonaBuilder(config, prng=None, registered_classes=[])[source]

Bases: ABC

Persona Builder class (abstract class).

You can implement your own persona builder by inheriting this class and implementing the build_persona method. Currently, Big5PersonaBuilder is implemented as a built-in option, which builds personas based on the Big5 personality traits.

See also: econsimulacra.llm_services.personas.big5.Big5PersonaBuilder

Parameters:
abstractmethod build_persona(agent_id, agent_config)[source]

Register the agent to agent_id2persona_dic.

Parameters:
  • agent_id (int) – agent_id of the agent to build persona for

  • agent_config (dict) – config of the agent to build persona for, which is the same as the one in env_config[“agents”][agent_name]

Return type:

None

Note

Called when LLMAgent is initialized. See also: econsimulacra.agents.llm_agent.LLMAgent._setup_env_services()

get_persona(agent_id)[source]

Get the persona for the agent with the given agent_id.

Parameters:

agent_id (int)

Return type:

dict[str, Any] | None

build_persona_prompt(agent_id)[source]

Build persona prompt for the agent with the given agent_id.

Parameters:

agent_id (int) – agent_id of the agent to build persona prompt for

Returns:

persona prompt for the agent

Return type:

str

Note

Called when LLMAgent.act is called. persona prompt contains the description and the persona information of the agent, and is used as part of the prompt for generation.

assign_name(agent_id, default_name, config)[source]
Parameters:
Return type:

str

class econsimulacra.llm_services.personas.scored_persona.ScoredPersonaBuilder(config, prng=None, registered_classes=[])[source]

Bases: PersonaBuilder

Persona builder that builds personas with scores.

The persona is represented as a dictionary of attributes and their corresponding scores.

Parameters:
build_persona(agent_id, agent_config)[source]

Register the agent to agent_id2persona_dic with random scores for each attribute.

Parameters:
  • agent_id (int) – agent_id of the agent to build persona for

  • agent_config (dict) – config of the agent to build persona for, which is the same as the one in env_config[“agents”][agent_name]

Return type:

None

Note

Called when LLMAgent is initialized. See also: econsimulacra.agents.llm_agent.LLMAgent._setup_env_services()