agentscope.models.openai_model module
Model wrapper for OpenAI models
- class agentscope.models.openai_model.OpenAIWrapperBase(config_name: str, model_name: str | None = None, api_key: str | None = None, organization: str | None = None, client_args: dict | None = None, generate_args: dict | None = None, **kwargs: Any)[source]
Bases:
ModelWrapperBase
,ABC
The model wrapper for OpenAI API.
- Response:
-
“id”: “chatcmpl-123”, “object”: “chat.completion”, “created”: 1677652288, “model”: “gpt-4o-mini”, “system_fingerprint”: “fp_44709d6fcb”, “choices”: [
- {
“index”: 0, “message”: {
“role”: “assistant”, “content”: “Hello there, how may I assist you today?”,
}, “logprobs”: null, “finish_reason”: “stop”
}
], “usage”: {
“prompt_tokens”: 9, “completion_tokens”: 12, “total_tokens”: 21
}
- __init__(config_name: str, model_name: str | None = None, api_key: str | None = None, organization: str | None = None, client_args: dict | None = None, generate_args: dict | None = None, **kwargs: Any) None [source]
Initialize the openai client.
- Parameters:
config_name (str) – The name of the model config.
model_name (str, default None) – The name of the model to use in OpenAI API.
api_key (str, default None) – The API key for OpenAI API. If not specified, it will be read from the environment variable OPENAI_API_KEY.
organization (str, default None) – The organization ID for OpenAI API. If not specified, it will be read from the environment variable OPENAI_ORGANIZATION.
client_args (dict, default None) – The extra keyword arguments to initialize the OpenAI client.
generate_args (dict, default None) – The extra keyword arguments used in openai api generation, e.g. temperature, seed.
- class agentscope.models.openai_model.OpenAIChatWrapper(config_name: str, model_name: str | None = None, api_key: str | None = None, organization: str | None = None, client_args: dict | None = None, stream: bool = False, generate_args: dict | None = None, **kwargs: Any)[source]
Bases:
OpenAIWrapperBase
The model wrapper for OpenAI’s chat API.
- model_type: str = 'openai_chat'
The type of the model wrapper, which is to identify the model wrapper class in model configuration.
- deprecated_model_type: str = 'openai'
- substrings_in_vision_models_names = ['gpt-4-turbo', 'vision', 'gpt-4o']
The substrings in the model names of vision models.
- __init__(config_name: str, model_name: str | None = None, api_key: str | None = None, organization: str | None = None, client_args: dict | None = None, stream: bool = False, generate_args: dict | None = None, **kwargs: Any) None [source]
Initialize the openai client.
- Parameters:
config_name (str) – The name of the model config.
model_name (str, default None) – The name of the model to use in OpenAI API.
api_key (str, default None) – The API key for OpenAI API. If not specified, it will be read from the environment variable OPENAI_API_KEY.
organization (str, default None) – The organization ID for OpenAI API. If not specified, it will be read from the environment variable OPENAI_ORGANIZATION.
client_args (dict, default None) – The extra keyword arguments to initialize the OpenAI client.
stream (bool, default False) – Whether to enable stream mode.
generate_args (dict, default None) – The extra keyword arguments used in openai api generation, e.g. temperature, seed.
- static static_format(*args: Msg | Sequence[Msg], model_name: str) List[dict] [source]
A static version of the format method, which can be used without initializing the OpenAIChatWrapper object.
- Parameters:
args (Union[Msg, Sequence[Msg]]) – The input arguments to be formatted, where each argument should be a Msg object, or a list of Msg objects. In distribution, placeholder is also allowed.
model_name (str) – The name of the model to use in OpenAI API.
- Returns:
The formatted messages in the format that OpenAI Chat API required.
- Return type:
List[dict]
- format(*args: Msg | Sequence[Msg]) List[dict] [source]
Format the input string and dictionary into the format that OpenAI Chat API required. If you’re using a OpenAI-compatible model without a prefix “gpt-” in its name, the format method will automatically format the input messages into the required format.
- Parameters:
args (Union[Msg, Sequence[Msg]]) – The input arguments to be formatted, where each argument should be a Msg object, or a list of Msg objects. In distribution, placeholder is also allowed.
- Returns:
The formatted messages in the format that OpenAI Chat API required.
- Return type:
List[dict]
- config_name: str
The name of the model configuration.
- model_name: str
The name of the model, which is used in model api calling.
- class agentscope.models.openai_model.OpenAIDALLEWrapper(config_name: str, model_name: str | None = None, api_key: str | None = None, organization: str | None = None, client_args: dict | None = None, generate_args: dict | None = None, **kwargs: Any)[source]
Bases:
OpenAIWrapperBase
The model wrapper for OpenAI’s DALL·E API.
- Response:
-
- model_type: str = 'openai_dall_e'
The type of the model wrapper, which is to identify the model wrapper class in model configuration.
- config_name: str
The name of the model configuration.
- model_name: str
The name of the model, which is used in model api calling.
- class agentscope.models.openai_model.OpenAIEmbeddingWrapper(config_name: str, model_name: str | None = None, api_key: str | None = None, organization: str | None = None, client_args: dict | None = None, generate_args: dict | None = None, **kwargs: Any)[source]
Bases:
OpenAIWrapperBase
The model wrapper for OpenAI embedding API.
- Response:
Refer to
https://platform.openai.com/docs/api-reference/embeddings/create
“object”: “list”, “data”: [
- {
“object”: “embedding”, “embedding”: [
0.0023064255, -0.009327292, …. (1536 floats total for ada-002) -0.0028842222,
], “index”: 0
}
], “model”: “text-embedding-ada-002”, “usage”: {
“prompt_tokens”: 8, “total_tokens”: 8
}
- model_type: str = 'openai_embedding'
The type of the model wrapper, which is to identify the model wrapper class in model configuration.
- config_name: str
The name of the model configuration.
- model_name: str
The name of the model, which is used in model api calling.