agentscope.models.openai_model module

Model wrapper for OpenAI models

class OpenAIChatWrapper(config_name: str, model_name: str | None = None, api_key: str | None = None, organization: str | None = None, client_args: dict | None = None, stream: bool = False, generate_args: dict | None = None, **kwargs: Any)[source]

Bases: OpenAIWrapperBase

The model wrapper for OpenAI’s chat API.

format(*args: Msg | Sequence[Msg]) List[dict][source]

Format the input string and dictionary into the format that OpenAI Chat API required. If you’re using a OpenAI-compatible model without a prefix “gpt-” in its name, the format method will automatically format the input messages into the required format.

Parameters:

args (Union[Msg, Sequence[Msg]]) – The input arguments to be formatted, where each argument should be a Msg object, or a list of Msg objects. In distribution, placeholder is also allowed.

Returns:

The formatted messages in the format that OpenAI Chat API required.

Return type:

List[dict]

static static_format(*args: Msg | Sequence[Msg], model_name: str) List[dict][source]

A static version of the format method, which can be used without initializing the OpenAIChatWrapper object.

Parameters:
  • args (Union[Msg, Sequence[Msg]]) – The input arguments to be formatted, where each argument should be a Msg object, or a list of Msg objects. In distribution, placeholder is also allowed.

  • model_name (str) – The name of the model to use in OpenAI API.

Returns:

The formatted messages in the format that OpenAI Chat API required.

Return type:

List[dict]

config_name: str

The name of the model configuration.

model_name: str

The name of the model, which is used in model api calling.

model_type: str = 'openai_chat'

The type of the model wrapper, which is to identify the model wrapper class in model configuration.

substrings_in_vision_models_names = ['gpt-4-turbo', 'vision', 'gpt-4o']

The substrings in the model names of vision models.

class OpenAIDALLEWrapper(config_name: str, model_name: str | None = None, api_key: str | None = None, organization: str | None = None, client_args: dict | None = None, generate_args: dict | None = None, **kwargs: Any)[source]

Bases: OpenAIWrapperBase

The model wrapper for OpenAI’s DALL·E API.

Response:
{
    "created": 1589478378,
    "data": [
        {
            "url": "https://..."
        },
        {
            "url": "https://..."
        }
    ]
}
config_name: str

The name of the model configuration.

model_name: str

The name of the model, which is used in model api calling.

model_type: str = 'openai_dall_e'

The type of the model wrapper, which is to identify the model wrapper class in model configuration.

class OpenAIEmbeddingWrapper(config_name: str, model_name: str | None = None, api_key: str | None = None, organization: str | None = None, client_args: dict | None = None, generate_args: dict | None = None, **kwargs: Any)[source]

Bases: OpenAIWrapperBase

The model wrapper for OpenAI embedding API.

Response:
  • Refer to

https://platform.openai.com/docs/api-reference/embeddings/create

{
    "object": "list",
    "data": [
        {
            "object": "embedding",
            "embedding": [
                0.0023064255,
                -0.009327292,
                .... (1536 floats total for ada-002)
                -0.0028842222,
            ],
            "index": 0
        }
    ],
    "model": "text-embedding-ada-002",
    "usage": {
        "prompt_tokens": 8,
        "total_tokens": 8
    }
}
config_name: str

The name of the model configuration.

model_name: str

The name of the model, which is used in model api calling.

model_type: str = 'openai_embedding'

The type of the model wrapper, which is to identify the model wrapper class in model configuration.

class OpenAIWrapperBase(config_name: str, model_name: str | None = None, api_key: str | None = None, organization: str | None = None, client_args: dict | None = None, generate_args: dict | None = None, **kwargs: Any)[source]

Bases: ModelWrapperBase, ABC

The model wrapper for OpenAI API.

Response:
{
    "id": "chatcmpl-123",
    "object": "chat.completion",
    "created": 1677652288,
    "model": "gpt-4o-mini",
    "system_fingerprint": "fp_44709d6fcb",
    "choices": [
        {
            "index": 0,
            "message": {
                "role": "assistant",
                "content": "Hello there, how may I assist you?",
            },
            "logprobs": null,
            "finish_reason": "stop"
        }
    ],
    "usage": {
        "prompt_tokens": 9,
        "completion_tokens": 12,
        "total_tokens": 21
    }
}
format(*args: Msg | Sequence[Msg]) List[dict] | str[source]

Format the input messages into the format that the model API required.