agentscope.models.zhipu_model module
Model wrapper for ZhipuAI models
- class ZhipuAIChatWrapper(config_name: str, model_name: str | None = None, api_key: str | None = None, stream: bool = False, client_args: dict | None = None, generate_args: dict | None = None, **kwargs: Any)[source]
Bases:
ZhipuAIWrapperBase
The model wrapper for ZhipuAI’s chat API.
- Response:
-
{ "created": 1703487403, "id": "8239375684858666781", "model": "glm-4", "request_id": "8239375684858666781", "choices": [ { "finish_reason": "stop", "index": 0, "message": { "content": "Drawing blueprints with ...", "role": "assistant" } } ], "usage": { "completion_tokens": 217, "prompt_tokens": 31, "total_tokens": 248 } }
- format(*args: Msg | Sequence[Msg]) List[dict] [source]
A common format strategy for chat models, which will format the input messages into a user message.
Note this strategy maybe not suitable for all scenarios, and developers are encouraged to implement their own prompt engineering strategies.
The following is an example:
prompt1 = model.format( Msg("system", "You're a helpful assistant", role="system"), Msg("Bob", "Hi, how can I help you?", role="assistant"), Msg("user", "What's the date today?", role="user") ) prompt2 = model.format( Msg("Bob", "Hi, how can I help you?", role="assistant"), Msg("user", "What's the date today?", role="user") )
The prompt will be as follows:
# prompt1 [ { "role": "system", "content": "You're a helpful assistant" }, { "role": "user", "content": ( "## Conversation History\n" "Bob: Hi, how can I help you?\n" "user: What's the date today?" ) } ] # prompt2 [ { "role": "user", "content": ( "## Conversation History\n" "Bob: Hi, how can I help you?\n" "user: What's the date today?" ) } ]
- Parameters:
args (Union[Msg, Sequence[Msg]]) – The input arguments to be formatted, where each argument should be a Msg object, or a list of Msg objects. In distribution, placeholder is also allowed.
- Returns:
The formatted messages.
- Return type:
List[dict]
- config_name: str
The name of the model configuration.
- model_name: str
The name of the model, which is used in model api calling.
- model_type: str = 'zhipuai_chat'
The type of the model wrapper, which is to identify the model wrapper class in model configuration.
- class ZhipuAIEmbeddingWrapper(config_name: str, model_name: str | None = None, api_key: str | None = None, client_args: dict | None = None, generate_args: dict | None = None, **kwargs: Any)[source]
Bases:
ZhipuAIWrapperBase
The model wrapper for ZhipuAI embedding API.
Example Response:
{ "model": "embedding-2", "data": [ { "embedding": [ (a total of 1024 elements) -0.02675454691052437, 0.019060475751757622, ...... -0.005519774276763201, 0.014949671924114227 ], "index": 0, "object": "embedding" } ], "object": "list", "usage": { "completion_tokens": 0, "prompt_tokens": 4, "total_tokens": 4 } }
- config_name: str
The name of the model configuration.
- model_name: str
The name of the model, which is used in model api calling.
- model_type: str = 'zhipuai_embedding'
The type of the model wrapper, which is to identify the model wrapper class in model configuration.
- class ZhipuAIWrapperBase(config_name: str, model_name: str | None = None, api_key: str | None = None, client_args: dict | None = None, generate_args: dict | None = None, **kwargs: Any)[source]
Bases:
ModelWrapperBase
,ABC
The model wrapper for ZhipuAI API.