agentscope.memory.temporary_memory module
Memory module for conversation
- class agentscope.memory.temporary_memory.TemporaryMemory(embedding_model: str | Callable | None = None)[source]
Bases:
MemoryBase
In-memory memory module, not writing to hard disk
- __init__(embedding_model: str | Callable | None = None) None [source]
Temporary memory module for conversation.
- Parameters:
embedding_model (Union[str, Callable]) – if the temporary memory needs to be embedded, then either pass the name of embedding model or the embedding model itself.
- add(memories: Sequence[Msg] | Msg | None, embed: bool = False) None [source]
Adding new memory fragment, depending on how the memory are stored :param memories: Memories to be added. :type memories: Union[Sequence[Msg], Msg, None] :param embed: Whether to generate embedding for the new added memories :type embed: bool
- delete(index: Iterable | int) None [source]
Delete memory fragment, depending on how the memory are stored and matched :param index: indices of the memory fragments to delete :type index: Union[Iterable, int]
- export(file_path: str | None = None, to_mem: bool = False) list | None [source]
Export memory, depending on how the memory are stored :param file_path: file path to save the memory to. The messages will
be serialized and written to the file.
- Parameters:
to_mem (Optional[str]) – if True, just return the list of messages in memory
Notice: this method prevents file_path is None when to_mem is False.
- load(memories: str | list[Msg] | Msg, overwrite: bool = False) None [source]
Load memory, depending on how the memory are passed, design to load from both file or dict :param memories: memories to be loaded.
If it is in str type, it will be first checked if it is a file; otherwise it will be deserialized as messages. Otherwise, memories must be either in message type or list
of messages.
- Parameters:
overwrite (bool) – if True, clear the current memory before loading the new ones; if False, memories will be appended to the old one at the end.
- retrieve_by_embedding(query: str | list[Number], metric: Callable[[list[Number], list[Number]], float], top_k: int = 1, preserve_order: bool = True, embedding_model: Callable[[str | dict], list[Number]] | None = None) list[dict] [source]
Retrieve memory by their embeddings.
- Parameters:
query (Union[str, Embedding]) – Query string or embedding.
metric (Callable[[Embedding, Embedding], float]) – A metric to compute the relevance between embeddings of query and memory. In default, higher relevance means better match, and you can set reverse to True to reverse the order.
top_k (int, defaults to 1) – The number of memory units to retrieve.
preserve_order (bool, defaults to True) – Whether to preserve the original order of the retrieved memory units.
embedding_model (Callable[[Union[str, dict]], Embedding], defaults to None) – A callable object to embed the memory unit. If not provided, it will use the default embedding model.
- Returns:
a list of retrieved memory units in specific order.
- Return type:
list[dict]
- get_embeddings(embedding_model: Callable[[str | dict], list[Number]] | None = None) list [source]
Get embeddings of all memory units. If embedding_model is provided, the memory units that doesn’t have embedding attribute will be embedded. Otherwise, its embedding will be None.
- Parameters:
embedding_model – (Callable[[Union[str, dict]], Embedding], defaults to None): Embedding model or embedding vector.
- Returns:
List of embeddings or None.
- Return type:
list[Union[Embedding, None]]
- get_memory(recent_n: int | None = None, filter_func: Callable[[int, dict], bool] | None = None) list [source]
Retrieve memory.
- Parameters:
recent_n (Optional[int], default None) – The last number of memories to return.
filter_func – (Callable[[int, dict], bool], default to None): The function to filter memories, which take the index and memory unit as input, and return a boolean value.