View module source on GitHub

parse_langchain_provider

def parse_langchain_provider(serialized: Dict[str, Any])

Parses langchain provider from serialized data

Arguments:

NameDescription
serializedDict[str, Any]: Serialized data to parse provider from

Returns:

NameDescription
strParsed provider

parse_langchain_llm_error

def parse_langchain_llm_error(
    error: Union[Exception, BaseException,
                 KeyboardInterrupt]) -> GenerationError

Parses langchain LLM error into a format that is accepted by Maxim logger

Arguments:

NameDescription
errorUnion[Exception, KeyboardInterrupt]: Error to be parsed

Returns:

NameDescription
[GenerationError](/sdk/python/references/logger/components/types)Parsed LLM error

parse_langchain_model_parameters

def parse_langchain_model_parameters(**kwargs: Any
                                     ) -> Tuple[str, Dict[str, Any]]

Parses langchain kwargs into model and model parameters. You can use this function with any langchain _start callback function

Arguments:

NameDescription
kwargsDict[str, Any]: Kwargs to be parsed

Returns:

Tuple[str, Dict[str, Any]]: Model and model parameters

Raises:

  • Exception - If model_name is not found in kwargs

parse_base_message_to_maxim_generation

def parse_base_message_to_maxim_generation(message: BaseMessage)

Parses langchain BaseMessage into a format that is accepted by Maxim logger

Arguments:

NameDescription
messageBaseMessage

Returns:

Dict[str, Any]: Parsed message

parse_langchain_message

def parse_langchain_message(message: BaseMessage)

Parses langchain BaseMessage into a choice of openai message

Arguments:

NameDescription
messageBaseMessage

Returns:

Dict[str, Any]: Parsed message

parse_langchain_generation

def parse_langchain_generation(generation: Generation)

Parses langchain generation into a format that is accepted by Maxim logger

Arguments:

NameDescription
generation[Generation](/sdk/python/references/logger/components/generation): Generation to be parsed

Returns:

Dict[str, Any]: Parsed generation

parse_token_usage_for_result

def parse_token_usage_for_result(result: LLMResult)

Parses token usage for a given LLM result

parse_langchain_chat_result

def parse_langchain_chat_result(result: ChatResult) -> Dict[str, Any]

Parses langchain Chat result into a format that is accepted by Maxim logger

Arguments:

NameDescription
resultChatResult: Chat result to be parsed

Returns:

Dict[str, Any]: Parsed Chat result

Raises:

  • Exception - If error parsing Chat result

parse_langchain_llm_result

def parse_langchain_llm_result(result: LLMResult) -> Dict[str, Any]

Parses langchain LLM result into a format that is accepted by Maxim logger

Arguments:

NameDescription
resultLLMResult: LLM result to be parsed

Returns:

Dict[str, Any]: Parsed LLM result

Raises:

  • Exception - If error parsing LLM result

parse_langchain_messages

def parse_langchain_messages(input: Union[List[str], List[List[Any]]],
                             default_role="user")

Parses langchain messages into messages that are accepted by Maxim logger

Arguments:

NameDescription
inputList[str] or List[List[Any]]: List of messages to be parsed
default_rolestr: Default role to assign to messages without a role

Returns:

List[Dict[str, str]]: List of messages with role and content

Raises:

  • Exception - If input is not List[str] or List[List[Any]]
  • Exception - If message type is not str or list
  • Exception - If message type is not recognized