xtrace_sdk.inference.llm

Classes

InferenceClient

A client wrapper for interacting with inference models via OpenAI API

Module Contents

class xtrace_sdk.inference.llm.InferenceClient(inference_provider, model_name, api_key=None, prompt_template=None)

A client wrapper for interacting with inference models via OpenAI API

Parameters:
  • inference_provider (str)

  • model_name (str)

  • api_key (Optional[str])

  • prompt_template (Optional[Callable[[str, str], str]])

client
model_name
query(query, context=None, stream=False, print_response=True)

Query the inference model.

Parameters:
  • query (str) – The query string to send to the model.

  • context (str, optional) – Optional context string to provide additional information to the model.

  • stream (bool)

  • print_response (bool)

Return type:

Any