xtrace_sdk.x_vec.inference.llm

Classes

InferenceClient

A client wrapper for interacting with inference models via OpenAI API.

Functions

_default_prompt_template(context, query)

Module Contents

xtrace_sdk.x_vec.inference.llm._default_prompt_template(context, query)
Parameters:
Return type:

str

class xtrace_sdk.x_vec.inference.llm.InferenceClient(inference_provider, model_name, api_key=None, base_url=None, prompt_template=None)

A client wrapper for interacting with inference models via OpenAI API.

Parameters:
  • inference_provider (str)

  • model_name (str)

  • api_key (str | None)

  • base_url (str | None)

  • prompt_template (Callable[[str, str], str] | None)

client
model_name
prompt_template
query(query, context=None, stream=False)

Query the inference model.

Parameters:
  • query (str) – The query string to send to the model.

  • context (str, optional) – Optional context string to provide additional information to the model.

  • stream (bool, optional) – If True, stream the response incrementally.

Returns:

The model’s response text.

Return type:

str