mlflow.tracing
Attention
The mlflow.tracing
namespace only contains a few utility functions fo managing traces. The main entry point for MLflow
Tracing is Tracing Fluent APIs defined directly under the
mlflow
namespace, or the low-level Tracing Client APIs
-
mlflow.tracing.
disable
()[source] Disable tracing.
Note
This function sets up OpenTelemetry to use NoOpTracerProvider and effectively disables all tracing operations.
Example:
import mlflow @mlflow.trace def f(): return 0 # Tracing is enabled by default f() assert len(mlflow.search_traces()) == 1 # Disable tracing mlflow.tracing.disable() f() assert len(mlflow.search_traces()) == 1
-
mlflow.tracing.
disable_notebook_display
()[source] Disables displaying the MLflow Trace UI in notebook output cells. Call
mlflow.tracing.enable_notebook_display()
to re-enable display.
-
mlflow.tracing.
enable
()[source] Enable tracing.
Example:
import mlflow @mlflow.trace def f(): return 0 # Tracing is enabled by default f() assert len(mlflow.search_traces()) == 1 # Disable tracing mlflow.tracing.disable() f() assert len(mlflow.search_traces()) == 1 # Re-enable tracing mlflow.tracing.enable() f() assert len(mlflow.search_traces()) == 2
-
mlflow.tracing.
enable_notebook_display
()[source] Enables the MLflow Trace UI in notebook output cells. The display is on by default, and the Trace UI will show up when any of the following operations are executed:
On trace completion (i.e. whenever a trace is exported)
When calling the
mlflow.search_traces()
fluent APIWhen calling the
mlflow.client.MlflowClient.get_trace()
ormlflow.client.MlflowClient.search_traces()
client APIs
To disable, please call
mlflow.tracing.disable_notebook_display()
.
-
mlflow.tracing.
reset
()[source] Reset the flags that indicates whether the MLflow tracer provider has been initialized. This ensures that the tracer provider is re-initialized when next tracing operation is performed.
-
mlflow.tracing.
set_destination
(destination: mlflow.tracing.destination.TraceDestination)[source] Note
Experimental: This function may change or be removed in a future release without warning.
Set a custom span destination to which MLflow will export the traces.
A destination specified by this function will take precedence over other configurations, such as tracking URI, OTLP environment variables.
To reset the destination, call the
mlflow.tracing.reset()
function.- Parameters
destination – A
TraceDestination
object that specifies the destination of the trace data.
Example
import mlflow from mlflow.tracing.destination import MlflowExperiment # Setting the destination to an MLflow experiment with ID "123" mlflow.tracing.set_destination(MlflowExperiment(experiment_id="123")) # Reset the destination (to an active experiment as default) mlflow.tracing.reset()
-
mlflow.tracing.
set_span_chat_messages
(span: LiveSpan, messages: Union[dict, ChatMessage], append=False)[source] Set the mlflow.chat.messages attribute on the specified span. This attribute is used in the UI, and also by downstream applications that consume trace data, such as MLflow evaluate.
- Parameters
span – The LiveSpan to add the attribute to
messages – A list of standardized chat messages (refer to the spec for details)
append – If True, the messages will be appended to the existing messages. Otherwise, the attribute will be overwritten entirely. Default is False. This is useful when you want to record messages incrementally, e.g., log input messages first, and then log output messages later.
Example:
import mlflow from mlflow.tracing import set_span_chat_messages @mlflow.trace def f(): messages = [{"role": "user", "content": "hello"}] span = mlflow.get_current_active_span() set_span_chat_messages(span, messages) return 0 f()
-
mlflow.tracing.
set_span_chat_tools
(span: LiveSpan, tools: list[ChatTool])[source] Set the mlflow.chat.tools attribute on the specified span. This attribute is used in the UI, and also by downstream applications that consume trace data, such as MLflow evaluate.
- Parameters
span – The LiveSpan to add the attribute to
tools –
A list of standardized chat tool definitions (refer to the spec for details)
Example:
import mlflow from mlflow.tracing import set_span_chat_tools tools = [ { "type": "function", "function": { "name": "add", "description": "Add two numbers", "parameters": { "type": "object", "properties": { "a": {"type": "number"}, "b": {"type": "number"}, }, "required": ["a", "b"], }, }, } ] @mlflow.trace def f(): span = mlflow.get_current_active_span() set_span_chat_tools(span, tools) return 0 f()
-
class
mlflow.tracing.destination.
Databricks
(experiment_id: Optional[str] = None, experiment_name: Optional[str] = None)[source] Note
Experimental: This class may change or be removed in a future release without warning.
A destination representing a Databricks tracing server.
By setting this destination in the
mlflow.tracing.set_destination()
function, MLflow will log traces to the specified experiment.If neither experiment_id nor experiment_name is specified, an active experiment when traces are created will be used as the destination. If both are specified, they must refer to the same experiment.
-
class
mlflow.tracing.destination.
MlflowExperiment
(experiment_id: Optional[str] = None, tracking_uri: Optional[str] = None)[source] Note
Experimental: This class may change or be removed in a future release without warning.
A destination representing an MLflow experiment.
By setting this destination in the
mlflow.tracing.set_destination()
function, MLflow will log traces to the specified experiment.-
experiment_id
The ID of the experiment to log traces to. If not specified, the current active experiment will be used.
- Type
Optional[str]
-
-
class
mlflow.tracing.destination.
TraceDestination
[source] Note
Experimental: This class may change or be removed in a future release without warning.
A configuration object for specifying the destination of trace data.