MLflow: A Tool for Managing the Machine Learning Lifecycle
MLflow is an open-source platform, purpose-built to assist machine learning practitioners and teams in handling the complexities of the machine learning process. MLflow focuses on the full lifecycle for machine learning projects, ensuring that each phase is manageable, traceable, and reproducible.
MLflow Getting Started Resources
If this is your first time exploring MLflow, the tutorials and guides here are a great place to start. The emphasis in each of these is getting you up to speed as quickly as possible with the basic functionality, terms, APIs, and general best practices of using MLflow in order to enhance your learning in area-specific guides and tutorials.
- Learn about MLflow
- MLflow Basics
- MLflow Models Introduction
- GenAI Quickstarts
- Deep Learning Quickstarts
Learn about the core components of MLflow
Quickstarts
Get Started with MLflow in our 5-minute tutorial
Guides
Learn the core components of MLflow with this in-depth guide to Tracking
Learn how to perform common tasks in MLflow
Guides
Autologging tutorial for effortless model tracking
Model Signatures and type validation in MLflow
Hyperparameter tuning with MLflow
Learn about MLflow Model-related topics
Guides
Introduction to Custom Python Models
Model dependency management in MLflow
Model Signatures and type validation
Get started with MLflow's GenAI integrations
Quickstarts
Transformers Text Generation
Introductory Tutorial Sentence Transformers Basic Embedding Tutorial
OpenAI Quickstart Tutorial
Get started with MLflow's Deep Learning Library integrations
GenAI and MLflow
Explore the comprehensive GenAI-focused support in MLflow. From MLflow Deployments for GenAI models to the Prompt Engineering UI and native GenAI-focused MLflow flavors like open-ai, transformers, and sentence-transformers, the tutorials and guides here will help to get you started in leveraging the benefits of these powerful models, services, and applications. You’ll learn how MLflow simplifies both using GenAI models and developing solutions that leverage them. Important tasks such as prompt development, evaluation of prompts, comparison of foundation models, fine-tuning, logging, and deploying production-grade inference servers are all covered by MLflow.
Explore the guides and tutorials below to start your journey!
- GenAI Integrations
- Tracing
- Prompt Engineering UI
- MLflow AI Gateway
- GenAI Evaluation
- RAG
Explore the Native MLflow GenAI Integrations
Learn about how to instrument your GenAI Workloads with MLflow Tracing
Guides
-
Learn how to leverage Tracing in MLflow
-
View the Tracing Guide for more information on tracing
-
Learn how to use MLflow autologging with OpenAI for automated trace logging
-
Discover the automated LangChain trace logging with MLflow autologging
Explore the Prompt Engineering UI
Quickstarts
Learn how to use the Prompt Engineering UI
Learn about managed access to GenAI services with the MLflow AI Gateway
Learn about GenAI Evaluation
Guides
Learn how to evaluate your GenAI applications with MLflow
Discover how to use MLflow Evaluate with the Prompt Engineering UI
Learn about using Retrieval Augmented Generation (RAG) with MLflow
Running MLflow Anywhere
MLflow can be used in a variety of environments, including your local environment, on-premises clusters, cloud platforms, and managed services. Being an open-source platform, MLflow is vendor-neutral; no matter where you are doing machine learning, you have access to the MLflow's core capabilities sets such as tracking, evaluation, observability, and more.
Run MLflow server locally or use direct access mode (no server required) to run MLflow in your local environment. Click the card to learn more.

Databricks Managed MLflow is a FREE, fully managed solution, seamlessly integrated with Databricks ML/AI ecosystem, such as Unity Catalog, Model Serving, and more.
MLflow on Amazon SageMaker is a fully managed service for MLflow on AWS infrastructure,integrated with SageMaker's core capabilities such as Studio, Model Registry, and Inference.

Azure Machine Learning workspaces are MLflow-compatible, allows you to use an Azure Machine Learning workspace the same way you use an MLflow server.
Nebius, a cutting-edge cloud platform for GenAI explorers, offers a fully managed service for MLflow, streamlining LLM fine-tuning with MLflow's robust experiment tracking capabilities.

You can use MLflow on your on-premise or cloud-managed Kubernetes cluster. Click this card to learn how to host MLflow on your own infrastructure.