Skip to main content

Prompt Management in MLflow

What is MLflow Prompt Registry?

MLflow Prompt Registry is a powerful tool that streamlines prompt engineering and management in your Generative AI (GenAI) applications. It enables you to version, track, and reuse prompts across your organization, helping maintain consistency and improving collaboration in prompt development.

Key Features
  • Reusability - Store and manage prompts in a centralized registry and reuse them across multiple applications.
  • Version Control - Track the evolution of your prompts with Git-inspired commit-based versioning and side-by-side comparison of prompt versions with diff highlighting.
  • Aliasing - Build robust yet flexible deployment pipelines for prompts using aliases, allowing you to isolate prompt versions from main application code and perform tasks such as A/B testing and roll-backs with ease.
  • Lineage - Seamlessly integrate with MLflow's existing features such as model tracking and evaluation for end-to-end GenAI lifecycle management.
  • Collaboration - Share prompts across your organization with a centralized registry, enabling teams to build upon each other's work.

Getting started

1. Create a Prompt

Create Prompt UI

  1. Run mlflow ui in your terminal to start the MLflow UI.
  2. Navigate to the Prompts tab in the MLflow UI.
  3. Click on the Create Prompt button.
  4. Fill in the prompt details such as name, prompt template text, and commit message (optional).
  5. Click Create to register the prompt.
note

Prompt template text can contain variables in {{variable}} format. These variables can be filled with dynamic content when using the prompt in your GenAI application. MLflow also provides a utility method to convert template into single brace format for frameworks like LangChain or LlamaIndex.

This creates a new prompt with the specified template text and metadata. The prompt is now available in the MLflow UI for further management.

Registered Prompt in UI

2. Update the Prompt with a New Version

Update Prompt UI

  1. The previous step leads to the created prompt page. (If you closed the page, navigate to the Prompts tab in the MLflow UI and click on the prompt name.)
  2. Click on the Create prompt Version button.
  3. The popup dialog is pre-filled with the existing prompt text. Modify the prompt as you wish.
  4. Click Create to register the new version.

3. Compare the Prompt Versions

Once you have multiple versions of a prompt, you can compare them to understand the changes between versions. To compare prompt versions in the MLflow UI, click on the Compare tab in the prompt details page:

Compare Prompt Versions

4. Load and Use the Prompt

To use a prompt in your GenAI application, you can load it with the mlflow.load_prompt() API and fill in the variables using the mlflow.entities.Prompt.format() method of the prompt object:

import mlflow
import openai

target_text = """
MLflow is an open source platform for managing the end-to-end machine learning lifecycle.
It tackles four primary functions in the ML lifecycle: Tracking experiments, packaging ML
code for reuse, managing and deploying models, and providing a central model registry.
MLflow currently offers these functions as four components: MLflow Tracking,
MLflow Projects, MLflow Models, and MLflow Registry.
"""

# Load the prompt
prompt = mlflow.load_prompt("prompts:/summarization-prompt/2")

# Use the prompt with an LLM
client = openai.OpenAI()
response = client.chat.completions.create(
messages=[
{
"role": "user",
"content": prompt.format(num_sentences=1, sentences=target_text),
}
],
model="gpt-4o-mini",
)

print(response.choices[0].message.content)

Prompt Object

The Prompt object is the core entity in MLflow Prompt Registry. It represents a versioned template text that can contain variables for dynamic content.

Key attributes of a Prompt object:

  • Name: A unique identifier for the prompt.
  • Template: The text of the prompt, which can include variables in {{variable}} format.
  • Version: A sequential number representing the revision of the prompt.
  • Commit Message: A description of the changes made in the prompt version, similar to Git commit messages.
  • Version Metadata: Optional key-value pairs for adding metadata to the prompt version. For example, you may use this for tracking the author of the prompt version.
  • Tags: Optional key-value pairs assigned at the prompt level (across versions) for categorization and filtering. For example, you may add tags for project name, language, etc, which apply to all versions of the prompt.
  • Alias: An mutable named reference to the prompt. For example, you can create an alias named production to refer to the version used in your production system. See Aliases for more details.

FAQ

Q: How do I delete a prompt version?

A: You can delete a prompt version using the MLflow UI or Python API:

import mlflow

# Delete a prompt version
mlflow.delete_prompt("summarization-prompt", version=2)

To avoid accidental deletion, you can only delete one version at a time via API. If you delete the all versions of a prompt, the prompt itself will be deleted.

Q: Can I update the prompt template of an existing prompt version?

A: No, prompt versions are immutable once created. To update a prompt, create a new version with the desired changes.

Q: Can I use prompt templates with frameworks like LangChain or LlamaIndex?

A: Yes, you can load prompts from MLflow and use them with any framework. For example, the following example demonstrates how to use a prompt registered in MLflow with LangChain. Also refer to Logging Prompts with LangChain for more details.

import mlflow
from langchain.prompts import PromptTemplate

# Load prompt from MLflow
prompt = mlflow.load_prompt("question_answering")

# Convert the prompt to single brace format for LangChain (MLflow uses double braces),
# using the `to_single_brace_format` method.
langchain_prompt = PromptTemplate.from_template(prompt.to_single_brace_format())
print(langchain_prompt.input_variables)
# Output: ['num_sentences', 'sentences']

Q: Is Prompt Registry integrated with the Prompt Engineering UI?

A. Direct integration between the Prompt Registry and the Prompt Engineering UI is coming soon. In the meantime, you can iterate on prompt template in the Prompt Engineering UI and register the final version in the Prompt Registry by manually copying the prompt template.