Prompt flow is a new feature in Azure Machine Learning that helps users create, evaluate, and deploy large language model (LLM)-infused applications for various business scenarios.

Prompt engineering is an essential process that involves several steps, such as data preparation, crafting tailored prompts, executing prompts using the LLM API, and refining the generated content. These steps require users to understand, experiment, and optimize their prompts using complex logic, metrics, and tools. Prompt flow offers a range of benefits that streamline and simplify the prompt engineering process, such as:

  • Agility: users can easily track, compare, and improve their prompts and flows with various tools and resources.
  • Enterprise readiness: users can collaborate, deploy, monitor, and secure their flows with Azure Machine Learning’s platform and solutions.
  • Evaluation: users can set up custom metrics and test their prompts and flows using built-in or tailored evaluation flows.
  • Production: users can integrate their flows into production as enterprise-grade endpoints and monitor them with alerts and continuous improvement.

Prompt flow is currently available for private preview and users can sign up for the AzureML Insiders Program to gain early access and experience its benefits.