Advent of 2024, Day 22 – Microsoft Azure AI – Prompt flow using VS Code and Python

In this Microsoft Azure AI series:

  1. Dec 01: Microsoft Azure AI – What is Foundry?
  2. Dec 02: Microsoft Azure AI – Working with Azure AI Foundry
  3. Dec 03: Microsoft Azure AI – Creating project in Azure AI Foundry
  4. Dec 04: Microsoft Azure AI – Deployment in Azure AI Foundry
  5. Dec 05: Microsoft Azure AI – Deployment parameters in Azure AI Foundry
  6. Dec 06: Microsoft Azure AI – AI Services in Azure AI Foundry
  7. Dec 07: Microsoft Azure AI – Speech service in AI Services
  8. Dec 08: Microsoft Azure AI – Speech Studio in Azure with AI Services
  9. Dec 09: Microsoft Azure AI – Speech SDK with Python
  10. Dec 10: Microsoft Azure AI – Language and Translation in Azure AI Foundry
  11. Dec 11: Microsoft Azure AI – Language and Translation Python SDK
  12. Dec 12: Microsoft Azure AI – Vision and Document AI Service
  13. Dec 13: Microsoft Azure AI – Vision and Document Python SDK
  14. Dec 14: Microsoft Azure AI – Content safety AI service
  15. Dec 15: Microsoft Azure AI – Content safety Python SDK
  16. Dec 16: Microsoft Azure AI – Fine-tuning a model
  17. Dec 17: Microsoft Azure AI – Azure OpenAI service
  18. Dec 18: Microsoft Azure AI – Azure AI Hub and Azure AI Project
  19. Dec 19: Microsoft Azure AI – Azure AI Foundry management center
  20. Dec 20: Microsoft Azure AI – Models and endpoints in Azure AI Foundry
  21. Dec 21: Microsoft Azure AI – Prompt flow in Azure AI Foundry

Prompt Flow is particularly beneficial for organisations leveraging AI to streamline operations, enhance customer experiences, and innovate in digital transformation projects.

Why Use Prompt Flow?

  • Ease of Use: Simplifies the complex process of prompt engineering and integration.
  • Scalability: Makes it easier to deploy and scale AI workflows across applications.
  • Cost-Effectiveness: Helps optimize prompts, reducing unnecessary API calls and improving efficiency.

With Python you can start using prompt flow by installing the package

pip install promptflow

and create a prompty file (save it as my_prompty_file.yaml):

---
name: Minimal Chat
model:
  api: chat
  configuration:
    type: azure_openai
    azure_deployment: gpt-35-turbo
    api_key: ${env:AZURE_OPENAI_API_KEY}
    api_version: ${env:AZURE_OPENAI_API_VERSION}
    azure_endpoint: ${env:AZURE_OPENAI_ENDPOINT}
  parameters:
    temperature: 0.2
    max_tokens: 1024
inputs:
  question:
    type: string
sample:
  question: "Where can I get the most famous pasta in the world?"
---
system:
You are an AI assistant who helps people find information.
As the assistant, you answer questions briefly, succinctly,
and in a personable manner using markdown and even add some personal flair with appropriate emojis.

# Safety
- You **should always** reference factual statements to search results based on [relevant documents]
- Search results based on [relevant documents] may be incomplete or irrelevant. You do not make assumptions
# Customer
You are helping to find answers to their questions.
Use their name to address them in your responses.

user:
{{question}}

And for the environment variables:

from promptflow.core import Prompty

# Load prompty with dict override
override_model = {
    "configuration": {
        "api_key": "${env:AZURE_OPENAI_API_KEY}",
        "api_version": "${env:AZURE_OPENAI_API_VERSION}",
        "azure_endpoint": "${env:AZURE_OPENAI_ENDPOINT}"
    },
    "parameters": {"max_tokens": 512}
}
prompty = Prompty.load(source="path/to/my_prompty_file.yaml", model=override_model)

And the prompty file (YAML) file is to be specified in the Prompty.load() function. To orchestrate and call:

from promptflow.core import Prompty

prompty_obj = Prompty.load(source="path/to/my_prompty_file.yaml")
result = prompty_obj(first_name="John", last_name="Doh", question="What is the capital of France?")

Prompty gives you the ability to create an end-to-end solution, like RAG where you can chat with LLM over an article or document, where you can ask to classify the input data (list of URLs,…)

Prompty is a markdown file, structured in YAML and encapsulates a series of metadata fields pivotal for defining the model’s configuration and the inputs. After this front matter is the prompt template, articulated in the Jinja format.

FieldDescription
nameThe name of the prompt.
descriptionA description of the prompt.
modelDetails the prompty’s model configuration, including connection info and parameters for the LLM request.
inputsThe input definition that passed to prompt template.
outputsSpecify the fields in prompty result. (Only works when response_format is json_object).
sampleOffers a dictionary or JSON file containing sample data for inputs.

You can run this from VS Code using SDK or as CLI and also use Trace UI to analyse the flow and analyse the runs.

Tomorrow we will look into feature Tracing in Azure AI Foundry.

All of the code samples will be available on my Github.

Tagged with: , , , , , , , , , , , , , , , , ,
Posted in Azure AI
4 comments on “Advent of 2024, Day 22 – Microsoft Azure AI – Prompt flow using VS Code and Python
  1. […] Dec 22: Microsoft Azure AI – Prompt flow using VS Code and Python […]

    Like

  2. […] Dec 22: Microsoft Azure AI – Prompt flow using VS Code and Python […]

    Like

  3. […] Dec 22: Microsoft Azure AI – Prompt flow using VS Code and Python […]

    Like

Leave a comment

Follow TomazTsql on WordPress.com
Programs I Use: SQL Search
Programs I Use: R Studio
Programs I Use: Plan Explorer
Rdeči Noski – Charity

Rdeči noski

100% of donations made here go to charity, no deductions, no fees. For CLOWNDOCTORS - encouraging more joy and happiness to children staying in hospitals (http://www.rednoses.eu/red-noses-organisations/slovenia/)

€2.00

Top SQL Server Bloggers 2018
TomazTsql

Tomaz doing BI and DEV with SQL Server and R, Python, Power BI, Azure and beyond

Discover WordPress

A daily selection of the best content published on WordPress, collected for you by humans who love to read.

Revolutions

Tomaz doing BI and DEV with SQL Server and R, Python, Power BI, Azure and beyond

Reeves Smith's SQL & BI Blog

A blog about SQL Server and the Microsoft Business Intelligence stack with some random Non-Microsoft tools thrown in for good measure.

SQL Server

for Application Developers

Business Analytics 3.0

Data Driven Business Models

SQL Database Engine Blog

Tomaz doing BI and DEV with SQL Server and R, Python, Power BI, Azure and beyond

Search Msdn

Tomaz doing BI and DEV with SQL Server and R, Python, Power BI, Azure and beyond

R-bloggers

Tomaz doing BI and DEV with SQL Server and R, Python, Power BI, Azure and beyond

Data Until I Die!

Data for Life :)

Paul Turley's SQL Server BI Blog

sharing my experiences with the Microsoft data platform, Fabric, enterprise Power BI, SQL Server BI, Data Modeling, SSAS Design, SSRS, Dashboards & Visualization since 2009

Grant Fritchey

Intimidating Databases and Code

Madhivanan's SQL blog

A modern business theme

Alessandro Alpi's Blog

DevOps could be the disease you die with, but don’t die of.

Paul te Braak

Business Intelligence Blog

Sql Insane Asylum (A Blog by Pat Wright)

Information about SQL (PostgreSQL & SQL Server) from the Asylum.

Gareth's Blog

A blog about Life, SQL & Everything ...

SQLPam's Blog

Life changes fast and this is where I occasionally take time to ponder what I have learned and experienced. A lot of focus will be on SQL and the SQL community – but life varies.

William Durkin

William Durkin a blog on SQL Server, Replication, Performance Tuning and whatever else.

$hell Your Experience !!!

As aventuras de um DBA usando o Poder do $hell

Design a site like this with WordPress.com
Get started