b93da6a98b
* adding AI projects SampleAIWorkflows includes ingest and chat workflows TokenizeDocFunction includes tokenize function * readme doc * Update connections.json * remove image file * Update workflow.json pr comments * run workflow section * update readme * Update chat workflow.json * Update ingest workflow.json * pr comments * fix param * pr comments * final edits to readme * Update ingest-data workflow |
||
---|---|---|
.. | ||
Deployment | ||
SampleAIWorkflows | ||
TokenizeDocFunction | ||
readme.md |
readme.md
Create a Chat with Your Data Logic App Project
This readme document provides step-by-step instructions on how to enable a Chat with your Data Logic Apps project.
Prerequisites
Required installations
- Visual Studio Code
- Azure Logic Apps extension for Visual Studio Code
- Azure Functions extension for Visual Studio Code
- Azurite extension for Visual Studio Code
- This guide also assumes you have pulled this repo down to your local machine.
Required AI Services
Access to an Azure OpenAI Service
If you already have an existing OpenAI Service and model you can skip these steps.
-
Go to the Azure portal
-
Click
Create a resource
-
In the search box type:
OpenAI
. -
In the search results list, click
Create
onAzure OpenAI
. -
Follow the prompts to create the service in your chosen subscription and resource group.
-
Once your OpenAI service is created you will need to create a deployments for generating embeddings and chat completions.
- Go to your OpenAI service, under the
Resource Management
menu pane, clickModel deployments
- Click
Manage Deployments
- On the
Deployments
page clickCreate new deployment
- Select an available embedding
model
e.g.text-embedding-ada-002
,model version
, anddeployment name
. Keep track of thedeployment name
, it will be used in later steps. - Ensure your model is successfully deployed by viewing it on the
Deployments
page - On the
Deployments
page clickCreate new deployment
- Select an available chat
model
e.g.gpt-35-turbo
,model version
, anddeployment name
. Keep track of thedeployment name
, it will be used in later steps. - Ensure your model is successfully deployed by viewing it on the
Deployments
page
- Go to your OpenAI service, under the
Access to an Azure AI Search Service
If you already have an existing AI Search Service you can skip to step 5.
-
Go to the Azure portal.
-
Click
Create a resource
. -
In the search box type:
Azure AI Search
. -
In the search results list, click
Create
onAzure AI Search
. -
Follow the prompts to create the service in your chosen subscription and resource group.
-
Once your AI Search service is created you will need to create an index to store your document content and embeddings.
- Go to your search service on the
Overview
page, at the top clickAdd index (JSON)
- Go up one level to the root folder
ai-sample
and open theDeployment
folder. Copy the entire contents of the fileaisearch_index.json
and paste them into the index window. You can change the name of the index in thename
field if you choose. This name will be used in later steps. - Ensure your index is created by viewing in on the
Indexes
page
- Go to your search service on the
Function App and Workflows Creation
There are 2 projects that need to be created and published to Azure:
- Azure Functions project located in
TokenizeDocFuntion
folder - Azure Standard Logic Apps project located in
SampleAIWorkflows
folder
Follow these steps to create the Azure Functions project and deploy it to Azure:
-
Open Visual Studio Code.
-
Go to the Azure Function extension.
-
Under Azure Function option, click
Create New Project
then navigate to and select theTokenizeDocFunction
folder. -
Follow the setup prompts:
- Choose
Python
language - Choose Python programming model V1 or V2
- Skip
Trigger Type
selection - Select
Yes
if asked to overwrite any existing files except therequirements.txt
file
- Choose
-
Deploy your Function App:
- Go to the Azure Function extension.
- Under the Azure Function option, click
Create Function App in Azure
- Select a Subscription and Resource Group to deploy your Function App.
-
Go to the Azure portal to verify your app is up and running.
-
Make note of the URL generated by your Function App, it will be used in later steps.
Follow these steps to create the Azure Standard Logic Apps project and deploy it to Azure:
-
Open Visual Studio Code.
-
Go to the Azure Logic Apps extension.
-
Click
Create New Project
then navigate to and select theSampleAIWorkflows
folder. -
Follow the setup prompts:
- Choose Stateful Workflow
- Press Enter to use the default
Stateful
name. This can be deleted later - Select
Yes
if asked to overwrite any existing files
-
Update your
parameters.json
file:-
Open the
parameters.json
file -
Go to your Azure OpenAI service in the portal
- Under the
Resource Management
menu clickKeys and Endpoint
- Copy the
KEY 1
value and place its value into thevalue
field of theopenai_api_key
property - Copy the
Endpoint
value and place its values into thevalue
field of theopenai_endpoint
property
- Copy the
- Under the
Resource Management
menu clickModel deployments
- Click
Manage Deployments
- Copy the
Deployment name
of the embeddings model you want to use and place its value into thevalue
field of theopenai_embeddings_deployment_id
property - Copy the
Deployment name
of the chat model you want to use and place its value into thevalue
field of theopenai_chat_deployment_id
property
- Click
- Under the
-
Go to your Azure AI Search service in the portal
- On the
Overview
page copy theUrl
value. Place its value in thevalue
field of theaisearch_endpoint
property - Under the
Settings
menu clickKeys
. Copy either thePrimary
orSecondary
admin key and place its value into thevalue
field of theaisearch_admin_key
property
- On the
-
Go to your Tokenize Function App
- On the
Overview
page. Copy theURL
value and place its value into thevalue
field of thetokenize_function_url
property. Then append/api/tokenize_trigger
to the end of the url.
- On the
-
-
Deploy your Logic App:
- Go to the Azure Logic Apps extension
- Click
Deploy to Azure
- Select a Subscription and Resource Group to deploy your Logic App
-
Go to the Azure portal to verify your app is up and running.
-
Verify your Logic Apps contains two workflows. They will be named:
chat-workflow
andingest-workflow
.
Run your workflows
Now that the Azure Function and Azure Logic App workflows are live in Azure. You are ready to ingest your data and chat with it.
Ingest Workflow
-
Go to your Logic App in the Azure portal.
-
Go to your
ingest
workflow. -
On the
Overview
tab click the drop downRun
then selectRun with payload
. -
Fill in the JSON
Body
section with yourfileUrl
anddocumentName
. For example:{ "fileUrl": "https://mydata.enterprise.net/file1.pdf", "documentName": "file1" }
NOTE: The expected file type is pdf. -
Click
Run
, this will trigger theingest
workflow. This will pull in your data from the above file and store it in your Azure AI Search Service. -
View the
Run History
to ensure a successful run.
Chat Workflow
-
Go to your Logic App in the Azure portal.
-
Go to your
chat
workflow. -
On the
Overview
tab click the drop downRun
then selectRun with payload
. -
Fill in the JSON
Body
section with yourprompt
. For example:{ "prompt": "Ask a question about your data?" }
-
Click
Run
, This will trigger thechat
workflow. This will query your data stored in your Azure AI Search Service and respond with an answer. -
View the
Run History
to see the Response from your query.
Conclusion
In this readme document, you learned how to:
- Create an Azure OpenAI Service
- Create an Azure AI Search Service
- Create and deploy an Azure Function and multiple Logic Apps workflows using Visual Studio Code and their respective extensions.
For more information and advanced usage, refer to the official documentation of Azure Logic Apps and Azure Functions.