42ad494bac
* rndecodetemplate * namefix * revert * jsonfix * remove_res * blob * blobfix * api1 * icon * addhttpicon * RNencode * RNencode1 * RNencode2 * simplifying * urifix * decode_update * decode_update1 * decode_update2 * newupdates * RAG sample with ingestion and retrieval workflows * Updating to remove app settings and instead add parameters. Also update the readme * Adding app settings for all connection parameters. Updated Readme as well removing secrets updating app settings |
||
---|---|---|
.. | ||
Deployment | ||
SampleAIWorkflows | ||
TokenizeDocFunction | ||
readme.md |
readme.md
Create a Chat with Your Data Logic App Project
This readme document provides step-by-step instructions on how to enable a Chat with your Data Logic Apps project.
Prerequisites
Required installations
- Visual Studio Code
- Azure Logic Apps extension for Visual Studio Code
- Azure Functions extension for Visual Studio Code
- Azurite extension for Visual Studio Code
- This guide also assumes you have pulled this repo down to your local machine.
Required AI Services
Access to an Azure OpenAI Service
If you already have an existing OpenAI Service and model you can skip these steps.
-
Go to the Azure portal
-
Click
Create a resource
-
In the search box type:
OpenAI
. -
In the search results list, click
Create
onAzure OpenAI
. -
Follow the prompts to create the service in your chosen subscription and resource group.
-
Once your OpenAI service is created you will need to create a deployments for generating embeddings and chat completions.
- Go to your OpenAI service, under the
Resource Management
menu pane, clickModel deployments
- Click
Manage Deployments
- On the
Deployments
page clickCreate new deployment
- Select an available embedding
model
e.g.text-embedding-ada-002
,model version
, anddeployment name
. Keep track of thedeployment name
, it will be used in later steps. - Ensure your model is successfully deployed by viewing it on the
Deployments
page - On the
Deployments
page clickCreate new deployment
- Select an available chat
model
e.g.gpt-35-turbo
,model version
, anddeployment name
. Keep track of thedeployment name
, it will be used in later steps. - Ensure your model is successfully deployed by viewing it on the
Deployments
page
- Go to your OpenAI service, under the
Access to an Azure AI Search Service
If you already have an existing AI Search Service you can skip to step 5.
-
Go to the Azure portal.
-
Click
Create a resource
. -
In the search box type:
Azure AI Search
. -
In the search results list, click
Create
onAzure AI Search
. -
Follow the prompts to create the service in your chosen subscription and resource group.
-
Once your AI Search service is created you will need to create an index to store your document content and embeddings.
- Go to your search service on the
Overview
page, at the top clickAdd index (JSON)
- Go up one level to the root folder
ai-sample
and open theDeployment
folder. Copy the entire contents of the fileaisearch_index.json
and paste them into the index window. You can change the name of the index in thename
field if you choose. This name will be used in later steps. - Ensure your index is created by viewing in on the
Indexes
page
- Go to your search service on the
Function App and Workflows Creation
There are 2 projects that need to be created and published to Azure:
- Azure Functions project located in
TokenizeDocFuntion
folder - Azure Standard Logic Apps project located in
SampleAIWorkflows
folder
Follow these steps to create the Azure Functions project and deploy it to Azure:
-
Open Visual Studio Code.
-
Go to the Azure Function extension.
-
Under Azure Function option, click
Create New Project
then navigate to and select theTokenizeDocFunction
folder. -
Follow the setup prompts:
- Choose
Python
language - Choose Python programming model V1 or V2
- Skip
Trigger Type
selection - Select
Yes
if asked to overwrite any existing files except therequirements.txt
file
- Choose
-
Deploy your Function App:
- Go to the Azure Function extension.
- Under the Azure Function option, click
Create Function App in Azure
- Select a Subscription and Resource Group to deploy your Function App.
-
Go to the Azure portal to verify your app is up and running.
-
Make note of the URL generated by your Function App, it will be used in later steps.
Follow these steps to create the Azure Standard Logic Apps project and deploy it to Azure:
-
Open Visual Studio Code.
-
Go to the Azure Logic Apps extension.
-
Click
Create New Project
then navigate to and select theSampleAIWorkflows
folder. -
Follow the setup prompts:
- Choose Stateful Workflow
- Press Enter to use the default
Stateful
name. This can be deleted later - Select
Yes
if asked to overwrite any existing files
-
Update your
local.settings.json
file:- Open the
local.settings.json
file - Go to your Azure OpenAI service in the portal
- Under the
Resource Management
menu clickKeys and Endpoint
- Copy the
KEY 1
value and place its value into thevalue
field of theopenai_openAIKey
property - Copy the
Endpoint
value and place its values into thevalue
field of theopenai_openAIEndpoint
property
- Copy the
- Under the
- Open the
- Go to your Azure AI Search service in the portal
- On the
Overview
page copy theUrl
value. Place its value in thevalue
field of theazureaisearch_searchServiceEndpoint
property - Under the
Settings
menu clickKeys
. Copy either thePrimary
orSecondary
admin key and place its value into thevalue
field of theazureaisearch_searchServiceAdminKey
property
- On the
- Go to your SQL Server in Azure Portal or in SQL Management Studio
- Copy the
Conenction Strings
for the SQL database. Place the value into thesql_connectionString
property
- Copy the
-
Update your
parameters.json
file:-
Open the
parameters.json
file -
Go to your Azure OpenAI service in the portal
- Under the
Resource Management
menu clickModel deployments
- Click
Manage Deployments
- Copy the
Deployment name
of the embeddings model you want to use and place its value into thevalue
field of theopenai_embeddings_deployment_model
property - Copy the
Deployment name
of the chat model you want to use and place its value into thevalue
field of theopenai_chat_deployment_model
property
- Click
- Under the
-
Go to your Azure AI Search service in the portal
- Under the
Resource Management
menu clickIndexes
- Copy the name of index that you want to use and place its value into the
value
field of theoaisearch_index_name
property
- Copy the name of index that you want to use and place its value into the
- Under the
-
Go to your Tokenize Function App
- On the
Overview
page. Copy theURL
value and place its value into thevalue
field of thetokenize_function_url
property. Then append/api/tokenize_trigger
to the end of the url.
- On the
-
Go to your SQL Server in Azure Portal or in SQL Management Studio
- Copy the name of the table which would be the source of data. Place the value of table name into the
value
field of thesql_table_name
property.
- Copy the name of the table which would be the source of data. Place the value of table name into the
-
-
Deploy your Logic App:
- Go to the Azure Logic Apps extension
- Click
Deploy to Azure
- Select a Subscription and Resource Group to deploy your Logic App
-
Go to the Azure portal to verify your app is up and running.
-
Verify your Logic Apps contains two workflows. They will be named:
RAG-Ingestion-Workflow
andRAG-Retrieval-Workflow
.
Run your workflows
Now that the Azure Function and Azure Logic App workflows are live in Azure. You are ready to ingest your data and chat with it.
Ingest Workflow
-
Go to your Logic App in the Azure portal.
-
Go to your
RAG-Ingestion-Workflow
workflow. -
On the
Overview
tab selectRun
to trigger the workflow. -
On the
Overview
tab, ClickRun
, this will trigger theRAG-Ingestion-Workflow
workflow. This will pull in your data from SQL database and store it in your Azure AI Search Service. -
View the
Run History
to ensure a successful run.
Chat Workflow
-
Go to your Logic App in the Azure portal.
-
Go to your
RAG-Retrieval-Workflows
workflow. -
On the
Overview
tab click the drop downRun
then selectRun with payload
. -
Fill in the JSON
Body
section with yourprompt
. For example:{ "prompt": "Provide insights and recommendations using sales data on how to improve album sales" }
-
Click
Run
, This will trigger theRAG-Retrieval-Workflows
workflow. This will query your data stored in your Azure AI Search Service and respond with an answer. -
View the
Run History
to see the Response from your query.
Conclusion
In this readme document, you learned how to:
- Create an Azure OpenAI Service
- Create an Azure AI Search Service
- Create and deploy an Azure Function and multiple Logic Apps workflows using Visual Studio Code and their respective extensions.
For more information and advanced usage, refer to the official documentation of Azure Logic Apps and Azure Functions.