logicapps/LogicApps-AI-RAG-Demo
Divya Swarnkar 42ad494bac
Logic Apps sample for RAG patterns - Ingestion and Retrieval with SQL as the data source (#1081)
* rndecodetemplate

* namefix

* revert

* jsonfix

* remove_res

* blob

* blobfix

* api1

* icon

* addhttpicon

* RNencode

* RNencode1

* RNencode2

* simplifying

* urifix

* decode_update

* decode_update1

* decode_update2

* newupdates

* RAG sample with ingestion and retrieval workflows

* Updating to remove app settings and instead add parameters. Also update the readme

* Adding app settings for all connection parameters. Updated Readme as well

removing secrets

updating app settings
2024-06-19 16:26:19 -07:00
..
Deployment Logic Apps sample for RAG patterns - Ingestion and Retrieval with SQL as the data source (#1081) 2024-06-19 16:26:19 -07:00
SampleAIWorkflows Logic Apps sample for RAG patterns - Ingestion and Retrieval with SQL as the data source (#1081) 2024-06-19 16:26:19 -07:00
TokenizeDocFunction Logic Apps sample for RAG patterns - Ingestion and Retrieval with SQL as the data source (#1081) 2024-06-19 16:26:19 -07:00
readme.md Logic Apps sample for RAG patterns - Ingestion and Retrieval with SQL as the data source (#1081) 2024-06-19 16:26:19 -07:00

readme.md

Create a Chat with Your Data Logic App Project

This readme document provides step-by-step instructions on how to enable a Chat with your Data Logic Apps project.

Prerequisites

Required installations

Required AI Services

Access to an Azure OpenAI Service

If you already have an existing OpenAI Service and model you can skip these steps.

  1. Go to the Azure portal

  2. Click Create a resource

  3. In the search box type: OpenAI.

  4. In the search results list, click Create on Azure OpenAI.

  5. Follow the prompts to create the service in your chosen subscription and resource group.

  6. Once your OpenAI service is created you will need to create a deployments for generating embeddings and chat completions.

    • Go to your OpenAI service, under the Resource Management menu pane, click Model deployments
    • Click Manage Deployments
    • On the Deployments page click Create new deployment
    • Select an available embedding model e.g. text-embedding-ada-002, model version, and deployment name. Keep track of the deployment name, it will be used in later steps.
    • Ensure your model is successfully deployed by viewing it on the Deployments page
    • On the Deployments page click Create new deployment
    • Select an available chat model e.g. gpt-35-turbo, model version, and deployment name. Keep track of the deployment name, it will be used in later steps.
    • Ensure your model is successfully deployed by viewing it on the Deployments page

Access to an Azure AI Search Service

If you already have an existing AI Search Service you can skip to step 5.

  1. Go to the Azure portal.

  2. Click Create a resource.

  3. In the search box type: Azure AI Search.

  4. In the search results list, click Create on Azure AI Search.

  5. Follow the prompts to create the service in your chosen subscription and resource group.

  6. Once your AI Search service is created you will need to create an index to store your document content and embeddings.

    • Go to your search service on the Overview page, at the top click Add index (JSON)
    • Go up one level to the root folder ai-sample and open the Deployment folder. Copy the entire contents of the file aisearch_index.json and paste them into the index window. You can change the name of the index in the name field if you choose. This name will be used in later steps.
    • Ensure your index is created by viewing in on the Indexes page

Function App and Workflows Creation

There are 2 projects that need to be created and published to Azure:

  • Azure Functions project located in TokenizeDocFuntion folder
  • Azure Standard Logic Apps project located in SampleAIWorkflows folder

Follow these steps to create the Azure Functions project and deploy it to Azure:

  1. Open Visual Studio Code.

  2. Go to the Azure Function extension.

  3. Under Azure Function option, click Create New Project then navigate to and select the TokenizeDocFunction folder.

  4. Follow the setup prompts:

    • Choose Python language
    • Choose Python programming model V1 or V2
    • Skip Trigger Type selection
    • Select Yes if asked to overwrite any existing files except the requirements.txt file
  5. Deploy your Function App:

    • Go to the Azure Function extension.
    • Under the Azure Function option, click Create Function App in Azure
    • Select a Subscription and Resource Group to deploy your Function App.
  6. Go to the Azure portal to verify your app is up and running.

  7. Make note of the URL generated by your Function App, it will be used in later steps.

Follow these steps to create the Azure Standard Logic Apps project and deploy it to Azure:

  1. Open Visual Studio Code.

  2. Go to the Azure Logic Apps extension.

  3. Click Create New Project then navigate to and select the SampleAIWorkflows folder.

  4. Follow the setup prompts:

    • Choose Stateful Workflow
    • Press Enter to use the default Stateful name. This can be deleted later
    • Select Yes if asked to overwrite any existing files
  5. Update your local.settings.json file:

    • Open the local.settings.json file
    • Go to your Azure OpenAI service in the portal
      • Under the Resource Management menu click Keys and Endpoint
        • Copy the KEY 1 value and place its value into the value field of the openai_openAIKey property
        • Copy the Endpoint value and place its values into the value field of the openai_openAIEndpoint property
  • Go to your Azure AI Search service in the portal
    • On the Overview page copy the Url value. Place its value in the value field of the azureaisearch_searchServiceEndpoint property
    • Under the Settings menu click Keys. Copy either the Primary or Secondary admin key and place its value into the value field of the azureaisearch_searchServiceAdminKey property
  • Go to your SQL Server in Azure Portal or in SQL Management Studio
    • Copy the Conenction Strings for the SQL database. Place the value into the sql_connectionString property
  1. Update your parameters.json file:

    • Open the parameters.json file

    • Go to your Azure OpenAI service in the portal

      • Under the Resource Management menu click Model deployments
        • Click Manage Deployments
        • Copy the Deployment name of the embeddings model you want to use and place its value into the value field of the openai_embeddings_deployment_model property
        • Copy the Deployment name of the chat model you want to use and place its value into the value field of the openai_chat_deployment_model property
    • Go to your Azure AI Search service in the portal

      • Under the Resource Management menu click Indexes
        • Copy the name of index that you want to use and place its value into the value field of the oaisearch_index_name property
    • Go to your Tokenize Function App

      • On the Overview page. Copy the URL value and place its value into the value field of the tokenize_function_url property. Then append /api/tokenize_trigger to the end of the url.
    • Go to your SQL Server in Azure Portal or in SQL Management Studio

      • Copy the name of the table which would be the source of data. Place the value of table name into the value field of the sql_table_name property.
  2. Deploy your Logic App:

    • Go to the Azure Logic Apps extension
    • Click Deploy to Azure
    • Select a Subscription and Resource Group to deploy your Logic App
  3. Go to the Azure portal to verify your app is up and running.

  4. Verify your Logic Apps contains two workflows. They will be named: RAG-Ingestion-Workflow and RAG-Retrieval-Workflow.

Run your workflows

Now that the Azure Function and Azure Logic App workflows are live in Azure. You are ready to ingest your data and chat with it.

Ingest Workflow

  1. Go to your Logic App in the Azure portal.

  2. Go to your RAG-Ingestion-Workflow workflow.

  3. On the Overview tab select Run to trigger the workflow.

  4. On the Overview tab, Click Run, this will trigger the RAG-Ingestion-Workflow workflow. This will pull in your data from SQL database and store it in your Azure AI Search Service.

  5. View the Run History to ensure a successful run.

Chat Workflow

  1. Go to your Logic App in the Azure portal.

  2. Go to your RAG-Retrieval-Workflows workflow.

  3. On the Overview tab click the drop down Run then select Run with payload.

  4. Fill in the JSON Body section with your prompt. For example: { "prompt": "Provide insights and recommendations using sales data on how to improve album sales" }

  5. Click Run, This will trigger the RAG-Retrieval-Workflows workflow. This will query your data stored in your Azure AI Search Service and respond with an answer.

  6. View the Run History to see the Response from your query.

Conclusion

In this readme document, you learned how to:

  • Create an Azure OpenAI Service
  • Create an Azure AI Search Service
  • Create and deploy an Azure Function and multiple Logic Apps workflows using Visual Studio Code and their respective extensions.

For more information and advanced usage, refer to the official documentation of Azure Logic Apps and Azure Functions.