590ca87116
* - First pass at getting all partitions. Still needs a few parameters setup so the code is currently broken. WIP. * Update script.csx Convert all partitions, not just the first one. * Update script.csx Fix placeholders of querystring params that need gotten/set for fetching subsequent partitions. * Update script.csx Fix syntax errors to ensure that isn't why connector upload is failing. * Update script.csx Fix more compilation errors. * Update script.csx Wanted a record of the fact that these changes are still returning subsequent partitions in array format, despite literally every response being converted. Is there maybe some sort of caching of the connector behavior, because it seems like I can't ever get the behavior of my data flow to change at all... * Update script.csx This version of the connector is the most complete example that can successfully be uploaded as a custom connector. Yet I still can't get the behavior to change no matter what code changes I make. * Issue #5 - Null detection and Type Conversion Error (#1) * Update script.csx - Fix null detection. * Fix issue with null handling in Snowflake connector --------- Co-authored-by: jbrinkman <github@brinkman.me> * Update apiDefinition.swagger.json - This version of the swaggerjson SHOULD be working, but we are seeing the DataSchema object being flattened out once uploaded to customer connector. * array data for DataSchema Got the swagger right (was really the code was had checked in before with just a little cleanup. The custom connector is now failing due to internal server error so we need to find a way to use the test page in powerapps online, despite the fact that it doesn't really handle array data very well. Possibly specifying the raw body data might be a workaround. * Update script.csx - Last few tweaks to get the customer connector to return subsequent partitions in pre-converted format. * Add version information into readme documentation * cleanup endpoints - extra body element is required. Caused a whole mess of issues. - change DataSchema to required and deprecate or delete unused endpoints as needed. - Remove fetchAllPages feature and separate into its own branch. * Code cleanup. - Make log messages more accurate. - Remove last remnant of fetchAllPartitions. * code cleanup * more minor code cleanup * intermediate check in - The code is acting absolutely insane and returning GetResults method as just a single property "Data" formatted as array. Since this is the 0 partition it should include metadata. - The interface is also not showing the partition parameter for the execsql method, so something is borked. * GetResults partition zero fixed - This was a very subtle issue related to the fact that when you call GetResults operation for partition zero you have no request body, so it cannot be parsed as json. - Change the response of execstmt async to match the schema of the sync version b/c the powerapps ui does not seem to be able to deal with the fact that async/sync have different response formats respectively. * SPC-36: Handle unexpected async responses better (#5) * Update script.csx - Fix async detection based on response code instead of request params, b/c apparently snowflake API can decide to return an async response if a synchronous response takes too long to return. * Fix typo in script.csx "BeginFetch" misspelled --------- Co-authored-by: Joseph Brinkman <github@brinkman.me> * SPC-39: MULTI_STATEMENT_COUNT parameter was being ignored (#6) * Update apiDefinition.swagger.json - change parameter name case to match snowflake docs exactly * add StatementHandles Map new response property for multi-statement handling. * Apply mappings to GetResults Same statementHandles mapping that was previously added to ExecSql was applied to GetResults to support Async * Remove async fixes These changes are already in dev branch, it was just a temporary change for debugging. * GetResults schema (#7) Innacurate schema was causing compilation issues in the power apps. Better to leave it as a untyped object since the schema is dynamic. * - Parse Object/Array types (#8) They were represented as a string before. * Document limitations per my experience. (#9) * Document limitations per my experience. * - Tweak readme * Update language limitations in the Readme documentation. --------- Co-authored-by: jbrinkman <github@brinkman.me> * openapi spec validation errors (#10) I was able to type the untyped objects, but a lot of those openapi spec validation errors are inherent to the fact that the snowflake api routes are technically all partial matches for eachother since the exec stmt path is "/" * Updated version history * Fix typo --------- Co-authored-by: TobinWritesCode <tobin.chee@improving.com> |
||
---|---|---|
.. | ||
apiDefinition.swagger.json | ||
apiProperties.json | ||
readme.md | ||
script.csx |
readme.md
Snowflake API Reference
This connector is based on the Snowflake SQL REST API. Snowflake enables data storage, processing, and analytic solutions that are faster, easier to use, and more flexible than traditional offerings. The connector uses the Snowflake REST API V2 to submit synchronous and asynchronous queries and retrieve corresponding results.
Publisher: Snowflake
Prerequisites
- Users must have Microsoft Entra ID for the external authentication.
- Users must have a premium Power Apps license.
- Users must have Snowflake account.
Supported Operations
Submit SQL Statement for Execution
Submit SQL statement for execution on snowflake.
Check the Status and Get Results
Check the status of the execution and get the results, by providing the statementHandle.
Cancel the Execution of a Statement
Cancels the execution of a statement, by providing the statementHandle.
Convert result set rows from array to objects
Converts result set rows from array of string to JSON object.
Obtaining Credentials
Set up Azure AD authentication for Snowflake by following these steps:
-
In Step 1: Configure the OAuth Resource in Azure AD, follow steps 1-10 and define the scope as
SESSION:ROLE-ANY
by following these instructions. -
In Step 2: Create an OAuth Client in Azure AD, follow steps 1-13.
-
Navigate to Authentication -> Platform configurations -> Add a platform -> Add "https://global.consent.azure-apim.net/redirect/snowflakepa" -> Click Save. Ensure that the redirect URL is set in the Snowflake OAuth Client and not the Snowflake OAuth Resource.
-
Go to the resource created in Step 1 and go to Expose an API -> Add a client application -> Add your
APPLICATION_CLIENT_ID
from earlier in step 3 above -> Click Save -
Follow Step 3: Collect Azure AD Information for Snowflake entirely.
-
Copy and paste the text below into your Snowflake worksheet, which is where you execute your queries in Snowflake. Before you execute the query, make sure you make the following replacements so that your query succeeds.
- In Microsoft Azure, go to your Snowflake OAuth Resource app and click on Endpoints.
- To get the AZURE_AD_ISSUER in line 5, copy the link in the Federation metadata document field and open the link in a new tab. Copy the entityID link which should something look like this:
https://sts.windows.net/90288a9b-97df-4c6d-b025-95713f21cef9/
. Paste it into the query and make sure you have a/
before the last quotation mark and that you keep the quotation marks. - To get the Keys URL in line 6, copy the link in the OpenID Connect metadata document field and open the link in a new tab. Copy the jwks_uri which should look something like this:
https://login.microsoftonline.com/90288a9b-97df-4c6d-b025-95713f21cef9/discovery/v2.0/keys
. Paste it into the query and make sure you keep the quotation marks. - Replace the Audience List URL in line 7 with
Application ID URI
from Step 1. Keep the quotation marks. - If your Snowflake account uses the same email address as your Microsoft Azure account, then replace
login_name
in line 9 withemail_address
. If not, keep it as is and do not type in your login name. Keep the quotation marks. - Make sure you've set your role as
ACCOUNTADMIN
. Now you can execute your query.
CREATE SECURITY INTEGRATION <integration name>
type = external_oauth
enabled = true
external_oauth_type = azure
external_oauth_issuer = '<AZURE_AD_ISSUER>'
external_oauth_jws_keys_url = '<Keys URL from step 6>'
external_oauth_audience_list = ('[Application ID URI from registered resource app in Azure]')
external_oauth_token_user_mapping_claim = 'upn'
external_oauth_snowflake_user_mapping_attribute = 'login_name'
external_oauth_any_role_mode = 'ENABLE';
Getting Started
- Add Snowflake connector in the Power App or Flow.
- Enter connection credentials as below:
- Client Id: Snowflake OAuth Client ID from registered client app in Azure.
- Client Secret: Snowflake OAuth Client secret from registered client app in Azure.
- Resource URL: Application ID URI from registered resource app in Azure. - Click on Sign in button and provider user credentials.
Known Issues and Limitations
- If you get a 500 response when creating a new connection, that is a transient error. Please wait a few minutes and try again.
- If you get a 401 response and your Host field in Step 1 follows this format "orgname-accountname," replace the Host field with your "locator" URL.
- The connector may time out with large query results. This is due to a general limitation that a custom connector must finish all operations, including fetching the data from snowflake is a total of 5 seconds as documented here.
Q: Are there any limits?
A:Yes. Your script must finish execution within 5 seconds and the size of your script file can’t be more than 1 MB. - Snowflake does not currently support variable binding in multi-statement SQL requests. This behavior passes through to the connector.
- Fields with true Null values are omitted from the result set of a query returned via a custom connector. Fields with quoted null strings ("null") are returned as expected. This is due to the serialization settings of the APIM layer that inherently lives underneath the custom connector as part of the power platform. Refer to the Snowlake API Reference documentation for more details about the Nullable setting.
Frequently Asked Questions
- How can the connector be used within Power Apps? Currently, Power Apps does not support dynamic schema. You can still use the connector from Power Apps by calling a flow from the app instead of directly from an app.
Deployment Instructions
- Download the connector source code from GitHub repository.
- Follow the instructions provided in Create a New Custom Connector to deploy the connector in your favourite Power Platform Environment.
- Once the connector is deployed, follow the instructions in Update an existing custom connector to update the connector with the latest changes.
- Follow the steps mentioned in Obtaining Credentials section to get credentials.
- Test the connector with the above obtained credentials.
Version History
September 2024
- Fix issue where Submit SQL Statement for Execution and Check the Status and Get Results actions were not handling fields with an object or array schema type.
- Resolve issue where multi-statement count parameter was not being handled by the Submit SQL Statement for Execution action.
August 2024
Fixes
- Resolve issue where null values for numeric columns would result in an error when calling Submit SQL Statement for Execution action or the Convert result set rows from array to objects action.
- Resolve issue where multi-partition results are not properly converted to JSON objects when calling the Convert result set rows from array to objects action.
- Resolve issue where the Convert result set rows from array to objects action would not properly convert the result set rows to JSON objects.
Breaking Changes
- The Convert result set rows from array to objects action is deprecated. This behavior has been integrated into the Submit SQL Statement for Execution and Check the Status and Get Results actions.
- The Check the Status and Get Results action now requires the DataSchema input parameter to be provided