Update Synapse gallery documentation (#84)
Co-authored-by: Karthick Narendan <kanarend@microsoft.com>
Двоичные данные
deploy/gallery/images/1.png
До Ширина: | Высота: | Размер: 170 KiB После Ширина: | Высота: | Размер: 116 KiB |
Двоичные данные
deploy/gallery/images/2.png
До Ширина: | Высота: | Размер: 50 KiB После Ширина: | Высота: | Размер: 75 KiB |
Двоичные данные
deploy/gallery/images/3.png
До Ширина: | Высота: | Размер: 346 KiB После Ширина: | Высота: | Размер: 152 KiB |
Двоичные данные
deploy/gallery/images/4.png
До Ширина: | Высота: | Размер: 293 KiB После Ширина: | Высота: | Размер: 202 KiB |
Двоичные данные
deploy/gallery/images/5.png
До Ширина: | Высота: | Размер: 291 KiB После Ширина: | Высота: | Размер: 134 KiB |
Двоичные данные
deploy/gallery/images/6.png
До Ширина: | Высота: | Размер: 295 KiB |
Двоичные данные
deploy/gallery/images/7.png
До Ширина: | Высота: | Размер: 294 KiB |
Двоичные данные
deploy/gallery/images/8.png
До Ширина: | Высота: | Размер: 215 KiB |
Двоичные данные
deploy/gallery/images/9.png
До Ширина: | Высота: | Размер: 200 KiB |
|
@ -1,68 +1,48 @@
|
|||
If you've landed on this page from Synapse gallery then Congrats! -- you are in the right place.
|
||||
If you've landed on this page from Synapse gallery, we assume you have the required infrastructure and the Custom Vision model package ready. If this is true proceed further else visit the [infrastructure deployment](https://github.com/Azure/Azure-Orbital-Analytics-Samples/tree/main/deploy#infrastructure-deployment) section and provision the required infrastructure and the package.
|
||||
|
||||
The instructions on this page will guide you through configuring the pipeline template to successfully run a custom vision model for object detection (for example, detecting swimming pool objects). You can also use the pipeline template to explore other object detection use cases as well.
|
||||
The instructions on this page will guide you through configuring the pipeline template in the gallery to successfully run a custom vision model for object detection (in this case, swimming pool).
|
||||
|
||||
**Prerequisites**
|
||||
|
||||
* Follow the steps in the [readme](https://github.com/Azure/Azure-Orbital-Analytics-Samples/blob/main/deploy/README.md) to deploy the infrastructure. This includes creation of the custom vision model package as well.
|
||||
* Run the following command to create the linked services and spark job definition on the Synapse workspace. Occasionally, you may notice some failure whilst creating the linked services. This is due to an on going issue with az cli, please re-run the command, if you encounter it.
|
||||
|
||||
* Run the following command to create the linked services and spark job definition on the Synapse workspace:
|
||||
|
||||
```
|
||||
```bash
|
||||
./deploy/gallery/create_service.sh <environmentCode>
|
||||
```
|
||||
|
||||
NOTE
|
||||
**environmentCode** should be the same as the one used in the deployment steps
|
||||
|
||||
* Run the following command to copy the sample GeoTiff image and the required configurations into the storage account for detecting swimming pools using the Object Detection CV model:
|
||||
* Run the following command to copy the sample GeoTiff image and the required configurations into the storage account for detecting swimming pools using the Object Detection CV model.
|
||||
|
||||
```
|
||||
```bash
|
||||
./deploy/scripts/copy_geotiff.sh <environmentCode>
|
||||
```
|
||||
|
||||
**Switch back to Synapse gallery**
|
||||
|
||||
1. The input page appears:
|
||||
1. The input page appears as shown below.
|
||||
|
||||
![](./images/1.png)
|
||||
|
||||
2. Select the values from the dropdown as shown below and click **Open pipeline**:
|
||||
2. Select the values from the dropdown as shown below and click **Open pipeline**.
|
||||
|
||||
![](./images/2.png)
|
||||
|
||||
3. On the right side, there is a list of mandatory fields. Click each of them and select the respective names from the dropdown as shown below. This additional step is only interim; we are working with the Synapse product group for a long term fix.
|
||||
|
||||
NOTE: It may not be the exact list as shown below, configure the names on the one's that show up.
|
||||
- Transforms
|
||||
|
||||
- Transforms
|
||||
![](./images/3.png)
|
||||
|
||||
![](./images/3.png)
|
||||
|
||||
- Pool Geolocation
|
||||
![](./images/4.png)
|
||||
|
||||
- Copy Tiles
|
||||
- Pool Geolocation
|
||||
|
||||
![](./images/4.png)
|
||||
|
||||
4. When the mandatory fields are populated, turn on `Data flow debug`. While this is warming up, enter the value of the parameters as shown below and **Publish** to save the changes.
|
||||
|
||||
![](./images/5.png)
|
||||
|
||||
- Copy Config file
|
||||
|
||||
![](./images/6.png)
|
||||
|
||||
- Copy Xml from Convert Transform
|
||||
|
||||
![](./images/7.png)
|
||||
|
||||
4. When all the required fields are provided as shown on the right side of the image below, go ahead and publish.
|
||||
|
||||
![](./images/8.png)
|
||||
|
||||
5. When published successfully, you are just one step away from running the pipeline. Enable the Integration Runtime by moving the **Data flow debug** toggle button. While this is warming up, enter the value for the parameters.
|
||||
|
||||
![](./images/9.png)
|
||||
|
||||
|
||||
|No |Parameter | Value | Comments |
|
||||
|--| ---- | --- | ------- |
|
||||
| 1|Prefix| \<environmentCode>-test-container | |
|
||||
|
@ -71,8 +51,9 @@ The instructions on this page will guide you through configuring the pipeline te
|
|||
| 4|BatchAccountName | | Get the batch account name from \<environmentCode>-orc-rg |
|
||||
| 5|BatchJobName | \<environmentCode>-data-cpu-pool | Get the jobname from the batch account|
|
||||
| 6|BatchLocation | | Get the region from the batch account. Usually be the deployment region|
|
||||
| 7|SparkPoolName | pool<6-character-random-string>| Get the spark pool name from \<environmentCode>-pipeline-rg |
|
||||
| 8|EnvCode | \<environmentCode> | Same as used in the deployment steps|
|
||||
| 7|SparkPoolName | pool<10-character-random-string>| Get the spark pool name from \<environmentCode>-pipeline-rg |
|
||||
| 8|EnvCode | \<environmentCode> | Same as used in the deployment steps|
|
||||
| 9|KeyVaultName | kvp<10-character-random-string>| Get the name from \<environmentCode>-pipeline-rg |
|
||||
|
||||
6. All set to start the pipeline now. Press **Debug**.
|
||||
5. All set to start the pipeline now. Press **Debug**.
|
||||
|
||||
|
|