Update to Edge 1.1 and RemoteDeviceAdapter (#116)

* Update to Edge 1.1 and RemoteDeviceAdapter

* address feedback
This commit is contained in:
giakas 2021-10-19 14:59:12 -07:00 коммит произвёл GitHub
Родитель 12af82d8e0
Коммит 5bfb8a03d3
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
24 изменённых файлов: 11414 добавлений и 41 удалений

Просмотреть файл

@ -1,8 +1,8 @@
# Azure Video Analyzer Extension for VS Code
# Azure Video Analyzer Extension for VSCode
Azure Video Analyzer support for Visual Studio Code is provided through this extension that makes it easy to edit and manage Video Analyzer pipelines. The getting started walk through below closely mirrors what you can find in our online quickstart for the extension [here](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/create-pipeline-vs-code-extension).
The Azure Video Analyzer extension makes it easy to edit and manage Video Analyzer pipelines. The steps below closely mirror our how-to guide on how to [use the Visual Studio Code extension for Azure Video Analyzer](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/edge/use-visual-studio-code-extension).
If you have already connected to your IoT Hub and are looking for reference on how to use the extension, please go to our doc on how to [use the Video Analyzer Visual Studio Code extension](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/visual-studio-code-extension).
If you have already connected to your IoT Hub and are looking for reference on how to use the extension, please go to our reference guide on the [Visual Studio Code extension for Azure Video Analyzer](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/visual-studio-code-extension).
## Suggested Pre-reading
@ -12,7 +12,10 @@ If you have already connected to your IoT Hub and are looking for reference on h
## Prerequisites
- An Azure account that includes an active subscription. [Create an account](https://azure.microsoft.com/free/) for free if you don't already have one.
- A deployed Video Analyzer edge module. If you didn't complete the [Get started - Azure Video Analyzer](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/get-started-detect-motion-emit-events) quickstart, you can deploy a sample set up in the [set up Azure resources](#set-up-azure-resources) section.
- Either the [Quickstart: Get started with Azure Video Analyzer](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/edge/get-started-detect-motion-emit-events) or [Continuous video recording and playback](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/edge/use-continuous-video-recording) tutorial
> [!NOTE]
> The images in this article are based on the [Continuous video recording and playback](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/edge/use-continuous-video-recording) tutorial.
## Set up Azure resources
@ -28,56 +31,57 @@ The deployment process will take about **20 minutes**. Upon completion, you will
If you run into issues with Azure resources that get created, please view our [troubleshooting guide](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/troubleshoot) to resolve some commonly encountered issues.
## Connect the Azure Video Analyzer Visual Studio Code extension to your IoT Hub
## Set up your development environment
To connect the extension to the edge module, you first need to retrieve your connection string. Follow these steps to do so.
### Obtain your IoT Hub connection string
1. Go to the [Azure portal](https://portal.azure.com) and select your IoT Hub.
1. On the left under `Settings`, select `Shared access policies`.
1. Select the Policy Name `iothubowner`.
1. From the window on the right, copy the `Primary connection string`.
To make calls to the Video Analyzer Edge module, a connection string is first needed to connect the Visual Studio Code extension to the IoT Hub.
Now that you have the connection string, the below steps will connect the extension to the edge module.
1. In the Azure portal, go to your IoT Hub account.
1. Look for **Shared access policies** in the left pane and select it.
1. Select the policy named **iothubowner**.
1. Copy the **Primary connection string** value. It will look like `HostName=xxx.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=XXX`.
1. In Visual Studio Code, select the `Azure Video Analyzer` icon on the left.
1. Click on the `Enter Connection String` button.
1. At the top, paste the connection string from the portal.
1. Select the device – default is `avasample-iot-edge-device`.
1. Select the Video Analyzer module – default is `avaedge`.
### Connect the Visual Studio Code extension to the IoT Hub
![Setup IoT Hub Connection String](https://github.com/Azure/lva-edge-vscode-extension/raw/main/resources/gifs/EnterConnectionString.gif)
Using your IoT Hub connection string, connect the Visual Studio Code extension to the Video Analyzer module.
Along the left, you will now see your connected device with the underlying module. By default, there are no pipeline topologies deployed.
1. In Visual Studio Code, select the **Azure Video Analyzer** icon from the activity bar on the far left-hand side.
1. In the Video Analyzer extension pane, click on the **Enter Connection String** button.
1. At the top, paste the IoT Hub connection string.
1. Select the device where AVA is deployed. The default is named `avasample-iot-edge-device`.
1. Select the Video Analyzer module. The default is named `avaedge`.
## Create a topology and live pipeline
![Gif showing how to enter the connection string](https://github.com/Azure/lva-edge-vscode-extension/raw/main/resources/gifs/EnterConnectionString.gif)
Pipeline topologies are the basic building block which Video Analyzer uses to define how work happens. You can learn more about [pipeline topologies here](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/pipeline.md). In this section you will deploy a pipeline topology which is a template and then create an instance of the topology, or live pipeline. The live pipeline is connected to the actual video stream.
The Video Analyzer extension pane should now show the connected device with all of its modules. Below the modules are where pipeline topologies are listed. By default, there are no pipeline topologies deployed.
1. On the left under `Modules`, right click on `Pipeline topologies` and select `Create pipeline topology``.
1. Along the top, under `Try sample topologies`, under `Motion Detection`, select `Publish motion events to IoT Hub`. When prompted, click `Proceed`.
1. Click `Save` in the top right.
## Create a pipeline topology
![Create a graph topology](https://github.com/Azure/lva-edge-vscode-extension/raw/main/resources/gifs/AddToplogy.gif)
A [pipeline topology](https://docs.microsoft.com/en-us/azure/azure-video-analyzer/video-analyzer-docs/pipeline) enables you to describe how live video or recorded videos should be processed and analyzed for your custom needs through a set of interconnected nodes.
You should now see an entry in the `Pipeline topologies` list on the left labeled `MotionDetection`. This is a pipeline topology, where some of the parameters are defined as variables that you can feed in when you create a live pipeline. Next we will create a live pipeline.
1. On the left under **Modules**, right click on **Pipeline topologies** and select **Create pipeline topology**.
1. Along the top, under **Try sample topologies**, under **Continuous Video Recording**, select **Record to Azure Video Analyzer video**. When prompted, click **Proceed**.
1. Click **Save** in the top right.
1. On the left under `Pipeline topologies`, right click on `MotionDetection` and select `Create live pipeline`.
1. For `Live pipeline name`, put in `mdpipeline1`.
1. In the `Parameters` section:
- For “rtspUrl” put in “rtsp://rtspsim:554/media/camera-300s.mkv”.
- For “rtspUserName” put in “testuser”.
- For “rtspPassword” put in “testpassword”.
1. In the top right, click “Save and activate”.
![Gif showing how to add a topology](https://github.com/Azure/lva-edge-vscode-extension/raw/main/resources/gifs/AddToplogy.gif)
![Activate an graph instance](https://github.com/Azure/lva-edge-vscode-extension/raw/main/resources/gifs/CreateAndActivate.gif)
Notice that there is now an entry in the **Pipeline topologies** list on the left labeled **CVRToVideoSink**. This is a pipeline topology, where some of the parameters are defined as variables.
This gets a starting topology deployed and a live pipeline up and running on your edge device. If you have the Azure IoT Hub extension installed from the Get Started quickstart, you can monitor the build-in event endpoint in the Azure IoT-Hub Visual Studio Code extension to monitor this as shown in the [Observe Results](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/get-started-detect-motion-emit-events.md#observe-results) section. This will require you to install the [Azure IoT Tools extension](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-tools) as well.
## Create a live pipeline
If you are looking for more instructions on the extension, please go to our reference doc on how to [use the Video Analyzer Visual Studio Code extension](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/visual-studio-code-extension).
A live pipeline is an instance of a pipeline topology. The variables in a pipeline topology are filled when a live pipeline is created.
## Clean up resources
1. On the left under **Pipeline topologies**, right click on **CVRToVideoSink** and select **Create live pipeline**.
1. For **Instance name**, put in `livePipeline1`.
1. In the **Parameters** section, under the **rtspUrl** parameter, put in `rtsp://rtspsim:554/media/camera-300s.mkv`.
1. In the top right, click **Save and activate**.
If you intend to try the Video Analyzer quickstarts or tutorials, keep the resources you created. Otherwise, go to the Azure portal, go to your resource groups, select the resource group where you ran this quickstart, and delete all the resources.
![Gif showing how to create and activate a live pipeline](https://github.com/Azure/lva-edge-vscode-extension/raw/main/resources/gifs/CreateAndActivate.gif)
Now that a live pipeline has been activated, operational events can be viewed by clicking on the **Start Monitoring Built-in Event Endpoint** button on the IoT Hub extension, as shown in the [Continuous video recording and playback](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/edge/use-continuous-video-recording#prepare-to-monitor-the-modules) tutorial.
If you are looking for more instructions on the extension, please go to our reference doc on the [Visual Studio Code extension for Azure Video Analyzer](https://docs.microsoft.com/azure/azure-video-analyzer/video-analyzer-docs/visual-studio-code-extension).
## Contributing

Просмотреть файл

@ -2,7 +2,7 @@
"name": "azure-video-analyzer",
"displayName": "Azure Video Analyzer",
"description": "%extensionDescription%",
"version": "0.1.6",
"version": "0.1.7",
"publisher": "ms-azuretools",
"license": "SEE LICENSE IN LICENSE",
"homepage": "https://github.com/Azure/video-analyzer-vscode-extension",
@ -134,6 +134,19 @@
{
"command": "moduleExplorer.livePipeline.showJson",
"title": "%livePipeline.showJson%"
},
{
"command": "moduleExplorer.remoteDeviceAdapter.create",
"title": "%remoteDeviceAdapter.create%",
"icon": "$(add)"
},
{
"command": "moduleExplorer.remoteDeviceAdapter.delete",
"title": "%remoteDeviceAdapter.delete%"
},
{
"command": "moduleExplorer.remoteDeviceAdapter.showJson",
"title": "%remoteDeviceAdapter.showJson%"
}
],
"viewsContainers": {
@ -289,6 +302,23 @@
"command": "moduleExplorer.livePipeline.showJson",
"when": "view == moduleExplorer && viewItem =~ /^livePipelineItemContext.*$/",
"group": "3_instanceCommands@0"
},
{
"command": "moduleExplorer.remoteDeviceAdapter.create",
"when": "view == moduleExplorer && viewItem == remoteAdapterListContext"
},
{
"command": "moduleExplorer.remoteDeviceAdapter.create",
"when": "view == moduleExplorer && viewItem == remoteAdapterListContext",
"group": "inline"
},
{
"command": "moduleExplorer.remoteDeviceAdapter.delete",
"when": "view == moduleExplorer && viewItem == remoteDeviceAdapterContext"
},
{
"command": "moduleExplorer.remoteDeviceAdapter.showJson",
"when": "view == moduleExplorer && viewItem == remoteDeviceAdapterContext"
}
]
},

Просмотреть файл

@ -57,6 +57,23 @@
"modulesListTreeItem": "Modules",
"no": "No",
"refresh": "Refresh",
"remoteDeviceAdapter.create": "Create remote device adapter",
"remoteDeviceAdapter.list.treeItem": "Remote device adapters",
"remoteDeviceAdapter.delete.confirmation": "Are you sure you want to delete the remote device adapter?",
"remoteDeviceAdapter.delete.failedError": "Failed to delete the remote device adapter",
"remoteDeviceAdapter.delete.successMessage": "Successfully deleted the remote device adapter",
"remoteDeviceAdapter.delete": "Delete remote device adapter",
"remoteDeviceAdapter.showJson": "Show remote device adapter JSON",
"remoteDeviceAdapter.create.name.prompt": "Enter a unique name for the remote device adapter",
"remoteDeviceAdapter.create.name.existing.validation.error": "Name already in use, please enter a unique name for the remote device adapter",
"remoteDeviceAdapter.create.deviceId.pick.placeHolder": "Select a device",
"remoteDeviceAdapter.create.deviceId.new.button": "Create a new device",
"remoteDeviceAdapter.create.newDevice.prompt": "Enter a unique ID for the new device",
"remoteDeviceAdapter.create.newDevice.existing.validation.error": "ID already in use, please enter a unique ID for the new device",
"remoteDeviceAdapter.create.newDevice.regex.validation.error": "Device ID should follow '^[A-Za-z0-9-:.+%_#*?!(),=@$']{0,128}$'",
"remoteDeviceAdapter.create.host.prompt": "Enter a hostname or IP address of the remote device",
"remoteDeviceAdapter.save.failedError": "Failed to save the remote device adapter",
"remoteDeviceAdapter.save.successMessage": "Successfully saved the remote device adapter",
"saveGraphFailedError": "Failed to save the graph ",
"saveGraphSuccessMessage": "Successfully saved the graph topology ",
"saveInstanceFailedError": "Failed to save the instance ",

Просмотреть файл

@ -0,0 +1 @@
<svg width="16" height="16" xmlns="http://www.w3.org/2000/svg"><title>Layer 1</title><rect height="11" width="3" y="3" x="7" fill="#C5C5C5"/><rect height="3" width="11" y="7" x="3" fill="#C5C5C5"/></svg>

После

Ширина:  |  Высота:  |  Размер: 203 B

Просмотреть файл

@ -0,0 +1 @@
<svg width="16" height="16" xmlns="http://www.w3.org/2000/svg"><title>Layer 1</title><rect height="11" width="3" y="3" x="7" fill="#424242"/><rect height="3" width="11" y="7" x="3" fill="#424242"/></svg>

После

Ширина:  |  Высота:  |  Размер: 203 B

Просмотреть файл

@ -240,6 +240,32 @@ export interface PipelineTopology {
systemData?: MediaGraphSystemData;
properties?: MediaGraphTopologyProperties;
}
export interface RemoteDeviceAdapter {
name: string;
apiVersion?: string;
systemData?: MediaGraphSystemData;
properties?: RemoteDeviceAdapterProperties;
}
export interface RemoteDeviceAdapterProperties {
description?: string;
target?: RemoteDeviceAdapterTarget;
iotHubDeviceConnection?: IotHubDeviceConnection;
}
export interface RemoteDeviceAdapterTarget {
host: string;
}
export interface IotHubDeviceConnection {
deviceId: string;
credentials: { key: string };
}
export interface SymmetricKeyCredentials {
"@type": "#Microsoft.VideoAnalyzer.SymmetricKeyCredentials";
key: string;
}
/**
* Collection of graph topologies.

Просмотреть файл

@ -85,4 +85,8 @@ export class IotHubData {
}
return null;
}
public async addDevice(deviceId: string) {
return await this.registryClient?.addDevices([{ deviceId: deviceId, capabilities: { iotEdge: false } }]);
}
}

Просмотреть файл

@ -0,0 +1,27 @@
import {
PipelineTopology,
RemoteDeviceAdapter
} from "../../Common/Types/VideoAnalyzerSDKTypes";
import { ModuleDetails } from "../ModuleExplorerPanel/ModuleItem";
import { IotHubData } from "./IotHubData";
export class RemoteDeviceAdapterData {
public static async getRemoteDeviceAdapters(iotHubData: IotHubData, moduleDetails: ModuleDetails): Promise<RemoteDeviceAdapter[]> {
try {
const response = await iotHubData.directMethodCall(moduleDetails, "remoteDeviceAdapterList");
return response?.value;
} catch (error) {
return Promise.reject(error);
}
}
public static putRemoteDeviceAdapter(iotHubData: IotHubData, moduleDetails: ModuleDetails, remoteAdapter: RemoteDeviceAdapter): Promise<RemoteDeviceAdapter[]> {
return iotHubData.directMethodCall(moduleDetails, "remoteDeviceAdapterSet", remoteAdapter);
}
public static deleteRemoteDeviceAdapter(iotHubData: IotHubData, moduleDetails: ModuleDetails, remoteAdapterName: string): Promise<void> {
return iotHubData.directMethodCall(moduleDetails, "remoteDeviceAdapterDelete", {
name: remoteAdapterName
});
}
}

Просмотреть файл

@ -9,6 +9,7 @@ import { Logger } from "../Util/Logger";
import { TreeUtils } from "../Util/TreeUtils";
import { GraphEditorPanel } from "../Webview/GraphPanel";
import { INode } from "./Node";
import { RemoteDeviceAdapterListItem } from "./RemoteDeviceAdapterListItem";
import { TopologyListItem } from "./TopologyListItem";
export interface ModuleDetails {
@ -56,7 +57,18 @@ export class ModuleItem extends vscode.TreeItem {
this._collapsibleState
);
await topologyListItem.loadInstances();
return [topologyListItem];
const items: any[] = [topologyListItem];
if (!versionDetails.legacy && versionDetails.apiVersion !== "1.0") {
const remoteDevideAdapterListItem = new RemoteDeviceAdapterListItem(this.iotHubData, {
deviceId: this.deviceId,
moduleId: this.moduleId,
apiVersion: versionDetails.apiVersion,
legacyModule: versionDetails.legacy,
versionFolder: versionDetails.versionFolder
});
items.push(remoteDevideAdapterListItem);
}
return items;
} else {
return [new vscode.TreeItem(Localizer.localize("iotHub.connectionString.moduleNotLVA"), vscode.TreeItemCollapsibleState.None) as INode];
}

Просмотреть файл

@ -0,0 +1,59 @@
import * as vscode from "vscode";
import {
LivePipeline,
MediaGraphInstanceState,
PipelineTopology,
RemoteDeviceAdapter
} from "../../Common/Types/VideoAnalyzerSDKTypes";
import { IotHubData } from "../Data/IotHubData";
import { LivePipelineData } from "../Data/LivePipelineData";
import { RemoteDeviceAdapterData } from "../Data/RemoteDeviceAdapterData";
import { Constants } from "../Util/Constants";
import { AvaHubConfig, ExtensionUtils } from "../Util/ExtensionUtils";
import Localizer from "../Util/Localizer";
import { Logger } from "../Util/Logger";
import { TreeUtils } from "../Util/TreeUtils";
import { GraphEditorPanel } from "../Webview/GraphPanel";
import { ModuleDetails } from "./ModuleItem";
import { INode } from "./Node";
export class RemoteDeviceAdapterItem extends vscode.TreeItem {
private _logger: Logger;
constructor(public iotHubData: IotHubData, private readonly _moduleDetails: ModuleDetails, private readonly _remoteDeviceAdapter?: RemoteDeviceAdapter) {
super(_remoteDeviceAdapter?.name ?? "", vscode.TreeItemCollapsibleState.None);
this._logger = Logger.getOrCreateOutputChannel();
this.contextValue = `remoteDeviceAdapterContext`;
this.iconPath = TreeUtils.getThemedIconPath("iothub");
}
public getChildren(avaHubConfig: AvaHubConfig): Promise<INode[]> | INode[] {
return [];
}
public async deleteRemoteDeviceAdapterCommand() {
if (this._remoteDeviceAdapter) {
const allowDelete = await ExtensionUtils.showConfirmation(Localizer.localize("remoteDeviceAdapter.delete.confirmation"));
if (allowDelete) {
RemoteDeviceAdapterData.deleteRemoteDeviceAdapter(this.iotHubData, this._moduleDetails, this._remoteDeviceAdapter.name).then(
(response) => {
TreeUtils.refresh();
this._logger.showInformationMessage(`${Localizer.localize("remoteDeviceAdapter.delete.successMessage")} "${this._remoteDeviceAdapter?.name}"`);
},
(error) => {
const errorList = GraphEditorPanel.parseDirectMethodError(error);
this._logger.logError(`${Localizer.localize("remoteDeviceAdapter.delete.failedError")} "${this._remoteDeviceAdapter?.name}"`, errorList);
}
);
}
}
}
public async showRemoteDeviceAdapterJson() {
if (this._remoteDeviceAdapter) {
vscode.workspace.openTextDocument({ language: "json", content: JSON.stringify(this._remoteDeviceAdapter, undefined, 4) }).then((doc) => {
vscode.window.showTextDocument(doc);
});
}
}
}

Просмотреть файл

@ -0,0 +1,206 @@
import { Device } from "azure-iothub";
import * as vscode from "vscode";
import {
LivePipeline,
PipelineTopology
} from "../../Common/Types/VideoAnalyzerSDKTypes";
import { IotHubData } from "../Data/IotHubData";
import { LivePipelineData } from "../Data/LivePipelineData";
import { RemoteDeviceAdapterData } from "../Data/RemoteDeviceAdapterData";
import { TopologyData } from "../Data/TolologyData";
import { Constants } from "../Util/Constants";
import { AvaHubConfig } from "../Util/ExtensionUtils";
import Localizer from "../Util/Localizer";
import { Logger } from "../Util/Logger";
import { MultiStepInput } from "../Util/MultiStepInput";
import { TreeUtils } from "../Util/TreeUtils";
import { GraphEditorPanel } from "../Webview/GraphPanel";
import { ModuleDetails } from "./ModuleItem";
import { INode } from "./Node";
import { RemoteDeviceAdapterItem } from "./RemoteDeviceAdapterItem";
import { TopologyItem } from "./TopologyItem";
interface RemoteDeviceAdapterCreateModel {
name: string;
device: any;
newDeviceId: string;
host: string;
}
export class RemoteDeviceAdapterListItem extends vscode.TreeItem {
private _logger: Logger;
private _remoteDeviceAdapters: any[] = [];
constructor(
public iotHubData: IotHubData,
private readonly _moduleDetails: ModuleDetails,
private readonly _collapsibleState: vscode.TreeItemCollapsibleState = vscode.TreeItemCollapsibleState.Collapsed
) {
super(Localizer.localize("remoteDeviceAdapter.list.treeItem"), _collapsibleState);
this.contextValue = `remoteAdapterListContext`;
this._logger = Logger.getOrCreateOutputChannel();
RemoteDeviceAdapterData.getRemoteDeviceAdapters(this.iotHubData, this._moduleDetails).then((adapters) => {
this._remoteDeviceAdapters = adapters;
});
}
public getChildren(): Promise<INode[]> | INode[] {
return new Promise((resolve) => {
if (!this._remoteDeviceAdapters?.length) {
RemoteDeviceAdapterData.getRemoteDeviceAdapters(this.iotHubData, this._moduleDetails).then(
(adapters) => {
this._remoteDeviceAdapters = adapters;
resolve(
adapters?.map((remoteDeviceAdapter) => {
return new RemoteDeviceAdapterItem(this.iotHubData, this._moduleDetails, remoteDeviceAdapter);
})
);
},
(error) => {
const errorString = this._moduleDetails.legacyModule ? "getAllGraphsFailedError" : "topologies.getAll.failedError";
const errorNode = new vscode.TreeItem(Localizer.localize(errorString), vscode.TreeItemCollapsibleState.None);
const errorList = GraphEditorPanel.parseDirectMethodError(error);
this._logger.logError(`${Localizer.localize(errorString)}`, errorList, false);
resolve([errorNode as INode]);
}
);
} else {
resolve(
this._remoteDeviceAdapters.map((remoteDeviceAdapter) => {
return new RemoteDeviceAdapterItem(this.iotHubData, this._moduleDetails, remoteDeviceAdapter);
})
);
}
});
}
public async createNewRemoteDeviceAdapterCommand(context: vscode.ExtensionContext) {
const dataModel = (await this.collectRemoteDeviceAdapterCreateInputs(context)) as RemoteDeviceAdapterCreateModel;
if (dataModel) {
if (dataModel.newDeviceId) {
await this.iotHubData.addDevice(dataModel.newDeviceId);
dataModel.device = await this.iotHubData.getDevice(dataModel.newDeviceId);
}
const remoteDeviceAdapter = {
name: dataModel.name,
properties: {
target: { host: dataModel.host },
iotHubDeviceConnection: {
deviceId: (dataModel.device as any).deviceId,
credentials: {
"@type": "#Microsoft.VideoAnalyzer.SymmetricKeyCredentials",
key: (dataModel.device as any).authentication.symmetricKey.primaryKey
}
}
}
};
return RemoteDeviceAdapterData.putRemoteDeviceAdapter(this.iotHubData, this._moduleDetails, remoteDeviceAdapter).then(
() => {
TreeUtils.refresh();
this._logger.showInformationMessage(`${Localizer.localize("remoteDeviceAdapter.save.successMessage")} "${remoteDeviceAdapter?.name}"`);
return Promise.resolve();
},
(error) => {
const errorList = GraphEditorPanel.parseDirectMethodError(error, remoteDeviceAdapter);
this._logger.logError(`${Localizer.localize("remoteDeviceAdapter.save.failedError")} "${remoteDeviceAdapter.name}`, errorList);
return Promise.reject();
}
);
}
}
private async collectRemoteDeviceAdapterCreateInputs(context: vscode.ExtensionContext) {
const title = Localizer.localize("remoteDeviceAdapter.create");
class MyButton implements vscode.QuickInputButton {
constructor(public iconPath: { light: vscode.Uri; dark: vscode.Uri }, public tooltip: string) {}
}
const createNewDeviceButton = new MyButton(TreeUtils.getThemedIconPath("add") as any, Localizer.localize("remoteDeviceAdapter.create.deviceId.new.button"));
const inputRemoteAdapterName = async (inputStep: MultiStepInput, dataModel: Partial<RemoteDeviceAdapterCreateModel>) => {
dataModel.name = await inputStep.showInputBox({
title,
prompt: Localizer.localize("remoteDeviceAdapter.create.name.prompt"),
step: 1,
totalSteps: 3,
validate: (value) =>
Promise.resolve(
this._remoteDeviceAdapters.find((adapter) => adapter.name === value)
? Localizer.localize("remoteDeviceAdapter.create.name.existing.validation.error")
: ""
),
value: typeof dataModel.name === "string" ? dataModel.name : ""
});
return (input: MultiStepInput) => pickDeviceId(input, dataModel);
};
const pickDeviceId = async (inputStep: MultiStepInput, dataModel: Partial<RemoteDeviceAdapterCreateModel>) => {
const devices = await this.iotHubData.getDevices();
const iotDevices = devices?.filter((device) => !device.capabilities?.iotEdge);
if (!iotDevices?.length) {
return (input: MultiStepInput) => inputNewDeviceId(input, dataModel);
}
const pick = await inputStep.showQuickPick({
title,
step: 2,
totalSteps: 3,
placeholder: Localizer.localize("remoteDeviceAdapter.create.deviceId.pick.placeHolder"),
items:
iotDevices?.map((device) => {
return { label: device.deviceId, device: device };
}) ?? [],
activeItem: typeof dataModel.device !== "string" ? dataModel.device : undefined,
buttons: [createNewDeviceButton]
});
if (pick instanceof MyButton) {
return (input: MultiStepInput) => inputNewDeviceId(input, dataModel);
}
dataModel.device = (pick as any).device;
return (input: MultiStepInput) => inputHostName(input, dataModel);
};
const inputNewDeviceId = async (inputStep: MultiStepInput, dataModel: Partial<RemoteDeviceAdapterCreateModel>) => {
dataModel.newDeviceId = await inputStep.showInputBox({
title,
prompt: Localizer.localize("remoteDeviceAdapter.create.newDevice.prompt"),
step: 3,
totalSteps: 3,
validate: (value) => {
return new Promise((resolve) => {
this.iotHubData.getDevices().then((devices) => {
if (devices?.find((adapter) => adapter.deviceId === value)) {
resolve(Localizer.localize("remoteDeviceAdapter.create.newDevice.existing.validation.error"));
} else if (!/^[A-Za-z0-9-:.+%_#*?!(),=@$']{0,128}$/.test(value)) {
resolve(Localizer.localize("remoteDeviceAdapter.create.newDevice.regex.validation.error"));
} else {
resolve("");
}
});
});
},
value: typeof dataModel.newDeviceId === "string" ? dataModel.newDeviceId : ""
});
return (input: MultiStepInput) => inputHostName(input, dataModel);
};
const inputHostName = async (inputStep: MultiStepInput, dataModel: Partial<RemoteDeviceAdapterCreateModel>) => {
const extraSteps = dataModel.newDeviceId ? 1 : 0;
dataModel.host = await inputStep.showInputBox({
title,
prompt: Localizer.localize("remoteDeviceAdapter.create.host.prompt"),
step: 3 + extraSteps,
totalSteps: 3 + extraSteps,
validate: (value) => Promise.resolve(""),
value: typeof dataModel.host === "string" ? dataModel.host : ""
});
};
const dataModel = {} as Partial<RemoteDeviceAdapterCreateModel>;
await MultiStepInput.run((input) => inputRemoteAdapterName(input, dataModel));
return dataModel;
}
}

Просмотреть файл

@ -13,7 +13,7 @@ export class Constants {
};
public static LegacySupportedApiVersions = ["2.0"];
public static SupportedApiVersions = ["1.0"];
public static SupportedApiVersions = ["1.0", "1.1"];
public static VideoAnalyzerGlobalStateKey = "videoAnalyzerGlobalStateConfigKey";
public static VideoAnalyzerGlobalStateGraphAlignKey = "videoAnalyzerGlobalStateGraphAlignKey";

Просмотреть файл

@ -0,0 +1,183 @@
import {
CancellationToken,
Disposable,
ExtensionContext,
QuickInput,
QuickInputButton,
QuickInputButtons,
QuickPickItem,
Uri,
window
} from "vscode";
class InputFlowAction {
static back = new InputFlowAction();
static cancel = new InputFlowAction();
static resume = new InputFlowAction();
}
type InputStep = (input: MultiStepInput) => Thenable<InputStep | void>;
interface QuickPickParameters<T extends QuickPickItem> {
title: string;
step: number;
totalSteps: number;
items: T[];
activeItem?: T;
placeholder: string;
buttons?: QuickInputButton[];
shouldResume?: () => Thenable<boolean>;
}
interface InputBoxParameters {
title: string;
step: number;
totalSteps: number;
value: string;
prompt: string;
validate: (value: string) => Promise<string | undefined>;
buttons?: QuickInputButton[];
shouldResume?: () => Thenable<boolean>;
}
export class MultiStepInput {
static async run<T>(start: InputStep) {
const input = new MultiStepInput();
return input.stepThrough(start);
}
private current?: QuickInput;
private steps: InputStep[] = [];
private async stepThrough<T>(start: InputStep) {
let step: InputStep | void = start;
while (step) {
this.steps.push(step);
if (this.current) {
this.current.enabled = false;
this.current.busy = true;
}
try {
step = await step(this);
} catch (err) {
if (err === InputFlowAction.back) {
this.steps.pop();
step = this.steps.pop();
} else if (err === InputFlowAction.resume) {
step = this.steps.pop();
} else if (err === InputFlowAction.cancel) {
step = undefined;
} else {
throw err;
}
}
}
if (this.current) {
this.current.dispose();
}
}
async showQuickPick<T extends QuickPickItem, P extends QuickPickParameters<T>>({
title,
step,
totalSteps,
items,
activeItem,
placeholder,
buttons,
shouldResume
}: P) {
const disposables: Disposable[] = [];
try {
return await new Promise<T | (P extends { buttons: (infer I)[] } ? I : never)>((resolve, reject) => {
const input = window.createQuickPick<T>();
input.title = title;
input.step = step;
input.totalSteps = totalSteps;
input.placeholder = placeholder;
input.items = items;
if (activeItem) {
input.activeItems = [activeItem];
}
input.buttons = [...(this.steps.length > 1 ? [QuickInputButtons.Back] : []), ...(buttons || [])];
disposables.push(
input.onDidTriggerButton((item) => {
if (item === QuickInputButtons.Back) {
reject(InputFlowAction.back);
} else {
resolve(<any>item);
}
}),
input.onDidChangeSelection((items) => resolve(items[0])),
input.onDidHide(() => {
(async () => {
reject(shouldResume && (await shouldResume()) ? InputFlowAction.resume : InputFlowAction.cancel);
})().catch(reject);
})
);
if (this.current) {
this.current.dispose();
}
this.current = input;
this.current.show();
});
} finally {
disposables.forEach((d) => d.dispose());
}
}
async showInputBox<P extends InputBoxParameters>({ title, step, totalSteps, value, prompt, validate, buttons, shouldResume }: P) {
const disposables: Disposable[] = [];
try {
return await new Promise<string | (P extends { buttons: (infer I)[] } ? I : never)>((resolve, reject) => {
const input = window.createInputBox();
input.title = title;
input.step = step;
input.totalSteps = totalSteps;
input.value = value || "";
input.prompt = prompt;
input.buttons = [...(this.steps.length > 1 ? [QuickInputButtons.Back] : []), ...(buttons || [])];
let validating = validate("");
disposables.push(
input.onDidTriggerButton((item) => {
if (item === QuickInputButtons.Back) {
reject(InputFlowAction.back);
} else {
resolve(<any>item);
}
}),
input.onDidAccept(async () => {
const value = input.value;
input.enabled = false;
input.busy = true;
if (!(await validate(value))) {
resolve(value);
}
input.enabled = true;
input.busy = false;
}),
input.onDidChangeValue(async (text) => {
const current = validate(text);
validating = current;
const validationMessage = await current;
if (current === validating) {
input.validationMessage = validationMessage;
}
}),
input.onDidHide(() => {
(async () => {
reject(shouldResume && (await shouldResume()) ? InputFlowAction.resume : InputFlowAction.cancel);
})().catch(reject);
})
);
if (this.current) {
this.current.dispose();
}
this.current = input;
this.current.show();
});
} finally {
disposables.forEach((d) => d.dispose());
}
}
}

Просмотреть файл

@ -1,6 +1,8 @@
import * as vscode from "vscode";
import { LivePipelineItem } from "./ModuleExplorerPanel/LivePipelineItem";
import ModuleExplorer from "./ModuleExplorerPanel/ModuleExplorer";
import { RemoteDeviceAdapterItem } from "./ModuleExplorerPanel/RemoteDeviceAdapterItem";
import { RemoteDeviceAdapterListItem } from "./ModuleExplorerPanel/RemoteDeviceAdapterListItem";
import { TopologyItem } from "./ModuleExplorerPanel/TopologyItem";
import { TopologyListItem } from "./ModuleExplorerPanel/TopologyListItem";
import { Constants } from "./Util/Constants";
@ -90,6 +92,15 @@ export async function activate(context: vscode.ExtensionContext) {
}),
vscode.commands.registerCommand("moduleExplorer.livePipeline.showJson", (instanceNode: LivePipelineItem) => {
instanceNode.showLivePipelineJson();
}),
vscode.commands.registerCommand("moduleExplorer.remoteDeviceAdapter.create", (remoteDeviceAdapter: RemoteDeviceAdapterListItem) => {
remoteDeviceAdapter.createNewRemoteDeviceAdapterCommand(context);
}),
vscode.commands.registerCommand("moduleExplorer.remoteDeviceAdapter.delete", (remoteDeviceAdapter: RemoteDeviceAdapterItem) => {
remoteDeviceAdapter.deleteRemoteDeviceAdapterCommand();
}),
vscode.commands.registerCommand("moduleExplorer.remoteDeviceAdapter.showJson", (remoteDeviceAdapter: RemoteDeviceAdapterItem) => {
remoteDeviceAdapter.showRemoteDeviceAdapterJson();
})
);
}

Разница между файлами не показана из-за своего большого размера Загрузить разницу

Просмотреть файл

@ -0,0 +1,13 @@
{
"sources": ["#Microsoft.VideoAnalyzer.RtspSource", "#Microsoft.VideoAnalyzer.IotHubMessageSource"],
"processors": [
"#Microsoft.VideoAnalyzer.MotionDetectionProcessor",
"#Microsoft.VideoAnalyzer.ObjectTrackingProcessor",
"#Microsoft.VideoAnalyzer.LineCrossingProcessor",
"#Microsoft.VideoAnalyzer.SignalGateProcessor",
"#Microsoft.VideoAnalyzer.CognitiveServicesVisionProcessor",
"#Microsoft.VideoAnalyzer.GrpcExtension",
"#Microsoft.VideoAnalyzer.HttpExtension"
],
"sinks": ["#Microsoft.VideoAnalyzer.IotHubMessageSink", "#Microsoft.VideoAnalyzer.FileSink", "#Microsoft.VideoAnalyzer.VideoSink"]
}

Просмотреть файл

@ -1,4 +1,4 @@
import DefinitionGenerator from "./DefinitionGenerator";
// constructor generates files as side effect
new DefinitionGenerator("ava1.0", "Definitions");
new DefinitionGenerator("ava1.1", "Definitions");

Просмотреть файл

@ -0,0 +1,40 @@
export const limitOnePerGraph = [
// Only one RTSP source is allowed per graph topology.
"#Microsoft.VideoAnalyzer.RtspSource"
];
export const mustBeImmediatelyDownstreamOf = [
// Motion detection processor: Must be immediately downstream from RTSP source.
["#Microsoft.VideoAnalyzer.MotionDetectionProcessor", ["#Microsoft.VideoAnalyzer.RtspSource"]],
// Signal gate processor: Must be immediately downstream from RTSP source.
["#Microsoft.VideoAnalyzer.SignalGateProcessor", ["#Microsoft.VideoAnalyzer.RtspSource"]],
// File sink: Must be immediately downstream from signal gate processor.
["#Microsoft.VideoAnalyzer.FileSink", ["#Microsoft.VideoAnalyzer.SignalGateProcessor"]],
// Line Crossing processor: Must be immediately downstream from object tracking processor.
["#Microsoft.VideoAnalyzer.LineCrossingProcessor", ["#Microsoft.VideoAnalyzer.ObjectTrackingProcessor"]]
];
export const cannotBeImmediatelyDownstreamOf = [
// File sink: Cannot be immediately downstream of HTTP extension processor, or motion detection processor
["#Microsoft.VideoAnalyzer.FileSink", "#Microsoft.VideoAnalyzer.HttpExtension"],
// File sink: Cannot be immediately downstream of HTTP extension processor, or motion detection processor
["#Microsoft.VideoAnalyzer.FileSink", "#Microsoft.VideoAnalyzer.MotionDetectionProcessor"],
// IoT Hub Sink: Cannot be immediately downstream of an IoT Hub Source.
["#Microsoft.VideoAnalyzer.IotHubMessageSink", "#Microsoft.VideoAnalyzer.IotHubMessageSource"]
];
export const cannotBeDownstreamOf = [
// Motion detection processor: Cannot be used downstream of a graph extension processor.
["#Microsoft.VideoAnalyzer.MotionDetectionProcessor", "#Microsoft.VideoAnalyzer.HttpExtension"],
["#Microsoft.VideoAnalyzer.MotionDetectionProcessor", "#Microsoft.VideoAnalyzer.GrpcExtension"],
// Motion detection processors cannot be in sequence
["#Microsoft.VideoAnalyzer.MotionDetectionProcessor", "#Microsoft.VideoAnalyzer.MotionDetectionProcessor"],
// Signal gate processors cannot be in sequence
["#Microsoft.VideoAnalyzer.SignalGateProcessor", "#Microsoft.VideoAnalyzer.SignalGateProcessor"],
// ObjectTrackingProcessor cannot be downstream of CognitiveServicesProcessor
["#Microsoft.VideoAnalyzer.ObjectTrackingProcessor", "#Microsoft.VideoAnalyzer.CognitiveServicesVisionProcessor"]
];
export const documentationLinks = {
limitationsAtPreview: "https://docs.microsoft.com/en-us/azure/media-services/live-video-analytics-edge/quotas-limitations#limitations-on-graph-topologies-at-preview"
};

Просмотреть файл

@ -0,0 +1,192 @@
import Localizer from "../../Localization/Localizer";
export class SamplesList {
public static gitHubInfo = {
apiUrl: "https://api.github.com/repos/azure/video-analyzer/git/trees/main?recursive=1"
};
public static getCommandBarItems = (menuItemOnClick: () => void) => {
return [
{
text: Localizer.l("sample.group.continuousRecording"),
key: "sample.group.continuousRecording",
subMenuProps: {
items: [
{
text: Localizer.l("sample.cvr-video-sink"),
key: "pipelines/live/topologies/cvr-video-sink/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.cvr-with-grpcExtension"),
key: "pipelines/live/topologies/cvr-with-grpcExtension/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.cvr-with-httpExtension"),
key: "pipelines/live/topologies/cvr-with-httpExtension/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.cvr-with-motion"),
key: "pipelines/live/topologies/cvr-with-motion/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.audio-video"),
key: "pipelines/live/topologies/audio-video/topology.json",
onClick: menuItemOnClick
}
]
}
},
{
text: Localizer.l("sample.group.eventBasedVideRecording"),
key: "sample.group.eventBasedVideRecording",
subMenuProps: {
items: [
{
text: Localizer.l("sample.evr-grpcExtension-video-sink"),
key: "pipelines/live/topologies/evr-grpcExtension-video-sink/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.evr-httpExtension-video-sink"),
key: "pipelines/live/topologies/evr-httpExtension-video-sink/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.evr-hubMessage-video-sink"),
key: "pipelines/live/topologies/evr-hubMessage-video-sink/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.evr-hubMessage-file-sink"),
key: "pipelines/live/topologies/evr-hubMessage-file-sink/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.evr-motion-video-sink-file-sink"),
key: "pipelines/live/topologies/evr-motion-video-sink-file-sink/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.evr-motion-video-sink"),
key: "pipelines/live/topologies/evr-motion-video-sink/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.evr-motion-file-sink"),
key: "pipelines/live/topologies/evr-motion-file-sink/topology.json",
onClick: menuItemOnClick
}
]
}
},
{
text: Localizer.l("sample.group.motionDetection"),
key: "sample.group.motionDetection",
subMenuProps: {
items: [
{
text: Localizer.l("sample.motion-detection"),
key: "pipelines/live/topologies/motion-detection/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.motion-with-grpcExtension"),
key: "pipelines/live/topologies/motion-with-grpcExtension/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.motion-with-httpExtension"),
key: "pipelines/live/topologies/motion-with-httpExtension/topology.json",
onClick: menuItemOnClick
}
]
}
},
{
text: Localizer.l("sample.group.extensions"),
key: "sample.group.extensions",
subMenuProps: {
items: [
{
text: Localizer.l("sample.httpExtension"),
key: "pipelines/live/topologies/httpExtension/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.httpExtensionOpenVINO"),
key: "pipelines/live/topologies/httpExtensionOpenVINO/topology.json",
onClick: menuItemOnClick
}
]
}
},
{
text: Localizer.l("sample.group.computerVision"),
key: "sample.group.computerVision",
subMenuProps: {
items: [
{
text: Localizer.l("sample.ava-spatial-analysis-person-count"),
key: "pipelines/live/topologies/spatial-analysis/person-count-operation-topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.ava-spatial-analysis-person-crossing-line"),
key: "pipelines/live/topologies/spatial-analysis/person-line-crossing-operation-topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.ava-spatial-analysis-person-crossing-zone"),
key: "pipelines/live/topologies/spatial-analysis/person-zone-crossing-operation-topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.ava-spatial-analysis-person-distance"),
key: "pipelines/live/topologies/spatial-analysis/person-distance-operation-topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.ava-spatial-analysis-custom"),
key: "pipelines/live/topologies/spatial-analysis/custom-operation-topology.json",
onClick: menuItemOnClick
}
]
}
},
{
text: Localizer.l("sample.group.aiComposition"),
key: "sample.group.aiComposition",
subMenuProps: {
items: [
{
text: Localizer.l("sample.ai-composition"),
key: "pipelines/live/topologies/ai-composition/topology.json",
onClick: menuItemOnClick
}
]
}
},
{
text: Localizer.l("sample.group.miscellaneous"),
key: "sample.group.miscellaneous",
subMenuProps: {
items: [
{
text: Localizer.l("sample.object-tracking"),
key: "pipelines/live/topologies/object-tracking/topology.json",
onClick: menuItemOnClick
},
{
text: Localizer.l("sample.line-crossing"),
key: "pipelines/live/topologies/line-crossing/topology.json",
onClick: menuItemOnClick
}
]
}
}
];
};
}

Просмотреть файл

@ -0,0 +1,13 @@
{
"Endpoint.url": "urlFormat",
"FileSink.filePathPattern": "number",
"VideoCreationProperties.segmentLength": "isoDuration",
"SignalGateProcessor.activationEvaluationWindow": "isoDuration",
"SignalGateProcessor.activationSignalOffset": "isoDuration",
"SignalGateProcessor.minimumActivationTime": "isoDuration",
"SignalGateProcessor.maximumActivationTime": "isoDuration",
"ImageScale.width": "number",
"ImageScale.height": "number",
"VideoSink.localMediaCacheMaximumSizeMiB": "number",
"SamplingOptions.maximumSamplesPerSecond": "number"
}

Просмотреть файл

@ -0,0 +1,25 @@
{
"AssetSink.segmentLength": {
"placeholder": "Enter value in seconds"
},
"IoTHubMessageSink.hubOutputName": {
"placeholder": "Enter value in seconds"
},
"SignalGateProcessor.activationEvaluationWindow": {
"placeholder": "Enter value in seconds"
},
"SignalGateProcessor.activationSignalOffset": {
"placeholder": "Enter value in seconds"
},
"SignalGateProcessor.minimumActivationTime": {
"placeholder": "Enter value in seconds"
},
"SignalGateProcessor.maximumActivationTime": {
"placeholder": "Enter value in seconds"
},
".nodeName": {
"title": "Node name",
"description": "The name of the node",
"placeholder": ""
}
}

Разница между файлами не показана из-за своего большого размера Загрузить разницу

Разница между файлами не показана из-за своего большого размера Загрузить разницу

Просмотреть файл

@ -0,0 +1,38 @@
{
"FileSink.filePathPattern": {
"type": "minMaxLength",
"value": [1, 260]
},
"VideoCreationProperties.segmentLength": {
"type": "minMaxValue",
"value": [30, 300, "seconds"]
},
"IotHubMessageSink.hubOutputName": {
"type": "maxLength",
"value": 50
},
"SignalGateProcessor.activationEvaluationWindow": {
"type": "minMaxValue",
"value": [0, 10, "seconds"]
},
"SignalGateProcessor.activationSignalOffset": {
"type": "minMaxValue",
"value": [-60, 60, "seconds"]
},
"SignalGateProcessor.minimumActivationTime": {
"type": "minMaxValue",
"value": [10, 3600, "seconds"]
},
"SignalGateProcessor.maximumActivationTime": {
"type": "minMaxValue",
"value": [10, 3600, "seconds"]
},
"ImageScale.width": {
"type": "minValue",
"value": 1
},
"ImageScale.height": {
"type": "minValue",
"value": 1
}
}