update readme.md
This commit is contained in:
Родитель
ee431035a8
Коммит
c3217141e7
|
@ -21,7 +21,7 @@ A compiled Windows application that connects to a Kinect sensor device and provi
|
|||
A compiled Windows application that loads human joint position CSV files produced by the KinectReader or other tools, as well as optional corresponding video and audio files. It provides a timeline-based method to trim audio and video joint movement sequences into representative human gestures.
|
||||
|
||||
#### **[LabanEditor: ](https://github.com/microsoft/LabanotationSuite/tree/master/GestureAuthoringTools/LabanEditor) Gesture Analysis and Labanotation Generator**
|
||||
A Python application that loads a Kinect joint CSV file representing a human gesture, provides algorithmic options for automatically extracting keyframes from the gesture that correspond Labanotation data, and provides a graphical user interface for selection and modification of the extracted keyframes. Additionally, it saves the resulting gesture data in a JSON file format suitable for controlling robots running a gesture interpretation driver, as well as .png graphic file renderings of the charts and diagrams used in the interface.
|
||||
A Python application that loads a Kinect joint CSV file representing a human gesture, provides algorithmic options for automatically extracting keyframes from the gesture that correspond Labanotation data, and provides a graphical user interface for selection and modification of the extracted keyframes. Additionally, it saves the resulting gesture data in a JSON file format suitable for controlling robots running a gesture interpretation driver, as well as PNG graphic file renderings of the charts and diagrams used in the interface.
|
||||
|
||||
#### **[MSRAbotSimulation: ](https://github.com/microsoft/LabanotationSuite/tree/master/MSRAbotSimulation) Gesture Performance with Simulated Robot**
|
||||
This Python and browser-based simulation software uses javascript and html code to implement an animated 3D model of the robot and a user interface for selecting and rendering gestures described in the JSON format. A temporary local HTTP server invoked with python or an existing server can be used to host the software and the simulation is run within a modern web browser. The user can choose from a collection of sample gestures, or select a new gesture captured and created using this project's Gesture Authoring Tools.
|
||||
|
|
Загрузка…
Ссылка в новой задаче