Azure-Kinect-Sensor-SDK/examples/fastpointcloud
wes-b 467d8272fa
Green Screen tests should not run as part of functional tests (#729)
* Ready for automation

* address PR comments and cleaned up other MD files based on the feedback.
2019-09-06 14:47:44 -07:00
..
CMakeLists.txt Updating and standardizing the copyright headers in all of the files. (#87) 2019-02-22 10:31:47 -08:00
README.md Green Screen tests should not run as part of functional tests (#729) 2019-09-06 14:47:44 -07:00
main.cpp Extended the transformation example to extract a frame in a recording as a point cloud (#104) 2019-03-05 18:11:52 -08:00

README.md

Azure Kinect Fastpointcloud Example

Introduction

The Azure Kinect Fastpointcloud example computes a 3d point cloud from a depth map. The example precomputes a lookup table by storing x- and y-scale factors for every pixel. At runtime, the 3d X-coordinate of a pixel in millimeters is derived by multiplying the pixel's depth value with the corresponding x-scale factor. The 3d Y-coordinate is obtained by multiplying with the y-scale factor.

This method represents an alternative to calling k4a_transformation_depth_image_to_point_cloud() and lends itself to efficient implementation on the GPU.

Usage Info

fastpointcloud.exe <output file>

Example:

fastpointcloud.exe pointcloud.ply