Green Screen tests should not run as part of functional tests (#729)
* Ready for automation * address PR comments and cleaned up other MD files based on the feedback.
This commit is contained in:
Родитель
b64f399a3f
Коммит
467d8272fa
|
@ -85,9 +85,9 @@ start a conversation with the Azure Kinect Team to get the Issue Triage Approved
|
|||
1) If you have not already, fork the repo.
|
||||
1) Make changes.
|
||||
1) Test the change, all tests should pass:
|
||||
* ``` CTest -L unit ```
|
||||
* ``` CTest -L function```
|
||||
* ``` CTest -L perf```
|
||||
* ` CTest -L unit `
|
||||
* ` CTest -L function`
|
||||
* ` CTest -L perf`
|
||||
1) Create a pull request.
|
||||
* The PR description must reference the issue.
|
||||
1) An Azure Kinect SDK team member will review the change. See the [review process](#review-process) for more information.
|
||||
|
|
|
@ -685,6 +685,11 @@ jobs:
|
|||
displayName: 'Run Functional Tests'
|
||||
timeoutInMinutes: 15
|
||||
|
||||
- script: 'python $(Build.SourcesDirectory)/scripts/RunTestList.py --list bin/functional_custom_test_list.txt --bin bin/ --output=xml --gtestargs "--gtest_filter=-*ONBOARDING*"'
|
||||
workingDirectory: '$(System.ArtifactsDirectory)/amd64-windows-msvc-RelWithDebInfo'
|
||||
displayName: 'Run Custom Functional Tests'
|
||||
timeoutInMinutes: 15
|
||||
|
||||
- script: 'python $(Build.SourcesDirectory)/scripts/RunTestList.py --list bin/functional_test_list.txt --bin bin/ --output=xml --gtestargs "--gtest_filter=*ONBOARDING*"'
|
||||
workingDirectory: '$(System.ArtifactsDirectory)/amd64-windows-msvc-RelWithDebInfo'
|
||||
displayName: 'Run Functional Tests - Onboarding'
|
||||
|
@ -822,6 +827,11 @@ jobs:
|
|||
displayName: 'Run Functional Tests'
|
||||
timeoutInMinutes: 15
|
||||
|
||||
- script: 'python $(Build.SourcesDirectory)/scripts/RunTestList.py --list bin/functional_custom_test_list.txt --bin bin/ --output=xml --gtestargs "--gtest_filter=-*ONBOARDING*"'
|
||||
workingDirectory: '$(System.ArtifactsDirectory)/x86_64-linux-clang-relwithdebinfo'
|
||||
displayName: 'Run Custom Functional Tests'
|
||||
timeoutInMinutes: 15
|
||||
|
||||
- script: 'python $(Build.SourcesDirectory)/scripts/RunTestList.py --list bin/functional_test_list.txt --bin bin/ --output=xml --gtestargs "--gtest_filter=*ONBOARDING*"'
|
||||
workingDirectory: '$(System.ArtifactsDirectory)/x86_64-linux-clang-relwithdebinfo'
|
||||
displayName: 'Run Functional Tests - Onboarding'
|
||||
|
|
|
@ -17,7 +17,15 @@
|
|||
#
|
||||
# FUNCTIONAL - Tests meant to run on test machine. These tests run
|
||||
# quickly (<10s), may require hardware, run on PCs that
|
||||
# meet min spec requirements, and are reproducible.
|
||||
# meet min spec requirements, and are reproducible.
|
||||
# These tests must also be capable of working on a
|
||||
# single Azure Kinect and not require any additional
|
||||
# setup to succeed.
|
||||
#
|
||||
# FUNCTIONAL_CUSTOM - Similar to FUNCTIONAL tests above. These tests
|
||||
# however are allowed to have additional physical
|
||||
# requirements like lighting, multiple devices, or
|
||||
# visible chessboard pattern for calibration.
|
||||
#
|
||||
# STRESS - Tests that run repeatedly and look for statistical
|
||||
# failures
|
||||
|
@ -40,7 +48,7 @@ if (NOT is_defined)
|
|||
BRIEF_DOCS "List of types of tests"
|
||||
FULL_DOCS "Contains full list of all test types")
|
||||
|
||||
set(TEST_TYPES "UNIT" "FUNCTIONAL" "STRESS" "PERF" "FIRMWARE")
|
||||
set(TEST_TYPES "UNIT" "FUNCTIONAL" "STRESS" "PERF" "FIRMWARE" "FUNCTIONAL_CUSTOM")
|
||||
set_property(GLOBAL PROPERTY TEST_TYPES ${TEST_TYPES})
|
||||
|
||||
foreach(TEST_TYPE ${TEST_TYPES})
|
||||
|
|
|
@ -60,18 +60,18 @@ The following tools are required to build on Windows:
|
|||
The following tools are optional:
|
||||
|
||||
* [Doxygen](http://www.doxygen.nl/download.html). Add doxygen to the PATH.
|
||||
Required for building documentation. To use, pass the CMake parameter ```-DK4A_BUILD_DOCS=1```
|
||||
Required for building documentation. To use, pass the CMake parameter `-DK4A_BUILD_DOCS=1`
|
||||
|
||||
* [Clang-Format](http://releases.llvm.org/download.html). Please download clang
|
||||
v6.0.0 since that is what we are using to format our code. To invoke, call ```ninja clangformat```
|
||||
v6.0.0 since that is what we are using to format our code. To invoke, call `ninja clangformat`
|
||||
|
||||
If you are building from a command prompt, it **must** be a **x64 Visual Studio
|
||||
developer command prompt** in order for CMake to find the installed compilers.
|
||||
We build both 32-bit and 64-bit binaries, but 64-bit binaries are the only
|
||||
binaries that are tested. (The command prompt should be called something like
|
||||
x64 Native Tools Command Prompt for VS 2017). Note: call the command line tool
|
||||
with the option ```-arch=amd64``` for x64 builds i.e ```VsDevCmd.bat
|
||||
-arch=amd64```
|
||||
with the option `-arch=amd64` for x64 builds i.e `VsDevCmd.bat
|
||||
-arch=amd64`
|
||||
|
||||
**NOTE:** *You can run
|
||||
[scripts/verify-windows.ps1](../scripts/verify-windows.ps1) to verify that your
|
||||
|
@ -95,20 +95,20 @@ need for building, but is required running the SDK*
|
|||
1. Create a folder named "build" in the root of the git repo and cd into that
|
||||
directory.
|
||||
|
||||
```shell
|
||||
```
|
||||
mkdir build && cd build
|
||||
```
|
||||
|
||||
2. Run CMake from that directory. The preferred build is ninja. All other
|
||||
generators are untested.
|
||||
|
||||
```shell
|
||||
```
|
||||
cmake .. -GNinja
|
||||
```
|
||||
|
||||
3. Run the build (ninja).
|
||||
|
||||
```shell
|
||||
```
|
||||
ninja
|
||||
```
|
||||
|
||||
|
|
|
@ -11,7 +11,7 @@ To build from source, the Azure Kinect repo must first download the source.
|
|||
The Azure Kinect repo uses git submodules to download the source. These submodules are
|
||||
initialized on the first CMake configure by executing
|
||||
|
||||
```shell
|
||||
```
|
||||
git submodule update --init --recursive
|
||||
```
|
||||
|
||||
|
|
|
@ -15,13 +15,13 @@ Releases are scheduled on demand based on program needs.
|
|||
### Alpha Release
|
||||
|
||||
Alpha builds are built using source from the develop branch and are numbered with the
|
||||
expected release number for current development, such as ```1.1.0-alpha.0```.
|
||||
expected release number for current development, such as `1.1.0-alpha.0`.
|
||||
|
||||
Alpha builds expect heavy churn and are not guaranteed to be backward compatible with each other.
|
||||
|
||||
### Beta Release
|
||||
|
||||
Beta builds are built from the release branches, such as ```1.1.0-beta.0```.
|
||||
Beta builds are built from the release branches, such as `1.1.0-beta.0`.
|
||||
|
||||
Release branches are created when that release is being stabilized, at which point only bug fixes and changes
|
||||
required for that release are merged or cherry-picked in to the release branch. Fixes may alternatively be made
|
||||
|
@ -30,15 +30,15 @@ to the release branch directly and then merged back to the develop branch.
|
|||
### Official Release
|
||||
|
||||
Once a beta build has been signed off for release, an official build is created with code from that release branch,
|
||||
such as ```1.1.0```
|
||||
such as `1.1.0`
|
||||
|
||||
### Patch releases
|
||||
|
||||
Critical changes to a released build may be made in the release branch to patch an existing release. These
|
||||
changes do not introduce functionality or break compatibility.
|
||||
|
||||
Changes are made in the release branch for the existing release, such as ```release/1.0.x```, and are verified with beta
|
||||
builds for the patch, such as ```1.0.1-beta.0```, before the patch is signed off and released as ```1.0.1```
|
||||
Changes are made in the release branch for the existing release, such as `release/1.0.x`, and are verified with beta
|
||||
builds for the patch, such as `1.0.1-beta.0`, before the patch is signed off and released as `1.0.1`
|
||||
|
||||
## Moving changes between release branches
|
||||
|
||||
|
|
|
@ -8,6 +8,8 @@ The Azure Kinect repo has several categories of tests:
|
|||
|
||||
* Unit tests
|
||||
* Functional tests
|
||||
* Functional tests that only depends on 1 Azure Kinect
|
||||
* Functional tests that have custom requirements.
|
||||
* Stress tests
|
||||
* Perf tests
|
||||
* Firmware tests
|
||||
|
@ -31,24 +33,35 @@ using the Google Test framework. For a basic example of writing a unit test
|
|||
please see
|
||||
[tests/UnitTests/queue_ut/queue.cpp](../tests/UnitTests/queue_ut/queue.cpp).
|
||||
|
||||
After compiling, unit tests can be run using "ctest -L unit" in the build
|
||||
After compiling, unit tests can be run using `ctest -L unit` in the build
|
||||
directory. Unit tests are run as part of the CI system.
|
||||
|
||||
**NOTE:** *These tests must succeed for a pull request to be merged.*
|
||||
|
||||
### Functional Tests
|
||||
|
||||
#### Single Device
|
||||
|
||||
Functional tests are tests which run on the test machine. They must be quick
|
||||
(~ <10s), be reproducible, and may require hardware. Functional tests are
|
||||
built using the Google Test framework. For a basic example of writing a
|
||||
functional test please see
|
||||
[tests/example/test.cpp](../tests/example/test.cpp).
|
||||
[tests/example/test.cpp](../tests/example/test.cpp).
|
||||
|
||||
After compiling, functional tests can be run using "ctest -L functional" in the build
|
||||
directory. Functional tests are run as part of the CI system.
|
||||
After compiling, functional tests can be run using `ctest -L "^functional$"`
|
||||
in the build directory. Functional tests are run as part of the CI system.
|
||||
|
||||
**NOTE:** *These tests must succeed for a pull request to be merged.*
|
||||
|
||||
#### Custom Configurations
|
||||
|
||||
Not everyone will have access to multiple devices. So we have moved tests with
|
||||
extra dependencies into their own test label. Some tests require OpenCV,
|
||||
multiple devices, or even a known chessboard. The hope is that users running
|
||||
tests outside of automation will be less complacent to errors if tests that will
|
||||
fail outside automation are disabled in these workflows by default. To run these
|
||||
tests, use the command `ctest -L "functional_custom"`.
|
||||
|
||||
### Stress Tests
|
||||
|
||||
Stress tests are tests which run the same logic repeatedly to check for
|
||||
|
@ -56,11 +69,10 @@ crashes while the host is under heavy load. These tests will be run on a rolling
|
|||
build and may require hardware. Stress tests are built using the Google Test
|
||||
framework.
|
||||
|
||||
After compiling, stress tests can be run using "ctest -L stress"
|
||||
in the build directory.
|
||||
After compiling, stress tests can be run using `ctest -L stress` in the build directory.
|
||||
|
||||
**NOTE:** *At the moment, the Azure Kinect SDK does not have any stress tests. However,
|
||||
there are some 'stress-like' tests running as unit tests that run for several
|
||||
**NOTE:** *At the moment, the Azure Kinect SDK does not have any stress tests. However,
|
||||
there are some 'stress-like' tests running as unit tests that run for several
|
||||
seconds to detect threading and timing related issues.*
|
||||
|
||||
### Perf Tests
|
||||
|
@ -69,7 +81,7 @@ Perf tests are tests who results are purely statistics and not Pass/Fail.
|
|||
These tests will be run on a rolling build and may require hardware. Perf
|
||||
tests are built using the Google Test framework.
|
||||
|
||||
After compiling, perf tests can be run using "ctest -L perf" in the build directory.
|
||||
After compiling, perf tests can be run using `ctest -L perf` in the build directory.
|
||||
|
||||
**NOTE:** *These tests are run on demand*
|
||||
|
||||
|
@ -100,4 +112,4 @@ To run tests on separate build machine:
|
|||
|
||||
* Copy the bin folder to the target machine.
|
||||
* Copy [RunTestList.py](../scripts/RunTestList.py) to the target machine.
|
||||
* Use \<xxx\>_test_list.txt files from bin folder as ```list``` input to the script.
|
||||
* Use \<xxx\>_test_list.txt files from bin folder as `list` input to the script.
|
||||
|
|
|
@ -15,15 +15,15 @@ More details about release versioning can be found [here](releasing.md)
|
|||
## Firmware Versioning
|
||||
|
||||
The Azure Kinect firmware is composed of 4 different firmware versions. These firmware versions are exposed through
|
||||
```k4a_hardware_version_t```. Here is a sample of that hardware version:
|
||||
`k4a_hardware_version_t`. Here is a sample of that hardware version:
|
||||
|
||||
```shell
|
||||
```
|
||||
RGB Sensor Version: 1.6.98
|
||||
Depth Sensor Version:1.6.70
|
||||
Mic Array Version: 1.6.14
|
||||
Sensor Config: 5006.27
|
||||
```
|
||||
|
||||
This version can also be simplified as ```1.6.098070014``` where ```098```, ```070```,
|
||||
and ```014``` are the patch versions of each component version, converted to a
|
||||
This version can also be simplified as `1.6.098070014` where `098`, `070`,
|
||||
and `014` are the patch versions of each component version, converted to a
|
||||
zero-based 3 digit form, and concatenated.
|
||||
|
|
|
@ -12,12 +12,12 @@ to efficient implementation on the GPU.
|
|||
|
||||
## Usage Info
|
||||
|
||||
```shell
|
||||
```
|
||||
fastpointcloud.exe <output file>
|
||||
```
|
||||
|
||||
Example:
|
||||
|
||||
```shell
|
||||
```
|
||||
fastpointcloud.exe pointcloud.ply
|
||||
```
|
||||
|
|
|
@ -9,12 +9,12 @@ This example also covers adding custom attachments and tags to a recording file.
|
|||
|
||||
## Usage Info
|
||||
|
||||
```shell
|
||||
```
|
||||
k4arecord_custom_track.exe <output file>
|
||||
```
|
||||
|
||||
Example:
|
||||
|
||||
```shell
|
||||
```
|
||||
k4arecord_custom_track.exe custom.mkv
|
||||
```
|
||||
|
|
|
@ -32,14 +32,14 @@ format. Note that the color channel needs to be in MJPEG format.
|
|||
|
||||
## Usage Info
|
||||
|
||||
```shell
|
||||
```
|
||||
transformation_example capture <output_directory> [device_id]
|
||||
transformation_example playback <filename.mkv> [timestamp (ms)] [output_file]
|
||||
```
|
||||
|
||||
### Examples:
|
||||
|
||||
```shell
|
||||
```
|
||||
transformation_example capture c:\temp\
|
||||
transformation_example capture output 1
|
||||
|
||||
|
|
|
@ -11,12 +11,12 @@ remap() function.
|
|||
|
||||
## Usage Info
|
||||
|
||||
```shell
|
||||
```
|
||||
undistort.exe <output file>
|
||||
```
|
||||
|
||||
Example:
|
||||
|
||||
```shell
|
||||
```
|
||||
undistort.exe undistorted.csv
|
||||
```
|
||||
|
|
|
@ -5,14 +5,24 @@ include(k4aTest)
|
|||
|
||||
# Only run green screen tests if OpenCV is installed
|
||||
find_package(OpenCV)
|
||||
add_executable(executables_ft executables_ft.cpp)
|
||||
add_executable(executables_ft executables_ft.cpp) # Single Device Tests
|
||||
add_executable(executables_ft_custom executables_ft.cpp) # Multi Device Tests
|
||||
if (OpenCV_FOUND)
|
||||
target_compile_definitions(executables_ft PRIVATE -DUSE_OPENCV=1 )
|
||||
target_compile_definitions(executables_ft_custom PRIVATE -DUSE_OPENCV=1 )
|
||||
endif()
|
||||
target_compile_definitions(executables_ft PRIVATE -DUSE_CUSTOM_TEST_CONFIGURATION=0 )
|
||||
target_compile_definitions(executables_ft_custom PRIVATE -DUSE_CUSTOM_TEST_CONFIGURATION=1 )
|
||||
|
||||
target_link_libraries(executables_ft PRIVATE
|
||||
k4a::k4a
|
||||
k4ainternal::utcommon
|
||||
gtest::gtest)
|
||||
|
||||
target_link_libraries(executables_ft_custom PRIVATE
|
||||
k4a::k4a
|
||||
k4ainternal::utcommon
|
||||
gtest::gtest)
|
||||
|
||||
k4a_add_tests(TARGET executables_ft HARDWARE_REQUIRED TEST_TYPE FUNCTIONAL)
|
||||
k4a_add_tests(TARGET executables_ft_custom HARDWARE_REQUIRED TEST_TYPE FUNCTIONAL_CUSTOM)
|
||||
|
|
|
@ -43,6 +43,7 @@ static int run_and_record_executable(std::string shell_command_path, std::string
|
|||
// In Linux, forking a process causes the under buffers to be forked, too. So, because popen uses fork under the
|
||||
// hood, there may have been a risk of printing something in both processes. I'm not sure if this could happen in
|
||||
// this situation, but better safe than sorry.
|
||||
std::cout << "Running: " << formatted_command << std::endl;
|
||||
std::cout.flush();
|
||||
FILE *process_stream = POPEN(formatted_command.c_str(), "r");
|
||||
if (!process_stream)
|
||||
|
@ -51,7 +52,6 @@ static int run_and_record_executable(std::string shell_command_path, std::string
|
|||
return EXIT_FAILURE; // if popen fails, it returns null, which is an error
|
||||
}
|
||||
int return_code = PCLOSE(process_stream);
|
||||
std::cout << "Ran: " << formatted_command << std::endl;
|
||||
std::cout << "<==============================================" << std::endl;
|
||||
try
|
||||
{
|
||||
|
@ -129,6 +129,7 @@ protected:
|
|||
}
|
||||
};
|
||||
|
||||
#if (USE_CUSTOM_TEST_CONFIGURATION == 0)
|
||||
TEST_F(executables_ft, calibration)
|
||||
{
|
||||
const std::string calibration_path = PATH_TO_BIN("calibration_info");
|
||||
|
@ -170,31 +171,6 @@ TEST_F(executables_ft, enumerate)
|
|||
test_stream_against_regexes(&results, ®exes);
|
||||
}
|
||||
|
||||
#ifdef USE_OPENCV
|
||||
TEST_F(executables_ft, green_screen_single_cam)
|
||||
{
|
||||
const std::string green_screen_path = PATH_TO_BIN("green_screen");
|
||||
const std::string green_screen_out = TEST_TEMP_DIR + "/green_screen-single-out.txt";
|
||||
// Calibration timeout for this is 10min due to low light conditions in the lab and slow perf of
|
||||
// cv::findChessboardCorners.
|
||||
ASSERT_EQ(run_and_record_executable(green_screen_path + " 1 9 6 22 1000 4000 2 600 5", green_screen_out),
|
||||
EXIT_SUCCESS);
|
||||
}
|
||||
|
||||
TEST_F(executables_ft, green_screen_double_cam)
|
||||
{
|
||||
const std::string green_screen_path = PATH_TO_BIN("green_screen");
|
||||
const std::string green_screen_out = TEST_TEMP_DIR + "/green_screen-double-out.txt";
|
||||
// Calibration timeout for this is 10min due to low light conditions in the lab and slow perf of
|
||||
// cv::findChessboardCorners.
|
||||
ASSERT_EQ(run_and_record_executable(green_screen_path + " 2 9 6 22 1000 4000 2 600 5", green_screen_out),
|
||||
EXIT_SUCCESS);
|
||||
std::ifstream results(green_screen_out.c_str());
|
||||
std::vector<std::string> regexes{ "Finished calibrating!" };
|
||||
test_stream_against_regexes(&results, ®exes);
|
||||
}
|
||||
#endif
|
||||
|
||||
TEST_F(executables_ft, fastpointcloud)
|
||||
{
|
||||
const std::string fastpoint_path = PATH_TO_BIN("fastpointcloud");
|
||||
|
@ -291,6 +267,40 @@ TEST_F(executables_ft, undistort)
|
|||
// don't bother checking the csv file- just make sure it's there
|
||||
ASSERT_TRUE(undistort_results.good());
|
||||
}
|
||||
#endif // USE_CUSTOM_TEST_CONFIGURATION == 0
|
||||
|
||||
#if (USE_CUSTOM_TEST_CONFIGURATION == 1)
|
||||
#ifdef USE_OPENCV
|
||||
TEST_F(executables_ft, green_screen_single_cam)
|
||||
#else
|
||||
TEST_F(executables_ft, DISABLED_green_screen_single_cam)
|
||||
#endif
|
||||
{
|
||||
const std::string green_screen_path = PATH_TO_BIN("green_screen");
|
||||
const std::string green_screen_out = TEST_TEMP_DIR + "/green_screen-single-out.txt";
|
||||
// Calibration timeout for this is 10min due to low light conditions in the lab and slow perf of
|
||||
// cv::findChessboardCorners.
|
||||
ASSERT_EQ(run_and_record_executable(green_screen_path + " 1 9 6 22 1000 4000 2 600 5", green_screen_out),
|
||||
EXIT_SUCCESS);
|
||||
}
|
||||
|
||||
#ifdef USE_OPENCV
|
||||
TEST_F(executables_ft, green_screen_double_cam)
|
||||
#else
|
||||
TEST_F(executables_ft, DISABLED_green_screen_double_cam)
|
||||
#endif
|
||||
{
|
||||
const std::string green_screen_path = PATH_TO_BIN("green_screen");
|
||||
const std::string green_screen_out = TEST_TEMP_DIR + "/green_screen-double-out.txt";
|
||||
// Calibration timeout for this is 10min due to low light conditions in the lab and slow perf of
|
||||
// cv::findChessboardCorners.
|
||||
ASSERT_EQ(run_and_record_executable(green_screen_path + " 2 9 6 22 1000 4000 2 600 5", green_screen_out),
|
||||
EXIT_SUCCESS);
|
||||
std::ifstream results(green_screen_out.c_str());
|
||||
std::vector<std::string> regexes{ "Finished calibrating!" };
|
||||
test_stream_against_regexes(&results, ®exes);
|
||||
}
|
||||
#endif // USE_CUSTOM_TEST_CONFIGURATION == 1
|
||||
|
||||
int main(int argc, char **argv)
|
||||
{
|
||||
|
|
|
@ -9,4 +9,4 @@ target_link_libraries(multidevice_ft PRIVATE
|
|||
k4a::k4a
|
||||
k4ainternal::utcommon)
|
||||
|
||||
k4a_add_tests(TARGET multidevice_ft TEST_TYPE FUNCTIONAL)
|
||||
k4a_add_tests(TARGET multidevice_ft TEST_TYPE FUNCTIONAL_CUSTOM)
|
||||
|
|
|
@ -48,7 +48,7 @@ public:
|
|||
k4a_device_t m_device2 = nullptr;
|
||||
};
|
||||
|
||||
TEST_F(multidevice_ft, DISABLED_open_close_two)
|
||||
TEST_F(multidevice_ft, open_close_two)
|
||||
{
|
||||
ASSERT_LE((uint32_t)2, k4a_device_get_installed_count());
|
||||
|
||||
|
@ -73,7 +73,7 @@ TEST_F(multidevice_ft, DISABLED_open_close_two)
|
|||
m_device2 = NULL;
|
||||
}
|
||||
|
||||
TEST_F(multidevice_ft, DISABLED_stream_two_1_then_2)
|
||||
TEST_F(multidevice_ft, stream_two_1_then_2)
|
||||
{
|
||||
k4a_device_configuration_t config = K4A_DEVICE_CONFIG_INIT_DISABLE_ALL;
|
||||
|
||||
|
@ -115,7 +115,7 @@ TEST_F(multidevice_ft, DISABLED_stream_two_1_then_2)
|
|||
m_device2 = NULL;
|
||||
}
|
||||
|
||||
TEST_F(multidevice_ft, DISABLED_stream_two_2_then_1)
|
||||
TEST_F(multidevice_ft, stream_two_2_then_1)
|
||||
{
|
||||
k4a_device_configuration_t config = K4A_DEVICE_CONFIG_INIT_DISABLE_ALL;
|
||||
|
||||
|
@ -157,7 +157,7 @@ TEST_F(multidevice_ft, DISABLED_stream_two_2_then_1)
|
|||
m_device1 = NULL;
|
||||
}
|
||||
|
||||
TEST_F(multidevice_ft, DISABLED_ensure_color_camera_is_enabled)
|
||||
TEST_F(multidevice_ft, ensure_color_camera_is_enabled)
|
||||
{
|
||||
bool master_device_found = false;
|
||||
bool subordinate_device_found = false;
|
||||
|
|
|
@ -3,5 +3,5 @@
|
|||
This is a simple command line tool to determine the version of the Azure Kinect plugin
|
||||
that is currently in the loader's path.
|
||||
|
||||
```deversion``` will print out the version of the depth engine plugin it loads. If
|
||||
```deversion``` is unable to load a depth engine it will not print out a version.
|
||||
`deversion` will print out the version of the depth engine plugin it loads. If
|
||||
`deversion` is unable to load a depth engine it will not print out a version.
|
|
@ -7,7 +7,7 @@ the most recent frames to the specified folder when notified by [Azure Kinect Fa
|
|||
|
||||
## Usage Info
|
||||
|
||||
```shell
|
||||
```
|
||||
d:\fastcapture_streaming.exe /?
|
||||
* fastcapture_streaming.exe Usage Info *
|
||||
|
||||
|
@ -21,4 +21,5 @@ Examples:
|
|||
|
||||
2 - fastcapture_streaming.exe -DirectoryPath C:\data\ -PcmShift 5 -StreamingLength 1000 -ExposureValue -3
|
||||
|
||||
3 - fastcapture_streaming.exe -d C:\data\ -s 4 -l 60 -e -2```
|
||||
3 - fastcapture_streaming.exe -d C:\data\ -s 4 -l 60 -e -2
|
||||
```
|
|
@ -9,6 +9,6 @@ Streaming tool is already running.
|
|||
|
||||
## Usage Info
|
||||
|
||||
Run ```fastcapture_trigger.exe``` to take a capture.
|
||||
Run `fastcapture_trigger.exe` to take a capture.
|
||||
|
||||
Run ```fastcapture_trigger.exe``` to exit the streaming process.
|
||||
Run `fastcapture_trigger.exe` to exit the streaming process.
|
|
@ -10,8 +10,6 @@ To use, select a device from the list, click "Open Device", choose the settings
|
|||
k4aviewer will try to detect if you have a high-DPI system and scale automatically; however, if you want to force it into or out of
|
||||
high-DPI mode, you can pass -HighDPI or -NormalDPI to override that behavior.
|
||||
|
||||
```shell
|
||||
k4aviewer.exe [-HighDPI|-NormalDPI]
|
||||
```
|
||||
|
||||
|
||||
k4aviewer.exe [-HighDPI|-NormalDPI]
|
||||
```
|
Загрузка…
Ссылка в новой задаче