4 Home
Bradley Weiers редактировал(а) эту страницу 2019-05-25 15:05:15 -07:00
Этот файл содержит неоднозначные символы Юникода!

Этот файл содержит неоднозначные символы Юникода, которые могут быть перепутаны с другими в текущей локали. Если это намеренно, можете спокойно проигнорировать это предупреждение. Используйте кнопку Экранировать, чтобы подсветить эти символы.

Right now the Wiki only applies to the in-development update of the project happening in 2018.3-packman branch.

How the Facial AR Remote works

Networking

The remote is made up of a client/remote iOS app. The client is a lightweight app thats able to make use of the latest additions to ARKit and send that data over the network to the Network Stream source. Using a simple TCP/IP socket and fixed-size byte stream, we send every frame of blendshape, camera and head pose data from the device to the editor. The editor then decodes the stream and to updates one or more rigged characters in real-time.

Jitter Reduction

To smooth out some jitter due to network latency, the Stream Reader keeps a tunable buffer of historic frames for when the editor inevitably lags behind the phone. We found this to be a crucial feature for preserving a smooth look on the preview character while staying as close as possible the real actors current pose. In poor network conditions, the preview will sometimes drop frames to catch up, but all data is still recorded with the original timestamps from the device.

How data is ingested from the remote app

On the editor side, we use the stream data to drive the character for preview as well as baking animation clips. Since we save the raw stream from the phone to disk, we can continue to play back this data on a character as we refine the blend shapes. And since the save data is just a raw stream from the phone, you can even re-target the motion to different characters.

Baking streamed data to an Animation Clip

Once you have a stream youre happy with captured, you can bake the stream to an animation clip. This is great since they can use that clip that you have authored like any other animation in Unity to drive a character in Mecanim, Timeline or any of the other ways animation is used. Note that the animation clip is specific to the particular character rig that was used when baking the clip.*