The following is an example of volumetric capture workflow using 4 Azure Kinect cameras. The person is standing in the middle of one square meter size volumetric capture stage with four cameras facing her at 90 degree angle to each other. This type of setup is the most popular as it is the best quality and price value but setups with more cameras are supported. It allows to create 360 degrees pointlcloud covering the whole body of the person.
4x Azure Kinect setup with 1×1 meter volumetric capture stage
All cameras are cable synchronized and running on a single computer. Via EF EVE™ Volcapp the user is able to control the whole network of devices. With EF EVE™ Volcapp you can build a powerful and very dynamic volumetric capture system. 4 Volcapp subscriptions are needed to run a 4 camera setup.
One Computer is able to handle all of your cameras
EF EVE™ Volcapp is known for having the most complete pack of settings for a robust volumetric capture. With over 14 advanced features as auto-calibration and real-time pointcloud filtering it is a power-packed software for your Azure Kinect.
Two EVE files are combined to produce a 360 degree overlapping pointcloud. EF EVE™ Creator is the ultimate tool to edit, improve the quality and export volumetric video recordings. A number of post-processing features will enable you to create watertight mesh finish and optimize it for fast web browser experience.
EF EVE™ Volcapp features
The most complete set of functionality in the market. It will enhance your 3D camera capabilities. From capture and evaluation to software development.
Pointcloud calibration into a single coordinate system is the most important part of the volumetric capture process.
EF EVE™ Volcapp automatically detects markers and uses advanced algorithms to estimate the exact positions of 3D cameras. Print calibration markers on an A4 size sheet of paper and follow tutorials in our Help Center. Four cameras calibration usually takes around 5 minutes.
Capture only what you need – remove the environment
Bounding box enables a quick and simple pointcloud removal. You can adjust the box to meet the size of your capture stage. System will record only what is inside of the bounding box creating much smaller file size and less work later. Also we made some presets for standard volumetric capture stage dimensions.
Cleaning pointclouds in real time is a must for today’s volumetric video capture
We created a set of advanced and sensitive 3D filters to produce the most accurate pointlcouds possible. As pointcloud quality depends on the distance, object materials and environmental conditions the real time pointcloud filtering is essential. You will be able to quickly test and find settings for the best results.
OBJ | PLY | GLTF
It is easy, fast and reliable to work with our patented Depth Engine and EVE format. As soon as you ready to move your content to 3rd party apps you can export it to industry standards. Learn more here.
Volumetric Video Use Cases – We’ve got you covered!
Let’s dive into some of the most popular Volumetric video use cases.
AR & VR Content
Location Based Entertainment
Film & Storytelling
Customer support is always here to help you
Our support team works around the clock to assist you whenever you have a question. Drop us a message or have a look at our Help Center: Tutorials, FAQs, Troubleshooting.
1. Why are some of my cameras not shown in the camera list?
There are multiple possible reasons. To ensure that a camera is shown you can try doing these steps: unplug all the cables from the camera, wait for a couple of seconds and plug them all back in. Ensure that no other program is using that camera. After all this, refresh the camera list. If your camera is still not shown, try launching the Azure Kinect viewer, otherwise known as k4aviewer and check if that program is able to find your camera. If even the viewer is not finding your camera, consult the microsoft support team.
2. Why are my cameras missing frames?
Your cameras might be missing frames most likely because of incorrect usb host controllers (as stated by microsoft, Windows, Intel, Texas Instruments (TI), and Renesas are the only host controllers that are supported). Another alternative is that you might have insufficient hardware (usually cpu), or your disk is currently being used by another program. We’ve also noticed that on unsupported usb host controllers you can get more stable results if you have fewer of the usb ports used (for example only one usb port used in a hub that has 4 usb ports).
3. Why am I unable to login?
One of the most common issues is that Creator requires to be launched with administrator privileges or else you won’t be able to login. Alternatively, our servers might be down, or you might not have internet, or someone else might already be using your account.
4. How do I calibrate my scene?
There are a couple of different tools for calibrating your scene, which can all be found in the Calibration tab, but the one that we suggest for everyone to use is Stereo Calibration, where you print out a marker, record a volumetric video by showing that marker to different pairs of cameras and then finally using that recording in Creator with the Stereo Calibration function in the Calibration Tab.
5. I found a bug
If you found a bug, it would be great if you let us know about it, so that we could fix it ASAP. You can find ways to contact us here: Contact Us .
It’s best if you can provide us this information about your encountered bug: Program version, Description of expected behavior, Description of encountered behavior, Steps to take to reproduce the bug, Program logs (they can be found by clicking [Help]->[Open Logs path] in Creator).
Visit our help center to find other commonly asked questions as well as other information such as tutorials and feature descriptions and more.