Integration Description #
We have been working together with the Notch team to turbo-charge your Volumetric Video production. Notch motion graphics and real time VFX software is trusted by the World’s biggest artists and brands, such as Billie Eilish, Gorillaz and The 2020 MTV Video Music Awards to mention a few. Notch is the first tool that enables you to both create interactive and high quality video content in one unified real-time environment.
How to use #
- To start working with EF EVE™ volumetric video in Notch you will need an .OBJ sequence. You can get a free sample here or record your own by using EF EVE™ software and 1 to 10 Azure Kinect cameras.
- Once you have your files, simply import them into Notch by going to “Resource” and choosing “.obj sequence”. From there you will only need to select the first frame from the list and the rest will be imported automatically.
- Once you have your file, drag it into the scene and you will have your volumetric video asset. However, you will also need to import image files to get color data. Again, go into “Resources” →”Import resources”→”Video”→”Image sequence” and select just the first frame.
- The next step is quite important in order for the video sequence to work. You will have to turn the images files into NotchLC movie files. This is done by selecting the image sequence, right clicking it and going right to the bottom of the menu when you will see the file name you have selected. There you will choose the option “Send to render queue for transcoding”.
The file will appear in the render queue. Please, select the file and hit “Render”. To get your render go to “Resources” →”Video”→”Import video” and choose the movie file you want to drop into the scene.
- You have all of your files ready to start working on materials. Go to “Materials” and bring in a “material node”.
- Once dropped into the scene just connect material to the “Object” and then connect the “Video loader” to the “Material” color slot.
- From there click on the “Material” and go to material settings where you will flip the axes of the UVs. Choose “UV scale Y” and type -1.
- Once you play the file you will notice that the frames between image and video are not synced in. Therefore, you will need to go to “Nodes”→”Modifiers” →“Extractor”.
- Once in the scene just connect the “Extractor” to your “Video loader”.
- In the “Extractor” settings next to “Source Envelope” choose “Current Playing Frame”.
- Also, what you will notice is that there is a time difference between imported video loader and 3D video scene. Video gives its current playing frame while the 3D video scene expects the time in seconds. Therefore, you need to do a conversion of 1 over 25.
- Once you have done that just connect the extractor to “Time controller” on the “Video scene”.
- Now you are ready to import your volumetric video into your selected environment. You will repeat all of the above steps of selecting and importing the obj sequence as well as the steps of selecting and image file. As previously you will transcode it into a movie which will appear in your render queue. Once you have done that, you will need to connect your 3D scene to the “Null” and connect Extractor to your 3D scene. For our tutorial we have selected a “Clone to Volume” effect which creates a sort of boxes effect on the volumetric video. However, you can choose to apply or create many different VFX. To connect the “Clone to Volume” effect please expand your “Imported 3D scene” menu select “Object” and connect that to “Clone to Volume”. We have also added a “ Plain Effector” and “Randomize” effect which moves up and down randomly removing some of the boxes. We have also added “ Combined geometry” which lets you see all of the polygons of the mesh.