Following the helpful post by 'borkia' on the Fulldome Yahoo group, I set up a project in Blender to render fulldome frames or video from 360 videos. 'borkia' had posted this explanatory image.
I had to take a few minutes to remind myself of the Blender interface, since it has been 7 years since my Blender fulldome experiments.
I had to take a few minutes to remind myself of the Blender interface, since it has been 7 years since my Blender fulldome experiments.
- Right-click to select an object - this can be a camera also
- Change its properties like position, rotation etc using the object tab on the Properties panel,
- The rotation of the sphere in the screenshot above changes the up-down angle. Increasing the X angle is like tilting the camera down.
- I saw this tutorial for getting to know how to edit nodes - what he suggested was to move the timeline up, and in place of the timeline, choose the editor type to be the Node editor.
- The Cycles rendering engine must be selected. The rest of the settings as per 'borkia's explanatory image. On my old model i7 Macbook Pro, 512x512 images were rendered in less than 2 seconds. 2048x2048 were taking 30 seconds per frame. So, around 1000x slower than real-time. So, a half-hour segment would take 20 days to render. I suppose it would be better to render in few-second chunks. One minute would take around 16 or 17 hours to render. 100 frames would take around 50 minutes or an hour. Cycles is capable of supporting GPU-based rendering, so it might work much faster with a NVIDIA + Mac Pro.
Edit: OCVWarp can do much much faster - 4 frames per second or so for 2K output.
No comments:
Post a Comment