Earlier, I posted about converting 360 videos to fulldome for planetarium projection using Blender. But this was excruciatingly slow. Then I worked on a tool to do a generalized warping (or remapping from one co-ordinate system to another) using OpenCV and CPU-based calculations - OCVWarp. This is still a work in progress, but some parts of it work - mapping to a 180 degree or 360 degree fisheye centred on the "North pole" of the equirectangular image is working as of now. Also with a simplified commandline-only tool to run on headless cloud servers. Meanwhile, found that ffmpeg also now has an option to do this directly, without the remap filter. Must explore the speeds etc.
On my i7 Macbook with only 4 GB RAM, output sizes greater than 2048x2048 (or maybe 3072x3072) was causing the process to run out of physical memory and start to write to swap disk. 2K output was running at around 4 fps. Tried running on an Azure VM with 16 GB RAM and on a borrowed Dell laptop with 8 GB RAM. Both of these could comfortably output 4096x4096 video at around 2 or 3 fps.
A quick note about the Azure VM. A Windows 10 VM took around 15 minutes to provision and start running. An Ubuntu 18.04 server took around 2 minutes or less.
Edit - OCVWarp has been updated. Here is a link to all my posts about OCVWarp.
On my i7 Macbook with only 4 GB RAM, output sizes greater than 2048x2048 (or maybe 3072x3072) was causing the process to run out of physical memory and start to write to swap disk. 2K output was running at around 4 fps. Tried running on an Azure VM with 16 GB RAM and on a borrowed Dell laptop with 8 GB RAM. Both of these could comfortably output 4096x4096 video at around 2 or 3 fps.
A quick note about the Azure VM. A Windows 10 VM took around 15 minutes to provision and start running. An Ubuntu 18.04 server took around 2 minutes or less.
Edit - OCVWarp has been updated. Here is a link to all my posts about OCVWarp.
No comments:
Post a Comment