Sunday, January 29, 2023

quality loss when converting "warpings" with OCVWarp

Fiske Planetarium's "Climate Change in our Backyard" is available as 4k frames for free download as well as a VR experience. So, I tried out doing some conversions to see how much quality is being lost when I use OCVWarp to change the "format" between 4k fulldome, 4k 360 VR, 2k fulldome and "pre-warped". 

The results were interesting. 

1. 4k 360 VR to 2k fulldome was not as good as 4k 360 VR to 4k fulldome. I expected the reverse.

Zoomed in views of screenshots below:

This is the 4k frame bicubic resized to 2k.


4k 360 VR to 2k fulldome - shows aliasing for fine lines.

4k 360 VR to 4k fulldome - shows less aliasing, better than the above!



2. 4k 360 VR to "prewarped" was almost as good as 2k fulldome to "prewarped". From earlier experience, I know that if I directly try 4k fulldome to "prewarped", I would get aliasing artifacts, so better results are obtained from bicubic resize 4k to 2k and then "warp". That being the case, this result shows that I need not download 4k frames and redo the "warping" if I have good quality 4k 360 VR video.

This was warped from 2K frames.



This was warped from 4k 360 VR video.

Friday, January 27, 2023

using a VPN for faster downloads without ISP throttling

FTP downloads on our 100 Mbps BSNL fiber averaged at around 10 Mbps when downloading from Fiske planetarium's server in Colorado. First I tried setting up OpenVPN to one of our Azure servers, and when that setup did not go according to plan, I tried ProtonVPN's free plan. 

Using a free server in the US, FTP traffic tunneled through ProtonVPN went at around 30 Mbps on average, touching 40-48 Mbps occasionally. 

Going back to the OpenVPN setup - I went through this webinar on OpenVPN for ZTNA where there is a mention of how to tunnel certain specific domains through OpenVPN - tried it, but instead of specifying only Fiske's server, I tried it with the Split Tunnel Off for a particular user I created. The IP address was being shown as our Azure machine's IP address, but there was not much of a speed improvement - rather, there was the issue of FTP not going through at all. This might have been due to some PASV mode setting etc, but I didn't try troubleshooting since ProtonVPN was sufficient for my needs.

Detailed setup:

ProtonVPN setup was relatively easy. Just create the free account, download the GUI client, and click to start.

OpenVPN cloud setup was more complicated. I wanted to explore some of its features, so I tried various options. The setup steps were mostly as per
https://openvpn.net/cloud-vpn/quick-start/
https://openvpn.net/cloud-docs/user-guide-protecting-your-users-and-your-network-using-cyber-shield/

But instead of a Linux internet gateway, I was using our Azure server running Windows. For setting up connectors and users, some hiccups were caused due to the following points. 

  • For the Internet gateway setup, there is a "Deploy connector" option leading to the download of the common Windows connector (which creates a TAP network adapter etc). But when connecting with that connector, we should first download the profile file - which is also available in the Deploy connector drop-down - and use that profile for the connector. That is the correct profile for an internet gateway.

  • When creating users, we need to manually set a temporary password, then view that temporary password if we're creating a user without an optional email linked to it. On first login, the temporary password needs to be changed.

  • The OpenVPN Cloud free plan has unlimited data and we can create a large number of users, but is limited to 3 concurrent connections. 

Tuesday, January 24, 2023

backing up mysql databases to google shared drive

Following the unixcraft script, also seen on github, implemented a weekly backup for some of our mysql databases. 

# 05 0 * * Sun /path/to/script.sh

as root.  

When I added an rclone move line after the backup line to the script, it was moving incomplete 20 kB files to the shared google drive. So, am running the rclone move as a separate script after half an hour,

 # 35 0 * * Sun /path/to/script.sh

which has lines similar to

MBD="$DEST/mysql"
rclone --config "/home/ouruser/.config/rclone/rclone.conf" move $MBD/ gshareddrive:/

The database dump takes only a couple of minutes, so the half an hour gap is more than sufficient. 

 Some things to remember about using this script:

  • If we're defining only a few databases to be backed up, like
    # Store list of databases
    DBS="db1 db2"

    then we need to disable/comment out the DBS= line which appears later,
    #DBS="$($MYSQL -u $MyUSER -h $MyHOST -p$MyPASS -Bse 'show databases')"

  • As seen above, the list of databases (or those to be ignored, in the IGGY variable) are to be entered separated by spaces.

  • The script changes the directory permissions for security, so it is meant to be run as root. Otherwise, we will find that it could not write the files (if run as an unprivileged user).


Bitnami mysql VM disk usage - disable binlog

Found that the disk utilization for one of our database servers was 83%. Checking with df -h and du -sh, found that the directory with most data usage was 

/bitnami/mysql/data$ sudo du -sh *
192K    #ib_16384_0.dblwr
8.2M    #ib_16384_1.dblwr
676K    #innodb_temp
4.0K    auto.cnf
1.1G    binlog.000867
1.1G    binlog.000868
1.1G    binlog.000869
1.1G    binlog.000870
1.1G    binlog.000871
1.1G    binlog.000872
1.1G    binlog.000873
1000M   binlog.000874
1.1G    binlog.000875

So, these binary logs were taking up all the space, when the actual database itself was only less than a GB. Followed this to disable binary logging - How to Disable Binary Logging in MySQL on Bitnami servers? (supportsages.com) 

sudo /opt/bitnami/ctlscript.sh stop mysql
sudo nano /opt/bitnami/mysql/conf/my.conf

and added the disable line after
[mysqld]
disable_log_bin

sudo /opt/bitnami/ctlscript.sh start mysql

Waited for some days, but the bin logs were not getting auto deleted. So, manually deleted them and did a restart of mysql with

 sudo /opt/bitnami/ctlscript.sh restart

Now the disk is only 30% used.

 

 

Monday, January 23, 2023

file not found error on Google Apps Script Drive API for files on shared drives - workaround

Failing to get a file in Google Apps Script using the Drive API - Stack Overflow

So, we must add a parameter to the Drive API call, instead of just the file id, 

Drive.Files.get('111111FileIDxxxxxxxxxxx', {supportsAllDrives: true});

 

 

Moodle course slow to load

An issue with one of our Moodle instances - one of the courses was loading very slowly in Edit Mode, and was timing out. Another issue - the question bank was not loading at all.

1. Question bank issue - The database table prefix_question_statistics showed a(n abnormally?) high number of entries -  that table alone is 49 MB,  with more being added every day. Apparently that was the issue. Statistics was disabled for the question bank, and the question bank started loading again. 

Some suggestions to make the course(s) load faster:

a. Currently, if you click on one of the courses, a huge page, with many images, for all weeks, loads in the browser. We can make it load a bit faster using Cloudflare, but such a large page is bound to cause problems with older computers / machines with less RAM / on slower networks / etc. It would be better to show a collapsed view, with each week opening up only on clicking. (the collapse all link on the top right). Or on separate pages.

b. Use jpg, of quality = 70 to 75 instead of png. 

c. Turn on cloudflare proxying. 

d. Clicking on question bank also loads up 1000 questions at once. This would cause problems as the number of questions / size of each question increase. It would be desirable to load up fewer questions - maybe 100 at a time as with the link at the bottom, "Show 100 at a time" - I have clicked it now. 

2. Course not loading / slow to load - Further troubleshooting tips -
Moodle in English: Very large course takes a minute to load unless "editing" is on

" It's almost always because somebody is trying to build the learning material on the course page itself rather than in activities/resources "behind" the page. There's nothing wrong with that, of course, but it's does have an effect.

We had a course with 120 videos embedded in labels on the course page. It took 10 minutes plus to load - even on a good connection."

So, probably not a good idea to embed all your content on the course page itself, they should be "activities/resources behind the page".


Also, another person says:
"To track down the culprit I usually switch to one section per page. With one section per page, only the content for that pages loads, so some sections will load normally, but your dragger will still drag."

How to show one topic per page:
Course format types in Moodle · Technology Help · Lafayette College
 
(Apparently the "embedding videos in labels" was one of the issues. Another suggestion which S passed on was: 
another way to improve performance overall is to move out three folders namely (cache, localcache and temp) out of moodle data folder. Here is a tutorial for the same.  https://youtu.be/qXnSlwYzDfU )

Sunday, January 22, 2023

trying out Google Earth Studio for planetarium fulldome content

When I was googling for ways to create fulldome or 360 VR content from Google Earth (desktop), came across this interesting product.

https://earth.google.com/studio/docs/making-animations/rendering/

Apparently, this is a separate product from Google Earth (desktop) and is invite only at present. I sent in a request and got approved within 12 hours. 

We can render 360 VR scenes fairly easily, with the caveat that their attribution must appear onscreen, and we should not sell the resulting content.

Using OCVWarp, I tried out turning the rendered 360 VR files into fulldome content, a couple of examples below.



Edit: For exporting regular flat content from the Desktop version of Google Earth, this youtube video has a step-by-step tutorial.


Deprecated SSL/TLS version warning from NIC-CERT - action taken on cloudflare

Copy-pasting from an email exchange after one of our institutions received an email saying that some/all of their domains had deprecated versions of TLS - 

There are pros and cons for disabling these.

1. You can see which clients will be affected by looking at the last answer at

2. If you log in to your cloudflare dashboard, click on your domain, then SSL/TLS --> Overview on the left hand pane, you can find a percentage of users still using TLS 1.0 etc.

3. The security risk is that the users who use TLS 1.0 etc have a finite risk of being tricked into going to some other site masquerading as our site, since the certificate is vulnerable, and then stealing the data they send to our site. Example attacks are at
https://www.acunetix.com/blog/articles/tls-vulnerabilities-attacks-final-part/

4. If you want to disable TLS 1.0 and 1.1, you can do it from your cloudflare dashboard, click on the relevant website, go to SSL/TLS --> Edge Certificates and choose minimum TLS version to be 1.2
 
 

They have now moved to a higher min TLS version.


Saturday, January 21, 2023

tunneling through ssh

We wanted to protect one of our database servers by closing its open ports and only allowing tunneling via ssh or a vpn. Since this post indicates that ssh is less cpu intensive than openvpn, wanted to keep an ssh tunnel open across reboots. Autossh seems to be a ready-made solution for this - 

autossh(1): monitor/restart ssh sessions

via

So, in our case, via

Create an ssh tunnel background service with autossh and systemd (systemctl)

and

16.04 - Start autossh on system startup - Ask Ubuntu 

 

sudo apt install autossh

sudo nano /etc/systemd/system/ourtunnel.service

[Unit]
Description=our tunnel service
After=network.target network-online.target sshd.service

[Service]
ExecStart=/usr/bin/autossh -i /home/ouruser/.ssh/id_rsa_ourkey -L 9999:localhost:9999 -NT ourserveruser@our.remoteserver.tld

[Install]
WantedBy=multi-user.target

The -NT switch is important - otherwise, the sshd on the remote server complains, "Pseudo-terminal will not be allocated because stdin is not a terminal." And the service status will show that it failed - 

service ourtunnel status

 autossh[494191]: ssh exited prematurely with status 0; autossh exiting

 

linux - Pseudo-terminal will not be allocated because stdin is not a terminal - Stack Overflow

After a few days of use, top shows the CPU usage - around 6.6% for autossh on the tunnel client which runs a web server, and the tunnel server which runs a db shows 8 to 12% CPU usage for sshd.

 

 

Thursday, January 19, 2023

issues with a Moodle course

One of our institutions had some trouble with one of their courses not loading - or being very slow to load. After first suspecting server issues, some excerpts from an email exchange:

Some suggestions to make the course(s) load faster:

0. Have finished the upgrade now, and the site is working. The database table prefix_question_statistics shows a(n abnormally?) high number of entries -  that table alone is 49 MB,  with more being added every day - around 11:15 this morning a large number of records were added - this might have been one of you adding another set of questions?
 
1. Currently, if you click on one of the courses, a huge page, with many images, for all weeks, loads in the browser. We can make it load a bit faster using Cloudflare, but such a large page is bound to cause problems with older computers / machines with less RAM / on slower networks / etc. It would be better to show a collapsed view, with each week opening up only on clicking. (the collapse all link on the top right). Or on separate pages.

2. As I mentioned, please use jpg quality = 70 to 75 instead of png. 

3. I'll make the cloudflare changes to turn proxying on. 

4. Clicking on question bank also loads up 1000 questions at once. This would cause problems as the number of questions / size of each question increase. It would be desirable to load up fewer questions - maybe 100 at a time as with the link at the bottom, "Show 100 at a time" - I have clicked it now

I do not think anyone was adding questions. I have only been opening that page to troubleshoot. The problem still existed even after upgrade. However, I found a fix by disabling question statistics which seems to be triggering every time we access the question bank.
 
For now this should work till moodle finds a permanent solution for the statistics trigger. 
 
Checking the server, I found node js (which runs our notifier service) was running at 60% CPU. I have stopped it now and restarted the webserver with
cd notifierdirectory
sudo forever stop index.js
sudo service apache2 restart

So, now the site itself is loading.

https://our.site/course/view.php?id=4
loads fine.

But when I turn on edit mode, the page becomes very large, and hence loads slowly.

I guess the same issue is affecting https://our.site/course/view.php?id=6 in edit mode.
 
--Later--
 
Further troubleshooting tips -
Moodle in English: Very large course takes a minute to load unless "editing" is on

"It's almost always because somebody is trying to build the learning material on the course page itself rather than in activities/resources "behind" the page. There's nothing wrong with that, of course, but it's does have an effect.

We had a course with 120 videos embedded in labels on the course page. It took 10 minutes plus to load - even on a good connection."


So, probably not a good idea to embed all your content on the course page itself, they should be "activities/resources behind the page".

Now, whether you are already doing this, or how to do this if you are not doing this - maybe S can help.

Also, another person says:
"To track down the culprit I usually switch to one section per page. With one section per page, only the content for that pages loads, so some sections will load normally, but your dragger will still drag."

How to show one topic per page:
Course format types in Moodle · Technology Help · Lafayette College