Tuesday, May 28, 2024

Zoom in from Milky Way to Puttaparthi

The scripts and settings used to generate this 4k fisheye fulldome video are shared at https://github.com/hn-88/openspace-scripts 


The following script was used,
https://github.com/hn-88/openspace-scripts/blob/main/user/recordings/zoom-out-psn-to-galaxy4.osrectxt

where the motion is actually backwards, zooming out, in order to give a more controlled zoom travel. 

ffmpeg and blender were used to render the screenshot image sequences into video, forward and backwards. Blender could be used to fine-tune the video by speeding up parts of the video without much action and so on. 

ffmpeg to reverse a frame sequence - https://stackoverflow.com/questions/40475480/ffmpeg-convert-image-sequence-to-video-with-reversed-order

The zoom-in (reversed frames) have been uploaded and made available at archive.org using the x265-lossless codec under a Public Domain licence.

https://archive.org/details/zoom-in-to-psn-lossless

Monday, May 27, 2024

ffmpeg lossless for archive.org and reverse order of frames

Archive.org did not accept zip files containing frames for uploading 4K content, but lossless codecs were accepted.

ffmpeg - Best Lossless Video Codec for 8 bpc RGB image sequences? - Super User

seems to indicate that the best seems to be x265,

and commandline
 
ffmpeg -i input.mp4 -c:v libx265 -x265-params lossless=1 output.mp4 

And for reversing the frame order (backwards video),

 
- first bulk rename to - (minus) for the numbers prefix.

Saturday, May 25, 2024

github importer, for making a copy of a repository from the web interface

For testing a reported opencv bug

 
I wanted to make a copy of one of my repositories. Instead of cloning to a local directory etc, I wanted to do it from github's web interface. The solution seemed to be to use github importer. 
 
At the upper-right corner of any page, click the plus sign, and then click Import repository
 

making google great again

A follow-up to my post about the beginning of the end of Google -

https://askleo.com/why-ive-stopped-using-google-search/

and a temporary band-aid fix to get better search results from google,

https://tedium.co/2024/05/17/google-web-search-make-default/

- just add &udm=14 to the end of the search query url. 

Tuesday, May 21, 2024

Fly through Mars valley and Grand Canyon - generated with OpenSpace

The scripts and settings used to generate these 4k fisheye fulldome videos are shared at https://github.com/hn-88/openspace-scripts


 

https://github.com/hn-88/openspace-scripts/blob/main/user/recordings/setupgrandcanyon.osrectxt

https://github.com/hn-88/openspace-scripts/blob/main/user/recordings/grandcanyonflyin5.osrectxt

(default profile, Window options fisheye 4k as at
https://github.com/hn-88/openspace-scripts/blob/main/config/single_fisheye-4k.json )

Link to 4096x4096 video at archive.org -  https://archive.org/details/grand-canyon5

Flying 5 km (3 miles) above a valley on Mars, near Valles Marineris


https://github.com/hn-88/openspace-scripts/blob/main/user/recordings/marsvalley2setup2.osrectxt

https://github.com/hn-88/openspace-scripts/blob/main/user/recordings/marsvalley2upfrombeginning3.osrectxt

(this recording is played in reverse at the beginning of the video.)

https://github.com/hn-88/openspace-scripts/blob/main/user/recordings/marsvalley2.osrectxt

Link to 4096x4096 video at archive.org - https://archive.org/details/mars-fly-into-valley0001-6940

The 4k frames are available with me, and can be made available. I'll also see if I can upload them to archive.org - Edit - apparently zip files with jpg frames can't be uploaded to archive.org, so I have added links to the 4K movie files uploaded to archive.org above.

Monday, May 20, 2024

Microsoft's answer to brute force mitigation - $$$

 How to avoid successful SSH Brute Force Attack - Microsoft Q&A

mentions Just-in-time access usage. But that

https://learn.microsoft.com/en-us/azure/defender-for-cloud/just-in-time-access-usage

requires Defender for Cloud Plan 2

https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-select-plan#plan-features

which costs nearly $15 per month per server!

https://azure.microsoft.com/en-us/pricing/details/defender-for-cloud/

Indian income tax e-filing - validation failed

When filling up the ITR2 income tax return, one of the "validation failed" messages said:

In Schedule CG, Sl. No. B4a LTCG u/s 112A is not equal to total of Col. 14 of Schedule 112A

This is probably because of rounding errors, I think. The numbers in the quarter-wise table in Schedule CG (Capital Gains) must add up to the numbers in Schedule 112A. So we should manually add and check, and make small one rupee adjustments to make the quarterly numbers add up to the total in Schedule 112A.

Another couple of points to note when there are "brought-forward losses" -
(1) in that case, the maximum amount of the BFL has to be applied - we can't apply a lower amount of the total losses to cover only part of the profits of this financial year
(2) and the numbers in the quarterly numbers should then add up to zero (if the losses completely cover the profits.)

Monday, May 13, 2024

making a copy of a wordpress blog and setting it up on another domain

What I should have done:

Probably I should have used wp-cli to deactivate all plugins before the migration. And apparently, the whole thing could have been done far easily with wp-cli - https://rocketgeek.com/basics/using-wp-cli-to-migrate-copy-your-site/
https://guides.wp-bullet.com/migrate-wordpress-site-new-server-wp-cli/

Steps taken:

0. Ran all commands inside screen.

1. https://wordpress.stackexchange.com/questions/75135/how-to-export-import-wordpress-mysql-database-properly-via-command-line
says a normal mysqldump should suffice.

2. Mysqldump complained about permissions -
Error: 'Access denied; you need (at least one of) the PROCESS privilege(s) for this operation' when trying to dump tablespacesldump: Error: 'Access denied; you need (at least one of) the PROCESS privilege(s) for this operation' when trying to dump tablespaces

So, https://dba.stackexchange.com/questions/271981/access-denied-you-need-at-least-one-of-the-process-privileges-for-this-ope

SHOW GRANTS for someuser@localhost;.

To add the global privilege use the SQL command
GRANT PROCESS ON *.* TO someuser@localhost;

That solved the issue. In our case, I was running the SQL commands from the SQL console of dbeaver.

3. Created a new db as root, granted permissions for the relevant user as per https://www.digitalocean.com/community/tutorials/how-to-create-a-new-user-and-grant-permissions-in-mysql

4. mysql -u theuser -p thenewdb < wpdevelbackup.sql to populate the new db.

5. Copied over the directory with cp -a to retain permissions as well as do a recursive copy - https://unix.stackexchange.com/questions/44967/difference-between-cp-r-and-cp-a
sudo cp -a olddirectory newdirectory

6. After that, need to update the wp_options table in the new database. option_value like '%public_html%' showed all the required changes to be made after siteurl -
siteurl
home
astra_sites_recent_import_log_file
recently_edited
astra_sites_recent_import_log_file
fs_active_plugins
duplicator_package_active
uagb_downloaded_font_files
_transient_dirsize_cache

(Had to copy paste to text editor and find and replace for the multiple instances of the old path in some of these fields.

the _transient field was very big. Since anyway it would be recreated, deleted its value instead of changing the path to the new path for that value.)

7. Have to edit wp-config file also, since the site was redirecting back to the old url. https://www.sitepoint.com/how-to-migrate-a-wordpress-site-to-a-new-domain-and-hosting/
Had to add the
define('WP_HOME','https://oursite.com');
define('WP_SITEURL','https://oursite.com');

at the end of wp-config file also, since I had not de-activated the plugins before doing the migration. Or maybe just a caching problem with firefox which I used for testing. Site was working at this point, when tested with another browser. (I had done the editing of /etc/apache2/sites-available etc beforehand.)

Wednesday, May 08, 2024

Schedule Page date listing much in advance

 PB wanted to verify that the schedule created for a few days in advance loads OK on the schedule page. But only a few days were visible. This was because

the code checks if the schedule data txt file is available, and then updates the dropdown, but only up to 5 days in advance -


while (dtimems < dateplus5ms) {
addifscheduledataavailable(dtimems);
dtimems += 24 * 60 * 60 * 1000;
}

It may be possible to see the schedule of other dates by changing your system date to the future.

And indeed, PB confirmed that he could check the published playlists for future dates by changing the system date.

Monday, May 06, 2024

run simple python code online

There is google's colab for executing python workbooks online, in the browser. But for simple code, trinket.io - "no need to log in, download plugins, or install software. "
via
https://www.youtube.com/watch?v=v7svyPnIeGA
via
https://www.wired.com/story/why-the-solar-system-is-flat/

Thursday, May 02, 2024

interesting problem with S3 buckets - anyone can do a $ DDoS now

Copy-pasting from Slashdot - 

How an Empty S3 Bucket Can Make Your AWS Bill Explode (medium.com) 55

Posted by msmash on Tuesday April 30, 2024 @03:10PM from the oops dept.
Maciej Pocwierz, a senior software engineer Semantive, writing on Medium: A few weeks ago, I began working on the PoC of a document indexing system for my client. I created a single S3 bucket in the eu-west-1 region and uploaded some files there for testing. Two days later, I checked my AWS billing page, primarily to make sure that what I was doing was well within the free-tier limits. Apparently, it wasn't. My bill was over $1,300, with the billing console showing nearly 100,000,000 S3 PUT requests executed within just one day! By default, AWS doesn't log requests executed against your S3 buckets. However, such logs can be enabled using AWS CloudTrail or S3 Server Access Logging. After enabling CloudTrail logs, I immediately observed thousands of write requests originating from multiple accounts or entirely outside of AWS.

Was it some kind of DDoS-like attack against my account? Against AWS? As it turns out, one of the popular open-source tools had a default configuration to store their backups in S3. And, as a placeholder for a bucket name, they used... the same name that I used for my bucket. This meant that every deployment of this tool with default configuration values attempted to store its backups in my S3 bucket! So, a horde of misconfigured systems is attempting to store their data in my private S3 bucket. But why should I be the one paying for this mistake? Here's why: S3 charges you for unauthorized incoming requests. This was confirmed in my exchange with AWS support. As they wrote: "Yes, S3 charges for unauthorized requests (4xx) as well[1]. That's expected behavior." So, if I were to open my terminal now and type: aws s3 cp ./file.txt s3://your-bucket-name/random_key. I would receive an AccessDenied error, but you would be the one to pay for that request. And I don't even need an AWS account to do so.

Wednesday, May 01, 2024

OCVWarp on Google Cloud Shell

Checked and found that OCVWarp works (though slowly - 5 fps for XVID encoded 1080p output) even on Google Cloud Shell, after installing the dependencies.

wget https://github.com/hn-88/OCVWarp/archive/refs/tags/v4.01.zip
unzip v*.zip
cd OCVWarp*/build
wget https://github.com/hn-88/OCVWarp/releases/download/v4.01/OCVWarp-4.01-x86_64.AppImage
chmod +x OCV*mage
mv OCV*mage ocvwarp
sudo apt install libopencv-dev


(after uploading the input file)


./ocvwarp OCVWarp.ini 30fps.mp4 output.mp4