Thursday, June 29, 2023

miscellaneous Wordpress fixes

Since I'm new to administering Wordpress, I'm listing below some solutions to some of the issues fixed on some of our servers.

Exporting and importing the database - 

https://ubuntu.com/tutorials/install-and-configure-wordpress#5-configure-database

Initially I had set up a wordpress site with WP-CLI since auto updates and installing of plugins were not working via the web console. Later, the web-based administration was fixed by changing ownership of the files to the webserver,
sudo chown the-wp-directory www-data:www-data

The site did not need 777 permissions, 744 seemed to be working.

But here are some of the useful WP-CLI commands.

cd the-wp-directory
wp plugin update --all

wp plugin search recaptcha
wp plugin install advanced-google-recaptcha
 
A migrated site would not open the web admin console - the error seen in /var/log/apache2/error.log seemed to be related to yoast plugin, not compatible with php 8.1. So, it needed to be disabled before I could open the web console. To do that,
https://kinsta.com/knowledgebase/disable-wordpress-plugins/
Rename public/wp-content/plugins folder to plugins_old, log in, then rename it back to plugins.
 
One of our sites was showing 404 errors for all pages other than the home page. The simple fixes at https://www.cloudways.com/blog/wordpress-404-error/ did not work. Digging in a bit, if I re-wrote the url as
domain.tld/?postname
it would work. So, it was some issue with the Permalinks setting. Though I checked that the .httpdocs was writable and so on, the wordpress url-rewrite was not working. So, had to manually copy-paste the following code to the end of the .htaccess file in the root directory, as per the advice at this page
# BEGIN WordPress
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>
# END WordPress
 
Also for w3 total cache plugin, mod_expires was needed.
enabling mod_expires with
https://unix.stackexchange.com/questions/694530/enable-mod-expires-on-apache
sudoa2enmod expires headers

https://lindevs.com/install-tidy-on-ubuntu gave the method for installing tidy.
But for tidy, we need the php extension and not above. 
sudo apt install php-tidy
https://linuxhint.com/install-latest-php-ubuntu22-04/

For testing the sites, I had edited the hosts file to point the browser to the new server -
sudo nano /etc/hosts on Linux

On Windows, you would do this by going to the start menu, and start typing notepad.exe, right click on Notepad, choose Run as Administrator, and open the file
C:\Windows\System32\drivers\etc\hosts
(since it is just hosts and not hosts.txt, when using File->open to find it, you must choose the drop-down "All files *. *" option.)
 
One of the migrated sites on a "multi-site network" wordpress instance did not allow logging in to the admin console with sitename/wp-admin - "An error occurred".
AFTER chown www-data for all files under wp directory, deleting inactive plugins from the Network admin web console solved the issue.

Resetting a password when a wordpress site is not configured for sending emails - via https://wordpress.org/documentation/article/reset-your-password/
- create MD5 hash of desired pw from http://www.miraclesalad.com/webtools/md5.php and put it in the user_pass field of the prefix_users table. If not using dbeaver, from the wp documentation above:
  1. “mysql -u root -p” (log in to MySQL)
  2. enter your mysql password
  3. “use (name-of-database)” (select WordPress database)
  4. “show tables;” (you’re looking for a table name with “users” at the end)
  5. “SELECT ID, user_login, user_pass FROM (name-of-table-you-found);” (this gives you an idea of what’s going on inside)
  6. “UPDATE (name-of-table-you-found) SET user_pass=”(MD5-string-you-made)” WHERE ID = (id#-of-account-you-are-reseting-password-for);” (actually changes the password)
  7. “SELECT ID, user_login, user_pass FROM (name-of-table-you-found);” (confirm that it was changed)
  8. (type Control-D to exit mysql client)

Edit: A list of items to secure a wordpress site - https://www.hostinger.in/tutorials/how-to-secure-wordpress

Friday, June 23, 2023

open a password protected PDF when the password is not completely known

There was an interest statement from SBI for NK in his email. The name in the email was correct, and it was from cbssbi.info@alerts.sbi.co.in, so it was assumed to be genuine. NK thought he did not have any SBI account, so we needed to check the statement. The statement was supposed to be protected with last 5 digits of phone number followed by date of birth in DDMMYY format. But no combination of phone numbers and dates of birth were opening it. 

Checking for tools to brute-force it, after many false-starts with "free" offerings which are limited to 3 letter passwords(!), finally landed up with John The Ripper.

Just running it with the default settings was bound to not work, since the default password list did not have the particular types of numbers we wanted to check. Also, brute-forcing with 11 numbers as the mask was indicating ETA of a week or so. So, googling around, found this nice post detailing how to customize JtR runs. So, the final steps were:

sudo apt-get install libssl-dev
git clone https://github.com/magnumripper/JohnTheRipper.git
cd ./JohnTheRipper/src
./configure && make
cd
JohnTheRipper/run/pdf2john.pl pdf_protected.pdf > pdf.hash

# created a file mywordlist with all relevant phone numbers
# created a file john-local.conf with the following contents
[List.Rules:myrule]
# append all possible dates DDMMYY
: $[0-3]$[0-9]$[0-1]$[0-9]$[0-9]$[0-9]

Then ran
JohnTheRipper/run/john --conf=john-local.conf --wordlist=mywordlist --stdout --rules:myrule >longlist
JohnTheRipper/run/john  pdf.hash --wordlist=longlist

and Hey Presto!

Instead of  a week to brute-force 11 numbers, the result was ready in less than a second.

AWS MFA and user experience

There was an email from Amazon Web Services (AWS), asking for a password reset due to inactivity, as well as adding multi-factor authentication (MFA) for security. The overall experience was good, with useful, clear, up-to-date documentation, unlike Azure/Microsoft.

https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_passwords_change-root.html

https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_mfa_enable.html

To use authenticator apps like Google Authenticator / Microsoft Authenticator etc,

https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_mfa_enable_virtual.html



Thursday, June 22, 2023

403 error for apache server after copying over files to fresh instance

Even though permissions and file ownership were set correctly - directory ownership was set with
sudo chown public_html www-data:www-data
there were 403 errors for all pages of one of our websites running on Apache. 

The reason for the 403 errors in this case was that the Directory directive was being set for the wrong directory. For wordpress, for example, we would need

<Directory /our/public_html>
        Options Indexes FollowSymLinks MultiViews
        AllowOverride All
        Require all granted
</Directory>

Tuesday, June 20, 2023

how cloud storage costs vary with usage

Some numbers which we crunched:

Azure blob storage, 1 TB:

Storage cost = $24 per month (for 1 TB - I suppose they will reach 1 TB only after a year or so.)
Data write cost probably negligible, $0.065 per 10,000 operations.
Data read cost = $0.005 per 10,000 operations.
Assuming 10k requests per day, that would be
$0.005*30 = $0.15 per month. 
10k requests/day =  10,000 * (10 MB) * (30 days) / month = 3 TB.

Assuming 10k requests per hour (maybe if they scale up a lot), that would be
$0.005*30*24 = $3.6 per month.

So total cost less than $30 per month estimated. 

Assuming average file size of 10 MB,

10k requests/hour =  10,000 * (10 MB) * 24 * (30 days) / month = 72 TB.

But what is missed out here is data transfer cost.

First 100 GB/month is free. But after that, $0.12 per GB. That would make the 3 TB data transfer cost = $360. (actually for 3.1 TB, counting the 100 GB free).

1.1 TB transfer = $120. If we set a data transfer budget of $60, ie. $2 per day, we would get 600 GB of transfer. 

AWS S3 object storage:

Pricing is similar - storage is cheap, but bandwidth is expensive. Using the calculator at https://calculator.aws/#/estimate

1 TB of data (assuming 15 MB average size, 64k objects) = $25 per month

10k put requests, 200k get requests, all less than a dollar.

Bandwidth, 10 TB = $1,120. So, for $60 data transfer budget, we get 535 GB of transfer. 

Dedicated server:

If a dedicated server is used, usually 10 TB of transfer is included in the monthly price of around Rs. 10k or $120.

Analog for Apache log processing

Our old favourite analog - the analog.cx domain no longer works, an alternative is either the archive.org wayback machine archived version or analog.gsp.com or some such hosting provider.

Analog came in handy when there was a request for some log info from one of our servers.

https://serverfault.com/questions/25769/how-to-make-analog-to-parse-only-one-week-of-logs#25848

But on our server, there was only 14 days of apache log data being retained, so the request had to be handled using other techniques.

Images are inside
 /usr/share/analog/images
need to manually copy to the analog run directory for them to be properly linked in the generated html files, for the simplistic configuration at /etc/analog.cfg
 
LOGFILE /home/theuser/analog/logcopy/access.lo*
FROM 190101
TO 231231
# LOGFILE /old/logs/access_log.*
OUTFILE total.html
####
REQLINKINCLUDE pages
REFLINKINCLUDE *
REDIRREFLINKINCLUDE *
FAILREFLINKINCLUDE *
SUBBROW */*
SUBTYPE *.gz,*.Z
####
PAGEINCLUDE *.php
and all the default typeinclude definitions.



Monday, June 19, 2023

data transfer numbers

One of our moodle websites wanted their data transfer numbers for the last two years. I tried analog, but that gave only 14 days of data, since probably the old logs were being deleted by logrotate. Then, got 30 days of data from cloudflare, which they show for free plans, for all their sites put together - 


This shows 60 GB per month across all their websites. Analog's data of the uncached calls to one of their servers showed "Average data transferred per day: 1.98 gigabytes, over the last 14 days." Which again works out to around 60 GB per month. 

Another interesting data point is that (another of our sites) currently pays around $35 a month for their approx 1 TB data stored on AWS S3, so their data transfer is only that much. Via calculator.aws, that means that their usage is probably around 100 GB per month from S3. Or less, since they are likely to have more than the 1M GET requests in the estimate below.

Tiered price for: 1000 GB
1000 GB x 0.0250000000 USD = 25.00 USD
Total tier cost = 25.0000 USD (S3 Standard storage cost)
3,000 PUT requests for S3 Standard Storage x 0.000005 USD per request = 0.015 USD (S3 Standard PUT requests cost)
1,000,000 GET requests in a month x 0.0000004 USD per request = 0.40 USD (S3 Standard GET requests cost)
25 USD + 0.40 USD + 0.015 USD = 25.41 USD (Total S3 Standard Storage, data requests, S3 select cost)
S3 Standard cost (monthly): 25.41 USD
Inbound:
Internet: 10 GB x 0 USD per GB = 0.00 USD
Outbound:
Internet: 100 GB x 0.1093 USD per GB = 10.93 USD
Data Transfer cost (monthly): 10.93 USD
Total Monthly cost: 36.34 USD




Gimp gradients to fade

The goal was to fade out the sharp edges of high-res images used in a planetarium show. 

If we use feathered selection, radius of feathering is limited to 100, so this method is only suitable for smaller images. What I used was Layer Masks.

Creating gradients on transparency is as simple as right-click on layer, add layer mask (then layer mask is automatically selected for editing) and then drag a gradient FG to BG - ensure that FG is white and BG is black.

cumulative gradients - I created new layer "from visible", added a block of black or background below that layer, and added the next gradient. Found that we can just Apply layer mask, then add layer mask again, and so on. Need not create new layers as in the screenshot below!


creating a planetarium show with Gimp, Reaper, Blender, Openshot and other free tools

This might be a long post, with separate sections on each tool.

The Gimp - for working with still images, and also can do "convert to polar" which results in something useful for fulldome. 180 degree pans can be stretched to 270 degrees or so without too much degradation. Here is a separate post on Gimp alpha channel gradients to fade out edges of high-res images.

Reaper - not free, but reasonably priced, and very capable DAW - Digital Audio Workstation - for multitrack editing of audio files. I've been using Reaper for a looong time - since 2010 or so.

Openshot - Just for quick and dirty creation of titles. Actually this was not required, Blender's Video Sequence Editor (VSE) can do all these and more.
Openshot - display frame number -
https://www.youtube.com/watch?v=5HyYj1eJH_o

Blender - I used Blender almost exclusively in its Video Sequence Editor (VSE) mode, only using the general purpose mode for creating the long "movie-style" scrolling credits as per this video

Blender's VSE had most of the features from Adobe Premiere and Adobe After Effects which I needed, though the final rendering is slow. Of course, I had worked extensively with Premiere and AE only around 20 years ago! Some short notes on how to do particular tasks:

Blender 3.4: Video Editing Tutorial For Beginners. - YouTube 

Transitions
https://www.youtube.com/watch?v=5WhlRQky90w
and a better way to do simple transitions - Click on first clip to select it, Shift click 2nd clip, Add -> transition -> gamma cross

Movie style end credits with long list of text file
https://www.youtube.com/watch?v=GACSUfqDl28
Very detailed movie style end credits, 75 minute video, if you want more control:
 
Some changes in Blender 3.4,

Delete default object
Delete default light
Move camera to z=5, x=0, y=0
Camera rotation to 0,0,0

Add --> Text from top menu of object mode
The Text Menu appears on top in Edit mode, for "Paste" or "Paste File"

For converting to Mesh, I used
Object --> Convert --> Mesh
with the default settings.

Materials - there is ready made Emit material
We need to make emit value 18 or so for max, not 1

Similarly, for editing the World,
World --> Surface --> Emission
can be used for a green screen background or black background etc.

For editing the keyframes to make them linear,
(Extreme left) -> Graph Editor (Shift F6)
select both keyframes, then
Key --> Interpolation mode --> Linear.

For rotation, a good slow linear rotation speed has a slope of
400 frames === 10 (degrees?) on y axis.

Unfortunately, Indic scripts and other complex scripts were not rendering properly in Blender - https://github.com/oormicreations/VSEIndic
My workaround was to create the the title in Google Docs or something, export as pdf, export as image, etc.

Blurring a face - animated masking with keyframes - https://youtu.be/NF_Q282V_uo

Free video stock footage is a thing - but very few useful clips. Youtube's CC filter and free videos and hi res images listed at
http://www.lss-planetariums.info/index.php?lang=en&menu=videos&page=free_shows
https://www.eso.org/
https://esahubble.org/
https://webb.nasa.gov/content/multimedia/images.html

Green screen chroma keying with Blender -
https://www.youtube.com/watch?v=dI1kMT3O6uk
preferable easier way, using the VSE

https://www.youtube.com/watch?v=_pPGR3d9WWc
uses the object view and nodes, has more control

Colour correcting a specific section in Blender VSE can be done with an adjustment layer strip - https://www.youtube.com/watch?v=yjON3i-890o

Lots of Blender resources and interesting reading:
https://github.com/agmmnn/awesome-blender
http://www.shadedrelief.com/natural3/pages/textures.html
https://github.com/search?q=blender+fulldome&type=repositories
https://blog.exppad.com/article/importing-actual-3d-models-from-google-maps

Edit: Also, lots of detailed info about tracking and animation in this video about how to create tracked callout titles in blender, https://www.youtube.com/watch?v=uT6qNVAKHzU


Saturday, June 17, 2023

Wordpress installation on Ubuntu 22.04 with S3 bucket/Azure Storage for content files

https://linux.how2shout.com/how-to-install-wordpress-on-ubuntu-22-04-lts-server/

Created user etc as per

https://hnsws.blogspot.com/2023/05/dot-net-core-ubuntu-linux-server.html

Apparently there were some unmet dependencies,
https://ubuntu.com/tutorials/install-and-configure-wordpress#2-install-dependencies

Still got the error "Please check that the mysqli PHP extension is installed and enabled." For fixing this, uncommented the mysqli extension line in
/etc/php/8.0/apache2/php.ini 

We need sudo rights to chown a resource to other users (who have different permissions than the user we are logged on).

So, instead chmod 777 for the public html folder for WP to be able to write to the configuration file. 

Connecting to an existing S3 bucket for auto-moving content files there, and all the advantages of using an S3 bucket -
https://www.codeinwp.com/blog/wordpress-s3-guide/

Using WP CLI to make changes like installing plugins, templates etc -
https://mediatemple.zendesk.com/hc/en-us/articles/360056149791-how-to-install-wp-cli
because otherwise, WP needs FTP or FTPS access - not SFTP by default. Apparently we can add SFTP support by
sudo apt install php-ssh2
https://stackoverflow.com/questions/53203050/how-to-use-sftp-instead-of-ftp-when-updating-wordpress-plugins

I may add more to this as the configuration progresses.

Edit: 27 June - 
Instead of WP CLI, in order to facilitate plugin installation, updates etc from the WP console, we just need to chown the wp files and the directory to www-data. Not make its permissions 777.

If using Azure storage for content files, even though some how-tos indicate that using our own custom domain requires the use of Azure CDN, we can actually use Cloudflare to proxy traffic with our custom domain and https - we need to first disable proxying and verify the custom domain after setting the CNAME, at the Azure portal -> Storage Account -> Security + networking -> Networking, Custom domain tab. After verification, we can turn on proxying.

For getting the Primary Access key, we need to go to the Azure portal -> Storage Account -> Security + networking -> Access keys.

 



Tuesday, June 13, 2023

secrets found in repository email

There was an email reporting google services json as having the API key for a repository of ours. According to this,
https://github.com/firebase/firebase-android-sdk/issues/4425

the API key can be public. So no action taken.


Visual Studio's vs folder in git repositories

Thought of removing the .vs directory created by Visual Studio in every git repository opened by it - seems to be a pain,
https://stackoverflow.com/questions/30868544/gitignore-wont-ignore-vs-folder

So I'm just ignoring it for now.

Saturday, June 10, 2023

moodle app build blues and fix

After my previous post about building the moodle app using Github actions, Moodle app version 4.1.1 would not build with the same commands. I guess there was some compatibility issues with the versions of cordova or gradle or SDK or .... 

Anyway, the newly released Moodle app version 4.2.0 builds without any issues with this basic workflow. I had seen an announcement that cordova-android is going to default to the latest version in future, so the workflow did not have to specify cordova-android@12.0.0.

Edit: 13 Jan - This commit forces a change to the moodle.config.json format. The Moodle tracker does not have any documentation about the expected format. By trial and error, found that this format works.

"sites": [
    {
      "name": "This seems to be ignored",
      "url": "https://our.moodle.instance.org"
    }
  ],

setting up an opencv C++ project in Visual Studio 2022

This video has a good overview. https://www.youtube.com/watch?v=unSce_GPwto

To summarize the points in the video:

  • Install OpenCV, add dll to path if required, add OPENCV_DIR env var if required. If doing these, we would need to close and open the command prompt or Visual Studio for these to take effect.
  • Empty project (or Console project)
  • If right-clicking through Solution Explorer to "create new c++ file" inside Visual Studio, the file would be automatically added to the project.
  • Project -> (name of project) Properties, 
    Configuration Release in dropdown
    Platform x64 in dropdown
  • C/C++ -> General -> Additional Include directories -> %OPENCV_DIR%\build\include
  • Linker -> General -> Additional Library directories -> %OPENCV_DIR%\build\x64\vc15\lib
  • Linker -> Input -> Additional dependencies -> (add the lib file of opencv, like opencv_world480.lib for version 4.80)
     

My older post about Visual Studio 2019 with opencv and spinnaker might probably still work, with the appropriate build tools installed. Creation of props (properties) file which contains the opencv dependencies is another possibility, which I have used in my projects. For example, this github actions build works fine with the latest windows runner, where Visual Studio 2022 is installed, and where the older build tools are also installed.  

Notes on failed builds on Visual Studio 2022:

Cannot find float.h


This is probably due to some path issue. For eg,
path needed an extra opencv

Similarly, for VS19 upwards, there are some differences due to a different way of precompiled header handling - no need to add an include for windows.h as in earlier versions, as seen in the code used for the youtube example linked at the top of this post. 

Another issue with Visual Studio builds would be the SDK version(s) which are installed. An answer at this post points to the location in the configuration as
Project properties -> Configuration properties -> General -> Windows SDK version

If it just says latest, then we must install the latest SDK - now it is for Windows 11, but the version number is still 10.<something>. And after the install, we may need to restart Windows too. And then in VS, may need to do "Retarget solution" and then Rebuild.

 


Saturday, June 03, 2023

MIT's Scratch for teaching how to program

From a discussion about embedding a custom player in moodle, https://moodle.org/mod/forum/discuss.php?d=435456

https://scratch.mit.edu/about

Useful for teaching kids (or adults) programming, I guess. Drag and drop code elements like loops and so on. Getting started - https://www.youtube.com/watch?v=ssoRNCtmhVM

Creating an animation with scratch - https://www.youtube.com/watch?v=k4zMuBf-7Vs  - basically drawing everything and then looping - much easier for our needs to do it in Blender VSE with keyframes etc.

Thursday, June 01, 2023

throw distance and brightness numbers

To evaluate the suitability of some projectors for our planetarium theatre, calculated some numbers based on projectorcentral's calculator.

Our existing projector, 

at 
0.15 m, output = 0 cm vertical offset, 8 cm x 5 cm image.
20.0 m, output = 0 cm vertical offset, 1049x656 cm image = 1229 cm diagonal
28 nits brightness @ 20 m.
15.0 m, output = 0 cm vertical offset, 1037x648 cm image = 1229 cm diagonal
50 nits brightness @ 15 m. for wide angle
15.0 m, output = 0 offset, 658x411 = 776 cm diagonal
65 nits @ 1.0x telephoto zoom.

Highlights page shows 2.0 to 10.8 m Throw Range.

With that throw range, only NEC projector with zoom lens can fit @ 10k lumens.

with NP19ZL lens
15.0 m, output = 115 offset, 409x230 = 469 cm diagonal
243 nits @ 1.0x telephoto zoom.

If non-4K

Sony VPL-FHZ65B - 6000 lumens
@zoom = 1.0x
15.0 m, output = 0 offset, 650x406 = 766 cm diagonal
57 nits @ 1.0x telephoto zoom.

7300 lumens
66 nits.
$7000

Barco F80 4K12
via paulbourke
$26,000
w GLD Zoom R9801721 lens
@15 m
141 cm offset, 556 cm diag im, 176 nits.



0.15 m, output = 0 cm vertical offset, 9 cm x 5 cm image.

20.0 m, output = 0 cm vertical offset, 950x535 cm image = 1083 cm diagonal
26 nits brightness @ 20 m. @zoom=1.05

20.0 m, output = 0 cm vertical offset, 901x507 cm image = 1033 cm diagonal
28 nits brightness @ 20 m. @zoom=1.0

15.0 m, output = 0 cm vertical offset, 1037x583 cm image = 1190 cm diagonal
26 nits brightness @ 15 m. for wide angle @zoom=1.53x

15.0 m, output = 0 offset, 676x380 = 775 cm diagonal
49 nits @ 1.0x telephoto zoom.

TODO: check zoom on current position, distance to mirror and size on mirror. (noted below)

Current setup:
Distance to mirror centre ~ 80 cm
Width of image ~41 cm
Corresponding values on
= 1.16x zoom

@1.16x zoom, distance of 15.0 metres
== 760 x 427 cm, 872 diag, for 16x9, 52 nits.

Corresponding values on
= 1.14x zoom

@1.14x zoom, distance of 15.0 metres
== 768 x 432 cm, 881 diag, for 16x9, 40 nits.

If we adjust for a slightly smaller 871 diag as for the mitsubishi,
@1.12x zoom, distance of 15.0 metres
== 759 x 427 cm, 871 diag, for 16x9, still 40 nits.
 
(52-40)/52 = 23% dimmer. 
 
But we can get 23% dimmer/brighter just by changing modes from theatre to custom etc, and also when the lamp age increases. So, probably this projector may be OK. https://www.projectorcentral.com/Optoma-ZK507-W.htm