Friday, November 15, 2013

side-chaining - ducking in Reaper

Following the method given at AskAudio:


  1. Drag and drop from the io button of the track which is going to be the controlling track, to the track which is the one which will be affected by the compressor.
  2. Choose drop-down and change the destination of the receive to 3-4 instead of the default 1-2.


  3. Choose the Detector Input on the compressor to be Auxiliary Input L+R and other settings as per your choice.

Monday, November 11, 2013

shortcut to creating multiple videos on dome

I wanted to have a number of videos tiled on the dome, maybe six or nine videos in the front upper surface of our tilted dome. The classical method would have been to use blender and a video texture. But that is very time consuming, and my blender skills are extremely rusty. Looking around for shortcuts, the General quadrilateral transform in Virtualdub was the first thing I thought of. But that filter does not support overlays - background remains a single colour. The Reform and Reformer plugins for avisynth did support overlay, but..... Reformer turned out to be buggy, the background video kept moving forward and back and turning green etc.... and Reform's output was too jaggy and aliased.

So, used the General Quadrilateral transform in Vdub, created a mask for each video, and used simple overlay calls in an avisynth script to overlay all the videos.

lrmask=Imagesource("lower_right_white0.jpg",end=450)
lr=avisource("lower_right.avi")
and so on, where lower_right_white0.jpg looked like 



bg=overlay(bg,ul,mask=ulmask)
bg=overlay(bg,ll,mask=llmask)
bg=overlay(bg,um,mask=ummask)
bg=overlay(bg,ur,mask=urmask)
overlay(bg,lr,mask=lrmask)

For getting the shapes right, first I did a trial and error using the warped mesh as a template. But that was not getting me the shapes so easily - the warped mesh has latitudinal and longitudinal lines, not rectangular lines. Not so suitable for placing rectangular videos. Finally, thought of directly painting with gimp on the dome - put the mesh as a background layer for aligning the frame to the dome approximately, then painted the desired rectangles on a new layer. Using those rough outlines, noted the vertex co-ordinates as below.


lower middle video = 440x248 - 1.774
lbotx= 740
ltopx= 740
lboty=1079 #default
rboty=1079 #default
rtopx=1180
rbotx=1180
rtopy= 832
ltopy= 832

upper middle video = 340x220 - 1.54
lbotx= 800
ltopx= 800
lboty= 740  
rboty= 740  
rtopx=1140
rbotx=1140
rtopy= 520
ltopy= 520

lower right video 
lbotx=1264
ltopx=1264
lboty=1079 #default  
rboty=1079 #default  
rtopx=1560 #changed from 1600
rbotx=1600
rtopy= 780
ltopy= 832

calculated lower left
rbotx= 656
rtopx= 656
rboty=1079 #default  
lboty=1079 #default  
ltopx= 360 #changed from 320
lbotx= 320
ltopy= 780
rtopy= 832

upper right video
lbotx=1260
ltopx=1220
lboty= 770  
rboty= 730  
rtopx=1480
rbotx=1550 #changed from 1590
rtopy= 530
ltopy= 600

calculated upper left
rbotx= 660
rtopx= 700
rboty= 770  
lboty= 730  
ltopx= 440
lbotx= 370 #changed from 330
ltopy= 530
rtopy= 600

updating remote database with records in one field

We had to update the 'firstplayed' field of the schedule page database on our website. Our local database had the info. The way to do this seemed to be:

  • Export the filename/firstplayed fields from remote db as csv, 
  • import into local db as a table, create a view (run a join query) to get the desired firstplayed field for each filename, 
  • export as csv, 
  • import into remote db, 
  • run an update query there. 
But Murphy's law was evident at every step. My database guru roommate Nz helped out to navigate the Murphy minefield. Remote db was MySql, with phpMyAdmin front-end. Local db was Postgres, with phpPgAdmin front end. The import button comes up in phpPgAdmin while viewing a particular table.



  • phpMyAdmin's default csv output had ; (semicolon) as default separator, phpPgAdmin's import needed , (comma) as separator. 
  • The remote db often took a long time to respond since it is on shared hosting. 131 seconds to update 2000 records, for example.
  • phpPgAdmin demanded the field names as the first row of the csv
  • phpMyAdmin ignored the field names as the first row, took that row as data to be entered, and mixed up the fields - firstplayed was entered as filename and vice versa!
  • Since there were many streams, the firstplayed from the local db had multiple matches, quite a few with NULL. Just ignored them, and used a NOT NULL condition. Syntax for postgres - IS NOT NULL - not not equal NULL. :)
  • Had to do a left() firstplayed field, since the local db had a full timestamp, while remote db needed only the date. 
  • Telugustream has a lot of files yet to be updated in the temp_play_history table. So will have to do the above procedure again.
  • Finally, the update query syntax for mysql was non-standard - Access style, as my db guru roommate mentioned - instead of update dt set df=sf from dt,st where d.jf=s.jf, the syntax was
    update `dt`,`st` set `dt`.`df`  = left(`st`.`sf`,10) where `dt`.`df` = `st`.`sf`

Thursday, November 07, 2013

terminal credentials in the new internap portal

After Internap took over Voxel, the configuration portal was "upgraded". In the new portal, the ssh terminal credentials are shown when  you click on your.servername.com, scroll down to device access, and click on the show link next to the ssh password. The confusing character in the middle of our password is capital i (which might have been small L or the number one).

Sunday, November 03, 2013

Sureyyasoft Shira Player mini review update

In my earlier post about the Shiraplayer, I had mentioned the warp quality issues it carried forward from its stellarium base. Now the developer has added a better warp functionality in the latest release 1.5.1 - a custom warp data file module.

Initially, on using this version, I got blockiness, as in the screenshot below.


The developer suggested I restart the application after choosing the warp file, and try again, this time it worked fine. So, no more warp quality issues.

Unfortunately my PC (Athlon 1.9 GHz laptop) is not fast enough for realtime usage of shiraplayer with warping.

The Shiraplayer process with the new warping mode takes approximately as much cpu as the warping vlc player produced by Paul. But Shiraplayer has two processes running, one with the video and one with the other features. The process with the other features takes up nearly 50% cpu on my system. So, Shiraplayer stutters while playing even those videos which play OK on Paul's realtime player. For eg, even the 1.2K videos which take up 30% cpu on Paul's player, playing back at full 30 fps, are played back only at 15 fps on Shiraplayer.

But probably all these will not be an issue for those with more powerful PCs. Probably a decently specced machine from the last two years should be able to handle it just fine.

Friday, November 01, 2013

video editing with After Effects

Since the blender methods for cross-fades were quite cumbersome, tried out After Effects instead. What I have is AE 4.1. For fades, need to set opacity keyframes for the upper track. For previewing audio, have to scrub the timeline with Ctrl pressed. Audio doesn't play when the time controls are used. Menu items Composition -> Preview -> RAM preview or Audio preview can make the audio play. Will try this, check if this is better than installing my old copy of Adobe Premiere. 

video editing with blender

Following the steps given at this post. Basically, choose Video Editing in the Choose Screen Layout drop-down just to the right of the menus, and change the default Graph editor window into a Properties windows when you want to do the export.

Right-click and drag to move clips. K to cut. Shift Right-click on two clips to select both of them, then choose bottom menu item Add -> Effect Strip... -> Gamma Cross to cross fade. Have to enable AV sync if needed, default is no sync.

And as this tip says, to move a view-port to a different monitor, shift+left click on the upper right corner of the view port to make it a separate window.

To get rid of a view port, as mentioned here, move the cursor near the border with the next area till it becomes a double-sided arrow, right-click and choose join areas.

The scroll bar handles can be used to expand/contract the timeline view in the Video Sequence editor.




Mouse scroll wheel also zooms in and out. Middle mouse button to pan. Full details are at the Blender wiki.

Select a clip and hit k to cut. Right-click and drag on the arrows at either end of clip to trim.



another audio cleanup tool-chain - Reaper for noise reduction

In previous posts, I've mentioned using Cool Edit Pro and Sound Forge for audio cleanup. For audio with a lot of hiss, noise reduction with Cool Edit Pro produces a lot of artifacts. Reaper's ReaFir filter in Subtract mode is useful in such situations. For an old recording taken from a VHS tape, the following steps were used.

NR and gating in Reaper,


Then sweetening the sound using the EQ setting "30-band Punch and Sparkle" in Cool Edit Pro,


and finally RMS normalizing to -10 dB RMS with Sound Forge, choosing regions of roughly equal loudness.



Edit: 13 Feb 2020 - adding a couple of videos to demonstrate choosing regions of equal loudness by visual inspection.


and then normalizing them with keyboard shortcut Ctrl Y to speed up the process.  




taking screenshots on Mac OS X

This wikihow article has seven methods of taking Mac OS X screenshots.

Command-Shift-4  -> screenshot region of interest, saved on desktop

Command-Shift-3  -> grab full screen, saved on desktop

Command-Shift-4-Spacebar  -> grab an open window, by clicking on it, saved on desktop.

Adding Ctrl like Command-Ctrl-Shift-3 etc copies to clipboard instead.

Applications -> Utilities -> Grab application is also available. Some extra tools are also mentioned.

more details of Reaper-BUTT broadcast setup

My post on the setup for live broadcasts using Jack OS X was a bit skimpy on the Reaper setup details. These screenshots below tell the story of the setup on the Reaper side.

I've got sends going to a full-mix recording track as well as the broadcast track, so that I can use them without interruption even when I 'solo' any of the tracks to listen to that mic - the 'solo' happens by muting all other tracks in the mix which goes to the headphone (outputs 1-2). The mix which is recorded goes to outputs 3-4. I also have another Jack setup where output 4 gets a compressed signal from the broadcast channel below - which could be sent to the video team, for example.


On this broadcast track, I have a high pass filter to cut out rumble, knee at 100 Hz, an aggressive compressor, and the insert from Jack. The insert's return is used as input for BUTT using Jack routing.