I often want to run commands or files, e.g. python scripts, and keep them running after my terminal session is closed.

There’s a handy tool to do this called screen.

It’s quick to install.

apt-get install screen

Then, issuing the command ‘screen’ before your command line entry will run with the screen utility. There are a few things you need to know about using screen.

Ctrl a c – Creates a new screen session so that you can use more than one screen session at once.
Ctrl a n – Switches to the next screen session (if you use more than one).
Ctrl a p – Switches to the previous screen session (if you use more than one).
Ctrl a d – Detaches a screen session (without killing the processes in it)

For example.

screen path/to/pythonscript.py
Ctrl a d

The process is running in the background. You can check if it is running by looking at the running processes.

screen -ls

After the initial DHCP configuration is received I set my Virtual Machines to a static address.

To do this:

sudo nano /etc/network/interfaces

and change the following from:

auto eth0
iface eth0 inet dhcp

to (in this example, IP 192.168.1.10)

auto eth0
iface eth0 inet static
address 192.168.1.10
netmask 255.255.255.0
network 192.168.1.0
broadcast 192.168.1.255
gateway 192.168.1.1

Then issue the commands:

ifdown eth0
ifup eth0

Running ‘ifconfig’ should then display the correct results for eth0.

I was running into a frustrating permissions issue when trying to setup FTP access for various users. They couldn’t follow a symlink to the /var/www/sitefolder for their respective vhost.

I worked around this by making their www sitefolder their HOME directory for their user account. As the user exists purely for the website then this was an effective workaround.

useradd -d /var/www/sitefolder username

Now when logging in via FTP I am greeted with the respective www files and can upload and download accordingly.

I’m finally getting round to monitoring what’s going on with various network devices. I have been meaning to setup an MRTG / SNMP setup for a while now. It’s actually very easy with the following steps from the Ubuntu wiki:

sudo apt-get install snmpd
sudo apt-get install mrtg

Now that mrtg is installed, we must create a home where web pages related to the program can reside. The Ubuntu / apache2 default is /var/www/.

sudo mkdir /var/www/mrtg

Backup the original /etc/mrtg.cfg file:

sudo cp /etc/mrtg.cfg /etc/mrtg.cfg.ORIGINAL

Create a configuration file for MRTG:

cfgmaker snmp_community_string@ip_address_of_device_to_be_monitored > /etc/mrtg.cfg

Create and index file for the webserver :

indexmaker /etc/mrtg.cfg > /var/www/mrtg/index.html

Wait about 5 minutes before browsing to:

http://server_ip/mrtg/index.html

I also repeated the above steps for each of my SNMP enabled devices, writing to a new config file each time.

I then made a new entry at .etc.cron.d/ for the mrtg process to run each 5 minutes for each new config file.

The stats can be seen here.

I often need to package and compress a folder from the command line, for example, before running updates so I have a fall back position.

The following works for me

tar -czf filename.tar.gz foldername

So when I had a folder named www and from the command prompt I could see it, and wanted to name the file www-040713.tar.gz it was

tar -czf www-040713.tar.gz www

c = create
z = compress
f = file

I was going round in circles trying to get WordPress permalinks to work on a Ubuntu server with Apache2 installed.

The .htaccess file was fine:

# BEGIN WordPress
#
RewriteEngine On
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
#

# END WordPress

However I still got Error 404 page not found when I went to siteurl.com/page-name rather than siteurl.com/?p=123

In the end it turns out I needed to make a tweak to the Apache virtualhost.conf file for the site in question:

By default in the section it had

AllowOverride None

This needed to be changed to

AllowOverride All

A quick

service apache2 reload

And, hey presto, permalinks were working again.

I wanted to check that suPHP was working correctly and executing scripts under a specific user id.

This little PHP script did the trick.

php echo ‘whoim = ‘.exec(‘/usr/bin/whoami’);

That echo’d (outputted) the result of running a file /user/bin/whoami

I had a frustrating error for a while when using suPHP. It kept giving error 500 when enabling mod_suphp for .php files:

Directory /var/www is not owned by user

Of course, the directory was not owned by the user. The user belonged to a sub-directory only, the reason behind using suPHP.

It turned out to solve this the directory /var/www had to be owned by user root. That fixed the error.

I was configuring an apache2 setup to work with suPHP and ran into various Internal Server Error 500s due to folders and files having too generous permissions on them. suPHP doesn’t like that.

These two commands, when run from the directory where the web files reside, will recursively reset the permissions as required.

Find files and set to permissions 644

sudo find . -type f -exec chmod 644 {} \;

Find folders and set to permissions 755

sudo find . -type d -exec chmod 755 {} \;