InfluxDB on Raspberry Pi

I found a blog post by Aymerick describing how to build InfluxDB on a Raspberry Pi. Here’s what I did to get it working.

Install prerequisites

$ sudo apt-get install -y bison ruby2.1 ruby-dev build-essential
$ sudo gem2.1 install fpm

Install gvm

This installs gvm for the current user:

$ bash < <(curl -s -S -L https://raw.githubusercontent.com/moovweb/gvm/master/binscripts/gvm-installer)

Setup Go

$ gvm install go1.4.3
$ gvm use go1.4.3 --default

Create an influxdb package set

$ gvm pkgset create influxdb
$ gvm pkgset use influxdb

Build InfluxDB

$ go get github.com/sparrc/gdm
$ go get github.com/influxdata/influxdb
$ cd ~/.gvm/pkgsets/go1.4.3/influxdb/src/github.com/influxdata/influxdb
$ gdm restore
$ go clean ./...
$ go install ./...

The ./package.sh command did not work for me, so I settled for the influxd and influx binaries in ~/.gvm/pkgsets/go1.4.3/influxdb/bin.

Advertisements

Limiting SSH access to certain users

There are multiple ways to limit SSH access to a machine. The one I’ve found most straight forward is to use PAM access rules. First, edit /etc/pam.d/sshd and uncomment the line:

account required pam_access.so

Next, edit /etc/security/access.conf

The following rules allow root from a local connection and deny all but users in the SSH group.

+:root:LOCAL
-:ALL EXCEPT ssh:ALL

With this in place, managing SSH access is a matter of tweaking the ssh group.

Accessing end-of-life’d Ubuntu packages

Sometimes it’s necessary to keep a server running an end-of-life’d version of Ubuntu breathing just a bit longer. Because Canonical removes old packages from http://archive.ubuntu.com/ubuntu/, apt-get install will spew out messages like:

Err http://archive.ubuntu.com karmic-updates/main libpam0g 1.1.0-2ubuntu1.1
404 Not Found [IP: 91.189.88.45 80]

You can update /etc/apt/sources.list to use the URL http://old-releases.ubuntu.com/ubuntu/ instead of http://archive.ubuntu.com/ubuntu/ and get to those packages.

Note, old versions of Ubuntu no longer receive security updates, so upgrading is the best way to go.

Using the Debian alternatives system

On Debian-based systems like Ubuntu it is possible for common programs to have multiple versions. The alternatives system is a way to manage what version is used when a command is run.

Vi, for example, can have multiple versions, and even variants such as vim, installed. Recently I custom-compiled vim in order to get CommandT working and realized that it broke if I ever ran vi instead of vim.

I fixed this by, first, adding /usr/local/bin/vim as an alternative for vi:

sudo update-alternatives --install /usr/bin/vi vi /usr/local/bin/vim 1

And subsequently set it as the alternative running the command:

sudo update-alternatives --config /usr/bin/vi

References:

http://www.infodrom.org/Debian/doc/maint/Maintenance-alternatives.html

Grabbing [and removing] remote files

While SSH’d into a remote machine, I find that I need to grab a file to use it locally. But, I’m behind a firewall … and so is the server, which means I can’t scp the file to or from my machine. I resort to temporarily dropping the file on an intermediate machine accessible by both.

After doing this for a while I ended up with a lot of junk piled on the intermediate machine because I never clean up files after copying them to my lappie.

Using a recently discovered rsync option, --remove-source-files, I created an alias which does the cleanup for me:

alias grabrm='rsync --remove-source-files'

I now after copying the file to the intermediate machine:

remote_machine$ scp foo.txt intermediate:

I run the following command, which copies the file and removes it from the intermediate machine:

local_machine$ grabrm intermediate:foo.txt .

CommandT on Ubuntu 11.04 & 11.10

Vim has a pretty awesome fuzzy search plugin, named after the TextMate shortcut, CommandT. Installing it is fairly straight forward, but it crashes pretty bad on Ubuntu. After some reading I found out it is because Ubuntu ships a version of vim with flaky Ruby support.

In order to recompile I installed the following packages:

sudo aptitude install python-dev ruby-dev mercurial ncurses-dev liblua5.1-0-dev lua5.1

I then followed Kresimir Bojcic’s instructions on building from the HG repo and ended up with vim in /usr/local:

hg clone https://vim.googlecode.com/hg/ ~/vim 
cd ~/vim
hg update -C v7-3-154
./configure --with-features=huge  --disable-largefile \
            --enable-perlinterp   --enable-pythoninterp \
            --enable-rubyinterp   --enable-gui=gtk2 \
            
make
sudo make install

With a new vim installed I followed the *command-t-installation* instructions and so far it has been stable.

Update 2012-01-16: After installing 11.10 I found that vim doesn’t have ruby support at all. Following the instructions above worked like a charm.

Hash buckets, rsync, and xargs magic

At work we have a couple of directories that are organized as two-deep hash buckets, totaling 65536 directories [1]. This creates a ton of directories and traversing this, e.g. find . -type f, takes ages. This structure also causes rsync to take up a lot of memory.

One way to solve this is to work on a single directory at a time instead of all 256 directories (each containing 256 directories of their own). For example, this will run rsync once per directory which dramatically decreases rsync‘s work load and works pretty well:

for i in *; do rsync -a $i server:/path/to/dest/$i; done;

With xargs the serial process above can be parallelized. The following will continually process 8 directories until all 256 have been copied over:

ls | xargs -n 1 -P 8 -I% rsync -a % server:/path/to/dest/%

I tried with 32, 16, then 8 parallel processes. In my case a -P value more than 10 will cause xargs to explode trying to create that many rsync processes. I haven’t figured out why, but it really doesn’t matter. With 8 running in parallel, the disk and network should be pretty well saturated anyway.

[1] the base directory has 256 directories 00 – ff, which each have 256 00 – ff directories in them. 256^2 = 65536 directories.