Archive for category Stuff

Server alive test?

What’s the main problem with running a server at home – the 50% uptime figure… (OK, it’s not that bad 🙂

One of my Zoneedit passwords expired the other day and I forgot to change it on the server (duh)…

So – I created a really simple cron script to check all the expected services are up at each domain I host. It does this by connecting to a remote server (assuming you have SSH access to some external machine) and attempting to connect back to each domain on a particular port. I check for SMTP and HTTP.

And here it is:
[sourcecode language=’sh’]
#!/bin/bash
#
# Script to test if all the domains are up and working.
# It upens a connection to a remote server and tries to connect
# back to the given port of each domain. If the connect fails, or times out
# this script prints an error to stderr and returns false.
#
# To use, make this file executable and make a cronjob to run it or place
# the script in /etc/cron.hourly.
#
# Wilson Waters 20100824
#

# domains to test (space separeted list)
DOMAINS=”mail.mydomain.com mail.someother.domain.”

# remote server to test from
SERVER=””

# port to try and connect to
PORT=25

# timeout on connect after this many seconds
TIMEOUT=5

#user to connect to the remote host as
USER=”root”

#—————————————————————————

error=0
for domain in $DOMAINS
do
command=”ssh $USER@$SERVER nc -z -w$TIMEOUT $domain $PORT”
output=`$command 2>&1`
if [ $? -ne 0 ]; then
error=1
echo “Connect error $domain:$PORT ($output)” >&2
fi
done

exit $error
[/sourcecode]

UPDATE
I recently updated this script to be a little more useful and robust. It now sends mail directly from the script (rather than via cron) and allows you to set a port for each host.

[sourcecode language=’sh’]
#!/bin/bash
#
# Script to test if all the domains are up and working.
# It upens a connection to a remote server and tries to connect
# back to the given port of each domain. If the connect fails, or times out
# this script prints an error to stderr and returns false.

# domains to test (space separeted list)
DOMAINS=”mail.alintech.com.au www.alintech.com.au:80:”

# remote server to test from
SERVER=”somedomain.net”

# port to try and connect to
DEFAULT_PORT=25

# timeout on connect after this many seconds
TIMEOUT=5

MAIL_RECIPIENT=blablabla@alintech.com.au

#—————————————————————————

for domain in $DOMAINS
do
# work out the port
port=$DEFAULT_PORT
IFS=’:’ read -ra addr <<< "$domain" if [[ ${#addr[@]} = 2 ]]; then domain=${addr[0]} port=${addr[1]} fi # ping it if [ "X$SERVER" == "X" ]; then command="nc -v -z -w$TIMEOUT $domain $port" else command="ssh $SERVER nc -v -z -w$TIMEOUT $domain $port" fi output=`$command 2>&1`
res=$?
output=”${output}\nTime: `date`”
if [ $res -ne 0 ]; then
echo -e $output | /usr/bin/mail -s “Connect error $domain” $MAIL_RECIPIENT
fi
done
[/sourcecode]

1 Comment

WiFi location vulnerability

Interesting video on “hacking” the google WiFi location service from the “black hat” conference.

http://www.securityweek.com/hacker-uses-xss-and-google-streetview-data-determine-physical-location

The basic gist of it is

  1. Create malicious html page which tries to load local router status pages in hidden iframes – i.e. http://192.168.1.1/router.asp?status, http://10.0.0.1/index.shtml, etc…
  2. Use javascript to parse DOM tree and read the status page text (and WiFi router MAC address).
  3. Send MAC address back to server (via JSON/POST, etc)
  4. Perform lookup on Google location service using discovered MAC address
  5. Google returns location of discovered MAC address.

Of course, this only woks if the users browser same origin policy allows parsing the DOM tree of the iframe, which I think most browsers would disallow…

I went looking at my home router status page and, sure enough, it shows the WiFi MAC address on the status page which is accessible on the local network without logging on.

No Comments

Google street view dissection

I felt like working out how google street view does its stuff today – so here goes!

Here’s the image of the street view interface we’re dealing with

Street View web page

Time to open up wireshark and see how it gets these images!

Looks like there’s lots of tile requests to the server at cbk2.google.com

GET /cbk?output=tile&zoom=3&x=4&y=1&cb_client=maps_sv&fover=2&onerr=3&v=4&panoid=0JTH3YvHt93HUiWezPIRhgHTTP/1.1\r\n

Pats of interest are the zoom, x, y, and panoid tags. I presume x and y are the tile locations in the entire panorama and zoom is the zoom level (0 to 3 in this case). panoid must be a specific identifier for the street view image you’re looking at.

putting this into a browser retrieves this tile


sv tile 1
click image to go to the google version

Nice! now for a play – Lets change the x and y values to (0,0)


street view tile at (0,0) zoom 3

mmm.. don’t know where that is. Lets look down a little at (0,1)


street view tile 0,0

ahh! There’s something interesting. Still, 0 on the x axis doesn’t seem to be “forward”. Lets look at the tile to the left which is at (6,1)


street view tile at 6,1

That explains it… Looks like there’s some overlap at the join. And – taking a closer look at this tile shows that the join is actually on the “back” of the image (as it was captured) because there’s a one way arrow on the ground. Perhaps tile 0 is actually aligned to west? Not sure on this one.

Ok, time to play with zoom. Here’s zoom=4 at (0,0)


street view zoom 4

A value of 4 doesn’t seem to work. It just gives a black tile. I think that’s because this area (in Perth WA) was only covered at a fairly low resolution this time around (it looks like they used ladybug2 cameras for this pass).

Well, here’s zoom = 2 at (0,0) then


sv7

That’s more like it. Interestingly, the join point on the x axis is still the same (i.e. the left of tile 0).

Now for zoom =1


street view zoom=1

and zoom=0


street view zoom=0

So there we go.

Just for fun I stitched all the zoom=1 tiles to generate this

street view zoom=2 stitched

Definitely looks like a Equirectangular_Projection

Out of interest I thought I’d also estimate the total size for each full sized image to be about 750kB. This assumes a tile size of 32kB, 21 full tiles and 7 quarter tiles at the bottom. Say that’s 1MB per image including the lower scales, multiplied by the number of streetview images over the world…. equals a lot!

, , ,

No Comments

Google WiFi data collection

Google have caused quite a FUSS after their accidental collection of unencrypted WiFi data.

“Communications Minister Stephen Conroy has lashed out at Google, accusing the internet giant of the single biggest breach of privacy in history.”

Here’s the official Google response.

I posted an article in February last year on how/why the street view cars collect WiFi data and I think it’s quite a clever idea. Basically, Google holds a database of WiFi MAC addresses collected by the street view car and users of their Latitude application. This database is then used to provide location information to users without a GPS device (i.e. on your laptop).

Google aren’t denying they collect WiFi access point information. Their mistake was storing un-encrypted WiFi data. Every WiFi access point broadcasts information about itself including a unique identifier (the MAC address) which is what google want. WiFi also broadcasts all data which is transferred – such as internet banking passwords and emails – meaning anyone within a reasonable distance can collect this private information. To prevent this most WiFi links use encryption to scramble the data making it unintelligible to anyone who doesn’t know a secret password. It wouldn’t matter if Google collected encrypted WiFi data because they can’t decrypt it.

So really, if you use an unencrypted WiFi link you really only have yourself to blame. An analogy of unencrypted WiFi communications would be sitting in your house with the windows open yelling your internet banking details over a megaphone for all your neighbors to hear. An encrypted WiFi link is slightly better because you’d be yelling the details in your own made-up language. I think Google are the least of your concerns.

It will be interesting to see what comes of  the various government investigations into Google’s privacy breech. Maybe here in Australia Steven Conroy could add google.com to the proposed internet filter.

, , ,

No Comments

int overflow

I haven’t posted anything for a while as the result of a bad case of integer overflow.

Many months ago, I thought I’d dust of the old photo gallery and upload some more photos. After numerous failures I realized PHP has a POST limit of only 4 MB and I was trying to upload 5MB+ photos. This took me quite a while to work out, as it failed without an error. I thought a good solution was to up the limit to something big. 2GB should be more than enough for anyone!

When that still didn’t work I just gave up and posted the photos somewhere else.

Little did I know that I’d just broken almost everything php on that server. It took a couple of months before I finally realized that the max size of an int on this platform is about 2 billion… and  post_max_size = 2048M in php.ini actually wraps to -2048MB or so. duh.

So back to it!

No Comments

Themingly great

I finally spent a bit of time browsing the WordPress themes. How does this one look?

There’s also put a link to my album in the photo gallery which used to be hosted at this address. Not too sure about the random image block in the sidebar yet though. I can’t figure out how to restrict it to random images from my album only, so you randomly see everyones photos.

Now, to update the photo gallery! I’ve decided to host my photos here, rather than giving away all rights to facebook.

, , ,

4 Comments

Programming for dummies

I just stumbled across Scratch.

Ok, it’s meant to be for kids, but I like the idea. It’s basically a visual programming environment for creating animation and music.

I found this via a post on the Google research blog about “Android app inventor“. They’re planning to make a similar graphical block-based programming environment for writing android apps.

I like the idea of visual programming. The traditional code-compile-run-test-stop-change-compile-run sequence is a little boring and time consuming. I want to change something “on the fly” and see what it does.

This programming style often seems to be associated with education. Greenfoot and Alice are two examples. If I was a real engineer, I may find myself playing with LabVIEW for programming microcontrollers and whatever else real engineers do.

But anyway. A couple of years ago I started working on a visual programming environment for creating video/audio/frame based processing applications – FraMMWorks. The idea is that you have a collection of building blocks (video sources, face detection, edge detection, etc) which you join together to generate an overall video processing “chain”. You can then save the resulting code as an executable and run it anywhere.

If there’s any interest I might just spend some more time on it.

, , , ,

No Comments

Google Real-estate

Browsing through this morning’s articles I came across a blog post for the new Google Maps Real Estate layer.

It’s a really nice visulisation of what is currently for sale/rent. And the best part – it works for Perth!

Here’s an example for some typical “first-home-buyers” in the Perth area. It’s interesting to see clusters north, south and east and some areas with no houses at all.

, ,

No Comments

Energy efficiency Microsoft style.

Microsoft’s latest technology –  Hohm – is ready to save the world from global warming.

It sounds, ugh, confusing.

The general concept behind it is good. In fact, it seems similar to Google’s offering – PowerMeter – which borrows the catchline:

If you cannot measure it, you cannot improve it. — Lord Kelvin

I can’t see how these “applications” will work yet though. It all seams a bit abstract and idealistic. The main issue is that someone needs to make a buck. Who wants to pay? The energy providers? Unlikely – they’ll loose money through reduced energy usage anyway. End users, well yea, but why would I want to pay for something I already know?

I suppose in essence it’s really about re-educating people. Which means it’s not going to make anyone any money. If companies such as Microsoft and Google have a spare few mil to throw at this – good on ’em. But it seems a little fishy to me.

And for something amusing:
“Later on, Microsoft intends for the service to grab data from programmable thermostats and so-called “smart plugs” to provide better real-time information. It all sounds well and good until the day we’re inevitably forced to call Microsoft’s activation center after adding a deck to the house”
http://www.theregister.co.uk/2009/06/24/microsoft_launches_hohm/

, , ,

2 Comments

Backup using rdiff-backup

I’ve been using rdiff-backup to automate my linux backups for a while now. I just recently made some improvements to my cronjob which performs a nightly backup and thought I’d share it here.

Rdiff-backup is a great application which wraps around rdiff to backup entire directories from one place to another. It’s very efficient too. Rdiff only transfers the changes to files, which means it’s great for performing offsite backups.

Only problem is that it’s still not easy enough to use – so I wrote a script to *really* automate things.

Here’s how to get it working:

  1. Install rdiff-backup on local and remote computers (apt-get install rdiff-backup for debian/ubuntu, build for windows, build for osx or build from source for anything else).
  2. Copy this script to /etc/cron.d/backup (you’ll probably have to rename that downloaded file)
  3. Create configuration directories /etc/backup/ and /etc/backup/hosts/
  4. Create the main configuration file /etc/backup/backup.conf with your settings using this template
    [sourcecode language=’sh’]
    # This is the config for a script which uses rdiff-backup to perform a nightly
    # backup of various
    # servers to this computer. It should be called via cron and should be
    # configured to send stderr to a sysadmin.
    #

    #==============================Config section==================================

    # local directory to store backups
    BACKUP_DIR=/usr/data/backups

    # location of remote host specific back up details
    HOST_CONFIGS_DIR=/etc/backup/hosts

    # Number of days to keep backups for. If unset backups will be kept forever
    KEEP_BACKUP_DAYS=62

    # parameters to pass to rdiff-backup
    RDIFF_BACKUP_PARAMETERS=”–force”

    # set to 0 for debugging information to be suppressed
    DEBUG=1

    #================================End Config Section============================
    [/sourcecode]

  5. Create a configuration file for each host you want to backup. This must be in the format of “backup.hostname” – i.e. backup.hammer.cs.curtin.edu.au or backup beans.ath.cx. You can refer to the localhost through its name too (i.e. backup.mycomp). These must be placed in /etc/backup/hosts/
    [sourcecode language=’sh’]
    # Config options for backing up remote computers
    # filename must be in the format of
    # backup.hostname.domain

    # The directory to start the backup at.
    START_DIR=”/”

    # Directories to backup. use a space to seperate.
    # INCLUDE_DIRS will have preference.
    # ie. if we include /usr/data/wilson but exclude /usr/data everything under
    # /usr/data/wilson will be backed up, but /usr/data/tim won’t be.
    INCLUDE_DIRS=”/usr/data/wilson/data”

    EXCLUDE_DIRS=”/usr/data /proc /cdrom /tmp /mnt /sys /home/squid /home/www/gallery2″
    [/sourcecode]

  6. Create an SSH key with no passphrase which will allow you to logon to the remote computer without typing a password (we will be automating this with cron remember). This link gives some details.
  7. Create a cronjob to run the script.
    30 4 * * *     root    /etc/cron.d/backups

Your remote computers should now be automatically backed up every night!

I haven’t actually tried this with windows or Mac OSX computers, but I believe it can be made to work.

, , , ,

2 Comments