Speed Test Your iPhone

We are painfully reminded every day when we venture out of our homes that the iPhone does not have 3G technology. The EDGE network is brutally slow and painful and there simply is no way to sugarcoat that harsh reality. But wait! All hope is not lost my fellow iPhoners, as we do have a little feature called Wireless Fidelity, better known as WiFi (and no, I didn’t need to Wikipedia that ;) ).

Many features on the iPhone, most notably Safari and Mail, operate such that the greater the available bandwidth the greater the experience of using the application. So it becomes important to know how much you got coming down the pipe back and forth between your iPhone and wireless router. Enter iNetwork Test.

The Web site iNetwork Test exists to perform exactly this function, to tell you how fast your WiFi network speed is currently operating. The site is a piece of cake to use, simply navigate to inetworktest.com and click the “Start Test” button. Then just wait. The quicker your connection the quicker the test will complete. Once the test completes, you select whether you’re operating via WiFi or EDGE, and the Web site records your score.

My WiFi network speeds were as follows:

1st attempt – 925 kbps
2nd attempt – 940 kbps
3rd attempt – 841 kbps
4th attempt – 894 kbps
5th attempt – 910 kbps

Once your score has been recorded you can click on the “Results” button to see how your score relates to the average EDGE and/or WiFi speeds. At the time of this writing, the average EDGE speed is 208 kbps, and the average WiFi speed is 798 kbps, putting me a bit above the average — gotta love that fiber optic pipe. :)

And to the wise guys out there, yes, it can detect if you’re not using an iPhone:

iNetwork Test for iPhone

2659.3 kbps
Not an iPhone

Read More

My First Linux Shell Script

Last week I wrote about my Ubuntu installation. At the end of that post I promised some Linux-related posts in the future, and here is my first one.

In my opinion one of the things about Linux that makes it so powerful is the shell. And as powerful as the shell can be, it becomes even more powerful when you create a shell script to execute a multitude of commands. Nobody wants to type ten commands in a row in order to achieve a desired result. Further, nobody wants to type the same ten commands in a row every single day. Enter the shell script.

My first shell script started out as a bit of experimenting but has evolved into quite a useful utility. The end result is a script that automatically backs up the development area of partybody.com at a specified time each night, stores a copy locally on the server, and emails a copy to the relevant administrators.

Let’s take a look at the code (some values have been changed for security reasons):

1 #!/bin/bash
2
3 # “scriptname” – copies all contents of a directory into time stamped tar.gz file in /path/to/backup/file
4
5 TD=$(date +%T-%d_%m_%Y)
6 FILE=”/home/user/web/cms/backup-$TD.tar.gz”
7 DIR=”/home/user/web/dev/”
8
9 # Create the gzipped archive file
10 tar -zcvf $FILE $DIR
11
12 # Send successful backup notification email
13 DEST=”/web/cms”
14 FILENAME=”backup-$TD.tar.gz”
15
16 echo “Automated backup ran successfully at $TD and created file: $FILENAME in FTP location: $DEST.” | mutt -s “Daily Automated Backup Complete” -a $FILE -c cc_email@yoursite.com to_email@yoursite.com
17

To get an idea of what’s going on here let’s examine this code line-by-line.

Line 1 is simply telling the script to use the BASH shell environment. Certain shell environments allow for different scripting capabilities than other shell environments, so it’s necessary to declare which one you’re using on the first line of your script. Note: BASH is the default for most Linux distributions.

Line 3 is a comment. The # character at the beginning of the line tells the shell to ignore all content on that particular line. In this case we’re just using the comment to identify the name of the script and a description of what it is intending to perform.

Line 5 creates our first variable, TD, and gives it the value of $(date +%T-%d_%m_%Y). In this instance we are accessing the built-in date function of the BASH shell and pulling out some values like time, day, month and year in order to create a time stamp.

Line 6 creates a variable called FILE and gives is the value of the backup file to be created. In other words, we’re specifying the filename we want to create and the location where we want to put it.

Line 7 creates a variable called DIR and gives it the value of the directory that we want to backup.

Line 9 is a comment declaring the action that we want to take next, in this case, creating the actual gzipped file.

Line 10 creates the archive file and gzips it taking the $DIR variable as input and the creating the $FILE variable as output.

Line 12 is another comment declaring the final action the script will take, sending the email to relevant administrators.

Lines 13 and 14 are variables that contain values to include as descriptors in the notification email, and have no effect on the execution of the script.

Line 16 echo’s the notification message into the email as the body content. The program Mutt is used to send the email. It accepts the piped input from echo and we specify a subject using the -s flag, an attachment using the -a flag, and a cc address using the -c flag. All that together sends an email to the “to” and “cc” email addresses with the subject of “Daily Automated Backup Complete” and attachment of the $FILE (”/home/user/web/cms/backup-$TD.tar.gz”) the script created.

And that’s it. Only 16 lines to backup any directory (including sub-directories), store a copy on your server, and email to whomever you wish.

There is still one problem with this script, though. As it stands it must be manually executed in order to run, and who wants to have to ssh into their server every time they want to run a backup. Therefore, we must auto-schedule the script using cron.

Cron is a time-based scheduling service driven by a crontab, a configuration file that specifies shell commands to run periodically on a schedule. Basically, we just need to add a line to this crontab file telling the shell to execute our backup script at a time of our choosing.

In this case, I added the following line to my crontab file to execute the script every day at 3 a.m.:

0 3 * * * /path/to/script/scriptname

Also, one final note, be sure to make your script executable or else it will not run. We can do this by using chmod +x as follows:

chmod +x /path/to/script/scriptname

That’s all for now. You should be able to adapt this code into a backup script of your own. Feel free to ask any questions in the comments section below.

Read More

Ubuntu 7.10 Gutsy Gibbon A Joy Once Installed

Ubuntu LogoTonight I made the switch from Fedora to the latest version of Ubuntu, 7.10, dubbed “Gutsy Gibbon.” I have long been a Fedora user over Ubuntu for the simple fact that the Fedora worked better “out-of-the-iso” in terms of supporting my hardware — an old Sony Vaio VGN-S480P. However, given some recent issues with configuring the wireless adapter, coupled with the fact that I had installed an older version of Ubuntu a few years ago and wireless worked instantly, I decided to give it a go.

I downloaded the Ubuntu ISO and burned the image to a blank CD using Alex Feinman’s ISO Recorder. Everything up until this point worked like a charm and I booted up the Ubuntu Live CD to the desktop.

Older Ubuntu Live CDs had failed to even reach the desktop on this old Vaio, so already I was making much more progress than in the past. Once on the desktop, I was able to connect to my wireless network and I got a warm, fuzzy feeling inside that this would be the one. Then things took a turn south.

After testing the wireless I double-clicked the “Install” icon on the desktop to put Ubuntu on the hard drive for good. Navigating through the on-screen menus and selecting my preferences was no problem, it’s once the actual process started that problems arose.

Without getting into too much detail, basically the install froze during the installation at 5% and wouldn’t budge. To make a long story short, the problem was with the ntfs-3g driver during the ext3 formatting step. After a rather extensive investigation into the Ubuntu support archives and The Google, I discovered a few solutions which somehow, some way fixed the issue. I’m not sure exactly which command got the installer to stop hiccuping and complete the installation, but the two I used were:

killall -9 <ntfs-3g process>

rm -rf *

Maybe I needed only one, maybe both, I have no idea. But I do know that now it works, and I encourage anyone having similar problems to use these commands as a starting point for your research into a fix.

Once I got past that the rest of the install was a breeze. The computer booted up no problem and has been running like an absolute gem for a few hours now.

The GUI Add/Remove Software application is simply awesome, and made getting Flash, MP3 support, and many other items a breeze.

Now that Gutsy Gibbon is up and running you can expect some Linux-related posts in the future, I just wanted to check in with you all and say a few words about my installation troubleshooting experience.

To end with, here is a screen shot of my current Ubuntu desktop:

ubuntu desktop screen shot

Read More