Category Archives: Linux

Using Rsync to Copy Remote Files to Local Machine on OS X / Linux

The following is used to copy files from a remote server to your local machine using Rsync to keep both synchronized. This is useful for things like backing up important directories on your web server to your external hard drive or elsewhere.

Rsync can incrementally synchronize files between two locations whether local or remote. This means only the updated files on your remote server are updated on your local machine.

From here on in I assume you’re using OS X but this should work on Linux as well. Press CMD+Space to open a Terminal (I use TotalTerminal.) At the command prompt, just use the following:

#!/bin/sh
rsync -a -e "ssh" --rsync-path="sudo rsync -vau " remoteusername@remote.host.com:/home .
rsync -a -e "ssh" --rsync-path="sudo rsync -vau " remoteusername@remote.host.com:/var .
rsync -a -e "ssh" --rsync-path="sudo rsync -vau " remoteusername@remote.host.com:/etc .

This will copy /home, /var, and /etc from your remote machine to the current directory.

Bash Tips for Power Users

Every Geek site needs an obligatory Bash Tips post

Copy Files Securely Between Two Machines

I used to always forget the syntax for this, until I realized that the syntax is exactly like the standard cp command. In fact, you can copy files like you normally would using scp, on your local machine. The following are equivalent:

$ cp file file.orig
$ scp file file.orig

Where they differ is, scp lets you copy files over a network, through SSH. Here’s an example:

$ scp contents.txt silver@ssh.domain.com:/tmp

This will copy local file contents.txt to /tmp on the remote machine ssh.domain.com, as user silver. Here are some more examples:

$ scp draft.pdf ssh.domain.com:

(copy draft.pdf to my home dir on remote machine. username is implied to be the same locally and remotely.)

$ scp swine.jpg rex@ssh.domain.com

(read: This will copy swine.jpg to local machine as a file named rex@ssh.domain.com. To make it go remote, append a : to the address, like above)

scp supports, among other things, compression (-C) and recursive copying of directories (-r).

$ scp -rC code/ ssh.domain.com:/archive/code_02032009

Trying to copy to a directory you don’t have permission to (/usr etc) will fail.

Don’t Get Lost Jumping To and Fro Between Directories

You can use cd - to jump to the previous (NOT parent) dir. For example:

kiwi@localhost: ~ $ cd /usr/local/share
kiwi@localhost: /usr/local/share $ cd -
/home/kiwi
kiwi@localhost: ~ $ cd -
/usr/local/share
kiwi@localhost: /usr/local/share $

Another way is using pushd/popd – A Last In First Out (LIFO) stack of dirs.

kiwi@localhost: ~ $ pushd /usr/local/share/
/usr/local/share ~

pushd is like cd but keeps note of the current dir before cd’ing into a new one. The stack of dirs is listed every time you invoke pushd (the “/usr/local/share ~” output you see above.)

kiwi@localhost: /usr/local/share $ pushd /
/ /usr/local/share ~

Stack is ordered left to right, latest push first. If we pop the first dir off:

kiwi@localhost: / $ popd
/usr/local/share /tmp ~
kiwi@localhost: /usr/local/share $

We’re back in the share dir. We can keep popping until there’s nothing left (throws an error):

kiwi@localhost: /usr/local/share $ popd
/tmp ~
kiwi@localhost: /tmp $ pushd /lib
/lib /tmp ~
kiwi@localhost: /lib $ popd
/tmp ~
kiwi@localhost: /tmp $ popd
~
kiwi@localhost: ~ $ popd
bash: popd: directory stack empty

Working with Long Lines

No need for more Bash shortcut cheat sheets, but here are some useful ones to help you work with long lines.

You can jump to the start & end of a line using CTRL+a & CTRL+e respectively. Example (* is the cursor):

kiwi@localhost: ~ $ echo al the ducks are swimming in the w*

and you want to fix the first word. You can hop to the beginning of the line with CTRL+a:

kiwi@localhost: ~ $ *echo al the ducks are swimming in the w

and now you can jump to the end of the misspelled word “al” using CTRL+Right twice to correct it:

kiwi@localhost: ~ $ echo all*the ducks are swimming in the w

Now ctrl+e to jump to the end of line:

kiwi@localhost: ~ $ echo all the ducks are swimming in the w*

Instead of backspacing every character, use ALT+Backspace to backspace entire words. You can also delete all or part of a line using CTRL+u combo. It deletes everything before the cursor. Likewise, CTRL+k wipes out everything after the cursor. I’ve developed a habit of using CTRL+e CTRL+k to delete lines.

Bash has a lot of ALT commands that let you move and manipulate words. ALT+l and ALT+u will make a word in front of the cursor lowercase or uppercase, for example. A neat one I don’t think I ever used is ALT+\ It pulls everything after the cursor left to the first non-whitespace character. Here’s an example, * is the cursor:

BEFORE:

$ my     spacebar is    *sticky

AFTER (ALT+\):

$ my     spacebar issticky

Avoid Retyping Commands & Arguments

ESC + . is very useful. Escape followed by a period will output the argument you sent to your last Bash command. Command calls themselves are outputted if they were invoked without any arguments (popd, ls, etc).

Example, unzipping a file and moving the archive to /tmp:

$ unzip archive-with-a-long-ambiguous-name-03092009-5960-1.2.5.zip
$ mv archive-with-a-long-ambiguous-name-03092009-5960-1.2.5.zip /tmp

In the mv command, the archive name was outputted by pressing ESC+. (full command being mv (ESC+.) /tmp) There was no need to type the long archive name twice.

The argument is taken from your bash history. You can keep invoking ESC+. to cycle back through all your recent command arguments. (history -c to clear)

Try not to forget this; You’ll naturally find plenty of uses for it.

Another way to avoid re-typing commands is CTRL+R. It will initiate a search of your command history. Begin typing, and watch Bash try to complete your command from previous ones you entered.

Command Getting Too Big? Send it to your Editor

Sometimes you begin writing what you think will be a simple command, only to realize that it has grown too complex for the command line, and you wish you were in your text editor.

First make sure your default editor is set. This is either in $EDITOR (export EDITOR=/usr/local/bin/vim) or elsewhere depending on the distro.

Use “fc” to open the last executed command in your editor:

ls -paul --sort=size
... ls output ...
fc

Now the ls line will be open in your editor. But what if you hadn’t executed the command yet? No problem. You’re sending off an email, but quickly realize that the command line isn’t ideal for everything:

echo -e "Dear Santa, \n\n\tIt has become evident that your fat ass is contributing to Global Warming, primarily due to the large quantity of coal you distribute annually. We hereby

No matter where you are on the line, hit CTRL+x, CTRL+e to invoke your editor, which now contains what you were typing on the cmd line.

I always find myself wanting to finish a command in vim, but unwilling to type the first few lines over, especially when I’m trying to write a for loop or any ugly multiline Bash code.

IMPORTANT: Whatever you type in your editor is executed automatically after you quit the editor.
Continue reading Bash Tips for Power Users

How to Block AIM’s Annoying ‘AOL System Msg’ in Pidgin

The following plugin for Pidgin will block the incredibly annoying and useless notifications from AOLSystemMsg on AIM.

“AOL System Msg: Your screen name (mrEman) is now signed into AOL(R) Instant Messenger (TM) in 2 locations. Click here for more information.”

To use, paste code in file, save file as blockaolsystemmsg.pl in ~/.purple/plugins/ and then open (or re-open) Pidgin and go to Tools -> Plugins (or press CTRL+U), and enable “Block AOLSystemMsg.” That should be it!

If you’re having any trouble, try going to Help -> Debug to open up Pidgin’s debug console.

#!/usr/bin/perl
# BlockAOLSystemMsg plugin tested on Pidgin 2.5.5. Put in ~/.purple/plugins/ and enable
use Purple;
our $target = 'AOL System Msg'; # case-insensitive
our $plugin_name = 'Block AOLSystemMsg'; 

%PLUGIN_INFO = (
  perl_api_version => 2,
  name => $plugin_name,
  version => "0.1",
  summary => "Blocks the screen name 'AOL System Msg'",
  description => "Ignore annoying 'your SN has signed on at 2 locations' AIM message",
  author => "Isam ",
  url => "http://biodegradablegeek.com",
  load => "plugin_load",
  unload => "plugin_unload"
);

sub loginfo { Purple::Debug::info($plugin_name, " @_\n"); }
sub minimize {
  my $r = lc($_[0]);
  $r =~ s/ //g;
  return $r;
}

sub plugin_init { return %PLUGIN_INFO; }

sub plugin_load {
  my $plugin = shift;
  $target = minimize($target);
  loginfo("Sight set on '$target'");
  Purple::Signal::connect(Purple::Conversations::get_handle(),
                          'receiving-im-msg', $plugin, \&callback, '');
}

sub plugin_unload {
  my $plugin = shift;
  loginfo('Block AOLSystemMsg Unloaded.');
}

sub callback {
  my ($acc, $sender, $msg, $flags) = @_;
  if (minimize($sender) eq $target) {
    loginfo("(BLOCKED) <$sender> $msg");
    return 1
  };
}

update: Fixed the botched code. Thanks.

I Can’t Live Without My vim Config

I have updated the vim page with my vimrc/gvimrc configs. Instead of repeating myself, I will quote some parts of the page ..

More details and the vim config itself here

I recommend turning backups on if you have them off. I personally hate having the ~ files all over my OS, so I keep them along with the .swp files in 1 backup dir in ~/.vim/

The programming language skeleton stuff will detect what files you are editing and change options in vim by inheriting the specified files which I put in ~/.vim/skeletons and ~/.vim/inherit.

The skeletons are automatically inserted in new files that vim is aware of. For example, in my own config, I have ~/.vim/inherit/c which has all the usual includes and int main() code. When I make a new C file (“gvim hello.c”), the new file begins with the skeleton code already present. Neat huh?

The inherit files can be used to set specific options for each language. This can mean different bindings, whitespace options, themes, etc depending on what language you’re working with, automatically.

See the vim page

What options have helped you the most?

Burning Xbox 360 Games on Linux (Stealth!)

xbox360-bg
You could run ImgBurn in Wine, or probably burn the games in VirtualBox running Windows, but that’s no solution… you’re reading this because you want to burn Xbox 360 games on Linux using native tools. It’s surprisingly easy!

The games are usually an ISO file, along with a little DVD (.dvd) file that tells the burner to use a layer break value of 1913760. This file is not necessary in Linux (or Windows) as we will be telling the app to use that break value explicitly.

I will go into detail on how to setup what you need. If you’re impatient, you might wanna skip the setup and jump straight to the quick recap.

Extract the ISO

cd /games/360/GameX
rar  x kfc-gamex.part01.rar

If you don’t have rar (“winrar”) installed, lookie:

The program 'unrar' can be found in the following packages:
 * unrar-free
 * unrar
Try: sudo apt-get install 

you can also DL it from rarlabs.com.

Now we need to see if the game is stealth/valid. This is done using an app that runs natively on Linux (and OS X) called abgx360.

Install abgx360

abgx360-linux

Download the tar.gz files from http://abgx360.net/download.html. The TUI is nice. Don’t bother getting the GUI for abgx360.

tar -zxvf abgx360-1.0.0.tar.gz
cd abgx360-1.0.0/
./configure && make
sudo checkinstall -D

(You may use ‘make install’ but this is not recommended on Debian/Ubuntu. checkinstall keeps your shit organized.)

If ./configure fails with an error about wx-config/wxWidgets, make sure wxWidgets is installed..

apt-cache search wxgtk2 

and make sure wx-config is in your PATH. On Ubuntu Intrepid, it wasn’t. Find it and make a symlink to something in your path.. i.e.,

locate wx-config # (finds it in /etc/alternatives/wx-config)
sudo ln -s /etc/alternatives/wx-config /usr/bin/wx-config

Rerun ./configure/make/checkinstall

If you downloaded the local database (abgx360-data) from the site above, install it now; Just extract and move the .abgx360/ dir into your ~/

Checking ISO CRC/SS – Is the game stealth?

abgx360 -af3 kfc-gamex.iso

the af3 flag will automagically fix/patch the ISO should it encounter any problems.
What abgx360 will do is check the ISO’s CRC against an online (or offline, ~/.abgx360/) database. It might begin by updating its database. If this is a problem (no net connection), pass it -localonly

When that’s done…

Burning the ISO Using growisofs

Making sure the dual layer DVD is in your drive, run the following command:

# growisofs -use-the-force-luke=dao -use-the-force-luke=break:1913760  -dvd-compat -speed=4 -Z /dev/burner=kfc-gamex.iso

I commented it out so you don’t execute it trying to paste it. Let’s look closer at this command…

The break:1913760 is the layer break, which you’ll find in the .dvd file. If for whatever reason you can’t check the .dvd file, just use this value.

Set your speed to something low. Some say 2.5x but I have no problems burning at 4X (my max is 8X). You don’t need to know the lowest speed your burner can go. Just set it to 2-4 and you’ll be fine.

Set /dev/burner to your own device. It’s probably /dev/scd0, /dev/scd1, or may already have a symlink like /dev/dvd6 /dev/dvd etc..

Try grepping dmesg to find your device. i.e.,

dmesg | grep "LITE"

This might give you some information but probably nothing too helpful:

sudo dvdrecord -scanbus

To see if you have the right device, try ejecting it.

eject /dev/dvd6

Set the kfc-gamex.iso to whatever the name/path of your ISO is (case sensitive of course).

Now I usually begin with a dry run. By passing -dry-run to growisofs, it will proceed as normal but quit before writing anything to disk. Actually, it kind of just spits out a command and dies. Awful design! i.e.,

$ growisofs -dry-run -use-the-force-luke=dao -use-the-force-luke=break:1913760  -dvd-compat -speed=4 -Z /dev/burner=kfc-gamex.iso
Executing 'builtin_dd if=kfc-bh5.iso of=/dev/dvd6 obs=32k seek=0'
$ 

So the above is good. Now remove the -dry-run flag to proceed with the actual burn.

growisofs -use-the-force-luke=dao -use-the-force-luke=break:1913760  -dvd-compat -speed=4 -Z /dev/burner=kfc-gamex.iso

Find something to do, or just stare at the screen. After about 20 minutes (at 4X), you’ll see the burn end successfully with output like this:

 7798128640/7835492352 (99.5%) @3.8x, remaining 0:06 RBU 100.0% UBU  99.8%
 7815495680/7835492352 (99.7%) @3.8x, remaining 0:03 RBU  59.7% UBU  99.8%
 7832862720/7835492352 (100.0%) @3.8x, remaining 0:00 RBU   7.9% UBU  99.8%
builtin_dd: 3825936*2KB out @ average 3.9x1352KBps
/dev/burner: flushing cache
/dev/burner: closing track
/dev/burner: closing disc

You’re done!


Quick Recap


Assuming you installed all the dependencies above, here’s a quick recap of what needs to be done to burn a game.
It really takes about 1 minute to begin the process. Write a shell script if you like.

cd GameX_REGION_FREE_XBOX360_KFC/
rar x kfc-gamex.part01.rar # Extract game ISO 
abgx360 -af3 kfc-gamex.iso # Checks if rip is valid/stealth/ss patched
growisofs -use-the-force-luke=dao -use-the-force-luke=break:1913760  -dvd-compat -speed=4 -Z /dev/burner=kfc-gamex.iso
eject /dev/burner # When burn is done, eject & play. 

Gentoo Sucks, Ubuntu Doesn’t.

I used Gentoo for a few years, and at first I loved it. Mainly because of portage, but the only distro I had experience with before Gentoo was Slackware, and I used to install packages and dependencies manually, so you can see why Gentoo would was so appealing to me.

When I first began my new job, the only distro available was Ubuntu, which deep down I hated without any real reason. I guess I saw Ubuntu as being “too user-friendly” and Mandrake-ish: Bloated and sluggish. But 10 minutes into using it, I made the decision that as soon as I get home, I’m wiping out Gentoo and installing Ubuntu.

You Learn From Compiling Apps Yourself

This is somewhat true, but I don’t believe it applies to Gentoo/portage. There’s nothing educational about watching shit scroll across the screen. None. If you want a real learning experience, try Slackware or Arch. You’ll learn if you’re forced to figure out what an app depends on, and what the most efficient compile flags are for your system. With Gentoo, the app is being compiled from scratch, but you aren’t doing any work, or research, for that matter. Running 1 command and then grabbing a bite while you wait for portage to do all the work for you isn’t going to teach you more than installing an RPM.

Gentoo’s installation isn’t going to teach you much of anything either, except maybe that patience is a virtue. The Gentoo docs are great, but each step is spoon fed to you. You’re basically copying and pasting commands so you can compile all the necessary files to get you started. After installation, Gentoo is as user-friendly as Ubuntu, even if it doesn’t seem like it at first.

Compiled Apps Are More Efficient Than Packages

Maybe. Prebuilt packages are usually compiled independently for each arch, and are already optimized, probably by people way more experienced in the field than you. Compiling your own apps can be slower if you don’t know what you’re doing, but even if you optimize your portage compile flags, the peformance difference between a prebuilt package for the specific arch vs an app compiled on that arch is minimal. There are too many drawbacks to compiling every app from scratch to make this tiny performance boost (which is just theoretical) worth it. Continue reading Gentoo Sucks, Ubuntu Doesn’t.

Quick BASH Script to Dump & Compress a MySQL Database

A quick script I whipped up to dump my MySQL database.
Usage: sh backthatsqlup.sh

(be warned that it dumps ALL databases. This can get huge uncompressed)


#!/bin/sh
# Isam (Biodegradablegeek.com) public domain 12/28/2008
# Basic BASH script to dump and compress a MySQL dump

out=sequel_`date +'%m%d%Y_%M%S'`.sql
dest=/bx/

function e {
  echo -e "n** $1"
}

e "Dumping SQL file ($out). May take awhile..."
#echo "oh snap" &gt; $out
sudo mysqldump -u root -p --all-databases &gt; $out
if [ $? -ne 0 ]; then
  e "MySQL dump failed. Check that server is up and your username/pass"
  exit 7
fi

e "Uncompressed SQL file size"
du -hs $out

e "Compressing SQL file"
gz=$out.tar.gz
tar -zvvcf $gz $out
rt=$?

if [ $rt -ne 0 ]; then
  e "tar failed (error=$rt). Will NOT remove uncompressed SQL file"
else
  e "Removing uncompressed SQL file"
  rm -f $out
  out=$gz

  e "Compressed SQL file size"
  du -hs $out
fi

e "Moving shit to '$dest'"
sudo mv $out $dest

download BackThatSqlUp.sh

RescueTime’s 22 Gigabyte notifier.debuglog Log File

I did find it weird that I kept running out of disk space recently. That hasn’t happened in years, and most of my big files go on another HD. On top of that, this box has been sluggish lately, even taking into account the fact that it’s ~4 years old and I always have 5-6 desktops filled to the brim.

I finally found the culprit. RescueTime‘s (unofficial) Linux client keeps a log of every single window that has gotten focus, EVER. I figured this would be cleared when the notifications were sent out, but apparently it wasn’t. My failed/ dir is nearly empty, so I know the notifications are getting sent out. The file is named ~/.rescuetime/tmp/notifier.debuglog

It might be that the client only clears the log when the app is closed? That sucks, because I don’t shutdown or reboot (or log out of X for that matter). Aside from the handful of kernel-update reboots (yeah yeah I could just init level down to preserve my uptime), I literally haven’t kept my PC off since 2006.

I don’t mind the disk space, but how the hell are you opening, seeking and writing to a 22gig+ file literally every single time focus is switched? I swear to the Gods I was one CC digit away from ordering a Mac.

notifier debug log rescue time picture

RescueTime is a great service/app, but I’ll keep the client off until this is fixed or they release an official client. No offense to the guys working on the Linux client (<3), especially considering it’s probably their pet project, and I’ve had no other problems with it thus far. Hell, maybe the debug log could of easily been turned off.

I’m using version 90 (newest release as of 11/16). This may have already been fixed in trunk. I’ll check/submit a bug report… eventually.

https://launchpad.net/rescuetime-linux-uploader

Mephisto for the Masses – Installation HOWTO

I’ve recently taken a fancy to Mephisto, a blogging-platform written in Rails. I have nothing against WordPress, but being in Ruby and using Liquid for themes, Mephisto is far easier (and more fun) to tweak and configure, especially when I want to migrate my sites away from the “blog look” and make them more dynamic.

It’s unfortunate development isn’t as active as say, Typo (also a Rails app, but I haven’t tried it), but I find that Mephisto at its current level makes a simple and fast starting point for most of my projects.

The point of this post is to address numerous problems with the installation. These are present in the tarball release of 0.8 Drax, and in trunk (as of 10/21).

Git The Code

Get the files, either the compressed archive or from edge (recommended).

git clone git://github.com/technoweenie/mephisto.git

Pre-installation

You’ll need to freeze rails 2.0.2, and have the latest tzinfo gem installed:

gem install tzinfo 
cd mephisto/ 
rake rails:freeze:edge RELEASE=2.0.2

The file it downloads should be named rails_2.0.2.zip and NOT rails_edge.zip.

Copy the “new” boot.rb into the config/ folder, overwriting the existing one:

cp vendor/rails/railties/environments/boot.rb config/boot.rb

Now rename the database sample file in config/ to database.yml and edit it to fit your own DB settings. You’ll probably only be using production.

Bootstrapping

Now bootstrap:

rake db:bootstrap RAILS_ENV=production

If it works, GREAT. But you’ll probably get an error or two. If you’re getting the following error:

Error message:
  undefined method `initialize_schema_information' for module  
  `ActiveRecord::ConnectionAdapters::SchemaStatements'
Exception class:
  NameError

You forgot to copy over boot.rb from vendor/rails/ – scroll up. If you’re getting an error that redcloth is missing (no such file to load—RedCloth-3.0.4/lib/redcloth), even though it’s in vendor/, it’s because the path to RedCloth is relative in config/environment.rb. Change it from:

require '../vendor/RedCloth-3.0.4/lib/redcloth' unless Object.const_defined?(:RedCloth)

to

require File.join(File.dirname(__FILE__), '../vendor/RedCloth-3.0.4/lib/redcloth') unless Object.const_defined?(:RedCloth)

Running

After the bootstrap, you may either start the server (ruby script/server, thin, mongrel, etc), or go with mod_rails (Phusion Passanger). I recommend the latter – Passenger is amazing, and the error screen is pretty.

Just point your Apache2 vhost to Mephisto’s PUBLIC/ dir. Here’s an example:


   ServerAdmin mrEman@domain.com
   ServerName domain.com
   ServerAlias www.domain.com

   # DocumentRoot must be rails_app/public/
   DocumentRoot /home/kiwi/www/domain.com/public/public
   Railsenv production

   DirectoryIndex index.html index.htm index.php
   ErrorLog /home/blue/www/domain.com/log/error.log
   CustomLog /home/blue/www/domain.com/log/access.log combined

Restart Apache2, and you’re done. The site should work right away. If you get the following error:

No such file or directory - /tmp/mysql.sock

It’s because the socket file resides somewhere else on your (host’s) distro. Just find (man find, locate, etc) and add a symlink to it. Here’s an example (Debian):

ln -s /var/run/mysqld/mysqld.sock mysql.sock

If you’re getting an error that gems you know you have aren’t found, like:

no such file to load -- tzinfo (MissingSourceFile)

it is due to the fact that Gems are not located anywhere Ruby checks. You’ll have to explicity pass Ruby -rubygems or require ‘rubygems’ — what a nuisance. Open config/environment.rb and add the latter line:

# requires vendor-loaded redcloth
require 'rubygems'

This will be global. Now either restart the server you ran (i.e., thin), or tell mod_rails to restart the app. To do so, just create a file named “restart.txt” in the tmp/ folder of the RAILS app:

cd mephisto_root/
touch tmp/restart.txt

and refresh the page. Passenger will restart the app and restart.txt will vanish.

The default login for the /admin page is admin/test. Wasn’t that a blast?

Script to Quickly Setup WebApp Environment and Domain

Just sharing a script I wrote to quickly deploy WordPress (and eventually a few other webapps) sites, which somebody might find useful. This uses Linode‘s API* to add the domain name to the DNS server along with some subdomains. If you’re using another server, (Slicehost, your own, etc), you can alter the dns class to use that API, or just ignore the DNS stuff completely; Its optional.

This will be updated periodically as I refactor and add support for more apps (notably Joomla and Clipshare – though this would violate their terms unless you have the unlimited license). This was written primarily because I couldn’t stand setting up another vhost and WordPress installation. There are plenty of existing deployers but I plan on adding very specific features and tweaking this for in-house work. I also wanted to try Rio (Ruby-IO). GPL license. Go nuts.

* As of 10/11, the apicore.rb file on the site has some syntactic errors in the domainResourceSave method. I sent an email out to the author about it. Problems aren’t major. You can get my apicore.rb here.

This won’t run unless you create the appropriate folder structure in /etc/mksite/. I’ll get going on this in a bit. See the code below:

Continue reading Script to Quickly Setup WebApp Environment and Domain