Archive for the ‘Bash’ Category

Pseudothreading with BASH

January 20, 2010

It’s more like a trick, but it’s better than repeat the same operation linearly. Some explanations:

TH_NUM=`ps aux | grep Python | grep -v "grep" | wc -l`

  • TH_MAX is the maximum number of “threads” that can be executed at the same time.
  • The first grep selects the threads that make use of python (you can change this, it depends on your script)
  • The second grep excludes the command you issued above 😉
  • wc counts the number of lines. The first time the result of the pipe is empty, so wc gives “0” as result.

#!/bin/bash
TH_MAX=10
for sample in `ls ./data`
do
while [ TRUE ]; do
TH_NUM=`ps aux | grep Python | grep -v "grep" | wc -l`
if [ "$TH_NUM" -le "$TH_MAX" ]
then
echo $( ./analyze_sample.py -s ${sample} ) > /dev/null &
echo -en " ${sample} "
break
else
echo -en "."
sleep 1
fi
done
done

Advertisements

Decode command line arguments given to a BASH script

August 25, 2009

Try the elegant way, at last!

For instance, imagine that your programme has the following arguments:
./myprog -s /source/directory -d /dest/directory -c deep

Then, the script myprog should contain a piece of code like this:


if [ $# -eq 0 ] ; then
echo "Usage: $0 -s -d -c"
exit 1
fi
while [ $# -gt 1 ] ; do
case $1 in
-s) source_dir=$2 ; shift 2 ;;
-d) dest_dir=$2 ; shift 2 ;;
-c) copy_mode=$2 ; shift 2 ;;
*) shift 1 ;;
esac
done

Of course, this is only a hint. Just tailor the decoding according to your needs.

sed one-liners

March 24, 2009

Handy one-liners for UNIX stream editor sed.

Using ssh forwarding to retrieve papers instead of VPN

November 5, 2008

So far I was using VPN to connect to my university network when I needed to download a paper from home, but that meant to temporarily lose my network connection, with all the hassle attached.

Today I discovered a much simpler method:

1) Open a ssh port forwarding to the remote machine (university server with IEEE or ACM subscription) in this way:

ssh -D 8080 -N <username>@<server address> &

where -D indicated the local port to forward, and -N avoid opening a shell.

2) Download the FoxyProxy Firefox extension, and configure in order to use a proxy on localhost:8080.

3) Add rules to FoxyProxy so that the proxy is active only when needed (*.ieee.*, *.acm.* etc…).

Thanks to Timo Reimann for having suggested that.

Sage to Google Reader OPML converter

September 24, 2008

For some reason, when exporting rss feeds from Sage, a Firefox plugin, to an OPML file it does not save the feed (i.e. the xml/rss/atom/whatever) link, but the feeds themselves, i.e. the content of the feeds.

If you have the same problem, here’s a fast-and-dirty solution.

  1. export your Firefox bookmarks in HTML format, in a file called, say, bookmarks.html
  2. using a bash shell, extract the feed lines into a file called feeds.xml doing
     grep FEEDURL bookmarks.html > feeds.xml
  3. create the following script called “scriptfeed.sh”
    #!/bin/bash
    echo "<?xml version=\"1.0\" encoding=\"UTF-8\"?>"
    echo "<opml version=\"1.0\">"
    echo "<head><title>RSS Subscriptions</title></head>"
    echo "<body>"
    while read line
    do
            temp=`echo $line | awk -F"<" {'print $3;'}`
            feed=`echo $temp | awk -F"\"" {'print $2;'}`
            url=`echo $temp | awk -F"\"" {'print $4;'}`
            text=`echo $temp | awk -F">" {'print $2;'}`
            echo "<outline text=\"$text\""
            echo "  title=\"$text\""
            echo "  type=\"rss\""
            echo "  xmlUrl=\"$feed\""
            echo "  htmlUrl=\"$url\"/>"
    done < feeds.xml
    echo "</body>"
    echo "</opml>"
  4. Give the script the right to execute
    chmod +x scriptfeed.sh
  5. Launch the script on the output file, say, exp.xml
    ./scriptfeed.sh > exp.xml

I know, it’s dirty and tricky, but it works 😛

Remove all binaries from a directory tree

June 4, 2008

Here is a one liner that packs some serious bash punch:

rm `find . -type f -exec file ‘{}’ \; | grep -i linux | awk -F: ‘{print $1}’`

tar – argument list too long

February 12, 2008

If you have a lot of files in one directory and you try to process them, for example, to tar them, you will very likely get this error:

$ tar cvzf allhtml.tgz *.html
-bash: /bin/tar: Argument list too long

Try this:

$ find . -name ‘*.html’ -print > ./allfiles
$ tar -cf allhtml.tgz –files-from ./allfiles

Command-line construction of auguments list with xargs

January 23, 2008

xargs is a really useful unix command line tool that can be used to construct an argument list for piping into other unix commands.
For example, lets say we want to find and delete all files fitting a certain criteria, such as those with a suffix of “.java~”.
We can use the find utility to easily list these:


$ find . -name *.java~
./path/to/file/file1.java~
./path/to/file/file2.java~
./path/to/file/file3.java~
./path/to/file/file4.java~

However, the output from this is in the form of a list, which cannot be piped directly into a command like rm. If the list is very long, it can be difficult to write it out manually into a singe argument list.

Instead, we can pass the output from the find command into xargs to convert this list into a single argument list for us:


$ find . -name *.java~ | xargs
./path/to/file/file1.java~ ./path/to/file/file2.java~ ./path/to/file/file3.java~ ./path/to/file/file4.java~

This can then be piped into rm or any other command as needed.

This is just one simple example of the use of xargs. It’s a much more powerful tool that is able to handle much larger argument lists with great flexibility.

Download papers via SSH

January 11, 2008

Want to download a paper from home, or from a poor different university which doesn’t have all subscriptions? Try this paper-get script.

Usage: paper-get URL [filename]

#!/bin/sh

SSH_TARGET=user@mymachine

if [ -n "$2" ]; then
    DEST="$2"
else
    DEST="$(basename $1)"
    echo Saving to $DEST
fi

if [ -e "$DEST" ]; then
   echo Warning: the destination file already exists.
   echo Press Enter to overwrite, Ctrl-C to abort.
   read ANTANI
fi

ssh $SSH_TARGET wget -O- \"$1\" > "$DEST"

Repetitive programming

November 30, 2007

Hint for the lazy programmer. How many times did you need to make switches/If clauses/… with lots of conditions, maybe coming from the n-th field of a each line in a file?

You’re lucky: bash and awk are here to help. This example generates (brrr…) VB code:

#!/bin/bash
while read line
do
echo “If variable = `echo $line | awk {‘print $1;’}` Then ” >> new_file
echo ” boolean = True” >> new_file
echo “End If” >> new_file
echo ” ” >> new_file
done < file_to_read[/sourcecode]