//Browsing archives (Page 10 of 13)
Latest tips by RSS
Click here to subscribe
Follow Shell-Fu on Twitter
Click here to follow
Follow Shell-Fu on identi.ca
Click here to follow

Add the following function/alias combo to your profile for fast directory access:
fd(){ `cat ~/.fd | grep $1`; }; alias addfd='echo "cd `pwd`" >> ~/.fd'

Usage example:
user@host:~/long/path/you/dont/want/to/type/often$ addfd

user@host:~$ fd often

To see the list of all fast access directories you can use "cat ~/.fd"

View Comments »

Sometimes when using sed you find that you need to match across line endings, this can be achieved by getting sed to match the first line and then pulling a second line into the buffer with the N command.

For example, if we have a file:
$ cat foo
This is a sample hello
world file.

And want to change 'hello world' to 'hello shell-fu' we need to replace across lines. This can be done with the following command:
:~$ cat foo | sed '/hello$/N;s/hello\nworld/hello\nshell-fu/'
This is a sample hello
shell-fu file.

Here sed first looks for lines which end with 'hello' then reads the next line, finally replacing 'hello\nworld' with 'hello\nshell-fu'.

This also has a lot of other uses, for example converting double line spaced files to single:
cat doublespace | sed '/^$/N;s/\n$//g'

View Comments »

The following isn't particularly pretty and should be considered a work in progress, but it's quite fun.

Get examples of ways a command can be used direct from shell-fu by adding the following alias:
function examples { lynx -width=$COLUMNS -nonumbers -dump "http://www.shell-fu.org/lister.php?tag=$1" | \
sed -n '/^[a-zA-Z]/,$p' | egrep -v '^http|^javas|View Comm|HIDE|] \+|to Share|^ +\*|^ +[HV][a-z]* l|^ .*efu.*ep.*!$' | \
sed -e '/^  *__*/N;s/\n$//g' | less -r; }

This pulls out the tips tagged by the given command. (Make sure you tag any tips you submit!)

View Comments »

Implementing "join" in bash (python's join, php's implode, etc.).

NOTE: the $ means "type what follows at your terminal prompt" (some people thinks that the $ is part of the command).

$ list_join() { 
    local OLDIFS=$IFS
    IFS=${1:?"Missing separator"}; shift
    echo "$*"

$ list_join : one two three

it only works for single characters, so this won't work as expected:

$ list_join ', ' one two three


also notice that you need to pass the list expanded (hence the name "list_join") so if you have an array you do:

$ myarray=(one two three)

$ list_join : "${myarray[@]}"

this function is pretty handy to use with fgrep (grep -f); fgrep is faster than grep because it does not use regular expressions so unless you actually need a regexp it's best to use that instead of normal grep, fgrep also accepts several words to search at once and works like an OR operation (if you chain several greps you get an AND operation) the only problem is that fgrep expects you to separate the words by newlines -quite ugly- so you can use the function like this:

$ fgrep "$(list_join $'\n' samus root)" /etc/group

View Comments »

This tip is inspired by #27 ( http://www.shell-fu.org/lister.php?id=27 ) which I think is misleading, here's a much more reliable and efficient version:

rename_ext() {
   local filename
   for filename in *."$1"; do
     mv "$filename" "${filename%.*}"."$2"

use it like:

rename_ext php html

to rename all .php files to .html in the current directory.

View Comments »

This can be used instead of "mv" to move file(s) and then cd into the destination folder.

mvf() {
if  mv "$@"; then
  shift $(($#-1))
  if [ -d $1 ]; then
    cd ${1}
    cd `dirname ${1}`

View Comments »

You can use the following to output your microphone to a remote computer's speaker:

dd if=/dev/dsp | ssh -c arcfour -C username@host dd of=/dev/dsp

This will output the sound from your microphone port to the ssh target computer's speaker port. The sound quality is very bad, so you will hear a lot of hissing.

View Comments »

The 'read' command can be used to split arguments based on any arbitrary character using IFS:

$ echo 'a/bc/ de fg hi/j' | { IFS='/' read first second others; echo "$first"; echo "$second"; echo "$others"; }
 de fg hi/j

View Comments »

You often see people using aliases like "alias ..='cd ..'" and "alias ...='cd ../..'". Here is a general version of this kind of '..' command, which takes an argument for how many levels up to go.

# .. - Does a 'cd ..'
# .. 3 - Does a 'cd ../../..'
function .. (){
  local arg=${1:-1};
  while [ $arg -gt 0 ]; do
    cd .. >&/dev/null;
    arg=$(($arg - 1));

View Comments »

Simulate a never-ending compilation so you have an excuse for why you're browsing the net (see http://xkcd.com/303/):

while true; do awk '{ print ; system("let R=$RANDOM%10; sleep $R") }' compiler.log; done

When you want to kill it, CTRL-C won't work. You have to suspend it (CTRL-Z) then kill the job (kill %1).

View Comments »

Many people use netcat on both the remote and local machine in order to transfer stdin and stdout to and from machines over the network, but many don't realize you can do the exact same thing using one line with SSH:

ssh remote_machine 'cat - > file' < file

Of course that is useless since if you have ssh access, you can use scp or sftp, but what is more useful is inserting ssh anywhere in a pipeline to run a command on the contents on a remote machine. For example:

wget -O - http://ftp.mozilla.org/.../thunderbird-3.0b2-i686.tar.bz2 | ssh remote_machine 'tar xjvf -'

That will download the latest nightly build of Firefox 3.0 and extract it on remote_machine.

Of course, it works both ways:

ssh remote_machine 'wget -O - http://ftp.mozilla.org/.../thunderbird-3.0b2-i686.tar.bz2' | tar xjvf

That downloads Firefox on the remote machine and extracts it on the local machine.

That example is still kinda useless, but combine it with tee and command substitution, and you have a nice way to distribute the same tarball to multiple hosts:

wget -O - http://ftp.mozilla.org/.../thunderbird-3.0b2-i686.tar.bz2 \
| tee >(ssh host1 'tar xjvf -') | tee >(ssh host2 'tar xjvf -') | ssh host3 'tar xjvf -'

You can pretty much use ssh anywhere you want to send stdin to or receive stdout from remote machine commands. Using netcat sends everything in the clear, and if you have the access to run netcat, you probably have ssh access.

View Comments »

The script below will find duplicate files (files with the same md5sum) in a specified directory and output a new shell script containing commented-out rm statements for deleting them. You can then edit this output to decide which to keep.

echo "#! /bin/sh" > $OUTF;
find "$@" -type f -print0 |
  xargs -0 -n1 md5sum |
    sort --key=1,32 | uniq -w 32 -d --all-repeated=separate |
    sed -r 's/^[0-9a-f]*( )*//;s/([^a-zA-Z0-9./_-])/\\\1/g;s/(.+)/#rm \1/' >> $OUTF;
chmod a+x $OUTF; ls -l $OUTF

View Comments »

You sometimes need your machines IP adress without relying on external sources; so here is the command to return your IP adress in one neat line.

ifconfig eth0 | grep -o "addr:[0-9.]*" | grep -o "[0-9.]*"

You off course need the necessary rights to us this. If you lack the rights, you might want to prepend it with "sudo" and add the ifconfig to the allowed paswordless commands for the user that needs this..

Ifconfig retrieves the adress; the first grep weeds out the external adress out of the data & the last grep filters out the IP adress.

View Comments »

The bash shell has a variable called $RANDOM, which outputs a pseudo-random number every time you call it. This allows you to randomize the lines in a file for example:

    for i in `cat unusual.txt`; do echo "$RANDOM $i"; done | sort | sed -r 's/^[0-9]+ //' > randorder.txt

In other words, put a random number on every line, sort the file, then take off the random numbers.

View Comments »

Kick all users other than you from your box and keep them out.

watch -d 'w | awk 'NR==4 {print "/dev/"$2}' | xargs fuser -k'

View Comments »

Sometimes, you want to add a lot of files to SVN from the command line. This simple command will find all unversioned files in an SVN checkout and adds them to SVN versioning.

for i in `svn status | awk '{if ($1 == "?") print $2}'`; do svn add $i; done

View Comments »

'colrm' is a column removal filter.

If only one parameter is specified, the characters of each line will be removed starting from that specified column number, if called with two parameters (range of character position to remove) the columns/characters from character position x to character position y will be removed.

Some examples:

# Remove characters from 2nd character position till end
$ echo "abcdefghij" | colrm 2

# Remove characters from 2nd to 5th column position
$ echo "abcdefghij" | colrm 2 5

#The following command deletes the characters in column 4-8 from the file file.txt
$ colrm 4 8 < file.txt

View Comments »

cut -d"," -f1,5 file.csv > newfile.csv

Will return only the 1st and 5th columns of a comma-delimited csv file.

View Comments »

And again about file extension changing.

There are two tips about this:
#27 - http://www.shell-fu.org/lister.php?id=27
#544 - http://www.shell-fu.org/lister.php?id=544

I'm happy to provide one more variant. It's up to you which one is the most useful.
function chext(){
  local fname
  local new_ext="$1"
  for fname in $@
    mv "$fname" "${fname%.*}.$new_ext"

If you place this function into .bashrc, then you may use it like as follows:
chext new_ext *.old_ext
chext html `find ~ -iname "*.htm"`
find ~ -iname "*.htm" | xargs chext html

View Comments »

This bit of sed will print the contents of a file until the first line which doesn't contain the specified expression. A useful alternative to 'head' when you're not sure how much of the file you need.

sed -n '/Hello/!q; p'

View Comments »

I noticed a one liner here for working out when you started work that day. The code given was:

date "+%b %d" | xargs -i grep -m1 -i {} /var/log/syslog.0 |awk '{ print "Today I got to work at " $3 }'

This basically just gets the time the machine was started that day.

There are a few problems here, firstly this can be done a lot easier with the 'uptime' command. More importantly though the machine I work on doesn't get rebooted each day so instead of the startup time I'd need the time I first logged in that day. This can be achieved with the following:

last $USER | sed -n '/'"$(date +"%b %d")"'/!q; p' | tail -1 | awk '{ print "Today I got to work at " $7 }'

View Comments »

Create an iso image from the contents of a directory:
mkisofs -hide-joliet-trans-tbl -l -J -f -T -r . > "../directory_dump_`basename $PWD`.iso"

When run in a directory (say tmp) it will create a file called directory_dump_tmp.iso one directory higher, which will contain the contents of the tmp directory in it's root.

View Comments »

nice ssh username@remoteservername "tar cjf - -C /from/basedir/ dirtocopy" | tar xjvf - -C /to/dir/ ; sleep 120 ; shutdown -P now

Copy the remote directory dirtocopy from the remote server to dir of local machine. Transfer the contents using bzip2 compression. When it's done (even if it fails) wait 120 seconds and power off the machine.

View Comments »

I used to have 'em' as an alias for Emacs. One day, I wanted to edit a file and typed 'rm' instead of 'em', losing the file I had no backup of. So I set a common alias to keep me focused.

alias rm='rm -i'

But sometimes this can be very annoying as you can imagine. By placing a backslash in front of a command, you "un-alias" it. This works for any alias you have set.

\rm *.c

I still use the 'rm' alias, but switched to just 'e' for Emacs ;^)

View Comments »

The 'tr' command can be used to quickly convert uppercase to lowercase and vice versa. This is done with:

tr [:upper:] [:lower:]

To give an example this can be used to rename all files in the current directory to lowercase (there are other ways to do this, but this is a decent example):

for file in * ; do echo mv "$file" "`echo $file | tr [:upper:] [:lower:]`"; done

Of course you can convert the other way by swapping '[:upper:]' and '[:lower:]'.

View Comments »

Home Latest Browse Top 25 Random Hall Of Fame Contact Submit