RSS feed [root] /unix /weblog




login:

password:

title search:




 


Sat Sep 28 10:13:26 GMT 2024

unix



(google search) (amazon search)
second
download zip of files only

Tue Jan 06 03:02:30 GMT 2009 From /weblog/unix

virtualization


According to http://wiki.debian.org/SystemVirtualization , there are several virtualization tool available for linux. First I try QEMU / KVM, easy enough to setup but I cannot get the networking working for guest vista, the driver is up and saying run fine but cannot get IP. Then I try VirualBox, with using "Intel PRO/1000 MT" as driver, the network work fine and I can connect to the VPN

(google search) (amazon search)


Thu May 08 06:00:20 GMT 2008 From /weblog/unix/script

date


A working date calculation script

http://www.unix.com[..]s/4870-days-elapsed-between-2-dates.html

(google search) (amazon search)


Thu Feb 15 08:45:24 GMT 2007 From /weblog/unix

netcat


From "man nc"

The nc (or netcat) utility is used for just about anything under the sun
involving TCP or UDP. It can open TCP connections, send UDP packets,
listen on arbitrary TCP and UDP ports, do port scanning, and deal with
both IPv4 and IPv6. Unlike telnet(1), nc scripts nicely, and separates
error messages onto standard error instead of sending them to standard
output, as telnet(1) does with some.

http://www.onlamp.com/lpt/a/3771

(google search) (amazon search)


Fri Jun 16 13:37:41 GMT 2006 From /weblog/unix/script

check exit status



Need to check #? and PIPESTATUS

http://www.mattryall.net/article.cgi?id=247

(google search) (amazon search)


Sun May 28 10:31:46 GMT 2006 From /weblog/unix/script

Batch rename


Script of batch rename at unix system

http://mattfleming.com/node/110
http://stuart.woodward.jp/?p=279

(google search) (amazon search)


Tue Dec 20 05:20:44 GMT 2005 From /weblog/unix

wget


The answer is wget. It can be used to download just single file, a list of specified files, or a recursive chain of files. For example, the following command will download an entire site, following all links as long as they stay in the same domain.


wget -r http://www.jlamp.com/

This command will do the same but include referenced CSS, inline images, etc.


wget -p -r http://www.jlamp.com/

In this case, I wanted all files with extension "mp3", skipping everything else. My first thought was to use the -A option to only "Accept" and download mp3 files.


wget -p -r -A mp3 http://www.escapepod.org/

The problem though is that Escape Pod, like many podcast sites, have their actual mp3 files hosted by a third party in order to reduce bandwidth. I could do the recursive download across domans but thought that might get a bit dangerous.

In the end, I ran three commands. The first downloads the html files for the entire site. The second line scans the html for full URL's and uses sed to filter out everything else. (If I knew sed better this could probably be a shorter command). Note the use of find to navigate all files in the tree, egrep to restrict to actual URL's, sed to eliminate irrelevant parts of the line, and sort/uniq to remove duplicates.

Finally, the third line uses wget to download all files found in the previous command. (Note: remember the ending \ causes the command to extend to the next line).


wget -r -A htm,html http://www.escapepod.org/

cat `find . -name \*htm\* -print` | egrep "http.*mp3" | \
sed "s/.*\(http:\/\/.*mp3\).*$/\1/" | sort | uniq > files.txt

wget -i files.txt

http://www.jlamp.com/blog/2005/12/17/1134883853233.html

(google search) (amazon search)


Tue Nov 08 06:33:09 GMT 2005 From /weblog/unix/script

Unix KornShell Quick Reference


http://www.maththinking.com/boat/kornShell.html

(google search) (amazon search)


Tue Oct 04 03:58:05 GMT 2005 From /weblog/unix/script

safer rm


A number of tips to make rm command safer

http://www.macosxhints.com[..]le.php?story=20050928082624470&lsrc=osxh

(google search) (amazon search)