If you are putting together a script that uses wget to post data to various URL’s it is likely that you do not want wget to save the response back from the server in files. You can easily get around this by specifying the output file that wget save to with the -O switch. Below is an example of what would happen when the -O switch is not used followed by an example of using wget with the -O switch outputting the results to /dev/null.
Continuing Google Chrome downloads that have been canceled because of a crashed browser, a Windows bluescreen, or some other reason is quite easy by using the tools available via Linux. The primary tool used to continue downloading files on a Linux computer is called wget and can be obtained for Windows using the information below. If a Chrome download has been cancelled for some reason you can continue downloading the file using wget as explained below. I am surprised there is not a extension that provides a resume feature yet however I imagine one will be coming along in the near future.
The other day I was attempting to use the Linux ftp command line application to obtain all of the files, sub directories, and files within the sub directories from an FTP site. The first issue I ran into was the issue of being prompted to confirm each and every file that is downloaded. Below I describe how to accomplish turning off the prompt and just to note I ended up using “wget” to download all of the files, sub directories, and files within the sub directories via FTP on the remote server.
Earlier tonight I was working on a project for a customer that wants to translate the Hebrew Interlinear Bible into English which obviously has been done many times before. This customer however has some translations that he wants to make for himself so I needed to find a Hebrew Interlinear Bible in text or PDF format. I was able to locate the Hebrew Interlinear Bible in PDF format however there was a separate PDF for each chapter in each book which numbers something like 930 different PDF’s. I was able to use the wget command described in detail below to download all of the PDF’s with a single command on my Windows 7 computer.
In going through all the tools with Alex on Backtrack I have discovered a few bugs and missing modules or libs. I will be writting posts on how to fix them but I will also be adding the fix’s to Backtrack svn as well. This morning I was writting the article on Dnsenum by my buddy Barbsie and I ran into a missing perl module.
- root@666:/pentest/enumeration/dnsenum# ./dnsenum.pl --enum -f dns.txt --update a -r cnn.com
- dnsenum.pl VERSION:1.2
- Warning: can't load Net::Whois::IP module, whois queries desabled.
Below I will show to to download and install the needed module: