How to make wget faster or multithreading?

I’ve just upgraded my computer hardware(cpu + motherboard + graphic card + memory + hard disk), so that install a new OS is needed. I tried to download debian-6.0.6-amd64-netinst.iso with wget command but the speed is so slow that I could not bear. 4Kb/s ~ 17 Kb/s, slow like a running turtle, or even more slower if I use Chrome.

I’ve read the help information of wget, it seems like there are no options could make it more faster.

Is there anyway to make wget faster? Or is it possible to make it multi-threading download?

PS: my bandwidth is 4M. I use this command:

wget -c url  http://hammurabi.acc.umu.se/debian-cd/6.0.6/amd64/iso-cd/debian-6.0.6-amd64-netinst.iso

Here is Solutions:

We have many solutions to this problem, But we recommend you to use the first solution because it is tested & true solution that will 100% work for you.

Solution 1

Why not try axel? It is a fully fledged Command line based Downloader.

Install axel and spawn download by

axel -a -n [Num_of_Thread] link1 link2 link3 ...

where '[Num_of_Thread]' is the number of parallel connections to create for each link you want to download.

-a just show an improved progress bar.

Unlike many other download managers, Axel downloads all the data directly to the destination file, using one single thread. This saves some time at the end because the program doesn’t have to concatenate all the downloaded parts.

Solution 2

I tried axel upon Gufran‘s recommendation but it hugely disappointed me. My goal was to find a CLI replacement for DownThemAll because it hogs the CPU and hard disc and slows the entire system down even on an 8-core Mac Pro. I also wanted a multithreaded replacement for wget and curl, not some kludge of a script that runs multiple instances of these. So I searched further and found what I think right now is the ultimate most modern multithreaded CLI downloader there is — aria2. The big problem I had with axel was that it ‘faked’ downloading files over SSL. I caught it doing that with tcdump. It was downloading https links as ordinary http. That really pissed me off and if I hadn’t checked, I would have had a false sense of security. I doubt that many people know about this serious breach in security. Getting back to aria2, it is more advanced than any other downloader. It supports HTTP(S), FTP, BitTorrent, and Metalink protocols, is multiplatform, and is a download guerrilla. It maxes out my ISP’s bandwidth with no load on the CPU or hard disk, unlike DTA. The man page is gigantic. I will never use more than a few of its many options. And oh, BTW, I checked its SSL performance with tcdump and it is solid, not fake. I wrote a script that mimics DTA’s behavior, if not its convenience.

The basic command I use to get max bandwidth is

aria2c --file-allocation=none -c -x 10 -s 10 -d "mydir" URL

-c allows continuation of download if it gets interrupted, -x 10 and -s 10 allow up to 10 connections per server, and -d "mydir" outputs file to directory mydir.

aria2files.sh:

#!/bin/bash

filename="$1" # get filename from command line argument

while read -r line
do
    if [ "$line" ] # skip blank lines
    then
        if [[ "$line" =~ (https?|ftp)\:\/\/ ]] # line contains a URL, download file
        then
            echo "URL: '$line'"
            aria2c --file-allocation=none -c -x 10 -s 10 -d "$currdir" "$line"
        else # line contains a directory name, create directory if not already present
            echo "Directory: '$line'"
            currdir="$line"
            if [ ! -d "$currdir" ]
            then
                mkdir -p "$currdir" # '-p' enables creation of nested directories in one command
            fi
        fi
    fi
done < "$filename"

It reads a text file of the format:

files.txt:

dierctory 1
url1
url2
…
directory 2/subdirectory/sub-subdirectory/…
url3
url4
…
…
…

The script reads the filename from the command line:

aria2files.sh files.txt

It creates the directories and downloads to them. It can create nested directories as shown in the second example.

For more details see my post Bash script to download files from URLs to specified directories listed in a text file.

Note: Use and implement solution 1 because this method fully tested our system.
Thank you 🙂

All methods was sourced from stackoverflow.com or stackexchange.com, is licensed under cc by-sa 2.5, cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply