Don't call download_multipart for 1 chunk

Previously, where a file was larger than `$LARGE_FILE_SIZE` but smaller
than `$CHUNK_SIZE*2`, `download_multipart` would be called but would
only download one (1) chunk that was the whole file.

This fix keeps the same download performance as before but optimizes
processing chunks out.
This commit is contained in:
Tom Davis
2016-07-23 16:41:04 -04:00
parent e3eaa9efaf
commit 2991ffd193

View File

@@ -36,10 +36,11 @@ export LC_ALL=C
EPOCH_DATE="Jan 1 00:00:00 1970"
MB=$((1024*1024))
LARGE_FILE_SIZE=$((100*$MB))
CHUNK_SIZE=$((64*$MB))
LARGE_FILE_SIZE=$((CHUNK_SIZE*2))
NUM_WORKERS=10
function kill_background_processes {