SSC Home Page
Spitzer Science Center
 SSC Home -  Active Mission -  Overview -  IRAC -  IRS -  MIPS -  Data Archive -  Data Analysis & Tools
Knowledgebase: Archive
The SHA broke up my request into many zip files! Is there any way I can get it in larger pieces, or somehow streamline the data download?
Posted by z-Luisa Rebull on 06 September 2013 03:48 PM
When downloading large quantities of data (big programs, whole campaigns, etc.), the SHA will break up the downloads into "manageable" pieces, where "manageable" is defined as "not larger than common computers and software can handle." We understand your frustration if you computer is customized to handle much larger files than the average computer. If you don't want to click to download each piece, use the download script provided by the Background Monitor, available when the packaging is complete either in the Monitor itself or from the email you can arrange to have sent to you. The script can be configured to unzip the files too.

The Download Retrieval Script dialog gives you some options regarding which script you want to use. Generally speaking, the wget script is best for Linux and Unix users. The curl script is best for Mac users, because curl is part of the standard OS distribution; Mac users can also go retrieve and install wget and then use the wget scripts. For any of the scripts, you can also choose to include an option that unzips the zip files automatically.  The files stay on disk here for at least 72 hours, so you have a window of time to download them.

Save the script to a plain text file, and invoke the script.  You can copy and paste the script lines individually into your terminal window, or by  typing "csh [yourtextfile]" at the prompt.  The files will be automatically and sequentially downloaded to your disk, and if you've selected that option, unzipped as well. 

For Windows users, download and save the text file of URLs . Then follow the following steps to install the wget script and then download your data:
  • Go to the Windows wget web page
  • Scroll to the Download section and retrieve the wget installation.
  • Install wget and add the binary to your path.
  • Download the text file of URLs
  • At the command prompt: wget --content-disposition -i <file_of_urls_downloaded.txt>