Scheduled Download Script

Hi,
I use cron on my server to tar 3 files nightly that are placed in my backups directory.
I looking for a script to connect to the backup directory and download the 3 files to my mac desktop/backups folder each night. The remote files reside in a directory below my public_html so they can’t be reached via http. The script can be for fetch, terminal, or whatever. Any help would be appreciated.

ie.
mysite.com
– cgi-bin
– backups/files are in here
– public_html
– – index.html
– – etc…

Thank you.

Try this:



(* 

You are going to need a valid ip to do this.


*)
		
		
		set MyDestinationFolder to "~/Desktop/"
		
		set OriginFolder to "222.333.111.222:/Users/UserName/folder/" -- the first set of numbers should be a valid ip
		
		
		tell application "Terminal"
			
			set MyComannd to do script with command "scp -r " & OriginFolder & space & MyDestinationFolder
			
		end tell
	


In my script you would have to type the pasword manually.

I hope someone more experienced would post how to do it without having to type the password

Yes, ideally the script would run at say 3am each morning. I doubt I’ll be up or in any condition to type responses into prompt boxes.

try this shell script


#!/bin/bash

remoteFiles="file1 file2 file3"

for theFile in $remoteFiles; do
	
	/usr/bin/curl -o /Users/userName/$theFile ftp://username:password@mysite.com/backups/$theFile

done

HTH -john

This works good, but I need two things:

  1. This downloads the entire backup directory. I need to be able to specify certain files.

  2. I need it to quit the Terminal app at the end. Right now it stays logged in.

Thanks alot.

You stillhave to type the password for every file, But this should work, and quit the terminal at the end…

Hope I am helping you…