URL Scripting Access Downloading of images

Hello,

I have a function that for the most part works:
(with redundant checking)


--theURL is the full path to the file to download
--filePath is the full file path including the file name and extension
--nottxt is set to true or false to determine the file extension to put on the downloaded file (is it a text file)

on GetImage(theURL, filePath, nottxt)
	if ItemnotExists(filePath) then
		try
			tell application "URL Access Scripting"
				download theURL to file filePath
			end tell
			repeat while ItemnotExists(filePath)
				---Just waiting
			end repeat
			
			if size of (info for filePath as alias) is less than 9000 and nottxt then
				tell application "Finder"
					delete filePath
				end tell
			end if
		end try
	end if
end GetImage

on ItemnotExists(thePath)
	try
		set thePath to thePath as alias
	on error
		return true
	end try
	return false
end ItemnotExists

on ItemnotExists(thePath)
	try
		set thePath to thePath as alias
	on error
		return true
	end try
	return false
end ItemnotExists

I am using this to go to multiple websites once a week and download the comics for the week to be put in the newspaper. Not something anyone asked for just to make my life easier mind you.

Here’s the issue(s) I’m having.

I have to login to multiple websites. Using cURL I was having to manage cookies to keep logged in. I just couldn’t get the thing to work. For whatever reason using URL scripting access seems to be working. (Uses Safari cookies? IDK) If I get an authentication error there is a handler that does the login via safari. I’m telling cURL just isn’t my solution. I tried, honest.

The issue is the checks are there so that if the file exists locally it will be skipped.
The next step is to try to download it, but the file may not exist on the webserver.
This may result in a 404 error that the URL scripting access throws out. The try is there for that reason.
If the file does not exist it may also result in a blank 4k file being created on the local drive.
That is where the check comes in and the file deletion.

The idea:

try to download the file
wait for it to finish
check to see if it is a real file
if not delete it

The problem: The blank files aren’t being downloaded.

Perhaps the solution is:


--theURL is the full path to the file to download
--filePath is the full file path including the file name and extension
--nottxt is set to true or false to determine the file extension to put on the downloaded file (is it a text file)

on GetImage(theURL, filePath, nottxt)
	if ItemnotExists(filePath) then
		try
			tell application "URL Access Scripting"
				download theURL to file filePath
			end tell
			repeat while ItemnotExists(filePath)
				---Just waiting
			end repeat
		end try
		try
			if size of (info for filePath as alias) is less than 9000 and nottxt then
				tell application "Finder"
					delete filePath
				end tell
			end if
		end try
	end if
end GetImage

on ItemnotExists(thePath)
	try
		set thePath to thePath as alias
	on error
		return true
	end try
	return false
end ItemnotExists

on ItemnotExists(thePath)
	try
		set thePath to thePath as alias
	on error
		return true
	end try
	return false
end ItemnotExists

Since I thought of that when I reached the end of writing this, I’m still posting to see if there is a better solution.