Link history

What is best way to save daily about 10 URLs to txt file or plist if that URL isnt in txt/plist already. Txt/plist have thousands of URLs, its kind of history file.

Basically this script fetches one webpage daily and scans for links and checks if x link is new and if its new then open in Safari and puts to txt/plist.

Hi,

a text file should be sufficient, something liks this


add_new_Link("http://www.apple.com")

on add_new_Link(newLink)
	try
		set ff to open for access file ((path to desktop as Unicode text) & "myLinks.txt") with write permission
		try
			set theLinks to paragraphs of (read ff)
		on error
			set theLinks to {}
		end try
		if newLink is not in theLinks then
			write newLink & return to ff starting at eof
		end if
	end try
	close access ff
end add_new_Link

Interestingly, it is also possible to write to ff starting at eof as URL, though I confess I don’t know why I want to do that.