Get URL by right-clicking a link

I had hoped that Automator’s service-plugin would have been powerful enough to fetch an URL by right-clicking a link. But it doesn’t. Automator seems to extract URLs if they are written pain text and full length in between other strings.

It’s so damn tedious to write different scripts for specific applications (Mail, Safari, etc.) as things could be so much easier… Anyway, design is all on purpose…
Meanwhile I cobbled together a working script for Mail… But I’m not excited at all… As I cannot get directly the link I want but have to choose from available inks found in the whole mail…

tell application "Mail"
	set sel to selection
	set getSource to source of item 1 of sel
	set {uniqueUrls, uniqueLbs} to my extractUrls(getSource)
end tell
choose from list uniqueLbs
log uniqueUrls

on extractUrls(getSource)
	set {uniqueUrls, uniqueLbs} to {{}, {}}
	repeat with b in paragraphs of getSource
		set b to b as text
		if "https://" is in b or "http://" is in b then
			set findOffs to offset of "http" in b
			log findOffs
			if findOffs > 1 then
				if b begins with "Networking" then log b
				set myUrl to text findOffs thru -1 of b
				set UrlLb to text 1 thru (findOffs - 3) of b
				set char1 to character 1 of UrlLb
				if "<" is not in UrlLb then
					log UrlLb
					if myUrl is not in uniqueUrls then
						copy myUrl to end of uniqueUrls
						copy UrlLb to end of uniqueLbs
					end if
				end if
			end if
		end if
	end repeat
	return {uniqueUrls, uniqueLbs}
end extractUrls


What is it you’re trying to do? I’m confused, because Safari and Mail both already have “copy link” on the menu when I right-click a link.

I see, ok.
Most URLs are hidden behind a label, for a quicker reading experience.

To save the content of a link, we usually have to right-click a link and choose “save as” in our browser’s or mail client’s context menus.

It seems that the Automator service (get input as URL) isn’t able to return the embedded URL of a right-clicked link - as we get still its label only (of the selected link). So my guessing, Automator isn’t able to recognize URLs if URLs aren’t written out in full length. Ouch!

Oh… I forgot to add that the build-in feature “share content with other applications” returns the same, useless (frustrating) result. No URL shared just the label will be shared.
Example:right click a link and share it with Apple’s Reminders/Notes. The passed content will be the name (label) of the URL, and not, the URL (web address) itself.

This may be relevant for Safari, but won’t make it work in Mail or other apps.

Just out of curiosity, what’s the endgame here? I mean, what’s the whole script accomplish?

In first place I tried to make a working Automator service, and I expected that this service would work for any app which supports links.
You can see I try to collect links, this because often I haven’t time to follow links in a first moment from the research I do.

Well, it’s not the way you want to do this where it would work system-wide from one application, but I think you could do this to work the way you want for at least Mail and Safari, though it’s sort of ugly and hacky.

Have the service on right click go ahead and get the link title. Then have your script check which application is frontmost. Have it get the page source from Safari or the “Raw Source” for a mail document. Then parse out the URL’s and check to see if there’s only one that uses the exact title text you’ve got, and if there is only one, then script getting the URL out of the HREF with that title. You’d only have to choose from a list of possibilities when there is more than one HREF with the exact title on the page you’re on.

That’s nice, thanks t.spoon

Meanwhile i tried to figure out how to write the right syntax to parse the (usually unique ) link name (label) from the source (text) in Mail.

The point is that awk, sed and grep need a file as reference to work. I’m used with shells but no “expert” :frowning: when it comes to deal with more complicated situations

I wrote a very simple and quick solution in pure applescript, and everything works apparently flawlessly - but when I compare the extracted URL with the URL in the source text it becomes clear that URLs aren’t treated as one-line strings. Very long URLs are broken between Paragraphs. So my take to look for a shell script which would be much more effective. If I don’t break the source text in paragraphs apple script is going to throw an error

Could you try out if this Automator service works for you too? :smiley:

on run {input, parameters}
	set LkLabel to input as text #"Listen to the episode now"
	tell application "Mail" to set getSource to source of item 1 of (get selection)
	set dd to 0
	set lg to length of paragraphs of getSource
	repeat with b in paragraphs of getSource
		set dd to dd + 1
		if LkLabel is in b and "http" is in b then
			set UrlAdr to text (offset of "http" in b) thru -1 of b
			if " )" is in UrlAdr then
				exit repeat
				set myUrl to my GetFullUrl(getSource, dd, lg)
				set UrlAdr to UrlAdr & myUrl
			end if
		end if
	end repeat
	tell application "Reminders"
		#prep default import list
		set remind_list to "Web - Links"
		if (exists list remind_list) is false then
			make new list with properties {name:remind_list}
		end if
		set allItems to name of reminders of list remind_list
		if LkLabel is not in allItems then
			make new reminder at end of list remind_list with properties {name:LkLabel, body:UrlAdr}
		end if
	end tell
end run

on GetFullUrl(getSource, dd, lg)
	set dd to dd + 1
	set myUrl to ""
	repeat with c from dd to lg
		set thisP to paragraph c of getSource
		if thisP is not "" then
			if " )" is in thisP then
				set getOffs to (offset of " )" in thisP) - 1
				set myUrl to myUrl & (text 1 thru getOffs of thisP)
				exit repeat
				set myUrl to myUrl & thisP
			end if
		end if
	end repeat
	return (myUrl as text)
end GetFullUrl