[Greasemonkey] UI idea

Aaron Boodman zboogs at gmail.com
Tue Apr 5 11:31:13 EDT 2005


Good points. I'm glad you're listening.

> I don't quite follow Aaron's talking about delicious -- won't using
> that site create as much load as using your own?  

Yes, but someone else's load - more on that below.

> (Really, it'd likely
> create more load, 'cause you can optimize your code for GM's need but
> delicious isn't changeable.)  

Well I think you can use it's existing functionality. This is really
the only thing that appeals to me about that idea; we don't have to
write anything. People could tag things as:

userscript
allsites

or:

userscript
google.com
gmail.com
etc...

We could put a frontend on userscripts.org that figures out the right
tags for a script.

Say then that GM visits google.com. It will requests from delicious
"userscript" AND "gmail.com", and then do the pattern matching itself.
Presumably, it caches the results of these calls for some time.

> [delicious is] the opposite of decentralized 'cause they're by definition
> only on one site.

Wasn't it you who quipped recently 'all the interesting stuff is
centralized' :)? I would love it to be decentralized, but we have
already seen that google (any every other search engine) cannot find
user scripts. Grr. Which is why we came to delicious in the first
place. We could change the extension. I'm not sure how well web
indexing will work in this case still; it seems like there might be a
lot of noise.

> The bloggery way to track scripts would be getting trackbacks sent to
> a particular userscript.org URL when people post.  But then you don't
> have any structured data available on userscript.org.  

The structured data is in the userscript header. userscript.org could
simply request it.

> Another reason I think sending off URLs to see what matches is a bad
> idea is that it becomes sorta spywareish: sending off every URL I
> visit to some other site!

Yikes. That's a good point.

> If you really want it to have some visual indicator of whether scripts
> will apply to this page, the best I can think of is:
> - have userscript.org index all scripts, extracting their @include lines
> - have GM fetch that index every so often, like once a day
> - on each page load, iterate through the index
> 
> But that still won't work for scripts that apply to every page but
> only modify some (like my nyt linker or my unembed script).  

I don't understand this bit; I'm probably missing something. I don't
care if a script only modifies somes page. I'm trying to find scripts
that apply to this page (eg, their includes and excludes would make
them run on this page), whether or not they would actually do
something productive.

I think even that much is hard, but I figure there is some fancy
indexing (hand waving here) that could make it work. At the very least
I think that indexing by domain name substrings found in the includes,
or allsites if none is found, you'd get a giant percentage of existing
user scripts.

So for instance:

http://*google.com/ - indexed under "google.com"
*.xhtml - indexed under "all sites"

Backing up a step, there are two main goals here: 
* make it easy to find out which scripts apply to the current site
from the browser
* develop a better repository which can deal with lots and lots of scripts

I don't really have an opinion on the second one. For the first one, I
think it would be cool, though not necessary, to let users know as
they browse that scripts are available for the current page. It
doesn't seem to change anything except load. It's still sorta
spywareish to send URLs to another server, even when you "ask" by
navigating to the part of the UI that does that, so I think that if we
do this, we would have to ask users if they are OK with it (and never,
ever, store the requests).

So we are left with three options:

* Index on a custom server, make requests per-page (with cacheing)
* Index on client (or server, I don't think it makes a big
difference), make requests to get entire list ~ once a day.
* List scripts on delicious, index by assigning appropriate tags. 

The major difference I see in any of these is maintenance. I'd rather
hoist the issue of database maintenance on a dedicated service, not
create our own.


More information about the Greasemonkey mailing list