[Greasemonkey] UI idea

Evan Martin evan.martin at gmail.com
Tue Apr 5 10:25:37 EDT 2005

On Apr 5, 2005 6:54 AM, Jeremy Dunck <jdunck at gmail.com> wrote:
> On Apr 5, 2005 3:21 AM, Aaron Boodman <zboogs at gmail.com> wrote:
> > Now I realize what you meant I think -- that it's hard to lookup
> > scripts that match by URL.
> >
> > Maybe we could cheat a little. Associate scripts in del.icio.us with a
> > specific domain, or the special * domain. GM finds all scripts that
> > match the current domain and the all domain, and then does the pattern
> > matching itself.
> Hmm.  I had planned to use del.icio.us just as a discovery and sorting
> mechanism.
> Displaying available scripts per page would be based on an REST
> service off of userscript.org.  If the load turns out to be too much
> (even caching results) then we could use a trusted delicious user (I
> just registered userscriptdotorg) whose tags would be populated by the
> directory.

I don't quite follow Aaron's talking about delicious -- won't using
that site create as much load as using your own?  (Really, it'd likely
create more load, 'cause you can optimize your code for GM's need but
delicious isn't changeable.)  I think Jeremy's talking about using
delicious for finding more scripts, which seems reasonable except that
there are lots of people (like me) who don't use that site.  delicious
tags are the opposite of decentralized 'cause they're by definition
only on one site.

The bloggery way to track scripts would be getting trackbacks sent to
a particular userscript.org URL when people post.  But then you don't
have any structured data available on userscript.org.  I'm still not
convinced that anything other than simply having people fill out a
form on userscript.org is the best idea.

Regarding detection:

Another reason I think sending off URLs to see what matches is a bad
idea is that it becomes sorta spywareish: sending off every URL I
visit to some other site!

If you really want it to have some visual indicator of whether scripts
will apply to this page, the best I can think of is:
- have userscript.org index all scripts, extracting their @include lines
- have GM fetch that index every so often, like once a day
- on each page load, iterate through the index

But that still won't work for scripts that apply to every page but
only modify some (like my nyt linker or my unembed script).  A more
complicated @include language would do it, but I don't know if it's
worth it.
And it'll slow down as more scripts are created.  With a more
restricted @include language you could make the indexing faster, I

More information about the Greasemonkey mailing list