RubyForge has been slow today because

For the record, the answer is "no".

Chad Fowler

···

On Mon, 25 Oct 2004 08:58:00 +0900, Hal Fulton <hal9000@hypermetrics.com> wrote:

trans. (T. Onoma) wrote:
>
> I am truly sorry if this inconveniences you, but its the sacrifice we all need
> to make if we wish to continue to have such a great resource.
>

Have you been put in charge of RubyGarden? I had not heard of that.

Hal

------------------
http://chadfowler.com

http://rubygems.rubyforge.org (over 20,000 gems served!)

Good god, its getting stupid with the spam already. I was just clicking though and notice pages of spam. What is the solution RubyGarden is giong to do to fix the problem? Its starting to get irritating now. I'm having to despam RubyGarden pages when I see them.

David Ross

···

--
Hazzle free packages for Ruby?
RPA is available from http://www.rubyarchive.org/

gabriele renzi wrote:

trans. (T. Onoma) ha scritto:

> I agree and CAPTCHA was my first suggestion. But the general take seemed

to be

> against it, siting reasons of use and implementation, and that spammers

would

> just find a way around it. I'm not so sure about these points, but
> nonetheless pre-moderating pages with new external links is simple

enough and

> 100% effective.
>

but this kills half of the goodness of the wiki..
I remain of the opinion that just stoipping the bots by changing the
post interface a little could be enough

How can you change it so HTML won't use a <form> tag with a <submit> button?

Automated Web page hits don't need to "look for" the Submit button, by
pixels. They just parse a page and concoct an HTTP POST response.

···

--
  Phlip
  http://industrialxp.org/community/bin/view/Main/TestFirstUserInterfaces

Half? I've done a lot of wiki editing, probably more than most. In all that
time I've added just over a handful of external links. This change only
effects pages with _new_ external links. So I do not see how this is any
where near "half". Do you honestly add an a new external link every other
time you edit/add a wiki page?

T.

···

On Sunday 24 October 2004 09:49 am, gabriele renzi wrote:

trans. (T. Onoma) ha scritto:
> I agree and CAPTCHA was my first suggestion. But the general take seemed
> to be against it, siting reasons of use and implementation, and that
> spammers would just find a way around it. I'm not so sure about these
> points, but nonetheless pre-moderating pages with new external links is
> simple enough and 100% effective.

but this kills half of the goodness of the wiki..
I remain of the opinion that just stoipping the bots by changing the
post interface a little could be enough

Did you notice that page was spammed too?

T.

···

On Sunday 24 October 2004 11:08 am, ts wrote:

> I would love captcha's, I think they are a good idea. Unfortunately not
> all people think so, I think they need to be cimplemented regardless. It
> would solve problems.

make it optionnal

     http://simon.incutio.com/archive/2004/07/29/jimmy

How about an ascii-art based captcha? I'd be willing to take that route if
that is preferable to more people.

T.

···

On Sunday 24 October 2004 10:52 am, David Ross wrote:

Actually I havent seen any type of anti-bot methods being applied,
someoen needs to create support via CAPTCHAs and see what happens. Yahoo
uses captchas and you don't see them whingin about it. Wikis are free
support systems via webpages, someone just build the damn support to
stop this moronic spam.

Jim Weirich wrote:

> I agree and CAPTCHA was my first suggestion. But the general
take seemed to
> be against it, siting reasons of use and implementation, and
that spammers
> would just find a way around it. I'm not so sure about these points, but
> nonetheless pre-moderating pages with new external links is
simple enough
> and 100% effective.

I am trying an experiment on my wikis (UseMod based) where I require all
external links to be written HTTP://host/yada rather than
http://host/yada\.
Any page with a lower case http link is rejected (with a message
directing
the user to an explaination). The patch to usemod effected only
a few lines
of code. And although the measure is simple to circumvent, it
has cut down
spam on my wikis to about one incident a week. I believe Tom has
implemented
this patch on some (all?) the rubyforge project wikis with some
success as
well.

Tom has implemented this for at least 5 of the RubyForge projects in which I am involved. It has had a 100% successes rate in stopping the spam (so far). This is a temporary measure until we switch over to Ruwiki for RubyForge.

Austin is adding authentication to Ruwiki and Tom is integrating that into RubyForge's login, so you'll have to be logged-in to RubyForge to edit pages.

Curt

···

On Sunday 24 October 2004 09:36 am, trans. (T. Onoma) wrote:

Another possiblity might be to combine simple tactics. Perhaps only
requiring CAPTCHA if the edit has new external links. It wouldn't be
restrictive or require extra moderation.

Did you notice that page was spammed too?

Yes, this is why I've not given the right reference for the interview

    Wikipedia Founder Jimmy Wales Responds - Slashdot

:-)))

Guy Decoux

Phlip ha scritto:

gabriele renzi wrote:

trans. (T. Onoma) ha scritto:

I agree and CAPTCHA was my first suggestion. But the general take seemed

to be

against it, siting reasons of use and implementation, and that spammers

would

just find a way around it. I'm not so sure about these points, but
nonetheless pre-moderating pages with new external links is simple

enough and

100% effective.

but this kills half of the goodness of the wiki..
I remain of the opinion that just stoipping the bots by changing the
post interface a little could be enough

How can you change it so HTML won't use a <form> tag with a <submit> button?

no, but you can change the way values are passed, I think that a change as simple as renaming the name from 'text' to 'stuff' could suffice

Automated Web page hits don't need to "look for" the Submit button, by
pixels. They just parse a page and concoct an HTTP POST response.

see my previous message. UseMod is a perfect fit for a spambot, widely used and standard interface.
I think that spamots just use it's edit form(I mean, they pass value 'name' with their stuff added) withouth scanning more exactly.
That's why we don't see lots of links in the 'summary' notes, and neither we find them in 'alternative' wikis such as ruwiki or instiki. but this is just my opinion I may be completely wrong :slight_smile:

trans. (T. Onoma) ha scritto:

>
> but this kills half of the goodness of the wiki..
> I remain of the opinion that just stoipping the bots by changing the
> post interface a little could be enough

Half? I've done a lot of wiki editing, probably more than most. In all that time I've added just over a handful of external links. This change only effects pages with _new_ external links. So I do not see how this is any where near "half". Do you honestly add an a new external link every other time you edit/add a wiki page?

T.

I did'nt mean that half of the stuff in the wiki are external links :slight_smile:
I meant that a wiki is a place where you collect info, by creating them and by pointing out to places where you find other, just think of all the pointers to ruby-talk.org . Sorry for being unclear.

trans. (T. Onoma) wrote:

···

On Sunday 24 October 2004 10:52 am, David Ross wrote:
> Actually I havent seen any type of anti-bot methods being applied,
> someoen needs to create support via CAPTCHAs and see what happens. Yahoo
> uses captchas and you don't see them whingin about it. Wikis are free
> support systems via webpages, someone just build the damn support to
> stop this moronic spam.

How about an ascii-art based captcha? I'd be willing to take that route if that is preferable to more people.

I've got a PMC (poor man's captcha) that guards the comment submission for my blog (http://www.jamisbuck.org/jamis, and click a 'comment' link to see it). It's just plain text that must be entered backwards in a text box. Easily circumvented, 'tis true, but I haven't had a single spam comment on my blog since I implemented it.

- Jamis

--
Jamis Buck
jgb3@email.byu.edu
http://www.jamisbuck.org/jamis

Curt Hibbs wrote:

Jim Weirich wrote:

I agree and CAPTCHA was my first suggestion. But the general
     

take seemed to
   

be against it, siting reasons of use and implementation, and
     

that spammers
   

would just find a way around it. I'm not so sure about these points, but
nonetheless pre-moderating pages with new external links is
     

simple enough
   

and 100% effective.
     

I am trying an experiment on my wikis (UseMod based) where I require all external links to be written HTTP://host/yada rather than http://host/yada\. Any page with a lower case http link is rejected (with a message directing the user to an explaination). The patch to usemod effected only a few lines of code. And although the measure is simple to circumvent, it has cut down spam on my wikis to about one incident a week. I believe Tom has implemented this patch on some (all?) the rubyforge project wikis with some success as well.
   
Tom has implemented this for at least 5 of the RubyForge projects in which I am involved. It has had a 100% successes rate in stopping the spam (so far). This is a temporary measure until we switch over to Ruwiki for RubyForge.

Austin is adding authentication to Ruwiki and Tom is integrating that into RubyForge's login, so you'll have to be logged-in to RubyForge to edit pages.

Curt

I think having to be logged in to edit a wiki is good. Can't wait for the full conversion.

David Ross

···

On Sunday 24 October 2004 09:36 am, trans. (T. Onoma) wrote:

--
Hazzle free packages for Ruby?
RPA is available from http://www.rubyarchive.org/

I didn't notice that the Simon Wilson link mentioned above suggest the
same. Sorry for the noise.

···

On Mon, 25 Oct 2004 01:43:16 -0500, Christian Metts <mintxian.list@gmail.com> wrote:

Another possiblity might be to combine simple tactics. Perhaps only
requiring CAPTCHA if the edit has new external links. It wouldn't be
restrictive or require extra moderation.

...and I hope to finally begin rolling that out on the Ruwiki site
itself later this week. I just added a mandatory redirect through
google.com (Redirect Notice) -- this will
ultimately have a configurable list so that redirects will only be
done for unknown URIs.

I've just been VERY busy lately with work and home life.

-austin

···

On Mon, 25 Oct 2004 03:24:40 +0900, Curt Hibbs <curt@hibbs.com> wrote:

Austin is adding authentication to Ruwiki and Tom is integrating that into RubyForge's
login, so you'll have to be logged-in to RubyForge to edit pages.

--
Austin Ziegler * halostatue@gmail.com
               * Alternate: austin@halostatue.ca
: as of this email, I have [ 5 ] Gmail invitations

I see. That's true. And we have a special tag for ruby-talk. And certainly a
few more special tags like that might be useful too.

But whatever course of action we take, it needs to be firm. And if that means
I have to get permission to post external links, I am willing to do that. A
wiki still offers a lot of goodness even without free-for-all external links.

But like I said, if this kind of moderation isn't preferred, then we can give
a captcha system a try.

T.

···

On Sunday 24 October 2004 01:09 pm, gabriele renzi wrote:

I did'nt mean that half of the stuff in the wiki are external links :slight_smile:
I meant that a wiki is a place where you collect info, by creating them
and by pointing out to places where you find other, just think of all
the pointers to ruby-talk.org . Sorry for being unclear.

gabriele renzi wrote:

see my previous message. UseMod is a perfect fit for a spambot, widely
used and standard interface.
I think that spamots just use it's edit form(I mean, they pass value
'name' with their stuff added) withouth scanning more exactly.
That's why we don't see lots of links in the 'summary' notes, and
neither we find them in 'alternative' wikis such as ruwiki or instiki.
but this is just my opinion I may be completely wrong :slight_smile:

Well, let's try to say it like this: I think I could write a Wiki spamming
engine in fewer lines than you could, which could hit more kinds of Wikis
than yours.

Rest assured I don't want to push the state of this art...

Public Wikis were a nice concept. Like e-mail, they will remain either
hostile or useless until we invent a more secure 'net infrastructure.

···

--
  Phlip
  http://industrialxp.org/community/bin/view/Main/TestFirstUserInterfaces

That's fine for Rubyforge, but not for Garden. Something else must still be
done there.

T.

···

On Sunday 24 October 2004 02:28 pm, David Ross wrote:

>Austin is adding authentication to Ruwiki and Tom is integrating that into
> RubyForge's login, so you'll have to be logged-in to RubyForge to edit
> pages.
>
>Curt

I think having to be logged in to edit a wiki is good. Can't wait for
the full conversion.

David Ross

Jamis,

What did you use to do that captcha? That captcha I might actually
support using; I just don't want to do an image-based captcha because
of accessibility issues.

-austin

···

On Mon, 25 Oct 2004 02:24:51 +0900, Jamis Buck <jgb3@email.byu.edu> wrote:

I've got a PMC (poor man's captcha) that guards the comment submission
for my blog (http://www.jamisbuck.org/jamis, and click a 'comment' link
to see it). It's just plain text that must be entered backwards in a
text box. Easily circumvented, 'tis true, but I haven't had a single
spam comment on my blog since I implemented it.

--
Austin Ziegler * halostatue@gmail.com
               * Alternate: austin@halostatue.ca
: as of this email, I have [ 5 ] Gmail invitations

trans. (T. Onoma) wrote:

···

On Sunday 24 October 2004 02:28 pm, David Ross wrote:

>Austin is adding authentication to Ruwiki and Tom is integrating that into
> RubyForge's login, so you'll have to be logged-in to RubyForge to edit
> pages.
>
>Curt

I think having to be logged in to edit a wiki is good. Can't wait for
the full conversion.

David Ross

That's fine for Rubyforge, but not for Garden. Something else must still be done there.

T.

Why not just have a .Ruby Passport service? If Rubyforge could set something up like that. If they dont have the bandwidth, I could host.

David Ross
--
Hazzle free packages for Ruby?
RPA is available from http://www.rubyarchive.org/