RubyForge has been slow today because

Some freaking dork at the following IP address(s) was continually
downloading ruby182-14_RC8a.exe from here:

200.98.63.142

Then from here...

200.98.136.108

How is this for an example log:

200.98.63.142 - - [23/Oct/2004:17:41:34 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:17:53:18 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:17:56:34 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:00:47 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:06:31 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:10:56 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:14 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:28 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:41 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:19:10 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 9190167
200.98.63.142 - - [23/Oct/2004:18:19:12 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:19:18 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:23:16 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:23:55 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:26:32 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:26:36 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:27:46 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:28:32 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:29:58 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:31:51 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:32:07 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136

And I mean continually. Those IP address are now officially blocked. If we
find the perp who did this, they are going to be NAILED. We realize that
this is probably a DSL line or cable modem. If someone wants to help track
down who is doing this it would be great. It seems to be coming from Brazil
(www.uol.com.br) RubyForge is a community resource and this screws the
whole community.

I can only assume this was a denial of service attack. I will block the
entire 200.98 subnet and every other subnet owned by uol.com.br if these
things continue (which may negatively effect innocent people...and I don't
want to do that).

Best,

Rich
Team RubyForge

Richard Kilmer wrote:

Some freaking dork at the following IP address(s) was continually
downloading ruby182-14_RC8a.exe from here:

200.98.63.142

Then from here...

200.98.136.108

How is this for an example log:

200.98.63.142 - - [23/Oct/2004:17:41:34 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:17:53:18 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:17:56:34 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:00:47 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:06:31 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:10:56 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:14 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:28 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:41 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:19:10 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 9190167
200.98.63.142 - - [23/Oct/2004:18:19:12 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:19:18 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:23:16 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:23:55 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:26:32 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:26:36 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:27:46 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:28:32 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:29:58 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:31:51 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:32:07 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136

And I mean continually. Those IP address are now officially blocked. If we
find the perp who did this, they are going to be NAILED. We realize that
this is probably a DSL line or cable modem. If someone wants to help track
down who is doing this it would be great. It seems to be coming from Brazil
(www.uol.com.br) RubyForge is a community resource and this screws the
whole community.

I can only assume this was a denial of service attack. I will block the
entire 200.98 subnet and every other subnet owned by uol.com.br if these
things continue (which may negatively effect innocent people...and I don't
want to do that).

Best,

Rich
Team RubyForge

I believe there are better ways than blacklisting so many users. RubyForge is a great place for browsing projects, I think it would be ill to prevent users to learn about RubyForge.

Maybe you could implement some sort of max downloads a day? or bandwidth usage a day?

David Ross

···

--
Hazzle free packages for Ruby?
RPA is available from http://www.rubyarchive.org/

Speaking of attacks, I jumped over to the Garden Wiki just now and see that
the front page is spammed to the hilt and all the RecentChanges are nothing
but spam entries. So then I check out the PreventingWikiSpam page (which I
started) to see what was new there. There I find this interesting link:

  http://www.rubygarden.org/ruby?action=browse&id=SpamIssue&revision=1

Imagine my surprise at seeing this! I am not sure who wrote it (daz?), but
actually I don't really care. Although, it seems I am being insinuated as the
potential "Great Graden Wiki Spam Artist". Lol! Sorry to disappoint folks,
but it ain't me.

But I can tell you this. I am quite dissappointed that the spamming problem
has not been satisfactorily dealt with yet. So much so that I now declaring
my intent to FIX IT. The plan is simple: any revision that adds an external
link will be denied --or perhaps better, honeypotted. If you want to add
external link(s) you'll have to email the administrator or official moderator
and ask that the link/links be added to the page. I doubt the email load this
will create will be very high especailly if spread out over a handful of
moderators. But if it does prove too much we can later add temporary
passwords (as in 48hrs) to let anyone do so via a special
(yet-to-be-determined) secure interface. Thats it. Problem solved.

So how should I proceed? Should I make a patch for Ruwiki? Or what?

T.

And BTW: I am not BillGuindon either.

···

On Sunday 24 October 2004 12:41 am, Richard Kilmer wrote:

Some freaking dork at the following IP address(s) was continually
downloading ruby182-14_RC8a.exe from here:

200.98.63.142

Then from here...

200.98.136.108

How is this for an example log:

[snip]

And I mean continually. Those IP address are now officially blocked. If
we find the perp who did this, they are going to be NAILED. We realize
that this is probably a DSL line or cable modem. If someone wants to help
track down who is doing this it would be great. It seems to be coming from
Brazil (www.uol.com.br) RubyForge is a community resource and this screws
the whole community.

I can only assume this was a denial of service attack. I will block the
entire 200.98 subnet and every other subnet owned by uol.com.br if these
things continue (which may negatively effect innocent people...and I don't
want to do that).

Best,

Rich
Team RubyForge

--
( o _ カラチ
// trans.
/ \ transami@runbox.com

I don't give a damn for a man that can only spell a word one way.
-Mark Twain

Richard Kilmer wrote:

Some freaking dork at the following IP address(s) was continually
downloading ruby182-14_RC8a.exe from here:

200.98.63.142

Then from here...

200.98.136.108

How is this for an example log:

200.98.63.142 - - [23/Oct/2004:17:41:34 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:17:53:18 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:17:56:34 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:00:47 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:06:31 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:10:56 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:14 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:28 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:41 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:19:10 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 9190167
200.98.63.142 - - [23/Oct/2004:18:19:12 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:19:18 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:23:16 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:23:55 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:26:32 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:26:36 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:27:46 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:28:32 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:29:58 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:31:51 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:32:07 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136

And I mean continually. Those IP address are now officially blocked. If we
find the perp who did this, they are going to be NAILED. We realize that
this is probably a DSL line or cable modem. If someone wants to help track
down who is doing this it would be great. It seems to be coming from Brazil
(www.uol.com.br) RubyForge is a community resource and this screws the
whole community.

I can only assume this was a denial of service attack. I will block the
entire 200.98 subnet and every other subnet owned by uol.com.br if these
things continue (which may negatively effect innocent people...and I don't
want to do that).

Best,

Rich
Team RubyForge

Solution, check RBL lists..
http://rbls.org/?q=200.98.136.108

implement to check these I use as well..
                   opm.blitzed.org, /* Remeber this is a hijacked-IP range domain, so its your choice to use. Questions, ask me. */
                   list.dsbl.org,
                   bl.spamcop.net,
                   sbl-xbl.spamhaus.org,
                   dnsbl.njabl.org,
                   http.dnsbl.sorbs.net,
                   socks.dnsbl.sorbs.net,
                   misc.dnsbl.sorbs.net,
                   smtp.dnsbl.sorbs.net,
                   web.dnsbl.sorbs.net,
                   spam.dnsbl.sorbs.net, block.dnsbl.sorbs.net,
                   zombie.dnsbl.sorbs.net,
                   rhsbl.sorbs.net,
                   dnsbl.ahbl.org

You were attacked, yes. Solution is to implement RBLs. This is what to do if you are going to be under attack. Some people don't care like big sites. I know Rubyforge isn't HUGE and has 100000 Terrabytes of transfer a month, so its best to implement RBL

Thanks have a nice day, for the solution.

David Ross

···

--
Hazzle free packages for Ruby?
RPA is available from http://www.rubyarchive.org/

I am contacting the provider...I will attempt to identify the individual.
This is deliberate, and I will defend RubyForge from it. I will blacklist
the subnet as a last resort, but this WILL stop.

-rich

···

On 10/24/04 12:52 AM, "David Ross" <dross@code-exec.net> wrote:

I believe there are better ways than blacklisting so many users.
RubyForge is a great place for browsing projects, I think it would be
ill to prevent users to learn about RubyForge.

Maybe you could implement some sort of max downloads a day? or bandwidth
usage a day?

trans. (T. Onoma) wrote:

Speaking of attacks, I jumped over to the Garden Wiki just now and see that the front page is spammed to the hilt and all the RecentChanges are nothing but spam entries. So then I check out the PreventingWikiSpam page (which I started) to see what was new there. There I find this interesting link:

http://www.rubygarden.org/ruby?action=browse&id=SpamIssue&revision=1

Imagine my surprise at seeing this! I am not sure who wrote it (daz?), but actually I don't really care. Although, it seems I am being insinuated as the potential "Great Graden Wiki Spam Artist". Lol! Sorry to disappoint folks, but it ain't me.

But I can tell you this. I am quite dissappointed that the spamming problem has not been satisfactorily dealt with yet. So much so that I now declaring my intent to FIX IT. The plan is simple: any revision that adds an external link will be denied --or perhaps better, honeypotted. If you want to add external link(s) you'll have to email the administrator or official moderator and ask that the link/links be added to the page. I doubt the email load this will create will be very high especailly if spread out over a handful of moderators. But if it does prove too much we can later add temporary passwords (as in 48hrs) to let anyone do so via a special (yet-to-be-determined) secure interface. Thats it. Problem solved.

So how should I proceed? Should I make a patch for Ruwiki? Or what?

T.

And BTW: I am not BillGuindon either.

Some freaking dork at the following IP address(s) was continually
downloading ruby182-14_RC8a.exe from here:

200.98.63.142

Then from here...

200.98.136.108

How is this for an example log:

[snip]

And I mean continually. Those IP address are now officially blocked. If
we find the perp who did this, they are going to be NAILED. We realize
that this is probably a DSL line or cable modem. If someone wants to help
track down who is doing this it would be great. It seems to be coming from
Brazil (www.uol.com.br) RubyForge is a community resource and this screws
the whole community.

I can only assume this was a denial of service attack. I will block the
entire 200.98 subnet and every other subnet owned by uol.com.br if these
things continue (which may negatively effect innocent people...and I don't
want to do that).

Best,

Rich
Team RubyForge

These spam attacks are ridiculous. There needs to be a type of honeypot system. I'm still curious how these spammers are working, through manual or bot. Logs or explaination would help.

I still think setting a link to trap the spammer in thier own db where no public changes are made and they view thier changes are the better idea.

David Ross

···

On Sunday 24 October 2004 12:41 am, Richard Kilmer wrote:

--
Hazzle free packages for Ruby?
RPA is available from http://www.rubyarchive.org/

trans. (T. Onoma) wrote:

...So much so that I now declaring
my intent to FIX IT.

Look up "reverse Turing test". You could write one in Java in about 6 hours,
or one in Ruby in 15 minutes.

The plan is simple: any revision that adds an external
link will be denied --or perhaps better, honeypotted.

I don't like that, because I add links-out to my other sites all the time.

···

--
  Phlip
  http://industrialxp.org/community/bin/view/Main/TestFirstUserInterfaces

Richard Kilmer wrote:

I am contacting the provider...I will attempt to identify the individual.
This is deliberate, and I will defend RubyForge from it. I will blacklist
the subnet as a last resort, but this WILL stop.

-rich

I believe there are better ways than blacklisting so many users.
RubyForge is a great place for browsing projects, I think it would be
ill to prevent users to learn about RubyForge.

Maybe you could implement some sort of max downloads a day? or bandwidth
usage a day?
   

Well I scanned those two computers, and appeared with nothing. I seriously don't think the person behind the computers were the ones attacking, but try to contact the provider just in case. brazil is the ultimate cracker funhouse, there are many exploited computers, and stupid people there(yes stupid people, I don't care how it sounds but its true) There are smart people in Brazil too.. . They are the ones usually to get infected, you are probably going to block out all of brazil before the end of it. Would it be possible to just integrade some sort of alert system in RubyForge? Like to give an alert if the person tries to download too much, or keeps downloading without pauses in a 20 sec range?

the worst countries are brazil and some of the asian countries. Spam, crackers, and kiddies.

David Ross

···

On 10/24/04 12:52 AM, "David Ross" <dross@code-exec.net> wrote:

--
Hazzle free packages for Ruby?
RPA is available from http://www.rubyarchive.org/

uol.com.br is one of Brazil's largest internet providers. I would be
careful about blocking the whole subnet.

···

On Sun, 24 Oct 2004 13:57:48 +0900, Richard Kilmer <rich@infoether.com> wrote:

I am contacting the provider...I will attempt to identify the individual.
This is deliberate, and I will defend RubyForge from it. I will blacklist
the subnet as a last resort, but this WILL stop.

-rich

David Ross ha scritto:

trans. (T. Onoma) wrote:

These spam attacks are ridiculous. There needs to be a type of honeypot system. I'm still curious how these spammers are working, through manual or bot. Logs or explaination would help.

I guess it is bots. This is why the spammed pages are usually the one linked from the main page. And this is why usemod-based wiki get spammed almost everyday while other engines are not (widely used, so an optimal target for a spambot).

I still think setting a link to trap the spammer in thier own db where no public changes are made and they view thier changes are the better idea.

I never understood this. A lone rider spamming some pages by hand is not a problem, the wiki community can fix it easily. Automated systems are the real one, and should be fighted with some simple captcha. Everything, IMHO, anyway.

David Ross wrote:

These spam attacks are ridiculous. There needs to be a type of honeypot
system. I'm still curious how these spammers are working, through manual
or bot. Logs or explaination would help.

They are probably using HttpUnit to "test" Wikis.

I still think setting a link to trap the spammer in thier own db where
no public changes are made and they view thier changes are the better

idea.

Preventing a series of changes from the same IP might raise the bar.
Preventing the same text appearing on different pages would too.

···

--
  Phlip
  http://industrialxp.org/community/bin/view/Main/TestFirstUserInterfaces

trans. (T. Onoma) wrote:
> ...So much so that I now declaring
> my intent to FIX IT.

Look up "reverse Turing test". You could write one in Java in about 6
hours, or one in Ruby in 15 minutes.

No. Motivated spammers will always find a way around these things. A REAL
Turing test --a human being, is effective and works. We can no longer afford
to play games.

> The plan is simple: any revision that adds an external
> link will be denied --or perhaps better, honeypotted.

I don't like that, because I add links-out to my other sites all the time.

But that's EXACTLY what we don't want! Do us a favor, make a single link to
your own page and add all the links you want to that.

I'm truly sorry. But we must do something about this if the Garden Wiki is to
remain a viable resource. I for one have already stopped using it b/c of this
spam problem. And I am sure others have done likewise. Not to mention the
number of man hours that have been wasted in fighting this.

I am truly sorry if this inconveniences you, but its the sacrifice we all need
to make if we wish to continue to have such a great resource.

T.

···

On Sunday 24 October 2004 02:44 am, Phlip wrote:

Hi,

uol.com.br is one of Brazil's largest internet providers. I would be
careful about blocking the whole subnet.

You are right. UOL is maybe the biggest ISP in Brazil. Their main
business is just that. Here in Brazil it's a very difficult problem to
educate people on how to avoid security problems and how to avoid
being a jerk and taking advantage of others. There is an ongoing sense
of impunity. We need professional politicians. Unfortunately, many of
our politicians have other interests besides the well-being of the
society. The only cure for such a misery is time. And count that in
hundreds of years. :slight_smile:

Meanwhile, I hope that we can fight these vandal acts.

Cheers,
Joao

···

On Sun, 24 Oct 2004 15:53:22 +0900, Carl Youngblood <carl.youngblood@gmail.com> wrote:

I agree and CAPTCHA was my first suggestion. But the general take seemed to be
against it, siting reasons of use and implementation, and that spammers would
just find a way around it. I'm not so sure about these points, but
nonetheless pre-moderating pages with new external links is simple enough and
100% effective.

T.

···

On Sunday 24 October 2004 06:29 am, gabriele renzi wrote:

David Ross ha scritto:
> I still think setting a link to trap the spammer in thier own db where
> no public changes are made and they view thier changes are the better
> idea.

I never understood this. A lone rider spamming some pages by hand is not
a problem, the wiki community can fix it easily. Automated systems are
the real one, and should be fighted with some simple captcha.
Everything, IMHO, anyway.

gabriele renzi wrote:

David Ross ha scritto:

trans. (T. Onoma) wrote:

These spam attacks are ridiculous. There needs to be a type of honeypot system. I'm still curious how these spammers are working, through manual or bot. Logs or explaination would help.

I guess it is bots. This is why the spammed pages are usually the one linked from the main page. And this is why usemod-based wiki get spammed almost everyday while other engines are not (widely used, so an optimal target for a spambot).

I still think setting a link to trap the spammer in thier own db where no public changes are made and they view thier changes are the better idea.

I never understood this. A lone rider spamming some pages by hand is not a problem, the wiki community can fix it easily. Automated systems are the real one, and should be fighted with some simple captcha. Everything, IMHO, anyway.

I would love captcha's, I think they are a good idea. Unfortunately not all people think so, I think they need to be cimplemented regardless. It would solve problems.

David Ross

···

--
Hazzle free packages for Ruby?
RPA is available from http://www.rubyarchive.org/

trans. (T. Onoma) wrote:

I am truly sorry if this inconveniences you, but its the sacrifice we all need to make if we wish to continue to have such a great resource.

Have you been put in charge of RubyGarden? I had not heard of that.

Hal

trans. (T. Onoma) ha scritto:

I agree and CAPTCHA was my first suggestion. But the general take seemed to be against it, siting reasons of use and implementation, and that spammers would just find a way around it. I'm not so sure about these points, but nonetheless pre-moderating pages with new external links is simple enough and 100% effective.

but this kills half of the goodness of the wiki..
I remain of the opinion that just stoipping the bots by changing the post interface a little could be enough

trans. (T. Onoma) wrote:

···

On Sunday 24 October 2004 06:29 am, gabriele renzi wrote:

David Ross ha scritto:
> I still think setting a link to trap the spammer in thier own db where
> no public changes are made and they view thier changes are the better
> idea.

I never understood this. A lone rider spamming some pages by hand is not
a problem, the wiki community can fix it easily. Automated systems are
the real one, and should be fighted with some simple captcha.
Everything, IMHO, anyway.

I agree and CAPTCHA was my first suggestion. But the general take seemed to be against it, siting reasons of use and implementation, and that spammers would just find a way around it. I'm not so sure about these points, but nonetheless pre-moderating pages with new external links is simple enough and 100% effective.

T.

Actually I havent seen any type of anti-bot methods being applied, someoen needs to create support via CAPTCHAs and see what happens. Yahoo uses captchas and you don't see them whingin about it. Wikis are free support systems via webpages, someone just build the damn support to stop this moronic spam.

David Ross
--
Hazzle free packages for Ruby?
RPA is available from http://www.rubyarchive.org/

I would love captcha's, I think they are a good idea. Unfortunately not
all people think so, I think they need to be cimplemented regardless. It
would solve problems.

make it optionnal

     http://simon.incutio.com/archive/2004/07/29/jimmy

Guy Decoux

I am trying an experiment on my wikis (UseMod based) where I require all
external links to be written HTTP://host/yada rather than http://host/yada\.
Any page with a lower case http link is rejected (with a message directing
the user to an explaination). The patch to usemod effected only a few lines
of code. And although the measure is simple to circumvent, it has cut down
spam on my wikis to about one incident a week. I believe Tom has implemented
this patch on some (all?) the rubyforge project wikis with some success as
well.

-- Jim Weirich

···

On Sunday 24 October 2004 09:36 am, trans. (T. Onoma) wrote:

I agree and CAPTCHA was my first suggestion. But the general take seemed to
be against it, siting reasons of use and implementation, and that spammers
would just find a way around it. I'm not so sure about these points, but
nonetheless pre-moderating pages with new external links is simple enough
and 100% effective.