Flea vs RubyGarden

Rubistas:

I sympathize with the battle on spam, but I was using RubyGarden to build
the "home sites" for MRW and Flea. Both require pages with transcluded
graphics, to avoid the need for thousands of boring words.

RubyGarden stopped building the <img> tag around graphic file links.

I'm too lazy to build my own static Web site for these pictures. Was images
in spam a big problem? or did some script just clobber my graphics by
accident?

···

--
  Phlip
  http://industrialxp.org/community/bin/view/Main/TestFirstUserInterfaces

Phlip wrote:

Rubistas:

I sympathize with the battle on spam, but I was using RubyGarden to build
the "home sites" for MRW and Flea. Both require pages with transcluded
graphics, to avoid the need for thousands of boring words.

RubyGarden stopped building the <img> tag around graphic file links.

I'm too lazy to build my own static Web site for these pictures. Was images
in spam a big problem? or did some script just clobber my graphics by
accident?

did you have the location: with the capital HTTP://?

David Ross

···

--
Hazzle free packages for Ruby?
RPA is available from http://www.rubyarchive.org/

Flipster,

The image links just need to be changed to upper-case the protocol
part of the URL (e.g. HTTP://www.blah.com/someimage.jpg\).

Hopefully not too much of a pain.

Thanks,
Chad Fowler

http://chadfowler.com

http://rubygems.rubyforge.org (over 20,000 gems served!)

···

On Sun, 31 Oct 2004 02:23:54 +0900, Phlip <phlip_cpp@yahoo.com> wrote:

Rubistas:

I sympathize with the battle on spam, but I was using RubyGarden to build
the "home sites" for MRW and Flea. Both require pages with transcluded
graphics, to avoid the need for thousands of boring words.

RubyGarden stopped building the <img> tag around graphic file links.

I'm too lazy to build my own static Web site for these pictures. Was images
in spam a big problem? or did some script just clobber my graphics by
accident?

Wasn't the idea of the "HTTP" patch to only apply to _new_ Wiki edits?

Gavin

···

On Sunday, October 31, 2004, 9:32:52 AM, Chad wrote:

The image links just need to be changed to upper-case the protocol
part of the URL (e.g. HTTP://www.blah.com/someimage.jpg\).

Hopefully not too much of a pain.

Gavin Sinclair wrote:

···

On Sunday, October 31, 2004, 9:32:52 AM, Chad wrote:

The image links just need to be changed to upper-case the protocol
part of the URL (e.g. HTTP://www.blah.com/someimage.jpg\).
   
Hopefully not too much of a pain.
   
Wasn't the idea of the "HTTP" patch to only apply to _new_ Wiki edits?

Gavin

What would be the point if it were only new? They would spam the current wiki pages.

David Ross
--
Hazzle free packages for Ruby?
RPA is available from http://www.rubyarchive.org/

Well, the patch does 2 things: (1) makes "HTTP:" the proper prefix for
recognized lines and (2) rejects edits containing "http:". I suppose the
patch author could have been more clever and allowed either HTTP or http as
recognized line prefixes. Probably a better solution is a script to update
the wiki pages to the new format.

···

On Saturday 30 October 2004 08:21 pm, Gavin Sinclair wrote:

On Sunday, October 31, 2004, 9:32:52 AM, Chad wrote:
> The image links just need to be changed to upper-case the protocol
> part of the URL (e.g. HTTP://www.blah.com/someimage.jpg\).
>
> Hopefully not too much of a pain.

Wasn't the idea of the "HTTP" patch to only apply to _new_ Wiki edits?

--
-- Jim Weirich jim@weirichhouse.org http://onestepback.org
-----------------------------------------------------------------
"Beware of bugs in the above code; I have only proved it correct,
not tried it." -- Donald Knuth (in a memo to Peter van Emde Boas)

Edits, not pages.

···

David Ross (dross@code-exec.net) wrote:

Gavin Sinclair wrote:

>On Sunday, October 31, 2004, 9:32:52 AM, Chad wrote:
>
>>The image links just need to be changed to upper-case the protocol
>>part of the URL (e.g. HTTP://www.blah.com/someimage.jpg\).
>
>>Hopefully not too much of a pain.
>
>Wasn't the idea of the "HTTP" patch to only apply to _new_ Wiki edits?

What would be the point if it were only new? They would spam the current
wiki pages.

--
Eric Hodel - drbrain@segment7.net - http://segment7.net
All messages signed with fingerprint:
FEC2 57F1 D465 EB15 5D6E 7C11 332A 551C 796C 9F04

Jim Weirich wrote:

Well, the patch does 2 things: (1) makes "HTTP:" the proper prefix for
recognized lines and (2) rejects edits containing "http:". I suppose the
patch author could have been more clever and allowed either HTTP or http

as

recognized line prefixes. Probably a better solution is a script to

update

the wiki pages to the new format.

Thanks - I just "got" it. To rescue my few pages, I must edit _all_ their
http: tags.

When you change a page and save it, any http: tags bounce, not just the ones
you edited.

···

--
  Phlip
  http://industrialxp.org/community/bin/view/Main/TestFirstUserInterfaces

Eric Hodel wrote:

Edits, not pages.

That's the thing - the pages "should" have all been grandfathered-in. I
ain't touching nothin' until the spam fixes stabilize, so the question
remains: Did new Wiki source disable all graphic transclusion by fiat? or by
accident?

I didn't think the spammers had (yet) started to push in images.

···

--
  Phlip
  http://industrialxp.org/community/bin/view/Main/TestFirstUserInterfaces

That's right.

···

On Sun, 31 Oct 2004 14:33:54 +0900, Phlip <phlip_cpp@yahoo.com> wrote:

Jim Weirich wrote:

> Well, the patch does 2 things: (1) makes "HTTP:" the proper prefix for
> recognized lines and (2) rejects edits containing "http:". I suppose the
> patch author could have been more clever and allowed either HTTP or http
as
> recognized line prefixes. Probably a better solution is a script to
update
> the wiki pages to the new format.

Thanks - I just "got" it. To rescue my few pages, I must edit _all_ their
http: tags.

When you change a page and save it, any http: tags bounce, not just the ones
you edited.

--

Chad Fowler
http://chadfowler.com

http://rubygems.rubyforge.org (over 20,000 gems served!)

Phlip wrote:

Eric Hodel wrote:

Edits, not pages.
   
That's the thing - the pages "should" have all been grandfathered-in. I
ain't touching nothin' until the spam fixes stabilize, so the question
remains: Did new Wiki source disable all graphic transclusion by fiat? or by
accident?

I didn't think the spammers had (yet) started to push in images.

Right, this spamming problem is just text related(I've yet to see image spam), but the current obfuscations disallow any type of modification due to the nature of the posed solution.

David Ross

···

--
Hazzle free packages for Ruby?
RPA is available from http://www.rubyarchive.org/

Chad Fowler wrote:

Phlip wrote:

> Thanks - I just "got" it. To rescue my few pages, I must edit _all_

their

> http: tags.
>
> When you change a page and save it, any http: tags bounce, not just the

ones

> you edited.

That's right.

Okay. Imagine a spammer changes their URLs (in their input files!) to read
HTTP:

Then they attack.

Any page with http: in it, they can't edit. They must erase the content, or
upgrade its http:s to HTTP:s. The later is unlikely.

However, my pages, which I just upgraded to HTTP:, are now _more_ vulnerable
to attack.

sigh<

"O bother, said Pooh, as his magazine emptied."

···

--
  Phlip
  http://industrialxp.org/community/bin/view/Main/TestFirstUserInterfaces

You bring up an interesting insight. If a page contains a "mark" of
un-save-ability, then it useless to spammers unless they can identify that
mark and remove. We've already seen that something as simple as 'http://' can
help do that. Of course eventually they could easily circumvent that. But we
have an interesting solution in the making.

If we allowed a special mark for a page, such that the lock sequence would
have to be _removed_ from the page before it will save. This would largely
thwart spammers b/c they generally just _add_ to the page --moreover, if the
lock sequence is redefinable to some degree it may help even more.

But how might it be done?

T.

···

On Sunday 31 October 2004 10:28 am, Phlip wrote:

Chad Fowler wrote:
> Phlip wrote:
> > Thanks - I just "got" it. To rescue my few pages, I must edit _all_

their

> > http: tags.
> >
> > When you change a page and save it, any http: tags bounce, not just the

ones

> > you edited.
>
> That's right.

Okay. Imagine a spammer changes their URLs (in their input files!) to read
HTTP:

Then they attack.

Any page with http: in it, they can't edit. They must erase the content, or
upgrade its http:s to HTTP:s. The later is unlikely.

However, my pages, which I just upgraded to HTTP:, are now _more_
vulnerable to attack.

>sigh<

trans. (T. Onoma) wrote:

You bring up an interesting insight. If a page contains a "mark" of
un-save-ability, then it useless to spammers unless they can identify that
mark and remove. We've already seen that something as simple as 'http://'

can

help do that. Of course eventually they could easily circumvent that. But

we

have an interesting solution in the making.

If we allowed a special mark for a page, such that the lock sequence would
have to be _removed_ from the page before it will save. This would largely
thwart spammers b/c they generally just _add_ to the page --moreover, if

the

lock sequence is redefinable to some degree it may help even more.

But how might it be done?

This arms race is bows-and-arrows against nukes. If we put a secret mark on
a page, this is the equivalent of a CAPTCHA, because a human must read the
documentation to find the mark. But the loophole is the spammer will just
erase the entire page.

···

--
  Phlip
  http://industrialxp.org/community/bin/view/Main/TestFirstUserInterfaces

Right, it is sort of a captcha. The only thing I was really drawing onw was
that you;d have to take something away rather then put something in. But your
right, they could just remove the whole pages. So you'd have to have both I
suppose.

Just foe sh*ts and g*ggles, what are the spmmer ramifications if the wiki
stuck a mark on the end of the page when you go to edit that you must delete,
and also gave a captcha string you must add to the end, in order to save the
document.

T.

···

On Sunday 31 October 2004 03:13 pm, Phlip wrote:

This arms race is bows-and-arrows against nukes. If we put a secret mark on
a page, this is the equivalent of a CAPTCHA, because a human must read the
documentation to find the mark. But the loophole is the spammer will just
erase the entire page.