Page MenuHomePhabricator

Grant bots the sboverride userright
Closed, ResolvedPublic

Description

The purpose of the spam blacklist is to prevent spam, not to prevent specific links from being added to a page. As bots are already vetted to ensure that they are not going to spam the wiki, they should be exempt from the spam blacklist. Among other issues this would solve (see the discussion in T36928 for more), archive bots will can attempt to archive a section containing a blacklisted link. The bot is unable to add the discussion to the archive page. This results in thousands of discussion being lost, usually without anyone knowing it happened.

Event Timeline

What's the proposed implementation?

Bot is a default user group in mediawiki. Should code be added to SpamBlacklist to automatically add sboverride to all bot groups by default? Or do we need to do this one wiki at a time via config files?

Should code be added to SpamBlacklist to automatically add sboverride to all bot groups by default? Or do we need to do this one wiki at a time via config files?

I think, this right can be added to standard pack of bot's rights and can be removed in config of certain wikis by request of their users. Please, complete this request, because every month without this feature is inconvenient for bots and botmasters.

Urbanecm subscribed.

Removing Community-consensus-needed; this is a change in the code, not a configuration change. Wikimedia-specific config may be requested in Wikimedia-Site-requests .

IMO that tag should also be used for changes in code, but 🤷

Change 923728 had a related patch set uploaded (by Legoktm; author: Legoktm):

[mediawiki/extensions/SpamBlacklist@master] Grant bots the "sboverride" userright by default

https://gerrit.wikimedia.org/r/923728

Legoktm subscribed.

Like I said last year (T36928#8058329), introducing this right without giving it to anyone is silly. Bots seem like a good place to start given that people seem to have some apprehension about giving it to humans.

Change 923728 merged by jenkins-bot:

[mediawiki/extensions/SpamBlacklist@master] Grant bots the "sboverride" userright by default

https://gerrit.wikimedia.org/r/923728

Suggested Tech News wording:

Bots will no longer be prevented from making edits because of URLs that match the [[mw:Special:MyLanguage/Extension:SpamBlacklist|spam blacklist]]

stjn reopened this task as Open.EditedMay 27 2023, 7:15 PM
stjn subscribed.

To get into User notice, this task needs to remain open, @MBH

Quiddity subscribed.

Tasks don't need to be "open" to show up in the User-notice workboard, which is where I/we get the Tech News entries from. Thanks though!
And thanks immensely for the draft wording! Added.

This should not be hardcoded like this, it should be coded as to allow additions of matches according to user rights, some should simply never be added (not even by admins, as the current SBL works, others should be disallowed only for unregistered editors or new editors. In this way one could even allow usage of link shorteners to e.g. extended confirmed users or users with ‘given’ rights, but disallow them being used by spambots (and then bots can replace all of them).

Note that overriding the SBL like this will result in massive vandalism fighting problems - if a vandalism edit removes the blacklisted link, and then a good follow up edit makes it impossible to revert the vandalism edit you cannot edit the vandalised information back in because you are not a bot.

@Beetstra can you elaborate on the usecase you care mentioning - is it this?

  1. A page exists, and has a SBL link on it
  2. A disruptive edit is made to the page, where the edit also removes the blacklisted link
  3. A constructive edit is made to the page
  4. ???

Please identify what is different then the prior state as well.

if a vandalism edit removes the blacklisted link, and then a good follow up edit makes it impossible to revert the vandalism edit you cannot edit the vandalised information back in because you are not a bot.

It has always worked this way, the change discussed here did not change this mechanism, did not make it worse, so this is not the place to make such claims. And secondly, you can roll back such edit, if you are rollbacker.

  1. Now try to revert the edit that vandalized … you can if the diffs are

not conflicting (that overrides the SBL), but if it is conflicting you
cannot, you have to edit it back in, but you cannot because that would add
a blacklisted link back in.

What I mean is, the only way to get the blacklisted link back in is through
a revert, undo, not through a regular edit. And situations do happen where
revert is wrong (you would revert a good, or multiple good, edit(s) which
you then have to re-do), and where undo does not work.

In short, even if blacklisted links on pages are not an immediate problem,
they ALL need to be removed or whitelisted. Allowing them being added (by
bots of all) is just plain wrong. There is a reason why blacklisted links
are blacklisted, most of them are crappy material that we do not want to
link to, and quite a significant number of them are blacklisted not because
they were spammed, but because the community did not want them to be linked.

Bots are not really "adding" blacklisted links, the links were already there. We can continue to block bots from doing their work, but the consequences IMO are much more severe than the edge scenario outlined above.

@Beetstra The reasons why we requested this change is

  1. because archiving bots can't archive discussion, if it contains a link, blacklisted after discussion. Ruwiki lost thousands of discussions because of it, it is one of the most disruptive events in Wikipedia history.
  2. because bots like AWB repairs broken link syntax on pages when edit in automatic mode, and if bot repairs spam link, it can't save edit.

@MBH: on en.wiki, bots just don’t archive when they can’t save. Morever, the discussions are not lost, barring oversight and deletion, everything is still there. The better solution is to break the link so it is disabled. Well, actually the better solution is to have the SBL be able to distinguish namespaces
Bots should not repair spam links, those links should be removed. There is a reason why stuff is blacklisted, the community does not want it. You want spammers to intelligently maim redirect links so a bot repairs it for them and they have their link? Spammers spam because it makes them money. Any link they get gives them a chance it gets followed.

Fix the SBL, not circumvent the problems it gives.

if a vandalism edit removes the blacklisted link, and then a good follow up edit makes it impossible to revert the vandalism edit you cannot edit the vandalised information back in because you are not a bot.

It has always worked this way, the change discussed here did not change this mechanism, did not make it worse, so this is not the place to make such claims. And secondly, you can roll back such edit, if you are rollbacker.

Yes, this is always like this, and that is why blacklisted links need to be removed, and certanly not en masse added by bots

Rollback and undo work indeed, but not everyone has the first right, and if there are follow
up edits, rollback does not work, and sometimes undo neither.

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy