Wikipedia:Bots/Requests for approval

< Wikipedia:Bots  (Redirected from Wikipedia:RFBOT)

BAG member instructions

If you want to run a bot on the English Wikipedia, you must first get it approved. To do so, follow the instructions below to add a request. If you are not familiar with programming it may be a good idea to ask someone else to run a bot for you, rather than running your own.

New to bots on Wikipedia? Read these primers!
 Instructions for bot operators

Current requests for approval


Operator: Elliot321 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 14:46, Saturday, January 23, 2021 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s):

Source code available:

Function overview: Automatically apply {{redirect category shell}} templates to redirects with Wikidata, and remove redundant {{Wikidata redirect}} templates.

Links to relevant discussions (where appropriate):

Edit period(s): one time run

Estimated number of pages affected: 50,000-100,000

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): No

Function details: I recently modified {{Redirect category shell}} to automatically detect Wikidata links and apply the template {{Wikidata redirect}} if they exist. This was previously already done with protection levels, and there is no reason that {{Wikidata redirect}} should not also be applied.

There are currently 100,000 redirects in the category Category:Redirects connected to a Wikidata item, which is applied by the software. There are currently 30,000 redirects in the category Category:Wikidata redirects. Nearly all of these were put into that category by applying {{Wikidata redirect}} manually, meaning they will need the tag removed (as it will be a duplicate).

Many of the remaining 70,000 pages will need the template {{rcat shell}} added. As the change to {{Redirect category shell}} was recent, many redirects connected to Wikidata items, without {{Wikidata redirect}}, but with {{Redirect category shell}}, have not been added to Category:Wikidata redirects. The difference in count between Category:Wikidata redirects and Category:Redirects connected to a Wikidata item is the number of pages that will be modified.

The edits will be carried out with AWB running as an automated bot. There is very low risk of disruption in this task, though the number of edits is significant. Using AWB, this bot can also carry out other generic fixes to redirects, though this is not a significant part of its functions.

A somewhat similar failed request was Wikipedia:Bots/Requests for approval/TomBot, but that that request was for a bot that would edit ~30-60x more pages, with less benefit overall. This is a much more narrow and useful request.


  1. Any prior discussions on doing this that you're aware of, which establish broader consensus for this task?
  2. Will this BRFA cause Template:Wikidata redirect to become redundant? If I understand correctly, this task will orphan all of its transclusions? If so, and especially if there's no prior discussion, I suggest sending that template to TfD (and then this bot task can be technically implementing that TfD). That would be one way to test the wider consensus for this task, too.

ProcrastinatingReader (talk) 16:01, 23 January 2021 (UTC)

There are no discussions I know of establishing consensus around this particular task. {{Wikidata redirect}} will not become redundant for two reasons. {{redirect category shell}} transcludes it. However, this usage could be subst, of course. The other usage is in cross-Wiki (such as to Wiktionary) and category redirects, the "soft" usage. The "hard" usage could be deprecated from the template, however (they are implemented slightly differently, with an automatic switch). Elliot321 (talk | contribs) 16:20, 23 January 2021 (UTC)
To begin with, I'd say stylistically this presentation is inferior. See eg here. The one on the top (caused by the edit) doesn't look as good as the manual one & looks slightly out of place with the plaintext.
If the rcat shell has to be manually added by bot, is there really a point to this? Why not have a bot add {{Wikidata redirect}} to pages in Category:Redirects connected to a Wikidata item? ProcrastinatingReader (talk) 00:39, 24 January 2021 (UTC)
Sorry - that was due to my changes being misunderstood and reverted. If you check now, you can see the way they were intended to look.
The reason for adding {{redirect category shell}} over {{wikidata redirect}} is for automatic detection. If the link on Wikidata is removed, no update on Wikipedia is necessary (likewise, if a link on Wikidata is added to one using the shell, no update is necessary). Elliot321 (talk | contribs) 07:52, 24 January 2021 (UTC)
Okay, makes sense. I'd suggest dropping a link to this BRFA from the template talk pages for the two templates, to allow some time for comments. ProcrastinatingReader (talk) 09:37, 24 January 2021 (UTC)
Done. Elliot321 (talk | contribs) 10:08, 24 January 2021 (UTC)

So the idea is that {{RCAT shell}} should add the Wikidata box by checking for the connected item. Manually adding the template wouldn't be necessary then because the software can already detect if a page is connected to a Wikidata item. Is that correct? --PhiH (talk) 13:20, 24 January 2021 (UTC)

@PhiH: pretty much. The shell will automatically detect a link to Wikidata, and if found, transclude the template. Therefore, this bot will remove the redundant manual transclusions of the template, and add the shell to automatically transclude on any redirect linked to Wikidata. Elliot321 (talk | contribs) 15:36, 24 January 2021 (UTC)

SDZeroBot 9

Operator: SD0001 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 12:57, Thursday, November 12, 2020 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): TypeScript

Source code available: GitHub

Function overview: Monitor activity of other bots. Issue alerts to bot operators via talk page or email if they subscribe.

Links to relevant discussions (where appropriate): Wikipedia:Bot_requests/Archive_80#A_bot_to_monitor_the_activity_level_of_other_bots

Edit period(s): Continuous

Estimated number of pages affected: -

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): Yes

Function details: Based on pre-configured information about bot tasks (name of bot account, what edit summaries it uses, what pages/namespaces it edits, how many edits are expected in the last x days, etc), it identifies bots and bot tasks which have stopped working. Stalled bot tasks can be identified even if the bot account is still running other tasks. Bots which perform actions other than editing (deletions/blocks/patrols etc) can also be monitored. A bot status table would be generated and posted to WP:Bot activity monitor.

If configured, this bot can also issue alerts to the operator to let them know that their bot tasks are not running. Alerts can be sent via talk page or email or least intrusively, via a ping from a central page.

I expect anyone should be able to set up a bot for tracking (to be included in status table), but of course only the operator(s) should set up alerts for themselves.


Pinging some users from the old BOTREQ discussion: @Sdkb, GreenC, Redrose64, Headbomb, Primefac, Majavah, and Amorymeltzer:. – SD0001 (talk) 15:34, 12 November 2020 (UTC)

  • The configuration parameters for describing bot tasks are given at WP:Bot activity monitor. However, I'm a bit confused on how and where should people set up these configurations. My initial thought was to have a central JSON page: Wikipedia:Bot activity monitor/config.json but the problems with JSON are (i) it requires regexes to be weirdly escaped (eg. \d needs to be written as \\d, though it will show up as \d while viewing the page) and (ii) it looks so clumsy, especially when there would be 100s of tasks. It seems using template markup is better to describe the configurations – but should they go on a central page or be decentralized on bot user pages? The drawback of the latter is that it discourages users from setting up tracking for others' bots. – SD0001 (talk) 15:35, 12 November 2020 (UTC)
  • I'm super happy to see this; thanks for your work on it, SD0001! I don't have the expertise to comment on the technical questions, but as far as monitoring goes, my sense is that many bots that stop working have retired operators, so it would be good for there to be notifications not just to the talk page of the operator. Looking forward to seeing this in operation! {{u|Sdkb}}talk 15:58, 12 November 2020 (UTC)
  • Okay, so am I reading it correctly that this is an opt in situation, only "checking up" on whichever specific bots are listed? Also, why do we need a second process when Wikipedia:Bots/Requests for approval/MajavahBot 3 exists? Primefac (talk) 14:12, 13 November 2020 (UTC)
    @Primefac: This is a lot more advanced than MajavahBot 3. See User:SD0001/Bot_activity_monitor/Report for the kind of output it produces – that's based on the data for a cherry-picked set of bots at Wikipedia:Bot activity monitor/config.json. And as mentioned it also supports sending notifications to botops. Because all of this requires data about bot tasks in a machine-readable form, it necessarily has to be "opt-in" (through folks can opt-in others' bots). – SD0001 (talk) 19:07, 13 November 2020 (UTC)
    Fair enough. Per the general precedent, there's no issue with creating a database-style report for these bots (i.e. "only edits one page") but when it starts getting towards notifications there come more questions. Speaking as a bot operator, I don't really care if someone keeps tabs on my bot, but I don't want any sort of automated notice if I happen to decide not to run one of my "active" tasks for some period of time, and I'd rather not find out after receiving a notification that someone's added my name to the "notify" list. Primefac (talk) 20:15, 13 November 2020 (UTC)
    Agreed. I myself wouldn't want these notifications – I've implemented error handling in my own bot tasks so that whenever an error occurs, I get an email with the stack trace of the error – which would be more useful than a generic message from a third-party bot which says the task didn't run. This is why I say above notifications would (should) only be enabled by botops themselves. But I think we can just let this be a convention rather than try to restrict it at the technical level, and hope that people won't be jerks? Remember that it's technically also possible for a random guy to subscribe you to random wikiprojects' newsletters – but this doesn't seem to happen in practise. – SD0001 (talk) 11:33, 14 November 2020 (UTC)
  • I'll drop a note at WP:BON about this bot. One point is worth noting, just in case it isn't obvious, is that the "monitoring" is intended for bots that run fully automatically – with zero human intervention. It wouldn't make sense to track bots that are one-time or on-demand, or the even the ones which require any level of operator intervention to run. The intent is to "catch" bot stoppages which the operator may not be aware of, typically occurring due to unforeseen issues such as the ones Anomie mentioned at this discussion, quoting:
  • Something changes that causes the bot to fail unless its code or configuration is updated ...
  • A software update by the hosting provider breaks the bot's code, again requiring a code update.
  • The bot's process stops running or locks up, and the operator isn't paying attention to notice and restart it.
  • The hosting provider where it bot is being run closes the account (or closes entirely).

SD0001 (talk) 12:04, 14 November 2020 (UTC)

  • I think this is helpful and not particularly problematic. Of course, operators are not obligated to run tasks, but many times the downtime is accidental not intentional. For example, when my task 3 stops we lose a day of Main Page history. It did lock up once after some maintenance from my host, so Template:2020 Main Page history is missing November 9b and 10. Setting up good app/server monitoring is not what most bots do. Note I haven't looked too closely at the implementation yet to say if I have any concerns with that part. ProcrastinatingReader (talk) 15:21, 14 November 2020 (UTC)
  • Sounds like a great idea. I would prefer JSON where each bot monitor is its own object with fields for summary regexes, expected run times, number of runs per day, an array of pages that are expected to have been edited, etc. JSON has the added benefit of being highly extensible since it can contain other objects allowing for more complex configurations. That said it may not be widely accessible to less-technical botops, and the regex escape problem is always a nuisance. Either way, sounds great and I look forward to seeing it in operation! Wug·a·po·des 03:25, 15 November 2020 (UTC)
  • BAG question: is the notification system up and running? Primefac (talk) 10:52, 16 November 2020 (UTC)
    Not yet, the notifications code as presently written will keep spamming the botop every half an hour until their bot comes back up again! I'll probably have to further use SQLite to keep track of notifications sent to avoid repetitions. – SD0001 (talk) 15:12, 16 November 2020 (UTC)
    If you can have the dbase/check running without the notifications enabled, feel free to start running that part while the rest gets hammered out. Primefac (talk) 17:40, 16 November 2020 (UTC)
  • @SD0001: Happy new year! Gently checking in on the status of this. — The Earwig talk 05:10, 12 January 2021 (UTC)

Bots in a trial period

Usernamekiran BOT 4

Operator: Usernamekiran (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 19:47, Thursday, August 13, 2020 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): AWB

Source code available: AWB

Function overview: find and replace wikilinks

Links to relevant discussions (where appropriate): User talk:Xaosflux#bot task

Edit period(s): As required

Estimated number of pages affected: variable, based on task

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): Yes

Function details: I became active again at closing page move discussions. If there is a DAB page involved in the moves, sometimes the bot (RussBot) us unable to fix the links, as they are not double redirects. Recently, the issue came up with RM at Requested Talk:Chani (character)#move 30 July 2020, but the links to be fixed were very few. Today, similar issue came up with Talk:Bell Satellite TV#Requested move 6 August 2020. I had to update somewhere around 450 pages, from "Bell TV" to "Bell Satellite TV"; as Bell TV was moved to Bell Satellite TV, and Bell TV was converted into a DAB pointing to Bell TV should become a DAB pointing to Bell Satellite TV, Bell Fibe TV, and Bell Mobile TV. In this case, the RussBot is not an option, as there wouldn't have been a double redirect; and no botremoved mistaken repetition fully-automated bot will know what link to choose from the DAB page.

My method/task is pretty basic, and simple. I added basic find and replace rule: find [[Bell TV, and replace it with [[Bell Satellite TV. So far I have updated more than 150 articles, and there have been no issues. I have been checking the diff in AWB, and hitting ctrl+S; without any issues.

I am aware this is a very basic task. But there is no bot at WP:RM with this function. There is a possibility that some other bot might be approved with this task; but that would mean I will have to wait till the bot operator comes online. Whereas if I had the approval, I can do it whenever the need arises. As this is a basic, and uncontroversial task; I thought I should ask about it. Regards, —usernamekiran (talk) 19:47, 13 August 2020 (UTC)

PS: This method also handles previously piped links: special:diff/972777535. —usernamekiran (talk) 19:49, 13 August 2020 (UTC)


What's the general logic / criteria for the task? Would this be automatic or assisted somehow (i.e. triggered by moves, or human-supervised, telling it 'fix those specifically') ? Headbomb {t · c · p · b} 17:11, 20 August 2020 (UTC)

@Headbomb: Basically, I would be making list in AWB after closing the RM, but before performing the actual move. I apologise for selecting bot's mode as automatic above. The operation would be just like manual AWB editing, with the only exception being "bot flag" to avoid hitting ctrl+S for a lot of times. It would also be convenient, and time saving. —usernamekiran (talk) 23:18, 20 August 2020 (UTC)
For the record, what you just described is "automatic mode". Primefac (talk) 23:51, 20 August 2020 (UTC)
Then where do the bots like MuzikBot, and archive bots fall in? —usernamekiran (talk) 10:51, 21 August 2020 (UTC)
They are also fully automatic. Primefac (talk) 19:32, 21 August 2020 (UTC)

  Approved for trial (1 move discussion). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Alright, then let's have a trial and see if bolts come off loose. Headbomb {t · c · p · b} 23:22, 20 August 2020 (UTC)

  • This is a classic case where context matters. There's a whole wikiproject whose goal is fixing links to dab pages (WP:DPL), and there are various tools they have developed over the years to help with aspects of the work. Sadly, a simple find and replace will not work, because the links are by definition ambiguous, so you can't know the intended target unless you examine the context. And you can't assume that the previous primary topic will always be it either. Fixing links in some very narrowly defined contexts can be helpful, like in [[New York]] City -> [[New York City]]), but situations where it's needed to do this at scale are quite rare. Also adding that, contrary to what was suggested in the Channi RM linked above, closers of RM discussions are not expected to fix dab links resulting from the moves. – Uanfala (talk) 10:19, 21 August 2020 (UTC)
    @Uanfala: Hi. I am aware of the issues that can arise, like the ones you mentioned. Like I stated in the comment above, I will be making list in AWB (and then I will check the list, and the edits once before making the actual edits), once the list/edits are okay; only then I will save the edits. It will not be a "blind" task where I will make list of "what links here", and hit save. —usernamekiran (talk) 10:50, 21 August 2020 (UTC)
    @Uanfala: Yes they are, see WP:FIXDABLINKS - "Before moving an article to a qualified name (in order to create a disambiguation page at the base name, to move an existing disambiguation page to that name, or to redirect that name to a disambiguation page), click on What links here to find all of the incoming links and repair them." Narky Blert (talk) 20:56, 22 August 2020 (UTC)
    Thank you for the link, Narky Blert. This is indeed what that page says. This is an abomination we should be thankful is never followed in practice – otherwise many moves wouldn't happen because the editors involved wouldn't ever have the time and inclination to fix those links and we would be stuck with all the bad primary topics for perpetuity. – Uanfala (talk) 21:38, 22 August 2020 (UTC)
    I'm having a look at the talk archives and I'm not surprised that this piece of nonsense doesn't appear to have been discussed much. The one discussion there was on the topic, however, is clear that closers of RM discussions should neither be required nor expected to fix those links. – Uanfala (talk) 21:44, 22 August 2020 (UTC)
    @Uanfala: FIXDABLINKS is idealistic and unworkable, even penal. If there is only a handful of links, it does make sense for the closer to fix the links, as being quicker and simpler than asking the proposer to do so and checking to see if they've done it. The problem is with moves which break several hundred or several thousand links. (Example: a move involving one of the Jammu and Kashmir pages at the beginning of June broke over 2,000 links; two-and-a-half months later, 495 are still broken, see DPwL.) Is is unfair to expect an editor who has made an administrative change reflecting a WP:CONSENSUS should have sole responsibility for cleaning up the mess; especially prior to making it, which is what FIXDABLINKS says.
    That 2014 discussion is still relevant. It includes a proposal which I came up with independently, that the onus should be on those who supported the move (and in big cases, there will always be more than one of them).
    I once proposed a change to a guideline; I won't be making that mistake again. My legal training taught me that if one party finds a clause ambiguous, it is ambiguous and it needs to be redrafted. I found the wording of a guideline ambiguous; some agreed, others didn't, and the result was WP:NOCONSENSUS. Narky Blert (talk) 07:00, 23 August 2020 (UTC)
    FIXDABLINKS is vague as to who should fix the links. In my opinion, it should be the person(s) requesting the move, rather than the closer who carries out the mechanics of moving the pages but may not be a subject expert. All too often, it gets left for a passing gnome. Certes (talk) 09:37, 23 August 2020 (UTC)
    Yeah, proposing changes to guidelines can be a pain (I would think even more so for like me who lack any legal training), and it sometimes may be easier to just ignore antiquated guidelines than attempt changing them, but oh well, there we go: Wikipedia talk:Disambiguation#Rewriting WP:FIXDABLINKS (Links to disambiguated topics). – Uanfala (talk) 14:06, 23 August 2020 (UTC)
    @Uanfala: I've seen that discussion, I intend to comment in detail once I've gathered my wool and rowed up my ducks. Narky Blert (talk) 18:31, 24 August 2020 (UTC)
  • a comment in general: I am okay with fixing the links from my normal (non-bot) account if there there are few pages to be fixed. If I am running the task from bot account, I will make sure there are no issues after my edits. I am requesting the bot flag only for the cases where there are a lot of page/links to be fixed. For example, in case of Talk:Bell Satellite TV#Requested move 6 August 2020: in step one, I created the list, after the pre-parse mode; there were around 450 articles to be fixed. In step two, I skimmed that list. In step 3, I made around 150; and in second run, around 50 edits with checking the difference before saving the edit. All were without any technical problems. I performed the remaining ~250 edits without checking the diffs; and after the entire session, checked a lot of them at random. None of the diffs I checked had any problem. What I am trying to say is, I will make a logical judgement before, and during step 2. I will not be changing the links without getting to know the context. The only difference in regular/normal AWB run, and this bot run would be the last step of saving the edit manually vs automatically. Given my previous experience of moving pages, and editing wikipedia in general; I can figure out where to use caution, and how to handle things. I am looking at this trial run as means to check only the technical accuracy of my method. —usernamekiran (talk) 15:02, 21 August 2020 (UTC)
    Thank you for the detailed explanation. Just a clarifying question, what proportion of links do you manually examine the context of? By examining the context, I mean looking at the article text that has the link and reading the sentence in which the link is found (possibly also the sentences before and after). – Uanfala (talk) 16:10, 21 August 2020 (UTC)
    That is a very broad question. The solution, and context begins with the page that is being moved. The Bell TV was the easiest one. There was not much scope for linking to the incorrect page (the incorrect page being the one that is to be moved). Biographies are also easy. Till now whenever I came across such instances, I used to skim the page which is to be moved. That gives you a general idea what the topic is, and where can you expect the links to come from. Also, after making the first list, I remove disamb pages from the AWB's list; and edit it manually. Most complicated are the ones of communities/castes; sometimes incorrect process can lead to a (intended) group being linked to a language (eg: Tamils (people) with redirect Tamil people; then there is Tamil language, and Tamil script. A complete list can be found at the dab Tamil). I do have a method for weeding out the pages that might get incorrectly updated, but I don't know how to put it in words. I do it when I see the "what links here" list of wikipedia, and/or in the AWB list. Its based partly on hunch, and mostly on logic (dont know how to put this logic in words). The only thing I can say is, I will make the edits carefully; and there wouldn't be any issues. Not trying to brag, but according to xtools I have moved 1,787 pages till now. That number is exaggerated as it counts one page swap/round robin move as 3 page moves. Assuming I have moved 400 pages, and performed this task (the one in discussion) a few times, there have never been any issues (however, there were two instances during the early days as page mover, where my closure was not perfect; but almost nothing after that). And I will try to keep things that way   —usernamekiran (talk) 18:42, 21 August 2020 (UTC)
    Thank you for the detailed reply, but I still don't see an answer to my question. Should I take it that you don't intend to examine each link in its context? In such a case, this task should definitely not go ahead in its present form. To decide the target of the link, you need to examine the link itself and the text around it, it's not enough to guess from the article title. You can use AWV or another tool in combination with whatever heuristics you choose, as long as you manually examine each link to be fixed: either before fixing it, or in the diff afterwards. A process that gets it right 99% of the time does not lead to a net improvement for readers: a link to a disambiguation page is merely an inconvenience that adds an extra step to a navigation path, an incorrectly disambiguated link on the other hand completely cuts off navigation to the intended article and also introduces a factual inaccuracy in the text. If you examine each and every link to be fixed, then it's up to you what technical means you use to save your edit; but whichever way you do it, you shouldn't be doing it under a bot flag, as this would be a manual edit that needs to invite the same amount, and type, of review from page watchers as any other manual edit. – Uanfala (talk) 19:10, 21 August 2020 (UTC)
    Adding that you can ask for more feedback at WT:DPL. – Uanfala (talk) 20:35, 21 August 2020 (UTC)
    @Uanfala: I have no problem with checking the edits. However, I cant understand why are you so much reserved with this task. If [ABC] gets moved to [XYZ], and if we have to divert the incoming links from ABC to XYZ from [foo], and [lorem]; provided there are already wikilinks in foo, and lorem pointing to ABC, I dont see much room for error that you are talking about, if any. —usernamekiran (talk) 08:25, 22 August 2020 (UTC)
    We're talking about moves involving primary topics, right? Say, ABC to ABC (foo) and then ABC (disambiguation) to ABC. Your assumption seems to be that all links to ABC will be intended for ABC (foo). This is not necessarily true, and in fact it is very rarely the case. There will be links ABC where the target is not ABC (foo), but ABC (bar) or ABC (buz). See for example the incoming links to Gaddi, an article about an ethnic group of India: there are some links intended for it (like at Kareri Lake), but there also links not intended for it (like at List of biblical names starting with G). – Uanfala (talk) 11:41, 22 August 2020 (UTC)
    Thats what I was trying to say when I gave the example of Tamil. Anyways, like said earlier; I don't have any problem checking the edits. —usernamekiran (talk) 11:58, 22 August 2020 (UTC)
  • When an article moves and a disambiguation page usurps its title, this is because that title is ambiguous. Many links will intend the moved article but some will have other meanings. It is necessary to examine each link manually. Requests to automatically change links which have been vetted by human eyes occasionally pop up at WP:AWB/TA, and I'm sure many editors use semi-automated tools for similar purposes. (I prefer WP:DisamAssist.) A bot to do that might be useful. However, I would oppose an automated change where links have not been checked manually or even where a sample has been checked. This would simply turn known unknowns into unknown unknowns, for example by replacing Foo by Foo (computing) where Adeline Foo was intended. I recommend automatically generating lists of potentially good changes, then applying them selectively with manual supervision. Certes (talk) 12:09, 22 August 2020 (UTC)
    I did a small amount of the cleanup after Talk:Vinyl#Requested move 19 June 2017, when a redirect was turned into a DAB page. That broke about 2,500 links. Only about 10% of them were intended for the original target, and the other 90% were mostly split three ways in roughly equal amounts. Eyeballs were essential. Narky Blert (talk) 21:05, 22 August 2020 (UTC)
    Another example is New York, which required checking and changing 119,000 wikilinks, of which more than 12,000 were not for the original target. (I did a tiny fraction of the work.) Certes (talk) 21:55, 22 August 2020 (UTC)
  • @Certes, Narky Blert, and Uanfala: Once my problem with AWB resolves (cant start it up) I can fix some links to a page from WP:DPL as bot trial, or maybe some other page. @Headbomb: I know this is different than the proposed task involving RMs, but the task is same. Would it be possible to run the task on a page from DPL? —usernamekiran (talk) 09:25, 23 August 2020 (UTC)
    That sounds good to me, if you identify a DPL case where all links are intended for a single target. Do you actually need AWB for this? WP:JWB can perform many AWB tasks, though you will have to get the list of pages to fix from elsewhere such as Special:WhatLinksHere. Certes (talk) 09:32, 23 August 2020 (UTC)
    No, I meant page links intended 2/3 pages. —usernamekiran (talk) 09:36, 23 August 2020 (UTC)
    (ec) DPwL would be an excellent source for a test-run, it's updated twice daily, and you could choose the number of links from 1 upwards.
    A thought. If a fix isn't obvious, could the bot be programmed to add a {{disambiguation needed}} tag? It's the DABfixer's last resort, and it's remarkable how often problems in Category:Articles with links needing disambiguation get fixed. Narky Blert (talk) 09:44, 23 August 2020 (UTC)
    I will look into DPwL as soon as I get on a computer. disambiguation needed tag can be added, but that would be just me manually adding it hehe. Seriously speaking, I think all that would get more clear after bot's 4-5 heavy runs. —usernamekiran (talk) 09:58, 23 August 2020 (UTC)
  • @Certes, Narky Blert, and Uanfala: I picked up Manny Perez from DPwL, but all the incoming links to the disamb were intended for only one article. I can easily fix the disambs of the likes of Chuuk, Stefan Marinović, and Wyn Jones. All of them have very few incoming links, but that's not the reason. As long as the targets are easily identifiable, I can do the task through AWB/bot; no matter how many links pages are listed on the disamb page. For example, Cruel Summer has only 5 entries, but all of them are songs. For now, I don't know how to approach such situations, but I can come up with some idea/solution if I keep working on the task. Tomorrow I will work on any two articles from above, and then we can have a bot trial for the remaining third article. note: there would be no difference in the working method at all, except for the automated saves. —usernamekiran (talk) 18:17, 24 August 2020 (UTC)
    I'm not the best person to consult about these newfangled tool and bot thingies. I fix links to DAB pages the way my father taught me, which is the way his father taught him, and his forefathers taught him, and so on time out of mind - using craftsman-knapped flint tools and low cunning.
    It's the bulk stuff - DAB pages with 10+ links, say - which is a problem. It's mind-blowingly tedious work, and I try to avoid it. Props to those who do firefight such links. Narky Blert (talk) 19:37, 24 August 2020 (UTC)
  • Done with Wyn Jones, and Stefan Marinović. No technical issues found. There were a few articles where {{sortname}} was used. I fixed it one instance manually, and then a couple through AWB. Then there was this instance where first, and last names were used the other way around. Then there was this instance where initial was used. But all these edits will not be edited even automated account. I will be fixing one link manually (through non-bot account), at the same time I will be making the lists for other targets using AWB. The doubtful/problematic articles would be added to one list for manual editing, other lists would have no chance of error. Links to Chuuk have been fixed by someone. I can fix the links to Big box; there are 4 targets, and 109 incoming links. I am ready to do this with automated/bot account. Pinging @Headbomb and Primefac: not sure if I should ping the other editors as well. —usernamekiran (talk) 16:38, 27 August 2020 (UTC)
  • Since my last comment here, I have been working on a custom/hybrid module for this task. I got input from David at their talkpage: special:permalink/975938658#find and replace random piped wikilinks. I tested this module in my sandbox, and it worked successfully: special:diff/975939820. I tested this module on the Gaddi disamb. All the edits were as expected, including special:diff/977067060. The only problematic edit was special:diff/977066941, where it changed [[Gaddi]]s to [[Gaddis]]s. I updated my module, and fixed the issue: special:diff/977067363. I should have already anticipated that though. Anyways, I have updated the module to handle this scenario. In short: if there are many pages to fixed, I will create the lists first, and then I will handle it through bot account. In case there are not many pages, or creating the lists are not worth it, then I will do that task through non bot account. But now I can positively say that if I do it from bot account, there would be no mistakes. —usernamekiran (talk) 19:45, 6 September 2020 (UTC)
    The Gaddi run has introduced a number of grammatical errors. Replacing [[Gaddi]] with [[Gaddis]] doesn't work in all contexts as the first word is singular and the second one – plural. I've checked the first 10 edits, and the following contain this error: [1] [2] [3] [4]. – Uanfala (talk) 20:04, 6 September 2020 (UTC)
    Yes. If I was told sooner, I could have easily done that. I generally ask this stuff during/while closing the RM (eg: Talk:Neurolathyrism#Requested move 2 July 2020). In either case, there were no incorrect targets, and there were no technical errors :) —usernamekiran (talk) 20:34, 6 September 2020 (UTC)
    Well, editors are expected to figure this out by themselves as part of their preparation before fixing the dablinks, regardless of the method they're going to use (I only pointed that out to you after I noticed an error washing up on my watchlist). This particular kind of error can be avoided if you don't change the visible article text, but use piping in the link (though of course, there are cases where linking directly is preferable). – Uanfala (talk) 21:28, 6 September 2020 (UTC) Adding that I've now fixed the 13 such errors introduced in this run. – Uanfala (talk) 22:39, 6 September 2020 (UTC)

BAG note Just as a note, I won't have time to check this for a couple of days/a week-ish so if some other BAG member wants to take a look, feel free. Headbomb {t · c · p · b} 22:04, 6 September 2020 (UTC)

  Trial complete. through my alt ac Usernamekiran (AWB) with this move. I checked all the diffs, no issues found at all. Sample diffs: sample diff 1, sample diff 2, and 3. —usernamekiran (talk) 18:58, 10 September 2020 (UTC)

So, um... why did this not get run with the bot?
Second question, which I only just realized now since I haven't really been paying much attention (since Headbomb's been the primary BAG) - is there going to be some sort of request process, or is this just so you don't have to hit "save" a bunch of times when you close an RM yourself? Primefac (talk) 22:02, 29 September 2020 (UTC)
Headbomb had approved the trial when the account didn't have bot flag. I was not sure if I should have used the bot account, so I went with non-bot account. I am also not sure if the bot needs to be enabled somewhere (from your side) to be able to edit the mainspace.
I am willing to accept requests which I can handle without any problems. I can handle doubtful requests with non-bot account (or the bot account, but with non-bot mode). If approved, I was thinking about putting a normal discussion thread similar to special:permalink/978981536#bot for link fixing before/after page moves on WT:RM, and WT:Page mover; with a real long {{DNAU}} —usernamekiran (talk) 03:17, 30 September 2020 (UTC)

{{BAGAssistanceNeeded}} Techie3 (talk) 06:48, 29 November 2020 (UTC)

  • Hmm. @Headbomb: thoughts on this? As far as I can see, there's a few arguments for/against the consensus and technical viability by editors for this task, some concerns seem more valid to me than others, though I'll admit I have exactly zero involvement in dab pages. I've advertised this to WT:DPL for comments, as well. I've only checked a couple diffs but I have a few thoughts. First, proposal says "replace [[Bell TV", I presume you mean [[Bell TV| (so it won't match [[Bell TV 2, for example)? Second, edits like Special:Diff/977744374 or Special:Diff/977743962 are introducing the subject's first name, when it seems the article content explicitly is choosing to use last names only? ProcrastinatingReader (talk) 13:46, 29 November 2020 (UTC)

  Approved for extended trial (5 dab runs). Please provide a link to the relevant contributions and/or diffs when the trial is complete. @Usernamekiran: says they can restrict the bot to non-problematic cases, so let's see that in action. One thing thought, concerning [[Jonathan David (soccer)|Jonathan David]][[Johnathan David]], I really don't believe those are needed. Especially given Jonathan David might become ambiguous in the future. Same for [[Jonathan David (soccer)|David]][[Jonathan David]], which is straight up an error. Headbomb {t · c · p · b} 15:25, 29 November 2020 (UTC)

Just based on this note it's starting to sound more like a CONTEXT issue... let's see how well this trial goes. Primefac (talk) 15:53, 29 November 2020 (UTC)
@Headbomb: thank you. [[Jonathan David (soccer)|Jonathan David]][[Johnathan David]] was part of the list. At the time, I thought it would be better to use the actual target page, instead of redirect. Also, should I use the bot account for trial runs, or normal account? @ProcrastinatingReader: Yes, I meant [[Bell TV|, sorry about that. Regarding use of only last name (or partial name of target page), such instances can be handled in next runs, where only last name is required. Like mentioned in the above comments, I will be needing to create different lists in AWB, one list can be created for partial names. @Primefac: I will use the same conventions which are already used in articles, the only context-issue here would be the one pointed out by Unfala above — regarding targets. I will be avoiding such tasks. —usernamekiran (talk) 16:02, 29 November 2020 (UTC)
I've said this multiple times - it's a BRFA, it's a bot trial, so use your bot account. Primefac (talk) 16:03, 29 November 2020 (UTC)
Bot account, yes. Headbomb {t · c · p · b} 16:04, 29 November 2020 (UTC)

Little concerned over the technical soundness here given the David error, though, and have a gut feeling the replacement may be overcomplicating things. Kiran can you publish the module code here? ProcrastinatingReader (talk) 16:14, 30 November 2020 (UTC)

I am a little confused, not sure how [[Jonathan David (soccer)|David]][[Jonathan David]] is an error. —usernamekiran (talk) 21:10, 30 November 2020 (UTC)
The link originally displayed as "David", after the edit it displays as "Jonathan David". The display of "David" was intentional on those articles, because the scoreboards used players' last names only. As I understand it, this bot task should never be changing how a link displays, it should only be changing the target (removing the pipe when the display = target is merely a technical point). ProcrastinatingReader (talk) 21:13, 30 November 2020 (UTC)
Indeed. Whether to write According to Einstein, ... or According to Albert Einstein, ... is not something the bot can decide. Headbomb {t · c · p · b} 22:35, 30 November 2020 (UTC)

If you'd like something to practice on from the most recent update of Disambiguation Pages with Links, try Argos (3 links in, each needing a different fix (and one of them a MOS:OVERLINK), DSS (3 links in, all the same, but you'll need to find which one on the DAB page it is), Elizabeth Ferris (1 easy fix, 2 will need redlinking) and Saint Mary's College (1 easy fix, 2 might be soluble by googling). Narky Blert (talk) 15:17, 2 December 2020 (UTC)

{{OperatorAssistanceNeeded}} Status? Noticed the bot hasn't made any edits since September; do you still plan to proceed with this task, kiran? If so, could you please address concerns above, particularly clarification on the task's scope and ideally post the module code? ProcrastinatingReader (talk) 21:51, 16 January 2021 (UTC)

Hello. I apologise for my absence. There were two separate deaths in extended family, and a wedding in between. And each of it consumes 20 days. Regarding the conext issue, I will go with the style already used in the article. I will shortly respond to Narky Blert. Courtesy ping to ProcrastinatingReader. —usernamekiran (talk) 16:48, 17 January 2021 (UTC)

No problem, take as long as you need; was just making sure you still planned to proceed with the request. ProcrastinatingReader (talk) 16:54, 17 January 2021 (UTC)
thanks. I was not close to the dead guys, nor the bride. But I had to be there because of the norms. First my computer started behaving erratically, then that funerals and wedding. I have been very occupied, and away from computer. I think i will be back in 5 to 7 days. —usernamekiran (talk) 17:15, 17 January 2021 (UTC)

Bots that have completed the trial period

ShortDescBot 2

Operator: MichaelMaggs (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 14:22, Friday, January 22, 2021 (UTC)

Function overview: (a) Add new short descriptions to organism articles. (b) Improve some existing moth short descriptions

Automatic, Supervised, or Manual: Automatic, after pre-review

Programming language(s): Pywikibot

Source code available: GitHub

Links to relevant discussions (where appropriate): WikiProject. Also noted on the WP short description page. Not a lot of interest, but there wasn't much for the moths task either, and that was entirely uncontentious.

Edit period(s): One time

Estimated number of pages affected: (a) 210,000 with relevant infobox; (b) 2000 moth articles

Namespace(s): Mainspace

Exclusion compliant: Yes

Already has a bot flag: Yes

Function details: ShortDescBot has successfully completed its addition of new short descriptions to all the moth articles. Next, I want to move on to categories of other organisms. This is a good bot task since non-technical short descriptions complying with WP:HOWTOSD can’t automatically be generated from the usual infoboxes, at least without expensive Lua calls.

Each bot run is based on a single category at some level in the tree that I can manually associate with a suitable common generic name. Sometimes that may be the same as the category name (Category:Butterflies --> "butterfly"), but often not (Category:Poaceae --> "grass" or Category:Onychophorans --> "velvet worm"). The bot then constructs and adds new short descriptions such as "Species of butterfly", "Genus of velvet worms", "Family of grasses" and so on. The text is deliberately simple so that a low error rate (<1%) can be maintained while minimising the number of non-standard articles that the bot has to skip as 'too difficult to parse'. For each category the procedure is:

  1. With the bot in trial mode, write the proposed descriptions to a local spreadsheet; review and repeat until the error rate is sufficiently low
  2. Manually remove from the list any obvious classes of article that the bot will not realistically be able to handle [not had to do this so far in testing]
  3. Re-run the bot in edit mode, making live changes only to the pages in the final corrected list.

The bot won't change existing short descriptions, with one small exception. A new feature this time is the inclusion of "Extinct ..." in the bot-created description of extinct organism articles, and also "Single-species .." in Monotypic genus articles (where that can be done without making the text too long). 2000 or so moth short descriptions of the form "Genus of moths" etc can be improved.

You can see a sample of suggested edits from a variety of categories at User:MichaelMaggs/ShortDesc.


  Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Primefac (talk) 16:10, 22 January 2021 (UTC)

  Trial complete. The results look good, I think, although I did notice that in a few cases such as Phomatosphaeropsis and Pecoramyces the bot used "Genus of fungi" rather than "Single-species genus of fungi" which would have been better. So far I've not been adding "Single-species ..." to genus articles solely on the basis that the article is in a monotypic-specific category, as categorisations can very often be wrong. But in practice, monotypic categorisation seems to be done carefully, by specialists, and I suspect that using the name of the category will pick up a few more instances that can't be parsed from the lead: things like Wollemia, for example, where the fact that the genus is monotypic is well-hidden in the body of the article but can easily be seen from the category. I'll do that from now on. MichaelMaggs (talk) 15:35, 23 January 2021 (UTC)

BattyBot 53

Operator: GoingBatty (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 03:33, Monday, December 21, 2020 (UTC)

Function overview:

Automatic, Supervised, or Manual: automatic

Programming language(s): AutoWikiBrowser

Source code available: AWB

Links to relevant discussions (where appropriate): Category documentation

Edit period(s): Monthly

Estimated number of pages affected: Thousands initially, fewer each month

Namespace(s): Articles

Exclusion compliant (Yes/No): Yes

Function details: Use AWB's general fixes to remove the maintenance categories when the year and/or date of birth exists.

  • Load all articles from Category:Year of birth missing
  • Skip the article if the category is no longer on the article (e.g. someone else removed the category while the bot was running)
  • Skip the article if the category is not removed
  • Use a default edit summary that states "Removed Category:Year of birth missing and other general fixes"
  • Repeat for the other three categories


Question: the DOB in BLPs is one of the most inappropriately-added bits of content (from a sourcing perspective) and I don't know if having a bot automatically removing pages from these tracking categories is a good idea; while I don't necessarily know if people are using said cats to find pages where there might not be a sourced DOB, it seems like enough of a possibility that I would be concerned about having this task. What are your thoughts on this? Primefac (talk) 13:56, 21 December 2020 (UTC)

@Primefac: My understanding of these categories is that they should only be used if the YOB/DOB is missing, not if it is unsourced. If the YOB/DOB is unsourced, then it may be appropriate for a human to add {{citation needed}}. GoingBatty (talk) 17:45, 21 December 2020 (UTC)
I guess my point is more that the sequence goes 1) someone adds an unsourced DOB, 2) the bot removes the cat, 3) someone removes the invalid DOB, 4) the cat is re-added, 5) repeat steps 1-4. With no bot edit, only 1 and 3 are enacted (saving two edits to the page).
Don't get me wrong, I understand why the category exists, but if any rando chucking a DOB into an article causes the cat to disappear (assuming that no one notices the cat was removed and thus don't re-add it) it almost makes it seem like a pointless addition. Primefac (talk) 17:54, 21 December 2020 (UTC)
@Primefac: I'm hoping the sequence would go like this:
  1. Someone adds a DOB
  2. Someone reviews the edit and removes the DOB if it's invalid (and the bot would skip the article)
  3. If the DOB is still there, the bot removes the category
  4. Reviewers notice the bot's edit summary and have another opportunity to review the DOB
  5. If needed, add {{citation needed}} or remove the DOB and readd the category
In case it would be helpful, here is a list of articles for people whose last name starts with "G" where the bot would remove the cat:
Extended content
  1. James Gaddas
  2. Olivier Gaillard
  3. Kenny Gajewski
  4. Haribol Gajurel
  5. Larisa Galadza
  6. Tommy Galán
  7. Silvia Gallego
  8. Ian Galvin
  9. Diana Gamage
  10. Jim Gamble
  11. Ganabatirau Veraman
  12. Claude de Ganay
  13. Latif Gandilov
  14. Marricke Kofi Gane
  15. India Gants
  16. Elizabeth Garber
  17. Adolfo García-Sastre
  18. Joel Garcia
  19. María Fernanda García
  20. Osmani García
  21. Kimberly Gardner
  22. Fernando Garibay
  23. Elsa M. Garmire
  24. Gary Garnett
  25. Diane Gaston
  26. Alan M. Gates
  27. Marie-Hélène Gaudreau
  28. Fabrice Gautrat
  29. Adam Gaynor
  30. Gazapizm
  31. Babatunde Gbadamosi
  32. Hilary Gbedemah
  33. David Geary
  34. Amber Gell
  35. Jamie Geller
  36. Stephen Geller
  37. Genesis Owusu
  38. Amika George
  39. Bill George (academic)
  40. Esther George
  41. Lisa George
  42. P. C. George
  43. Wilf George
  44. Barbara Gervin-Hawkins
  45. Sabine Getty
  46. Fazli Ghafoor
  47. Gholamreza Soleimani
  48. Luc Giard
  49. Mark Gibbon
  50. Nicole Gibbons
  51. Sadie Gibbs
  52. Hogan Gidley
  53. Zerbanoo Gifford
  54. Chris Payne Gilbert
  55. Mahdi Gilbert
  56. Faiza Gillani
  57. Hank Gilpin
  58. Saskia Giorgini
  59. Jean-Denis Girard
  60. Girl Ultra
  61. Peter Njuguna Gitau
  62. Lisa Glasberg
  63. Judith E. Glaser
  64. George Edward Glass
  65. Roland Glasser
  66. Jeff Glassman
  67. Avram Glazer
  68. Beverly Glenn-Copeland
  69. Roman Glick
  70. Cat Glover
  71. Corey Glover
  72. Juleanna Glover
  73. Andrew Goddard
  74. Pete Gofton
  75. Mike Goggin
  76. Joanne Goh
  77. Adam Goldman
  78. Taylor Goldsmith
  79. Solomon Goldstein-Rose
  80. Meathead Goldwyn
  81. Daniel Gollán
  82. Alfonso Gomez-Rejon
  83. Carlo Gonzales
  84. Arcadio González
  85. Eric Gonzalez (lawyer)
  86. Jessica González
  87. Jeff Goode
  88. Vikki Goodwin
  89. Lakshmi Gopalaswamy
  90. Priyal Gor
  91. Bert Gordon (rugby league)
  92. Jon Gordon
  93. Maria Goretti (actress)
  94. Nancy Goroff
  95. Jake Gosling
  96. Hattie Gossett
  97. Tia Gostelow
  98. KK Goswami
  99. Franziska Gottwald
  100. Andrew Gounardes
  101. Céline Gounder
  102. Kathleen Graber
  103. Sina Grace
  104. Gaetano Grado
  105. Phyllis E. Grann
  106. Angel Grant
  107. Brigitte Granville
  108. Farrah Gray
  109. Jenna Gray
  110. John Gray (Canadian author)
  111. Jonathan Grayer
  112. Joe Green (entrepreneur)
  113. Carin Greenberg
  114. Gerald Greene
  115. Keith Greiner
  116. Raymond Grew
  117. Narender Kumar Grewal
  118. Laura Griffin
  119. Katie Griffiths
  120. Ron Griggs
  121. Leo Groarke
  122. Louis Groarke
  123. Lisbeth Gronlund
  124. Nick Gross
  125. Laura Groves
  126. Jakub J. Grygiel
  127. Benedikt Guðmundsson
  128. Matteo Guerinoni
  129. Francisco Guerreiro
  130. Abhijit Guha (director)
  131. Sujit Guha
  132. Capitão Guimarães
  133. Luke Guldan
  134. Dolly Guleria
  135. Randeep Guleria
  136. Kartikeya Gummakonda
  137. Begench Gundogdyev
  138. Steven Gundry
  139. Abdul-Samed Muhamed Gunu
  140. Aparna Dutta Gupta
  141. Deepak Gupta (attorney)
  142. Prerna Gupta
  143. Sangam Lal Gupta
  144. Vandana Gupte
  145. Merceditas Gutierrez
  146. Simon Gutierrez
  147. Titus Gwaze
GoingBatty (talk) 19:48, 21 December 2020 (UTC)

{{BAG assistance needed}} Happy New Year! GoingBatty (talk) 19:45, 31 December 2020 (UTC)

  • Hmm. Personal 2c: I don't think this task is too problematic, and I'm also not sure it's too helpful.
    Not too problematic because: (a) these are no worse than other unsourced info, if the DOB is problematic; (b) if it's a problematic DOB, the person could've just removed the tracking cat in the same edit; (c) Category:Year of birth missing (living people) has 141,310 pages - safe to say nobody is using it; (d) we have Filter 712 for this.
    Not too helpful because: Category:Year of birth missing (living people) has 141,310 pages - safe to say nobody is using it. ProcrastinatingReader (talk) 20:16, 31 December 2020 (UTC)
    PR, would you say then that it might be worth pursuing the removal/deletion of this category? If so, and it fails, where would your thoughts on the appropriateness of this task fall? If not, is it still worth running? I find myself largely in the same camp (i.e. not really useful, not really harmful), be good to think long-term on it. Primefac (talk) 20:27, 31 December 2020 (UTC)
    I think CfD seems to be the way to go. This is kinda like Category:Pages using deprecated image syntax (CfD) -- too large and too minute to be useful. Unlike Category:Living people it has no real technical purpose either afaics. Last batch CfD seems to be 2007 - NC.
    Re bot: On the one hand, a tracking cat which is incorrect is useless. If it were populated by a template updates would be automatic regardless of sourcing, so this should probably be treat the same way. OTOH, I'm not sure removing a few thousand articles changes anything, it's still useless. I guess the task falls closer to useful than useless, but it's indeed very much near the middle. May be useful to get some thoughts from Wikipedia:WikiProject Biography -- if anyone is actually using this, they're probably there. ProcrastinatingReader (talk) 22:28, 31 December 2020 (UTC)
@ProcrastinatingReader: Per your suggestion, I added a note in Wikipedia talk:WikiProject Biography#Notification of bot request. Thanks! GoingBatty (talk) 23:29, 31 December 2020 (UTC)

{{BAG assistance needed}} I appreciate the feedback received so far. What's the appropriate next step to resolve this request? Thanks! GoingBatty (talk) 15:12, 10 January 2021 (UTC)

  Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. At some point it sounds like a few of us were going to do this anyway and dropped the ball. Doing so now before I forget again. Primefac (talk) 20:47, 10 January 2021 (UTC)
@Primefac:   Trial complete. with link to results. Thanks! GoingBatty (talk) 04:43, 12 January 2021 (UTC)
Six diffs into the review and this is a situation we were worried about: bot adds cat for unsourced year, editor removes the unsourced year but doesn't touch the cat, and we're left with an unsourced cat. Wondering if we need a companion task to remove a DoB cat (or at least flag it somehow) if the corresponding statement in the article is removed?   Another thought I had earlier but did not mention: the problematic DoB was added less than a month before the bot came through, so maybe have the bot not do anything until the DoB has been stable in the article for some period of time? OTOH, this adds complexity without really solving the problem. I'm not sure what to do here. — The Earwig talk 03:25, 16 January 2021 (UTC)
@The Earwig: As far as I know, AWB doesn't have functionality that allows bots to skip articles if they have been edited in some recent period of time. If that's a requirement, then I will have to withdraw this request. GoingBatty (talk) 03:40, 18 January 2021 (UTC)
Not a requirement per se, just trying to brainstorm. — The Earwig talk 05:36, 18 January 2021 (UTC)

Approved requests

Bots that have been approved for operations after a successful BRFA will be listed here for informational purposes. No other approval action is required for these bots. Recently approved requests can be found here (edit), while old requests can be found in the archives.

Denied requests

Bots that have been denied for operations will be listed here for informational purposes for at least 7 days before being archived. No other action is required for these bots. Older requests can be found in the Archive.

Expired/withdrawn requests

These requests have either expired, as information required by the operator was not provided, or been withdrawn. These tasks are not authorized to run, but such lack of authorization does not necessarily follow from a finding as to merit. A bot that, having been approved for testing, was not tested by an editor, or one for which the results of testing were not posted, for example, would appear here. Bot requests should not be placed here if there is an active discussion ongoing above. Operators whose requests have expired may reactivate their requests at any time. The following list shows recent requests (if any) that have expired, listed here for informational purposes for at least 7 days before being archived. Older requests can be found in the respective archives: Expired, Withdrawn.