Wikipedia:Bots/Requests for approval

BAG member instructions

If you want to run a bot on the English Wikipedia, you must first get it approved. To do so, follow the instructions below to add a request. If you are not familiar with programming it may be a good idea to ask someone else to run a bot for you, rather than running your own.

New to bots on Wikipedia? Read these primers!
 Instructions for bot operators

Current requests for approval

AWMBot

Operator: BJackJS (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 18:21, Wednesday, October 21, 2020 (UTC)

Automatic, Supervised, or Manual: supervised

Programming language(s): Node.JS with MWN Library

Source code available: github

Function overview: Repair broken links that occur due to page moves. This would be done to broken links that are in large volumes that cannot be fixed by individual editors efficiently and timely.

Links to relevant discussions (where appropriate): Wikipedia:Bot_requests#Bot_to_fix_broken_peer_review_links was the primary reason for this bot, but it could be extended.

Edit period(s): Likely weekly or when a need arises to repair a group of links.

Estimated number of pages affected: 1000+

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): No

Function details: Scans categories that are identified to have broken links (such as). Once broken links are found, it locates redirects and fixes the link with the most recent redirect/move. Rescans after all of the links have been fixed and repeats any ones with a later redirect if needed.

Discussion

  • This is a very useful bot. Relevant details:--Tom (LT) (talk) 20:59, 22 October 2020 (UTC)
  • This is a tricky issue and so a trial run of 10 articles or so is likely necessary to make sure any unanticipated issues are ironed out. Thanks again to BJackJS for proposing this bot.--Tom (LT) (talk) 20:59, 22 October 2020 (UTC)
  •   Note: This bot appears to have edited since this BRFA was filed. Bots may not edit outside their own or their operator's userspace unless approved or approved for trial. AnomieBOT 21:52, 23 October 2020 (UTC)

Cewbot 6

Operator: Kanashimi (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 10:38, Saturday, October 10, 2020 (UTC)

Function overview: Fix broken anchors

Automatic, Supervised, or Manual: Automatic

Programming language(s): wikiapi

Source code available: 20201008.fix_anchor.js

Links to relevant discussions (where appropriate): Wikipedia:Bot_requests#Fixing_broken_shortcuts_to_sections

Edit period(s): Continuous

Estimated number of pages affected: estimate 10/day

Namespace(s): Articles, and may expand

Exclusion compliant (Yes/No): Yes

Function details:

 
20201008.fix_anchor.js screenshot
  1. Listen to edits modifying section title in ARTICLE.
  2. Checking all pages linking to the ARTICLE.
  3. If there are links with old anchor, modify it to the newer one.
  4. If need, the bot will search revisions to find previous renamed section title.
  5. The bot may notice in the talk page for lost anchors.

Sample edits: 1, 2 --Kanashimi (talk) 10:38, 10 October 2020 (UTC)

Discussion

Looks like nice functionality. Code is quite interesting, although haven't looked closely enough for issues. I know sample edit 1 was one where the anchor already existed, but in many cases it wouldn't. It may be best to limit this to articles (incl redirects in mainspace), since talk/project-space has its own myriad of issues in this sense (with archiving and all), so it may at worst conflict and at best not solve many issues (since most will be busted archive links etc). Don't think there's any issues preventing a trial for articles, though. ProcrastinatingReader (talk) 09:29, 12 October 2020 (UTC)

Thanks for the ping on BOTREQ. I think this is really cool, but I'm not sure looking at the code how "Listen to edits modifying section title in ARTICLE." is supposed to work. If I delete a section header and add a different section header in the same edit, would the bot mistakenly detect that as renaming the section? Also, what if a section is renamed and the content in it is substantially altered (which probably means incoming links shouldn't be automatically fixed)? I guess some trials could give an idea of the extent of these problems. – SD0001 (talk) 10:12, 12 October 2020 (UTC)

Thank you for the comments. The bot will check the section titles added and deleted. Only the edits modifying just one section, and the change is not too big to be regarded as the same meaning, will be treated as the same meaning. For the case of the content substantially altered, if the section titles are not modified, it will not be counted. If the section titles are modified, the different will too big to be regarded as the same meaning, and the cases will be recorded as section titles lost. --Kanashimi (talk) 20:55, 12 October 2020 (UTC)

Dapperbot

Operator: WikiMacaroons (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 08:27, Thursday, August 13, 2020 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: Still in development (Will change this once I've done it)

Function overview: Automatically uses the Covid-19 API to update Coronavirus disease rates.

Links to relevant discussions (where appropriate):

Edit period(s): Perhaps twice daily?

Estimated number of pages affected: 2,760

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): No

Function details: Every day, perhaps twice a day, the bot sweeps all Covid-19 articles, updating the rates. It would perhaps use a dedicated Covid 19 REST API to scrape the data and automatically edit each article. There would perhaps be some templates that would be laid out on each page so the bot could edit them.

Discussion

Would you make a few manual edits with your own account that illustrate the edit you would make with your bot account and link those diffs here? — xaosflux Talk 11:26, 13 August 2020 (UTC)

Extended content
@Xaosflux: For example, this diff shows the updating of the cases count in the UK. On any pages related to Covid-19 in the UK, that template will be there and show recent data. WikiMacaroonsCinnamon? 11:59, 13 August 2020 (UTC)
@WikiMacaroons: I'm looking for an actual live update to something in an article - not an isolated sandbox - can you do one of those? — xaosflux Talk 14:45, 13 August 2020 (UTC)
@Xaosflux: Ah I see, sorry. So the rates of Covid-19 on each page listing them will instead be a template. For example, under Russia, instead of saying "15,000" etc, it will have "{{Russia Covid-19 Cases}}". The bot will do a sweep every few hours of the Covid cases and update each template, thus updating them on the page. Unlike in this diff, where I give an example of the process, it will not be a userspace template, I just used that for demonstration purposes. WikiMacaroonsCinnamon? 19:04, 13 August 2020 (UTC)
@WikiMacaroons: so every time you update a fact, I'm assuming you would update the reliable reference as well? Having a template to just store one number, then using that in an article doesn't sound like a good idea - as it makes the article harder for others to edit. If this was maybe only being used in an info box, using wikidata and having the bot update wikidata might be better. — xaosflux Talk 19:44, 13 August 2020 (UTC)
@Xaosflux: So the bot could potentially edit the structured data on Wikidata, or simply update the tables without the templates, as in this diff. Which would be your preference? WikiMacaroonsCinnamon? 20:08, 13 August 2020 (UTC)
@WikiMacaroons: so in that edit you made a change to a value, but that value was supported by a reference, {{cite web |title=Оперативные данные. По состоянию на 23 апреля 10:35 |url=https://xn--80aesfpebagmfblc0a.xn--p1ai/ |website=Стопкоронавирус.рф |accessdate=11 August 2020 |language=ru}}. Are you now supporting your edit with a new reliable source, did that source change, etc? Think of this the same as if you just made that edit - where is your source, how are you showing readers the current source? — xaosflux Talk 01:09, 14 August 2020 (UTC)
@Xaosflux: That's a good point. I notice that particular source is actually a live dynamic one, meaning that the bot could constantly webscrape from that. However, I'm not sure if this applies to all of the references in the table. WikiMacaroonsCinnamon? 07:49, 14 August 2020 (UTC)
@WikiMacaroons: so basically if you are going to make statistics edits, it is expected that you will also include current reference information -- your plan to just use a template with a bare number in it doesn't work for this - not to mention that using such templates in articles is generally a bad idea. Outstanding a larger discussion where you have gain a large consensus to make these edits in this manner (please provide a link if this happened already) I'm going to decline your bot request. If things change, you can always open a new request. Your idea may be useful for wikidata, but we don't control bots on wikidata here on Wikipedia (see wikidata:Wikidata:Requests for permissions/Bot for their process). Using a bot to maintain a statistical value there could be useful, though I expect they would also be looking to have you update the reference link, then it comes down to our editors if we want to load wikidata information to encyclopedia articles or not. Will hold this open for at least a day in case there is something I'm overlooking, but like I said if things change you can always just open a new BRFA. — xaosflux Talk 15:06, 14 August 2020 (UTC)
@Xaosflux: You didn't comment on my previous message: I had a thought that if all the references were dynamic, like the Russian one, then it could webscrape data from those and the refs would consistently be up-to-date. WikiMacaroonsCinnamon? 15:15, 14 August 2020 (UTC)
I'm trying to think of a clearer way to say what I mean... WikiMacaroonsCinnamon? 15:19, 14 August 2020 (UTC)
Ok, here's a better explanation. When the bot enters the page, it will go down the list of countries. Assume here that, like the Russian source, each ref is a dynamic site, that makes live upadtes on its Covid values. The bot will webscrape the reference, and put the new value on the table. Is that a little more clear? WikiMacaroonsCinnamon? 15:41, 14 August 2020 (UTC)
@WikiMacaroons: so in your example edit here - is the source of your data still the same ref that is shown (Стопкоронавирус.рф) ? Or are you using a new source? And certainly the "accessdate=11 August 2020" value is no longer valid, since you have a new access date - but you didn't update the reference. Anytime you change a statistic that is supported by a reference, the reference listed should be what is currently supporting the statistic. Also in that article, the entire table is labeled as "Summary table of confirmed cases in Europe (as of 11 August 2020)" - but you didn't advance the date there either. — xaosflux Talk 16:16, 14 August 2020 (UTC)
@Xaosflux: So, in theory, the bot would update all of that. It would update the access date of the reference, and the "Summary table of confirmed cases in Europe (as of DATE)". The data that I put in that example edit was just an arbitrary placeholder, the bot will webscrape from the actual reference, meaning that if the reference is dynamic, it doesn't need to be changed. WikiMacaroonsCinnamon? 16:22, 14 August 2020 (UTC)
Also, in answer to "is the source of your data still the same ref that is shown", yes, as the bot will gather the dynamic data from that site. Is that unclear? I want to make sure I'm explaining myself correctly. WikiMacaroonsCinnamon? 16:24, 14 August 2020 (UTC)
@WikiMacaroons: OK, lets try this again: manually, under your own account - make one edit that you think is good, that would be the same as what you are proposing your bot would do for you. Use an edit summary that is appropriate to the edit (don't mention the bot in your own edit's edit summary at this time - make the edit becuase it is something you think actually needs to be edited), use the minor flag if appropriate. Place the single diff here. — xaosflux Talk 16:28, 14 August 2020 (UTC)
@Xaosflux: this diff shows the general kind of edit the bot would be making. WikiMacaroonsCinnamon? 16:41, 14 August 2020 (UTC)
Keep in mind that the references don't need to have their urls changed, as they show live updates. WikiMacaroonsCinnamon? 17:08, 14 August 2020 (UTC)
OK, @WikiMacaroons: that looks better, you listed that you want to run this on 2000+ pages, do you have the list of the pages somewhere (is it dynamic like category based?). Is the format of the data you will be changing on all of these pages the same today? What will your bot do if the format changes on a page? — xaosflux Talk 17:53, 14 August 2020 (UTC)
@Xaosflux: Ah, well I got that rough digit from getting the amount of articles in Category:COVID-19. I expect that not all of them contain rate counts, is there a category for this? Also, if the format changes I'll try to get the bot to find that, and notify me so I can rework the code to fit the new format... there may be a better way. WikiMacaroonsCinnamon? 19:11, 14 August 2020 (UTC)
@Xaosflux: For a trial, do you reckon I should just focus it on the main continent Covid-related pages? WikiMacaroonsCinnamon? 12:45, 17 August 2020 (UTC)
@WikiMacaroons: the trial will need to cover at least all of the different kinds of things that would get approved - it doesn't look like these are using a standard template - just wikitables - have you reviewed how many different things you will need to code for? — xaosflux Talk 13:32, 17 August 2020 (UTC)
@Xaosflux: I could potentially make the wikitables their own templates to edit, and leave a comment saying where to go to edit them WikiMacaroonsCinnamon? 15:44, 17 August 2020 (UTC)
@WikiMacaroons: those sort of design and content issues are something that you really need to work out before programming, this bot approval request is premature right now, you need to have a solid repeatable plan to have the bot component of it evaluated. The plan needs to be something that has existing support for the types of edits that will be made, or you will need to establish the consensus for the edits you will be making just as if you will be making them yourself (for example if you plan on introducing a new template to replace tables to thousands of articles). — xaosflux Talk 16:18, 17 August 2020 (UTC)
@Xaosflux: I'm so sorry for being vague about the plan, I've been a bit busy but I'm free now. I've partly programmed it, here's the plan:

There will be templates of wikitables for each Covid page about a continent, minus Antarctica:

The bot will sweep templates of those continents, so the pages themselves will be dynamically updated. There will be a <!--Hidden Comment--> that tells you where to find the template to edit if you're a newcomer. It will also go into the countries within those and have templates for each graph of rates. Again, thank you, you're a saint for putting up with me. WikiMacaroonsCinnamon? 18:07, 17 August 2020 (UTC)

@WikiMacaroons: our goal in BRFA is not just technical, but also to make sure there won't be editor interaction problems. For your idea above, this is getting much clearer - and the scope of the bot now sounds like it will be "update a small list of templates" which sounds very manageable. I'd like to see the templates created and in use first - so either you or others working on these pages updates should do that (and make sure you don't get push back on something like "we shouldn't use a template here"). You may want to discuss this with Wikipedia:WikiProject COVID-19 first. There is no time limit on these BRFA's as long as there is momentum. — xaosflux Talk 18:41, 17 August 2020 (UTC)
@Xaosflux: "update a small list of templates" - I like the sound of that! I've left a message at WP COVID. WikiMacaroonsCinnamon? 09:25, 18 August 2020 (UTC)
Right, @Xaosflux: Here's where we're at. I've spent a while programming the webscrapers for the bot, but have found a fatal flaw that means that I'll have to put aside next few days reworking the code. Hope to have the result with you soon. WikiMacaroonsCinnamon? 19:59, 9 September 2020 (UTC)
Phew, @Xaosflux: That took longer than expected. The two templates that are currently available for listing covid rates are Europe and Asia.
I hooked up the value-changing script that the bot would run on to my account and it created these diffs:
Would you rather we set Europe and Asia running now while I figure out the other 4 continents, (not doing Antarctica, obviously) or program all the continents before approving the bot? Thanks, WikiMacaroonsCinnamon? 16:31, 2 October 2020 (UTC)
P.S: I realise that the numbers don't have 1000 separation commas. (e.g 1000000, not 1,000,000) I've just put that in the code, so should be ok next time WikiMacaroonsCinnamon? 19:18, 2 October 2020 (UTC)
@Xaosflux: Sorry to ping again, I don't know if the previous ping went through. WikiMacaroonsCinnamon? 17:03, 12 October 2020 (UTC)
  A user has requested the attention of a member of the Bot Approvals Group. Once assistance has been rendered, please deactivate this tag by replacing it with {{tl|BAG assistance needed}}. can't look at this right now - but queuing it for attention. — xaosflux Talk 17:40, 12 October 2020 (UTC)

Bots in a trial period

ProcBot 3

Operator: ProcrastinatingReader (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 13:52, Wednesday, July 22, 2020 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Ruby

Source code available:

Function overview: Creates a snapshot of the main page twice daily.

Links to relevant discussions (where appropriate): Talk:Main_Page/Archive_199#Main_Page_history, Wikipedia:Bots/Requests for approval/Amalthea (bot)

Edit period(s): Two runs per day

Estimated number of pages affected: 2 per day

Exclusion compliant (Yes/No): N/A - bot only creates pages

Already has a bot flag (Yes/No): No

Function details: Replacement for this bot while that bot remains down. Happy to return this task to that bot should the operator return.

Does almost exactly the same as that BRFA. Creates a page Wikipedia:Main Page history/YYYY MMMM DD(a/b). Main difference being it snapshots twice a day with suffix (a/b) depending on if the snapshot is at the morning or afternoon (to account for both DYKs). Sample edit in my sandbox: Special:Permalink/968948661

Discussion

For the record, the change to archiving twice a day was discussed at Talk:Main_Page/Archive_194#Propose_archiving_Main_Page_history_twice_a_day_when_DYK_is_on_12-hour_schedule.

ProcrastinatingReader, editors (eg. PrimeHunter) preferred that the first archive of the day should not have an a in the title. --- C&C (Coffeeandcrumbs) 14:03, 22 July 2020 (UTC)

c.f. Special:PermaLink/968949473#Main_Page_history. — xaosflux Talk 14:05, 22 July 2020 (UTC)
Coffeeandcrumbs, thanks, updated. To double check, Wikipedia:Main Page history/2020 July 22 for the morning, Wikipedia:Main Page history/2020 July 22b for the night? ProcrastinatingReader (talk) 14:17, 22 July 2020 (UTC)
Correct. --- C&C (Coffeeandcrumbs) 14:18, 22 July 2020 (UTC)
Pinging in @Amalthea: as we want to ensure the bots aren't colliding with each other. — xaosflux Talk 14:08, 22 July 2020 (UTC)
@Xaosflux:, since it's 6 days since your message, and the operator isn't currently active, what's the plan if there's no response? Can this still go forward? If so, are there any additional considerations I should make regarding collisions? This will be creating pages slightly after that one, and works in createonly, so even if that bot starts working again my thoughts are mine would just do nothing (for the morning run, afternoon run would add as normal). ProcrastinatingReader (talk) 11:53, 28 July 2020 (UTC)
  Approved for trial (25 edits or 7 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. (Project space only). — xaosflux Talk 12:45, 28 July 2020 (UTC)
@ProcrastinatingReader: please report back here after trialing. — xaosflux Talk 12:46, 28 July 2020 (UTC)
@Xaosflux:, thanks! Looks like Amalthea (bot) is now back online. Operator's response to your message: Special:Diff/969947853. Whilst this task does do a second evening snapshot which that bot doesn't, it seems rather silly to have one bot for the morning and one for the afternoon, so in light of that I think it's best to withdraw this BRFA in favour of Coffeeandcrumbs (or any other interested parties) requesting Amalthea to consider adding the afternoon functionality to her task. ProcrastinatingReader (talk) 17:39, 28 July 2020 (UTC)
Xaosflux and ProcrastinatingReader, I have reopened this ticket after discussing here. I am sorry if this is not the proper process. Feel free to undo. --- C&C (Coffeeandcrumbs) 15:53, 22 October 2020 (UTC)
  Approved for trial (100 edits or 15 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. xtend trial — xaosflux Talk 00:23, 23 October 2020 (UTC)
@ProcrastinatingReader: trial reactivated. — xaosflux Talk 00:25, 23 October 2020 (UTC)

SDZeroBot 8

Operator: SD0001 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 13:29, Tuesday, October 6, 2020 (UTC)

Automatic, Supervised, or Manual: supervised

Programming language(s): Node.js or AWB

Source code available: Coming soon

Function overview: Add {{set category}} to applicable categories

Links to relevant discussions (where appropriate):

Edit period(s): One time runs

Estimated number of pages affected: 1000s

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): Yes

Function details: Add {{set category}} template to categories that are WP:SETCATs. The template has 32,500 transclusions so its usage is well-established.

It is not possible for a bot to tell apart set categories from others. I would be manually telling the bot what categories to edit. For example, all pages in Category:Television programs by director and Category:Television series by creator should be tagged.

This came out of a discussion at User_talk:SDZeroBot/Category_cycles#It's_hopeless.

Discussion

cc RoySmith. – SD0001 (talk) 13:32, 6 October 2020 (UTC)

  • If you're specifying precisely which pages/categories this will run on, I feel like this is assisted editing and may not need a BRFA - even if you're using some Node.js code and not AWB (although I feel like this task would be suitable for AWB). I also think you'll still need consensus for this somewhere, though. If we can't think of a more specific venue, WP:VPPR will work. Enterprisey (talk!) 03:43, 7 October 2020 (UTC)
    • Hi. Not very familiar with AWB, will give it a whirl. Can you add AWB access for the bot account? Since the edits will be made from the bot account (as I don't want to litter the contribs of my own account), I think BRFA is needed, right? Regarding consensus, I left a comment at WT:WikiProject Categories#Tagging set categories. – SD0001 (talk) 12:17, 7 October 2020 (UTC)
      I have given your bot account an AWB flag (pinging Swarm to see if I did it correctly). For your other point, a dedicated AWB account (or any other declared alternative account) will only need a BRFA if you're doing (fully) automated editing; assisted editing doesn't need a BRFA. And thanks for opening the discussion; if nobody comments there I assume it'll be okay. Enterprisey (talk!) 09:03, 12 October 2020 (UTC)
      Wouldn't it be best practice to do these edits on main account (or an AWB account), rather than on the bot account? Just seems less confusing to limit bot edits to BRFA approved / exempt ones, rather than assisted editing. Not forbidden, of course, but I think it's a general good practice? ProcrastinatingReader (talk) 09:33, 12 October 2020 (UTC)
@Enterprisey: You did it right. Good job alphabetizing correctly. @SD0001: It is indeed common for users to use a secondary account for AWB usage to avoid "cluttering" their contributions. As Enterprisey says, AWB use is not treated as "bot-like editing", it's treated as manual editing because it requires that you personally screen and approve every edit made with it, although the general community position on this is that it's unacceptable to use AWB to make potentially-controversial, sweeping changes without a prior consensus. Strangely, WP:BOT does not include any sort of caveat for allowing manual editing with a bot account without BAG approval. Is this sort of thing common, Enterprisey? I agree with ProcrastinatingReader that it seems like a bit of an unnecessarily complex, esoteric situation. ~Swarm~ {sting} 03:14, 13 October 2020 (UTC)
An AWB account sounds like a fine solution to me. SD0001, thoughts? After the WT:WikiProject Categories discussion goes a week without anyone replying (and there's consensus among the replies, if any), I figure you're good to go for this task; just let me know the name of the account (or use the AWB checkpage). Thank you to both Swarm and ProcrastinatingReader for weighing in. Enterprisey (talk!) 00:22, 15 October 2020 (UTC)
  • For what it's worth, I think "supervised" is being taken very literally by SD0001; I would consider this a fully-automated task, even if they just feed the bot the pages to change; this could happen anywhere (not just AWB) and thus I would consider this a great task for a bot; correct me if I'm wrong, SD0001, but your intention isn't to sit there and hit "Save" thousands of times, correct? Primefac (talk) 16:41, 16 October 2020 (UTC)
    Ah yes, Primefac and Enterprisey. I may have been a bit unclear in the original request – most of the time, pages won't need to be looked at individually. So using the human mode of AWB definitely would be a banal waste of time. My understanding of the process/policy is that AWB alt accounts (use of which was suggested above) are not given access to the bot mode of AWB, right? If so, a bot approval would be good to have. – SD0001 (talk) 13:59, 17 October 2020 (UTC)
    SD0001, would it be easier to run on Node.js as you initially proposed? Go with whatever is easier for you. Primefac (talk) 17:30, 17 October 2020 (UTC)
      Approved for trial (25 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. I don't really care what you run it with, though if you do need the bot's name put on the AWB list just post here and someone will take care of it. Primefac (talk) 17:32, 17 October 2020 (UTC)

MilHistBot 8

Operator: Hawkeye7 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 05:20, Tuesday, October 13, 2020 (UTC)

Function overview: Go through the MilHist stub-class articles and reclassify the ones that are no longer stubs

Automatic, Supervised, or Manual: Automatic

Programming language(s): C#

Source code available: User:MilHistBot/AutoStub.cs

Links to relevant discussions (where appropriate): Wikipedia talk:WikiProject Military history/Coordinators#A-Class Reviewing Drive?

Edit period(s): daily

Estimated number of pages affected: ~ 50,000 articles to be checked. Expect up to 80% will be upgraded to start class or better.

Namespace(s): Talk

Exclusion compliant (Yes/No): Yes

Function details: The Bot will loop through articles in Category:Stub-Class military history articles and re-assess each one using autoCheck - the same rules as the AutoCheck run, which assesses based on our b1 to b5 criteria, with assistance from ORES. Each daily run will process 1,000 entries. Edit summaries are tagged with "AutoStub" in case they have to be rolled back or rechecked. A checkpoint file is created by each run. When a run starts, it checks the checkpoint file and resumes from there. when it reaches the end, it deletes the checkpoint file, which will cause the next run to start again from the top. Articles reassessed as B class are flagged for human checking.

Discussion

  •   Approved for trial (25 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Primefac (talk) 16:20, 16 October 2020 (UTC)

ProcBot 2.5

Operator: ProcrastinatingReader (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 20:26, Thursday, September 24, 2020 (UTC)

Function overview: Extension of task 2 to allow for general maintenance of date params in television-related infoboxes

Automatic, Supervised, or Manual: Automatic

Programming language(s): Ruby

Source code available:

Links to relevant discussions (where appropriate): User_talk:ProcrastinatingReader#Initial_run_complete

Edit period(s): Continuous

Estimated number of pages affected: 5000 initial run

Exclusion compliant (Yes/No): Yes

Function details: Task 2 was approved to convert text dates to date templates, tracking Category:Pages using infobox television with nonstandard dates. Initial run of task 2 has been completed, but 5000 pages remain in the category. These have various issues, ranging from the wrong template being used (eg {{start date}} for |end_date=), to wikilinks in dates, invalid values, start and end dates both in start_date, uppercase "Present", and likely other formatting issues we haven't considered yet (since there's 5000 pages currently, hard to know all of what's wrong with them). Thus, requesting an open ended maintenance approval (as described in the function overview). Aim is to fix the obvious, repeated errors, so the category numbers are more manageable for human review of the remaining contextual cases.

Discussion

  Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Primefac (talk) 14:55, 25 September 2020 (UTC)

YoutubeSubscriberBot

Operator: EncodedRainbow (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 13:04, Wednesday, September 2, 2020 (UTC)

Function overview: This bot will automatically edit the page of YouTubers' pages with accurate subscriber information periodically.

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python

Source code available: https://github.com/TisTiller/WikipediaYoutubeSubscriptionBot

Links to relevant discussions (where appropriate): Non-controversial, maintenance edit.

Edit period(s): Daily or Weekly

Estimated number of pages affected: ~200 Current Pages: List of YouTubers

Namespace(s): Articles

Exclusion compliant (Yes/No): Yes

Function details: One (or two) purpose(s).

1: Periodically update the subscription counter for each youtubers' channel(s). Information requested directly from google using an API. Once per (day/week), the subscription counter will be updated by sending a request to google's YouTube Data V3 API containing the channel name determined manually *or* by the value(s) in the infobox. If data is retrieved, updated to fit. Contains a date of update.

2: Same as above but for views.

Discussion

Will the bot make an edit if the new figure is too close to the existing one (that is, will it "update" a figure of 2,000,000 to 2,001,000)? A daily update on any article will clutter the page history and would be certainly undesirable. IMO the bot shouldn't edit the same article more than once a month, and too only if the change in the number is significant. Also, has this task been discussed somewhere? – SD0001 (talk) 16:04, 3 September 2020 (UTC)

I randomly checked about a half-dozen YouTubers and the subscription counts tended to be either 2 or 3 sigfics, so you wouldn't get an update of 2 million until it hit 2.1 million (though obviously the bot needs to know be able to deal with this). I concur that daily is unnecessary and that monthly might be better. Primefac (talk) 17:05, 3 September 2020 (UTC)
I agree with both of you. Daily/Weekly is unnecessary. The API only displays 2 (or 3) significant figures anyway. I actually didn't think to discuss it anywhere since the Wikipedia:Bots/Requests_for_approval article says that if the bot could be controversial, it should be discussed. But it makes sense to talk to people about it beforehand, I was ignoring proceedure. I'll do it at Wikipedia:WikiProject YouTube. Also, if the bot does not notice a difference in value, it will not submit an edit. EncodedRainbow (talk) 17:02, 8 September 2020 (AEST)

Suggest looking for the transclusions of {{Infobox YouTube personality}}, rather than the items in the list article. ProcrastinatingReader (talk) 02:11, 5 September 2020 (UTC)

  •   Approved for trial (25 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. If possible, please also keep track of how many pages it skips for having "the same" number. Primefac (talk) 16:28, 15 September 2020 (UTC)
  •   Comment:Hey ER, Instead of editing locally, why not update Youtube channel ID property of the associated wikidata item and contribute globally? ---- Nokib Sarkar Poke 11:00, 28 September 2020 (UTC)

DomdomeggBot

Operator: Domdomegg (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 23:17, Tuesday, August 11, 2020 (UTC)

Function overview: Where possible, automatically update data (starting with the UK) for Template:COVID-19_pandemic_data

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python

Source code available: https://github.com/domdomegg/uk-covid-stats-wikipedia-updater

Links to relevant discussions (where appropriate): No explicit discussion about this bot, but on Village Pump automating this has been met with positive reception and the approved Wugbot 4 does a similar thing for the maps.

Edit period(s): Daily

Estimated number of pages affected: 1

Namespace(s): Template

Exclusion compliant (Yes/No): No, only runs on one page with many safeguards

Function details:

Daily, the bot will call the official UK government COVID data API for cumulative case and death numbers, and update the page. This will happen soon after the data is published at 4pm BST (3pm UTC). This source is https://coronavirus.data.gov.uk/, the one recommended and being used for manual updates now. The API data is licensed under the Open Government License v3.0, so fine to use on Wikipedia with a reference (although it's a fact so not copyrightable under UK law anyway).

It's a fairly simple Python script which has clear monitoring and can easily be shut off. The only Wikipedia API interactions it should have are: Get login token, login, get current page, get edit token, make edit. At several points during the process it will bail if something is unexpected, e.g. data from the government API has unexpected structure, it cannot definitely find the location of the UK statistics on the page, if the numbers on the page are higher (should not be possible with cumulative cases) or significantly lower (should not jump >10%/day) etc.

A bit about me - I've been an editor on-and-off since 2013, and have 3050 live edits on Wikimedia Commons. I've contributed to various open-source Wikimedia projects such as the Wikimedia Commons Android App, and the source code of this bot is open too (will sort out a license soon). I won't add any extra functionality without submitting another BRFA (unless the procedure is to do something different - let me know :)).

Thanks for considering this bot! Apologies if I have made any mistakes in submitting this bot proposal - it's my first bot on Wikipedia :)

Discussion

Oh, and what I think I'd initially be looking for would be a 7 day supervised trial of this bot - stopping if it runs into any unexpected issues. Also, I've done a little testing in userspace. Thanks again :) Domdomegg (talk) 23:34, 11 August 2020 (UTC)

Will this bot conflict with Dapperbot? Primefac (talk) 20:37, 20 August 2020 (UTC)
Hi Primefac, thanks for replying :) No, it will not conflict with Dapperbot. It will only update the base article exactly as a human would. It has several safeguards in place to avoid editing the figures if it is at all uncertain about the true statistic (e.g. it'll never decrease the cumulative count). Domdomegg (talk) 22:12, 28 August 2020 (UTC)
  Approved for trial (7 edits or 7 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. I suppose technically they're the same thing, but wanted to specify. Primefac (talk) 19:06, 1 September 2020 (UTC)

Yapperbot 3

Operator: Naypta (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 14:13, Saturday, June 20, 2020 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Golang

Source code available: https://github.com/mashedkeyboard/yapperbot-scantag

Function overview: Scans every article on Wikipedia, and checks for configured patterns. When it finds a pattern, it tags the article with an appropriate maintenance tag.

Links to relevant discussions (where appropriate): Wikipedia:Bot requests#Populate tracking category for CS1|2 cite templates missing "}}" (now archived here)

Edit period(s): Continuous

Estimated number of pages affected: Hard to say; as it's configurable, potentially the entire corpus of Wikipedia articles

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details: GreenC was looking for a bot that would populate a tracking category for templates that are unclosed and contained within a <ref> tag. I started out making this, but then realised that, rather than having people make a mishmash of bots that did regex scanning of all articles with different regexes, it would be a better idea to have one bot parsing the pages and doing the regex scanning for all the regexes people wanted to match over pages. So, that's what I've made.

Scantag, the name I've given to this task, is a dynamic system for creating rules for pattern matching articles that need maintenance using regular expressions, and then maintenance tagging them according to the associated matches. At present, it is only configured for the specific request that GreenC has made; however, its configuration (as is becoming a recurring theme with my bots!) is entirely on-wiki, so it can be reconfigured on-the-fly. You can see its configuration file here. This is in a user JSON file, so it is only editable by administrators and myself through the bot account; I think this should be a sufficient threshold to prevent abuse, but was considering getting the content model changed to JS to make it interface protected, instead, due to the potential for danger inherent in the task. Thoughts on this would be appreciated.

Whilst the edit filter regex matches changes, it is designed only to be used for preventing serious issues that actively harm the wiki, and there's a limit to the number of rules that it can have - after all, a user is waiting. Scantag, on the other hand, is a deliberately slow process - it runs with a short maxlag, a high number of retries for maxlag, and after every edit it waits a full ten seconds before continuing. This brings with it the advantage that, while it may be a slow process, it can be used for a lot more than the edit filter would ever be. Because it's looking over every single article, it can also be useful for finding and tagging articles that would be impossible to run through a standard regex Elasticsearch, because it would simply time out. Case in point, the maintenance tagging that we're talking about here - but potentially, the same system could be useful for a number of other applications that involve matching patterns in articles.

The task works as follows:

  1. The bot examines the User:Yapperbot/Scantag.json file, reads the rules, and compiles the regexes.
  2. The bot then iterates through the latest database dump's list of page titles in NS0.
  3. For every title in NS0, the bot retrieves the wikitext.
  4. The bot matches each regex (currently only the one) specified in Scantag.json against the wikitext of the article.

If there is no match, the bot skips to the next article. If the bot matches the regex, however, it performs the following steps:

  1. Check the "noTagIf" regex specified corresponding to the main regex. This is a rule designed to check for where the article has already been tagged with the correct maintenance tag.
  2. Prefix the article with the corresponding "prefix" property in the JSON file, if there is one.
  3. Suffix the article with the corresponding "suffix" property in the JSON file, if there is one.
  4. Edit the page, with an edit summary linking to the task page, and listing the "detected" parameter as the reason.
  5. Wait ten seconds before moving on. This is a safety mechanism to prevent a situation where a badly-written regex causes the bot to go completely haywire, editing every single article it comes across.

In common with other Yapperbot tasks, the bot respects the kill page at User:Yapperbot/kill/Scantag, so in the event of an emergency, it could be turned off that way.

Because writing the regexes involved requires not only a good knowledge of regex, but for those regexes to be JSON escaped as well to stay in the JSON string correctly, and because of the potential for issues to come up as a result, there is also a sandbox for the rules. Myself or any other administrator configuring a Scantag rule would be able to set one up to test in here. Rules in the sandbox generate a report at User:Yapperbot/Scantag.sandbox, explaining exactly what the bot has understood from the JSON it's been given, and rendering an error if there are any obvious problems (e.g. failure to compile one of the regexes, noTagIf being set to anything other than a regex or false, etc). Each rule also can have a "testpage" parameter, specifying a test page with the prefix "User:Yapperbot/Scantag.sandbox/tests/", which is designed as a place to set up tests to make sure that the corresponding regex is matching when it's supposed to, and not matching when it's not. An example of one of these is here.

I appreciate that this is a fair bit more complicated than the previous bot tasks, so I'm absolutely about to answer any questions! There are specific instructions for admins on how to deal with Scantag rule requests on the task page. I think there is also an open question here as to whether each rule would require a separate BRFA. Fundamentally, what's going on here isn't all too different from a "retroactive edit filter", of sorts, so I should think either the default restriction for JSON files to only admins editing, or changing the content model to JS so only interface admins can edit, should be sufficient to protect from misuse; however, I'd definitely like to hear BAG members' thoughts on this.

Discussion

  • This bot is currently proposing to check for the "CS1|2 non-closed }} issue". How do you propose that "new changes" be proposed/vetted/implemented? Primefac (talk) 18:31, 30 June 2020 (UTC)
    @Primefac: I'd envisage the process to be similar to that which is used for edit filters, and indeed have modelled the system around many of the same assumptions, but I'm absolutely open to any better suggestions! To give an overview of what I'm thinking of, though:
    Proposing new changes would happen through User talk:Yapperbot/Scantag, where I've set up a requests system very similar to that of WP:EFR. In much the same way, I'd expect that anything that is posted there has a clearly-demonstrated need, and in cases where it is envisaged to affect a large number of pages, a clear consensus so to do. Any editor would be welcome to propose and discuss rules there, just like EFR, and as discussed below, myself or any sysop would then be able to implement them.
    Vetting changes would take place in two stages: community review of the rule requests from any interested users (much like a BRFA or an EFR) if applicable, as well as (hopefully!) other experienced technical editors and myself, and then implementation review - i.e. checking that the regexes that are written are sane and will run correctly. I'll talk a bit more about this below, as it leads into:
    Implementing changes, which would be done by myself through the Yapperbot account or by any other administrator who edits the JSON file containing the rules. Because this process is non-urgent by its very nature, I would expect that even a sysop making a request would go through the same processes as any other request - there's no reason for them to directly skip to editing the JSON file. As I've mentioned in the instructions up at User:Yapperbot/Scantag, it would be expected to be the case that all changes would be tested in the sandbox first before actually being implemented; I'm also considering adding a separate "live" parameter to the actual JSON file, which would notate whether or not a rule should be live, or on a dry run. This would allow for more complex regexes to be tested on the entire Wikipedia text, and having the bot save to a page a list of pages the regex would match, prior to it actually modifying those changes.
    Hopefully that clears things up a bit, let me know if there's anything that's not clear though! All of this is just "how it's struck me as being best", not "how it is definitively best", so any thoughts are definitely appreciated. As I mentioned in the original BRFA text, I'm particularly interested in thoughts on whether this is actually better to be restricted to interface administrators only rather than all administrators (or perhaps the sandbox should be admins, and the real rules intadmins? or perhaps even the sandbox and "dry run" rules being admins only, and the real rules intadmins?)
    PS. I appreciate that this is a chunky and annoying wall of text; sorry this BRFA is a bit more complex than the others! Naypta ☺ | ✉ talk page | 18:52, 30 June 2020 (UTC)
  • This bot appears to be fetching page texts from the API individually for every page. If its going to do that for 6 million pages, that's horribly inefficient. Please batch the queries - it's possible for bots to query the texts of upto 500 pages in one single request, which is more efficient for the server. See mw:API:Etiquette. I see you're already handling edit conflicts, which is great (as they would occur more often because of the larger duration between fetching and editing).
Regarding the editing restrictions, I don't there's a need to restrict it to intadmins. Just put a banner as an editnotice asking admins not to edit unless they know what they're doing. (non-BAG comment) SD0001 (talk) 14:05, 2 July 2020 (UTC)
@SD0001: I had a chat with some of the team in either #wikimedia-cloud or #wikimedia-operations on IRC (one or the other, I don't recall which, I'm afraid) who had indicated that there wouldn't be an issue with doing it this way, so long as maxlag was set appropriately (which is deliberately low here, at 3s). I didn't initially want to do too many page requests in a batch, for fear of ending up with a ton of edit conflicts towards the end of the batch; even with the ability to handle edit conflicts, it's expensive both in terms of client performance and also in terms of server requests to do so. That being said, batching some of the requests could be an idea - if either you or anyone else has a feel for roughly what that batch limit ought to be, I'd appreciate any suggestions, as this is the first time I'm building a bot that parses the entire corpus. Naypta ☺ | ✉ talk page | 14:38, 2 July 2020 (UTC)
@Naypta: Now I actually read the task description. Since the bot is only editing the very top or bottom of the page, it is unlikely to run into many conflicts. Edit conflicts are only raised if the edits touched content in nearby areas; the rest are auto-merged using diff3. I'd be very surprised if you get more than 5-10 editconflicts in a batch of 500. So if you want to reduce the number of server requests (from about 1000 to about 510 per 500 pages), batching is what I'd use. If you choose to do this, you'd want to give the jsub command enough memory to avoid an OOM. SD0001 (talk) 16:09, 2 July 2020 (UTC)
@SD0001: Sure, thanks for the recommendation - I'll plonk them all into batches then. You're right that it's only editing the very top and bottom, but it does need to do a full edit (because of maintenance template ordering) rather than just a prependpage and appendpage, which is unfortunate, but so the edit conflict issues might still come about from that. No harm in giving it a go batched and seeing how it goes though!   I'll make sure it gets plenty of memory assigned on grid engine to handle all those pages - a few gigabytes should do it in all cases. Naypta ☺ | ✉ talk page | 16:13, 2 July 2020 (UTC)
  Done - batching implemented. I've also changed the underlying library I use to interact with MediaWiki to make it work with multipart encoding, so it can handle large pages and large queries, like these batches, an awful lot better. Naypta ☺ | ✉ talk page | 22:20, 3 July 2020 (UTC)

  Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Primefac (talk) 22:22, 2 August 2020 (UTC)

DannyS712 bot III 72

Operator: DannyS712 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 10:26, Sunday, July 26, 2020 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Javascript

Source code available: Not written yet

Function overview: Automatically unpatrol pages created by global rollbackers

Links to relevant discussions (where appropriate): Wikipedia talk:New pages patrol/Reviewers#Autopatrol and global rollback

Edit period(s): Continuous

Estimated number of pages affected: Likely no more than a handful per day

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): Yes

Function details: See Wikipedia talk:New pages patrol/Reviewers#Autopatrol and global rollback. Users with global rollback that create pages here on enwiki are autopatrolled. For those users that lack local autopatrol, automatically un-patrol the pages that they create that are in the page curation system, so that they can be reviewed normally.

Pages to unpatrol will be fetched from the replicas using the query below. This filters for pages that last had their review status changed by global rollbackers where

  • The current status corresponds to autopatrolled pages (some global rollbackers, myself included, are new page reviewers, and so there are many entries where the status was changed by a global rollbacker, but in the context of patrolling others' creations, rather than creating their own)
  • The user lacks both local autopatrol and adminship (yes, bots also have autopatrol, so technically this wouldn't properly filter out global rollbackers that are also local bots and should not be unpatrolled, but there aren't any, and its unlikely there will be in the future)
  • The page is in the main namespace (I don't think the concerns raised in the linked discussion suggest that we should be concerned about the user (sub-)pages created by global rollbackers, and user pages are also a part of the page curation system
 1 /* Value for pagetriage_page.ptrp_reviewed for autopatrol */
 2 SET @autopatrol_status = 3;
 3 
 4 /* Value for global_user_groups.gug_user for global rollbackers */
 5 SET @global_group = 'global-rollbacker';
 6 
 7 SELECT
 8 	pg.page_id AS 'Target page id',
 9     gu.gu_name AS 'Creator (name)',
10     a.actor_user AS 'Creator (user id)',
11     ptrp.*,
12     pg.*
13 FROM centralauth_p.global_user_groups gug
14 JOIN centralauth_p.globaluser gu
15 	ON gu.gu_id = gug.gug_user
16 JOIN actor a
17 	ON a.actor_name = gu.gu_name
18 JOIN pagetriage_page ptrp
19 	ON ptrp.ptrp_last_reviewed_by = a.actor_user
20 JOIN page pg
21 	ON pg.page_id = ptrp.ptrp_page_id
22 WHERE gug.gug_group = @global_group
23 
24 /* Global rollbackers can be new page reviewers, only care about pages that they autopatrolled */
25 AND ptrp.ptrp_reviewed = @autopatrol_status
26 
27 /* The focus is on articles, global rollbackers can be trusted not to abuse user pages */
28 AND pg.page_namespace = 0
29 
30 /* Global rollbackers can also be locally autopatrolled. Exclude users in the relevant local groups */
31 AND NOT EXISTS (
32 	SELECT 1
33 	FROM user_groups ug
34 	WHERE ug.ug_user = a.actor_user
35 	AND ug.ug_group IN ('autoreviewer', 'sysop')
36 )

Testing the query correctly flagged some recent pages I and other global rollbackers created.

Discussion

@DannyS712: to unpatrol a page we need phab:T22399.--GZWDer (talk) 10:52, 30 July 2020 (UTC)
@GZWDer: Not with page curation - that is for rc patrolling. See https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/PageTriage/+/refs/heads/master/includes/Api/ApiPageTriageAction.php DannyS712 (talk) 10:53, 30 July 2020 (UTC)
  Approved for trial (25 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. And yes, I realize these aren't "edits", but please post the log of un-patrols here when complete. Primefac (talk) 22:06, 2 August 2020 (UTC)
Will do. Unfortunately, the replicas are currently lagging, so it might be a bit before I can do this DannyS712 (talk) 22:08, 2 August 2020 (UTC)
Update - haven't gotten to this yet due to some other commitments, but I should be able to do it soon DannyS712 (talk) 05:21, 18 August 2020 (UTC)

Bots that have completed the trial period

MilHistBot 7

Operator: Hawkeye7 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 03:16, Thursday, September 24, 2020 (UTC)

Function overview: This job runs through the backlog of MilHist project articles without task forces assigned, and adds appropriate task force cards to the MilHist project template on the talk page.

Automatic, Supervised, or Manual: Automatic

Programming language(s): C#

Source code available: User:MilHistBot/AutoClass.cs

Links to relevant discussions (where appropriate): Wikipedia talk:WikiProject Military history/Coordinators#The final frontier

Edit period(s): weekly

Estimated number of pages affected: Initially, a backlog of about 900 articles. Thereafter about 30 articles per week.

Namespace(s): Talk

Exclusion compliant (Yes/No): Yes


Function details: The job loops through the Category:Military history articles with no associated task force and examines the articles found there. The categories are matched against a list of keywords, and the task force cards are added to the Template:WikiProject Military history template.

Discussion

  Approved for trial (25 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Primefac (talk) 14:56, 25 September 2020 (UTC)

 * This has been delayed due to changes to the software for another job. Hope to get it run soon. Hawkeye7 (discuss) 04:03, 11 October 2020 (UTC
  Trial complete. No significant problems. Hawkeye7 (discuss) 22:24, 11 October 2020 (UTC)
Hawkeye7, link to diffs please? Primefac (talk) 16:46, 16 October 2020 (UTC)
There's a discussion in progress. Will provide another set. Diffs look like this. Nothing harmful. Hawkeye7 (discuss) 21:44, 16 October 2020 (UTC)

MajavahBot 4

Operator: Majavah (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 15:30, Friday, October 9, 2020 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: on GitHub

Function overview: Fill out DYK blurbs to pages which are missing those.

Links to relevant discussions (where appropriate): Wikipedia:Bot_requests#DYK_blurb_filling_bot

Edit period(s): Go thru backlog as one-time run, after that run periodically, probably weekly

Estimated number of pages affected: currently 2,500 pages in the category

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details: Go thru pages in Category:Pages using DYK talk with a missing entry, parse date from {{DYK talk}}, load blurb from archive and fill it in

Discussion

  Approved for trial (25 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Yeah, it's my own Bot request, but this is hardly controversial. I'll review things, but leave it to another BAG member to do the final approval if everything is peachy. Headbomb {t · c · p · b} 15:43, 9 October 2020 (UTC)

  Trial complete. I'm having an issue that is causing the bot to sometimes incorrectly change the DYK appearance date. I'm currently manually correcting those and I have a guess on what is causing that bug.  Majavah talk · edits 17:11, 9 October 2020 (UTC)
I found out what is causing those issues ([1], [2]) and added some code to automatically detect similar issues.  Majavah talk · edits 17:27, 9 October 2020 (UTC)

  Approved for extended trial (25 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Let's test the improved bot logic then. Headbomb {t · c · p · b} 18:26, 9 October 2020 (UTC)

  Trial complete. LGTM  Majavah talk · edits 19:25, 9 October 2020 (UTC)
Looks good to me too. As a note, the date fixing wasn't requested, but it brings the DYK talk template in line with what is shown in the DYK archives, the UTC date. It's often off by +1/−1 day from the DYK talk template because the date varies by that much depending on where you are in the world. This is a good thing in general, both WTF-reducing (+/- 1 day difference) and error-correcting (over +/- 1 day difference), so there shouldn't be any issue with this task done in combination with adding missing |entry=.
It might be worth exploring approval for general date fixing, not just doing date fixing when there's a missing entry, but I'll leave that to the final BAG approval. Headbomb {t · c · p · b} 20:28, 9 October 2020 (UTC)
General date fixing would be nice if we're at it. As mentioned, it's confusing and I can't count the number of times I got messed up because of it. Wug·a·po·des 23:37, 9 October 2020 (UTC)
Headbomb Majavah The date in the DYKtalk template is semantically different from the date in the archives. DYKtalk template reflects the UTC date/time the hook appeared on the Main page. The archives reflect the UTC date/time the hook left the Main Page. I suggest a conversation with the broader DYK community before attempting to change these semantics. Shubinator (talk) 17:49, 10 October 2020 (UTC)
Flipped through the bot's trial edits - with the existing semantics that the DYKtalk template should reflect the UTC date/time the hook was placed on the Main Page, the bot is introducing errors by changing the date to when the hook left the Main Page:
Also it looks like 1922 College Football All-America Team is a separate bug, perhaps the one that was fixed between the first and second trial? The hook-filling logic looks great, so it might be best to scope the bot's responsibilities to just filling the hook in DYKtalk templates. Shubinator (talk) 20:58, 10 October 2020 (UTC)
I wasn't aware of those semantic differences, thanks for letting me know. If that's the case I agree that it's best to only fill the hook ignoring the date parameter.  Majavah talk · edits 08:44, 11 October 2020 (UTC)
That's not quite true. The date is used to create link to anchors on the archive page, e.g. Wikipedia:Recent_additions/2006/May#14_May_2006 (before) vs (after) Wikipedia:Recent_additions/2006/May#15_May_2006 for Hulk Hogan's Pastamania. The bot fixed those links to point to the correct archives. Headbomb {t · c · p · b} 15:19, 11 October 2020 (UTC)
I'd be in favor of a new |archiveDate= parameter on Template:DYK talk to address the archive link anchors. Shubinator (talk) 00:42, 12 October 2020 (UTC)
I highly doubt that |date= truly reflects a difference between archiving time and displaying time, or that the difference is even meaningful. But I agree that's a different discussion, and that this bot should be concerned with sticking to the entries for now, and fixing dates later if there's consensus for it. Headbomb {t · c · p · b} 01:00, 12 October 2020 (UTC)
Shubinator's suggestion may be a good way forward for the long term; for now, we shouldn't be changing the dates for {{DYK talk}}, since the main concern is to show when the DYK for the article was posted on the main page. The archive, as noted, gives the time that the DYK was moved off the main page (or archived), which is sometimes the following day in the early days when we were running three or four sets a day, and nowadays happens either all the time when doing one set a day, or half the time when doing two. BlueMoonset (talk) 15:53, 12 October 2020 (UTC)
Agreed, this isn't the correct place to discuss how {{DYK talk}} should be changed. For now I've removed code that changes |date=, if it's wanted in the future I'll open a new BRFA.  Majavah talk · edits 15:26, 13 October 2020 (UTC)

  Approved for extended trial (25 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete., let's have a hopefully final trial on this, and then some other BAG member can evaluate and do the final approval. Headbomb {t · c · p · b} 22:58, 13 October 2020 (UTC)

  Trial complete. and I've reverted the previous date changes.  Majavah talk · edits 07:18, 14 October 2020 (UTC)
Looks all good to me. I'll ping other BAG members for final review/approval. Headbomb {t · c · p · b} 21:43, 14 October 2020 (UTC)
Majavah Can the bot fill out |nompage= where easily possible/applicable, like DYKUpdateBot? It's useful in case the current page title later moves, to retain the link. Mostly an extra really, so doesn't matter too much. Regarding the above thread on dates, you say the date code is removed, so (just confirming) if Talk:1922 College Football All-America Team was rollbacked and done again it wouldn't modify that date? (it would stay "7 April 2009"?). Recent trial LGTM. ProcrastinatingReader (talk) 12:16, 16 October 2020 (UTC)
Correct, it does not modify dates anymore. I probably could add nompages but I'd like to do that in a separate BRFA to avoid problems similar to dates above when approving this BRFA.  Majavah talk · edits 18:20, 16 October 2020 (UTC)
  • A link to diffs/contribs of the last trial would be very helpful. Primefac (talk) 16:47, 16 October 2020 (UTC)
    @Primefac: Here you go: [3]  Majavah talk · edits 18:09, 16 October 2020 (UTC)

Usernamekiran BOT 4

Operator: Usernamekiran (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 19:47, Thursday, August 13, 2020 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): AWB

Source code available: AWB

Function overview: find and replace wikilinks

Links to relevant discussions (where appropriate): User talk:Xaosflux#bot task

Edit period(s): As required

Estimated number of pages affected: variable, based on task

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): Yes

Function details: I became active again at closing page move discussions. If there is a DAB page involved in the moves, sometimes the bot (RussBot) us unable to fix the links, as they are not double redirects. Recently, the issue came up with RM at Requested Talk:Chani (character)#move 30 July 2020, but the links to be fixed were very few. Today, similar issue came up with Talk:Bell Satellite TV#Requested move 6 August 2020. I had to update somewhere around 450 pages, from "Bell TV" to "Bell Satellite TV"; as Bell TV was moved to Bell Satellite TV, and Bell TV was converted into a DAB pointing to Bell TV should become a DAB pointing to Bell Satellite TV, Bell Fibe TV, and Bell Mobile TV. In this case, the RussBot is not an option, as there wouldn't have been a double redirect; and no botremoved mistaken repetition fully-automated bot will know what link to choose from the DAB page.

My method/task is pretty basic, and simple. I added basic find and replace rule: find [[Bell TV, and replace it with [[Bell Satellite TV. So far I have updated more than 150 articles, and there have been no issues. I have been checking the diff in AWB, and hitting ctrl+S; without any issues.

I am aware this is a very basic task. But there is no bot at WP:RM with this function. There is a possibility that some other bot might be approved with this task; but that would mean I will have to wait till the bot operator comes online. Whereas if I had the approval, I can do it whenever the need arises. As this is a basic, and uncontroversial task; I thought I should ask about it. Regards, —usernamekiran (talk) 19:47, 13 August 2020 (UTC)

PS: This method also handles previously piped links: special:diff/972777535. —usernamekiran (talk) 19:49, 13 August 2020 (UTC)

Discussion

What's the general logic / criteria for the task? Would this be automatic or assisted somehow (i.e. triggered by moves, or human-supervised, telling it 'fix those specifically') ? Headbomb {t · c · p · b} 17:11, 20 August 2020 (UTC)

@Headbomb: Basically, I would be making list in AWB after closing the RM, but before performing the actual move. I apologise for selecting bot's mode as automatic above. The operation would be just like manual AWB editing, with the only exception being "bot flag" to avoid hitting ctrl+S for a lot of times. It would also be convenient, and time saving. —usernamekiran (talk) 23:18, 20 August 2020 (UTC)
For the record, what you just described is "automatic mode". Primefac (talk) 23:51, 20 August 2020 (UTC)
Then where do the bots like MuzikBot, and archive bots fall in? —usernamekiran (talk) 10:51, 21 August 2020 (UTC)
They are also fully automatic. Primefac (talk) 19:32, 21 August 2020 (UTC)

  Approved for trial (1 move discussion). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Alright, then let's have a trial and see if bolts come off loose. Headbomb {t · c · p · b} 23:22, 20 August 2020 (UTC)

  • This is a classic case where context matters. There's a whole wikiproject whose goal is fixing links to dab pages (WP:DPL), and there are various tools they have developed over the years to help with aspects of the work. Sadly, a simple find and replace will not work, because the links are by definition ambiguous, so you can't know the intended target unless you examine the context. And you can't assume that the previous primary topic will always be it either. Fixing links in some very narrowly defined contexts can be helpful, like in [[New York]] City -> [[New York City]]), but situations where it's needed to do this at scale are quite rare. Also adding that, contrary to what was suggested in the Channi RM linked above, closers of RM discussions are not expected to fix dab links resulting from the moves. – Uanfala (talk) 10:19, 21 August 2020 (UTC)
    @Uanfala: Hi. I am aware of the issues that can arise, like the ones you mentioned. Like I stated in the comment above, I will be making list in AWB (and then I will check the list, and the edits once before making the actual edits), once the list/edits are okay; only then I will save the edits. It will not be a "blind" task where I will make list of "what links here", and hit save. —usernamekiran (talk) 10:50, 21 August 2020 (UTC)
    @Uanfala: Yes they are, see WP:FIXDABLINKS - "Before moving an article to a qualified name (in order to create a disambiguation page at the base name, to move an existing disambiguation page to that name, or to redirect that name to a disambiguation page), click on What links here to find all of the incoming links and repair them." Narky Blert (talk) 20:56, 22 August 2020 (UTC)
    Thank you for the link, Narky Blert. This is indeed what that page says. This is an abomination we should be thankful is never followed in practice – otherwise many moves wouldn't happen because the editors involved wouldn't ever have the time and inclination to fix those links and we would be stuck with all the bad primary topics for perpetuity. – Uanfala (talk) 21:38, 22 August 2020 (UTC)
    I'm having a look at the talk archives and I'm not surprised that this piece of nonsense doesn't appear to have been discussed much. The one discussion there was on the topic, however, is clear that closers of RM discussions should neither be required nor expected to fix those links. – Uanfala (talk) 21:44, 22 August 2020 (UTC)
    @Uanfala: FIXDABLINKS is idealistic and unworkable, even penal. If there is only a handful of links, it does make sense for the closer to fix the links, as being quicker and simpler than asking the proposer to do so and checking to see if they've done it. The problem is with moves which break several hundred or several thousand links. (Example: a move involving one of the Jammu and Kashmir pages at the beginning of June broke over 2,000 links; two-and-a-half months later, 495 are still broken, see DPwL.) Is is unfair to expect an editor who has made an administrative change reflecting a WP:CONSENSUS should have sole responsibility for cleaning up the mess; especially prior to making it, which is what FIXDABLINKS says.
    That 2014 discussion is still relevant. It includes a proposal which I came up with independently, that the onus should be on those who supported the move (and in big cases, there will always be more than one of them).
    I once proposed a change to a guideline; I won't be making that mistake again. My legal training taught me that if one party finds a clause ambiguous, it is ambiguous and it needs to be redrafted. I found the wording of a guideline ambiguous; some agreed, others didn't, and the result was WP:NOCONSENSUS. Narky Blert (talk) 07:00, 23 August 2020 (UTC)
    FIXDABLINKS is vague as to who should fix the links. In my opinion, it should be the person(s) requesting the move, rather than the closer who carries out the mechanics of moving the pages but may not be a subject expert. All too often, it gets left for a passing gnome. Certes (talk) 09:37, 23 August 2020 (UTC)
    Yeah, proposing changes to guidelines can be a pain (I would think even more so for like me who lack any legal training), and it sometimes may be easier to just ignore antiquated guidelines than attempt changing them, but oh well, there we go: Wikipedia talk:Disambiguation#Rewriting WP:FIXDABLINKS (Links to disambiguated topics). – Uanfala (talk) 14:06, 23 August 2020 (UTC)
    @Uanfala: I've seen that discussion, I intend to comment in detail once I've gathered my wool and rowed up my ducks. Narky Blert (talk) 18:31, 24 August 2020 (UTC)
  • a comment in general: I am okay with fixing the links from my normal (non-bot) account if there there are few pages to be fixed. If I am running the task from bot account, I will make sure there are no issues after my edits. I am requesting the bot flag only for the cases where there are a lot of page/links to be fixed. For example, in case of Talk:Bell Satellite TV#Requested move 6 August 2020: in step one, I created the list, after the pre-parse mode; there were around 450 articles to be fixed. In step two, I skimmed that list. In step 3, I made around 150; and in second run, around 50 edits with checking the difference before saving the edit. All were without any technical problems. I performed the remaining ~250 edits without checking the diffs; and after the entire session, checked a lot of them at random. None of the diffs I checked had any problem. What I am trying to say is, I will make a logical judgement before, and during step 2. I will not be changing the links without getting to know the context. The only difference in regular/normal AWB run, and this bot run would be the last step of saving the edit manually vs automatically. Given my previous experience of moving pages, and editing wikipedia in general; I can figure out where to use caution, and how to handle things. I am looking at this trial run as means to check only the technical accuracy of my method. —usernamekiran (talk) 15:02, 21 August 2020 (UTC)
    Thank you for the detailed explanation. Just a clarifying question, what proportion of links do you manually examine the context of? By examining the context, I mean looking at the article text that has the link and reading the sentence in which the link is found (possibly also the sentences before and after). – Uanfala (talk) 16:10, 21 August 2020 (UTC)
    That is a very broad question. The solution, and context begins with the page that is being moved. The Bell TV was the easiest one. There was not much scope for linking to the incorrect page (the incorrect page being the one that is to be moved). Biographies are also easy. Till now whenever I came across such instances, I used to skim the page which is to be moved. That gives you a general idea what the topic is, and where can you expect the links to come from. Also, after making the first list, I remove disamb pages from the AWB's list; and edit it manually. Most complicated are the ones of communities/castes; sometimes incorrect process can lead to a (intended) group being linked to a language (eg: Tamils (people) with redirect Tamil people; then there is Tamil language, and Tamil script. A complete list can be found at the dab Tamil). I do have a method for weeding out the pages that might get incorrectly updated, but I don't know how to put it in words. I do it when I see the "what links here" list of wikipedia, and/or in the AWB list. Its based partly on hunch, and mostly on logic (dont know how to put this logic in words). The only thing I can say is, I will make the edits carefully; and there wouldn't be any issues. Not trying to brag, but according to xtools I have moved 1,787 pages till now. That number is exaggerated as it counts one page swap/round robin move as 3 page moves. Assuming I have moved 400 pages, and performed this task (the one in discussion) a few times, there have never been any issues (however, there were two instances during the early days as page mover, where my closure was not perfect; but almost nothing after that). And I will try to keep things that way   —usernamekiran (talk) 18:42, 21 August 2020 (UTC)
    Thank you for the detailed reply, but I still don't see an answer to my question. Should I take it that you don't intend to examine each link in its context? In such a case, this task should definitely not go ahead in its present form. To decide the target of the link, you need to examine the link itself and the text around it, it's not enough to guess from the article title. You can use AWV or another tool in combination with whatever heuristics you choose, as long as you manually examine each link to be fixed: either before fixing it, or in the diff afterwards. A process that gets it right 99% of the time does not lead to a net improvement for readers: a link to a disambiguation page is merely an inconvenience that adds an extra step to a navigation path, an incorrectly disambiguated link on the other hand completely cuts off navigation to the intended article and also introduces a factual inaccuracy in the text. If you examine each and every link to be fixed, then it's up to you what technical means you use to save your edit; but whichever way you do it, you shouldn't be doing it under a bot flag, as this would be a manual edit that needs to invite the same amount, and type, of review from page watchers as any other manual edit. – Uanfala (talk) 19:10, 21 August 2020 (UTC)
    Adding that you can ask for more feedback at WT:DPL. – Uanfala (talk) 20:35, 21 August 2020 (UTC)
    @Uanfala: I have no problem with checking the edits. However, I cant understand why are you so much reserved with this task. If [ABC] gets moved to [XYZ], and if we have to divert the incoming links from ABC to XYZ from [foo], and [lorem]; provided there are already wikilinks in foo, and lorem pointing to ABC, I dont see much room for error that you are talking about, if any. —usernamekiran (talk) 08:25, 22 August 2020 (UTC)
    We're talking about moves involving primary topics, right? Say, ABC to ABC (foo) and then ABC (disambiguation) to ABC. Your assumption seems to be that all links to ABC will be intended for ABC (foo). This is not necessarily true, and in fact it is very rarely the case. There will be links ABC where the target is not ABC (foo), but ABC (bar) or ABC (buz). See for example the incoming links to Gaddi, an article about an ethnic group of India: there are some links intended for it (like at Kareri Lake), but there also links not intended for it (like at List of biblical names starting with G). – Uanfala (talk) 11:41, 22 August 2020 (UTC)
    Thats what I was trying to say when I gave the example of Tamil. Anyways, like said earlier; I don't have any problem checking the edits. —usernamekiran (talk) 11:58, 22 August 2020 (UTC)
  • When an article moves and a disambiguation page usurps its title, this is because that title is ambiguous. Many links will intend the moved article but some will have other meanings. It is necessary to examine each link manually. Requests to automatically change links which have been vetted by human eyes occasionally pop up at WP:AWB/TA, and I'm sure many editors use semi-automated tools for similar purposes. (I prefer WP:DisamAssist.) A bot to do that might be useful. However, I would oppose an automated change where links have not been checked manually or even where a sample has been checked. This would simply turn known unknowns into unknown unknowns, for example by replacing Foo by Foo (computing) where Adeline Foo was intended. I recommend automatically generating lists of potentially good changes, then applying them selectively with manual supervision. Certes (talk) 12:09, 22 August 2020 (UTC)
    I did a small amount of the cleanup after Talk:Vinyl#Requested move 19 June 2017, when a redirect was turned into a DAB page. That broke about 2,500 links. Only about 10% of them were intended for the original target, and the other 90% were mostly split three ways in roughly equal amounts. Eyeballs were essential. Narky Blert (talk) 21:05, 22 August 2020 (UTC)
    Another example is New York, which required checking and changing 119,000 wikilinks, of which more than 12,000 were not for the original target. (I did a tiny fraction of the work.) Certes (talk) 21:55, 22 August 2020 (UTC)
  • @Certes, Narky Blert, and Uanfala: Once my problem with AWB resolves (cant start it up) I can fix some links to a page from WP:DPL as bot trial, or maybe some other page. @Headbomb: I know this is different than the proposed task involving RMs, but the task is same. Would it be possible to run the task on a page from DPL? —usernamekiran (talk) 09:25, 23 August 2020 (UTC)
    That sounds good to me, if you identify a DPL case where all links are intended for a single target. Do you actually need AWB for this? WP:JWB can perform many AWB tasks, though you will have to get the list of pages to fix from elsewhere such as Special:WhatLinksHere. Certes (talk) 09:32, 23 August 2020 (UTC)
    No, I meant page links intended 2/3 pages. —usernamekiran (talk) 09:36, 23 August 2020 (UTC)
    (ec) DPwL would be an excellent source for a test-run, it's updated twice daily, and you could choose the number of links from 1 upwards.
    A thought. If a fix isn't obvious, could the bot be programmed to add a {{disambiguation needed}} tag? It's the DABfixer's last resort, and it's remarkable how often problems in Category:Articles with links needing disambiguation get fixed. Narky Blert (talk) 09:44, 23 August 2020 (UTC)
    I will look into DPwL as soon as I get on a computer. disambiguation needed tag can be added, but that would be just me manually adding it hehe. Seriously speaking, I think all that would get more clear after bot's 4-5 heavy runs. —usernamekiran (talk) 09:58, 23 August 2020 (UTC)
  • @Certes, Narky Blert, and Uanfala: I picked up Manny Perez from DPwL, but all the incoming links to the disamb were intended for only one article. I can easily fix the disambs of the likes of Chuuk, Stefan Marinović, and Wyn Jones. All of them have very few incoming links, but that's not the reason. As long as the targets are easily identifiable, I can do the task through AWB/bot; no matter how many links pages are listed on the disamb page. For example, Cruel Summer has only 5 entries, but all of them are songs. For now, I don't know how to approach such situations, but I can come up with some idea/solution if I keep working on the task. Tomorrow I will work on any two articles from above, and then we can have a bot trial for the remaining third article. note: there would be no difference in the working method at all, except for the automated saves. —usernamekiran (talk) 18:17, 24 August 2020 (UTC)
    I'm not the best person to consult about these newfangled tool and bot thingies. I fix links to DAB pages the way my father taught me, which is the way his father taught him, and his forefathers taught him, and so on time out of mind - using craftsman-knapped flint tools and low cunning.
    It's the bulk stuff - DAB pages with 10+ links, say - which is a problem. It's mind-blowingly tedious work, and I try to avoid it. Props to those who do firefight such links. Narky Blert (talk) 19:37, 24 August 2020 (UTC)
  • Done with Wyn Jones, and Stefan Marinović. No technical issues found. There were a few articles where {{sortname}} was used. I fixed it one instance manually, and then a couple through AWB. Then there was this instance where first, and last names were used the other way around. Then there was this instance where initial was used. But all these edits will not be edited even automated account. I will be fixing one link manually (through non-bot account), at the same time I will be making the lists for other targets using AWB. The doubtful/problematic articles would be added to one list for manual editing, other lists would have no chance of error. Links to Chuuk have been fixed by someone. I can fix the links to Big box; there are 4 targets, and 109 incoming links. I am ready to do this with automated/bot account. Pinging @Headbomb and Primefac: not sure if I should ping the other editors as well. —usernamekiran (talk) 16:38, 27 August 2020 (UTC)
  • Since my last comment here, I have been working on a custom/hybrid module for this task. I got input from David at their talkpage: special:permalink/975938658#find and replace random piped wikilinks. I tested this module in my sandbox, and it worked successfully: special:diff/975939820. I tested this module on the Gaddi disamb. All the edits were as expected, including special:diff/977067060. The only problematic edit was special:diff/977066941, where it changed [[Gaddi]]s to [[Gaddis]]s. I updated my module, and fixed the issue: special:diff/977067363. I should have already anticipated that though. Anyways, I have updated the module to handle this scenario. In short: if there are many pages to fixed, I will create the lists first, and then I will handle it through bot account. In case there are not many pages, or creating the lists are not worth it, then I will do that task through non bot account. But now I can positively say that if I do it from bot account, there would be no mistakes. —usernamekiran (talk) 19:45, 6 September 2020 (UTC)
    The Gaddi run has introduced a number of grammatical errors. Replacing [[Gaddi]] with [[Gaddis]] doesn't work in all contexts as the first word is singular and the second one – plural. I've checked the first 10 edits, and the following contain this error: [4] [5] [6] [7]. – Uanfala (talk) 20:04, 6 September 2020 (UTC)
    Yes. If I was told sooner, I could have easily done that. I generally ask this stuff during/while closing the RM (eg: Talk:Neurolathyrism#Requested move 2 July 2020). In either case, there were no incorrect targets, and there were no technical errors :) —usernamekiran (talk) 20:34, 6 September 2020 (UTC)
    Well, editors are expected to figure this out by themselves as part of their preparation before fixing the dablinks, regardless of the method they're going to use (I only pointed that out to you after I noticed an error washing up on my watchlist). This particular kind of error can be avoided if you don't change the visible article text, but use piping in the link (though of course, there are cases where linking directly is preferable). – Uanfala (talk) 21:28, 6 September 2020 (UTC) Adding that I've now fixed the 13 such errors introduced in this run. – Uanfala (talk) 22:39, 6 September 2020 (UTC)

BAG note Just as a note, I won't have time to check this for a couple of days/a week-ish so if some other BAG member wants to take a look, feel free. Headbomb {t · c · p · b} 22:04, 6 September 2020 (UTC)

  Trial complete. through my alt ac Usernamekiran (AWB) with this move. I checked all the diffs, no issues found at all. Sample diffs: sample diff 1, sample diff 2, and 3. —usernamekiran (talk) 18:58, 10 September 2020 (UTC)

So, um... why did this not get run with the bot?
Second question, which I only just realized now since I haven't really been paying much attention (since Headbomb's been the primary BAG) - is there going to be some sort of request process, or is this just so you don't have to hit "save" a bunch of times when you close an RM yourself? Primefac (talk) 22:02, 29 September 2020 (UTC)
Headbomb had approved the trial when the account didn't have bot flag. I was not sure if I should have used the bot account, so I went with non-bot account. I am also not sure if the bot needs to be enabled somewhere (from your side) to be able to edit the mainspace.
I am willing to accept requests which I can handle without any problems. I can handle doubtful requests with non-bot account (or the bot account, but with non-bot mode). If approved, I was thinking about putting a normal discussion thread similar to special:permalink/978981536#bot for link fixing before/after page moves on WT:RM, and WT:Page mover; with a real long {{DNAU}} —usernamekiran (talk) 03:17, 30 September 2020 (UTC)


Approved requests

Bots that have been approved for operations after a successful BRFA will be listed here for informational purposes. No other approval action is required for these bots. Recently approved requests can be found here (edit), while old requests can be found in the archives.


Denied requests

Bots that have been denied for operations will be listed here for informational purposes for at least 7 days before being archived. No other action is required for these bots. Older requests can be found in the Archive.

Expired/withdrawn requests

These requests have either expired, as information required by the operator was not provided, or been withdrawn. These tasks are not authorized to run, but such lack of authorization does not necessarily follow from a finding as to merit. A bot that, having been approved for testing, was not tested by an editor, or one for which the results of testing were not posted, for example, would appear here. Bot requests should not be placed here if there is an active discussion ongoing above. Operators whose requests have expired may reactivate their requests at any time. The following list shows recent requests (if any) that have expired, listed here for informational purposes for at least 7 days before being archived. Older requests can be found in the respective archives: Expired, Withdrawn.