Wikipedia:Bot requests

  (Redirected from Wikipedia:BOTREQ)

This is a page for requesting tasks to be done by bots per the bot policy. This is an appropriate place to put ideas for uncontroversial bot tasks, to get early feedback on ideas for bot tasks (controversial or not), and to seek bot operators for bot tasks. Consensus-building discussions requiring large community input (such as request for comments) should normally be held at WP:VPPROP or other relevant pages (such as a WikiProject's talk page).

You can check the "Commonly Requested Bots" box above to see if a suitable bot already exists for the task you have in mind. If you have a question about a particular bot, contact the bot operator directly via their talk page or the bot's talk page. If a bot is acting improperly, follow the guidance outlined in WP:BOTISSUE. For broader issues and general discussion about bots, see the bot noticeboard.

Before making a request, please see the list of frequently denied bots, either because they are too complicated to program, or do not have consensus from the Wikipedia community. If you are requesting that a template (such as a WikiProject banner) is added to all pages in a particular category, please be careful to check the category tree for any unwanted subcategories. It is best to give a complete list of categories that should be worked through individually, rather than one category to be analyzed recursively (see example difference).

Alternatives to bot requests

Note to bot operators: The {{BOTREQ}} template can be used to give common responses, and make it easier to keep track of the task's current status. If you complete a request, note that you did with {{BOTREQ|done}}, and archive the request after a few days (WP:1CA is useful here).


Please add your bot requests to the bottom of this page.
Make a new request


Copy coordinates from lists to articlesEdit

Virtually every one of the 3000-ish places listed in the 132 sub-lists of National Register of Historic Places listings in Virginia has an article, and with very few exceptions, both lists and articles have coordinates for every place, but the source database has lots of errors, so I've gone through all the lists and manually corrected the coords. As a result, the lists are a lot more accurate, but because I haven't had time to fix the articles, tons of them (probably over 2000) now have coordinates that differ between article and list. For example, the article about the John Miley Maphis House says that its location is 38°50′20″N 78°35′55″W / 38.83889°N 78.59861°W / 38.83889; -78.59861, but the manually corrected coords on the list are 38°50′21″N 78°35′52″W / 38.83917°N 78.59778°W / 38.83917; -78.59778. Like most of the affected places, the Maphis House has coords that differ only a small bit, but (1) ideally there should be no difference at all, and (2) some places have big differences, and either we should fix everything, or we'll have to have a rather pointless discussion of which errors are too little to fix.

Therefore, I'm looking for someone to write a bot to copy coords from each place's NRHP list to the coordinates section of {{infobox NRHP}} in each place's article. A few points to consider:

  • Some places span county lines (e.g. bridges over border streams), and in many of these cases, each list has separate coordinates to ensure that the marked location is in that list's county. For an extreme example, Skyline Drive, a long scenic road, is in eight counties, and all eight lists have different coordinates. The bot should ignore anything on the duplicates list; this is included in citation #4 of National Register of Historic Places listings in Virginia, but I can supply a raw list to save you the effort of distilling a list of sites to ignore.
  • Some places have no coordinates in either the list or the article (mostly archaeological sites for which location information is restricted), and the bot should ignore those articles.
  • Some places have coordinates only in the list or only in the article's {{Infobox NRHP}} (for a variety of reasons), but not in both. Instead of replacing information with blanks or blanks with information, the bot should log these articles for human review.
  • Some places might not have {{infobox NRHP}}, or in some cases (e.g. Newport News Middle Ground Light) it's embedded in another infobox, and the other infobox has the coordinates. If {{infobox NRHP}} is missing, the bot should log these articles for human review, while embedded-and-coordinates-elsewhere is covered by the previous bullet.
  • I don't know if this is the case in Virginia, but in some states we have a few pages that cover more than one NRHP-listed place (e.g. Zaleski Mound Group in Ohio, which covers three articles); if the bot produced a list of all the pages it edits, a human could go through the list, find any entries with multiple appearances, and check them for fixes.
  • Finally, if a list entry has no article at all, don't bother logging it. We can use WP:NRHPPROGRESS to find what lists have redlinked entries.

I've copied this request from an archive three years ago; an off-topic discussion happened, but no bot operators offered any opinions. Neither then nor now has any discussion has yet been conducted for this idea; it's just something I've thought of. I've come here basically just to see if someone's willing to try this route, and if someone says "I think I can help", I'll start the discussion at WT:NRHP and be able to say that someone's happy to help us. Of course, I wouldn't ask you actually to do any coding or other work until after consensus is reached at WT:NRHP. Nyttend (talk) 15:53, 12 February 2020 (UTC)

You could use {{Template parameter value}} to pull the coordinate values out of the {{NRHP row}} template. It would still likely take a bot to do the swap but it would mean less updating in the future. Of course, if the values are 100% accurate on the lists then I suppose it wouldn't be necessary. Primefac (talk) 16:55, 12 February 2020 (UTC)
Never heard of that template before. It sounds like an Excel =whatever function, e.g. in cell L4 you type =B4 so that L4 displays whatever's in B4; is that right? If so, I don't think it would be useful unless it were immediately followed by whatever's analogous to Excel's "Paste Values". Is that what you mean by having a bot doing the swap? Since there are 3000+ entries, I'm sure there are a few errors somewhere, but I trust they're over 99% accurate. Nyttend (talk) 02:57, 13 February 2020 (UTC)
That's a reasonable analogy, actually. Check out the source of Normani#Awards_and_nominations: it pulls the wins and nominations values from the infobox at the "list of awards", which means the main article doesn't need to be updated every time the list is changed.
As far as what the bot would do, it would take one value of {{coord}} and replace it with a call to {{Template parameter value}}, pointing in the direction of the "more accurate" data. If the data is changed in the future, it would mean not having to update both pages.
Now, if the data you've compiled is (more or less) accurate and of the not-likely-to-change variety (I guess I wouldn't expect a monument to move locations) then this is a silly suggestion – since there wouldn't be a need for automatic syncing – and we might as well just have a bot do some copy/pasting. Primefac (talk) 21:27, 14 February 2020 (UTC)
Y'know, this sort of situation is exactly what Wikidata is designed for... --AntiCompositeNumber (talk) 22:29, 14 February 2020 (UTC)
Primefac, thank you for the explanation. The idea sounds wonderful for situations like the list of awards, but yes these are rather accurate and unlikely to change (imagine someone picking up File:Berry Hill near Orange.jpg and moving it off site), so the bot copy/paste job is probably best. Nyttend (talk) 02:23, 15 February 2020 (UTC)
By the way, Primefac, are you a bot operator, or did you simply come here to offer useful input as a third party? Nyttend (talk) 03:12, 20 February 2020 (UTC)
I am both botop and BAG, but I would not be offering to take up this task as it currently stands. Primefac (talk) 11:24, 20 February 2020 (UTC)
Thank you for helping me understand. "as it currently stands" Is there something wrong with it, i.e. if changes were made you'd be offering, or do you simply mean that you have other interests (WP:VOLUNTEER) and don't feel like getting involved in this one? This question might sound like I'm being petty; I'm writing with a smile and not trying to complain at all. Nyttend (talk) 00:27, 21 February 2020 (UTC)
I came here to say what AntiCompositeNumber said. It's worth emphasising: this is exactly what Wikidata is designed for. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:25, 17 March 2020 (UTC)
Actually not. A not-so-small fraction of articles need to have different coordinates in lists and infoboxes, as I already noted here. If we consistently rely on the lists to inform Wikidata, it's going to end up with a good number of self-contradictions due to lists that appropriately don't provide coordinates that make sense in articles (e.g. multi-county listings). Moreover, you can't rely on the infoboxes to inform Wikidata, because there's a consistently unacceptable error rate in coordinates unchecked by humans, and very few infoboxes are checked by humans; they're derived from the National Register database, and it would be pointless to ignore or trash the human-corrected Virginia coordinates. Literally all that needs to be done is a bot doing some copy/pasting; it would greatly be appreciated if someone were to spend a few minutes on this, instead of passing the buck. Nyttend backup (talk) 19:36, 28 April 2020 (UTC)
The coordinates in the lists are often incorrect too. Let me know if you want help manually correcting them. Abductive (reasoning) 02:46, 22 June 2020 (UTC)

Enlist help to clear Category:Harv and Sfn no-target errorsEdit

  1. Fetch all articles in Category:Harv and Sfn no-target errors
  2. Compile a list of who created what article
  3. Compile a list of which Wikiproject covers what article
  4. Send each user[1] and each WikiProject a personalized report about which articles they created have errors in them, e.g.
 
= = List of your created articles that are in [[:Category:Harv and Sfn no-target errors]] = =

A few articles you created are in need of some reference cleanup. Basically, some short references create via {{tl|sfn}} and {{tl|harvnb}} and similar templates have missing full citations or have some other problems. This is ''usually'' caused by copy-pasting a short reference from another article without adding the full reference, or because a full reference is not making use of citation templates like {{tl|cite book}} (see [[Help:CS1]]) or {{tl|citation}} (see [[Help:CS2]]). See [[Category:Harv and Sfn template errors#Resolving errors|how to resolve issues]]. To easily see which citation is in need of cleanup, you can check '''[[:Category:Harv and Sfn template errors#Displaying error messages|these instructions]]''' to enable error messages ('''Svick's script''' is the simplest to use, but '''Trappist the monk's script''' is a bit more refined if you're interested in doing deeper cleanup).

The following articles could use some of your attention
{{columns-list|colwidth=30em|
#[[Ancient 1]]
#[[Article 2]]
...
}}

If you could add the full references to those article, that would be great. Again, the easiest way to deal with those is to install Svick's script per [[:Category:Harv and Sfn template errors#Displaying error messages|these instructions]]. If after installing the script, you do not see an error, that means it was either taken care of, or was a false positive, and you don't need to do anything else.

Also note that the use of {{para|ref|harv}} is no longer needed to generate anchors. ~~~~
  1. ^ Skip user talk pages with links to List of your created articles that are in [[:Category:Harv and Sfn no-target errors]] in headers since they already have such a report

Headbomb {t · c · p · b} 23:18, 18 May 2020 (UTC)

I think the message needs to provide a link to a discussion page where people can go for help. Keep in mind that most requests for help will be of the form "What is this message? I didn't do anything or ask for this. I don't understand it. Help me." – Jonesey95 (talk) 23:31, 18 May 2020 (UTC)
Agree it would be a good idea to point to a help page. Where would that be? Help talk:CS1 perhaps? Headbomb {t · c · p · b} 00:21, 19 May 2020 (UTC)
Maybe Template talk:Sfn? If this goes through, I'd like to see these messages go out in batches, in case a potential help system (run by you and me, presumably) gets a lot of traffic. – Jonesey95 (talk) 01:44, 19 May 2020 (UTC)
Doesn't really matter much to me where things go. Module talk:Footnotes could be a place. Messages could be sent in batches too. Maybe top 25 users, then next 25, and so on each day for the first week. And then see what the traffic is and adjust rates if it's nothing crazy. Headbomb {t · c · p · b} 11:28, 19 May 2020 (UTC)

A bot to develop a mass of short stubs and poorly built articles for Brazilian municipalitiesEdit

I propose a bot along the lines of {{Brazil municipality}} is created to develop our stubs like Jacaré dos Homens which have been lying around for up to 14 years in some cases. There's 5570 municipality articles, mostly poorly developed or inconsistent with data and formatting even within different states. A bot would bring much needed information and consistency to the articles and leave them in a half decent state for the time being, Igaci which Aymatth2 expanded is an example of what is planned and would happen to stubs like Jacaré dos Homens. Some municipalities have infoboxes and some information but hopefully this bot will iron out the current inconsistencies and dramatically improve the average article quality. It would be far too tedious to do it manually, would take years, and they've already been like this for up to 14 years! So support on this would be appreciated.† Encyclopædius 12:09, 20 May 2020 (UTC)

@Encyclopædius: and @Aymatth2: Where's the community endorsed consensus from WikiProject Brazil/WikiProject Latin America/Village Pump? Where's your driver list of proposed articles? How are you proposing to improve the page so that these aren't perma stubs with no chance at improvement? Per WP:FAIT and WP:MASSCREATION it's expected that there will be a very large and well attended consensus that this bulk creation is appropriate. In short,   Not a good task for a bot. table this until you have a community conesnsus in hand as very few bot operators will roll the dice on doing this task in exchange for having their bot revoked. Hasteur (talk) 17:40, 23 June 2020 (UTC)
@Hasteur: The title of this proposal is a bit misleading. The idea is not to create a mass of short stubs and poorly built articles, but to improve the existing mass of short stubs and poorly built articles. There are 5,570 of them, all notable based on WP:NGEO. The Brazilian Institute of Geography and Statistics (IBGE) maintains a database with extensive information on the geography, population, economy etc. of each of them. See Cocalinho and Brasil / Mato Grosso / Cocalinho for a sample IBGE entry. Using this information, and information from sources like GeoNames and Mindat.org, we can upgrade a stub like Araguaiana into a more useful article like Cruzeiro do Sul, Paraná. This seem uncontroversial. The proposal is to develop a screen-scraping tool that will make it easier to copy the data into each Brazil municipality stub.
There are quite a lot of these database-type websites on different topics, displaying each entry in highly standardized format. There is no copyright concern as long as we stick to dates, numbers etc. It would probably be very difficult to develop a generic screen-scraper that could be configured to handle them all, but might be possible to develop reusable logic that could make it fairly simple to develop a new one. That seems to be worth discussing. Aymatth2 (talk) 19:02, 23 June 2020 (UTC)
  Not a good task for a bot. Absolutely NOT. FULL STOP. Get a broadly endorsed consensus at Village Pump as there have been several cases (NRHP, Betacommand, etc) automated database dumps that have gotten editors drummed out either in part (restrictions on creation) or full on community/Arbitration banned. While you may think this is uncontraversial, this is requires a well attended RFC to confirm the sense of the community. Hasteur (talk) 20:58, 23 June 2020 (UTC)
I see this as a screen scraping tool running under editor control rather than a bot. I have started a discussion at Wikipedia:Village pump (idea lab)/Archive 31#Database website screenscrapers. All comments welcome. Aymatth2 (talk) 23:29, 23 June 2020 (UTC)

DYKN image resize botEdit

Greetings. At WP:DYKN, the image size is based on the orientation of the image; vertical images at 120px, square at 140, and horizontal at 160. However there is no way to set the resolution during nomination, which means that even experienced editors often forget to fix the size of the image, and new editors don't know that they should.

I am proposing that a bot do a daily check and update the resolution where needed. In order to cut down on the amount of resources required, it needs only look at recent additions.

It would, I'm guessing, work something like this:

  1. Generate a list of all DYK nominations added to Template talk:Did you know since the task was last run. (It can't use the nomination date because there's a 7-day window to nominate.)
  2. Determine if they contain {{main page image}}.
  3. For nominations where that template is present, determine the aspect ratio of the image.
  4. If the ratio is between 5:6 and 6:5, change the field width= from 120 to 140.
  5. If the ratio is greater than 6:5, change the field width= from 120 to 160.

Sincerely, The Squirrel Conspiracy (talk) 00:31, 7 June 2020 (UTC)

Since nominations can be reviewed quite quickly and moved to the Template talk:Did you know/Approved page, the bot would need to check there as well. While the main Nominations page has a "Current nominations" section comprising of the current date and the previous seven days—this is updated at midnight every day—the Approved page doesn't have the equivalent section. Depending on how often it runs, the bot may need to check earlier on the page, because the dates are not when the nomination was added, but rather when work on the article began, which is supposed to be no more than seven days before nominating. (But is sometimes a little late.) BlueMoonset (talk) 01:44, 7 June 2020 (UTC)
I wonder if it's possible to do this with a module? I'm not familiar with them, but a quick glance shows file metadata can provide height and width.[1] If it is possible to do with a module, that'd probably be better, and it would update automatically rather than having to wait for periodic bot runs. ProcrastinatingReader (talk) 02:34, 14 July 2020 (UTC)
The Squirrel Conspiracy, BlueMoonset,   Done using a template and module. See: Template:DYK image. ProcrastinatingReader (talk) 10:17, 20 July 2020 (UTC)

Convert comma separated values into ListEdit

Comma separated values like A, B, C can be instead converted into

  • A
  • B
  • C

or

{{hlist|A|B|C}}

This is usually found in infoboxes. Additionally, values separated by a

<br/>

can also be converted into a list.

I'mFeistyIncognito 16:39, 14 June 2020 (UTC)

@I'mfeistyincognito: Is there a particular reason for doing this? It looks cosmetic to me, and per WP:FLATLIST, either style is acceptable for the MOS. Naypta ☺ | ✉ talk page | 17:25, 14 June 2020 (UTC)
@Naypta: I always try to turn the data carried by the infoboxes into a more structured form (I know it'll never get there completely). It would make it easier to export data from infoboxes into WikiData. I'mFeistyIncognito 20:01, 16 June 2020 (UTC)
This is a context-sensitive task. To give just one example, {{hlist}}, because it uses <div>...</div> tags, cannot be wrapped by any tags or templates that use <span>...</span> tags, like {{nowrap}}. If an infobox wraps a parameter with {{nowrap}}, converting that parameter's contents to use {{hlist}} will lead to invalid HTML output. – Jonesey95 (talk) 22:17, 14 June 2020 (UTC)
@Jonesey95: You are probably right. Nevertheless, using {{Comma separated entries}} shouldn't be a problem. I'mFeistyIncognito 20:13, 16 June 2020 (UTC)

DYK botEdit

Can someone make a bot to automatically update the Wikipedia:List of Wikipedians by number of DYKs. Just like Wikipedia:List of Wikipedians by featured list nominations and Wikipedia:List of Wikipedians by featured article nominations. Thanks. ~~ CAPTAIN MEDUSAtalk 18:37, 15 June 2020 (UTC)

This is something I'd be interested to work on, if it would be useful. Do you know how this list is currently updated? Pi (Talk to me!) 06:08, 20 June 2020 (UTC)
Manually, by its participants.
As an aside, two users on that list are combining totals from old and new accounts. I'm not on the list because I never bothered, but I would also be combining from two accounts. Is there a way for your proposed bot to handle this? The Squirrel Conspiracy (talk) 17:49, 21 June 2020 (UTC)
That shouldn't be a problem, I'd just have to put the list somewhere of all the accounts that needed adding up. I'll look into the feasibility of it tomorrow Pi (Talk to me!) 05:28, 22 June 2020 (UTC)

  Coding... - I'm just making the script to get the data. Once that's working I'll look at making the bot to update the table Pi (Talk to me!) 17:53, 22 June 2020 (UTC)

@CAPTAIN MEDUSA: I've made some progress with getting the list of nominations, and getting the article creators is relatively simple, but I'm not sure where to find the data for who is credited with the expansion of the article or promotion to GA. Does DYK as a process have a policy on this, and is the data recorded anywhere? Pi (Talk to me!) 22:54, 22 June 2020 (UTC)
Pi, here but you have to manually search for a user.
You can also find the user by going through nominations. It would usually say, Created, Improved to GA, Moved to main space, 5x expanded, and nominated by..... ~~ CAPTAIN MEDUSAtalk 23:32, 22 June 2020 (UTC)
This is coming along OK, I should have a prototype in a couple of days Pi (Talk to me!) 04:32, 23 June 2020 (UTC)
Category:DYK/Nominations, This category is quite useful. ~~ CAPTAIN MEDUSAtalk 12:26, 23 June 2020 (UTC)

Wikipedia:Categories for discussion/Archive debatesEdit

Let's make a bot that creates each page that day at midnight. 95.49.166.194 (talk) 13:10, 17 June 2020 (UTC)

Looks like it's mostly ProveIt who does this normally, who's previously mentioned that they have a script to do it that they then copy and paste from. I've pinged them in here - ProveIt, is this botreq something you're interested in having? Naypta ☺ | ✉ talk page | 13:38, 17 June 2020 (UTC)

Remove malformed obsolete Template:Infobox drug fieldEdit

Per Wikipedia talk:WikiProject Pharmacology#Molecular weights in drugboxes, I am requesting bot attention to remove the following regexp line:

/\| *molecular_weight *= *[0-9.]+ *g\/mol\n/

in articles that transclude Template:Infobox drug. There are a few rare variations that I can remove by hand or that require manual decision whether to remove, but this seems to be the vast majority and a conservative regex for it. This is a one-time cleanup pass that I started doing it with WP:JWB before I realized it was possibly the majority of the 12K articles in that transcluders list. DMacks (talk) 19:18, 17 June 2020 (UTC)

Am I correct in that the parameter itself has not been deprecated, just the usage where a value and units are given? Primefac (talk) 19:38, 17 June 2020 (UTC)
Mostly-correct. The units should not be given with the number...that's a mistake that needs to be fixed. The majority of cases, even the number does not need to be given (it's a deprecated use-case of the field, not the field deprecated as a whole). One detail I had in my offline note and forgot to paste (yikes! sorry!) is to limit the scope to pages where there is a:
/\| *C *= *\d/
as those are pages where the value can be automatically calculated, so the field is not needed. In terms of regex, this is almost always on the line immediately preceding the /molecular_weight/ if it would be useful to have a single regex rather than "one regex to filter the pages, another to replace". Rather than simply fixing the units across-the-board, this is an opportunity to upgrade the usage wherever easily possible. There are a bunch of special cases, where the field contains other than a single number or where the number really does need to be manually specified, but I'm setting those aside for now...once the majority of mindless fixes are done, individual decisions about each remaining case can be made. DMacks (talk) 00:37, 18 June 2020 (UTC)
Might be useful to set up some tracking categories, then; those that don't need the param, and those that need the units removed. Primefac (talk) 00:40, 18 June 2020 (UTC)
Category:Chem-molar-mass both hardcoded and calculated tracks where the param is redundant (or will, as soon as the job queue catches up), so can use that rather than looking for "tranclusion of {{Infobox drug}} ∧ |C=\d". DMacks (talk) 05:10, 18 June 2020 (UTC)
...has stabilized around 5600 pages. Next step is to filter the ones whose field is malformed (mistake to fix) rather than just redundant (deprecated but valid format). DMacks (talk) 14:40, 19 June 2020 (UTC)
  Deferred I'm JWB'ing it, with a looser regex and manual oversight...manually annoying but still scratches the itch. DMacks (talk) 13:43, 21 June 2020 (UTC)

category watch and notification botEdit

Hi. Is there a bot which can monitor a category, such as the category that {{helpme}} requests are added to, and leave notifications of each new addition on-wiki at a specified target page, such as my personal talk page? Just checking as I am looking for something similar for AfC WikiProject, and I suspect that it might be already implemented. Prior discussion one, prior discussion two. Can look at implementing it in Python or Nodejs or Perl, but I hope that perhaps there is an existing bot for such a task. Thank you in advance for your advice. --Gryllida (talk) 05:13, 29 June 2020 (UTC)

@Gryllida: This might end up with you getting a lot of talk page spam. Worth bearing that in mind. Naypta ☺ | ✉ talk page | 09:11, 29 June 2020 (UTC)
This is for testing of the bot - the spam will eventually need to track {{article start request}} tracking category, and go to an AfC help desk wiki page. As this template is currently not in use, testing out a bot on existing and often used {{helpme}} sounds easier. Gryllida (talk) 20:56, 29 June 2020 (UTC)
I made a tool that kind of does this but instead of posting to a page it emails. It monitors backlinks for a given page, generates a list of the backlinks (configurable eg. transclusions to userspace only), checks again the next day or how often you like, diffs the two lists and emails any differences. It can send additions, subtractions or both. It correctly compensates for the bug when a vandal blanks a page briefly which the backlink database takes a few days to re-register. It is a single-file GNU awk script and will work on any unix machine, with cron. on git. -- GreenC 21:16, 29 June 2020 (UTC)

Template:Date is used directly in articles and shouldn't beEdit

Template:Date is supposed to be used only in templates but there's more than a few used otherwise. A simple "subst" won't work as a significant portion of the uses are inside <ref> where subst does not work.

A bot to process these would be appreciated. --Izno (talk) 00:15, 4 July 2020 (UTC)

I am not sure that there would be consensus for such edits. I have wondered aloud on the template's talk page whether that guidance is valid. I did not get a satisfactory answer. – Jonesey95 (talk) 01:18, 4 July 2020 (UTC)
Jonesey95, agreed. I wonder whether the template might be potentially useful for integration with Wikidata or similar applications. I'd want to see matters like that explored before mass-substituting. {{u|Sdkb}}talk 05:37, 7 July 2020 (UTC)

Automatically format TV run datesEdit

Hey geniuses, I was looking at this version of Bigg Boss Tamil 3 and noted that

| first_aired          = 23 June 2019
| last_aired           = 6 October 2019

was problematic, because these dates should be properly formatted for Template:Infobox television. So I wondered if there was a bot that could look at these parameters, then look to see if there is one of the {{Use DMY dates}} or {{Use MDY dates}} templates on the page, and adjust accordingly, with a result of:

| first_aired          = {{Start date|df=y|2019|06|23}}
| last_aired           = {{End date|df=y|2019|10|06}}

or

| first_aired          = {{Start date|2019|06|23}}
| last_aired           = {{End date|2019|10|06}}

Depending on whatever date format it finds.

Also, could this be incorporated into an existing bot? Don't we have maintenance bots that could be looking for stuff like this?

Thanks! Cyphoidbomb (talk) 18:44, 6 July 2020 (UTC)

@Primefac: since using {{start date}} and {{end date}} is advised by Template:Infobox television, for a couple of good reasons, would there be a need for a wider discussion for consensus before filing a RFBA for this? ProcrastinatingReader (talk) 18:37, 17 July 2020 (UTC)
Not Primefac, but looks clearly uncontroversial to me.   Doing... Will file a BRFA for this soon. SD0001 (talk) 20:35, 17 July 2020 (UTC)
SD0001, I've already made up a bot for this, just not sure if it's eligible for BRFA. If you're already finished coding it as well, feel free to just file yours, since you're more familiar with the bot approval process. ProcrastinatingReader (talk) 20:41, 17 July 2020 (UTC)
@ProcrastinatingReader: Well in fact, I have only done a data collection step and saw there are about 17000 affected pages. I didn't write any code for making changes yet (and that does seem a bit tricky), so if you're done with that, please go ahead. SD0001 (talk) 20:51, 17 July 2020 (UTC)
SD0001, assuming my script is correct (it looks to be making the correct replacements, and ignoring when it can't make sense of the data) I believe I've got the changes part down. I've looked at ~75 replacements locally and it made the correct one for all.
But for the data collection, I got a result below 17k (though only accounting for first_aired and last_aired currently, and not other poorly formulated date params). Turns out, after asking in #-discovery, Special:Search doesn't allow regex lookaheads, and the engine messes up with a not after a star, so I had to use a simpler query to fetch results which are possibly outdated and then do a better check in my script with a lookahead. This does work, but it generates too many false positives initially (also, I got about 1k needing update, not 17k), which isn't a problem since I check fetches locally, but it is annoying. I wanted to use the lookahead to ignore results which aren't already using a template as the value. Curious what search query you used? ProcrastinatingReader (talk) 21:37, 17 July 2020 (UTC)
@ProcrastinatingReader: Yes search doesn't support lookaheads and also it can display only the first 10k results (via either the UI or the API). I didn't use a search query. I used my self-written bot framework to load all the 48,000 articles with template, parse the templates using my template parser, and pass the values in first_aired and last_aired fields to the JS Date() function which would be able to make sense of raw dates but not the ones within a template. And there seem to 17000 articles where JS Date() is able to make sense of the dates, which implies they're all raw dates that need to be templatized. SD0001 (talk) 05:25, 18 July 2020 (UTC)
If it would help, a tracking category can always be added to the infobox code.Gonnym (talk) 06:54, 18 July 2020 (UTC)
I'm not sure how extra tracking cats work, but may be helpful? I've ran a manual search along the lines suggested above and got about 20-25k across both templates for both applicable params. Quite a bit larger than Special:Search. ProcrastinatingReader (talk) 23:39, 19 July 2020 (UTC)
I would set up a category along the lines of {{#ifexpr: {{Str find|{{lc:{{{first_aired|}}}}}|start date}} > 1 | [[Category:Pages using infobox television with nonstandard dates]]}} with a similar tracker for end date as well. Primefac (talk) 00:40, 20 July 2020 (UTC)
@Primefac: I might be testing it incorrectly, but that doesn't seem to work for me. Maybe it has to do with the fact that "start date" is a template name? If I check "dtstart" which appears in the class name it does work. --Gonnym (talk) 08:48, 20 July 2020 (UTC)
I looked at some examples and think it's meant to be in the template itself? Also < 1 (since it returns -1 for no match). I added Primefac's example to sandbox here: Template:Infobox television/sandbox, and you can see it flagging here. Might need some slight tweaks; it currently flags empty strings, and ideally it should ignore some strings like "present". It can probably be made more specific using the template, but I wonder if it's better to just stash this into a module and do it more cleanly, and not have to repeat? Alternatively, a check to see if the param is not empty, and begins with either a number or a letter, excl words like "present" would probably do it (all of these should be non-standard). Good idea on the category btw, saves having to query the API for all 40k+ transclusions (and their contents) for every run. ProcrastinatingReader (talk) 16:37, 20 July 2020 (UTC)
Done using a switch, seems to work. Special:Diff/968642472. ProcrastinatingReader (talk) 16:45, 20 July 2020 (UTC)
Your example does not work. Use the valid template and the tracking category still appears. --Gonnym (talk) 16:53, 20 July 2020 (UTC)
Good catch, I only tested the ones that should flag (and the false positives) I forgot to check start date itself. {{Str find|{{lc:{{{first_aired|}}}}}|may}}}} seems to return "1" for example, which makes me think |first_aired= is already passed through the start date template by the time it's evaluated here. Will read through some docs. ProcrastinatingReader (talk) 17:05, 20 July 2020 (UTC)
Uhh, I have a solution, but it's disgusting. Looks like it works, though? No template: Special:Permalink/968658413. Has template: Special:Permalink/968659176. I imagine there's a far neater way to do this, though. ProcrastinatingReader (talk) 18:36, 20 July 2020 (UTC)

Yeah, you're right, Gonnym, I didn't realize that it would parse the {{start date}} template before it hit the infobox call. In that case, you'll be wanting {{#if:{{{first_aired|}}}|{{#ifexpr: {{Str find|{{{first_aired|}}}|dtstart }} < 1 | [[Category:Pages using infobox television with nonstandard dates]]}}}} and using dtend for the end date. Really nice, actually because it means that you don't have to worry about template redirects. I've tested it in the sandbox and it looks good to me, but if someone else wants to run it through the paces before we go live let me know. Primefac (talk) 21:42, 20 July 2020 (UTC)

Neater than my module idea. I tried that dtstart concept in Special:Diff/968653168 but I guess the nowiki broke it. I think it might be worth retaining the switch/adding some form of check, so 'present' (for last_aired) isn't added to the tracking cat ('present' is a valid value for that param) ProcrastinatingReader (talk) 21:53, 20 July 2020 (UTC)
Well, the easiest way would be to assume that if they're using the proper template for start they'll use the proper template for end, and only check the |first_aired param. Primefac (talk) 21:57, 20 July 2020 (UTC)
Possibly a dangerous assumption, eg could be different editors at different times who last touched either of the params. I've done a quick data collection run with a script. For {{Infobox television}}, 19399 templates with first_aired being improper, 21122 templates with last_aired being improper. For {{Infobox television series}}, 2032 and 2104. These could overlap, of course (templates with one improper param may also have a second improper param). But I think that's (at least) 1795 templates where last_aired isn't valid while first_aired is? ProcrastinatingReader (talk) 22:48, 20 July 2020 (UTC)
Fair enough. Go for it. Primefac (talk) 22:52, 20 July 2020 (UTC)
Am I being silly or is something fishy going on with last_aired? See User:ProcrastinatingReader/sandbox3 with both first_aired and last_aired, and Special:Permalink/968696281 for last_aired only. Both seem to flag up as nonstandard? This is without my edit. ProcrastinatingReader (talk) 23:09, 20 July 2020 (UTC)
Got the sandbox tracking working correctly now (I believe). --Gonnym (talk) 00:32, 21 July 2020 (UTC)
Was going to say, looks good to me... and for what it's worth, I've turned the sandbox into a [[:Category... just so it's a bit more obvious if/when it triggers. Obviously will need to have the : removed when it goes live. Primefac (talk) 00:33, 21 July 2020 (UTC)
LGTM now, as well. ProcrastinatingReader (talk) 08:53, 21 July 2020 (UTC)
If the infobox is supposed to use those two templates, then it wouldn't be controversial to enforce that. Primefac (talk) 22:21, 17 July 2020 (UTC)
Also, just to note, quite a lot of dates are not using the proper templates. See Elizabeth I for an example of the dates in Template:Infobox person not being properly used. Probably a bunch of uncontroversial cleanup which can be done by a bot here. Baby steps, I suppose. ProcrastinatingReader (talk) 00:07, 18 July 2020 (UTC)

Bypassing redirects for hatnotes and see also sectionsEdit

This task might be better for semi-automated editing than a straight bot, but I'll throw it out here. I often come across hatnotes and see also sections that link to an old title for a page, e.g. this sort of fix or this one. Would it be possible to create a bot or a tool that lists or fixes instances where hatnotes or see also sections include a redirect to a page that has been moved to a new title? {{u|Sdkb}}talk 05:30, 7 July 2020 (UTC)

Probably not, unless there is some other substantive change to be made. See WP:NOTBROKEN. – Jonesey95 (talk) 15:06, 7 July 2020 (UTC)
Normally, if a page title is not broken, the page move won't succeed. If it does succeed, it's likely there's a good enough reason that it'd be worth changing the see also links and hatnotes as well. {{u|Sdkb}}talk 01:13, 10 July 2020 (UTC)
I think the point is more that if there is a link to United States Constitution there is little reason to change it to Constitution of the United States purely for the purpose of avoiding a redirect. Primefac (talk) 01:53, 10 July 2020 (UTC)
Oppose There can be reason to use a correct but alternate name in a hatnote because it is shorter, such as the example Primefac gave above. (t · c) buidhe 10:38, 10 July 2020 (UTC)

Automatically remove Commons files that are tagged as not PD-USEdit

Although Commons is supposed to follow US law, in practice it is very difficult to get URAA-violating files (i.e. non-US media that is not public domain in the US) deleted there. In the meantime many files are tagged with {{URAA}} template, but despite this, files so marked continue to be used on enwiki despite it being against our copyright policies according to Wikipedia:Copyrights. (Earlier today I manually removed several uses of a non-URAA compliant Chinese stamp from several articles, including Joseph Stalin and Tiananmen). How easy would it be for a bot to automatically remove Commons files tagged with URAA template, similar to the bot which removes fair use images without a rationale specifically for the article it is being used in? (t · c) buidhe 06:37, 9 July 2020 (UTC)

For "URAA restored" files on Commons, the Wikimedia Foundation Board of Trustees says here "We are not recommending that community members undertake mass deletion of existing content on URAA grounds, without such actual knowledge of infringement or takedown notices." So, it may also be inappropriate to do mass removals of WP links to these Commons files. Anyway, even on a one-off basis, I do not think we should be removing links to files on Commons solely because we think the Commons hosting is wrong. The correct procedure is to raise a Commons:Deletion request. What is "very difficult"? Is it the work needed to place a request or the difficulty in getting agreement that a file should be deleted? Thincat (talk) 09:31, 19 July 2020 (UTC)
There are many Commons admins who do not agree that the files in the site must be free according to US law, despite the fact that current policies on both Commons ("If the end result of copyright evaluation is that there is significant doubt about the freedom of a file under US or local law, the file must be deleted in line with the precautionary principle.") and enwiki ("The Wikimedia Foundation is based in the United States and accordingly governed by United States copyright law.") insist on this. All that Wikimedia says is that they are not interfering in the process of deleting files, except where they receive takedown requests. (t · c) buidhe 09:45, 19 July 2020 (UTC)
I think the {{Not-PD-US-URAA}} tag on, for example, File:Chinese stamp in 1950.jpg is wrong. Even if it is valid it is not so self-evidently so as to require mass deletion of the file. I'll go to your talk page to discuss this further. Thincat (talk) 11:56, 19 July 2020 (UTC)

Name hatnotesEdit

Request for bot for medical templates to move ICD data to Wikidata, and remove from viewEdit

Hi all, hope you are well in this crazy time period. I am seeking a bot that will:

  1. Go through all templates in this category: Category:Disease_and_disorder_templates
  2. In the Wikidata entries associated with the templates insert the relevant ICD9 and ICD10 codes
  3. Then, remove the associated data from the template header

This is per the discussion here: Wikipedia_talk:WikiProject_Medicine#Proposal_to_remove_ICD_codes_from_templates, essentially the reasons being that they clutter the titles and don't help editors.

An example of this would be here:

The codes are: "C44.L40–L68/D23.L15–49, 173/216"; each is linked to a respective ICD9 and ICD10 category; Wikidata would need to be updated and then these removed from the title. We did this a few years ago within the Anatomy space; ping to Nihlus who was very helpful then. Please let me know if there's any additional information that I can provide to help. Many thanks, --Tom (LT) (talk) 23:43, 15 July 2020 (UTC)

Tom (LT), for clarification, is the slash part of the same ICD10 code?
eg if Template:Tumors of skin appendages were being added to wikidata, would it be:
  • ICD-10 = C44.L40–L68/D23.L15–49
  • ICD-9 = 173/216
Similarly, is it the "ICD-10" property that's desired, or "ICD-10-CM" / "ICD-10-PCS"? ProcrastinatingReader (talk) 00:54, 16 July 2020 (UTC)
The slash represents two ICD ranges (C44.L40-68, and D23.L15-49), but beyond that I don't know. Ping to some editors who might though: DePiep, Tobias1984, Was a bee. --Tom (LT) (talk) 02:10, 16 July 2020 (UTC)
More important: as WhatamIdoing noted, template {{Medical resources}} already reads ICD's from WD, with option to locally (at enwiki) overwrite. Why is this not considered? -DePiep (talk) 07:18, 16 July 2020 (UTC)
@DePiep correct me if I am wrong, but {{Medical resources}} is for use on articles. What I am requesting is for the relevant wikidata for TEMPLATES to be moved and the template titles stripped. How do you propose we use the medical resources template in this instance to remove the template ICD codes? --Tom (LT) (talk) 07:44, 16 July 2020 (UTC)
I still don't understand this proposal. Is it:
1. Add the codes to Wikidata (but to which WD item exactly? C44.L40–L68/D23.L15–49, 173/216 must be added to skin cancer (Q192102), right? and/or to WD-items listed in this navbox, like papillary eccrine adenoma (Q7132983)?)
2. Remove code from the enwiki template titlebar;
3. How should that WD property be used (shown) in enwiki (infobox? {{Medical resources}}), in which articles/templates? -DePiep (talk) 08:29, 16 July 2020 (UTC)
I'm thinking this: (1) Add the range codes to the relevant wikidata item for each template so they are preserved (2) remove the code, and (3) remove them from view completely. There was consensus and no objections at WP:MED when I proposed this. They don't (especially the older codes) contribute to navigation or organisation of the navboxes in any meaningful way. --Tom (LT) (talk) 08:52, 16 July 2020 (UTC)
That's for example here Template:Tumors of skin appendages (Q20346485) then. But I do have the impression that in WD it should be tied to the navbox topic (skin cancer (Q192102)), not the navbox template. Maybe WD people can make a suggestion re this. I'll leave it here. -DePiep (talk) 09:31, 16 July 2020 (UTC)
DePiep I forgot about the bot request process on Wikidata - thanks for reminding me, it is the appropriate venue for me to go first, which I've now done: Wikidata:Wikidata:Bot_requests#Move_ICD_codes_on_medical_navboxes_to_wikidata. --Tom (LT) (talk) 03:14, 17 July 2020 (UTC)

Please consider this request to be suspended / closed until I get the Wikidata component sorted. Many thanks --Tom (LT) (talk) 03:14, 17 July 2020 (UTC)

I think it might be easier to have one bot to do the lot, filing BRFAs both here and at Wikidata for approval to run. It's not really much more work to do both tasks with the same bot in one clean sweep. ProcrastinatingReader (talk) 12:39, 21 July 2020 (UTC)
If that is not too difficult, that does sound easier. --Tom (LT) (talk) 20:21, 21 July 2020 (UTC)

External linksEdit

(This is not a request for a bot. This is a request for a sanity check.)

WP:ELN is discussing the ==External links== section of Mary Tyler Moore. It contains (in part) this list:

* {{NYTtopic|people/m/mary_tyler_moore/}}
* {{IMDb name|1546}}
* {{tcmdb name|id=134771|name=Mary Tyler Moore}}
* {{iBDB name|023123}}
* {{findagrave|175697586}}

This is not an unusual set of links for BLP articles. Obviously, the exact list of links and the order they're presented in varies. Most of them use external link templates.

Imagine a future in which we developed a consensus that some/all of this "standard link dump" should be combined into a single template, perhaps similar to Template:Authority control. Am I correct that it would (if that magical future arrives) be a relatively simple matter for a bot to remove some of these (existing) items from this list and transform them into the new template, in at least most articles? If it's harder than it sounds, then I'd rather know that in advance. (Please ping me.) WhatamIdoing (talk) 17:48, 19 July 2020 (UTC)

I take it you're saying that you're envisioning some sort of template where when someone calls {{ELinksTemplate|Mary Tyler Moore}} it spits out the five templated links you mention? If so, I'm not sure how feasible it would be to do that, because you would need a HUGE module to account for the millions of names and links that would be required. Unless I'm mistaken on your future vision, the rest of the discussion is a rather moot point.
As I typed out the above, I thought about having this magical template be basically a wrapper for the links you mention, so you would set (for example) |imdb=1546 to have it kick out the IMDb link. I suppose that could be doable, but I don't think you'll ever get consensus to basically turn five templates into "five templates plus a wrapper template for them all". Primefac (talk) 18:26, 19 July 2020 (UTC)
This idea was discussed briefly at Template talk:Authority control in 2014. I remember a more recent discussion, but I don't recall where it happened. – Jonesey95 (talk) 18:49, 19 July 2020 (UTC)
Interesting. As mentioned in that discussion, it would be a nightmare to get consensus on what to include/exclude in such a template. Not to say it can't be done, just a little tedious. Primefac (talk) 18:53, 19 July 2020 (UTC)
I fear that this idea would have a similar problem to Authority control: no one would agree on a "standard" set of external links. For example, if a TV/film actor had a minor off-broadway role, they would be listed in iMDb and IOBDB, and both would likely be represented in Wikidata (because of course we'd use wikidata for this template). However, some editors might not want to link to the IOBDB page because it doesn't provide much more information, especially if there are already many external links. That would mean implementing overrides and having protracted discussions about what sites are suitable for general external links. --AntiCompositeNumber (talk) 18:59, 19 July 2020 (UTC)
Primefac, what I want is for the bot to take that list and turn it into something like {{new thing |NYTtopic=people/m/mary_tyler_moore/ |IMDb name=1546 |tcmdb name=134771 |iBDB name=023123 |findagrave=175697586}} and have the template display the same links more compactly. WhatamIdoing (talk) 22:18, 19 July 2020 (UTC)
I can't see why that'd be useful, personally. That template would call these under the hood, so the only thing that eliminates is writing out the bullets. imo this single template idea would only make sense if the data was to be sourced from Wikidata, perhaps some kind of {{links|imdb|tcmmb|ibd}} which sources the info from Wikidata? ProcrastinatingReader (talk) 22:36, 19 July 2020 (UTC)
I think having it display in a standardized, compact format, similar to Template:Medical resources or Template:Authority control would be beneficial. My question for this group is whether it's feasible to have the bot convert the articles, given that not all articles will use the same templates, place them in the same order, etc. WhatamIdoing (talk) 22:54, 19 July 2020 (UTC)
Technically speaking, sure. It's possible to parse the vast majority, yes, despite those display differences. The order doesn't really cause issues with parsing, neither does them being bullets or newlines or something else. Displaying them again might need more design thought if those differences are to be retained whilst using a wrapper template. ProcrastinatingReader (talk) 23:48, 19 July 2020 (UTC)
Thanks, I appreciate all the responses and the time people took to understand my question. There might (someday, not soon, possibly a couple of months from now) be a request for a bot to do this. WhatamIdoing (talk) 21:48, 4 August 2020 (UTC)

Redirects from Townname to Townname, Statename [US]Edit

Look at the two most recent redirects that I have created. There were communities at the name with the state disambiguator, but the base name was a redlink. Is there any way to do what I just did for every article that is in the form of "[anything], [state/province/country]" with a corresponding redlink? It would mostly need to run only once, but it could run again for a minor update every 3 months or so. HotdogPi 11:20, 20 July 2020 (UTC)

I'm currently working through bad links of this type, e.g. to Villa Park where Villa Park, California was intended. I'm not finding many redlinks. I am finding plenty of duplicate names. For example, one bad link, despite mentioning California nearby, actually related to Villa Park, Illinois. Certes (talk) 11:34, 20 July 2020 (UTC)
When I say "redlink", I just mean that there's nothing there. The two I created were found by typing in the URL bar, not by clicking a link. HotdogPi 12:33, 20 July 2020 (UTC)
I mean the same. I'm creating lists of targets by removing ", State" from the article titles. Most such abbreviated titles are redirects to the correct city (no action needed) or disambiguation pages (incoming links will be caught and fixed elsewhere). Several are articles on another topic (or a primary redirect thereto), and I'm fixing links to such pages. Very few are redlinks, and those which are may be duplicated in other states or countries. Certes (talk) 12:57, 20 July 2020 (UTC)
I can see this being messy if done by bot. Who gets priority in the case of naming conflicts? Just the first one to be processed by the bot? ProcrastinatingReader (talk) 11:38, 20 July 2020 (UTC)
@HotdogPi: I've worked through redlinks for cities in U.S. states A–G as a by-product of my other work. I've created one dab (Cherokee Village), one redirect (McRae–Helena, a new city) and have two outstanding where I'm undecided between those approaches: Greers Ferry (Greers Ferry, Arkansas is unique but has a dam and a lake) and Clarkedale (Clarkedale, Arkansas is unique but the reader may want a Clarkdale or a Clarksdale). I've skipped places without city status, but some of these "cities" have only a few hundred residents. I'm also checking for clashes with place names beyond the U.S., as many American cities are named after places in Britain and elsewhere. Although I'm using semi-automated tools, I doubt that we could specify this task tightly enough to deploy a bot. Certes (talk) 15:50, 22 July 2020 (UTC)
@HotdogPi: Here is a full list of pages matching "$foo,_$state" where "$foo" does not exist. Some of these obviously should not be redirected because they're titles of works or whatever (I didn't feel like adding a category filter). I would say that this is   Not a good task for a bot. As Certes noted, these decisions would rely heavily on context. Hopefully the list will be helpful to you though. --AntiCompositeNumber (talk) 18:44, 22 July 2020 (UTC)

Amalthea (bot)Edit

Since this bot it down, there a request to replace one of its functions at Wikipedia:Bots/Requests for approval/ProcBot 3. However, there is a second task it does: updating Wikipedia:Sockpuppet investigations/Cases/Overview. We may need someone to create a bot to fill that function. Ping Amalthea, ProcrastinatingReader and Xaosflux --- C&C (Coffeeandcrumbs) 14:17, 22 July 2020 (UTC)

Doesn't DeltaQuadBot have similar functionality?  Majavah talk · edits 14:21, 22 July 2020 (UTC)
Looks like it's already running anyway, at User:AmandaNP/SPI case list. Should be possible to just swap it out with the existing one not being updated? Ping @AmandaNP to see if she's okay with that / has any comments? ProcrastinatingReader (talk) 09:38, 23 July 2020 (UTC)
It should already be transcluding to the main SPI page. My bot has always been the backup until the other one is alive + I use for my personal formatting. I see no need to deviate from that. -- Amanda (aka DQ) 20:47, 23 July 2020 (UTC)

Land use of the municipalities in SwitzerlandEdit

The links of sources for the land uses of the municipalities (within the geography section) in switzerland point to a web page that is no longer available (e.g. Bulle) and only some of them have been linked to wayback machine. Can someone link the rest of them to wayback machine?--Horizon Sunset (talk) 17:44, 22 July 2020 (UTC)

May wish to post this at WP:URLREQ. ProcrastinatingReader (talk) 10:16, 23 July 2020 (UTC)

Article History template scriptEdit

Hi all, I was wondering if anyone was interested in developing a script for talk pages to automatically role templates like DYK, GA and PR into an {{ArticleHistory}} format? I occasionally Wikignome, and it occurs to me such a script would likely be very useful for myself and many other editors, by automating a fairly time consuming manual process. The benefits will be more readable and organised talk pages, as well as a more comprehensive history for some articles. What a noble goal! --Tom (LT) (talk) 06:44, 27 July 2020 (UTC)

I posted at village pump earlier and didn't get any responses, so I assume such a script doesn't exist, therefore I thought I'd ask here :). --Tom (LT) (talk) 06:44, 27 July 2020 (UTC)

The MilHistBot has this capability. It normally does so when articles are promoted to A-class or FAC. Hawkeye7 (discuss) 18:37, 27 July 2020 (UTC)
@Hawkeye7 that's great! Is there a way to get it as a user script? --Tom (LT) (talk) 04:50, 28 July 2020 (UTC)
Unfortunately, it would have to be rewritten. As far as I know, scripts have to be written in JavaScript. Hawkeye7 (discuss) 10:06, 28 July 2020 (UTC)
I have worte codes that may parse {{Article History}}. You may use the codes to modify {{Article History}}, and the codes are in JavaScript. But I think the request can execute automatically? --Kanashimi (talk) 11:37, 28 July 2020 (UTC)
Thanks Kanashimi! I don't understand the second half of what you have said though. How will the script happen automatically? (Is it possible that I can choose for it to execute, like with most scripts via a button added to the "More" menu?) --Tom (LT) (talk) 00:52, 29 July 2020 (UTC)
For example, we may using a bot to merge DYK, GA and PR mark into {{Article History}}. But I don't know if there is a such mark... --Kanashimi (talk) 08:47, 29 July 2020 (UTC)
I see Kanashimi. But how can I do that using a script that I can initiate? --Tom (LT) (talk) 00:53, 2 August 2020 (UTC)
Are there some sample edits for DYK, GA and PR, so we can know more clearly what to do? --Kanashimi (talk) 01:08, 2 August 2020 (UTC)

Fix talk page to mainspace redirectsEdit

Query 46704 lists all of the talk namespace redirects that point to an article. Usually, if "A" redirects to "B" and "Talk:A" is also a redirect, then "Talk:A" should redirect to "Talk:B", not "B".

So, I think that we should have a bot that lists all of the talk page to mainspace redirects on a single page (perhaps a user subpage for the bot, or a "database reports subpage"). After that, the bot will find all of the redirects that do not include a slash (slashes indicate subpages), and fix them to point to the talk page of the mainspace target instead. If "Talk:A" happens to redirect to "A", then the (admin)bot would delete "Talk:A" because otherwise, it would redirect to itself. There are currently 1292 talk namespace redirects that point to articles (plus possibly some more due to a database replication lag). GeoffreyT2000 (talk) 20:46, 30 July 2020 (UTC)

Excluding subpages gets us down to just about 1,000 pages: https://quarry.wmflabs.org/query/47087. There's a few that shouldn't be redirects, and a few where fixing it naively would cause a double redirect. I don't think any of these pages should be deleted, since they're likely to have some sort of history. All in all, it wouldn't be a very difficult bot task. It may be worth putting in a warn edit filter for top-level ns1 pages being redirected to a page in a non-talk namespace. --AntiCompositeNumber (talk) 22:18, 30 July 2020 (UTC)

WikiProject redirectsEdit

A common mistake is to type "Wikiproject" instead of "WikiProject" to get to pages like Template:WikiProject Physics or Wikipedia:WikiProject Physics. So a bot that would automatically create those would be really useful.

This should only be the base pages, not the subpages like Wikipedia:WikiProject Physics/Quality Control. Headbomb {t · c · p · b} 16:51, 31 July 2020 (UTC)

There was a deliberate action some time long ago to remove WikiProject template redirects to make it easier to maintain them. I am not entirely certain that part of this request would have consensus. --Izno (talk) 17:05, 31 July 2020 (UTC)
The redirects that people got rid of were those that were very weird/non-standard ("WikiProject Phys"). This would be a systematic creation for very common typos very often made by newbies. Headbomb {t · c · p · b} 17:10, 31 July 2020 (UTC)
I'd be opposed to this template redirect creation as I find it useless (and template redirects always have a hidden downside later on). The templates are used exactly once per page. It's ok if it takes you 2 seconds more to type in the correct "P". --Gonnym (talk) 17:20, 31 July 2020 (UTC)
The point is newbies don't know that and make that mistake often. WP:CHEAP applies here. There is no downside to those redirects, and many upsides. Headbomb {t · c · p · b} 17:21, 31 July 2020 (UTC)
Newbies, and others, should be directed to User:Evad37/rater if you find them having problems with these templates. --Izno (talk) 18:00, 31 July 2020 (UTC)
Newbies at AFC should not be directed to scripts. Headbomb {t · c · p · b} 18:11, 31 July 2020 (UTC)
Newbies at AFC should also not be directed to add rubbish to talk pages.
Using a bot to create redirects for variant capitalisation will not help much when a given miscapitalisation is rarely used a second time. Look through the history of Wikipedia:Database reports/Broken WikiProject templates beginning at this revision to see the sheer variety that the newbies come up with. The last column of the report tells you how many instances existed at the time that the report ran: it's rarely above 1. --Redrose64 🌹 (talk) 18:42, 31 July 2020 (UTC)
So? That's where WP:CHEAP applies. This doesn't fix every "mistake" someone can make, but it fixes a good bunch of them. Headbomb {t · c · p · b} 19:12, 31 July 2020 (UTC)

I want a botEdit

I want a bot to do all of my editing.It is hard to do editing.It may help with deleting pages if you want to.Having a bot also puts less stress on editing.Was an explorer —Preceding undated comment added 14:21, 4 August 2020

I don't think you really understand what bots are used for. They can assist with editing for tedious and/or repetitive tasks, but they won't "do your editing". Primefac (talk) 15:48, 4 August 2020 (UTC)