Wikipedia:Bots/Requests for approval/AWMBot: Difference between revisions

**::::The problem is that it takes up a 2nd request per edit and I'm not sure I want to increase that. [[User:BJackJS|<span style="background-color: coral; color: black">BJackJS</span>]] [[User talk:BJackJS|talk]] 03:48, 2 December 2020 (UTC)
**:::::I don't think it's much of a problem. Your only per-edit request currently is the edit one, so adding a data fetch each whilst not ideal is not a dealbreaker; it's certainly better than making bad edits. That being said, you can just add one request for all your pages if you want. Loop through your pages, fetch your redirect and generate the predicted archive page title as you do, add it into an array, then make a single [[mw:API:Query]] with all the page names stuffed in as "titles". You know it's a valid title if the resulting request (when plucked for page titles) contains the title. If it is, make the edit. [https://en.wikipedia.org/w/api.php?action=query&format=json&titles=Main%20Page%7CAbdbdbabdsadsad%7CAotoasd%7CSandbox Sample] [[User:ProcrastinatingReader|ProcrastinatingReader]] ([[User talk:ProcrastinatingReader|talk]]) 13:17, 2 December 2020 (UTC)
**:::::: Another idea is to merge that with completly removing the template then readding it with the correct arguments. If the page isnt valid, it wouldn't make the edit. The goal of the bot is to whittle it down to like 50 pages that cant be repaired by the bot so humans can do them. I'll work on getting that implemented. [[User:BJackJS|<span style="background-color: coral; color: black">BJackJS</span>]] [[User talk:BJackJS|talk]] 18:09, 2 December 2020 (UTC)