Why was this removed? [1] Also, if you are going to run a bot, you should have a way to communicate and turn it off on each project you run it on. Jokestress (talk) 01:54, 1 February 2009 (UTC)[reply]
Please make a clear link on your bot page and bot's talk page indicating that users can leave a message here if there are problems. I do not like to leave messages on other Wikimedia projects, so there should be clear instructions on how to leave them here as well as on the German one. Thanks. Jokestress (talk) 07:55, 12 February 2009 (UTC)[reply]
I thought it would be enough, to put the bot's template on the page, which also contains the operators contact address. Isn't it? On the discussion page of the bot is a further link to my home wiki. --Xqt (talk) 08:11, 12 February 2009 (UTC)[reply]
Hi, this is just informational, because I doubt you could have anticipated it, but [2] shows your bot making a bad change on behalf of a subtle vandal who had redirected the original target (Tard, Hungary). I've requested permanent protection of Tard, but I'd also like you to consider a blacklist of terms that your bot shouldn't edit, to stop things like this. In any case, please try not to these sort of bad edits unintentionally.
I would also join previous posters to this page in asking that you not point your bot's talk page to de.wiki when it should point here instead. I understand that you don't want to miss talk messages, but communications with an en.wiki bot's operator should take place here. Thanks in advance for your consideration of both points. — Gavia immer (talk)19:12, 28 February 2009 (UTC)[reply]
I checked the bots edit and I suggest it was not a good idea to change the redirect but teh but isn't be able to detect vandalism. Tard, Hungary was overridden with a redirect yesterday on 2:40 but the RCs patrol doesn't identified it. At 10 o clock the Special:DoubleRedirects was updated an my box begun fixing them four hours later. I am sorry for the mistaken edit but there are no possiblities for the bot which acts in standard manner to detect vandalism or to see whether is link may be right or not. On the other hand it wasn't also detect by human beeings for a long term. You can prevent a page not changed by a bot but I think this wouldn't make sense. My bot fixes double redirections of more than 1000 pages a day on this wikipedia. I think it's not a good idea stopping it because one doesn't detect potential vandalism in lead time. Bot it is normal to correct pages they are changed by vandalism. Btw I agree to communicate here as well as on de-wiki. If I didn't answer, please give me a hint there because it is not possible to have a view on all my SUL talk pages (perhaps I'll write a bot to solve this). Sorry if my English is not so good and thanks for your messages, which gives me a hint improving my bot --Xqt (talk) 15:49, 1 March 2009 (UTC)[reply]
Double redirects change
Hi, I notice you run a bot that fixes double redirects. You might be interested in participating in the thread at WP:Village pump (proposals)#Double redirects, which discusses the possibility of having certain double redirects left unfixed. If we adopt that solution, I would be interested to know how such redirects might be marked so that bots know to leave them alone.--Kotniski (talk) 10:10, 8 March 2009 (UTC)[reply]
The simpliest way is to mark such articles with a {{nobots}} template. But I don't see any sense because my redirect bot solves only redirects which points to other redirect pages and btw is the result of moving an article. It doesn't change any links on articles which points to a redirect page. --Xqt (talk) 17:01, 8 March 2009 (UTC)[reply]
It seems {{nobots}} doesn't work on standard pywikipedia bots. I've changed it on my bot and I'll looking vor other solutions and comment it on the given discussion page. --Xqt (talk) 22:05, 10 March 2009 (UTC)[reply]
I noticed that this bot removed a language link to a redirect, I corrected that link here. I wonder if you could make a change that improves links rather than removing them. cygnis insignis14:45, 14 April 2009 (UTC)[reply]
Thanks for fixing the iw-link. The target page has been moved and the bot couldn't find it because no site pointed to the new one. But it would be fixed some steps later if a bot checks the iw-links of the sv-wiki. I am working on a feature to run my bot especially for moved pages. This would improve these edits. Regards. --Xqt (talk) 07:19, 16 April 2009 (UTC)[reply]
Is this bot approved?
I couldn't find any link that proves that this bot is approved? Is it? I noticed for example that it changes Image->File but I think it was a discussion for that and there was no consensus for that action. -- Magioladitis (talk) 23:03, 19 April 2009 (UTC)[reply]
In doing "cosmetic changes", Xqbot created a syntax error in Balochistan by inserting a space between the asterisk and number sign (*#) used to created an ordered (numbered) list within an unordered list. This wiki markup only works if the symbols are adjacent, with no whitespace. Please fix this in your bot.
I suspect the bot's problem is that it incorrectly assumes that it can safely insert a space after an asterisk if it's the first character in a line (i.e., markup), or possibly after the last asterisk in a series (allowing for indented lists). I commend the effort, as we really should have spaces separate markup from text wherever possible because it's easier for newbies to understand what's happening. But you should account for all nest-able markup, which includes asterisks (bullet list), number signs (numbered list), colons (indentation), and semi-colons (definition lists, which are often used for non-TOC headings). I believe a substitution pattern replacing "^([*#:;]+) *(.)" with "$1 $2" should do the trick. Thank you for your attention. ~ Jeff Q(talk)18:18, 21 April 2009 (UTC)[reply]
I've switched this feature off for the en-wiki as requested above but you are right. I'll fix it for the others. Thanks! --Xqt (talk) 15:27, 22 April 2009 (UTC)[reply]
Two days ago I created the page Husch. I included a link to [[de:Hüsch]]. Take a look at the English and German pages: they are obviously appropriate for linking.
Yesterday your bot removed [[de:Hüsch]] from the article. It did not even give any justification - the edit summary was just (robot Removing: de:Hüsch). I find this very irritating - it's a time-consuming process to go through looking for links to Wikipedias in other languages, and if I put it back I could quite easily find that your bot removes it again.
I'm sure you constructed and run the bot in good faith, but I wonder how many hours of people's good work your bot is undoing in this manner.
de:Hüsch is a disambiguation page but the {{surname}} is not listed at MediaWiki:Disambiguationspage. This forces iw-bots removing links to different page types. My Bot would not touch this page for a while but you may solve this by adding this template to the MediaWiki-page or to exclude bots from the given page by marking it with the {{nobots}}-template. --Xqt (talk) 17:41, 14 May 2009 (UTC)[reply]
I know about this behavior and I made a two feature requests to the pywikipeda framework to solve this. And I've found an additional request there for changing MediaWiki:Disambiguationspage for a similar bot edit of another bot owner. In this case my bot will not return to change this and I'll mark this page on a exception list until this behavior would be changed in a global manner. On the other hand It would be a good idea to use a {{disambig}} instead of {{surname}} because this really seems to be a disambiguation and not an article. btw: please see the version history of the given MediaWiki page. The {{surname}} template has been removed from this page on february and this leads to a new behavior of iw-bots. --Xqt (talk) 08:32, 15 May 2009 (UTC)[reply]
Please see MOS:DABNAME which clearly states, "Pages only listing persons with a certain given name or surname (unless they are very frequently referred to by that name alone) are not disambiguation pages, and this Manual of Style does not apply to them. In such cases, do not use {{disambig}} or {{hndis}}, but {{given name}} or {{surname}} instead." If I were to take your advice and use {{disambig}} I am sure another editor would correct my mistake.
It seems that German WP regards pages of people with the same surname as disambiguation pages, whereas English WP does not. Your bot wrongly assumes agreement on this – an invalid assumption on which to operate. This logical contradiction needs a radical solution which is beyond the realms of a humble editor like me. Are you able to get something done to correct this? -- Hebrides (talk) 16:04, 15 May 2009 (UTC)[reply]
I just had to revert another ten links vandalised by this bot. Please suspend its action until it respects the protocol described in the paragraph above. I'm getting desperate! I don't want to spend the rest of my life reverting this bot's changes... -- Hebrides (talk) 19:16, 17 May 2009 (UTC)[reply]
The problem is that you are using the -force parameter at the same time as -autonomous which is something you are not supposed to do because it breaks on disambiguation pages as you have seen. It is a common error, as long as you stop using the two paramters together you will stop having this problem. -Djsasso (talk) 03:49, 30 May 2009 (UTC)[reply]
Yes, you are right. I've changed my bots behaviour now and testet it since about two weeks. It would skip pages if it founds a disambig mismatch. Can someone unblock my bot, so he can start working again. That would be very nice. --Xqt (talk) 19:39, 3 July 2009 (UTC)[reply]
I am wondering. The purpose of the bot is described at User:Xqbots page. It is the normal funktion of a interwiki and redirect bot which comes from the pywikipedia framework. --Xqt (talk) 17:27, 14 May 2009 (UTC)[reply]
Hello, thanks for your contributions to the Afrikaans Wikipedia. It seems your bot is programmed to organize interwiki links alphabetically. While this may be acceptable on the English Wikipedia, this is not the policy of the Afrikaans Wikipedia. Please reprogramme your bot to organize interwiki links according to the alphabetical order of the language name of the Wikipedia, and not according to the link name of the Wikipedia. For reference, see how your bot moves the Finnish language between the F-languages at [6], while the language name is Suomi and belongs between the S-names. Thanks for your co-operation. — Adriaan (T★C) 18:14, 6 June 2009 (UTC)[reply]
Hi Adriaan, thanks for this report. I will report it to the framework to fix the sorting of iw-links for the bots too. --Xqt (talk) 16:17, 9 June 2009 (UTC)[reply]
Thanks for the reply. The Afrikaans Wikipedia doesn't have a custom policy about this, so if this is the case as you explained it, then it is infact correct. I was mistakened by thinking the default policy would be not to categorize the iw-links alphabetically, but to do it alphabetically in its language name or transliterated version of its language name form. But that is an incorrect assumption. Thanks for the reply and sorry for the trouble. — Adriaan (T★C) 10:45, 10 June 2009 (UTC)[reply]
Misspelling redirect. I thought this was a bot edit, but see it's yours. For genera names, or taxon names, two names may be very close in spelling but not be the same. Pelagorhynchus appears to be a dinoflagellate. Pelargorhynchus is an extinct fish. Particularly when dealing with marine single-celled organisms, the genus may be so obscure to appear to be a misspelling. It still could be, but in the absence of a source for it, and a top note for the redirect, I'd rather the redirect simply be deleted. --69.226.103.13 (talk) 09:02, 3 July 2009 (UTC)[reply]
Well I found a broken link and tried to repair it. But if you prefer to delete the redirect, it is ok to do so. --Xqt (talk) 19:31, 3 July 2009 (UTC)[reply]
It's probably a broken link because the dinoflagellate genus was deleted; however, this doesn't mean it's a misspelling to the extinct fish. It's a tricky edit, though, so no fault on your part, just a suggestion that with genera names misspellings may require more looking. --69.226.103.13 (talk)
fr:Princesse royale was marked as a disambig. But it is not. Your ar right but the bot didn't know about this. I've removed the disambig template on the french site and let the bot correct the rest. --Xqt (talk)05:04, 6 July 2009 (UTC)[reply]
I have stopped the bot from work for the last few days because of problem with order odf the links , and will change the scripts before making it work, when I restart it I will let you know, please let me know if there is any problems , I am planning to make it work only if all the scripts are updated to latest version. many thanks. Ghaly (talk) 12:47, 25 July 2009 (UTC)[reply]
No, it doesn't work properly. Changing the iw-link from ar-wiki is quite right, but the the interwiki sorting is a misorder; you can see here that mhr: appears on top of the list but it must not. Is your local version the actual one? mhr-wiki has been added to the family few days ago.
Modifying the ar-wiki links was ok an I have wondered that your bot was one of the first who solved this in correct manner. That wiki just created a new extension namespaces for years articles and on de-wiki there was implemented an abuse filter until I supported some bot owners to change the crossing namespace table.
What you mentioned about the sorting of the codes is the reason I stopped GhalyBot from working on 23 July 2009.
Since then I have updated the pywikipedia file on my computer and ran a test on .af and now the sorting order is correct , so I let ist update more pages.
If your bot speaks python, did you changed the crossnamespace-table? This seems good because you recognize the extension namespace of year on ar-wiki. This is quite fine. But on the other hande there was a sorting problem of iw links to mhr-wiki. I looked through the last edits but these pages doesn't contain a mhr-link. You should test your bot on those pages which contain such a link like 1962 or sth else. But I am wondering about this of cause my bot is also python and it could correct the missorting of yours [7]. Good luck --Xqt (talk) 06:09, 28 July 2009 (UTC)[reply]
GhalyBot has been running on an updated version since 26 July , and I think this has sorted the order of links now, many thanks. --Ghaly (talk) 06:13, 28 July 2009 (UTC)[reply]
I have updated the software and GhalyBot has been running on an updated version since 26 July, but now because of what you told me on the arz.wikipedia page , I stopped the bot from working until I further udpate the sofware , I will let you know when I manage to update its software again. so if you dont mind I can run a trial , but I am not going to make it work before the new updates. Thanks. Ghaly (talk) 08:56, 15 September 2009 (UTC)[reply]
You have to keep you bot up to date. This means not only monthly because there often comes a new software version, a bugfix, new translation and localisation or new wikipedia families. I strictly recommend to do this. You can get the actual version from the toolserver here. Your bot is blocked on pdc:- and de:-wiki until 17th september; enougth time to actualize it. I just trust you that you will keep your bot okay for the future. Thanks for your kind attention. --Xqt (talk) 12:55, 15 September 2009 (UTC)[reply]
Your edits on the Yiddish Wikipedia
Your bot has been making wholesale unexplained deletions of interwiki links under the guise of Cosmetic changes. Please revert these deletions. --Redaktor (talk) 22:54, 3 August 2009 (UTC)[reply]
IIRC, some bots respect reverts: When you revert them, they don't re-revert. Can Xqbot do that too, please? There are legitimate reasons for double redirects, and there seems to have been a majority for allowing longer redirect chains at Wikipedia:Village_pump_(proposals)/Archive_44#Double_redirects, and if your bot doesn't have the human intelligence to recognize them, it should leave the judgment to humans, and not edit war with them, as it did here. — Sebastian23:24, 15 August 2009 (UTC)[reply]
Hi Sebastian, I think you are right blocking the bot in this case. This was the second best choice for this problem. A better ability would be to tell the bot, what it should do with this page as you did it for human editors (here:leave it unchanged). I placed a {{nobots}} there to do this. This bot uses a standard script from Python Wikipedia Robot Framework as by others and its behavior is unchanged. But a better solution would be, to mark the middleman of double redirects and to change the bots behavior leaving the middlemans references unchanged. As I am in process becoming a developer of this framework, I could help to solve this problem in another manner. But in conclusion it is necessary to give bots an hint, what to do with such double redirects as you did for the human ones. BTW: There is no script by the PWRF which recognized its own reverts. It is a good suggestion but it would degrade the performance a lot. It is better to improve the bot to prevent malfunctions.
Thank you for your friendly reply to my somewhat breathless post! I see your performance argument, but checking the history is something we humans have to do, too, and it hampers our performance as well. Maybe there is a way to make life easier both for humans and bots? (Any solution would probably be well beyond the scope of this bot. (E.g. de:WP:Sichten comes to mind.) But maybe you have a better idea.
Thank you also for placing {{nobots}} on the page in question; I wasn't aware of this. It would be better if the bot told us of this option in the edit summary. However, this is not in general a good solution; imagine what you would say if we had a human user who busily reverts edits of others, and if you want him to respect those edits, you would have to speak in Chinese with him - in each single case! I quickly looked at the last 20 edits the bot did, and found that seven of these are potentially the same situation as described at the village pump. I don't know what the conclusion of that discussion was, but I'm only mentioning this to show that the problem may go far beyond this one single redirect.
A simpler and more effective solution may be more specific to this bot: This is one of a class of bots that never should go back to an article it has edited before, and does not do urgent edits. I don't know how the bot proceeds from article to article, but off hand I could imagine two solutions: If it is possible to go by creation date of the redirect, then the bot can start with the oldest, and it would be guaranteed that it would never revisit a page. If that isn't possible, then maybe it could work off a list that is created once every week or so, which would be created by taking into account a "master list" of articles already visited. To better compare the two lists, they could be sorted alphabetically, which would also have the added benefit that we humans get a feel for what the bot is up to. — Sebastian15:50, 16 August 2009 (UTC)[reply]
That same page now got changed again by another bot. So, obviously, {{nobots}} doesn't even work as its name promises. That makes that even less of a solution than I said above. Please, therefore, take my concerns and proposals above seriously. — Sebastian14:32, 24 August 2009 (UTC)[reply]
Hi Sebastian, this is a good example why your proposal won't work. I've discussed it with other bot owners and developers. In general there are several bots who does the jobs, but there is no way to disable an edit except looking for {{nobots}} (and the bot has to be exclusion compliant and on actual state) or giving the bots an other syntax to handle that stuff. I've changed my bot to respect {{nobots}} on 10th March to prevent the given problem and the bugfix of the PWRF came on 15th march with release 6508. It is strongly recommended to keep the bot actualy. The exclusion compilant is described at WP:Bots as well. In this special case, there is no difference between a redirect which links to a target who contains an article and on the other hand a redirect which belongs to an other lemma, which could be a redirect itself or an artice, but this link should be fixed. There is a meta-tag __STATICREDIRECT__, but this is never supported by any bot. A possible solution for the last one would be to mark these redirects with a template like {{softredirect}}. This would work of cause normal redirect bots wouldn't see these pages as redirects and won't try to fix that. This softredirect could link to a redirect as well which derives to the article. This not differs significantly from a double redirekt of cause mediawiki solves only one hop. --Xqt (talk) 14:03, 28 August 2009 (UTC)[reply]
Can some of the bot's work be undone without going through each one?
The editor who made the move didn't trouble fix any of the redirects, though. Accordingly, they all became double redirects. The bot jumped in and "fixed" them. As a result, they now reach a dead end. For example, reader who enters Social Security Act of 1935 used to get to the relevant section of Social Security (United States), but now reaches a redirect page.
I don't want to have to go through and fix all of these by hand. Is there a way to automate it?
I wonder if the bot prefers working on articles who have been most recently changed, which would also explain the revert war I mentioned above. — Sebastian06:06, 22 August 2009 (UTC)[reply]
Struck out above; that was an edit conflict. Xqt, if you're there, could you please reply to what wrote above? Thank you! — Sebastian06:12, 22 August 2009 (UTC)[reply]
The other bot handled some of the problems, but quite a few were left. On closer examination, though, I realized that this was because the editor who moved the article also edited the "ssusa" template to direct to the new title, so it showed up as being linked in all the articles that used the template. I edited the template and that seems to have solved the problems that the bot didn't fix. Thanks for your help! JamesMLanetc03:05, 24 August 2009 (UTC)[reply]
Sorry but the bot corrected the interwiki links to the given articles as writen in the foreign languages. For the spelling you have written there exists no articles on other wikis. I've reverted your edit thus iw links works again. --Xqt (talk) 06:23, 31 August 2009 (UTC)[reply]
I see what happened now. Sorry about that. Technically speaking, the bot is correct. However, the correct spelling of Never Shout Never is not NeverShoutNever! and it is incorrect on those select pages in other languages. Could you correct it on the Español, Norsk (bokmål), and Português pages? Thanks! --Russ is the sex (talk) 13:46, 31 August 2009 (UTC)[reply]
I think it's up to the local community to change the title. For es-wiki IMHO it's ok to move the page but my normal local user account is not autoconfirmed and I couldn't do this. Try to explain your request on these local talk pages to get the title changed. Regards --Xqt (talk) 06:24, 1 September 2009 (UTC)[reply]
It is not a bot war [9]. My bot just removed a non-existent link but SieBot found the right link to that wiki first. Maybe elsewhere the right link has been changed manually. --Xqt (talk) 23:08, 3 October 2009 (UTC)[reply]
Merde
A vandal redirected hip hop to shit, and Xqbot dutifully fixed many of the double-redirects, even though the redirect was only in place for 42 minutes. Although I may agree with the sentiment, it was difficult for me to find all the redirects. Any suggestions? — Arthur Rubin(talk)18:28, 9 October 2009 (UTC)[reply]
Increasing the time offset would not solve this problem. I found this behaviour two times in one year and it was never reported to the robot framework. This miss-edit is annoying but it might be not a big problem. I will change the bot scripts on occasion to detect some cases of vandalizm. Thanks for your request --Xqt (talk) 17:21, 12 October 2009 (UTC)[reply]
May I suggest that the edit summary might be a good way to reduce this? "Fixing double redirect hip hop to shit" would be easy enough to search for. RichFarmbrough, 19:00, 15 October 2009 (UTC).[reply]
Hi, your bot just 'fixed' a double redirect - effectively reverting my edit from Heme C to Heme c. However, the former name (my change) is actually correct, given the context of the target article. I've undone its change. Is there any reason why this change took place please? Brammers (talk) 15:02, 29 October 2009 (UTC)[reply]
Addendum: please could you clarify if your bot is approved? I've only been able to find its "Request expired" page. Looking at your talk page, it seems to be misbehaving quite a bit. Brammers (talk) 15:06, 29 October 2009 (UTC)[reply]
Sorry I don't understand your opinion. If Heme C is the right one, you should move the article to that place. This bot only solves double redirection like some others too because mediawiki doesn't solves it yet. These pages are listet at Special:DoubleRedirects. Btw my bot has a global bot flag. --Xqt (talk) 05:44, 30 October 2009 (UTC)[reply]
Please accept my apologies for being a little terse; my understanding of the double-redirect was mistaken. All sorted now. Best wishes, Brammers (talk) 14:53, 7 November 2009 (UTC)[reply]
This behavior is the result of namespace mismatch. I wouldn't free all of these crossnamespaces. So I blocked the page instead for bots --Xqt (talk) 13:49, 11 November 2009 (UTC)[reply]
Please try to understand that the different language wikipedias have their own individual different ways of organizing their articles. German Wikipedia does not organize german articles on different language wikipedias in the article namespace. German Wikipedia has a special namespace for this here. Thus You cannot remove such valid language links from english wikipedia articles. If, for instance, you create a german article on the Norwegian Wikipedia, and place that article in the german article name space, what will happen is that a german admin will come along and take that article out of the article namespace, and place it into the appropriate name space in German Wikipedia. So the language links are correct. This is not a typical "namespace mismatch". Please adjust your bot accordingly and refrain from removing valid language links from english wikipedia articles. Thank you. Amsaim (talk) 20:48, 3 December 2009 (UTC)[reply]
I've fixed the crossnamespace on my local copy now. After testing the behavior I will upload the new revision to svn repository for the other bots too. I hope I get my bot unblocked for doing this stuff --Xqt (talk) 17:31, 7 December 2009 (UTC)[reply]
Recently a hyphen was added in Rosie Malek-Yonan's last name when spelled in Farsi. Though her last name does have a hyphen in English, when translated into Farsi, there should be no hyphen. I've removed it a few times but it appears again. Can you please help remove it. Thanks! Zayya 17:51, 4 December 2009 (UTC)
I've blocked Xqbot from editing, it has not went through a request for approval and is editing outside the scope allowed for global bots. Per WP:GLOBALBOTS, The English Wikipedia allows the use of Global bots to update interwiki links . . . Use of global bots for any other purpose is not currently permitted.QTC11:06, 5 December 2009 (UTC)[reply]
Sorry, I do not agree. You cannot change the rules and hang the delinquent. Since I've startet fixing double redirects in january 2009 this was allowed for global bots long time ago [10]. This was never recalled by global bot policy. I found no restriction to that policy by local policy [11] at the time I startet this task. On the other hand I informed the comunity about my bots tasks on its user page as well as on Wikipedia:Registered bots #Other registered bots. So far I can not understand why my bot was blocked after this long time and its maintainance was well known. Anyway I have switched off fixing double redirections at en-wiki and it would very helpfull unblocking it now to enable fixing interwiki links again. Thanks a lot --Xqt (talk) 17:25, 7 December 2009 (UTC)[reply]
English Wikpiedia opted-in to global bots, not the global bot policy. Global bot policy does not overrule local policy. As agreed, ENWP bot policy would only allow global bots to do interwiki work. When the global policy expanded to include double redirects, that did not give the authority to do so on English Wikipedia, as ENWP does not agree to the global policy, but allows global bots under local restrictions. While this might have been ambiguous for a period of time since adoption, the Global rights page was updated to explicitly state that it is against policy to perform tasks outside of the original consensus of interwiki-only. Despite the fact that it was running, did not constitute approval that it was running within policy. It only came to my attention after it 'fixed' some malicious redirects. You're more then welcome to do double redirect fixing, but only after gaining approval for this task. I've gone ahead and unblocked the account so it can continue to do interwiki work only; as mentioned, any other tasks will require approval. QTC01:26, 8 December 2009 (UTC)[reply]
I respect this policy without any doubt. But as I came here I had also no doubt that fixing redirects is allowed here. I felt it as a overreaction blocking the bot instead of informing me about this misunderstanding. I check all talk pages every day and I am reachable at #pywikipediabot. Anyway it's quite ok for me. Thanks for unblocking and best regards --Xqt (talk) 07:30, 8 December 2009 (UTC)[reply]
Oh, I see. I'm sorry, then. A suggestion: perhaps the bot should check if any part of the page name (a "part" being a word separated by a space from the rest of the name) exists in the second redirected page. This should be true 100% of the time (real double redirects will always be fixe), although it would not always prevent vandalism like this. Perhaps you have another idea which might make it prevent more vandalism. The problem with this kind of vandalism, combined with bots, is that when it gets "fixed", it seems to be a mistake rather than vandalism.
Thanks for your suggestion. Maybe it works. I am just in experimenting for one of a further release to detect whether a broken redirect or redirect loop may be fixed or should speedy deleted. This might be a bit similar to get potential vandalized pages and write this to a service page for checking. By the way, my bot has a delay time for giving the rc patrol a change to detect such vandalism. But also as described at #Merde above, I changed the bot behaviour in Release 7499 writing the redirect target into the comment line. This helps to revert such derived links to its origin. --Xqt (talk) 18:03, 11 December 2009 (UTC)[reply]
The whole thing is not plausible. There are iw links from a portal namespace to tempates and normal articles. This doesn't make any sense at all. The bot just tried to unlink crossed namespaces --Xqt (talk) 12:40, 20 December 2009 (UTC)[reply]
It's not a bot's malfunction. It is a normal way interwikilinks works. I've changes the source pages at fr-wiki thus it would lead the new links to the correspondig targets. --Xqt (talk) 11:08, 26 December 2009 (UTC)[reply]
Welcome to Wikipedia. Although everyone is welcome to contribute to Wikipedia, at least one of your recent edits, such as the one you made to List of Wikipedias, did not appear to be constructive and has been reverted. Please use the sandbox for any test edits you would like to make, and read the welcome page to learn more about contributing constructively to this encyclopedia. Thank you. Amsaim (talk) 10:05, 13 January 2010 (UTC)[reply]
Did not help
This edit by the bot didn't help. The previous edit was pretty obviously some click vandalism and should just have been removed (since done). Can the bot be improved to detect this and do the best thing? --J Clear (talk) 03:22, 14 January 2010 (UTC)[reply]
This does not seem very likely. I've seen an iw bot do this before, so if you can figure out what combination of factors is causing this, please leave a note. —TheDJ (talk • contribs) 03:15, 19 January 2010 (UTC)[reply]
Bot Propagated Redirect vandalism.
Just a heads up, your bot propagated some vandalism. (Or maybe just a new user's test edit?)
When this edit was made, your bot not only obscured it by making another redirect, but it propagated the change.
Oh. I see now, from reading more of your talk page that you've already discussed and implemented changes that should prevent this from occurring in the future. APL (talk) 16:53, 19 January 2010 (UTC)[reply]
... Do you have a mechanism for quickly fixing this sort of error in the future? If a vandal were to temporarily redirect a page that was the target of many redirects, your bot might go on a spree. That would be tedious to fix manually. APL (talk) 17:06, 19 January 2010 (UTC)[reply]
I am honored. Oh btw. I am not a bot but human :D I sent greetings to my bot and he respondet 1111010001010010. Greeting you both --Xqt (talk) 20:56, 10 February 2010 (UTC)[reply]
A tag has been placed on Steffen Mueller, requesting that it be speedily deleted from Wikipedia. This has been done for the following reason:
not noteable
You may wish to consider using a Wizard to help you create articles. See the Article Wizard.
Thank you.
Under the criteria for speedy deletion, articles that do not meet basic Wikipedia criteria may be deleted at any time. Please see the guidelines for what is generally accepted as an appropriate article, and if you can indicate why the subject of this article is appropriate, you may contest the tagging. To do this, add {{hangon}} on the top of the page and leave a note on the article's talk page explaining your position. Please do not remove the speedy deletion tag yourself, but don't hesitate to add information to the article that would confirm its subject's notability under the guidelines.
Xqbot removing interwiki link to renamed page without adding replacement
I noticed here that Xqbot removed an interwiki link to an en.wikipedia that had been renamed and deleted, but did not replace/update the link. It was easy enough to add the links manually, but I thought perhaps you might be able to use the information. Cheers, -- Black Falcon(talk)22:54, 21 February 2010 (UTC)[reply]
As categories could not be moved, I have to find out any regularity for such cases to implement any new behavior. But this is a first hint. Thank you therefore. Last week I implemented in r7936 a new function that enables bots to follow {{category redirect}}s as well as redirects. And I am improving a new module which could detect backlinks to a given page. This will help to fix the new link as fast as possible. Xqt (talk) 08:09, 22 February 2010 (UTC)[reply]
I'm glad if I've been any help. One more thought I had is that the deletion summary could possibly be useful, as the new category name is often linked in the summary; I don't know, however, whether there is sufficient regularity in this practice for a bot to be able to make use of it. Thank you for your response and all your work, -- Black Falcon(talk)08:39, 22 February 2010 (UTC)[reply]
Someone once confused two separate blackface minstrels who were both living and working in 19th c. America. One man was called Charles White: the other was called John Hodges, but acted under the stagename Cool White. Somehow the two got confused and so a page was created for a non-existent Charles "Cool" White. To try and sort it out I made a page for each man separately, and then made a redirect from the stagename "Cool" White to the real person, John Hodges. But somehow the bot has reinstated the 'Charles "Cool" White' page. RLamb (talk) 23:06, 26 February 2010 (UTC)[reply]
Yes, I made that page before I discovered redirects. How can we stop wikipedia from responding to a search for 'Charles "Cool" White' when there is no such person? Or since it redirects to the right person, does it not matter?RLamb (talk) 13:54, 27 February 2010 (UTC)[reply]
bot mindlessly processing double redirects
The situarion regarding "Taqi al-Din" was satisfactory unitl yesterday. Then editor Jagged 85 carried out an ill-judged move of it, to "Taqi al-Din (disambiguation)". I protested, at User talk:Jagged 85#Taqi al-Din, explaining that, amongst other things, it he had left a whole lot of redirects in a wrong state. Now your bot has "fixed" them by making it worse. Wikipedia work is hard enough without well-intentioned messing-up like this. SamuelTheGhost (talk) 22:19, 1 March 2010 (UTC)[reply]
Removing double redirects is actually harmful to the wikipedia. The policy was originally that they should be removed, the current policy is that it's unnecessary, as it does not improve performance in any significant way. There's actually a policy that you should not do things for efficiency reasons; that's the website's problem. In some common situations removing links actually damages the Wikipedia. I want this ill-conceived bot function shutdown.- Wolfkeeper15:14, 4 March 2010 (UTC)[reply]
Double redirects are a problem and do need to be fixed. This bot seems to be a doing a good job with them. If the editors above have suggestions they should be advised that being constructive would be more effective. — Martin (MSGJ · talk) 10:34, 16 March 2010 (UTC)[reply]
Double redirects are not a problem, merely an irritant. Wrong links are a problem, and there are a lot of them, sometimes caused by vandalism, more often by editor carelessness. When the links are right this bot removes the irritant. When the links are wrong this bot compounds the problem. It should only be used with great care. SamuelTheGhost (talk) 08:59, 7 June 2010 (UTC)[reply]
All transliterations of Choy Li Fut are causing moving/redirect problem.
This is a great page here which gives credit to many of the branches of Choy Li Fut - a style of martial arts.The only reason why anyone would want to change the name of this page is for political reasons. Lets keep politics and passions out of this wiki page and concentrate on the content. I believe that the page should be left as Cai Li Fo. I will add both Mandarin and Cantonese to the top description. Okay, let me explain the linguistic problems. First Chinese as a spoken language is tonal not phonetic like most western languages such as our English. There are basically 4 tones in Mandarin, 7 in Cantonese. The system of writing Chinese words into English is called Pinyin. For example "choy" can be written as "Choi" or "Tsoi", etc. That is because it is difficult to write tones and refined sounds into letters. To try and standardize the English writing of Chinese words and to take into consideration pronunciation, pinyin standards such as Gwoyeu Romatzyh of 1928, Latinxua Sin Wenz of 1931,Wade-Giles (1859; modified 1892), zhuyin, etc.. were created over history to address these problems. Each of them had differing standards. The official 2009 national standarized pinyin of China is called Hanyu Pinyin. There are 107+ known spoken dialects in China. In Cantonese alone, you have dialects such as Toi-san, Sam-yup, Sei-yup, Gok-gong, Hakka, etc.. Each will pronounce "Choy Li Fut" slightly different, thus the transliteration to English, depending on what pinyin you used, and when it was used, will create differences in the English spelling. An example would be the word "Chi". If you use the Chinese Postal Romanization, you can write it as Chi, ch'i, and hsi (pinyin ji, qi, and xi) are represented as either tsi, tsi, and si or ki, ki, and hi depending on historic pronunciation, etc. The official Chinese Hanyu Pinyin of 2009 romanized spelling of Chi is Qi, whether you like it or not, whether you are from the South or North. So arguing whether Choy Li Fut should be written as Choi Lei Fut or Tsoi Lee Fot, is ridiculous and wasting time. If you wish to conform to the most popular Southern Cantonese standard for the name, the "Choy Li Fut" would be the one. Another problem. To make Choy Li Fut a widely known martial arts in China, and to standardize it's name. You have to use Mandarin. To unify the country as a whole and remove the dialect issues. The government of China made Mandarin the official language of China. Since Hong Kong is now part of China again, Mandarin is now the official language in Hong Kong even though people still speak their dialects. Even with written and spoken Mandarin, Taiwan uses the older written language while mainland China uses a simplified version. Most people born and educated before WWII in China and Japan can read the old style of writing as well as the newer simplified form. To deal with this issue, I will mention both names at the top of the article. To deal with transliterations of the romanized spellings, when a wiki user types in any transliteration of Choy li Fut, Cai Li Fo, Choi Lei Fut, whatever, the wiki has been set up to auto-magically send them to this page. The Xqbot is causing problems with all the transliterations of Choy Li Fut. All the various Pinyin spellings should point to Cai Li Fo as the official name and page. Huo Xin (talk) 20:43, 22 April 2010 (UTC)[reply]
I don't see the point. As Cai li fo has been moved, xqbot solved double redirections to the new target after a delay. The delay is to prevent fixing double redirects during move wars. You moved that page back to its origin. This is no problem for the bot. It would fix it again after a given delay. But in this case you were faster doing this by hand. -Xqt (talk) 06:11, 23 April 2010 (UTC)[reply]
Please refrain from removing valid language links from this article. The links you removed are valid, and they lead to the corresponding language entry on "List of Wikipedias" in other language Wikipedias. Check the discussion at the articles talk page. Thank you. Amsaim (talk) 22:28, 5 May 2010 (UTC)[reply]
It looks like a number of the links that were added were erroneous, as well, pointing to other languages' articles on the Laotian Wikipedia, rather than to their list of Wikipedias; e.g., fr:Wikipédia_en_lao. But yes, as Amsaim says, List of Wikipedias is tricky as far as interwikis are concerned, because different encyclopedias might have a list article, a list in project space, or both, and consensus on enwiki is to link all of them. This shouldn't be changed by a bot. — Gavia immer (talk)00:43, 6 May 2010 (UTC)[reply]
This comes from an faulty namespace at hy-wiki because the namespace delimiter is not a ":" but a unicode letter. This leads to this malfunction. Normaly it avoids the project namespace. I try to fix it manually -Xqt (talk) 07:51, 6 May 2010 (UTC)[reply]
I implemented some checking procedures for other vandalism types in past and analyzing this new type I am sure to find a solution for that. Give me a bit time to implement it. -Xqt (talk) 04:47, 31 May 2010 (UTC)[reply]
over-enthusiastic bot
This is not the first time I have wished this bot did not exist (see above). It is really disconcerting when doing slightly complicated reorganisations of articles, as I have been doing, when this bot crashes in and fixes double redirects which I was about to deal with anyway, but only after case-by-case examination of links which used the redirection. Could I suggest that at least you equip the bot with a time-limit, so that it doesn't act on a double redirect until the situation is, for example, three days old, rather than doing it less than 30 minutes later, as happened to me just now. SamuelTheGhost (talk) 21:59, 6 June 2010 (UTC)[reply]
Bot Issue Involving Xqbot & A New Proposed Bot Configuration Tip
Some pages and categories of ml-wiki where not reachable for some hours for humans and bots as well but mw replied with "page not found". This causes the bot removing the link. -Xqt (talk) 17:05, 11 April 2010 (UTC)[reply]
Huh Well, it seems to me like the wiser decision might be to skip ml.wp or have your bot operate from ml.wp, removing only articles that are deleted on that wiki. If I hadn't checked, how long do you think it would have been before a different bot re-added ml.wp? —Justin (koavf)❤T☮C☺M☯ 18:47, 11 April 2010 (UTC)[reply]
I wise decision if you could know it. The lost pages might be have something to do with the mw version upgrade from 1.16alpha-wmf to 1.16wmf4 and maybe some changes of namespace aliases. Anyway I've started my bot to fix iw links for that site (and other bots did that too) and all links should be restored now. -Xqt (talk) 16:22, 12 April 2010 (UTC)[reply]
Thanks. This is a well-known bug of the interwiki bot and nobody has an idea to fix it. This is due to the needed transformation from latin to kyrillic letters. I've fixed the iw link now.
Yet again this bot has caused me annoyance, where this attampt to "fix" a double redirect was simultaneous with my actual fixing of it, so timed as to cause an edit conflict.
The reason I've been having problems with this bot is the type of work I've been doing recently, namely sorting out disambiguations of Arabic names. There is often a fairly common name (for example Abdur Rashid, as in this case) which occurs in several variant spellings (for example Abdul Rashid). Often some form of the name has arbitrarily been used in unqualified form as the name of the article about one of its bearers. So I've been moving such articles to better qualified names (for example Abdul Rashid (Chief Justice) in this case), updating all the correct inward links (while leaving alone the incorrect ones of which there are often many), then turning one of the pages with the plain name into a disambiguation page, and redirecting the other forms of the name to it. Between my doing the move and creating the dab page there is a short time lag where there may be a double redirect situation. It is not at all helpful if this bot "fixes" this situation at this time, since what it does is wrong, and it serves only to confuse and irritate.
I will repeat my suggestion given above, that this bot should be modified so that it doesn't touch double redirects until the situation is, say, at least three days old. That would give me (and others) enough time to sort out any necessary restructuring. SamuelTheGhost (talk) 09:31, 20 June 2010 (UTC)[reply]
I am not the bot runner, but I disagree. It should not be taking a human editor anywhere near three days to do a move; typing a redirect takes less than a minute, and even with some other intervening cleanup edits, manual moves-and-cleanup should not take more than ten minutes, and usually less than five. If anything, I often wish this bot acted faster, so that we wouldn't have so many double redirects all over the place. —Lowellian (reply) 23:15, 5 July 2010 (UTC)[reply]
With respect, you obviously have never attempted anything on the scale of the restructurings I've been doing. I've been doing moves involving fifty to a hundred "intervening cleanup edits", each of which I have checked for appropriateness. That takes rather more than ten minutes, and is tedious enough that one needs to take breaks. SamuelTheGhost (talk) 17:09, 15 July 2010 (UTC)[reply]
That assumption about me is not correct. I have made tens of thousands of edits in my years on Wikipedia, including some large and complex move jobs with intervening cleanup edits on that scale in the past, and I have never had any problems with this bot acting too quickly. Those intervening cleanup edits might take more than ten minutes, but this bot works on a far slower scale than that; it can often take hours to fix redirects. —Lowellian (reply) 08:57, 8 August 2010 (UTC)[reply]
Redirect confusion of Henry Nicholas for Henry Nicholis
I notice that several pages that were meant to redirect to Henry Nicholis were incorrectly "corrected" by the bot in order to clean up a double redirect. (See Heinrich Nicklaes, Heinrich Niclaes and Hendrik Niclaes.) I've tried to undo all those false redirects, but I'm not sure how to locate where the double redirect came from, and I can't tell from the revision history of the "Nicholis" page if it was simply a problem of an incorrect rename of the article. Can you help me figure out if the problem is fixed? Or is there another article awaiting a chance to trick the bot? Thanks, Aristophanes68 (talk) 21:23, 10 August 2010 (UTC)[reply]
Actually its not our responsibility, its that of the bot runner to make sure there are no bugs. Your cosmetic changes have been making errors on simple.wikipedia. This is one. It moves the default sort and the stub out of the correct positions and places them up with the external links. Please stop using cosmetic changes on simple.wikipedia until this is fixed or your bot may be blocked. -DJSasso (talk) 10:35, 8 September 2010 (UTC)[reply]
This not only depends on cosmetic changes but also to all bots dealing with categories because of a non-standard placement of that {{stub}}id template (see textlib.py for further informations). I do not think that this is a reason for blocking a bot. Anyway I've deactivated my cc on simple-wiki. I guess we have other problems to fix first, isn't it? ;) Greetings Xqt (talk) 07:48, 8 October 2010 (UTC)[reply]
Thanks for this hint. Missleading links are introduced by an user, not a bot. I fixed the remaining pages on the other sites. Xqt (talk) 18:47, 1 September 2010 (UTC)[reply]
Yet again
Once more this stupid bot is interfering with my work. I moved Abd al-Qadir al-Jaza'iri less than an hour ago and was in the middle of sorting out the redirects, and this stupid bot cuts across what I'm doing and makes a whole lot of wrong edits. When will this nonsense stop? SamuelTheGhost (talk) 21:27, 2 September 2010 (UTC)[reply]
20:32, 2 September 2010 (diff | hist) N Abd al-Qadir (moved Abd al-Qadir to Abd al-Qadir al-Jaza'iri: let bare name be disambiguated)
This automatically created a redirect from Abd al-Qadir to Abd al-Qadir al-Jaza'iri.
Then I made eight straightforward corrections of double redirects, of which the following is typical:
20:37, 2 September 2010 (diff | hist) Emir Abdel-Kader Al-Jazairi (←Redirected page to Abd al-Qadir al-Jaza'iri) (top)
There remained a dozen or so double redirects of other spellings of Abd al-Qadir, which had in total around 100 uses of them in articles. These were uses of variant spellings of Abd al-Qadir, currently referring to Abd al-Qadir al-Jaza'iri. I wished to point these variant spelling to the disambiguation page, so I went through the 100 or so articles which used them, changing their links to go direct to Abd al-Qadir al-Jaza'iri. I checked each carefully as I went since in a few cases the link was wrong. Less than an hour into this process your bot "fixed" the couble redirects.
I left the redirection of Abd al-Qadir itself until after I had finished, as this made it easier for me to see what I was doing, and so that anyone accessing the articles in the interim would in fact find the links to be as correct as possible, even if double. I'm sure you understand that carefully checking over 100 links and changing them took 14 hours (spread over two days). SamuelTheGhost (talk) 12:11, 6 September 2010 (UTC)[reply]
I just moved Abd el-Krim as part of a re-structuring exercise, and two and a half hours later you crash in with your stupid fucking bot. You don't have the grace to discuss this with me properly, nor the sense to support my work. I see you as about as much help as a vandal. SamuelTheGhost (talk) 18:37, 6 October 2010 (UTC)[reply]
Samuel, after moving a page you may replace the remaining redirect with an article or disambig page. This would prevent fixing redirects to the new target. If you would keep that redirect without fixing its redirects just write _STATICREDIRECT_ to its content. I've changed the behavior of redirect bots especially for this request to keep redirects pointing to a static redirect unchanged. This gives you enough time to check every link. If you are ready you should remove this magic word. I guess this is in your sense and the best solution for your work. Regards Xqt (talk) 20:04, 6 October 2010 (UTC)[reply]
You redirected Warren Woods High School (my new article) to Warren Woods Tower High School, a pre-existing article. The 2 schools are NOT the same. The former (WWHS) predated WWTower HS (see WWHS text). I have undone your redirect in that it is not accurate.
Hi, when I move a navbox Template, the sourcecode "Name" does not change. So the v-d-e box {{{navbox}}) still links to the old name. Is that easy to detect, to notify, and maybe even to change by bot? -DePiep (talk) 22:24, 22 September 2010 (UTC)[reply]
I`ve got my wires crossed somewhere, could you explain. {{navbox}} isn't moved. Did you changed the assignment to "Name" and it didn't changed on the pages? Then wait for some hours. This changes will be recorded on a queue and changes step by step. Ok there is a possibility to speed-up this stuff. If it's urgent, please call me again (via irc if you like) Xqt (talk) 14:51, 23 September 2010 (UTC)[reply]
You described the problem correctly (it's about the Name parameter), and for me it doesn't need a speedup. Just didn't know it was automated already. -DePiep (talk) 11:17, 27 September 2010 (UTC)[reply]
Not your fault, but...
After someone vandalized the feminism page by redirecting it to misandry, your Xqbot dutifully did its job and changed every redirect to feminism to misandry instead. Took me some time to clean up the extensive mess. Again, not your fault at all, but thought you should know. Maybe there's a way to throttle the bot so it waits some number of hours or days before fixing double redirects, to avoid turning a minor vandalism into a major headache. --Jayron3206:42, 30 September 2010 (UTC)[reply]
I am not sure whether this was vandalism. I guess he meant this seriously. Anyway I am working on a solution for these cases. Xqt (talk) 13:42, 5 October 2010 (UTC)[reply]
The bot appears to have removed the link to the Chinese WP for this article. Is there a reason this was done? im relatively new to linking to for lang WP's, and i know there are some issues with firewalls and censorship in China. please enlighten me if there is a legit reason for this edit.Mercurywoodrose (talk) 19:42, 8 October 2010 (UTC)[reply]
You can see the reason here and I've fixed it. Unforunately interwiki bots aren't be able o transliterate different forms of chinese letters as MediaWiki does. The right thing is copy the page name from zh-wiki and it should work. Xqt (talk) 23:43, 8 October 2010 (UTC)[reply]
Thanks: specific interwiki links bots documentation
Thanks for your prompt reply.
I would like to find doc about the specific bots dealing with interwiki links, helping in the translation efforts. The link you provided is about writing bot in python: I cannot find specific do con interwiki bots (instances of the general concept) dealing with link checking and completeness, useful during translation works.
--Pastore Italy (talk) 12:45, 20 October 2010 (UTC)[reply]
Sure there is this wrong iw-link on another site too. This should prevent it's return for now. I'll investigate for this wrong link later. Xqt (talk) 05:59, 27 October 2010 (UTC)[reply]
Jentadueto redirected to Linagliptin/metformin. Linagliptin/metformin has "with possibilities". If Linagliptin/metformin is changed to be a standalone page Jentadueto should still redirect to it. The bot changed Jentadueto to redirect to the page that Linagliptin/metformin redirects to even though it has "with possibilities".
I noticed that you had changed "Khuzestan" into "Arabistan" in referring to an article on the Arabic Wikipedia. The illegal action (we are not at liberty to change internationally-recognized names as we please) on the Arabic Wikipedia conforms with what some Arabs seem to be doing all the time, naming the entire world Arabistan, a manifestation of an acute form of inferiority complex from which this lot is clearly suffering. For historical details and the relevant historical maps (even Arabic maps) concerning Khuzestan, please consult this note. We do not need to follow uncritically some Arab chauvinists/illiterates here on the English Wikipedia. --BF 00:07, 5 December 2010 (UTC)[reply]
Recently, an anonymous vandal redirected a page with a degrading name (Whore of Babylon) to a different page (Lady Gaga). This redirect lasted for 2 hours and 19 minutes; during this time, your bot (Xqbot) "fixed" redirects to it. To prevent such vandalism, I think your bot should only assume a redirect to be valid (and therefore fix redirects to it) if either the redirecting was done by a registered user, or it had lasted 24 hours. עוד מישהוOd Mishehu13:28, 6 December 2010 (UTC)[reply]
Hello, I would like to ask for your help. I saw the comment of Parrot of Doom (delete a load of nonsense, and possibly a copyvio) and I would appreciate if someone could explain to me why it is nonsense to show to everyone the 20 persons who studied, researched and wrote, about the lost city of Helike, in the past. These informations were published in several greek magazines and one of the most reliable magazine for archaelogists (Archaeology magazine issue No 9 November 1983 wrote: In the Corinthian Gulf and in the area of Aegeion the Greek diver-explorer Alexis Papadopoulos has discovered a sunken town. It lies at a depth of 25m-45m with exhibits walls, fallen roofs, discarded roof tiles, streets, etc. Whether or not this town can be identified with Eliki is a question to be answered by extensive underwater research. In any case, the discovery of this town can be regarded as an extremely interesting find). Please advise to whom I can send the permission of the owner for using material from his site. Please find the link for the underwater documentary film in greek and english version http://oudeterapleustotita.blogspot.com/2009/11/blog-post.html
Thank you for your time.
Happy New Year.
Alchemistria (talk) 17:15, 26 December 2010 (UTC)[reply]
The Pkkott m/68 is not a rocket. It is similar in operation to the larger recoilless rifles (see text on that page). Also, please not that the AT-4 which replaced the Pskott m/68 and is wide spread use around the world (including the US) is also not a rocket, even though for what ever reason many official documents in the US military state it is. JackJackehammond (talk) 14:44, 25 January 2011 (UTC)[reply]
Please be advised that your bot is adding {{stub}} template to articles in Persian Wikipedia. This is not advised because stub articles should be marked with subject-specific stub templates. It would be a good idea to stop this behavior of your bot at the earliest convenience.
Huji, could you please give me diff link? I do not remember that a bot places stub templates without manual operating. And I don't. Thanks. Xqt (talk) 09:35, 21 February 2011 (UTC)[reply]
Sorry, my bad. It appears the abuse filter is triggered even when the stub template already exists in the article. Your bot is fine. huji—TALK02:49, 22 February 2011 (UTC)[reply]
I noticed that when you fixed the double redirect of City of thieves on 1/21/2011, you left in place the "R from other capitalisation" template, which no longer applied to the updated redirect. I'm not sure what the solution to this problem is in the general case, but I think a reasonable thing to do would be when fixing a double redirect, remove the redirect template if present. AlphaPyro (AlphaPyro) 15:43, 12 April 2011 (UTC)[reply]
Bot edits pages even when they are marked as "In Use"
In this edit, your bot edited a page which was marked by the "in use" template. This caused a duplication of the "References" section which had to be fixed manually. Perhaps your bot should respect the "in use" template, which says "please do not edit this page while this message is displayed." —Bill Price (nyb) 16:41, 1 May 2011 (UTC)[reply]
I've implemented this behavior for interwiki bots in pyrev:7759 several months ago but it would be a good idea to do the same for other bot scripts too. Give me some time to implement this. Xqt (talk) 10:26, 3 May 2011 (UTC)[reply]
Hi there, and thanks for your great work with Xqbot. This is just to let you know that in this edit, xqbot tried to fix a link that was already correct. I'm not sure exactly what the problem is, but I would appreciate it if you could take a look at it. Thanks! — Mr. Stradivarius♫11:53, 14 June 2011 (UTC)[reply]
The reason was this redirection of the target page which results in a double redirect of the source. I've no idea to prevent this except keeping a delay time which was more than 1 hour. Sorry for this. Xqt (talk) 12:04, 14 June 2011 (UTC)[reply]
Possible bot edit warring
Hi, I'd like to bring to your attention the history of this article. Three bots seem to be warring here. Could you try and explain what exactly is happening there? Thanks. Lynch712:28, 23 June 2011 (UTC)[reply]
This is a known bug 3081100 not from the bot framework but from the underlying python interpreter. We are not able to fix it but to deny some bots from editing some affected language codes. Xqt (talk) 16:59, 23 June 2011 (UTC)[reply]
I also needed another clarification. For instance, if I add a sa interwiki to a page in English Wikipedia, will all the interwikis be automatically added to the article in sa? Lynch708:58, 24 June 2011 (UTC)[reply]
Yes, under the conditions the target page exists it it is not empty or under construction and there is no interwiki conflict (which means more than one article points to the same language site). Xqt (talk) 11:21, 25 June 2011 (UTC)[reply]
And why en was ignored (non-disambiguation page [[pl:Makary]] to disambiguation [[en:Makarios]]) but it, ro, sh, co disambigs weren't? Bulwersator 06:46, 17 августа 2011 (UTC)
The way disambigs are treated by pywikibot is a bit tricky. You'd better ask for more technical details one of script developers - en:User:Xqt --Volkov(?!) 09:32, 17 августа 2011 (UTC)
Disambig templates are normally recognized via MediaWiki:Disambiguationspage. {{Given name}} is not a disambig but pwb handles it as a disambig per bug request years ago. The bot only follows interwiki links of disambig pages even if the target page is a disambig page too. And guess what it does for non-disambig pages ;) Most of the bots are running in autonomous mode which means the operator does not verify all bot edits before updating. But the rule described above could be wrong. Maybe it could be right pointing from a disambig page to a non-disambig page e.g. a page marked with "Given name" template to a name article on an other site. We had a lot of wrong bot edits in past with a -force option which enables to remove wrong crosswiki disambig links and bot owner are recommended not to use that option in autonous mode. In result given links will not be deleted but only new links are added existing links could be changed. In pyrev:8536 I introduced a -cleanup option which enabled removing non-existend links. In the given sample existing links wheren't removed but it changed a link from a disambig to a non-disambig page. Xqt (talk) 21:19, 23 August 2011 (UTC)[reply]
Xqbot and double redirects resulting from vandal edits
Thanks. I am collecting some ideas avoiding some of these vandal edits before changing the framework. I guess I need a bit time for implementation. Xqt (talk) 15:54, 26 September 2011 (UTC)[reply]
Found it. I guess the problems are over-agressive DR fixes, and DR fixes that don't get reverted when there are new cues to suggest it was invalid. Arguably, it's just a bigger problem with MediaWiki itself and how redirects are handled, but that's another can of worms. --Bxj (talk) 17:28, 2 October 2011 (UTC)[reply]
missing template-generated refs
Xqt seems to have missed Caolan language, where the ref is generated by the infobox. (It's since been done manually.) Is there a way to include this? I'd like to try to get Wikiproject languages to review our language articles, and if the population figure checks out with the latest edition of Ethnologue, to set ref=e16 to generate the reference. There are thousands of such articles, and it would be nice if people could just work on getting through them without worrying about formatting the references. — kwami (talk) 07:35, 5 December 2011 (UTC)[reply]
The bot didn't see the reference during screen scrapping since the reference gots an additional <span> tag. I fixed the bot now and it should work with the new mw screen. Thanks for reporting this bug. @xqt10:58, 5 December 2011 (UTC)[reply]
Great! Usur:WikHead has expressed reservations on my talk page; if you think his worries are justified, maybe the bot could just fix cases of the ref parameter being set to e16? Though that parameter is used for manual refs as well. — kwami (talk) 23:35, 9 December 2011 (UTC)[reply]
Please see, that you xqbot does not add "Ksalol" as serbian translation. The international drug name is Alprazolam and it's translations. "Ksalol" is a brand name of a sr company. This seems to be in error in sr:wiki.
They should use the drug name, not a brand name.
70.137.152.25 (talk) 07:47, 12 December 2011 (UTC)[reply]
Lescinel Jean-Francois
Please stop changing his name to the incorrect version of Jean-Francois Lescinel - Lescinel is his first name as published by his current club, used on the back of his shirt and confirmed by the player in interviews where he says he is fed up of people assuming differently. 09:41, 8 March 2012 (UTC)
Your bot has been removing links to the Croatian Wikipedia, either adding them back later or letting another bot re-add them. Do you know why? Jared Preston (talk) 05:29, 14 March 2012 (UTC)[reply]
There's probably a simple explanation for it, but I can't/couldn't work it out. I reverted here, here, here, here and here. I'm sorry if I should have left them. But what caught my eye was Miroslav Klose on my watchlist, see history here, your bot added a correct link, but also removed the Croatian link in the same edit. But there is nothing in the Croatian public log to show why the link would or should have automatically been removed. Your bot corrected itself here and here, but another bot corrected yours here, which is why I am a little bit confused. Jared Preston (talk) 07:17, 14 March 2012 (UTC)[reply]
When a redirect is edited by your bot to a new destination, such as when an article has been renamed and an already-existing redirect thereby becomes part of a double redirect, R templates in the redirect apparently are not edited by the bot and may no longer be correct. I observed this in the last week or so. I recommend that when destinations are changed the bot delete all R templates, thus making a redirect like those that were never assigned R templates in the first place. If Wikipedia already has a system for accessing redirects lacking R templates, then deleting R templates will put the newly-edited redirects together with the redirects that always lacked them, which might inspire someone to assign R templates. That's likelier than if they're left erroneous and it saves the work of reading and judging all the R tags when editing a destination.
I don't know if yours is the only bot doing this function. Should I post elsewhere?
As I am member of the developer team, this place is ok for me. Could you give me a hint for your request please (e.g. diff links). @xqt18:06, 17 March 2012 (UTC)[reply]
Hello Xqt. I am currently conducting a study on the dispute resolution processes on the English Wikipedia, in the hope that the results will help improve these processes in the future. Whether you have used dispute resolution a little or a lot, now we need to know about your experience. The survey takes around five minutes, and the information you provide will not be shared with third parties other than to assist in analyzing the results of the survey. No personally identifiable information will be released.
Please click HERE to participate.
Many thanks in advance for your comments and thoughts.
Featured.py to mine Featured pictures on many wikis
Hi, I am developing working on Commons:Template:Assessments which reports "featured" status files on commons. For instance if a file is a Featured Picture on English Wikipedia, Assessments template would have the parameter |enwiki=1 and optionally |enwiki-nom= to link to the nomination page. If a file is a formerly featured picture the parameter would be |enwiki=2. If the file is a featured sound the parameter would be |enwiki=3 and if the file is a formerly featured sound the parameter would be |enwiki=4. Updating the commons page is sufficient to update all wikis since the starts are visible on all wikis. Though it is also possible to run the code in an interwiki.py-like manner as featured pictures are typically not interwiki linked. -- A Certain White Catchi? 09:25, 20 May 2012 (UTC)
fr:Knowlesville does not exist ans therefore the bot removed it. Interwiki Links aren't Translation. They must Link to existent corresponding pages on other sites. @xqt05:35, 29 May 2012 (UTC)[reply]
Garbo edit
Hello there, you just edited something on the Garbo page which I don't understand. Can you explain what you did? Many people make these adjustments which I don't understand so, out of curiosity, I want to learn this stuff! You can just answer, if you wish, on my talk p. thanks,--Classicfilmbuff (talk) 00:50, 26 November 2012 (UTC)[reply]
Unexplained removal by bot
I don't know if the bot's supposed to explain but ....
It removed a component I don't fully understand myself yet but which I had looked at with some care and decided was a reasonable addition, namely the "zh" with the article (ICE, spelled out) name in (in this case) Chinese (to start, I had to look up "zh"), here. The component was put in here. To check the addition myself, I had translated via Google the Chinese phrase and it was correct to the article title. (GoogleTransl. did seem a little confused between Japanese and Chinese but the Chinese seemed firmly the preference so I didn't probe further; from my bit of knowledge about the two languages (Japanese "kanji" (sp?) being Chinese characters "reused", about my limit), it wasn't a bothersome confusion and was one that could happen easily.)
Short question: Why remove it?
Curiously, not you (or anyone) have (yet) removed the same IP editor's "zh" addition here from 20 Dec. (well before the ICE case above which was early 24 Dec. addition, later 24 Dec. removal)) but another bot (User:MastiBot) has removed the "zh" addition here (12 Dec. edit; 15 Dec. removal), also without explanation or communication with the IP'er here at the IP Talk page. From this quick look, it feels like you both, via bot, are (sometimes?) treating this component like vandalism but aren't labeling it or combating it as such. It may not be your "job" to do so; I'm just trying to understand. I would and will post to the IP's Talk if you have a good explanation. I don't like to see apparently good faith work "cleaned [out]" without explanation.
For the record, ICE is a global enterprise as, partic., is NYSE which it is acquiring, so a Chinese link for the article made and makes good sense to me.
I guess the reason is a kind of miss-spelling. Mediawiki can solve different zh spelling but pywikibot unfortunately can't. I'll have a look to this problem tomorrow to solve it if necessary. @xqt21:10, 24 December 2012 (UTC)[reply]
You lost me with "Mediawiki" but I appreciate your looking into it. I've done another bit of research or three. First, I've learned now that these "language:translation" bits don't show up on the page as presented, for instance here Neo-Scholasticism. I'm assuming the article has some "l:t" bits because I see them here on a "differences" page that's addressing a change in them. However, you'll note (or know) that they (still) don't appear in the body of the text below the "differences" section. This particular differences page is of interest, further, because there's no "zh:_" on it; yet one was added here. The problem of these not appearing on the page anywhere is that it becomes cumbersome to track who has removed one like the "zh:_" one here, gone between the two pages I've linked to. In this case I got one cleverer (and lucky) and looked at the one edit with substantial negative (red) bytes on the Rev. hist. and found this one, a wholesale "language:translation" bits remover (on Neo-Schol. page at least).
How did I find User:Isnow who added the zh to Neo-Schol. back in July 2010? Because he or she just (since your bot removal) added it (back) to our ICE. And he or she is mostly a zh adder. So I went back aways in his or her contributions history and picked Neo-Schol. randomly.
As is I expect clear, I'm poking my way through this subject without necessarily a lot of intrinsic interest even in it; but I'm curious about a zone of Wiki I have found I don't know anything about but which, in seeming to have some problems, interests me. And the China interaction piques my interest as well. Maybe it's good to have zh adders, maybe not, don't know yet. OK? I am going to link in User:Hydro who did that wholesale removal and User:Isnow to see if there is anything to be learned from them, too. I hope you can continue to try to meet me halfway and are OK with me starting a (little) larger forum here at your user talk page. I'll find another locale if you'd rather. Cheers and thanks. Swliv (talk) 20:11, 26 December 2012 (UTC)[reply]
Hello, I am a human editor. The interwiki links were added between zh and en based on my knowledge. The "zh:_" links were usually the same page title as in zh.wikipedia, which may be Simplified Chinese or Traditional Chinese. I am also interested in how the bots decided to remove the links. --Isnow (talk) 21:46, 26 December 2012 (UTC)[reply]
As I told above as shown here. You inserted the right link, xqbot remove a kind of "miss-spelled" link. Mediawiki software can solve that in some circumstances. The Pywikipediabot framework cannot. The best way is to copy the target title of zh-wiki and paste the link here. @xqt13:12, 27 December 2012 (UTC)[reply]
In addition to my comment above here are some explanations about the deleted links:
Well, as I predicted, my interest in this has ebbed. But I'm glad Isnow came in as someone directly affected. And I appreciate xqt's earnest attention to the detail.
I did know, and said above, that Methaqualone was fine.
The Ferrari adds a new element to my sense of what I'm dealing with here (and noone's explained where these components even live on a page much less what they do; I know, I could look it up): Do these link to articles in other-language Wikipedia editions? Wow! if so. Neat. And are we then saying that MastiBot was correct in removing the link (if it's a link; the "component" was my vaguer, working term) (because there was no Ferrari FF article in the zh Wiki)? That would be a nice resolution of that. (This isn't yours, Isnow, but you could look into that if you wanted. I did not even reach out to that IPer, in this go-round.)
As to Neo-Schol.: I gave you the "fixed" page you've given back to me (mine at the end of my first 20:11 paragraph). I don't understand the "other languages" explanation you give. All I saw at the (User:Hydro) edit was the "wholesale", as I called it, removal of lots including the zh one. I guess I'll leave this to Isnow to explore further if wanted but I'll sure try to understand an explanation if one's given.
ICE. I'll take your word it's fixed. I'll also leave it to Isnow to figure out the "miss-spelling" if that's the explanation for the (mistaken) removal. I guess I do, now that I'm back where I started – with ICE – wonder if this is a systemic problem with the bot removing components that it shouldn't be removing. I guess your answer, xqt, is "Yes, sometimes, come to me and I'll fix it, case by case". That's of course not a systemic fix. But maybe it's the best way to proceed. I've seen your defense above (1000 edits a day or an hour or whatever) and can certainly appreciate (from a place of extreme basic ignorance) the work you're doing for the encyclopedia.
I'll leave it (almost) at that for now. (And I mustered some focus and interest ... for another round, eh?) Final point: I've just been picking randomly a tiny number of examples, following my nose on this. Is there really a sense, xqt, that this is not a systemic problem? Thanks for your attention. Cheers.
ps Reviewing before posting I've just picked up with more focus on "interwiki" in your Neo-Schol. explanation. That ... has led me to Wikipedia:Interwiki. I think, if any other generalist wanders into this thicket hereafter, that would be where one would start to look up what's going on here, what we're talking about here. Am I right, xqt and Isnow? should you care to answer such a basic question. Feels like I'm right. So much to learn .... (And Isnow, I think we're all human in this discussion, right? xqt operates a bot (prob. designed it too). But I was charmed by your opening; and both of your's willingness to engage in this. :-)) All best. Swliv (talk) 02:10, 28 December 2012 (UTC)[reply]
I gave you a link to another talk page discussion with a diff exactly like your bot's. Anyway, here's the specific one. You'll notice that each of the IWs linked to deal with German World Heritage Sites – not all of Western Europe like the English article. EricLeb01 (Page | Talk)03:46, 25 December 2012 (UTC)[reply]
I am very sorry but I cannot fix it at the moment at the other sites. I would do it any time later. For now I blocked interwiki bots at that page. @xqt19:42, 31 January 2013 (UTC)[reply]
Important note: Bots that continue to add, remove, or update interwiki links on the English Wikipedia may be blocked from editing after Saturday, February 16, 2013.
If you are running pywikipedia's interwiki.py, please update to pyrev:11073 which will automatically prevent your bot from updating links on this wiki.
If you have any questions, please ask at the bot owners' noticeboard. Thank you for your past work maintaining interwiki links. It has been very appreciated and we're looking forward to an even brighter future with Wikidata. Legoktm (talk) 10:30, 14 February 2013 (UTC)[reply]
(BK) unfortunately not all these pages are at wikidata. I've blocked my bot now. Could anybody unblock me (and of cause my bot to since I've stopped it due to a malfunction with nonregistered pages on wikidata. Please. @xqt20:37, 16 February 2013 (UTC)[reply]
I've seen some recent edits by User:Xqbot that initially looked suspicious and/or wrong. A little investigation revealed they are related this Wikidata project/initiative. This is probably a good thing, but seems a bit opaque right now in the early stages.
While there is the "edit links" thing over in the languages frame, these otherwise unexplained edits might catch editors unawares. Providing very explicit edit summaries along with a link to the associated wikidata page may be helpful during this transition. --Dfred (talk) 04:30, 17 February 2013 (UTC)[reply]
Hi Xqt. You fixed the missing wikis in the pywikibot rewrite branch (in r11100 and r11101), but not yet in the trunk. I couldn't find much information about the branch. Should new bots use the rewrite branch, or is it not ready yet? —Pathoschild 00:47, 23 February 2013 (UTC)
Hi Pathoschild. Sorry I forgot to update the trunk release from rewrite. You may use the rewrite branch as you like. I don't know where we are going in future, perhaps the amsterdam hackathon gives more clarity. I use both branches for developing. Rewrite has a lot of good ideas but I feel trunk is more stable and does not need a lot of api calls for getting site information of all sites of a project. I prefer to merge them because it is hard to work to keep both versions up to date. Best @xqt11:52, 23 February 2013 (UTC)[reply]
Thanks. I'll stay with the trunk for now, and see if anything comes out of the hackathon. —Pathoschild 16:58, 23 February 2013 (UTC)
Redirect.py in case of multiple redirects
Hello Xqt, you have here handled and closed my error message "2 redirects on a page: all replaced, instead the first only - ID: 3605596". (It was my first such message and I misinterpreted the "Private". And, maybe, it's not the right place here for the discussion?) Am I right, you didn't change the source code? I think, that wouldn't be the best solution.
I had written on the problem first here to Avocato, giving more information. Clearly, a second redirect is useless. But in reality, users make errors and use such redirects to put information on the pages. You have seen such a case in ANSCA. And redirect.py now destroys this information. I have changed already nearly half of the pages with more than one redirect (and will handle the others too), but we cannot hope, that users will not make this error in the future.
Hm I guess we should have a maintenance page either on the redirects talk page or any page on bots namespace or somewhere else to be fixed. Or the bot could transform the redirect to disambig page. Otherwise this problem may be kept for while until it will be detected by accident. Any further suggestions? @xqt16:59, 9 March 2013 (UTC)[reply]
Yes, a maintenance page could help, to correct known cases of multiple redirects on a page more quickly. A bot could find the cases, but it could not correct them: A good correction is dependent of the context, a disambig page isn't appropriate in every case. Often a note "This article is about .... For ..., see ..." on one of the target pages (and deletion of the other redirects) is better. So it should be handled manually. But a first step could be, to avoid destroying of (mostly) valuable information... --Griot-de (talk) 00:27, 10 March 2013 (UTC)[reply]
cosmetic_changes.py
Some time ago the message <!--interwiki (no, sv, da first; then other languages alphabetically by name)--> was added to cosmetic_changes.py for nn.wiki. Now that interwiki is no longer to be stored locally, the message should be removed/deactivated. Thanks. --Njardarlogar (talk) 09:51, 28 March 2013 (UTC)[reply]
There was, iirc, discussion back in 2006 or '07 about making it a policy to have BOTs exclude certain types of pages, and in my case, one like this clearly tagged as underconstruction. I can really do without BOT edit conflicts, you know? Preferentially, you should be skipping any article less than 48 hrs old as a matter of commonsense, Letting the brainless BOTs loose whilst the human intelligence is still working out the tough stuff, is counterproductive. // FrankB02:08, 12 September 2013 (UTC)[reply]
Georgia Brown (Brazilian singer)
Hello Xqt,
This is just a courtesy visit to apprise you that I took the liberty to slightly expand your above article by adding a section and relevant references. Hopefully, you'd like my little effort. Best regards,
I was disappointed when I got the notification that you've reversed the edit work I did for your Georgia Brown. The information I put in about the person in question, albeit brief, the extra references and wikilinks, I believe, did not harm the article, on the other hand, supported it. Anyways, I'm not airing a grievance, but, just to be more sure about my work in the future, I'd truly appreciate knowing what was lacking. Best regards, (MrNiceGuy1113 (talk) 09:26, 24 September 2013 (UTC))[reply]
Hi,
Back again. This time to apologize to you. The article George Brown (as I learned later) was not reverted by you but a diffrent gentleman(Seokhun). Sorry again for taking your time. Regards, (MrNiceGuy1113 (talk) 09:47, 24 September 2013 (UTC))[reply]
Yes, this was before the fix above. There where some changes in mediaWiki software results in the previous weeks which also caused other bugs too. @xqt10:36, 13 October 2013 (UTC)[reply]
Preferred ref tag
Hi,
I see you consistently add <references /> to the ref section. I always add {{Reflist}}. Is one preferable to the other? (They used to do different things, but no longer.) Pls ping me if this matters. If {{Reflist}} is dispreferred, it would be a good idea to remove it from the edit-window options. — kwami (talk) 05:26, 14 November 2013 (UTC)[reply]
I am not sure which of both is the preferred one, imho there is no preferences for one of them. The template does the same as the mediawiki magic word except the template is able to show the reference list in columns (see also bugzilla:51260) @xqt06:25, 14 November 2013 (UTC)[reply]
Bot shouldn't add "References" section to disambiguation pages
Hi,
On occasion an editor will add a reference to a disambiguation page. This is an error, so xqbot should not add a corresponding "References" section to such pages (as happened in this edit, for example).
Hi. Can you resolve bug 59008, please? It is just adding two lines of code. Unfortunately, my current computer doesn't have Git installed on it or else I would have submitted the patch to Gerrit myself. --Meno25 (talk) 20:06, 29 December 2013 (UTC)[reply]
Thank you for fixing up the reference for Zoology mnemonic. I was not able to fix up the problem myself because my computer is currently only showing extremely tiny unreadable text - both the article page and the editing page. All the best. Figaro (talk) 02:57, 6 February 2014 (UTC)[reply]
blacklist hits
Xqt, I noticed your bot sometimes hits the blacklist by adding {{reflist}} to articles where there are 'blacklisted references'. Two recent/current examples are:
16:55, 25 April 2014 Xqbot (talk | contribs | block) caused a spam blacklist hit on Truly Scrumptious (song) by attempting to add youtu.be.
16:52, 25 April 2014 Xqbot (talk | contribs | block) caused a spam blacklist hit on Faubourg St. John by attempting to add www.google.com/url?.
I have now solved the problems (both are redirect link, I have expanded to the proper link which is not blacklisted), but maybe you can make your bot alert you of these cases so they can be repaired. Thanks! --Dirk BeetstraTC06:28, 26 April 2014 (UTC)[reply]
When fixing this double redirect, the bot has dropped the hash part of the original redirect that links to a section in the target article. This edit is wrong, as it changes the intended target of the first redirect and therefore the fix loses information. I've found another example of the same error here, which should have been fixed like this.
This is a severe bug; sure, it's not earth-shattering, as the redirect still leads to the same page, but the meaning of the redirect and the section that the editor intended as the meaning of the link are lost, forcing the reader to infer why the link was created. Please disable the double-redirect function until this bug is fixed, as the bot shouldn't be making wrong edits unsupervised, and it's very hard to detect. You may also want to check all the edits where the second redirect is a redirect-to-section, as it's likely that this has been going on for a long time. Diego (talk) 06:27, 14 May 2014 (UTC)[reply]
Hi: The bot seems to be "fixing" redirects as being double that aren't. The only thing being "fixed" in this and this is that there's no space between the #REDIRECT and the brackets - which is, incidentally, how the redirect button on the editing menu formats it. Space or no space makes no functional difference to the redirect, so the bot's edits aren't necessary at all. BMK (talk) 02:56, 26 June 2014 (UTC)[reply]
Why are we using "fancy dashes" in article titles, when they cannot be easily typed from a normal keyboard? BMK (talk) 08:19, 27 June 2014 (UTC)[reply]
Apologies if this has already been fixed - the instance I've just discovered was 4 years ago but I didn't find any mention of it in the earlier discussions.
This edit left {{R from other capitalisation}} there, although it is inappropriate to the new title to which the redirect was changed. I suppose fixing this would be a matter of programming it to recognise certain redirect templates and remove them if they are no longer appropriate.
Just read the text of that template again, and you're right. Though I'm made to wonder whether the template is defined that way purely for the benefit of bots fixing double redirects. Though maybe there are other templates that need to be thought about - I'll let you know if I find one. — Smjg (talk) 22:34, 27 June 2014 (UTC)[reply]
Bot messed up HTML comments while fixing double redirect
Hi, I would like to report a bug in your bot where it did not only fix a double redirect, but also messed up the text in HTML comments: [41], [42], [43], [44]. These comments exist for documentation purposes and as a suggestion for other editors for possible alternative link targets depending on the future development of the articles. In either case, comments should never be touched by a bot. Please fix your bot to ignore them. Thanks. --Matthiaspaul (talk) 09:50, 19 July 2014 (UTC)[reply]
In amongst all the bug reports and moans, this is just to say thanks for the bot. When I moved the page, I got the page (reasonably) asking me to deal with the double-redirects, but I didn't know how to do that, and the bot has done it for me. Thanks. DrArsenal (talk) 20:33, 17 December 2014 (UTC)[reply]
Jackie Joyner Kersee award
On Jackie Joyner Kersee award I have included a note about the intentional double redirect on this article. Your bot can't read. This is a simple lower-case disambiguation of the original version. The original name currently redirects to the Jesse Owens Award because that is what the award was known as until a 2013 decision to convert it to the Jackie Joyner Kersee Award, so the last two versions have followed that name. As this new name gets established there will certainly will be a new article--probably by me. When that is done, I don't want the disambiguation to directing to the wrong article. I need to figure out a way to do so without losing the 20 year history of the award with the original name. The Board of USATF who awards this is in disarray right now. Who knows what decision they will make for the legacy of their most important award once some sane individuals are appointed to the board. Trackinfo (talk) 09:31, 26 December 2014 (UTC)[reply]
Hi Trackinfo. There is a way to prohibit the bot solving the double redirect. Just add __STATICREDIRECT__ to your redirect page; it wouldn't be fixed anymore. @xqt10:54, 29 December 2014 (UTC)[reply]
Sorry I had to explain that behavior. You edited the wrong redirect. The middle man must be the static redirect which I've done now here. This drawing shows how static redirects work with the bot(s). Please excuse that trouble I've done. @xqt09:04, 1 January 2015 (UTC)[reply]
Bot changed Virginia Kelley redirect Virginia Clinton Kelley
Virginia Kelley was set up in 2007 as a redirect to Virginia Clinton Kelley, who is Bill Clinton's mother and was known as Virginia Kelley. On April 17, 2015 Xqbot with this edit incorrectly changed the redirect to point to Bill Clinton. I'm changing it back to what it originally was. — Maile (talk) 20:58, 12 May 2015 (UTC)[reply]
Hi. Could you, please, have a look at phab:T109225? As I can see here, you have a disagreement in opinion with John Vandenberg, so, could you, please, commit the one line change that I requested in the task and postpone other changes until you settle your disagreement with John? Thank you, sir. --Meno25 (talk) 14:05, 27 August 2015 (UTC)[reply]
Hi. Regarding T110529, when using the core version of template.py it fails and gives me the following error:
python pwb.py template Persondata -remove -page:"آدام هامل" -lang:ar -family:wikipedia
WARNING: Bot.site was not set before being retrieved.
WARNING: Using the default site: wikipedia:ar
Retrieving 1 pages from wikipedia:ar.
You can't edit page [[آدام هامل]]
Although the page ar:آدام هامل is not protected and the bot account is not blocked. This is the first time I try the core script. Is this a bug in the script or am I doing something wrong? --Meno25 (talk) 17:10, 7 April 2016 (UTC)[reply]
you may ignore the warnings seems the script uses an older bot class which set the default site when it is not set in constructor. But the blocking of the page looks ugly. I'll investigate into this matter soon. @xqt18:22, 7 April 2016 (UTC)[reply]
Edit succedded but it turns out that the same bug descriped in T110529 exists also in core (see edit) as can be shown in the following output:
python pwb.py template "Persondata" -remove -page:"User:Meno25/test core" -lang:ar -family:wikipedia
WARNING: Bot.site was not set before being retrieved.
WARNING: Using the default site: wikipedia:ar
Retrieving 1 pages from wikipedia:ar.
WARNING: Bot.site was not set before being retrieved.
WARNING: Using the default site: wikipedia:ar
>>> مستخدم:Meno25/test core <<<
@@ -14,5 +14 @@
+
- {{Persondata
- |NAME=آدام هامل
- |ALTERNATIVE NAMES=
- |SHORT DESCRIPTION=لاعب كرة قدم إنجليزي
- |DATE OF BIRTH={{تاريخ الميلاد|1988|1|25|df=y}}
Do you want to accept these changes? ([y]es, [N]o, [e]dit, open in [b]rowser, [a]ll, [q]uit): y
Waiting for 1 pages to be put. Estimated time remaining: 0:00:10
Password for user MenoBot on wikipedia:ar (no characters will be shown):
Logging in to wikipedia:ar as MenoBot
WARNING: API warning (login): Fetching a token via action=login is deprecated. Use action=query&meta=tokens&type=login instead.
Page [[مستخدم:Meno25/test core]] saved
The problem seems the nested template. I reopened T110529. I guess we should copy the related parts there. @xqt06:47, 8 April 2016 (UTC)[reply]
The other issue: I tried to run the bot on your test page as follows (without ignore_bot_templates=True):
>>> import pwb, pywikibot as py
>>> s = py.Site('ar')
>>> p = py.Page(s, 'user:Meno25/test core')
>>> p.botMayEdit()
True
>>>
I tried the other page you mentioned:
>>> p = py.Page(s, 'user:xqt/Test')
>>> dest = list(p.linkedPages())[0]
>>> dest
Page(\u0622\u062f\u0627\u0645 \u0647\u0627\u0645\u0644)
>>> dest.botMayEdit()
True
>>>
It also works.
Finally I tried an edit on that page:
C:\pwb\GIT\core>pwb.py template Persondata -remove -page:"%D8%A2%D8%AF%D8%A7%D9%
85_%D9%87%D8%A7%D9%85%D9%84" -lang:ar
WARNING: Bot.site was not set before being retrieved.
WARNING: Using the default site: wikipedia:ar
Retrieving 1 pages from wikipedia:ar.
WARNING: Bot.site was not set before being retrieved.
WARNING: Using the default site: wikipedia:ar
>>> آدام هامل <<<
@@ -17,5 +17 @@
+
- {{Persondata
- |NAME=آدام هامل
- |ALTERNATIVE NAMES=
- |SHORT DESCRIPTION=لاعب كرة قدم إنجليزي
- |DATE OF BIRTH={{تاريخ الميلاد|1988|1|25|df=y}}
Do you want to accept these changes? ([y]es, [N]o, [e]dit, open in [b]rowser, [a
]ll, [q]uit): y
Sleeping for 4.9 seconds, 2016-04-08 09:26:40
Hello, Please see this edit by Xqbot: [45]. In short, the preceding edit added a <ref group=n>...</ref> tag to the page, and the bot then added the <references /> tag to the article, which doesn't actually fix the citation error. What Xqbot should have added to the page was {{Reflist|group=n}} ([46]).
Hello, Xqt. Voting in the 2016 Arbitration Committee elections is open from Monday, 00:00, 21 November through Sunday, 23:59, 4 December to all unblocked users who have registered an account before Wednesday, 00:00, 28 October 2016 and have made at least 150 mainspace edits before Sunday, 00:00, 1 November 2016.
The Arbitration Committee is the panel of editors responsible for conducting the Wikipedia arbitration process. It has the authority to impose binding solutions to disputes between editors, primarily for serious conduct disputes the community has been unable to resolve. This includes the authority to impose site bans, topic bans, editing restrictions, and other measures needed to maintain our editing environment. The arbitration policy describes the Committee's roles and responsibilities in greater detail.
Delving deeper into the data, we found that most of the disagreement occurs between bots that specialize in creating and modifying links between different language editions of the encyclopedia. The lack of coordination may be due to different language editions having slightly different naming rules and conventions.
In support of this argument, we also found that the same bots are responsible for the majority of reverts in all the language editions we study. For example, some of the bots that revert the most other bots include Xqbot, EmausBot, SieBot, and VolkovBot, all bots specializing in fixing inter-wiki links. Further, while there are few articles with many bot-bot reverts (S7 Fig), these articles tend to be the same across languages. For example, some of the articles most contested by bots are about Pervez Musharraf (former president of Pakistan), Uzbekistan, Estonia, Belarus, Arabic language, Niels Bohr, Arnold Schwarzenegger. This would suggest that a significant portion of bot-bot fighting occurs across languages rather than within.
The origin study is published here which is less spectacular. Anyway most of these bots based on the same framework i.e. have the same script. There where three main reasons that bots where reverting each other:
an unicode bug found on the underlying development system, see phab:T102461 and [47]
cross-namespace interlanguage links might be misleading when the framework was not updated in short term
The BAG Newsletter is now the Bots Newsletter, per discussion. As such, we've subscribed all bot operators to the newsletter. You can subscribe/unsubscribe from future newsletters by adding/removing your name from this list.
(You can unsubscribe from future newsletters by removing your name from this list.)
Cross-namespace double redirects
Would it be possible for the bot to log all the cross-namespace and U2 redirects it creates when bypassing redirects (and possibly not create them)? Especially when article-space is involved, they are often R2-able, and they make cleaning up from confused users a lot harder. For example, see User:Marina51/sandbox -> User:Amedeo Schiattarella -> Amedeo Schiattarella -> Help:Marina51/sandbox that I just encountered. I was able to bring the page back to article-space, but if it were something completely unfit for mainspace and had to be brought back all the way back to the sandbox, I'd need admin help. – Train2104 (t • c) 18:34, 11 May 2017 (UTC)[reply]
Bots Newsletter, July 2017
Bots Newsletter, July 2017
Greetings!
Here is the 4th issue of the Bots Newsletter (formerly the BAG Newletter). You can subscribe/unsubscribe from future newsletters by adding/removing your name from this list.
21 inactive bots have been deflagged (see discussion).
WP:BOTISSUE has been updated to mention that BAG members can act as neutral mediators in bot-related disputes.
WP:INTERWIKIBOT has been updated to reflect the post-February 2013 practice of putting interwiki links on Wikidata, rather than on Wikipedia (see discussion).
The bot fixed the double redirect only. If this redirect isn't needed at all please feel free to make a deletion request at the dedicated page. Greetings. @xqt11:58, 7 November 2017 (UTC)[reply]
Xqtbot
Your bot is correcting double redirects from student users moving their pages around multiple times. This makes it much more difficult to fix the errors they are causing from the moves. Can you put a delay on your bot so that users have a chance to fix these pages before your bot fixes the double redirects? Thanks. Nihlus01:25, 2 December 2017 (UTC)[reply]
Unfortunately, the ones I was working with are deleted since they had to make way for a move. However, a few minutes isn't enough. I'm looking at something around a 60 minute delay as a bare minimum where a page move is involved. Nihlus08:54, 4 December 2017 (UTC)[reply]
ArbCom 2017 election voter message
Hello, Xqt. Voting in the 2017 Arbitration Committee elections is now open until 23.59 on Sunday, 10 December. All users who registered an account before Saturday, 28 October 2017, made at least 150 mainspace edits before Wednesday, 1 November 2017 and are not currently blocked are eligible to vote. Users with alternate accounts may only vote once.
The Arbitration Committee is the panel of editors responsible for conducting the Wikipedia arbitration process. It has the authority to impose binding solutions to disputes between editors, primarily for serious conduct disputes the community has been unable to resolve. This includes the authority to impose site bans, topic bans, editing restrictions, and other measures needed to maintain our editing environment. The arbitration policy describes the Committee's roles and responsibilities in greater detail.
Here is the 5th issue of the Bots Newsletter (formerly the BAG Newletter). You can subscribe/unsubscribe from future newsletters by adding/removing your name from this list.
While there were no large-scale bot-related discussion in the past few months, you can check WP:BOTN and WT:BOTPOL (and their corresponding archives) for smaller issues that came up.
The edit summary limit has been increased to 1000 characters (see T6715). If a bot you operate relied on the old truncation limit (255 characters), please review/update your code, as overly long summaries can be disruptive/annoying. If you want to use extra characters to have more information in your edit summary, please do so intelligently.
You will soon be able to ping users from the edit summary (see T188469). If you wish to use this feature in your bot, please do so intelligently.
Here is the 6th issue of the Bots Newsletter. You can subscribe/unsubscribe from future newsletters by adding/removing your name from this list.
Highlights for this newsletter include:
ARBCOM
Nothing particular important happened. Those who care already know, those who don't know wouldn't care. The curious can dig ARBCOM archives themselves.
BAG
There were no changes in BAG membership since the last Bots Newsletter. Headbomb went from semi-active to active.
In the last 3 months, only 3 BAG members have closed requests - help is needed with the backlog.
{{Automated tools}}, a new template linking to user-activated tools and scripts has been created. It can be used in articles previews, and can be placed on any non-mainspace page/template (e.g. {{Draft article}}) to provide convenient links to editors.
AWB 5.10.0.0 is out, after nearly 20 months without updates. If you run an old version, you will be prompted to install the new version automatically. See the changelog for what's new. Note that the next version will require .NET Framework 4.5. Many thanks to Reedy and the AWB team.
BotWatch, "a listing of editors that have made >2 edits per minute [without] a bot flag", is being developed by SQL (see discussion).
(You can subscribe or unsubscribe from future newsletters by adding or removing your name from this list.)
ArbCom 2018 election voter message
Hello, Xqt. Voting in the 2018 Arbitration Committee elections is now open until 23.59 on Sunday, 3 December. All users who registered an account before Sunday, 28 October 2018, made at least 150 mainspace edits before Thursday, 1 November 2018 and are not currently blocked are eligible to vote. Users with alternate accounts may only vote once.
The Arbitration Committee is the panel of editors responsible for conducting the Wikipedia arbitration process. It has the authority to impose binding solutions to disputes between editors, primarily for serious conduct disputes the community has been unable to resolve. This includes the authority to impose site bans, topic bans, editing restrictions, and other measures needed to maintain our editing environment. The arbitration policy describes the Committee's roles and responsibilities in greater detail.
Sorry, I don't understand what bots has to do here. Maybe other bot owner cannot follow your task too. Could you explain in simple steps what you expect doing by bots.
When your helpful bot fixes double redirects, could it also correct redirects on the associated talk page? That would be very useful and save editor time. Thanks! LizRead!Talk!03:31, 26 July 2019 (UTC)[reply]
I am wondering that this hasn't been done. Do you have an example for me to be compared with my log files? @xqt14:56, 26 July 2019 (UTC)[reply]
Here is the 7th issue of the Bots Newsletter, a lot happened since last year's newsletter! You can subscribe/unsubscribe from future newsletters by adding/removing your name from this list.
BAG members are expected to be active on Wikipedia to have their finger on the pulse of the community. After two years without any bot-related activity (such as posting on bot-related pages, posting on a bot's talk page, or operating a bot), BAG members will be retired from BAG following a one-week notice. Retired members can re-apply for BAG membership as normal if they wish to rejoin the BAG.
We thank former members for their service and wish Madman a happy retirement. We note that Madman and BU Rob13 were not inactive and could resume their BAG positions if they so wished, should their retirements happens to be temporary.
Activity requirements: BAG members now have an activity requirement. The requirements are very light, one only needs to be involved in a bot-related area at some point within the last two years. For purpose of meeting these requirements, discussing a bot-related matter anywhere on Wikipedia counts, as does operating a bot (RFC).
Copyvio flag: Bot accounts may be additionally marked by a bureaucrat upon BAG request as being in the "copyviobot" user group on Wikipedia. This flag allows using the API to add metadata to edits for use in the New pages feed (discussion). There is currently 1 bot using this functionality.
Mass creation: The restriction on mass-creation (semi-automated or automated) was extended from articles, to all content-pages. There are subtleties, but content here broadly means whatever a reader could land on when browsing the mainspace in normal circumstances (e.g. Mainspace, Books, most Categories, Portals, ...). There is also a warning that WP:MEATBOT still applies in other areas (e.g. Redirects, Wikipedia namespace, Help, maintenance categories, ...) not explicitely covered by WP:MASSCREATION.
It looks like the bot moves redirect to PANDAS to Giant_panda. Not sure how this happens, but the page Pandas is redirected Giant_panda while there is another page with title PANDAS. — Preceding unsigned comment added by Jieralv (talk • contribs) 22:16, 20 October 2019 (UTC)[reply]
The Arbitration Committee is the panel of editors responsible for conducting the Wikipedia arbitration process. It has the authority to impose binding solutions to disputes between editors, primarily for serious conduct disputes the community has been unable to resolve. This includes the authority to impose site bans, topic bans, editing restrictions, and other measures needed to maintain our editing environment. The arbitration policy describes the Committee's roles and responsibilities in greater detail.
I don't know how to resolve this but I am frequently reverting your bot. What happens is that an editor moves a pages (or sometimes many pages) to the wrong page and then the bot moves every single redirect to a wrong location. I end up following up behind the bot, reverting many of its edits. Perhaps it could delay responding to page moves for 48 or 92 hours which would allow bad moves to be undone before the bot gets to work and makes many changes that other editors then need to clean up.
The reason why this is a concern is that when the pages DO get moved back, the original redirects appear to be broken and then there is a different bot that comes along and deletes them when they are valid redirects, they just direct to the wrong page because of your bot's edits. Any ideas how this could be avoided? LizRead!Talk!00:58, 9 January 2020 (UTC)[reply]
Sorry, I have been absent from Wikipedia and didn't notice your response.
Here is a recent example: DolbyPedia, a sockpuppet, moved Tajik alphabet to Persian (Iran) alphabet (see this edit), and within 5 minutes all of the redirects (like this one) were redirected to the incorrect page by the bot. When an editor moved the page back, the bot didn't change the redirects back to the correct location. This might be because the editor or admin who corrected the wrong move frequently doesn't leave a redirect from the wrong page to the right one.
I should say that yours is not the only bot I've come across which does this (changes the redirect pages following a bad move and doesn't change them back when the page gets moved back). See this edit that was a part of a series of incorrect double redirects by EmausBot. And I'm not sure if these redirect corrections can occur if the editor or admin who reverts the move doesn't leave a redirect. I've only noticed them because they often show up at the broken redirect page and I'm concerned that User:AnomieBOT III will just delete the redirect if an editor doesn't revert the bot moves. LizRead!Talk!19:32, 25 January 2020 (UTC)[reply]
Once again, I'm reverting about a hundred edits of your bot which changed dozens and dozens of redirects after a few bad page moves. The pages were moved back almost immediately and the redirects then pointed to empty pages. If only the bot would change the redirects back to where they were in the first place so I didn't have to revert each one individually myself. Sorry, I'm a bit frustrated with the bot and the new editor who thought it would be smart to move highly visible articles to talk pages. LizRead!Talk!05:05, 13 February 2020 (UTC)[reply]
There may be several bot using the redirect.py script from mw:Manual:Pywikibot framework. I guess user:EmausBot and user:Xqbot are working on this site. The source is public. I tried to implement some kind of vandalism detection and can conclude that page moves by unconfirmed users are always ignored. In addition there is a delay time after moving a page befor the bot starts working. Anyway what can someone do if the page must be moved back and the redirects should be fixed in that way? Just wait until the bots have fixed the double redirect again and delete the wrong redirect afterwards. What to do if the redirect is deleted after moving back or the page was moved without leaving a redirect? Just wait again until bots have fixed the broken redirects; they are able to restore the broken links. Or make a mass revert for the bots edits. Or make a bot task request. Or ask a bot owner who uses that redirect script. Or ask me or other guys at #pywikibot channel. Probably you have a better idea to verify whether a page move is usefull or not. There are several tasks at Phabricator related to this script. Any hints and proposals are welcome. @xqt16:13, 13 February 2020 (UTC)[reply]
See DIFF. Hint: I think that came after an admin moved a redirect upon request at WP:RM/TR, where after the request to move the page was made there, another editor moved it to a different title before the admin honored the request, and the admin failed to notice that they were actually moving a redirect, not an article. Sigh, sometimes managing Wikipedia editors feels like trying to herd cats. Best, wbm1058 (talk) 19:04, 5 April 2020 (UTC)[reply]
Where is the approval for that bot task? I currently have a BRFA open for it, but your bot doesn't seem to have been approved for it --DannyS712 (talk) 19:21, 22 April 2020 (UTC)[reply]
Hello! Voting in the 2020 Arbitration Committee elections is now open until 23:59 (UTC) on Monday, 7 December 2020. All eligible users are allowed to vote. Users with alternate accounts may only vote once.
The Arbitration Committee is the panel of editors responsible for conducting the Wikipedia arbitration process. It has the authority to impose binding solutions to disputes between editors, primarily for serious conduct disputes the community has been unable to resolve. This includes the authority to impose site bans, topic bans, editing restrictions, and other measures needed to maintain our editing environment. The arbitration policy describes the Committee's roles and responsibilities in greater detail.
I reverted the redirect of the Baron Macbre (Marvel comics character) page because the data on him had gotten lost due to repeated redirects, merges, page splits, re-merges, etc. Please grant an exemption to this page from your bot until such a time as the merge is successful. Thank you. Blackfyr (talk) 19:06, 29 April 2021 (UTC)[reply]
Hello! It seems that none of the double-redirect fixing bots has been running for the past several days. Thought you might want to check on this. --R'n'B (call me Russ) 14:21, 10 November 2021 (UTC)[reply]
Graphs are unavailable due to technical issues. Updates on reimplementing the Graph extension, which will be known as the Chart extension, can be found on Phabricator and on MediaWiki.org.
BRFA activity by month
Welcome to the eighth issue of the English Wikipedia's Bots Newsletter, your source for all things bot. Maintainers disappeared to parts unknown... bots awakening from the slumber of æons... hundreds of thousands of short descriptions... these stories, and more, are brought to you by Wikipedia's most distinguished newsletter about bots.
Our last issue was in August 2019, so there's quite a bit of catching up to do. Due to the vast quantity of things that have happened, the next few issues will only cover a few months at a time. This month, we'll go from September 2019 through the end of the year. I won't bore you with further introductions — instead, I'll bore you with a newsletter about bots.
Overall
Between September and December 2019, there were 33 BRFAs. Of these, Y 25 were approved, and 8 were unsuccessful (N2 3 denied, ? 3 withdrawn, and 2 expired).
TParis goes away, UTRSBot goes kaput: Beeblebroxnoted that the bot for maintaining on-wiki records of UTRS appeals stopped working a while ago. TParis, the semi-retired user who had previously run it, said they were "unlikely to return to actively editing Wikipedia", and the bot had been vanquished by trolls submitting bogus UTRS requests on behalf of real blocked users. While OAuth was a potential fix, neither maintainer had time to implement it. TParis offered to access to the UTRS WMFLabs account to any admin identified with the WMF: "I miss you guys a whole lot [...] but I've also moved on with my life. Good luck, let me know how I can help". Ultimately, SQL ended up in charge. Some progress was made, and the bot continued to work another couple months — but as of press time, UTRSBot has not edited since November 2019.
Curb Safe Charmer adopts reFill: TAnthonypointed out that reFill 2's bug reports were going unanswered; creator Zhaofeng Li had retired from Wikipedia, and a maintainer was needed. As of June 2021, Curb Safe Charmer had taken up the mantle, saying: "Not that I have all the skills needed but better me than nobody! 'Maintainer' might be too strong a term though. Volunteers welcome!"
Hi Xqt, if you don't know then Marathi Wikipedia has completed 2 million edits and the 2 millionth edit has been made by a bot run by you, i.e. Xqbot. You may see a notice of completetion of 2 million edits on all pages of marathi wikipedia. Also the 2 millionth edit made by bot is this- [[49]]. Also a marathi wikipedia administrator has wished the bot for the same on Xqbot's marathi talk page. Congratulations for running a bot who is part of a Wikipedia milestone. ExclusiveEditorNotify Me!12:40, 23 January 2022 (UTC)[reply]
Bots Newsletter, January 2022
Bots Newsletter, January 2022
Graphs are unavailable due to technical issues. Updates on reimplementing the Graph extension, which will be known as the Chart extension, can be found on Phabricator and on MediaWiki.org.
BRFA activity by month
Welcome to the ninth issue of the English Wikipedia's Bots Newsletter, your source for all things bot. Vicious bot-on-bot edit warring... superseded tasks... policy proposals... these stories, and more, are brought to you by Wikipedia's most distinguished newsletter about bots.
After a long hiatus between August 2019 and December 2021, there's quite a bit of ground to cover. Due to the vastness, I decided in December to split the coverage up into a few installments that covered six months each. Some people thought this was a good idea, since covering an entire year in a single issue would make it unmanageably large. Others thought this was stupid, since they were getting talk page messages about crap from almost three years ago. Ultimately, the question of whether each issue covers six months or a year is only relevant for a couple more of them, and then the problem will be behind us forever.
Of course, you can also look on the bright side – we are making progress, and this issue will only be about crap from almost two years ago. Today we will pick up where we left off in December, and go through the first half of 2020.
Overall
In the first half of 2020, there were 71 BRFAs. Of these, Y 59 were approved, and 12 were unsuccessful (with N2 8 denied, ? 2 withdrawn, and 2 expired).
January 2020
Yeah, you're not gonna be able to get away with this anymore.
A new Pywikibot release dropped support for Python 3.4, and it was expected that support for Python 2.7 would be removed in coming updates. Toolforge itself planned to drop Python 2 support in 2022.
On February 1, some concerns were raised about ListeriaBot performing "nonsense" edits. Semi-active operator Magnus Manske (who originally coded the Phase II software|precursor of MediaWiki) was pinged. Meanwhile, the bot was temporarily blocked for several hours until the issue was diagnosed and resolved.
In March, a long discussion was started at Wikipedia talk:Bot policy by Skdb about the troubling trend of bots "expiring" without explanation after their owners became inactive. This can happen for a variety of reasons -- API changes break code, hosting providers' software updates break code, hosting accounts lapse, software changes make bots' edits unnecessary, and policy changes make bots' edits unwanted. The most promising solution seemed to be Toolforge hosting (although it has some problems of its own, like the occasional necessity of refactoring code).
A discussion on the bot noticeboard, "Re-examination of ListeriaBot", was started by Barkeep49, who pointed out repeated operation outside the scope of its BRFA (i.e. editing pages in mainspace, and adding non-free images to others). Some said it was doing good work, and others said it was operating beyond its remit. It was blocked on April 10; the next day it was unblocked, reblocked from article space, reblocked "for specified non-editing actions", unblocked, and indeffed. The next week, several safeguards were implemented in its code by Magnus; the bot was allowed to roam free once more on April 18.
Issues and enquiries are typically expected to be handled on the English Wikipedia. Pages reachable via unified login, like a talk page at Commons or at Italian Wikipedia could also be acceptable [...] External sites like Phabricator or GitHub (which require separate registration or do not allow for IP comments) and email (which can compromise anonymity) can supplement on-wiki communication, but do not replace it.
MajavahBot 3, an impressively meta bot task, was approved this month for maintaining a list of bots running on the English Wikipedia. The page, located at User:MajavahBot/Bot status report, is updated every 24 hours; it contains a list of all accounts with the bot flag, as well as their operator, edit count, last activity date, last edit date, last logged action date, user groups and block status.
In July 2017, Headbomb made a proposal that a section of the Wikipedia:Dashboard be devoted to bots and technical issues. In November 2019, Lua code was written superseding Legobot's tasks on that page, and operator Legoktm was asked to stop them so that the new code could be deployed. After no response to pings, a partial-block of Legobot for the dashboard was proposed. Some months later, on June 16, Headbomb said: "A full block serves nothing. A partial block solves all current issues [...] Just fucking do it. It's been 3 years now." The next day, however, Legoktm disabled the task, and the dashboard was successfully refactored.
On June 7, RexxS blocked Citation bot for disruptive editing, saying it was "still removing links after request to stop". A couple weeks later, a discussion on the bots noticeboard was opened, saying "it is a widely-used and useful bot, but it has one of the longest block logs for any recently-operating bot on Wikipedia". While its last BRFA approval was in 2011, its code and functionality had changed dramatically since then, and AntiCompositeNumber requested that BAG require a new BRFA. Maintainer AManWithNoPlan responded that most blocks were from years ago (when it lacked a proper test suite), and problems since then had mostly been one-off errors (like a June 2019 incident in which a LTA had "weaponized" the bot to harass editors).
David Tornheim opened a discussion about whether bots based on closed-source code should be permitted, and proposed that they not. He cited a recent case in which a maintainer had said "I can only suppose that the code that is available on GitHub is not the actual code that was running on [the bot]". Some disagreed: Naypta said that "I like free software as much as the next person, and I strongly believe that bot operators should make their bot code public, but I don't think it should be that they must do so".
(You can subscribe or unsubscribe from future newsletters by adding or removing your name from this list.)
Why?
Hi! Yesterday you reverted "Restored revision 399635460 by Xqbot" on the page Drug laws saying: "unnecessary diversion". I do not understand and I do not see it very legitimate, provided that I had redirected to "Drug law (disambiguation)" which includes a redirection to "prohibition of drugs" anyways.
It seems more logical that "Drug law" should point at "Drug law (disambiguation)" instead as to any other page, anyways. I would be happy to understand why not. Thank you!
Teluobir (talk) 00:34, 2 March 2022 (UTC)[reply]
I'm curious as to one aspect of the bot fixing double redirects. Article "A" is moved to "B" and then is moved to "C", but with the redirect suppressed so that "A" is now a redirect to a red link that was "B". Will the bot be able to fix that and redirect "A" to "C"? Thanks. CambridgeBayWeather, Uqaqtuq (talk), Huliva08:05, 6 July 2022 (UTC)[reply]
I got an error running archivebot.py as well. If it's trying to create a new archive page, it was throwing a NoPageError in load_page. I was able to bypass the error and continue execution by putting a try except on the text = self.get() for NoPageError. However, it failed to add the header properly. My repository was a week old, so I'm not sure if it's still a problem in the latest version. -- Prod (Talk) 23:40, 30 July 2022 (UTC)[reply]
It looks like you are not on Wikipedia much lately but I thought I'd drop you a note about your bot. Xqbot correctly changed a redirect on Draft:Skeleton Crew (TV series) when the target page was moved but didn't change the redirect on the talk page. Is there any way you can ensure that the bot corrects both talk pages and article/draft pages when a move happens? Thanks for your contributions and the work of your bot! LizRead!Talk!22:14, 10 September 2022 (UTC)[reply]
ArbCom 2022 Elections voter message
Hello! Voting in the 2022 Arbitration Committee elections is now open until 23:59 (UTC) on Monday, 12 December 2022. All eligible users are allowed to vote. Users with alternate accounts may only vote once.
The Arbitration Committee is the panel of editors responsible for conducting the Wikipedia arbitration process. It has the authority to impose binding solutions to disputes between editors, primarily for serious conduct disputes the community has been unable to resolve. This includes the authority to impose site bans, topic bans, editing restrictions, and other measures needed to maintain our editing environment. The arbitration policy describes the Committee's roles and responsibilities in greater detail.
Would it be possible for you to implement a whitelist system to Xqbot? When WP:Sandbox was modified to a redirect, it caused a bunch of disruptive changes such as this one. A system preventing such changes to certain test pages would be helpful! Thanks ~ Eejit43 (talk) 05:00, 30 January 2023 (UTC)[reply]