This is an archive of past discussions with User:SD0001. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page.
A weird problem happened with yesterday's SDZeroBot/G13 soon's report. It states that there are 278 Drafts eligible on November 14, 2021 but it only lists 89 pages. There is no similar problem on today's report or on previous reports. Since I've been working with SDZeroBot's reports in September 2020, this particular problem has never occurred so I'm not sure what happened.
Is there any way you could rerun the report for November 14, 2021 so all of eligible Drafts and User pages are listed? At this point, it has expanded beyond just me and there are several admins who utilize these lists on a daily basis.
Oh, and my fears were incorrect, as you already know, there were no end-of-the-month issues with the G13 soon list on November 1st. I guess we can only have to worry when the current month is shorter than the month six months ago. We'll see what happens at the end of November! LizRead!Talk!03:05, 9 November 2021 (UTC)
All soon-to-be eligible pages were listed, It's the count that was wrong, which I think may have been because of database replag (the count is being calculated before recently edited/deleted pages are skipped to account for replag). The error logs show nothing unusual. – SD0001 (talk) 03:48, 9 November 2021 (UTC)
Thanks for checking, SD0001, but if that's the case, then that is REALLY odd. On average, there are between 200-250 expiring drafts on the daily G13 soon report but recently it has been higher, over 300 drafts several times and, on one day where there was some mass AWB editing done, there were 512 expiring draft pages. In the past 15 months, I've never seen the count go below 100 drafts so 89 drafts seems dramatically low during a period of the year where there seemed to be a surge of expiring drafts.
I use to think that the changing number of drafts throughout the year had to do with increasing and decreasing levels of draft creation but now I think the fluctuating number has to do with changing levels of AFC reviewer activity since it seems like many new editors don't return to drafts once they are declined in AFC reviews. More drafts reviewed=>more drafts declined=>more drafts that are left unedited.
So, there are 89 expiring drafts on the G13 Soon list a few days ago and today there are 520?! I guess there was more AWB editing on existing drafts that day. The number was so consistent for so many weeks, these fluctuations are really unusual. LizRead!Talk!00:05, 11 November 2021 (UTC)
Hello, SD0001,
Today was the day that the G13 soon list said there were 278 drafts that would be eligible for G13 deletion but only listed 89 drafts. Guess what happened? Today's G13 eligible list, which usually has 0 drafts, listed has 75 drafts suddenly eligible! So, for some reason, those 75 drafts should have appeared on the G13 soon list for 11/14/2021 but didn't. But still, 89 drafts + 75 drafts still doesn't add up to 278 pages so some things are still a mystery!
But I thought I'd let you know what happened so that next time the count is unusually low, we can anticipate those extra pages showing up at 00:00 UTC on the G13 eligible list. Luckily, I think this has only happened 2 or 3 times before so the bot is still consistently reliable by my standards. Thanks again! LizRead!Talk!00:48, 15 November 2021 (UTC)
ArbCom 2021 Elections voter message
Hello! Voting in the 2021 Arbitration Committee elections is now open until 23:59 (UTC) on Monday, 6 December 2021. All eligible users are allowed to vote. Users with alternate accounts may only vote once.
The Arbitration Committee is the panel of editors responsible for conducting the Wikipedia arbitration process. It has the authority to impose binding solutions to disputes between editors, primarily for serious conduct disputes the community has been unable to resolve. This includes the authority to impose site bans, topic bans, editing restrictions, and other measures needed to maintain our editing environment. The arbitration policy describes the Committee's roles and responsibilities in greater detail.
Graphs are unavailable due to technical issues. Updates on reimplementing the Graph extension, which will be known as the Chart extension, can be found on Phabricator and on MediaWiki.org.
BRFA activity by month
Welcome to the eighth issue of the English Wikipedia's Bots Newsletter, your source for all things bot. Maintainers disappeared to parts unknown... bots awakening from the slumber of æons... hundreds of thousands of short descriptions... these stories, and more, are brought to you by Wikipedia's most distinguished newsletter about bots.
Our last issue was in August 2019, so there's quite a bit of catching up to do. Due to the vast quantity of things that have happened, the next few issues will only cover a few months at a time. This month, we'll go from September 2019 through the end of the year. I won't bore you with further introductions — instead, I'll bore you with a newsletter about bots.
Overall
Between September and December 2019, there were 33 BRFAs. Of these, Y 25 were approved, and 8 were unsuccessful (N2 3 denied, ? 3 withdrawn, and 2 expired).
TParis goes away, UTRSBot goes kaput: Beeblebroxnoted that the bot for maintaining on-wiki records of UTRS appeals stopped working a while ago. TParis, the semi-retired user who had previously run it, said they were "unlikely to return to actively editing Wikipedia", and the bot had been vanquished by trolls submitting bogus UTRS requests on behalf of real blocked users. While OAuth was a potential fix, neither maintainer had time to implement it. TParis offered to access to the UTRS WMFLabs account to any admin identified with the WMF: "I miss you guys a whole lot [...] but I've also moved on with my life. Good luck, let me know how I can help". Ultimately, SQL ended up in charge. Some progress was made, and the bot continued to work another couple months — but as of press time, UTRSBot has not edited since November 2019.
Curb Safe Charmer adopts reFill: TAnthonypointed out that reFill 2's bug reports were going unanswered; creator Zhaofeng Li had retired from Wikipedia, and a maintainer was needed. As of June 2021, Curb Safe Charmer had taken up the mantle, saying: "Not that I have all the skills needed but better me than nobody! 'Maintainer' might be too strong a term though. Volunteers welcome!"
(You can subscribe or unsubscribe from future newsletters by adding or removing your name from this list.)
Your afch-rewrite pull reqs
Hey, just thought I'd note here that I've been sorta busy IRL, but I know it's annoying to have PRs sit for a while. I'll try to get to them when I have some free time. Enterprisey (talk!) 10:28, 14 December 2021 (UTC)
Hi! I'm User:Ed6767. I'm probably best known for founding and developing Wikipedia:RedWarn, which is now one of the most used user scripts on the English Wikipedia. I couldn't help but notice that you also develop user scripts, so I was wondering if I could ask you to try and give me feedback on my new tool called EasyWikiDev. It's a new way to develop user scripts quickly and easily using Visual Studio Code, and only takes a few minutes to set up and install on your computer, whilst saving you the headache of constantly having to save edits and reload the page for every single change to your script you'd like to test. EasyWikiDev makes it so you can develop your script locally, on your own computer, and only publish the changes to your users when you are ready to - and unlike other solutions, EasyWikiDev reloads the page right away when you make a change, so you always see the latest version of your script. Plus, by using Visual Studio Code, you have access to some of the most extensive and helpful extensions and tools available to developers right now.
If you're interested, you can find the GitHub repository here and a video tutorial that shows both how to set up EasyWikiDev and how to use it (which you should watch) here. When you have tried it and would like to give feedback, or just need help, please let me know by pinging me - you'd play a big part in my goal to make user script development easier for all Wikipedians. Thanks again for your consideration, ✨ Edtalk! ✨ 12:46, 17 December 2021 (UTC)
Hey there! I'm going through a report of untranscluded templates, and {{DraftCorp}} appears to be unused and links to deleted categories. If you don't have a use for it anymore, can you please tag it with {{Db-g7}}? Thanks. – Jonesey95 (talk) 04:24, 21 December 2021 (UTC)
Thank you for your reviews of my Twinkle pull requests. Reviewing pull requests for legacy software isn't always the most fun, so I appreciate you staying on top of it. –Novem Linguae (talk) 13:31, 27 December 2021 (UTC)
See entry for Dark Ages (Europe). The bot ranked it "GA". The author of the article has become so convinced they have a Good Article, all attempts to reason with them have failed, they are kind of bragging about it saying how much better is then other similar articles (it is actually a mess of POV, Original Research, CSPLIT etc.. will soon be deleted). Your bot ranking is apparently, in their mind, an authority. They are claiming it's "GA" status as a primary AfD Keep rational. It also was a sticking point during 4 days of difficult "I can't hear you" talk page discussions. There may be a competency/commonsense problem. Maybe I don't understand the report, would it makes sense the bot not automatically rank articles as GA or FA? Or add some qualifier or warning not to take it as literal truth? GreenC06:44, 28 December 2021 (UTC)
Thanks for the note @GreenC. Reading this rather late. I'll look into adding a * mark next to "GA" and "FA" in the reports. The bot merely blurts out whatever ORES tells it, so removing GA/FA ratings altogether isn't probably a good idea. – SD0001 (talk) 12:42, 31 December 2021 (UTC)
Whatever the rating is called, I think it is valid to question a proposal to delete the higher rated article and make it a redirect to a lower rated article. ThuDauMot (talk) 05:13, 1 January 2022 (UTC)
Here we are at the end of another 30 days/31 days month change. Tonight's (December 31st) G13 soon is for July 7th expiring drafts and I'm hoping the G13 soon report for January 1st is for July 8th expiring drafts. It seems like it is difficult to make up for a lost day's report when a date is skipped so I'm posting this in advance in case there are any problems at this time tomorrow.
@Liz Yes it will be like that. Have there been any issues of late? From the page history, it looks like there is continuity now. There were 2 reports on 1 Dec 2021, and none on 1 Nov 2021, all as expected. – SD0001 (talk) 12:37, 31 December 2021 (UTC)
Well, except for this end-of-the-month problem, things have been great with SDZeroBot. It looks like SDZeroBot just skipped the G13 soon report for today, I assume because there isn't a June 31st. As long as we can pick up with the next report being for July 8th tomorrow, that will be fine. But I have the feeling that the bot will just skip this day and just pick up tomorrow with a report for July 9th.
Can you run a report later today or tomorrow to replace today's missing report for July 8th? That would be great. Happy New Year's Eve to you! LizRead!Talk!::::Okay, that's fine. It's better to be a day behind in reports than to skip a day. There are work-arounds, like using Category:AfC G13 eligible soon submissions instead, but your list is much more complete that we end up with a lot of extra drafts on User:SDZeroBot/G13 eligible at the end of the day. Thanks again for working out such an efficient system...it makes the old days, when we used Wikipedia:Database reports/Stale drafts, look rather chaotic. LizRead!Talk!18:55, 1 January 2022 (UTC)
I don't understand exactly what happened here, SD0001...typically, the G13 soon list contains around 175-225 expiring drafts each day although there have been occasions where it topped 300 drafts. The list for July 8th has 425! But, the dates seem correct, it doesn't look like the chart is for 2 days. I see that AFC was running a Backlog Drive during July so we should expect to see larger daily counts. Still, it is a big jump from 201 drafts for today's list for July 1st. Thanks for keeping everything running smoothly, in spite of calendar variations. LizRead!Talk!01:01, 2 January 2022 (UTC)
I have a question and this time it's not about the G13 soon list! I've noticed that your bot notifies page creators for AFDs when the nominator fails to do so. Could the bot also do this for PRODs? I deal with PRODs more with AFDs and I've noticed that new editors and IP editors frequently fail to notify page creators when they add a PROD tag. This can be a big problem because it is perfectly okay for page creators to remove the tag before the page is deleted but if they don't know it's been tagged, they may not even be aware that their page creation is going to be deleted.
This is really a great service for a bot to do. It would be better if the notification was coming from the nominator but some notice is better than none. Thanks for considering this request. LizRead!Talk!05:50, 15 December 2021 (UTC)
Hi, I was working on categorising templates, I stumbled upon a redirect that is printworthy, and happens to be a colloquial name too. But I'm not able to make out exactly what to put within the r cat shell. You happen to be the most recent editor of {{R from colloquial name}} Can you please help me out? Thanks! ---CX Zoom(he/him)(let's talk|contribs)18:50, 27 January 2022 (UTC)
Graphs are unavailable due to technical issues. Updates on reimplementing the Graph extension, which will be known as the Chart extension, can be found on Phabricator and on MediaWiki.org.
BRFA activity by month
Welcome to the ninth issue of the English Wikipedia's Bots Newsletter, your source for all things bot. Vicious bot-on-bot edit warring... superseded tasks... policy proposals... these stories, and more, are brought to you by Wikipedia's most distinguished newsletter about bots.
After a long hiatus between August 2019 and December 2021, there's quite a bit of ground to cover. Due to the vastness, I decided in December to split the coverage up into a few installments that covered six months each. Some people thought this was a good idea, since covering an entire year in a single issue would make it unmanageably large. Others thought this was stupid, since they were getting talk page messages about crap from almost three years ago. Ultimately, the question of whether each issue covers six months or a year is only relevant for a couple more of them, and then the problem will be behind us forever.
Of course, you can also look on the bright side – we are making progress, and this issue will only be about crap from almost two years ago. Today we will pick up where we left off in December, and go through the first half of 2020.
Overall
In the first half of 2020, there were 71 BRFAs. Of these, Y 59 were approved, and 12 were unsuccessful (with N2 8 denied, ? 2 withdrawn, and 2 expired).
January 2020
Yeah, you're not gonna be able to get away with this anymore.
A new Pywikibot release dropped support for Python 3.4, and it was expected that support for Python 2.7 would be removed in coming updates. Toolforge itself planned to drop Python 2 support in 2022.
On February 1, some concerns were raised about ListeriaBot performing "nonsense" edits. Semi-active operator Magnus Manske (who originally coded the Phase II software|precursor of MediaWiki) was pinged. Meanwhile, the bot was temporarily blocked for several hours until the issue was diagnosed and resolved.
In March, a long discussion was started at Wikipedia talk:Bot policy by Skdb about the troubling trend of bots "expiring" without explanation after their owners became inactive. This can happen for a variety of reasons -- API changes break code, hosting providers' software updates break code, hosting accounts lapse, software changes make bots' edits unnecessary, and policy changes make bots' edits unwanted. The most promising solution seemed to be Toolforge hosting (although it has some problems of its own, like the occasional necessity of refactoring code).
A discussion on the bot noticeboard, "Re-examination of ListeriaBot", was started by Barkeep49, who pointed out repeated operation outside the scope of its BRFA (i.e. editing pages in mainspace, and adding non-free images to others). Some said it was doing good work, and others said it was operating beyond its remit. It was blocked on April 10; the next day it was unblocked, reblocked from article space, reblocked "for specified non-editing actions", unblocked, and indeffed. The next week, several safeguards were implemented in its code by Magnus; the bot was allowed to roam free once more on April 18.
Issues and enquiries are typically expected to be handled on the English Wikipedia. Pages reachable via unified login, like a talk page at Commons or at Italian Wikipedia could also be acceptable [...] External sites like Phabricator or GitHub (which require separate registration or do not allow for IP comments) and email (which can compromise anonymity) can supplement on-wiki communication, but do not replace it.
MajavahBot 3, an impressively meta bot task, was approved this month for maintaining a list of bots running on the English Wikipedia. The page, located at User:MajavahBot/Bot status report, is updated every 24 hours; it contains a list of all accounts with the bot flag, as well as their operator, edit count, last activity date, last edit date, last logged action date, user groups and block status.
In July 2017, Headbomb made a proposal that a section of the Wikipedia:Dashboard be devoted to bots and technical issues. In November 2019, Lua code was written superseding Legobot's tasks on that page, and operator Legoktm was asked to stop them so that the new code could be deployed. After no response to pings, a partial-block of Legobot for the dashboard was proposed. Some months later, on June 16, Headbomb said: "A full block serves nothing. A partial block solves all current issues [...] Just fucking do it. It's been 3 years now." The next day, however, Legoktm disabled the task, and the dashboard was successfully refactored.
On June 7, RexxS blocked Citation bot for disruptive editing, saying it was "still removing links after request to stop". A couple weeks later, a discussion on the bots noticeboard was opened, saying "it is a widely-used and useful bot, but it has one of the longest block logs for any recently-operating bot on Wikipedia". While its last BRFA approval was in 2011, its code and functionality had changed dramatically since then, and AntiCompositeNumber requested that BAG require a new BRFA. Maintainer AManWithNoPlan responded that most blocks were from years ago (when it lacked a proper test suite), and problems since then had mostly been one-off errors (like a June 2019 incident in which a LTA had "weaponized" the bot to harass editors).
David Tornheim opened a discussion about whether bots based on closed-source code should be permitted, and proposed that they not. He cited a recent case in which a maintainer had said "I can only suppose that the code that is available on GitHub is not the actual code that was running on [the bot]". Some disagreed: Naypta said that "I like free software as much as the next person, and I strongly believe that bot operators should make their bot code public, but I don't think it should be that they must do so".
Hello! I wanted to drop a quick note for all of our AFC participants; nothing huge and fancy like a newsletter, but a few points of interest.
AFCH will now show live previews of the comment to be left on a decline.
The template {{db-afc-move}} has been created - this template is similar to {{db-move}} when there is a redirect in the way of an acceptance, but specifically tells the patrolling admin to let you (the draft reviewer) take care of the actual move.
It's been a while since I visited your talk page which, I guess, is a good thing because there have been no problems with SDZeroBot! But I thought I'd check in with you because we have another calendar jump with the G13 soon report as we go from the February 28 report to the March 1 report. We can handle the extra three days of expiring drafts okay (for August 29-31) but we really can't have SDZeroBot skip ahead three days of G13 reports and miss reports for drafts expiring in early September. Both DGG (on the part of the AFC) and I review the G13 soon reports prior to the time of deletion and having 3 missing reports would negatively affect this process.
I should say that with simple 1 day mismatches (like when months are only 30 days long) seem to be handled by SDZeroBot with few problems these days so maybe you have fixed this glitch which has more to do with the calendar irregularities than with how your bot functions. Thanks again for all of the work that both you and your bot do for the project! LizRead!Talk!00:31, 22 February 2022 (UTC)
I thought that calendar problem was fixed but, you know, I look at these pages every day and so I try to anticipate problems that might occur. But I will just assume from now on that this problem won't reoccur. Today's G13 soon report is for drafts that would expire February 29, 2022 but we'll just take care of them on February 28, 2022.
But you know when I said above I hadn't posted here lately so there must be no problems? LOL! Well, I spoke too soon. The G13 soon reports have been a little short lately which wasn't a surprise because the AFC did a huge July 2021 Backlog Drive so six months ago there were thousands of drafts that were reviewed in July (so they would expire in January) as AFC reviewers caught up with the draft backlog so we were expecting February would be much lighter. Today's report had just 150 expiring drafts (there are typically about 200-225) which was just fine but then, on User:SDZeroBot/G13 eligible, which is usually empty, it listed 45 drafts that were G13 eligible today but didn't show up on the G13 soon report! That's unusual...it's happened before but it doesn't occur very often as you can see in the page history. Usually the only time drafts appear on G13 eligible page is because they were missed by myself, Explicit or one of the editors who helps out with G13s. Any idea what might have happened with SDZeroBot? It's not a big deal unless it starts happening more frequently. And even then, it's not urgent, it's just that we like to delete expiring drafts over the course of the day & night rather than all at once. But it's curious. Thanks! LizRead!Talk!01:55, 23 February 2022 (UTC)
@Liz I see pages on today's eligible report had last edits on 22 August. If we look at the soon report for 22 August, the entries are there but unfortunately there was a formatting problem, phew! The raw wikitext of the links are all inside the Excerpt field of User:Rockpromo/sandbox. This happened because the generated excerpt for the sandbox page had an unclosed <nowiki> tag.I made a change some time back to keep out unclosed <ref> tags from excerpts (which too can affect the formatting of the rest of the page) because they were more common. Will take a look whether nowikis too could be handled the same way. – SD0001 (talk) 04:00, 23 February 2022 (UTC)
Everything worked out perfectly, SDZeroBot issued 4 G13 soon reports today so we won't miss a day of reports. Thanks! LizRead!Talk!03:31, 1 March 2022 (UTC)
Hello, SD0001. This is a bot-delivered message letting you know that Draft:Amazon MemoryDB, a page you created, has not been edited in at least 5 months. Drafts that have not been edited for six months may be deleted, so if you wish to retain the page, please edit it again or request that it be moved to your userspace.
If the page has already been deleted, you can request it be undeleted so you can continue working on it.
In accordance with our policy that Wikipedia is not for the indefinite hosting of material deemed unsuitable for the encyclopedia mainspace, the draft has been deleted. If you plan on working on it further and you wish to retrieve it, you can request its undeletion. An administrator will, in most cases, restore the submission so you can continue to work on it.
Now, I didn't come this week and post here about the fact that September (6 months ago) only had 30 days and March has 31 days because SDZeroBot had all this stuff worked out, right? But SDZB skipped the G13 soon report tonight. I guess this is because it doesn't want to get a day ahead of itself? As long as there is a report for every day (today's report would be for October 8th), then it doesn't really matter whether it is issued today or tomorrow. But I thought I'd check in any way because I'm a creature of habit and you are so nice about all of the questions I've come to you with in the past. I hope that your life is going well! LizRead!Talk!01:06, 1 April 2022 (UTC)
Yeah you're right – it doesn't want to get a day ahead of itself. The October 8 report would be there tomorrow. Enjoy your day off :) – SD0001 (talk) 03:30, 1 April 2022 (UTC)
Hot articles on high-risk templates
Hello! I noticed that Wikipedia:WikiProject Anarchism/Hot articles(edit | talk | history | links | watch | logs) was not updating and went to its history to figure out why, only to discover that it has extended confirmed access protection because it is included in the project's talk page banner and is highly transcluded (high risk). Would the better route be to request the permissions for the bot or to lower the template's protection? Alternatively, if those are too complicated, I could remove it from the WikiProject banner, but I'd like to think it's helpful there, so that would be the last resort. czar05:23, 27 March 2022 (UTC)
It's only extconf protected so the bot should be able to edit it (as the bot user group includes extendedconfirmed right). Not sure why there are no edits. I don't see anything in the output or error logs. – SD0001 (talk) 13:05, 27 March 2022 (UTC)
It does appear to be permissions-related. I removed its "high-risk" transclusion and undid the permission changes and it started right back up. czar04:14, 2 April 2022 (UTC)
Hey, Draftify Watch didn't update yesterday so I just took a look in the log page and it appears it ran into an issue regarding a spam blacklisted link. If you're already aware, sorry just ignore this. Curbon7 (talk) 01:18, 6 April 2022 (UTC)
Hi! Sorry to bug with tech support questions, but I'm a bit stuck. I've made, by this point, a basically fully-operational script for DYK to implement a rule change (replacing the old QPQ checker). I have a tool account on toolforge, but I'm not sure how to implement lighttpd to host the script publicly? If you know of any resources to point me towards, that'd be appreciated. Thanks in advance! theleekycauldron (talk • contribs) (she/they) 18:21, 13 April 2022 (UTC)
@Theleekycauldron Put the source code in the public_html directory. Create an index.html file which will be read by toolforge as the main page of the tool, on which you can place the <script> tag to load the javascript. If that doesn't make sense or you need further help, point me to your code and I can explain what to do. – SD0001 (talk) 19:08, 13 April 2022 (UTC)
Recognising that you're a busy individual, just wanted to note that you've either missed or forgotten about a few pings at this BRFA, so whenever is convenient for you please respond. Primefac (talk) 14:18, 27 February 2022 (UTC)
@Primefac oh, sorry for the much delayed response. As the editor is back, my comment there is somewhat moot. I have not reviewed the latest trial edits – I don't think I'm the best person to approve this, as I became a bit involved over the course of initial trials. Feel free to take it over! – SD0001 (talk) 12:40, 23 April 2022 (UTC)
Wikimedia Code Review - Gerrit
Hello SD! Some time ago you helped me learn the basis of contributing in Gerrit for Wikimedia and solving a localization problem I had with the special pages in my community. The commit was merged some days ago and now I'm waiting for deployment. Motivated by that, I was wondering if there were other possibilities I could get involved more, reviewing other changes and basically helping around there. Do you have any advice on "how to start" or maybe some easy tasks to review or small details where I can help around? I understand if my request might feel rather odd so don't worry much about the answer. :) - Klein Muçi (talk) 02:00, 24 January 2022 (UTC)
Yes, but they're not without issue. For instance, the 2nd link includes a unix command that won't work on Windows. What OS do you have? If windows, I believe using WSL2 is recommended nowadays, though I think mediawiki can be slow if the files are on a windows partition (and if you put them on the linux partition, then the code editor would become slow instead) – not sure though as it's been a while since I used windows. If you're not using windows, you're more in luck. – SD0001 (talk) 04:09, 25 January 2022 (UTC)
Okay then. I'll try setting up my developing environment in Linux soon. If i get stuck somewhere, I'll come here, maybe you can help me. Thank you! :) - Klein Muçi (talk) 09:31, 25 January 2022 (UTC)
After some errors I was able to install docker and docker-compose and I verified they're working properly by checking their versions and downloading some images and playing a bit around. (This is the first time I ever deal with docker though.)
Now I'm stuck at the next step: Prepare .env file.
It says create a .env file in the root of the MediaWiki core repository but I'm not really sure what that means. What should the name be (what is its exact purpose) and where exactly should it be located (what command do I use to get there)?
Also, should this line be kept unchanged: MEDIAWIKI_USER=Admin?
And finally, unrelated specifically to this but, when using sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu bionic stable", I get this error. I only started getting this after I had to delete something after some errors in the first time. Any idea what I've done wrong and how to correct it? - Klein Muçi (talk) 03:30, 26 January 2022 (UTC)
You'd need to figure out things like that via google and stackoverflow and so on – searching for .env files brings up https://docs.docker.com/compose/env-file/ as the 3rd result. Are you using linux mint? I couldn't see any relevant search results but you'd probably get something useful with a specific search like "add-apt-repository on linux mint gives additional-repositories.list not found error" – SD0001 (talk) 04:02, 26 January 2022 (UTC)
Oh, so it is a literal ".env" file. I thought I hade to create it with a name like: Something.env
The only question now is where do I create that? What does root of the MediaWiki core repository mean in a more simple language? In the Mediawiki directory? Or somewhere here somehow? - Klein Muçi (talk) 11:15, 26 January 2022 (UTC)
The directory to which you cloned mediawiki – the one which contains the docker-compose.yml file. This directory should be named "mediawiki", (not "core"), so that you can later install extensions under mediawiki/extensions/. (core/extensions/* would also work in theory but just sounds awkward). – SD0001 (talk) 13:47, 26 January 2022 (UTC)
@SD0001, thank you! I prepared the .env file and then created a docker-compose.override.yml file in the mediawiki directory. But, when I wanted to start the containers, I got an error. I gave the command while being in the mediawiki directory and got this error. What am I doing wrong? - Klein Muçi (talk) 21:51, 26 January 2022 (UTC)
I was able to fix that error. (Had downloaded an older version.) Things went smoothly after that (got MediaWiki on localhost8080) and I also downloaded Visual Studio Code. A bit of a silly question but what would be the next step after this?
How does one identify problems and usually goes on about solving one? I mean, I understand the whole Phabricator/Gerrit thing theoretically but I've never dealt with a situation like that before (beside the one you helped me) and any kind of practical advice would be helpful. As I said in the beginning, I do understand if this sounds odd as a question and that you might not really have an answer for it. - Klein Muçi (talk) 02:59, 27 January 2022 (UTC)
Pick your favourite extensions and look at their phab workboards to see if there's anything you find interesting. Extensions work primarily via hooks – so the first place to look for understanding the code of any extension would be the file in includes/ directory having "Hooks" in the name. These are usually self-explanatory, eg. onPageSaveComplete hook allows an extension to execute code every time any page is saved. MW core itself does not use hooks, though, so its code can be generally harder to modify.(Also make sure your code editor is set up so that it allows you to navigate to class/function definition by clicking on where its used, provides autocomplete suggestions, etc.) – SD0001 (talk) 14:18, 27 January 2022 (UTC)
Thanks a lot! I think I'll give it a try with a task from here. If I manage to solve anything maybe I add you as a reviewer in Gerrit. I'm supposing you'll have the same name. Thank you one more time! Really grateful. :) - Klein Muçi (talk) 00:50, 28 January 2022 (UTC)
I'm experimenting around with the various Mediawiki files and folders and I was trying the PHP update script. Every time I use it, I get this. Is that supposed to be normal? What are those 8 listed entries? - Klein Muçi (talk) 03:57, 29 January 2022 (UTC)
update.php mainly just runs the db schema upgrades - which of course can't happen if it can't connect to db. Do you have the wgDb* credentials in LocalSettings correctly configured. Are there any db errors while viewing pages? The folks in MediaWiki discord (https://discord.gg/ZrV2Ex9) may be better positioned to help with this. – SD0001 (talk) 11:14, 29 January 2022 (UTC)
I haven't touched the file beside adding the WikiLove extension just to test out how "things" (extensions/skins) can be added. I have this.- Klein Muçi (talk) 11:48, 29 January 2022 (UTC)
If it still isn't working, consider using a MySQL/MariaDB database. You can set one up using docker, and specify its credentials in LocalSettings. – SD0001 (talk) 11:42, 30 January 2022 (UTC)
I tried using docker-compose exec mediawiki php maintenance/update.php following the instructions here and I get this. It seems to be working fine, no?
Yes that's working fine. So you were running it earlier without using docker? Running any maintenance script directly in a linux terminal will not work. Your sqlite database is inside the docker container and can't be accessed from outside (unless ...). You need to do it inside the container – docker-compose exec mediawiki bash gets you a shell inside the container – and then you can do php maintenance/update.php (The command you mention is the shortcut for this.) Some commands like composer install and composer test may work when run outside – but if your local php version is different from what's there inside docker, it could result in subtle bugs. – SD0001 (talk) 13:34, 30 January 2022 (UTC)
Hello back! I was working with some good first tasks at Phab and I started noticing that I started encountering conflicts. This is normal but do you have any advice on how to act in regard to that? Like, for example, in this small change, what would I need to do, if anything, in regard to the current conflicts that are listed there? Do I follow the links and solve them manually somehow? Or is that considered rude and bad manners? As usually, I'm just starting so any kind of information would be appreciated. - Klein Muçi (talk) 13:27, 7 February 2022 (UTC)
You don't have to do anything. Depending on which patch gets merged first – the conflicts will have to be fixed in the other. Also don't +1 your own patches. – SD0001 (talk) 14:43, 7 February 2022 (UTC)
Ah, I see. I don't do that but I saw that it came with an automatic comment in the lines of "it looks good to me, maybe someone else can check it" and given that it was just a simple change of characters, not code per se, I thought it would be a good and acceptable occasion to do that. Apparently it almost never is. :P Thank you! :) - Klein Muçi (talk) 15:28, 7 February 2022 (UTC)
Oh, and also, what about the "topic" content? What are some examples for good topics? I ask because I thought that would be the first line of the commit message but apparently it gets put aside and the description, which is optional, is what becomes the commit message. The topic in my task above seems strange now.
Hey there! I'm back with a new question. I was using what you helped me set up months ago to test something. I'm still glad for all the help you provided on that. I've learned everything I know in regard to Gerrit from you.
I just created 2 articles in "my Wikipedia". Test and Test2. Do created articles appear as files in the mediawiki folder in my laptop? If so, where? If not, where do they exist? And how do I delete them? - Klein Muçi (talk) 04:34, 24 March 2022 (UTC)
I just checked on my local wiki – I can see the text in the text table, for each revid. You can also see your local database using a tool like MySQL Workbench or https://sqlitebrowser.org/ if your db backend is sqlite. In WMF production they are of course stored differently – the text table just contains a pointer to a blob in external storage which stores the content in compressed form (see mw:Manual:External_Storage). In any case, editing/deleting things directly from the database or filesystem would NOT be safe – for instance when you delete a page, references from pagelinks, categorylinks and many other tables would need to be cleared. For all of that to happen you have to use the UI or API or some maintenance script. – SD0001 (talk) 05:27, 24 March 2022 (UTC)
My db backend is sqlite and I have sqlitebrowser (maybe I should give the MySQL Workbench a try too) but if I'm not supposed to edit the database directly, how should the normal way of doing it be? I tried manually going to the delete link: https://local host link/w/index.php?title=Article'sTitle&action=delete but it said to me that I didn't have permissions to do that because I wasn't an admin. How do I make myself a crat? - Klein Muçi (talk) 13:40, 24 March 2022 (UTC)
The user account created during MW installation will have admin, crat and intadmin rights. You can log in to that account to add rights to other accounts. – SD0001 (talk) 15:31, 24 March 2022 (UTC)
I went to Special:ListUsers and you were right. A bit of a noobish question but how do I login to that account? What's the password? - Klein Muçi (talk) 17:01, 24 March 2022 (UTC)
You would have set its password during MW installation :) If you don't remember it, you can reset its password or create a new account with rights using maintenance/createAndPromote.php script. – SD0001 (talk) 17:50, 24 March 2022 (UTC)
Can you perhaps show me where is the manual page describing this step? If I read it, maybe I get remembered what I've put. Also, shouldn't I be able to locate its password somewhere in the database given that I, supposedly, have full control of everything? - Klein Muçi (talk) 18:07, 24 March 2022 (UTC)
There's mw:Manual:CreateAndPromote.php, though you can also figure out what most maintenance scripts do by looking at their code (the constructor will have descriptions of params). Passwords are stored in database after being salted and passed through one-way cryptographic hash functions. You can see hashed values in user table, but it is not possible to figure out the original password from that. – SD0001 (talk) 19:30, 24 March 2022 (UTC)
Thanks a lot for the extra information. I had no information whatsoever in cryptography beside basic understanding of what SSH is and how it works. One last basic question out of curiosity:
I imported some skins and extensions, created and deleted some articles, created and promoted some users... Basically I did some stuff. I choose to have a clean start. How do I do that? All histories of everything deleted, all the imported things removed. Basically perform a "factory reset". The easiest way to achieve that? - Klein Muçi (talk) 03:29, 25 March 2022 (UTC)
Hey, SD! Recently I've been doing some work in Gerrit, mostly just small typo code fixing to familiarize myself with it, and I was wondering some things.
Whenever a reviewer reviews my patches, JenkinsBot comes to do some tests in them. Is there any way to "invite the JenkinsBot" for its tests before reviewers come? Can I set it as a reviewer or is that something only +2 users can do?
Also, let's say I'm fixing these typos in the first entry here. If you scroll down a bit you'll see that most likely just for the first entry I'd have to patch around 30 files in the same directory which have more or less the same content to which I'll do the same change. What I'm currently doing is opening and changing them one by one on the web interface and then publishing that change. Is there any faster way to approach these kind of situations? It kinda feels like doing a bot job. Are there bots in Gerrit doing these kind of jobs? - Klein Muçi (talk) 18:47, 4 May 2022 (UTC)
You can add yourself to the CI allowlist so that jenkinsbot automatically runs tests for your patches. You can just checkout the repo locally and use project-wide find-and-replace, which most code editors have. – SD0001 (talk) 19:59, 4 May 2022 (UTC)
Can you show me an example of what you actually mean with project-wide find-and-replace? If I checkout the repo locally, I believe I can't use the web interface for Gerrit anymore, can I? - Klein Muçi (talk) 00:11, 5 May 2022 (UTC)
I've never used gerrit web interface. Isn't that just for those who haven't figured out how to set up git review? VS code has a search option in the sidebar which searches across a repository, and It can replace too. – SD0001 (talk) 03:19, 5 May 2022 (UTC)
Thank you!
Most likely. I've used git review in the past but it felt extremely tedious for me. It took me around a hour to push a simple 1-2 bit change (and I was just using the sandbox). I currently lack the courage to clone and edit things in the bash interface so I was happy when I found the web interface. I suppose eventually I'll have to learn how to utilize that as well. If you have any pro tips in regard to that, I'd appreciate reading those but I doubt there are. I've already read everything there is in MediaWiki. I believe it's just a mater of my brain getting used to it. - Klein Muçi (talk) 12:04, 5 May 2022 (UTC)