This is an archive of past discussions with User:TCN7JM. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page.
Wikibooks will get access to language links via Wikidata on February 24th. Coordination is happening at d:Wikidata:Wikibooks.
Roughly 17000 of the candidate articles that Google identified as potentially being about the same topic but lacking a language link have been merged. About 17500 are remaining and waiting for you to go through them via https://tools.wmflabs.org/yichengtry/
Did more work on Capiunto to get it into a state where it can be deployed to Wikimedia sites.
Mourned Titan (phabricator:T88550) as its developers were bought up. Began evaluation for which graph database to use instead for a Wikidata query service (mw:Wikibase/Indexing)
Further work on header section redesign
Tweaks to the sitelink section
Worked on making monolingual text datatype accept more languages
Finished showing the language when a language fallback is used
Started looking into fixing the existing Guided Tours after Guided Tours extension API changes
Worked on implementing Lua convenience functions for rendering arbitrary Snaks. This is useful for displaying references or qualifiers. (phabricator:T76213)
The templates for adding badges (good article, featured article, etc) to articles on Wikipedia in the sidebar are getting removed from articles rapidly in favor of getting that information from Wikidata \o/ German Wikipedia even deleted those templates already. English Wikipedia seems to be getting close.
We are hiring! Passionate about Wikidata and know your way around Java Script and co? Apply!
We'll spend the next week working with Nik and Stas from WMF to move queries forward in Berlin
Worked on implementing a Lua interface for arbitrary Snak rendering. This can be used to render eg. qualifiers or references in Wikipedia infoboxes.
Did groundwork for Lua convenience functions that render data in the user’s interface language, rather than the content language (for multilingual wikis, like Commons or Wikidata only)
Did further work on making the Lua interface code nicer and share code with the parser functions
Removed input method selector in the sitelink input as it was hiding the actual input and not very useful there
Fixed some issues in diff views showing new data instead of old data
Created a few scripts to make it easier for 3rd parties to install Wikibase
Fixed icinga notification that tests if dispatch lag gets high
Investigated several issues regarding storage of time values and started fixing them
Fixed editing of qualifiers
Fixed most browser tests after introduction of new header design
That's it, the first round is done, sign-ups are closed and we're into round 2. 64 competitors made it into this round, and are now broken into eight groups of eight. The top two of each group will go through to round 3, and then the top scoring 16 "wildcards" across all groups. Round 1 saw some interesting work on some very important articles, with the round leader Freikorp (submissions) owing most of his 622 points scored to a Featured Article on the 2001 film Final Fantasy: The Spirits Within which qualified for a times-two multiplier. This is a higher score than in previous years, as Godot13 (submissions) had 500 points in 2014 at the end of round 1, and our very own judge, Sturmvogel_66 (submissions) led round 1 with 601 points in 2013.
In addition to Freikorp's work, some other important articles and pictures were improved during round one, here's a snapshot of a few of them:
Rodw (submissions) developed an extremely timely article to Good Article, taking Magna Carta there some 800 years after it was first sealed;
And last but not least, Godot13 (submissions) (FP bonus points) worked up a number of Featured Pictures during round 1, including the 1948 one Deutsche Mark(pictured right), receiving the maximum bonus due to the number of Wikis that the related article appears in.
You may also wish to know that The Core Contest is running through the month of March. Head there for further details - they even have actual prizes!
Chaged a number of Lua modules (from deprecated function mw.wikibase.getEntity to mw.wikibase.getEntityObject)
Updated the json documentation
Always link to Wikidata on client pages that don’t have any langlinks. This affects users without JavaScript and logged out users, logged in users will still see the link item dialog. gerrit:168632
Fixes for the Wikibase qunit Jenkins job
Made Vagrant git-update also properly update Wikibase and dependencies
Fix for phabricator:T88254 (malformatted Wikidata entries appearing in Watchlist RSS feed in clients)
Final touches on new header design
Investigated how we can provide language fallback also in suggestions when search or adding new statements
More work on allowing additional languages in monolingual text datatype
Added missing backend piece for quantities with units. Now the remaining piece is the user interface.
Hello. You recently wrote "The GAN queue for road articles has been pretty slow recently, and most of the reviews are done within the project. People like you from outside the project reviewing them helps to speed up the queue, while also making sure the articles make sense to people who aren't as knowledgeable on roads". With that in mind, could you pass your expert eye over this article and see what else could be done about it? There are a few bits still unsourced (most of which is not what I'd call "information challenged or likely to be challenged") and it could do with some more political history ... what else, I wonder? Ritchie333(talk)(cont)18:40, 6 March 2015 (UTC)
Road articles are my forte, but – being completely unfamiliar with London's political or transportation history – I'm not sure how much I can help with this one aside from a few general/minor things. I can't seem to locate a standards page for WP:UKRD, so I'm not sure if/how the standards for this article differ from other WP:HWY articles (specifically those in WP:USRD, my fortissimo, if you will). Does a normal UKRD junction list look like the one in this article? Finally, a cursory read shows there are units missing conversion templates. I added one in the lead, but there are more yet to be added. TCN7JM06:38, 7 March 2015 (UTC)
Another one of the standards I was unsure of. I remember some British editors being adamant about using coordinates a while ago, so while I'd prefer KMLs on all road articles, I don't know if the consensus is for that in the UK. Could you clear it up for me? TCN7JM07:06, 7 March 2015 (UTC)
If memory serves me right, KML is a worldwide thing, but I don't think it's been done much outside the US/Canada/Australia before. I'll try and think about it, or maybe another talk page stalker can help since it was so long ago... --Rschen775407:13, 7 March 2015 (UTC)
Well somewhere buried on a hard drive I have some OpenStreetMap osm traces of various bits of the NCR that I can convert to KML I think, depends what people's requirements are? What would be really useful is something with traffic figures, which I have seen done elsewhere on the web. For junctions, a typical GLR traffic bulletin would say something like "heavy queues on the North Circular at the moment eastbound between Brent Cross, that's the A41 and Henlys Corner, that's the A1, due to a broken down lorry in the inside lane", so I'd include all of that (minus the broken down lorry, of course). Regarding transport history, very simply put since WWII there's been no money, Ken Livingstone hates cars and Boris Johnston is more interested in buses, cyclists and taxis. Still, there are a ton of news sources to mine, so plenty of room to add content. Ritchie333(talk)(cont)15:42, 7 March 2015 (UTC)
Wikidata weekly summary #148
Here's your quick overview of what has been happening around Wikidata over the last week.
Lots of improvements around MixNMatch. It has a new catalog overview page. ~340K IDs have been matched so far with it and it now has an FAQ for institutions wanting to get their identifiers linked in Wikidata
First screenshot of the primary sources tool that'll help with migrating data from Freebase and enriching it with references has been leaked ;-): 1 and 2