User Details
- User Since
- Sep 3 2015, 10:55 AM (485 w, 5 d)
- Availability
- Available
- LDAP User
- Unknown
- MediaWiki User
- Envlh [ Global Accounts ]
Sep 29 2023
May 19 2023
Migration done:
- crontab replaced by scheduled jobs
- webservice upgraded to php7.4
Feb 27 2023
I'm not sure there will be exactly one URL, but there should be only one subdomain (formations.wikimedia.fr). I'm awaiting confirmation from @mickeybarber.
Feb 7 2023
Dec 17 2022
Sep 27 2022
I can reproduce with Firefox 105.0.1.
Apr 12 2022
Lowering the priority as the main issue is now dealt with.
Apr 6 2022
We're working on this (as volunteers, so resolution will not be immediate).
Feb 18 2022
Jan 11 2022
Hi! I checked the example (and another one with the Infobox Biographie2), both on web and on the iOS app. It seems fixed. Thank you!
Jun 3 2021
This issue also impacts account recovery when you lost your password, as no email is sent when you try to reset your password.
May 19 2021
The issue is not limited to the Cyclists Infobox. It impacts every infobox that uses the module Infobox on the French Wikipedia, including the Infobox Biographie2, which is used in nearly 300,000 articles. I left a message on the talk page of the module. I don't know if this should be fixed in Mediawiki or in the module Infobox, but the priority should probably be higher.
Jan 22 2021
Nov 15 2020
The copy of Denelezh's MySQL data files is 365G. The denelezh schema is 324G. I'm working on setting-up a local MySQL server to restore it (and carefully, as this is the only backup at the moment). One idea is to export only a subset of it, so it will be easier to transfer it to Humaniki's server and to restore it.
Oct 2 2020
On Denelezh, I installed MySQL using Oracle's apt repository: https://dev.mysql.com/downloads/repo/apt/
Sep 6 2020
Mar 17 2020
A project grant was opened to achieve this project. Feedback is of course welcome :)
Dec 12 2019
Oct 15 2019
Oct 14 2019
@Maximilianklein I sent you an email about this topic.
Oct 1 2019
Started to work on this with @Seb35. I should be able to deliver a merge request on Saturday or Sunday.
Sep 3 2019
Just a kind reminder that I reported this issue at Wikimania 2017.
Aug 28 2019
Aug 27 2019
Lighting talk at French WikiConvention (the third one): https://meta.wikimedia.org/wiki/WikiConvention_francophone/2019/Programme/pr%C3%A9sentations_%C3%A9clair_communaut%C3%A9_et_outils
Aug 18 2019
Link to Denelezh Git: https://framagit.org/wikimedia-france/denelezh-core
Aug 7 2019
From @Magnus in Wikidata Telegram channel: "I restarted it, someone please close the phab issue".
Jul 19 2019
Thank you all! :)
Jul 7 2019
Thank you all! Using Wikidata Toolkit, I was able to load the dump generated on 2019-07-04 :)
Jul 3 2019
@TheDatum @ArielGlenn Thank you for the clarification!
Sorry to reopen this bug, but it seems that the new dumps still have the .not extension:
https://dumps.wikimedia.org/wikidatawiki/entities/20190701/
Jun 26 2019
Jun 25 2019
Thanks for the ping! I don't use RDF dumps at the moment, and I'm fine with this change.
May 9 2019
I did not had the opportunity to test the fix before it was reverted, but I totally agree with @Nikki.
Apr 28 2019
Thanks for the ping! I don't use this table either, but it seems a huge and well prepared work. Good luck :-)
Apr 23 2019
Thank you for your replies. A few comments / questions:
- While I understand your point, I fair that isolating some data from the main dump is only a temporary solution to its size growth. Sooner or later, it will weight 1 TB (even compressed), and we'll have to deal with this (as producer or as consumer).
- Will the lexemes dumps contain the P namespace, or will the consumer have to additionally download the other complete dump to get the data about properties?
- Will there be one dump per namespace (one for P, one for Q, one for L)?
Apr 17 2019
Thank you for the reply and for having clarified that this issue has no assignee at the moment.
Glad my help was useful :) Thank you for your quick fix!
Apr 15 2019
FYI, I also have very different results for P380 (even though it is data from the dump of 2019-04-08). If you follow the link "Usage history" on Property_talk:P380, you'll see that were no recent major change on the usage of this property.
Apr 13 2019
@thiemowmde Are you still working on this issue?
Mar 23 2019
Hello, I'm also very surprised on how things are evolving here. This task was about changing how dumps generation is scheduled (based on week or on month), not their frequency. The first proposals by @JAllemandou were with 4 dumps generations per month. Now, you are talking about reducing the frequency of dumps generations from 4-5 per month to only 2, on an irregular basis (sometimes with 3 weeks between two dumps, sometimes with 1 week).
Mar 17 2019
@ArielGlenn, As @Melderick, I commented here thanks to @Lea_Lacroix_WMDE's addition in the Wikidata weekly summary a few weeks ago :)
Feb 25 2019
Women in Red rely on Wikidata Human Gender Indicators (WHGI) statistics, which are generated once a week using Wikidata dump. If not already done, you should get in touch with these communities to evaluate the impact of the change.
Oct 31 2018
Apr 26 2018
Apr 25 2018
Apr 16 2018
Mar 6 2018
Feb 5 2018
Jan 8 2018
Dec 9 2017
As a user, it's back to normal for me. Thanks !
Dec 7 2017
Everything seems slow after a few edits: page load, auto-completion, save, etc. It looks like a rate limit is raised by edits and is impacting every request. After a short time without editing, it's back to normal.
Dec 6 2017
Dec 3 2017
Sorry, I didn't see this task as I wasn't added to it. The problem was solved for me a few hours/days later in August. Thanks :)
Oct 11 2017
@Dereckson Thanks! It seems that there is no IPv6 (ping6 does not work, even if I have an IPv6 inside the network). I was also informed that the network relies on two DSL Orange routers; I hope that IPs will not change during the event...
Sep 13 2017
I've at last published a blog post: https://www.lehir.net/the-french-connection-at-the-wikimania-2017-hackathon/
Sep 2 2017
Aug 13 2017
I totally agree with Ash_Crow. Wikidata aims to be the sum of all knowledge which can be referenced, which Wikimedians data is not. A U namespace could be a solution, but only and only if only a Wikimedian can edit its own item.