* Migrates namespace info from the page tables page_namespace field
to the new linter table field linter_namespace. This duplication
of the namespace value was requested to greatly reduce the amount
of database activity required by the linter search and reporting
code.
* This patch has been prepared as a dark launch patch enabled with
config value LinterMigrateNamespaceStage and assumes that the
Linter table has had the linter_namespace column added to it,
and recording of the namespace field is already enabled and is
populating the namespace column.
* The migrate code now runnable from Linter/maintenance directory,
using migrateNamespace.php, which will be deployed in a separate
patch. The maintenance code creates an appropriate environment
to call migrateNamespace( in Database.php.
Bug: T299612
Change-Id: I73cb80729d6a5a8716fe93164ad1e42e6958d672
* Add "mw-blank" as another tag value that erases all lint errors
for a page as a blank page cannot have any lint errors.
Bug: T280193
Change-Id: Iaad8ce75950588b2676de5dfb5f5221d64231f0e
* Determines if new content type is not wikitext and if so
deletes all existing lint error records for that pageID.
Bug: T298343
Change-Id: I20fac9a0c901f3e7a5cc898566a4487fbe70798f
WikiPage::factory() is deprecated since 1.36 and should be replaced
with WikiPageFactory::newFromTitle().
Bug: T297688
Change-Id: I63bf3ba1c2ad6f8b59d369d91777af0418746a6b
* Adapted other core phpunit test user, title and page creation
code to avoid creating a MOCK title such that the job runner
finds the page(title) in the database and runs the job without
hackery of populating the title in the constructor of
RecordLinkJob. When the getForPage() runs, it finds the
page and its lint errors through the standard code paths.
Bug: T225337
Change-Id: Ibb57523ee2f066c7bd0465c14f0dcb2bab51286b
Currently we select 20 rows, and return the accurate count if it's less
than that, so up to 19 rows. Since we want to return an accurate count
if it's 20 rows or less, select one more row, 21, so we can differentiate
between only having 20 result rows or hitting the limit. This is the same
technique used in MediaWiki's Pager system.
Change-Id: I50fa96238eb4c7178414ee92c53799fd69520926
* The code now produces an accurate count if the number of
errors for a category is below the threshold set by a
public constant MAX_ACCURATE_COUNT (currently 20).
The database record count limit was originally set to 1,
to determine accurately, if there were actually 0 errors
in a category as the estimate code would never report 0.
If not 0, it would use the estimated count which does not
produce an accurate count for any other number of errors.
For low error counts this is annoying to editors and
unnecessary. The additional CPU/disk activity to accurately
check for low error counts is not significantly more than
checking for 0 or 1, as checking for 0 likely requires
a complete table scan which is probably expensive compared
to a low count that early outs when it hits to record limit.
* An improvement to consider is recording the accurate count in
a separate tiny table, and maintaining an accurate count there
which is used in preference to doing the select with row limit
based on say a 30 second TTL, to prevent a stampede of requests
from doing extraneous database operations.
* Added unit test coverage for accurately counting low error
conditions that are lower than the threshold and also verify
that the estimate is inaccurate beyond the error count
threshold.
Bug: T194872
Change-Id: I4f74cfe3bf9601baa0dc8fa6464a68030ac2bc4b
No integration needed.
Requires bumping minimum version of mediawiki to when
MediaWikiUnitTestCase was introduced in 1.34.
Change-Id: Ibc0a1028cc61a7bdc149081aeaa1109de18ee119
"Using assertContains() with string haystacks is
deprecated and will not be supported in PHPUnit 9.
Refactor your test to use assertStringContainsString()
or assertStringContainsStringIgnoringCase() instead."
Change-Id: I88df8a91660eb332a0ec87070eff31cfcf8c4955
The following sniffs are failing and were disabled:
* MediaWiki.Commenting.FunctionComment.MissingDocumentationPrivate
Additional changes:
* Also sorted "composer fix" command to run phpcbf last.
Change-Id: Icdd0d0e60dd543921a5757162548ae149c3316ea
This eases deployment dependencies by allowing Parsoid to supply an
appropriate database category ID so that new lint categories can be
appropriately stored during the interval between adding a new lint
category to Parsoid and deploying an Extension:Linter patch to
describe it.
Change-Id: Ib7b2342168fa53ca2abac7d5f54fe313be341eb7
This test previously wasn't running because the foreach() in the data
provider was totally wrong.
Also the -details variant for fostered isn't supposed to exist, so
hardcode in an exception.
Finally, add the @coversNothing annotation since this test is just
verifying the contents of en.json, not any PHP code.
Change-Id: I7ffffcc3a910aefb082f7ff59265d3be8bc46347
The following sniffs are failing and were disabled:
* MediaWiki.FunctionComment.Missing.Protected
* MediaWiki.FunctionComment.Missing.Public
Change-Id: I96e32df48d13040893bfd1be6d90d0db4f7c7d0a
The query itself is too expensive to be run on large Wikimedia wikis. So
put it behind WAN cache and touch the check keys for each category
whenever those have errors added or deleted from them.
If this happens to get out of sync, it will get fully refreshed
regularly when the totals are sent to statsd.
WANObjectCache's 'lockTSE' feature will help avoid cache stampedes that
made this query expensive in the past.
Change-Id: I3774103a29fa0f29d36283950f136259fa71bffe
These tests insert variations of fake lint errors into the database, and
then read out of the database to check they round-trip properly.
And while we're at it, improve the setForPage() return value.
These tests can be run with something like:
php tests/phpunit/phpunit.php extensions/Linter/tests/phpunit/
Change-Id: Ifdba8a8a104d218a822f909bc5d7b3512aca499d