As the web fills up with junk text written by robots, it’s no surprise that AI-generated slop is also making its way into ebooks, YouTube, and even Wikipedia. That last one is a particular problem, since Wikipedia’s open access is its greatest strength and its primary method for dealing with misinformation. In response, the site’s administrators are using new policies to combat the onslaught of AI text.
According to a new policy write-up, Wikipedia administrators now have the authority and the tools to rapidly delete articles and edits that are obviously generated with large language models. It’s an expansion to an existing “speedy deletion” option that skips a week-long discussion process among Wikipedia’s teams of volunteer editors and administrators for a full deletion.
But articles that are new or have been substantially rewritten can now be flagged with tags of obvious signs of LLM text, including phrases like “Here is your Wikipedia article on…” or citations and references to things that don’t exist. (That’s a common problem for auto-generated text, as some lawyers and would-be diners of glue-enhanced pizza have discovered.) The presence of both is a strong indicator that whoever’s submitting the article hasn’t even read through it themselves.
If the article shows such telltale signs of being automatically generated, it can be tossed out under the speedy deletion option, something previously reserved for additions that were obvious nonsense or thinly-disguised advertisements.
In an interview with 404 Media, Wikimedia editor Ilyas Lebleu says that most new articles that are removed still use the week-long discussion option. But the slew of quickly-generated content necessitated a method of quickly dealing with obvious junk in a much faster manner. Lebleu says this is a “band-aid” for the most egregious examples of AI-generated submissions, though the larger problem will continue.
Not for the first time, I can’t help but think of John Henry racing the steam drill, a timeless image of humans versus machines. Wikimedia’s new policy is notable in contrast to a would-be change from earlier this year, when editors overwhelmingly rejected AI-generated article summaries. “Wikipedia’s brand is reliability, traceability of changes, and ‘anyone can fix it.’ AI is the opposite of these things,” said Wikipedia editor Bawolff.
Connectez-vous pour ajouter un commentaire
Autres messages de ce groupe

Windows Explorer has always seemed like a part of a small car’s toolk

The best home security camera is one that you can set up in minutes a

Samsung has an insanely large 57-inch gaming monitor with impressive

In a move that surprises no one, Microsoft continues to annoy its use

If you’re looking to upgrade your home office computer, then your bes

Despite ever-improving junk mail filters and more sophisticated defen
