Английская Википедия:Artificial intelligence in Wikimedia projects

Материал из Онлайн справочника
Перейти к навигацииПерейти к поиску

Шаблон:Short description Artificial intelligence is used in Wikipedia and other Wikimedia projects for the purpose of developing those projects.[1][2] Human and bot interaction in Wikimedia projects is routine and iterative.[3]

Using artificial intelligence for Wikimedia projects

Various projects seek to improve Wikipedia and Wikimedia projects by using artificial intelligence tools.

ORES

The Objective Revision Evaluation Service (ORES) project is an artificial intelligence service for grading the quality of Wikipedia edits.[4][5] The Wikimedia Foundation presented the ORES project in November 2015.[6]

Detox

Detox was a project by Google, in collaboration with the Wikimedia Foundation, to research methods that could be used to address users posting unkind comments in Wikimedia community discussions.[7] Among other parts of the Detox project, the Wikimedia Foundation and Jigsaw collaborated to use artificial intelligence for basic research and to develop technical solutionsШаблон:Examples needed to address the problem. In October 2016 those organizations published "Ex Machina: Personal Attacks Seen at Scale" describing their findings.[8][9] Various popular media outlets reported on the publication of this paper and described the social context of the research.[10][11][12]

Bias reduction

In August 2018, a company called Primer reported attempting to use artificial intelligence to create Wikipedia articles about women as a way to address gender bias on Wikipedia.[13][14]

Generative language models

In 2022, the public release of ChatGPT inspired more experimentation with AI and writing Wikipedia articles. A debate was sparked about whether and to what extent such large language models are suitable for such purposes in light of their tendency to generate plausible-sounding misinformation, including fake references; to generate prose that is not encyclopedic in tone; and to reproduce biases.[15][16] Шаблон:As of, a draft Wikipedia policy on ChatGPT and similar large language models (LLMs) recommended that users who are unfamiliar with LLMs should avoid using them due to the aforementioned risks, as well as the potential for libel or copyright infringement.[16]

Using Wikimedia projects for artificial intelligence

Content in Wikimedia projects is useful as a dataset in advancing artificial intelligence research and applications. For instance, in the development of the Google's Perspective API that identifies toxic comments in online forums, a dataset containing hundreds of thousands of Wikipedia talk page comments with human-labelled toxicity levels was used.[17]

A 2012 paper reported that more than 1000 academic articles, including those using artificial intelligence, examine Wikipedia, reuse information from Wikipedia, use technical extensions linked to Wikipedia, or research communication about Wikipedia.[18] A 2017 paper described Wikipedia as the mother lode for human-generated text available for machine learning.[19]

A 2016 research project called "One Hundred Year Study on Artificial Intelligence" named Wikipedia as a key early project for understanding the interplay between artificial intelligence applications and human engagement.[20]

References

Шаблон:Reflist

See also

Шаблон:Wikimedia Foundation