A recent viral graph indicates that ChatGPT's user engagement has surpassed Wikipedia, raising concerns about the reliability of AI-generated information. The data, compiled by GWI, suggests a decline in Wikipedia use while ChatGPT's popularity has soared, especially among university students. Unlike Wikipedia's human editors ensuring accuracy, ChatGPT relies on potentially flawed AI data, prompting ethical questions about information sourcing and reliability. The rapid rise of ChatGPT poses significant challenges to the trust and verification mechanisms established by traditional information sources like Wikipedia.
If the data - which is based on survey responses and not site visits - is to be believed, it's a good reason to be concerned about the reliability of information people are seeking out online.
Wikipedia has an army of over 49 million human editors who ensure accuracy across 64 million articles worldwide. In contrast, it remains infamously unknown what exactly the large language models supporting ChatGPT were trained on.
Collection
[
|
...
]