And no surprise, apartheid apologetics: https://grokipedia.com/page/Apartheid#debunking-prevailing-n...
Hilarious factual errors in https://grokipedia.com/page/Green_Line_(CTA)
Many of the most glaring errors are linked to references which either directly contradict Grokipedia's assertion or don't mention the supposed fact one way or the other.
I guess this is down to LLM hallucinations? I've not used Grok before, but the problems I spotted in 15 mins of casual browsing made it feel like the output of SoA models 2-3 years ago.
Has this been done on the cheap? I suspect that xAI should probably have prioritised quality over quantity for the initial launch.
Citizendium is still around, though they've loosened some of the requirements in order to encourage more contributions, which seems self-defeating to me. I think they should have tried to cooperate with Wikipedia instead. The edits and opinions of subject matter experts could be a special layer on top of existing Wikipedia articles. Maybe there could be a link for various experts with highlights of sections they have peer-reviewed and a diff of what they would change about the article if those changes haven't been accepted. There could also be labels for how much expert consensus and trust there is on a given snapshot of an article or how frozen the article should be based on consensus and evidence provided by the experts. This would help users delineate whether an article contains a lot of common knowledge or whether it's more speculative or controversial.
Regardless, the business was there. Wikipedia killed all that. So if you want to create an expertly created encyclopedia anno 2025 you have a real problem: you will need to pay experts for their time somehow otherwise why would they compete with the million monkeys, but your source of revenue has been strangled by those very same monkeys, who it turns out produce content that is orders of magnitude better than anything I've ever read in a for-pay encyclopedia from before Wikipedia.
The bar to entry is insanely high.