zlacker

[parent] [thread] 0 comments
1. nonran+(OP)[view] [source] 2022-12-12 09:47:49
In a way it does provide something useful.

A null hypothesis.

If you take the output of GPT for what it really is; the sum of all written human thoughts divided by several billion - resulting in a soup of banal, conformist cringe - then it's a marker.

Original human content can be graded by its deviation in some high dimensional space of semantic novelty.

There are two worrying social fallouts from this:

  Firstly we will get used to our posts being graded, not by each
  other, but by algorithms.

  Second, this creates an incentive to post more extreme and unhinged
  content.
[go to top]