zlacker

[parent] [thread] 0 comments
1. codeki+(OP)[view] [source] 2019-03-11 21:48:06
> And it's not just vocabulary, the successes of RNNs show that grammar is also mostly patterns.

The shape of resultant word strings indeed form patterns. However, matching a pattern is, in fact, different than being able to knowledgeably generate those patterns so they make sense in the context of a human conversation. It has been said that mathematics is so successful because it is contentless. This is a problem for areas that cannot be treated this way.

Go can be described in a contentless (mathematical) way, therefore success is not surprising (maybe to some it was).

It is those things that cannot be described in this manner where 'AGI' (Edit: 'AGI' based on current DL) will consistently fall down. You can see it in the datasets....try to imagine creating a dataset for the machine to 'feel angry'. What are you going to do....show it pictures of pissed off people? This may seem like a silly argument at first, but try to think of other things that might be characteristic of 'GI' that it would be difficult to envision creating a training set for.

[go to top]