zlacker

[parent] [thread] 6 comments
1. lidHan+(OP)[view] [source] 2019-12-13 16:37:58
I think that this particular topic is evergreen because people are perennially surprised that this technology, which seems so reasonable and advanced at first blush, has failed to be useful in practice.
replies(2): >>The_ra+pe >>xamuel+1r
2. The_ra+pe[view] [source] 2019-12-13 18:18:25
>>lidHan+(OP)
There's still no complete explanation on why it is a failure. What are the technical difficulties they can't overcome?
replies(1): >>goatlo+Rv
3. xamuel+1r[view] [source] 2019-12-13 19:48:06
>>lidHan+(OP)
I spent a year doing an ontology postdoc. I can't speak for Cyc, but from what I saw of the ontology world, there are a lot of charlatans and people who are using it as a buzzword to make grant proposals sexier. Whether or not there's real potential in the technology, that kind of environment surely isn't conducive to achieving said potential. An outsider trying to peek into the field is immediately swamped by vast oceans of garbage, and everyone in the field is an expert at marketing themselves so if there are legitimate researcher gems in the field, I don't know how you'd actually find them from amidst all the noise.
replies(1): >>Jeff_B+Aw
◧◩
4. goatlo+Rv[view] [source] [discussion] 2019-12-13 20:20:07
>>The_ra+pe
Maybe because the world and human knowledge are incredibly complex and difficult things to put into logical relations well enough to achieve more success?

Consider that humans learn though having bodies to explore the world with, while forming a variety of social relations to learn the culture. Which is very different from encoding a bunch of rules to make up an intelligence.

replies(1): >>The_ra+uM
◧◩
5. Jeff_B+Aw[view] [source] [discussion] 2019-12-13 20:25:29
>>xamuel+1r
This supply-side opacity is definitely a problem. There seems to be a corresponding demand-side problem, that clients often don't know quite what they want. If there was an easy way of generating hard tests with clear-cut answers, maybe there would be an easy way for a winner to distinguish themselves.
◧◩◪
6. The_ra+uM[view] [source] [discussion] 2019-12-13 22:10:43
>>goatlo+Rv
Creating a software that learns the real world indeed seems like a really hard problem without a body.

I was referring to why a software that parse the semantics of Wikipedia articles and make them queryable through natural language questions, is something that humanity isn't able to do?

replies(1): >>goatlo+cW
◧◩◪◨
7. goatlo+cW[view] [source] [discussion] 2019-12-13 23:40:25
>>The_ra+uM
That might depend on how much semantics is related to having a body. We do utilize quite a lot of metaphors that are based on the kinds of bodies and senses we have. The question here is how much embodiment is necessary for understanding semantics. Maybe it's possible to brute force around that with ML, or stack enough human hours into building the right ontologies in symbolic AI. But maybe not.

I think people like Rodney Brooks are of the belief you need to start with robots that learn their environment and build up from there.

[go to top]