zlacker

[parent] [thread] 2 comments
1. dex_te+(OP)[view] [source] 2019-12-13 22:58:51
1) What do you think about hybrid approach: hypergraphs + large-scale NLP models (transformers)?

2) How far we're from real self-evolving cognitive architectures with self-awareness features? Is it a question of years, months, or it's already solved problem?

3) Does it make sense to use embeddings like https://github.com/facebookresearch/PyTorch-BigGraph to achieve better results?

4) Why Cycorp decided to limit communication and collaboration with scientific community / AI-enthusiasts at some point?

5) Did you try to solve GLUE / SUPERGLUE / SQUAD challenges with your system?

6) Is Douglas Lenat still contribute actively to the project?

Thanks

replies(1): >>choamn+Y
2. choamn+Y[view] [source] 2019-12-13 23:07:16
>>dex_te+(OP)
Doug Lenat is very much still active in the project. He doesn't do as much work building the ontology, but he plays a role in how various projects develop and provides a lot of feedback.
replies(1): >>The_ra+wH
◧◩
3. The_ra+wH[view] [source] [discussion] 2019-12-14 12:01:27
>>choamn+Y
How do you compare with SOAR and opencog/atomspace?

Which6is the most promising AGI project according to you?

[go to top]