zlacker

[parent] [thread] 0 comments
1. didibu+(OP)[view] [source] 2022-10-17 01:47:54
> The OpenAI codex model (from which Copilot is derived) works a lot like a translation tool. When you use Google to translate from English to Spanish, it’s not like the service has ever seen that particular sentence before. Instead, the translation service understands language patterns (i.e. syntax, semantics, common phrases). In the same way, Copilot translates from English to Python, Rust, JavaScript, etc. The model learns language patterns based on vast amounts of public data

As I understand, this isn't proven is it?

We don't know that the model isn't simply stitching and approximating back to the closest combination of all the data it saw, versus actually understanding the concepts and logic.

Or is my understanding already behind times?

[go to top]