> While this analysis might be completely off, the simple fact that I could get even this information without much efforts is mind-boggling. With better setup it might be able to get more.
This can essentially be rephrased as "I don't know if what the LLM said is true or not but the fact it may or may not be correct is amazing!"
Btw, LLMs are already used in vulnerability discovery and exploit development.
Which you should've done before making such statements imo.