Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So it sounds like LLMs will not achieve AGI.


They are regurgitrons that will eventually begin training themselves on their own output, and humanity will get stupider as a result.

RIP art and poetry.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: