Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And have you managed to perform such a test or is that just an imaginary result you're convinced will happen ? Not trying to be snarky here but i see this kind of thing a lot and 'this is my model of how LLMs work and so this is how they would behave in this test I cannot verify' is very uncompelling.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: