Hacker Newsnew | past | comments | ask | show | jobs | submit | rf15's commentslogin

I wonder how the actual way of opening a walnut fits into this metaphor: apply pressure to the gap line of the two halves. You can even do it with your bare hands a few times. Do not use a chisel, the nut will slip easily, duh.

What about self-determination? Peace enforced by whom?

To remove, to prevent martyrdom in death, to force a change of government that sells them better oil. Same thing the US always does.

another country, without justification, again after we promised not to do that after the last dozen similar cases, duh.

and Trump said he want nobel peace prize, such a joke

Without justification?

idk, is "drug trafficking" enough justification to topple a country? I don't really think so. And the claim is rather spurious anyway...

Ignoring the results of an election seems like a reasonable justification to topple a specific leader (no country is being toppled here)

Works well, maybe it was a small hiccup?

it feels like the arguments' off-by-one buffer size vs string length is horrible ergonomics and will probably lead to further usage errors in the future.

Yes I have a degree in bike shedding, why am I always getting this particular question


everyone outside the US: lol polygraph selfown

For those not in the know: they're unreliable to the point of uselessness and the US Government is somehow really enamoured with the fantasy of mind-reading and lie-detection. But what can you do when the government agencies suffer from chuunibyou?


I’ve said this or similar several times before but I’m too lazy to dig through my comment history and link it. Nobody (or almost nobody) in the government is under the illusion that the polygraph is a magic mind-reading device that can detect lies. Polygraphs are used to test how well a person responds to stress and as a political/managerial tool. It seems in this case they’re getting a lot of mileage out of it…


Even in the US, lie detector results are generally inadmissible in court.


Isn't formal verification a "just write it twice" approach with different languages? (and different logical constraints on the way you write the languages)


You should not have mercy on someone who repeatedly ignores all warnings without thinking and then hurts themselves in the way the warnings promised. At that point you are on your very own.


It feels like LLMs are specifically laser targeting the "never learn" mindset, with a promise of leaving skill and knowledge to a machine. (people like that don't even pause to think why they would be needed in the loop at all if that were the case)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: