Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>Using LLMs doesn't kill people

Guess you mmissed the post where lawyers were submitting legal documents generated by LLM's. Or people taking medical advice and ending up with hyperbromium consumptions. Or the lawsuits around LLM's softly encouraging suicide. Or the general AI psychosis being studied.

It's way past "some exceptions" at this point.



Besides the suicide one, I don't know of any examples where that has actually killed someone. Someone could search on Google just the same and ignore their symptoms.


>I don't know of any examples where that has actually killed someone.

You don't see how botched law case can't cost someone their life? Let's not wait until more die to reign this in.

>Someone could search on Google just the same and ignore their symptoms.

Yes, and it's not uncommon for websites or search engines to be sued. Millenia of laws exist for this exact purpose, so companies can't deflect bad things back to the people.

If you want the benefits, you accept the consequences. Especially when you fail to put up guard rails.


LLMs generate text. It is people who decide what to do with it.

Removing all personal responsibility from this equation isn't going to solve anything.


>It is people who decide what to do with it.

That argument is rather naive, given that millenia of law is meant to regulate and disincentivize behavior. "If people didn't get mad they wouldn't murder!"

We've regulated public messages for decades, and for good reason. I'm not absolving them this time because they want to hide behind a chatbot. They have blood on their hands.


Sticks and stones, my friend...


If you were offended by that comment, I apologize. You're 99.99% not the problem and infighting gets us nowhre.

But you may indeed be vying against your best interests. Hope you can take some time to understand where you lie in life and if your society is really benefiting you.


I am not offended. And I'll be the one to judge my own best interests. (back to: "personal responsibility"). e.g. I have more information about my own life than you or anyone else, and so am best situated to make decisions for myself about my own beliefs.

For instance I work for one of the companies that produces some of the most popular LLMs in use today. And I certainly have a stake in them performing well and being useful.

But your line of reasoning would have us believe that Henry Ford is a mass murderer due to the number of vehicular deaths each year, or that the wright brothers bear some responsibility for 9/11. They should have foreseen that people would fly their planes into buildings, of course.

If you want to blame someone for LLMs hurting people, we really need to go all the way back to Alan Turing -- without him these people would still be alive!


>And I'll be the one to judge my own best interests thank you.

Okay, cool. Note that I never asked for your opinion and you decided to pop up in this chain as I was talking to someone else. Go about your day or be curious, but don't butt in then pretend 'well I don't care what you say' when you get a response back.

Nothing you said contradicted my main point. So this isn't really a conversation but simply more useless defense. Good day.


I said "sticks and stones" to suggest the end of that quote: "words can never hurt me". That's a response to your comment about LLMs hurting people.

Didn't think that would go so cleanly over your head given you're all the way up there on your high horse of morality.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: