Hacker Newsnew | past | comments | ask | show | jobs | submit | tiahura's commentslogin

How does Tesla FSD respond to inactive traffic control lights?

Coincidentally we were on the Robotaxi during the black out (didn’t know about it, we were going to Japan town from the Mission). Noticed that it navigated through the non-working traffic lights fine, treated it like a stop sign junction. One advantage of building unsupervised system from public version that had to deal with these edge cases all around the country.

Though the safety driver disengaged twice to let emergency vehicles pass safely.



https://x.com/edgecase411/status/2002630953844552094

Looks like it treats it as a 4-way stop. Is this because Tesla has more training data?


I'd default to assuming it's the respective roadmaps for Waymo and Tesla differed on which things to implement when, not training data, that results in the two behaving different.

50/50 bet it would either go right through or treat it as a stop.

Don’t think I have had a totally inactive light. I have had the power is out but emergency battery turned to blinking red light, and it correctly treats as a stop sign.


> Is this because Tesla has more training data?

Its human takes over. FSD is still Level 3.

(Robotaxi, Tesla's Level 4 product, is still in beta. Based on reports, its humans had to intervene.)


FSD is level 2. Level 3 doesn't require the human driver to monitor the outside environment, only take over when requested. Tesla also doesn't report data from FSD under L3 reporting requirements anywhere in the US.

Why does firefox need a CEO? Is the Linux model not feasible?

The Linux Foundation has an executive director, which is the usual title (not CEO) for the head of a non-profit.

Because Mozilla is an explicitly mission driven non-profit. Linux doesn't really have a model, the closest equivalent is basically Chromium which is to say it's an open source project to which extremely large companies donate the vast majority of developer hours.

Most of it is _not_ Amazon’s content. They don’t own the book, so they can’t sell you the book. Nemo dat.

It’s not that it brings it up in unrelated conversations, it’s that it nudges related conversations in unwanted directions.

And, don't forget to figure in that OpenAI has indicated they're getting into porn.

Is this about when Sam mentioned they want to continue/start letting people do lewd texting with LLMs? Or are you talking about actual pornography?

The “lewd texting with LLM” will be a tool for writing actual pornography, and in workflows for image and video pornography, even if the image and video generation doesn’t happen on OpenAI’s platform (in fact, people are using ChatGPT and other major AI engines as tools in that already, but loosening the filters were facilitate that even more on OpenAI’s platform.)

OpenAI knows that, and the people interested in that capability know that, even if many of the other people seeing the marketing about it don't.


> The “lewd texting with LLM” will be a tool for writing actual pornography

Sure, but does that mean "OpenAI has indicated they're getting into porn"? A bit like saying W3C is getting into porn because the web is used for porn, together with other things. Even when I try to think of parent's comment in the most charitable way, I don't think that's what they meant.

Personally I prefer if my tools stay as tools, and let me do professional work with them regardless of what that profession is.


> Sure, but does that mean "OpenAI has indicated they're getting into porn"?

Yes, it literally means they have indicated to the customer base that is looking into making porn.

It may not mean they have indicated it to some other audiences.

> A bit like saying W3C is getting into porn because the web is used for porn, together with other things.

No, its a bit like saying the W3C is getting into porn if the W3C had announced changes in the platform whose main market appeal was to people making porn, but announced it in a way that glossed over and minimized that.

If, on the other hand, the web had a steady state of being used for porn, you wouldn't say the W3C is getting into anything, you’d just say “the internet is for porn” (which has, of course, rather famously been said, and even sung.)


The initial claim was "OpenAI has indicated they're getting into porn", letting writers write the scripts, story-lines or dialogue for pornography does not mean OpenAI suddenly "does porn". In that case Google and Microsoft with their Docs and Office are also "getting into porn", which would be a ridiculous claim.

> The initial claim was "OpenAI has indicated they're getting into porn", letting writers write the scripts, story-lines or dialogue for pornography does not mean OpenAI suddenly "does porn". In that case Google and Microsoft with their Docs and Office are also "getting into porn", which would be a ridiculous claim.

Actively announcing a change of policy whose marketable function is to facilitate porn production is only the case for the OpenAI action and you have presented nothing analogous for the entities you are trying to hold up as comparable.


> Actively announcing a change of policy whose marketable function is to facilitate porn production

Where exactly did this happen though? And how am I supposed to prove a negative? It's up to you to present evidence that this is something OpenAI actively promote as a use case for their tools, something I personally haven't seen, but I'm open to changing what I think is happening if proof can be presented that this is the case.


It's hard to tell.

How did they indicate?

Iger and Altman on CNBC at 10:30.

Iger and Altman on CNBC at 10:30.

You expect The Washing Machine Institute Foundation is going to sponsor it?

Bad news for Boise.


Why? Micron will still fabricate things there... just not for us.


Like when OpenAI started experiencing a massive brain drain.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: