Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I know nothing about companies (esp. in the US), but I find it weird that a company can go from non-profit to for-profit? Surely this would be taken advantage of. Can someone explain me how this work?


That was the point musk was complaining about.

In practice it’s doable though. You can just create a new legal entity and move stuff and/or do future value creating activity in the new co. IF everyone is on board with the plan on both sides of the move then that’s totally doable with enough lawyers and accountants


If the non-profit is on board with that though, then they're breaking the law. The IRS should reclassify them as a for-profit for private inurement and the attorney general should have the entire board removed and replaced.


OpenAI Global, LLC - the entity that actually employs all the engineers, makes revenue from ChatGPT and the API, and pays taxes - has been a for-profit corporation since at least 2019: https://en.wikipedia.org/wiki/OpenAI#2019:_Transition_from_n...

The IRS isn’t stupid. The rules on what counts as taxable income and what the nonprofit can take tax-free have been around for decades.


Whatever you think of the IRS, they aren't the master of their own destiny:

https://www.propublica.org/article/how-the-irs-was-gutted (2018)

> An eight-year campaign to slash the agency’s budget has left it understaffed, hamstrung and operating with archaic equipment. The result: billions less to fund the government. That’s good news for corporations and the wealthy.


Still, if the non-profit has private inurement, the non-profit shouldn't be able to take anything tax-free as it wouldn't qualify as a 501(c)(3). The bigger issue is definitely Delaware non-profit law though.


But, if the non-profit gives all its assets to the new legal entity, shouldn't the new legal entity be taxed heavily? The gift tax rate goes up to 40% in the US. And 40% of the value of openAI is huge.


A non-profit can't give away its assets to a private entity, but it can exchange its assets for fair value, in this case, equity in the for-profit.


You don't need to sell/give the assets away to allow the for-profit to use them.

You sign an exclusive, non-revocable licensing agreement. Ownership of the original IP remains 100% with the original startup.

Now, this only works if the non-profit's board is on-board.


ICYMI, Elon Musk restarted his lawsuit a month or two ago: https://www.reuters.com/technology/elon-musk-revives-lawsuit...

I'm wondering if OpenAI's charter might provide a useful legal angle. The charter states:

>OpenAI’s mission is to ensure that [AGI ...] benefits all of humanity.

>...

>We commit to use any influence we obtain over AGI’s deployment to ensure it is used for the benefit of all, and to avoid enabling uses of AI or AGI that harm humanity or unduly concentrate power.

>Our primary fiduciary duty is to humanity. We anticipate needing to marshal substantial resources to fulfill our mission, but will always diligently act to minimize conflicts of interest among our employees and stakeholders that could compromise broad benefit.

>...

>We are committed to doing the research required to make AGI safe, and to driving the broad adoption of such research across the AI community.

>We are concerned about late-stage AGI development becoming a competitive race without time for adequate safety precautions. [...]

>...

https://openai.com/charter/

I'm no expert here, but to me, this charter doesn't appear to characterize OpenAI's behavior as of the year 2024. Safety people have left, Sam has inexplicably stopped discussing risks, and OpenAI seems to be focused on racing with competitors. My question: Is the charter legally enforceable? And if so, could it make sense for someone to file an additional lawsuit? Or shall we just wait and see how the Musk lawsuit plays out, for now?


It would think it is legally enforceable, but I suspect Kathy Jennings is the only person who has standing to sue over it.


So perhaps we can start a campaign of writing letters to her?

I'm curious about the "fiduciary duty" part. As a member of humanity, it would appear that OpenAI has a fiduciary duty to me. Does that give me standing? Suppose I say that OpenAI compromises my safety (and thus finances) by failing to discuss risks, having a poor safety culture (as illustrated by employee exits), and racing. Would that fly?


Under Lujan v. Defenders of Wildlife, you have to suffer a concrete, discernible injury. They can have broken their promise to you, but unless you can prove the dollar amount that harmed you, you can't sue.

Even if you donated to them, all states I know of assign sole oversight for proper management of those funds to the state AG. If you donate to a food bank and they use the money to buy personal Ferraris instead of helping the hungry, that's clearly illegal, but you'd be out the money either way, so you wouldn't have standing to sue. The attorney general has to sue for mismanagement of funds. If you feel OpenAI is violating their charter, I would definitely encourage writing to Mrs. Jennings to voice that opinion.


"Humanity vs. OpenAI" would look good on a docket.


Elon Musk absolutely has standing, as one of the biggest donors to the nonprofit. I assume he will settle for some ownership in the for-profit, though.


Maybe. In California that has been ruled to not be the case: https://www.thetaxadviser.com/issues/2021/sep/donor-no-stand...

I don't know the laws of Delaware well, but I would be surprised if he has standing even as a donor.


That was also specifically about a donor-advised fund, which is different than a nonprofit corporation. Elon Musk's tort would be something like "fraud in the inducement" or some weird theory like that not for a breach of fiduciary duty.


Didn't he already refuse the shares offered to him?


I'm sure they just didn't offer him enough shares.


I suppose Open Philanthropy does as well, then.


LOL, it's like a bank with stained glass windows to make passers-by think it's a church, isn't it.


Sam had a blog post literally two days ago that acknowledged risks. There’s also still a sizeable focus on safety and people with roles dedicated to it at open ai


Is there a sizable focus on safety? Last time I heard there was only like one safety person left on the team


> That was the point musk was complaining about.

I think the real issue Musk was complaining about is that sama is quickly becoming very wealthy and powerful and Musk doesn't want any competition in this space.

Hopefully some people watching all this realize that the people running many of these big AI related projects don't care about AI. Sam Altman is selling a dream about AGI to help make himself both wealthier and more powerful, Elon Musk is doing the same with electric cars or better AI.

People on HN are sincerely invested in the ideas behind these things, but it's important to recognize that the people pulling the strings largely don't care outside how it benefits them. Just one of the many reasons, at least in AI, truly open source efforts are essential for any real progress in the long run.


The notion that consciousness is going to emerge in a system where neurons are modelled as bits is laughable.


The famous last word of humanity before skynet wakes up (most joking, but only mostly)


It's not weird, it's illegal.

There's a lot of jurisdiction around preventing this sort of abuse of the non-profit concept.

The reason why the people involved are not on trial right now is a bit of a mystery to me, but could be a combination of:

* Still too soon, all of this really took shape in the past year or two.

* Only Musk has sued them, so far, and that happened last month.

* There's some favoritism from the government to the leading AI company in the world.

* There's some favoritism from the government to a big company from YC and Sam Altman.

I do believe Musk's lawsuit will go through. The last two points are worth less and less with time as AI is being commoditized. Dismantling OpenAI is actually a business strategy for many other players now. This is not good for OpenAI.


> Dismantling OpenAI is actually a business strategy for many other players now.

Which ones exactly?

NVIDIA is drinking sweet money from OpenAI.

Microsoft & Apple are in cahoots with it.

Meta/Facebook seems happy to compete with OpenAI on a fair playing field.

Anthropic lacks the resources.

Amazon doesn't seem to care.

Google is asleep.


Meta has to be happy someone else is currently looking as sketchy as they are. Thus the business strategy is moving to limit their power and influence as much as possible while also avoiding any appearance of direct competition, and letting the other guy soak up the bad pr.

Amazon gets paid either way, because even if open ai doesn’t use them, where are you going to cloud your api that’s talking with open ai?

If open ai looks weakened I think we’ll see everyone else has a service they want you to try. But there’s no use in making much noise about that, especially during an election year. No matter who wins, all the rejected everywhere will blame AI, and who knows what that will look like. So, sit back and wait for the leader of the pack to absorb all the damage.


Google is asleep? Gemini is the product of a company that's asleep?


Gemini is the product of a company that is still half-asleep. We’re trying to work with it on a big data case, and have seen everything, from missing to downright wrong documentation, missing SDKs and endpoints, random system errors and crashes, clueless support engineers… it’s a mess.

OpenAI is miles ahead in terms of ecosystem and platform integration. Google can come up with long context windows and cool demos all they want, OpenAI built a lot of moat while they were busy culling products :)


Fair enough.

I didn't realise it was that bad.


You’re right, Gemini is more of a product from a company in a vegetative state.


Gemini thinks the founding fathers of america were black and that the nazis were racially diverse. so ya


>NVIDIA is drinking sweet money from OpenAI.

NVIDIA makes money from any company doing AI. I would be surprised if OpenAI was a whole digit percentage of their revenue.

>Microsoft & Apple are in cahoots with it.

Nope. Apple is using OpenAI to fill holes their current model is not good at. This doesn't sound like a long-term partnership.

>Meta/Facebook seems happy to compete with OpenAI on a fair playing field.

They want open source models to rule, obliterating proprietary models out of existence, while at it.

>Anthropic lacks the resources.

Hence why it would be better for them if OpenAI would not exist. It's the same with all other AI companies out there.

>Amazon doesn't seem to care.

Citation needed, AWS keeps putting out products which are their market leaders, they just don't make a big fuzz about it.

>Google is asleep.

I'll give you this one. I have no idea why they keep Pichai around.


> I would be surprised if OpenAI was a whole digit percentage of their revenue.

It is not publicly known how much revenue Nvidia gets from OpenAI, but it is likely more than 1%, and they may be one of the top 4 unnamed customers in their 10Q filing, which would mean at least 10% and $3 billion [0].

That's not nothing.

[0] https://www.yahoo.com/tech/nvidia-gets-almost-half-revenue-0...


>I would be surprised if OpenAI was a whole digit percentage of their revenue.

As opposed to? The euphemism "I wouldn't be surprised" usually means you think what you're saying. If you negate that you're saying what you _don't_ think is the case? I may be reading too much into whats probably a typo.


I read it as "I would be surprised if OpenAI were spending enough to consitute even 1% of nVIDIA's revenue."


At first, I thought, “Wow, if companies can start as nonprofits and later switch to for-profit, they’ll exploit the system.” But the more I learned about the chaos at OpenAI, the more I realized the opposite is true. Companies will steer clear of this kind of mess. The OpenAI story seems more like a warning than a blueprint. Why would any future company want to go down this path?


It's quite simple: the talent pool that had already enough money that they quit their well paying job at a for profit company in part because they wanted to continue working at a non-profit high impact.

As OpenAI found its product-market fit, the early visionaries are not needed anymore (although I'm sure the people working there are still amazing)


I think OpenAI took this play right out of one of its founding donors playbooks. Pretend your company has lofty goals and you can get people to compromise to moral relativism and work superduper hard for you. These people definitely have framed posters with the “If you want to build a ship, don’t drum up the men to gather wood, divide the work, and give orders. Instead, teach them to yearn for the vast and endless sea" quote somewhere in their living places/workspaces.


> Why would any future company want to go down this path?

most would happily sell their soul and deal with any mess to reach $150B valuation


It is going to be taken advantage of. Musk and others have criticized this “novel” method of building a company. If it is legal then it is a puzzling loophole. But another way to look at it is it gives small and vulnerable companies a chance to survive (with different laws and taxes applying to the initial nonprofit). If you look at it as enabling competition against the big players it looks more reasonable.


>But another way to look at it is it gives small and vulnerable companies a chance to survive (with different laws and taxes applying to the initial nonprofit).

I feel like this is quite a slippery slope, though. Should we also give small companies a right to violate trademarks? Copyright? Kill people? These could also give them a chance to compete against big players.


I am not a tax specialist but from my understanding a non-profit is a for-profit that doesn't pay dividends. Why would the government care?


No, a non-profit is one in which there are no shareholders. The non-profit entity can own a lot and be extremely successful and wealthy, but it cannot give that money to any shareholders. It can pay out large salaries, but those salaries are scrutinized. It doesn't prevent abuse, and it certainly doesn't prevent some unscrupulous person from becoming extremely wealthy with a non-profit, but it is a little more complicated and limiting than you would think. Also, you get audited with routine regularity and if you are found in violation you lose your tax-exempt status, but you still are not a for-profit.


Yes: non-profits usually have members, not shareholders.

And, most importantly: non-profit charities (not the only kind of nonprofit, but presumably what OpenAI was) are legally obligated to operate “for the public good”. That’s why they’re tax exempt: the government is basically donating to them, with the understanding that they’re benefiting the public indirectly by doing so, not just making a few people rich.

In my understanding, this is just blatant outright fraud that any sane society would forbid. If you want to start a for-profit that’s fine, but you’d have to give away the nonprofit and its assets, not just roll it over to your own pocketbook.

God I hope Merrick Garland isn’t asleep at the wheel. They’ve been trust busting like mad during this administration, so hopefully they’re taking aim at this windmill, too.


> God I hope Merrick Garland isn’t asleep at the wheel. They’ve been trust busting like mad during this administration, so hopefully they’re taking aim at this windmill, too.

Little chance of that as Sama is a big time Democrat fundraiser and donor.


So are Google and Facebook :shrug:

Can’t find a good source for both rn but this one has alphabet in the top 50 nationwide for this election: https://www.opensecrets.org/elections-overview/top-organizat...

edit: and Sam Altman isn’t exactly donating game changing amounts — around $300K in 2020, and seemingly effectively nothing for this election. That’s certainly nothing to sneeze at as an individual politician, but that’s about 0.01% of his net worth (going off Wikipedia’s estimate of $2.8B, not counting the ~$7B of OpenAI stock coming his way).

https://www.dailydot.com/debug/openai-sam-altman-political-d...


> So are Google and Facebook

When you see any numbers for corporations contributing to political campaigns, that's actually just measuring the contributions from the employees of those corporations. That's why most corporations "donate to both parties"--because they employ both Republicans and Democrats.


I’m not sure extreme wealth is possible with a non-profit. You can pay yourself half a million a year, get incredible kickbacks by the firms you hire to manage the nonprofits investments, have the non-profit hire outside companies that you have financial interests in, and probably some other stuff. But none of these things are going to get you a hundred million dollars out of a non profit. The exception seems to be OpenAI which is definitely going to be netting at least a couple people over a billion dollars, but as Elon says, I don’t understand how or why this is possible


Yes definitely that is the far majority. I actually had Mozilla and their CEO in mind when I was thinking of "extreme" wealth. Also I've heard some of the huge charities in the US have some execs pulling down many millions per year, but I don't want to name any names because I'm not certain.


In the USA, the salaries of execs of non-profits are publicly listed in their form 990s they file with the IRS.

Name names. We can look it up.


> No, a non-profit is one in which there are no shareholders.

Again, I am not a lawyer but that makes no sense. Otherwise, anyone can claim the non-profit? So clearly there are some beneficial owners out there somehow.


The nonprofit is controlled by trustees and bound by its charter, not shareholders. Any profit a nonprofit organization makes is retained within the organization for its benefit and mission, not paid out to shareholders.


Has OpenAI been profitable so far? If not, is there any subtantial tax that you have to pay in the US as a for-profit organization if you are not profitable?


A non-profit is a company that for accounting purposes does not have shareholders and therefore keeps nothing in retained earnings at the end of the period. The leftover money must be distributed (e.g. as salaries, towards the stated mission, etc.). Their financial statements list net profit for the period and nothing is retained.


The money doesn't have to be used. Many non-profits have very large balance sheets of cash and cash equivalent assets. The money just won't be paid out as dividends to shareholders.


Correct. They carry a net assets balance at the end of the period. But they do not retain earnings because there are no shareholders to pay out.


That's not correct, they also have tax advantages and a requirement to fulfill their charter.


Non-profits are tax-exempt, that's why they're carefully[1] regulated.

1: In principle; in practice, well, we'll see with this one!


Isn’t transferring all of your value to a for-profit company that can pay dividends, kinda the same thing?


The NFL used to be a nonprofit and now for profit. OpenAI can use similar routes


Not an accountant but there are different kinds of nonprofits, OpenAI is a 501c3 (religious/charitable/educational) whereas the NFL was a 501c6 (trade association).

Obviously we all think of the NFL as a big money organisation, but it basically just organises the fixtures and the referees. The teams make all the money.


If you want to be pedantic, in legal terms, no, the NFL is a big money org ($13B+/yr in revenue, with a commissioner earning $65M/yr).

They pay dividends to the teams, however, yes. But all that revenue (which is distinct from team revenue) is actually legally earned by the NFL itself.


65m a year... Wow! Richard Masters, the chief exec of the most watched, most popular sports league in the world is on about 2m a year.


Time to go open source




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: