I know nothing about companies (esp. in the US), but I find it weird that a company can go from non-profit to for-profit? Surely this would be taken advantage of. Can someone explain me how this work?
In practice it’s doable though. You can just create a new legal entity and move stuff and/or do future value creating activity in the new co. IF everyone is on board with the plan on both sides of the move then that’s totally doable with enough lawyers and accountants
If the non-profit is on board with that though, then they're breaking the law. The IRS should reclassify them as a for-profit for private inurement and the attorney general should have the entire board removed and replaced.
> An eight-year campaign to slash the agency’s budget has left it understaffed, hamstrung and operating with archaic equipment. The result: billions less to fund the government. That’s good news for corporations and the wealthy.
Still, if the non-profit has private inurement, the non-profit shouldn't be able to take anything tax-free as it wouldn't qualify as a 501(c)(3). The bigger issue is definitely Delaware non-profit law though.
But, if the non-profit gives all its assets to the new legal entity, shouldn't the new legal entity be taxed heavily? The gift tax rate goes up to 40% in the US. And 40% of the value of openAI is huge.
I'm wondering if OpenAI's charter might provide a useful legal angle. The charter states:
>OpenAI’s mission is to ensure that [AGI ...] benefits all of humanity.
>...
>We commit to use any influence we obtain over AGI’s deployment to ensure it is used for the benefit of all, and to avoid enabling uses of AI or AGI that harm humanity or unduly concentrate power.
>Our primary fiduciary duty is to humanity. We anticipate needing to marshal substantial resources to fulfill our mission, but will always diligently act to minimize conflicts of interest among our employees and stakeholders that could compromise broad benefit.
>...
>We are committed to doing the research required to make AGI safe, and to driving the broad adoption of such research across the AI community.
>We are concerned about late-stage AGI development becoming a competitive race without time for adequate safety precautions. [...]
I'm no expert here, but to me, this charter doesn't appear to characterize OpenAI's behavior as of the year 2024. Safety people have left, Sam has inexplicably stopped discussing risks, and OpenAI seems to be focused on racing with competitors. My question: Is the charter legally enforceable? And if so, could it make sense for someone to file an additional lawsuit? Or shall we just wait and see how the Musk lawsuit plays out, for now?
So perhaps we can start a campaign of writing letters to her?
I'm curious about the "fiduciary duty" part. As a member of humanity, it would appear that OpenAI has a fiduciary duty to me. Does that give me standing? Suppose I say that OpenAI compromises my safety (and thus finances) by failing to discuss risks, having a poor safety culture (as illustrated by employee exits), and racing. Would that fly?
Under Lujan v. Defenders of Wildlife, you have to suffer a concrete, discernible injury. They can have broken their promise to you, but unless you can prove the dollar amount that harmed you, you can't sue.
Even if you donated to them, all states I know of assign sole oversight for proper management of those funds to the state AG. If you donate to a food bank and they use the money to buy personal Ferraris instead of helping the hungry, that's clearly illegal, but you'd be out the money either way, so you wouldn't have standing to sue. The attorney general has to sue for mismanagement of funds. If you feel OpenAI is violating their charter, I would definitely encourage writing to Mrs. Jennings to voice that opinion.
Elon Musk absolutely has standing, as one of the biggest donors to the nonprofit. I assume he will settle for some ownership in the for-profit, though.
That was also specifically about a donor-advised fund, which is different than a nonprofit corporation. Elon Musk's tort would be something like "fraud in the inducement" or some weird theory like that not for a breach of fiduciary duty.
Sam had a blog post literally two days ago that acknowledged risks. There’s also still a sizeable focus on safety and people with roles dedicated to it at open ai
I think the real issue Musk was complaining about is that sama is quickly becoming very wealthy and powerful and Musk doesn't want any competition in this space.
Hopefully some people watching all this realize that the people running many of these big AI related projects don't care about AI. Sam Altman is selling a dream about AGI to help make himself both wealthier and more powerful, Elon Musk is doing the same with electric cars or better AI.
People on HN are sincerely invested in the ideas behind these things, but it's important to recognize that the people pulling the strings largely don't care outside how it benefits them. Just one of the many reasons, at least in AI, truly open source efforts are essential for any real progress in the long run.
There's a lot of jurisdiction around preventing this sort of abuse of the non-profit concept.
The reason why the people involved are not on trial right now is a bit of a mystery to me, but could be a combination of:
* Still too soon, all of this really took shape in the past year or two.
* Only Musk has sued them, so far, and that happened last month.
* There's some favoritism from the government to the leading AI company in the world.
* There's some favoritism from the government to a big company from YC and Sam Altman.
I do believe Musk's lawsuit will go through. The last two points are worth less and less with time as AI is being commoditized. Dismantling OpenAI is actually a business strategy for many other players now. This is not good for OpenAI.
Meta has to be happy someone else is currently looking as sketchy as they are. Thus the business strategy is moving to limit their power and influence as much as possible while also avoiding any appearance of direct competition, and letting the other guy soak up the bad pr.
Amazon gets paid either way, because even if open ai doesn’t use them, where are you going to cloud your api that’s talking with open ai?
If open ai looks weakened I think we’ll see everyone else has a service they want you to try. But there’s no use in making much noise about that, especially during an election year. No matter who wins, all the rejected everywhere will blame AI, and who knows what that will look like. So, sit back and wait for the leader of the pack to absorb all the damage.
Gemini is the product of a company that is still half-asleep. We’re trying to work with it on a big data case, and have seen everything, from missing to downright wrong documentation, missing SDKs and endpoints, random system errors and crashes, clueless support engineers… it’s a mess.
OpenAI is miles ahead in terms of ecosystem and platform integration. Google can come up with long context windows and cool demos all they want, OpenAI built a lot of moat while they were busy culling products :)
> I would be surprised if OpenAI was a whole digit percentage of their revenue.
It is not publicly known how much revenue Nvidia gets from OpenAI, but it is likely more than 1%, and they may be one of the top 4 unnamed customers in their 10Q filing, which would mean at least 10% and $3 billion [0].
>I would be surprised if OpenAI was a whole digit percentage of their revenue.
As opposed to? The euphemism "I wouldn't be surprised" usually means you think what you're saying. If you negate that you're saying what you _don't_ think is the case? I may be reading too much into whats probably a typo.
At first, I thought, “Wow, if companies can start as nonprofits and later switch to for-profit, they’ll exploit the system.” But the more I learned about the chaos at OpenAI, the more I realized the opposite is true. Companies will steer clear of this kind of mess. The OpenAI story seems more like a warning than a blueprint. Why would any future company want to go down this path?
It's quite simple: the talent pool that had already enough money that they quit their well paying job at a for profit company in part because they wanted to continue working at a non-profit high impact.
As OpenAI found its product-market fit, the early visionaries are not needed anymore (although I'm sure the people working there are still amazing)
I think OpenAI took this play right out of one of its founding donors playbooks. Pretend your company has lofty goals and you can get people to compromise to moral relativism and work superduper hard for you. These people definitely have framed posters with the “If you want to build a ship, don’t drum up the men to gather wood, divide the work, and give orders. Instead, teach them to yearn for the vast and endless sea" quote somewhere in their living places/workspaces.
It is going to be taken advantage of. Musk and others have criticized this “novel” method of building a company. If it is legal then it is a puzzling loophole. But another way to look at it is it gives small and vulnerable companies a chance to survive (with different laws and taxes applying to the initial nonprofit). If you look at it as enabling competition against the big players it looks more reasonable.
>But another way to look at it is it gives small and vulnerable companies a chance to survive (with different laws and taxes applying to the initial nonprofit).
I feel like this is quite a slippery slope, though. Should we also give small companies a right to violate trademarks? Copyright? Kill people? These could also give them a chance to compete against big players.
No, a non-profit is one in which there are no shareholders. The non-profit entity can own a lot and be extremely successful and wealthy, but it cannot give that money to any shareholders. It can pay out large salaries, but those salaries are scrutinized. It doesn't prevent abuse, and it certainly doesn't prevent some unscrupulous person from becoming extremely wealthy with a non-profit, but it is a little more complicated and limiting than you would think. Also, you get audited with routine regularity and if you are found in violation you lose your tax-exempt status, but you still are not a for-profit.
Yes: non-profits usually have members, not shareholders.
And, most importantly: non-profit charities (not the only kind of nonprofit, but presumably what OpenAI was) are legally obligated to operate “for the public good”. That’s why they’re tax exempt: the government is basically donating to them, with the understanding that they’re benefiting the public indirectly by doing so, not just making a few people rich.
In my understanding, this is just blatant outright fraud that any sane society would forbid. If you want to start a for-profit that’s fine, but you’d have to give away the nonprofit and its assets, not just roll it over to your own pocketbook.
God I hope Merrick Garland isn’t asleep at the wheel. They’ve been trust busting like mad during this administration, so hopefully they’re taking aim at this windmill, too.
> God I hope Merrick Garland isn’t asleep at the wheel. They’ve been trust busting like mad during this administration, so hopefully they’re taking aim at this windmill, too.
Little chance of that as Sama is a big time Democrat fundraiser and donor.
edit: and Sam Altman isn’t exactly donating game changing amounts — around $300K in 2020, and seemingly effectively nothing for this election. That’s certainly nothing to sneeze at as an individual politician, but that’s about 0.01% of his net worth (going off Wikipedia’s estimate of $2.8B, not counting the ~$7B of OpenAI stock coming his way).
When you see any numbers for corporations contributing to political campaigns, that's actually just measuring the contributions from the employees of those corporations. That's why most corporations "donate to both parties"--because they employ both Republicans and Democrats.
I’m not sure extreme wealth is possible with a non-profit. You can pay yourself half a million a year, get incredible kickbacks by the firms you hire to manage the nonprofits investments, have the non-profit hire outside companies that you have financial interests in, and probably some other stuff. But none of these things are going to get you a hundred million dollars out of a non profit. The exception seems to be OpenAI which is definitely going to be netting at least a couple people over a billion dollars, but as Elon says, I don’t understand how or why this is possible
Yes definitely that is the far majority. I actually had Mozilla and their CEO in mind when I was thinking of "extreme" wealth. Also I've heard some of the huge charities in the US have some execs pulling down many millions per year, but I don't want to name any names because I'm not certain.
> No, a non-profit is one in which there are no shareholders.
Again, I am not a lawyer but that makes no sense. Otherwise, anyone can claim the non-profit? So clearly there are some beneficial owners out there somehow.
The nonprofit is controlled by trustees and bound by its charter, not shareholders. Any profit a nonprofit organization makes is retained within the organization for its benefit and mission, not paid out to shareholders.
Has OpenAI been profitable so far? If not, is there any subtantial tax that you have to pay in the US as a for-profit organization if you are not profitable?
A non-profit is a company that for accounting purposes does not have shareholders and therefore keeps nothing in retained earnings at the end of the period. The leftover money must be distributed (e.g. as salaries, towards the stated mission, etc.). Their financial statements list net profit for the period and nothing is retained.
The money doesn't have to be used. Many non-profits have very large balance sheets of cash and cash equivalent assets. The money just won't be paid out as dividends to shareholders.
Not an accountant but there are different kinds of nonprofits, OpenAI is a 501c3 (religious/charitable/educational) whereas the NFL was a 501c6 (trade association).
Obviously we all think of the NFL as a big money organisation, but it basically just organises the fixtures and the referees. The teams make all the money.
If you want to be pedantic, in legal terms, no, the NFL is a big money org ($13B+/yr in revenue, with a commissioner earning $65M/yr).
They pay dividends to the teams, however, yes. But all that revenue (which is distinct from team revenue) is actually legally earned by the NFL itself.