> Don't you think they can easily monetize their 800 million [...] users?
I am pretty sure they will be able to monetize it. But there is a big difference between "generating revenue" and "generating profit". It's way cheaper to put ads between posts of your friends (like FB started out with ads) then putting ads next to the response of an LLM. Because LLM responses has to be unique, while a holiday photo of yours might be interesting for all of your friends, and LLM inference is quite expensive, while hosting holiday photos is cheap. IMHO this is the reason why the 5th generation of ChatGPT models try to answer all possible questions of the world in one single response, kinda hoping that I am going to be happy with it an just close the chat.
I am pretty sure they will be able to monetize it. But there is a big difference between "generating revenue" and "generating profit". It's way cheaper to put ads between posts of your friends (like FB started out with ads) then putting ads next to the response of an LLM. Because LLM responses has to be unique, while a holiday photo of yours might be interesting for all of your friends, and LLM inference is quite expensive, while hosting holiday photos is cheap. IMHO this is the reason why the 5th generation of ChatGPT models try to answer all possible questions of the world in one single response, kinda hoping that I am going to be happy with it an just close the chat.