Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Llama3V is suspected to have been stolen from the MiniCPM-Llama3-v2.5 project (github.com/openbmb)
30 points by zinccat on June 3, 2024 | hide | past | favorite | 7 comments



> Edit (June 2)

> A big thank you to people who pointed out similarities to previous research in the comments. We realized that our architecture is very similar to OpenBMB’s “MiniCPM-Llama3-V 2.5: A GPT-4V Level Multimodal LLM on Your Phone,” who beat us to the implementation. We have taken down our original model in respect to the authors.

> The link to the original author’s repository can be found here: https://github.com/OpenBMB/MiniCPM-V/tree/main?tab=readme-ov...

> — Aksh Garg, Sid Sharma



Is there a nice outoftheloop summary out there somewhere?

I don't follow these projects closely. =/



Oh okay, so these are two different academic projects.

> If the research team from Stanford University is proven to have plagiarized this MiniCPM-V project from Tsinghua University, they should feel ashamed, and also, MiniCPM-V project deserve an apology and acknowledgment.

I was thinking this might have been a commercial organization stealing from an open source project or something.

> This is the strongest evidence that llama3-V does not train its model at all, but adds random Gaussian noise to the model parameters of miniCPM-llama3-v2.5

And not just "built on without attribution", but a literal copy between the products with minimal changes.


previous post of llama3v on hackernews: https://news.ycombinator.com/item?id=40504827




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: