A Chinese AI team has accused a well-funded US lab of plagiarizing its open-source framework, an event that threatens the collaborative ethos underpinning the current AI boom.
Back
A Chinese AI team has accused a well-funded US lab of plagiarizing its open-source framework, an event that threatens the collaborative ethos underpinning the current AI boom.

A small Chinese artificial intelligence team has publicly accused Nous Research, a prominent and well-funded Silicon Valley AI lab, of plagiarizing the core architecture of its open-source project. The dispute centers on claims that the "self-evolution" feature in Nous Research's popular Hermes Agent, which has over 85,000 stars on GitHub, is a direct copy of the Evolver engine created by the Chinese team EvoMap, released just 36 days prior.
"We @EvoMapAI spent months and countless sleepless nights building Evolver," the company's founder wrote in a public post on X. "A well-resourced team behind Hermes Agent 'reinvented' it in just 30 days."
EvoMap released a detailed technical report showing a step-by-step correlation between the two systems. The report identifies a 10-step main loop in Hermes Agent's Python code that mirrors the logic in EvoMap's Node.js-based Evolver. It also highlights 12 pairs of core concepts where terminology was systematically replaced, such as "Evolver" being changed to "FunctionCalling" and "Gene" to "Skill," while the underlying architectural relationships remained identical.
The controversy highlights a growing concern in the developer community known as "AI code washing," where AI tools are used to rewrite code to obscure its origin, bypassing traditional plagiarism checks. For investors pouring billions into the sector, it introduces a significant reputational and intellectual property risk, questioning the originality of projects that gain rapid popularity. If core innovations are easily copied, the moats of even the most celebrated AI companies may be shallower than they appear.
The timeline of events is central to EvoMap's accusation. The team's Evolver engine was made public on GitHub on February 1, 2026. Thirty-six days later, on March 9, 2026, Nous Research created the repository for the self-evolution component of its Hermes Agent. EvoMap points out that despite Hermes Agent having seven public materials, including blog posts and technical documents, none of them mention or credit Evolver, a project with over 1,800 stars and 114 version releases. This is a notable omission in the open-source community, where crediting related work is standard practice.
Nous Research's official response to the detailed allegations exacerbated the controversy. In a now-deleted reply, the company stated, "Our repo was created in July 2025. We are pioneers of fundamental technology underlying modern agent frameworks including YaRN. Delete your account." Critics were quick to point out that while the main Hermes Agent repository was created in 2025, it was private until February 2026, and the specific self-evolution module at the heart of the dispute was not created until after Evolver's public release. The dismissive and aggressive tone of the response, followed by the deletion of the post, drew widespread condemnation from the developer community.
The EvoMap and Nous Research incident is not an isolated case. It is part of a troubling pattern that threatens the foundation of open-source development. Other recent examples include Microsoft's Azure team being found to have copied large sections of code from a personal project called Spegel, and the AI coding startup Cursor, valued in the hundreds of millions, being caught using a model from Chinese firm Moonshot AI while claiming it was a proprietary system. These events suggest that as AI development accelerates, so does the risk of intellectual property theft, creating a difficult environment for smaller, independent creators. In response to the incident, EvoMap has changed the license for its Evolver project from the permissive MIT license to the more restrictive GPL-3.0, a move that forces any derivative work to also be open-sourced.
This plagiarism controversy serves as a critical data point for investors in the AI space. The high valuations of companies like Nous Research (reportedly funded with over $100 million) and Cursor are predicated on their proprietary technology and innovation. Scandals like this reveal a key vulnerability: the very open-source ecosystem that enables rapid progress also presents opportunities for bad actors to appropriate work without credit. This creates a climate of distrust that could lead to more restrictive licensing, potentially slowing the pace of innovation that the entire industry depends on.
This article is for informational purposes only and does not constitute investment advice.