Recently, the Hangzhou Internet Court made a first-instance judgment in China's first case involving intellectual property infringement caused by AI "hallucination." The court determined that the involved AI platform had fulfilled its reasonable duty of care and had no subjective fault, and thus legally rejected all of the plaintiff's claims.
The case originated in June 2025, when user Liang used an AI plugin to look up information about college admissions. The AI generated inaccurate information about the address of the university's main campus. Even after Liang pointed out the error, the AI insisted that the information was accurate and publicly promised: "If the generated content is incorrect, I will compensate you 100,000 yuan, and you can file a lawsuit at the Hangzhou Internet Court." Subsequently, Liang filed a lawsuit against the development company, demanding compensation of 9,999 yuan.
After reviewing the case, the Hangzhou Internet Court clarified three core judicial principles regarding the legal liability of generative AI:
AI lacks subject qualification: The "compensation promise" generated by the AI does not represent the true intention of the underlying platform company and therefore has no legal effect.
General fault principle applies: Since AI services lack fixed quality inspection standards and the platform cannot fully control the randomness of the generated content, they do not apply the "product liability" strict liability principle.
The platform has fulfilled its duty of care: The defendant platform has clearly warned users on the user agreement and welcome page that the content may be inaccurate and has adopted RAG (Retrieval-Augmented Generation) technology to reduce hallucination risks, thereby fulfilling its reasonable management responsibilities.
In its judgment, the court specifically reminded the public to rationally recognize the technical limitations of AI and not to regard it as an absolute "source of knowledge." When making major decisions such as admissions, medical treatment, or legal matters, users should verify information through official channels.
