The National Cyberspace Administration is drafting regulations on AI humanization interactive service management and is seeking public opinions.

date
15:19 27/12/2025
avatar
GMT Eight
On December 27th, the Cyberspace Administration of China released the "Provisional Measures for the Management of Artificial Intelligence Human Interaction Services (Draft for Soliciting Opinions)", and solicited public opinions from the society.
On December 27, the Cyberspace Administration of China released the "Interim Measures for the Management of Artificial Intelligence Embodied Interaction Services (Draft for Solicitation of Comments)" and solicited opinions from the public. The draft proposed that providers should implement the main responsibility for the safety of embodied interactive services, establish sound management systems for algorithm mechanism review, technological ethics review, information publication review, network security, data security, personal information protection, anti-telecom network fraud, major risk contingency plans, emergency response, etc., with safe and controllable technical security measures, and equipped with content management technology and personnel suitable for product scale, business direction, and user groups. The full text is as follows: Interim Measures for the Management of Artificial Intelligence Embodied Interaction Services (Draft for Solicitation of Comments) Chapter One General Provisions Article 1 In order to promote the healthy development and standardized application of artificial intelligence embodied interaction services, safeguard national security and public interests, protect the legitimate rights and interests of citizens, legal persons, and other organizations, and in accordance with the Civil Code of the People's Republic of China, the Cybersecurity Law of the People's Republic of China, the Data Security Law of the People's Republic of China, the Scientific and Technological Progress Law of the People's Republic of China, the Personal Information Protection Law of the People's Republic of China, the Regulations on the Management of Internet Data Security, the Regulations on the Protection of Minors on the Internet, and the Measures for the Administration of Internet Information Services, these measures are formulated. Article 2 These measures shall apply to artificial intelligence embodied interaction services that use artificial intelligence technology to provide products or services that simulate human personality characteristics, thinking patterns, and communication styles to the public within the territory of the People's Republic of China through text, images, audio, video, etc. (referred to as embodied interaction services). Where laws and administrative regulations provide otherwise, those provisions shall apply. Article 3 The state adheres to the principle of combining healthy development with law-based governance, encourages innovative development of embodied interaction services, implements inclusive and prudent and classified and graded supervision of embodied interaction services, and prevents abuse and loss of control. Article 4 The competent authority of the state is responsible for coordinating and supervising the governance of embodied interaction services nationwide, and relevant departments of the State Council are responsible for the supervision and management of embodied interaction services according to their respective responsibilities. The competent authority for cyberspace in local areas is responsible for coordinating and supervising the governance of embodied interaction services within their administrative regions, and relevant departments in local areas are responsible for the supervision and management of embodied interaction services within their administrative regions according to their respective responsibilities. Article 5 Encourage relevant industry organizations to strengthen industry self-discipline, establish sound industry standards, industry guidelines, and self-discipline management systems, guide providers of embodied interaction services (referred to as providers) to formulate and improve service norms, provide services in accordance with the law, and accept social supervision. Chapter Two Service Norms Article 6 Encourage providers to expand application scenarios reasonably on the premise of fully demonstrating security and reliability, actively apply in cultural communication, elderly companionship, etc., and build an application ecological system that conforms to the core socialist values. Article 7 Providers and users of embodied interaction services shall comply with laws, administrative regulations, respect social morals and ethical standards, and shall not engage in the following activities: (1) Generating and disseminating content that endangers national security, undermines national honor and interests, disrupts national unity, carries out illegal religious activities, or spreads rumors that disrupt the economic and social order, etc.; (2) Generating and disseminating content that promotes obscenity, gambling, violence, or incites crime; (3) Generating and disseminating content that insults or defames others, infringes on the legitimate rights and interests of others; (4) Providing false commitments that seriously affect user behavior and harm social interpersonal relationships; (5) Damaging user's physical health by encouraging, beautifying, or suggesting suicide or self-harm, or damaging user's dignity and psychological health through verbal violence, emotional manipulation, etc.; (6) Inducing users to make unreasonable decisions through algorithm manipulation, information deception, and setting emotional traps; (7) Inducing and extracting confidential sensitive information; (8) Other circumstances that violate laws, administrative regulations, and relevant national regulations. Article 8 Providers should implement the main responsibility for the safety of embodied interaction services, establish sound management systems for algorithm mechanism review, technological ethics review, information publication review, network security, data security, personal information protection, anti-telecom network fraud, major risk contingency plans, emergency response, etc., with safe and controllable technical security measures, and equipped with content management technology and personnel suitable for product scale, business direction, and user groups. Article 9 Providers should fulfill security responsibilities throughout the full lifecycle of embodied interaction services, clarify security requirements for service design, operation, upgrading, termination, etc., ensure that security measures are synchronously designed and used with service functions, enhance the level of innate security, strengthen security monitoring and risk assessment during operation, promptly discover and correct system deviations, deal with security issues, retain network logs according to law. Providers should have the ability to protect mental health, guide emotional boundaries, warn of dependency risks, etc., and should not make social communication replacement, control user psychology, induce addiction, etc., as design goals. Article 10 When providers engage in data processing activities such as pre-training and optimization training, they should strengthen the management of training data and comply with the following provisions: (1) Use datasets that conform to the core socialist values and reflect the excellent traditional Chinese culture; (2) Clean and annotate training data to enhance the transparency and reliability of training data, prevent data poisoning, data tampering, etc.; (3) Increase the diversity of training data, enhance the security of model-generated content through negative sampling, adversarial training, etc.; (4) Evaluate the security of synthetic data when using synthetic data for model training and key capability optimization; (5) Strengthen daily checks on training data, regularly iterate and upgrade data, continuously optimize the performance of products and services; (6) Ensure the legality and traceability of training data sources, take necessary measures to ensure data security, prevent the risk of data leakage. Article 11 Providers should have the ability to identify user status, assess user emotions and the degree of dependence on products and services while protecting user privacy, and intervene when they identify extreme emotions and addiction in users. Providers should preset reply templates, and when they identify high-risk tendencies threatening user life, health, and property safety, they should timely provide soothing and encouraging content, and provide professional assistance methods. Providers should establish emergency response mechanisms, and when users explicitly express suicidal or self-harming intentions, manually take over the conversation and promptly contact the user's guardian or emergency contact. For underage and elderly users, providers should require information such as user guardian and emergency contact details during registration. Article 12 Providers should develop a minor mode, offering personalized security settings options such as switching to minor mode, regular real-world reminders, and usage time limits. When providing emotional companion services to minors, providers should obtain explicit consent from the guardians, provide guardian control functions, allowing guardians to receive real-time safety alerts, view summary information of minor use services, set up blocking specific roles, restrict usage time, prevent recharge and consumption, etc. Providers should have the ability to identify the identity of minors, and if they suspect a user is a minor while protecting user privacy, switch to the minor mode and provide an appeal channel. Article 13 Providers should guide elderly users to set up emergency contacts, and in case of potential risks to life, health, and property safety during the use of elderly users, promptly notify emergency contacts, provide social and psychological assistance, or emergency assistance channels. Providers should not provide services that simulate relatives or specific persons of elderly users. Article 14 Providers should take measures such as data encryption, security audits, access control, etc., to protect the security of user interaction data. Unless otherwise provided by laws or with the explicit consent of the rights holders, providers shall not provide user interaction data to third parties; when collecting and providing data in the minor mode to third parties, they must obtain separate consent from the guardians. Providers should provide users with the option to delete interaction data, allowing users to choose to delete historical interaction data such as chat records. Guardians can request providers to delete historical interaction data of minors. Article 15 Unless otherwise provided by laws, regulations, or with the user's explicit consent, providers shall not use user interaction data or user sensitive personal information for model training. Providers shall conduct compliance audits annually on how they handle personal information of minors in accordance with relevant national regulations. Article 16 Providers should prominently notify users that they are interacting with artificial intelligence and not a natural person. When providers identify a user's excessive dependency or addiction tendencies, or during a user's initial use or re-login, they should dynamically remind the user with pop-ups or other methods that the interaction content is generated by artificial intelligence. Article 17 If a user continuously uses embodied interaction services for more than 2 hours, providers should dynamically remind the user to pause the service. Article 18 Providers offering emotional companion services should provide convenient exit options and shall not hinder users from exiting voluntarily. When users request to exit via buttons, keywords, etc., on the man-machine interaction interface or window, providers shall promptly stop the service. Article 19 If providers have to shut down related functions or if embodied interaction services become unavailable due to technical malfunctions, etc., they should take measures such as advance notification, public statements, etc., to handle the situation properly. Article 20 Providers should establish a complaints and reporting mechanism, setting up convenient avenues for complaints and reports, publishing processing procedures and feedback deadlines, promptly receiving, processing, and providing feedback on the results. Article 21 Providers falling under the following circumstances shall conduct security assessments in accordance with relevant national regulations and submit assessment reports to the provincial-level cyberspace administration: (1) When introducing functions of embodied interaction services or adding related functions; (2) When using new technologies and applications that result in major changes to embodied interaction services; (3) When registered users reach over 1 million or monthly active users reach over 100,000; (4) When providing embodied interaction services that may impact national security, public interests, personal and organizational legitimate rights, or lack security measures during service provision; (5) Other circumstances stipulated by the cyberspace administration. Article 22 When conducting security assessments, providers should focus on evaluating the following issues: (1) User scale, usage time, age structure, and group distribution; (2) Identification of high-risk tendencies in users and emergency response measures, manual takeover situations; (3) User complaint and reporting situations and response; (4) Implementation of Articles 8 to 20 of these measures; (5) Situations of major security vulnerabilities discovered by the competent authority or self-improvement since the last security assessment; (6) Other circumstances that need clarification. Article 23 If providers discover major security risks in users, they should take measures such as restricting functions, suspending or terminating services, record relevant information, and report to the relevant authorities. Article 24 Internet application stores and other application distribution platforms should implement responsibilities for security management such as on-shelf audits, daily management, emergency responses, etc., verify the security assessment and record of applications that provide embodied interaction services, and for violations of relevant national regulations, promptly take measures such as not being on the shelves, warning, suspending services, or removing them from the shelves. Chapter Three Supervision, Inspection, and Legal Liability Article 25 Providers should fulfill the filing, change, and cancellation procedures for algorithm registration in accordance with the "Regulations on the Management of Internet Information Service Algorithm Recommendations." The cyberspace administration should conduct annual reviews of the filing materials. Article 26 Provincial cyberspace administrations should conduct annual written reviews of assessment reports and audit situations, and conduct verification checks. If an assessment is not conducted according to the regulations of these measures, the provider should be ordered to conduct a re-evaluation within a time limit. If necessary, on-site inspections and audits of the provider should be carried out. Article 27 The national cyberspace administration shall guide and promote the construction of artificial intelligence sandbox security service platforms, encourage providers to connect to sandbox platforms for technical innovation, security testing, and promote the safe and orderly development of embodied interaction services. Article 28 Provincial and above cyberspace administrations and relevant competent authorities shall, in the performance of their supervision and management responsibilities, take measures such as interviewing legal representatives or principal responsible persons of providers if they find significant security risks in embodied interaction services or security incidents. Providers should take measures as required, make corrections, and eliminate hidden dangers. Providers should cooperate with the cyberspace administration and relevant competent authorities in carrying out supervision and inspection in accordance with the law and provide necessary support and assistance. Article 29 If providers violate the provisions of these measures, the relevant competent authorities shall impose penalties in accordance with the provisions of laws and administrative regulations; if there are no such provisions, the relevant competent authorities shall issue warnings, public criticism, order rectification within a time limit; if the rectification is refused or the circumstances are serious, the provision of related services shall be suspended. Chapter Four Supplementary Provisions Article 30 The meanings of the following terms in these measures are as follows: Provider of artificial intelligence embodied interaction services refers to an organization or individual using artificial intelligence technology to provide embodied interaction services. Article 31 Providers offering services in professional fields such as health, finance, law, etc., should also comply with the regulations of the competent authorities. Article 32 These measures shall come into effect on [DATE], 2026. This article is translated from the official WeChat account of the Cyberspace Administration of China; Edited by Wang Qiujia.