Multilingual Blog / Articles

Highlights of International Legal Developments in the “Children’s Rights” section as of the first half of 2025

The year 2025 marked a period when children’s rights were re-discussed globally, risks shifted in magnitude with technological advancements, and states sought to respond more visibly and quickly at the legislative level. Across the world, many countries have proposed new legislation in areas such as child abuse, online safety, the right to education, the protection of refugee children, and child labor, implemented comprehensive reforms to existing regulations, or implemented special regulations on children’s rights for the first time. These developments were shaped not only by the legal framework but also by the impact of rising public awareness, international pressures, and the new threats posed by the digital world.

Redefining the Concept of Childhood

In recent years, the concept of childhood has evolved into a multidimensional phenomenon, not limited to biological age but shaped by developmental, psychosocial, digital, and cultural layers. This situation, particularly with the increased digitalization following the pandemic, is bringing children online at an earlier age, presenting both opportunities and threats.

In 2025, many countries adopted new regulations that included children’s digital rights, recognizing the need to redefine the concept of “childhood” in line with the times and technology. This transformation has come to be viewed not only as a pedagogical issue but also as a legal and ethical one.

Driving Forces of Legal Developments

Four fundamental dynamics appear to be driving legal developments in the field of children’s rights:

The Impact of International Conventions: Documents such as the United Nations Convention on the Rights of the Child (CRC), the Lanzarote Convention, and ILO Conventions Nos. 138 and 182 were shaping the domestic laws of many countries in 2025.
Technological Threats: Problems in areas such as cyberbullying, online abuse, and AI-assisted content filtering forced countries to develop new legal mechanisms. Liability regulations for social media companies were particularly noteworthy.
Social Pressure and Awareness: Public pressure forced some countries to amend their laws due to women’s and children’s rights activism, media campaigns, and civil society organizations.
The Refugee Problem Evolving into a Crisis: Regulations regarding the protection of children displaced by reasons such as war, climate change, and migration occupied a prominent place on the legal agenda for 2025.

Expansion of Protective Law

Child protection legislation, which in the past primarily focused on physical violence and the right to education, evolved into a more complex and multilayered structure in 2025. The boundaries of protective law have been expanded in the following areas:

Online privacy and data security
Making psychological violence within the family visible
Equality and combating discrimination in the school environment
Freedom of expression and the right to self-actualization
The effects of climate change on children
In addition, some countries have taken steps to directly provide constitutional protection to children.

Preventive and Restorative Legal Approaches

Another prominent theme in 2025 is the development of preventive and restorative justice models, not solely based on punitive sanctions. For example, practices such as blocking harmful content from children on social media platforms through automatic filtering systems before it’s even published; providing digital literacy training to parents; providing psychosocial support to child victims; and designing restorative justice processes between the perpetrator and the victim have brought about legal steps aimed not only at punishment but also rehabilitation.

So…let’s move on to examining the countries that “have attempted and/or implemented significant legislative amendments in the field of children’s rights; or have existing legislative proposals,” specifically for the first half of 2025, the objectives these countries sought when introducing regulations, and the texts of these amendments.

USA..

The “Kids Online Safety Act,” which improves children’s online privacy, has been introduced to the Senate.

According to the bill, there is a distinction between “Child” and “Minor” as follows:

The term “Child” means any individual under the age of 13.

The term “Minor” means any individual under the age of 17.

Each digital platform covered by the bill shall exercise reasonable care in the creation and implementation of any design feature to prevent and reduce the following harms to minors, if a reasonable and prudent person would consider such harms to be reasonably foreseeable by the covered platform and that the design feature is a contributing factor to such harms:

(1) Eating disorders, substance use disorders, and suicidal behaviors.

(2) Depressive disorders and anxiety disorders, if such conditions have objectively verifiable and clinically diagnosable symptoms and are associated with compulsive use.

(3) Patterns of use that indicate compulsive use.

(4) Physical violence or online harassment that is severe, pervasive, or objectively offensive enough to interfere with a significant life activity of a minor.

(5) Sexual exploitation and abuse of minors.

(6) Distribution, sale, or use of narcotics, tobacco products, cannabis products, gambling, or alcohol.

(7) Financial damages resulting from unfair or deceptive acts or practices (as defined in the relevant section of the Federal Trade Commission Act).

The bill also comprehensively addresses platforms’ duty of care, the requirement to implement protective measures, particularly for minors, transparency in these actions, and the imposition of administrative fines upon inspection if they fail to do so. Technical penalties, such as the closure of the relevant platform in the event of recurrence, age verification, and reporting processes are also discussed.

The bill also establishes a “Children’s Online Safety Council.”

Accordingly, the Council’s duties are to submit reports to Congress containing recommendations and suggestions on matters related to the online safety of minors. The Council will address the following topics:

(1) Identify emerging or existing risks of harm to minors associated with online platforms;

(2) Recommend measures and methods for assessing, preventing, and mitigating harms suffered by minors online;

(3) Recommend methods and themes for conducting research on online harms to minors, including in English and non-English languages; and

(4) Recommend best practices and clear, consensus-based technical standards for transparency reports and audits, as required under this heading, including methods, criteria, and scope to promote overall accountability.

Continuing with the United Kingdom.

The “Online Safety Act” has come into effect, making age verification mandatory on social media.

The online age verification law, which aims to protect children from harmful content in the UK, officially came into effect on July 26. The law mandates digital platforms, particularly pornography sites, to verify the age of their users. Approximately 6,000 pornography sites have announced that they have become compliant with the law and implemented age verification.
The law isn’t limited to adult content platforms. Social media and dating apps like Xbox, Reddit, Bluesky, X, and Spotify are now also requiring their UK users to prove their age through selfies, passports, or government-issued ID documents. This is being interpreted as the beginning of a new era in internet use:

“Is this the end of an anonymous online existence?”

However, despite their stated aim to protect children, these apps have drawn harsh criticism from privacy advocates. Digital rights organizations like the Electronic Frontier Foundation (EFF) warn that age verification systems can compromise user privacy and eliminate anonymity. Indeed, a recent example confirms these concerns. Selfies and digital ID documents collected by the dating app Tea for its age verification process were exposed on cyber forums in a data leak. This incident exposed the vulnerability of security systems.

Users have already begun developing various methods to bypass the new age verification requirements. Creating fake ID documents, creating fake selfies using video game characters’ faces, or bypassing geographic restrictions via VPN are among the most common “digital escapes,” but these seemingly creative solutions also carry serious risks. Using fake documents is a crime that can result in legal penalties, while sharing these documents on platforms vulnerable to fraud can expose users to threats such as identity theft and data leaks.

In other words, those who try to trick systems are often forced to compromise their own security.

While this step taken by the UK is designed to ensure children’s safer online behavior, it could pave the way for “age verification” to become the new digital standard around the world. Similar discussions, particularly in European Union countries and the United States, indicate that the anonymous nature of the internet is increasingly being regulated.

This brings to mind a digital world in which users may be forced to reveal their identities not only when producing content but also when consuming it.

All these developments bring us to the heart of a fundamental dilemma at the heart of the digital age:

“Protecting children online or defending individuals’ right to privacy?”

This dilemma creates a deep fault line not only technologically but also in its ethical, legal, and societal dimensions. On one side, there are lawmakers and families trying to prevent harmful content to which children are exposed; on the other, there are digital rights organizations, activists, and other dissident individuals who advocate for a free and anonymous internet. With the increasing prevalence of artificial intelligence, facial recognition technologies, and biometric data, the boundaries of privacy are being redrawn daily.

In conclusion, the UK’s online age verification initiative should be considered not just a law, but a turning point that will influence the trajectory of the digital age.

How this process is managed, the extent to which states will respect individual rights, and how technology companies will protect user data will determine the boundaries of the future internet. Perhaps most importantly, users must no longer be merely content creators but also the greatest defenders of their digital rights.

Now, let’s move on to Australia.

Legislative proposals have been prepared to ban social media use under the age of 16.

A social media ban targeting children under the age of 16 has been passed by the Australian Parliament, a world first.

The legislation would fine platforms like TikTok, Facebook, Snapchat, Reddit, X, and Instagram up to 50 million Australian dollars ($33 million) for their systematic failure to prevent children under 16 from having accounts.

The Senate passed the bill by 34 votes to 19. The House of Representatives overwhelmingly approved the bill by 102 votes to 13.

The House of Representatives passed the Senate’s opposition amendments, effectively signing the bill into law.

Prime Minister Anthony Albanese said the legislation supports parents concerned about online harm to their children.

Now we look at Canada.

Parliament passed education reform to protect the cultural identity of Indigenous children.

Germany…

Germany does not want to ban social media for children under 16, citing the right of children and young people to participate digitally and explore their digital lives safely.

However, in the country, parental consent is required for children under 16 to use social media. It appears that there is no rigorous verification of whether this consent has actually been given by parents. Children may provide false birth information when registering on social media platforms.

This situation often doesn’t result in sanctions against social media providers. Germany places the responsibility for age limit checks on social media companies.

The German Ministry of Family, Senior Citizens, Women and Youth points out that the European General Data Protection Regulation (DSGVO) requires consent from parents of children and adolescents under the age of 16 for the processing of personal data by service providers.

Italy..

New legislation has been adopted to combat bullying and abuse in schools.

The law instructs the government to establish a technical committee for the prevention of bullying and cyberbullying within the Ministry of National Education, comprised of experts in psychology, pedagogy, and social communication.

Furthermore, the government will conduct periodic information campaigns on the prevention and awareness of bullying and cyberbullying, as well as parental control techniques.

The government is also instructed to adopt one or more legislative decrees within 12 months establishing other appropriate measures to assist victims of bullying and cyberbullying, including the 114 public emergency number for Childhood Trauma, which is accessible free of charge and 24/7.

This public number will be tasked with providing psychological and legal assistance to victims, their families, and friends, and, in the most serious cases, immediately report dangerous situations to the police.

Furthermore, the National Institute of Statistics will be required to conduct a survey every two years to measure the problem of bullying and cyberbullying and identify those most exposed to these risks.

Under the new legislation, each school: It must adopt an internal regulation on the prevention of bullying and cyberbullying and establish a permanent supervisory board composed of students, teachers, families and experts.

The law designates January 20th as “Respect Day” each year to examine the issue of respect for others, raise awareness about psychological and physical non-violence, and combat all forms of discrimination and abuse.

We’re moving on to Spain.

The draft law, designed to protect children in digital environments, was approved by the Council of Ministers in March 2025. It includes measures such as raising the age for opening a social media account from 14 to 16, requiring default parental control systems on devices, and defining crimes related to AI-based child pornography and deepfake content.

ICT product manufacturers are now required to offer free parental control systems activated at the time of purchase.
The regulation prohibits in-game random reward mechanisms, such as Lootbox, that interact with children for those under the age of 18.
The new regulation also introduces provisions penalizing online grooming or the creation of criminal profiles, as well as virtual remote access bans.

Portugal..

A law introducing harsh penalties to combat child labor has entered into force.

The political party Bloco de Esquerda (BE) is proposing raising the working age from 16 to 18 in Portugal to align it with the duration of compulsory education. This step aims to prevent children from entering the workforce before completing their education.

The proposal has been debated in parliament and discussed in committees, but has not yet been enacted.

Unfortunately, the majority appears to be cautious about this proposal; some parties argue that this change could alienate young workers from the formal system and make it more difficult to monitor them.

We continue with Sweden.

New regulations have been made to the Parental Code.

The amendments, which came into effect on January 1, 2025, define children’s rights regarding custody, residence, and visitation (boende och umgänge). These rules are now in effect for all ongoing cases.

The principle of “best interests of the child” (barnets bästa) is taken into account when making decisions to ensure the safety and well-being of children.

Furthermore, a new Social Services Law proposal was adopted on January 23, scheduled to come into force on July 1, 2025.

The new Social Services Law strengthens children’s rights by aligning it with the Convention on the Rights of the Child. The law states that “social services must take the child’s views into account when assessing the child’s best interests.” Children also have the right to information about their interventions, and social services must ensure that the child understands this information.

The new bill includes the following:

Social services should be more preventative and identify needs before they become too serious;
Social services’ preventative work against crime should be clarified;
It should be easier to access social services and receive help when needed; and
Social services should be able to respond more quickly in emergencies.

Again;

A bill has been introduced that will enable the monitoring of electronic communications of children under the age of 5, with a proposed wiretapping authorization.

Ju2024/02286 – Data lagring och åtkomst till elektronisk information (Regulation on the storage of electronic communication data and access to law enforcement authorities)

Children’s rights organizations in the country oppose this proposal as an invasion of children’s privacy and call for respect for legal rights.

Let’s look at Norway.

Increased oversight of social media platforms has been implemented to address children’s digital rights.

Furthermore,

The Norwegian government is taking decisive action to protect children online by proposing a public consultation on a new law that would ban social media platforms from providing services to children under the age of 15.

Recognizing the serious impact of screen use and social media on children’s sleep, mental health, learning, and concentration, Norway appears committed to creating a safer online environment for children.

Prime Minister Jonas Gahr Støre said, “This is one of the most pressing social and cultural challenges of our time and cannot be solved by national measures alone. We aim to strengthen cooperation with Europe to ensure a safe digital environment for children and young people.”

Minister of Children and Families Lene Vågslid said: “We cannot allow screens and algorithms to take over childhood. Children must be protected from harmful content, abuse, commercial exploitation, and misuse of their personal data,” she said.

Developing effective enforcement mechanisms for absolute age limits is both a legal and technological challenge. Currently, there is no fully effective solution for age verification. Norway aims to work closely with the EU and other European countries addressing the same issue to develop practical and accessible solutions.

Karianne Tung, Minister of Digitalization and Public Administration, said:

“Digitalization transcends national borders.

Norway is working closely with the EU on how to regulate large technology companies. We want to find common solutions on age verification and age restrictions.”

The proposed law aims to protect children and young people from potential harms associated with social media use, including exposure to criminal activity.

The law also includes a definition of what constitutes a social media platform, which will play a key role in determining which services are subject to age restrictions.

Most importantly, the law will not restrict children’s participation in leisure activities or social communities. The law is designed to respect children’s fundamental rights, such as freedom of expression, access to information, and the right to association.

Exceptions will be proposed for services such as video games and platforms used for communication purposes related to school or extracurricular activities.

The Norwegian government is also implementing several complementary initiatives to protect children online:

Increasing the age of consent under the GDPR for the processing of personal data by information society services to 15 years.
Publishing recommendations from national health authorities on screen use, screen time, and social media.
Removing mobile phones from schools, with a clear national proposal.
Proposed legislation to increase penalties for violations of child-targeted marketing regulations.
Combating online crime and exploitation of children and young people. Various issues, such as:

In Norway:

72% of children aged 9-12 use social media.
75% of the population supports electronic age verification on social media.
60% believe that age limits for social media use should be imposed by the government, not platforms or parents.
Denmark…

A new national strategy plan for children’s internet safety has been implemented.

Under the EU Digital Services Act (DSA), Denmark has launched a pilot program for an age verification app for children’s internet safety. This app will be used to verify those over the age of 18, ensuring the protection of personal information.

The Danish government has effectively implemented the DSA to tighten control over issues such as cyberbullying and harmful content, which can lead to addiction on online platforms.

The government aims to make age verification tools mandatory to protect children online.

Following a proposal from the Danish Welfare Commission, children aged 7–16 are prohibited from bringing mobile phones to school; this practice is being made legal in all folkeskolle (primary and secondary schools).

It is also recommended that children under the age of 13 not be given smartphones or tablets.

Sikker Internet Centre Danmark (Danish Safer Internet Centre) provides awareness raising, a helpline, and psychosocial counseling to ensure a safe online experience that respects children’s digital rights.

This structure operates within the BIK+ platform, a joint initiative of the EU.

As part of “Sikker Internet Day” (Safer Internet Day) for 2025, a conference focusing on children’s digital rights was held in Copenhagen on February 26.

Belgium..

As part of the joint Child-Friendly Justice Project of the Council of Europe and the EU, Belgium launched its new “Child-Friendly Justice Assessment Tool” in June 2025. This tool aims to align the justice system with child-focused norms, with Belgium participating as a pilot country in this process.

The Council of Europe’s Children’s Rights Division, together with representatives of Belgium, Poland, and Slovenia, presented this vital new document at a high-level meeting held in Brussels during the Polish Presidency of the Council of the European Union.

This innovative tool, a product of the Joint European Union/Council of Europe Project on Child-Friendly Justice (CFJ Project), is designed to enable member states to rigorously assess and subsequently strengthen their national justice frameworks. By providing clear indicators, it enables a comprehensive assessment of legislation, institutions, and practices, ensuring their compliance with the Council of Europe’s established Child-Friendly Justice Guidelines.

Following a practical demonstration of the Assessment Tool, compelling presentations were made from Belgium, Poland, and Slovenia, the key focus countries of the CFJ Project. The common findings from country-specific self-assessments highlighted the practical value of such tools in promoting progress and mutual learning across the continent.

This new Assessment Tool is envisioned as an important reference for national authorities and all professionals interacting with children within the legal system. It will facilitate the identification of strengths, the addressing of shortcomings, and the long-term monitoring of progress, ultimately contributing to broader European efforts to protect children’s rights in all proceedings affecting children.

The tool is currently available in English, with translations into French, Dutch, Polish, and Slovenian underway, and its official release is expected in the second half of 2025.

Poland

A bill imposing new duties on electronic service providers to limit children’s access to harmful content (including pornography) online was submitted to public consultation in February 2025.

The bill:

Requires providers to conduct risk analyses,
Establishing age verification mechanisms before pornographic content is displayed,
Creating a domain name blocking system to prevent undesirable user experiences.
It proposes the development of independent systems that do not rely on direct biometric verification or user testimony.
Penalties are quite severe; platforms and internet service providers that fail to provide verification may be subject to administrative fines of up to PLN 1 million (approximately EUR 230,000).

Czechia

With a bill passed in 2025, the Czech Republic allows children under 14 to work during summer vacation under certain conditions. This regulation recognizes the right of some children to work while also introducing strict controls and restrictions to protect their education and health. However, this amendment is not seen as part of a broader strategy to combat child labour, as it has been prepared with a rather narrow and shallow perspective.

Consider Hungary.

On March 18, 2025, the Hungarian Parliament passed a law further strengthening the “Child Protection Act.” This law prohibits events involving “gender reassignment or homosexuality” for children. It also introduced a regulation allowing the use of facial recognition technology to identify participants in such events.

On April 14, 2025, the 15th amendment to the Hungarian Constitution was adopted. This amendment controversially defined “person” as “male or female” in Article L(1) of the Constitution and guaranteed the right of every child to “the protection and care of their physical, psychological, and moral development” in Article XVI(1).

We look at Ukraine, a country victimized by war.

Ukraine has launched various projects for the protection and rehabilitation of child victims of war within the framework of the Council of Europe’s Children’s Rights Strategy for the period 2022-2027. These projects aim to protect children from violence, enhance their access to fair judicial processes, and strengthen psychosocial support services. Efforts to protect displaced, parentless, or victims of violence are particularly prominent.

During the Russian occupation, Ukraine faced serious human rights violations, including the forced deportation of children and their military training in Russia. International organizations and the Ukrainian government emphasize that these violations should be considered war crimes. The European Parliament adopted a resolution on this issue, stating that the forced deportation and military training of children are against international law.

UNICEF and the United Nations are running various rehabilitation programs for child victims of war in Ukraine. These programs aim to ensure that children receive psychological support, continue their education, and grow up in safe environments. In particular, efforts are being made to prevent and treat injuries resulting from mines and unexploded ordnance.

We’re coming to South Africa…

Regulations have been introduced to ban child marriage.

The bill proposes to completely ban child marriage in both civil and traditional marriages, limiting the marriage age to 18. In this context:

Marriage with individuals under the age of 18 will be strictly prohibited; under current law, 12-year-old girls and 14-year-old boys could be married, though unfortunately not acceptable, with parental or local court permission.
The bill subjects individuals who carry out or facilitate child marriage to criminal sanctions: imprisonment or a fine could be imposed.
During the bill’s approval process, campaigns and public hearings were held, and many citizens and NGOs: He argued that the marriage age should be lowered from 18 to 21. Let’s look at Nigeria…

The Federal Government has decided to review the National Policy and the 2021–2025 National Child Labour Elimination Action Plan in collaboration with the ILO.

As part of the policy review, the hazardous work list will be updated and existing legal gaps will be addressed.

On February 14, 2025, the Federal Ministry of Labour and Employment, the ILO, and the National Child Labour Elimination Steering Committee launched a new platform and mobile application.

This tool is used for centralized reporting, monitoring, and rapid response to child labour cases.

Nigeria has also incorporated ILO Conventions No. 137 on the Minimum Age for Employment and No. 182 on the Prohibition of the Employment of Bad Forms into its domestic law.

Egypt

A new budget increase and legal amendments have been made to ensure children’s right to education.

The Egyptian Government approved the budget for the 2025/26 fiscal year, which begins in July 2025; The total borrowing target was set at 4.6 trillion Egyptian pounds (~91 billion USD).

Total spending increased by 18%, and the share allocated to education also increased compared to previous years. There was a significant expansion in resources allocated to areas such as social services, education, and healthcare.

Despite this, this budget still falls short of the constitutional requirement to allocate at least 4% of GDP to preschool and primary education, with only approximately 1.7% of government spending allocated to education.

Kenya..

As of April 29, 2025, the Communications Authority of Kenya (CA) published a Child Online Protection and Security Industry Guide covering the entire IT sector.

The guide aims to protect children under 18 from online risks by:

*Age verification systems, parental controls, default privacy settings,

*Complaint and reporting mechanisms, privacy-by-design practices,

*Requiring suppliers and content providers to establish child safety policies.

The Free Pentecostal Fellowship of Kenya has launched a mobile application called the Linda Mtoto Early Warning System in the Busia region.

The system allows users to anonymously report child abuse, exploitation, or neglect via SMS.

Case reports are reported to local authorities; the system will be operational in a modular manner in the Teso North, Teso Central, and Busia regions in 2025.

Moving back to the Americas…

Brazil…

CONANDA Resolution No. 245, dated April 5, 2024, established key principles governing the rights and privacy of children and adolescents in the digital world.

The decision includes:

Only necessary data must be collected,
Clear and understandable information must be provided,
The basis for consent must be free, informed, and clearly stated,
Age verification systems must be made mandatory,
Digital platforms must be accountable and publish annual risk reports.

The ANPD has prioritized child data protection as part of its 2025 Regulatory Agenda.

Among the agenda targets are:

Age verification,
Parental consent mechanisms,
Implementation of privacy-by-design policies,
Regulation of biometric data, and the mandatory use of risk assessment reports (PIA).
The ANPD is developing specific guidelines on child data protection, particularly regarding the use of facial recognition systems, educational platforms, and artificial intelligence.

We examine Argentina.

It became the first Latin American country to impose age restrictions for children’s social media use.

Chile.

A national prosecutor’s office for children’s rights violations has been established.

The “Brigada Investigadora de Delitos Sexuales y Menores” (BRISEXME), under the Jefatura Nacional de delitos contra la Familia (JENAFAM), is a special police unit responsible for conducting investigations and prosecutions related to crimes against children.

According to the Penal Code enacted in 2019 (Article 94 bis), the statute of limitations for sexual crimes against children has been abolished; thus, prosecutors can intervene in these cases indefinitely.

Looking at Colombia:

Rehabilitation programs to reduce the impact of war on children have become legal.

Ley 2421 de 2024, which entered into force on August 24, 2024, strengthens Ley 1448 de 2011 (Law on Victims and Land Restitution), particularly with respect to child victimization.

Under this law:

The state is obligated to develop a psychosocial and health rehabilitation policy for children and youth. Programs supported by trained personnel ensure the reintegration of victims into society and their psychological recovery.
Special support and resources are allocated specifically for children who have been abused, exploited by armed groups, or victims of conflict.
Rehabilitation and psychosocial support for children in Colombia are legally guaranteed.
Ley 2421, through 2024, defines the development of public policies, allocation of financial resources, and coordination mechanisms specifically for child victims. This implementation is carried out by institutions such as the ICBF and the UAEARIV, and aims to support child victims in terms of health, education, family unity, and rights.

Mexico…

In 2025, the Supreme Court of Mexico (SCJN) ruled that child sex crimes would no longer be subject to a statute of limitations in criminal and civil cases. This decision aims to provide victims with more time to heal from their trauma and ensure justice.

Amendments to the Federal Penal Code have increased penalties for child sexual assault. For example, the penalty for pederasty has been increased from 17 to 24 years. These reforms aim to impose harsher sentences on offenders and protect victims.

In the State of Yucatán, amendments to the Penal Code in 2025 increased penalties for child sexual assault. These changes strengthen local efforts to protect children.

In Mexico, to prevent child sexual abuse in the digital environment, criminal sanctions have been introduced against individuals who engage in sexual abuse with children through social media. This measure aims to increase child safety in the digital environment.

Various laws and protocols are in place to protect the rights of child victims and provide them with psychosocial support. For example, the “Protocol for the Prevention of Sexual Abuse of Children” (Protocolo de Prevención del Abuso Sexual a Niñas, Niños y Adolescentes) provides a framework for the protection and support of victims.

Japan…

Age verification systems have been made mandatory for children’s online safety.

Japan has implemented various regulations to ensure the safety of children on social media platforms. For example, Instagram launched “Teen Accounts” in January 2025 for users aged 13-17. These accounts only allow messaging with approved followers, and users under 16 require parental consent to change security settings.

Additionally, online service providers in Japan are required to use digital identity verification systems to verify users’ ages. These systems use various methods to verify users’ identities.

Japan is working to strengthen its digital identity verification systems. For example, “My Number” cards are used to verify individuals’ identities and are equipped with IC chips to enhance security in online transactions. Such digital identity verification systems are expected to support age verification processes on online platforms.

South Korea

South Korea has taken a significant step toward protecting children in the digital environment, establishing the Digital Children’s Rights Commission. This commission works to ensure children’s online safety, protect their digital rights, and mitigate the risks they face in the digital world. The commission aims to safeguard children’s digital rights by working in collaboration with government agencies, civil society organizations, and other stakeholders.

This step has become a global priority for children’s rights due to the rapid development of the digital world and the increasing potential risks children face in this environment. The United Nations Committee on the Rights of the Child has issued recommendations to protect children’s rights in the digital world and urged states to take measures. In this context, South Korea’s establishment of the Digital Children’s Rights Commission aims to ensure children have a safer presence in the digital environment by demonstrating an approach consistent with international standards.

The Commission’s activities include developing various strategies to reduce the risks children face in the digital world, increase their digital literacy, and protect their digital rights. These efforts demonstrate South Korea’s commitment to protecting children’s safety and rights in the digital world.

Let’s also take a look at China:

China has taken significant steps toward integrating digitalization into its education system by 2025. In particular, Beijing has made artificial intelligence training mandatory for all students from primary to secondary school. As part of this initiative, students will receive at least eight hours of AI training annually. The training is differentiated by age group:

Primary school students: Basic AI concepts and applications.
Middle school students: Use of AI in daily life and schoolwork.
High school students: In-depth studies on AI applications and innovation.
This reform aims to increase China’s competitiveness in the global AI race.

China has introduced strict regulations to prevent children’s addiction to digital games. Specifically, during the 2025 winter break, children’s total gaming time has been limited to 15 hours. This is a measure aimed at reducing children’s gaming addiction and promoting a more balanced lifestyle.

Additionally, regulations implemented in 2021 limited the gaming time of individuals under the age of 18 to three hours per week. These regulations require gaming companies to use authentic identity verification systems and not provide services outside of designated hours.

India, another country with a very high population density,

India has introduced new penalties for child labor. In India, under the Child and Adolescent Labour (Prohibition and Regulation) Act of 1986, the employment of children under the age of 14 is prohibited. Adolescents between the ages of 14 and 18 are permitted to be employed only in non-hazardous work. As of 2025, penalties for employers who violate this law have been increased: Child Labour: Imprisonment from 6 months to 2 years and a fine of 20,000 to 50,000 INR. Repeat offenders: Up to 3 years in prison and a fine of up to INR 1,000,000. These penalties have been further tightened by reforms, particularly in the state of Gujarat. Various measures are being implemented across India to combat child labor:

Gujarat State: Between 2020 and 2025, 4,824 raids were conducted, and 616 child laborers were rescued. These operations resulted in 791 criminal cases and 339 criminal complaints.
Bihar State: As of 2025, 30 children were rescued, each receiving financial support of 25,000 Indian Rupees. An additional contribution of 5,000 Indian Rupees was also made for each rescued child.

Such practices are implemented to ensure children’s right to education and provide economic support to families.

Pakistan..

In 2025, Pakistan took a significant step by enacting a law in the capital, Islamabad, banning child marriage. This law set the minimum age for marriage for both girls and boys at 18, criminalizing child marriage. The law also repealed the old 1929 law regulating child marriage.

⚖️ Key Articles of the New Law
Marriage Age: The marriage age for both girls and boys has been set at 18.
Punishment Sanctions: Those who facilitate, force, or organize child marriage will face prison sentences of up to 7 years and a fine.
Courts: Cases related to child marriage will be heard only in regional and high criminal courts.
Protection Measures: The law includes measures such as confidentiality and anonymity to protect victims.

The Council of Islamic Ideology (CII), Pakistan’s highest religious advisory body, described this law as “un-Islamic” and argued that setting the marriage age at 18 violates Sharia law. However, despite these objections, Pakistani President Asif Ali Zardari approved the law and ensured its enactment.

Bangladesh..

Bangladesh adopted a national campaign law against child labor and abuse in 2025. This law has led to significant steps towards the protection of children.

⚖️ Key Features of the Law
Prohibition of Child Marriage: The law prohibits the marriage of those under the age of 18.
Combating Child Labor: The employment of children in hazardous work is prohibited, and special units have been established to combat such situations.
Education and Awareness Programs: Education programs have been launched to raise public awareness of children’s rights.
National Monitoring Mechanisms: National monitoring mechanisms have been established to monitor the effectiveness of the law and address any problems encountered in its implementation.

Indonesia..

Indonesia took a significant step towards protecting children in the digital environment in 2025. The Electronic System Operators’ Child Protection Regulation (GR 17/2025), approved by President Prabowo Subianto, entered into force on January 13, 2025.

⚖️ Key Provisions of GR 17/2025
Children’s Digital Rights: Special regulations have been introduced for the digital protection of children under the age of 18.
Parental Consent: Parental consent has been made mandatory for the collection of children’s personal data.
Data Protection: Children’s personal data is considered sensitive and has been given stricter protection.
Responsibility of Electronic System Providers: Digital platform providers are obligated to prevent the misuse of children’s data and to take measures against harmful content.

🛡️ Measures Against Cyberbullying
GR 17/2025 aims to prevent the dangers children face in the digital world, such as cyberbullying. However, this regulation is not a law directly aimed at combating cyberbullying. Rather, it provides a general framework for ensuring children’s safety in the digital environment.

The Indonesian government is working on more specific regulations to combat cyberbullying. In this context, measures such as age restrictions and content controls for social media platforms are on the agenda. However, the details of these regulations have not yet been finalized.

Philippines..

In the Philippines, an important legal regulation came into force in 2025 to ensure faster and fairer investigations of child abuse cases. In this context, new provisions added to the existing law, RA 9231 (Child and Adolescent Labor Act), have significantly expedited criminal prosecution processes.

The newly added Section 16-A regulates the following process:

Preliminary investigation period: Must be completed within 30 days of the complaint or criminal complaint.
Information registration period: A lawsuit must be filed within 48 hours of the completion of the preliminary investigation confirming the trauma.
Trial period: The case is expected to be concluded within 90 days, and the verdict is expected to be delivered within 15 days.
Additionally:

Sec 16-B: Provides victims with exemptions from filing fees and criminal litigation fees.

Sec 16-C: Provides the right to free legal, medical, and psychosocial support.
This comprehensive legal reform aims to effectively expedite court proceedings related to child labor and abuse.

This scenario suggests that:

The principle of providing victim-centered and swift justice aims to minimize the long-term effects of trauma.

Instead of slow processes in the official system, a structure is being implemented that prevents unnecessary waiting for victims.

Instead of delaying legal obligations, a “priority litigation process” is effectively defined for victims of abuse.

While we’re moving towards Malaysia, or rather, Asia;

Malaysia significantly revised its legal framework for protecting children in the digital environment in 2025. Its aim is to create a stronger regulatory framework against cyberbullying, abusive content, and digital threats targeting children.

Published in the Federal Government Gazette on May 22, 2025, the Online Safety Act 2025 regulates the protection of children by expanding the definition of “harmful content” on digital platforms.

Sections 15A and 15B cover violations such as child sexual abuse content, grooming, and sextortion.

Licensed internet and social media providers (ASPs and CASPs) are now obligated to implement the Child Safety Code, block harmful content, implement age-appropriate filters, and provide a security plan.

A revision planned for June 2025 will require parents and guardians to monitor their children’s online activities and participate in digital safety training.

This approach is based on the principle that “child safety is everyone’s responsibility.”

As of January 1, 2025, the Ministries have implemented a licensing requirement for platforms with over 8 million users.

These platforms are also required to submit an Online Safety Plan to the MCMC annually.

The establishment of an Online Safety Committee and an Online Safety Appeal Tribunal is also part of the bill.

Malaysia, through its Penal Code (Amendment) (No. 2) Bill 2024, criminalized cyberbullying, punishing actions such as threats, insults, and identity sharing.

We’re in Singapore.

Singapore significantly upgraded its national online child safety standards to ensure children’s internet safety by 2025. These regulations include multiple components, including platform sanctions, age verification systems, app store codes, and parental controls.

🔹 The new Online Safety Code of Practice for App Distribution Services (ADS), published by IMDA (Infocomm Media Development Authority), came into effect on March 31, 2025. This code:

System measures have been implemented to protect users under the age of 18 and prevent them from being exposed to harmful content.
App stores (Apple Store, Google Play, etc.) are required to implement age verification processes to prevent children from downloading inappropriate apps. Platforms that have not yet implemented age verification systems are required to submit an IMDA-approved implementation plan.

With the amendment to the Broadcasting Act and related online security laws, social media services have been empowered to quickly address content that could harm children and to impose sanctions against platforms that respond late to complaints or fail to moderate, and sanctions have been strengthened.

The ADS code mandates age verification technologies in stores to prevent children from downloading inappropriate content: methods such as age-estimating AI, biometric verification, or the use of official ID.

Measures such as protecting users under the age of 18 with stricter privacy settings by default on private accounts and requiring parental consent have also been implemented into the system.

Platforms are expected to establish functional user complaint mechanisms and ensure that complaints are resolved quickly.

Additionally, under IMF/governance requirements, platforms are required to submit annual online security reports and monitor user behavior with transparency.

App stores that fail to comply may be subject to administrative fines of up to S$1 million.

As part of the national health and education strategy, GrowWell SG, national standards have been established to limit children’s screen time and prevent digital addiction.

Digital literacy training and online safety awareness programs are being offered to families in collaboration with schools and parents. Parental control apps and tools are also being expanded.

Regarding Russia…

Russia significantly tightened penalties in 2025 to combat child pornography and the exploitation of children in the digital environment. The new legislation introduces particularly strong measures against the involvement of children in criminal organizations, sexual abuse content, or the exploitation of children for malicious purposes.

In December 2024 and the first half of 2025, the State Duma passed amendments that specifically stipulate harsh penalties for the incitement and exploitation of children in online crimes. These include:

Diversion of children via the internet to join criminal organizations or aggressive groups;

Involving children under the age of 14 in criminal activities through violence or threats;

Diversion of large numbers of children into crime via the internet.

In these cases, penalties range from 3–9 years, and in aggravated cases, 8–10 years in prison, along with professional bans and other sanctions.

The new norms in Russia, particularly after the invasion of Ukraine, have attracted attention for combating cases of digital child abuse.

Amendments to Articles 150 and 151 of the Penal Code significantly increased the maximum penalty for crimes committed online.

Another regulation, which came into effect in June 2025, mandated that pedophilia offenders be subject to strict supervision after release; these individuals must be registered in the monitoring system within 72 hours and kept under continuous reporting and psychiatric monitoring.

Furthermore, pedophilia offenders were banned from entering schools, daycare centers, and public institutions for children; this decision took effect on April 6, 2024.

Saudi Arabia.

The regulation, which came into effect on February 21, 2025, introduced clear provisions on matters directly affecting children, such as the age of marriage, custody, alimony, and inheritance rights.

Marriages under the age of 18 are now acceptable only if the required consent requirements are met through official health and psychological reports. This is a significant development, and we urgently expect this positive development to continue.

The Public Prosecution has clarified its strict prohibition of employing individuals under the age of 15.

Saudi Arabia’s “CPC – Child Protection in Cyberspace” initiative, presented at the UN Human Rights Council in 2025, was recognized as a step toward internationally supporting child safety in the digital environment.

This initiative disciplines technical capacity building, training, and collaboration tools, paving the way for raising standards of digital protection within the country.
The Frontliners training program, launched on December 12, 2024, is supported by the Social Affairs Council, the Ministry of Labor, and the ILO, and aims to increase monitoring and enforcement capacity against child labor.

United Arab Emirates

The Digital Wellbeing Pact was signed in February 2025; The government, social media platforms, and telcos have collaborated to increase children’s online safety.

Awareness training programs for both parents and teachers on children’s online safety have been expanded through the “Child Digital Safety” campaign.

Criticism of Legal Reforms

Although many countries have highlighted new regulations on children’s rights in 2025, serious problems have been observed in implementing these reforms. Despite the enactment of laws in some countries, oversight mechanisms remain weak, while in others, legal regulations remain merely symbolic. Furthermore, there has been criticism that some countries have legitimized changes to laws regarding child abuse with rhetoric of “moral panic,” which could restrict children’s freedoms. Therefore, it is clear that legal regulations are insufficient to protect children’s rights; enforcement, oversight, and awareness must also be strengthened.

“Seeing Children”

In 2025, children’s rights ceased to be a matter solely related to children and became an indicator that tests societies’ perspectives on justice, equality, and conscience. Every law developed, every regulation enacted, reflects the way adult society confronts its own responsibilities. Recognizing children as “subjects” is as important as viewing them as beings “in need of protection.” In other words, it’s crucial not only to safeguard their rights but also to listen to them, see them, and ensure their participation.

———————————————————————————————–