Why in News?
- The Union Government is considering a separate legislation to regulate social media usage among children under 18 years, adopting a graded, age-based restriction framework rather than a blanket ban.
- The proposed law may be introduced during the Monsoon Session of Parliament after stakeholder consultations.
- The move reflects rising concerns globally about digital addiction, harmful online content, and mental health impacts on children, while also balancing digital rights and access to information.
What’s in Today’s Article?
- Government’s Proposed Approach
- Global Context - Rising Regulation of Children’s Social Media Use
- Policy Support
- Rationale Behind Regulation
- Concerns and Challenges
- Way Forward
- Conclusion
Government’s Proposed Approach:
- Graded age-based restrictions:
- The government is examining a tiered regulatory structure for different age groups -
- 8–12 years: Strictest restrictions with parental supervision and limited usage.
- 12–16 years: Moderate restrictions with controlled access.
- 16–18 years: Relatively relaxed restrictions but still monitored.
- The aim is to balance child safety with digital exposure, acknowledging that today’s children are more digitally aware than earlier generations.
- Possible regulatory measures:
- Several mechanisms are being discussed. For example,
- Time-based restrictions: Limiting daily usage hours.
- Login restrictions: Preventing social media access during evening or night hours.
- Parental consent: For creating accounts.
- Platform accountability: Social media companies may need to implement safeguards for minors.
- These proposals draw inspiration from global models such as China’s one-hour-per-day limit for online gaming for minors.
Global Context - Rising Regulation of Children’s Social Media Use:
- Concerns regarding children’s online safety have triggered regulatory initiatives worldwide. For instance,
- Australia: Introduced landmark legislation restricting children’s social media use.
- Indonesia: Plans to ban Instagram and other “high-risk” platforms for users under 16.
- France: President Emmanuel Macron proposed banning social media for children below 15.
- European countries: Spain and others are considering similar restrictions.
- These initiatives highlight growing concerns about AI-driven harmful content, cyberbullying, and digital addiction.
Policy Support:
- The Economic Survey 2025–26 recommended -
- Age-based limits on social media use for children.
- Restrictions on targeted digital advertisements for minors.
- Promotion of simpler devices such as basic phones, and education-focused tablets.
- The Survey also suggested content filters and usage limits to protect children from violent, sexual, gambling-related, or addictive online content.
Rationale Behind Regulation:
- Mental health concerns: Excessive social media use can lead to digital addiction, anxiety and depression, sleep disruption, and cyberbullying exposure.
- Exposure to harmful content: Algorithms and AI-generated content have increased the circulation of violent material, sexual content, gambling promotions, and misinformation.
- Child safety as a policy priority: Government sources emphasise that “citizen safety” will be the guiding principle in designing the law.
Concerns and Challenges:
- Operational challenges:
- Tech companies warn about difficulties in implementing restrictions.
- For example, geo-restrictions at the state level are difficult due to nationwide connectivity.
- Different state-level rules (e.g., Karnataka vs Andhra Pradesh) may create compliance challenges.
- Definitional issues: States may define “child” differently. For example, Andhra Pradesh ban for children below 13, Karnataka ban for those under 16. Such inconsistencies could create regulatory fragmentation.
- Risk of circumvention: Industry representatives warn that bans may push teenagers to unregulated platforms, and logged-out browsing, bypassing safety protections.
- Digital rights and freedom of expression: Digital rights groups argue that blanket bans may violate children’s rights to information, expression, and participation in digital spaces.
- Digital gender divide: The Internet Freedom Foundation (IFF) cautions that restrictions framed as “protection” may reinforce patriarchal control over girls’ internet access, widening the digital gender divide, particularly in rural areas.
Way Forward:
- Balanced regulatory framework: Instead of outright bans, India could adopt proportionate regulations combining age verification systems, content moderation requirements, and platform accountability.
- Strengthening digital literacy: Education systems should integrate digital safety education, awareness about misinformation, cyberbullying, and online risks.
- Parental and institutional involvement: Policies should emphasise parental supervision, and School-level digital awareness programs.
- Platform design reforms: Regulators could mandate child-safe algorithms, reduced addictive design features, stronger privacy and data protection mechanisms.
- Uniform national framework: A central law would ensure consistency across states and reduce compliance complexities.
Conclusion:
- India is moving toward a nuanced regulatory framework for children’s social media usage, prioritising safety while preserving digital access and freedoms.
- A graded, age-based system combined with digital literacy, platform accountability, and parental oversight may provide a balanced solution.
- Such an approach could help mitigate risks like digital addiction, harmful content exposure, and mental health issues, while ensuring that children remain empowered participants in the digital ecosystem.