UNICEF’s Children’s Data Manifesto: Gaps To Fill In Children Privacy Laws
UNICEF published “The Case for Better Governance of Children’s Data: A Manifesto.” The report details that while children are some of the most significant technology users, they are the least protected when it comes to data privacy. From the very moment of their first heartbeats when an ultrasound is taken to their adulthood, data is made about children. However, with all this exposure and connection children have to the digital world, current legal frameworks are lagging behind in adequately protecting children’s data privacy.
Key Players in data privacy
The manifesto begins by highlighting the key players when it comes to children and their data. The first is governments which are responsible for establishing agencies and policies that limit and dictate what data can be collected and used. The subsequent establishment involved is private sector companies that collect data and decide the uses for the data, within the constraints of the law. The following key players are data brokers. These third-party companies often operate behind the scenes and collect data points (age, gender, interests, location, etc.) to create user profiles for targeted advertising and other purposes. The following key players are parents/guardians who create and share children’s data, often acting as a proxy and providing consent on behalf of a child. Finally, the most important player is the children themselves. Children are constantly creating and sharing data about themselves and their peers on social media and other sites, often without any idea they are creating and sharing data points. These actors do not operate independently of one another, and the same way data flows between servers, data flows between who is handling it. While no one player fully controls how data is shared, they are all responsible for protecting children’s data.
Why children are more vulnerable
One of the biggest pitfalls in legislating for children is the failure to acknowledge that children’s brains work differently than full-grown adults. This is true for all areas of law but is highlighted with data privacy. Children have a different level of understanding regarding what data is collected online and for what purpose. While a child may understand that if they post something online, they share their thoughts, a child may be unaware of what metadata they are sharing. Also, given that children’s cognitive capacity is still developing, they will have a more challenging time discerning the truth from misinformation. Due to this high level of susceptibility, companies cannot only push children towards certain products, but they can also influence a child’s behavior and beliefs. Bad actors have also used these tactics to push children towards joining extremist groups and spreading misinformation. Tracking children and using their data to influence them challenges their freedom and agency over their decision-making.
The problem with EdTech
The vulnerability of children’s data has grown immensely during the COVID-19 pandemic as schools have switched to a remote program. UNICEF highlighted three areas where educational technology (EdTech) is lacking. The first is a lack of understanding by children, parents, and educators of how data flows from one source to another and whether vendors and third parties use data for exploitative purposes such as targeted advertising. UNICEF also criticized how excited schools were to adopt EdTech programs. Vendors offered these programs to schools at little to no cost; the enthusiasm to adopt remote learning may have led to lower requirements for privacy standards. Finally, UNICEF critiqued the lack of evaluation that went into these programs. There was no universal review of these programs, leading many schools to evaluate the safety and privacy of the programs themselves. Schools bought and utilized remote education programs without knowing what risks it exposed students and educators to.
The gaps in legal frameworks
UNICEF argues that countries and territories are thinking about data privacy legislation the wrong way. Many countries create data privacy laws to regulate the relationship between an individual, a business that collects data, and a third party that processes that data. While laws that regulate these transactions are important, they ignore the most significant contributor to the data economy, which is not an individual’s data, but a group of data. Focusing solely on individual data rights ignores the risk that groups face, a group such as children especially. UNICEF highlights an example where a breach could reveal the particular location of a group of children, such as a school or bus stop, making them a possible target for an attack.
Additionally, current legal frameworks have a problematic relationship with consent. Consent has been the legal basis for countries and companies to collect data from children, whether that consent comes from the parent or the child themselves. Consent is also the only requirement needed to collect data from children. The GDPR, which is considered the strictest and strongest data privacy regulation, sets the age that a child can consent on their own behalf at 16. In contrast, America’s Children Online Privacy Protection Act (COPPA) sets the age of consent at 13. Also, many laws look at children as a homogeneous group, but that is not the case, especially on an international level. Children worldwide come from communities with different levels of digital literacy, language skills, and internet access; therefore, only considering age does not truly represent an individual’s capacity to understand what data they are sharing. In other words, a 13-year-old who grew up with the internet will have a different understanding of data privacy than a 13-year-old who has never had access to a computer, yet the two are governed the same. Without any education or training, a child will not suddenly comprehend data privacy when they turn 13 or 16, but that is when they become responsible for handling their own data and have to take matters into their own hands to opt-out of data collection. There needs to be a change of thought from “data privacy as a personal responsibility” to putting the responsibility on companies and governments.
“The Ten commandments” for international standards
UNICEF highlighted these issues to demonstrate the need for international standards. Various laws and territories have massively different definitions of what a child is. And some countries do not even have a single law in place to protect children’s data. Also, all the issues highlighted above become more complicated when they are applied internationally. Take language barriers as a demonstrable example of this. Once schools switched over to remote learning in Brazil, they relied on G-Suite, an educational platform. However, the privacy notice and explanatory videos were only available in English, making them inaccessible for most users in Brazil. Issues such as this highlight the many challenges children around the world face without any international standards. For this reason, and many others, UNICEF developed the ten steps we must take to secure children’s data and children’s rights.
1. Protect children and their rights through child-centered data governance.
2. Prioritize Children’s best interest in all decisions about children’s data.
3. Consider children’s unique identities evolving capacities, and circumstances in data governance frameworks.
4. Shift responsibility for data protection from children to companies and governments.
5. Collaborate with children and their communities in policy building and management of their data.
6. Represent children’s interests within administrative and judicial processes, as well as redress mechanisms.
7. Provide adequate resources to implement child-inclusive data governance frameworks.
8. Use policy innovation in data governance to solve complex problems and accelerate results for children.
9. Bridge knowledge gaps in the realm of data governance for children.
10. Strengthen international collaboration for children’s data governance and promote knowledge and policy transfer among countries.
You can access UNICEF’s “The Case for Better Governance of Children’s Data: A Manifesto” here.
About Ardent Privacy
Ardent Privacy is an "Enterprise Data Privacy Technology" solutions provider based in the Maryland/DC region of the United States and Pune, India. Ardent harnesses the power of AI to enable companies with data discovery and automated compliance with DPB (India), RBI Security Guidelines, GDPR (EU), CCPA/CPRA (California), and other global regulations by taking a data-driven approach. Ardent Privacy's solution utilizes machine learning and artificial intelligence to identify, inventory, map, minimize, and securely delete data in enterprises to reduce legal and financial liability.
For more information visit https://ardentprivacy.ai/and for more resources here.
Ardent Privacy articles should not be considered legal advice on data privacy regulations or any other specific facts or circumstances.