After over three years at UNICEF, it is time to reflect on achievements and learnings and write a “brag pack” (this looking back is a tradition of mine — see my previous reviews).
I’m the Digital Policy Specialist for UNICEF, based in New York in the Office of Global Insight and Policy (OGIP). The Office serves as an internal think-tank, investigating issues with implications for children, equipping the Organization to more effectively shape global discourse, and preparing it for the future by scanning the horizon for frontier issues and ways of working.
I have tried to do two things since joining UNICEF: focus on key emerging digital issues for children, such as AI, digital literacy, and mis/disinformation, and position the Organization as a thought leader on digital issues for children. Below are some highlights:
Project leadership and innovation on emerging digital issues
AI for children
While AI is a hot topic, not enough attention is paid to how it impacts on children in policies and systems (see the report, which I co-authored, that reviewed how little national AI strategies say about children). I thus helped set up and lead the AI for Children Policy Project, a 2-year initiative in partnership with the Ministry of Foreign Affairs (MFA), Finland, that aims to see more child-centred AI systems and policies in the world. Working with a stellar team (Melanie Penagos and consultants Prof Virginia Dignum, Dr Klara Pigmans and Eleonore Pauwels, and under the guidance of Jasmina Byrne and Laurence Chandy), I:
- Developed the work plan for the project, raised the funds for it (largest external funding for OGIP) and manage the partnership with the MFA.
- Co-authored the Policy Guidance on AI for Children (a world first).
- Pioneered a user-centred design approach to policy development within the UN: first we held consultations with experts around the world to inform and ground the guidance, then we released an official draft version and held public consultations on it as well as — here’s the interesting bit — invited governments and companies to pilot it (acknowledging that we don’t have all the answers in moving from AI policy to practice). From the field learnings we wrote 8 case studies about what works and what doesn’t, which informed version 2.0 (non-draft) of the policy guidance – released a year later.
- Have overseen the first UN global consultation with children on AI, led by rock star colleague Kate Pawelczyk, to inform the development of the guidance. Adolescent perspectives on AI documents the findings from nine workshops with 245 children in five countries. A major contribution here is the workshop methodology on how to consult children on AI.
- Helped to grow and manage an external advisory group for the AI project, including the World Economic Forum, Berkman Klein Center for Internet & Society (Harvard University), IEEE Standards Association, PwC UK and Cetic.br.
- Hosted the world’s first Global Forum on AI for Children with 450 participants to raise awareness of children and AI and help plot a better AI future.
Achievements: the Government of Scotland has officially adopted the draft policy guidance in its national AI strategy. The policy guidance was shortlisted as a promising responsible AI initiative by the Global Partnership on AI and the Future Society, nominated for a Harvard Kennedy School Tech Spotlight recognition, and our Office’s most popular download.
Digital literacy for children
While many excellent digital literacy initiatives were being driven at UNICEF, the efforts were often ad hoc and not situated within a coherent framework for the Organization. Working with Dr Fabio Nascimbeni and colleagues, we mapped the current digital literacy policy and practice landscape; highlighted existing competence frameworks and how they could be adapted to UNICEF’s needs; surveyed the needs and efforts of UNICEF country offices (a first across the Organization); and offered policy and programme recommendations, including a new definition of digital literacy for UNICEF. Our resulting paper tells all.
Digital mis/disinformation and children
As with AI, mis/disinformation are current and crucially important topics — but the discourse offers little insight into how children are affected. In navigating the digital world, with their cognitive capacities still in development, children are particularly vulnerable to the risks of mis/disinformation. At the same time, they are capable of playing a role in actively countering its flow and in mitigating its adverse effects through online fact-checking and myth-busting. Working with Prof Philip N. Howard, Lisa-Maria Neudert and Nayana Prakash of the Oxford Internet Institute, we authored a report (and 10 Things you need to know) that go beyond simply trying to understand the phenomenon of false and misleading information, to explain how policymakers, civil society, tech companies and parents and caregivers can act to support children as they grow up in a digital world rife with mis/disinformation.
Helping to sharing knowledge and steer discourse on key issues:
- I’m a member of the World Economic Forum’s Global Future Council on Artificial Intelligence for Humanity (invitation only). I contributed to the White Paper: A Holistic Guide to Approaching AI Fairness Education in Organizations and the AI Fairness Global Library.
- I’m an advisory group member of UNESCO’s Global Declaration on Connectivity for Education.
- Delivered one of the keynotes at the Beijing AI Conference on Why we need child-centred AI and how we can achieve it.
- Contributed to UNICEF’s Prospects for children: A global outlook 2021-2025.
- Co-authored internal / public intelligence briefs for the Office of the Executive Director on cyber attacks, online hate speech, and COVID-19 and children’s digital privacy.
- I’m a working group member of UNICEF’s initiative for Good Governance of Children’s Data.
What’s my big idea?
Digital is only a force for good when it serves all of humanity’s interests, not just those of a privileged few. Meaningful technology use must be for everyone, provide opportunities for development and livelihoods, and support well-being. Technology cannot only be for those that can control it and afford it, it should not constrain opportunity and undermine well-being.
These are not new ideas, but what I have come to believe is that the best way to achieve meaningful digital inclusion is to focus on children and youth. A digital world that works for children works best for everyone. Children under 18 make up one-third of all internet users, and youth (here, 15-24 year olds) are the most online age cohort (globally, 71% use the internet, compared with 57% of the other age groups). And yet, despite being significant user groups, they are the unseen teens. Digital platforms are not sufficiently designed or regulated with or for them.
A focus on children and youth will force platform creators and digital regulators to be more conscious of a range of different user needs – not just privilege the adult experience. It will help them take online child protection more seriously, reduce digital surveillance of children, and think creatively and co-operatively about digital experiences that support well-being of children. It does not mean “dumbing down” the internet to the lowest common denominator — not every part of the internet is appropriate for children — but rather holding inclusion, protection and empowerment for all as guiding principles.
So far it has been an incredible journey at UNICEF: stimulating, challenging and rewarding, working with amazing people on issues that really impact on children. I look forward to continue to do work that is pioneering and relevant in the coming years.