The Impact of Artificial Intelligence on Children’s Right in Africa: A primer

Keketso Kgomosotho

This research was conducted as part of the African Commission’s Draft Study on human and peoples’ rights and artificial intelligence, robotics, and other new and emerging technologies in Africa. Read more.

Introduction

Artificial Intelligence (AI) is proving to be a transformative technology driving digital revolutions globally, with impact that extend profoundly into various aspects of human life and activity, including in the lives of human rights of children. This emerging technology, driven by data and algorithms, promises to transform children’s realisation of their human rights. At the same time, AI also raises critical concerns impacting children’s rights.

The role of AI in performing tasks previously limited to humans significantly alters the societal context for children, impacting their growth, learning, and understanding of the world. In this primer, we explore the relationship between AI and children’s rights. We explore in more detail the impact of AI of children’s rights within the specific African context, through an intersectional lens.

Significant of Protecting Children in the Context of AI

In the African legal and human rights framework, the rights of the child are accorded significant importance. As established in various judicial precedents, children are recognized as particularly vulnerable members of society, warranting enhanced protection. The South African Constitutional Court in Centre for Child Law v Minister of Constitutional Development, outlined the rationale behind this enhanced protection, noting that “children’s inherent physical and psychological immaturity makes them especially susceptible to harm and exploitation. Their lack of physical strength and decision-making capacity necessitates a heightened level of care and protection compared to adults. Specifically, the Court noted that “Children’s bodies are generally frailer, and their ability to make choices generally more constricted, than those of adults. They are less able to protect themselves, more needful of protection, and less resourceful in self-maintenance than adults.”[1]

The Court added further that “Not only are children less physically and psychologically mature than adults: they are more vulnerable to influence and pressure from others”.[2] This viewpoint is further supported by international and regional legal instruments, which uphold the “best interest of the child” as a paramount principle. The Preamble to the UNCRC confirms that “the child, by reason of his physical and mental immaturity, needs special safeguards and care, including appropriate legal protection, before as well as after birth.”[3] Moreover, in its report on Children’s Rights and Artificial Intelligence, UNESCO reiterates this viewpoint, that the potential impact of Ai on children deserves special attention due to children’s heightened vulnerabilities and the significant role AI will play throughout the entirety of their lives.[4]

According to the 2023 report of the UN Secretary-General on the Status of the Convention on the Rights of the Child,[5] Children account for an estimated one third of Internet users around the world. Children are becoming more digitally connected from a very early age, often having their first experiences with digital technologies before they turn 2 years old. Their online presence and usage of digital tools and platforms have grown significantly, offering new avenues for them to exercise their rights. This reliance on digital resources was greatly intensified during the COVID-19 pandemic.[6]

At General comment No. 25 on children’s rights in relation to the digital environment, the Committee on the Rights of the Child indeed confirms the imperative to protect children’s rights in the digital environment. It confirmed that the existing children’s rights must be respected, protected and fulfilled even in the digital environment. This is because innovations in digital technologies such as Ai affect children’s lives and their rights in ways that are “wide-ranging and interdependent,” including where children do not themselves access these tools and services.”[7]

Legal Protection of Children’s Rights in Africa

Due to the established significance if protecting children’s rights, they enjoy special, widespread and comprehensive protection under the African human rights system. As a start, children’s rights and wellbeing are recognised at Article 18(3) of the African Charter on Human and People’s Rights,[8] which, at Article 18(3), provides that the State shall ensure the protection of the rights the child as stipulated in international declarations and conventions. In the context of the current discussion, this means that African States have a legal obligation, premised on Article 18(3) of the African Charter, to protect children’s rights in accordance with international law standards embodied in the African Charter on the Rights and Welfare of the Child (ACRWC),[9] the Convention on the Rights of the Child (UNCRC)[10] and all other international law norms aimed at the protection of children.

The ACRWC is the first regional legal instrument to provide for the comprehensive protection of children and their rights, spanning civil, economic, social, cultural and political rights – with the monitoring and enforcement of which vesting in the hands of the African Committee of Experts on the Rights and Welfare of the Child.[11]

Outside of the African context, children’s rights already enjoy protection as a matter of customary international law, and international human rights agreements and institutions. The UDHR, ICCR and ICESCR, although dealing with a range of social economic, civil or political rights,[12] allocate certain rights to children.[13] Theoretically, it is accepted that the rights in international human rights treaties apply to children as well, within certain contextual and defined limits.[14] This is evidenced first by the guarantee of rights to all persons without discrimination on any basis including that of age, and secondly by the text of the treaties, which often employs inclusive language such as “everyone,” “all human beings,” “all individuals”, “all citizens,” or “any/all persons” to allocate the various fundamental rights.[15] However, even at the international level, this theoretical presumption lacked the required specificity to respond to children’s special needs and vulnerabilities, and their unique position in society.

In 1990 the UNCRC became the first international law treaty with a specific focus on the protection of children, to afford specific and comprehensive protection to children, in recognition of children’s special needs and vulnerabilities.[16] Like the ACRWC, the UNCRC sets out comprehensive children’s rights covering the entire spectrum of children’s civil, political, economic, social and cultural rights, and defines universal principles and norms for the status of children under international law.[17]

In addition to being the most widely ratified international convention, it is also accepted as customary international law – meaning all States, regardless of their ratification of the convention, are legally bound by its obligations since its provisions have attained the status of international custom. Relative to the UNCRC, the ACRWC offers a higher and contextualised level of protection to Children’s Rights in the African context. It is alive to the specific circumstances children in Africa exist, as a result of cultural, socio-economic, political and developmental circumstances. Further, because of the economic realities in many African States and to avoid delaying the implementation of all economic, social and cultural rights, the ACRWC notably does not make the realisation of the rights therein subject to progressive realisation.[18] Since 2022, a total of 61% of African States have enacted legislation aimed specifically at data protection and privacy.[19]

The existing legal framework on children’s rights does not mention Ai and does not to respond directly to the novel challenges presented by Ai and the digital revolution. Even though Ai is expanding rapidly,[20] there is currently no international law framework for the governance of Ai, or for the protection of children’s rights in the context of Ai.[21]This is because Ai has historically not enjoyed examination under any field of law.[22] Although the scientific study of Ai has been underway since 1956, Machine Learning (ML) has only recently become popularised and integrated into aspects of human activity.[23] At its heart, it’s about drawing inferences of hidden patterns, rules, factors, and correlations in data, by observing and processing problem-relevant data.[24] This is the Ai technique on the basis of which today’s Ai systems are premised. Because of this rapid progress witnessed in Ai-based technologies in recent years, the existing international framework, developed before the advent of Ai in mainstream society, does not explicitly address AI. As a result, the jurisprudence examining the interplay between Ai and children’s rights is still in its infancy, especially in the African context where the development and application of Ai is slower relative to developed countries.[25]

At the same time, there is growing consensus that the existing children’s rights legal framework will find application equally in the context of Ai, and that where context requires further legal protection, the existing framework will form the foundational point of departure from which to develop new forms of legal protection for children. Within the framework of children’s rights, there are specific rights that are particularly pertinent when considering the risks and challenges posed by AI technologies. These rights serve as a crucial starting point for analysing AI’s potential impact on children’s rights and well-being. Key among these rights are rights to privacy, education, autonomy, and protection from algorithmic discrimination.

The current legal discourse is grappling with the intricate challenge of comprehending how AI will impact various facets of children’s rights while also striving to adapt existing norms to the unique context of AI’s risks and opportunities. Notably, the discourse reveals a noticeable gap in research pertaining to the African context, where the impact of AI on children’s rights remains largely uncharted.

The 2022 of the Resolution of the ACERWC Working Group on Children’s Rights and Business of the African Committee of Experts on the Rights and Welfare of the Child [26] focuses on protecting and promoting children’s rights in the digital sphere in Africa. Premised on various articles of the ACRWC, the resolution emphasises the need to protect children from abuse, exploitation, and privacy infringements in the digital world. It calls for specific actions from State parties, the private sector, and non-governmental organizations, actions aimed at the protection and fulfilment of children’s rights in the digital context in Africa. These child protection measures include legislation for cybersecurity, data protection, setting up Digital Regulatory bodies, digital industry codes of conduct and terms of service – each reinforcing the obligation on the private sector to protect children. [27]

The Committee on the Rights of the Child developed General comment No. 25 on children’s rights in relation to the digital environment in order to explain “how States parties should implement the Convention in relation to the digital environment.” Having noted the fact that the existing framework is silent on new risks emanating from the digital environment, the Committee provides guidance on relevant legislative, policy and other measures to ensure full compliance with States’ obligations under the Convention in the light of the risks and challenges in promoting, respecting, protecting and fulfilling all children’s rights in the digital environment.[28] In addition to addressing Ai impact on children’s lives and rights in its 2021 Recommendation on the Ethics of Artificial Intelligence, UNESCO also published Policy Guidance on promoting, respecting, protecting and fulfilling Children’s Rights in the context of Ai.[29]

In a similar vein, the 2023 report of the UN Secretary-General on the Status of the Convention on the Rights of the Child[30] draws attention to how children’s lives and rights are increasingly mediated through technological tools. In addition to the opportunities for realising children’s rights, he acknowledges the potential harms to children, and draws attention to “the implementation gaps and barriers affecting the realization of children’s rights in the context of the digital environment, including in relation to the legislation … to ensure its safe and empowering use.”[31]

Furthermore, Article 15 of the UNESCO Recommendations accentuates the importance of ensuring that AI systems, including those interfacing with children, operate within a framework that safeguards human dignity and fundamental freedoms. This provision underscores the need to prevent the objectification of individuals and the infringement of their rights while emphasizing the ethical and human rights dimensions inherent in AI interactions. In the African context, the work of translating and adapting the provisions of the ACRWC to the context of AI-related risks and challenges remains an area warranting attention.

Impact of Ai on Children’s Rights in Africa

Ai systems, through their pervasive integration in various aspects of human activity, is already having a significant influence on the developmental, educational, and social environments of children. This impact on children’s rights is a complex interplay of both negative and positive consequences, encompassing opportunities and challenges alike.[32]Of course, as a neutral tool, Ai lacks inherent intentions or ethical predispositions. Its impact on any group hinges on factors such as the specific use case, context in which it is deployed, legal regulatory environment, and the human objectives it has been employed to pursue.[33] Several scholars and bodies have noted the implications of Ai on children’s rights.

In his report on the ‘Status of the Convention on the Rights of the Child,’ the UN Secretary-General notes that while the digital environment indeed offers previously unreachable potential for the realisation of children’s rights, its accelerated development and integration, whose speed far exceeds the legislative response, presents challenges and risks for children and their rights.[34]

For instance, AI-driven tools and content personalisation can improve learning experiences and outcomes and has the potential to make education more widely accessible and customised to individual needs. On the other hand, there are critical concerns about Ai’s impact on children’s rights regarding safety, privacy, data security, exploitation of children, and the potential for AI to perpetuate discrimination and existing inequalities. The report highlights that children’s rights are often overlooked in the conceptualising, design and development phase, and have similarly been largely ignored in the governance of the Internet. An AI-driven digital environment enables new behaviours and ways to perpetrate violence against children, to manipulate children, and risks amplifying children’s exposure to harmful and untrustworthy content. Furthermore, the increasing use of digital tools by children draws attention to child rights issues relating to privacy, non-discrimination, data protection, consent, accountability, and the availability of remedies.[35]

At General comment No. 25 (2021) on children’s rights in relation to the digital environment, the Committee on the CRC notes that digital technologies such as Ai are gaining increasingly importance across various aspects of children’s lives, including in education, as a socio-technical function, in allocation and distribution of government services, during times of crisis, and in generally in commerce and entertainment to name a few. As noted earlier, the General Comment insists that children’s rights must be respected, protected and fulfilled even in the digital environment.[36]

The UNESCO Recommendations on the Ethics of AI notes the long term impact of Ai on children’s lives – that AI, because of its nature and characteristics, will play a significant new role in human practices and society, including in our connection to the environment and ecosystems around us. For children, this will create “a new context to grow up in, develop an understanding of the world and themselves, critically understand media and information, and learn to make decisions.” The Recommendations note that in the long term, “AI systems could challenge humans’ special sense of experience and agency, … while challenging additional concerns human self- understanding, social, cultural and environmental interaction, autonomy, agency, worth and dignity.”[37]

There are 4 key principles that emerge as of critical importance in the implementation of the children’s rights framework.[38] These are the best interests principle,[39] the principle of non-discrimination, the right to survival and development,[40] and the child’s right to participate in matters concerning his or her well-being[41]. As Michael Gose writes, these principles serve as foundational elements for the Convention’s application and interpretation – they are the “soul of children’s rights treaties.[42] Ai bears an impact on each of these principles.

In terms of privacy, children face a higher risk of having their rights violated due to their vulnerability to privacy breaches. Their limited understanding of the long-term consequences of consenting to data processing and sharing personal information makes them particularly susceptible to intrusions into their privacy.[43] This is further confirmed by the African Committee of Experts on the Rights and Welfare of the Child in its 2023 Day of the African Child Concept. The Committee furthermore[44] draws attention to threats to children online include cyberbullying and exposure to harmful content and advice, which can be exacerbated by the use of Ai. Moreover, the the Committee noted “an information gap” in terms of online harms affecting children in the region, noting that only 3 states have reported on children’s online experiences, including on the violation of children’s rights online.[45]

In terms non-discrimination, Ai systems are reported widely to produce algorithmic bias in decision making processes that bear impacts on human rights, including the rights of children. Algorithmic discrimination refers to when discriminatory outcomes (or disparate impact) against certain individuals or groups defined by protected grounds, that is produced by data analytics in Ai algorithmic systems.[46] Algorithmic discrimination can occur in all context where data-analytical systems are applied. In fact, there is overwhelming evidence that Ai algorithms produce discriminatory outcomes in near all contexts where its use produces a socio-technical impact.[47] Examples of algorithmic bias include LinkedIn’s[48] and Amazon’s[49] recruiting machine learning system being biased against women, racial discrimination in face recognition technology,[50] racial discrimination in distribution of government social services.[51]

All data is a product of human language, experiences, cultural artifacts, human interpretation, selection and decision-making at some stage throughout the data life cycle, whether at collection, categorisation, or presentation.[52] In this way, every data set carries with it the imprints of societal, cultural, and personal biases of those who played a role in its life cycle.

Algorithms are trained on this data, and these human biases enter the algorithmic system dynamically at various stage of the Ai pipeline, from design, development to deployment.[53] As a start, the data selection stage can introduce bias where the data set is not representative of the broader population or phenomenon it intends to capture.[54] Where the algorithms require labelled data, the individuals labelling the data brings their own biases and interpretations to bear, which will subsequently be learned by the algorithm. Further, certain design choices or features chosen to represent the data can further introduce bias into the algorithm, for instance where important features are omitted or where irrelevant features are given undue weight.[55] These human biases are further reinforced in the system through feedback loops, where the algorithm’s biased decision influences the generation of new data on which the algorithm is further trained and updated – thus reinforcing the initial biases.[56] Those interpreting the algorithmic output/results also bring their pre-existing biases or if they lack a deep understanding of the algorithm’s limitations, they might draw biased conclusions.[57]

Finally, the demographics and backgrounds of those involved in algorithm development teams also plays a key role, since a homogenous group might inadvertently overlook biases that would be evident to a more diverse team.

In this way, Ai algorithms can reinforce existing inequalities in areas such as education and social services. For instance, if an Ai algorithm used in educational settings is biased against certain groups (say due to a lack of representation in the training data sets), it may disproportionately affect children from those groups, leading to unfair treatment or access to opportunities. Similarly, in social and government services, biased algorithms could result in unequal access to resources or support for children in the context of social security, healthcare, justice system, etc. Moreover, algorithmic discrimination can also influence the content children are exposed to online and in the classroom, potentially reinforcing historic patterns of exclusion, stereotypes or exposing them to inappropriate material.

Reading impact of Ai on children’s rights in the African context

In the African context, the impact of AI on children’s rights is intimately intertwined with the region’s unique socio-economic, infrastructural, and cultural challenges. There are notable inequalities and disparities that necessitate an intersectional lens to appreciate the impact of AI within the unique context of the region. There is a significant rural-urban in the region, with a notable disparity in internet access between urban and rural areas, and among different socio-economic groups. This divide not only affects children’s ability to benefit from AI-driven educational and developmental tools but also risks widening existing inequalities. Research shows that a significant portion of rural populations in Africa lack access to mobile broadband and network coverage, highlighting a stark urban-rural divide. For instance, in Angola and Malawi, rural communities face barriers to internet access due to affordability and infrastructural issues, resulting in markedly lower internet usage compared to urban areas.[58] In Malawi, only anestimated 9.3% of people in rural areas have internet access, relative to 40.7% of those based in urban settings. A 2020 UNICEF report indicates stark disparities: only about 1% of children in the poorest segments of West and Central Africa have internet access. The report also highlights that in Eastern and Southern Africa, just 13% of children and young people under 25 have home internet access, in sharp contrast to 59% in Eastern Europe and Central Asia, reflecting wider socio-economic inequalities.[59]

While research on the rural-urban digital divide in Africa is limited, existing data reveals significant disparities. In 2021, the ITU found that nearly a third of Africa’s rural population lacked mobile broadband coverage, and around 18% were without any mobile network coverage. [60] Furthermore, the continent exhibited the largest urban-rural internet usage gap globally, with 50% of the urban population using the internet, in sharp contrast to only 15% in rural areas.[61]

There is also a significant gender divide in the region. In education, girls generally show lower enrolment in subjects that develop digital and technological literacy, such as STEM and ICTs. This trend contributes to the gender digital divide, especially in underprivileged areas. In Africa, research shows that women and girls are notably less likely to possess smartphones or have internet access compared to men. Limited data specific to Southern Africa still reveals consistent patterns of gender disparity in ICT access and usage across countries like South Africa, Tanzania, Mozambique, and Lesotho. The Committee on the Rights of the Child in General Comment No. 25 highlights that while digital technology access can empower children in realizing their rights, the lack of digital inclusion risks exacerbating existing inequalities and creating new ones.[62] For girl children in Africa, this means facing heightened barriers to realising their rights and potential. With limited access to digital tools and technologies, girls may struggle to participate in the digital economy and benefit from educational opportunities that foster technological skills. This disparity not only hinders their personal and professional development but also further cements perpetuates historic gender inequalities in African societies.

The digital divide then reflects these larger socio-economic disparities, including those based on wealth, gender, urban-rural locations, and educational levels. The African Committee of Experts on the Rights and Welfare of the Child (ACERWC) highlights how this digital gap hinders children’s rights, including access to education, freedom of expression, and association, as well as the right to play.

Moreover, there is the challenge of low literacy and digital literacy, prevalent particularly in rural and underserved communities, and hindering children’s ability to engage with AI technologies equitably. The low levels of internet penetration, especially among poorer children, further exacerbate these challenges. Additionally, due to the economic realities in most African States, there is a notable lack of investment in AI and technology infrastructure, which accounts for the slower progression of the AI revolution in the region. The immediate priority for many African countries remains addressing basic needs rather than advancing in technology. This context necessitates a nuanced approach to implementing AI in the region, in ways that are contextual, and alive to the intersecting socio-economic realities on the ground.

Conclusion

The impact of Artificial Intelligence on children’s rights in Africa is proving to be complex and multifaceted. To the region, AI presents both opportunities for enhancing children’s realisation of their rights, including in access to education and information; as well as unprecedented risks related to privacy, discrimination, exploitation, data protection, and exposure to harmful content. The challenges are amplified by realities in the ground – the existing digital divide, influenced by socio-economic, infrastructural, and cultural factors, including gender disparities and rural-urban gaps.

What is clear is clear is that the increasing role of AI in children’s lives demands a comprehensive and child rights-based approach to AI. Further, there is a need for targeted interventions to bridge existing socio-economic divides in the region; with an emphasis on inclusive policies and investments that target the specific needs of underprivileged and marginalised children.

References

[1] ‘Centre for Child Law v Minister for Justice and Constitutional Development and Others (CCT98/08) [2009] ZACC 18; 2009 (2) SACR 477 (CC) ; 2009 (6) SA 632 (CC) ; 2009 (11) BCLR 1105 (CC) (15 July 2009) para 25’ <https://www.saflii.org/za/cases/ZACC/2009/18.html> accessed 7 January 2024.

[2] ibid, at para 27.

[3] ‘Convention on the Rights of the Child’ (OHCHR) <https://www.ohchr.org/en/instruments-mechanisms/instruments/convention-rights-child> accessed 10 January 2024.

[4] ‘Memorandum on Artificial Intelligence and Child Rights | UNICEF Office of Innovation’ (21 May 2019) <https://www.unicef.org/innovation/reports/memoAIchildrights> accessed 8 January 2024.

[5] ‘Status of the Convention on the Rights of the Child – Report of the Secretary-General (A/78/366) [EN/AR/RU/ZH] – World | ReliefWeb’ (17 October 2023) <https://reliefweb.int/report/world/status-convention-rights-child-report-secretary-general-a78366-enarruzh> accessed 10 January 2024.

[6] ibid.

[7] ‘General Comment No. 25 (2021) on Children’s Rights in Relation to the Digital Environment | OHCHR’ <https://www.ohchr.org/en/documents/general-comments-and-recommendations/general-comment-no-25-2021-childrens-rights-relation> accessed 8 January 2024, at para 4.

[8] Article 18 of the African Charter recognises that “[t]he State shall ensure the elimination of every discrimination against women and also ensure the protection of the rights of women and the child as stipulated in international declarations and conventions.”

[9] ‘African Charter on the Rights and Welfare of the Child | African Union’ <https://au.int/en/treaties/african-charter-rights-and-welfare-child> accessed 10 January 2024.

[10] ‘Convention on the Rights of the Child’ (n 6).

[11] The OAU is the first regional organisation to adopt a binding regional legal instrument targeting children’s rights specifically.

[12] Such as the ICCPR, ICESCR, UDHR.

[13] For example, see Article 24 of the ICCPR, Article 10 of the ICESCR, Article 25 of UDHR.

[14] Practice shows that children are often denied these internationally protected rights.

[15] Moreover, it is accepted that human rights vest in all human beings, because of our common humanity, which entitles children, too, to human rights without discrimination.

[16] Amanda Lloyd, A theoretical analysis of the reality of children’s rights in Africa: An introduction to the African Charter on the Rights and Welfare of the Child in C Heyns (ed) Human Rights law in Africa 1997 (1999) 38.

[17] Swace Digital, ‘African Charter on the Rights of the Child’ (Save the Children’s Resource Centre) <https://resourcecentre.savethechildren.net/document/african-charter-rights-child/> accessed 8 January 2024; ibid.

[18] This is contrary to Article 4 of the UNCRC, which provides that ‘[s]tates shall take implementation measures “to the maximum extent of their available resources.”

[19] Digital Rights Landscape in SADC Report, Centre for Human Rights, University of Pretoria, 2022, https://www.chr.up.ac.za/images/researchunits/dgdr/documents/reports/Digital_Rights_Landscape_in_SADC_Report.pdf

[20] ‘Google Scholar Reveals Its Most Influential Papers for 2021’ (Nature Index, 24 August 2021) <https://www.nature.com/nature-index/news/google-scholar-reveals-most-influential-papers-research-citations-twenty-twenty-one> accessed 6 August 2023.Maslej et al Nestor, ‘The AI Index 2023 Annual Report’ (AI Index Steering Committee, Institute for Human-Centered AI, Stanford University 2023) <https://aiindex.stanford.edu/wp-content/uploads/2023/04/HAI_AI-Index-Report_2023.pdf>.

[21] Ai is only governed by international law to the extent that the principles and drafting language happen to be broad enough to respond to the challenges posed by Ai. There are no special rules designed with an Ai context in mind.

[22] Matthijs Maas, ‘International Law Does Not Compute: Artificial Intelligence and the Development, Displacement or Destruction of the Global Legal Order’ (2019) 20 29; Gabbrielle M Johnson, ‘Algorithmic Bias: On the Implicit Biases of Social Technology’ (2020) 198 Synthese 9941.

[23] ML refers to the “subfield of Ai that studies the ability to improve performance based on experience.” See Stuart Russell and Peter Norvig, ‘Artificial Intelligence: A Modern Approach’ (2021).

[24] Gizem Halis Kasap, Can Artificial Intelligence (“AI”) Replace Human Arbitrators? Technological Concerns and Legal Implications’ (2021) 2021 J Disp Resol 209; David Lehr & Paul Ohm, Playing with the Data: What Legal Scholars Should Learn About Machine Learning, 51 U.C. DAVIS L. REV. 653, 671 (2017) wherein they define ML as “an automated process of discovering correlations, relationships or patterns between variables in a dataset, often to make predictions or estimates of some outcome.”). See also David Danks, Learning, in THE Cambridge Handbook of Artificial Intelligence, at 151, 157 where Danks clarifies that the value of ML lies in the way that the output can be used for future tasks, such as prediction, planning, classification, recognition, language etc.

[25] For instance, the pace of Ai development is much faster in US, China, EU, Russia and other developed countries.

[26] ACERWC Working Group on Children’s Rights and Business of the African Committee of Experts on the Rights and Welfare of the Child, Resolution No. 17/2022, 2022, available at https://www.acerwc.africa/sites/default/files/2022-10/Resolution%20No%2017%202022%20of%20the%20Working%20Group%20on%20Children%27s%20Rights%20and%20Business_0.pdf

[27] Ibid.

[28] ‘General Comment No. 25 (2021) on Children’s Rights in Relation to the Digital Environment | OHCHR’ (n 10).

[29] UNICEF, Policy guidance on AI for children 2.0, November 2021, available at ‘Recommendation on the Ethics of Artificial Intelligence | UNESCO’ <https://www.unesco.org/en/legal-affairs/recommendation-ethics-artificial-intelligence> accessed 6 August 2023.

[30] ‘Status of the Convention on the Rights of the Child – Report of the Secretary-General (A/78/366) [EN/AR/RU/ZH] – World | ReliefWeb’ (n 9).

[31] Ibid, para 3.

[32] ‘Status of the Convention on the Rights of the Child – Report of the Secretary-General (A/78/366) [EN/AR/RU/ZH] – World | ReliefWeb’ (n 9).

[33] Johnson (n 29).

[34] ‘Status of the Convention on the Rights of the Child – Report of the Secretary-General (A/78/366) [EN/AR/RU/ZH] – World | ReliefWeb’ (n 9).

[35] ibid.

[36] ‘General Comment No. 25 (2021) on Children’s Rights in Relation to the Digital Environment | OHCHR’ <https://www.ohchr.org/en/documents/general-comments-and-recommendations/general-comment-no-25-2021-childrens-rights-relation> accessed 6 January 2024.

[37] ‘Recommendation on the Ethics of Artificial Intelligence | UNESCO’ (n 3).

[38] General guidelines regarding the form and contents of initial reports to be submitted by States parties under article 44, paragraph I (a), of the Convention on the Rights of the Child, adopted by the Committee on the Rights of the Child at its first session in October 1991, Official Records of the General Assembly, Forty-seventh Session, Supplement No. 41 (A/47/41), annex III. [Available at http://www.unhchr.ch/html/menu6/2/fs10.htm]

[39] Article 3 of the UNCRC.

[40] Article 6 UNCRC.

[41] Article 12 UNCRC.

[42] Michael Gose, ‘The African Charter on the Rights and Welfare of the Child’. https://dullahomarinstitute.org.za/childrens-rights/archives/Publications/Other%20publications/The%20African%20Charter%20on%20the%20Rights%20and%20Welfare%20of%20the%20Child.pdf

[43] ‘Industry Toolkit: Children’s Online Privacy & Freedom of Expression by UNICEF USA – Issuu’ (4 May 2018) <https://issuu.com/unicefusa/docs/unicef_toolkit_privacy_expression> accessed 10 January 2024.

[44] African Committee of Experts on the Rights and Welfare of the Child, ‘DAC Concept Note 2023’ (2023) https://www.acerwc.africa/sites/default/files/2023-02/DAC%20CONCEPT%20NOTE%202023_EN.pdf

[45] African Committee of Experts on the Rights and Welfare of the Child, ‘DAC Concept Note 2023’ (2023) https://www.acerwc.africa/sites/default/files/2023-02/DAC%20CONCEPT%20NOTE%202023_EN.pdf

[46] John W Patty and Elizabeth Maggie Penn, ‘Algorithmic Fairness and Statistical Discrimination’ (2023) 18 Philosophy Compass, Vol 18, Issue 1.

[47] Robin Staab and others, ‘Beyond Memorization: Violating Privacy Via Inference with Large Language Models’ (arXiv, 11 October 2023) <http://arxiv.org/abs/2310.07298> accessed 18 October 2023; Köchling and Wehner (n 16); Sara Khor and others, ‘Racial and Ethnic Bias in Risk Prediction Models for Colorectal Cancer Recurrence When Race and Ethnicity Are Omitted as Predictors’ (2023) 6 JAMA Network Open e2318495; Prince and Schwarcz (n 26); Anya ER Prince, ‘Insurance Risk Classification in an Era of Genomics: Is a Rational Discrimination Policy Rational?’ (2017) 96 Nebraska law review 624; Johnson (n 26); Gordon (n 17); R Stuart Geiger and others, ‘“Garbage in, Garbage out” Revisited: What Do Machine Learning Application Papers Report about Human-Labeled Training Data?’ (2021) 2 Quantitative Science Studies 795; Will Knight, ‘AI Chatbots Can Guess Your Personal Information From What You Type’ Wired <https://www.wired.com/story/ai-chatbots-can-guess-your-personal-information/> accessed 18 October 2023.

[48] ‘LinkedIn’s Job-Matching AI Was Biased. The Company’s Solution? More AI.’ (MIT Technology Review) <https://www.technologyreview.com/2021/06/23/1026825/linkedin-ai-bias-ziprecruiter-monster-artificial-intelligence/>.

[49] ‘Why Amazon’s Automated Hiring Tool Discriminated Against Women | ACLU’ (American Civil Liberties Union, 12 October 2018) <https://www.aclu.org/news/womens-rights/why-amazons-automated-hiring-tool-discriminated-against> accessed 19 October 2023.

[50] Raji and Buolamwini (n 26).

[51] ‘Racial Bias in Health Care Artificial Intelligence’ (NIHCM) <https://nihcm.org/publications/artificial-intelligences-racial-bias-in-health-care> accessed 19 October 2023.

[52] John Bower, ‘The Nature of Data and Their Collection’ (2013).

[53] Datta and others (n 21).

[54] Bower (n 101).

[55] Borja Seijo-Pardo and others, ‘Biases in Feature Selection with Missing Data’ (2019) 342 Neurocomputing 97.

[56] Nicolò Pagan and others, ‘A Classification of Feedback Loops and Their Relation to Biases in Automated Decision-Making Systems’ (arXiv, 10 May 2023) <http://arxiv.org/abs/2305.06055> accessed 19 October 2023.

[57] Ninareh Mehrabi and others, ‘A Survey on Bias and Fairness in Machine Learning’ (2021) 54 ACM Computing Surveys 115:1; Samuel (n 88).

[58] Digital Rights Landscape in SADC Report, Centre for Human Rights, University of Pretoria, 2022., https://www.chr.up.ac.za/images/researchunits/dgdr/documents/reports/Digital_Rights_Landscape_in_SADC_Report.pdf; Freedom House, ‘Freedom of the Net 2021: Angola’, 2021.

[59] UNICEF & ITU, ‘How many children and young people have internet access at home? Estimating digital connectivity during the COVID-19 pandemic’ (2020); atzatzev, ‘How Many Children and Young People Have Internet Access at Home?’ (UNICEF DATA, 30 November 2020) <https://data.unicef.org/resources/children-and-young-people-internet-access-at-home-during-covid19/> accessed 10 January 2024.

[60] ITU, ‘Measuring digital development: Facts and figures 2021’ (2021) 12, Digital Rights Landscape in SADC Report, Centre for Human Rights, University of Pretoria, 2022., https://www.chr.up.ac.za/images/researchunits/dgdr/documents/reports/Digital_Rights_Landscape_in_SADC_Report.pdf

[61] ITU, ‘Measuring digital development: Facts and figures 2021’ (2021) 12, Digital Rights Landscape in SADC Report, Centre for Human Rights, University of Pretoria, 2022., https://www.chr.up.ac.za/images/researchunits/dgdr/documents/reports/Digital_Rights_Landscape_in_SADC_Report.pdf

[62] Digital Rights Landscape in SADC Report, Centre for Human Rights, University of Pretoria, 2022., https://www.chr.up.ac.za/images/researchunits/dgdr/documents/reports/Digital_Rights_Landscape_in_SADC_Report.pdf