Measures to support development of AI education tools in the United Kingdom and Singapore
ISE11/2025
Subject: Education, artificial intelligence, AI, generative AI, education technology
- The use of artificial intelligence ("AI") in education has become increasingly prevalent, given its potential to enhance teaching and learning. In response, some jurisdictions have updated school curricula with AI-related content and strengthened teachers' AI competence. On the back of growing demand from schools, the market has also seen a proliferation of new AI-powered education technology ("EdTech") products.1Legend symbol denoting Precedence Research (2025). Yet, concerns remain regarding the technical and ethical risks associated with AI. Risks such as misinformation and biased data may undermine educators' confidence in employing EdTech. These concerns have been shared by stakeholders, including some Legislative Council ("LegCo") Members.2Legend symbol denoting Panel on Education (2025).
- To address these risks, some jurisdictions, such as the United Kingdom ("UK") and Singapore, have issued guidance on the ethical and responsible use of AI in school settings, and taken extra measures to ensure AI tools provided for education are safe and reliable. A notable measure in the UK is the establishment of a dedicated content store for training AI tools for school use, whereas in Singapore, core AI tools for schools are developed directly by the government and centrally provided through a single platform. This issue of Essentials briefly discusses (a) the use of AI in education, (b) the current policy on the use of AI in Hong Kong's primary and secondary schools, and (c) initiatives pursued by the UK and Singapore to promote safe and reliable use of AI in schools.
Use of AI in education
- AI enables machines to perform tasks that normally require human intelligence, such as recognizing patterns, making decisions, or generating contents. It has long been used to make predictions or conduct categorizations using pre-programmed rules and mathematical models. In recent years, generative AI tools (e.g. ChatGPT) capable of producing contents like text, images, or videos, based on the patterns and structures learnt from training data, has also seen rapid growth. Given their versatility, the use of AI in teaching and learning is considered promising. For students, AI can enable personalized learning that suits students of varying competence and supports self-learning through automated feedback. For teachers, AI can help create customized teaching plans and materials, and facilitate automated marking or assessment. AI can also reduce administrative burden on teachers, thus sparing them more time to focus on teaching tasks.3Legend symbol denoting UNESCO (2024) and Office for Artificial Intelligence (2021). It is projected that the global AI-powered education market will grow from US$ 5.2 billion (HK$41 billion) in 2024 to US$ 112 billion (HK$874 billion) by 2034.4Legend symbol denoting Precedence Research (2025).
Hong Kong's policy on use of AI tools in schools
- In line with the global trend, the use of AI in local schools is increasingly common. A survey of the Hong Kong Federation of Education Workers ("HKFEW") released in May 2025 found that nearly 80% of teachers had used AI to assist teaching.5Legend symbol denoting 香港教育工作者聯會(2025a). On the choice of AI tools, most adopted mainstream AI large language models such as ChatGPT, Copilot and DeepSeek. While recognizing the opportunities, the teachers polled were also concerned about potential risks of AI, citing misinformation, privacy and data protection risks as key barriers to adopting AI tools.
- To mitigate these risks, the Digital Policy Office ("DPO") issued in April 2025 the Generative Artificial Intelligence Technical and Application Guideline, which serves as a general guide for reference by various sectors.6Legend symbol denoting Digital Policy Office (2025b). For the education sector specifically, the guideline recommends that (a) students seek prior approval before using generative AI in their coursework, and (b) teachers ensure that the generated content is truthful, accurate, and consistent, and that when AI is used for grading, final results should always be reviewed by human educators. On addressing privacy and data protection risks, the Privacy Commissioner for Personal Data issued in August 2021 the Guidance on Ethical Development and Use of Artificial Intelligence. DPO also issued in July 2025 the Ethical Artificial Intelligence Framework, which, however, only applies to government departments regarding the application of AI.
- On the wider policy landscape in education, the Government has been keen to promote the use of AI in schools and AI literacy among students. To enhance AI literacy, since the 2023-2024 school year, upper primary and lower secondary school students have begun to receive enrichment courses on coding and AI. In the 2025 Policy Address, the Chief Executive indicated clear support to widen the application of generative AI in and roll out more AI tools for schools, with a commitment to (a) develop an "AI literacy" learning framework, (b) incorporate AI education into the core curriculum, and (c) enhance AI training for teachers.7Legend symbol denoting Policy Address (2024, 2025). The 2025 Policy Address has announced that a Blueprint for Digital Education in Primary and Secondary Schools will be issued in 2026.
- Currently, the Education Bureau ("EB") does not directly develop or provide AI tools to schools. As an EB-funded corporation, the Hong Kong Education City online platform helps promote EdTech in schools, and serves as a hub of e-resources. Riding on the trend of AI, the platform has begun to provide AI- and big data-driven tools under the e-Learning Ancillary Facilities Programme ("eLAFP").8Legend symbol denoting Apart from eLAFP, the Hong Kong Education City was studying the feasibility of developing AI tools for learning. See Hong Kong Education City (2024, 2025). Launched by EB in October 2021 with HK$500 million funding from the Quality Education Fund, eLAFP has so far supported the production of 22 tools (at least six featuring AI) through collaboration among schools, universities and/or professional bodies. Three of the tools have been rolled out in the 2025-2026 school year, and the rest will be made available on the platform for school subscription in phases by the end of 2026.
- While AI tools are being introduced under eLAFP, their supply and development have largely been left to the private market. Some stakeholders including LegCo Members have expressed concerns over possible cultural bias stemming from AI models predominantly trained based on Western data, and data security risks associated with the use of overseas cloud-technology.9Legend symbol denoting Panel on Education (2025). Other concerns include whether schools have adequate funding for procuring latest digital technology. Separately, some local schools, which are early movers in the use of AI tools, have reported challenges in ensuring accuracy and relevance of the generated information due to limitations in training data.10Legend symbol denoting 明報(2025). In another report of HKFEW released in September 2025, a majority of the surveyed teachers hoped to receive support from the Government, including developing (a) AI tools aligned with Hong Kong's education landscape, and (b) new platforms where AI tools could be shared for use among schools.11Legend symbol denoting 香港教育工作者聯會(2025b).
UK's approach to the supply of AI tools for schools
- In the UK, use of AI in schools is a high priority, as it constitutes part of the national AI strategy unveiled in September 2021.12Legend symbol denoting UKAI (2025). To support schools in adopting AI, the UK Department for Education ("DfE") published a policy paper titled "Generative artificial intelligence in education" in November 2023.13Legend symbol denoting The policy paper only applies to schools in England, as Wales and Scotland have autonomy over their education policies. See Department for Education (2025e). The document, updated in August 2025, sets out the principles for safe, responsible and effective use of AI in education settings, with a view to mitigating risks from the user's perspective. For example, use of student homework or assessment data to train AI models should be prohibited unless explicit permission is obtained.
- Apart from providing user guidance, the UK government has also actively supported the development of safe AI tools for school use. On the one hand, it has developed its own AI tools for teachers, such as a lesson-assistant to customize teaching plans, through Oak National Academy, a DfE-funded public body, for better quality control.14Legend symbol denoting The Oak National Academy, which had provided online resources for teaching during the Covid-19 pandemic, has developed an AI-driven lesson-assistant known as "Aila" which can create and customize teaching plans and lessons efficiently. It claims the tools has met the highest safety standards. See Oak National Academy (2024a, 2024b). On the other hand, it has also proactively addressed potential issues with privately developed AI tools through the following measures:
-
(a)
Setting clear safety expectations for AI tools: DfE has developed a guidance on generative AI product safety for the education sector. Published in January 2025, the guidance targets EdTech developers and suppliers, outlining the minimum requirements in terms of capabilities and designs that AI products should meet to be considered safe for educational use.
15Legend symbol denoting Seven aspects are covered in the guidance, including, filtering, monitoring, security, design and testing, privacy and data protection, intellectual property and governance. See Department for Education (2025d). For example, it sets out clear technical safeguards including enhanced filtering of harmful content, and protection against unauthorized modifications to the products. To bolster credibility of the guidance, DfE collaborated with major technology firms like Google and Microsoft in developing what it describes as "the world's most detailed set of safety expectations for AI in education";
16Legend symbol denoting Department for Education (2025b).
-
(b)
Building a government-approved content store: To help AI tool developers meet these safety expectations, DfE has been developing reportedly the world's first content store for training AI tools for education since 2024.
17Legend symbol denoting Department for Education (2024). With an investment of £3 million (HK$32 million), the said content store is currently under pilot, providing accurate, relevant and legally compliant data for AI tool developers to train their AI tools for educational use. The data repository comprises local teaching materials and curriculum, guidance from DfE, and anonymized assessments and works of students. Use of the store is voluntary.
According to DfE's estimate, the content store could raise accuracy of the generated content of AI tools from 67% to 92%. To promote the content store, the government has allocated an extra £1 million (HK$11 million) to encourage 16 selected innovators in the private market to use the store data to create AI tools for teachers' use; and
-
(c)
Facilitating evaluation of AI tools in the market: The rapid development of AI has spurred numerous EdTech products in the market, and schools may find it challenging to identify tools that are both effective and safe. In this connection, DfE has established an EdTech Evidence Board in early 2025, a panel of experts from the education and technology sectors tasked with evaluating the pedagogical efficacy of these EdTech products. DfE would invite AI tool developers to submit evidence on their products, which would be measured against a set of criteria currently under development. These findings will then be published for reference, helping schools learn about the educational value of the EdTech products and make informed procurement decisions.
18Legend symbol denoting Chartered College of Teaching (2025).
Singapore's approach to the supply of AI tools for schools
- Singapore also places strong emphasis on EdTech. In 2024, it published the renewed EdTech Masterplan 2030, supplementing the National AI Strategy released in 2019 which called for "personalized education through adaptive learning and assessment". Furthermore, the Ministry of Education ("MOE") issued in 2024 an AI-in-Education Ethics Framework to guide schools and students on the use of AI. The framework establishes key principles such as fairness, accountability, transparency, and safety, for integrating AI into the learning environment, alongside a practical guidance on the use of generative AI in schools.19Legend symbol denoting Ministry of Education (2025a).
- On the supply of AI tools, the Singaporean government has reinforced Student Learning Space ("SLS"), its national online learning platform, by incorporating new AI-powered tools.20Legend symbol denoting Ministry of Education (2025a, 2025d). Key safety features include:
-
(a)
-
(b)
Using representative local data for AI training: To mitigate data bias of the AI tools provided on the SLS platform, MOE requires them to be trained with representative local data.
23Legend symbol denoting Ministry of Education (2025b). For example, AI tools for language learning (e.g. writing) were trained with local data to reduce bias inherent in large language models trained on data from the Internet, which covers global content. This helps ensure an accurate reflection of Singapore students' language capabilities; and
-
(c)
"Whitelisting" for third-party AI tools: While SLS centrally provides government-developed AI tools to schools, it allows integration of external resources from the market through a "whitelisting" arrangement. Providers have to seek pre-approval from MOE before their products can be accessed by teachers and students via the SLS platform.
Concluding remarks
- Both the UK and Singapore have implemented measures to address the risks associated with deploying AI in education. Beyond issuing guidance, both have taken additional steps to ensure the safe provision of AI tools. In the UK, the dedicated content store for AI training ensures the tools are safe, relevant and legally compliant (e.g. with intellectual property protection laws and regulations). On top of that, there is an expert board in place to evaluate the AI tools, supporting schools to make informed procurement choices. In contrast, Singapore adopts a centralized approach to AI tools provision, through direct development of these tools and AI training with representative local data, thereby building user confidence. These initiatives are considered key to ensuring a reliable and quality supply of AI-assisted tools in education, which may offer valuable insights for Hong Kong.
Prepared by CHEUNG Chi-fai
Research Office
Research and Information Division
Legislative Council Secretariat
17 October 2025
Endnotes:
- Precedence Research (2025).
- Panel on Education (2025).
- UNESCO (2024) and Office for Artificial Intelligence (2021).
- Precedence Research (2025).
- 香港教育工作者聯會(2025a).
- Digital Policy Office (2025b).
- Policy Address (2024, 2025). The 2025 Policy Address has announced that a Blueprint for Digital Education in Primary and Secondary Schools will be issued in 2026.
- Apart from eLAFP, the Hong Kong Education City was studying the feasibility of developing AI tools for learning. See Hong Kong Education City (2024, 2025).
- Panel on Education (2025). Other concerns include whether schools have adequate funding for procuring latest digital technology.
- 明報(2025).
- 香港教育工作者聯會(2025b).
- UKAI (2025).
- The policy paper only applies to schools in England, as Wales and Scotland have autonomy over their education policies. See Department for Education (2025e).
- The Oak National Academy, which had provided online resources for teaching during the Covid-19 pandemic, has developed an AI-driven lesson-assistant known as "Aila" which can create and customize teaching plans and lessons efficiently. It claims the tools has met the highest safety standards. See Oak National Academy (2024a, 2024b).
- Seven aspects are covered in the guidance, including, filtering, monitoring, security, design and testing, privacy and data protection, intellectual property and governance. See Department for Education (2025d).
- Department for Education (2025b).
- Department for Education (2024).
- Chartered College of Teaching (2025).
- Ministry of Education (2025a).
- Ministry of Education (2025a, 2025d).
- Adaptemy (2025).
- These might include chatbots providing student counselling advice or algorithms deciding student's score for education pathway decision. See Ministry of Education (2025b).
- Ministry of Education (2025b).
Essentials are compiled for Members and Committees of the Legislative Council. They are not legal or other professional advice and shall not be relied on as such. Essentials are subject to copyright owned by The Legislative Council Commission (The Commission). The Commission permits accurate reproduction of Essentials for non-commercial use in a manner not adversely affecting the Legislative Council. Please refer to the Disclaimer and Copyright Notice on the Legislative Council website at www.legco.gov.hk for details. The paper number of this issue of Essentials is ISE11/2025.