DPIA Digital Platforms (DPIA)
The Haven – Digital Learning Platforms
Version 1.0 |
| Name of controller | Autistic Girls Network |
|---|---|
| ICO Registration Number | ZB867479 |
| Subject / title of DPO | Data Protection Officer |
| Name of controller contact / DPO | Cathy Wassell |
| Date of DPIA | February 2026 |
| Next scheduled review | February 2027 |
Step 1: Identify the Need for a DPIA
Project Description and Aims
This DPIA relates to The Haven’s use of digital education platforms to deliver online education, tutoring, communication, and safeguarding oversight for neurodivergent children and young people. Many learners hold Education, Health and Care Plans (EHCPs) and access The Haven as an alternative or specialist provision.
The project involves the processing of personal data via multiple cloud-based systems, including:
-
Google Workspace for Education
-
Canvas (learning management system)
-
Pencil Spaces (live virtual classrooms)
-
TutorCruncher (timetabling, billing, attendance tracking)
These platforms are used together to provide a coherent online learning environment, including live lessons, asynchronous learning materials, communication, attendance tracking, and safeguarding oversight.
Reason for DPIA
A DPIA is required under Article 35 UK GDPR because the processing meets all three high-risk criteria set out in the ICO’s guidance:
-
Systematic and large-scale processing of special category data (health and SEN information)
-
Processing involving children and other vulnerable individuals
-
Systematic monitoring of individuals via online platforms
Additional triggers include the involvement of third-party cloud processors and the potential for significant harm to data subjects if data were misused, breached, or accessed inappropriately.
Step 2: Describe the Processing
Nature of the Processing
Collection
-
Data is collected directly from parents/carers and learners during onboarding and enrolment.
-
Additional data is generated through platform use (attendance, coursework, communications, engagement records).
Use
-
Account creation and authentication across platforms
-
Delivery of live and recorded lessons
-
Distribution and assessment of learning materials
-
Communication between staff, learners, and families
-
Monitoring of attendance, engagement, and safeguarding indicators
Storage
-
Data is stored securely on third-party cloud platforms (Google-hosted infrastructure or equivalent; see processor list at Step 4).
-
Access is role-based and restricted to authorised staff only.
Deletion
-
Data is retained in accordance with The Haven’s retention schedule and applicable education and safeguarding legislation.
-
Accounts are disabled and data archived or deleted when learners leave, subject to mandatory retention periods.
Data Sharing
-
Data is shared with platform providers acting as data processors under Data Processing Agreements (DPAs).
-
No personal data is sold or shared for marketing or commercial purposes.
-
Data may be shared with statutory bodies (e.g. local authorities, social care, police) only where legally required, including under safeguarding obligations.
-
The four named platforms operate independently. There is no automated cross-platform data sharing.
High-Risk Processing
-
Processing of children’s data
-
Processing of special category data (health, SEN, safeguarding)
-
Live online learning environments involving real-time interaction, including risk of: unmonitored one-to-one contact; visible home environments; potential recording of sessions
Scope of the Processing
Nature of Data
Personal data processed includes:
-
Names, email addresses, usernames, and authentication credentials
-
Dates of birth
-
Education records and coursework submissions
-
Communications between staff, learners, and families
-
SEN, health, and therapeutic support information (special category data under Article 9 UK GDPR)
-
Data relating to safeguarding concerns, which may include special category data and/or criminal offence data, processed under Schedule 1, Part 2 of the Data Protection Act 2018 and the safeguarding provisions therein
Volume and Frequency
-
Data is processed daily during term time.
-
Processing covers all enrolled learners, relevant staff, and tutors.
Retention
-
Retained in accordance with education and safeguarding legal requirements.
-
Not kept longer than is necessary for the original purpose.
-
A formal retention schedule is maintained and reviewed annually.
Individuals Affected
-
Children and young people (primary data subjects)
-
Parents and carers
-
Staff, tutors, and contractors
Geographical Scope
-
Primarily UK-based learners and staff.
-
Data hosted primarily within the UK/EEA, with appropriate safeguards (e.g. Standard Contractual Clauses) applied where any international transfers occur.
Context of the Processing
-
Learners are neurodivergent young people. There is a high duty of care and safeguarding responsibility.
-
Families reasonably expect digital platforms to be used for the delivery of online education.
-
Learners and families have limited practical alternative to digital engagement, given The Haven’s online-only model. This means genuine free consent cannot be obtained for core platform use; processing is therefore based on lawful bases other than consent (see Step 4).
-
The use of cloud-based platforms for online education is not novel, but risk is heightened due to the vulnerability of the cohort.
-
Technology used is industry-standard for online education provision.
-
There is significant and current public concern regarding children’s data, online safety, and the use of commercial platforms in educational settings.
-
No approved certification scheme or code of conduct currently applies to this processing, though The Haven adheres to relevant ICO guidance and DfE data protection standards.
Purposes of the Processing
-
Deliver safe, accessible, and personalised online education to learners who cannot access mainstream settings
-
Support neurodivergent learners through flexible, relationship-first digital access
-
Enable communication between staff, learners, and families
-
Monitor attendance, engagement, and safeguarding indicators in line with statutory obligations
-
Meet legal education, SEND, and safeguarding duties
Benefits
-
Enables access to education for learners unable to attend mainstream or physical provision
-
Supports learner autonomy, wellbeing, and dignity
-
Improves consistency, record-keeping, and safeguarding oversight
-
Enables transparency and accountability to commissioners, local authorities, and Ofsted
AI tools: scope of this DPIA
Some of the platforms covered by this DPIA include integrated AI features (e.g. Google Workspace’s assistive AI, Canvas AI assistants, Pencil Spaces collaborative AI features). Where AI features process or could infer personal data of children:
- The feature is reviewed under the Responsible Use of AI Policy v10.26 (institutional Diamond Standard + relational Diamond AI Posture).
- A separate AI Review entry is created in the AI tools register, cross-referenced to this DPIA.
- Where the AI feature would process identifiable children’s data for training, profiling, behaviour prediction, biometric inference, or affect recognition, it is not deployed — these are prohibited practices under the AI policy.
- Standalone AI tools used by staff for non-child-facing tasks (drafting, summarisation, planning) are subject to the same policy but recorded in the AI register rather than this DPIA.
Generative AI tools used directly by learners (where adopted) require their own DPIA entry before deployment.
Step 3: Consultation Process
-
Internal consultation undertaken with senior leadership, safeguarding leads, SENCo, and systems administration staff.
-
Platform providers’ published security documentation, data processing terms, and sub-processor lists have been reviewed as part of due diligence.
-
Ongoing informal feedback from families, captured through onboarding calls and annual review discussions, informs platform configuration and privacy communications.
-
Direct consultation with learners is limited given the age and vulnerability of the cohort. Learner voice is captured through keyworker and mentor relationships, and through family and carer advocacy.
-
Any decision that departs materially from concerns raised by families or staff will be documented with reasons in Step 7.
| Recommended: Record the dates and format of any consultation activities undertaken (e.g. ‘Leadership meeting, [date]’; ‘Onboarding call review, [date range]’). This strengthens the evidential record if the ICO reviews this DPIA. |
Step 4: Assess Necessity and Proportionality
Lawful Basis
The following lawful bases apply under UK GDPR and the Data Protection Act 2018:
-
Article 6(1)(e) – processing necessary for a task carried out in the public interest (delivery of education)
-
Article 6(1)(c) – processing necessary to comply with a legal obligation (statutory education and safeguarding duties)
-
Article 6(1)(b) – processing necessary for the performance of a contract (where applicable to the contractual relationship with families)
-
Article 9(2)(g) – processing necessary for reasons of substantial public interest (DPA 2018, Schedule 1, Part 2, para 6 — statutory and government purposes; and para 18 — safeguarding of children and individuals at risk)
-
Article 9(2)(h) – processing necessary for health or social care purposes (DPA 2018, Schedule 1, Part 1, para 2 — health care; applicable to SEN and therapeutic support data)
| Note: Consent is NOT used as the lawful basis for core platform processing. This is correct given that learners and families have limited practical alternative to digital engagement. Ensure privacy notices do not use consent language in relation to core platform use. |
Necessity
-
Digital platforms are necessary to deliver online education to The Haven’s learner cohort. No reasonable offline or in-person alternative exists for this provision model.
-
Each platform serves a distinct and necessary function; no single platform could replace the combination.
Proportionality
-
Only data that is necessary for the stated purposes is collected.
-
Access controls are role-restricted; staff only access data relevant to their function.
-
Clear functional separation exists between platforms.
Preventing Function Creep
-
Platforms are used strictly for education delivery, communication, and safeguarding.
-
No secondary, commercial, or research use of personal data without separate legal basis and, where applicable, further DPIA.
Data Quality and Minimisation
-
Data is entered by trained staff following defined processes.
-
Regular review and correction procedures are in place.
-
Retention schedule ensures data is not kept beyond the necessary period.
Transparency
-
Privacy notices are provided to parents and carers at the point of enrolment.
-
Clear explanations of platform use are provided during onboarding.
-
Notices are written in plain English and are accessible to the families The Haven serves.
Individual Rights
-
Processes are in place to handle requests for access, rectification, and erasure (where applicable and not overridden by legal retention obligations).
-
Parental rights are respected alongside learner autonomy, with particular sensitivity given learner age and capacity.
Processors
-
All platform providers have Data Processing Agreements (DPAs) in place with The Haven.
-
Due diligence has been conducted on the security standards and sub-processor arrangements of each named processor.
-
Sub-processor lists for each platform are reviewed periodically.
International Transfers
- Where any processing involves transfers of personal data outside the UK/EEA, appropriate safeguards are in place, including Standard Contractual Clauses (SCCs) or equivalent UK-approved transfer mechanisms.
Step 5: Identify and Assess Risks
| Risk | Description | Likelihood | Severity | Overall Risk |
|---|---|---|---|---|
| Unauthorised access to learner accounts | Weak credentials or shared access leading to exposure of learner data | Possible | Significant | Medium |
| Accidental data disclosure by staff | Staff sharing data inappropriately, e.g. via email or incorrect platform settings | Possible | Significant | Medium |
| Platform security breach | Breach of a third-party cloud platform exposing personal and special category data | Remote | Severe | Medium |
| Learner identity disclosure in live sessions | Learner home environment, appearance, or identity visible to peers or unintended parties during live lessons | Possible | Significant | Medium |
| Learner data exposure causing distress or reputational harm | Sensitive SEN, safeguarding, or health data accessed or disclosed without authorisation, causing harm to the learner | Possible | Significant | Medium |
| Retention failure | Data retained beyond the required period due to manual processes or oversight, increasing exposure risk | Possible | Minimal | Low |
| Third-party sub-processor risk | Platform providers using sub-processors whose security standards have not been independently verified | Possible | Significant | Medium |
| Privacy notice / consent failure | Families not receiving, understanding, or meaningfully engaging with privacy information at onboarding | Possible | Minimal | Low |
Step 6: Identify Measures to Reduce Risk
| Risk | Mitigation Measures | Effect on Risk | Residual Risk | Approved |
|---|---|---|---|---|
| Unauthorised access | Strong password policy; role-based access controls; multi-factor authentication (MFA) where available; regular access reviews; staff training | Reduced | Low | Yes |
| Staff disclosure error | Clear data handling policies; onboarding training; audit trails; regular reminders; incident reporting procedure | Reduced | Low | Yes |
| Platform breach | Use of reputable, contractually bound processors; DPAs in place for all platforms; encryption in transit and at rest; review of sub-processor lists; incident response plan | Reduced | Medium | Yes |
| Live session identity risks | Camera-optional policy enforced; guidance to learners and families on background privacy; no recording without consent; session oversight by staff | Reduced | Low | Yes |
| Learner data exposure causing harm | Trauma-informed data handling policies; minimal data visibility principle applied; role-based access; staff training in sensitive data management | Reduced | Low | Yes |
| Retention failure | Formal retention schedule in place; annual review; account disabling process on learner exit; staff responsible for deletion identified | Reduced | Low | Yes |
| Sub-processor risk | Sub-processor lists reviewed during onboarding of new platforms and annually thereafter; DPAs require processor notification of sub-processor changes | Reduced | Low | Yes |
| Privacy notice failure | Plain-English privacy notices provided at enrolment; onboarding calls used to explain platform use; notices reviewed annually | Reduced | Low | Yes |
| Any risk recorded as residual Medium or above should be reviewed with the DPO before processing commences, and the ICO consulted if residual risk remains High. |
Step 7: Sign Off and Record Outcomes
| Item | Name / Position / Date | Notes |
|---|---|---|
| Measures approved by: | [Name / Role / Date] | Integrate actions back into operational plan with dates and responsible owners. |
| Residual risks approved by: | [Name / Role / Date] | If accepting any residual High risk, consult the ICO before proceeding. |
| DPO advice provided: | Yes | DPO should advise on compliance, Step 6 measures, and whether processing can proceed. |
| Summary of DPO advice: | Processing may proceed subject to: continued annual review; staff training on data handling and platform use; adherence to retention schedule; and ongoing review of processor sub-processor arrangements. | |
| DPO advice accepted or overruled by: | [Name / Role / Date] | If overruled, reasons must be documented here. |
| Consultation responses reviewed by: | [Name / Role / Date] | If any decision departs from views expressed by families or staff, reasons must be documented. |
| This DPIA will be kept under review by: | Data Protection Officer and Senior Leadership Team | Review annually, or upon any material change to processing activities, platforms used, or applicable legislation. |
| Next scheduled review date: | [Insert Date] | At minimum, annual review is required. |
Document control
| Document owner | Data Protection Officer / Senior Leadership Team |
|---|---|
| Version | 1.0 |
Refreshed for AI / DfE 2025 / Diamond AI — signed off April 2026
This policy was refreshed on 2026-04-29 to align with DfE Generative AI in Education 2025, ICO Children’s Code, EU AI Act (compliance + extension), UK GDPR / DPA 2018, and to make explicit the link between the institutional Diamond Standard (Safety, Sovereignty, Symmetry, Stewardship) and the relational Diamond AI posture (work with AI; do not offload decisions to AI; do not defer entirely from AI).
Status: live — signed off 29 April 2026 by Proprietor and Governing Body.
| Status | Draft — awaiting sign-off | | Date created | [Insert Date] | | Date last reviewed | [Insert Date] | | Next review due | [Insert Date] | | Related documents | Privacy Notice (Families); Child Protection and Safeguarding Policy v01.26; Data Retention Schedule; Information Security Policy; Individual Platform DPAs |