It has come to our attention that a fraudulent party is using a domain name similar to ours (magrathsheldrick.com) to impersonate our Firm and demand payment for non existent invoices. Please be advised that any such communications are not from Magrath Sheldrick LLP.

How to Protect Yourself:

Your security is our priority. If you have any concerns or believe you may have been targeted, please get in touch immediately.

Open Navigation
Request a callback
Employment

Improper use of Biometric Data

5 mins read 14/10/2024

The end of the first term, new systems may have been put in place but is ICO likely to find them acceptable? It is always worth putting new data processes under the microscope – especially with the ICO ramping up their focus on compliance.

Over the summer the ICO issued a reprimand to a secondary school for using children’s biometric data in a cashless payment system.  In doing so the ICO made it clear that the school had breached the Data Protection Act 2018 and UK GDPR in a number of ways.  Whilst Data Protection Impact Assessments may seem like unnecessary admin when a business wants new systems up and running, this case highlights the risks of implementation without due consideration. Proper DPIA’s significantly lower that risk!

The school, Chelmer Valley High, was issued with a reprimand for failing to undertake a Data Protection Impact Assessment (DPIA) prior to the implementation of facial recognition technology (FRT). A copy of the ICO’s reprimand is available here: https://ico.org.uk/action-weve-taken/enforcement/chelmer-valley-high-school/ – and is a sobering reminder that data protection transgressions are made public.   Not only had the school failed to conduct a DPIA but they had also failed to obtain opt-in “explicit consent” for the use of this technology, as required under Article 9 of the GDPR (for biometric data).

Public sector breaches are addressed differently by the ICO, with organisations predominantly reprimanded, as opposed to being fined with ICO making recommendations for matters to be remedied in specific ways.

 

Processing biometric data: the importance of explicit consent

The Chelmer Vally case is the latest in a line of ICO decisions concerning failures to obtain explicit consent for the processing of biometric data.  Highlighting the importance of discussing commercial decisions and business efficiencies with Data Protection Officers and undertaking necessary DPIAs, however frustrating that may seem, when there is pressure to implement new systems.

  • In 2018, the ICO undertook regulatory action against HMRC regarding their use of the Voice ID service for customer verification. The ICO determined that HMRC had (i) failed to give customers adequate information about the processing of their biometric data; and (ii) not given them the opportunity to consent or withhold consent. Subsequently, HMRC was required to delete all biometric data collected under the Voice ID service for which explicit consent had not been obtained. No doubt a time consuming and frustrating exercise that could well have been avoided.
  • North Ayrshire Council found itself in difficulties concerning its cashless catering in schools (January 2023) when the ICO wrote to them raising concerns about its implementation of biometric data in nine school canteens. The ICO’s letter stated that (i) explicit consent was the most appropriate legal basis for processing facial recognition technology (FRT) for cashless catering, and (ii) a Data Protection Impact Assessment (DPIA) should have been conducted before starting the processing. The Council instructed its schools to stop using FRT for taking payments from children and set them back to square one whilst the situation was rectified / another solution found.
  • Serco Leisure (“Serco”) found themselves in difficulties concerning the use of fingerprint technology to monitor staff attendance. This technology, similar to facial recognition, converts fingerprint images into numerical identifiers, which are then compared with reference images in a database for authentication. The ICO found this practice to be overly intrusive and unnecessary for monitoring staff attendance, adding that Serco and the centres in question had not obtained explicit, free, and fair consent from their staff. As a result, the ICO issued an Enforcement Notice requiring Serco and the Trust centres to cease all biometric processing and destroy the data within three months. The ICO also recommended implementing an alternative monitoring method that staff could opt into without any pressure or penalty.

Those responsible for the implementing biometric data systems often misunderstand the core data protection principles involved and the importance of undertaking DPIA’s to ensure compliance.  The key issue in these cases is that the initial capture of an image and its conversion into a numerical identifier stored in a database constitutes the processing of personal data. Regardless of whether the facial image or fingerprint is immediately deleted the processing still occurs. The identifier qualifies as personal data, as it can indirectly identify an individual when combined with other information.

Children’s data is regarded by the ICO especially important and must be treated as such. Because of this the education sector, in particular, has been increasingly scrutinised by the ICO, with the ICO observing that in certain contexts children should be recognised as having the ability to consent to the processing of their own data.

Biometric data is special category data (Article 9(1) UK GDPR) and can only be processed when the data processor has identified both the lawful basis under Article 6 UK GDPR and a separate condition for processing under Article 9 UK GDPR. There are further conditions that may have to be satisfied under Schedule 1 of the Data Protection Act 2018.  In line with the purpose limitation principle under Data Protection law, schools and colleges can only store and use the biometric information for the purpose for which it was originally obtained and parental/child consent is given. For further information refer to the July 2022 guidance provided by the Department of Education Guidance found here: Protection of biometric data of children in schools and colleges (publishing.service.gov.uk)

It is also worth noting that under the Protection of Freedoms Act 2012, schools and colleges must notify each parent, carer/legal guardian of the child of their intention to process the child’s biometric information, and that the parent may object at any time to the processing of the information. A child’s biometric information must not be processed unless at least one parent of the child consents, and no parent of the child has withdrawn his or her consent, or otherwise objected, to the information being processed. In addition, a pupil’s or student’s objection or refusal, overrides any parental consent to the processing.   Quite a number of compliance hurdles.

Further, if a pupil or student under 18 objects or refuses to participate (or to continue to participate) in activities that involve the processing of their biometric data, the school or college must ensure that the pupil/student’s biometric data is not taken/used as part of a biometric recognition system. A pupil’s or student’s objection or refusal overrides any parental consent to the processing. Section 26 and Section 27 of the Protection of Freedoms Act 2012 makes no reference to a lower age limit in terms of a child’s right to refuse to participate in sharing their biometric data.  Once a student is 18 years old they will be considered an adult and as such parental consent is no longer relevant.

Bottom line is that there is a LOT to think about – which is the whole point of requiring Data Protection Impact Assessments.  Firstly, they are designed to help organisations identify the data protection implications during the design phase. Secondly, they are there to demonstrate that an organisation has properly considered the impact of projects – which, should the ICO investigate, will help to mitigate against the harshest penalties.  You may not have got it perfectly correct, but you were, at least, trying.

  • Consult your Data Protection Officer;
  • Do a Data Protection Impact Assessment;
  • Adjust projects accordingly to the risks identified to ensure compliance; and
  • Minimise compliance failures.

For information about biometric data and conducting DPIAs contact [email protected].

Sign up