GDPR Updates: Workplace data & EU fines

Share Post:

workplace data protection and eu fines
Table of Contents
    Add a header to begin generating the table of contents

    We have decided to do this blog every two weeks because we can give you more relevant information and more time for you to get around reading the parts that might interest you.

    In this blog we are looking at articles on, Guernsey’s new data protection regime, using CCTV in the work place, a decision on the erasure of personal data, new guidelines from the European Data Protection Board on recording personal data on video devices, the final guidelines on the GDPR territorial scope and the requirements for a EU GDPR Representative and finally the vulnerabilities of SME’s with regard to cyber-attacks.

    We have also seen this week the release of new guidance from the EDPB and the UK and Irish commissioners on subject such as, Special Category data, Subject access requests, data protection by design and default, obligation on data controllers and processors.  We are really seeing clarity on a lot of issues were confusion or differing opinions on meaning.

    Propelfwd is still driving the question on the requirement of a Representative in the union for businesses who fall within the territorial scope of GDPR.  We are also looking at 2020 to develop a training schedule for our data protection courses, and our customised online courses.

    The message on data protection is still not getting to the front-line staff, especially in the hospitality sector.  A recent trip to Ireland, staying in a large group hotel, I was left waiting to be seated for breakfast and the entire guest list in front of me.  This careless approach has led to fines, €15,000 for one restaurant in Romania for this exact thing.  Get the message to your staff – they are your front line for protection against a data breach.

    Our fines section this time is looking at the top 10 individual fines and top 10 fining countries. (Spain is the most proactive)

    Guernsey’s data protection regime – shifting our attitudes

    A shift in our corporate and individual attitude to the misuse of data is now central to the Office of the Data Protection Authority’s (the ODPA) future approach to governance and enforcement in Guernsey.

    Following the end of the transitional relief period under the Data Protection (Bailiwick of Guernsey) Law, 2017 (the Law) in May this year, we have now rounded up the key issues which the ODPA have communicated to us and which will dictate that approach.

    A change in culture in the workplace

    The ODPA has repeatedly highlighted its encouragement for a shift in our attitude (as consumers as well as businesses) so that the misuse of data is seen as both legally and socially unacceptable.

    While legislation and regulatory action both have a role to play in protecting our data, the ODPA sees each of us as the key factor in achieving secure, ethical use of our data. As we begin to recognise the ever-growing value of our personal information and have open access to information about the frequency and severity of data breaches, we can begin to impose an ethical baseline when it comes to the use of our data and punish those businesses which fall beneath it. Over time this will have the effect of building a self-correcting market.

    A simple rule of thumb for officers and employees undertaking any aspect of personal data management, to ensure they don’t fall foul of the standards of protection required by the ODPA, is to treat personal data in the manner in which they, themselves, would wish their own personal data to be treated.

    Predict, prevent, detect, enforce

    The ODPA is seeking to achieve a balanced approach across the four key areas of regulation (prediction, prevention, detection and enforcement) in fulfilling its functions under the law.

    In particular, businesses have been reminded that the principal purpose of the breach reporting requirements under the law is to assist the regulator in predicting breaches and preventing harm before it has occurred, identifying areas in the industry which may require additional resources and training to achieve compliance and/or best practice, rather than as an enforcement tool.

    Delayed introduction of self-funded charging system

    The ODPA released a statement on 28 October 2019 to confirm that while it had been working with the States of Guernsey for the past year to agree a funding model for the ODPA’s activities based on the charging of annual registration fees, it has taken longer than expected to agree and implement such a model.

    Guernsey’s Data Protection Commissioner, Emma Martins stated that the ODPA’s goal is to achieve a “fair, low-cost, low-admin business that allows local businesses to concentrate their efforts on running their businesses well, rather than filling in bureaucratic forms.”

    The delay in agreeing the funding model has resulted in the extension of the current registration exemptions for small businesses and sole traders. Those persons to which the exemptions apply will, now, not be required to register with the ODPA until January 2021.

    Candid camera: CCTV in the workplace

    In a recent case, the European Court of Human Rights (ECHR) has provided useful guidance on an employer’s ability to monitor staff activities covertly. The case relates to a Spanish employer but the outcome is relevant to UK law.

    Some supermarket employees were dismissed for theft in light of confessions by colleagues and covertly recorded video evidence showing the misconduct. Five of the employees then embarked on a 10-year attempt to have the video evidence rendered inadmissible on the basis that it breached their privacy rights under both Spanish law and the European Convention on Human Rights.

    The ECHR confirmed that employers can conduct covert video surveillance without breaching privacy rights, but must be proportionate in balancing the loss of privacy against the necessity of conducting the surveillance covertly. Here the thefts were very substantial – amounting to tens of thousands of euros per month over a period of some months. The employees had been informed that certain surveillance cameras were in place, but not that there were cameras trained on four tills covertly recording employees at those tills throughout their working day.

    The factors to be considered in determining proportionality included (i) whether (and how) the employee has been informed (ii) the extent of the monitoring and the degree of intrusion (iii) whether the employer has legitimate reasons to carry out the monitoring (iv) whether less intrusive measures could be used and (v) the consequences to the employee. Here the covert monitoring was proportionate given the gravity of the thefts, the fact that the shop was a public place, the covert surveillance only lasted 10 days and only three people (including the employees’ trade union representative) reviewed the video evidence before the dismissals.

    Breaches of data protection rights were not argued in this case, but privacy rights are inherent to UK data protection legislation. Employers should undertake a proper and detailed Data Protection Impact Assessment (DPIA) to carry out a balancing act before undertaking any employee monitoring. Where monitoring is overt, with employees being clearly informed about it, and it is proportionate taking into account the data protection principles and the above points, then it should be permissible. Covert monitoring is more difficult to justify but as in this case may still be proportionate depending on the reason for it.

    The fact that this case which looks hopeless and with patently guilty claimants, took 10 years in the courts to resolve should in itself be enough to persuade even the most righteous employers to have a properly considered, reasoned justification, in advance of undertaking any covert monitoring.

    GDPR: Decision of the DPA on the erasure of personal data

    The daily business of a company is simply inconceivable without the processing of data and it is difficult to imagine constellations in which a company does not process personal data.

    As a rule, however, this data processing is not possible for an unlimited period, so sooner or later the time comes when the data must be erased. This also coincides with the data subjects’ right to demand the erasure of their data from the controller if certain reasons exist.

    The question, therefore, remains as to what is meant by “erasure” in the sense of the General Data Protection Regulation (GDPR’)?

    The GDPR does not define this, nor does it provide any information on how the erasure of personal data is to be carried out.

    If one considers that a conventional hard disk consists of millions of bits that can assume two states (e.g. 1 or 0, On or Off) and whose number always remains the same, it quickly becomes clear: When a file is erased, these bits do not simply disappear from the hard disk; much more, the file system merely notes that the corresponding data area is free again for new data. To physically destroy data completely, it would be necessary to destroy the hard disk with a hammer, a drill or the like. This solution is not always feasible. Fortunately, the data protection authority (DPA) recently dealt with the question regarding which technical characteristics result in an erasure (DPA 5.12.2018, DSB-D123.270/009-DPO/2018).

    In the specific case, a policyholder lodged a complaint with the data protection authority because the insurance company did not erase all personal data from its systems by irreversibly overwriting them, but “merely” anonymised some of them. As part of this anonymisation, the individual’s details name, address and sex were irrevocably overwritten manually with a “dummy customer connection”, namely “John Doe”. Also, the customer connection was aggregated with a further non-assignable entry, whereby also the change sequence was no longer reconstructable. The insurance argued that due to this overwriting no further information would be present, which would refer to the identity of the policyholder.

    “Processing”: erasure and destruction are not necessarily congruent

    In its decision, the DPA stated that the definition of “processing” in the GDPR does not necessarily mean that erasure and destruction are identical. From this, the authority deduced that erasure does not necessarily require the final destruction of the data. A possible measure of erasure could also be the removal of the personal reference, i.e. anonymisation, if it is ensured that no one can restore this personal reference without disproportionate effort. Even the possibility of restoring the data at a later point in time, for example by using new technical procedures, does not mean that the erasure by anonymisation is currently insufficient. Complete irreversibility is therefore not required by the authority. It would also be up to the controller to decide how the erasure is to be carried out. The data subject therefore has no subjective claim to a concrete method.

    As a result, this decision is to be welcomed for controllers. On the one hand it allows a certain flexibility with regard to the erasure modalities and, on the other hand, it makes it easier for more complex technical systems, where the erasure of data is not simply possible by pressing the “delete” button, to comply with the requirements of the GDPR through anonymisation.

    Here you will find the decision of the data protection authority.

    New EDPB guidelines on processing personal data through video devices

    How does the GDPR apply to the use of video devices?

    The key takeaway

    Businesses that use CCTV and other video monitoring should check that their current practices are compliant with data protection laws.

    The background

    In July 2019 the European Data Protection Board (EDPB) published their guidelines on data processing in relation to the use of video devices. The public were able to submit their comments on the consultation version of the guidelines until 9 September 2019.

    These guidelines come within the context of increased concern from the EPDB about the use of personal data obtained from videos. The EPDB has stated that a significant amount of personal data is being generated and stored and there is growing concern over the potential for misuse – for example, using the data for purposes beyond security which data subjects may not expect (eg marketing or employee monitoring). The introduction of facial recognition technology presents additional privacy challenges, as does combining surveillance systems with other technology (eg biometrics) which make it harder for individuals to remain anonymous.

    The guidance

    Exemptions

    The guidelines explain that there are a number of scenarios where video footage does not fall within the scope of the GDPR. These include videos where individuals cannot be identified (for example their face or number plate is blurred), or the footage is for law enforcement activity or personal use.

    Specific GDPR requirements for use of video devices

    In cases where the exemptions do not apply, the guidelines set out a number of key requirements:

    • if video devices are being used to monitor a large public area, a data protection impact assessment (DPIA) must be carried out (Article 35(3)(c))
    • if video devices are being used to monitor individuals on a regular or systematic basis, a data protection officer must be appointed (Article 37(1)(b))
    • every camera in use must be for a specific purpose which is recorded in writing (Article 5(2))
    • data subjects must be made aware of the purpose for which they are being recorded and this information must be provided a transparent manner. This will usually involve a installing prominent sign with initial information and then offering more detailed information in an accessible manner (for example, via a link or telephone number).

    Legal bases for processing

    As with other types of processing, the use of personal data obtained through a video device must have a legal basis. For video devices the EPDB states this is most likely to be a legitimate interest or a task carried out in the public interest.

    A legitimate interest must be balanced with the rights of data subjects. Factors that are particularly relevant for this balancing exercise include:

    • the size of the area being monitored
    • the number of data subjects being monitored, and
    • the reasonable expectations of the data subject in relation to the processing of their data (for example, the EDPB states that individuals would usually expect not to be monitored in leisure areas such as gyms and restaurants).

    If a data subject objects to the surveillance, there must be compelling legitimate interest in order to continue. This could potentially include situations involving a threat such as criminal activity.

    However, the interest will only be a legitimate reason to continue the monitoring if it relates to a current (rather than a speculative) threat.

    In line with the principle of data minimisation, personal data collected should also be processed only to the extent necessary. For example, if audio recordings and facial recognition are not required, these video functions should be disabled. The recording should also not take place at times of day or in areas which are not necessary or relevant for the purpose.

    In some exceptional cases the data processor may rely on the consent of an individual as their lawful basis. However, in order to be valid, consent must be freely given, specific, informed and unambiguous. Power imbalances, such as those between an employee and an employer, are likely to negate consent.

    Particular care must be taken where special category data is being recorded (for example, facial recognition via biometric data might fall within this ambit). In order to process this more sensitive type of information you are likely to have to rely on the consent of the individual. If you are capturing and analysing the image of anyone who has not properly consented, this will be a breach.

    The EPDB also provides some helpful examples of ways to protect processed data – compartmentalising it during storage and transmission, using an integrity code, prohibiting external access and storing raw data on a different platform to biometric templates.

    Why is this important?

    The guidelines published by the EPDB provide greater clarity on the application of the rules on video recording. The examples given are helpful in terms of demonstrating what data controllers need to be considering. Above all, the guidelines emphasise that every situation needs to be considered on its own merits. Now would be a good time for businesses to start assessing (or re-assessing) their practices to ensure that they are working towards the required standards.

    Any practical tips? If you want to use the footage from a video device, ensure that you can justify it with an appropriate legal basis. Only use the video device in the areas and at the times necessary. Provide clear signs which explain to data subjects why they are being recorded and make sure that detailed information on the use of the video devices is available.

    Finally, keep an eye out for any updates to the EDPB guidelines following the close of the consultation – there will likely be some fine tuning. Assessments that involve subjective considerations like the reasonable expectations of a data subject are always going to be difficult to interpret, so hopefully more examples to expand our understanding of this concept will follow.

    EDPB adopts final version of guidelines on territorial scope of the GDPR

    On November 14, 2019, the EDPB adopted a final version of Guidelines 3/2018 on the territorial scope of the GDPR (Art. 3). This takes into account the contributions and feedback that the EDPB received during a public consultation on a draft version of the guidelines (see here).

    The draft version of the guidelines raised many questions, which the final version aims to address by clarifying that:

    1. Article 3 determines whether the GDPR applies to a processing activity (e., purpose for processing personal data) and not whether a controller or processor is subject to the GDPR.
    2. The mere presence of an employee in the EU does not suffice to trigger the application of the GDPR; in order for the processing activity to be covered by the GDPR, it must be carried out in the context of the activities of the EU-based employee.
    3. A processing activity will not fall outside the scope of Art. 3(1) simply because the controller in the EU instructs a non-EU processor to carry out the processing.
    4. The inadvertent or incidental offering of goods or services to individuals in the EU (g., to non-EU individuals travelling in the EU) does not trigger the application of the GDPR.
    5. The provision of certain targeted advertisement to individuals in the EU, for example, on the basis of their location, triggers the application of the GDPR because it qualifies as monitoring the behavior of individuals in the EU (Art. 3(2)(b)).
    6. A non-EU processor that targets or monitors the behavior of EU individuals on behalf of a controller (g., carries out a social media campaign targeting individuals in the EU or stores data of an app offered to individuals in the EU), must comply with the GDPR with regard to that processing.
    7. Where requested by the Supervisory Authority, the GDPR representative (Art. 27) must be able to produce a copy of the record of processing operations.
    8. The GDPR representative can only be held directly liable for violations of its own obligations, such as under Art. 30 and 58(1)(a) (record of processing operations and cooperation with Supervisory Authorities) and not for violations committed by the non-EU controller or processor.
    9. The GDPR representative of a non-EU controller apparently cannot be a processor of that controller. The EDPB does not explain why this is. Given the limited direct liability of the representative, it is unclear why a processor could not be the liaison of a controller and the “addressee” of corrective measures imposed on the controller.

    The EDPB will increase international cooperation in order to facilitate enforcement against controllers and processors established in third countries and against international organisations (Art. 50).

    Are SMEs leaving the door ajar allowing cyber-criminals to sneak in?

    It has been well publicised that cyber-attacks have increased dramatically in recent years and that small and medium-sized enterprises (“SMEs”) in particular are vulnerable to such attacks.

    As a consequence, there has been an upward trend in SMEs looking to place blame on their (usually former) directors following cyber-attacks or IT failure. In certain circumstances, directors can find themselves facing a claim if they have failed to ensure that appropriate IT security measures were in place or adhered to where this has seemingly left the door open for cyber criminals. Such claims are usually covered under a directors’ and officers’ insurance policy.

    Directors of SMEs are often responsible for a number of different areas of their business. Just because a company faces a cyber-attack or IT failure, does not mean that directors can be held responsible. Cyber-criminals use sophisticated techniques to hack even the largest businesses with the most secure networks. It is crucial that directors of SMEs are aware of IT/cyber risks and the importance of implementing and enforcing measures to keep their business secure.

    Exposures for SMEs

    The Cyber Security Breaches Survey 2018 measured how organisations in the UK approach cyber security and the impact of breaches. The directors or senior management in 74% of businesses with less than 50 staff (“Micro businesses”) said that cyber security is a high priority with 42% identifying at least one breach or attack in the last 12 months and in 17% of cases, it took these businesses a day or more to recover from the breach.

    Micro businesses are less likely than medium/large businesses to have: 1) sought any information, advice or guidance about cyber security (58% vs. 79%); 2) formal cyber security policies (26% vs. 62%); and 3) any cyber security training (19% vs. 47%).

    SMEs, by their very nature, often do not benefit from large corporate structures with dedicated teams to manage cyber risks. Cyber security will often form a small part of a SME director’s responsibilities. Directors may not realise the sheer amount of information their business holds which would be attractive to cyber criminals. This will often include sensitive customer information.

    Common reasons for not having a cyber-security policy in place are: the cost of implementing and enforcing a cyber-security policy; not being fully aware of the risks; not having in-house IT advice; and not prioritising the review or updating of security protocols and programs.

    Among the Micro businesses that identified at least one breach or attack in the last 12 months: 36% needed to implement new measures to prevent or protect against future breaches; 31% used additional staff time to deal with the breaches; 27% said that breaches stopped staff carrying our day-to-day work; 21% said that breached incurred further recover or repair costs.

    Potential exposure for directors

    SMEs are less resilient to the reputational and financial damage which can result from cyber-attacks or IT failures. The obligation to protect against cyber risks does not mean directors are guaranteeing no attack will succeed, but if directors ignore the risks then they could be exposed to liability for failing to fulfil their duties to the business.

    The rules governing directors are contained within case law and within specific legislation but the Companies Act 2006 (“CA 2006”) sets out what are described as a director’s general duties within sections 171-177.

    The statutory duties under S.172 (to promote the success of the company) and S.174 (to exercise reasonable case, skill and diligence) of the CA 2006 are commonly the focus of claims against directors in this context. These duties are arguably the most fundamental in that a director must act in a way he considers would be most likely to promote the success of the company for the benefit of its members as a whole and they will be held to the standard of the ‘reasonable director’ who will be assumed to have the knowledge, skill and experience to be expected of a director in that role.

    A director will need to be able to demonstrate that they have considered what IT security measures may be required by the company. There will undoubtedly be a cost benefit analysis that will take place in relation to some solutions. Directors should ensure that any such discussions are documented (for example, in board minutes). When considering the level of cyber security that would be appropriate, this will very much depend on the size and type of company. Directors should also consider the type of information being held.

    Conclusion

    The mere existence of a cyber-attack of IT failure does not automatically mean any wrongdoing on the part of a director. However, directors may be exposed to criticism if they are unable to demonstrate that cyber/IT security has been properly considered.

    For businesses handling significant levels of data, it should be part of insurers underwriting process to understand whether cyber and IT security has been considered by its directors as such claims against a director are often covered by a standard SME D&O policy.

     A look at EU Fines and Trends 

    Top 10: Countries with the highest sum of fines

    Country – Total Sum of Fines [€] – Total Number of Fines

    UNITED KINGDOM – 314,990,200 – 2

    FRANCE – 51,100,000 – 5

    AUSTRIA – 18,070,100 – 8

    GERMANY – 14,954,925 – 13

    BULGARIA – 3,145,210 – 8

    THE NETHERLANDS – 1,360,000 – 2

    POLAND – 933,648 – 5

    SPAIN – 841,400 – 20

    GREECE – 550,000 – 3

    PORTUGAL – 424,000 – 4

     

    Top 10: Countries with the highest number of fines

    Country – Total Sum of Fines [€] – Total Number of Fines

    SPAIN – 841,400 – 20

    GERMANY – 14,954,925 – 13

    CZECH REPUBLIC – 19,070 – 11

    HUNGARY – 165,691 – 10

    ROMANIA – 342,500 – 9

    AUSTRIA – 18,070,100 – 8

    BULGARIA – 3,145,210 – 8

    SLOVAKIA – – 90,000 – 6

    FRANCE – 51,100,000 – 5

    POLAND – 933,648 – 5

    References:

    Scroll to Top