GA-Alliance

Data Protection & Cybersecurity

GA-Alliance

GA-Alliance specializes in providing expert guidance on privacy, data protection and cybersecurity, assisting clients in navigating complex industry regulations and effectively managing cybersecurity risks.
Our firm boasts the capability to support clients with multi-disciplinary teams comprising both legal professionals and information technology experts.

Personal Data Protection Legal Advice
Our team provides legal advice on personal data protection, assisting clients in interpreting and adhering to national and international privacy regulations.

Our legal services encompass:

  • Privacy audits, gap analysis and risk assessments
  • Creation and implementation of all necessary acts and documents as per applicable regulations
  • Drafting of data protection policies and procedures
  • Assistance in the management of data breaches and notification to the relevant authorities
  • Our professionals also serve as DPOs (Data Protection Officers) for institutional investors and leading companies, both domestically and internationally, spanning diverse sectors.

Cybersecurity Services
We provide specialised legal advice for the prevention and management of cybersecurity incidents, enabling clients to comprehend and mitigate cybersecurity risks to protect their businesses and data.

Our legal services include:

  • Cybersecurity audits and vulnerability assessments
  • Advice in drafting and negotiating cybersecurity contracts
  • Assistance in the management of security incidents and responding to cyber attacks

With our extensive experience and expertise in the fields of data protection and cybersecurity, we deliver comprehensive and reliable legal support to address challenges associated with digital information management and cybersecurity effectively.

Our experts

VIEW MORE

Caricamento..

Insights

GA-Alliance

News

Lahore, Jan 30 2026

GA-Alliance lands in Pakistan
Press release

GA-Alliance lands in Pakistan: strategic partnership signed with Axis Law Chambers

MILAN – 29 January 2026

GA-Alliance, a global legal and tax firm with more than 2,600 professionals in 80 countries, announces its entry into the Pakistani market. The strategic partnership with Axis Law Chambers, a leading full‑service law firm in the region, marks a further expansion of GA‑Alliance’s network, which today covers geographies that generate nearly 90% of global GDP.

The agreement strengthens GA‑Alliance’s commitment to its “one‑stop‑shop” strategy. By integrating local expertise with the highest global standards, the Alliance offers clients a single, efficient access point for all legal and tax needs. This model removes the complexities of managing multiple advisers across different jurisdictions, delivering a coordinated and seamless experience that prioritizes clarity and business growth.

Axis Law Chambers brings to the Alliance a reputation for excellence, particularly in high‑value cross‑border mandates and advice on complex regulatory matters. Regularly listed by Chambers and Partners and The Legal 500, Axis Law stands out for its transactional work in corporate matters, mergers and acquisitions (M&A), employment law, intellectual property, foreign investment, public‑private partnerships, corporate governance, antitrust, tax, data protection and sectoral compliance. The firm advises clients in key industries such as energy, oil & gas, mining, healthcare, telecommunications, automotive, financial services, defense, retail, manufacturing, agriculture, media, IT, logistics, real estate and non‑profit organizations.

Axis Law also boasts one of Pakistan’s most authoritative dispute resolution practices, including litigation and international arbitration, with solid experience in proceedings before ICSID (International Centre for Settlement of Investment Disputes, based in Washington, D.C., and part of the World Bank), ICC (International Chamber of Commerce, based in Paris) and LCIA (London Court of International Arbitration, based in London). This depth of expertise ensures GA‑Alliance clients receive top‑level support in the world’s fifth most populous country, one of the most dynamic economies in Asia.

Francesco Sciaudone, Managing Partner of GA‑Alliance, emphasized the strategic importance of the operation: “Our entry into Pakistan through the partnership with Axis Law Chambers is another step that strengthens our global growth path. At GA‑Alliance, the goal is to simplify complexity for our clients. By extending our ‘one‑stop‑shop’ model to an outstanding Pakistani firm, we are increasingly able to offer our clients the ability to operate with confidence in a very large number of markets worldwide. We are not only expanding our geographic presence; we are enhancing a sophisticated ecosystem where international best practices and precision meet local market leadership to meet clients’ needs in a simple, direct and highly efficient way.”


About GA‑Alliance

With more than 2,600 professionals in 80 countries, GA‑Alliance is a global legal and tax firm with deep European roots, combining a strong legal tradition with a broad international presence. Founded on principles of excellence and innovation, GA‑Alliance offers integrated, multidisciplinary expertise and positions itself as a strategic partner to promote sustainable growth in an ever‑evolving regulatory environment.


About Axis Law Chambers

Axis Law Chambers is a leading Pakistani law firm recognized for excellence in corporate and transactional advice and in resolving commercial disputes. With a team of over 30 professionals and seven partners, the firm assists national and multinational clients in high‑impact transactions, regulatory compliance and complex dispute resolution matters, including international arbitrations.

GA-Alliance

News

Jul 15 2025

GA-Alliance Welcomes Salvatore Figliuolo as New Partner

GA-Alliance Launches Cybersecurity and Digital Compliance Practice and Welcomes Salvatore Figliuolo as New Partner

GA-Alliance, the leading law firm renowned for its innovative and client-centric global legal services, is pleased to announce the establishment of a new Cybersecurity and Digital Compliance Desk. This strategic initiative - designed to support public and private clients, both domestically and internationally, in preventing digital risks, managing data, and ensuring compliance with Italian and international regulations - underscores GA's relentless commitment to providing outstanding and thorough legal support to help businesses navigate the raising challenges of the digital era.

The new practice will be led by Mr. Salvatore Figliuolo, an experienced lawyer who will join GA as a new partner. Mr. Figliuolo has gained extensive exposure to technology law and cybersecurity, both in Italy and abroad, coupled with significant managerial roles in Generative AI companies.

GA realizes the importance of moving from a reactive assistance approach to a more proactive approach, aimed at strengthening clients' digital resilience. As a result, the new desk will offer joint legal and technical support, also thanks to the collaboration with Visibily, a managed security service provider (MSSP) company specialized in advanced enterprise solutions. This unique desk will provide integrated legal and technological services including:

  1. Digital risk and vulnerability analysis;
  2. Review and set up of internal policies and data management protocols;
  3. Training and staff awareness on security and privacy issues;
  4. Ongoing assistance during inspections, data breach situations, and digitalization projects;
  5. Management of relationships with authorities (e.g., Police, Data Protection, Cybersecurity, European Authorities).

With this new initiative, GA-Alliance reaffirms its commitment to supporting clients through their digital evolution with a practical, multidisciplinary, and prevention-oriented approach. The team will also leverage the existing expertise within the law firm, particularly in privacy and administrative law.

Francesco Sciaudone, Managing Partner of GA-Alliance, commented: "The arrival of Salvatore Figliuolo and the launch of the Cybersecurity and Digital Compliance desk represent a natural evolution in GA's growth toward a more and more sophisticated professional services market. In an environment where companies are increasingly exposed to digital risks, and to evolving complex regulations, we believe essential being able to offer clients a comprehensive and integrated support – both domestically and internationally – combining legal expertise with technological solutions. The cooperation with a sophisticated technical partner and the synergies among our internal desks further strengthen our ability to support promptly, concretely, and strategically our clients in facing these new digital challenges."

GA-Alliance

Knowledge Management

Jul 23 2024

Eu Alert - Data, IP and Privacy

This newsletter provides a selection of opinions and analysis from our EU legal experts on interesting policy developments, recent case law and new regulatory directions of major industry practices. It is released biweekly and covers areas such as: Competition Law, Sanctions, Trade, Energy, Finance, EU funds, Data IP and Privacy, Life Sciences, Transport and Court of Justice of the European Union news.

The aim is to provide an up–to–date tool for quick and easy consultation on the most current and important topics at EU level.

EUROPEAN COMMISSION (EC)

The European Commission designates adult content platform XNXX as Very Large Online Platform under the Digital Services Act (10.07.2024) – The Commission has formally designated XNXX as a Very Large Online Platform (VLOP) under the Digital Services Act (DSA).Therefore, XNXX will have to comply with the most stringent rules under the DSA within four months of its notification Such obligations include adopting specific measures to empower and protect users online, to prevent minors from accessing pornographic content online, including with age-verification tools, to provide access to publicly available data to researchers, and to publish a repository of ads.

The European Commission publishes the second report on the State of the Digital Decade (02.07.2024) – The European Commission has published the second report on the State of the Digital Decade, providing a comprehensive overview of the progress made in the quest to achieve the digital objectives and targets set for 2030 by the Digital Decade Policy Programme (DDPP). This year, for the first time, the report is accompanied by an analysis of the national Digital Decade strategic roadmaps presented by Member States, detailing the planned national measures, actions and funding to contribute to the EU's digital transformation.

GA-Alliance

Knowledge Management

Jul 16 2024

EU AI Act - General Purpose AI Rules

Artificial Intelligence Act: fostering responsible AI development in Europe

Overview

The Artificial Intelligence Act (“AI Act”) is set to be published on the EU’s Official Journal soon, following the final approval of the Council of the EU on 21st May 2024. This landmark legislation aims to establish a regulatory framework for Artificial Intelligence (“AI”) across the European Union, promoting trustworthy and ethical development, deployment, and use of AI technologies. New rules will enter into force twenty days after the publication, with obligations then phased-in gradually over three years, more specifically:

  • Bans on prohibited practices, which will apply six months after the entry into force date;
  • Codes of practice, which will apply nine months after entry into force
  • General-purpose AI rules including governance, which will apply 12 months after entry into force
  • Obligations for high-risk systems, which will apply 36 months after the entry into force

The Significance of Codes of Practice

One crucial aspect of the AI Act involves the creation of Codes of Practice for General-Purpose AI (“GPAI”) models. These codes are fundamental for bridging the gap between the high-level requirements outlined in the AI Act for GPAI providers and the practical implementation of those requirements. In essence, they serve as a detailed roadmap for ensuring compliance with the principles enshrined in the new Regulation.

Concerns Regarding Stakeholder Involvement

On 8th July 2024, certain Members of the European Parliament (“MEPs”) expressed their concerns in a letter sent to EU's AI Office urging to include civil society in the drafting of rules for powerful AI models. In particular, they argued against European Commission's initial approach, which reportedly proposed to allow AI model providers to take the lead in drafting the codes, with civil society organizations (“CSOs”) playing a more limited consultative role.

MEPs expressed apprehension that such an approach could result in codes that prioritize industry interests over broader societal concerns. They advocate for an inclusive process that actively engages a diverse range of stakeholders, including:

  • Companies, as input from the AI development and deployment sectors is crucial for ensuring the codes are practical and workable.
  • Civil Society Organizations, which bring valuable perspectives on ethical considerations, potential biases, and the impact of AI on fundamental rights.
  • Academia, with researchers and experts offering insights into the latest advancements in AI technology and potential risks.
  • Other Stakeholders, considering that a diverse range of voices can contribute to well-rounded and comprehensive codes.

At the same time, civil society members highlight the potential for a situation where large technology companies write their own rules, potentially undermining AI Act's goal of establishing equal and globally influential standards for GPAI development.

Looking Forward

The European Commission has acknowledged the need for clarity on stakeholders’ involvement. Details regarding the participation of CSOs and other stakeholders are expected to be included in a forthcoming call for expressions of interest. An external firm will be responsible for leading the drafting process, with the AI Office maintaining oversight and approving the final versions of the codes.

The coming months will be crucial in determining how the EU navigates stakeholders’ involvement in crafting the AI Act's Codes of Practice. A transparent and inclusive process will be essential for establishing strong, effective, and ethically sound standards for trustworthy AI development across Europe.

GA-Alliance

Knowledge Management

Jun 26 2024

EU Alert - Data, IP and Privacy

This newsletter provides a selection of opinions and analysis from our EU legal experts on interesting policy developments, recent case law and new regulatory directions of major industry practices. It is released biweekly and covers areas such as: Competition Law, Sanctions, Trade, Energy, Finance, EU funds, Data IP and Privacy, Life Sciences, Transport and Court of Justice of the European Union news.

The aim is to provide an up–to–date tool for quick and easy consultation on the most current and important topics at EU level.

COUNCIL OF THE EUROPEAN UNION (COUNCIL)

Data protection: Council agrees position on GDPR enforcement rules (13.06.2024) – The Council has reached an agreement on a common member states’ position on a new law which will improve cooperation between national data protection authorities when they enforce the General Data Protection Regulation (GDPR). The Council position maintains the general thrust of the proposal but amends the draft regulation as regards clearer timelines, enhanced and efficient cooperation and early resolution mechanism.

GA-Alliance

Knowledge Management

Jun 25 2024

Alert - Data Protection and Cybersecurity

The recent EDPB opinion on the use of the “Pay or Consent” system

When browsing on the websites of some newspapers or other online platforms, users are increasingly faced with a choice: subscribe to the service or consent to the use of their data for receiving customized advertising content.

The “Pay or Consent” system is a widespread practice but not necessarily lawful. This issue was addressed in the EDPB's recent opinion (Opinion 08/2024 of 17 April 2024), which was requested by some national Data Protection Authorities (Netherlands, Norway and Germany) and, moreover, also takes into account the ruling of the Court of Justice of the European Union in case C-252/21, in which Meta was involved.

One of the primary legal concerns regarding the “Pay or Consent” system is undoubtedly the validity of consent, which, under the GDPR must be, inter alia, freely given. According to the EDPB, a user's freedom of choice also depends on the options available to him/her, and it is hard to imagine that platforms can obtain freely given consent if the choice is limited to paying a subscription or “paying” with one's data.

The EDPB argues that Data controllers should not merely offer a single alternative involving payment alongside a service that includes the processing of data for behavioural advertising purposes. Instead, they should consider providing data subjects with a third “equivalent alternative” that does not entail paying a fee. This would enable users to make a genuine choice; in fact, if users can access the service both free of charge and also without necessarily having to consent to behavioural advertising, it can be inferred that those who consent to profiling do so freely and knowingly, rather than opting for the only (seemingly) free alternative.

Another critical factor affecting the validity of the consent given is the power imbalance between a proprietor holding a significant market position (e.g. a newspaper) and the data subject (the website user). This imbalance is heightened by the type of service and the fact that it can be said to be “essential” for users. In essence, in “Pay or Consent” systems, users who neither wish to pay nor consent to having their data processed for behavioural advertising may be forced to forgo essential services such as staying informed.

While the freedom of consent given is undoubtedly the most controversial in this context, all other conditions for consent to be deemed lawful must also be met: it must be free, specific, informed, unambiguous and obtained in a manner that is clear and comprehensible to the data subject. Furthermore, data controllers must of course adhere the other principles of the GDPR when processing personal data.

For more information:

EDPB, 'Opinion 08/2024 on Valid Consent in the Context of Consent or Pay Models Implemented by Large Online Platforms', 17 April 2024.

AI Act: new rules on Artificial Intelligence in Europe

On 21 May 2024, the final text of the AI Act was approved.

It is the world's first regulation on artificial intelligence, whose goal is to promote the development and adoption of safe and reliable AI systems within the European single market, ensuring they operate in accordance with the fundamental freedoms and rights of citizens, including the right to protection of personal data.

The AI Act aims to stimulate new investments and foster innovation in artificial intelligence, a sector now deemed crucial not only in the European market but also globally, recognized as a strategically important element across various domains and for many stakeholders.

First of all, the AI Act classifies AI systems based on their risk to the rights and freedoms of stakeholders. In fact, AI systems deemed to present a “limited risk” are subject to less stringent transparency obligations. In contrast, “high-risk” AI systems must comply with a number of specific requirements and face more stringent transparency obligations, except for AI systems authorised by law and used to ascertain, prevent, investigate or prosecute crimes.

In addition, the AI Act introduces a number of prohibitions regarding certain practices considered to have an element of risk, including:

  • adoption of techniques that manipulate individual cognition and behaviour;
  • random collection of biometric data from public spaces, the Internet or via video surveillance systems;
  • use of emotion recognition systems in workplace or educational settings;
  • implementation of social scoring systems;
  • biometric processing to infer data belonging to special categories;
  • use of predictive policing systems targeting specific individuals.

The AI Act also establishes new regulation for so-called foundation models, i.e. AI-based computational systems (e.g. ChatGPT) used for various activities and purposes, such as generating videos, texts and images, speech-language conversations, calculations and more.

In connection with such systems, the AI Act requires providers to conduct impact assessments on high-risk systems or systems used in the banking and insurance sector. Other obligations imposed on these system providers include the obligation to conduct tests to try to resolve systemic risks and to take measures aimed at ensuring an adequate level of security of the hardware and software infrastructure.

Another aspect involves implementing measures to foster the development and adoption of safe AI. In this respect, the competent authorities of each Member State are required to establish regulatory testing spaces dedicated to AI to ensure a controlled environment that fosters innovation and facilitates the development, training, testing and validation of AI systems.

In this context, it is crucial that AI systems undergo testing before being placed on the market. Therefore, tests must be conducted under real conditions, i.e. on the basis of existing data provided by subjects who have consented to their data being processed for AI regulatory testing purposes.

In any case, there is a strong emphasis on safeguarding the right to personal data protection, since the collection of data by AI systems poses a high risk to the rights of data subjects if protective measures are not taken or action is not taken to mitigate the risk of unlawful processing.

The AI Act therefore establishes specific rules to protect individuals whose data are processed, especially with regard to special categories of data (as in the case of “high-risk” AI systems that utilize datasets to train, test and validate AI-based learning models).

Providers of such systems are allowed to process data belonging to special categories, provided that: a) the use of such data is limited and appropriate security measures are taken, b) appropriate measures are taken to protect data and implement adequate safeguards; c) data are not disclosed or notified to third parties; and d) data are deleted once the purpose of the processing has been achieved. 

Finally, the AI Act includes sanctions for those who violate the provisions thereof. In fact, depending on the nature of the breach, the competent authority can impose a penalty equal to a maximum amount (ranging between EUR 7.5 and 35 million) or, if the offender is a company, a percentage of the total annual worldwide turnover for the preceding year, whichever is greater. However, SMEs or start-ups will benefit from reduced penalties.

For more information: Artificial Intelligence Act

The Privacy Authority has sanctioned a municipality for unlawful video surveillance and violations of employee privacy

The Data Protection Authority has sanctioned a municipality for improperly processing personal data through video surveillance, prompted by an employee’s complaint regarding a camera placed near the time-clock, the only tool used to record employees' working hours. The municipality had used the images to addressed alleged breaches of the employee’s official duties, such as not adhering to her working hours, and, when questioned by the Privacy Authority, the municipality cited security concerns to justify the presence of the camera. However, the Privacy Authority determined that the municipality failed to comply with the remote oversight procedures and had used the footage for disciplinary purposes. Moreover, the municipality had not provided adequate information to workers and visitors regarding the camera’s personal data processing.

Although the fine imposed on the municipality was modest, the incident underscores the critical nature of this issue, which requires careful handling by companies.

For more details: provision dated 11 April 2024 (10013356)

Telemarketing: the Privacy Authority fined two energy operators 100,000 Euro each

Personal Data Protection Authority has imposed fines on two energy operators of EUR 100,000 each for unlawful processing of personal data. The sanctions were prompted by two complaints and 56 reports from users who received unsolicited phone calls and unauthorized activations of energy contracts. Investigations revealed that the calls, made without the consent of the individuals involved, primarily targeted users listed in the Public Opposition Register (PRo). Call centres, after acquiring users' contacts from third-party companies and agents or intermediaries, illegally contacted those users, many of whom subsequently signed supply contracts. The Authority also ordered the call centres to implement appropriate technical, organisational and monitoring measures to ensure compliance with privacy laws when processing the personal data of the persons involved.

For more details: provision of 11 April 2024 (10008076)

Video surveillance with facial recognition in Rome: the Privacy Authority initiate an investigation

Personal Data Protection Authority has launched an inquiry into a project involving video surveillance with facial recognition at metro stations in Rome. According to press reports, in preparation to the Jubilee, the Administration of “Roma Capitale” intends to deploy cameras equipped with facial recognition technology, to detect “disruptive actions” in the metro by those who have committed “non-compliant acts” in the past. The Authority therefore requested the Administration of Roma Capitale to provide a technical description of the facial recognition technology, its purpose, the legal basis for processing biometric data, and a copy of the data protection impact assessment. The Administration was granted a 15-day deadline to respond. The Authority also reminded that a moratorium is currently in effect until 2025 on the use of video surveillance systems with facial recognition technology, in public places or places accessible to the public, by public authorities or private entities. Only judicial authorities, in the course of their judicial duties, and public authorities, engaged in crime prevention and suppression, may carry out such processing activities, subject to the Authority’s approval.

For further information: release dated 9 May 2024

Amendments to the Privacy Code: streamlined rules for medical, biomedical and epidemiological research

Recent amendment to the Italian Privacy Code, enacted through the conversion of Decree-Law no. 19 of 2 March 2024, brings significant changes to scientific research in the medical, biomedical and epidemiological fields. In particular, under the revised Article 110 of the Privacy Code, when obtaining prior consent of the data subject is not possible, patients' personal data may be processed for the purposes of scientific research in the medical, biomedical and epidemiological fields provided an ethics committee has given a favourable opinion and the guarantees outlined by the Privacy Authority are met. The requirement for prior authorisation from the Privacy Authority has therefore been replaced by compliance with the guarantees set out by the Privacy Authority in the ethical rules on the processing of data for research purposes. 

The Personal Data Protection Authority will therefore have to establish general measures applicable to a plurality of projects through these ethical rules, which will undergo public consultation, also involving the scientific community in their formulation.

This reform aims to strike a balance between the need to protect personal data and fostering scientific research, which is particularly crucial in public health.

For more details: provision dated 9 May 2024 (10016146)

EDPB: Annual Report published

The annual report of the European Data Protection Board (“EDPB”) provides an overview of the board’s activities throughout the year (recommendations and best practice reports issued by the board, binding decisions, practical application of guidelines, etc.).

In 2023, the EDPB adopted two binding decisions, one of an urgent nature, and two new guidelines. In addition, the EDPB issued 37 opinions pursuant to Article 64 of the GDPR, most of them focusing on binding corporate rules and the accreditation requirements for certification bodies. Finally, the EDPB collaborated with the EDPS  to release two legislative opinions.

With specific reference to binding decisions, notable mentions include:

  • Binding Decision 1/2023, by which the EDPB resolved a dispute concerning data transfers by Meta Platforms Ireland Limited.
  • Binding Decision 2/2023, by which the EDPB resolved a dispute concerning the processing of data of users aged between 13 and 17 years by TikTok Technology Limited, highlighting issues with registration and video posting pop-ups not offering objective and neutral options to the user. 

In addition, the EDPB published the following guidelines in 2023:

  • Guideline 03/2022, issued on 14 February 2023, on Misleading Design Patterns in Social Media Platforms, aiming to provide recommendations and practical guidance to social media providers on how to identify and eliminate misleading designs in social media platforms.
  • Guidelines 05/2022 on the use of Facial Recognition Technology (FRT) in law enforcement. The guidelines provide relevant information for European and national legislators, as well as law enforcement authorities on implementing and using such FRT systems.

For further information: EDPB Annual Report

EDPB's opinion on the use of facial recognition technologies by airport operators

In late May, the EDPB issued an opinion (Opinion No. 11/2024) on the use of facial recognition technologies to streamline passenger flow and the storage of biometric data by airport operators.

The opinion, in particular, examines the compatibility of such practices with:

  • the principle of data retention limitation (Art. 5(1)(e) GDPR),
  • the principle of integrity and confidentiality (Article 5(1)(f) GDPR),
  • data protection by design and by default (Article 25 GDPR),
  • security of processing (Article 32, GDPR).

As a preliminary remark, the EDPB highlights that there is no uniform EU legal obligation for airport operators and airlines to verify the name on a passenger's boarding pass against their identity document. However, any such obligation may be governed by national law. Therefore, in the absence of national requirements for identity verification of passengers, biometric data cannot be used for recognition purposes, as this would entail excessive processing of personal data.

That said, the EDPB evaluated the compliance of the biometric data processing of passengers in four distinct scenarios.

  1. Data stored exclusively on passengers' personal devices

In this scenario, biometric data reside solely on passengers' personal devices, under their exclusive control, and are used for authentication at various airport checkpoints. This approach could align with GDPR requirements, provided that adequate security measures are implemented and that there is no alternative, less intrusive solutions available.

  • Centralized data storage at the airport with Passengers holding access keys

In the second scenario, biometric data are centrally stored at the airport in encrypted form, with the decryption key being held exclusively by passengers. The EDPB acknowledges that centralised storage poses risks, but these can be mitigated with appropriate security measures, ensuring GDPR-compliance, provided that the storage period is justified and limited to the minimum necessary.

  • Centralized data storage controlled by airport operators

Another scenario examined by the EDPB involves central storage of biometric data under the control of airport operators, enabling passenger identification, for a maximum period of 48 hours. According to the EDPB, this approach is incompatible with the GDPR, as centralization poses high risks to passengers' fundamental rights in the event of a data breach.

  • Cloud-based data storage controlled by airlines or their service providers

Finally, the EDPB evaluates a scenario where biometric data is stored in the cloud under the control of an airline or its service provider, facilitating passenger identification. This scenario carries substantial risks as data may be accessible to multiple entities, including non-EEA providers. The EDPB concludes that this scenario is incompatible with the GDPR due to the high risk of a data breach and the lack of control by passengers over their own data.

In all instances, only biometric data of passengers who actively register and provide their consent should be processed.

In conclusion, the EDPB determined that scenarios where biometric data is stored exclusively by passengers (scenario 1) or in a centralized database with decryption keys solely in the possession of users (scenario 2), if implemented with a set of recommended minimum safeguards, are the only approaches that are compatible with the above-mentioned GDPR principles and that adequately mitigate the intrusiveness of data processing while ensuring maximum control for data subjects over their personal data.

Conversely, scenarios 3 and 4 are deemed excessively intrusive by the EDPB, lacking proportionality in relation to the expected benefits. Therefore, solutions based on centralized at the airport or in the cloud, without passengers holding decryption keys, cannot be considered compliant with the above-mentioned principles.

For more information:

EDPB, 'Opinion 11/2024 on the use of facial recognition to streamline airport passengers' flow (compatibility with Articles 5(1)(e) and(f), 25 and 32 GDPR', 23 May 2024. 

Keep in touch!

Sign up for our newsletters!

Stay up-to-date on domestic and international legislative and tax news
and international, as well as all the Firm’s events and initiatives.

Back
to top