I was sitting across from a German data protection officer in Frankfurt when she said something that changed how I think about GDPR: "Article 32 isn't a checkbox—it's a philosophy. It asks you to think like an attacker, but act like a protector."
That was in 2018, just months after GDPR enforcement began. Since then, I've helped over 30 organizations across Europe and the US implement Article 32 controls, witnessed three major regulatory investigations, and learned that this single article is perhaps the most misunderstood—and most critical—piece of the entire GDPR framework.
Let me share what 15+ years in cybersecurity and six years of GDPR implementation have taught me about getting Article 32 right.
What Article 32 Actually Says (And What It Really Means)
Here's the official text, but stick with me—I'll translate the legalese:
"Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk."
When I first read this to a CTO in 2018, he looked at me and said, "That's beautifully vague. What does it actually require?"
Exactly. And that's both the challenge and the opportunity.
The Four Pillars of Article 32: What You Must Consider
Article 32 specifically mentions four measures you should consider (note: consider, not necessarily implement all of them):
Security Measure | What It Means | Real-World Example |
|---|---|---|
Pseudonymization and encryption | Making data unreadable without specific keys or rendering it unlinkable to individuals | Customer database with hashed identifiers; encrypted backup storage |
Ongoing confidentiality, integrity, availability | Ensuring systems work correctly, data stays accurate, and authorized users can access it | Multi-factor authentication; integrity monitoring; redundant systems |
Ability to restore availability | Quick recovery from incidents | Tested backup systems; disaster recovery procedures; incident response plans |
Regular testing and evaluation | Proving your security actually works | Penetration testing; security audits; vulnerability assessments; tabletop exercises |
But here's what kept me up at night when I first started implementing Article 32: these four measures are examples, not requirements. The real requirement is "appropriate security based on risk."
"Article 32 doesn't tell you what to do. It tells you what to think about. And that's infinitely harder."
The Risk-Based Approach: What "Appropriate Security" Actually Means
I learned this lesson the hard way in 2019 while working with a small marketing agency in Amsterdam.
They processed email addresses and basic contact information for about 5,000 newsletter subscribers. Their CEO had read about GDPR and was convinced they needed military-grade encryption, full disk encryption on every device, and a $200,000 security infrastructure.
I had to break some news to him: "That's not appropriate security. That's overkill. And Article 32 actually requires something different—security that matches the risk."
Here's the framework I use to determine "appropriate security":
The Article 32 Risk Assessment Matrix
Factor | Questions to Ask | Impact on Security Requirements |
|---|---|---|
Nature of Data | Is it special category data? Financial data? Basic contact info? | Higher sensitivity = stronger controls |
Scope of Processing | How many data subjects? How much data? How many systems? | Larger scope = more robust controls |
Context | Who has access? Where is data stored? How is it transmitted? | Higher exposure = enhanced protections |
Purposes | Why are you processing this data? Who benefits? | Critical purposes = stronger safeguards |
Likelihood of Risk | How likely is unauthorized access or breach? | Higher probability = preventive controls |
Severity of Impact | What happens to individuals if something goes wrong? | Greater harm = comprehensive protection |
Let me show you how this works in practice.
Real-World Implementation: Three Stories
Case 1: The Healthcare Startup (High Risk = Maximum Security)
In 2020, I consulted for a telehealth platform processing medical consultations, prescriptions, and payment information for 50,000 patients across Europe.
Their Article 32 Implementation:
Security Domain | Specific Measures Implemented | Why This Level? |
|---|---|---|
Encryption | • AES-256 for data at rest<br>• TLS 1.3 for data in transit<br>• Field-level encryption for sensitive fields<br>• Encrypted backups with separate key management | Health data is special category under Article 9; breach could cause serious harm to individuals |
Access Control | • Multi-factor authentication mandatory<br>• Role-based access control<br>• Just-in-time privileged access<br>• Biometric authentication for mobile apps | Unauthorized access to health records could cause discrimination, embarrassment, or harm |
Monitoring | • 24/7 security operations center<br>• Real-time anomaly detection<br>• Automated threat response<br>• Quarterly penetration testing | High-value target for attackers; need to detect and respond quickly |
Availability | • 99.9% uptime SLA<br>• Multi-region redundancy<br>• Real-time replication<br>• 4-hour recovery time objective | Patients need access to medical records; unavailability could impact health outcomes |
Organizational | • Mandatory annual security training<br>• Background checks for all employees<br>• Segregation of duties<br>• Data protection impact assessments | Insider threats significant; human error major risk factor |
Cost: €340,000 initial implementation, €120,000 annually for maintenance.
Supervisory Authority Response: During a 2022 audit by the Dutch DPA, they specifically praised the "proportionate and comprehensive" security measures. No findings, no fines.
Case 2: The Newsletter Service (Low Risk = Proportionate Security)
Remember that marketing agency in Amsterdam? Here's what "appropriate security" actually looked like for them:
Their Article 32 Implementation:
Security Domain | Specific Measures Implemented | Why This Level? |
|---|---|---|
Encryption | • TLS for email transmission<br>• Encrypted database connections<br>• Basic password hashing (bcrypt) | Low sensitivity data; encryption prevents opportunistic attacks |
Access Control | • Strong passwords required<br>• Two-factor authentication for admin accounts<br>• Regular access reviews | Small team; insider risk low; focus on external threats |
Monitoring | • Basic logging enabled<br>• Weekly log reviews<br>• Cloud provider security alerts | Lower risk profile allows less intensive monitoring |
Availability | • Daily automated backups<br>• Cloud provider redundancy<br>• 24-hour recovery target | Service interruption annoying but not harmful to subscribers |
Organizational | • Annual security awareness training<br>• Clear data processing procedures<br>• Documented incident response plan | Small team needs clarity and repeatability |
Cost: €8,500 initial implementation, €2,400 annually for maintenance.
Result: Proportionate to risk, cost-effective, and fully compliant with Article 32 requirements.
Case 3: The Financial Services Firm (The Goldilocks Approach)
In 2021, I worked with a boutique investment advisory firm in London managing portfolios for 300 high-net-worth clients.
They'd initially gone the overkill route—trying to implement every security control imaginable. After six months, they'd spent £180,000 and still weren't compliant because the complexity was overwhelming their small team.
I helped them right-size their approach:
Their Revised Article 32 Implementation:
Security Domain | What They Were Doing (Wrong) | What We Changed To (Right) | Why It Worked Better |
|---|---|---|---|
Encryption | Attempting to encrypt everything, including internal systems | Focused encryption on customer data, communications, and backups | Maintained security where it mattered; improved performance elsewhere |
Access Control | Complex zero-trust architecture they couldn't manage | Strong MFA + role-based access + quarterly reviews | Achievable security without overwhelming IT team |
Monitoring | Enterprise SIEM they didn't have staff to manage | Cloud-native security monitoring with managed detection and response | Actually got value from alerts; 24/7 coverage without 24/7 staff |
Testing | Trying to do monthly penetration tests | Quarterly external pentests + annual internal assessments | Maintained assurance without budget exhaustion |
Original Budget: £180,000 initial, £95,000 annual
Revised Budget: £65,000 initial, £42,000 annual
Compliance Status: Full Article 32 compliance with ICO validation
"The goal isn't maximum security. It's appropriate security that you can actually maintain. A complex system you can't manage is worse than a simple system you execute flawlessly."
The "State of the Art" Requirement: What's Expected in 2024-2025
Article 32 specifically mentions "taking into account the state of the art." This is crucial—what was appropriate in 2018 isn't necessarily appropriate now.
Here's what I consider baseline "state of the art" for different organization sizes:
Small Organizations (< 50 employees)
Control Category | Minimum Expected "State of the Art" |
|---|---|
Authentication | • Multi-factor authentication for all business systems<br>• Password manager for team<br>• Single sign-on where feasible |
Encryption | • TLS 1.2+ for all data transmission<br>• Encrypted backups<br>• Full disk encryption on laptops |
Access Management | • Regular access reviews (quarterly)<br>• Immediate deactivation of departing employees<br>• Principle of least privilege |
Monitoring | • Cloud provider security alerts enabled<br>• Basic logging of access to personal data<br>• Awareness of unusual activity |
Updates & Patches | • Automatic updates enabled where possible<br>• Critical patches within 30 days<br>• Regular system inventory |
Backups | • Daily automated backups<br>• Test recovery quarterly<br>• Offsite/cloud backup storage |
Medium Organizations (50-500 employees)
Control Category | Minimum Expected "State of the Art" |
|---|---|
Authentication | • MFA mandatory for all systems<br>• Adaptive authentication based on risk<br>• Privileged access management solution |
Encryption | • TLS 1.3 for external communications<br>• Encryption at rest for sensitive data<br>• Key management system |
Access Management | • Automated provisioning/deprovisioning<br>• Monthly access reviews<br>• Just-in-time access for privileged operations |
Monitoring | • Security information and event management (SIEM)<br>• Automated alerting and response<br>• Log retention per regulatory requirements |
Vulnerability Management | • Monthly vulnerability scanning<br>• Annual penetration testing<br>• Patch management within 14 days |
Incident Response | • Documented incident response plan<br>• Defined roles and responsibilities<br>• Annual tabletop exercises |
Large Organizations (500+ employees)
Control Category | Minimum Expected "State of the Art" |
|---|---|
Authentication | • Zero-trust architecture implementation<br>• Risk-based authentication<br>• Continuous authentication validation |
Encryption | • End-to-end encryption for sensitive data<br>• Quantum-resistant cryptography planning<br>• Hardware security modules for key storage |
Access Management | • Identity governance and administration platform<br>• Automated certification campaigns<br>• Real-time anomaly detection |
Monitoring | • 24/7 security operations center<br>• Advanced threat detection and response<br>• Security orchestration and automation |
Testing | • Continuous vulnerability assessment<br>• Quarterly penetration testing<br>• Red team exercises annually |
Recovery | • Documented business continuity plans<br>• Quarterly disaster recovery testing<br>• Multiple recovery sites/regions |
Pseudonymization: The Underrated Article 32 Superpower
Let me share something that transformed how one of my clients approached GDPR compliance.
A retail analytics company in Berlin was processing shopping behavior data for millions of customers. Every data subject access request, every deletion request, every breach notification was a nightmare involving multiple systems and databases.
Then we implemented proper pseudonymization:
Before Pseudonymization:
Customer_ID: 12345
Name: "Hans Schmidt"
Email: "[email protected]"
Purchase_History: [extensive details]
After Pseudonymization:
Pseudonym: "a8f92b3e5c1d"
Purchase_History: [same details]The Results Were Dramatic:
Metric | Before | After | Improvement |
|---|---|---|---|
Data subject request processing time | 6-8 hours | 20 minutes | 95% reduction |
Systems requiring access for analytics | 7 systems | 1 system | Massive scope reduction |
Risk exposure in case of breach | High - direct identifiers exposed | Low - pseudonyms meaningless without key | Significantly reduced |
Compliance with data minimization | Questionable | Excellent | Full compliance |
The German DPA specifically highlighted this pseudonymization implementation as a "best practice example" during their audit.
"Pseudonymization isn't just about compliance—it's about building systems that are fundamentally more secure by design."
Regular Testing: The Article 32 Requirement Everyone Forgets
Here's a confession: In my first year implementing GDPR, I focused obsessively on getting security controls in place. Encryption? Check. Access controls? Check. Monitoring? Check.
Then came the question from a Belgian DPA auditor in 2019: "How do you know these controls actually work?"
I froze. We'd implemented everything, but we'd never actually tested whether it would hold up under attack.
The Testing Framework That Actually Works
Based on painful lessons learned, here's my recommended testing cadence:
Test Type | Frequency | What You're Testing | Typical Cost Range |
|---|---|---|---|
Vulnerability Scanning | Weekly (automated) | Known vulnerabilities in systems | €2K-15K/year for tools |
Phishing Simulations | Monthly | Employee awareness and response | €1K-5K/year |
Access Control Review | Quarterly | Who has access to what; orphaned accounts | Internal time only |
Backup Recovery Test | Quarterly | Can you actually restore from backups | Internal time only |
Tabletop Exercise | Semi-annually | Incident response procedures and team readiness | €3K-10K per exercise |
External Penetration Test | Annually | Real-world attack simulation | €15K-50K per test |
Internal Security Assessment | Annually | Comprehensive control evaluation | Internal time or €20K-40K |
Red Team Exercise | Every 2-3 years (large orgs) | Advanced persistent threat simulation | €50K-150K+ |
A Story About Testing:
In 2022, I was working with an e-commerce company that had "great security"—at least on paper. We conducted a tabletop exercise simulating a ransomware attack.
Within 15 minutes, we discovered:
The incident response plan was stored on a shared drive that would be encrypted in a real attack
Nobody knew the password to their offline backups
The "backup admin" had left the company three months earlier
Their cyber insurance policy required breach notification within 24 hours, but nobody knew how to contact the insurer
All this from a €4,000 tabletop exercise. A real ransomware attack would have cost them millions.
They fixed everything within two weeks. When they actually got hit by ransomware eight months later, they recovered in six hours with zero data loss.
That's the power of testing.
Organizational Measures: The Non-Technical Side of Article 32
Here's something that took me years to fully appreciate: Article 32 requires both "technical and organizational" measures.
I've seen organizations spend €500,000 on security technology while ignoring the human and process elements. They always fail audits.
Essential Organizational Measures
Organizational Measure | What It Includes | Why It Matters | Real-World Example |
|---|---|---|---|
Security Policies | Acceptable use, data handling, incident response, access management | Defines expected behavior and procedures | Written policy that employees actually read and follow |
Training & Awareness | Initial onboarding, annual refresher, role-specific training, ongoing communications | Humans are the weakest link—and the strongest defense | Monthly security tips, quarterly training, simulated phishing |
Access Management Processes | Joiner/mover/leaver procedures, access request workflow, periodic reviews | Prevents unauthorized access from process failures | Automated deprovisioning when employees leave |
Change Management | Approval process, testing requirements, rollback procedures, documentation | Prevents security gaps from system changes | All production changes require security review |
Vendor Management | Due diligence, contracts with security requirements, ongoing monitoring | Third parties can create your biggest vulnerabilities | Annual vendor security assessments |
Incident Response Plan | Defined roles, communication procedures, escalation paths, documentation requirements | Chaos during incident guarantees worse outcomes | Written plan tested quarterly |
Physical Security | Building access, workspace policies, device handling, visitor management | Digital security means nothing if someone steals the server | Badge access, clean desk policy, locked server rooms |
A Tale of Two Breaches:
I witnessed two very similar data breaches in 2023—same attack vector (phished credentials), same industry (financial services), similar data exposure.
Company A had strong technical controls but weak organizational measures:
No clear incident response plan
Employees unsure who to contact
No communication procedures
Took 36 hours to contain
€2.3 million in damages
€800,000 GDPR fine
Company B had equally strong technical controls AND solid organizational measures:
Clear incident response procedures
Everyone knew their role
Pre-approved communication templates
Contained in 4 hours
€120,000 in damages
€0 in fines (supervisory authority praised their response)
The difference? Organizational measures.
Common Article 32 Mistakes (And How to Avoid Them)
After six years of GDPR implementation, I've seen the same mistakes repeatedly:
Mistake #1: Treating Article 32 as a Checklist
What Organizations Do Wrong: "We have encryption, so we're compliant with Article 32."
Why It's Wrong: Article 32 requires risk-appropriate security. Encryption alone doesn't address availability, testing, access controls, or organizational measures.
How to Fix It: Use a comprehensive framework like ISO 27001 or SOC 2 as your foundation, then map it to Article 32 requirements.
Mistake #2: Ignoring the "State of the Art" Evolution
What Organizations Do Wrong: Implementing security measures in 2018 and never updating them.
Why It's Wrong: "State of the art" changes. What was appropriate five years ago may not be sufficient now.
How to Fix It: Annual review of security measures against current best practices and emerging threats.
Mistake #3: Over-Focusing on Technical, Ignoring Organizational
What Organizations Do Wrong: Spending €200,000 on security tools while using a shared password for admin accounts.
Why It's Wrong: Article 32 explicitly requires both technical AND organizational measures.
How to Fix It: Allocate at least 30% of your security budget to training, policies, and processes.
Mistake #4: Implementing Everything Without Risk Assessment
What Organizations Do Wrong: Trying to implement maximum security regardless of actual risk.
Why It's Wrong: Article 32 requires "appropriate to the risk"—overkill wastes resources and creates unsustainable complexity.
How to Fix It: Start with a thorough risk assessment. Let risk drive your security decisions.
Mistake #5: Forgetting About Processors
What Organizations Do Wrong: Implementing strong security internally while using processors with weak security.
Why It's Wrong: Article 32 applies to both controllers AND processors. You're responsible for ensuring processors maintain appropriate security.
How to Fix It: Include specific Article 32 requirements in processor contracts. Audit your processors regularly.
The Article 32 Documentation Playbook
Supervisory authorities don't just want to see security—they want to see proof of security. Here's what you need to document:
Document Type | What to Include | Update Frequency | Why DPAs Care |
|---|---|---|---|
Risk Assessment | • Data inventory<br>• Threat analysis<br>• Vulnerability assessment<br>• Risk treatment decisions | Annual or when major changes occur | Demonstrates you're making risk-based decisions |
Security Policy | • Technical controls implemented<br>• Organizational measures<br>• Responsibilities<br>• Review procedures | Annual review | Shows you have a coherent security strategy |
Implementation Records | • What was implemented<br>• When it was implemented<br>• Who approved it<br>• Testing results | Ongoing | Proves controls exist and work |
Training Records | • Who was trained<br>• What topics covered<br>• When training occurred<br>• Assessment results | After each training session | Demonstrates organizational measures |
Testing Reports | • Vulnerability scans<br>• Penetration test results<br>• Tabletop exercise outcomes<br>• Remediation actions | After each test | Shows you verify controls work |
Incident Response Plan | • Roles and responsibilities<br>• Communication procedures<br>• Technical response steps<br>• Escalation paths | Annual review | Required for responding to breaches |
Audit Logs | • Access to personal data<br>• System changes<br>• Security events<br>• Administrative actions | Continuous | Provides evidence of monitoring |
"Documentation isn't bureaucracy—it's insurance. When a supervisory authority asks how you protect data, you want to show them, not tell them."
Article 32 in the Context of Other GDPR Articles
Article 32 doesn't exist in isolation. Understanding how it connects to other GDPR articles is crucial:
GDPR Article | Connection to Article 32 | Practical Implication |
|---|---|---|
Article 5 (Principles) | Integrity and confidentiality principle requires security | Your Article 32 measures must support all GDPR principles |
Article 25 (Data Protection by Design) | Security must be built in from the start | Article 32 controls should be designed into systems, not bolted on |
Article 28 (Processor Requirements) | Processors must implement Article 32 measures | Your processor contracts must specify Article 32 obligations |
Article 33 (Breach Notification) | Detection capability required for 72-hour notification | Your Article 32 monitoring must enable rapid breach detection |
Article 35 (DPIA) | High-risk processing requires additional measures | DPIA findings should drive Article 32 implementation |
What Supervisory Authorities Actually Look For
I've supported organizations through five supervisory authority audits. Here's what DPAs consistently examine:
The Article 32 Audit Checklist (From Real DPA Inspections)
Initial Request (Usually):
Description of technical and organizational measures implemented
Risk assessment documentation
Evidence of "state of the art" consideration
Testing and evaluation records
If They Dig Deeper: 5. Actual configuration of security controls (not just policies) 6. Evidence that controls are working (logs, monitoring data) 7. Incident response capabilities (through questioning or testing) 8. Processor security validation records 9. Training completion records 10. Change management records for security-relevant changes
Red Flags That Trigger Fines:
No documented risk assessment
Security measures clearly inadequate for data sensitivity
No evidence of testing or evaluation
Processor security not validated
Repeated similar incidents indicating control failures
No response to previously identified deficiencies
Costs: What Article 32 Compliance Actually Takes
Here's the question I get most: "How much will this cost?"
Based on 30+ implementations, here are realistic budget ranges:
Small Organization (< 50 people, low-risk data)
Cost Category | Initial Cost | Annual Cost |
|---|---|---|
Risk Assessment | €3,000 - €8,000 | €2,000 - €4,000 |
Tool Licensing | €2,000 - €5,000 | €3,000 - €8,000 |
Implementation | €5,000 - €15,000 | - |
Training | €1,000 - €3,000 | €1,500 - €3,000 |
Testing | €2,000 - €5,000 | €4,000 - €8,000 |
Documentation | €2,000 - €5,000 | €1,000 - €2,000 |
Consultant Support | €5,000 - €15,000 | €3,000 - €8,000 |
TOTAL | €20,000 - €56,000 | €14,500 - €33,000 |
Medium Organization (50-500 people, medium-risk data)
Cost Category | Initial Cost | Annual Cost |
|---|---|---|
Risk Assessment | €10,000 - €25,000 | €5,000 - €12,000 |
Tool Licensing | €15,000 - €40,000 | €20,000 - €60,000 |
Implementation | €50,000 - €150,000 | - |
Training | €8,000 - €20,000 | €10,000 - €25,000 |
Testing | €15,000 - €40,000 | €25,000 - €60,000 |
Documentation | €10,000 - €25,000 | €5,000 - €12,000 |
Staff Time (FTE) | 0.5 - 1.0 FTE | 0.5 - 1.0 FTE |
Consultant Support | €25,000 - €75,000 | €15,000 - €40,000 |
TOTAL | €133,000 - €375,000 | €80,000 - €209,000 |
Large Organization (500+ people, high-risk data)
Cost Category | Initial Cost | Annual Cost |
|---|---|---|
Risk Assessment | €40,000 - €100,000 | €25,000 - €60,000 |
Tool Licensing | €100,000 - €300,000 | €150,000 - €400,000 |
Implementation | €250,000 - €750,000 | - |
Training | €30,000 - €80,000 | €40,000 - €100,000 |
Testing | €50,000 - €150,000 | €80,000 - €200,000 |
Documentation | €30,000 - €75,000 | €15,000 - €40,000 |
Staff Time (FTE) | 3-5 FTE | 2-4 FTE |
SOC/Monitoring | €100,000 - €250,000 | €200,000 - €500,000 |
Consultant Support | €100,000 - €300,000 | €50,000 - €150,000 |
TOTAL | €700,000 - €2,005,000 | €560,000 - €1,450,000 |
Important Note: These are security program costs. If you're already doing some of this, costs will be lower. If you're processing special category data or operating in highly regulated industries, costs may be higher.
Your Article 32 Action Plan: Next 90 Days
If you're starting from scratch, here's a realistic implementation timeline:
Days 1-30: Assessment Phase
Week 1-2: Data and Risk Assessment
[ ] Inventory all personal data processing activities
[ ] Identify data categories and sensitivity levels
[ ] Map data flows and system dependencies
[ ] Conduct initial risk assessment
Week 3-4: Gap Analysis
[ ] Document current security measures
[ ] Compare against Article 32 requirements
[ ] Identify gaps and priorities
[ ] Estimate implementation costs
Days 31-60: Planning and Quick Wins
Week 5-6: Implementation Planning
[ ] Develop comprehensive security roadmap
[ ] Prioritize based on risk
[ ] Secure budget and resources
[ ] Engage vendors/consultants if needed
Week 7-8: Quick Wins
[ ] Enable MFA on all business systems
[ ] Implement password manager
[ ] Enable automatic security updates
[ ] Conduct security awareness training
[ ] Document incident response procedures
Days 61-90: Foundation Building
Week 9-10: Technical Controls
[ ] Implement encryption for data at rest
[ ] Ensure TLS 1.2+ for all communications
[ ] Set up centralized logging
[ ] Configure automated backups
[ ] Implement access control improvements
Week 11-12: Organizational Measures
[ ] Document security policies
[ ] Establish access request procedures
[ ] Create training program
[ ] Set up regular access reviews
[ ] Prepare for first round of testing
Final Thoughts: Article 32 as Competitive Advantage
I started this article in Frankfurt with a DPA officer's words about philosophy. Let me end with a realization that came years later.
In 2023, I was helping a SaaS company prepare for Series B fundraising. The lead investor spent 30 minutes discussing their Article 32 implementation during due diligence.
The founder told me afterward: "I thought security compliance was just a cost center. But the investor said our Article 32 program was a major factor in their decision to invest. They said most startups our size have terrible security, and it's a huge risk they have to price into their valuations."
That company raised €15 million at a valuation €3 million higher than comparable companies. The investor specifically cited their "mature security posture" as justification for the premium.
Article 32 isn't just about avoiding fines. It's about building an organization that:
Protects people's fundamental rights
Earns customer trust
Attracts investors
Operates sustainably
Sleeps better at night
Is it easy? No. Is it cheap? No. Is it worth it?
Absolutely.
"Article 32 asks one fundamental question: If someone trusted you with their personal information, did you protect it the way you'd want your own information protected? Everything else is just details."