I remember sitting across from a marketing director in London, three weeks after GDPR went into effect in May 2018. She was nearly in tears. "We've been sending newsletters to 40,000 subscribers for five years," she said. "Now our legal team is telling me we might not have a lawful basis for any of it. We could be looking at millions in fines."
She wasn't being dramatic. She was right to be worried.
After helping over 60 organizations navigate GDPR compliance across healthcare, finance, technology, and retail sectors, I can tell you that Article 6 is where most companies trip up. It's not the most complex part of GDPR—that honor goes to international data transfers—but it's the most fundamental. Get this wrong, and everything else falls apart.
"Article 6 isn't just a legal checkbox. It's the foundation of your entire data processing operation. Build on sand here, and your whole compliance program crumbles."
What Article 6 Actually Says (In Plain English)
Let me cut through the legal jargon. Article 6 establishes a simple but critical principle: you cannot process personal data unless you have a valid legal reason.
Notice I said "legal reason," not "good reason." You might have a fantastic business justification for processing someone's data. But if it doesn't fit within one of the six lawful bases GDPR provides, you can't do it. Period.
Here's what keeps me up at night: I've seen companies with billion-dollar valuations build entire business models on data processing they thought was legitimate, only to discover they had no lawful basis under GDPR. The scramble to fix it? Expensive, time-consuming, and sometimes impossible.
The Six Lawful Bases: Your Complete Toolkit
Think of these six lawful bases as the only keys that unlock the door to data processing. You need at least one, but you can only use the ones that actually fit your situation. You can't just pick your favorite.
Let me break down each one with real scenarios I've encountered:
Lawful Basis | When to Use It | Key Requirements | Main Limitations |
|---|---|---|---|
Consent | Marketing emails, cookies, optional features | Clear, specific, informed, freely given, withdrawable | Must be explicitly obtained; can be withdrawn anytime |
Contract | Delivering ordered services, processing payments | Processing must be necessary for contract performance | Can't be used for non-essential processing |
Legal Obligation | Tax records, employment law, legal holds | Required by law (not just company policy) | Only what law specifically requires |
Vital Interests | Medical emergencies, life-threatening situations | Literally life or death situations | Extremely narrow application |
Public Task | Government services, regulatory enforcement | Official public function or legal authority | Primarily for public sector |
Legitimate Interests | Fraud prevention, network security, marketing to existing customers | Balancing test required; individual rights considered | Cannot override data subject's rights; not available for public authorities |
1. Consent: The One Everyone Knows (And Usually Gets Wrong)
Here's a confession: In my first year working with GDPR, I thought consent was the universal solution. Client wants to send marketing emails? Get consent. Want to track website behavior? Get consent. Want to share data with partners? Get consent.
I was wrong. Dangerously wrong.
Real Story: The E-commerce Platform Disaster
In 2019, I consulted for an e-commerce platform processing 50,000 transactions monthly. They'd built their entire data strategy around consent. When users created accounts, they checked a pre-ticked box agreeing to "all data processing activities."
Three problems:
Pre-ticked boxes aren't valid consent under GDPR
Consent must be specific—you can't get blanket consent for everything
They were using consent for contract-essential processing (like fulfilling orders)
When we audited their practices, we discovered that if customers withdrew consent, the platform couldn't legally process their orders. They'd built a business model on quicksand.
The fix cost them €340,000 in system changes and three months of engineering time. They had to rebuild their entire data processing logic, moving contract-essential activities to the "contract" lawful basis and only using consent for truly optional marketing activities.
"Consent is like a nuclear reactor—incredibly powerful when used correctly, catastrophically dangerous when mishandled. Most companies should use it far less than they think."
What Makes Consent Valid Under GDPR?
I use the SCIFI test (yes, I made that acronym up, but it's helped dozens of clients):
S - Specific: "We'll use your email for monthly product updates" ✓ vs "We'll use your data for business purposes" ✗
C - Clear: Written in language an average person understands, not legal gibberish
I - Informed: You tell people exactly what they're consenting to, who will see their data, and how long you'll keep it
F - Freely given: No conditional service. "Create account" must work without marketing consent
I - Independently withdrawable: Unsubscribe must be as easy as subscribe (one click, no 12-step process)
Here's a consent request I helped redesign for a healthcare app:
Before (Invalid): "I agree to the Privacy Policy and Terms of Service"
After (Valid): "☐ Yes, send me weekly health tips via email (you can unsubscribe anytime) ☐ Yes, share my anonymized health data with research partners to improve healthcare (optional)"
See the difference? Specific, unbundled, optional.
When Consent Is The Right Choice
Scenario | Why Consent Works | What To Watch For |
|---|---|---|
Marketing newsletters | Purely optional service | Make unsubscribe easy |
Non-essential cookies | Tracking isn't required for site function | Cookie walls may not be compliant |
Data sharing with third parties | Not essential to your service | Each party needs clear identification |
Future research projects | Undefined future use | May need fresh consent for each project |
Children's data (under 16) | GDPR requires parental consent | Age verification mechanisms needed |
2. Contract: The Workhorse of Data Processing
If consent is the celebrity of GDPR, contract is the unsung hero doing most of the actual work.
Real Story: The SaaS Company That Got It Right
I worked with a project management SaaS company in 2020. When processing their data flows, we identified 17 different types of processing. The founder wanted to get consent for everything—he thought it was "safer."
I stopped him. "Look at your service," I said. "Can you deliver it without storing user names, email addresses, and project data?"
"Of course not," he replied.
"Then you don't need consent for that. You have a contract. The user signed up for your service. Processing their data is necessary to deliver what they paid for."
This realization transformed their approach:
Contract Basis (No consent needed):
Storing user account information
Processing project data they created
Sending service-related emails (password resets, billing notifications)
Providing customer support
Processing payments
Consent Basis (Explicit opt-in required):
Marketing newsletters about new features
Sharing usage statistics with third-party analytics (beyond essential security monitoring)
Beta testing programs with additional data collection
The result? Simpler privacy policy, clearer user communications, and a 34% increase in marketing opt-ins because users trusted the company wasn't hiding required processing behind "consent."
The Necessity Test
Here's how I determine if "contract" applies:
Ask yourself: If I don't process this data, can I still deliver the service the customer paid for?
Can't ship products without shipping addresses → Contract ✓
Can't process refunds without payment information → Contract ✓
Want to send promotional emails about other products → Not Contract, need consent ✗
Want to analyze purchasing patterns for internal research → Not Contract, consider legitimate interests ✗
Common Contract Basis Mistakes
Mistake | Why It's Wrong | Correct Approach |
|---|---|---|
Using contract basis for marketing | Marketing isn't necessary to deliver the service | Use consent or legitimate interests |
Claiming contract for data sharing with affiliates | Sharing with your other companies isn't necessary for the contract | Use consent or legitimate interests (carefully) |
Processing more data than needed | Contract only covers what's actually necessary | Minimize data collection |
Using contract for profile building | Profiling isn't necessary for most contracts | Use consent |
3. Legal Obligation: When the Law Makes You Do It
This one's straightforward but often misunderstood. Legal obligation means actual laws, regulations, or court orders—not company policies you created.
Real Story: The Financial Services Confusion
A fintech startup I advised was keeping customer data for seven years "because of legal obligations." When I asked which law required this, they pointed to their internal data retention policy.
That's not a legal obligation. That's a company policy.
We dug deeper and found that anti-money laundering regulations actually required them to keep transaction records for five years, not seven. And those regulations only applied to specific transaction data, not the full customer profile including marketing preferences and website behavior.
By correctly applying "legal obligation" only where laws actually required it, they:
Reduced storage costs by 23%
Simplified their data retention schedules
Reduced breach risk by holding less data
Improved customer trust by deleting data sooner
Examples of Legitimate Legal Obligations
Jurisdiction | Legal Requirement | Data Covered | Retention Period |
|---|---|---|---|
EU | Tax regulations | Financial transactions, invoices | Typically 7-10 years (varies by country) |
EU | AML/KYC | Identity verification, transaction records | 5 years after relationship ends |
US (HIPAA) | Medical records | Patient health information | 6 years from creation/last use |
UK | Employment law | Payroll, tax records | 3-6 years after employment ends |
Various | Court orders | Data subject to litigation holds | Duration of legal proceedings + appeals |
"Legal obligation is your shield against criticism for keeping data. But it only works if you can point to the specific law that requires it. 'Industry best practice' isn't good enough."
4. Vital Interests: The Emergency Brake
In 15 years of cybersecurity work, I've only seen vital interests legitimately invoked three times. It's that narrow.
Vital interests means life or death. Literally.
Real Story: The Medical Emergency That Justified Everything
A hospital client faced a situation where an unconscious patient arrived at the emergency room. They had no ID, couldn't consent, and needed immediate treatment. The hospital accessed data from the patient's health app (linked through emergency medical services) without consent.
Was this legal? Absolutely. Vital interests applied because:
The patient's life was in danger
The patient couldn't provide consent (unconscious)
There was no time to seek authorization from family
The data processing was necessary to protect life
But here's the critical part: Once the patient regained consciousness and could provide consent, vital interests no longer applied. The hospital had to get proper authorization to continue accessing the health app data.
When Vital Interests Does NOT Apply
I've seen companies try to invoke vital interests for:
Fraud prevention (use legitimate interests instead)
Cybersecurity monitoring (use legitimate interests)
Preventing account takeovers (use legitimate interests or contract)
"Protecting" users from their own bad decisions (not your call)
Unless someone will literally die or face serious physical harm without your data processing, vital interests doesn't apply.
5. Public Task: The Government Exception
If you're a private company reading this, you can probably skip this section. Public task applies almost exclusively to government agencies and public authorities performing official functions.
Quick Reference: Public Task Examples
Organization Type | Valid Public Task | Invalid Use |
|---|---|---|
National health service | Operating patient registration systems | Marketing new voluntary health programs (use consent) |
Police department | Criminal investigation databases | Analyzing crime data for academic research (use legal obligation or legitimate interests) |
Public university | Managing student records and grades | Alumni marketing and fundraising (use consent or legitimate interests) |
Tax authority | Processing tax returns | Sharing data with other government agencies beyond legal requirements (use legal obligation) |
One critical point: Even public authorities can't use "public task" as a catch-all. The specific task must be defined in law or regulation.
6. Legitimate Interests: The Flexible (But Dangerous) Option
Here's where things get interesting. Legitimate interests is simultaneously the most flexible and most misused lawful basis.
After working with legitimate interests across dozens of implementations, I've developed a healthy respect—and a healthy fear—of this basis.
Real Story: The Marketing Agency That Pushed Too Far
A digital marketing agency I consulted for wanted to use legitimate interests for everything consent didn't cover. "We have a legitimate interest in running our business," they argued. "Therefore, all our data processing is legitimate interests."
If only it were that simple.
GDPR requires a three-part test for legitimate interests (the "LIA" - Legitimate Interests Assessment):
The Three-Part Legitimate Interests Test
Test Component | Questions to Ask | Red Flags |
|---|---|---|
1. Purpose Test | Do we have a genuine, specific legitimate interest? Is it clearly articulated? | Vague purposes like "business operations" or "improving services" |
2. Necessity Test | Is this processing necessary to achieve our purpose? Could we achieve it less intrusively? | Processing more data than needed; ignoring less invasive alternatives |
3. Balancing Test | Do our interests override the individual's rights and freedoms? Would they reasonably expect this processing? | Surprising or intrusive processing; vulnerable individuals; children's data |
Let me show you how this works in practice:
Scenario: Fraud Detection System
Purpose Test: ✓ Legitimate interest: Preventing fraud and protecting customers ✓ Specific: Detecting unusual transaction patterns ✓ Genuine: Real business and customer protection need
Necessity Test: ✓ Necessary: Can't prevent fraud without analyzing transactions ✓ Proportionate: Only analyzing transaction patterns, not reading message content ✓ No better alternative: Can't get consent before transactions (defeats real-time protection)
Balancing Test: ✓ Reasonable expectation: Customers expect fraud protection ✓ Minimal intrusion: Automated analysis, human review only for flagged transactions ✓ Additional safeguards: Clear privacy notice, easy opt-out for non-fraud security measures
Verdict: Legitimate interests likely applies ✓
Scenario: Profiling Customers for Targeted Advertising
Purpose Test: ✓ Legitimate interest: Maximizing advertising revenue ⚠ Question: Is this interest strong enough?
Necessity Test: ⚠ Questionable: Could achieve similar results with contextual advertising ✗ Less invasive alternatives exist
Balancing Test: ✗ Unexpected: Many users surprised by extent of profiling ✗ Potentially intrusive: Detailed behavior tracking ✗ Creates privacy risks: Extensive personal profiles
Verdict: Legitimate interests likely does NOT apply. Use consent instead. ✗
"Legitimate interests is like a high-performance sports car. In the right hands with proper training, it's incredibly effective. In the wrong hands, it's a lawsuit waiting to happen."
When I Recommend Legitimate Interests
Based on my experience, here are processing activities where legitimate interests typically works:
Processing Activity | Why It Works | Key Safeguards Needed |
|---|---|---|
Cybersecurity monitoring | Protecting systems benefits everyone | Limit human access; automated alerts only |
Fraud prevention | Protects both company and customers | Clear boundaries; human review only for flagged activity |
Internal analytics | Improves service quality | Aggregate data; anonymize where possible |
Direct marketing to existing customers | Reasonable expectation in business relationship | Easy opt-out; respect preferences |
CCTV in retail locations | Security necessity; public expects it | Clear signage; limited retention; access controls |
Network and IT security | Essential for service delivery | Proportionate to threats; regular review |
When Legitimate Interests Fails
I've seen legitimate interests rejected for:
Children's Data: The balancing test almost always fails when processing children's information. Kids' rights get extra weight.
Sensitive Data: Processing health data, racial/ethnic information, or other Article 9 special categories based on legitimate interests? Extremely difficult to justify.
Unexpected Processing: That clever data use case nobody would anticipate? Fails the balancing test.
Purely Internal Benefits: "It makes our internal processes easier" rarely outweighs individual rights.
The Lawful Basis Selection Matrix
After years of helping organizations choose the right lawful basis, I created this decision matrix:
Your Situation | Primary Basis | Backup Option | Notes |
|---|---|---|---|
Delivering paid service | Contract | N/A | Don't overthink this one |
Essential security monitoring | Legitimate interests | N/A | Document your LIA |
Marketing to prospects | Consent | N/A | No shortcuts here |
Marketing to existing customers | Legitimate interests | Consent | Always offer opt-out |
Complying with tax law | Legal obligation | N/A | Be specific about which law |
Employee monitoring | Legitimate interests | Legal obligation (if applicable) | Consider employment law requirements |
Research with identifiable data | Consent | Contract (if related to service) | Consider anonymization instead |
Data sharing with partners | Consent | Legitimate interests (very carefully) | Each partner needs justification |
Cookie tracking (non-essential) | Consent | N/A | Essential cookies can use legitimate interests |
The Critical Mistakes I See Repeatedly
Let me save you from the pain I've watched other organizations go through:
Mistake #1: Switching Lawful Bases Mid-Stream
The Problem: Company starts processing based on consent, then switches to legitimate interests when users start withdrawing consent.
Why It's Dangerous: GDPR requires you to determine your lawful basis before you start processing. You can't switch bases because the first one becomes inconvenient.
Real Impact: A marketing firm I audited had switched their newsletter from consent to legitimate interests after losing 40% of subscribers post-GDPR. When questioned by their DPA (Data Protection Authority), they couldn't demonstrate that legitimate interests was valid all along. They had to delete their entire mailing list and start over with consent.
Cost: €180,000 in lost revenue during rebuild, plus a €25,000 fine.
Mistake #2: The "Multiple Basis" Myth
The Problem: Companies claim multiple lawful bases for the same processing activity "to be safe."
Why It's Wrong: You must identify one primary lawful basis for each processing purpose. You can't hedge your bets.
The Right Approach: Different processing activities can have different bases, but each specific activity needs one clear basis.
Example Done Right:
Processing order information: Contract
Sending order confirmations: Contract
Sending marketing emails: Consent
Fraud detection on the account: Legitimate interests
Mistake #3: Consent for Everything
The Problem: Getting consent for processing that should be based on contract or legal obligation.
Why It Backfires:
If someone withdraws consent, you can't deliver your service
You're giving people control they shouldn't have over contract-essential processing
It confuses users and reduces trust
Real Story: An online bank required consent for processing transactions. When a customer withdrew all consent during a complaint, the bank couldn't legally process their account anymore. They had to close the account and return funds by check—creating compliance issues with other financial regulations.
Mistake #4: Ignoring the Documentation Requirement
Here's something that surprises people: Article 6 requires you to document your lawful basis for each processing activity.
Not just have one. Document it. Write it down. Maintain records.
I've seen companies with perfect lawful bases fail audits because they couldn't produce documentation showing how they made their decisions.
Your Lawful Basis Documentation Checklist
For each processing activity, document:
Required Element | What to Record | Example |
|---|---|---|
Processing activity | What you're doing with the data | "Sending monthly newsletter to subscribers" |
Data processed | What personal data is involved | "Email address, first name, subscription preferences" |
Lawful basis | Which of the six bases applies | "Consent" |
Justification | Why this basis applies | "User explicitly opted in via checkbox; can unsubscribe anytime" |
Individual's rights | How rights apply under this basis | "Right to withdraw consent; right to erasure applies" |
Date assessed | When you made this determination | "2024-01-15" |
Reviewed by | Who approved this basis | "Data Protection Officer" |
Practical Implementation: The 5-Step Process
After implementing Article 6 compliance for over 60 organizations, here's my proven methodology:
Step 1: Map Your Data Processing (Week 1-2)
Create a comprehensive inventory:
What data are you collecting?
Why are you collecting it?
What are you doing with it?
Who are you sharing it with?
How long are you keeping it?
Tool I Use: Simple spreadsheet with columns for each question above. Nothing fancy needed.
Step 2: Assign Lawful Bases (Week 2-3)
For each processing activity:
Start with the necessity test: Is this required by law? (Legal obligation)
Is it essential to deliver a service someone paid for? (Contract)
Is it truly life or death? (Vital interests - rare)
Are you a public authority doing official duties? (Public task)
If none of above, choose between consent and legitimate interests
Step 3: Conduct Legitimate Interests Assessments (Week 3-4)
For any processing using legitimate interests:
Document your purpose
Prove necessity
Conduct the balancing test
Identify safeguards
Consider individual rights
Time Investment: Budget 2-4 hours per LIA. Don't rush this.
Step 4: Update Privacy Notices (Week 4-5)
Your privacy policy must clearly state:
What data you collect
For what purposes
Under which lawful basis
How long you retain it
Individual rights that apply
Pro Tip: Create a layered privacy notice—brief summary upfront, detailed information available on click.
Step 5: Implement Rights Management (Week 5-6)
Different lawful bases trigger different rights:
Right | Consent | Contract | Legal Obligation | Vital Interests | Public Task | Legitimate Interests |
|---|---|---|---|---|---|---|
Right to erasure ("right to be forgotten") | Yes | Limited* | No | Limited* | No | Yes** |
Right to data portability | Yes | Yes | No | No | No | No |
Right to object | N/A*** | No | No | No | Yes | Yes |
Right to restrict processing | Yes | Yes | Yes | Yes | Yes | Yes |
* Only if no other legal ground applies and no overriding business need ** Unless compelling legitimate grounds override *** Can withdraw consent, which effectively stops processing
You need systems to handle these requests within one month (extendable to three months for complex requests).
Real-World Scenarios: How I'd Handle Them
Let me walk you through some actual situations I've encountered:
Scenario 1: The Recruitment Agency Dilemma
Situation: Recruitment agency wants to keep candidate CVs for future job opportunities, even if the specific job they applied for is filled.
Initial Thinking: "We have legitimate interests in keeping candidates on file."
My Analysis:
Purpose Test: ✓ Legitimate business interest
Necessity Test: ⚠ Could ask candidates to re-submit for new roles
Balancing Test: ⚠ Candidates might not expect indefinite retention
My Recommendation: Use consent with automatic expiry.
Implementation: "☐ Yes, keep my CV on file for 12 months to notify me about suitable roles (you can opt out anytime)"
Outcome: 73% of candidates opted in. Agency has clear legal basis, candidates have transparency and control, everyone's happy.
Scenario 2: The Cybersecurity Startup
Situation: Security company wants to analyze customer traffic patterns to detect threats and improve their machine learning models.
Initial Thinking: "It's all under legitimate interests for security."
My Analysis: Two separate processing purposes:
Threat detection: Legitimate interests ✓
ML model training: Separate assessment needed
My Recommendation:
Threat detection: Legitimate interests (conduct LIA, document safeguards)
ML training: Anonymize data where possible; if not possible, use legitimate interests with additional safeguards and opt-out
Implementation:
Clear privacy notice explaining both purposes
Automated anonymization pipeline for ML training
Customer dashboard showing how their data improves security
Easy opt-out from ML training (but not from essential security monitoring)
Outcome: Transparent approach built customer trust. When DPA audited them, documentation was solid.
Scenario 3: The Healthcare Appointment Reminders
Situation: Medical clinic wants to send appointment reminders via SMS.
Initial Thinking: "Medical data, so we need consent."
My Analysis:
Appointment reminders are essential to delivering healthcare service
Patients expect and want reminders
Not sending reminders leads to missed appointments and worse health outcomes
My Recommendation: Contract basis (not consent).
Why: Appointment reminders are reasonably necessary for delivering healthcare services. Making them optional via consent could harm patient care.
Safeguards:
Clear notice that reminders will be sent
Allow patients to choose reminder method (SMS, email, phone)
Easy way to opt out of reminders while keeping appointment
Outcome: Simplified process, better patient care, defensible legal position.
The Future of Article 6: What's Coming
After monitoring GDPR enforcement for six years, I see patterns emerging:
Trend 1: Legitimate Interests Under Scrutiny
Data Protection Authorities are increasingly challenging legitimate interests assessments, especially for:
Behavioral advertising
Extensive profiling
AI-powered decision making
Cross-border data sharing
My Advice: If you're using legitimate interests for anything beyond basic business operations and security, expect to defend it. Make your LIA bulletproof.
Trend 2: Consent Getting Stricter
I'm seeing enforcement actions against:
Cookie walls (blocking access unless you consent to cookies)
Consent bundling (making consent for marketing conditional on service access)
Dark patterns (design that tricks users into consenting)
Inadequate withdrawal mechanisms
My Advice: If you're asking for consent, make it genuinely optional and genuinely easy to withdraw.
Trend 3: Documentation Demands Increasing
Recent enforcement actions show DPAs expect:
Detailed, contemporaneous records of lawful basis decisions
Regular reviews of lawful basis appropriateness
Clear audit trails for processing activities
Evidence of balancing tests for legitimate interests
My Advice: Document everything. When in doubt, document more.
Your Article 6 Action Plan
Here's what I tell every client on day one:
This Week:
[ ] List all your data processing activities
[ ] Identify which lawful basis you're currently relying on (even if it's just assumed)
[ ] Flag any processing where you're not sure of the lawful basis
This Month:
[ ] Conduct proper lawful basis assessment for each activity
[ ] Document your reasoning for each choice
[ ] Conduct LIAs for any legitimate interests processing
[ ] Update privacy notices to clearly state lawful bases
This Quarter:
[ ] Implement systems to manage individual rights based on lawful bases
[ ] Train your team on lawful basis requirements
[ ] Review and update consent mechanisms if needed
[ ] Establish regular review process (quarterly or when processing changes)
Ongoing:
[ ] Review lawful bases when launching new processing activities
[ ] Monitor enforcement trends and guidance from your DPA
[ ] Keep documentation updated
[ ] Respond to individual rights requests appropriately based on lawful basis
The Bottom Line: Why Article 6 Really Matters
After 15 years in cybersecurity and six years of GDPR implementation work, here's my honest take:
Article 6 is not about compliance theater. It's about forcing organizations to think critically about why they're processing personal data and whether they really need to.
I've seen Article 6 compliance:
Force companies to delete 60% of data they didn't actually need (reducing breach risk and storage costs)
Clarify business processes that were previously chaotic
Build customer trust through transparent processing
Prevent privacy-invasive projects before they launched
Provide clear justification for processing that was legitimate but unexplained
Yes, it's work. Yes, it requires ongoing attention. Yes, you need to document things you used to just do.
But here's what I've learned: Organizations that get Article 6 right don't do it because they fear fines. They do it because it makes them better businesses.
"Article 6 compliance isn't about minimum viable legality. It's about maximum viable trust. Get your lawful bases right, and everything else falls into place."
The companies I've watched succeed with GDPR aren't the ones looking for loopholes or workarounds. They're the ones that embrace Article 6 as a framework for responsible data handling.
Be one of those companies.