Skip to content

Social Norms & Privacy

Privacy isn’t just a legal checkbox — it’s a fundamental aspect of user trust. How you collect, store, and share data shapes how users feel about your product. Respect their expectations, be transparent, and give them control.

Research shows that 48% of users distrust websites with deceptive UX patterns, and 49% of consumers believe websites don’t provide enough information about how their data is used. The stakes are high: both user trust and regulatory penalties.


PrincipleRequirementImplementation
Lawfulness, fairness, transparencyLegal basis for processing; clear communicationPrivacy policy, consent flows
Purpose limitationProcess only for stated purposesExplicit purpose declaration
Data minimizationCollect only what’s necessaryField-by-field justification
AccuracyKeep data current and correctUpdate mechanisms, validation
Storage limitationDelete when no longer neededRetention policies, auto-purge
Integrity & confidentialityProtect against unauthorized accessEncryption, access controls
AccountabilityDemonstrate complianceAudit logs, documentation
RequirementGood PracticeBad Practice
Freely givenEqual-weight Accept/Reject buttonsReject buried in submenu
SpecificPer-purpose consentBlanket “agree to all”
InformedPlain language explanationLegal jargon
UnambiguousActive opt-in requiredPre-checked boxes
WithdrawableOne-click revocationMulti-step cancellation
PatternDescriptionViolation
Pre-checked boxesDefault opt-in to marketingGDPR Art. 7
Hidden opt-outsReject option hard to findGDPR Art. 7
Confirm-shaming”No, I don’t want to save money”DSA/DMA
Forced continuityAuto-renewal hidden in T&CCCPA
Privacy ZuckeringConfusing privacy settingsGDPR Art. 25
MisdirectionVisual design emphasizes AcceptGDPR Art. 7

privacy_validation:
rules:
- id: consent-freely-given
severity: error
check: "Accept and Reject buttons have equal visual weight"
bad: "Accept: large blue button; Reject: gray text link"
good: "Accept: blue button; Reject: blue outlined button (same size)"
- id: no-prechecked-boxes
severity: error
check: "Marketing/tracking consent checkboxes default to unchecked"
legal: "GDPR Article 7, CCPA"
- id: consent-specific
severity: error
check: "Each data use purpose has separate consent toggle"
bad: "By continuing, you agree to all data uses"
good: "Toggle: Analytics, Toggle: Marketing, Toggle: Personalization"
- id: plain-language
severity: warning
check: "Privacy explanations readable at grade 8 level or below"
bad: "Pursuant to applicable regulations, data may be processed..."
good: "We use your location to show nearby stores"
- id: withdrawal-accessible
severity: error
check: "Consent withdrawal requires no more steps than granting"
bad: "Opt-in: 1 click; Opt-out: email support"
good: "Same toggle to enable/disable"
- id: data-minimization
severity: warning
check: "Each collected field has documented necessity"
question: "Would the product still work without this field?"
- id: purpose-limitation
severity: error
check: "Data not used for undisclosed purposes"
bad: "Collected for orders; used for ad targeting"
- id: retention-policy
severity: warning
check: "Data retention period defined and enforced"
pattern: "Auto-delete after [defined period]"
- id: sharing-scope-clear
severity: error
check: "User knows exactly who will see their content before sharing"
bad: "'Share' button with no audience indication"
good: "'Share with 3 workspace members' or 'Post publicly'"
- id: default-to-private
severity: warning
check: "New content defaults to private or limited visibility"
rationale: "Most users never change defaults"

Helen Nissenbaum’s theory of contextual integrity provides the conceptual foundation for understanding privacy expectations. Privacy violations occur when information flows don’t match contextual norms.

ParameterDefinitionExample
Data subjectPerson the data is aboutPatient, customer, employee
SenderWho transmits the dataDoctor, website, app
RecipientWho receives the dataInsurance, advertiser, employer
Information typeCategory of dataHealth, location, purchase history
Transmission principleConditions of flowWith consent, as required by law, freely

A practice violates contextual integrity when flows fail to match expected parameter values:

FUNCTION detectPrivacyViolation(flow):
expected = getUserExpectations(flow.context)
violations = []
IF flow.recipient NOT IN expected.recipients:
violations.push("Unexpected recipient: " + flow.recipient)
IF flow.information_type NOT IN expected.types:
violations.push("Unexpected data type: " + flow.information_type)
IF flow.transmission_principle NOT IN expected.principles:
violations.push("Unexpected condition: " + flow.transmission_principle)
RETURN violations
ContextExpected FlowViolation
MedicalDoctor → Insurance (for billing)Doctor → Advertiser
E-commercePurchase history → RecommendationsPurchase history → Political ads
Fitness appActivity → Personal dashboardActivity → Employer
Social mediaPhoto → Tagged friendsPhoto → Facial recognition training

Every data point you collect is:

  • A responsibility to protect
  • A potential liability if breached
  • A decision users must make about trusting you

Question for each field: Would the product still work without this information?

FieldRequired?Justification
EmailYesAccount recovery, essential notifications
PhoneMaybe2FA (offer alternatives), critical alerts
Birth dateRarelyAge verification (collect year only if needed)
GenderUsually noPersonalization (offer “prefer not to say”)
Full addressOnly if shippingPhysical delivery
Location (precise)Usually noApproximate location often sufficient

Collect data only when needed, not upfront:

┌─────────────────┐
│ Sign Up │ ← Email only
└────────┬────────┘
┌─────────────────┐
│ First Purchase │ ← Shipping address
└────────┬────────┘
┌─────────────────┐
│ Saved Cards │ ← Payment info (optional)
└────────┬────────┘
┌─────────────────┐
│ Enable Alerts │ ← Phone number (optional)
└─────────────────┘

Data you don’t keep can’t be breached.

Data TypeRetention GuidelineAuto-Action
Session logs30 daysDelete
Payment recordsLegal requirement (varies)Archive, restrict access
Support tickets2 yearsAnonymize
Abandoned carts30 daysDelete
Inactive accounts2 yearsPrompt, then delete
Analytics events26 months (GA default)Anonymize

Limit access to personal data within your organization:

RoleAccess Level
Customer supportName, email, order history
MarketingAnonymized segments only
EngineeringProduction: anonymized; Debug: time-limited access
FinanceBilling records, no PII except necessary
ExecutiveAggregated metrics only

Dark patterns in consent are both unethical and increasingly illegal. In 2024, average FTC penalties for dark pattern violations reached $14.8 million.

Requirements:

  • Don’t pre-check boxes for marketing or data sharing
  • Don’t hide opt-outs in walls of legal text
  • Don’t make rejection harder than acceptance
  • Do use plain language to explain what and why
ElementGoodBad
Accept button”Accept all""Yes! Give me the best experience!”
Reject button”Reject all” (equally visible)Small “Manage preferences” link
Granular optionsPer-category togglesOne toggle for everything
Pre-selectionAll off by default”Legitimate interest” pre-enabled
Language”Marketing cookies""Functional enhancement modules”

Vague: “We use your data to improve our services”

Specific and honest:

  • “We use your location to show nearby stores” (necessary for feature)
  • “We share your email with shipping partners for tracking updates” (clear recipient)
  • “We analyze usage patterns to fix bugs (no personal data leaves your device)” (reassuring limitation)

The GDPR standard: Withdrawal of consent must be as easy as granting it.

Consent ActionSteps to GrantSteps to Withdraw
Marketing emails1 click (checkbox)1 click (unsubscribe link)
Location sharing1 tap (enable)1 tap (disable)
Account deletionN/A3 clicks max, no waiting period
Data exportN/ASelf-service in settings

When users create content or profiles:

  • Default visibility: Private or limited
  • Pre-publish preview: Show who will see
  • Expansion warning: Alert before increasing visibility

Vague “Share” buttons create anxiety. Be specific:

VagueSpecific
”Share""Share with 3 people in this workspace"
"Post""Post publicly (anyone on the internet)"
"Share profile""Visible to your 847 connections"
"Share location""Share with Sarah for 1 hour”

Show current sharing state clearly:

┌────────────────────────────────────────┐
│ Document: Q4 Report │
│ 🔒 Private — Only you can see this │
│ [Share] [Make public] │
└────────────────────────────────────────┘
┌────────────────────────────────────────┐
│ Post: Product announcement │
│ 🌐 Public — Anyone can see this │
│ Shared with: Everyone on the internet │
│ [Edit visibility] │
└────────────────────────────────────────┘

Preserve the context in which data was shared:

Source ContextInappropriate Cross-Context Use
Medical recordsSearch engine indexing
Photos with close friendsRecommendations to strangers
Work documentsPersonal social media
Private messagesTraining data for AI

Before showing real-time activity indicators, ask whether users want that visibility:

FeatureDefaultUser Control
”Typing…” indicatorOffEnable per conversation
Read receiptsOffEnable globally
”Online now” statusOffEnable with schedule
Last active timeOffEnable globally
”Viewing this page”OffEnable per page

Showing social signals creates pressure. Evaluate carefully:

FeatureBenefitPrivacy CostRecommendation
View countsSocial proofEmbarrasses low-view contentAggregate only
Who viewed your profileCuriosityCreates surveillanceOpt-in only
Like countsValidationCreates pressureOption to hide
”X is reading this”UrgencyForces responseDon’t implement

83% of users never change privacy defaults. Your defaults are your true values.

DefaultMessage to Users
Maximum sharing, opt-out”We share unless you stop us”
Minimum sharing, opt-in”We protect you unless you choose otherwise”
Transparent trade-offs”Share more for these benefits, or keep private”

Decision Logic for Privacy-Respecting Design

Section titled “Decision Logic for Privacy-Respecting Design”
FUNCTION evaluatePrivacyDesign(feature):
issues = []
// Data collection check
FOR each field IN feature.collected_data:
IF NOT hasDocumentedNecessity(field):
issues.push({
severity: "warning",
issue: "Unnecessary data collection: " + field
})
// Consent flow check
IF feature.requires_consent:
IF feature.consent.accept_weight > feature.consent.reject_weight:
issues.push({
severity: "error",
issue: "Asymmetric consent buttons (dark pattern)"
})
IF feature.consent.prechecked_boxes > 0:
issues.push({
severity: "error",
issue: "Pre-checked consent boxes violate GDPR"
})
// Sharing scope check
IF feature.involves_sharing:
IF feature.default_visibility == "public":
issues.push({
severity: "warning",
issue: "Default to public violates privacy by design"
})
IF NOT feature.shows_audience_before_share:
issues.push({
severity: "error",
issue: "User cannot see who will receive shared content"
})
// Withdrawal check
IF feature.has_consent_mechanism:
IF feature.withdrawal_steps > feature.grant_steps:
issues.push({
severity: "error",
issue: "Withdrawal harder than granting (GDPR violation)"
})
RETURN issues

Regulations vary by jurisdiction but share common themes.

RegulationJurisdictionKey RequirementsEffective
GDPREU/EEAConsent, right to deletion, data portability2018
CCPA/CPRACaliforniaRight to know, right to delete, opt-out of sale2020/2023
LGPDBrazilSimilar to GDPR, local enforcement2020
DSAEUDark pattern prohibition, transparency2024
DMAEUGatekeepers can’t use dark patterns2024
MODPAMaryland (US)Strict data minimizationOct 2025
CompanyFineViolation
Amazon€746MCookie consent manipulation
WhatsApp€225MPrivacy policy transparency
Google€150MConfusing cookie rejection
HondaUnder investigationDark patterns in consent (CCPA)

Building for GDPR compliance generally satisfies other regulations:

FUNCTION checkCompliance(feature):
// GDPR is typically the strictest
IF meetsGDPRRequirements(feature):
RETURN "Likely compliant globally"
// Check jurisdiction-specific requirements
violations = []
IF targetsCalifornia AND NOT meetsCCPA(feature):
violations.push("CCPA")
IF targetsEU AND NOT meetsGDPR(feature):
violations.push("GDPR")
IF targetsBrazil AND NOT meetsLGPD(feature):
violations.push("LGPD")
RETURN violations

A 2024 study found that 97% of EU apps still deploy dark patterns despite GDPR enforcement. The Norwegian Consumer Council found that 90% of popular websites and apps use dark patterns that conflict with GDPR principles.

The EU AI Act (adopted March 2024) addresses the tension between AI’s data hunger and privacy principles. Recital 69 states: “The right to privacy and to protection of personal data must be guaranteed throughout the entire lifecycle of the AI system. In this regard, the principles of data minimisation and data protection by design and by default… are applicable when personal data is processed.”

The ICO found that 65% of users prefer websites that simplify data rights management. A University of Cambridge study found that 48% of users distrust websites with deceptive UX patterns. IAB’s January 2024 report shows 49% of consumers believe websites don’t provide enough information about data use.

The Maryland Online Data Privacy Act (MODPA, effective October 2025) takes a stricter approach than other US laws: data collection must be “reasonably necessary and proportionate” to provide a specific product or service requested by the consumer. It also prohibits targeted advertising to minors entirely.


Legal Frameworks:

Dark Patterns:

Contextual Integrity:

Implementation:

Advocacy: