CUES Data Protection Impact Assessment (DPIA)

A privacy-focused template to evaluate whether an AI system’s data collection, retention, access, transfer, and storage model is appropriate before deployment.

Privacy templateData minimizationSensitive data review
Retention controlsAccess reviewTransfer review

Executive Summary

The DPIA is used when an AI use case processes personal, confidential, or otherwise sensitive information in a way that could create privacy, governance, or reputational risk. It helps CUES apply data minimization, retention discipline, and appropriate controls before launch.

Use a DPIA when: the use case handles member-related data, employee information, confidential internal material, recordings, transcripts, or any workflow where data leaves a controlled system or is retained by a vendor.

Core DPIA Questions

TopicQuestions to document
Data inventoryWhat categories of data are processed? Is any of it personal, confidential, or restricted?
Purpose limitationWhy is each data element needed? Could the use case work with less data?
Storage and retentionWhere are data, outputs, and logs stored, and for how long?
Access and permissionsWho can view, edit, export, or share the data or generated outputs?
Third-party handlingDoes a vendor process or store the data? Under what contractual and technical controls?
Cross-system transferDoes information move between Microsoft 365, cloud platforms, websites, LMS tools, or external vendors?
Individual impactCould mishandling expose or misrepresent a person, team, or sensitive activity?
MitigationsWhat specific controls reduce the identified privacy risks?

Typical Mitigations

  • Mask or remove unnecessary identifiers before processing.
  • Limit transcript, recording, and export access to named roles.
  • Use approved storage locations with defined retention windows.
  • Disable public sharing and unmanaged downloads when not required.
  • Require documented approval before moving sensitive data into external AI tools.

Decision Outcomes

The DPIA should end with one of three outcomes: approve, approve with required controls, or do not proceed until issues are resolved.

Where privacy risk remains elevated, require compensating controls, tighter scoping, or a different architecture.

FAQ

Does every AI tool need a DPIA? Not necessarily. Use it when privacy exposure is meaningful, when recordings or transcripts are involved, or when sensitive data is processed or retained.

Who should review it? The business owner, IT/security, and the relevant privacy or compliance stakeholder.