UW Privacy Office

Privacy Assessments

Privacy Impact Assessment for Personal Data Processing Activities (PIA)

The Privacy Impact Assessment (PIA) helps UW and UW units assess the:

  • Potential benefits and harms to the individuals whose data are being processed and to the UW; and
  • The appropriateness of processing personal data based on UW’s values and principles and relevant laws, regulations, or policies.

Scope and Applicability

The PIA must be completed before a UW Unit begins processing personal data (other than Protected Health Information (PHI)) in connection with one or more of the below high-risk data processing activities. These risk categories and descriptions were sourced from the TrustArc Privacy Management Platform and then augmented and endorsed by the Privacy Steering Committee.

High-Risk Data Processing Activities

  1. Automated decision-making with legal or similar significant effect (i.e., no human involvement or intervention in the decision-making), such as[vi]:
    1. Automated decision-making that may result in physical harm to individuals (e.g., autopilot in connected cars)[i]
    2. Automated decision-making that may result in denial of a job opportunity, promotion, loan, or other financial opportunity[i]
    3. Automated decision-making that may result in differential pricing of goods and services
    4. Automated decision-making that may result in denial of health treatment, such as organ donation, blood transfusion or bone marrow transplant, or health payment, such as based on poor prognosis
    5. Automated decision-making that may result in reputational harm, such as a predictive association with a group that may result in stigmatization[i]
    6. Automated decision-making that may result in embarrassment, shock or surprise[i]
    7. Automated decision-making that may result in unlawful or other inappropriate discrimination, such as gender, age, ethnic, race or genetic discrimination[i]
  2. Evaluation or scoring, such as:
    1. Credit checks
    2. Financial assessment[i]
    3. Behavioral profiling or assessment
    4. Health risk assessment
    5. Genomic testing
    6. Pre-employment personality and preference assessments
    7. Employee capabilities assessment
    8. Evaluation or scoring of employees for talent assessment and profiling
    9. Leadership potential assessment
    10. Academic success prediction
    11. Evaluation or scoring of performance at work
    12. Evaluation or scoring of economic situations
    13. Evaluation or scoring of personal preferences or interests
    14. Evaluation or scoring of reliability
    15. Evaluation of individual location or movements
    16. Use of past incidents in a scoring algorithm to determine severity of sanctions
    17. Screening prospects or customers against a credit reference database
    18. Screening prospects or customers against an anti-money laundering database
    19. Screening prospects or customers against denied parties database
    20. Screening prospects or customers against fraud database
    21. Other evaluation, scoring, profiling or prediction[i]
  3. Systematic monitoring, such as[vi]:
    1. Systematic observation, monitoring or control of individuals, such as use of a camera system to monitor driving behavior on highways or monitoring of an employee’s work station or internet activities
    2. Computer systems and network access monitoring
    3. Building or physical site access monitoring
  4. Sensitive data or data of a highly personal nature, such as[vi]:
    1. Any of the following categories of data (sensitive or special categories of data or other higher risk data): race or ethnicity, political affiliation or opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data used to uniquely identify an individual, health data, data related to sex life or sexual orientation, criminal convictions or offenses, electronic communications data, location data, financial account data[i][vi]
    2. Cloud computing services for personal or household activities, such as personal document management, personal webmail, e-readers, e-diaries (with note-taking)
    3. Facial recognition technology[ii]
    4. Biometric Identifiers[iii]
  5. Data processed on a large scale, such as[vi]:
    1. High volume activities involving a large number of people or a larger percentage of the relevant population
    2. High volume of data
    3. Broad range of different data items being processed
    4. Long duration of the data processing activity
    5. Broad geographic extent of the data processing activity
  6. Datasets that have been matched or combined, such as:
    1. High volume activities involving a large number of people or a larger percentage of the relevant population
    2. Predictive analytics involving two or more data sources
    3. Prescriptive analytics involving two or more data sources
    4. Diagnostic analytics involving two or more data sources
    5. Descriptive analytics involving two or more data sources
  7. Data concerning vulnerable subjects, such as:
    1. Data involving vulnerable individuals (i.e., where there is a power imbalance between the position of the organization and the individuals whose data are being processed) such as employee, children, mentally ill individuals, the elderly, patients, and asylum seekers
    2. Health assessments[iv]
    3. Medical treatment[iv]
    4. Biomedical research
    5. Clinical research
    6. Payment for health care[iv]
    7. Health care operations[iv]
    8. Products for children[vii]
  8. Innovative use or new technology, such as[vi]:
    1. New technology, such as Internet of Things applications or combining fingerprint and facial recognition for improved access control
    2. Technology product design
    3. Technology product development
    4. New innovative use of / purpose for an existing product
    5. A mobile app that transparently or passively collects personal data and the app owner is requesting to publish it in the UW App Store[v]
  9. Interference with rights or opportunities, such as:
    1. Prevention of an individual from exercising a right or using a service or a contract, such as credit checks or automated decision-making
  10. Other likely high risks to the fundamental rights or freedoms of individuals, such as:
    1. Any other activity that could lead to physical, material or non-material damage to individuals, such as discrimination, identity theft or fraud, financial loss, damage to reputation, loss of confidentiality of personal data protected by professional secrecy, unauthorized reversal of pseudonymization, or any other significant economic or social disadvantage[i][vi]
  11. Other likely high risks, such as:
    1. Targeted advertising[i]
    2. Sale of personal data[i]

Form and Workflow Process

Download PIA [docx] [Update in progress]
The PIA form and process are being incorporated into the TrustArc Privacy Management Platform. The new form and process will be released in January 2022. To receive updates on this and other changes, please subscribe to Privacy Office communications. [Sign up in sidebar]

References

[i] Reserved
[ii] Related to RCW 43.386 Facial Recognition
[iii] Related to RCW 40.26 Biometric Identifiers
[iv] If the processing activity is related to protected health information, then follow UW Medicine Compliance’s policies and procedures.
[v] Related to discussions and decisions with UW Privacy Office and UW-IT, Academic Experience Design and Delivery
[vi] Related to General Data Protection Regulation Article 35 and/or Recital 75
[vii] UW Privacy Policy for UW Youth Programs