Privacy Risk Assessment Methodology#
The NIST Privacy Risk Assessment Methodology (PRAM) is a tool that helps organizations analyze, assess, and prioritize privacy risks to determine how to respond and select appropriate solutions. [1] The PRAM can help drive collaboration and communication between various components of an organization, including privacy, cybersecurity, business, and IT personnel.
Taking on the persona of a financial institution, the NCCoE applied the PRAM to assess privacy risk of using an mDL to establish a financial account through online remote identity proofing. The content below is representative of how a financial institution may complete a privacy assessment and is exemplary in nature. It is not intended to be normative in nature, as the responses will vary for each organization. However, completing this process allowed the NCCoE to create a more impactful set of recommendations and improve the build behind our fictitious NCCoE Bank. It is also worth noting that our example PRAM is representative of the mDL request and presentation model only, and does not assess alternative models of identity verification which would likely have different privacy risks. For example, when using document capture of a physical driver’s license there is no selective disclosure of information supported as with mDL. This would introduce different risks and issues with different impacts for the financial institution. A comparable assessment should be conducted of all techniques used by financial institutions for identifying and authenticating its users.
The NIST PRAM is a freely available resource any organization can use. However, financial institutions may extend their existing privacy risk assessment approaches if they have already adopted one. It is recommended that these organization specific approaches consider a detailed evaluation of the data flows and functions of the technology they are evaluating, similar to the method applied in the PRAM.
The sections that follow dive into the completed sample privacy risk assessment, answering five key questions for the (not real) NCCoE Bank (referenced later as “our organization”).
What are the organizational privacy goals?#
Mission/business needs the system serves#
The term “digital first” describes the increasing population of customers who prefer to interact on-line rather than through traditional methods, such as making telephone calls or visiting a branch. Pursuing a digital-first strategy prioritizes the customer experience, enables operational scaling, and strengthens security – crucial aspects of our exemplar financial organizational goals. More specifically, using a mobile driver’s license (mDL) for online account establishment and access, serves the following needs of our “digital-first” bank customers:
-
Customer Convenience: Increased efficacy and accuracy with which customers can present and validate their identity with our online services, reducing the need for customers to make in-person visits to a brick-and-mortar branch and expanding the hours when they can complete identity verification.
-
Operational Efficiency: Increased automation, reducing manual labor and lowering operational costs.
-
Customer Empowerment: Self-service options for account opening, enrollment (setting up secure online access), executing transactions and account recovery.
-
Competitive Advantage: Attracts “digital-first” customers and expands reach to new markets (e.g. customers from various generations, non-local customers [2], , [3])
-
Security and Compliance: Advanced security protocols that improve the accuracy of customer information collected for identity verification. Increase ability to report fraud to states whose IDs may be used by bad actors. Improved regulatory compliance for Customer Identification Programs/Customer Due Diligence (CIP/CDD).
-
Increased Customer Service: Increase the ability of bank tellers to focus more on customer service and less on identifying fraud such as checking counterfeit IDs or explaining to customers that they cannot provide service due to a failed security check.
Privacy-preserving goals of the system#
When customers opt to use an mDL for establishing and accessing an online our exemplar financial institution goals are to ensure the following , they benefit from enhanced privacy protections.
-
User control and selective data disclosure: Only data required for opening the account is shared, minimizing the data collected and stored—a level of granular data control which is unique to the mobile Driver’s License.
-
Notice and transparency: Customers are provided with easy-to-understand customer interfaces to enable informed decision-making about sharing their personal information. Additionally, mobile wallets commonly offer customers a record of the organizations with which they have shared their information.
-
Built-in security: We incorporate strong security into the way we present, store, and transmit customer information. Enabling confidentiality, integrity, and availability of data facilitates controls which double as privacy protections for customer data—such as access control, encryption, and retention/deletion.
Legal environment in which we operate#
Federal Regulations
As a financial institution, our organization is subject to the Financial Modernization Act of 1999, also known as the Gramm-Leach Bliley Act (GLBA), which includes:
-
The Financial Privacy Rule (governs how financial institutions can collect and disclose customers’ personal financial data),
-
The Safeguards Rule (requires all financial institutions to maintain safeguards to protect customer information); and
-
A provision to prevent access to customers’ financial information under false pretenses (“pretexting”).
Source: FTC Financial Privacy
When our customers establish or access an online financial account, we also are subject to Customer Identification Program (CIP) and Customer Due Diligence (CDD) regulations – both falling under the USA PATRIOT Act, aimed at preventing fraud and financial crimes.
Source: FDIC explains the CIP requirements
Cybersecurity and Privacy risk fall under operational risk for our organization. Operational risk is subject to safety and soundness examinations under the FDI act and Federal Reserve Board supervision and regulation letters.
Source: FDI Act Section 39, SR 21-14
State Regulations
In addition, since we operate in California and Oregon, we also must comply with the California Consumer Privacy Act and the Oregon Consumer Privacy Act.
International Regulations
Our mDL efforts will be subject to the regulations in each country where we operate. This PRAM is based upon the requirements of our US operations. Privacy risks should be evaluated for each country where we introduce mDL, either in a separate PRAM or by integrating international considerations into this PRAM.
Contractual Requirements
Certain contracts with key partners include privacy requirements beyond those mandated by law. These may include:
-
Permissible uses of shared data,
-
Specific data retention periods and methods of destruction,
-
Role-based access control for certain data, and
-
Notification procedures for incidents.
These contractual privacy requirements should also be addressed in the PRAM.
How does this mDL system work?#
The figure below is an example data flow diagram based on the architecture we have implemented as part of the NCCoE Lab demonstration. This diagram highlights how data flows through our NCCoE Bank and third-party systems to include government and commercial third parties. Each step of the flow elucidates data actions relevant to the PRAM.

Fig. 1. mDL Privacy Assessment Data Flow.
The following table describes the data actions (collection, generation, disclosure, retention, disposal, or transformation) for each of the numbered interaction steps in the
flow in Figure 1.
Data Action |
Data |
---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Note: The below items happen outside of the identity verification flow |
|
|
|
|
|
|
What potential privacy risks does this mDL system introduce?#
The below table uses the privacy problems for individuals as defined in the NIST PRAM and highlights some of the organization impact that could result if those problems manifested for NCCoE Bank customers.
Data Action |
Potential Problems for Individuals |
---|---|
(1) The IDMS collects personal attributes self-asserted by the applicant. |
Loss of Trust: Individuals could lose trust in a bank because they’re uncomfortable with how the bank is handling their data. Example: A customer assumes the data they provide is only for account enrollment, then they receive an email marketing mailer that is tailored to their identity. Customers may also lose trust in the bank if they’re asked for sensitive data without knowing why. Example: A customer may not realize SSN is required regardless of mDL, in all financial institution account creation processes. |
(2) The applicant presents an mDL from their digital wallet. The verifier collects the applicant’s attributes from the mDL. |
Loss of Trust: Customers could lose trust in a bank if the data they collect is beyond what’s predictable or expected. Example: A customer may think only certain attributes from their mDL are transferred, while the bank may actually collect all attributes from the mDL. Loss of Autonomy: A perceived loss of customer control over how their data is used. Example: A customer’s data is sent to a third party without their knowledge or used in a way they were not made explicitly aware of. |
(3) The verifier confirms that the mDL is valid and discloses the mDL attributes to the IDMS. The verifier disposes of all the mDL attributes in the verifier by deleting them immediately after disclosure. |
Loss of Trust: The customer could lose trust in the bank and its partners because of how they’re handling the customer data. Example:
Dignity Loss: An issuer tracking customer activities across transactions could lead to stress or embarrassment. Example: Issuer uses a series of transactions to piece together habits of the customer, or aspects of the customer’s identity, which they do not expect or want to be known. |
(4) The IDMS creates a unique user identifier for the applicant and transfers the identifier along with all the attributes (both self-asserted and from the mDL) to the appropriate bank IT back-end applications. |
Loss of Trust: If a customer identifier is used to aggregate data about the customer’s transactions beyond the financial institution tenant. Example:
Dignity Loss: An IDMS tracking customer activities across tenants could lead to stress or embarrassment. Example: Issuer uses a series of transactions to piece together habits of the customer, or aspects of the customer’s identity, which they do not expect or want to be known. Economic Loss: If customer data is breached at the IDMS, this leak may be used to defraud the individual. Example: A nefarious actor could use leaked data to phish a customer and steal their identity, which could result in significant economic loss. |
(5) The bank validates that the application attributes match the Applicant SSN. SSN verifier responds with “yes” or “no” and disposes of the data sent by the bank back-end. The bank may also validate attributes with third party databases. |
Loss of Trust: The bank’s business partner may not effectively secure customer data from external actors. Example:
|
(6) Bank IT back-end processes the information gathered, uses its decision logic to determine what financial offering will be made to the customer. |
Discrimination: If assumptions are made about individuals or groups of individuals based on their self-asserted information, this could result in discrimination. Joining this self-asserted data with other datasets could reveal a narrative about an individual. Example: Drawing conclusions about race or financial situation. Economic Loss: Individuals could receive unfair value in a transaction due to discrimination. Example: Assuming credit worthiness based solely on someone’s area code. Additionally, if customer data is breached at the bank back end, this leak may be used to defraud the individual. Example: A nefarious actor could use leaked data to phish a customer and steal their identity, which could result in significant economic loss. Loss of Autonomy: Customers may lose control over how their data is processed, because the bank chooses to process it in a new way internally without proper notice (i.e., secondary use or repurposing). Example: Customers may expect their personal data to be accessed for only certain purposes, such as identity verification when applying for a financial product, but then it is later used for fraud scoring without their consent. |
(7) Bank creates an account for the applicant and initiates passkey setup via the IDMS. |
Loss of Trust: Customers may lose trust in the bank if they’re unclear on who has access to which data. Example: A customer doesn’t know why they’re interacting with the IDMS instead of the bank. Customers may also lose trust in the bank if they’re asked to present their mDL and attributes each time they authenticate after account creation. Example: If customers are asked to use their mDL for regular authentication, they will need to assert attributes from their mDL each time. Such actions could be perceived as disproportionate to the purpose or outcome of the transaction. Loss of Autonomy: A perceived loss of customer control over how their data is used. Example: Customers may experience stress if they cannot update or delete their information within financial institution systems. Customers may also not be aware of financial institution data retention requirements. |
What’s the potential impact to the Financial Institution?#
The below table uses the privacy problems for individuals as defined in the NIST PRAM and highlights some of the organization impact that could result if those problems manifested for NCCoE Bank customers.
Data Action |
Potential Problems for Individuals |
Impact to the Financial Institution |
---|---|---|
(1) The IDMS collects personal attributes self-asserted by the applicant. |
Loss of Trust: Individuals could lose trust in a bank because they’re uncomfortable with how the bank is handling their data. Example: A customer assumes the data they provide is only for account enrollment, then they receive an email marketing mailer that is tailored to their identity. Customers may also lose trust in the bank if they’re asked for sensitive data without knowing why. Example: A customer may not realize SSN is required regardless of mDL, in all financial institution account creation processes. |
The bank may incur direct business costs if customers choose to bank elsewhere because they aren’t comfortable with how banks are using their data. |
(2) The applicant presents an mDL from their digital wallet. The verifier collects the applicant’s attributes from the mDL. |
Loss of Trust: Customers could lose trust in a bank if the data they collect is beyond what’s predictable or expected. Example: A customer may think only certain attributes from their mDL are transferred, while the bank may actually collect all attributes from the mDL. Loss of Autonomy: A perceived loss of customer control over how their data is used. Example: A customer’s data is sent to a third party without their knowledge or used in a way they were not made explicitly aware of. |
If customers are not clear of their choices in how to enroll with or without mDL, if they are not clear on the data collected from the mDL or if their data is used in ways they didn’t predict, they may lose trust in the bank and then choose to do business elsewhere. |
(3) The verifier confirms that the mDL is valid and discloses the mDL attributes to the IDMS. The verifier disposes of all the mDL attributes in the verifier by deleting them immediately after disclosure. |
Loss of Trust: The customer could lose trust in the bank and its partners because of how they’re handling the customer data. Example:
Dignity Loss: An issuer tracking customer activities across transactions could lead to stress or embarrassment. Example: Issuer uses a series of transactions to piece together habits of the customer, or aspects of the customer’s identity, which they do not expect or want to be known. |
If third party software that is processing data on the bank’s behalf uses customer data to surveil them, banks may face reputational damage. The bank may incur direct business costs if customers choose to bank elsewhere because they aren’t comfortable with how banks— associated external organizations—are using their data. The bank may incur noncompliance costs if customer data is stored when they’re told it’s deleted, or if it’s used beyond the purposes to which the customers consented. Customers could lose trust in mDLs, driving the financial institution to use less effective and secure means for meeting CIP requirements, opening them up to fraud and customer loss. |
(4) The IDMS creates a unique user identifier for the applicant and transfers the identifier along with all the attributes (both self-asserted and from the mDL) to the appropriate bank IT back-end applications. |
Loss of Trust: If a customer identifier is used to aggregate data about the customer’s transactions beyond the financial institution tenant. Example:
Dignity Loss: An IDMS tracking customer activities across tenants could lead to stress or embarrassment. Example: Issuer uses a series of transactions to piece together habits of the customer, or aspects of the customer’s identity, which they do not expect or want to be known. Economic Loss: If customer data is breached at the IDMS, this leak may be used to defraud the individual. Example: A nefarious actor could use leaked data to phish a customer and steal their identity, which could result in significant economic loss. |
If third party software that is processing data on the bank’s behalf uses customer data to surveil them, banks may face reputational damage. The bank may incur direct business costs if customers choose to go elsewhere for banking because they aren’t comfortable with how banks— associated external organizations—are using their data. The bank may incur noncompliance costs if the bank isn’t securely handling sensitive data about the customer. |
(5) The bank validates that the application attributes match the Applicant SSN. SSN verifier responds with “yes” or “no” and disposes of the data sent by the bank back-end. The bank may also validate attributes with third party databases. |
Loss of Trust: The bank’s business partner may not effectively secure customer data from external actors. Example:
|
The bank may incur noncompliance costs if the bank isn’t securely handling sensitive data about the customer. Customers may choose to work with other banks if they don’t trust a bank with their data. |
(6) Bank IT back-end processes the information gathered, uses its decision logic to determine what financial offering will be made to the customer. |
Discrimination: If assumptions are made about individuals or groups of individuals based on their self-asserted information, this could result in discrimination. Joining this self-asserted data with other datasets could reveal a narrative about an individual. Example: Drawing conclusions about race or financial situation. Economic Loss: Individuals could receive unfair value in a transaction due to discrimination. Example: Assuming credit worthiness based solely on someone’s area code. Additionally, if customer data is breached at the bank back end, this leak may be used to defraud the individual. Example: A nefarious actor could use leaked data to phish a customer and steal their identity, which could result in significant economic loss. Loss of Autonomy: Customers may lose control over how their data is processed, because the bank chooses to process it in a new way internally without proper notice (i.e., secondary use or repurposing). Example: Customers may expect their personal data to be accessed for only certain purposes, such as identity verification when applying for a financial product, but then it is later used for fraud scoring without their consent. |
The bank may face noncompliance costs if data is not used for the purposes approved/expected. The bank may incur direct business costs if they are making inaccurate assumptions about customers and maintain poor-quality data. The bank may incur noncompliance costs if the bank isn’t securely handling sensitive data about the customer. |
(7) Bank creates an account for the applicant and initiates passkey setup via the IDMS. |
Loss of Trust: Customers may lose trust in the bank if they’re unclear on who has access to which data. Example: A customer doesn’t know why they’re interacting with the IDMS instead of the bank. Customers may also lose trust in the bank if they’re asked to present their mDL and attributes each time they authenticate after account creation. Example: If customers are asked to use their mDL for regular authentication, they will need to assert attributes from their mDL each time. Such actions could be perceived as disproportionate to the purpose or outcome of the transaction. Loss of Autonomy: A perceived loss of customer control over how their data is used. Example: Customers may experience stress if they cannot update or delete their information within financial institution systems. Customers may also not be aware of financial institution data retention requirements. |
The bank may face reputational damage if customers’ data is accessible by organizations they didn’t expect, or if there are security issues. The bank may incur direct business costs if customers choose to go bank elsewhere because they don’t have control over their data. |
How do we mitigate these risks?#
The below table highlights controls that can help mitigate problems to individuals and impact to banks. For a
complete rationale for why these controls were
chosen, please refer to our excel mapping here
.
Note: The Privacy Framework subcategories below align with the Initial Public Draft of the Privacy Framework version 1.1.
Problems |
Controls |
|
---|---|---|
(1) The IDMS collects personal attributes self-asserted by the applicant. |
||
Loss of Trust: Individuals could lose trust in a bank because they’re uncomfortable with how the bank is handling their data. Example: A customer assumes the data they provide is only for account enrollment, then they receive an email marketing mailer that is tailored to their identity.
|
NIST SP 800-53r5 Controls
Privacy Framework 1.1 Subcategories
|
|
(2) The applicant presents requested attributes from their mDL stored in their digital wallet. The verifier collects the applicant’s attributes from the mDL. |
||
Loss of Trust: Customers could lose trust in a bank if the data they collect is beyond what’s predictable or expected. Example: A customer may think only certain attributes from their mDL are transferred, while the bank may actually collect all attributes from the mDL. |
See loss of trust controls for data action 1. These controls apply to both the IDSM in data action 1 as well as the verifier in data action 2. In some architectures the verifier may be an element of the IDMS. |
|
Loss of Autonomy: A perceived loss of customer control over how their data is used. Example: A customer’s data is sent to a third party without their knowledge or used in a way they were not made explicitly aware of. |
NIST SP 800-53r5 Controls
Privacy Framework 1.1 Subcategories
|
|
(3) The verifier confirms that the mDL is valid and discloses the mDL attributes to the IDMS. The verifier disposes of all the mDL attributes in the verifier by deleting them immediately after disclosure. |
||
Loss of Trust: The customer could lose trust in the bank and its partners because of how they’re handling the customer data. Example:
Dignity Loss: An issuer tracking customer activities across transactions could lead to stress or embarrassment. Example: Issuer uses a series of transactions to piece together habits of the customer, or aspects of the customer’s identity, which they do not expect or want to be known. |
NIST SP 800-53r5 Controls
Device retrieval – a verification model that allows the verifier to independently verify the integrity and accuracy of the user’s attributes without contacting the issuer at transaction time. Server retrieval is not supported. Privacy Framework 1.1 Subcategories
|
|
(4) The IDMS creates a unique user identifier (UUID) for the applicant and transfers the identifier along with all the attributes (both self-asserted and from the mDL) to the appropriate bank IT back-end applications. |
||
Loss of Trust: If a customer identifier is used to aggregate data about the customer’s transactions beyond the financial institution tenant. Example:
Dignity Loss: An IDMS tracking customer activities across tenants could lead to stress or embarrassment. Example: Issuer uses a series of transactions to piece together habits of the customer, or aspects of the customer’s identity, which they do not expect or want to be known. |
NIST SP 800-53r5 Controls
Unique identifiers are randomly generated so they aren’t inherently revealing about an individual. Privacy Framework 1.1 Subcategories
|
|
Economic Loss: If customer data is breached at the IDMS, this leak may be used to defraud the individual. Example: A nefarious actor could use leaked data to phish a customer and steal their identity, which could result in significant economic loss. |
NIST SP 800-53r5 Controls
Privacy Framework 1.1 Subcategories
|
|
(5) The bank validates that the applicant attributes match a valid SSN record. SSN verifier responds with “yes” or “no” and disposes of the data sent by the bank back-end. The bank may also validate attributes with third party databases. |
||
Loss of Trust: The bank’s business partner may not effectively secure customer data from external actors. Example:
|
NIST SP 800-53r5 Controls
Use of Social Security Administration’s SSN Verifier incorporates a secure approach to verification. Privacy Framework 1.1 Subcategories
|
|
(6) Bank IT back-end processes the information collected and uses its decision logic to determine what financial offering will be made to the applicant. |
||
Discrimination: If assumptions are made about individuals or groups of individuals based on their self-asserted information, this could result in discrimination. Joining this self-asserted data with other datasets could reveal a narrative about an individual. Example: Drawing conclusions about race or financial situation. |
NIST SP 800-53r5 Controls
Privacy Framework 1.1 Subcategories
|
|
Economic Loss: Individuals could receive unfair value in a transaction due to discrimination. Example: Assuming credit worthiness based solely on someone’s area code. Additionally, if customer data is breached at the bank back end, this leak may be used to defraud the individual. Example: A nefarious actor could use leaked data to phish a customer and steal their identity, which could result in significant economic loss. |
NIST SP 800-53r5 Controls
Privacy Framework 1.1 Subcategories
|
|
Loss of Autonomy: Customers may lose control over how their data is processed, because the bank chooses to process it in a new way internally without proper notice (i.e., secondary use or repurposing). Example: Customers may expect their personal data to be accessed for only certain purposes, such as identity verification when applying for a financial product, but then it is later used for fraud scoring without their consent. |
NIST SP 800-53r5 Controls
Privacy Framework 1.1 Subcategories
|
|
(7) Bank creates an account for the approved applicant and initiates passkey set up via the IDMS. |
||
Loss of Trust: Customers may lose trust in the bank if they’re unclear on who has access to which data. Example: A customer doesn’t know why they’re interacting with the IDMS instead of the bank. Customers may also lose trust in the bank if they’re asked to present their mDL and attributes each time they authenticate after account creation. Example: If customers are asked to use their mDL for regular authentication, they will need to assert attributes from their mDL each time. Such actions could be perceived as disproportionate to the purpose or outcome of the transaction. |
NIST SP 800-53r5 Controls
Pseudonymous Authentication - Using authentication methods (e.g., passkeys) that does not require the assertion of user attributes as part of the authentication protocol. Privacy Framework 1.1 Subcategories
|
|
Loss of Autonomy: A perceived loss of customer control over how their data is used. Example: Customers may experience stress if they cannot update or delete their information within financial institution systems. Customers may also not be aware of financial institution data retention requirements. |
NIST SP 800-53r5 Controls
Privacy Framework 1.1 Subcategories
|