fbpx
logos

Statement on the Need for An Interpretive Rule Concerning the Solicitation of Demographic Data for the Purposes of Fair Lending Self-Testing

(Download)

June 22, 2022

We are a group of fintechs and consumer advocates who share a common interest in making the credit system fairer for historically underserved consumers.

On June 29, 2021, the National Community Reinvestment Coalition and Affirm, Lending Club, Oportun, PayPal, Square (now “Block”), and Varo provided a joint statement to the Consumer Financial Protection Bureau (“the Bureau”) in support of the use of disparate impact testing, under the Equal Credit Opportunity Act (“ECOA”) to address risks of “digital discrimination” that result from Artificial Intelligence, machine learning, or alternative data used in credit models.[1]   That statement suggests five topics we believe the Bureau should address in an interpretive rule  to enhance the disparate impact framework.

In the following statement, we elaborate on the fifth of our earlier recommendations, to enhance the “self-testing” option enabling creditors to collect demographic data directly from consumers. We believe this option could provide consumers better protection against discrimination.

We call on the Bureau to issue an interpretive rule on how creditors could collect demographic information from applicants for the purposes of improving their ability to monitor their underwriting for disparate impact on consumers who are members of protected groups. We are not suggesting that the collection of demographic data is mandatory under ECOA, only that if a lender voluntarily elects to collect information for self-testing purposes only, which is already permitted by federal fair lending laws, that additional clarification from the Bureau would be helpful.

The Rationale Behind our Request for an Interpretive rule

To ensure compliance with the Equal Credit Opportunity Act (“ECOA”) and Regulation B,[2] creditors are expected to conduct fair lending analysis on their credit models to ensure that their models do not unlawfully discriminate against applicants on the basis of protected class status, such as race, national origin, sex, and age. See Equal Credit Opportunity Act.[3] These regulatory requirements apply to fintech, algorithmic, AI, ML, and alternative data credit models, just as they do to traditional credit models.

We recognize the Bureau’s application of the unfairness doctrine to apply to discriminatory practices beyond the context of credit.  Although this letter does not address in detail self-testing as it relates to the unfairness doctrine, we encourage the Bureau to consider an approach to the self-testing privilege that allows financial institutions to apply consistent compliance procedures as they monitor for discrimination in non-credit products and services.

As a part of the Bureau’s supervision under Regulation B, it prohibits a creditor from inquiring about the race, color, religion, national origin, or sex of a credit applicant except under certain circumstances.[4] An exception to this standard is the privilege conferred upon creditors to conduct “self-tests” for the purpose of fair lending compliance. However, self-tests are rarely conducted by creditors because of fears of self-incrimination, confusion about what is permissible, potential impacts on attrition during the application process, and uncertainty about what are best practices for demographic data collection.

While 12 CFR 1002.5(b)(1) et seq. does provide an exception allowing the collection of demographic information for self-testing, this is not a common practice in industry, due to concerns that the collection of such information would create a perception of disparate treatment among customers and leave creditors at risk of regulatory enforcement actions.

Creditors that seek to evaluate the fairness of their lending practices when they are not under an affirmative obligation to collect demographic information may use alternative methods to estimate the demographic composition of a class of applicants.[5]

Absent certainty that it is permissible or because of the relative complexity of the most commonly-utilized method (Bayesian Improved Surname Geocoding or (“BISG”), some firms may limit their fair lending testing. The Bureau has examined the capabilities of BISG using Home Mortgage Disclosure Act (“HMDA”) data and concluded that it can improve on simpler diagnostic methodologies but has also acknowledged that its inferential power differs among different demographic groups and that it is less predictive for persons of color.[6] While the CFPB used HMDA mortgage data to show BISG provides reliable estimates for this population, others have raised concerns about BISG’s accuracy at an individual level and for other populations.  Proxy methodologies may lead to less effective evaluations of fair lending practices than would be possible if lenders were permitted to collect such demographic information for fair lending “self-test” purposes only, as outlined in 12 CFR 1002.5(b)(1) et seq.

The collection of demographic information would allow lenders to validate the accuracy of proxy methods such as BISG to their specific borrower population, and to develop and validate novel methods that could offer an improvement over BISG when estimating race and ethnicity when self-reported attributes are missing. Additionally, some credit applicants may decline to provide demographic data when asked. The efficacy of testing could be undermined if those applications were thrown out. Statistically sound procedures do exist for handling missing data. The Bureau could provide guidance on proper approaches for proxy methodologies that address surveys with missing data points.

The collection of demographic information would improve the ability of financial institutions to effectively self-test, while also allowing them to validate the accuracy of proxy methods such as BISG to their specific borrower population, and to develop and validate novel methods that could offer an improvement over BISG when estimating race and ethnicity when self-reported attributes are missing.[7]

Additionally, an interpretive rule indicating when and how a creditor might rely on visual observation, similar to what is permitted under HMDA, to estimate the demographic status of an applicant would further assist with efforts to conduct self-testing. Ideally, it would provide specifics for such estimations for each channel, including how a creditor could make use of forms of identifying documentation that is gathered through digital channels.

Regulation C (HMDA) allows for the collection of demographic data by mortgage lenders and a notice of proposed rulemaking will soon permit similar procedures for small business creditors. HMDA has supported efforts to ensure accountability for credit accessibility and fair lending testing. There has not been a noticeable impact on incompletion rates due to the requirement to collect demographic data.

The following chart lists examples of why a creditor might refrain from self-testing and how those hurdles might be addressed through an interpretive rule from the Bureau.

The publication of an interpretive rule would support efforts to enhance fair lending testing.

Background

Self-testing is an important aspect of advancing the policy objectives embodied in ECOA and Regulation B. The methodologies that institutions use for fair lending testing their models and lending policies vary, but as a general matter such testing often includes: (1) ensuring that models and policies do not include protected class status, or close proxies for protected class status, as attributes; and (2) performing disparate impact testing that assesses whether facially-neutral automated models are likely to disproportionately lead to negative outcomes for a protected class and if such negative impacts exist, ensuring the models serve legitimate business needs and “no less discriminatory” model or practice would achieve similar business results.[8] These practices allow creditors to provide products and services on an equitable basis.

The clarifications recommended below would provide well-intentioned creditors with guidelines and rules governing how to ask for demographic data, while remaining in compliance with Regulation B, and Regulation Z.

Recommendations

We urge the Bureau to conduct research and provide an interpretive rule to clarify when and how creditors can solicit individual credit applicants for demographic data in the context of fair lending testing programs, including the following:

  1. Providing examples of best practices, such as those the Bureau already provides for notices of adverse action.[9]
  2. Creating a model guide for the solicitation of demographic information in varying application formats, including mobile, desktop, phone, in-person, and through the mail. The Bureau should state that creditors using practices in accordance with the model guide will not risk enforcement under Regulation Z.
  3. Enumerating operational safeguard best practices to ensure that the demographic data is not misused by creditors after being acquired for “self-testing” purposes.
  4. Providing clarity on how to determine optimal sample sizes that maximize the reliability of demographic information to be collected while minimizing the impact on conversion.
  5. Providing clarity on how to supplement data for customers who choose not to provide demographic data, including the allowance of visual observation similar to what is permitted in HMDA, as well as guidance on acceptable proxy methods, both for financial institutions who supplement their data and for those who might choose not to directly request demographic data.

 


 

[1] Statement On Request For Guidance On Implementation Of Disparate Impact Rules Under ECOA https://ncrc.org/statement-on-request-for-guidance-on-implementation-of-disparate-impact-rules-under-ecoa/

[2] credit models are defined in Reg B: "empirically derived, demonstrably and statistically sound, credit scoring system"

[3] ECOA, 15 U.S.C. § 1691 et seq.; Regulation B (12 C.F.R. part 1002). ECOA and Regulation B prohibit disparate treatment and disparate impact, including “prohibit[ing] a creditor practice that is discriminatory in effect because it has a disproportionately negative impact on a prohibited basis, even though the creditor has no intent to discriminate and the practice appears neutral on its face, unless the creditor practice meets a legitimate business need that cannot reasonably be achieved as well by means that are less disparate in their impact.” 12 C.F.R. part 1002, Supp. I ¶ 1002.6(a)-2.

[4] Amendments to Equal Credit Opportunity Act (Regulation B) Ethnicity and Race Information Collection, 82 FR 16307 (Apr. 4, 2017).

[5] As a first step, creditors review an underwriting policy or practice to identify if it disproportionately disadvantages a protected class, usually through a quantitative technique. If this step provides reasons for concerns, then a creditor would seek to determine if a policy or practice meets a legitimate business interest; if it does, then the last step is to determine if the objective could be achieved through a less discriminatory alternative.

[6] Consumer Financial Protection Bureau. “Using Publicly Available Information to Proxy for Unidentified Race and Ethnicity: A Methodology and Assessment.” Washington, D.C.: CFPB, Summer 2014. https://files.consumerfinance.gov/f/201409_cfpb_report_proxy-methodology.pdf.

[7] Baines, A. P., & Courchane, D. M. J. (2014). Fair Lending: Implications for the Indirect Auto Finance Market. 143.

[8] See Initial Report of the Independent Monitor, Fair Lending Monitorship of Upstart Network’s Lending Model at 7 (April 14, 2021) (“Initial Upstart Report”), https://www.relmanlaw.com/media/cases/1086_Upstart%20Initial%20Report%20-%20Final.pdf; David Skanderson & Dubravka Ritter, Federal Reserve Bank of Philadelphia, “Fair Lending Analysis of Credit Cards” 38–40 (2014).

[9] See Appendix C to 12 CFR Part 1002 (Regulation B) — Sample Notification Forms, https://www.consumerfinance.gov/rules-policy/regulations/1002/c/

Print Friendly, PDF & Email
Scroll to Top