This article discusses demographic differentials that impact forensic error rate assessment of latent prints via covariate-specific ROC regression.
The challenge of understanding how automated tools might introduce bias has gained a lot of interest. If biases are not interpreted and solved, algorithms might reinforce societal discrimination and injustices. It is also not clear how to measure fairness in algorithms. In biometrics disciplines such as automatic facial recognition, examining images pertaining to male subjects has been proven does not yield the same error rates as when examining images from female subjects. Furthermore, recent studies found that automatic fingerprint match scores vary based on an individual’s age and gender. Although ROC curve has been essential for assessing classification performance, the presence of covariates can affect the discriminatory capacity. It might be advisable to incorporate these covariates in the ROC curve to exploit the additional information that they provide. More importantly, the ROC regression modeling discussed in the paper can handle both continuous covariates such as age and discrete covariates such as gender and race. The resulting adjusted ROC curve provides error rates which account for demographic information pertaining to each subject. Thus, a better measure of the discriminatory capacity is generated compared to the pooled ROC curve. (Publisher abstract provided)