Biometrics & You

Biometrics as a form of authentication—in other words, using fingerprint or other biological characteristics as proof of identity—is increasingly used in the security world. The technology can be used as a standalone authentication method, or as part of a 2FA (2-factor authentication).

Smartphones, laptops, secured entryways, mantraps—even credit cards are being made with biometric capabilities. Most of these implement biometrics by using some kind of scanner, registering the user’s fingerprint, retina pattern, facial recognition, or similar features. When the user wishes to be use the device the process is repeated, and if there is a match they are authenticated.

A few notes on the limitations of this technology. First, many of these scanners are compact. To save on miniaturization costs these scanners often have very low resolution, which can lead to low fidelity in the image matching process. Secondly, as in other security implementations convenience is opposed to security, and among users there is a low tolerance for false negatives (when the system doesn’t recognize someone it should), so many manufacturers err on the side of permissiveness. This can lead to someone with a partial fingerprint match being authenticated as though they were a different user, as an example.

One other objection that has been raised is that biometrics by its very nature creates a digital signature of the biology it is verifying.  That digital signature must be stored and transmitted, and is subject to compromise; a clever malicious actor could simply do the biometric equivalent of passing the hash to gain access.

Just as with any form of authentication, biometrics is most powerful as part of a two factor configuration, where it supplements another form such as a passphrase.  A cord of two strands is not easily broken; an authentication framework with two factors is not easily fooled.