In biometrics, the false acceptance rate (FAR) is a security metric that measures how many times a biometric verification or identification system misidentifies an unauthorized user. This metric is a function of the decision threshold set in template matching and nearest neighbor methods. A higher FAR indicates a less accurate system, while a lower FAR indicates a more accurate system.
When evaluating a biometrics system, the FAR should be balanced against the False rejection rate (FRR), which measures how often a biometric identification system rejects an authorized user. Ideally, a biometric security system should have low FAR and FRR, while also being capable of verifying or identifying a high percentage of legitimate users.
Every type of biometric authentication has its own set of FAR and FRR, depending on the technology used and the specific use case. For example, fingerprint scanners are more likely to have a high FAR than iris scanners. This is because a person can spoof a fingerprint reader by placing a fake finger on it.
If the biometric system is able to match the spoof against its database of enrolled fingerprints, the unauthorized user can gain access. This is what’s known as a type II error in biometrics, and it is considered the most serious type of error because it gives unauthorized users access to systems that expressly are trying to keep them out. If a biometric system has a low FAR, it means that out of 1000 attempts by unauthorized users to gain access, the system will only accept one of those attempts. This is a desirable security metric, as it makes the system more secure.