Wednesday, November 21, 2018

"Researchers Created Fake 'Master' Fingerprints to Unlock Smartphones"

Well this doesn't sound good.
But maybe, looking on the bright side, it means criminal masterminds won't have to chop off your digits to access your phone.

From Motherboard:

It’s the same principle as a master key, but applied to biometric identification with a high rate of success. 
AI can generate fake fingerprints that work as master keys for smartphones that use biometric sensors. According to the researchers that developed the technique, the attack can be launched against individuals with “some probability of success.”

Biometric IDs seem to be about as close to a perfect identification system as you can get. These types of IDs are based on the unique physical traits of individuals, such as fingerprints, irises, or even the veins in your hand. In recent years, however, security researchers have demonstrated that it is possible to fool many, if not most, forms of biometric identification.

In most cases, spoofing biometric IDs requires making a fake face or finger vein pattern that matches an existing individual. In a paper posted to arXiv earlier this month, however, researchers from New York University and the University of Michigan detailed how they trained a machine learning algorithm to generate fake fingerprints that can serve as a match for a “large number” of real fingerprints stored in databases.

Known as DeepMasterPrints, these artificially generated fingerprints are similar to the master key for a building. To create a master fingerprint the researchers fed an artificial neural network—a type of computing architecture loosely modeled on the human brain that “learns” based on input data—the real fingerprints from over 6,000 individuals. Although the researchers were not the first to consider creating master fingerprints, they were the first to use a machine learning algorithm to create working master prints.

A “generator” neural net then analyzed these fingerprint images so it could begin producing its own. These synthetic fingerprints were then fed to a “discriminator” neural net that determined if they were genuine or fake. If they were determined to be fake, the generator then made a small adjustment to the image and tried again. This process was repeated thousands of times until the generator was able to successfully fool the discriminator—a setup known as a generative adversarial network, or GAN......
....MORE
HT: naked capitalism