Facial Recognition Could be The Next Access Point For Hackers

Facial recognition, a form of biometric technology is a biological measurement — or physical characteristic — that can be used to identify individuals. Researchers claim the shape of an ear, the way someone sits and walks, unique body odors, the veins in one’s hands, and even facial contortions are other unique identifiers.

Because physical characteristics are relatively fixed and individualized — even in the case of twins — they are being used to replace or at least augment password systems for computers, phones, and restricted access rooms and buildings.

Biometrics scanners are becoming increasingly sophisticated. For example, Apple’s iPhone X  which operates on the facial recognition technology incidents 30,000 infrared dots onto a user’s face in order to create a sequence of reflections which produce information about the 3D shape of the face, substantiating the user by pattern matching.

The chance of mistaken identity is one in a million, according to Apple. This technological advancement should be a thing of joy as it’s supposed to go a long way in easing us of the constant worries of having to remember our passwords and also being very careful not to mistakenly drop them for cybercriminals.

It is on record that in a survey carried out on 129 white hat and black hat hackers that attended Black Hat 2017 by Bitglass Data Games: Security Blind Spots, password-protected documents were ranked as the least effective security tool and facial recognition was ranked the second-worst tool overall.

After all, it’s only somebody possibly from the moon that would not have heard about or felt the impact of the nefarious activities of these bunch of baddies which they are constantly upping and fine-tuning. But while everybody is harping on the privacy concerns of facial recognition, people are not really looking at the security fall-outs from it.

This is the main reason why we have to be very wary of the report that the Zürich researchers have invented a computational method for automatically designing synthetic skin to match real individuals.

The process is initiated by scanning 3D facial expressions from a human subject. After which a novel optimization scheme then goes on to determines the shape of the synthetic skin as well as control the framework for the robotic head that conforms with the human subject.

The attendant result will be to increase the realism of the resulting character, which will then give an animatronic face that closely resembles the human subject. This innovation by all standards should be lauded but based on the ugly situation we have on our hands, it comes with mixed feelings.

Dr. Bernd Bickel a research scientist at Disney Research, Zürich did not mince words when he said: “With our method, we can simply create a robotic clone of a real person.” To clarify what they can actually achieve he said that, “The custom digitally designed skin can be fabricated using injection molding and modern rapid prototyping technology. We 3D print a mold and use elastic silicon with properties similar to human skin as base material”.

It’s also a cause for concern to note that their findings are in the public domain as they were presented at ACM SIGGRAPH 2012, the International Conference on Computer Graphics and Interactive Techniques. It’s possible for us to try to diminish the significance of these findings especially as it has been some years now since the report was released.

This must, however, be the basis for us to be more cautious. “Our research focuses on the creation of the silicone skin,” volunteered by Dr. Peter Kaufmann, a researcher at Disney Research, Zürich. He also went on to explain that “We use computation to carefully modify the thickness of the skin across the face, leading to deformations that closely match those of the real human.”

Looking at the situation critically, we may be unwittingly evolving the ground for a season of heavy cyber attacks especially as corporations like Amazon are reported to be selling facial recognition technology. Cybercriminals have been able to start from virtually nothing and got to the level of reproducing passwords and decoding encryptions.

They have been able to access some vital information that was protected with VPN and some other security measures despite the fact that they were not given any form of obvious opening.

The discerning voice of Andrew Bud, CEO of iProov, who said: “When everything is done on the device, there is no supporting system to detect and defend against attacks,” should be a clarion call for us to realize that we may be forging ahead in the wrong direction.

There shouldn’t be an iota of doubt, facial recognition has some clear advantages for certain applications, this laudable advancement in technology should be concentrated on how to effectively tackle crime and terrorism.

Photo Credit: YO! What Happened To Peace? Flickr via Compfight cc

Tagged , , , , ,

About John Ejiofor

John Ejiofor is a curious life-researcher, whose quest to finding answers to life's pertinent questions has led to founding Nature Torch. This blog aims to debate and explore many questions about our earth -- including those a lot of people are uncomfortable with asking. He has been published on some of the internet's most respected websites, which you can find online.
View all posts by John Ejiofor →

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.