Speakers:

Alex Polyakov

Testing Facial Recognition System Security for Smart Home Devices

Date:

Monday, May 11, 2020

Time:

4:15 pm

Summary:

Do you know that it is possible to create glasses to bypass facial recognition system? The use of facial recognition technology is on the rise, and you can find it in different areas of human activity including social media, smart homes, ATMs, and stores. Recently, researchers have discovered that deep learning algorithms are vulnerable to various attacks called adversarial examples. Our client, a smart home solution provider, asked us to test software and hardware to reveal a real threat or academic research and select the most secure solution available on the market. Here is the way we conducted the security assessment. Facial recognition systems have their specific deep learning models. The systems usually work in the physical environment, and their attack surface differs from the digital one presented in research papers. Furthermore, all examples of attacks and defensive measures were given for various models, datasets, and conditions. It does not help to understand the real situation even if you examine approximately 100 research papers on this subject. To test properly, we’ve composed our own attack taxonomy to check the effectiveness of the recent approaches to attacking facial recognition systems. I will present our research conducted in the real environment with various cameras and algorithms and show how to protect production systems from this kind of attacks.

Ready to attend?

Register now! Join your peers.

Register nowView Agenda
Newsletter Knowledge is everything! Sign up for our newsletter to receive:
  • 10% off your first ticket!
  • insights, interviews, tips, news, and much more about Deep Learning World
  • price break reminders