{"id":12806,"date":"2016-08-22T09:00:24","date_gmt":"2016-08-22T13:00:24","guid":{"rendered":"https:\/\/www.kaspersky.com\/blog\/?p=12806"},"modified":"2019-11-15T06:54:16","modified_gmt":"2019-11-15T11:54:16","slug":"bad-facial-recognition","status":"publish","type":"post","link":"https:\/\/www.kaspersky.com\/blog\/bad-facial-recognition\/12806\/","title":{"rendered":"The dark side of facial recognition technology"},"content":{"rendered":"<p>You can change your name or use a pseudonym. You can also edit or delete your social media accounts. But you cannot change your face so easily. Facial recognition <a href=\"https:\/\/www.kaspersky.com\/blog\/good-facial-recognition\/12796\/\" target=\"_blank\" rel=\"noopener nofollow\"> helps us solve a lot of problems<\/a> \u2014 and it simultaneously creates plenty of new ones. In this post, we discuss the threats that come with the global spread of such systems.<\/p>\n<h3>1. Losing the right to privacy \u2014 on a global scale<\/h3>\n<p>The FBI officially maintains the Next Generation Identification-Interstate Photo System (NGI-IPS) \u2014 a database containing photos of people accused or convicted of civil and criminal proceedings. That\u2019s OK, isn\u2019t it?<\/p>\n<p>Not at all! In May, the American Government Accountability Office <a href=\"http:\/\/www.gao.gov\/assets\/680\/677098.pdf\" target=\"_blank\" rel=\"noopener nofollow\"> audited<\/a> the Bureau and found out that in fact the FBI\u2019s database of <i>412 million<\/i> photos includes pictures of people who were never the target of any investigation. The Bureau even has a separate unit that is in charge of facial recognition: Facial Analysis, Comparison, and Evaluation (<a href=\"https:\/\/www.fbi.gov\/services\/records-management\/foipa\/privacy-impact-assessments\/facial-analysis-comparison-and-evaluation-face-services-unit\" target=\"_blank\" rel=\"noopener nofollow\"> FACE<\/a>) Services.<\/p>\n<p>As it turned out, FBI representatives made arrangements with several states and obtained photos from driver\u2019s licenses and passport and visa applications, as well as images of criminal suspects and convicts. The database also includes photos of foreign nationals \u2014 potentially about 100 million of them.<\/p>\n<p>The FBI actively uses facial recognition during investigations. We\u2019ve <a href=\"https:\/\/www.kaspersky.com\/blog\/good-facial-recognition\/12796\/\" target=\"_blank\" rel=\"noopener nofollow\"> already written<\/a> about how this approach pays off. But the situation is more complicated than that. Facial recognition technology is young and imperfect, and the FBI\u2019s system is no exception: It <a href=\"https:\/\/www.theguardian.com\/technology\/2016\/apr\/08\/facial-recognition-technology-racial-bias-police\" target=\"_blank\" rel=\"noopener nofollow\"> has racial biases<\/a> and at best <a href=\"https:\/\/www.theguardian.com\/us-news\/2016\/jun\/15\/fbi-facial-recognition-software-photo-database-privacy\" target=\"_blank\" rel=\"noopener nofollow\"> achieves 80%-85% accuracy<\/a>. At the same time, the FBI deliberately <a href=\"https:\/\/www.theguardian.com\/us-news\/2016\/jun\/15\/fbi-facial-recognition-software-photo-database-privacy\" target=\"_blank\" rel=\"noopener nofollow\"> covered up<\/a> the scale of its facial recognition usage, contrary to the requirements of the Privacy Impact Assessment.<\/p>\n<blockquote class=\"twitter-tweet\" data-width=\"500\" data-dnt=\"true\">\n<p lang=\"en\" dir=\"ltr\">10 ways <a href=\"https:\/\/twitter.com\/hashtag\/facial?src=hash&amp;ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener nofollow\">#facial<\/a> recognition can be used for <a href=\"https:\/\/twitter.com\/hashtag\/good?src=hash&amp;ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener nofollow\">#good<\/a> <a href=\"https:\/\/t.co\/pkaIiUJHvQ\" target=\"_blank\" rel=\"noopener nofollow\">https:\/\/t.co\/pkaIiUJHvQ<\/a> <a href=\"https:\/\/t.co\/VzlvLtcHWd\" target=\"_blank\" rel=\"noopener nofollow\">pic.twitter.com\/VzlvLtcHWd<\/a><\/p>\n<p>\u2014 Kaspersky (@kaspersky) <a href=\"https:\/\/twitter.com\/kaspersky\/status\/766647220351430656?ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener nofollow\">August 19, 2016<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>Also noteworthy: The city council of Moscow and Russian law enforcement try to keep abreast of relevant technology and are getting ready to implement FaceN technology (developers of this system also provided the code for <a href=\"https:\/\/www.kaspersky.com\/blog\/findface-experiment\/11916\/\" target=\"_blank\" rel=\"noopener nofollow\"> FindFace<\/a> \u2014 a service that lets people search for other people using their photos). These new systems will be connected to hundreds of thousands of surveillance cameras in Moscow.<\/p>\n<p>As Russian media Meduza <a href=\"https:\/\/meduza.io\/feature\/2016\/07\/07\/konets-chastnoy-zhizni\" target=\"_blank\" rel=\"noopener nofollow\"> reports<\/a>, <i>there are no analogues to this system in any city in the world. The algorithm can compare people on the streets with criminals database, but it\u2019s not all. It can also detect individuals in any part of the city and match their images with social network accounts, which usually contain a lot of personal information\u201d.<\/i><\/p>\n<p>We should also point out that in the beginning of this year Russian Senate compelled Russian courts to consider photos and videos as legal evidence. Prior to that, the decision was at the court\u2019s discretion.<\/p>\n<h3>2. Abuse by law enforcement<\/h3>\n<p>Facial recognition makes mistakes. People in charge of these systems misuse them \u2014 it\u2019s a known fact. For example, in August, the New York Times <a href=\"http:\/\/www.nytimes.com\/2015\/08\/13\/us\/facial-recognition-software-moves-from-overseas-wars-to-local-police.html?_r=0\" target=\"_blank\" rel=\"noopener nofollow\"> reported<\/a> that San Diego police gathered images of guilty and innocent people without their permission.<\/p>\n<p>Aaron Harvey, a 27-year-old African-American living in San Diego claimed that police treated him with prejudice. Harvey lives in one of the most violent areas of the city. That is probably why the police stopped him more than 50 times and said he was a suspected gang member. When he refused to allow the officer to take his photo, the officer boasted that he could do so anyway.<\/p>\n<blockquote class=\"twitter-tweet\" data-width=\"500\" data-dnt=\"true\">\n<p lang=\"es\" dir=\"ltr\"><a href=\"https:\/\/twitter.com\/hashtag\/Man?src=hash&amp;ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener nofollow\">#Man<\/a> vs. <a href=\"https:\/\/twitter.com\/hashtag\/machine?src=hash&amp;ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener nofollow\">#machine<\/a>: facial recognition \u2013 <a href=\"https:\/\/t.co\/5WDfsSQVRC\" target=\"_blank\" rel=\"noopener nofollow\">https:\/\/t.co\/5WDfsSQVRC<\/a> <a href=\"https:\/\/t.co\/2LSme8O9Ym\" target=\"_blank\" rel=\"noopener nofollow\">pic.twitter.com\/2LSme8O9Ym<\/a><\/p>\n<p>\u2014 Kaspersky (@kaspersky) <a href=\"https:\/\/twitter.com\/kaspersky\/status\/748919096952094720?ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener nofollow\">July 1, 2016<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>\u201cHe said: \u2018We\u2019re going to do this either legally or illegally,\u2019 and pulled me out of the car\u201d \u2014 Harvey, describing the incident to the New York Times.<\/p>\n<p>Earlier, in 2013, authorities in <a href=\"http:\/\/www.nytimes.com\/2015\/08\/13\/us\/facial-recognition-software-moves-from-overseas-wars-to-local-police.html?_r=0\" target=\"_blank\" rel=\"noopener nofollow\"> Boston also tested a facial recognition system<\/a>. It was linked to surveillance cameras that covertly scanned people\u2019s faces during concerts and other outdoor events. At the end of the testing period, the project was scrapped for ethical reasons. But Boston is one deal, and global implementation is quite another: Facial recognition systems are now being widely used by government agencies.<\/p>\n<h3>3. Corporations spying on everybody<\/h3>\n<p>Organizations own face databases that are much bigger than the FBI\u2019s collection. Social networks top the list: Facebook, Instagram (which belongs to Facebook), Google (with its Google+), VC.com, and other social sites. The majority of these companies <a href=\"https:\/\/www.kaspersky.com\/blog\/how-facial-recognition-works\/12073\/\" target=\"_blank\" rel=\"noopener nofollow\"> have their own facial recognition solutions<\/a> that they constantly develop and improve.<\/p>\n<p>Microsoft is now working on a similar technology for the <a href=\"https:\/\/blogs.windows.com\/buildingapps\/2016\/06\/28\/familynotes-using-the-camera-to-detect-a-user\/\" target=\"_blank\" rel=\"noopener nofollow\"> FamilyNotes<\/a> app that will enable the software to distinguish one user from another with the help of a camera built into a laptop or a tablet. Microsoft develops one of the most popular operating systems in the world, and this app will substantially supplement the company\u2019s database of faces.<\/p>\n<blockquote class=\"twitter-tweet\" data-width=\"500\" data-dnt=\"true\">\n<p lang=\"en\" dir=\"ltr\">You can\u2019t replace your face, says facial recognition \u2013 <a href=\"https:\/\/t.co\/tW6vdmxPWE\" target=\"_blank\" rel=\"noopener nofollow\">https:\/\/t.co\/tW6vdmxPWE<\/a> <a href=\"https:\/\/t.co\/dKXmOVdJ33\" target=\"_blank\" rel=\"noopener nofollow\">pic.twitter.com\/dKXmOVdJ33<\/a><\/p>\n<p>\u2014 Kaspersky (@kaspersky) <a href=\"https:\/\/twitter.com\/kaspersky\/status\/723502848114307072?ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener nofollow\">April 22, 2016<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>Facebook\u2019s facial recognition system is one of the most accurate in the world. The company quietly <a href=\"http:\/\/www.cbsnews.com\/news\/how-to-disable-facebooks-new-facial-recognition-feature\/\" target=\"_blank\" rel=\"noopener nofollow\"> launched<\/a> this tool in 2012 and kept it on by default for the majority of users. Later, the company faced dozens of lawsuits \u2014 that <a href=\"http:\/\/www.theverge.com\/2016\/5\/5\/11605068\/facebook-photo-tagging-lawsuit-biometric-privacy\" target=\"_blank\" rel=\"noopener nofollow\"> number is still growing<\/a> , and Google is also being pursued in court <a href=\"http:\/\/www.ibtimes.com\/google-gets-sued-over-face-recognition-joining-facebook-shutterfly-battle-over-2330278\" target=\"_blank\" rel=\"noopener nofollow\"> on similar charges<\/a>. As a result <a href=\"https:\/\/www.theguardian.com\/technology\/2016\/may\/11\/facebook-moments-facial-recognition-app-europe\" target=\"_blank\" rel=\"noopener nofollow\"> Facebook had to disable facial recognition features<\/a> in certain regions.<\/p>\n<p>We should also note that Facebook has one-sided approach to this issue: For example, its knowledge base has zero articles on how to disable the facial recognition function \u2014 which, by the way, <a href=\"http:\/\/www.cbsnews.com\/news\/how-to-disable-facebooks-new-facial-recognition-feature\/\" target=\"_blank\" rel=\"noopener nofollow\"> is not a one-click operation<\/a>.<\/p>\n<p>Even if you are not a member of any social network (or if you avoided uploading your real photos to any of them) your face can still get into a social media company\u2019s database. Last year, a Chicago citizen <a href=\"http:\/\/fortune.com\/2015\/06\/18\/shutterfly-lawsuit-facial-recognition\/\" target=\"_blank\" rel=\"noopener nofollow\"> sued<\/a> photo-book service Shutterfly because the site added his photo to its database without his consent. A third party (a friend, most likely) had uploaded his photo to Shutterfly and signed the image.<\/p>\n<h3>4. Anybody can find you<\/h3>\n<p>Any facial recognition system that is available for everyone can be used as a powerful tool for <i><i>lynch-law<\/i><\/i>, or vigilante justice. This year, for example, two young men set a fire in a building lobby in Saint Petersburg. After they finished, the pyromaniacs caused a ruckus in the elevator of the same building. Cameras in the elevator and around the neighborhood recorded how the duo entertained themselves.<\/p>\n<p>When local police declined to open a criminal case, the tenants of the house took matters into their own hands: They took screenshots showing the culprits\u2019 faces and used <a href=\"https:\/\/www.kaspersky.com\/blog\/findface-experiment\/11916\/\" target=\"_blank\" rel=\"noopener nofollow\"> FindFace<\/a> to locate them on social networks.<\/p>\n<p>The amateur detectives reported their findings to the police, and as a result, the young men were charged. In Ren TV\u2019s (Russian TV channel) <a href=\"https:\/\/www.youtube.com\/watch?v=xhjg4hubpb0\" target=\"_blank\" rel=\"noopener nofollow\"> report<\/a> , one of the tenants said that they had enough data and evidence to send message to friends of the hooligans, and to their places of study and work.<\/p>\n<p>Though the Petersburgers had enough patience to ask police for help, not all Internet users are as level-headed. And where there\u2019s a will, there\u2019s a way \u2014 to bully people. If you\u2019ve heard of FindFace, you know about its most infamous use case: when members of the anonymous 2ch image board <a href=\"https:\/\/www.kaspersky.com\/blog\/findface-deanon\/11921\/\" target=\"_blank\" rel=\"noopener nofollow\"> used it to hunt porn actresses online<\/a>. The trolls found the women\u2019s social media pages and sent scandalous messages to their friends and relatives along with corresponding images.<\/p>\n<blockquote class=\"twitter-tweet\" data-width=\"500\" data-dnt=\"true\">\n<p lang=\"en\" dir=\"ltr\"><a href=\"https:\/\/twitter.com\/hashtag\/Porn?src=hash&amp;ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener nofollow\">#Porn<\/a> stars and <a href=\"https:\/\/twitter.com\/hashtag\/sex?src=hash&amp;ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener nofollow\">#sex<\/a> workers targeted with facial recognition <a href=\"https:\/\/twitter.com\/hashtag\/app?src=hash&amp;ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener nofollow\">#app<\/a> \u2013 <a href=\"https:\/\/t.co\/3dEOWKPd8V\" target=\"_blank\" rel=\"noopener nofollow\">https:\/\/t.co\/3dEOWKPd8V<\/a> <a href=\"https:\/\/t.co\/HBf8KztWNa\" target=\"_blank\" rel=\"noopener nofollow\">pic.twitter.com\/HBf8KztWNa<\/a><\/p>\n<p>\u2014 Kaspersky (@kaspersky) <a href=\"https:\/\/twitter.com\/kaspersky\/status\/726111488893702146?ref_src=twsrc%5Etfw\" target=\"_blank\" rel=\"noopener nofollow\">April 29, 2016<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>At the same time, the founder of FindFace Maxim Perlin <a href=\"https:\/\/meduza.io\/feature\/2016\/07\/07\/konets-chastnoy-zhizni\" target=\"_blank\" rel=\"noopener nofollow\"> is sure<\/a> that nowadays <b><b>people literally need to pay to preserve their privacy<\/b><\/b>. In a television interview, he said that people who want to wipe their data from FindFace\u2019s database will have to buy a premium account. A month of privacy on the service costs about $8.<\/p>\n<h3>5. There is a thin line between security and disaster<\/h3>\n<p>Many experts are sure: biometrics will replace passwords and make the world even more secure. So in the future, people will let the systems scan their irises, fingerprints, and even face prints to replace the process of entering complicated combinations of symbols.<\/p>\n<p><a href=\"https:\/\/support.microsoft.com\/en-us\/help\/17215\/windows-10-what-is-hello\" target=\"_blank\" rel=\"noopener nofollow\">Microsoft is already developing<\/a> technology that allows users to authorize with selfies. NEC is <a href=\"http:\/\/www.biometricupdate.com\/201607\/nec-trials-facial-recognition-based-cashless-payment-wins-award-for-ticket-id-system\" target=\"_blank\" rel=\"noopener nofollow\"> researching<\/a> the use of facial recognition to secure electronic payments. MasterCard is working on a <a href=\"https:\/\/www.youtube.com\/watch?v=xGMlz-0gvjs\" target=\"_blank\" rel=\"noopener nofollow\"> selfie identification system<\/a> that lets users send money without passwords.<\/p>\n<p>We\u2019ve <a href=\"https:\/\/www.kaspersky.com\/blog\/stealing-digital-identity\/10386\/\" target=\"_blank\" rel=\"noopener nofollow\"> already written<\/a> about <a href=\"https:\/\/www.kaspersky.com\/blog\/fingerprints-sensors-security\/10951\/\" target=\"_blank\" rel=\"noopener nofollow\"> the downsides of dactylography<\/a> , so let\u2019s now focus on weaknesses in facial recognition. It\u2019s really comes down to recent advances in 3D printing: Today you can <a href=\"http:\/\/imgur.com\/gallery\/yzoyq\" target=\"_blank\" rel=\"noopener nofollow\"> print a really realistic copy of a person\u2019s face<\/a>. Developers of new identification systems will need to take that into account if they want to create really secure solutions.<\/p>\n<p>For example, MasterCard and Google ask users to blink \u2014 a simple action that stops fraudsters from fooling the system with the help of a 3D-printed face or even just a photo. Unfortunately, Google\u2019s solution failed \u2014 <a href=\"http:\/\/www.droiddog.com\/android-blog\/2012\/08\/jelly-beans-face-unlock-liveness-check-easily-fooled-with-basic-photo-editing\/\" target=\"_blank\" rel=\"noopener nofollow\"> people managed to bypass the security measure<\/a> with the help of a simple animated picture. MasterCard\u2019s system is under development, so nobody knows yet if it can be fooled the same way.<\/p>\n<div id=\"attachment_12808\" style=\"width: 2058px\" class=\"wp-caption alignright\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/92\/2016\/08\/06021716\/3d-printed-face.jpg\"><img decoding=\"async\" aria-describedby=\"caption-attachment-12808\" class=\"size-full wp-image-12808\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/92\/2016\/08\/06021716\/3d-printed-face.jpg\" width=\"2048\" height=\"1536\"><\/a><p id=\"caption-attachment-12808\" class=\"wp-caption-text\">\u201cMy coworker 3D printed my face\u201d \u2013 image from Imgur<\/p><\/div>\n<h3>6. Don\u2019t share your face with any Tom, Dick, or Harry<\/h3>\n<p>You may have heard of Anaface, a <a href=\"http:\/\/www.anaface.com\/\" target=\"_blank\" rel=\"noopener nofollow\"> website<\/a> that analyzes your photo and rates your level of attractiveness. It uses symmetry as the main criterion \u2014 quite a questionable standard, don\u2019t you think? For example, Angelina Jolie <a href=\"http:\/\/www.forbes.com\/sites\/kashmirhill\/2009\/07\/01\/free-computer-analysis-of-your-beauty-whats-the-catch\/%25231aa3adea3dde\" target=\"_blank\" rel=\"noopener nofollow\"> rated<\/a> only 8.4 out of 10 on Anaface. But the site\u2019s accuracy is not its only problem.<\/p>\n<p>First, Anaface\u2019s owners admitted they launched the project to encourage people to seek plastic surgery. Well, at least they\u2019re giving it to us straight.<\/p>\n<p>Second, the website\u2019s terms and conditions are hard to read and really shady. They are given in a very small window so a user needs to scroll a lot to read the more than 7,000 words of small print. That\u2019s why many will probably miss that every user of the site provides it with a \u201cnon-exclusive, transferable, sub-licensable, royalty-free, worldwide license\u201d to use all photos uploaded to Anaface. In plain English: The service can sell the photos people upload without any obligation to pay the real owners of the images.<\/p>\n<p>At the same time users, pledge to upload only their own photos: \u201cYou may not post, upload, display or otherwise make available Content that contains video, audio photographs, or images of another person without his or her permission (or in the case of a minor, the minor\u2019s legal guardian);\u201d The conditions also include vague remarks about privacy and the possibility of removing photos after registering a user account \u2014 but nobody can do that on Anaface; the site doesn\u2019t allow it.<\/p>\n<p>All in all, everybody collects photos: governments, corporations and companies, and even regular people. Nowadays, everyone can use and misuse facial recognition systems \u2014 and all we can do is try to <a href=\"https:\/\/www.kaspersky.com\/blog\/camouflaging-from-global-surveillance\/10133\/\" target=\"_blank\" rel=\"noopener nofollow\"> hide from them<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Lynch law, loss of basic privacy, disgusting marketing, digital identity theft \u2014 how else can facial recognition be misused? <\/p>\n","protected":false},"author":522,"featured_media":12807,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[5,1788,1789],"tags":[1405,301,1565,1233,422],"class_list":{"0":"post-12806","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-news","8":"category-privacy","9":"category-technology","10":"tag-3d-printing","11":"tag-facial-recognition","12":"tag-findface","13":"tag-fingerprint-sensors","14":"tag-threats"},"hreflang":[{"hreflang":"x-default","url":"https:\/\/www.kaspersky.com\/blog\/bad-facial-recognition\/12806\/"},{"hreflang":"en-us","url":"https:\/\/usa.kaspersky.com\/blog\/bad-facial-recognition\/7547\/"},{"hreflang":"en-gb","url":"https:\/\/www.kaspersky.co.uk\/blog\/bad-facial-recognition\/7575\/"},{"hreflang":"es-mx","url":"https:\/\/latam.kaspersky.com\/blog\/bad-facial-recognition\/7545\/"},{"hreflang":"es","url":"https:\/\/www.kaspersky.es\/blog\/bad-facial-recognition\/8970\/"},{"hreflang":"it","url":"https:\/\/www.kaspersky.it\/blog\/bad-facial-recognition\/8809\/"},{"hreflang":"ru","url":"https:\/\/www.kaspersky.ru\/blog\/bad-facial-recognition\/12823\/"},{"hreflang":"tr","url":"https:\/\/www.kaspersky.com.tr\/blog\/bad-facial-recognition\/2451\/"},{"hreflang":"pt-br","url":"https:\/\/www.kaspersky.com.br\/blog\/bad-facial-recognition\/6487\/"},{"hreflang":"pl","url":"https:\/\/plblog.kaspersky.com\/bad-facial-recognition\/5357\/"},{"hreflang":"de","url":"https:\/\/www.kaspersky.de\/blog\/bad-facial-recognition\/8503\/"},{"hreflang":"ja","url":"https:\/\/blog.kaspersky.co.jp\/bad-facial-recognition\/12343\/"},{"hreflang":"ru-kz","url":"https:\/\/blog.kaspersky.kz\/bad-facial-recognition\/12823\/"},{"hreflang":"en-au","url":"https:\/\/www.kaspersky.com.au\/blog\/bad-facial-recognition\/12806\/"},{"hreflang":"en-za","url":"https:\/\/www.kaspersky.co.za\/blog\/bad-facial-recognition\/12806\/"}],"acf":[],"banners":"","maintag":{"url":"https:\/\/www.kaspersky.com\/blog\/tag\/3d-printing\/","name":"3d printing"},"_links":{"self":[{"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/12806","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/users\/522"}],"replies":[{"embeddable":true,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/comments?post=12806"}],"version-history":[{"count":2,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/12806\/revisions"}],"predecessor-version":[{"id":30178,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/12806\/revisions\/30178"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/media\/12807"}],"wp:attachment":[{"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/media?parent=12806"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/categories?post=12806"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.kaspersky.com\/blog\/wp-json\/wp\/v2\/tags?post=12806"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}