For $29.99 a month, a website called PimEyes offers a potentially dangerous superpower from the world of science fiction: the ability to search for a face, finding obscure photos that would otherwise have been as safe as the proverbial needle in the vast digital haystack of the internet.
A search takes mere seconds. You upload a photo of a face, check a box agreeing to the terms of service and then get a grid of photos of faces deemed similar, with links to where they appear on the internet. The New York Times used PimEyes on the faces of a dozen Times journalists, with their consent, to test its powers.
PimEyes found photos of every person, some that the journalists had never seen before, even when they were wearing sunglasses or a mask, or their face was turned away from the camera, in the image used to conduct the search.
PimEyes found one reporter dancing at an art museum event a decade ago, and crying after being proposed to, a photo that she didn’t particularly like but that the photographer had decided to use to advertise his business on Yelp. A tech reporter’s younger self was spotted in an awkward crush of fans at the Coachella music festival in 2011. A foreign correspondent appeared in countless wedding photos, evidently the life of every party, and in the blurry background of a photo taken of someone else at a Greek airport in 2019. A journalist’s past life in a rock band was unearthed, as was another’s preferred summer camp getaway.
Unlike Clearview AI, a similar facial recognition tool available only to law enforcement, PimEyes does not include results from social media sites. The sometimes surprising images that PimEyes surfaced came instead from news articles, wedding photography pages, review sites, blogs and pornography sites. Most of the matches for the dozen journalists’ faces were correct. For the women, the incorrect photos often came from pornography sites, which was unsettling in the suggestion that it could be them. (To be clear, it was not them.) A tech executive who asked not to be identified said he used PimEyes fairly regularly, primarily to identify people who harass him on Twitter and use their real photos on their accounts but not their real names. Another PimEyes user who asked to stay anonymous said he used the tool to find the real identities of actresses from pornographic films and to search for explicit photos of his Facebook friends.
The new owner of PimEyes is Giorgi Gobronidze, a 34-year-old academic who says his interest in advanced technology was sparked by Russian cyberattacks on his home country, Georgia.
Gobronidze said he believed that PimEyes could be a tool for good, helping people keep tabs on their online reputation. The journalist who disliked the photo that a photographer was using, for example, could now ask him to take it off his Yelp page.
PimEyes users are supposed to search only for their own faces or for the faces of people who have consented, Gobronidze said. But he said he was relying on people to act “ethically,” offering little protection against the technology’s erosion of the long-held ability to stay anonymous in a crowd. PimEyes has no controls in place to prevent users from searching for a face that is not their own and suggests a user pay a hefty fee to keep damaging photos from an ill-considered night from following him or her forever.
“It’s stalkerware by design no matter what they say,” said Ella Jakubowska, a policy adviser at European Digital Rights, a privacy advocacy group.
Under new management
Gobronidze grew up in the shadow of military conflict. His kindergarten was bombed during the civil war that ensued after Georgia declared independence from the Soviet Union in 1991. The country was effectively cut off from the world in 2008 when Russia invaded and the internet went down. The experiences inspired him to study the role of technological dominance in national security.
After stints working as a lawyer and serving in the Georgian army, Gobronidze got a master’s degree in international relations. He began his career as a professor in 2014, eventually landing at European University in Tbilisi, Georgia, where he still teaches.
In 2017, Gobronidze was in an exchange program, lecturing at a university in Poland, when one of his students introduced him, he said, to two “hacker” types — Lucasz Kowalczyk and Denis Tatina — who were working on a facial search engine. They were “brilliant masterminds,” he said, but “absolute introverts” who were not interested in public attention.
They agreed to speak with him about their creation, which eventually became PimEyes, for his academic research, Gobronidze said. He said they had explained how their search engine used neural net technology to map the features of a face, in order to match it to faces with similar measurements, and that the program was able to learn over time how to best determine a match.
“I felt like a person from the Stone Age when I first met them,” Gobronidze said. “Like I was listening to science fiction.” He kept in touch with the founders, he said, and watched as PimEyes began getting more and more attention in the media, mostly of the scathing variety. In 2020, PimEyes claimed to have a new owner, who wished to stay anonymous, and the corporate headquarters were moved from Poland to Seychelles, a popular African offshore tax haven.
Gobronidze said he “heard” sometime last year that this new owner of the site wanted to sell it. So he quickly set about gathering funds to make an offer, selling a seaside villa he had inherited from his grandparents and borrowing a large sum from his younger brother, Shalva Gobronidze, a software engineer at a bank. The professor would not reveal how much he had paid.
“It wasn’t as big an amount as someone might expect,” Gobronidze said.
In December, Gobronidze created a corporation, EMEARobotics, to acquire PimEyes and registered it in Dubai because of the United Arab Emirates’ low tax rate. He said he had retained most of the site’s small tech and support team, and hired a consulting firm in Belize to handle inquiries and regulatory questions.
Gobronidze has rented office space for PimEyes in a tower in downtown Tbilisi. It is still being renovated, light fixtures hanging loose from the ceiling.
Tatia Dolidze, a colleague of Gobronidze’s at European University, described him as “curious” and “stubborn,” and said she had been surprised when he told her that he was buying a face search engine.
“It was difficult to imagine Giorgi as a businessman,” Dolidze said by email.
Now he is a businessman who owns a company steeped in controversy, primarily around whether we have any special right of control over images of us that we never expected to be found this way. Gobronidze said facial recognition technology would be used to control people if governments and big companies had the only access to it.
And he is imagining a world where facial recognition is accessible to anyone.
‘Essentially extortion’
A few months back, Cher Scarlett, a computer engineer, tried out PimEyes for the first time and was confronted with a chapter of her life that she had tried hard to forget.
In 2005, when Scarlett was 19 and broke, she considered working in pornography. She traveled to New York City for an audition that was so humiliating and abusive that she abandoned the idea.
PimEyes unearthed the decades-old trauma, with links to where exactly the explicit photos could be found on the web. They were sprinkled in among more recent portraits of Scarlett, who works on labor rights and has been the subject of media coverage for a high-profile worker revolt she led at Apple.
“I had no idea up until that point that those images were on the internet,” she said.
Worried about how people would react to the images, Scarlett immediately began looking into how to get them removed, an experience she described in a Medium post and to CNN. When she clicked on one of the explicit photos on PimEyes, a menu popped up offering a link to the image, a link to the website where it appeared and an option to “exclude from public results” on PimEyes.
But exclusion, Scarlett quickly discovered, was available only to subscribers who paid for “PROtect plans,” which cost from $89.99 to $299.99 per month. “It’s essentially extortion,” said Scarlett, who eventually signed up for the most expensive plan.
Gobronidze disagreed with that characterization. He pointed to a free tool for deleting results from the PimEyes index that is not prominently advertised on the site. He also provided a receipt showing that PimEyes had refunded Scarlett for the $299.99 plan last month.
PimEyes has tens of thousands of subscribers, Gobronidze said, with most visitors to the site coming from the United States and Europe. It makes the bulk of its money from subscribers to its PROtect service, which includes help from PimEyes support staff in getting photos taken down from external sites.
PimEyes has a free “opt-out” as well, for people to have data about themselves removed from the site, including the search images of their faces. To opt-out, Scarlett provided a photo of her teenage self and a scan of her government-issued identification. At the beginning of April, she received a confirmation that her opt-out request had been accepted.
“Your potential results containing your face are removed from our system,” the email from PimEyes said.
But when the Times ran a PimEyes search of Scarlett’s face with her permission a month later, there were more than 100 results, including the explicit ones.
Gobronidze said that this was a “sad story” and that opting out didn’t block a person’s face from being searched. Instead, it blocks from PimEyes’ search results any photos of faces “with a high similarity level” at the time of the opt-out, meaning people need to regularly opt out, with multiple photos of themselves, if they hope to stay out of a PimEyes search. Gobronidze said explicit photos were particularly tricky, comparing their tendency to proliferate online to the mythical beast Hydra.
“Cut one head and two others appear,” he said.
Gobronidze said he wanted “ethical usage” of PimEyes, meaning that people search only for their own faces and not those of strangers.
But PimEyes does little to enforce this goal, beyond a box that a searcher must click asserting that the face being uploaded is his or her own. Helen Nissenbaum, a Cornell University professor who studies privacy, called this “absurd,” unless the site had a searcher provide government identification, as Scarlett had to when she opted out.
“If it’s a useful thing to do, to see where our own faces are, we have to imagine that a company offering only that service is going to be transparent and audited,” Nissenbaum said.
PimEyes does no such audits, though Gobronidze said the site would bar a user with search activity “beyond anything logical,” describing one with more than 1,000 searches in a day as an example. He is relying on users to do what’s right and mentioned that anyone who searched someone else’s face without permission would be breaking European privacy law.
“It should be the responsibility of the person using it,” he said. “We’re just a tool provider.” Scarlett said she had never thought she would talk publicly about what happened to her when she was 19, but felt she had to after she realized that the images were out there.
“It would have been used against me,” she said. “I’m glad I’m the person who found them, but to me, that’s more about luck than PimEyes working as intended. It shouldn’t exist at all.”
Exceptions to the rule
Despite saying PimEyes should be used only for self-searches, Gobronidze is open to other uses as long as they are “ethical.” He said he approved of investigative journalists and the role PimEyes played in identifying Americans who stormed the U.S. Capitol on Jan. 6, 2021.
The Times allows its journalists to use face recognition search engines for reporting but has internal rules about the practice. “Each request to use a facial recognition tool for reporting purposes requires prior review and approval by a senior member of the masthead and our legal department to ensure the usage adheres to our standards and applicable law,” said a Times spokeswoman, Danielle Rhoades Ha.
There are users Gobronidze doesn’t want. He recently blocked people in Russia from the site, in solidarity with Ukraine. He mentioned that PimEyes was willing, like Clearview AI, to offer its service for free to Ukrainian organizations or the Red Cross, if it could help in the search for missing persons.
The better-known Clearview AI has faced serious headwinds in Europe and around the world. Privacy regulators in Canada, Australia and parts of Europe have declared Clearview’s database of 20 billion face images illegal and ordered Clearview to delete their citizens’ photos. Italy and Britain issued multimillion-dollar fines.
A German data protection agency announced an investigation into PimEyes last year for possible violations of Europe’s privacy law, the General Data Protection Regulation, which includes strict rules around the use of biometric data. That investigation is continuing.
Gobronidze said he had not heard from any German authorities. “I am eager to answer all of the questions they might have,” he said.
He is not concerned about privacy regulators, he said, because PimEyes operates differently. He described it as almost being like a digital card catalog, saying the company does not store photos or individual face templates but rather URLs for individual images associated with the facial features they contain. It’s all public, he said, and PimEyes instructs users to search only for their own faces. Whether that architectural difference matters to regulators is yet to be determined.
This article originally appeared in The New York Times.
Join the Conversation
We invite you to use our commenting platform to engage in insightful conversations about issues in our community. We reserve the right at all times to remove any information or materials that are unlawful, threatening, abusive, libelous, defamatory, obscene, vulgar, pornographic, profane, indecent or otherwise objectionable to us, and to disclose any information necessary to satisfy the law, regulation, or government request. We might permanently block any user who abuses these conditions.