since it exploded onto the scene in January after
a newspaper exposé, Clearview AI quickly became one of the most elusive, secretive, and reviled companies in the tech startup scene.
The controversial facial recognition startup allows its law enforcement users to take a picture of a person, upload it, and match it against its alleged database of 3 billion images, which the company scraped from
public social media profiles.
But for a time, a misconfigured server exposed the company’s internal files, apps and source code for anyone on the internet to find.
Mossab Hussein, chief security officer at Dubai-based cybersecurity firm SpiderSilk, found the repository storing Clearview’s source code. Although the repository was protected with a password, a misconfigured setting allowed anyone to register as a new user to log in to the system storing the code.
The repository contained Clearview’s source code, which could be used to compile and run the apps from scratch. The repository also stored some of the company’s secret keys and credentials, which granted access to Clearview’s cloud storage buckets. Inside those buckets, Clearview stored copies of its finished Windows, Mac, and Android apps, as well as its iOS app, which Apple
recently blocked for violating its rules. The storage buckets also contained early, pre-release developer app versions that are typically only for testing, Hussein said.