New concerns raised over how Amazon handles people’s Cloud Cam footage
AI has to be trained by a human, and what better way to do this than with real-world examples?
What you need to know
- Video clips from Amazon Cloud Cams can be manually submitted for review when a false alarm is detected.
- Teams of workers in India and Romania review hundreds of clips every day.
- Amazon’s handling of private and sensitive data is being brought into question.
Amazon, who is no stranger to privacy issues with Alexa and its Ring cameras, is under fire again for alleged privacy issues regarding the use of recorded footage from Amazon Cloud Cams. A report by Bloomberg about Amazon’s Cloud Cam is raising a few eyebrows today, but some of it may be unnecessary skepticism after other legitimate privacy issues have unfolded over the past several months.
According to a number of Amazon employees that Bloomberg spoke with under the guise of anonymity, Amazon runs a program that helps train the AI used to identify events in the home. This AI is supposed to be able to tell the difference between a false alarm and a real one, but it needs human help to ensure everything is being identified correctly. Customers and Amazon employees, alike, can submit footage to Amazon for review when a false alarm has been identified.
But some employees are questioning whether or not the footage being reviewed was actually manually submitted by consumers, as some clips contain “inappropriate content” that seems out of the ordinary to submit for review. Some clips contain very sensitive personal content, too, such as people having sex, according to the employees that work on the program.
Understanding how Amazon’s AI works to detect intruders is an important part of the puzzle, especially when considering why these types of clips would be submitted for review. Identifying someone being loud in the home as an intruder isn’t always going to be the desired outcome, thus, it’s likely said consumers submitted the offending clips as an example.
But the real questions come in the form of how the footage was obtained and how securely this footage is monitored.
Amazon states that all footage under review was voluntarily submitted by end-users and cannot be accessed in any other way. Recorded footage, after all, is stored in a private cloud locker and can only be seen by the registered account. According to Amazon’s Q&A:
Only you or people you have shared your account information with can view your clips, unless you choose to submit a clip to us directly for troubleshooting. Customers can also choose to share clips via email or social media.
Some workers say they are unsure how the clips get chosen for review though, as thousands of clips are submitted on a regular bases and auditors can expect to review up to 150 clips in a single day. This breeds distrust in and of itself, as the nature of these clips’ origin doesn’t seem to be known by everyone in the chain. But the bigger issue comes in how these clips are viewed and the environment they are monitored in.
Cloud Cam auditors work on a restricted floor that’s kept under tight lock and key, according to the workers Bloomberg interviewed, but even tight security allegedly hasn’t prevented employees from sharing clips with outside sources. This certainly feeds the notion that Amazon is less careful with customer data than it should be and needs to tighten its reigns when it comes to data privacy.
While Bloomberg states that “nowhere in the Cloud Cam user terms and conditions does Amazon explicitly tell customers that human beings are training the algorithms behind their motion detection software.”, the fact of the matter is that clips have to be manually submitted for feedback and review. This, by nature, means that someone has to review the footage to check for problems.
Much of this information is eyebrow-raising but, ultimately, circumstantial without hard evidence of data breaches or foul play. What it represents is a continued mistrust of Amazon and its practices of data handling, and that may cause more damage to the company than faulty products could.