I just threw my Wyze home security cameras in the trash. I’m done with this company.
I just learned that for the past three years, Wyze has been fully a vulnerability in its security cameras that could have aware of home theoretically let hackers access your video feeds over the internet — but chose to sweep it under the rug. And the security firm that found the vulnerability largely let them do it.
Instead of patching it, instead of recalling it, instead of just, you know, saying something so I could stop pointing these cameras at my kids, Wyze simply decided to discontinue the WyzeCam v1 this January without a full explanation. But on Tuesday, security research firm Bitdefender finally shed light on why Wyze stopped selling it: because someone could theoretically access your camera’s SD card, steal the encryption key, and start watching and downloading its video.
Since I published this editorial, several people have reached out to explain the issue isn’t nearly as bad as you might have imagined reading my words — that hackers would likely have to be inside your home network, or you would have had to make an egregious mistake by configuring your firewall to provide internet access to the camera’s virtual port. I checked with Bitdefender, and it suggests that’s partially true:
The remote (from outside the network) attacks requires an initial camera ID (it’s completely random and non-predictable string) that can only be acquired if present on the same network as device. In other words, if someone connects to your home WiFi, they can get that token and, at a later moment, use any of the other working remote exploits to hack your device from their home or wherever else in the world they are.
That makes me feel a little better, and I apologize for misleading anyone. But my cameras are still going to the e-waste bin, because Wyze decided it wasn’t important to tell anyone about this vulnerability or do anything about it for years.
It didn’t tell customers when it discontinued the camera, in the three years since Bitdefender brought it to Wyze’s attention in March 2019, and possibly not ever: Wyze spokesperson Kyle Christensen told me that as far as the company is concerned, it’s already been transparent with its customers and has “fully corrected the issue”. But Wyze only corrected it for newer versions of the WyzeCam, and even then it only finished patching the v2 and v3 on January 29th, 2022, according to BleepingComputer.
As far as being transparent, the most I’ve seen Wyze tell customers was that “your continued use of the WyzeCam after February 1, 2022 carries increased risk, is discouraged by Wyze, and is entirely at your own risk.” It also sometimes sends vague emailed messages like this to its customers, which I used to appreciate but am now retroactively questioning:
When I read those words about “increased risk” in our Verge post regarding the WyzeCam v1’s discontinuation, I remember thinking it just referred to future security updates — not a major vulnerability that already exists and justifies discontinuing the entire device.
Wyze isn’t the only smart home company that’s danced around something like this: when randos actually did access people’s Google Nest security cameras (due to password issues, not a hack), Google didn’t fulfill its responsibility to properly warn its customers either .
Here’s another question: Why on earth would Bitdefender not disclose this for three whole years, when it could have forced Wyze’s hand?
According to the security research firm’s own disclosure timeline (PDF), it reached out to Wyze in March 2019 and didn’t even get a response until November 2020, a year and eight months later. Yet Bitdefender chose to keep quiet until just yesterday.
In case you’re wondering, no, that is not normal in the security community. While experts tell me that the concept of a “responsible disclosure timeline” is a little outdated and heavily depends on the situation, we’re generally measuring in days, not years. “The majority of researchers have policies where if they make a good faith effort to reach a vendor and don’t get a response, that they publicly disclose in 30 days,” Alex Stamos, director of the Stanford Internet Observatory and former chief security officer at Facebook, tells me.
“Even the US government has a 45-day default disclosure deadline to prevent vendors from burying bug reports and never fixing them,” writes Katie Moussouris, founder and CEO of Luta Security and co-author of the international ISO standards for vulnerability disclosure and vulnerability handling processes.
I asked Bitdefender about this, and PR director Steve Fiore had an explanation, but it doesn’t sit well with me. Here it is in full:
Our findings were so serious, our decision, regardless of our usual 90-day-with-grace-period-extensions-policy, was that publishing this report without Wyze’s acknowledgment and mitigation was going to potentially expose millions of customers with unknown implications. Especially since the vendor didn’t have a known (to us) a security process / framework in place. Wyze actually implemented one last year as a result of our findings (https://www.wyze.com/pages/security-report).
We have delayed publishing reports (iBaby Monitor M6S cameras) for longer periods for the same reason before. The impact of making the findings public, coupled with our lack of information on the ability of the vendor to address the fallout, dictated our waiting.
We understand that this is not necessarily a common practice with other researchers, but disclosing the findings before having the vendor provide patches would have put a lot people at risk. So when Wyze did eventually communicate and provided us with credible information on their ability to address the issues reported, we decided to allow them time and granted extensions.
Waiting sometimes makes sense. Both of the experts I spoke to, Moussouris and Stamos, independently brought up the infamous Meltdown computer CPU vulnerabilities as an example of where balancing security and disclosure was difficult — because of how many people were affected, how deeply embedded the computers might be, and how difficult they are to fix.
But a $20 consumer smart home camera just sitting on my shelf? If Bitdefender put out a press release two years ago that Wyze had a flaw it’s not fixing, it’s damn easy to stop using that camera, not buy any more of them, and pick a different one instead. “There’s an easy mitigation strategy for affected customers,” Stamos says.
The iBaby Monitor example that Bitdefender brings up is a little ironic too — because there, Bitdefender actually did force a company to action. When Bitdefender and PCMag revealed that the baby monitor company hadn’t patched its security hole, the resulting bad publicity pushed them to fix it just three days later.
Days, not years.
Now if you’ll excuse me, I need to go say goodbye to those Wyze earbuds I liked, because I’m serious about being done with Wyze. I was willing to write off the company’s disastrous leak of 2.4 million customers’ data as a mistake, but it doesn’t look like the company made one here. If these flaws were bad enough to discontinue the camera in 2022, customers deserved to know that back in 2019.
Update March 31st, 2:57PM ET: Clarified with Bitdefender that the attack would likely require some amount of access to your local network first — which means it’s vastly less likely to happen.