Car thieves are targeting catalytic converters to access the valuable precious metals they contain. Researchers have identified new vulnerabilities in Amazon’s Alexa.


Increased climate regulations have given car thieves a new focus: catalytic converters. These devices contain precious metals like palladium and rhodium, which have soared to record high prices thanks to stricter car emissions rules, according to a recent article from The New York Times.

The price of palladium went from about $500 an ounce five years ago to hit a record of $2,875 an ounce last year and is now hovering between $2,000 and $2,500 an ounce, above the price of gold. Meanwhile, rhodium prices have skyrocketed more than 3,000 percent from about $640 an ounce five years ago to a record $21,900 an ounce this year, roughly 12 times the price of gold. NYT said those prices are fueling a black market in stolen catalytic converters, with police nationwide reporting a surge in such cases.

Source: “Thieves Nationwide Are Slithering Under Cars, Swiping Catalytic Converters,” The New York Times, Feb. 9, 2021


A recent study has revealed new privacy concerns for users of Amazon’s voice-activated assistant, Alexa.

The authors warned that although people who use Alexa may think they’re interacting only with Amazon, a lot of the apps (or skills) were actually created by third parties. They identified several flaws in the current vetting process that could allow those third parties to gain access to users’ personal or private information.

The researchers used an automated program to collect 90,194 unique skills found in seven different skill stores. The research team also developed an automated review process that provided a detailed analysis of each skill.

They revealed a number of privacy concerns, such as Amazon’s failure to verify the developers’ names, which could lead users to think the skill was published by a trustworthy organization. Amazon also allows multiple skills to use the same invocation phrase, which could leave users sharing information with the wrong developer. For example, some skills require linking to a third-party account, such as an email, banking or social media. Developers also can change the code on the back end of skills after the skill has been placed in stores.

Source: “Study Reveals Extent of Privacy Vulnerabilities with Amazon’s Alexa,” North Carolina State University release, March 4, 2021; Study: “Hey Alexa, is this Skill Safe?: Taking a Closer Look at the Alexa Skill Ecosystem