Artificial intelligence can be used to falsify satellite imagery. Baby cushions from The Boppy Company have been recalled due to safety risks.
Geographers from the University of Washington are warning that deepfake satellite imagery could be used to create hoaxes about wildfires or floods or to discredit stories based on real satellite imagery.
Such AI-generated images of cityscapes and countryside might even be a national security issue, as fake satellite imagery could be used to mislead tacticians and mess with mission planning.
The fake satellite images are “uncannily realistic,” said Bo Zhao, lead author of the study published in the journal Cartography and Geographic Information Science. He noted that “untrained eyes would easily consider they are authentic.” And while detection software can spot the fakes based on characteristics like texture, contrast and color, these tools need constant updates to keep up with improvements in deepfake generation.
Source: “Deepfake satellite imagery poses a not-so-distant threat, warn geographers,” The Verge, April 27, 2021; “A growing problem of ‘deepfake geography’: How AI falsifies satellite images,” University of Washington, April 21, 2021
Baby-product manufacturer The Boppy Company is recalling 3.3 million lounger pads for newborns after at least eight infant deaths were associated with the pillows in less than five years.
Federal regulators say that the deaths, which were reported from December 2015 to June 2020, occurred after the babies were placed on their backs, sides or stomachs in the loungers.
The recall covers all models of the newborn lounger, including the original and “preferred” versions, as well as a line sold through Pottery Barn Kids. The loungers were sold from January 2004 through September 2021. The retailers that carried them included Target, Walmart and Amazon.
Source: “3 Million Baby Cushions Are Recalled After 8 Reported Infant Deaths,” New York Times, Sept. 23, 2021