The term “deepfake” has penetrated the 21st-century vernacular, mainly in relation to videos that convincingly replace the likeness of one person with that of another. These often insert celebrities into pornography, or depict world leaders saying things they never actually said.
But anyone with the know-how can also use similar artificial intelligence strategies to fabricate satellite images, a practice known as “deepfake geography.” Researchers caution that such misuse could prompt new channels of disinformation, and even threaten national security.