On Monday, social media was flooded with visuals of an explosion purportedly taking place on the grounds of the Pentagon, causing a momentary pause worldwide. The image, shared on Twitter on May 22, depicted an explosion on a grass lawn outside the Pentagon.
However, the original post has since been removed, and the US Department of Defense has clarified that there was no explosion within or near the Pentagon premises. A spokesperson from the Department of Defense referred to the image as “misinformation.” The Arlington Fire Department also swiftly tweeted that there was no explosion or incident at or near the Pentagon reservation, ensuring that there was no immediate danger or hazard to the public.
The source of the viral image remains unknown at this time.
In recent times, the rise of deep fakes generated by Artificial Intelligence (AI) has gained significant attention due to their remarkably realistic portrayal of genuine individuals and subjects. These deep fakes have spread rapidly across social media platforms.
The proliferation of such deep fakes has been amplified by the availability of powerful AI technologies, including OpenAI’s ChatGPT, which have been made accessible to the public. These AI-generated depictions have included images of Pope Francis wearing a Balenciaga coat and fabricated viral images of former President Donald Trump resisting authorities during a fictitious arrest.
The image depicting an explosion within the Pentagon’s premises has also been described as a deepfake.
Interestingly, Samuel Altman, the CEO of OpenAI, the company behind ChatGPT, recently issued a word of caution regarding the regulation of AI. During a US Senate hearing, Altman highlighted the potential risks associated with AI technology, stating that if it goes wrong, the consequences could be significant.
Altman proposed the establishment of a US or global agency responsible for licensing the most powerful AI systems and ensuring compliance with safety standards by having the authority to revoke licenses if necessary.
Post Your Comments