Her bones tell Gaza story, but Grok thinks it is Yemen: Musk’s chatbot flagged for false claims on starving girl photo – Can AI be trusted?


Her bones tell Gaza story, but Grok thinks it is Yemen: Musk's chatbot flagged for false claims on starving girl photo - Can AI be trusted?
Image of a starving girl (Pic credit: AFP)

A harrowing image captured in Gaza, showing a severely malnourished young girl held in her mother’s arms, has become the latest flashpoint in the ongoing battle over truth, technology, and the Israel-Hamas war.The photo, taken on August 2, 2025, by AFP photojournalist Omar al-Qattaa, documents the frail, skeletal frame of nine-year-old Mariam Dawwas amid growing fears of mass famine in the besieged Palestinian enclave. Israel’s blockade of the Gaza Strip has cut off critical humanitarian aid, pushing over two million residents to the brink of starvation.But when users turned to Elon Musk’s AI chatbot, Grok, on X to verify the image, the response was stunningly off the mark. Grok insisted the photo was taken in Yemen in 2018, claiming it showed Amal Hussain, a seven-year-old girl whose death from starvation made global headlines during the Yemen civil war.That answer was not just incorrect — it was dangerously misleading.When AI becomes a disinformation machineGrok’s faulty identification rapidly spread online, sowing confusion and weaponising doubt. French left-wing lawmaker Aymeric Caron, who shared the image in solidarity with Palestinians, was swiftly accused of spreading disinformation, even though the image was authentic and current.“This image is real, and so is the suffering it represents,” said Caron, pushing back against the accusations.The controversy spotlights a deeply unsettling trend: as more users rely on AI tools to fact-check content, the technology’s errors are not just mistakes — they’re catalysts for discrediting truth.A human tragedy, buried under algorithmic errorMariam Dawwas, once a healthy child weighing 25 kilograms before the war began in October 2023, now weighs just nine. “The only nutrition she gets is milk,” her mother Modallala told AFP, “and even that’s not always available.”Her image has become a symbol of Gaza’s deepening humanitarian crisis. But Grok’s misfire reduced her to a data point in the wrong file, an AI hallucination with real-world consequences.Even after being challenged, Grok initially doubled down: “I do not spread fake news; I base my answers on verified sources.” While the chatbot eventually acknowledged the error, it again repeated the incorrect Yemen attribution the very next day.



  • Related Posts

    Watch: Video shows huge blaze after blasts hit oil tankers off Iraq’s Basra coast

    At least one person has died and two oil tankers caught fire in the Persian Gulf off the coast of Basra, Iraq on Wednesday, after explosions that officials suspect were…

    Continue reading
    Indian Embassy teams up with Jazeera Airways, arranges travel relief for stranded nationals in Kuwait amid flight disruptions across the Gulf due to Iran vs US-Israel war

    Indian Embassy Steps In With Jazeera Airways To Help Nationals Travel From Kuwait Amid Iran vs US-Israel War As tensions across the Middle East continue to disrupt air travel, the…

    Continue reading

    Leave a Reply

    Your email address will not be published. Required fields are marked *