Her bones tell Gaza story, but Grok thinks it is Yemen: Musk’s chatbot flagged for false claims on starving girl photo – Can AI be trusted?


Her bones tell Gaza story, but Grok thinks it is Yemen: Musk's chatbot flagged for false claims on starving girl photo - Can AI be trusted?
Image of a starving girl (Pic credit: AFP)

A harrowing image captured in Gaza, showing a severely malnourished young girl held in her mother’s arms, has become the latest flashpoint in the ongoing battle over truth, technology, and the Israel-Hamas war.The photo, taken on August 2, 2025, by AFP photojournalist Omar al-Qattaa, documents the frail, skeletal frame of nine-year-old Mariam Dawwas amid growing fears of mass famine in the besieged Palestinian enclave. Israel’s blockade of the Gaza Strip has cut off critical humanitarian aid, pushing over two million residents to the brink of starvation.But when users turned to Elon Musk’s AI chatbot, Grok, on X to verify the image, the response was stunningly off the mark. Grok insisted the photo was taken in Yemen in 2018, claiming it showed Amal Hussain, a seven-year-old girl whose death from starvation made global headlines during the Yemen civil war.That answer was not just incorrect — it was dangerously misleading.When AI becomes a disinformation machineGrok’s faulty identification rapidly spread online, sowing confusion and weaponising doubt. French left-wing lawmaker Aymeric Caron, who shared the image in solidarity with Palestinians, was swiftly accused of spreading disinformation, even though the image was authentic and current.“This image is real, and so is the suffering it represents,” said Caron, pushing back against the accusations.The controversy spotlights a deeply unsettling trend: as more users rely on AI tools to fact-check content, the technology’s errors are not just mistakes — they’re catalysts for discrediting truth.A human tragedy, buried under algorithmic errorMariam Dawwas, once a healthy child weighing 25 kilograms before the war began in October 2023, now weighs just nine. “The only nutrition she gets is milk,” her mother Modallala told AFP, “and even that’s not always available.”Her image has become a symbol of Gaza’s deepening humanitarian crisis. But Grok’s misfire reduced her to a data point in the wrong file, an AI hallucination with real-world consequences.Even after being challenged, Grok initially doubled down: “I do not spread fake news; I base my answers on verified sources.” While the chatbot eventually acknowledged the error, it again repeated the incorrect Yemen attribution the very next day.



  • Related Posts

    Explosions at Oman’s Salalah port: Iran drones hit fuel storage oil tanks? Tehran denies role as war with US-Israel engulfs Middle East

    Iranian Drones Hit Fuel Tanks at Oman’s Salalah Port, Massive Fires Erupt, Tehran Denies Role The widening conflict between Iran, the United States and Israel has now reached one of…

    Continue reading
    Ex-DOGE engineer had access to private information of millions of Americans, claims whistleblower: ‘Wanted to sanitize data’

    A whistleblower complaint has triggered a federal investigation into claims that a former member of a government cost-cutting team said he had access to sensitive Social Security data and planned…

    Continue reading

    Leave a Reply

    Your email address will not be published. Required fields are marked *