Tencent’s AI model Hunyuan Image 3.0 tops leaderboard, beating Google’s Nano Banana


Tencent Holdings’ new artificial intelligence model, Hunyuan Image 3.0, has surpassed Google DeepMind’s “Nano Banana” as the leading image-generation model among both open-source and closed products, according to a major public leaderboard.

As of Saturday, the open-source Hunyuan Image 3.0 secured the top position in the text-to-image rankings on LMArena, an AI model evaluation platform originally started by researchers at the University of California, Berkeley.

The previous leader – Google DeepMind’s Gemini 2.5 Flash Image, also known as Nano Banana – rapidly gained popularity after its release in late August because of its image editing accuracy and 3D-figurine generation.

When Tencent released Hunyuan Image 3.0 late last month, the company said the model was “completely comparable to the industry’s flagship closed-source models”. It has 80 billion parameters, making it the largest open-source image-generation model to date.

An image of a Star Ferry-inspired spacecraft traversing a wormhole, generated by Google’s Nano Banana.
An image of a Star Ferry-inspired spacecraft traversing a wormhole, generated by Google’s Nano Banana.

Parameters are the variables that encode a model’s intelligence and are adjusted during training. Generally, a higher number of parameters indicates a more powerful model, though it also requires greater computational resources to train and operate.

  • Related Posts

    Why Hong Kong’s tech index is failing to ride the Chinese AI stock boom

    Investor frustration is growing with Hong Kong’s technology index, as the benchmark’s prolonged slide contrasts sharply with the soaring share prices of several Chinese AI firms that recently went public…

    Continue reading
    How customised AI is delivering real-world impact

    [The content of this article has been produced by our advertising partner.] The AI systems delivering the strongest results today are the ones built in-house and trained on proprietary data…

    Continue reading

    Leave a Reply

    Your email address will not be published. Required fields are marked *