• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
Wednesday, September 10, 2025
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News Technology

Why AI Lacks the Human Touch in Understanding Flowers

Bioengineer by Bioengineer
September 6, 2025
in Technology
Reading Time: 4 mins read
0
Why AI Lacks the Human Touch in Understanding Flowers
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

Despite advancements in artificial intelligence, a recent study indicates that AI tools like ChatGPT fall short of representing concepts as richly and deeply as humans do. This research, conducted by a group of psychologists led by Qihui Xu from The Ohio State University, highlights the fundamental differences in how humans and AI understand the world, particularly regarding sensory and motor experiences. The study found that while AI models excelled in representing words devoid of sensory connections, they struggled significantly with concepts that rely on the richness of human experiences.

The core of the issue lies in the architecture of large language models (LLMs) which predominantly harness linguistic data. Unlike humans, who possess multisensory experiences—sight, sound, touch, taste, and smell—AI models rely on extensive text-based datasets to learn. This discrepancy leads to a significant gap in understanding complex concepts such as flowers or food, where sensory engagement is critical. As Xu articulates, “A large language model can’t smell a rose, touch the petals of a daisy or walk through a field of wildflowers.” This inability to engage in sensory interaction causes a limitation in the AI’s conceptual framework as it lacks the rich, embodied experiences humans draw upon.

The study published in Nature Human Behaviour, examined how humans and two sophisticated AI models—OpenAI’s GPT-3.5 and GPT-4, as well as Google’s PaLM and Gemini—represent a range of concepts. The researchers focused on nearly 4,500 words, analyzing the degree of alignment between human and AI conceptual understandings. Assessments were based on two distinct measures: the Glasgow Norms, which evaluate words across dimensions like arousal and imageability, and the Lancaster Norms, which scrutinize how concepts interrelate with sensory and motor information.

.adsslot_XVYSNJ69WP{width:728px !important;height:90px !important;}
@media(max-width:1199px){ .adsslot_XVYSNJ69WP{width:468px !important;height:60px !important;}
}
@media(max-width:767px){ .adsslot_XVYSNJ69WP{width:320px !important;height:50px !important;}
}

ADVERTISEMENT

In one aspect of their analysis, the researchers investigated how both human and AI systems correlated on differing concepts. They aimed to discern if there was uniformity in acknowledging certain concepts’ emotional weight, or how they are perceived across various dimensions. The results revealed an intriguing pattern; while AI performed admirably on abstract concepts lacking sensory connection, it faltered significantly when confronted with sensory-rich terminology.

Words that relate to human touch, taste, or sight posed considerable challenges for AI. For instance, the term ‘flower’ encompasses a multitude of experiences beyond its mere linguistic definition; it invokes vivid memories from scent, texture, and emotional context. Xu highlighted that the representation of a flower in human thought encompasses diverse sensory experiences that AI fails to integrate adequately. In essence, human cognition creates a multifaceted tapestry of experiences that words alone cannot encapsulate.

The researchers further explored the implications of these findings for future interactions between AI and humans. If AI processes the world differently, it could lead to misunderstandings or diminished effectiveness in communication. As AI technologies become more integrated into our daily lives, the nuances in their understanding of concepts can significantly affect their interactions with human users.

Moreover, the study also spotlighted an evolving trend: while AI has a long way to go in replicating human-like conceptualization, there are improvements on the horizon. Models trained not just on textual data but also on images have shown better performance in grasping vision-related concepts compared to their text-only counterparts. This suggests a pathway for AI to enrich its representations by incorporating various sensory modalities.

As the domain of artificial intelligence continues to evolve, it is conceivable that future advancements may incorporate more sophisticated forms of understanding, potentially enriched through sensory data directly linked to robotic interactions within the physical world. Xu anticipates that as LLMs become more integrated with sensor technologies, their ability to emulate human-like understanding could dramatically improve.

Despite the current limitations, researchers and developers maintain an optimistic outlook for the next generation of AI models. By embracing a more holistic approach that combines language with sensory experiences, the gap between human and AI understanding could narrow, leading to more intuitive and effective interaction paradigms.

The findings of this study emphasize the multifaceted nature of human understanding, underscoring that our experiences are significantly shaped by our direct engagement with the world. In contrast, AI’s reliance on text alone renders it an incomplete mimic of human cognition, illustrating that there is much room for growth and improvement in developing AI technologies.

In conclusion, the study by Xu and her colleagues paves the way for critical discussions surrounding the future of AI and its evolving relationship with human users. As advancements are made, the integration of sensory and motor experiences into AI frameworks could herald a new era in artificial intelligence, where AI not only understands language but also experiences the richness of life similarly to humans.

Subject of Research: People
Article Title: ‘Large language models without grounding recover non-sensorimotor but not sensorimotor features of human concepts’
News Publication Date: 4-Jun-2025
Web References: http://dx.doi.org/10.1038/s41562-025-02203-8
References: Not available
Image Credits: Not available

Keywords

AI, language models, human cognition, sensory experience, concept representation, emotional arousal, multimodal learning, artificial intelligence, robotics, human-computer interaction.

Tags: AI and human experience gapAI understanding of sensory experienceschallenges in AI conceptualizationdifferences between human and AI perceptionembodied experiences in understandingflower concepts and AIhuman touch in AIlanguage models and sensory datalimitations of artificial intelligencemultisensory experiences in learningpsychology of AI comprehensionQihui Xu research findings

Tags: AI sensory limitationsembodied AI experiencesflower concept understandinghuman-AI perception gaplanguage model limitations
Share13Tweet8Share2ShareShareShare2

Related Posts

blank

Boosting E. coli in Anaerobic Sludge for Fuel Cells

September 10, 2025
blank

Breakthrough: First-Ever Koala Chlamydia Vaccine Receives Approval

September 10, 2025

Insatiable Star Devours Its Cosmic Twin at Unprecedented Rate

September 10, 2025

Indiana University and Instructure Secured NSF Funding to Launch TOPSAIL: A Groundbreaking Infrastructure for Evaluating AI Tools in Education

September 9, 2025

POPULAR NEWS

  • blank

    Breakthrough in Computer Hardware Advances Solves Complex Optimization Challenges

    151 shares
    Share 60 Tweet 38
  • New Drug Formulation Transforms Intravenous Treatments into Rapid Injections

    116 shares
    Share 46 Tweet 29
  • Physicists Develop Visible Time Crystal for the First Time

    53 shares
    Share 21 Tweet 13
  • First Confirmed Human Mpox Clade Ib Case China

    56 shares
    Share 22 Tweet 14

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

Boosting E. coli in Anaerobic Sludge for Fuel Cells

Evaluating ‘Maze Out’: A Game for Eating Disorders

Synergistic Natural Edible Coatings Enhance Guava Preservation

  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.