• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
Wednesday, August 20, 2025
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News Technology

Why AI Lacks the Human Touch in Understanding Flowers

Bioengineer by Bioengineer
June 4, 2025
in Technology
Reading Time: 4 mins read
0
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

blank

Despite advancements in artificial intelligence, a recent study indicates that AI tools like ChatGPT fall short of representing concepts as richly and deeply as humans do. This research, conducted by a group of psychologists led by Qihui Xu from The Ohio State University, highlights the fundamental differences in how humans and AI understand the world, particularly regarding sensory and motor experiences. The study found that while AI models excelled in representing words devoid of sensory connections, they struggled significantly with concepts that rely on the richness of human experiences.

The core of the issue lies in the architecture of large language models (LLMs) which predominantly harness linguistic data. Unlike humans, who possess multisensory experiences—sight, sound, touch, taste, and smell—AI models rely on extensive text-based datasets to learn. This discrepancy leads to a significant gap in understanding complex concepts such as flowers or food, where sensory engagement is critical. As Xu articulates, “A large language model can’t smell a rose, touch the petals of a daisy or walk through a field of wildflowers.” This inability to engage in sensory interaction causes a limitation in the AI’s conceptual framework as it lacks the rich, embodied experiences humans draw upon.

The study published in Nature Human Behaviour, examined how humans and two sophisticated AI models—OpenAI’s GPT-3.5 and GPT-4, as well as Google’s PaLM and Gemini—represent a range of concepts. The researchers focused on nearly 4,500 words, analyzing the degree of alignment between human and AI conceptual understandings. Assessments were based on two distinct measures: the Glasgow Norms, which evaluate words across dimensions like arousal and imageability, and the Lancaster Norms, which scrutinize how concepts interrelate with sensory and motor information.

.adsslot_XVYSNJ69WP{width:728px !important;height:90px !important;}
@media(max-width:1199px){ .adsslot_XVYSNJ69WP{width:468px !important;height:60px !important;}
}
@media(max-width:767px){ .adsslot_XVYSNJ69WP{width:320px !important;height:50px !important;}
}

ADVERTISEMENT

In one aspect of their analysis, the researchers investigated how both human and AI systems correlated on differing concepts. They aimed to discern if there was uniformity in acknowledging certain concepts’ emotional weight, or how they are perceived across various dimensions. The results revealed an intriguing pattern; while AI performed admirably on abstract concepts lacking sensory connection, it faltered significantly when confronted with sensory-rich terminology.

Words that relate to human touch, taste, or sight posed considerable challenges for AI. For instance, the term ‘flower’ encompasses a multitude of experiences beyond its mere linguistic definition; it invokes vivid memories from scent, texture, and emotional context. Xu highlighted that the representation of a flower in human thought encompasses diverse sensory experiences that AI fails to integrate adequately. In essence, human cognition creates a multifaceted tapestry of experiences that words alone cannot encapsulate.

The researchers further explored the implications of these findings for future interactions between AI and humans. If AI processes the world differently, it could lead to misunderstandings or diminished effectiveness in communication. As AI technologies become more integrated into our daily lives, the nuances in their understanding of concepts can significantly affect their interactions with human users.

Moreover, the study also spotlighted an evolving trend: while AI has a long way to go in replicating human-like conceptualization, there are improvements on the horizon. Models trained not just on textual data but also on images have shown better performance in grasping vision-related concepts compared to their text-only counterparts. This suggests a pathway for AI to enrich its representations by incorporating various sensory modalities.

As the domain of artificial intelligence continues to evolve, it is conceivable that future advancements may incorporate more sophisticated forms of understanding, potentially enriched through sensory data directly linked to robotic interactions within the physical world. Xu anticipates that as LLMs become more integrated with sensor technologies, their ability to emulate human-like understanding could dramatically improve.

Despite the current limitations, researchers and developers maintain an optimistic outlook for the next generation of AI models. By embracing a more holistic approach that combines language with sensory experiences, the gap between human and AI understanding could narrow, leading to more intuitive and effective interaction paradigms.

The findings of this study emphasize the multifaceted nature of human understanding, underscoring that our experiences are significantly shaped by our direct engagement with the world. In contrast, AI’s reliance on text alone renders it an incomplete mimic of human cognition, illustrating that there is much room for growth and improvement in developing AI technologies.

In conclusion, the study by Xu and her colleagues paves the way for critical discussions surrounding the future of AI and its evolving relationship with human users. As advancements are made, the integration of sensory and motor experiences into AI frameworks could herald a new era in artificial intelligence, where AI not only understands language but also experiences the richness of life similarly to humans.

Subject of Research: People
Article Title: ‘Large language models without grounding recover non-sensorimotor but not sensorimotor features of human concepts’
News Publication Date: 4-Jun-2025
Web References: http://dx.doi.org/10.1038/s41562-025-02203-8
References: Not available
Image Credits: Not available

Keywords

AI, language models, human cognition, sensory experience, concept representation, emotional arousal, multimodal learning, artificial intelligence, robotics, human-computer interaction.

Tags: AI and human experience gapAI understanding of sensory experienceschallenges in AI conceptualizationdifferences between human and AI perceptionembodied experiences in understandingflower concepts and AIhuman touch in AIlanguage models and sensory datalimitations of artificial intelligencemultisensory experiences in learningpsychology of AI comprehensionQihui Xu research findings

Share12Tweet8Share2ShareShareShare2

Related Posts

Long-Lived Triplet Excitons Enable Time-Dependent PUFs

Long-Lived Triplet Excitons Enable Time-Dependent PUFs

August 20, 2025
Alocasia odora Activated Carbon: A Promising Pb2+ Sensor

Alocasia odora Activated Carbon: A Promising Pb2+ Sensor

August 20, 2025

SiO2 Nanoparticles Enhance Conductivity in Polymer Blends

August 20, 2025

Enhancing Ionic Conductivity in Garnet Electrolytes with Sr-Ta

August 20, 2025

POPULAR NEWS

  • blank

    Molecules in Focus: Capturing the Timeless Dance of Particles

    141 shares
    Share 56 Tweet 35
  • Neuropsychiatric Risks Linked to COVID-19 Revealed

    80 shares
    Share 32 Tweet 20
  • Modified DASH Diet Reduces Blood Sugar Levels in Adults with Type 2 Diabetes, Clinical Trial Finds

    60 shares
    Share 24 Tweet 15
  • Predicting Colorectal Cancer Using Lifestyle Factors

    47 shares
    Share 19 Tweet 12

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

Long-Lived Triplet Excitons Enable Time-Dependent PUFs

N6-Methyladenosine’s Role in Prostate Cancer Progression

New Research Reveals Biological Factors Behind Daytime Sleepiness

  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.