• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
Tuesday, May 12, 2026
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News Health

Understanding brain activity when you name what you see

Bioengineer by Bioengineer
June 24, 2019
in Health
Reading Time: 2 mins read
0
IMAGE
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

IMAGE

Credit: Baylor College of Medicine

You see an object, you think of its name and then you say it. This apparently simple activity engages a set of brain regions that must interact with each other to produce the behavior quickly and accurately. A report published in eNeuro shows that a reliable sequence of neural interactions occurs in the human brain that corresponds to the visual processing stage, the language state when we think of the name, and finally the articulation state when we say the name. The study reveals that the neural processing does not involve just a sequence of different brain regions, but instead it engages a sequence of changing interactions between those brain regions.

“In this study, we worked with patients with epilepsy whose brain activity was being recorded with electrodes to find where their seizures started. While the electrodes were in place, we showed the patients pictures and asked them to name them while we recorded their brain activity,” said co-corresponding author Dr. Xaq Pitkow, assistant professor of neuroscience and McNair Scholar at Baylor College of Medicine and assistant professor of electrical and computer engineering at Rice University.

“We then analyzed the data we recorded and derived a new level of understanding of how the brain network comes up with the right word and enables us to say that word,” said Dr. Nitin Tandon, professor in the Vivian L. Smith Department of Neurosurgery at McGovern Medical School at The University of Texas Health Science Center at Houston.

The researchers’ findings support the view that when a person names a picture, the different behavioral stages – looking at the image, thinking of the name and saying it – consistently correspond to dynamic interactions within neural networks.

“Before our findings, the typical view was that separate brain areas would be activated in sequence,” Pitkow said. “But we used more complex statistical methods and fast measurement methods, and found more interesting brain dynamics.”

“This methodological advance provides a template by which to assess other complex neural processes, as well as to explain disorders of language production,” Tandon said.

###

Aram Giahi Saravani of Baylor College of Medicine and Kiefer J. Forseth of UTHealth also are authors of this work.

Financial support for this study was provided by the National Institute on Deafness and Other Communication Disorders (R01DC014589), the National Institute of Neurological Disorders and Stroke (U01NS098981), the National Science Foundation Awards 1533664 and IOS-1552868, and the McNair Foundation.

Media Contact
Graciela Gutierrez
[email protected]

Original Source

https://www.bcm.edu/news

Related Journal Article

http://dx.doi.org/10.1523/ENEURO.0472-18.2019

Tags: Algorithms/ModelsBiologyLanguage/Linguistics/SpeechneurobiologyPhysiologySocial/Behavioral Science
Share12Tweet8Share2ShareShareShare2

Related Posts

AI Models Analyze Patient Data to Forecast Cardiac Arrest Risk

May 12, 2026

Monovalent Degraders Enable Tunable SMARCA 2/4 Degradation

May 12, 2026

How Neighborhood Environments Influence Teen Smoking: Exploring the Rural-Urban Divide

May 12, 2026

Medicaid Expansion Reduces Mortality in Young Adults with Kidney Failure

May 11, 2026
Please login to join discussion

POPULAR NEWS

  • Research Indicates Potential Connection Between Prenatal Medication Exposure and Elevated Autism Risk

    842 shares
    Share 337 Tweet 211
  • New Study Reveals Plants Can Detect the Sound of Rain

    728 shares
    Share 290 Tweet 182
  • Salmonella Haem Blocks Macrophages, Boosts Infection

    62 shares
    Share 25 Tweet 16
  • Breastmilk Balances E. coli and Beneficial Bacteria in Infant Gut Microbiomes

    57 shares
    Share 23 Tweet 14

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

Boise State University Researchers Pioneer Rapid, Affordable On-Site Detection Technology for ‘Forever Chemicals’

AI Models Analyze Patient Data to Forecast Cardiac Arrest Risk

Harnessing Neural Networks for Advances in Nonlinear Optics

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 82 other subscribers
  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.