• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
Wednesday, September 10, 2025
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News

Rice U. scientists slash computations for deep learning

Bioengineer by Bioengineer
June 1, 2017
in Science News
Reading Time: 4 mins read
0
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram
IMAGE

Credit: Jeff Fitlow/Rice University

Rice University computer scientists have adapted a widely used technique for rapid data lookup to slash the amount of computation — and thus energy and time — required for deep learning, a computationally intense form of machine learning.

"This applies to any deep-learning architecture, and the technique scales sublinearly, which means that the larger the deep neural network to which this is applied, the more the savings in computations there will be," said lead researcher Anshumali Shrivastava, an assistant professor of computer science at Rice.

The research will be presented in August at the KDD 2017 conference in Halifax, Nova Scotia. It addresses one of the biggest issues facing tech giants like Google, Facebook and Microsoft as they race to build, train and deploy massive deep-learning networks for a growing body of products as diverse as self-driving cars, language translators and intelligent replies to emails.

Shrivastava and Rice graduate student Ryan Spring have shown that techniques from "hashing," a tried-and-true data-indexing method, can be adapted to dramatically reduce the computational overhead for deep learning. Hashing involves the use of smart hash functions that convert data into manageable small numbers called hashes. The hashes are stored in tables that work much like the index in a printed book.

"Our approach blends two techniques — a clever variant of locality-sensitive hashing and sparse backpropagation — to reduce computational requirements without significant loss of accuracy," Spring said. "For example, in small-scale tests we found we could reduce computation by as much as 95 percent and still be within 1 percent of the accuracy obtained with standard approaches."

The basic building block of a deep-learning network is an artificial neuron. Though originally conceived in the 1950s as models for the biological neurons in living brains, artificial neurons are just mathematical functions, equations that act upon an incoming piece of data and transform it into an output.

In machine learning, all neurons start the same, like blank slates, and become specialized as they are trained. During training, the network is "shown" vast volumes of data, and each neuron becomes a specialist at recognizing particular patterns in the data. At the lowest layer, neurons perform the simplest tasks. In a photo recognition application, for example, low-level neurons might recognize light from dark or the edges of objects. Output from these neurons is passed on to the neurons in the next layer of the network, which search for their own specialized patterns. Networks with even a few layers can learn to recognize faces, dogs, stop signs and school buses.

"Adding more neurons to a network layer increases its expressive power, and there's no upper limit to how big we want our networks to be," Shrivastava said. "Google is reportedly trying to train one with 137 billion neurons." By contrast, he said, there are limits to the amount of computational power that can be brought to bear to train and deploy such networks.

"Most machine-learning algorithms in use today were developed 30-50 years ago," he said. "They were not designed with computational complexity in mind. But with 'big data,' there are fundamental limits on resources like compute cycles, energy and memory. Our lab focuses on addressing those limitations."

Spring said computation and energy savings from hashing will be even larger on massive deep networks.

"The savings increase with scale because we are exploiting the inherent sparsity in big data," he said. "For instance, let's say a deep net has a billion neurons. For any given input — like a picture of a dog — only a few of those will become excited. In data parlance, we refer to that as sparsity, and because of sparsity our method will save more as the network grows in size. So while we've shown a 95 percent savings with 1,000 neurons, the mathematics suggests we can save more than 99 percent with a billion neurons."

###

The research was supported by the National Science Foundation and Rice University.

A copy of the paper "Scalable and Sustainable Deep Learning via Randomized Hashing" is available at: https://arxiv.org/abs/1602.08194

More information about KDD 2017 is available at: http://www.kdd.org/kdd2017/

Related machine learning research from Rice:

Researchers working toward indoor location detection — April 17, 2017

http://news.rice.edu/2017/04/17/researchers-working-toward-indoor-location-detection/

Computer Science's Shrivastava wins NSF CAREER Award — March 6, 2017

http://news.rice.edu/2017/03/06/computer-sciences-shrivastava-wins-nsf-career-award/

Rice, Baylor team sets new mark for 'deep learning' — Dec. 16, 2016

http://news.rice.edu/2016/12/16/rice-baylor-team-sets-new-mark-for-deep-learning/

Rice's energy-stingy indoor mobile locator ensures user privacy — Oct. 20, 2016

http://news.rice.edu/2016/10/20/rices-energy-stingy-indoor-mobile-locator-ensures-user-privacy/

Rice wins interdisciplinary 'big data' grant — July 12, 2016

http://news.rice.edu/2016/07/12/rice-wins-interdisciplinary-big-data-grant/

This release can be found online at news.rice.edu.

Follow Rice News and Media Relations on Twitter @RiceUNews.

Located on a 300-acre forested campus in Houston, Rice University is consistently ranked among the nation's top 20 universities by U.S. News & World Report. Rice has highly respected schools of Architecture, Business, Continuing Studies, Engineering, Humanities, Music, Natural Sciences and Social Sciences and is home to the Baker Institute for Public Policy. With 3,879 undergraduates and 2,861 graduate students, Rice's undergraduate student-to-faculty ratio is 6-to-1. Its residential college system builds close-knit communities and lifelong friendships, just one reason why Rice is ranked No. 1 for happiest students and for lots of race/class interaction by the Princeton Review. Rice is also rated as a best value among private universities by Kiplinger's Personal Finance. To read "What they're saying about Rice," go to http://tinyurl.com/RiceUniversityoverview.

Media Contact

David Ruth
[email protected]
713-348-6327
@RiceUNews

http://news.rice.edu

############

Story Source: Materials provided by Scienmag

Share12Tweet7Share2ShareShareShare1

Related Posts

Global Movement and Annual Cycle in Spoonbills

Global Movement and Annual Cycle in Spoonbills

September 10, 2025

Targeted Intraoperative Radiotherapy Advances in Early Breast Cancer

September 10, 2025

Blood Transfusions Increase Bronchopulmonary Dysplasia Risk in Preemies

September 10, 2025

Modular Organocatalysis Creates BN Isosteres via Wolff Rearrangement

September 10, 2025
Please login to join discussion

POPULAR NEWS

  • blank

    Breakthrough in Computer Hardware Advances Solves Complex Optimization Challenges

    151 shares
    Share 60 Tweet 38
  • New Drug Formulation Transforms Intravenous Treatments into Rapid Injections

    116 shares
    Share 46 Tweet 29
  • Physicists Develop Visible Time Crystal for the First Time

    52 shares
    Share 21 Tweet 13
  • First Confirmed Human Mpox Clade Ib Case China

    56 shares
    Share 22 Tweet 14

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

Global Movement and Annual Cycle in Spoonbills

Targeted Intraoperative Radiotherapy Advances in Early Breast Cancer

Blood Transfusions Increase Bronchopulmonary Dysplasia Risk in Preemies

  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.