Google Fellow: Neural Nets Need <span style='color:red'>Optimized Hardware</span>
  If you aren't currently considering how to use deep neural networks to solve your problems, you almost certainly should be, according to Jeff Dean, a Google senior fellow and leader of the deep learning artificial intelligence research project known as Google Brain.  In a keynote address at the Hot Chips conference here Tuesday (Aug. 22), Dean outlined how deep neural nets are dramatically reshaping computational devices and making significant strides in speech, vision, search, robotics and healthcare, among other areas. He said hardware systems optimized for performing a small handful of specific operations that make up the vast majority of machine learning models would create more powerful neural networks.  "Building specialized computers for the properties that neural nets have makes a lot of sense," Dean said. "If you can produce a system that is really good at doing very specific [accelerated low-precision linear algebra] operations, that's what we want."  Of the 14 Grand Challenges for Engineering in the 21st Century identified by the National Academy of Engineering in 2008, Dean believes that neural networks can play an integral role in solving five — including restoring and improving urban architecture, advancing health informatics engineering better medicines and reverse engineering the human brain. But Dean said neural networks offer the greatest potential for helping to solve the final challenge on the NAE's list: engineering the tools for scientific discovery.  "People have woken up to the idea that we need more computational power for a lot of these problems," Dean said.  Google recently began giving to customers and researchers access to the second-generation of its TensorFlow processing unit (TPU) machine-learning ASIC through a cloud service. A custom accelerator board featuring four of the second-generation devices boasts 180 teraflops of computation and 64 GB of High Bandwidth Memory (HBM).  Dean said the devices is designed to be connected together into larger configurations — a "TPU pod" featuring 64 second-generation TPUs, cable of 11.5 petaflops and offering 4 terabytes of HBM memory. He added that Google is making available 1,000 Cloud TPUs for free to top researchers who are committed to open machine learning research.  "We are pretty excited about the possibilities of the pod for solving bigger problems," Dean said.  In 2015, Google released its TensorFlow software library for machine learning to open source with a goal of establishing a common platform for expressing machine learning ideas and systems. Dean showed a chart demonstrating that TensorFlow in just over a year and a half has become far more popular than other libraries with similar uses.  "It's been pretty rewarding to have this rather large community now crop up," Dean said.  The rise of neural networks — which has accelerated greatly over the past five years — has been made possible by tremendous advances in compute power over the past 20 years, Dean said. He added that he actually wrote a thesis about neural networks in 1990. He believed at the time that neural networks were not far off from being viable, needing only about 60 times more compute power than was available then.  "It turned out that what we really needed was about 1 million times more compute power, not 60," Dean said.
Release time:2017-08-24 00:00 reading:1302 Continue reading>>

Turn to

/ 1

  • Week of hot material
  • Material in short supply seckilling
model brand Quote
RB751G-40T2R ROHM Semiconductor
BD71847AMWV-E2 ROHM Semiconductor
TL431ACLPR Texas Instruments
MC33074DR2G onsemi
CDZVT2R20B ROHM Semiconductor
model brand To snap up
BP3621 ROHM Semiconductor
ESR03EZPJ151 ROHM Semiconductor
TPS63050YFFR Texas Instruments
IPZ40N04S5L4R8ATMA1 Infineon Technologies
STM32F429IGT6 STMicroelectronics
BU33JA2MNVX-CTL ROHM Semiconductor
Hot labels
ROHM
IC
Averlogic
Intel
Samsung
IoT
AI
Sensor
Chip
About us

Qr code of ameya360 official account

Identify TWO-DIMENSIONAL code, you can pay attention to

AMEYA360 mall (www.ameya360.com) was launched in 2011. Now there are more than 3,500 high-quality suppliers, including 6 million product model data, and more than 1 million component stocks for purchase. Products cover MCU+ memory + power chip +IGBT+MOS tube + op amp + RF Bluetooth + sensor + resistor capacitance inductor + connector and other fields. main business of platform covers spot sales of electronic components, BOM distribution and product supporting materials, providing one-stop purchasing and sales services for our customers.

Please enter the verification code in the image below:

verification code