Toshiba Develops High-Speed Algorithm and Hardware Architecture for Deep Learning Processor
Toshiba Memory Corporation today announced the development of a high-speed and high-energy-efficiency algorithm and hardware architecture for deep learning processing with less degradations of recognition accuracy. The new processor for deep learning implemented on an FPGA achieves 4 times energy efficiency compared to conventional ones. The advance was announced at IEEE Asian Solid-State Circuits Conference 2018 (A-SSCC 2018) in Taiwan on November 6.Deep learning calculations generally require large amounts of multiply-accumulate (MAC) operations, and it has resulted in issues of long calculation time and large energy consumption. Although techniques reducing the number of bits to represent parameters (bit precision) have been proposed to reduce the total calculation amount, one of proposed algorithm reduces the bit precision down to one or two bit, those techniques cause degraded recognition accuracy.Toshiba Memory developed the new algorithm reducing MAC operations by optimizing the bit precision of MAC operations for individual filters in each layer of a neural network. By using the new algorithm, the MAC operations can be reduced with less degradation of recognition accuracy.Furthermore, Toshiba Memory developed a new hardware architecture, called bit-parallel method, which is suitable for MAC operations with different bit precision. This method divides each various bit precision into a bit one by one and can execute 1-bit operation in numerous MAC units in parallel. It significantly improves utilization efficiency of the MAC units in the processor compared to conventional MAC architectures that execute in series.Toshiba Memory implemented ResNet50, a deep neural network, on an FPGA using the various bit precision and bit-parallel MAC architecture. In the case of image recognition for the image dataset of ImageNet, the above technique supports that both operation time and energy consumption for recognizing image data are reduced to 25 % with less recognition accuracy degradation, compared to conventional method.
Key word:
Release time:2018-11-07 00:00 reading:1172 Continue reading>>
<span style='color:red'>LG</span> Electronics and Luxoft forge partnership to bring webOS to new markets
Global IT service provider, Luxoft, has partnered with LG Electronics, to create the next generation webOS as part of a strategy to extend its capabilities and ecosystem into the automotive, robotics and smart home vertical markets.LG Electronics has previously deployed webOS in over 60 million LG smart TVs and digital signage displays worldwide and is seeing this installed base growing rapidly. As a strategic partner, Luxoft will bring additional technical assets as well as experience in designing and deploying software platforms for a wide variety of products and services.“Thanks to our collaboration with Luxoft, we are able to bring webOS into automotive and beyond,” said I.P. Park, Chief Technology Officer at LG Electronics. “Luxoft is providing a substantial technological contribution to webOS and has also greatly enhanced our ability to deploy it into new industries.”Luxoft will lead the deployment of webOS into the new, targeted sectors, beginning with automotive. Initially focusing on digital cockpit development which includes infotainment, navigation, and other features that are human-car interaction-centric. Luxoft and LG Electronics also plan to introduce the new platform into the robotics and smart home sectors.“We’re already leveraging LG Electronics’ thriving smart TV eco-system to customise and enhance webOS so it provides an innovation canvas for car manufacturers to develop next-generation autonomous vehicles,” explained Mikael Soderberg, Senior Technical Director, Automotive at Luxoft. “Having access to webOS and its cloud services platform will enable car makers to design and develop better customer experiences for autonomous mobility services.”Commenting on the agreement, President and CEO of Luxoft, Dmitry Loschinin, said, “Underlying this partnership is a shared desire to make it easier for manufacturers to innovate with technology. This platform gives them the flexibility to make digital changes. This will help accelerate the mobility revolution, improve human-robotic interactions and make smart devices even smarter."
Key word:
Release time:2018-09-04 00:00 reading:1200 Continue reading>>
Global TV Panel Shipment Grew by 9.1% Monthly in May, BOE Surpassed <span style='color:red'>LG</span>D and Ranked the First
According to the latest report by WitsView, a division of TrendForce, global TV panel shipments reached 23.52 million in May, a 9.1% growth over the previous month. Although branded TV makers have purchased fewer panels since 2Q18 in order to adjust their inventory levels, the panel makers still have high utilization rates. In the situation, they planned special deal projects to increase the shipments, which would probably further lower the panel prices.The panel shipments have shown some improvements in June. “The TV set retail prices in Chinese 618 mid-season sales have been 20% lower than in 315 sales”, says Iris Hu, the research manager of WitsView, “which may stimulate the channel sales and consume inventories.” Going forward to the end of 2Q, TV panel prices have approached the cash costs. TV brands are expected to stock up panels in advance, on considering the limited room for further price decline. “Therefore, we expect the TV panel shipments in June to be roughly the same as May.BOE returned to the top of shipment ranking thanks to special deal projects, Innolux ranked the second with 42.5 % monthly growthBOE has adjusted the product mixes of its Gen 8.5 fab, increasing the share of 55-inch panels, of which the shipment reached a record high of 590,000 pieces. In terms of small size products, BOE has shipped all 600,000 pieces of 32-inch panels in the inventory at special deal prices, and the total shipments of this size reached 2.578 million pieces, a growth of 30.2% over last month. In total, BOE shipped 4.774 million pieces of TV panels in May, a monthly growth of 19%. This is a new high record of BOE’s shipments, bringing the company back to the top of shipment ranking.Innolux recorded relatively low shipments in April due to poor demand, ending up with high inventory level for 39.5-inch and 50-inch panels. Similar to BOE, Innolux also provided special deal projects, consuming nearly 40% of 39.5-inch panels’ inventory. Its shipments for 39.5-inch panels increased significantly by 136% to 1.37 million pieces in May. In total, Innolux shipped 3.817 million pieces of TV panels, a growth of 42.5% month-on-month. With the highest shipment growth among the six major panel makers, Innolux returned to the second place in the ranking.Panel price drop has reduced the customers’ willingness to stock up, LDG’s shipments decreased by 7% monthly in MayLDG has also been influenced by the panel price drop starting from 2Q18, which lowered the customers’ willingness to stock up. As the result, the shipments of LGD have been decreasing for two consecutive months. The mainstream size, 43-65-inch products recorded a shipment decline of 9.4%. On the whole, LDG shipped 3.643 million pieces of TV panels in May, a decrease of 7% compared with previous month. The decline appeared to be the steepest among the six major panel makers.CSOT, one of the major suppliers of 32-inch panels, has adjusted the product mixes to reduce the risks of stocking up 32-inch products. The shipments of 43-inch and 49-inch products grew by 38.8% and 21.1% compared with last month. Yet in total, the monthly shipments of CSOT for all sizes still declined by 3.1% to 3.241 million pieces.Samsung Display (SDC) started its annual maintenance of Korean fab in April amid the weak demand in Q2, resulting in low production volume and shipments in April and May. In June, SDC shipped 400,000 pieces of 65-inch panels thanks to its production scale and cost advantages. For all size, SDC shipped 3.15 million pieces in May, a 2.8% increase over the last month.AUO's TV panel shipments recorded 2.005 million pieces in May, a monthly decline of 1.1%. This was mainly attributed to the customers’ lower willingness to stock up because of the overall poor demand and the continued drop in panel prices. The shipments of 50-inch and 55-inch panels declined by 6.3% and 7.8% respectively.
Key word:
Release time:2018-06-29 00:00 reading:1151 Continue reading>>
<span style='color:red'>LG</span> Display shares take a dive on grim LCD outlook
Investors continue to dump shares of LG Display Co. as outlook on the world’s largest TV-screen display maker deteriorates amid a supply glut and heavy competition from government-backed Chinese rivals. LG Display’s shares sank to a new 52-week low of 17,750 won ($15.87) in mid-day trading on Tuesday. The last time its shares fell below 18,000 won was in October 2011. The company has lost nearly 30 percent of its market value over the past three months. Its shares closed Wednesday at 18,300 won, up 0.55 percent from the previous session. LG Display was the only local display manufacturer to fall into the red in the first quarter. It posted an operating loss of 98.3 billion won in the first three months of this year, its first loss since the first quarter of 2012. Market observers forecast its losses to have widened in the second quarter with little signs of pickup in the soft LCD prices. LG Display, which generates 90 percent of its sales from LCD panels, has recently taken a hit as BOE, CSOT and other Chinese rivals have pumped up supply and driven down prices on the back of strong government support. “LG Display needs to beef up its organic light-emitting diode (OLED) business but it is running low on cash as it fails to make money from its mainstay LCD,” Korea Investment & Securities said in a report released on Tuesday. “Conditions remain tough as its losses are expected to continue throughout the third quarter and its cash flow to face rapid deterioration.” Earlier this year, the company vowed to make OLED its mainstay turnout by 2020 and cement its leadership in the next-generation TV screens by investing 20 trillion won over the next few years. But it may have to adjust its OLED investment plans due to faster cooling in the display market. Korea Investment & Securities Co. cut the price target for LG Display from 37,000 won to 29,000 won. Other local brokerages have also lowered the target on the bleak outlook. KTB Investment & Securities Co. slashed the target price from 25,000 won to 22,000 won and Hi Investment & Securities Co. from 31,000 won to 23,000 won.
Key word:
Release time:2018-06-28 00:00 reading:1079 Continue reading>>
Algorithm Speeds GPU-based AI Training 10x on Big Data Sets
  IBM Zurich researchers have developed a generic artificial-intelligence preprocessing building block for accelerating Big Data machine learning algorithms by at least 10 times over existing methods. The approach, which IBM presented Monday (Dec. 4) at the Neural Information Processing Systems conference (NIPS 2017) in Long Beach, Calif., uses mathematical duality to cherry-pick the items in a Big Data stream that will make a difference, ignoring the rest.  “Our motivation was how to use hardware accelerators, such as GPUs [graphic processing units] and FPGAs [field-programmable gate arrays], when they do not have enough memory to hold all the data points” for Big Data machine learning, IBM Zurich collaborator Celestine Dünner, co-inventor of the algorithm, told EE Times in advance of the announcement.  “To the best of our knowledge, we are first to have generic solution with a 10x speedup,” said co-inventor Thomas Parnell, an IBM Zurich mathematician. “Specifically, for traditional, linear machine learning models — which are widely used for data sets that are too big for neural networks to train on — we have implemented the techniques on the best reference schemes and demonstrated a minimum of a 10x speedup.”  IBM Zurich researcher Martin Jaggi at ?cole Polytechnique Fédérale de Lausanne (EPFL), also contributed to the machine learning preprocessing algorithm.  For their initial demonstration, the researchers used a single Nvidia Quadro M4000 GPU with 8 gigabytes of memory training on a 30-Gbyte data set of 40,000 photos using a support vector machine (SVM) algorithm that resolves the images into classes for recognition. The SVM algorithm also creates a geometric interpretation of the model learned (unlike neural networks, which cannot justify their conclusions). IBM’s data preprocessing method enabled the algorithm to run in less than a one minute, a tenfold speedup over existing methods using limited-memory training.  The key to the technique is preprocessing each data point to see if it is the mathematical dual of a point already processed. If it is, then the algorithm just skips it, a process that becomes increasingly frequent as the data set is processed. “We calculate the importance of each data point before it is processed by measuring how big the duality gap is,” Dünner said.  “If you can fit your problem in the memory space of the accelerator, then running in-memory will achieve even better results,” Parnell told EE Times. “So our results apply only to Big Data problems. Not only will it speed up execution time by 10 times or more, but if you are running in the cloud, you won’t have to pay as much.”  As Big Data sets grow, such time- and money-saving preprocessing algorithms will become increasingly important, according to IBM. To show that its duality-based algorithm works with arbitrarily large data sets, the company showed an eight-GPU version at NIPS that handles a billion examples of click-through data for web ads.  The researchers are developing the algorithm further for deployment in IBM’s Cloud. It will be recommended for Big Data sets involving social media, online marketing, targeted advertising, finding patterns in telecom data, and fraud detection.  For details, read Efficient Use of Limited-Memory Accelerators for Linear Learning on Heterogeneous Systems, by Dünner, Parnell, and Jaggi.
Release time:2017-12-06 00:00 reading:1099 Continue reading>>
Big Data Algorithms, Languages Expand
  The buzz around big data is spawning new algorithms, programming languages, and techniques at the speed of software.  “Neural networks have been around for a long time. What’s new is the large amounts of data we have to run against them and the intensity of engineering around them,” said Inderpal Bhandari, a veteran computer scientist who was named IBM’s first chief data officer.  He described work using generative adversarial networks to pit two neural nets against each other to create a better one. “This is an engineering idea that leads to more algorithms — there is a lot of that kind of engineering around neural networks now.”  In some ways, the algorithms are anticipating tomorrow’s hardware. For example, quantum algorithms are becoming hot because they “allow you to do some of what quantum computers would do if they were available, and these algorithms are coming of age,” said Anthony Scriffignano, chief data scientist for Dun & Bradstreet.  Deep belief networks are another hot emerging approach. Scriffignano describes it as “a non-regressive way to modify your goals and objectives while you are still learning — as such, it has characteristics of tomorrow’s neuromorphic computers,” systems geared to mimic the human brain.  At Stanford, the DeepDive algorithms developed by Chris Ré have been getting traction. They help computers understand and use unstructured data like text, tables, and charts as easily as relational databases or spreadsheets, said Stephen Eglash, who heads the university’s data science initiative.  “Much of existing data is un- or semi-structured. For example, we can read a datasheet with ease, but it’s hard for a computer to make sense of it.”  So far, Deep Dive has helped oncologists use computers to interpret photos of tumors. It’s being used by the New York attorney general as a law enforcement tool. It’s also in use across a large number of companies working in different domains.  DeepDive is unique in part because “it IDs and labels everything and then uses learning engines and probabilistic techniques to figure out what they mean,” said Eglash.  While successful, the approach is just one of many algorithm efforts in academia these days. Others focus on areas such as computer vision or try to ID anomalies in real-time data streams. “We could go on and on,” said Eglash.
Key word:
Release time:2017-06-09 00:00 reading:1192 Continue reading>>

Turn to

/ 1

  • Week of hot material
  • Material in short supply seckilling
model brand Quote
TL431ACLPR Texas Instruments
MC33074DR2G onsemi
RB751G-40T2R ROHM Semiconductor
CDZVT2R20B ROHM Semiconductor
BD71847AMWV-E2 ROHM Semiconductor
model brand To snap up
STM32F429IGT6 STMicroelectronics
BP3621 ROHM Semiconductor
TPS63050YFFR Texas Instruments
BU33JA2MNVX-CTL ROHM Semiconductor
IPZ40N04S5L4R8ATMA1 Infineon Technologies
ESR03EZPJ151 ROHM Semiconductor
Hot labels
ROHM
IC
Averlogic
Intel
Samsung
IoT
AI
Sensor
Chip
About us

Qr code of ameya360 official account

Identify TWO-DIMENSIONAL code, you can pay attention to

AMEYA360 mall (www.ameya360.com) was launched in 2011. Now there are more than 3,500 high-quality suppliers, including 6 million product model data, and more than 1 million component stocks for purchase. Products cover MCU+ memory + power chip +IGBT+MOS tube + op amp + RF Bluetooth + sensor + resistor capacitance inductor + connector and other fields. main business of platform covers spot sales of electronic components, BOM distribution and product supporting materials, providing one-stop purchasing and sales services for our customers.

Please enter the verification code in the image below:

verification code