Analog Devices and Seeing Machines Collaborating on <span style='color:red'>ADAS</span> Solutions
  Analog Devices Inc. and Seeing Machines are collaborating to support high-performance driver and occupant monitoring system (DMS/OMS) technology.  Long-haul driving and congested traffic are two scenarios where driver fatigue and distraction often occur and frequently cause accidents, resulting in injury or worse. New and sophisticated advanced driver assistance systems (ADAS) are rapidly evolving to support safety across increasing, varied levels of autonomous capability.  The collaboration pairs ADI’s advanced infrared driver and high-speed Gigabit Multimedia Serial Link (GMSL) camera connectivity solutions with Seeing Machines’ artificial intelligence (AI) DMS and OMS software to support powerful eye gaze, eyelid, head, and body-pose tracking system technology that more accurately monitors driver fatigue and distraction. The combined solution will readily meet European Commission General Safety Regulations (GSR) and European New Car Assessment Program (Euro NCAP) requirements. It is also conducive to enabling future occupant monitoring features and a range of in-cabin camera placement options, previously unworkable due to challenges related to power efficiency, functional safety, hardware footprint, and image quality.  Semi-autonomous driving systems rely on in-cabin DMS and OMS to recognize and address driver fatigue and distraction. These systems must operate in all lighting conditions and require proper infrared lighting to ensure image quality on a frame-by-frame basis necessary for eye tracking in real time. The combined solution from ADI and Seeing Machines leverages ADI’s industry-first infrared driver for DMS and OMS, capable of delivering up to 100W of peak power in a compact and functionally safe solution. This allows for a non-intrusive, smaller camera module in a vehicle’s cabin.  Seeing Machines’ AI software interprets signals from the optical hardware, monitors and diagnoses the problem, and combines with ADAS features to enable output signals to warn drivers and vehicle occupants when necessary.  “Seeing Machines exists to get people home safely, and our work with ADI aims to support semi-autonomous driving with increased safety levels to deliver what we call ‘supervised automation’,” said Nick DiFiore, SVP and GM of Automotive at Seeing Machines. “ADI’s proven automotive-grade, near-infrared drivers and GMSL devices enable a sophisticated optical path to provide critical illumination and high-speed video bandwidth for real-world and real-time processing of interior cabin environments.”  “Cabin monitoring is complex and requires careful integration of infrared illumination, image capture, data processing, and algorithm layers to achieve a real-time response,” said Yin Wu, Director of Automotive Product Line Management at Analog Devices. “Together with Seeing Machines, we are supporting the automotive industry with pragmatic solutions to help reduce collisions and save lives.”
Key word:
Release time:2023-01-06 10:55 reading:2138 Continue reading>>
Renesas and Fixstars Developing Tools for AD, <span style='color:red'>ADAS</span> AI Software Optimization
  Renesas and Fixstars are developing a suite of tools that allows optimization and fast simulation of software for autonomous driving systems and ADAS.  Renesas Electronics Corp. and Fixstars Corp. are jointly developing a suite of tools that allows optimization and fast simulation of software for autonomous driving (AD) systems and advanced driver-assistance systems (ADAS) specifically designed for the R-Car system-on-chip (SoC) devices from Renesas.  These tools make it possible to rapidly develop network models with highly accurate object recognition from the initial stage of software development that take advantage of the performance of the R-Car. This reduces post-development rework and thereby helps shorten development cycles.  “Renesas continues to create integrated development environments that enable customers to adopt the ‘software-first’ approach,” said Hirofumi Kawaguchi, Vice President of the Automotive Software Development Division at Renesas. “By supporting the development of deep learning models tailored to R-Car, we help our customers build AD and ADAS solutions, while also reducing the time to market and development costs.”  “The GENESIS for R-Car, which is a cloud-based evaluation environment that we built jointly with Renesas, allows engineers to evaluate and select devices earlier in the development cycles and has already been used by many customers,” said Satoshi Miki, CEO of Fixstars. “We will continue to develop new technologies to accelerate machine learning operations (MLOps) that can be used to maintain the latest versions of software in automotive applications.”  Today’s AD and ADAS applications use deep learning to achieve highly accurate object recognition. Deep learning inference processing requires massive amounts of data calculations and memory capacity. The models and executable programs on automotive applications must be optimized for an automotive SoC, since real-time processing with limited arithmetic units and memory resources can be a challenging task. In addition, the process from software evaluation to verification must be accelerated and updates need to be applied repeatedly to improve the accuracy and performance.
Key word:
Release time:2022-12-27 13:23 reading:2104 Continue reading>>
Wind River teams up with Renesas to advance <span style='color:red'>ADAS</span>
Wind River and Renesas are expanding their collaboration for connected, advanced driver assist systems, and autonomous driving programmes.Plans include the pre-validation of Wind River software with the latest Renesas R-Car systems-on-chip (SoCs). Software from the Wind River automotive portfolio is currently available on reference hardware for several existing Renesas SoCs. These include those for high-end, next-generation systems targeting vision and image processing applications relevant for autonomous driving, safety support systems, and also SoCs targeting in-vehicle infotainment. “Connected and autonomous cars must intelligently interpret the world around them, and this is pushing software complexity to rapidly increase within the automobile. In order to achieve advanced automotive systems, software and hardware must work together seamlessly,” said Marques McCammon, vice president of Automotive at Wind River. “By leveraging pre-validated solutions that are ready for deployment, car makers can quickly achieve their goals for delivering innovation and accelerating their time-to-market. We look forward to continuing our work with leaders such as Renesas, as well as further strengthening our efforts with silicon partners across the ecosystem.” Renesas is a major supplier of microcontrollers (MCUs) and advanced semiconductor solutions including MCUs, SoC solutions, and a broad range of analogue and power devices. “With the move toward a more connected and autonomous future and the growing sophistication of automotive systems, increased collaboration across the ecosystem is essential,” said Masayasu Yoshida, senior director of the Automotive Technical Customer Engagement Division at Renesas. “Together, Wind River and Renesas can combine their expertise to dramatically improve development efficiencies for car makers to better meet the evolving demands of the automotive industry.”The Wind River automotive portfolio includes VxWorks, a high-performance RTOS tuned for both determinism and responsiveness with a proven track record in safety- and security-certified environments, making it an ideal solution for quickly commercializing ADAS and autonomous applications.Wind River Linux and other commercial-grade open source technologies, in combination with Wind River technical support and maintenance, will help customers stay up-to-date on the latest innovations. Wind River Edge Sync provides a software framework for remote over-the-air (OTA) updates and software lifecycle management, allowing rapid, safe, and secure updates to software and firmware throughout the vehicle lifecycle. Under-the-hood SoCs play an important role in safety-oriented systems, such as autonomous driving and automated driving technologies. Renesas SoCs deliver high computer vision performance and AI processing at industry-leading low power levels, targeting automotive front cameras for use in mass-produced Level 3 (conditional automation) and Level 4 (high automation) autonomous vehicles.Part of the open Renesas autonomy Platform for autonomous and automated driving, R-Car SoCs enable design flexibility for Tier 1 suppliers and auto manufacturers looking to develop highly automated vehicles.
Key word:
Release time:2018-12-05 00:00 reading:2379 Continue reading>>
A Look at ST's Plans for EVs, <span style='color:red'>ADAS</span> and China
A new report from International Data Corporation (IDC) presents IDC's inaugural forecast for the worldwide 5G network infrastructure market for the period 2018–2022. It follows the release of IDC's initial forecasts for Telecom Virtual Network Functions (VNF) and Network Functions Virtualization Infrastructure (NFVI) in September and August 2018, respectively.With the first instances of 5G services rolling out in the fourth quarter of 2018, 2019 is set to be a seminal year in the mobile industry. 5G handsets will begin to hit the market and end-users will be able to experience 5G technology firsthand.From an infrastructure standpoint, the mobile industry continues to trial innovative solutions that leverage new spectrum, network virtualization, and machine learning and artificial intelligence (ML/AI) to create new value from existing network services. While these and other enhancements will play a critical role, 5G NR represents a key milestone in the next mobile generation, enabling faster speeds and enhanced capacity at lower cost per bit. Even as select cities begin to experience 5G NR today, the full breadth of 5G's potential will take several years to arrive, which will require additional standards work and trials, particularly related to a 5G NG core.In addition to 5G NR and 5G NG core, procurement patterns indicate communications service providers (SPs) will need to invest in adjacent domains, including backhaul and NFVI, to support the continued push to cloud-native, software-led architectures.Combined, IDC expects the total 5G and 5G-related network infrastructure market (5G RAN, 5G NG core, NFVI, routing and optical backhaul) to grow from approximately $528 million in 2018 to $26 billion in 2022 at a compound annual growth rate (CAGR) of 118%. IDC expects 5G RAN to be the largest market sub-segment through the forecast period, in line with prior mobile generations."Early 5G adopters are laying the groundwork for long-term success by investing in 5G RAN, NFVI, optical underlays, and next-generation routers and switches. Many are also in the process of experimenting with the 5G NG core. The long-term benefit of making these investments now will be when the standards-compliant SA 5G core is combined with a fully virtualized, cloud-ready RAN in the early 2020s. This development will enable many communications SPs to expand their value proposition and offer customized services across a diverse set of enterprise verticals through the use of network slicing," says Patrick Filkins, senior research analyst, IoT and Mobile Network Infrastructure.The report, Worldwide 5G Network Infrastructure Forecast, 2018-2022 (IDC #US44392218), presents IDC's inaugural forecast for the 5G network infrastructure market. Revenue is forecast for both the 5G RAN and 5G NG Core segments and each of the three related sub-segments (NFVI, Routing Backhaul, and Optical Backhaul). The report also provides a market overview, including drivers and challenges for communications service providers and advice for technology suppliers.
Key word:
Release time:2018-11-15 00:00 reading:1183 Continue reading>>
Nvidia Enters <span style='color:red'>ADAS</span> Market via AI-Based Xavier
Nvidia is in Munich this week to declare war that it is coming after the advanced driver assistance system (ADAS) market. The GPU company is now pushing its AI-based Nvidia Drive AGX Xavier System — originally designed for Level 4 autonomous vehicles — down to Level 2+ cars.In a competitive landscape already crowded with ADAS solutions provided by rival chip vendors such as NXP, Renesas, and Intel/Mobileye, Nvidia is boasting that its GPU-based automotive SoC isn’t just a “development platform” for OEMs to prototype their self-driving vehicles.At the company’s own GPU Technology Conference (GTC) in Europe, Nvidia announced that Volvo cars will be using the Nvidia Drive AGX Xavier for its next generation of ADAS vehicles, with production starting in the early 2020s.NVIDIA's Drive AGX Xavier will be designed into Volvo's ADAS L2+ vehicles. Henrik Green (left), head of R&D of Volvo Cars, with Nvidia CEO Jensen Huang on stage at GTC Europe in Munich. (Photo: Nvidia)Danny Shapiro, senior director of automotive at Nvidia, told us, “Volvo isn’t doing just traditional ADAS. They will be delivering wide-ranging features of ‘Level 2+’ automated driving.”By Level 2+, Shapiro means that Volvo will be integrating “360° surround perception and a driver monitoring system” in addition to a conventional adaptive cruise control (ACC) system and automated emergency braking (AEB) system.Nvidia added that its platform will enable Volvo to “implement new connectivity services, energy management technology, in-car personalization options, and autonomous drive technology.”It remains unclear if car OEMs designing ADAS vehicles are all that eager for AI-based Drive AGX Xavier, which is hardly cheap. Shapiro said that if any car OEMs or Tier Ones are serious about developing autonomous vehicles, taking an approach that “unifies ADAS and autonomous vehicle development” makes sense. The move allows carmakers to develop software algorithms on a single platform. “They will end up saving cost,” he said.Phil Magney, founder and principal at VSI Labs, agreed. “The key here is that this is the architecture that can be applied to any level of automation.” He said, “The processes involved in L2 and L4 applications are largely the same. The difference is that L4 would require more sensors, more redundancy, and more software to assure that the system is safe enough even for robo-taxis, where you don’t have a driver to pass control to when the vehicle encounters a scenario that it cannot handle.”Better than discrete ECUsAnother argument for the use of AGX for L2+ is that the alternative requires the use of multiple discrete ECUs. Magney said, “An active ADAS system (such as lane keeping, adaptive cruise, or automatic emergency braking) requires a number of cores fundamental to automation. Each of these tasks requires a pretty sophisticated hardware/software stack.” He asked, “Why not consolidate them instead of having discrete ECUs for each function?”Scalability is another factor. Magney rationalized, “A developer could choose AGX Xavier to handle all these applications. On the other hand, if you want to develop a robo-taxi, you need more sensors, more software, more redundancy, and higher processor performance … so you could choose AGX Pegasus for this.”Is AGX Xavier safer?Shapiro also brought up safety issues.He told us, “Recent safety reports show that many L2 systems aren’t doing what they say they would do.” Indeed, in August, the Insurance Institute for Highway Safety (IIHS) exposed “a large variability of Level 2 vehicle performance under a host of different scenarios.” An EE Times story entitled “Not All ADAS Vehicles Created Equal” reported that some [L2] systems can fail under any number of circumstances. In some cases, certain models equipped with ADAS are apparently blind to stopped vehicles and could even steer directly into a crash.Nvidia’s Shapiro implied that by “integrating more sensors and adding more computing power” that runs robust AI algorithms, Volvo can make their L2+ cars “safer.”On the topic of safety, Magney didn’t necessarily agree. “More computing power doesn’t necessarily mean that it is safer.” He noted, “It all depends on how it is designed.”Lane keeping, adaptive cruise, and emergency braking for L2 could rely on a few sensors and associated algorithms while a driver at the wheel manages events beyond the system’s capabilities.However, the story is different with a robo-taxi, explained Magney. “You are going to need a lot more … more sensors, more algorithms, some lock-step processing, and localization against a precision map.” He said, “For example, if you go from a 16-channel LiDAR to a 128-channel LiDAR for localization, you are working with eight times the amount of data for both your localization layer as well as your environmental model.”Competitive landscapeBut really, what does Nvidia have that competing automotive SoC chip suppliers don’t?Magney, speaking from his firm VSI Labs’ own experience, said, “The Nvidia Drive development package has the most comprehensive tools for developing AV applications.”He added, “This is not to suggest that Nvidia is complete and a developer could just plug and play. To the contrary, there is a ton of organic codework necessary to program, tune, and optimize the performance of AV applications.”However, he concluded that, in the end, “you are going to be able to develop faster with Nvidia’s hardware/software stack because you don’t have to start from scratch. Furthermore, you have DRIVE Constellation for your hardware-in-loop simulations where you can vastly accelerate your simulation testing, and this is vital for testing and validation.”
Key word:
Release time:2018-10-11 00:00 reading:2820 Continue reading>>
Not All <span style='color:red'>ADAS</span> Vehicles Created Equal
The Insurance Institute for Highway Safety (IIHS) earlier this week unveiled results, and insights gained, from tests to evaluate such ADAS features as adaptive cruise control (ACC) and lane-keeping assist (LKA).In the tests, performed both on the road and on test tracks, IIHS found that some models struggled “in typical driving situations, such as approaching stopped vehicles and negotiating hills and curves.”IIHS is a Virginia-based, nonprofit organization funded by auto insurers.The five Level 2 models that IIHS used for their testing were a 2017 BMW 5-series with “Driving Assistant Plus,” a 2017 Mercedes-Benz E-Class with “Drive Pilot,” a 2018 Tesla Model 3 and 2016 Model S with “Autopilot” (software versions 8.1 and 7.1, respectively), and a 2018 Volvo S90 with “Pilot Assist.”IIHS’s ADAS tests have exposed a large variability of Level 2 vehicle performance under a host of different scenarios. These systems can fail under any number of circumstances. In some cases, certain models equipped with ADAS are apparently blind to stopped vehicles and could even steer directly into a crash.IIHS examined driver assistance features in road and track tests and shared its test results here.Mixed bagMike Demler, senior analyst at The Linley Group, called the IIHS test results a “mixed bag.” He noted, “You can’t point to just one factor for … poor performance.” He said, “If you just look at how the Tesla Model 3 and Model S performed equally well with lane-keeping assist on curves, you might conclude they mastered that function. But then on hills, the Model S is by far the worst, and the less expensive Model 3 is the best.”Indeed, the test results can be confusing.When IIHS tested the system with adaptive cruise control turned off but automatic braking on, at 31 mph, both Teslas — the Model S and Model 3 — braked but still hit a stationary vehicle. According to IIHS, they were the only two models that failed to stop in time during tests.And yet, when the same test was repeated with ACC engaged, the BMW 5-series, Mercedes-Benz E-Class, and Tesla Model 3 and Model S braked earlier and gentler than with emergency braking and still avoided the stationary vehicle.IIHS acknowledged that it’s still “crafting a consumer ratings program for ADAS.” The institute noted, “IIHS can’t say yet which company has the safest implementation of Level 2 driver assistance.”You can read the IIHS test results here.Building blocks of L2 vehiclesOne of the most striking elements about the test results is the inconsistent ADAS performance among the five cars. Does the explanation lie in the building blocks used in L2 vehicles’ ADAS features?Phil Magney, founder and principal at VSI Labs, explained that L2 systems are largely vision-first systems, often with the help of radar.[Vision systems] maintain their lane keeping with their vision algorithms. If the lines become obscured in any way, the performance degrades. If the lines are gone, they simply will not work and cannot be engaged.All these solutions are enabled with radar as well, which gives them their dynamic speed control when following other vehicles.Most L2 solutions (including all tested) are further enabled with automated emergency braking (AEB) that is designed to mitigate collisions with stationary vehicles. This feature is typically enabled with radar and/or camera.What causes variability?But exactly which technical factors induce variables in ADAS behavior?Magney said, “A lot of performance variance is found on these systems because there are so many elements of the HW/SW configurations.”For example, active lane keeping is a multi-step process that partitions lane detection from control systems, said Magney. “Each of these steps has a unique set of code with its own parameters. In tight turns, these solutions can fail depending on the look-ahead settings, which are necessary to calculate the curvature.”Magney added that the ACC pipeline is equally complex. “ACC regulates the longitudinal velocity of a vehicle based on the kinematics of the host vehicle and the target vehicle.”The primary goal of ACC, as Magney sees it, “is to apply throttle and brakes in order to match the speed to that of the target vehicle. Both comfort and safety are key features for ACC, but in some cases, safety will take priority over comfort to avoid a collision.”Unique to radar-based calculations is all the filtering necessary to avoid false positives, he noted. “For example, if you are traveling at speed on an expressway and are overtaking a slower car in an adjacent lane, you must be certain that what you choose to brake for is within your trajectory!”False positives related to ACC happen occasionally when a vehicle brakes despite nothing in its path. Magney observed, “Most Tesla owners have experienced this. It’s not necessarily dangerous as it is annoying.”Other details also contribute to varied performance. Demler cited “difference in the sensors, where they are positioned,” and “differences in the steering control mechanisms,” among others. He added, “All the variances in design factors come into play between manufacturers as well as between models from the same manufacturer.”Should we define ADAS performance standards?Differences in ADAS performance matter to drivers. If ADAS featured in various vehicles handles driving tasks so differently, won’t drivers get confused? Might such variability make it tougher for consumers to choose the vehicle they want? Wouldn’t this be even worse for a rental car driver encountering ADAS features that he or she has never driven before?A case in point, as pointed out by IIHS, was: “One of the questions researchers looked to answer is, do the systems handle driving tasks as humans would?” The report said, “Not always, tests showed.” It explained, “When they didn’t perform as expected, the outcomes ranged from the irksome, such as too-cautious braking, to the dangerous — for example, veering toward the shoulder if sensors couldn’t detect lane lines.”EE Times asked if this would be the time to start defining acceptable ADAS performance standards — for safety reasons. Is anyone talking about it?Demler said that he isn’t aware of anyone specifically discussing ADAS standards using a set of tests like those done by IIHS. But he agreed: “This is definitely an argument for doing it.”He said, “I also expect that it would go into the New Car Assessment Program (NCAP) rating, but lane-keep assist and ACC aren’t mandated features. The car magazines and consumer reports do follow the same test procedures on all cars that they evaluate, so that information is available to car buyers.”Magney concurred. “An argument for establishing performance standards can be made to level the expectations in terms of capabilities,” he said. “A protocol for doing this is pragmatic and a natural extension of existing safety agencies. A Level 2 automated system should perform well against its intended design domain. This would not include an all-out scenario test but, rather, a defined protocol that examines measurable performance against defined targets.”ADAS ratingsIIHS clearly noted in its report that it “can’t say yet which company has the safest implementation of Level 2 driver assistance.” If so, what would it take for an institution like IIHS to come to clearer conclusions?Demler doesn’t believe that IIHS is equipped to design and implement rigorous tests. He sees what’s reported in this report as “just subjective evaluations.” He said, “We need the National Highway Traffic Safety Administration (NHTSA) to implement standards, but the SAE and manufacturers should get together to drive that.”Magney also believes that it takes “a more refined approach to attempt to rate these features.” He said, “We don’t know how the course was set, but I think that in some examples, the curve testing and hill testing were perhaps outside the normal operating domain.”Magney added, “In order to rate automated driving features, they would need to be tested against a devised set of scenarios on various type of road segments and in various conditions. Probably a pass/fail-type test based on multiple runs.”
Key word:
Release time:2018-08-13 00:00 reading:1007 Continue reading>>
Renesas and HELLA Aglaia offer scalable <span style='color:red'>ADAS</span> solution
  Renesas Electronics and HELLA Aglaia, a specialist in intelligent visual sensor systems, have announced their open and scalable front camera solution for advanced driver assistance systems (ADAS) and automated driving.  The front camera system combines the R-Car V3M, Renesas' high-performance, low-power image recognition system-on-chip (SoC) for New Car Assessment Program (NCAP), and HELLA Aglaia?s camera software designed to meet level 2 (partial automation) and level 3 (conditional automation) of the SAE International’s new J3016 standard.  The solution, which has been designed for scalability, enables system designers to build a wide range of front cameras from cameras supporting NCAP to cameras supporting requirements for up to level 3 applications.  This collaboration is intended to open up the market for front cameras as sensor vision is projected to become a major requirement for NCAP over the next several years, driven by the rapid adoption of NCAP safety features such as automated emergency braking, lane departure warning, and traffic sign recognition.  OEMs & Tier 1s are seeking open-but-proven solutions with the right performance and cost efficiency balance to provide them the freedom to implement a front camera with their own differentiators.  The Renesas R-Car V3M SoC implements a computer vision platform using different accelerators, including a versatile pipeline engine (IMP) and a computer vision engine (CVE), allowing the R-Car V3M to run convolutional neural networks for the classification of objects and to manage algorithms like optical flow and object detection.  The R-Car V3M also features an integrated ISP and a single DDR3L memory channel offering potential for further cost reductions. The R-Car V3M is part of the Renesas autonomy Platform: Renesas’ ground-breaking platform for ADAS and automated driving that delivers total end-to-end solutions scaling from cloud to sensing and vehicle control.  HELLA Aglaia’s software solutions will shorten the development time, efforts, and risk for the next wave of front camera customers. As an experienced Tier 2 supplier of field-proven camera software for level 2 and level 3, HELLA Aglaia delivers software that is flexible and portable and that can be combined with Tier 1 or OEM software IP, allowing customers to focus on creating their USPs. The software is scalable and covers from low-cost NCAP applications up to level 3-based front camera applications.
Release time:2017-12-07 00:00 reading:2152 Continue reading>>
Imagination <span style='color:red'>ADAS</span> technology licensed by DENSO

Turn to

/ 1

  • Week of hot material
  • Material in short supply seckilling
model brand Quote
CDZVT2R20B ROHM Semiconductor
TL431ACLPR Texas Instruments
MC33074DR2G onsemi
RB751G-40T2R ROHM Semiconductor
BD71847AMWV-E2 ROHM Semiconductor
model brand To snap up
TPS63050YFFR Texas Instruments
STM32F429IGT6 STMicroelectronics
IPZ40N04S5L4R8ATMA1 Infineon Technologies
BU33JA2MNVX-CTL ROHM Semiconductor
ESR03EZPJ151 ROHM Semiconductor
BP3621 ROHM Semiconductor
Hot labels
ROHM
IC
Averlogic
Intel
Samsung
IoT
AI
Sensor
Chip
About us

Qr code of ameya360 official account

Identify TWO-DIMENSIONAL code, you can pay attention to

AMEYA360 mall (www.ameya360.com) was launched in 2011. Now there are more than 3,500 high-quality suppliers, including 6 million product model data, and more than 1 million component stocks for purchase. Products cover MCU+ memory + power chip +IGBT+MOS tube + op amp + RF Bluetooth + sensor + resistor capacitance inductor + connector and other fields. main business of platform covers spot sales of electronic components, BOM distribution and product supporting materials, providing one-stop purchasing and sales services for our customers.

Please enter the verification code in the image below:

verification code