Autonomous & Self-Driving Vehicle News: Toyota, Stanford, Applied Intuition, dSPACE & Hyundai

Autonomous+%26%23038%3B+Self-Driving+Vehicle+News%3A+Toyota%2C+Stanford%2C+Applied+Intuition%2C+dSPACE+%26%23038%3B+Hyundai

In autonomous and self-driving vehicle news are Toyota, Stanford, Applied Intuition, dSPACE and Hyundai.

Autonomous Double Drifting from TRI & Stanford

Toyota Research Institute (TRI) and Stanford Engineering announced a world first in driving research: autonomously drifting two cars in tandem.

For nearly seven years, the teams have collaborated on research to make driving safer. The experiments automate a motorsports maneuver called “drifting,” where a driver precisely controls a vehicle’s direction after breaking traction by spinning the rear tires—a skill transferable to recovering from a slide on snow or ice. By adding a second car drifting in tandem, the teams have now more closely simulated dynamic conditions where cars must respond quickly to other vehicles, pedestrians, and cyclists.

“Our researchers came together with one goal in mind – how to make driving safer,” said Avinash Balachandran, vice president of TRI’s Human Interactive Driving division. “Now, utilizing the latest tools in AI, we can drift two cars in tandem autonomously. It is the most complex maneuver in motorsports, and reaching this milestone with autonomy means we can control cars dynamically at the extremes. This has far-reaching implications for building advanced safety systems into future automobiles.”

“The physics of drifting are actually similar to what a car might experience on snow or ice,” said Chris Gerdes, professor of mechanical engineering and co-director of the Center for Automotive Research at Stanford (CARS). “What we have learned from this autonomous drifting project has already led to new techniques for controlling automated vehicles safely on ice.”

In an autonomous tandem drifting sequence, two vehicles—a lead car and a chase car—navigate a course at times within inches of each other while operating at the edge of control. The team used modern techniques to build the vehicle’s AI, including a neural network tire model that allowed it to learn from experience, much like an expert driver.

“The track conditions can change dramatically over a few minutes when the sun goes down,” said Gerdes. “The AI we developed for this project learns from every trip we have taken to the track to handle this variation.”

Car crashes result in more than 40,000 fatalities in the US and about 1.35 million fatalities worldwide every year. Many of these incidents are due to a loss of vehicle control in sudden, dynamic situations. Autonomy holds tremendous promise for assisting drivers to react correctly.

“When your car begins to skid or slide, you rely solely on your driving skills to avoid colliding with another vehicle, tree, or obstacle. An average driver struggles to manage these extreme circumstances, and a split second can mean the difference between life and death,” added Balachandran. “This new technology can kick in precisely in time to safeguard a driver and manage a loss of control, just as an expert drifter would.”

“Doing what has never been done before truly shows what is possible,” added Gerdes. “If we can do this, just imagine what we can do to make cars safer.”

Technical Details 

  • Experiments were conducted at Thunderhill Raceway Park in Willows, California, using two modified GR Supras: Algorithms on the lead car were developed at TRI, while Stanford engineers developed those on the chase car.
  • TRI focused on developing robust and stable control mechanisms for the lead car, allowing it to make repeatable, safe lead runs.
  • Stanford Engineering developed AI vehicle models and algorithms that enable the chase car to adapt dynamically to the motion of the lead car so that it can drift alongside without colliding.
  • GReddy and Toyota Racing Development (TRD) modified each car’s suspension, engine, transmission, and safety systems (e.g., roll cage, fire suppression). Though subtly different from each other, the vehicles were built to the same specifications used in Formula Drift competitions to help the teams collect data with expert drivers in a controlled environment.
  • Both are equipped with computers and sensors that allow them to control their steering, throttle, and brakes while also sensing their motion (e.g., position, velocity, and rotation rate).
    • Crucially, they share a dedicated WiFi network that allows them to communicate in real time by exchanging information such as their relative positions and planned trajectories.
    • To achieve autonomous tandem drifting, the vehicles must continually plan their steering, throttle, and brake commands and the trajectory they intend to follow using a technique called Nonlinear Model Predictive Control (NMPC).
  • In NMPC, each vehicle starts with objectives, represented mathematically as rules or constraints that it must obey.
    • The lead vehicle’s objective is to sustain a drift along a desired path while remaining subject to the constraints of the laws of physics and hardware limits like maximum steering angle.
    • The chase vehicle’s objective is to drift alongside the lead vehicle while proactively avoiding a collision.
  • Each vehicle then solves and re-solves an optimization problem up to 50 times per second to decide what steering, throttle, and brake commands best meet its objectives while responding to rapidly changing conditions.
  • By leveraging AI to constantly train the neural network using data from previous tests, the vehicles improve from every trip to the track.

For more technical information, please visit TRI’s Tandem Drifting Medium blog.

Funding for Applied Intuition

Applied Intuition, a Silicon Valley-based vehicle software supplier for automotive, trucking, construction, mining, agriculture and other industries, today announced it has closed a secondary round of over $300 million and welcomes Fidelity Management & Research Company as a new investor. Existing investors General Catalyst, BOND, Lux Capital and Elad Gil also participated. As part of the secondary sale, investors purchased equity from current employees, former employees, and early investors.

Roof Sensors from Webasto for AVs for Top Auto Supplier

With its Roof Sensor Module (RSM) for passenger cars, Webasto has already successfully established itself as a pioneer in the forward-looking field of autonomous driving. Now the top 100 automotive supplier is expanding its RSM portfolio to include robot taxis and autonomous trucks.

In addition to the integration of individual static and extendable lidar modules in cars, Webasto has already developed complete roof modules that contain several sensor technologies such as camera, radar and lidar. In the USA, the supplier has won a customer project with a robot taxi manufacturer and is supplying the Roof Sensor Module for the vehicle.

Thomas Schütt, head of the business unit at Webasto, explains: ‘The next big step in this field of development will be autonomoustrucks. Here, the requirements for sensor integration in terms of vibration, environmental detection and runtime are even more demanding and we can make optimum use of our expertise from the passenger car sector. For example, a compact sensor strip from Webasto, in which various sensors and functions for sensor availability are integrated, reliably transmits signals and important environmental information to the vehicle’s control unit. For Webasto, this technical development represents a strategic expansion of the product portfolio, which now covers sensor modules for passenger cars, robot taxis, people movers and autonomous trucks.’

It is of the utmost importance that the functionality of the detection systems is guaranteed at all times. Webasto combines innovative and automated cleaning and thermal management systems with sensor solutions to achieve this. Functions for cleaning, de-icing and defogging as well as sensor cooling ensure the availability of the sensors in a wide range of weather conditions. Depending on customer requirements, various sensors are combined in the RSM to create visually appealing solutions.

‘With this expansion of our portfolio, we are once again demonstrating our commitment to actively shaping the future of autonomous driving. After all, this will change the way we get around even more drastically than electromobility. We offer our customers tailor-made solutions in both areas and support them from concept to industrialization,’ concludes Schütt.

dSPACE & AWS

dSPACE, one of the world’s leading solution providers for simulation and validation of automated and connected vehicles, and an Amazon Web Service (AWS) Automotive Competency Partner, is developing an advanced scenario generation solution by leveraging generative AI technologies from Amazon Web Services (AWS).

The new scenario generation feature, which dSPACE is developing with support from the AWS Generative AI Innovation Center, will be integrated into the dSPACE portfolio for SIL/HIL testing and data-driven development. It is based on a large language model (LLM) available on Amazon Bedrock and retrieval-augmented generation (RAG) architecture, which leverages scenario data provided by dSPACE. As a result, users will be able to easily create a vast number of complex and realistic scenarios, including edge cases according to the ASAM OpenSCENARIO standard, simply by entering a textual scenario description. This allows automotive manufacturers and suppliers to simulate and validate their autonomous driving systems more comprehensively and efficiently, reducing the time and cost associated with physical prototyping and testing.

“dSPACE is an AWS Automotive Competency Partner. In this new initiative, we are creating innovative capabilities in scenario generation for autonomous vehicle validation. Using generative AI technologies, we ensure that our customers can easily create realistic and diverse simulation scenarios to develop safer and more reliable autonomous vehicles,” says André Rolfsmeier, Director of Strategic Product Management at dSPACE.

dSPACE’s close collaboration with AWS and the integration of generative AI technologies into dSPACE products pave the way for more comprehensive and efficient validation pipelines, ultimately contributing to safer autonomous vehicles.

dSPACE is demonstrated these capabilities for the first time to the market at the AWS Automotive Symposium in Munich on June, 26.

Hyundai  Robotics Lab Win

Hyundai Motor Group’s (the Group) Robotics LAB has been recognized with three prestigious accolades at the 2024 Red Dot Design Award, including two Best of Best prizes. The Red Dot Awards are one of the world’s prominent design awards, with accolades for the HMG Robotics LAB given in Red Dot’s Robotics category.

“These awards are the result of Hyundai Motor Group’s continuous efforts to ensure that customers and innovative robotics technologies can meet naturally,” said Dong Jin Hyun, Vice President and Head of Robotics LAB at Hyundai Motor and Kia. “We will continue to make efforts as a friendly guide to the robot intelligence society, alongside devotion to take the lead in turning imagination into reality.”

The Group’s innovative Safety Inspection Robot – named a Best of Best winner – is in operation at Hyundai Motor Group Innovation Center Singapore (HMGICS). It utilizes artificial intelligence to inspect machinery within the smart urban mobility hub and identify abnormalities. The robot’s design features optimized packaging with a sculpted cover to prevent sensor blind spots, enabling accurate detection and increased productivity. Its exterior design emphasizes the robot’s technological capabilities and integrates seamlessly into the environment at HMGICS.

To maximize its field of view and improve recognition and detection capabilities, the robot can elevate the position of its camera, which is mounted on a telescopic extension. Equipped with four PnD (Plug and Drive) modules, one at each corner, the robot can navigate around obstacles on HMGICS’ production center floor. The robot is unique in its ability to inspect gauges, valves, and components in awkward locations, correcting false readings caused by human error and reporting any missed contaminants or spillages.

The Group’s MobED Delivery robot, a compact and minimalistic mobility platform, was also awarded the title of Best of Best. MobED Delivery’s DnL (Drive and Lift) module integrates driving, steering and braking into a single eccentric wheel mechanism. This design allows MobED Delivery to dynamically adjust its body inclination and height for seamless movement and safe delivery of goods. The robot’s wheelbase can be minimized to reduce inconvenience for pedestrians while ensuring secure transportation to destinations.

MobED Delivery’s adaptability allows it to traverse different terrains with varying elevations effortlessly. Its load tray extends towards the ground when the main body is inclined, with the unit’s conveyor belt guiding items smoothly towards the rear of the robot. This mechanism ensures the safe delivery of items. MobED Delivery can be seamlessly integrated into a diverse range of environments, including offices and apartments, eliminating the need for individuals to receive items personally and enhancing convenience.

The Robotics LAB’s DAL-e Delivery robot was also recognized as a Red Dot Winner. This food, beverage and parcel delivery robot can conveniently deliver to customers in complex spaces, such as offices and shopping malls, all thanks to its PnD module.

As part of its sophisticated design, DAL-e Delivery features a top display and signature exterior LED lighting, enabling users to identify its current state. The unit’s load cabin stores delivery items securely in transit and features double doors that open automatically, as well as a powered tray that operates on delivery. The robot’s power module offers autonomous navigation thanks to a suite of cameras and sensors.

 

The post Autonomous & Self-Driving Vehicle News: Toyota, Stanford, Applied Intuition, dSPACE & Hyundai first appeared on Frugals ca.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *