Jeff Bogue 2018-02-28 00:41:51
Your autonomous chauffeur could be here sooner than you think I HAVE to admit that trying to keep up with the throngs of technology bombarding the automotive industry is like trying to juggle cats. It seems like every other day an engineer says something completely outrageous and then some company puts it in a new car. Active Vehicle Tracking, Heads Up Displays (HUD), Remote Vehicle Shutdown, and Biometric Access are just a few. The Autonomous Vehicle and Driver Override Systems are the two that have me most intrigued. These are a pair that, and I love this, were touted just a few years ago as being “in cars within the next 10 to 20 years”. Well, that was the quickest ten years I have seen yet. Google has sent their self-driving car out on several excursions, but Tesla, always thumbing their nose at conventional automotive ideals, had one of their vehicles cross the U.S. in just 58 hours while being “driven” mostly autonomously. “Mostly” is the key word here, but cars are parking themselves, and even finding their own parking spots. On top of that they are overriding active human control to apply the brakes in accident avoidance situations. I do believe that “mostly” is going to get smaller and smaller very quickly, and I am excited. Now don’t get me wrong, I love my four-wheel drive pickup truck with the manual transmission, crank windows, and smell of hound dog, but sometimes it is just not universally practical. I remember just six or so years ago reading that most people owned phones that had more computing power than what was used to put a man on the moon. Now that rough estimate seems grossly understated and modern vehicles are making modern phones seem like weak little cameras. How does all of this technology work together and where will it end? What are Telematics and what do they do? Well, Telematics is a term to describe the connectivity of the vehicle (think OnStar and others) and they are increasing that exponentially each year. (in 2010 less than 10% of vehicles had factory installed telematics. We are looking at about 62% for the 2016 model year). The last couple of years at the Consumer Electronics Show (CES) have proven themselves as automotive showcases. In 2016, Audi sent an A7 from San Francisco to Las Vegas fully autonomously. BMW had a car that could roam a parking lot by itself looking for a parking spot. Nvidia (all you gamers know these guys, and Tesla has a partnership with them) developed two computer systems specifically designed to handle the information generated by self-driving systems. In January 2017, it became even more abundantly clear that tech in vehicles was going to be a format to be reckoned with. BMW debuted a new user interface concept for the interior of the car called the HoloActive Touch system at the show. The system features a virtual “free floating” display that’s controlled via finger gestures, not a touchscreen, and also features haptic feedback. It uses sensors to detect where your finger is pointing and then a vent shoots air to that location to provide the haptic feedback (which tech services that?). That is just touching the surface (pun intended) of all of the other vehicles that were capable of complete autonomous operation and/or incorporating artificial intelligence to entertain and chauffer. The 2018 CES was not any less automotivecentric either with multiple speakers from the automotive industry and manufacturers on high alert to be jaw dropping or go home with a participation trophy. The automobile has seen this transitioning into a giant computer become more and more prevalent every year. These technological marvels employ a vast array of sensors, cameras, and inputs that generate an amount of data that is absolutely amazing. This data is handled primarily by multiple bus systems, depending on the sensor, but CAN and FlexRay are predominant with Media Oriented Systems Transport taking care of the video and GPS. The computing power alone to handle and process this data stream is truly astounding. The modern vehicle has the equivalent of several computers just to run the vehicles’ motor in the most efficient way possible. Now add a couple more computers to monitor and respond to the non-engine inputs. These inputs consist of several different types that work in conjunction with each other to survey the automobiles’ surroundings and driving conditions. Here is just a tentative list of sensors for a modern vehicle. Long- and short-range ultrasonic sensors Front and rear radar systems Lidar—laser imaging systems Front- and rear-facing cameras GPS—integrated global positioning Vehicle to vehicle (V2V) and Vehicle to infrastructure (V2I) communication. It may seem a little redundant, but all of these sensors work in conjunction with each other to give the vehicle and occupants a full moving picture of their active surroundings and V2V is set to complete the picture. The Long- and Short-Range Ultrasonic Sensors Long-range ultrasonic sensors are primarily used for vehicle-positioning and tracking in the lane adjacent to the host vehicle in order to identify free areas around the vehicle and provide information to an automatic avoidance collision system that can perform autonomous braking and lane change maneuvers. Short-range ultrasonic sensors have a limited range and are more specifically relegated to the close-in work around the vehicle. These sensors have a basic range of 30' or less and take care of the light duty work like Park Assist, but their input is deemed no less significant than the others and is added to the fray when the vehicle is in motion. The Front and rear Radar systems support adaptive cruise control, pre-crash protection, and collision warning systems with and without automatic steering and braking intervention. They also take care of the medium- and long-range activity. These systems can range from wide angle (up to 80° and 30 m range) to multi-mode and long-range (60° and 18° respectively and out to 200 m). These give the vehicle a better view front and back out to some distance. This better situates the vehicle in its surroundings and supplies early warning of trouble ahead. The lidar system is for the final brush strokes. Laser sensors (Lidar) gather detailed distance measurements of a vehicle’s surroundings at a frame rate that matches video’s 30 frames a second. This information comes from a vertical array of spinning and/or scanning lasers and mirrors that send and receive pulses of light at more than a million times a second. This forms a complete 3D detailed dot matrix picture (down to the centimeter) of the vehicle's immediate surrounding area. The Front and rear Camera systems monitor and assist low-speed maneuvering and situational awareness for parking assist, active avoidance, and enhanced visual dashboard displays for the modern “driver.” (Note: These can also be infrared, added to the HUD, etc.) The GPS assists in high-speed situational awareness and tells you where you are, how fast you are going, and when to turn. V2V and V2I—Vehicle to Vehicle and Vehicle to Infrastructure communications. This last piece of the puzzle is one that is very important, yet a few years out. The Federal Government has started the ball rolling on this and it could potentially be the “golden ticket” as far as autonomous driving is concerned. This would give the vehicle the ability to basically see around corners, spotting and tagging other vehicles, road signs, and traffic signals that it cannot yet get a visual on for better situational awareness and active avoidance. Tesla, BMW, Google, and others are going all in and Cadillac announced V2V would be standard in 2017 and have “Super Cruise” available too. Things are going to get interesting quick. That being said, the National Highway Traffic Safety Administration has proposed a formal classification system: Level 0: The driver completely controls the vehicle at-all-times. Level 1: Individual vehicle controls are automated, such as traction control, electronic stability control, and automatic braking. Level 2: At least two controls can be automated in unison, such as adaptive cruise control in conjunction with lane assist. Level 3: The driver can fully cede control of all safety-critical functions in certain conditions. The car senses when conditions require the driver to retake control and provides a “sufficiently comfortable transition time” for the driver to do so. Level 4: The vehicle performs all safety-critical functions for the entire trip, with the driver not expected to control the vehicle at any time. As this vehicle would control all functions from start to stop, including all parking functions, it could include unoccupied cars. It all sounds mind-blowingly complicated, and it is, from the engineering point of view, but most of these systems are modularly designed. There would be no real troubleshooting of failed systems. Most of that would be taken care of by the host computers before it ever gets to the shop. The technician would be utilized to take care of bad connections, swapping bad components, and adjusting sensors for optimum field of view. In reality, what the technician will see is answers on his scan tool. These systems together give the automobile and driver a better grasp of what is going on in and around the vehicle, more so than what a human could gather alone, but there is still work to be done. Computers cannot yet react to certain situations in a way that a human could, but then again, over 80% of all highway accidents are caused by driver error. I’m looking forward to getting in my car and saying “Home, James.” Jeff Bogue is an electronics specialist focused on research and development. He works at ATech Training as a product representative and contributor to ATech Educator News. This article is reprinted from the December 2017 issue of ATech Educator News.
Published by Prakken Publications, Inc. View All Articles.
This page can be found at http://www.omagdigital.com/article/Home%2C+James/3022286/478790/article.html.