How is AI used in autonomous vehicles?

  • Artificial Intelligence (AI) was coined by computer scientist John McCarthy in 1955, defining it as the ability of computers or machines to think, learn, and decide, often mimicking human cognition.
  • The automotive AI market is projected to soar from $783 million in 2017 to nearly $11 billion by 2025, with AI systems becoming standard in new vehicles, particularly in infotainment interfaces and advanced driver assistance systems (ADAS).
  • AI in autonomous vehicles involves equipping them with sensory, cognitive, and executive functions akin to human drivers, allowing them to perceive, reason, and act based on real-time data fed into intelligent agents, forming a Perception Action Cycle.

Artificial Intelligence’s surge in the automotive sector, projected to hit $11 billion by 2025, transforms vehicles into cognitive entities, mimicking human drivers’ capabilities.

What is artificial intelligence?

The term “Artificial Intelligence” was coined by John McCarthy, a computer scientist, in 1955. AI is defined as the capacity of a computer program or machine to reason, learn, and make decisions. In common parlance, it denotes a machine that emulates human cognitive functions. Through AI, we empower computer programs and machines to emulate human capabilities.

These programs and machines are fed copious amounts of data, which is analysed and processed to facilitate logical thinking and execution of human-like actions. The automation of repetitive human tasks represents merely the surface of AI’s potential; its applications extend to medical diagnostics equipment and autonomous vehicles, with the aim of preserving human lives.

Also read: What is automotive technology?

What is the expansion of AI in the automotive sector?

The automotive AI market is projected to have been valued at $783 million in 2017, and it is expected to approach nearly $11 billion by 2025, with a Compound Annual Growth Rate (CAGR) of approximately 38.5%. According to IHS Markit, the adoption rate of AI-based systems in new vehicles is forecasted to surge by 109% in 2025, compared to the 8% rate recorded in 2015. AI-based systems are poised to become a standard feature in new vehicles, particularly in two main categories:

Firstly, infotainment human-machine interface, encompasses speech recognition, gesture recognition, eye tracking, driver monitoring, virtual assistance, and natural language interfaces.

Secondly, Advanced Driver Assistance Systems (ADAS) and autonomous vehicles, incorporate camera-based machine vision systems, radar-based detection units, driver condition evaluation, and sensor fusion engine control units (ECUs).

Also read: What is the difference between generative AI and discriminative AI?

How does AI function in autonomous vehicles?

AI has emerged as a prevalent buzzword, but what role does it truly play in autonomous vehicles?

Firstly, let us consider the human aspect of driving, utilizing sensory faculties such as vision and sound to observe the road and other vehicles. When we halt at a red light or yield to a pedestrian, we rely on our memory to swiftly make these decisions. Years of driving experience instill in us the habit of noticing recurring elements on roads—whether it’s an optimal route to the workplace or a prominent pothole.

While we endeavor to develop vehicles capable of autonomous operation, our objective is for them to navigate roads akin to human drivers. This necessitates equipping these vehicles with sensory faculties, cognitive functions (memory, logical reasoning, decision-making, and learning), and executive capabilities akin to those employed by human drivers. The automotive industry has been diligently progressing towards this goal in recent years.

According to Gartner, by 2020, approximately 250 million cars will be interconnected with each other and with surrounding infrastructure via diverse V2X (vehicle-to-everything communication) systems. As the volume of data channeled into IVI (in-vehicle infotainment) units or telematics systems increases, vehicles will be capable of capturing and sharing not only internal system statuses and location data, but also real-time alterations in their surroundings. Autonomous vehicles are equipped with cameras, sensors, and communication systems to enable them to amass vast amounts of data, which, when coupled with AI, empowers the vehicle to perceive, analyse, reason, and act, akin to human drivers.

What is the AI perception action cycle in autonomous vehicles?

A recurrent loop, known as the Perception Action Cycle, is established when an autonomous vehicle generates data from its environment and feeds it into an intelligent agent, which subsequently makes decisions, enabling the vehicle to execute specific actions within that environment.

Tilly-Lu

Tilly Lu

Tilly Lu, an intern reporter at BTW media dedicated in Fintech and Blockchain. She is studying Broadcasting and Hosting in Sanming University. Send tips to t.lu@btw.media.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *