Breadcrumb navigation

Next-gen Driving Analysis - Video recognition AI × LLM [1:27]

Movie script

The fusion of video recognition AI, and LLM enable next-generation driving analysis.
This technology analyzes driving data to evaluate the driver's skills and provide feedback.

Let's walk through a simulated drive based on real driving scenarios.

The driver begins the journey—
but faces three challenging situations:

First, a sudden cut-in from a car parked on the roadside.
Second, a right turn requiring careful safety checks.
Third, a motorcycle weaving between cars at a red light.

After the drive,
AI analyzes the driver's behavior based on dashcam video and in-vehicle sensor data.

By simultaneously analyzing video and sensor data, and taking into account driving smoothness, specific feedback is provided on safety and fuel efficiency

It also cross-references traffic regulations,
making it a powerful tool for driver training and fleet management.

This technology goes beyond assistance—
enhancing transport efficiency and contributing to road safety at scale.

From reducing fuel costs to supporting senior driver training,
its applications will only grow.

With video recognition AI and LLM, NEC is building a safer mobility future.

CI:
Orchestrating a brighter world
NEC

Overview

By combining NEC's video recognition AI with LLM, this technology analyzes both camera footage and sensor data to assess driving behavior and road conditions.
It provides tailored feedback to help drivers improve their habits—reducing accident risks and enhancing fuel efficiency.
In addition, by identifying potential hazards in the driving environment, it supports proactive accident prevention.
Through AI-powered analysis, NEC aims to promote safer, smarter, and more sustainable mobility for everyone.


Your Opinion/Comments

Did you like the contents? Please check on a scale of 5.
  • *
    Please send messages in English. Thank you for your cooperation.
  • *
    Response to comments will not be made.
  • *
    All personal information will be handled according to our privacy policy.

Contact

(November 18, 2025)