PLM-Net:

Perception Latency Mitigation for Vision-Based Lateral Control of Autonomous Vehicles

Aws Khalil1 ORCID , Jaerock Kwon1 ORCID
1University of Michigan-Dearborn

🎥 Demo Video

📄 Abstract

This study introduces the Perception Latency Mitigation Network (PLM-Net), a novel deep learning approach for addressing perception latency in vision-based Autonomous Vehicle (AV) lateral control systems. Perception latency is the delay between capturing the environment through vision sensors (e.g., cameras) and applying an action (e.g., steering). This issue is understudied in both classical and neural-network-based control methods. Reducing this latency with powerful GPUs and FPGAs is possible but impractical for automotive platforms. PLM-Net consists of two models: the Base Model (BM) and the Timed Action Prediction Model (TAPM). BM represents the original Lane Keeping Assist (LKA) system, while TAPM predicts future actions for different latency values. By integrating these models, PLM-Net mitigates perception latency. The final output is determined through linear interpolation of BM and TAPM outputs based on real-time latency. This design addresses both constant and varying latency, improving driving trajectories and steering control. Experimental results validate the efficacy of PLM-Net across various latency conditions.

🖼️ Method Overview

PLM-Net architecture

📊 Evaluation Results

Steering Similarity

Steering comparison - 200ms
Steering comparison - random 1
Steering comparison - random 2

Trajectory Similarity

Trajectory plot - 200ms Latency
Trajectory plot - 200ms Latency

📖 Citation

  @article{khalil2024plm,
    title={PLM-Net: Perception Latency Mitigation Network for Vision-Based Lateral Control of Autonomous Vehicles},
    author={Khalil, Aws and Kwon, Jaerock},
    journal={arXiv preprint arXiv:2407.16740},
    year={2024}
  }