Autonomous Vehicle Navigation System
2025 – 2026Graduation Thesis — End-to-End Deep Learning on Embedded Hardware
Built a complete autonomous driving pipeline: dataset collection (~20,000 labeled image-command pairs), PilotNet-inspired CNN training on NVIDIA GPU, dynamic quantization (FP32 → INT8, ~1.2x speedup), and TorchScript deployment on Raspberry Pi 4.
- Achieved real-time inference at ~20–25 Hz on ARM hardware, enabling autonomous hallway navigation using monocular vision with behavioral cloning.
- Integrated four HC-SR04 ultrasonic sensors as an independent safety layer with emergency obstacle avoidance, direction-aware recovery maneuvers, and a 500 ms watchdog timer.
- Developed a multi-threaded Python control server (drive_server.py) with dedicated threads for TCP control, telemetry, and MJPEG video streaming.
- Created a cross-platform Flutter mobile application with dual-joystick control, live camera feed, real-time sensor HUD, and seamless manual/autonomous mode switching.