Map-Less Autonomous Navigation Using Deep Learning (2020)

Working on IntentionNet at NUS: Goal-Directed Autonomous Navigation in Unknown Environments
In 2020, I worked on a problem that sits at the uncomfortable edge of autonomy:
How does a robot navigate when a map doesn’t exist—and cannot be created?
This question became the foundation of my research work at National University of Singapore (NUS) on the IntentionNet project.
📄 Reference paper:
Intention-Net: Integrating Planning and Deep Learning for Goal-Directed Autonomous Navigation
https://arxiv.org/abs/1710.05627
/
🎥 Implementation demo:
https://www.youtube.com/watch?v=4MOrfOzaW9A
Why Map-less Navigation Matters
Most autonomous navigation systems rely on:
- prior maps
- extensive data collection
- SLAM-based localization
But many real-world environments don’t allow that:
- disaster zones
- new buildings
- temporary layouts
- restricted or dynamic spaces
Mapping first—and navigating later—isn’t always possible.
The IntentionNet approach explored a different idea:
Can a robot navigate using intent and perception—without a pre-built map?
That idea was both technically demanding and conceptually elegant.
My Role: From Sensors to System Behavior
My primary responsibility was system-level design and evolution, not just model execution.
1. Designing the Sensor Suite
I worked on designing and integrating a multi-modal sensor setup, including:
- LiDAR – spatial awareness and obstacle geometry
- RGB Camera – visual perception
- Depth Camera – near-field structure understanding
- Odometer – motion estimation
- IMU – orientation and stability
The challenge wasn’t adding sensors—it was making them work coherently under real-time constraints.
2. System Upgrades and Iterative Testing
Beyond integration, I:
- implemented upgrades to the navigation pipeline
- tested system robustness across scenarios
- evaluated behavior under sensor noise and uncertainty
- improved visual navigation stability
This wasn’t static research.
It was continuous experimentation, refinement, and validation.
Visual Navigation Using Intent, Not Maps
A key idea in IntentionNet is intent-driven navigation.
Instead of relying on a metric map, the system uses:
- current perception
- a high-level goal
- and learned behavior
One of the most interesting aspects was the use of a hand-drawn map.
Not a metric map.
Not SLAM-generated.
Just a rough, human-provided sketch.
From that, the system was expected to:
- localize itself visually
- understand directional intent
- and navigate toward the goal
This allowed navigation in environments where:
- mapping is infeasible
- data collection is expensive
- layouts change frequently
Why This Was Hard
Map-less navigation exposes everything that traditional pipelines hide:
- perceptual ambiguity
- localization drift
- sensor inconsistencies
- compounding uncertainty
There’s no “ground truth map” to fall back on.
The system must continuously reason from perception, not memory.
That constraint forces better design.
What This Work Taught Me
This research reshaped how I think about autonomy:
- Navigation is a behavioral problem, not just a geometric one
- Sensor fusion matters more than sensor quantity
- Learning-based systems benefit from intent signals
- Removing maps forces systems to become adaptive
Most importantly:
Generalization improves when assumptions are removed.
Why This Work Still Matters
The ideas explored in IntentionNet are increasingly relevant today:
- autonomous robots in unknown buildings
- service robots in dynamic indoor spaces
- last-mile delivery in new neighborhoods
- exploration without prior infrastructure
Map-less, intent-driven navigation isn’t a shortcut.
It’s a necessary direction.
Closing Thought
This project wasn’t about making robots smarter.
It was about making them less dependent.
Less dependent on maps.
Less dependent on prior data.
Less dependent on controlled environments.
Working on IntentionNet at NUS taught me that autonomy becomes truly powerful only when it can operate where preparation is impossible.
And that idea continues to influence how I design intelligent systems today.