Tag: stem

When My Robot Started Paying Attention To Me (2015)

From Making Things Fly to Teaching Them How to See

In 2015, I watched a machine I had built do something new.

It didn’t just move.
It didn’t just balance.

It noticed a human—and followed them.

That moment marked a shift in my thinking.
Up until then, I had been teaching machines how to behave.
This time, I was teaching one how to perceive.


From Flight to Following

A year earlier, I had built my first drone. That project taught me control, stability, and respect for physics.

But something kept bothering me.

Flying was impressive—but reactive.
The system responded to inputs, not intent.

I wanted to build something that could:

  • Observe its environment
  • Make decisions based on visual input
  • Interact with a human without explicit commands

That curiosity became my college major project in 2015:

A person-following robot using image-based pattern recognition.

🎥 Project demo video:
https://www.youtube.com/watch?v=vECu79wfyEg


The Idea: Simple on Paper, Difficult in Reality

The goal sounded straightforward:

  • Detect a person visually
  • Track them in real time
  • Move the robot to follow while maintaining distance

But in practice, this meant solving multiple problems at once:

  • Real-time image processing
  • Pattern recognition under changing lighting
  • Frame-to-frame consistency
  • Motion control linked to perception
  • Noise, latency, and false positives

Unlike the drone, where physics dominated, this robot lived in a messy visual world.


Teaching a Machine to See

The robot didn’t “recognize a person” the way humans do.

It relied on:

  • Image-based patterns
  • Visual features and contrasts
  • Continuous frame analysis
  • Heuristics that worked most of the time

And “most of the time” turned out to be the real challenge.

Lighting changes broke detection.
Background clutter confused tracking.
Sudden movement caused loss of lock.

This was my first real exposure to a core truth of computer vision:

Vision systems don’t fail loudly. They fail ambiguously.


When Perception Meets Control

The hardest part wasn’t detection alone.

It was closing the loop.

Every visual decision had physical consequences:

  • Move too fast → overshoot
  • Move too slow → lose the subject
  • Turn aggressively → lose frame stability

Perception and motion had to agree.

This was where my earlier experience with control systems paid off. I wasn’t just tuning motors—I was tuning behavior.

The robot had to feel intentional, not mechanical.


What Didn’t Work (At First)

A lot didn’t.

  • Detection flickered under poor lighting
  • Background patterns caused false tracking
  • Latency created delayed reactions
  • Minor calibration errors caused drifting behavior

I stopped trying to make it “perfect” and focused on making it robust enough to work reliably.

That decision mattered.


The Moment It Felt Alive

Then came the moment that changed everything.

A person walked in front of the robot.
The robot locked on.
It adjusted its position.
And it followed.

Not aggressively.
Not blindly.

Deliberately.

Watching it move based on what it saw—not what I told it—felt fundamentally different from anything I had built before.

This wasn’t automation anymore.
This was early autonomy.


What This Project Gave Me

This robot taught me things that stayed with me far longer than the project itself:

  • Perception is fragile but powerful
  • Intelligence emerges from tight perception–action loops
  • Vision without control is useless
  • Control without perception is blind

Most importantly, it taught me that real-world AI is never clean—and that’s okay.


From Person-Following Robots to Applied AI

Years later, when I worked on:

  • Computer vision systems
  • Real-time perception pipelines
  • AI models deployed outside the lab

I recognized the same patterns.

The same compromises.
The same trade-offs.
The same need to make systems work despite imperfect inputs.

That understanding started in 2015—with a robot trying to follow a human.


Closing Reflection

The drone taught me how systems stay stable.
This robot taught me how systems understand.

One learned how to fly.
The other learned how to see.

Together, they quietly defined the direction my work would take.

AI and Robotics