Smart Holographic Displays for Retail Advertising (2021)

Building an AI-Driven Holographic Display During the NUS GRIP Program
Some projects feel like experiments.
Some feel like products.
This one felt like a glimpse into the future.
During the NUS GRIP program, I got the opportunity to work on a holographic display system that combined AI, computer vision, gesture interaction, and holographic projection into a single interactive experience.
🎥 Story Of The Project:
https://www.youtube.com/watch?v=aSiqphFwof0
🎥 Demonstration Video:
https://www.youtube.com/watch?v=3R9sDwjNNEs
Why Holographic Displays?
Traditional displays are passive.
They show the same content to everyone:
- regardless of who’s watching
- regardless of interest
- regardless of engagement
But retail, exhibitions, and public spaces demand something more:
- attention
- interaction
- personalization
The question we explored was simple but ambitious:
What if a display could see the person in front of it—and respond intelligently in real time?
The Core Idea: AI + Hologram
This system wasn’t just about visual novelty.
It combined multiple technologies into a coherent experience:
- Holographic projection for spatial, eye-catching visuals
- Camera-based perception to detect and analyze the person in front
- Real-time AI analytics to understand engagement
- Gesture sensing to enable touchless interaction
Together, they turned a display into an interactive system, not just a screen.
How the System Worked
1. Visual Perception
A camera mounted on top of the device:
- detected the presence of a user
- analyzed movement and engagement in real time
- enabled AI-driven analytics
This allowed the system to respond dynamically instead of playing static loops.
2. Real-Time Content Curation
Using AI models, the system could:
- adapt ad content in real time
- curate what was displayed based on interaction
- create a personalized experience for brands and users
Instead of broadcasting ads, the display reacted.
3. Gesture-Based Interaction
A gesture sensor allowed users to:
- swipe in the air
- change content without touching the device
- interact naturally and intuitively
No screens to tap.
No buttons to press.
Just intent → action.
4. Holographic Presentation
All of this intelligence was delivered through a holographic display, creating a:
- futuristic
- sci-fi-like
- immersive experience
People didn’t just look at the display—they engaged with it.
Why This Project Was Exciting
This work brought together many threads from my earlier projects:
- computer vision
- gesture recognition
- holographic systems
- real-time AI inference
- user-centered interaction
But this time, the outcome wasn’t just a demo.
It felt product-ready.
Something you could imagine deployed in:
- shopping malls
- brand showrooms
- exhibitions
- experiential marketing spaces
What This Project Taught Me
This project reinforced several important lessons:
- AI becomes powerful when it’s visible—but not intrusive
- Interaction quality matters more than technical complexity
- Real-time systems demand tight integration across hardware and software
- Futuristic experiences are built from very practical engineering decisions
Most importantly:
The future of displays isn’t higher resolution—it’s intelligence.
Closing Thought
Holograms capture attention.
AI creates relevance.
Gestures remove friction.
When combined thoughtfully, they don’t just display content—they create experiences.
Working on this project during the NUS GRIP program showed me how AI and spatial interfaces together can redefine how people interact with information.
And that’s a future worth building toward.