Pupil Labs
banner
pupil-labs.bsky.social
Pupil Labs
@pupil-labs.bsky.social
We make wearable eye tracking technology. Our eye trackers are used by researchers and integrators around the world. https://pupil-labs.com
That’s a wrap on @sfn.org (Society for Neuroscience) #SfN25!

Four days of conversations, demos, and reconnecting with the neuroscience community. Thank you to everyone who stopped by Booth 435 to explore Neon, discuss your research, or share ideas.
November 21, 2025 at 7:11 AM
What if you could automatically track gaze on any moving object?

Our new Alpha Lab tutorial makes it possible, introducing a powerful workflow that combines our Neon eye tracker with metaopensource.bsky.social Segment Anything Model 2 (SAM2).

The process is simple: Click. Segment. Track.
October 29, 2025 at 7:16 AM
Real-time Python Client Update: Now supports streaming live audio from Neon!

Play synced gaze + video + audio, analyze, or transcribe (STT) in real-time.

Start listening. Update now:
pip install -U pupil-labs-realtime-api

Learn more: pupil-labs.github.io/pl-realtime-...
October 24, 2025 at 8:36 AM
What does the expert eye see during ultrasound-guided embryo transfer?

Researchers of the LTSI at University of Rennes combined Pupil Labs Neon eye tracking glasses with a high-fidelity simulator to capture how specialists visually navigate this delicate procedure.

(Video credit: Josselin Gautier)
October 21, 2025 at 4:16 AM
Updates for Pupil Cloud

We’ve rolled out a big update to the Video Renderer and introduced a new drawing tool in the AOI Editor.
October 7, 2025 at 10:17 AM
There’s always more than meets the eye at ECVP @ecvp.bsky.social 👀

We took Neon for a spin through "Highlights from the Barn" – An exhibition featuring illusions from Dr. Bernd Lingelbach's collection.
August 27, 2025 at 8:33 AM
Meet us in Mainz at the @ecvp.bsky.social (European Conference on Visual Perception).

Next Sunday we will host a hands-on eye tracking workshop with Neon. The focus is on programmatic control - using our Python APIs to stream live data, manage experiments, and analyze results directly with code.
August 20, 2025 at 8:59 AM
Knowing where someone is looking is good. Knowing what they're looking at is a game-changer.
August 12, 2025 at 8:56 AM
We've launched an updated version of the Neon shop on our website 👀

As our ecosystem of eye tracking frames, accessories, and add-ons has grown, we wanted to make it easier to explore all the options and understand how Neon's modular system works.
August 11, 2025 at 9:33 AM
More updates for Pupil Cloud!

Now you can customize how gaze and fixations are visualized within Cloud. Current options for gaze visualization are circle and crosshair. Adjust size, color, width, and more!

Stay tuned, we’re bringing these updates to the video renderer soon! 👀
August 7, 2025 at 10:04 AM
New Real-Time API Docs

We’ve just released comprehensive documentation for our Real-Time API, making it easier than ever to stream and visualize live gaze data from Neon.

Remotely control recordings, send event annotations, and synchronize data streams across devices — all in real-time.
July 31, 2025 at 8:49 AM
Big updates to Pupil Cloud!

There's so much eye tracking data generated with each Neon recording and with this release we're visualizing more of it (with more on the way 😉 ).

Now you can see eye video overlays, eye aperture graphs, pupil diameter graphs, and audio graphs directly in Pupil Cloud.
July 11, 2025 at 9:26 AM
Tired eyes tell a story. Now you can read it in real-time.

Don't just observe fatigue, quantify it. Our new tool uses Neon eye tracking to calculate PERCLOS (Percentage of Eye Closure) live, giving you a direct, data-driven window into drowsiness.
July 8, 2025 at 9:06 AM
Here’s a little demo we built while we were at the conference with hand and ball tracking using our real-time API. Stop guessing what your athletes are seeing – start knowing!
July 4, 2025 at 3:45 AM
What if you could see a live concert through the eyes of an entire audience?

Researchers at ‪@mcmasteruniversity.bsky.social‬ LIVELab just did.

Their new "SocialEyes" framework, powered by data from our Neon eye tracking glasses, captured the synchronized gaze of 30 audience members in real time.
June 30, 2025 at 4:50 AM
How do babies see the world?

For a long time, that question was limited to the lab. Not anymore.

Researchers at BabyLab (Grenoble Alpes University) have designed a custom headband for our Neon module, enabling eye tracking studies with children as young as 3 months old in natural settings.
June 24, 2025 at 6:04 AM
Where did I see that?

With our new Alpha Lab tool, you can now fuse Neon eye tracking with real-time location data (GPS), all visualized on a single interactive map.

Track a participant’s full journey, including:
- Gaze behavior
- Head orientation
- Precise GPS path
June 19, 2025 at 3:29 AM
And that wraps up another year at VSS (Vision Sciences Society).

We said come get a hands-on demo with Neon, and that is exactly what they did - check out the video!

Thanks to everyone who stopped by to say hi - lots of great discussions with old and new faces 🙏
May 23, 2025 at 9:37 AM