If you’re looking for high-grade, flexible eye-tracking tools that adapt to your research, from VR setups to real-world environments, visit us at Booth 435.
If you’re looking for high-grade, flexible eye-tracking tools that adapt to your research, from VR setups to real-world environments, visit us at Booth 435.
Find our team at @sfn.org Neuroscience 2025 (Booth 435), San Diego Convention Center, Nov 16-19.
We'll be showing how Neon's real-world eye tracking is opening new frontiers in science.
#SfN25
Our new Alpha Lab tutorial makes it possible, introducing a powerful workflow that combines our Neon eye tracker with metaopensource.bsky.social Segment Anything Model 2 (SAM2).
The process is simple: Click. Segment. Track.
Our new Alpha Lab tutorial makes it possible, introducing a powerful workflow that combines our Neon eye tracker with metaopensource.bsky.social Segment Anything Model 2 (SAM2).
The process is simple: Click. Segment. Track.
Play synced gaze + video + audio, analyze, or transcribe (STT) in real-time.
Start listening. Update now:
pip install -U pupil-labs-realtime-api
Learn more: pupil-labs.github.io/pl-realtime-...
Play synced gaze + video + audio, analyze, or transcribe (STT) in real-time.
Start listening. Update now:
pip install -U pupil-labs-realtime-api
Learn more: pupil-labs.github.io/pl-realtime-...
Researchers of the LTSI at University of Rennes combined Pupil Labs Neon eye tracking glasses with a high-fidelity simulator to capture how specialists visually navigate this delicate procedure.
(Video credit: Josselin Gautier)
Researchers of the LTSI at University of Rennes combined Pupil Labs Neon eye tracking glasses with a high-fidelity simulator to capture how specialists visually navigate this delicate procedure.
(Video credit: Josselin Gautier)
We’ve rolled out a big update to the Video Renderer and introduced a new drawing tool in the AOI Editor.
We’ve rolled out a big update to the Video Renderer and introduced a new drawing tool in the AOI Editor.
We took Neon for a spin through "Highlights from the Barn" – An exhibition featuring illusions from Dr. Bernd Lingelbach's collection.
We took Neon for a spin through "Highlights from the Barn" – An exhibition featuring illusions from Dr. Bernd Lingelbach's collection.
Next Sunday we will host a hands-on eye tracking workshop with Neon. The focus is on programmatic control - using our Python APIs to stream live data, manage experiments, and analyze results directly with code.
Next Sunday we will host a hands-on eye tracking workshop with Neon. The focus is on programmatic control - using our Python APIs to stream live data, manage experiments, and analyze results directly with code.
Researchers have taken eye tracking out of the lab and onto the basketball court to find out.
Researchers have taken eye tracking out of the lab and onto the basketball court to find out.
As our ecosystem of eye tracking frames, accessories, and add-ons has grown, we wanted to make it easier to explore all the options and understand how Neon's modular system works.
As our ecosystem of eye tracking frames, accessories, and add-ons has grown, we wanted to make it easier to explore all the options and understand how Neon's modular system works.
Now you can customize how gaze and fixations are visualized within Cloud. Current options for gaze visualization are circle and crosshair. Adjust size, color, width, and more!
Stay tuned, we’re bringing these updates to the video renderer soon! 👀
Now you can customize how gaze and fixations are visualized within Cloud. Current options for gaze visualization are circle and crosshair. Adjust size, color, width, and more!
Stay tuned, we’re bringing these updates to the video renderer soon! 👀
We’ve just released comprehensive documentation for our Real-Time API, making it easier than ever to stream and visualize live gaze data from Neon.
Remotely control recordings, send event annotations, and synchronize data streams across devices — all in real-time.
We’ve just released comprehensive documentation for our Real-Time API, making it easier than ever to stream and visualize live gaze data from Neon.
Remotely control recordings, send event annotations, and synchronize data streams across devices — all in real-time.
There's so much eye tracking data generated with each Neon recording and with this release we're visualizing more of it (with more on the way 😉 ).
Now you can see eye video overlays, eye aperture graphs, pupil diameter graphs, and audio graphs directly in Pupil Cloud.
There's so much eye tracking data generated with each Neon recording and with this release we're visualizing more of it (with more on the way 😉 ).
Now you can see eye video overlays, eye aperture graphs, pupil diameter graphs, and audio graphs directly in Pupil Cloud.
Don't just observe fatigue, quantify it. Our new tool uses Neon eye tracking to calculate PERCLOS (Percentage of Eye Closure) live, giving you a direct, data-driven window into drowsiness.
Don't just observe fatigue, quantify it. Our new tool uses Neon eye tracking to calculate PERCLOS (Percentage of Eye Closure) live, giving you a direct, data-driven window into drowsiness.
The winning edge is often invisible. A split-second glance, a moment of intense focus. We believe you should be able to see it all.
The winning edge is often invisible. A split-second glance, a moment of intense focus. We believe you should be able to see it all.
Researchers at @mcmasteruniversity.bsky.social LIVELab just did.
Their new "SocialEyes" framework, powered by data from our Neon eye tracking glasses, captured the synchronized gaze of 30 audience members in real time.
Researchers at @mcmasteruniversity.bsky.social LIVELab just did.
Their new "SocialEyes" framework, powered by data from our Neon eye tracking glasses, captured the synchronized gaze of 30 audience members in real time.
For a long time, that question was limited to the lab. Not anymore.
Researchers at BabyLab (Grenoble Alpes University) have designed a custom headband for our Neon module, enabling eye tracking studies with children as young as 3 months old in natural settings.
For a long time, that question was limited to the lab. Not anymore.
Researchers at BabyLab (Grenoble Alpes University) have designed a custom headband for our Neon module, enabling eye tracking studies with children as young as 3 months old in natural settings.
With our new Alpha Lab tool, you can now fuse Neon eye tracking with real-time location data (GPS), all visualized on a single interactive map.
Track a participant’s full journey, including:
- Gaze behavior
- Head orientation
- Precise GPS path
With our new Alpha Lab tool, you can now fuse Neon eye tracking with real-time location data (GPS), all visualized on a single interactive map.
Track a participant’s full journey, including:
- Gaze behavior
- Head orientation
- Precise GPS path
We've been steadily developing new frames for the Neon eye tracking system – and with 15 options now available, we thought it would be time to write a guide to help you chose the right one for your application.
Read the guide: pupil-labs.com/blog/a-guide...
We've been steadily developing new frames for the Neon eye tracking system – and with 15 options now available, we thought it would be time to write a guide to help you chose the right one for your application.
Read the guide: pupil-labs.com/blog/a-guide...
We said come get a hands-on demo with Neon, and that is exactly what they did - check out the video!
Thanks to everyone who stopped by to say hi - lots of great discussions with old and new faces 🙏
We said come get a hands-on demo with Neon, and that is exactly what they did - check out the video!
Thanks to everyone who stopped by to say hi - lots of great discussions with old and new faces 🙏
Available now: pupil-labs.com/products/neo...
Available now: pupil-labs.com/products/neo...