Science is like magic but real.
what would you do when you have access to the pixel stream from @zeiss-microscopy.bsky.social inside @napari.org?
Btw, it works as well in Fiji&ImageJ :-)
Sidenote: Experiments started from "outside" do not update the ZEN UI. But they are executed by ZENservice.
what would you do when you have access to the pixel stream from @zeiss-microscopy.bsky.social inside @napari.org?
Btw, it works as well in Fiji&ImageJ :-)
Sidenote: Experiments started from "outside" do not update the ZEN UI. But they are executed by ZENservice.
Are you ready to automate your next large scale microscopy experiment and work smarter, not harder?
Join us for the Smart Microscopy workshop organized by EPFL Bio-Imaging Platform (Lausanne) and @zeiss-microscopy.bsky.social
www.zeiss.ch/mikroskopie/...
Are you ready to automate your next large scale microscopy experiment and work smarter, not harder?
Join us for the Smart Microscopy workshop organized by EPFL Bio-Imaging Platform (Lausanne) and @zeiss-microscopy.bsky.social
www.zeiss.ch/mikroskopie/...
github.com/zeiss-micros...
Control ZEN system from the outside using python (or any other language). Or stream the incoming pixel directly into a an array of online processing :-)
github.com/zeiss-micros...
Control ZEN system from the outside using python (or any other language). Or stream the incoming pixel directly into a an array of online processing :-)
- Guided Acquisition running on a Celldiscoverer 7
- overnight experiment (fully automated)
- mitotic events detected in overview image (over time)
- 300 3D stacks were acquired using high-resolution water immersion
- Guided Acquisition running on a Celldiscoverer 7
- overnight experiment (fully automated)
- mitotic events detected in overview image (over time)
- 300 3D stacks were acquired using high-resolution water immersion
The stream of pixel will be directly availabe as an array, which one can use directly. Here an example for using a simple DL model for online segmentation.
The stream of pixel will be directly availabe as an array, which one can use directly. Here an example for using a simple DL model for online segmentation.
- 300 3D timelapse datasets
- fully automated
- "mitotic" events detected in low-resultion image
- 3D stacks using high-resolution water immersion
- 300 3D timelapse datasets
- fully automated
- "mitotic" events detected in low-resultion image
- 3D stacks using high-resolution water immersion
It makes partial labeling to train your models (semantic or instance) even faster. The 1st cell is still annotated the usual way, the others using the new tool.
It makes partial labeling to train your models (semantic or instance) even faster. The 1st cell is still annotated the usual way, the others using the new tool.
sometimes on Fridays I find some time to play around with some of our new features from ZEN software from #zeiss_micro.
Here a demo of using the PixelStream (via ZEN API) to directly "segment" the acquired images by applying an ONNX model to the pixel stream.
Hello Feedback Microscopy 😁!
sometimes on Fridays I find some time to play around with some of our new features from ZEN software from #zeiss_micro.
Here a demo of using the PixelStream (via ZEN API) to directly "segment" the acquired images by applying an ONNX model to the pixel stream.
Hello Feedback Microscopy 😁!
Here you can see it in action 😜
Here you can see it in action 😜