Hackathon participation

Between Dec 16th-18th, me and Stemson collaborator Angus Lothian participated in the 2nd Global AI and Machine Learning for Microscopy Hackathon organized by the Kalinin lab (AI and ML for Microscopy, Hackathon | Microscopy Hackathon). I got to participate in person at the Thermo Fisher site in Eindhoven, sponsored by VISION. It was a great time with plenty of hacking, networking, and inspiring tours around the impressive Thermo Fisher facilities.

Our contribution to the hackathon was inspired by CLIP (openai/CLIP: CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image), but rather than finding a joint embedding between text labels and image content, we find a joint embedding between instrument parameters and image style. We imagine this can be used for lightweight physical denoising quick enough to run live on your microscope when the learned embeddings are used to condition and train a generative model. Essentially, we want to be able to record data with low dose, and use the model to predict what the image would’ve looked like imaged with a higher dose.