Real-time Experiments with an AI Co-Scientist - Stefania Druga, fmr. Google Deepmind

Live Demonstration of AI Co-Scientist 00:20

  • The presentation begins with a live demo showing a microbit board connected to sensors measuring temperature, demonstrating real-time data collection
  • Data from heat pad experiments is analyzed in real time by an AI science assistant, which provides feedback based on the collected sensory information
  • Users can input context and create experiment protocols for the assistant to interpret and provide relevant insights and monitoring
  • The setup allows creation of custom experiment pages to monitor and plot real-time data
  • An open-source camera capable of running object tracking models is demonstrated; it can autonomously track targets and be customized for specific scientific observations

Motivation and Inspiration for AI Co-Scientists 03:38

  • The concept addresses scientific data overload and complexity, with AI assisting in data parsing, generating hypotheses, and identifying blind spots
  • AI enables simultaneous testing of numerous hypotheses, greatly accelerating research
  • Inspiration came from a recent DeepMind paper that orchestrated multiple AI agents (using Gemini 2.0) to perform varied scientific roles, such as summarizing, ranking hypotheses, and planning experiments
  • The DeepMind system replicated a 12-year gene transfer discovery in only two days, without prior exposure to the data, and proposed new hypotheses for liver fibrosis treatments that were verified in wet lab tests
  • This demonstrated the practical, immediate potential of AI co-scientists in areas like drug discovery and healthcare

Real-Time AI Collaboration in Experiments 07:08

  • The vision expands from asynchronous AI analysis to real-time collaboration, where AI formulates hypotheses based on live experimental data
  • Referenced the "era of experience" paper, emphasizing the progression from static human-curated data toward AI learning continuously from real-world environments, especially through multimodal data streams (images, sensors, audio)

System Architecture and Workflow 08:28

  • The system is a React app that aggregates data from USB sensors, multiple webcams, text, and voice inputs; all inputs become webhooks to a backend communicating with Gemini API
  • A dynamic context assembly process determines which modalities are active and constructs adaptive input for the AI assistant
  • The setup allows users to define different protocols and have the system respond accordingly based on current experiment parameters

Hardware Overview and Experiment Design 10:19

  • Experimentation is guided by available hardware (sensors, cameras, boards), with constraints such as real-time measurement, safety, and portability
  • Two key types of experiments highlighted: crystal growth and fermentation, both suitable for at-home science

Crystal Growth Experiment Insights 11:17

  • Crystal growth is achieved by oversaturating hot water with salt and gradually cooling, with careful control yielding better crystal formation
  • Measurements include salt dissolution rates, nucleation sites, and crystal growth rates, influenced by parameters like temperature and concentration
  • Real-time sensor data recorded alongside experimentation; CSV data is used for visualization and analysis (currently via external script)
  • Discovery: Crystal growth is not gradual but occurs in sudden bursts after critical saturation is reached
  • Experiments with different samples (fridge, room temp) allow for controlled comparisons

Fermentation Experiment (Brief Mention) 14:34

  • Fermentation experiments focused on varying salt/sugar content and temperature, measuring growth rate and CO2 levels

Educational Tools and Open Source Ecosystem 15:16

  • Demonstrated an educational version of the system for hands-on experiments, including code available for use with personal API keys and devices
  • Encouraged audience members to try the educational tool and connect their own hardware

Open Source Lab Automation and Future Directions 15:57

  • Cited a robust open-source ecosystem for reproducing lab equipment and automating experimental procedures (e.g., pipetting, analysis, droplet manipulation)
  • Jubilee motion platform (from University of Washington) and open bioreactor projects highlighted, with reference to a recent workshop on lab automation
  • Future vision includes integrating real experimentation data with simulations, enabling realistic and efficient planning of lab conditions and experiments
  • AI-informed simulations could support both scientific discovery and more advanced experimentation strategies

Closing Remarks and Education Initiatives 18:08

  • Presenter shares resources and ongoing open-source education-related projects on her website
  • Announcement of an upcoming AI education summit, inviting attendees to participate
  • Presenter expresses passion for educational outreach and thanks the audience