Abstract
In iterative optimization, actions are adjusted based on what we see—such as dosing until dissolution or stirring until mixing is complete. Self-driving laboratories (SDLs) offer an opportunity to guide experimental adjustments based on such visual feedback in an autonomous, iterative way. However, current SDLs do not monitor these visual cues. HeinSight 4.0 fills this gap by integrating computer vision into SDLs to enable real-time experimental adjustments based on visual feedback. The computer vision detects equipment (e.g., reactor, vial), classifies chemical phases (solid, liquid, air), and analyzes image features such as turbidity and color. HeinSight 4.0 tracks these physical characteristics frame by frame and interprets physical states (e.g., dissolution, separation). This data feeds into a rule-based system that integrates with the SDL to make real-time experimental adjustments. We demonstrate HeinSight 4.0 adaptability across two pharmaceutical case studies: purification (solubility screening) and drug formulation (melt spray congeal). We also developed a hardware-agnostic architecture and deployed it across two institutions with distinct robotic systems. The open-source HeinSight 4.0 enables SDLs to see, think, and act in real time.