Open Access Article
Suyash Damir
a,
Fernando Delgado-Licona
b,
Andrew deMello
*a and
Milad Abolhasani
*b
aDepartment of Chemistry and Applied Biosciences, Institute of Chemical and Bioengineering, ETH Zurich, Vladimir-Prelog-Weg 1, Zürich 8093, Switzerland. E-mail: andrew.demello@chem.ethz.ch
bDepartment of Chemical & Biomolecular Engineering, North Carolina State University, Raleigh, NC, USA. E-mail: abolhasani@ncsu.edu
First published on 17th February 2026
Global challenges such as climate change, escalating energy demands, and health equity require new scientific innovations able to deliver timely solutions. Self-driving laboratories (SDLs) combine robotics and lab automation with artificial intelligence to efficiently explore complex experimental spaces, reduce human effort, and speed up discovery through intelligent experimentation. Central to this transformation is responsible research acceleration (RRA). This ensures that advances are reproducible, transparent, and resource-efficient, and lays the foundation for sustainable innovation. Microfluidics, with its precise control of heat and mass transfer rates, minimal reagent use, and seamless integration with real-time sensing and automation, represents an ideal platform to embody RRA principles within SDLs. This perspective explores the synergy between microfluidics and autonomous experimentation, highlights key challenges, and proposes strategies for fully autonomous microfluidic workflows. We argue that flow-based platforms are essential to expedite discovery and that stronger academia–industry collaboration is critical in shortening the path from scientific insight to real-world implementation and impact.
In this context, a new research paradigm is emerging, and centered on the integration of automation, artificial intelligence (AI), and closed-loop experimentation. At the core of this shift is the self-driving laboratory (SDL), which is a physical–digital platform that automates the iterative design–make–test–analyze (DMTA) cycle (Fig. 1). SDLs replace linear workflows with adaptive ones, where machine learning (ML) models guide the selection of experiments, interpret outcomes, and rapidly refine hypotheses. This approach enables researchers to explore vast design spaces systematically and efficiently, reducing the number of physical experiments needed and increasing the likelihood of identifying high-performing solutions.2
Importantly, SDLs do not replace human insight but rather extend it. Researchers define goals, constraints, and hypotheses, with the SDL executing experiments, analyzing results, and proposing new directions. This hybrid model of human–AI–robot collaboration is especially well-suited for use in domains where experimentation is expensive, slow, or multidimensional, such as materials discovery, catalysis, and pharmaceutical development. That said, as SDLs begin to reshape these fields, questions of trust, reproducibility, and equitable access become more pressing. Accelerating discovery must not come at the cost of rigor. To ensure this balance, the concept of responsible research acceleration (RRA) offers a guiding framework. Building on the principles of responsible AI ecosystems,3,4 RRA emphasizes transparency, reproducibility, sustainability, and equitable access across all parts of the DMTA cycle. In practice, this means documenting experimental decisions, reporting uncertainty, minimizing waste and energy use, and ensuring that the benefits of accelerated discovery are broadly distributed. Platforms that inherently support these values, by generating reliable data at scale with minimal resource consumption, are particularly valuable in the SDL landscape.5
Microfluidics represents one such enabling platform. By miniaturizing and compartmentalizing chemical and biological processes, microfluidic systems reduce reagent use, accelerate reaction times, and offer exquisite control over experimental conditions. Their intrinsic scalability and compatibility with automation make them a natural fit for closed-loop experimentation, minimizing variability and enhancing reliability. Moreover, the ability to multiplex experiments, integrate in-line sensing, and achieve high spatial and temporal resolution enables rich, multimodal data collection, which is an essential (but often ignored) ingredient for effective predictive ML modeling and optimization. Microfluidic platforms can thus serve as efficient engines for SDLs, supporting both rapid exploration and reproducibility.
This perspective article explores the intersection of microfluidics and autonomous experimentation, focusing on how recent advances are enabling the realization of fully autonomous microfluidic labs. We examine the current state of the field, identify unmet challenges, and highlight promising directions for research and deployment. We argue that microfluidics is not merely compatible with the SDL paradigm, but rather constitutes a central technology for realizing rapid, reproducible, and resource-efficient autonomous experimentation.
The field of microfluidics emerged in the early 1980s from the need to enhance the sensitivity, reproducibility, and longevity of chemical sensors.9 As researchers recognized that miniaturization could reduce reagent usage and shorten analysis times, microfluidic systems evolved rapidly, giving rise to integrated lab-on-a-chip platforms (or micro total analysis systems, μTAS) in late 1990s.10,11 These platforms enabled the consolidation of laboratory functions such as sample preparation, reaction, detection, and analysis onto a single device, transforming workflows across disciplines.
Research in microfluidics has demonstrated substantial impact across chemistry,12–14 biology,15–19 and materials science.20–23 Perhaps the most transformative contribution has been in single-cell and single-molecule analysis,15,17,24–30 and nucleic acid amplification,31–36 where microfluidic compartmentalization underpins technologies such as digital polymerase chain reaction (PCR)37 and single-cell genomics,38 enabling measurements that are fundamentally inaccessible to bulk assays. Further, in chemical synthesis and materials discovery, microfluidic reactors have provided precise control over heat and mass transfer rates, enabling reproducible synthesis, access to transient reaction regimes, and high-throughput exploration of parameter spaces with drastically reduced material consumption.39–41 These capabilities establish microfluidics as a mature and distinct experimental modality, enabling the investigation of complex chemical and biological systems with a level of precision, throughput, and data richness that is difficult to achieve by conventional means.
Despite the success of miniaturized workflows, monolithic μTAS platforms are often limited in terms of scalability. Their rigid architectures restrict configurational flexibility, making it difficult to iterate on designs or troubleshoot failures without extensive refabrication. In contrast, modular microfluidic systems, composed of discrete, task-specific units, offer enhanced flexibility, enabling rapid reconfiguration and easier integration with external automation (Table 1).42,43 Each component in such modular workflows typically serves one of four primary functions: (a) formulation and mixing with precise compositional control, (b) continuous or droplet-based synthesis, (c) in-line or on-line detection using integrated sensors, and (d) downstream processing such as separation, purification, sorting, and archiving (Fig. 2a).
| Features | μTAS | Modular microfluidic systems |
|---|---|---|
| Scalability | Low (exponentially high after design is fixed) | High (number of modules decide the limit) |
| Reconfigurability | Medium (rigid architecture) | High (discretized units) |
| Reproducibility | Excellent (with established microfabrication practices) | Good (for standardized modules), can vary with assembly |
| Automation | High (but less flexible) | High (dynamic automation via module integration) |
| Failure recovery | Poor (single point failure may require full replacement) | Excellent (swap individual modules) |
| Cost | Low (for large-scale commercial applications) | High upfront costs (decreasing with methods like 3D printing) |
This modularity proves particularly advantageous when investigating dynamic and multidimensional experimental landscapes. For example, enzymatic assays often require iterative optimization of substrate ratios, pH, temperature, and reaction time44—parameters that demand flexible experimental workflows capable of accommodating diverse assay formats and detection methods. While integrated μTAS platforms excel at executing defined protocols with high precision, their architectures are optimized for specific experimental workflows, making it challenging to adapt to fundamentally different assay requirements or incorporate new analytical techniques without substantial redesign effort.45 Similarly, in colloidal nanomaterial synthesis, where product quality is highly sensitive to flow profiles and mixing dynamics, modular platforms allow individual tuning of each stage while enabling real-time adjustments based on sensor feedback.46 In both cases, modular microfluidic systems reduce development time, enhance reproducibility, and allow selective upgrades or repairs without disrupting the entire workflow.
The convergence of microfluidics and SDLs is thus a natural progression. Modular microfluidic platforms satisfy key requirements for autonomous experimentation: precise control, scalability, and compatibility with real-time sensing and actuation.114 Reactions can be executed with high temporal and spatial resolution, and outcomes continuously measured, producing comprehensive, structured datasets that are ideally suited for ML. When combined with robotic liquid handling, programmable pumps, and digital control systems (Fig. 2b), these platforms move beyond static, miniaturized versions of traditional labs toward truly adaptive systems that learn from data and respond dynamically to new information.47,48
In this context, miniaturization is not simply shrinking workflows but a foundational technology that supports rapid iteration, efficient modular reconfigurability, and data-rich experimentation, features crucial when performing complex, multistep assays in fully autonomous SDLs.
| Features | Modular microfluidic systems | Modular batch systems |
|---|---|---|
| a Dependent on system design and level of standardization. | ||
| Operating mode | Continuous (state depends on flow conditions) | Discrete (state progresses by scheduled steps) |
| Search-space | Ideal for continuous variables; can handle categorical changes with reagent switching modules | Ideal for discrete variables and broad formulation libraries |
| Experimental coordinate | Residence time/position | Experiment index (well/vial ID) and time |
| Data richness per unit material | Higha (continuous signals yield rich information) | Moderatea (can increase with repeated sampling) |
| Data latency | Low (time-resolved traces along reactor coordinate) | Moderatea (often endpoint or sparse time points) |
| Reconfigurability | High (discretized units, facile module integration) | Medium (discretized units, steep workflow integration) |
| Ease of protocol transfer | Higha (aided by standardized platform ecosystems/well-specified systems), custom builds are prevalent | Higha (aided by standardized platform ecosystems) |
| Cost | Medium (lower reagent use; maintenance on sub-modules) | High (higher consumables, i.e., tips and plates, CapEx depends on robots and modules) |
| Typical failure modes | Clogging, pump drift, sensor fouling/calibration; mitigated by inline monitoring and control logic | Mechanical drift, evaporation, liquid-handling variance, sensor calibration |
| Safety | Small internal volumes reduce hazards | Larger discrete volumes and open handling can increase exposure |
| Workflow diversity & complexity | Higha (if modularization is prioritized; multi-step trains possible) | High (excellent for discrete libraries and multi-step handling) |
| Cross-contamination | Effects can arise from fouling; managed by flushing, surface and secondary phase choices | Disposable consumables simplify isolation; managed by protocols |
| Residence time control | Direct and precise via flow rate and reactor volume | Indirect; set by protocol timing and mixing/incubation history |
| Material transport control | Transport engineered via channel geometry/flow regime; mixing is often predictable, reproducible, and model-informed | Handles slurries/solids; mixing governed by geometry/agitation; potential edge effects |
| Energy transport control | High surface-to-volume ratios enable enhanced thermal control; reduced characteristic lengths enable uniform energy flux | Thermal gradients are possible across the reactors/plates; slower heating/cooling rates |
Continuous flow systems, by contrast, enable a fundamentally different experimental architecture. In microfluidic reactors, operated either as segmented (droplet-based) or continuous flows, reaction conditions are controlled in both space and time. The spatial progression of fluid through the reactor maps directly onto reaction time, enabling kinetic resolution and fine temporal control without process interruption.50–52
A key consequence of using continuous systems is the seamless integration of flow reactors with real-time, in-line characterization techniques.53 Unlike batch workflows that rely on discrete sampling and aliquoting, in-line sensing allows continuous, non-invasive monitoring of reaction evolution, supporting the rapid acquisition of large, time-resolved datasets and access to transient phenomena that are impossible to capture off-line. Optical spectroscopy (including UV-vis and near infrared) is the most widely adopted technique in microfluidics, as it can be integrated readily via bespoke flow cells or fiber-optic probes and provide low-latency readouts of reaction outcome (concentration, conversion, and optical properties of compounds), making it particularly suitable for closed-loop optimization, where rapid feedback is essential.54–58
Complementary modalities extend this analytical space in microfluidic reactors. Specifically, infrared (IR) spectroscopy offers chemically distinct fingerprints of functional group transformations and is commonly implemented using in-line flow cells to mitigate substrate transparency and path-length constraints.59–62 Emission-based techniques such as fluorescence and photoluminescence provide high sensitivity at low concentrations and are particularly effective for nanomaterial synthesis and biochemical assays,63–65 though their use in autonomous workflows requires accounting for photobleaching, inner-filter effects, and temperature-dependent signal drift.66,67 Beyond optical methods, on-line mass spectrometry enables molecular-level insight into intermediates and reaction pathways, albeit with added latency from flow splitting and the need to synchronize residence time with data acquisition.68,69 Together with specialized approaches such as Raman,70 nuclear magnetic resonance (NMR),71,72 electron paramagnetic resonance (EPR),73 and X-ray spectroscopy,74,75 these techniques provide a versatile and dynamic characterization toolbox for multimodal sensing in autonomous microfluidic platforms, capabilities that remain difficult to replicate in batch-based autonomous workflows.
While these analytical tools enable data-rich experimentation, their effective deployment in autonomous microfluidic systems depends on more than just sensor availability. Reliable flow characterization often requires stable calibration over long experimental campaigns, resistance to fouling and bubble formation, and seamless integration with control software. As such, in-flow characterization in SDLs is best viewed as a co-designed component of the experimental workflow, rather than a modular add-on, with sensing, data handling, and decision-making tightly coupled. These features collectively support more adaptive, efficient, and informative experimentation, which are critical to SDLs.
Recent demonstrations highlight the performance and efficiency gains of microfluidic flow systems in autonomous research settings. In the AlphaFlow platform, for example, over 9000 distinct reaction conditions were explored using less than 30 μL of each reagent, achieving rapid optimization of quantum dot optical properties within 1000 hours, an outcome inaccessible to standard mini-batch workflows.39 Beyond throughput, the small reaction volumes (typically nanoliters to microliters) lead to dramatic reductions in material consumption and waste. In the case of atropine synthesis, transitioning from batch to flow reduced waste by nearly two orders of magnitude while improving product purity and throughput.76 A shift to continuous formats also reduces the dependence on single-use consumables and improves energy efficiency, offering clear advantages for sustainable and toxicity-sensitive applications.77
While continuous- and segmented-flow platforms are ideally suited for data-rich autonomous experimentation, the majority of existing SDLs rely on batch-based, macroscale formats. This reflects practical considerations in implementing microfluidic SDLs rather than fundamental limitations of continuous flow approaches.48 In particular, issues related to robustness, system integration, and product volumes obtained for post-processing become increasingly important when translating microfluidic concepts into autonomous workflows. Initial setup costs, including microreactors, pumps, and analytical modules, can also exceed those of traditional batch infrastructure.78 Additionally, as platforms scale to include multiple parallelized flow paths or downstream processes, engineering complexity increases, demanding reliable integration strategies and modular design principles.79,80
Nonetheless, batch and microfluidic-based SDLs address different practical needs within modular experimentation workflows. Robotic, batch-based platforms are particularly effective when the DMTA cycle relies on decoupled, multi-step procedures involving solid and heterogeneous mixtures, where high-dimensional variables are explored as discrete, programmable and audited protocols. Commercial robotic/batch ecosystems facilitate protocol transfer due to their commercial maturity and widespread adoption. However, the quantity and reliability of the resulting datasets can be limited in part by more constrained engineered controls over material and energy transport, and by the complexity of integrating in-process characterization, which often results in endpoint readouts with limited temporal sampling. These limitations are frequently compounded by a higher cost per datapoint, driven by material consumption and reliance on single-use consumables. If the process is amenable to flow (i.e., no significant solid precursors) the choice for microfluidic modules allows finely engineered control over transport phenomena, enabling reproducible, time-resolved measurements and seamless integration of in-line analytics under tightly defined reaction conditions. Such capabilities directly support closed-loop, adaptive discovery by supporting reproducible, high-fidelity measurements, albeit with increased demands on system robustness, integration, and modular standardization.113 Moving forward, integrating microfluidic modules into SDL architectures offers a practical route to scaling data quality, control, and autonomy while complementing batch-based scientific workflows.
Studies from industrial settings underscore the scale of this issue. For instance, a seminal assessment by Lonza indicated that more than 60% of reactions amenable to continuous processing were rendered impractical on the microscale due to solid-related complications.40 Perovskite nanocrystal synthesis serves as a representative example, where shifts in precursor concentration or temperature can induce premature nucleation, leading to salt precipitation and channel blockage. Without reliable fouling detection and compensation strategies, such events introduce errors into the data stream that can misguide decision-making algorithms and compromise the integrity of subsequent iterations. In practice, fouling detection can be realized through multiple modalities. Optical microscopy remains the most widely used approach, with bright-field and fluorescence imaging enabling real-time visualization of aggregation, deposition, and adsorption in microchannels.81–84 Complementary techniques discussed for in-line characterization, such as Raman (including surface-enhanced Raman) spectroscopy,85 IR spectroscopy,86 NMR imaging,87,88 and X-ray methods89 have also been employed to probe early-stage fouling and particle composition without labeling, albeit with higher integration complexity. Beyond direct imaging, indirect indicators such as real-time knowledge analysis of pressure-drops across reactor length or deviations in volumetric flow rate under constant actuation provide simple, robust signatures of fouling or blockage that are well suited for autonomous operation.
Addressing challenges related to solid aggregation requires a combination of physical, chemical, and algorithmic solutions. Passive strategies include surface modifications to reduce particle adhesion90 and hydrodynamic control schemes that keep solids away from channel walls. Active techniques such as acoustic streaming,91 magnetic field manipulation, and electrophoretic separation92 enable dynamic control of solids within microchannels. However, these methods remain underutilized in autonomous workflows and often lack generalizability across chemistries. Incorporating redundancy into system architecture, such as parallelized flow paths or self-cleaning loops, can also increase robustness by enabling fault isolation and recovery. Furthermore, integrating in-line imaging or pressure-based sensing could allow SDLs to autonomously detect clogging and adapt protocols in real time, thereby preserving data quality and experimental continuity.
A second major challenge is the substantial upfront costs and specialized expertise required to establish the SDL infrastructure. Off-the-shelf automation systems can exceed 0.1–0.5 M USD, creating steep entry barriers that often concentrate operational knowledge among a few trained users.93 To save time and cost, labs resort to bespoke solutions that function locally but may lack the reproducibility and transferability needed for industrial adoption.
This is compounded by another barrier: integration of modular microreactor components with automated hardware and data infrastructure. While individual instruments (e.g., pumps, reactors, and sensors) have matured considerably, they are rarely designed for seamless machine-to-machine communication. Most commercial tools are optimized for manual operation, often requiring proprietary software interfaces, limiting interoperability. For SDLs to function autonomously and scalably, experimental components must operate as part of an orchestrated ecosystem capable of sharing data, triggering actions, and responding to sensor feedback in real time. Currently, this orchestration is largely implemented through ad hoc workarounds. For example, synchronizing a commercial syringe pump with an in-line spectrometer often demands microcontroller-based intermediaries, reverse-engineered communication protocols, or custom Python scripts emulating manual inputs.48 These solutions, while functional, are fragile and difficult to maintain, hindering reproducibility and slowing deployment and widespread adoption.
Moving forward, two complementary strategies have the potential to drastically accelerate progress. First, democratizing access through open-source hardware (e.g., FINDUS, OTTO) and standardized software stacks, supported by community-driven training, can reduce both financial and knowledge barriers.48,94,95 Second, the research community would benefit from the development and adoption of hardware-agnostic communication standards tailored to automated laboratories, analogous to the OPC-UA protocol in industrial automation or SCPI in instrumentation control. Initiatives like SiLA 2, combined with open data standards like AniML represent important steps toward this goal.96 Broader adoption of such frameworks, coupled with efforts to design SDL-compatible instrumentation with standardized application programming interfaces (APIs) are likely to significantly reduce integration overheads, particularly in academic and early-stage applications where resources are most constrained.
Beyond technical integration, establishing unified data structures for capturing experimental metadata, instrument settings, and sensor outputs is critical. Such a unified data structure ensures interoperability across devices and also supports long-term goals in ML, benchmarking, and collaboration. Emerging efforts toward “FAIR” (findable, accessible, interoperable, reusable) data standards in autonomous experimentation provide a promising direction and should be extended to microfluidic contexts, where the richness of spatial–temporal data is especially valuable.97
While the majority of the SDL development has focused on chemical synthesis and materials science, significant opportunities exist in expanding these capabilities to biological systems. Applications such as single-cell genomics, where microfluidics enables the isolation and analysis of individual cells,18,24,29 or genotype–phenotype linkage studies that track cellular responses under controlled microenvironments,98,99 represent particularly promising directions for autonomous experimentation. Notably, several widely adopted commercial platforms illustrate the practical maturity of microfluidics in this domain. Commercially available digital PCR systems, which partition samples into millions of isolated volumes for absolute nucleic acid quantification, have become a diagnostic gold standard in the last decade.100,101 Similarly, droplet-based microfluidic workflows revolutionized single-cell analysis, fundamentally advancing our understanding of cellular heterogeneity in cancer and immunology.102–104 These successes underscore how microfluidics can support scalable, standardized, and data-intensive biological workflows. Translating such capabilities into autonomous microfluidic SDLs will require advances in integrated biological sensing, sterile and robust fluid handling, and biocompatible system design, offering a compelling pathway toward accelerating discovery in synthetic biology and personalized medicine.105
Beyond microscale optimization, an important consideration for autonomous microfluidic laboratories is the translation of knowledge to larger-scale processes. While scaling microreactors through numbering-up or parallelization has often been proposed,106 it remains a nontrivial challenge in practice. Instead, microfluidic SDLs are best viewed as discovery engines that generate transferable kinetic and mechanistic insights under well-defined conditions.107
The precise control over transport phenomena in microreactors enables the systematic mapping of reaction performance through dimensionless variables, such as the Damköhler number for reaction-transport coupling, the Péclet number for mixing efficiency, and the Biot number for heat transfer dynamics.108 Importantly, the degree to which these parameters must be preserved during scale-up depends on the controlling regime. For chemistries that are kinetically-limited (Da ≪ 1), translation to larger scales primarily requires maintaining practical homogeneity and thermal control rather than matching mixing-related dimensionless variables exactly. In contrast, when reactions outpace mixing (Da > 1) observed rates and selectivity can become functions of mass transfer; operating in this regime can disrupt kinetics, making transport similarity (and the associated dimensionless variables) a primary scale-up target for preserving performance.109 Additionally, microfluidic platforms rapidly identify process boundaries, such as precipitation thresholds, thermal runaway conditions, or selectivity changes, that might take months to map using traditional batch screening.110 These fundamental insights, captured as dimensionless correlations and operating windows, guide the design of production-scale reactors where transport limitations differ significantly from microscale conditions.
Looking forward, the ultimate vision for microfluidic SDLs extends beyond individual experimental optimization to encompass end-to-end workflow automation, from application selection and device design to implementation and scale-up (Fig. 4). Such automation would leverage ML algorithms not only to optimize experimental conditions but also to guide the selection of appropriate microfluidic architectures, predict optimal reactor materials, and even suggest novel reaction pathways based on accumulated experimental data.111,112 This systems-level approach would democratize advanced chemical and biological research by making elaborate experimental capabilities accessible to researchers without specialized microfluidics expertise.
In sum, advancing autonomous microfluidic labs requires the confrontation of both chemical and infrastructural challenges. Solutions will likely emerge at the intersection of hardware innovation, software interoperability, and systems-level thinking. The development of robust solid-handling strategies, standardized communication protocols, and open-source control frameworks will be central to this effort. As these components mature, microfluidic SDLs are poised to become more capable and resilient while improving accesibility to the broader scientific community, ushering in a new era of scalable, data-rich, and intelligent experimentation.
| This journal is © The Royal Society of Chemistry 2026 |