An Open-Source Modular Framework for Automated Pipetting and Imaging Applications

The number of samples in biological experiments are continuously increasing, but complex protocols and human experimentation in many cases lead to suboptimal data quality and hence difficulties in reproducing scientific findings. Laboratory automation can alleviate many of these problems by precisely reproducing machine-readable protocols. These instruments generally require high up-front investments and due to lack of open APIs they are notoriously difficult for scientists to customize and control outside of the vendor-supplied software. Here, we demonstrate automated, high-throughput experiments for interdisciplinary research in life science that can be replicated on a modest budget, using open tools to ensure reproducibility by combining the tools Openflexure, Opentrons, ImJoy and UC2. Our automated sample preparation and imaging pipeline can easily be replicated and established in many laboratories as well as in educational contexts through easy-to-understand algorithms and easy-to-build microscopes. Additionally, the creation of feedback loops, with later pipetting or imaging steps depending on analysis of previously acquired images, enables the realization of smart microscopy experiments, featuring completely autonomously performed experiments. All documents and source-files are publicly available (https://beniroquai.github.io/Hi2) to prove the concept of smart lab automation using inexpensive, open tools. We believe this democratizes access to the power and repeatability of automated experiments.


Introduction
One of the core interests in modern life sciences is to understand how cells interact with each other and form highly organized biological systems. [1,2]Tools such as microscopy help to discover cell dynamics on very small scales in both time and mitosis or apoptosis, and greatly reduce data storage challenges.Overall, laboratory automation minimizes time spent by highly trained researchers on routine tasks and significantly improves reproducibility by eliminating human error.Automated analysis during or after experiments can fine-tune parameters in future experimental steps. [19]he increasing complexity and associated costs of experiments, due to the increased use of laboratory equipment, as well as the knowledge required to properly operate all the necessary instruments, make high-throughput research very exclusive. [20,21]n turn, it is particularly challenging to train students to use such methods. [22]Our aim in this work is to prove the principle of lab automation using open, accessible tools, with the ultimate goal of making this technology more widespread and easier to use.
In the past, much open-source software and, more recently, open hardware projects have shown that by creating a community of developers and allowing them to develop the projects together, a high level of professionalism and usability is created, with the help being offered free of charge.][24][25][26][27] The quality of documentation, and the skills required to replicate a given solution are often quite variable, which currently limits the uptake of this approach.
On the other hand, projects such as the 3D-printed Open-Flexure Microscope, [25] the Octopi/SQUID project, [14] and the cellSTORM microscope [24] come with a fully illustrated manual that enables high-performance microscopy on a stand-alone or modular scale.These projects have all been replicated by the growing community of microscopy users.For a great overview of the subject, please refer to Rosario et al. [21] In most cases, commercial devices lack open hardware or software interfaces for customization or automation, making it difficult or impossible to use them outside of the intended application for the devices.The Opentrons OT-2 pipetting robot [6,28] which we use in this manuscript is an example of commercially produced open-source hardware.Its commercial success shows that collaborative development also works in an industry context and additional functions from the developer community can be quickly incorporated.
Open-source software projects, such as the data analysis tool CellProfiler [29] or the image processing tool Fiji, [30] have created vibrant communities where the core software is enhanced with a huge library of contributed plugins.This common interface can accelerate the research process enormously by making it much easier for researchers to implement a published method, or to integrate multiple plugins together in new work.On close observation, such an interface between individual open hardware projects seems to be lacking at present.With UC2, we have already shown that it is possible to create a framework to connect different components from other projects or commercial sources so that a variety of optical experiments are possible. [22]Tools such as GitHub/Gitlab repositories in combination with proper open-source licenses help to organize such collaborations in the institutional context and beyond.This way, researchers and enthusiasts can participate and the lifetime of such a project very often outlives classical research projects.
In this work, we ask the question, how different expertise from several open-source hard-and software projects can be merged to generate a professional laboratory quality research tool.We show how the interaction between open-source tools such as the Opentrons OT-2 pipetting robot (Opentrons, New York, USA), the OpenFlexure Microscope (OFM Server [31] ), ImJoy, [11] and UC2, [22] each expert in liquid handling, robotic microscopy, web-based image processing, and modular optics solutions, respectively, can help democratization of smart lab automation with a close-to turn-key solution.We will present two new compact slide-scanning (fluorescence) microscopes that can be placed directly in the Opentrons robot or cell culture incubators.Furthermore, we show an example of how a complete protocol from staining the microtubule network of fixed HeLa cells, to simultaneous in situ observation via the microscope, to quantitative analysis which can be realized via a simple interface in a web browser.

Experimental Section
Here, a brief description of the software pipeline, the automated pipetting system (Figure 1a) as well as the two different DIY high-throughput microscopy imaging systems (Figure 1c,d) for on-site microscopic imaging and image analysis inside the Opentrons pipetting robot (Figure 1b) is given.Additionally, it is shown how all components can easily be integrated into a common workflow using Jupyter notebook to design complex biological protocols as depicted in Figure 1a.The soft-and hardware used in this project is summarized in Table 1.An extensive overview of alternative open-source DIY labware is given in refs.[14,21].
The first stand-alone microscope (Figure 1c) with a fixed sample stage was used as a development device for the software in several different locations (e.g., Sweden, Germany).From the experience gained from prototype development, the second UC2-based device (Figure 1d) moved the sample and was found much easier to reproduce.

Microscope Design
To perform cell observation both during the pipetting process and after completion of the protocol, two compact microscopes with cellular resolution (≈2-3 µm) capable of recording time series of individual wells were developed.The necessary parameters identified for the desired microscope for prototyping were 1) flat design to fit into lateral flow hoods, biological incubators, liquid handling robots (i.e., Opentrons OT-2), 2) scanning of multi-well plates (6, 24, 96), as well as individual sample, slides with high speed and reproducibility with respect to sample coordinates, 3) transmission brightfield and optional fluorescence imaging, 4) easily reproducible, expandable, and cost-effective to quickly build a variety of instruments, 5) an easy-to-use control system that can be easily integrated into existing workflows, 6) simple integration of image processing tasks into the software, 7) perform in vitro experiments at temperature around 37 °C as well as high humidity for long periods of time with automatic focus.
Recent work such as the Incubot [34] or the open-source Lab Platform [35] had already shown how parts from a 3D printer or computer numerical control (CNC) machine could be used to create low-cost, high-throughput microscopes.The ideas presented there were taken up and two new compact 3D printed devices with a similar range of functions were demonstrated.Optimal use of the limited working volume within the OT-2 was made and the maximum pipetting height of 150 mm measured from the base was considered.
The standalone "OpenmiTronScope" (Figure 1c and Section S1, Supporting Information) offered a static sample well, moved the camera in XYZ to generate large-scale microscope images, and used a white-light light-emitting diode (LED) to realize transmission brightfield imaging at cellular resolution.A more generic solution, where customized imaging schemes such as fluorescence or multiple objective lenses for different magnification and optical resolutions were desired, was provided by the UC2-based [22,36] system "Hi2" (Figure 1d).The functionality of recently introduced open-source modular optical toolbox UC2 with the capability of large-scale fluorescence microscopy with the development of a laser engraver-based XY stage (Figure 2a) and a novel Z-focusing unit (Figure 2b) was extended.Because most parts came pre-assembled, the complexity to replicate such devices was significantly reduced, hence enabling increased accessibility and wide-spread dissemination.
An in-depth description and analysis of the mechanical properties can be found in Sections S1 and S2 in the Supporting Information, while a set of instructions on how to manufacture and assemble the parts as well as a detailed bill of materials (BOM) can be found on the project page https://beniroquai.github.io/Hi2.

Software for Hardware Control
For software control, it was important that all components, namely, the pipetting robot, the image processing pipeline, the microscope, and any additional components such as sensors or laboratory devices, could communicate with each other so that a fully automated workflow could be created.For this purpose, integration via a local area network (LAN) using Ethernet or WiFi, was a convenient option that allowed the use of existing software to run each device, rather than requiring a single integrated application.Furthermore, intuitive control using a graphical user interface (GUI) or scripting interface such as Jupyter Notebook enabled users to get started quickly with familiar tools.
In addition to an online protocol designer, the Raspberry Pibased Opentrons OT2 could be controlled using a Python application programming interface (API) from a Jupyter Notebook hosted on the OT2 or using a representational state transfer (REST) API via hypertext transfer protocol (HTTP) requests. [28]imilarly, the OFM server offered control through a browser-based Components for a fully autonomous pipetting and imaging pipeline.a) A typical workflow starts with sketching the biological protocol and implementing it in python-based code for the opentrons/microscope.After simulating the code and setting up the experiment which typically involves placing reagents and samples inside the volume of the OT2, a first run can be started with some dummy reagents to debug the experiment.The autonomous experimentation includes robotic pipetting steps, continuous cell observation and processing on-the-fly image which can be controlled remotely.After collecting the data and an adjacent analysis, a hypothesis can be proven, the experiment repeated or refined.b) The microscope should fit into the working volume of the Opentrons OT2 pipetting robot to be integrated into a common pipetting workflow.c) The standalone high-throughput microscope "OpenmiTronscope" moves the optical assembly around a fixed sample plate.d) An extension to the modular optical toolbox "UC2" and is based on a widely available laser engraving x/y table to realize high accuracy at a low price, where the sample is moved in x/y.GUI and using a REST API, which allowed, e.g., the initiation of scans and captured via the browser [31] or from a Python module running anywhere on the network.
The OFM server supported the software extensions so that, e.g., the "GRBL" interface [37] used here for the stepper motor control running on a serial-connected Arduino and the USB3 camera could be integrated easily.The server was designed for low computing resources and could be used on single-board computer (SBC) such as the Raspberry Pi (v3b, UK) or the Nvidia Jetson Nano (Santa Clara, USA).Because the GUI ran in the browser, the hardware could also be controlled remotely with suitable (i.e., secured) network configuration.The core HTTP server code is available as the flask-labthings Python package. [38]

Software for On-Site Image Processing
In order to process the generated image data directly without transferring each image from the microscope to an external processing machine or to involve imaging data into automated decision making for future pipetting events, the browser-and web-based image processing tool ImJoy [11] was integrated directly into the OFM server.With this, it was, e.g., possible to evaluate captured images directly in the browser using ImageJ.JS [32] or already available plugins such as the ITK/VTK Viewer.This was, e.g., useful if the pixel size might be calibrated (as shown in Figure 2d), image tiles had to be stitched to form a larger field of view (FOV) or timelapse series had to be combined into a video.ImJoy also offered a large number of plugins developed by the community which could directly be accessed from within the browser.These included previously trained networks, denoising or deconvolution algorithms to partially compensate the loss of quality caused by the lowcost hardware.ImJoy could use the computing resources of the browser in which the plugin was operated, as well as external computer clusters, e.g., for training neural networks and was therefore not limited by the low computational resources by the SBCs.

Open Collaboration for Distributed Hardware Development
The ability to develop hardware in a decentralized fashion has proven particularly helpful in the still ongoing COVID-19 pandemic, where many scientists have limited access to the lab.A discussion of an idea of building a compact microscope for the integration in the pipetting robot was followed by first prototyping of the "OpenmiTronscope" in Jena (Germany).Two prototypes were sent to JCP and WO in Sweden at an early stage of development.With this functional prototype, it was possible to work on the further development of the software at several locations in parallel and to incorporate possible hardware and software optimizations into the next optimization iteration.Using tools such as remote coding sessions in VSCode (https:// code.visualstudio.com/,Microsoft, Seattle, USA), versioning tools such as Github/Gitlab, and video conferencing such as Teams or ZOOM, the integration of the various components was thus realized from multiple locations even though the project partners never met in person.Any hardware changes can be quickly implemented using 3D printing or off-the-shelf components so that upgrades can be performed in situ.The iterative development process led to the UC2-based system, where all issues from the first version have been resolved to maximize user experience and stability.This approach could serve as a blueprint for future projects since it offers a very fast development process from the first prototype in February 2021 to a fully working solution in May 2021.Development was simplified by adopting an "openly developed hardware" [20] approach, removing the overhead of restricting access to designs.
The selection of the projects presented in Table 1 followed specific criteria which we will briefly discuss below.An in-depth discussion about detailed requirements for the optical system and the software can be found in adjacent sections and in the Supporting Information.https://imoy.io [11]ji/ImageJ.jsOnline-based Java image processing-tool that enables versatile processing using large variety of available plugins in the browser https://doi.org/10.5281/zenodo.4944985 [32]pyter Notebook A Python-based notebook that organizes experimental workflows and controls execution of pipetting/imaging tasks as well as plans upcoming tasks based on image processing results https://jupyter.org/

Microscope
The microscope must be compact and adaptable to experimental conditions through simple modification.UC2 offers a modular optical approach to quickly change modules such as excitation wavelength to observe different fluorophores.At the same time, it offers a cost-effective, high-quality solution that integrates into existing laboratory workflows via OpenFlexure's REST API.

Pipetting Robot
Although there are a large number of open-source tools for sample preparation of 96-well plates, they are often not capable of washing samples or transferring reagents across multiple well locations.The Opentrons OT2 is capable of both by providing accurate aspirating and dispensing steps and, like the microscope, has a REST API that allows it to be controlled over the network for integration into an automated workflow.

Image Processing
The goal is to operate all devices from the browser, so that copying data for the purpose of analysis becomes unnecessary.ImJoy.io as a browser-based image processing software can therefore be integrated very well into the workflow.Through an integration in Imagej.js,which is integrated as a native "APP" in the OpenFlexure GUI, everything can be found in one place, thus increasing the user-friendliness.

Control Software
Jupyter Notebook is used to unite all the participants in a biological workflow, such as pipetting robot, the microscope, and processing software, and to share the results reproducibly with the research community as a human-and machine-readable protocol.Using Python scripting language, the individual devices can be controlled and connected to each other both via a Python API (Opentrons OT2, ImJoy) and via a REST API (OpenFlexure /Hi2).This way, an image acquisition can be requested, processed via ImJoy and updated control commands can be sent to the pipetting robot or the microscope.
A comprehensive list of additional labware for individual experimental needs is offered by the databases from Open Know-How, [39] Thingiverse [40] or the NIH 3D print exchange. [41]

Using the Hi2 Microscope for Live-Cell and High-Throughput Imaging
For a high-throughput microscope, we identify the following criteria for use in biological experiments: 1) Long-term imaging in the incubator at high temperature/ humidity environment 2) Permanent in-focus across the full well plate 3) Reproducibility of the targeted XYZ coordinates over multiple runs overtime to locate individual cells over time reproducibly 4) Ability to stitch a larger FOV from multiple image tiles 5) Fluorescence imaging with at least one excitation channel 3D printing material with increased glass transition temperature such as polyethene terephthalate glycol (PETG) in combination with an autofocus algorithm, that compensates for temporal focus drift and tilted plates (see Section S3, Supporting Information), helps to perform long-term imaging experiments of living organisms in cell culture incubators, where 37 °C and high humidity represent challenging conditions for electronics and thermoplastics.
To estimate the mean displacement of multiple regions of interest within one well plate, a long-term experiment, where the microscope periodically (t = 5 min) scans 32 out of the 96 wells, was performed at room temperature.The measurement in Section S3 in the Supporting Information suggests an average displacement of less than 32 µm (4% for a region of interest (ROI)) across an 18 h measurement, leading to the possibility to continuously monitor cells at multiple locations (e.g., wells).Fluorescent imaging (Figure 3) of the robot-labeled HeLa cell sample (Thermo Fisher alpha Tubulin Monoclonal Antibody cat.Number: A11126, Alexa Fluor 647 Secondary Antibody, cat.Number: A-21235), the ability to stitch multiple image tiles to form a larger FOV directly on the device as shown in the zoomed ROI, is illustrated in Figure 3.
A detailed discussion about the selection of individual hardware components can be found in Section S1/S2 in the Supporting Information.

Making the Labware Talk to Each Other
Until now, the individual components, such as the microscope, the GUI, and the image processing worked independently from each other.In the following, we want to show how to connect them to create a fully automated workflow.This could in turn mean that pipetting steps can be done based on the previously obtained and processed imaging results to, e.g., adjust pipetting volumes or have better environmental conditions for future experiments (e.g., buffer and antibody concentration).
A router (Netgear Nighthawk AC1900 R7000, 100€) ensures a stable connection of all devices over ethernet (Wifi for the Opentrons OT2) to ensure command as well as data transfer.An additional laptop or the Jetson Nano running the OFM Server is used to render the OFM GUI as well as the Jupyter notebook from the Opentrons.An in-detail description of how to set up the environment, perform simple tasks, and a selection of ready-to-use pipetting protocols can be found on our project webpage (https://beniroquai.github.io/Hi2).
Using the Opentrons Python API, arbitrarily complex pipetting operations can be formulated in an easily readable code framework and executed both in standalone Python scripts as well as in dynamic browser-based Jupyter notebook.The latter has the advantage that each step is clearly logged, can be added to the lab notebook in a graphically easy-to-understand form, and facilitates reproducibility by sharing directly in the browser (Figure 4).
The basic code structure for a protocol is divided into the definition of the labware (pipette tip rack, well plate, microscope, etc.), which are assigned to the deck coordinates and the execution of steps such as aspiration, dispensing, or the movement from tip location A1 to well location B2, e.g.To this end, common Python libraries (e.g., numpy [42] ) can be easily integrated, which is useful for controlling the Hi2 microscope or doing calculations based on computed results.Using the OFM client library (Figure 4, bottom), [43] Python commands such as image capture, move to coordinate, or laser on/off are wrapped in human-readable protocol steps and executed via the REST API.This enables time-lapse imaging series during incubation times or fluorescence imaging after a staining process to check if the experiment worked on site.Custom labware, such as the microscopes or 3D-printed tip racks, can be added using custom labware definitions or manually set XYZ coordinates.
The ImJoy Jupyter Notebook Plugin simplifies the execution of available plugins directly in the automated workflow.A typical use case is on-the-fly image processing, where the robot requests an image capture, downloads the data from the OFM server, and transfers it to an externally running (e.g., browser of the laptop/computer cluster) ImJoy Plugin.This way, processing steps requiring high computational resources (e.g., segmentation by neural networks) can be realized without storing large data sets continuously.This enables the use of external libraries that cannot be installed on the read-only file system from the Opentrons and is especially useful for graphics rendering applications, where the ITK/VTK Viewer in the form of an ImJoy plugin helps illustrate results for further debugging.
Alternatively, e.g., in case an OT2 is not available, the microscope can conveniently be used as a standalone device, where ImJoy is integrated into the OFM web app, allowing, e.g., image processing using ImageJ.js(Figure 2d) or other ImJoy plugins.The OFM GUI can be accessed through the browser or an Electron app, providing a live view of the microscope's camera feed and controls for common imaging operations.It is possible to control the Opentrons robot using an experimental HTTP interface, moving overall control to a different device.We describe this in Section S4 in the Supporting Information but note that it is not yet feature-complete.Section S4 in the Supporting Information describes alternative configurations, e.g., using the experimental HTTP interface on the Opentrons robot to move overall control to the OpenFlexure software, but this is not yet able to access the full functionality of the Opentrons software.Diagram for intra-device communication, where the Opentrons OT2 pipetting robot's Jupyter server controls the experiment.The protocol is formulated using human-readable python syntax, where on-the-fly image processing can be performed using available ImJoy plugins directly in the browser.Additional hardware, such as the GRBL stage and camera can be connected to the Jetson Nano, which also runs the OFM server.

Imaging Results from a Fully Automated Antibody Labeling Workflow
After presenting the individual building blocks for the automated pipetting, microscopic imaging, image analysis, and hardware control, the open question remains, can we get the hardware work together in order to perform a fully autonomous biological experiment?For this, we perform a standard, often very time-consuming laboratory experiment using primary and secondary antibodies for fluorescent immunostaining.A complete protocol can be found in the online repository protocols.io (see Section S5, Supporting Information).This is an ideal use case, where human errors, like varying pipetting volumes or wrong timings, can successfully be avoided since every step is properly tracked in the digital protocol.A repeated run with another fluorophore does not require immense labor work since, because the protocol can simply be run again.As shown in Figure 5, the protocol consists of several pipetting and washing steps, with image acquisition during the waiting times to observe possible reactions of samples to reagents.In the experiment presented here, only microtubules (anti-tubulin) are labeled with an Alexa Fluor 647 conjugated (anti-mouse) antibody and then automatically imaged with the fluorescence unit (see Figure 5, right).After the image acquisition, the results are directly processed using ImJoy to apply a look-up-table (LUT), perform a pixel size calibration, and quantify the cell's features (see also Section S5, Supporting Information).
A critical point of such an experiment is the debugging of the protocol without living samples and expensive reagents.For this purpose, the protocol can first be simulated before it is run on the robot without samples to detect and eliminate any errors in advance.
Live cell experiments inside the robot at 37 °C ambient temperature and sufficiently high humidity are possible in principle.For this purpose, similar to ref. [44], we have placed a hotplate heated to 75 °C with a 2 L water vessel in the working volume of the robot to achieve a constant temperature of about 35-37 °C with the housing completely closed.

Performing "Smart Microscopy" by Finding Highest Cell Density
A unique feature of the closed-loop pipetting, imaging, and processing pipeline presented here is the ability to plan future steps within a protocol based on a computer-aided decision.We give a simple example in which the pipetting robot seeds an unknown quantity of yeast cells, which are imaged by the microscope after sedimentation (Figure 6).One image from each of the 96 wells is sent to a customized ImJoy plugin written in Python running on the laptop, which returns the number of yeast cells as the result.After cell preparation of all wells, the microscope moves to the position that has the lowest cell density and performs a long-time series.In this way, unknown cell densities could easily be calibrated, and subsequent experiments can be better selected in advance, e.g., to better observe the growth rate at the most ideal cell density, which can impact the biological outcome of a given experiment.

Discussion and Outlook
In this work, we demonstrate how a fusion of knowledge from open-source projects, each hosting a large and active developer community, leads to a powerful tool that makes the increasingly important field of lab automation available and accessible.The total cost, including the commercial OT-2 (≈5k€ vs >50k€, e.g., Eppendorf Eppimotion, Hamburg, Germany), the microscope (≈2k€ vs >50k€, e.g., Biosense oCelloscope, Denmark or Cytosmart Omni, Netherlands), and the software (free vs ≈20k€), is about 7k€, plus individual labor costs of around 6k€ if a Ph.D. student works on it fulltime for 2 months.This stands in contrast to >120k€ and thus more than one order of magnitude less expensive than a comparable commercial setup with similar functionality.More importantly, the here presented open-source solution for the pipetting robot-contained microscope offers the ability to be readily customized since the software and hardware can be adapted and extended to individual needs.The low price and high accessibility make it ideal for the use in low-resource laboratories.
Our solution includes browser-based GUI software for controlling the microscope, Jupyter notebook, and ImJoy plugins for the reproducible execution of pipetting protocols and processing of image data.Imaging is performed on easily reproducible microscopes from the UC2 toolkit.This demonstrates that smart microscopy experiments can be realized without great effort and costs.We rely on existing projects and many off-the-shelf components like the commercially available opensource pipetting robot, laser engraving stages, and pre-assembled UC2 modules to fully concentrate on the experiment.
The additional 3D-printed parts can be easily adapted to the conditions of the experiment and replaced if necessary.However, this flexibility and the low price of the components are accompanied by significantly reduced stability compared to metal machined parts.In particular, the cantilevered sample holder in the Hi2 microscope is susceptible to low-frequency vibrations.This is particularly noticeable through small variations in the FOV and is especially visible when the robot is moving or when vibrations prevail in the building.3D-printed parts made from PLA start to soften at a temperature around 37 °C (e.g., inside an incubator).Using PETG with a much higher glass transition temperature solves the problem, although a microscope warmup phase with corresponding temperature-induced drift has to be considered.The low positioning error of less than +/− 4% within the FOV with repeated movements over 18 h was a surprising finding for the low-cost x/y laser engraving table and shows the strength from the already well-engineered maker hardware for use in the scientific context. [45]ther critical points within an experiment are possible malfunctional behavior of components within an ongoing experiment leading to a cancelation of the process, as reagents such as antibodies are often very expensive and should not be wasted.This can be remedied by early debugging in the form of a protocol run with "dummy reagents" such as dyed water and checking the intermediate results or evaluating a series of time lapses made using a cell phone camera.
The low-cost optics used here is easy to obtain and can be replaced by, e.g., higher quality lenses to achieve a higher optical resolution and better optical performance.Additional UC2 modules such as different excitation lasers and filter cubes can enable additional functionalities such as multicolor fluorescence imaging based on the building block-based principle.
The same applies to the here made integration of the webbased image processing tool ImJoy.Available libraries, code snippets, and plugins for Fiji, Python, or JavaScript can be integrated into the workflow, allowing the measured value in the form of 2D images of the image to be quantified and used for the next experiment or an adaptation of the previous workflow.on the participation of many research groups.Often, direct commercial exploitation of research results still prevails, which can lock up intellectual property, and prevent projects like this one from using those results. [46]

Conclusion
The effort to plan an already existing biological protocol, such as the antibody labeling shown here, and to transform it into a machine-readable code is currently much larger and more time-consuming than the manual approach.If the planning phase is included, the time required is an order of magnitude higher than conducting the pipetting manually.One can therefore ask the question of whom laboratory automation will be useful.However, with the workflows presented here making this technology available to a wide audience, the ability to share protocols that have already been performed, and the prospect of better training in this area already in the university context, it seems likely that automated performance of experiments will prevail in the long term.The ability to perform many experiments in parallel or replicate an experiment exactly will greatly improve both the quality of data and the reproducibility of studies.Expensive, sophisticated automation systems are often underutilized, because many biologists are not trained in their use, there are insufficient specialist technicians available to support them.The system as presented in this manuscript is not yet a step-change in user friendliness, but our eventual goal is that, by vastly increasing the accessibility of automation, we will help it to become a more mainstream technique.Building a large community of users and developers will, in turn, help to develop training material and share know-how to improve the uptake of these powerful tools-both the open and low-cost ones, and the existing commercial systems.Our next goal is to perform closed loop experiments where, based on the data recorded in the experiments, trained neural networks can help to make decisions about future experiments and to plan and execute them autonomously.

Figure 1 .
Figure 1.Components for a fully autonomous pipetting and imaging pipeline.a) A typical workflow starts with sketching the biological protocol and implementing it in python-based code for the opentrons/microscope.After simulating the code and setting up the experiment which typically involves placing reagents and samples inside the volume of the OT2, a first run can be started with some dummy reagents to debug the experiment.The autonomous experimentation includes robotic pipetting steps, continuous cell observation and processing on-the-fly image which can be controlled remotely.After collecting the data and an adjacent analysis, a hypothesis can be proven, the experiment repeated or refined.b) The microscope should fit into the working volume of the Opentrons OT2 pipetting robot to be integrated into a common pipetting workflow.c) The standalone high-throughput microscope "OpenmiTronscope" moves the optical assembly around a fixed sample plate.d) An extension to the modular optical toolbox "UC2" and is based on a widely available laser engraving x/y table to realize high accuracy at a low price, where the sample is moved in x/y.

Figure 2 .
Figure2.a) For the UC2-based high-throughput microscope "Hi2," we developed two additional modules where the XY stage derived from a commercially available laser engraving machine can simply be integrated into the existing cube-based framework using a minimum of 3D-printed parts.b) The novel Z-stage features low wobbling along the optical axis with the help of a magnetic ball-bearing decoupling mechanism in combination with linear rail bearings.The hardware module is optimized for simple replication, long lifetime, and low price.c) A long-term time-lapse image series of in vivo HeLa cells conducted inside a cell culture incubator revealed a mitotic event.d) With the help of a newly created ImJoy extension for the OpenFlexure server, image processing, such as pixel size calibration using ImageJ.js,can directly be conducted inside the browser without transferring the data to external computers.

Figure 3 .
Figure 3. a) A representative stitched tile scan of a µSlide 15-well sample slide, where HeLa cells were labeled with anti-tubulin AF647 primary/secondary antibodies using the Opentrons OT2.c) The zoomed-in version (ROI3) of b) ROI2, suggests the presence of fibrous structures which can be identified as the microtubule network.d,e) Two roi-scans were performed 12 h apart to demonstrate the location error computed in (f) as the difference between the two timestamps.Only small variations of the sample, mostly due to photobleaching can be observed.

Figure 4 .
Figure 4.Diagram for intra-device communication, where the Opentrons OT2 pipetting robot's Jupyter server controls the experiment.The protocol is formulated using human-readable python syntax, where on-the-fly image processing can be performed using available ImJoy plugins directly in the browser.Additional hardware, such as the GRBL stage and camera can be connected to the Jetson Nano, which also runs the OFM server.

Figure 5 .
Figure 5.A common automated workflow that involves robot-assisted immunostaining, microscopic imaging, and image processing.The control blocks are formulated in the form of a Jupyter Notebook running on the Opentrons OT2.All steps, except sample preparation involving fixation and permeabilization as well as sample disposal, are conducted directly at the robot.
However, a truly functioning feedback loop has not yet been established.The open-source nature of all projects in this study pays off since they allow rapid integration and provide a legally valid framework through corresponding open-source licenses.The model of open, collaborative science we demonstrate here relies

Figure 6 .
Figure 6.After seeding an unknown number of yeast cells using the pipetting robot, the microscope performs a whole plate scan before a dedicated ImJoy plugin running on a separated server computes the cell density and decides which well will be analyzed in long-time experiments.In this case, well number 19 showed lowest cell density.

Table 1 .
. Summary of the open-source hard-and software tools that have been combined to from the automated labeling and imaging pipeline.