Jessie
Howell
a,
Tansy C.
Hammarton
b,
Yoann
Altmann
*c and
Melanie
Jimenez
*a
aBiomedical Engineering Division, James Watt School of Engineering, University of Glasgow, Glasgow, G12 8LT, UK. E-mail: Melanie.Jimenez@glasgow.ac.uk
bInstitute of Infection, Immunity and Inflammation, University of Glasgow, Glasgow, G12 8TA, UK
cSchool of Engineering and Physical Sciences, Heriot-Watt University, Edinburgh Campus, Edinburgh, EH14 4AS, UK
First published on 17th July 2020
Visualising fluids and particles within channels is a key element of microfluidic work. Current imaging methods for particle image velocimetry often require expensive high-speed cameras with powerful illuminating sources, thus potentially limiting accessibility. This study explores for the first time the potential of an event-based camera for particle and fluid behaviour characterisation in a microfluidic system. Event-based cameras have the unique capacity to detect light intensity changes asynchronously and to record spatial and temporal information with low latency, low power and high dynamic range. Event-based cameras could consequently be relevant for detecting light intensity changes due to moving particles, chemical reactions or intake of fluorescent dyes by cells to mention a few. As a proof-of-principle, event-based sensing was tested in this work to detect 1 μm and 10 μm diameter particles flowing in a microfluidic channel for average fluid velocities of up to 1.54 m s−1. Importantly, experiments were performed by directly connecting the camera to a standard fluorescence microscope, only relying on the microscope arc lamp for illumination. We present a data processing strategy that allows particle detection and tracking in both bright-field and fluorescence imaging. Detection was achieved up to a fluid velocity of 1.54 m s−1 and tracking up to 0.4 m s−1 suggesting that event-based cameras could be a new paradigm shift in microscopic imaging.
Using current event-based cameras (available for under 5k USD), events are detected with microsecond resolution. Since only events are transmitted, event-based cameras offer low latency, low power (ca. 10 mW) and high dynamic range (>120 dB).2 These “silicon retinas” have become increasingly popular for high-speed robotic vision, e.g., for ball detection,3–5 gesture recognition,6,7 3D mapping8 or for unmanned aerial vehicles9 and predator robots.10 They have also been used for tracking macroscopic objects such as vehicles11 or stars/satellites12,13 (see the recent survey14 which discusses applications and challenges of event-based sensing). Importantly, and to the best of our knowledge, application to the “micro-world” has been limited to the work proposed in ref. 15 for micro-robotics, demonstrating tracking of microparticles in a Petri dish or for imaging neural activity.16 In ref. 17, particle tracking in a fluid–solid system has also been tested in a 5 cm inner diameter pipe with 950 μm particles. The potential of event-based sensing for microfluidic applications remains consequently untapped.
Due to their characteristic microscopic scale, microfluidic systems primarily rely on imaging technologies (such as microscopes and cameras) to monitor fluids and particles inside a channel. Imaging modalities offer considerable flexibility and find applications in quality control (e.g., detection of dust/bubble), performance evaluation (e.g., mixing, separation, detection) or in better understanding localised phenomena but limitations remain. As an example, the role of imaging in inertial microfluidics is considered.
Inertial focusing devices have been widely used by the community for their unique capabilities to focus and separate particles based on size, shape and/or deformability.18 Despite advances in the field of computational inertial microfluidics,19 prototypes are often tailored to a targeted application following long design/test/optimise iterations to empirically explore the capabilities of new channel designs. Accurately imaging fluid and particle behaviour has become essential to assess the underlying physical phenomena and inform further design changes. Imaging inside inertial focusing devices (and microfluidics channels in general) typically relies on either long-exposure fluorescence or high-speed imaging (Fig. S1†).
Long-exposure fluorescence consists of imaging fluorescent particles over an extended period of time (long enough to be representative of the particle/fluid behaviour) and of building corresponding composite images (by stacking/integrating several images). This approach is particularly well suited to inertial devices with clear visualisation of “streaks” representing fluorescent particles that are focused at specific locations inside the channel. By mapping the intensity profile, an estimation of the focusing efficiency can be obtained. Long-exposure has been used with a wide range of particles (e.g., beads,20–24 cancer cells,25 bacteria26–28 or fungal cells29 to mention a few) and has enhanced our understanding of the impact of channel geometry on focusing.21,22,30,31 Advantages of this imaging approach include compatibility with standard fluorescent microscopes equipped with conventional cameras and relatively little data processing. Importantly however, the information extracted is reduced to global (i.e., population) behaviours. Moreover, high particle concentrations are usually required to ensure detectable fluorescent signals, hindering an in-depth understanding of single particle behaviour.
On the other hand, high-speed imaging unveils other aspects of inertial focusing devices such as the formation of trains of particles,32 the measurement of migration velocities33 or the number of focusing positions.30 In contrast to long-fluorescence imaging, studies exploiting high-speed imaging for in-depth quantification are scarcer. One reason might be the requirement of bespoke and expensive imaging systems to limit motion blur. Typically, imaging can be performed with a high-speed camera synchronised with a high-power, pulsed illumination source to reach exposure times in the order of 1–10 μs.33–36 Although products have been commercialised e.g., by Dolomite, Fluigent or PreciGenome to offer plug-and-play solutions to the community, such products are often limited to brightfield imaging. In the presence of mixed populations, as often occurs in microfluidic systems, detecting and differentiating particles without fluorescent signals can be a challenge. New approaches are emerging to unlock the potential of high-speed data acquisition;37–41 however, micro-particle image velocimetry (μPIV) remains the most widely accessible technique for high-speed fluorescence imaging. In μPIV, the illumination is provided by high-power, pulsed lasers to record pairs of images with a short time delay. μPIV set-ups or similar have been used for inertial focusing systems to access particle or fluid velocities.32,33,42–46 Access to μPIV can be challenging due to high capital cost; commercialised μPIV set-ups are also often limited to one wavelength, thus requiring iterative measurements for mixed populations (one measurement per fluorescent population).
In this work, we investigate the potential of event-based cameras as a cost-effective alternative to particle detection and tracking in microfluidic devices that 1) is compatible with standard microscopes, 2) does not rely on high-power pulsed illumination sources, 3) is significantly less data-consuming and less expensive than traditional, frame-based cameras and 4) is attractive for both bright-field and fluorescence imaging. As proof-of-principle, particle detection and tracking were performed in a spiral microfluidic channel in both fluorescence and bright-field modes for 1 μm and 10 μm-diameter beads. For the first time, this work reveals the unique capabilities of event-based sensing for overcoming some of the commonly encountered challenges in microfluidics imaging.
Prior to any measurement, and in between samples, 5 mL filtered PBS was flushed through the syringe three times, and then through the spiral at 1.5 mL min−1, to clean the system. Experiments were performed in triplicate.
To validate data from the event-based camera, a DinoLite camera (Dino-Lite) with a resolution of 1280 × 1024 pixels and a frame rate of 30 fps was mounted on a clamp stand and positioned above the spiral channel, and images recorded at ∼50× magnification, 30 fps using DinoCapture 2.0 software.
Data analysis was performed using a bespoke data processing pipeline implemented in Matlab, which is detailed in the following section.
In principle, object detection can be performed directly from the stream of detection events. However, here, we have adopted more traditional image processing tools to perform the detection task, especially given the small size and simple shape of the beads considered in this work. One of the main advantages of event-based cameras is that this frame-based representation can be obtained after the data acquisition, and the frame rate, as well as the integration time, can be user-defined. Each pixel is subject to a dead time, i.e., a period of time after each detection, during which that pixel is not able to record subsequent events. This dead time varies depending on the overall number of events recorded in the array but is typically lower than 50 μs with the camera used here. Thus, integration periods smaller than this value are not recommended. In the context of event-based cameras, we identify the integration period as the temporal window used to aggregate events and build a frame. Conversely, and as will be illustrated in Fig. 3, the longer this integration period, the blurrier the reconstructed frames. Here exposures in the 100 μs–1 ms range were used, depending on the expected particle velocity and illumination mode, to find the best visual trade-off between satisfactory particle detection and motion blur. Note that within a frame or integration period, several events can be recorded by the camera at a given pixel. In that case, only the most recent event is considered to generate the frame. The frame rate is also user-defined and can be set independently from the per frame integration time, e.g., it is possible to consider overlapping integration windows. Again, setting this parameter can depend on the expected speed of particles to be tracked. Low frame rates can lead to large distances travelled by the particles between two frames and can jeopardize the tracking task, especially when multiple particles are present in the field of view simultaneously (a challenging data association problem). High frame rates make the tracking task easier, but might unnecessarily increase the number of frames to be processed. In this work, the frame rate has been set to 20k fps, to ensure the particles are visible in a sufficient number of frames to estimate their velocity (>40 frames).
Based on previous work with similar designs,47–49 10 μm rigid spherical beads were expected to focus towards the inner wall of the spiral channel for Reynolds numbers above ∼50 in the region of interest of the channel, while 1 μm particles were expected to remain unfocused. It can also be noted that according to ref. 50, no secondary Dean vortices are expected for the Reynolds numbers in the range of interest (Re < 160). With an average fluid velocity in the spiral of Uf = 0.5 m s−1 at Re = 50, a light source with a very short pulse duration (∼10 μs) would normally be necessary for single particle detection to circumvent motion blur. In this work, only a standard lighting source (Zeiss HBO100 Mercury vapor short-arc lamp) from a fluorescent microscope was used with the event-based camera.
It is interesting to note from Fig. 3 that the trails produced by moving particles (e.g., the negative (blue) events for fluorescent particles in Fig. 3A and B) are usually longer than the events created when the particles first become visible in a given pixel. As presented in Fig. 4A, this phenomenon is due to the fact that only the most recent event is kept at each pixel when constructing the frames. Depending on the selected integration period, the reconstructed image consequently has more negative (blue) events than positive (red) ones for fluorescent particles (and more positive events for bright-field reconstructed images).
Fig. 4 A. Example of image reconstruction from recorded asynchronous events. In this example, a single pixel records first a positive event and then a negative one (top panel). The events are mapped onto temporal frames based on a user-defined integration window (mid panel). This gives access to a sequence of reconstructed images that can then be analysed using image processing tools. The event camera only keeps information corresponding to the last event, leading to “trails” (higher number of negative (blue) events in this example) on the reconstructed image (bottom panel). B. Simulation of moving particle between times t1 and t2 in bright-field and fluorescence modes and corresponding events. The third row from the top illustrates how the (simulated) reconstructed event frames are expected to look like (for a sufficiently short time delay t2–t1) and short integration time. The displacement of the particles is further highlighted by the dashed yellow lines. C. Reconstructed images obtained from measurements of 10 μm particles slowly moving (∼0.0003 m s−1) in the spiral channel for an integration time of 10 ms and 100 ms. As in Fig. 3A and B, the longer the integration time, the longer the trails. |
Fig. 3 also shows that moving fluorescent particles induce a local increase of light flux, while particles in bright-field mode are characterized by a local reduction of the light flux. Consequently, fluorescent particles first produce positive events and then negative events, depicted in red and blue, respectively, in Fig. 3A and B. Conversely, particles in bright-field first produce negative events and then positive events (Fig. 3C). This detail is essential to process data appropriately and differences between bright-field and fluorescence imaging are further highlighted via the simulated data presented in Fig. 4B.
In bright-field mode, static particles typically appear as dark rings.51 The darker edges and lighted centre can create a characteristic pattern on the reconstructed images (see simulated images in Fig. 4B, left panels), especially for slowly moving particles. Despite not being visible in Fig. 3C due to the high velocity of the particles, this pattern was observed at higher magnification (20×) and lower particle velocity (∼0.0003 m s−1), as reported in Fig. 4C. It can also be noted that for slow-moving objects, the length of the tail could be used to estimate the particle velocity from a single image, i.e., without the need for advanced tracking algorithms. However, Fig. 3 shows that the length of the tail of high-velocity particles cannot be accurately measured due to the low signal-to-noise ratio in that case.
Consequently, in the following sections, only the positive events are used to detect particles in fluorescence, and respectively, negative events to detect particles in bright-field mode. More precisely, the frames of positive or negative events are first denoised using morphological transforms (e.g., erosion and dilation) to remove isolated events. The particles are then extracted by identifying groups of spatially connected events whose size falls into a predefined range. The position of the particle is then computed as the position of the centroid of each region. Note that here, all the particles used had the same apparent size, but the size/shape of the connected regions could be used in the future to classify particles and potentially enhance the tracking results.
Particles were successfully detected in fluorescence mode for Uf ∈ [0.04, 1.54] m s−1. The focusing of beads close to the inner wall is clearly visible when the fluid velocity increases, as expected for this microfluidic design.48 Note that each distribution has been derived from at least 1000 particles and that similar distribution profiles can be plotted at any x-location along the channel imaged (cf.Fig. 3D for definition of x axis), or mapped onto the entire section of the channel imaged (the channel length that is imaged with the current set-up is circa 1.5 mm).
For comparison purposes, similar experiments were performed in bright-field mode. Due to a lower contrast between beads and background, reliable detection was only possible up to Uf = 0.15 m s−1 (Fig. 5B). To measure the similarity between the distributions obtained in Fig. 5A and B, we computed their percentage overlap. For Uf = 0.04, 0.08 and 0.15 m s−1, the overlap between the distributions plotted in Fig. 5A and B was 88%, 79% and 77%, respectively. The distributions of fluorescence and bright-field modes are generally in good agreement and the increasing discrepancy with increasing Re can be partly explained by a degradation of the detection performance in bright-field mode.
Being able to detect particles for fluid velocities up to 1.54 m s−1 without a pulsed light confirms the potential of event-based cameras for particle detection in microfluidic channels. Note that 1) results for both fluorescence and bright-field modes were highly reproducible as demonstrated for three replicates (cf. blue inset, Fig. 5B; overlap >94% for replicates in fluorescence mode and >89% in bright-field mode) and 2) detection at higher fluid velocities might possibly be achieved upon further optimisation of the hardware/software (see discussion in Conclusion).
As a control, long-exposure fluorescence imaging was used for all the average fluid velocities considered (cf.Fig. 6A) – in this case, images were recorded at 30 fps and stacked over a 15 second period. Fluorescence intensity distributions were then estimated in a measurement window similar to the one used in Fig. 5. As visible in Fig. 6B, the intensity distributions follow patterns and trends similar to those obtained using the event-based camera, with focusing occurring at the inner wall of the spiral channel with increasing Reynolds numbers. The horizontal spread of the fluorescence distribution at Uf = 1.54 m s−1 does not appear as sharp as the particle distribution, which might be due to pixel saturation and the likely non-linear relationship between the particle and light intensity densities. This limitation due to the fluorescence imaging set-up is confirmed by the similarities in results obtained with the event-based camera and a high-speed camera for Uf = 1.54 m s−1 (Fig. S4†). Only one main mode is also observed in fluorescence imaging, while two streams of focusing seem to be detected with the event-based camera at lower velocities (visible at ∼215 μm, ∼193 μm and ∼138 μm along the y-axis for Uf = 0.08, 0.15 and 0.39 m s−1 in Fig. 6B). Multiple focusing streamlines in inertial devices have been previously reported in the literature, especially for higher volume fractions.47,52 However, the particle concentration used here was kept the same for event-based and fluorescence imaging experiments. Although the exact nature of this observed second mode remains unclear; its observation for both event-based fluorescence and bright-field modes in Fig. 5 seems to confirm that this is not an artefact from the cameras or from the data processing pipeline.
Depending on the concentration of particles, a varying number of particles can be observed simultaneously in the field of view. Consequently, algorithms for multiple target tracking (MTT) can be used. Although it might be possible to identify particles from the complete set of generated frames, in practice, it is computationally intractable, given the high frame rates considered (20m frames for a 10 s experiment). In all experiments performed, the number of particles simultaneously present was relatively low (less than 10) and the particles presented similar trajectories and velocities. Thus, a standard online approach to MTT was adopted, which updated the particle tracks sequentially as each frame was processed. Following the particle detection steps described above, the data association problem was solved using a variant of the Munkres algorithm.53,54 This problem consisted of deciding which detected particles were associated with existing tracks (from previous frames) and which were new particles. Once the data association was performed, the actual tracking of each particle was performed using a standard Kalman filter55 assuming a near-constant velocity model (for each track). The algorithm also included a track ending mechanism, which terminated tracks for particles that had not been seen over a given period. This made the algorithm more robust against missed particles (which might not be detected in a few frames during the detection process).
Fig. 7A presents an example of four tracks identified in the channel at Uf = 0.04 m s−1 (Re = 4), leading to the estimated particle velocities plotted in Fig. 7B. The estimated average particle velocity of 0.05 m s−1 is in accordance with Uf. A slight decrease in particle velocity can be observed when approaching the outlets (x > 1000 μm), as expected due to the opening of the channel. Particle velocity profiles were then plotted as a function of the distance to the inner wall for different Reynolds numbers. As presented in Fig. 7C, successful tracking was achieved up to Uf = 0.39 m s−1 (Re = 40); the colormap highlights in purple regions where many detected particles present similar velocity and distance to the inner wall, for a given Reynolds number. For Re < 40, an approximately constant velocity is measured for all the particles tracked (>1000 for each experiment). For particles detected closer to the inner wall, especially at Re = 40 with a high number of particles tightly focused in the region near y = 45 μm, a decrease in particle velocity is observed. This behaviour in particle velocity is further confirmed by the probability density functions displayed in Fig. 7D. For Re > 40, the high speed of the particles induced large particle displacements between successive frames and the read-out limitations of the camera made it more challenging to track particles (low probability of detection) and estimate their velocity.
Fig. 7 A. Example of tracks for four randomly picked 10 μm particles flowing in the spiral at Uf = 0.04 m s−1 (one colour per track). Dashed lines correspond to the microfluidic channel. Each track contains approximately 800 positions. B. Corresponding estimated particle velocity (raw data) as a function of the x-position in the channel. C. Particle velocity profile as a function of the distance to the inner wall for Re = 4–40 (Uf = 0.04–0.39 m s−1) and D. corresponding probability density functions (P.D.F). Plots in C are based on particles tracked in the blue region of interest highlighted in A. Colormap highlights particle density, with dense regions in purple (linear colour maps L17 in ref. 56). |
The results in Fig. 7 confirm that event-based cameras can be used to track individual particle behaviours in the size range of commonly used biological cells. With a particle velocity of 0.4 m s−1, a recording rate of 30 fps (which is the typical recording rate of normal cameras57) would allow the particle to be seen at most twice in the region of interest (1.5 mm). With a recording rate of 20k fps, these results confirm that event-based cameras can be used for tracking based on high-speed data acquisition. Although the aim of this work is to demonstrate the compatibility of event-based sensing with stand-alone microscopes, it can also be noted that performances could be improved by coupling the camera with a stronger illumination source.
In order to map fluid patterns, smaller sizes would typically be used for μPIV experiments. Experiments were consequently also conducted with 1 μm fluorescent particles at Uf = 0.04 m s−1, still using only the illumination from the microscope. With the current set-up, one pixel of the 480 × 360 pixels CMOS vision sensor corresponds to a ∼3.3 μm × 3.3 μm field of view. As presented in Fig. 8A, a subpixel detection was possible in fluorescence mode, with 1 μm particles successfully detected across the channel. The estimated probability density function of particle velocity is displayed in Fig. 8B. It can be noted that only particles with lower velocities (<0.03 m s−1) were successfully detected and tracked. Due to the small size of the particles, the fluorescence-induced intensity change was lower than with the previously tested 10 μm particles, causing a lower signal-to-noise ratio while the particle passing time was shorter. Importantly, the fact that 1 μm beads could still be detected and tracked illustrates further the potential of event-based sensing for μPIV experiments in microfluidic devices; a higher detection performance could be reached by either increasing the magnification or working with sensors with a higher number of pixels (cf.ref. 2 for a descriptive review of existing cameras available and characteristics).
Fig. 8 Tracking results of 1 μm fluorescent particles at Uf = 0.04 m s−1. Probability density functions (P.D.F.) of A. tracked particles as a function of the distance to the spiral inner wall and B. corresponding particle velocities. Plots A and B are based on particles tracked in the blue region of interest highlighted in Fig. 7A. |
For both 10 and 1 μm particles, no information was obtained on the z-position of particles detected with the current data processing strategy; a similar approach to ref. 33 using velocimetric reconstruction for obtaining the z-position could potentially be used with event-based data too.
Finally, the drastic changes in data – from images to events – imposes the development of a new framework for processing. Importantly, it has been demonstrated here that data can be analysed to extract relevant information (e.g., particle focusing position, particle velocity) and can also be directly compared to images (e.g., comparison with composite image from fluorescent imaging). This, in addition to their high sensitivity to intensity changes, compatibility with standard microscopes, high speed capabilities, low consumption and lower costs compared to standard high-speed cameras, makes event-based cameras unique candidates to change our way of characterising the microscopic world.
Footnote |
† Electronic supplementary information (ESI) available. See DOI: 10.1039/d0lc00556h |
This journal is © The Royal Society of Chemistry 2020 |