Open Access Article
This Open Access Article is licensed under a
Creative Commons Attribution 3.0 Unported Licence

High-speed particle detection and tracking in microfluidic devices using event-based sensing

Jessie Howell a, Tansy C. Hammarton b, Yoann Altmann *c and Melanie Jimenez *a
aBiomedical Engineering Division, James Watt School of Engineering, University of Glasgow, Glasgow, G12 8LT, UK. E-mail: Melanie.Jimenez@glasgow.ac.uk
bInstitute of Infection, Immunity and Inflammation, University of Glasgow, Glasgow, G12 8TA, UK
cSchool of Engineering and Physical Sciences, Heriot-Watt University, Edinburgh Campus, Edinburgh, EH14 4AS, UK

Received 29th May 2020 , Accepted 16th July 2020

First published on 17th July 2020


Abstract

Visualising fluids and particles within channels is a key element of microfluidic work. Current imaging methods for particle image velocimetry often require expensive high-speed cameras with powerful illuminating sources, thus potentially limiting accessibility. This study explores for the first time the potential of an event-based camera for particle and fluid behaviour characterisation in a microfluidic system. Event-based cameras have the unique capacity to detect light intensity changes asynchronously and to record spatial and temporal information with low latency, low power and high dynamic range. Event-based cameras could consequently be relevant for detecting light intensity changes due to moving particles, chemical reactions or intake of fluorescent dyes by cells to mention a few. As a proof-of-principle, event-based sensing was tested in this work to detect 1 μm and 10 μm diameter particles flowing in a microfluidic channel for average fluid velocities of up to 1.54 m s−1. Importantly, experiments were performed by directly connecting the camera to a standard fluorescence microscope, only relying on the microscope arc lamp for illumination. We present a data processing strategy that allows particle detection and tracking in both bright-field and fluorescence imaging. Detection was achieved up to a fluid velocity of 1.54 m s−1 and tracking up to 0.4 m s−1 suggesting that event-based cameras could be a new paradigm shift in microscopic imaging.


Introduction

Event-based cameras emerged in the 1990s as neuromorphic vision sensors mimicking biological retinas.1 Unlike frame-based cameras, event-based cameras respond to, and only record, brightness changes (log intensity depicted as log(n) in Fig. 1), asynchronously and independently for each pixel. When a change in brightness is detected at a given pixel, the event information is transmitted, that is, its (x, y) location on the pixel array, its time stamp and the sign/polarity of the change (increase (+1 in Fig. 1C) or decrease (−1 in Fig. 1C)).
image file: d0lc00556h-f1.tif
Fig. 1 Event-based detection of a particle over time. A. Represents the light flux reaching a given pixel when a fluorescent particle passes through the pixel field of view (B). C. When the light intensity change (with respect to the last recorded event) exceeds a user-defined threshold, the camera records a new event whose polarity encodes the sign of the intensity change.

Using current event-based cameras (available for under 5k USD), events are detected with microsecond resolution. Since only events are transmitted, event-based cameras offer low latency, low power (ca. 10 mW) and high dynamic range (>120 dB).2 These “silicon retinas” have become increasingly popular for high-speed robotic vision, e.g., for ball detection,3–5 gesture recognition,6,7 3D mapping8 or for unmanned aerial vehicles9 and predator robots.10 They have also been used for tracking macroscopic objects such as vehicles11 or stars/satellites12,13 (see the recent survey14 which discusses applications and challenges of event-based sensing). Importantly, and to the best of our knowledge, application to the “micro-world” has been limited to the work proposed in ref. 15 for micro-robotics, demonstrating tracking of microparticles in a Petri dish or for imaging neural activity.16 In ref. 17, particle tracking in a fluid–solid system has also been tested in a 5 cm inner diameter pipe with 950 μm particles. The potential of event-based sensing for microfluidic applications remains consequently untapped.

Due to their characteristic microscopic scale, microfluidic systems primarily rely on imaging technologies (such as microscopes and cameras) to monitor fluids and particles inside a channel. Imaging modalities offer considerable flexibility and find applications in quality control (e.g., detection of dust/bubble), performance evaluation (e.g., mixing, separation, detection) or in better understanding localised phenomena but limitations remain. As an example, the role of imaging in inertial microfluidics is considered.

Inertial focusing devices have been widely used by the community for their unique capabilities to focus and separate particles based on size, shape and/or deformability.18 Despite advances in the field of computational inertial microfluidics,19 prototypes are often tailored to a targeted application following long design/test/optimise iterations to empirically explore the capabilities of new channel designs. Accurately imaging fluid and particle behaviour has become essential to assess the underlying physical phenomena and inform further design changes. Imaging inside inertial focusing devices (and microfluidics channels in general) typically relies on either long-exposure fluorescence or high-speed imaging (Fig. S1).

Long-exposure fluorescence consists of imaging fluorescent particles over an extended period of time (long enough to be representative of the particle/fluid behaviour) and of building corresponding composite images (by stacking/integrating several images). This approach is particularly well suited to inertial devices with clear visualisation of “streaks” representing fluorescent particles that are focused at specific locations inside the channel. By mapping the intensity profile, an estimation of the focusing efficiency can be obtained. Long-exposure has been used with a wide range of particles (e.g., beads,20–24 cancer cells,25 bacteria26–28 or fungal cells29 to mention a few) and has enhanced our understanding of the impact of channel geometry on focusing.21,22,30,31 Advantages of this imaging approach include compatibility with standard fluorescent microscopes equipped with conventional cameras and relatively little data processing. Importantly however, the information extracted is reduced to global (i.e., population) behaviours. Moreover, high particle concentrations are usually required to ensure detectable fluorescent signals, hindering an in-depth understanding of single particle behaviour.

On the other hand, high-speed imaging unveils other aspects of inertial focusing devices such as the formation of trains of particles,32 the measurement of migration velocities33 or the number of focusing positions.30 In contrast to long-fluorescence imaging, studies exploiting high-speed imaging for in-depth quantification are scarcer. One reason might be the requirement of bespoke and expensive imaging systems to limit motion blur. Typically, imaging can be performed with a high-speed camera synchronised with a high-power, pulsed illumination source to reach exposure times in the order of 1–10 μs.33–36 Although products have been commercialised e.g., by Dolomite, Fluigent or PreciGenome to offer plug-and-play solutions to the community, such products are often limited to brightfield imaging. In the presence of mixed populations, as often occurs in microfluidic systems, detecting and differentiating particles without fluorescent signals can be a challenge. New approaches are emerging to unlock the potential of high-speed data acquisition;37–41 however, micro-particle image velocimetry (μPIV) remains the most widely accessible technique for high-speed fluorescence imaging. In μPIV, the illumination is provided by high-power, pulsed lasers to record pairs of images with a short time delay. μPIV set-ups or similar have been used for inertial focusing systems to access particle or fluid velocities.32,33,42–46 Access to μPIV can be challenging due to high capital cost; commercialised μPIV set-ups are also often limited to one wavelength, thus requiring iterative measurements for mixed populations (one measurement per fluorescent population).

In this work, we investigate the potential of event-based cameras as a cost-effective alternative to particle detection and tracking in microfluidic devices that 1) is compatible with standard microscopes, 2) does not rely on high-power pulsed illumination sources, 3) is significantly less data-consuming and less expensive than traditional, frame-based cameras and 4) is attractive for both bright-field and fluorescence imaging. As proof-of-principle, particle detection and tracking were performed in a spiral microfluidic channel in both fluorescence and bright-field modes for 1 μm and 10 μm-diameter beads. For the first time, this work reveals the unique capabilities of event-based sensing for overcoming some of the commonly encountered challenges in microfluidics imaging.

Materials and methods

Bead preparation

Red fluorescent polystyrene beads, 10 μm or 1 μm in diameter (Magsphere) were diluted in filtered phosphate-buffered saline (PBS) supplemented with 0.1% v/v Triton X-100 to a final concentration of 1 × 104–5 × 105 beads per mL. Beads concentrations were determined using a haemocytometer.

Microfluidic setup

A spiral device was fabricated by lithography (Epigem, UK) and consisted of a channel in an Archimedean spiral, with a rectangular cross section of 360 μm (width) × 60 μm (height). The channel had one inlet, four outlets and six loops with a radius of curvature varying from 0.9 mm to 4.2 mm (Fig. 2). The full design of the spiral is provided in the ESI (Fig. S2). Samples were injected into the spiral channel via the inlet with a mid-pressure syringe pump (neMESYS 1000N, Cetoni, Germany) and polytetrafluoroethylene tubing with an internal diameter of 0.5 mm. The tubing was connected to the chip via cheminert nuts (1/4′′−28 for 1/16′′ outer diameter). Applied flowrates in this work ranged from 0.05 to 2 mL min−1 corresponding to average fluid velocities in the interval 0.04–1.54 m s−1 and Reynolds numbers in the interval 4–159; the Reynolds number is defined as Re = ρUDh/μ, where ρ is the fluid density, μ is the fluid viscosity, U is the velocity of the fluid and Dh the hydraulic diameter of the channel.
image file: d0lc00556h-f2.tif
Fig. 2 Imaging in inertial focusing devices. A. Illustration of the optical setup. Experiments were performed with a standard fluorescent microscope, with the event-based camera mounted via a C-mount port. B. Picture of the event-based camera used in this work. The camera is circa 5.7 cm in width, 4.8 cm in height and 3.6 cm in depth. C. Schematic of the spiral channel used for focusing experiments with images recorded in the region of interest (ROI).

Prior to any measurement, and in between samples, 5 mL filtered PBS was flushed through the syringe three times, and then through the spiral at 1.5 mL min−1, to clean the system. Experiments were performed in triplicate.

Imaging setup

Particles flowing in the spiral were characterised by capturing video footage (15 seconds duration, from triplicate experiments) using two separate cameras. As depicted in Fig. 2, an event-based camera (CSD3SHCD, Prophesee) consisting of a 480 × 360 pixels CMOS vision sensor, 20 μm × 20 μm event-based pixels and >10k frames per second (fps) typical equivalent frame rate, was mounted on a Zeiss Axioskop 2 fluorescence microscope (Zeiss, Germany) to visualise the spiral at 10× magnification. Prophesee player software (version 1.4.1-1935316) was used to adjust the camera settings and to record videos. A picture of the experimental setup with the event-based camera is available in Fig. S3.

To validate data from the event-based camera, a DinoLite camera (Dino-Lite) with a resolution of 1280 × 1024 pixels and a frame rate of 30 fps was mounted on a clamp stand and positioned above the spiral channel, and images recorded at ∼50× magnification, 30 fps using DinoCapture 2.0 software.

Data analysis was performed using a bespoke data processing pipeline implemented in Matlab, which is detailed in the following section.

Event-data collection and pre-processing

As mentioned above, event-based cameras do not provide series of frames but, instead, lists of time-tagged events. Thus, pre-processing steps are usually required prior to information extraction. As explained in Fig. 1, events are recorded when the intensity change exceeds a user-defined threshold. If this threshold is set too small, a large number of events are recorded, including signal events but also spurious events considered as “nuisance events”. For extremely low threshold values, this can lead to read-out issues whereby not all the events can be properly recorded and transmitted. Conversely, using a large threshold value reduces the number of background events but also the number of signal events, potentially hindering the detection of particles generating intensity changes. Consequently, this threshold has to be set carefully. Here, the trade-off between accurate particle detection and low background noise was found via visual inspection using the camera software for each Reynolds number and illumination mode. Note that, in the bright-field mode, the intensity changes are weaker than in the fluorescence mode, leading to lower thresholds in practice.

In principle, object detection can be performed directly from the stream of detection events. However, here, we have adopted more traditional image processing tools to perform the detection task, especially given the small size and simple shape of the beads considered in this work. One of the main advantages of event-based cameras is that this frame-based representation can be obtained after the data acquisition, and the frame rate, as well as the integration time, can be user-defined. Each pixel is subject to a dead time, i.e., a period of time after each detection, during which that pixel is not able to record subsequent events. This dead time varies depending on the overall number of events recorded in the array but is typically lower than 50 μs with the camera used here. Thus, integration periods smaller than this value are not recommended. In the context of event-based cameras, we identify the integration period as the temporal window used to aggregate events and build a frame. Conversely, and as will be illustrated in Fig. 3, the longer this integration period, the blurrier the reconstructed frames. Here exposures in the 100 μs–1 ms range were used, depending on the expected particle velocity and illumination mode, to find the best visual trade-off between satisfactory particle detection and motion blur. Note that within a frame or integration period, several events can be recorded by the camera at a given pixel. In that case, only the most recent event is considered to generate the frame. The frame rate is also user-defined and can be set independently from the per frame integration time, e.g., it is possible to consider overlapping integration windows. Again, setting this parameter can depend on the expected speed of particles to be tracked. Low frame rates can lead to large distances travelled by the particles between two frames and can jeopardize the tracking task, especially when multiple particles are present in the field of view simultaneously (a challenging data association problem). High frame rates make the tracking task easier, but might unnecessarily increase the number of frames to be processed. In this work, the frame rate has been set to 20k fps, to ensure the particles are visible in a sufficient number of frames to estimate their velocity (>40 frames).


image file: d0lc00556h-f3.tif
Fig. 3 Raw data for 10 μm particle visualisation inside the spiral channel at Uf = 0.04 m s−1 (Re = 4) in fluorescence mode for an integration time of 100 μs (A) and 750 μs (B). C. Example of raw data for bright-field imaging with the event camera and an integration time of 750 μs. Red arrows denote the position of particles in the channel, red pixels correspond to pixels detecting an increase in brightness and blue ones to a decrease. D. Reconstructed long-exposure intensity image. The scale bars (in the bottom right corner of each subplot) correspond to 200 μm; the inner and outer walls are defined in A.

Results

As presented in Fig. 2, the passage of spherical particles through a spiral channel with a rectangular cross-section (360 μm × 60 μm) was analysed with an event-based camera. Particle detection and tracking were performed near the outlet of the spiral (1200 μm × 1500 μm region of interest (ROI) in Fig. 2C).

Based on previous work with similar designs,47–49 10 μm rigid spherical beads were expected to focus towards the inner wall of the spiral channel for Reynolds numbers above ∼50 in the region of interest of the channel, while 1 μm particles were expected to remain unfocused. It can also be noted that according to ref. 50, no secondary Dean vortices are expected for the Reynolds numbers in the range of interest (Re < 160). With an average fluid velocity in the spiral of Uf = 0.5 m s−1 at Re = 50, a light source with a very short pulse duration (∼10 μs) would normally be necessary for single particle detection to circumvent motion blur. In this work, only a standard lighting source (Zeiss HBO100 Mercury vapor short-arc lamp) from a fluorescent microscope was used with the event-based camera.

Visualisation and detection of microparticles

As visible in Fig. 3A and B (and ESI Video), 10 μm fluorescent particles can be clearly seen at a low fluid velocity (Uf = 0.04 m s−1, Re = 4) using the event-based camera. The motion of particles in bright-field mode (Fig. 3C) was also demonstrated, although the contrast was generally lower than in the fluorescence mode. Fig. 3A–C highlight one specific characteristic of event-based cameras, namely the possibility to define, after the data recording, the integration period and frame rate for visualisation/analysis purposes, as discussed above. Although not all event-based cameras provide this feature, the camera used in this work also records information allowing the reconstruction of grey scale images (but at a much lower time resolution than the event-data processed here). This feature was used to create a long-exposure intensity image (in bright-field mode) for calibration purposes and to identify the location of the channel in the field of view. An example is depicted in Fig. 3D.

It is interesting to note from Fig. 3 that the trails produced by moving particles (e.g., the negative (blue) events for fluorescent particles in Fig. 3A and B) are usually longer than the events created when the particles first become visible in a given pixel. As presented in Fig. 4A, this phenomenon is due to the fact that only the most recent event is kept at each pixel when constructing the frames. Depending on the selected integration period, the reconstructed image consequently has more negative (blue) events than positive (red) ones for fluorescent particles (and more positive events for bright-field reconstructed images).


image file: d0lc00556h-f4.tif
Fig. 4 A. Example of image reconstruction from recorded asynchronous events. In this example, a single pixel records first a positive event and then a negative one (top panel). The events are mapped onto temporal frames based on a user-defined integration window (mid panel). This gives access to a sequence of reconstructed images that can then be analysed using image processing tools. The event camera only keeps information corresponding to the last event, leading to “trails” (higher number of negative (blue) events in this example) on the reconstructed image (bottom panel). B. Simulation of moving particle between times t1 and t2 in bright-field and fluorescence modes and corresponding events. The third row from the top illustrates how the (simulated) reconstructed event frames are expected to look like (for a sufficiently short time delay t2t1) and short integration time. The displacement of the particles is further highlighted by the dashed yellow lines. C. Reconstructed images obtained from measurements of 10 μm particles slowly moving (∼0.0003 m s−1) in the spiral channel for an integration time of 10 ms and 100 ms. As in Fig. 3A and B, the longer the integration time, the longer the trails.

Fig. 3 also shows that moving fluorescent particles induce a local increase of light flux, while particles in bright-field mode are characterized by a local reduction of the light flux. Consequently, fluorescent particles first produce positive events and then negative events, depicted in red and blue, respectively, in Fig. 3A and B. Conversely, particles in bright-field first produce negative events and then positive events (Fig. 3C). This detail is essential to process data appropriately and differences between bright-field and fluorescence imaging are further highlighted via the simulated data presented in Fig. 4B.

In bright-field mode, static particles typically appear as dark rings.51 The darker edges and lighted centre can create a characteristic pattern on the reconstructed images (see simulated images in Fig. 4B, left panels), especially for slowly moving particles. Despite not being visible in Fig. 3C due to the high velocity of the particles, this pattern was observed at higher magnification (20×) and lower particle velocity (∼0.0003 m s−1), as reported in Fig. 4C. It can also be noted that for slow-moving objects, the length of the tail could be used to estimate the particle velocity from a single image, i.e., without the need for advanced tracking algorithms. However, Fig. 3 shows that the length of the tail of high-velocity particles cannot be accurately measured due to the low signal-to-noise ratio in that case.

Consequently, in the following sections, only the positive events are used to detect particles in fluorescence, and respectively, negative events to detect particles in bright-field mode. More precisely, the frames of positive or negative events are first denoised using morphological transforms (e.g., erosion and dilation) to remove isolated events. The particles are then extracted by identifying groups of spatially connected events whose size falls into a predefined range. The position of the particle is then computed as the position of the centroid of each region. Note that here, all the particles used had the same apparent size, but the size/shape of the connected regions could be used in the future to classify particles and potentially enhance the tracking results.

Quantification of focusing behaviours in fluorescence and bright-field modes

In addition to demonstrating that particles can be detected at low fluid velocity, we investigated whether detection was possible for the range of velocities which are usually considered for focusing experiments with such microfluidic designs (0.04 ≤ Uf ≤ 1.54 m s−1 tested here, corresponding to 4 ≤ Re ≤ 159). The spatial distribution (expressed as the distance to the spiral channel inner wall) of particles detected next to the outlet of the channel, in a region of constant cross-section (before the opening), was recorded (Fig. 5A). For visualisation purposes, violin plots were used as a comprehensive representation of the spatial distribution of particles. A wide horizontal spread of these plots (normalised by the number of detected particles in each case) corresponds to a large number of particles detected within a narrow region of the channel (characteristics of focusing).
image file: d0lc00556h-f5.tif
Fig. 5 A. Distribution of particles detected in one video recorded with an event-based camera using fluorescence microscopy for average fluid velocities in the interval 0.04–1.54 m s−1 (corresponding to Re = 4, 8, 16, 40 and 159). B. Distribution of particles detected with an event-based camera using bright-field microscopy for average fluid velocities in the interval 0.04–0.15 m s−1. Inset. For Uf = 0.15 m s−1 (light blue plots), three replicates (blue, red and black) are superimposed for both fluorescence and bright-field imaging. All the distributions depicted in this figure are normalised such that they do not depend on the particle concentration. Thus, all the violin plots (except in the inset which presents a different vertical scale) have the same area. Although the average fluid velocity Uf is used in the x-axis for differentiation purposes, a wide horizontal spread corresponds to a large number of particles detected within a given distance to the inner wall.

Particles were successfully detected in fluorescence mode for Uf ∈ [0.04, 1.54] m s−1. The focusing of beads close to the inner wall is clearly visible when the fluid velocity increases, as expected for this microfluidic design.48 Note that each distribution has been derived from at least 1000 particles and that similar distribution profiles can be plotted at any x-location along the channel imaged (cf.Fig. 3D for definition of x axis), or mapped onto the entire section of the channel imaged (the channel length that is imaged with the current set-up is circa 1.5 mm).

For comparison purposes, similar experiments were performed in bright-field mode. Due to a lower contrast between beads and background, reliable detection was only possible up to Uf = 0.15 m s−1 (Fig. 5B). To measure the similarity between the distributions obtained in Fig. 5A and B, we computed their percentage overlap. For Uf = 0.04, 0.08 and 0.15 m s−1, the overlap between the distributions plotted in Fig. 5A and B was 88%, 79% and 77%, respectively. The distributions of fluorescence and bright-field modes are generally in good agreement and the increasing discrepancy with increasing Re can be partly explained by a degradation of the detection performance in bright-field mode.

Being able to detect particles for fluid velocities up to 1.54 m s−1 without a pulsed light confirms the potential of event-based cameras for particle detection in microfluidic channels. Note that 1) results for both fluorescence and bright-field modes were highly reproducible as demonstrated for three replicates (cf. blue inset, Fig. 5B; overlap >94% for replicates in fluorescence mode and >89% in bright-field mode) and 2) detection at higher fluid velocities might possibly be achieved upon further optimisation of the hardware/software (see discussion in Conclusion).

As a control, long-exposure fluorescence imaging was used for all the average fluid velocities considered (cf.Fig. 6A) – in this case, images were recorded at 30 fps and stacked over a 15 second period. Fluorescence intensity distributions were then estimated in a measurement window similar to the one used in Fig. 5. As visible in Fig. 6B, the intensity distributions follow patterns and trends similar to those obtained using the event-based camera, with focusing occurring at the inner wall of the spiral channel with increasing Reynolds numbers. The horizontal spread of the fluorescence distribution at Uf = 1.54 m s−1 does not appear as sharp as the particle distribution, which might be due to pixel saturation and the likely non-linear relationship between the particle and light intensity densities. This limitation due to the fluorescence imaging set-up is confirmed by the similarities in results obtained with the event-based camera and a high-speed camera for Uf = 1.54 m s−1 (Fig. S4). Only one main mode is also observed in fluorescence imaging, while two streams of focusing seem to be detected with the event-based camera at lower velocities (visible at ∼215 μm, ∼193 μm and ∼138 μm along the y-axis for Uf = 0.08, 0.15 and 0.39 m s−1 in Fig. 6B). Multiple focusing streamlines in inertial devices have been previously reported in the literature, especially for higher volume fractions.47,52 However, the particle concentration used here was kept the same for event-based and fluorescence imaging experiments. Although the exact nature of this observed second mode remains unclear; its observation for both event-based fluorescence and bright-field modes in Fig. 5 seems to confirm that this is not an artefact from the cameras or from the data processing pipeline.


image file: d0lc00556h-f6.tif
Fig. 6 A. Composite images of 10 μm fluorescent beads flowing near the outlet of a spiral channel at Uf = 0.15, 0.39 and 1.54 m s−1 (Re = 16, 40 and 159 respectively). The dashed lines correspond to the channel walls, the yellow rectangles are regions where the intensity distributions have been estimated and the arrow highlights the focused stream of beads at Uf = 1.54 m s−1. Scale bars: 200 μm. B. Intensity distributions estimated in the measurement windows (yellow rectangles in A) and depicted as distances to the channel inner wall in blue. Particle densities obtained from the event-camera are superimposed in red. All the distributions depicted in this figure are normalised such that all the violin plots have the same area. Although the average fluid velocity Uf is used in the x-axis for differentiation purposes, a wide horizontal spread corresponds to a large number of particles detected within a given distance to the inner wall.

Particle tracking and velocity mapping

In this section, the potential of event-based imaging for particle tracking was investigated. Only results in fluorescence mode are reported here but tracking was also achieved in bright-field mode for all the fluid velocities reported in Fig. 5B.

Depending on the concentration of particles, a varying number of particles can be observed simultaneously in the field of view. Consequently, algorithms for multiple target tracking (MTT) can be used. Although it might be possible to identify particles from the complete set of generated frames, in practice, it is computationally intractable, given the high frame rates considered (20m frames for a 10 s experiment). In all experiments performed, the number of particles simultaneously present was relatively low (less than 10) and the particles presented similar trajectories and velocities. Thus, a standard online approach to MTT was adopted, which updated the particle tracks sequentially as each frame was processed. Following the particle detection steps described above, the data association problem was solved using a variant of the Munkres algorithm.53,54 This problem consisted of deciding which detected particles were associated with existing tracks (from previous frames) and which were new particles. Once the data association was performed, the actual tracking of each particle was performed using a standard Kalman filter55 assuming a near-constant velocity model (for each track). The algorithm also included a track ending mechanism, which terminated tracks for particles that had not been seen over a given period. This made the algorithm more robust against missed particles (which might not be detected in a few frames during the detection process).

Fig. 7A presents an example of four tracks identified in the channel at Uf = 0.04 m s−1 (Re = 4), leading to the estimated particle velocities plotted in Fig. 7B. The estimated average particle velocity of 0.05 m s−1 is in accordance with Uf. A slight decrease in particle velocity can be observed when approaching the outlets (x > 1000 μm), as expected due to the opening of the channel. Particle velocity profiles were then plotted as a function of the distance to the inner wall for different Reynolds numbers. As presented in Fig. 7C, successful tracking was achieved up to Uf = 0.39 m s−1 (Re = 40); the colormap highlights in purple regions where many detected particles present similar velocity and distance to the inner wall, for a given Reynolds number. For Re < 40, an approximately constant velocity is measured for all the particles tracked (>1000 for each experiment). For particles detected closer to the inner wall, especially at Re = 40 with a high number of particles tightly focused in the region near y = 45 μm, a decrease in particle velocity is observed. This behaviour in particle velocity is further confirmed by the probability density functions displayed in Fig. 7D. For Re > 40, the high speed of the particles induced large particle displacements between successive frames and the read-out limitations of the camera made it more challenging to track particles (low probability of detection) and estimate their velocity.


image file: d0lc00556h-f7.tif
Fig. 7 A. Example of tracks for four randomly picked 10 μm particles flowing in the spiral at Uf = 0.04 m s−1 (one colour per track). Dashed lines correspond to the microfluidic channel. Each track contains approximately 800 positions. B. Corresponding estimated particle velocity (raw data) as a function of the x-position in the channel. C. Particle velocity profile as a function of the distance to the inner wall for Re = 4–40 (Uf = 0.04–0.39 m s−1) and D. corresponding probability density functions (P.D.F). Plots in C are based on particles tracked in the blue region of interest highlighted in A. Colormap highlights particle density, with dense regions in purple (linear colour maps L17 in ref. 56).

The results in Fig. 7 confirm that event-based cameras can be used to track individual particle behaviours in the size range of commonly used biological cells. With a particle velocity of 0.4 m s−1, a recording rate of 30 fps (which is the typical recording rate of normal cameras57) would allow the particle to be seen at most twice in the region of interest (1.5 mm). With a recording rate of 20k fps, these results confirm that event-based cameras can be used for tracking based on high-speed data acquisition. Although the aim of this work is to demonstrate the compatibility of event-based sensing with stand-alone microscopes, it can also be noted that performances could be improved by coupling the camera with a stronger illumination source.

In order to map fluid patterns, smaller sizes would typically be used for μPIV experiments. Experiments were consequently also conducted with 1 μm fluorescent particles at Uf = 0.04 m s−1, still using only the illumination from the microscope. With the current set-up, one pixel of the 480 × 360 pixels CMOS vision sensor corresponds to a ∼3.3 μm × 3.3 μm field of view. As presented in Fig. 8A, a subpixel detection was possible in fluorescence mode, with 1 μm particles successfully detected across the channel. The estimated probability density function of particle velocity is displayed in Fig. 8B. It can be noted that only particles with lower velocities (<0.03 m s−1) were successfully detected and tracked. Due to the small size of the particles, the fluorescence-induced intensity change was lower than with the previously tested 10 μm particles, causing a lower signal-to-noise ratio while the particle passing time was shorter. Importantly, the fact that 1 μm beads could still be detected and tracked illustrates further the potential of event-based sensing for μPIV experiments in microfluidic devices; a higher detection performance could be reached by either increasing the magnification or working with sensors with a higher number of pixels (cf.ref. 2 for a descriptive review of existing cameras available and characteristics).


image file: d0lc00556h-f8.tif
Fig. 8 Tracking results of 1 μm fluorescent particles at Uf = 0.04 m s−1. Probability density functions (P.D.F.) of A. tracked particles as a function of the distance to the spiral inner wall and B. corresponding particle velocities. Plots A and B are based on particles tracked in the blue region of interest highlighted in Fig. 7A.

For both 10 and 1 μm particles, no information was obtained on the z-position of particles detected with the current data processing strategy; a similar approach to ref. 33 using velocimetric reconstruction for obtaining the z-position could potentially be used with event-based data too.

Conclusions

Event-based cameras offer unique advantages to track high-speed phenomena thanks to their sensors acting as silicon retinas. Although the benefits of this technology have been already demonstrated for robotics, its potential for biological/microscopic applications remains largely untapped. In this work, the performance of an event-based camera for detecting and tracking micrometric particles in a microfluidic channel was evaluated. Inertial focusing devices, due to their high working Reynolds numbers, are often recognised as challenging systems for individual particle tracking. Current approaches typically rely on high power, pulsed illumination sources and expensive micro-PIV setups to track fluorescent particles. The present work demonstrates that event-based cameras can offer an alternative to such state-of-the art imaging setups. Particle detection was possible for a wide range of fluid velocities, up to Uf = 1.54 m s−1, simply by using a standard fluorescence microscope (and lighting), both in bright-field and fluorescence modes. As opposed to micro-PIV setups, the event-based camera is not limited to one wavelength; any particles that are excitable in the visible spectrum with the microscope can potentially be detected. Although more challenging to accurately track, the velocity profile of particles down to 1 μm was also measured with the tested set-up. Since the application of event-based cameras to the microfluidic world is still new, specific challenges/limitations also need to be considered. The field of event-based cameras is fast evolving with always faster and more sensitive sensors being developed. For instance, efforts are currently being made to increase the fill-factor and reduce the pitch of event-based detectors, and at the same time to produce larger arrays to improve the spatial resolution. However, it is important to mention that as opposed to the camera tested here, most products do not directly offer grey scale “reconstructed image”. This might cause significant difficulties for setting up the system (e.g., for the focus) since only moving/blinking objects are visible on the display. This could be tackled by engineering new tools to help with the calibration either in the setup itself or computationally.58

Finally, the drastic changes in data – from images to events – imposes the development of a new framework for processing. Importantly, it has been demonstrated here that data can be analysed to extract relevant information (e.g., particle focusing position, particle velocity) and can also be directly compared to images (e.g., comparison with composite image from fluorescent imaging). This, in addition to their high sensitivity to intensity changes, compatibility with standard microscopes, high speed capabilities, low consumption and lower costs compared to standard high-speed cameras, makes event-based cameras unique candidates to change our way of characterising the microscopic world.

Authors contribution

JH, TH and MJ designed and performed the experiments. YA designed the data processing approach and analysed the experimental results with MJ. JH, YA and MJ wrote the manuscript; all authors discussed the results and commented on the manuscript. YA and MJ equally contributed to this work.

Conflicts of interest

There are no conflicts to declare.

Acknowledgements

We would like to thank Dr Chas Nelson and Dr Will Peveler for our discussions on the event-based camera. We also would like to thank Dr Ewa Guzniczak for the help provided with the high-speed imaging set-up. MJ and YA are supported by the Royal Academy of Engineering under the Research Fellowship scheme (RF/201718/1741 and RF201617/16/31). MJ would also like to thank the Engineering and Physical Sciences Research Council (EPSRC) and the Royal Society for their support (EP/R006482/1 and RGS\R1\191188).

Notes and references

  1. M. Mahowald, VLSI analogs of neuronal visual processing: a synthesis of form and function, Dissertation (Ph.D.), California Institute of Technology, 1992, Available from: https://www.ini.uzh.ch/~amw/publicat/mishathesis.pdf Search PubMed.
  2. G. Gallego, T. Delbrück, D. Delbrück, G. Orchard, C. Bartolozzi and B. Taba, et al.Event-based Vision: A Survey, 2020, Available from: https://github.com/uzh-rpg/event-based Search PubMed.
  3. T. Delbruck and P. Lichtsteiner, Fast sensory motor control based on event-based hybrid neuromorphic-procedural system, in 2007 IEEE International Symposium on Circuits and Systems, IEEE, 2007, pp. 845–848 Search PubMed.
  4. T. Delbruck and M. Lang, Robotic goalie with 3 ms reaction time at 4% CPU load using event-based dynamic vision sensor, Front. Neurosci., 2013, 7, 223 CrossRef PubMed.
  5. A. Glover and C. Bartolozzi, Event-driven ball detection and gaze fixation in clutter, in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, 2016, pp. 2203–2208 Search PubMed.
  6. J. H. Lee, T. Delbruck, M. Pfeiffer, P. K. J. Park, C.-W. Shin and H. Ryu, et al. Real-Time Gesture Interface Based on Event-Driven Processing From Stereo Silicon Retinas, IEEE Trans. Neural Netw. Learn. Syst., 2014, 25(12), 2250–2263 Search PubMed.
  7. A. Amir, B. Taba, D. Berg, T. Melano, J. McKinstry and C. Di Nolfo, et al. A Low Power, Fully Event-Based Gesture Recognition System, in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPRe), IEEE, 2017, pp. 7388–7397 Search PubMed.
  8. C. Brandli, T. A. Mantel, M. Hutter, M. A. Höpflinger, R. Berner and R. Siegwart, et al. Adaptive pulsed laser line extraction for terrain reconstruction using a dynamic vision sensor, Front. Neurosci., 2014, 7, 275 CrossRef PubMed.
  9. B. J. Pijnaker Hordijk, K. Y. W. Scheper and G. C. H. E. de Croon, Vertical landing for micro air vehicles using event-based optical flow, J. Field Robot., 2018, 35, 69–90 CrossRef.
  10. D. P. Moeys, F. Corradi, E. Kerr, P. Vance, G. Das and D. Neil, et al. Steering a predator robot using a mixed frame/event-driven convolutional neural network, in 2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP), IEEE, 2016, pp. 1–8 Search PubMed.
  11. M. Litzenberger, B. Kohn, A. N. Belbachir, N. Donath, G. Gritsch and H. Garn, et al. Estimation of Vehicle Speed Based on Asynchronous Data from a Silicon Retina Optical Sensor, in 2006 IEEE Intelligent Transportation Systems Conference, IEEE, 2006, pp. 653–658 Search PubMed.
  12. G. Cohen, S. Afshar, B. Morreale, T. Bessell, A. Wabnitz and M. Rutten, et al. Event-based Sensing for Space Situational Awareness, J. Astronaut. Sci., 2019, 66(2), 125–141 CrossRef.
  13. T.-J. Chin, S. Bagchi, A. Eriksson and A. van Schaik, Star Tracking using an Event Camera, 2018, arXiv:1812.02895.
  14. G. Gallego, T. Delbruck, G. Orchard, C. Bartolozzi, B. Taba and A. Censi, et al. Event-based Vision: A Survey, 2019, arXiv:1904.08405.
  15. Z. Ni, C. Pacoret, R. Benosman, S. Ieng and S. Régnier, Asynchronous event-based high speed vision for microparticle tracking, J. Microsc., 2012, 245(3), 236–244 CrossRef.
  16. G. Taverni, D. P. Moeys, F. F. Voigt, C. Li, C. Cavaco and V. Motsnyi, et al. In-vivo imaging of neural activity with dynamic vision sensors, in 2017 IEEE Biomedical Circuits and Systems Conference (BioCAS), IEEE, 2017, pp. 1–4 Search PubMed.
  17. D. Drazen, P. Lichtsteiner, P. Häfliger, T. Delbrück and A. Jensen, Toward real-time particle tracking using an event-based dynamic vision sensor, Exp. Fluids, 2011, 51(5), 1465–1469 CrossRef.
  18. J. Zhang, S. Yan, D. Yuan, G. Alici, N.-T. Nguyen and M. E. Warkiani, et al. Fundamentals and applications of inertial microfluidics: a review, Lab Chip, 2016, 16(1), 10–34 RSC.
  19. S. Razavi Bazaz, A. Mashhadian, A. Ehsani, S. C. Saha, T. Krüger and M. E. Warkiani, Computational inertial microfluidics: a review, Lab Chip, 2020, 20(6), 1023–1048 RSC.
  20. A. A. S. Bhagat, S. S. Kuntaegowdanahalli and I. Papautsky, Continuous particle separation in spiral microchannels using dean flows and differential migration, Lab Chip, 2008, 8(11), 1906 RSC.
  21. G. Guan, L. Wu, A. A. Bhagat, Z. Li, P. C. Y. Chen and S. Chao, et al. Spiral microchannel with rectangular and trapezoidal cross-sections for size based particle separation, Sci. Rep., 2013, 3, 1475 CrossRef PubMed.
  22. J. M. Martel and M. Toner, Inertial focusing dynamics in spiral microchannels, Phys. Fluids, 2012, 24(3), 32001 CrossRef PubMed.
  23. J. Zhang, S. Yan, R. Sluyter, W. Li, G. Alici and N.-T. Nguyen, Inertial particle separation by differential equilibrium positions in a symmetrical serpentine micro-channel, Sci. Rep., 2015, 4(1), 4527 CrossRef PubMed.
  24. P. Paiè, F. Bragheri, D. Di Carlo and R. Osellame, Particle focusing by 3D inertial microfluidics, Microsyst. Nanoeng., 2017, 3(1), 17027 CrossRef PubMed.
  25. J. Sun, et al. Size-based hydrodynamic rare tumor cell separation in curved microfluidic channels, Biomicrofluidics, 2013, 7(1), 11802 CrossRef PubMed.
  26. J. Cruz, T. Graells, M. Walldén and K. Hjort, Inertial focusing with sub-micron resolution for separation of bacteria, Lab Chip, 2019, 19(7), 1257–1266 RSC.
  27. J.-H. Lee, S.-K. Lee, J.-H. Kim and J.-H. Park, Separation of particles with bacterial size range using the control of sheath flow ratio in spiral microfluidic channel, Sens. Actuators, A, 2019, 286, 211–219 CrossRef CAS.
  28. H. W. Hou, R. P. Bhattacharyya, D. T. Hung and J. Han, Direct detection and drug-resistance profiling of bacteremias using inertial microfluidics, Lab Chip, 201, 15(10), 2297–2307 RSC.
  29. B. B. Fuchs, S. Eatemadpour, J. M. Martel-Foley, S. Stott, M. Toner and E. Mylonakis, Rapid Isolation and Concentration of Pathogenic Fungi Using Inertial Focusing on a Chip-Based Platform, Front. Cell. Infect. Microbiol., 2019, 9, 27 CrossRef CAS PubMed.
  30. J. M. Martel and M. Toner, Particle Focusing in Curved Microfluidic Channels, Sci. Rep., 2013, 3(1), 3340 CrossRef.
  31. A. Russom, A. K. Gupta, S. Nagrath, D. Di Carlo, J. F. Edd and M. Toner, Differential inertial focusing of particles in curved low-aspect-ratio microchannels, New J. Phys., 2009, 11, 75025 CrossRef PubMed.
  32. Z. Pan, R. Zhang, C. Yuan and H. Wu, Direct measurement of microscale flow structures induced by inertial focusing of single particle and particle trains in a confined microchannel, Phys. Fluids, 2018, 30(10), 102005 CrossRef.
  33. K. Hood, S. Kahkeshani, D. Di Carlo and M. Roper, Direct measurement of particle inertial migration in rectangular microchannels, Lab Chip, 2016, 16(15), 2840–2850 RSC.
  34. P. M. Holloway, J. Butement, M. Hegde and J. West, Serial integration of Dean-structured sample cores with linear inertial focussing for enhanced particle and cell sorting, Biomicrofluidics, 2018, 12(4), 044104 CrossRef PubMed.
  35. T. Kwon, R. Yao, J.-F. P. Hamel and J. Han, Continuous removal of small nonviable suspended mammalian cells and debris from bioreactors using inertial microfluidics, Lab Chip, 2018, 18(18), 2826–2837 RSC.
  36. E. Guzniczak, O. Otto, G. Whyte, N. Willoughby, M. Jimenez and H. Bridle, Deformability-induced lift force in spiral microchannels for cell separation, Lab Chip, 2020, 20, 614–625 RSC.
  37. S. Karpf, C. T. Riche, D. Di Carlo, A. Goel, W. A. Zeiger and A. Suresh, et al. Spectro-temporal encoded multiphoton microscopy and fluorescence lifetime imaging at kilohertz frame-rates, Nat. Commun., 2020, 11(1), 2062 CrossRef CAS PubMed.
  38. L. E. Weiss, Y. Shalev Ezra, S. Goldberg, B. Ferdman, O. Adir and A. Schroeder, et al. Three-dimensional localization microscopy in live flowing cells, Nat. Nanotechnol., 2020, 1–7 Search PubMed.
  39. B. Dong, S. Chen, F. Zhou, C. H. Y. Chan, J. Yi and H. F. Zhang, et al. Real-time Functional Analysis of Inertial Microfluidic Devices via Spectral Domain Optical Coherence Tomography, Sci. Rep., 2016, 6(1), 33250 CrossRef CAS PubMed.
  40. Y. Suzuki, K. Kobayashi, Y. Wakisaka, D. Deng, S. Tanaka and C.-J. Huang, et al. Label-free chemical imaging flow cytometry by high-speed multicolor stimulated Raman scattering, Proc. Natl. Acad. Sci. U. S. A., 2019, 116(32), 15842–15848 CrossRef CAS PubMed.
  41. B. Guo, C. Lei, Y. Wu, H. Kobayashi, T. Ito and Y. Yalikun, et al. Optofluidic time-stretch quantitative phase microscopy, Methods, 2018, 136, 116–125 CrossRef CAS PubMed.
  42. D. R. Gossett and D. Carlo, Particle Focusing Mechanisms in Curving Confined Flows, Anal. Chem., 2009, 81(20), 8459–8465 CrossRef CAS PubMed.
  43. J. Zhang, W. Li, M. Li, G. Alici and N.-T. Nguyen, Particle inertial focusing and its mechanism in a serpentine microchannel, Microfluid. Nanofluid., 2014, 17(2), 305–316 CrossRef CAS.
  44. B. Miller, M. Jimenez and H. Bridle, Cascading and Parallelising Curvilinear Inertial Focusing Systems for High Volume, Wide Size Distribution, Separation and Concentration of Particles, Sci. Rep., 2016, 6(1), 36386 CrossRef CAS PubMed.
  45. Y. W. Kim and J. Y. Yoo, The lateral migration of neutrally-buoyant spheres transported through square microchannels, J. Micromech. Microeng., 2008, 18(6), 065015 CrossRef.
  46. E. J. Lim, T. J. Ober, J. F. Edd, S. P. Desai, D. Neal and K. W. Bong, et al. Inertio-elastic focusing of bioparticles in microchannels at high throughput, Nat. Commun., 2014, 5(1), 4120 CrossRef CAS PubMed.
  47. M. Jimenez, B. Miller and H. L. Bridle, Efficient separation of small microparticles at high flowrates using spiral channels: Application to waterborne pathogens, Chem. Eng. Sci., 2017, 157, 247–254 CrossRef CAS.
  48. E. Guzniczak, O. Otto, G. Whyte, N. Willoughby, M. Jimenez and H. Bridle, Deformability-induced lift force in spiral microchannels for cell separation, Lab Chip, 2020, 20(3), 614–625 RSC.
  49. E. Guzniczak, O. Otto and G. Whyte, et al. Purifying stem cell-derived red blood cells: a high-throughput label-free downstream processing strategy based on microfluidic spiral inertial separation and membrane filtration, Biotechnol. Bioeng., 2020, 117, 2032–2045 CrossRef CAS PubMed.
  50. N. Nivedita, P. Ligrani and I. Papautsky, Dean Flow Dynamics in Low-Aspect Ratio Spiral Microchannels, Sci. Rep., 2017, 7(1), 44072 CrossRef PubMed.
  51. O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan and W. Bishara, et al. Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications, Lab Chip, 2010, 10(11), 1417 RSC.
  52. K. J. Humphry, P. M. Kulkarni, D. A. Weitz, J. F. Morris and H. A. Stone, Axial and lateral particle ordering in finite Reynolds number channel flows, Phys. Fluids, 2010, 22(8), 081703 CrossRef.
  53. J. Munkres, Algorithms for the Assignment and Transportation Problems, J. Soc. Ind. Appl. Math., 1957, 5(1), 32–38 CrossRef.
  54. F. Bourgeois and J.-C. Lassalle, An extension of the Munkres algorithm for the assignment problem to rectangular matrices, Commun. ACM, 1971, 14(12), 802–804 CrossRef.
  55. R. E. Kalman, A New Approach to Linear Filtering and Prediction Problems, J. Basic Eng., 1960, 82(1), 35–45 CrossRef.
  56. P. Kovesi, Good Colour Maps: How to Design Them, 2015, arXiv:1509.03700.
  57. M. Versluis, High-speed imaging in fluids, Exp. Fluids, 2013, 54(2), 1458 CrossRef.
  58. H. Rebecq, R. Ranftl, V. Koltun and D. Scaramuzza, High Speed and High Dynamic Range Video with an Event Camera, 2019, arXiv:1906.07165.

Footnote

Electronic supplementary information (ESI) available. See DOI: 10.1039/d0lc00556h

This journal is © The Royal Society of Chemistry 2020