Sei sulla pagina 1di 14

G Model

BIOS 8996 114 BIOS : 8996 Model5G pp:114col:fig: : NIL

Biosensors and Bioelectronics ()

Contents lists available at ScienceDirect

Biosensors and Bioelectronics


journal homepage: www.elsevier.com/locate/bios

A review of recent progress in lens-free imaging and sensing


Mohendra Roy a,b, Dongmin Seo a, Sangwoo Oh a,c, Ji-Woon Yang a, Sungkyu Seo a,n
a
Department of Electronics and Information Engineering, Korea University, Sejong, Republic of Korea
b
Department of Physics, Rajiv Ghandi University, Rono Hills, Arunachal Pradesh, India
c
Korea Research Institute of Ships & Ocean Engineering, Daejeon, Republic of Korea

art ic l e i nf o a b s t r a c t

Article history: Recently, lens-free imaging has evolved as an alternative imaging technology. The key advantages of this
Received 7 June 2016 technology, including simplicity, compactness, low cost, and exibility of integration with other com-
Received in revised form ponents, have facilitated the realization of many innovative applications, especially, in the elds of the
27 July 2016
on-chip lens-free imaging and sensing. In this review, we discuss the development of lens-free imaging,
Accepted 31 July 2016
from theory to applications. This article includes the working principle of lens-free digital inline holo-
graphy (DIH) with coherent and semi coherent light, on-chip lens-free uorescence imaging and sensing,
Keywords: lens-free on-chip tomography, lens-free on-chip gigapixel nanoscopy, detection of nanoparticles using
Lens-free imaging and sensing on-chip microscopy, wide eld microscopy, and lens-free shadow image based point-of-care systems.
From theory to applications
Additionally, this review also discusses the lens-free uorescent imaging and its dependence on structure
Shadow images
and optical design, the advantage of using the compact lens-free driven equilibrium Fourier transform
Diffraction patterns
CMOS image sensor (DEFT) resolved imaging technique for on-chip tomography, the pixel super-resolved algorithm for
gigapixel imaging, and the lens-free technology for point-of-care applications. All these low-cost, com-
pact, and fast-processing lens-free imaging and sensing techniques may play a crucial role especially in
the elds of environmental, pharmaceutical, biological, and clinical applications of the resource-limited
settings.
& 2016 Elsevier B.V. All rights reserved.

1. Introduction of health care in resource-limited countries has been an ambitious


goal. Such an instrument will improve health care by reducing the
In recent years, the biomedical electronics industry has un- overall cost of testing and providing early diagnosis. The recent
dergone a remarkable evolution, bringing compact and low-cost development of miniaturization technology such as 20-nm feature
wearable devices to consumers. This evolution also triggered the size fabrication and nano electromechanical systems (NEMS) has
development of various advanced sensing and characterization fueled research in and development of compact and integrated
systems not just for academic research, but also for industries. The devices with very small form factors that are used on these kinds
advances in computers, such as high-performance graphics pro- of diagnosis devices. These developments in semiconductor en-
cessing units (GPUs) and central processing units (CPUs), fa- gineering lead to the compact and integrated optoelectronic pro-
cilitated the implementation of computational methods for fast ducts, including complementary metal oxide semiconductor
(CMOS) and charge-coupled device (CCD) sensors, for use in
characterization and prediction, thus decreasing their complexity
imaging. These high-resolution sensors, along with the traditional
and dependence on complex hardware components. This ad-
optical components such as lenses, can acquire high-resolution
vancement in computational resources provided an opportunity to
microscopic images (Kim et al., 2012). However, the optical com-
develop advanced biomedical imaging techniques such as com-
ponents are large and expensive, which in turn makes the whole
putational tomography (CT) and X-ray imaging (Coskun and Oz- system bulky and costly. Interestingly, due to the high pixel re-
can, 2014). In addition, considerable effort using these advanced solution, the image sensors can acquire diffraction patterns of
electronics and computational methods has gone into developing microsamples without the use of any lenses (Gurkan et al., 2011).
portable point-of-care devices. The design and development of a Recently, some research groups have tried to take advantage of
point-of-care diagnostic instrument that can improve the delivery these high-resolution sensors to fabricate a compact and lensless
microscope (Kim et al., 2011). Signicant progress has been made
n
Correspondence to: Rm #205, Science & Technology Bldg. II, 2511 Sejong-ro,
to achieve alternative imaging by integrating lens-free technology
Sejong 30019, Republic of Korea. with the advances in computational and image-processing algo-
E-mail address: sseo@korea.ac.kr (S. Seo). rithms (Gorocs and Ozcan, 2013). The components used for this

http://dx.doi.org/10.1016/j.bios.2016.07.115
0956-5663/& 2016 Elsevier B.V. All rights reserved.

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
2 M. Roy et al. / Biosensors and Bioelectronics ()

Fig. 1. Schematic of a lens-free imaging system showing the simplicity of the


platform (Gorocs and Ozcan, 2013).

purpose, such as light-emitting diodes (LEDs) and CMOS image


Fig. 2. Schematic of DIH. A coherent wave from a point source strikes an object and
sensors, are low cost, lightweight, and easy to integrate (see Fig. 1), forms a scattered wave, which, in turn, forms a diffraction pattern of the object
which makes the system simple. Several groups are working to because of interference with the reference light.
develop various applications that take advantage of the lens-free
imaging platform (Zhu et al., 2013a, 2013b). In this article, we recorded digitally by image sensors such as CMOSs or CCDs. These
discuss the development of lens-free imaging technology and its data are used in the numerical reconstruction process with the
various applications. help of a computational algorithm. Reconstruction consists of
virtually illuminating the recorded holographic pattern with a
reference wave and evaluating the complex eld distribution. The
2. Theory behind lens-free imaging and sensing rst DIH reconstruction was reported by Goodman and Lawrence
in 1967 (Gorocs and Ozcan, 2013). However, the true revolution in
The basic principle of lens-free imaging can be explained using holography began with the use of modern digital imaging sensors
digital inline holography (DIH), a two-step process in which the (i.e., CCDs and CMOSs) by Schnars and Jptner (1994). In general,
amplitude and phase information of the wave front that originates most current numerical reconstruction methods are based on the
from the object are digitally recorded and then reconstructed famous KirchhoffHelmholtz transform, which can be described by
using computational algorithms (Gorocs et al., 2013). Digital inline the following equation:
holography became popular during the 1990s (Picart and Leval,
2008). Since then, its importance in applications because of its K (r ) = s dsI( )exp(2ir/ ) (3)
ability to image transparent objects and analyze images easily
through mathematical morphology has been demonstrated. where ( x, y, L ) is the 3D coordinate, L is the distance of the X,Y
In DIH, the hologram of an object is formed by a spherical wave screen from the source, and I ( ) is the contrast of the hologram.
of illumination of wavelength emitted from a point source about This integration yields the reconstructed information of the object.
the size of the wavelength. This spherical wave splits into a scat- Plotting K (r ) in an X,Y plane yields the 2D holographic re-
tered wave and an unscattered reference wave when it encounters constructed image, and stacking the 2D hologram yields the 3D
an object (Fig. 2), creating interference patterns (hologram). This information. The detailed steps of DIH are illustrated in Fig. 3.
event can be expressed as Ascatr(r)A(r)Aref(r), where A(r) is the Although the reconstruction provides high-precision surface
amplitude of the incident wave, Ascatr(r) is the amplitude of the topography of objects, artifacts associated with the acquisition
scattered wave, and Aref(r) is the amplitude of the unscattered process, such as twin images due to the zero-order term and
reference wave. Thus, the contrast of the hologram is dened as complex conjugates of the object wave, make the reconstruction
2 2 process a challenging one because the image sensor measures only
I ( r ) = A ref ( r ) + Ascatr ( r ) A ref ( r ) (1)
the intensity prole of these waves. However, these kinds of ar-
tifacts can be eliminated with phase retrieval algorithms, but they
I ( r ) = A ref * ( r ) Ascatr ( r ) 2
* ( r )Ascatr ( r ) + A ref ( r )Ascatr (2) make the system complex and place more demand on computa-
tional resources. This is useful for applications in which high-
The rst term in Eq. (2), Aref * ( r ), is the
* ( r )Ascatr ( r ) + Aref ( r )Ascatr
contrast topography of an object is required. Fortunately, common
holographic diffraction pattern generated by the superposition of biological applications such as gathering information on the con-
the reference wave directly from the source on the scattered wave centration or population of cells, determining cell size, and dif-
2
from the object, and Ascatr ( r ) is the classical diffraction pattern ferentiating cell types do not require high-contrast topography.
generated by the interference of the scattered wave only (Xu et al., Therefore, research on lens-free shadow imaging with partially
2002). Because a hologram results from the interference between coherent planar wave illumination is ongoing. In this method, the
scattered and reference waves, the hologram of a transparent sample-to-sensor distance is smaller than the source-to-sample
object can be created with DIH. distance. Fig. 4 presents a schematic of this approach. Because of
In general, the phase and amplitude data of the hologram are this distance relationship, the reference wave can be considered a

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
M. Roy et al. / Biosensors and Bioelectronics () 3

computational algorithm (Bishara et al., 2010). However, this


method is time-consuming and assembling the images puts great
demand on computational resources. Therefore, many research
groups are working on a cost-effective and compact system that
can directly process the shadow images. Recently, Roy et al. de-
monstrated the application of such a system for obtaining blood
cells counts, white blood cell differentiation, and determination of
the size of micro-objects by directly characterizing the shadow
parameters (Roy et al., 2014, 2015). An overall review of the lens-
free imaging was discussed by the Greenbaum et al. In this paper
they discussed the various categories of lens-free imaging, spe-
cially contact mode shadow imaging and diffraction based lens-
free imaging. A concise discussion of advantages, limitation of all
these techniques and their future challenges were carried out,
specially the limitation of spatial resolution, overlapping due to
sample density and etc. They also discussed the need of standar-
dization of remonstration methods and limitation of the on-chip
lens-free uorescence imaging, i.e. the limitation of the spatial and
temporal coherence of the uorescence light and the effect of the
thickness of the lters (Greenbaum et al., 2012b). However, due to
the growing application of lens-free imaging over time, a more
detail discussion on some of the specic applications of lens-free
imaging is desirable. In the following sections, we discuss the
Fig. 3. Flowchart of the formation of images using DIH (Picart and Leval, 2008). development of lens-free imaging and its various applications.

plane wave. This eliminates the aliasing effects associated with the
spherical wave and reduces the spatial and temporal coherence of 3. Applications of lens-free imaging and sensing
the illumination because the reduced difference between the
paths of the scattered and reference waves facilitated the use of a 3.1. Lens-free on-chip platforms
common semicoherent light source such as an LED conjugated
with a pinhole. Because the sample-to-sensor distance is much In this section, we discuss the development of lens-free ima-
smaller than the source-to-sample distance, the cross interference ging in an on-chip imaging system. Although the discovery of lens-
among the scattered waves is reduced signicantly, which, in turn, free DIH was reported in the 1990s, the rst on-chip application
leads to a reduction in the noise and spatial magnication of the was in 2008 when it was used for automated detection and
image. This reduced spatial magnication allows the system to use characterization of microparticles on a microuidic chip that was
the full active area of the sensor more efciently; however, this integrated on a lens-free imaging system (Ozcan and Demirci,
reduces the spatial resolution because of the limited pixel size of 2008). The system, which consisted of a CCD image sensor with an
the detector (Bishara et al., 2010, 2011a; Mudanyali et al., 2010; active area of 37.25 mm  25.70 mm, was very compact and had a
Seo et al., 2010; Isikman et al., 2011a; Greenbaum et al., 2012a, wider eld of view (FOV) than a conventional imaging system. The
2012b). The limited pixel size can be resolved by implementing the working principle of this system is similar to that of lens-free
super-resolution method in which several pictures taken by sub- imaging with a partially coherent planar wave, as discussed in the
pixel shifting of the sample are stitched together using a previous section. The goal was a compact optimized alternative

Fig. 4. Schematic of lens-free shadow imaging with partially coherent planar wave illumination. (a) Experimental setup used by Roy et al., (2014, 2015). (b) Generation of
diffraction pattern of a micro-object as described by Roy et al., (2014, 2015).

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
4 M. Roy et al. / Biosensors and Bioelectronics ()

Fig. 5. (a) Experimental apparatus (under blue light) and (b) schematic diagram of the holographic LUCAS (Lensless Ultra-wide-eld Cell monitoring Array platform based on
Shadow imaging) platform.

imaging system that is comparable to existing systems with re- 420 samples. Therefore,
spect to the information provided. To achieve this goal, various DevDevMIN
parameters such as the dependence of the system on the wave- Corr=1
DevMAX DevMIN
length of illumination, the effect of variation in object size, and the
variation in the error rate of the change in object concentration, where DevMAX and DevMIN are the maximum and minimum devia-
were optimized. This optimized system was used to automatically tion of f(x,y) calculated using the individual library images that
detect biological cells such as NIH-3T3m AML-12 hepatocytes and make up L(x,y).
CD4 T lymphocytes. The processed results obtained with this algorithm show that
Seo et al. further improved this on-chip imaging by introducing different signatures for different type of cells can be obtained, as
the ability to differentiate cell types with lens-free imaging (Seo shown in the Fig. 6, which shows that the diffraction patterns of red
et al., 2009). The system had a higher-resolution sensor with blood cells (RBC), a yeast cell, and polystyrene microbeads have
smaller pixels (2.2 mm), and the sample plane was illuminated signicant signal differences. However, cells with similar optical
with tunable monochromatic light, as shown in the Fig. 5. The properties but different work functions are very difcult to differ-
shadow images from this system were autocharacterized by a entiate with the lens-free imaging system, a problem recently ad-
custom algorithm developed based on the parameters of the dif- dressed by Wei et al. (2013). They implemented an optical mod-
fraction patterns of the samples, including digital signal-to-noise ulation method based on the plasmonic resonance effect produced
ratio (SNR), shadow radius (RRMS), and correlation index (Corr), by metallic nanoparticles (Schultz, 2003). Plasmonic nanoparticles
which are represented as follows: are being used as a contrast agent because of their unique optical
modulation property. Wei et al. used gold (Au) and silver (Ag) na-
3.1.1. Signal to noise ratio noparticles as contrast agents to label the CD4 and CD8 cells. This
produced distinguishable diffraction signatures in the lens-free
image, which could then be characterized using principle compo-
nent analysis (PCA) and multiclass support vector machine (SVM)
SNR = max( I ) b /b analysis on the multispectral data of the lens-free images. Their
system consisted of a CMOS image sensor with an active area of
where I is the intensity of the light on the sensor array and mb
24 mm2 and a monochromatic light source. Images of all samples
and sb are the mean and variance of the background noise region.
were taken with this setup in the wavelength range of 480900 nm
in 10-nm steps. Fig. 7(a) shows a schematic of the system.
3.1.2. Shadow radius (Rrms)
The lens-free system also has been used in the study of the
movement of microsubstances. Su et al. demonstrated the feasi-
bility of using the lens-free imaging system for analyzing the ac-
w 2 2
w 2 1/2 tivity of human sperm (Su et al., 2010a), a major parameter in the
x=1 (
R rms = (
x x ) f x, y = y0 ) /x = 1 f ( x, y = y0 ) evaluation of male fertility. Su et al. used lens-free technology to
develop a portable sperm activity monitoring system that cap-
where W is the maximum number of pixels in the region of tured shadow images of sperm at 20 frames per second. The
interest (ROI) and images were normalized to their mean intensity and digitally
w 2 w 2 summed to quantify the immotile sperm. To identify motion of
x= x=1 x (
f x, y = y0 ) /x = 1 f ( x, y = y0 ) motile sperm, a shadow image was subtracted from its subsequent
frame. A reconstruction algorithm then was used to reconstruct
Here, ( x, y ) is the index of the image pixel and f x, y = y0 is ( ) the subtracted images. The distance between the dark and bright
the detected intensity prole of a line represented by y = y0.
spots in the reconstructed image was the displacement of the
sperm, indicating movement (Fig. 8). The speed and trajectory of
3.1.3. Correlation index
the motile sperm can be determined from this relative distance. In
addition, Cui et al. (2008) demonstrated an on-chip optouidic
microscope based on lens-free imaging technology, which pro-
Dev = ( x, y) ROI f ( x , y) L ( x , y) mises the point of care analysis by facilitating the large scale
scanning of cell samples through the microuidics (Cui et al.,
where L( x, y ) is the mean library image formed by averaging 2008). Again Zhang et al. (2011) had demonstrated an on-chip

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
M. Roy et al. / Biosensors and Bioelectronics () 5

Fig. 6. (a) and (b) illustrate the difference in performance between conventional LUCAS (with incoherent light) and holographic LUCAS (with coherent light). A high-
resolution CMOS sensor array (2.2-mm pixel) was used in each image. (c) and (d) show the cross-sectional intensity proles of the various micro-objects imaged in (a) and
(b), respectively. Due to increased spatial coherence, the holographic diffraction pattern of each micro-object exhibits more information on texture, with unique oscillating
features that contain phase information of the cell or microparticle. This phase information, usually lost during incoherent illumination, provides magnied images of the
holographic diffraction signatures of 10-mm beads, yeast cells (S. pombe), and RBCs. (e)(j) Microscope images of the same FOV, acquired with a 10-mm objective lens, are
shown next to the holographic LUCAS images of the three microparticles for comparison (Seo et al., 2009).

sperm motion characterization system. This enables the on-chip microparticles and a uorescent sample by imaging a hetero-
monitoring of the sperm health in a wide eld of view, which geneous sample of uorescent and nonuorescent particles using
provides statistically more accurate result compare to low the vertical transmission imaging system depicted in Fig. 4. The
throughput conventional devices (Zhang et al., 2011). results in Fig. 11 show that the system can capture the uores-
cence and hologram images simultaneously in an 8-cm2 FOV,
3.2. Lens-free uorescence microscopy making the system a unique portable uorescence detector.
The spatial resolution of this platform was improved to
Fluorescent imaging is an important method for characterizing  10 mm with the implementation of a compressive sensing algo-
cells at the molecular level. A uorescent imaging system with a rithm (Coskun et al., 2010). The system used for this purpose was
large FOV is desirable for the best characterization. Because a almost identical to that described above, except for the introduc-
conventional uorescent imaging system has a small FOV, it was tion of a ber optic faceplate in between the coverglass and the
thought that a lens-free imaging system integrated with a uor- absorption lter, as shown in Fig. 12. This lens-free system sig-
escent imaging system would provide better cellular character- nicantly enlarges and distorts the uorescence image because of
ization. Recently, Coskun et al. (2010) demonstrated such a system. diffraction and the introduction of the faceplate, but the distortion
They placed a prism on top of the sample plane to facilitate total is eliminated by using a compressive sensing algorithm, which is
internal reection, which is the excitation light for the uorescent described as follows.
samples. The emitted uorescent light is ltered by an absorption The uorescent particle/cell distribution within the sample
lter and then recorded with a CCD or CMOS sensor. Fig. 9 shows volume is c = c1, c2, c3..... , cN , where N is the number of voxels,
the schematic of this experimental setup. The advantage of this and the physical grid size in c is d. In addition, c is sparse and only
the S coefcients of c are nonzero such that S { N. The intensity
system is that it can take uorescent and corresponding holo-
distribution of light impinging on the detector array is determined
graphic images simultaneously with an FOV of 2.5 cm  3.5 cm.
by c , and that above the detector plane is expressed as
The feasibility of the system was tested by imaging a hetero-
geneous sample of 10-mm uorescent particles and 2030-mm N

non-uorescent particles. The 470-nm wavelength excitation light f ( x , y) = cii( x, y)


i=1
was introduced through the side walls of the prism. A detected
image is shown in Fig. 10(a). Fig. 10(b) is the same image with where i( x, y ) is the intensity of a 2D wave immediately before the
improved resolution after reconstructing the image using the ac- detector. i( x, y ) can be measured for each object plane from the
celerated LucyRichardson algorithm. uorescent particles. Without the use of a faceplate, i( x, y ) for a
Coskun et al. (2010) also demonstrated the dual imaging of given object plane becomes space invariant and is expressed as

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
6 M. Roy et al. / Biosensors and Bioelectronics ()

Fig. 7. (a) Schematic illustration of a multispectral lens-free in-line holographic system. (b) Full FOV of a lens-free hologram of Au-CD4 cells excited at 560 nm; the center
region of the white box is shown in (g). (c) and (f) Dark-eld scattering microscopy of CD4 and Au-CD4 cells, respectively, obtained by an objective lens. (d) and (g) are lens-
free super-resolved (SR) holograms and (e) and (h) are reconstructed amplitude images of the same regions of interest (ROIs) as the dark-eld scattering images in (c) and (f).
Scale bar 25 mm for c, d, e. Scale bar 10 mm for f, g, h (Wei et al., 2013).

( )
i( x, y ) = p x x i , y yi , where p( x, y ) is the incoherent point The intensity sampled by the detector is
spread function (PSF) of the system for a given object layer and ( ) (
Im = f ( x, y ) x x m, y ym dxdy , where x x m, y ym dxdy )
( )
x i , yi indicates the physical location of ci . Therefore, the general represents the sampling or measurement bases. Here, m 1:M
equation for lens-free imaging with or without the presence of a denotes the mth pixel of the detector array with center co-
faceplate can be written as ( )
ordinates of x m, ym , and ( x, y ) is the pixel function.
For better decoding, there should be good spatial correlation
N between m and i for all possible m 1:M and i1:N pairs. Good
f ( x , y) = cip( x xi , y yi ) spatial correlation provides the incoherence between the sampling
i=1 and representation bases, which in turn provides the probability of

Fig. 8. (a) Digitally subtracted lens-free hologram of three moving sperm is generated from two successive frames (500 ms apart). (b) Microscopic image, digitally re-
constructed from the lens-free differential hologram shown in (a), shows the positions of three sperm in the two successive frames, where white spots indicate the nal
positions and the black spots indicate the starting positions. S1, S2, and S3 are the displacement vectors of these sperm (Su et al., 2010a).

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
M. Roy et al. / Biosensors and Bioelectronics () 7

measured uorescent intensities from multiple isolated particles


are interpolated, aligned, and averaged. The intensity proles de-
pend on the structure and optical design of the sensor pixel, lter
response, and distance between the object and detector planes.
Sencan et al. demonstrated the demultiplexing method by
using it for on-chip lens-free uorescent imaging with various RGB
uorescent emitters; the results are shown in Fig. 14.

3.3. Lens-free tomography

Structural information on biological cells and microparticles


provides deep insight into cell morphology. Several approaches such
as confocal microscopy and optical coherence tomography are
available for obtaining 3D information on microparticles. However,
these techniques use bulky technology and are time-consuming. The
increase in biological research requires a structural analysis system
that is fast and compact. Recently, lens-free technology has been
Fig. 9. Schematic diagram of the lens-free on-chip uorescent imaging platform studied as a potential alternative for 3D imaging (Fig. 15). Lens-free
over a large FOV of 2.5 cm  3.5 cm. Fluorescent excitation is achieved via illumi- technology makes the system simpler, compact, and lightweight. The
nation through the side of a rhomboid prism (if more convenient, a different prism
geometry could be used). A simple LED or a Xenon lamp tuned by a mono-
working principle of this technology for 3D imaging is similar to that
chromator is used for excitation. Lens-free holographic imaging of the same FOV is of conventional lens-free technology with the addition of the ability
achieved with vertical incoherent illumination (another LED) through the at top of for multiple-angle illumination. The reconstructed images from this
the prism. Drawing is not to scale. Dimensions: prism height p, 17 mm; active area system provide 3D information on micro-objects. Recently, Su et al.
of the imager, w1  w2, 25 mm  35 mm; depth of the solution reservoir, k,  10
demonstrated multiangle lensless digital holography for depth-re-
100 mm; distance of the vertical source, h,  510 cm; distance of the uorescent
excitation source, f,  12 cm. Not shown here, an index-matching gel can be used solved imaging on a chip (Su et al., 2010b).
to avoid total internal reection and undesired scattering at the bottom facet of the In the multiangle illumination method, the height of a micro-
prism. In addition, to better control the vertical distance between the sample mi- object is determined by calculating the lateral shift of the holo-
crochannel and the active region of the sensor, we removed the protective cover- gram of the object as function of the illumination angle. The ad-
glass of the chip. The thin absorption lter above the CCD/CMOS acts as a protective
layer in this case, isolating the active region of the sensor chip from the micro-
vantage of the compact, lens-free, driven equilibrium Fourier
channels (Coskun et al., 2010). transform (DEFT) resolve imaging technique is that its FOV is
wider than that of the conventional DEFT resolve imaging tech-
nique. The compactness of this system facilitates imaging on
accurately reconstructing c from M measurements. The coherence
miniaturized platforms such as lab-on-a-chip. Recently, Bishara
is obtained by calculating the correlation between the pixel
et al. demonstrated the integration of lens-free tomography with a
function ( x, y ) and the PSF p( x, y ). The smaller the correlation,
the better the compressive decoding is. microuidic channel (Bishara et al., 2011b). Isikman et al., who
In conclusion, compressive sampling digitally counters the ef- presented a summary of a eld-portable benchtop lens-free op-
fect of diffraction induced by tical tomographic microscope, further explained this integration
(Isikman et al., 2011b). They also demonstrated sectional imaging
N N
of a large volume (20 mm3) with a spatial resolution of o7 mm
f ( x , y) = cii( x, y) and f ( x, y) = cip( x xi , y yi ) (Isikman et al., 2011a), providing high-throughput images with a
i=1 i=1
long depth of eld. These systems provide better 3D information
through decoding of the lens-free image pixels expressed in on the microscopic world at minimal cost. The main advantage of
( )
Im = f ( x, y ) x x m, y ym dxdy . this kind lens-free tomography system is that the spatial resolu-
Fig. 13 presents the results from the compressive decoder method tion can be improved without compromising the FOV, which is a
and the conventional LucyRichardson deconvolution method for trade-off encountered with lens-based microscopy systems.
comparison. The former technique reportedly improved the spatial
resolution of the lens-free imaging system to about 10 mm. 3.4. Lens-free gigapixel nanoscopy
Despite the simplicity of lens-free imaging, it does have some
limitations, including a lack of resolution due to limited temporal Nanoscale imaging using an optical imaging system has been a
coherence of the incident light. In most lens-free imaging systems, challenge due to its diffraction limit, which research groups are trying
the light sources are either partially coherent light (in the case of to overcome. In this section, we discuss some of the new approaches
holography) or incoherent light (particularly in case of uorescent developed to address this issue that are based on lens-free imaging.
objects), implemented mainly by LEDs, which are limited in tem- The gigapixel (109 pixels) is the resolution of images obtained
poral coherence. In addition, in uorescent imaging, the light from with the lens-free technique, in which multiple images of the
uorescent particles spatially overlaps. However, most traditional holograms produced by the objects are captured by slightly
reconstruction algorithms are single-wavelength procedures, which shifting the image plane. In general, the shift is less than the size of
limits the effective bandwidth of the light emitted by the uor- the pixel. The images are then fused into to a single high-resolu-
escent particles. Sencan et al. (Sencan et al., 2014) addressed this tion image using a pixel super-resolved algorithm and an iterative
problem by introducing the spectral demultiplexing method in gradient algorithm (McLeod et al., 2013). The super-resolution
which the polychromatic phase retrieval algorithm is used to de- images are then reconstructed via back-propagation of the object
multiplex the superimposed spectral components of the holograms plane through an angular spectrum approach. Euan et al. pre-
generated with broadband LED illumination. For that, they used sented a study on recent developments in lens-free imaging to-
super-resolved polychromatic intensity measurements. The multi- ward the goal of high-resolution imaging (McLeod et al., 2013).
color lens-free uorescent signatures are simultaneously demulti- In the review by McLeod et al., they discussed methods to
plexed using a sparse recovery algorithm with a PSF at a different improve the effective numerical aperture (NA) of lens-free imaging
wavelength and the spectral response curve of the sensor. The to obtain nano-range resolution, including iterative error-

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
8 M. Roy et al. / Biosensors and Bioelectronics ()

Fig. 10. (a) Lens-free uorescent imaging of 10-mm uorescent beads over a FOV of 4 8 cm2 (excitation/emission: 495 nm/505 nm). (a) Raw lens-free image pumped
through an LED and acquired with an integration time of 1 s (b) Digitally deconvolved uorescent image of the same FOV as (a). (c)(e) From left to right: Magnied image of
the indicated area in (a), the result of the deconvolution process after 100 iterations, and after 600 iterations. The letters f, g, and h and dashed lines in (c) and (d) refer to
panels (f)(h), which illustrate the different cross sections of the raw and deconvolved uorescent images, demonstrating the  5  improvement in uorescent spot size.
Through the iterative deconvolution process, two particles that almost completely overlap in the raw lens-free image [see, for instance, (c) and (d)], are now separated, as
shown to the right of (c) and (d) and in (f)(h). The pixel size of the CCD in this experiment was 9 mm. The resolution of the deconvolved lens-free uorescent image could
have been further improved with the use of a smaller pixel (Coskun et al., 2010).

reduction algorithms, smaller pixel size of the image sensor, and background light (Mudanyali et al., 2013). In this work they used
self-assembly. Self-assembly is an effective way to overcome the the pixel super resolution on-chip imaging technique to capture
limitation of optical microscope resolution and achieve nanor- the nano lens enhanced light signals of the nanoparticles. The
esolution imaging. In this technique, micro- or nanolenses are nanolenses were fabricated by using dispersed components over
fabricated by naturally adopting the conguration of individually the individual nanoparticles as stated above. They demonstrated
dispersed components over target nano-objects. This greatly in- the feasibility of this system by detecting the viral particles of
creases the scattering cross section of the target objects, which, in H1N1. This indicates the feasibility of the lens-free system for
turn, improves the amplitude of the scattered waves from the detecting nano particles. Sobieranski et al. (2015) had also de-
objects. Euan et al. discussed the self-assembly process and the monstrated the pixel super resolution lens-free imaging technol-
integration of self-assembly technology with lens-free imaging ogy to detect sperms and platelets that are in the dimension of a
technology in detail (McLeod and Ozcan, 2014). This combination few microns (Sobieranski et al., 2015). This work also showed the
has the advantages of the wide FOV of lens-free imaging and the feasibility of the on-chip gigapixel nanoscopy available for the
improved resolution of self-assembly technology. characterization of sub-micron cells or particles.
Alon et al. investigated super-resolved on-chip color imaging in
which they implemented YUV color space averaging and Dijkstra's 3.5. Lens-free imaging and sensing systems for point-of-care
shortest-path computational methods (Greenbaum et al., 2013). diagnostics
Again Mudanyali et al. had demonstrated a high-throughput, on-
chip detection scheme that uses biocompatible wetting lms to The lens-free imaging system is compact, inexpensive and has
self-assemble aspheric liquid nanolenses around individual na- high throughput, all of which are suitable characteristics for a point-
noparticles to enhance the contrast between the scattered and of-care system. In this section, we discuss recent developments in

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
M. Roy et al. / Biosensors and Bioelectronics () 9

Fig. 11. Lens-free holography and on-chip uorescent imaging is demonstrated. Left column: raw lens-free uorescent images. Middle column: results of digital decon-
volution. Right column: results of lens-free holographic imaging of the same FOV, showing the shadow signatures of all the particles, both uorescent (F) and nonuorescent
(NF). The images in the other two columns show only the uorescent signatures (Coskun et al., 2010).

lens-free imaging technology with respect to a point-of-care sys- reconstruction of lensless incoherent images from a compact and
tem. Umut et al. presented a study on a lens-free imaging system as lightweight microscope (Mudanyali et al., 2010).
a point-of-care system (Gurkan et al., 2011), and Zhu et al. published Many groups are using the self-assembly approach to eliminate the
a critical review on optical imaging technology for point-of-care optical limitation of the lens-free imaging system. In the self-assembly
diagnostics, in which they reviewed the development of the lens- process, the individual dispersed components naturally adopt the
free imaging system as a point-of-care system (Zhu et al., 2013a). desired conguration and form micro- and nanolenses that enhance
Portability, low power consumption, functionality in heat and the signal with a spatial resolution of o100 nm (McLeod and Ozcan,
humidity, and continuous monitoring are the major challenges in 2014).
realizing point-of-care devices. Some have built point-of-care The cell viability test is a major component in the diagnosis of
systems with the goal of incorporating all these features by in- many cell-borne diseases. Measurement of cell viability in a point-of-
tegrating microuidics technology with the lens-free imaging care system has been a major challenge due to its complexity and
platform. Moon et al. diagnosed the CD4 T-lymphocyte count for reagent-dependent methods. Recently, Jin et al. successfully de-
HIV with such a point-of-care system (Moon et al., 2009). In ad- monstrated a reagent-free method for characterizing cell viability
dition, Biener et al. combined reection and transmission imaging using a lens-free imaging platform (Jin et al., 2012). They monitored
by integrating lens-free imaging, i.e., transmission imaging, and an the continuous viability of human alveolar epithelial A549 cells
optical imaging system, i.e., reection microscopy (see Fig. 16). without the use of a labeling reagent. By integrating video micro-
This setup provides spatial and phase information of samples si- scopy with lens-free imaging, this work improved upon that of Ke-
multaneously (Biener et al., 2011). On the other hand, studies on savan et al., who utilized this system to continuously monitor cell
retrieving phase and spatial information using computational proliferation inside a standard incubator (Kesavan et al., 2014).
methods are underway. Mudanyali et al. demonstrated the However, video processing is not intended for point-of-care systems.

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
10 M. Roy et al. / Biosensors and Bioelectronics ()

Fig. 12. Schematic diagram of the lens-free on-chip uorescent imaging platform with unit magnication such that the imaging FOV is equal to the entire active area of the
sensor array (i.e.,48 cm2). Total internal reection occurs at the glassair interface at the bottom facet of the coverglass. To avoid detection of scattered pump photons, a
plastic absorption lter is introduced after the faceplate. Typical dimensions (see caption for Fig. 9 for a description of the parameters): w1  w2 25 mm  35 mm;
p 1.7 cm; k  10100 mm; f 12 cm. (b) Microscope image of the faceplate. The numerical aperture of each ber is  0.3 (Coskun et al., 2010).

The most common point-of-care diagnostic test is the complete operate them and analyze their results. Because a point-of-care
blood cell count. In general, a complete blood cell count is performed system needs to be quick and easy, Roy et al. developed a custom
using a sophisticated hematology analyzer or a conventional optical telemedicine system that can perform a complete blood cell count of
microscope, both of which are bulky and need a trained user to human whole blood. They also used the device to calculate the

Fig. 13. Top row: Raw lens-free uorescent images of different pairs of 10-mm-diameter particles. As the particles get closer to each other, their signatures in the raw lens-
free image become indistinguishable to the naked eye. The inset images (bottom right corner) are transmission microscope images of the same particles and were used to
calculate the center-to-center distance (g) for comparison purposes. Middle row: Results of compressive decoding of the lens-free images in the top row. gCS is the center-to-
center distance of the resolved uorescent particles, where CS means compressive sampling. Bottom row: results of deconvolution of the lens-free images in the top row
using the LucyRichardson algorithm. gLR is the center-to-center distance of the resolved uorescent particles, where LR means LucyRichardson (Coskun et al., 2010).

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
M. Roy et al. / Biosensors and Bioelectronics () 11

Fig. 14. Lens-free simultaneous multicolor uorescent imaging on a chip. (a1), (b1), (c5), and (d5) Raw lens-free uorescence signatures of microbeads cropped from a large
FOV of 2.42 cm2. These raw images were decoded to retrieve color and spatial distribution information. (a3), (b3), (c6), (c7), (c8), (d6), (d7), and (d8) Decoding results are
veried with [insets in (a4) and (b4)] bright-eld and [insets in (c3), (c4), (d3), and (d4)] uorescent microscope images of the same objects. (a2), (b2), (c1), and (d1)
Debayered lens-free images illustrate the limitations of the color sensor for lens-free uorescent imaging without our demultiplexing method. The arrows highlight two
regions where our method effectively resolved the color and the spatial overlap of lens-free uorescent signatures.

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
12 M. Roy et al. / Biosensors and Bioelectronics ()

Fig. 15. Schematic diagram illustrating the principles of multiangle lens-free holographic imaging. For each illumination angle, a spatially incoherent source, such as an LED,
is ltered by a large aperture (  0.050.1-mm diameter), which is  6 cm from the object plane. Unlike conventional inline holography, the sample plane is much closer to
the detector plane, with a vertical distance of  1 mm, so that the entire active area of the sensor becomes the imaging FOV. (a) The shadow of each cell shifts laterally on the
sensor plane as a function of the illumination angle of the incoherent source, encoding its axial position. (b) The shadows of the cells acquired at different illumination angles
are matched to their sources by forming imaginary rays between each shadow and the corresponding source. The shift of the centroid position as a function of illumination
angle was used to determine the depth of the cells on a chip. This technology demonstrated the accuracy of the depth of a cell to be  300400 nm (Su et al., 2010b).

Fig. 16. Combination of transmission lens-free imaging and optical imaging via reection microscopy.

subpopulation of white blood cells in the samples, i.e., the con- et al. demonstrated a low-cost lens-free telemedicine system that
centration of lymphocytes, monocytes, and neutrophils, on the basis can automatically detect and characterize the size of cells on the
of the characterization of the custom-developed diffraction para- basis of their shadow parameter, the latter which they custom
meter, as shown in the Fig. 17 (Roy et al., 2014). The work of Zhu et al. developed, as shown in Fig. 18 (Roy et al., 2015). The compactness
similarly characterized blood cells using a smartphone-powered and the ability to perform reagent-free detection makes this sys-
point-of care system (Zhu et al., 2013b). tem a suitable candidate for a point-of-care diagnostic instrument.
Automated determination of the size of cells and other biolo-
gical substances is a major criterion for the diagnosis of many
diseases. However, until now, sophisticated instruments such as 4. Conclusions and perspectives
the Coulter counter and conventional optical microscopes were
used for measuring cell size. Integrating these functions with a In this review, we summarized the theory, basic design, and
point-of-care system has been a major challenge. Recently, Roy application of the lens-free imaging system. With respect to theory

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
M. Roy et al. / Biosensors and Bioelectronics () 13

Fig. 17. Illustration of the custom-developed diffraction parameters (shadow parameters). (a) and (b) Shadow images of 10- and 20-mm beads. (c) and (d) Intensity proles
from (a) and (b), respectively. (e) 3D plot of the diffraction parameters of heterogeneous samples of 10- and 20-mm beads.

and design, the lens-free imaging system can be divided into three DIH on phase and amplitude data and on the advanced compu-
categories: DIH, lens-free imaging with partially coherent light, tational method have been partly eliminated by the introduction
and on-chip lens-free imaging. The complexity and dependence of of lens-free imaging with partially coherent light. These simplied

Fig. 18. Detailed evaluation of the mathematical relationship between peak-to-peak distance (PPD) and the actual size of a microparticle. (a) Optical micrograph of a
heterogeneous sample of beads. (b) Digitally magnied lens-free shadow image of the sample in (a). (c) Intensity prole of the 21-mm-diameter bead. (d) Intensity prole of
the 10-mm-diameter bead. (e) PPD vs. actual size of 222 beads. (f) Statistical size distribution of 222 beads calculated from the PPD.

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i
14 M. Roy et al. / Biosensors and Bioelectronics ()

systems are used for quantication and size determination. A Bishara, W., Isikman, S.O., Ozcan, A., 2011b. Ann. Biomed. Eng. 40 (2), 251262.
range of applications has been developed from these techniques. Biener, G., Greenbaum, A., Isikman, S.O., Lee, K., Tseng, D., Ozcan, A., 2011. Lab Chip
11 (16), 27382743.
The integration of on-chip uorescent imaging with traditional Coskun, A.F., Ozcan, A., 2014. Curr. Opin. Biotechnol. 25, 816.
optical microscopy provides better insight into samples by facil- Coskun, A.F., Su, T.-W., Ozcan, A., 2010. Lab Chip 10 (7), 824827.
itating cross inspection in a large FOV. Spatial resolution can be Cui, X., Lee, L., Heng, X., Zhong, W., Sternberg, P., Psaltis, D., Yang, C., 2008. Proc.
Natl. Acad. Sci. USA 105, 1067010675.
improved by implementing a compressive sensing algorithm, and
Gorocs, Z., Ozcan, A., 2013. IEEE Rev. Biomed. Eng. 6, 2946.
the effective bandwidth of the emitted light of the uorescent Greenbaum, A., Sikora, U., Ozcan, A., 2012a. Lab Chip 12 (7), 1242.
particles can be improved by implementing the demultiplexing Greenbaum, A., Feizi, A., Akbari, N., Ozcan, A., 2013. Opt. Express 21 (10),
method based on the polychromatic phase retrieval algorithm. On- 1246912483.
Gurkan, U.A., Moon, S., Geckil, H., Xu, F., Wang, S., Lu, T.J., Demirci, U., 2011. Bio-
chip tomography can provide depth-resolved imaging of micro- technol. J. 6 (2), 138149.
particles with a wide FOV. This compact system is suitable for Greenbaum, A., Luo, W., Su, T.-W., Grcs, Z., Xue, L., Isikman, S., Coskun, A., Mu-
integration with existing on-chip analysis tools such as lab-on-a- danyali, O., Ozcan, A., 2012b. Nat. Methods 9, 889895.
Isikman, S.O., Bishara, W., Mavandadi, S., Yu, F.W., Feng, S., Lau, R., Ozcan, A., 2011a.
chip. A super-resolved lens-free image of microparticles can be
Proc. Natl. Acad. Sci. USA 108 (18), 72967301.
obtained by capturing multiple images of holograms of the objects Isikman, S.O., Bishara, W., Sikora, U., Yaglidere, O., Yeah, J., Ozcan, A., 2011b. Lab
by subpixel shifting of the image plane. These images are then Chip 11 (13), 22222230.
fused into a single high-resolution image using a pixel super-re- Jin, G., Yoo, I.-H., Pack, S.P., Yang, J.-W., Ha, U.-H., Paek, S.-H., Seo, S., 2012. Biosens.
Bioelectron. 38 (1), 126131.
solved algorithm and an iterative gradient algorithm. This super- Kesavan, S.V., Momey, F., Cioni, O., David-Watine, B., Dubrulle, N., Shorte, S., Allier,
resolution image can be reconstructed using back-propagation of C., 2014. Sci. Rep. 4. http://dx.doi.org/10.1038/srep05942.
the object plane and an angular spectrum approach. The com- Kim, S.B., Bae, H., Cha, J.M., Moon, S.J., Dokmeci, M.R., Cropek, D.M., Kha-
pactness and low cost of the lens-free system make it suitable for demhosseini, A., 2011. Lab Chip 11 (10), 18011807.
Kim, S.B., Bae, H., Koo, K., Dokmeci, M.R., Ozcan, A., Khademhosseini, A., 2012. J. Lab.
point-of-care diagnostic instruments. Integration of the automated Autom. 17 (1), 4349.
shadow image feature extraction and characterization algorithm McLeod, E., Luo, W., Mudanyali, O., Greenbaum, A., Ozcan, A., 2013. Lab Chip 13 (11),
with the lens-free system creates an alternative system that per- 20282035.
McLeod, E., Ozcan, A., 2014. Nano Today 9, 560573.
forms a complete blood cell count and size characterization. The
Moon, S., Keles, H.O., Ozcan, A., Khademhosseini, A., Haeggstrom, E., Kuritzkes, D.,
lens-free system integrated with sophisticated microuidics can Demirci, U., 2009. Biosens. Bioelectron. 24 (11), 32083214.
be used for measuring continuous cell viability. All these low-cost, Mudanyali, O., Tseng, D., Oh, C., Isikman, S.O., Sencan, I., Bishara, W., Ozcan, A., 2010.
compact, and fast-processing lens-free imaging and sensing Lab Chip 10 (11), 14171428.
Mudanyali, O., McLeod, E., Luo, W., Greenbaum, A., Coskun, A., Hennequin, Y., Allier,
techniques may play a crucial role especially in the elds of en-
C., Ozcan, A., 2013. Nat. Photonics 7, 247254.
vironmental, pharmaceutical, biological, and clinical applications Ozcan, A., Demirci, U., 2008. Lab Chip 8 (1), 98106.
of the resource-limited settings. Picart, P., Leval, J., 2008. J. Opt. Soc. Am. A 25 (7), 17441761.
Roy, M., Jin, G., Seo, D., Nam, M.H., Seo, S., 2014. Sens. Actuators B: Chem. 201,
321328.
Roy, M., Seo, D., Oh, C., Nam, M., Jun, Y., Seo, S., 2015. Biosens. Bioelectron. 67,
Acknowledgment 715723.
Schnars, U., Jptner, W., 1994. Appl. Opt. 33, 179181.
Schultz, D.A., 2003. Curr. Opin. Biotechnol. 14, 1322.
This research was supported by the Basic Science Research Sencan, I., Coskun, A.F., Sikora, U., Ozcan, A., 2014. Sci. Rep. 4. http://dx.doi.org/
Program (Grant#:2013-010832, Grant#:2014R1A6A1030732) 10.1038/srep03760.
through the National Research Foundation of Korea (NRF) funded Seo, S., Su, T.-W., Tseng, D.K., Erlinger, A., Ozcan, A., 2009. Lab Chip 9 (6), 777787.
Seo, S., Isikman, S.O., Sencan, I., Mudanyali, O., Su, T.-W., Bishara, W., Ozcan, A., 2010.
by the Ministry of Science, ICT & Future Planning and the Ministry
Anal. Chem. 82 (11), 46214627.
of Education, Korea. This work was also supported by the Korea Sobieranski, A., Inci, F., Tekin, H., Yuksekkaya, M., Comunello, E., Cobra, D., Wan-
Research Institute of Ships and Ocean Engineering (KRISO) En- genheim, A., Demirci, U., 2015. Light Sci. Appl. 4, e346.
dowment-Grant (PES2440) and a part of the project Development Su, T.-W., Erlinger, A., Tseng, D., Ozcan, A., 2010a. Anal. Chem. 82 (19), 83078312.
Su, T.-W., Isikman, S.O., Bishara, W., Tseng, D., Erlinger, A., Ozcan, A., 2010b. Opt.
of Management Technology for HNS Accident, funded by the Express 18 (9), 96909711.
Ministry of Oceans and Fisheries, Korea. Wei, Q., McLeod, E., Qi, H., Wan, Z., Sun, R., Ozcan, A., 2013. Sci. Rep. 3, 1699. http:
//dx.doi.org/10.1038/srep01699.
Xu, W., Jericho, M.H., Meinertzhagen, I.A., Kreuzer, H.J., 2002. Appl. Opt. 41 (25),
53675375.
References Zhang, X., Khimji, I., Gurkan, U., Safaee, H., Catalano, P., Keles, H., Kayaalp, E., De-
mirci, U., 2011. Lab Chip 11, 25352540.
Bishara, W., Su, T.-W., Coskun, A.F., Ozcan, A., 2010. Opt. Express 18 (11), Zhu, H., Isikman, S.O., Mudanyali, O., Greenbaum, A., Ozcan, A., 2013a. Lab Chip 13
1118111191. (1), 5167.
Bishara, W., Sikora, U., Mudanyali, O., Su, T.-W., Yaglidere, O., Luckhart, S., Ozcan, A., Zhu, H., Sencan, I., Wong, J., Dimitrov, S., Tseng, D., Nagashima, K., Ozcan, A., 2013b.
2011a. Lab Chip 11 (7), 1276. Lab Chip 13 (7), 12821288.

Please cite this article as: Roy, M., et al., Biosensors and Bioelectronics (2016), http://dx.doi.org/10.1016/j.bios.2016.07.115i

Potrebbero piacerti anche