Serial time-encoded amplified microscopy

Serial time-encoded amplified imaging/microscopy or stretched time-encoded amplified imaging/microscopy' (STEAM) is a fast real-time optical imaging method that provides MHz frame rate, ~100 ps shutter speed, and ~30 dB (× 1000) optical image gain. An example of time-stretch microscopy, STEAM holds world records for shutter speed and frame rate in continuous real-time imaging. STEAM employs the photonic time stretch along with optical image amplification to circumvent the fundamental trade-off between sensitivity and speed that affects virtually all optical imaging and sensing systems. This method employs a single-pixel photodetector, eliminating the need for the detector array and readout time limitations. Avoiding this problem and featuring the optical image amplification for dramatic improvement in sensitivity at high image acquisition rates, STEAM's shutter speed is at least 1000 times faster than the state-of-the-art CCD[1] and CMOS[2] cameras. Its frame rate is 1000 times faster than fastest CCD cameras and 10-100 times faster than fastest CMOS cameras.

History

The system combined photonic time stretch with internal Raman amplification, developed earlier to create a femtosecond real-time single-shot digitizer with the concept of spectral encoding. The first demonstration the one-dimensional version[3] and later the two-dimensional version.[4] Recently, imaged of rapidly vibrating objects such as loudspeaker cones has been accomplished by operating in an interferometric configuration.[5]

Background

Fast real-time optical imaging technology is indispensable for studying dynamical events such as shockwaves, laser fusion, chemical dynamics in living cells, neural activity, laser surgery, microfluidics, and MEMS. The usual techniques of conventional CCD and CMOS cameras are inadequate for capturing fast dynamical processes with high sensitivity and speed; there are technological limitations—it takes time to read out the data from the sensor array and there's a fundamental trade-off between sensitivity and speed: at high frame rates, fewer photons are collected during each frame, a problem that affects nearly all optical imaging systems.

The streak camera, used for diagnostics in laser fusion, plasma radiation, and combustion, operates in burst mode only (providing just several frames) and requires synchronization of the camera with the event to be captured. It is therefore unable to capture random or transient events in biological systems. Stroboscopes have a complementary role: they can capture the dynamics of fast events—but only if the event is repetitive, such as rotations, vibrations, and oscillations. They are unable to capture non-repetitive random events that occur only once or do not occur at regular intervals.

Principle of operation

The basic principle involve two steps both performed optically. In the first step, the spectrum of a broadband optical pulse is converted by a spatial disperser into a rainbow that illuminates the target. Here the rainbow pulse consists of many subpulses of different colors (frequencies), indicating that the different frequency components (colors) of the rainbow pulse are incident onto different spatial coordinates on the object. Therefore, the spatial information (image) of the object is encoded into the spectrum of the resultant reflected or transmitted rainbow pulse. The image-encoded reflected or transmitted rainbow pulse returns to the same spatial disperser or enters another spatial disperser to combine the colors of the rainbow back into a single pulse. Here STEAM's shutter speed or exposure time corresponds to the temporal width of the rainbow pulse. In the second step, the spectrum is mapped into a serial temporal signal that is stretched in time using dispersive Fourier transform to slow it down such that it can be digitized in real-time. The time stretch happens inside a dispersive fiber that is pumped to create internal Raman amplification. Here the image is optically amplified by stimulated Raman scattering to overcome the thermal noise level of the detector. The amplified time stretched serial image stream is detected by a single-pixel photodetector and the image is reconstructed in the digital domain. Subsequent pulses capture repetitive frames hence the laser pulse repetition rate corresponds to the frame rate of STEAM. The second is known as the time stretch analog-to-digital converter, otherwise known as the time stretch recording scope (TiSER).

Amplified dispersive Fourier transformation

The simultaneous stretching and amplification is also known as amplified dispersive Fourier transformation.[6][7] It is a process in which the spectrum of an optical pulse is mapped by large group-velocity dispersion into a slowed down temporal waveform and amplified simultaneously by the process of stimulated Raman scattering. Consequently, the optical spectrum can be captured with a single-pixel photodetector and digitized in real-time. Pulses are repeated for repetitive measurements of the optical spectrum. The amplified dispersive Fourier transformer consists of a dispersive fiber pumped by lasers and wavelength-division multiplexers that couple the lasers into and out of the dispersive fiber. Amplified dispersive Fourier transformation was originally developed to enable ultra wideband analog to digital converters and has also been used for high throughput real-time spectroscopy. The resolution of STEAM imager is mainly determined by diffraction limit, sampling rate of the back-end digitizer, and spatial dispersers.[8]

Applications

This method is useful for a broad range of scientific, industrial, and biomedical applications that require high shutter speeds and frame rates. The one-dimensional version can be employed for displacement sensing, barcode reading, and blood screening;[9] the two-dimensional version for real-time observation, diagnosis, and evaluation of shockwaves, microfluidic flow,[10] neural activity, MEMS,[11] and laser ablation dynamics. The three-dimensional version is useful for range detection, dimensional metrology, and surface vibrometry and velocimetry.[12]

Image compression in optical domain

Illustration of warped stretch transform in imaging.
Image compression by warped stretch transform.

Big data not only brings opportunity, but also a challenge in biomedical and scientific instruments, whose acquisition and processing units are overwhelmed by a torrent of data. The need to compress massive volumes of data in real-time has fueled interest in nonuniform stretch transformations – operations that reshape the data according to its sparsity.

Recently researchers at UCLA demonstrated image compression performed in the optical domain and in real-time.[13] Using nonlinear group delay dispersion and time-stretch imaging, they were able to optically warp the image such that the information-rich portions are sampled at a higher sample density than the sparse regions. This was done by restructuring the image before optical-to-electrical conversion followed by a uniform electronic sampler. The reconstruction of the nonuniformly stretched image demonstrates that the resolution is higher where information is rich and lower where information is much less and relatively not important. The information-rich region at the center is well preserved while maintaining the same sampling rates compared to uniform case without down-sampling. Image compression was demonstrated at 36 million frames per second in real-time.

See also

References

  1. J. R. Janesick (2001). Scientific charge-coupled devices. SPIE Press. ISBN 9780819436986.
  2. H. Zimmermann (2000). Integrated silicon optoelectronics. Springer. ISBN 3540666621.
  3. K. Goda; K. K. Tsia & B. Jalali (2008). "Amplified dispersive Fourier-transform imaging for ultrafast displacement sensing and barcode reading". Applied Physics Letters. 93: 131109. arXiv:0807.4967Freely accessible. Bibcode:2008ApPhL..93m1109G. doi:10.1063/1.2992064.
  4. K. Goda; K. K. Tsia & B. Jalali (2009). "Serial time-encoded amplified imaging for real-time observation of fast dynamic phenomena". Nature. 458: 1145–9. Bibcode:2009Natur.458.1145G. doi:10.1038/nature07980. PMID 19407796.
  5. A. Mahjoubfar; K. Goda; A. Ayazi; A. Fard; S. H. Kim & B. Jalali (2011). "High-speed nanometer-resolved imaging vibrometer and velocimeter". Applied Physics Letters. 98: 101107. Bibcode:2011ApPhL..98j1107M. doi:10.1063/1.3563707.
  6. K. Goda; D. R. Solli; K. K. Tsia & B. Jalali (2009). "Theory of amplified dispersive Fourier transformation". Physical Review A. 80: 043821. Bibcode:2009PhRvA..80d3821G. doi:10.1103/PhysRevA.80.043821.
  7. K. Goda & B. Jalali (2010). "Noise figure of amplified dispersive Fourier transformation". Physical Review A. 82: 033827. Bibcode:2010PhRvA..82c3827G. doi:10.1103/PhysRevA.82.033827.
  8. K. K. Tsia, K. Goda, D. Capewell, and B. Jalali, "Performance of serial time-encoded amplified microscope," Optics Express 18, 10016 (2010).
  9. Chen C., Mahjoubfar A., Tai L., Blaby I., Huang A., Niazi K., Jalali B. (2016). "Deep Learning in Label-free Cell Classification". Scientific Reports. 6: 21471. doi:10.1038/srep21471.
  10. D. Di Carlo (2009). "Inertial microfluidics". Lab on a Chip. 9: 3038. doi:10.1039/b912547g.
  11. T. R. Hsu (2008). MEMS & microsystems: design, manufacture, and nanoscale engineering. Wiley. ISBN 0470083018.
  12. Mahjoubfar A., Goda K., Ayazi A., Fard A., Kim S., Jalali B. (2011). "High-speed nanometer-resolved imaging vibrometer and velocimeter". Applied Physics Letters. 98: 101107. Bibcode:2011ApPhL..98j1107M. doi:10.1063/1.3563707.
  13. CL Chen; A Mahjoubfar; B Jalali (2015). "Optical Data Compression in Time Stretch Imaging". PLoS ONE. 10: 1371. doi:10.1371/journal.pone.0125106.
This article is issued from Wikipedia - version of the 8/2/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.