A Progress Review on Solid‐State LiDAR and Nanophotonics‐Based LiDAR Sensors

Light detection and ranging (LiDAR) sensors enable precision sensing of an object in 3D. LiDAR technology is widely used in metrology, environment monitoring, archaeology, and robotics. It also shows high potential to be applied in autonomous driving. In traditional LiDAR sensors, mechanical rotator is used for optical beam scanning, which brings about limitations on their reliability, size, and cost. These limitations can be overcome by a more compact solid‐state solution. Solid‐state LiDAR sensors are commonly categorized into the following three types: flash‐based LiDAR, microelectromechanical system (MEMS)‐based LiDAR, and optical phased array (OPA)‐based LiDAR. Furthermore, advanced optics technology enables novel nanophotonics‐based devices with high potential and superior advantages to be utilized in a LiDAR sensor. In this review, LiDAR sensor principles are introduced, including three commonly used sensing schemes: pulsed time of flight (TOF), amplitude‐modulated continuous wave TOF, and frequency‐modulated continuous wave. Recent advances in conventional solid‐state LiDAR sensors are summarized and presented, including flash‐based LiDAR, MEMS‐based LiDAR, and OPA‐based LiDAR. The recent progress on emerging nanophotonics‐based LiDAR sensors is also covered. A summary is made and the future outlook on advanced LiDAR sensors is provided.


Introduction
Light detection and ranging (LiDAR) technology enables the accurate determination of an object's distance (and velocity) information. Compared with more mature radio detection and ranging (RADAR) technology, LiDAR makes use of optical wave, which is at shorter wavelength regime compared with radio wave, DOI: 10.1002/lpor.202100511 and hence has potential to achieve higher precision in 3D sensing. LiDAR has been widely applied in metrology, [1][2][3] environment monitoring, [4][5][6] archaeology, [7,8] robotics, [9,10] and shows high potential to be used for autonomous driving. [11][12][13] The demand for advanced LiDAR technology in the fast-growing autonomous driving industry can be induced from the appearance and growth of startup companies in this area. Based on sensing/ranging mechanism, most LiDAR sensors can be categorized into the following three schemes: pulsed time of flight (TOF), amplitude-modulated continuous wave (AMCW) TOF, and frequency-modulated continuous wave (FMCW). Traditional LiDAR sensors make use of a mechanical rotation mechanism to achieve wide field-of-view (FOV) scanning, which places limitations on reliability, size, and cost. [14] These limitations can be overcome by using a solid-state approach.
There is a growing demand for compact ranging systems, [1] and solid-state LiDAR provides an alternative for traditional LiDAR using mechanical rotator, which is often bulky in size and can be removed. Therefore, solid-state Li-DARs have recently drawn significant interests in both academic research and industrial applications. Depending on the mapping/illumination method, solid-state LiDARs are commonly categorized into the following three types: [15,16] flashbased LiDAR, [17][18][19] microelectromechanical system (MEMS)based LiDAR, [20][21][22] and optical phased array (OPA)-based LiDAR. [23][24][25] Flash-based LiDAR makes use of a photodetector (PD) array, hence it can capture the entire target scene within a single shot. Since there is no scanning involved, the flash Li-DAR promises superior long-term reliability. However, the resolution of flash-based LiDAR is constrained by the physical size of PD arrays. Incorporating MEMS mirrors can bring down the size of the LiDAR system and enable solid-state LiDAR scanning, with advantages of being compact and lightweight. OPA-based LiDAR is based on integrated photonics technology, which also provides a compact platform. The fabrication of OPA is compatible with complementary metal-oxide-semiconductor (CMOS) processes, [26][27][28] which brings down the manufacturing cost.
In addition to the above-mentioned three types of solid-state LiDAR, advanced nanophotonics technology also enables novel nanophotonics devices with high potential and superior advantages to be utilized in a LiDAR sensor. One example of nanophotonics device is optical switch based on integrated photonics platform. [29] The switching networks formed by optical switches enable sequential illumination in LiDAR system for large sensing range. Another example is optical frequency comb (OFC), [2] which is also based on integrated photonics platform. OFC is a compact light source for high-performance LiDAR sensing. It has recently been demonstrated for LiDAR sensing with high resolution, [30] parallel scanning, [2] as well as the capability to capture object profile under fast moving speed. [1] One more typical example is metasurface-based spatial light modulator (SLM). [31] A metasurface is a thin layer of patterned nanostructures, which can control light phase and amplitude in sub-wavelength scale. Hence, it brings potential for compact LiDAR sensor with high precision. [16] A comprehensive review on nanophotonics devices for LiDAR sensing has been recently conducted by Kim et al. [16] In this review, the recent advances in conventional solid-state LiDAR are summarized and presented, including flash-based Li-DAR, MEMS-based LiDAR, and OPA-based LiDAR. Followed by that, the recent progress on emerging nanophotonic-based Li-DAR sensors is also reviewed. At the end, the outlook on advanced LiDAR sensors is provided. A note worth mentioning is that for the LiDAR sensors covered in this review, we emphasize on the optics part of the system rather than the electronics part. The whole review is organized in the following way: Section 2 covers the LiDAR sensor principles and design rules; Section 3 presents recent advances on conventional solid-state LiDAR sensors; Section 4 focuses on emerging nanophotonics-based LiDAR sensors; Section 5 summarizes the review content and provides future prospect on advanced LiDAR sensors. The overall structure of this review can be visualized in Figure 1.

LiDAR Principles
As mentioned in the introduction, the three sensing schemes most commonly used in LiDAR sensors are: pulsed TOF, AMCW TOF, and FMCW. Good comparison and summary on these three sensing schemes have been made in refs. [34,35]. In general, pulsed TOF and AMCW TOF are based on the modulation of light intensity, while FMCW is based on the modulation of light frequency. The mechanisms to obtain the distance information (for all three schemes) and velocity information (for FMCW Li-DAR) are explained in this section. The plot of LiDAR signal under different sensing schemes is illustrated in Figure 2. The power budget of the LiDAR system can be expressed using the following equation [23] P RX = P in TX A RX 4 R 2 RX (1) where P in is the input optical power. TX and RX represent the efficiency of transmitter and receiver, respectively. is the reflectivity of the sensing object. A RX represents the area of receiver. R is the distance of the sensing object. Typically, a larger A RX contributes to higher received optical power, and hence longer sensing distance. [34] Also, a note worth mentioning is that the maximum input power or transmit power is mostly limited by eye-safety concerns. [34]

Pulsed TOF LiDAR
The pulsed TOF LiDAR works based on the time delay of an optical pulse emitted by the TX, reflected from the sensing object, and received by the RX. The sensing distance can be expressed using the following equation where Δt is the time delay and c is the speed of light. The plot of TX and RX signal in time domain is illustrated in Figure 2a.
The range resolution is limited by the resolution in time counting available, which is mainly determined by electronics timing resolution. [16] Contributed by high peak power, pulsed TOF Li-DAR can sense longer distance while maintaining low average power to comply with eye-safety limitation. [34]

AMCW TOF LiDAR
Comparing with pulsed TOF LiDAR, AMCW TOF LiDAR makes use of amplitude/intensity modulated optical signal rather than pulsed optical signal for sensing. It works based on the phase difference between the modulated light from TX and received light by RX. The sensing distance can be expressed as where Δ is the phase shift and f is the modulation frequency of the optical signal. The plots of TX and RX signals can be visualized in Figure 2b. Since AMCW TOF LiDAR uses modulated optical signal rather than optical pulse, it is suitable for moderate range sensing rather than long range sensing. [34]

FMCW LiDAR
FMCW LiDAR sensor emits a frequency-modulated optical signal and collects the reflected signal. From the beat signal between the original and reflected signal, the distance and velocity information of a moving object can be obtained. The calculation process can be illustrated in the following two equations where T is the period of modulation, B is the chirp bandwidth, and c is the light propagation speed. f u and f d represent the beat frequency for upward and downward laser scan, respectively. Both have included Doppler frequency shift. f c is the starting frequency without chirp or called optical carrier frequency. Please note that the above equation (5) assumes the angle between the Adapted with permission. [19] Copyright 2016, The Optical Society. Top middle: Schematic of an MEMS-based LiDAR. Adapted with permission. [21] Copyright 2018, IARIA. Top right: Schematic of solid-state OPA-based beam scanner with integrated laser source and amplifiers. Adapted with permission. [14] Copyright 2020, IEEE. Bottom left: Schematic of RX (receiver) block formed by heterodyne PD pixel for LiDAR imager with sequential illumination enabled by optical switches. Adapted with permission. [32] Copyright 2021, Springer Nature. Bottom middle: Schematic of optical frequency comb (OFC)-based LiDAR for parallel sensing. Adapted with permission. [2] Copyright 2020, Springer Nature. Bottom right: Schematic of a metasurface-based LiDAR sensor setup. Adapted with permission. [33] Copyright 2021, Springer Nature.
target velocity vector and LiDAR line of sight is zero. If such an angle is considered, the denominator part on right side of equation (5) needs to be multiplied by a cos( ).
Comparing with TOF, FMCW method has the following advantages: [32] the sensing system will not suffer from interference from nearby LiDAR systems due to the coherent detection nature of FMCW. In addition to distance information, it can obtain the velocity based on Doppler's effect. Also, the FMCW method can achieve higher depth accuracy compared with TOF. Lastly, FMCW requires relatively lower optical peak power compared with the pulsed TOF method where a strong optical pulse is required.
In addition, it is worth mentioning that ref. [36] provides a comprehensive explanation on an implementation of FMCW RADAR in millimeter wavelength range, which can be used as a reference for FMCW sensor design.

Conventional Solid-State LiDAR Sensor
In the earlier section, LiDAR principles are introduced. In this section, we will move on to the review and discussion on solidstate LiDAR sensors. Solid-state LiDAR can be categorized into two main types: flash-based and scanning-based LiDAR. For scanning type LiDAR, including MEMS-based and OPA-based LiDAR, the advantage is higher signal-to-noise ratio compared with flash type LiDAR since there is increased optical power during laser point scanning. [18] In the meanwhile, the advantage of flash-based LiDAR is the capability to capture the sensing object with a single shot by using a PD array. Since there is no scanning part involved, the flash-based LiDAR has better long-term reliability and higher data acquisition rate. Key features and comparison of solid-state LiDAR, including flash-based, MEMS-based, and OPA-based LiDAR, have been summarized in refs. [15,16,35]. In this section, the recent research works in the past 5 years

Flash-Based LiDAR
As mentioned earlier, a LiDAR system can obtain 3D image through two approaches: flash and scanning. The scanning approach makes use of one or a few detectors and a scanner to obtain 3D information, while the flash approach makes use of a 2D detector array to capture the sensing object and obtain 3D information. [19] The flash type LiDAR system enables the capture of 3D depth image with a single shot of optical pulse, and hence has relatively higher data acquisition rate compared with the scanning type. In the meanwhile, it requires optical pulse with high power and detector array with high sensitivity. [16] In a LiDAR sensing system, the optical power of reflected signal is inversely proportional to the sensing distance squared, as indicated in Equation (1). Hence, for long-distance sensing, single-photon avalanche diodes (SPADs) have been extensively demonstrated for TOF-based LiDAR. [18,[38][39][40][41] In the research work by Zhang et al., [18] a SPAD flash LiDAR sensor with 252 × 144 pixel and 30 frames s −1 has been reported. The image sensor is fabricated on a 180 nm CMOS technology platform containing the SPAD array and time-to-digital converters (TDC). More details on the electronics of the image sensor can be found in ref. [18]. Its operational diagram is illustrated in Figure 3a. The 3D imaging is performed in real-time, with results shown in Figure 3b. Six frames of a 3D movie showing a hand clenching and unclenching are illustrated. The movie is captured at 30 frames s −1 , with the hand located at a distance of 0.7 m. Furthermore, in a later study by Hutchings et al., [39] a 256 × 256 SPAD imaging sensor is reported for TOF LiDAR imaging. The sensor has a low power consumption of less than 100 mW, and reports long imaging distance up to 50 m. Also, in the research work by Hu et al., [40] a noise filtering circuit is utilized in the pixel formed by SPADs to improve the signal-to-background noise ratio, and hence the TOF LiDAR detection range. More recently, the study by Padmanabhan et al. [41] reports a SPAD array with the capability to conduct photon coincidence to suppress the background light and hence   improve the signal-to-background noise ratio. A maximum TOF LiDAR ranging distance of 100 m has been achieved. In addition, in the study by Beer et al., [38] photon coincidence detection is also used to achieve ambient light suppression for SPAD-based LiDAR image sensor.
In the meanwhile, the SPAD-based flash LiDAR sensors discussed above have limitations on range precision and spatial resolution, since it is technically challenging to increase the pixel count of the detector array. [19] To further improve the performance of the 3D LiDAR sensor in terms of spatial resolution and   [21] Copyright 2018, IARIA. b) Schematic of 2D MEMS scanning mirror assembly, with bases consisting of two PZT ceramics. PZT: lead zirconate titanate. c) 2D scanning pattern by the packaged MEMS scanner. b,c) Adapted under the terms of a Creative Commons Attribution 4.0 International License. [45] Copyright 2017, The Authors, published by MDPI. d) SEM image of the fabricated vertical MEMS mirrors with forward scanning scheme. d) Adapted with permission. [47] Copyright 2018, IEEE. e) SEM image of fabricated large aperture 2D MEMS scanning mirror with mirror plate size of 2 × 2.5 mm 2 , and an optical FOV of 15°× 12°. e) Adapted with permission. [48] Copyright 2019, IEEE. f) Conceptual schematic of MEMS mirror mounted on micro-robot for zoom-in 3D scanning. f) Adapted with permission. [10] Copyright 2021, IEEE. ranging precision, in the research work by Jo et al., [19] a novel flash LiDAR system with high spatial resolution (0.12 mrad) and high ranging precision (5.2 mm at 16 m) has been reported. The schematic of flash LiDAR is shown in Figure 3c. The laser source provides optical pulses at 532 nm wavelength. Most of the optical beam will be reflected by the mirror (M), while a small leaky portion will be collected by the PD behind the mirror to trigger a delay pulse generator (DPG) in the setup. The DPG then activates the Pockels cell after a delay time of . The optical beam from the laser is collimated by two lens L1 and L2. A rotating diffuser is placed at the focal point for speckle reduction. The reflected signal from the sensing object is detected by a micropolarizer charge-coupled device camera (MCCD). From the polarization state of the reflected signal, the TOF information can be obtained. Using the flash LiDAR setup mentioned above, a 3D image with 200 × 200 pixel resolution is obtained. The sensing target (Venus plaster) has a size of 60 cm × 30 cm. Its 2D and 3D sensing images are shown in Figure 3d left and right panel, respectively. Furthermore, the study by Zhang et al. [37] demonstrates two high-resolution flash LiDAR systems based on polarization modulation: one uses a polarization beam splitter together with two CCD imaging cameras, and the other uses a micropolarizer array with a CCD array. Both systems can achieve ranging precision of a few mm. Compared with the conventional flash LiDAR, the advantage of the demonstrated system is the use of low bandwidth detector (e.g., a CCD) together with a polarization modulator, instead of high bandwidth detector, which has limitation on size, spatial resolution, and range precision.

MEMS-Based LiDAR
MEMS technology is able to reduce the size and weight of a scanning LiDAR system, enabling the use of miniaturized LiDAR sensors on small unmanned aerial vehicles (UAVs). [20] In the meanwhile, for MEMS mirror design, the trade-off between optical beam size (mirror size) and scanning speed needs to be considered and balanced. [16] A recent review by Wang et al. [42] has made a good summary on different kinds of MEMS mirrors for LiDAR application. In this section, the focus is on reported demonstrations of MEMS-based LiDAR sensors as well as MEMS mirrors in the past 5 years, with key features summarized in Table 2.

1D MEMS mirror
LiDAR sensor utilizing 1D scanning MEMS mirror has been demonstrated by Druml et al. [21] The schematic of the LiDAR sensor prototype is shown in Figure 4a. To achieve 2D scanning, the MEMS mirror performs horizontal scanning of a vertical line of laser beams. The work reported not only demonstrates a LiDAR prototype, but also shows the potential for future LiDAR systems with long sensing distance of >200 m at costs of <$ 200. Furthermore, in the study by Schwarz et al., [43] a resonant 1D MEMS mirror with scanning angle of >±45°and hence scanning FOV of up to 180°has been presented. The aluminum nitride (AlN)-based piezoelectric MEMS mirror with a size of 2 mm × 4 mm can achieve accurate scanning with a frequency of 1.5 kHz. Further, www.advancedsciencenews.com www.lpr-journal.org scandium-doped AlN, with higher piezoelectric coefficient than AlN, has been identified to further improve the device efficiency.

2D MEMS mirror
The 2D MEMS mirror typically has a fast axis and a slow axis, which are used for horizontal scanning and vertical scanning, respectively. The scanning frequencies are typically within the range of 0.5-2 kHz and 10-30 Hz for the fast and slow axis, respectively. [42] The frame rate of the LiDAR scanner is limited by the scanning speed of the slow axis.
In 2016, the study by Kasturi et al. [22] demonstrated a 2D MEMS scan module with a weight of <40 g mounted on an UAV. The MEMS scan module is controlled by a smartphone through Bluetooth. Also, for proof-of-concept demonstration, the MEMS mirror is integrated with off-the-shelf laser range finder to demonstrate accurate distance measurement of 2 m with ± 8°F OV. In the same year, the study by Ye et al. [44] demonstrated a 2D MEMS scanner with wedge-like structure for scanning angle amplification. Scanning angles of 45.3°and 42.6°have been demonstrated in the x-and y-axis, respectively. In 2017, a followup work by Ye et al. [45] demonstrated 2D MEMS mirror with a driving voltage of 5 V and two-axis scanning frequencies of 947.51 and 1464.66 Hz. The schematic of the 2D MEMS mirror assembly with base consists of two lead zirconate titanate (PZT) ceramics and its 2D scanning pattern are shown in Figure 4b,c, respectively. Also, in the same year, a Ti alloy-based microscanning mirror with large aperture size of 12 mm and fast scanning frequency of 1.24 kHz was demonstrated by the same group. [46] The large aperture size of the microscanning mirror enables the LiDAR system to work in longer ranging distance.
In 2018, the study by Wang et al. [20] demonstrated an electrothermal actuated MEMS scanner. The scanner has been applied in a LiDAR prototype, which has a volume of 100 mm × 100 mm × 60 mm and a weight of <100 g. The LiDAR prototype is suitable for sensing applications used in small UAVs. In the same year, in the study by the same research group, [47] a novel design of MEMS mirrors bending vertically to the substrate has been demonstrated. The vertical mirror scheme is able to perform direct forward scanning without beam folding compared to the conventional case where mirrors are parallel to the substrate. Hence, in comparison with the conventional case, the direct forward scanning by vertical mirror takes less space and reduces efforts for optical alignment. This is important for a miniaturized LiDAR scanner. The scanning electron microscopy (SEM) image of the fabricated MEMS scanner is shown in Figure 4d. The MEMS scanning range is reported to be 17°, under only 4.5 V of driving voltage. The resonant frequency of scanning mode can achieve 2.2 kHz. The forward-view scanner has a compact size of 4 mm × 4.5 mm × 1.6 mm and a light weight of 16 mg, and hence can be applied for small-size LiDAR in micro-air vehicles. [47] Also based on the electro-thermal actuation mechanism, in 2019, the same group reported a large aperture two-axis MEMS mirror, with mirror plate size of 2 × 2.5 mm 2 , and an optical FOV of 15°× 12°. [48] The SEM image of the fabricated MEMS mirror is shown in Figure 4e. In a follow-up research study by the same group, [10] a miniature LiDAR with a detached MEMS scanner has been demonstrated. The MEMS mirror has a weight of only 10 g, and dimensions of 36 mm × 30 mm × 13 mm. The LiDAR configuration with MEMS mirror mounted on micro-robots enables the zoom-in 3D scanning of the sensing object. The conceptual schematic of LiDAR sensor configuration is shown in Figure 4f.

OPA-Based LiDAR
The integrated photonics platform enables compact, functional optical devices on chip with a small footprint. These functional devices include laser sources, [49][50][51][52] optical modulators, [53][54][55][56] optical filters, [57][58][59] optical couplers, [60,61] PDs, [62][63][64] and nonlinear optical generators. [65][66][67][68] Also, among solid-state LiDAR approaches, integrated photonics provide an alternative and compact platform for optical beam scanning using OPA [14] which has drawn significant interests in the research community in recent years. [23,69] Compared with the MEMS-based LiDAR, OPA-based LiDAR does not need the mechanical moving parts, and hence has orders of magnitude faster speed and higher reliability by avoiding the issue of vulnerability to mechanical shocks. [16] The study by Heck [70] and more recent study by Guo et al. [71] have made comprehensive summaries on the research progress of OPA. Also, the review work by Sun et al. [72] has made a good summary on silicon photonics OPA and key components (e.g., antenna, phase shifter) for practical LiDAR solutions. In this section, our focus will be on OPA-based solid-state LiDAR sensors that have been demonstrated, with key features/parameters summarized in Table 3.
In the study by Sun et al., [26] a large-scale 2D OPA with 64 × 64 antennas has been demonstrated on silicon photonics chip, which is fabricated on wafer-scale in CMOS-compatible fabrication line. Hence, the compact and robust OPA can be mass-produced with low cost. Also, the OPA has the potential to be integrated with other integrated photonics devices [27] as well as electronic circuits. [74] In 2017, the study by Poulton et al. [23] demonstrated the first LiDAR using a 1D OPA on the same silicon photonics platform. The FMCW sensing scheme enables the LiDAR system to capture both distance and velocity information simultaneously. The schematic of an FMCW LiDAR system is shown in Figure 5a, including TX OPA and RX OPA. By steering these two OPAs, the range information for three targets at different locations is obtained, as shown in Figure 5b. The chip with OPAs can be packaged on a circuit board with an optical fiber, as shown in Figure 5c. In 2019, a follow-up study by the same group [24] demonstrated an FMCW LiDAR system using OPA for long-range sensing. The OPA containing 512 elements is wire-bonded onto a printed circuit board and packaged with a polarization maintaining fiber, as shown in Figure 5d. A look-up-table is generated for beam steering from the packaged OPA. Figure 5e shows the beam spots from the OPA captured by an IR camera. The beam spot steering speed has been reported to be ≈30 μs, which is limited by the interface between digitalto-analog converter and field programmable gate arrays (FPGA). A beam steering range of 56°× 15°has been demonstrated. Furthermore, a prototype OPA-based LiDAR system has been used for outdoor long-range sensing. The OPA without phase shifter has been used for the long-range sensing demonstration.  have a frame rate of 10 Hz and is claimed to be the first coherent OPA LiDAR for long-range sensing. The ranging distance is up to 185 m. Furthermore, the research work by Bhargava et al. [73] reports the first demonstration of integration between photonics (including OPAs) front end and CMOS electronics in a single chip, with the system schematic shown in Figure 5g. The photograph of bonded wafer and packaged optical device are illustrated in Figure 5h left panel. The microscopy image of the LiDAR chip is shown in Figure 5h right panel, including transmitter OPA and receiver OPA. The FMCW laser signal is coupled onto the chip through fiber-edge coupler, and then split into transmitter OPA and local oscillator (LO) path. The reflected signal from sensing target is collected by receiver OPA and beat with the signal from LO path. The photograph of LiDAR testing setup In 2020, the study by Lee et al. [14] demonstrated the first chipscale LiDAR solution with integrated optical source and amplifier, which paves way for low-cost, compact fully integrated solidstate LiDAR sensor. The schematic of the chip-scale device based on III-V-on-Si is illustrated in Figure 6a, where the inset shows the microscopy image of the fabricated chip with a size of 7.5 mm × 3 mm. The chip is fabricated on a silicon-on-insulator (SOI) wafer. The III-V gain layers are bonded on patterned SOI wafer. The LiDAR system based on TOF sensing mechanism is illustrated in Figure 6b. For TOF sensing, the modulated signal with 30 ns width at 1 MHz from a designed drive board is used to drive the beam scanner in pulsed mode. Semiconductor optical amplifiers (SOAs) in the optical scanner are driven by 100 mA current to enable optical beam power of 10 mW. An avalanche photodiode (APD) array is used for detection of reflected signal. TX signal from driving board and RX signal from APD are transferred to an analog-digital converter (ADC) circuit with 1 GHz sampling rate. The TOF is obtained by calculating the cross-correlation of TX signal and RX signals at FPGA and reconstructed for 3D depth image and point cloud plot. The 3D LiDAR scanning can achieve frame rate of >20 Hz. For LiDAR demonstration, a pedestrian walking from a wall 10 m away has been captured. The selected camera image, depth image, and 3D point cloud plot are illustrated in Figure 6 panels c, d, and e, respectively.
More recently, in 2021, Nakamura et al. demonstrated OPAbased LiDAR using liquid crystal (LC) as the tunable material. [25] The LC tuning provides one more dimension for beam steering in addition to OPA beam steering by phase shifter, and hence enables 2D beam steering at single wavelength. The OPA with LC is fabricated using a standard silicon photonics process and an LC process for commercial LC displays. The schematic of 1D OPA with LC is illustrated in Figure 7a. The vertical beam steering is enabled by 1D OPA and the horizontal beam steering is enabled by LC tuning. The LC tunable antenna is formed by LC core sandwiched by distributed Bragg reflectors (DBR), as illustrated in the right panel of Figure 7a. The top DBR has higher transmittance for light emission. The LC molecular orientation and refractive index can be changed by an applied electric field, and hence the phase change can be achieved for beam steering. The layout and the cross-section of eight-channel OPA are shown in Figure 7b. The thermal-optic phase shifter enables 1D OPA tuning in the vertical direction. The photonics components are based on silicon nitride (Si 3 N 4 ) platform, with working wavelength of 940 nm. The microscopy image of the fabricated device is shown in Figure 7c. The beam steering range of the OPA can achieve 15°× 16°. Furthermore, a LiDAR system has been demonstrated by using LC-tunable device capable for 1D beam steering, with schematic shown in Figure 7d. Target tracking has been realized by detecting a person from a black-and-white camera and steering the beam toward the target person. By using a TOF camera with a synchronized laser diode coupled to LC beam steering device, the distance of the sensing target can be obtained. A maximum sensing distance of 12 m and refresh rate of 10 frames s −1 have been reported.

Emerging Nanophotonics-Based LiDAR Sensor
With the advances in nanofabrication, nanostructures with suboptical wavelength dimension can be patterned in large scale, and hence enable light-matter interaction. The engineering and patterning of nanostructures contribute to the control and manipulation of electromagnetic waves in the optical wavelength regime. One example is integrated photonics technology, which demonstrates large-scale patterning of nanostructures (e.g., Bragg gratings [61,75] and high-Q microring resonator [76][77][78] ) on photonic integrated circuit (PIC). The high-Q microring resonator enables the generation of OFC, which can be used for sensing. Another example is flat optics technology, where the nanostructures in a layer are engineered to obtain a desired phase profile, [79] and hence achieve various functionalities. [80] The metasurface brings the advantage of compactness, capability for dispersion control, and high optical beam quality without aberration. The novel nanophotonics devices mentioned here have also been demonstrated for LiDAR sensing. In this section, the recently demonstrated LiDAR sensors implementing novel nanophotonics devices including optical switches for sequential illumination LiDAR, OFCs as light source for parallel scanning and high-precision ranging in LiDAR, as well as metasurfaces for optical beam steering and deflection in LiDAR are reviewed.
A summary of nanophotonics-based LiDAR sensors with key features is also presented in Table 4.

Optical Switches for Sequential Illumination LiDAR
In the earlier section, OPA-based LiDAR sensors have been reviewed. Also based on integrated photonics platform, more recently, sequential illumination/flash LiDAR sensors have been demonstrated. [29,32,[82][83][84] In such sequential flash approach, spatial scanning is achieved by controlling optical switches to sequentially switch the emission among different emitters. Comparing with the flash approach mentioned in the earlier section, sequential illumination/flash approach overcomes the limitation on optical power budget, and hence enables larger sensing range. For LiDAR sensor hardware, the PIC components covered in this section include optical switches and PDs. An additional note is that the OFC generator can also be on an integrated photonics platform, which will be discussed separately in later Section 4.2.
In 2018, the study by Martin et al. [29] reports a sequential flash LiDAR sensor working in the FMCW scheme. The sensor PIC includes two optical switching networks (SNs) to separately switch illumination among eight emission and eight collection channels on a single CMOS-compatible silicon photonics chip. The Laser Photonics Rev. 2022, 16,2100511  switching among eight channels by controlling SN enables spatial scanning, so that beam scanning by moving parts can be avoided. A frequency-modulated distributed feedback laser is used as the external off-chip light source. A part of the optical power goes into TX as well as delay line interferometer (DLI), and another part of optical power goes into RX as LO. The optical signal is coupled onto the photonics chip via grating coupler. DLI cascaded with a balanced photodetector (BPD) is used to monitor and control the chirp of laser source. The TX emits light in eight different directions through eight collimation lenses. The reflected light is collected and routed to the RX through eight external fiber circulators. These reflected signals will beat with the LO in BPDs in RX. The operation of LiDAR sensor with eight channels working is demonstrated by measuring a wall at 9.5 m distance. Also, sequential flash LiDAR sensors with optical switches and lens-assisted beam-steering (LABS) technology have been reported in recent studies. [82,83] LABS technology has attracted interest in research contributed by the advantage of low control complexity and high background suppression. [83] The more updated work by Li et al. [83] reports 2D beam steering by placing a cylindrical lens above emitter array on a PIC chip, with schematic shown in Figure 8a. 2D beam steering is achieved by both thermal switching among different emitters (along the x direction) and wavelength tuning of input signal (along the y direc-tion). One point to emphasize is that at one time, only one emitter is switched on. A TOF LiDAR has been demonstrated using the beam steering device. The schematic of experiment setup is shown in Figure 8b. A pulse laser is used as light source, followed by a pulse picker to reduce the repetition rate in time domain, and a spectral filter to select out the wavelength in frequency domain. Then the signal is split into two paths: one path goes directly into the PD, and another path goes into the amplifier and LABS device transmitter. In this TOF LiDAR demonstration, optical signals with multiple wavelengths are emitted and collected simultaneously to improve the sensing speed. The returning signals are received by a fiber array and a few APDs, as plotted in Figure 8c together with reference signal. From the time delay between reference signal and returned signals, the target distance can be calculated as 1.08 and 11.22 m.
Furthermore, a study reported in 2021 by Rogers et al. [32] demonstrated a 3D imaging sensor based on sequential illumination/flash with optical switching tree on silicon photonics platform. The LiDAR sensor is based on FMCW scheme, with a TX focal plane array (FPA) and an RX FPA. The large-scale coherent receiver array with 512 pixels operates at the quantum noise limit. The heterodyne detector is integrated with electronic readout circuit through the monolithic integration of photonic and electronic circuit, which also provides the possibility for further scale-up of the pixel number. The trade-off between the FOV and Figure 8. A LiDAR sensor with optical switches for sequential illumination. a) Schematic of 2D beam steering device with a cylindrical lens placed above emitter array on a chip with integrated optical switches. b) Schematic of TOF LiDAR sensing setup with the beam steering device. Inset: Optical source signal in frequency domain and time domain from i) pulsed laser, ii) pulse picker, iii) spectral filter. c) Plot of reference signal and two reflected signals from sensing object located at the distance of 1.08 and 11.12 m. a-c) Adapted with permission. [83] Copyright 2021, Chinese Laser Press. sensing range is eliminated by sequentially illuminating and reading out the sensing scene. The schematic of the RX block formed by heterodyne PD pixel is shown in Figure 9a. The zoom-in view of one RX pixel is illustrated in the inset of Figure 9a on the left side. The LO light is guided through a 1 × 8 switching tree and combines with the reflected light collected by grating couplers in each pixel. The heterodyne signal is detected by a Ge BPD and the generated photocurrent signal is amplified by a transimpedance amplifier (TIA). At the end of each row, there is an output amplifier to transmit the signal off from chip. The FMCW LiDAR sensing result is shown in Figure 9b. A 3D point cloud of a rotating basketball located at 17 m distance is illustrated in Figure 9c top panel. The velocity across the middle of the rotating basketball measured by FMCW scheme is shown in Figure 9c bottom panel.
More recently, the study by Zhang et al. [84] reported a 16384 pixel LiDAR realized through the monolithic integration of grating antennas and MEMS-based optical switches on a silicon photonics chip. Compared with optical switches based on thermally tuned Mach-Zehnder interferometer (MZI), [29,82,83] the MEMS-based optical switches have the advantages of smaller footprint, lower power consumption, and higher switching speed. [84] The schematic of switching array with a lens on top is illustrated in Figure 10a. The optical signal is routed to the selected grating antenna through row-selection and column-selection switches. These switches operate based on MEMS electrostatic actuation, with schematic of ON and OFF states shown in Figure 10b. Under ON state, the coupler tip (in green color) is pulled down to couple light from bus waveguide (in yellow color) to grating antenna. The emitted light from grating antenna is then collimated through the lens on top. 3D imaging at the distance of 0.8, 5, and 10 m with a distance resolution of 1.7 cm has been achieved by the LiDAR sensor working in FMCW scheme. At around 0.8 m distance, the point clouds captured by FMCW LiDAR sensor together with camera image of three letters at the same height and different heights are illustrated in Figure 10c,d, respectively.
In addition to integrated beam splitters, optical switches, gratings, and PDs discussed in the earlier works, the study by Yang et al. [81] demonstrated an optical pulse circulator on integrated Figure 9. A LiDAR image sensor based on sequential illumination enabled by optical switching tree. a) Schematic of RX block formed by heterodyne PD pixel. Inset: Zoom-in view of one RX pixel. Light from LO is guided by a 1 × 8 switching tree. Grating coupler is used to collect reflected light from sensing object. LO light and reflected light will have heterodyne detection in Ge BPD. The generated electrical signal is amplified by TIA. At the end of each row, there is an output amplifier to transmit the signal off from chip. b) 3D point cloud of a rotating basketball located at 17 m distance, with velocity information indicated in color bar. c) Top panel: Photograph of rotating basketball setup rotating at the speed of 1 rpm. Bottom panel: Measured velocity across the middle of the rotating basketball. a-c) Adapted with permission. [32] Copyright 2021, Springer Nature. photonics platform, which has potential applications in a LiDAR system. The circulator is formed by a high-Q silicon racetrack resonator and a silicon bus waveguide with an inverse-designed reflector. The geometrical asymmetry and optical nonlinearity contribute to the nonreciprocity of the device. Although the proof-ofconcept LiDAR demonstration reported in ref. [81] is not based on sequential illumination/flash, the integrated circulator has the potential to be implemented in various LiDAR systems including sequential illumination, e.g., to replace the off-chip circulator within the LiDAR system reported in ref. [29].
An additional note worth mentioning is that the sequential illumination/flash approach has been deployed in commercial Li-DAR sensors. One example is the LiDAR sensor from Ibeo Automotive Systems GmbH applied for autonomous driving. Another example is LiDAR sensor on iPhone and iPad from Apple Inc. applied for consumer electronics. Typical configuration is to use vertical cavity surface-emitting lasers (VCSEL) array on the transmitter side, and SPAD array on the receiver side. For the LiDAR sensor from Ibeo, the mapping between the emitter and receiver overcomes the power budget limitation constrained by eye-safety regulation, and hence enables longer sensing range from the Li-DAR system.

Frequency Comb Sources for High-Performance LiDAR
In addition to the above-mentioned OPA-based LiDAR and sequential illumination LiDAR with integrated optical switches, OFC can also be generated on an integrated photonics platform. OFC is a high-precision metrology tool consisting of a series of equal-distance optical frequency lines. It has been used in many areas, including optical frequency metrology, [85] optical frequency synthesis, [86][87][88] microwave photonics, [89][90][91] distance www.advancedsciencenews.com www.lpr-journal.org Figure 10. A LiDAR sensor based on sequential illumination enabled by MEMS-actuated optical switches. a) Schematic of MEMS-actuated optical switching array with a lens on top. Light is routed to the selected grating antenna through row-selection and column-selection switches. The lens on top is to collimate the emitted light from selected grating antenna. b) Schematic of ON and OFF states for optical switches and grating antennas. Under ON state, the coupler tip (in green color) is pulled down to couple light from bus waveguide (in yellow color) to the selected grating antenna. c,d) Point clouds captured by FMCW LiDAR sensor and camera image of sensing target formed by three letters at the c) same height and d) different heights located at around 0.8 m distance. a-d) Adapted under the terms of a Creative Commons Attribution 4.0 International License. [84] Copyright 2022, The Authors, published by Springer Nature. measurement, [92,93] chemical sensing, and spectroscopy. [94,95] Broadband light sources including OFC and supercontinuum can be generated through nonlinear optical effect of waveguide material. [66,[96][97][98][99] Contributed by the nonlinear optical properties of the materials on integrated photonics platform, chip-scale OFCs have been demonstrated and investigated. [100][101][102] Recently, integrated OFCs have also been used as light source for LiDAR sensing. [1,2,30,103] In this section, recent research progress of Li-DAR sensors using integrated OFCs is reviewed.
In 2018, the study by Suh and Vahala [30] reported dualfrequency combs used for TOF LiDAR sensing achieving 200 nm precision in distance measurement. Sensing distance up to 25 m with lower precision was also reported. The high precision and long range of LiDAR sensing is enabled by the use of dualfrequency combs, which are generated by pumping a single microresonator in clock-wise (CW) and counter clock-wise (CCW) direction. Dual-comb from single resonator not only simplifies the system by avoiding the use of two resonators and pump sources, but also improves the mutual coherence between two combs. [30] The schematic of dual-comb generation setup and Li-DAR sensor setup is illustrated in Figure 11a. A CW pump laser source is amplified by an erbium-doped fiber amplifier (EDFA) and then split into two arms through a 50/50 coupler. In each arm, an acousto-optic modulator (AOM) is used to control the pump frequency, and a polarization controller (PC) is used to tune the polarization of pump light. The frequency of the pump laser is locked by the servo through a feedback loop with a PD detecting CCW soliton. Fiber Bragg grating (FBG) filter is used to attenuate the residual pump. Optical spectrum, electric spectrum, and time domain signals are monitored by optical spectrum analyzer (OSA), electric spectrum analyzer (ESA), and oscilloscope, respectively. For target distance sensing, the CW soliton is split into two arms through a 50/50 splitter. One arm is a reference beam (green dotted arrow), and another arm is a beam for target sensing (orange dotted arrow). The beams from both arms are combined with CCW beam (blue dotted arrow) to generate an is contributed by the large free spectral range of comb lines, and high ranging precision is contributed by the wide optical bandwidth (>11 THz). The ranging of an in-flight gun projectile (v = 150 m s −1 ) has been demonstrated. The schematic of LiDAR setup is shown in Figure 11c top panel. Two dissipative Kerr soliton combs are generated from two separate Si 3 N 4 microring resonators. The sensing measurement is conducted for a flying bullet with moving speed of 150 m s −1 from an air gun. The measured bullet profile is plotted in red as shown in Figure 11c middle panel (in red). A reference measurement result for static bullet by an optical coherence tomography system has been included in the same plot (in blue) for comparison. A photograph of the bullet is illustrated in Figure 11c bottom panel as reference.
Furthermore, in addition to the dual-comb approach, the research work by Riemensberger et al. [2] demonstrated a frequency comb-based FMCW LiDAR for massively parallel 3D sensing. In comparison with TOF LiDAR, although FMCW LiDAR has the advantages of obtaining velocity, free from interference, and operation with lower optical peak power, it has limitation on acquisition speed and requires precisely chirped coherent laser source. [2] The massively parallel 3D sensor reported in ref. [2] provided a solution to overcome the limitation. A frequencymodulated CW laser is used to pump a high-Q Si 3 N 4 microring resonator. The idea is to transfer the chirp from the pump source to multiple comb lines while retaining the repetition rate of the optical signal. In this way, an array of independent sources with frequency modulation can be obtained. These channels are later dispersed through diffractive optics. Each channel can be used to measure distance and velocity information simultaneously at different locations. The distance and velocity information can be obtained through the homodyne detection between original signal and reflected signal from sensing object. Using the proposed sensor, a proof-of-concept demonstration of parallel sensing system has been performed, with schematic illustrated in Figure 12a. The frequency-modulated comb lines are first amplified by an EDFA, and then split into two arms by a 90/10 splitter. The signal in one arm is spectrally dispersed by a transmission grating (966 lines per millimeter) for the sensing of a flywheel. The signal in another arm is used as LO. The radio frequency spectrum of mixed signal has two peaks (f u and f d ), as illustrated in the inset of Figure 12a. The optical spectrum of emitted comb has also been illustrated in the middle part of Figure 12a  In addition, it is worth mentioning that the research work by Wang et al. [103] demonstrated a high-performance LiDAR using integrated soliton microcomb to achieve sensing with long distance, high precision, and high speed simultaneously. The ranging distance is up to 1179 m with up to 35 kHz high speed, and high precision (minimum Allan deviation of 5.6 μm at an average time of 0.2 ms). The microcomb is generated by a high-index doped silica glass-integrated microring resonator, which has the advantage of compact integration. The sensing distance is obtained through dispersive interferometry method.
For the OFC-based LiDAR sensors discussed above, [1,2,30,103] the only integrated photonics component is the ring resonator for OFC generation. In the near future, more photonic components can be integrated on the same chip to achieve a fully integrated OFC-based LiDAR sensing system. [104] These integrated photonics components include pump source, [105][106][107][108][109][110][111][112] OPA, [26,70,71] and BPD. [23,29,113] Recently, the study by Xiang, et al. [114] demonstrated the monolithic integration of semiconductor pump source and Si 3 N 4 microring resonator on silicon, which opens doors to lowcost compact integrated OFC source fabricated using CMOScompatible techniques. Also, a tunable laser source with remarkable performance (118 nm tuning range, sub-100 Hz linewidth) has been recently reported. [115] Furthermore, large-area silicon photonics OPAs [26] have been demonstrated. OPAs with BPDs, directional couplers, and edge couplers have also been demonstrated on a silicon photonics platform for LiDAR sensing. [23] In addition, different chips and photonic platforms can be connected through photonic wire bonding. [116,117] From the demonstrated photonics components mentioned here, silicon photonics platform shows significant potential to realize a fully integrated solid-state LiDAR sensor. The integration of LiDAR on photonic and electronic chips can further reduce the cost, size, and power consumption. [34]

Metasurfaces for Beam Steering and Deflection in LiDAR
Metasurface has become an emerging field in the area of optics and photonics in the past decade. [118,119] It is a thin layer of patterned nanostructures to manipulate the phase, amplitude, and polarization of light. By engineering the phase and amplitude profile of the meta-elements, various functional devices have been demonstrated, including lenses, [120][121][122][123][124][125][126] beam deflectors, [127][128][129] waveplates, [130][131][132][133] spectrum filters, [134][135][136][137][138] and holograms. [139][140][141][142] Metasurface is a disruptive technology to conventional optical devices, which are relatively bulky compared with the metasurface. [143] Furthermore, metasurface-based devices can be fabricated using a single-step lithography process, which is compatible with the CMOS fabrication line. [144][145][146] Contributed by the nature of sub-wavelength scale phase control meta-elements, metasurface-based LiDAR system has the capability to achieve high-resolution 3D sensing. [16] Also, optical beam steering can be achieved using metasurfaces with active tuning capability. [31,147] A recent study by Park et al. [33] demonstrated an SLM based on an electrically tunable metasurface applied in LiDAR sensing. The all-solid-state metasurface array can achieve complete phase sweeping between 0°and 360°at an estimated rate of 5.4 MHz, and also independent adjustment of amplitude, which overcomes the limitation of earlier reported active metasurfaces. The functional tunable metasurface is formed by an array of plasmonic nanoresonators. The nanoresonator consists of a gold (Au) layer at the top as antenna, an indium tin oxide layer in the middle, and an Al layer at the bottom as mirror. These three layers are electrically insulated by oxide layers in between. The real and imaginary part of reflection coefficient can be tuned by independently applying electric voltage on top electrode (V t ) and bottom electrode (V b ). The fabricated SLM packaged with driving electronics is illustrated in Figure 13a. The driving electronics provide 100 independently controlled channels, 50 of them for V t and 50 of them for V b . Hence, the active array as shown in the left and right panels of Figure 13a has 50 channels. Each channel contains 11 nanoantennas. Using the developed SLM, a proof-ofconcept TOF LiDAR sensor has been demonstrated, with setup schematic shown in Figure 13b. A pulse laser at 1560 nm is used as the light source. The SLM is used for beam steering. A lens together with an APD array is used as the RX. The sensing objects include a human model, a car model, and a screen located at 2.4, 3.4, and 4.7 m away, respectively. The scanning region and corresponding depth image are shown in Figure 13c top and bottom panels, respectively. Good agreement between the measurement results and the actual distances can be observed.
In addition to the SLM mentioned above, the demonstration of different active metasurface-based devices has shown potential to be applied in LiDAR sensors. These active metasurface-based devices include the ones utilizing MEMS technology, [148][149][150][151][152] the ones implementing LC for tuning, [31,153,154] and the ones using phase-change material (PCM). [155][156][157][158] Metasurfaces on MEMS actuator substrate enable flat optics devices to have larger beam steering angle, which corresponds to larger FOV in a LiDAR system. Megahertz-level modulation speed has been achieved in the research work by Holsteen et al. [148] The fast modulation speed corresponds to potential for high frame rate during Li-DAR scanning. Also, active metasurface using LC is an attractive approach since it can leverage on the well-developed LC display industry [153] and can also be tuned by either temperature or electric field. The study by Chung and Miller [154] demonstrated the feasibility of using LC-based metasurface to achieve wide beam deflection angle of 144°and high efficiency of >80% through inverse design. More recently, the study by Zhang et al. [158] reports a large-scale non-volatile beam switching based on optical PCM. The works mentioned above show promise of using active metasurface for compact LiDAR sensing.
Furthermore, metalens has also been combined with active silicon photonic microring emitter array to achieve 2D beam steering, as reported by Chang et al. [159] The schematic of the novel beam steering device is illustrated in Figure 13d. Mach-Zehnder (MZ) switch tree enables switching among different microring emitters in 2D array. An aberration-free metalens is above the emitter array to convert the emission from different emitter location into different propagation direction in the far field. The use of metalens contributes to compact integration with photonics devices, and high optical beam quality without aberration. Also, the switching tree in PIC enables the device for single-wavelength operation, and lower power consumption compared with OPAbased beam steering. [159] The metalens is designed to have an A pulse laser at 1560 nm is used as light source. SLM is used for beam steering. Steered optical pulse hits the sensing object and is reflected and collected by RX formed by a lens together with an APD array. c) Top panel: Optical image of sensing objects including a human model, a car model, and a screen located at 2.4, 3.4, and 4.7 m away, respectively. Bottom panel: 3D depth image obtained from the LiDAR sensor. a-c) Adapted with permission. [33] Copyright 2021, Springer Nature. d) Schematic of beam steering device with metalens placed above emitter array. MZ: Mach-Zehnder. e) Ray tracing of designed optical device. f) SEM image of fabricated metalens formed by silicon posts on fused silica substrate. g) Overlapped far-field angular distribution from 4 × 4 microring emitter array showing an FOV of 12.4°× 26.8°. d-g) Adapted with permission. [159] Copyright 2021, The Optical Society.
FOV of ±13.6°, with ray tracing shown in Figure 13e. The fabricated metalens is formed by silicon posts on fused silica substrate, with SEM image shown in Figure 13f. The overlapped farfield angular distribution from 4 × 4 emitter array is illustrated in Figure 13g. An FOV of 12.4°× 26.8°has been demonstrated experimentally. The novel solid-state device shows potential for LiDAR sensing with compact size, low power consumption, and high optical beam quality.
An additional note worth mentioning is that metasurfacebased dot projector/point cloud generators [129,160,161] also have the potential to be applied in 3D sensing. A remarkable work using such approach is the full-space random point cloud generated by a scrambling metasurface demonstrated in the research work by Li et al. [160] The metasurface is formed by amorphous silicon on SiO 2 substrate, which can be patterned by a single-step lithography process. Over 4044 points are observed in full space with angles covering up to 90°. Also, in the research work by Xie et al., [147] metasurface beam deflectors are monolithically integrated on an array of 10 × 10 VCSEL. Such configuration enables the point cloud generation as well as the programming of emission angle by controlling each VCSEL independently. The point cloud 3D sensors have already been applied in consumer electronics, e.g., cellphones and tablets using point cloud for facial recognition. Metasurface-based dot projector/point cloud generators have the potential to further improve the compactness of commercial products.

Summary and Outlook
To sum up, in this review, first, different LiDAR sensing/ranging approaches are introduced. These approaches include pulsed TOF, AMCW TOF, and FMCW. The mechanism for each approach is explained with diagrams and equations. Next, recent research progress on conventional solid-state LiDAR sensors is reviewed which covers flash-based, MEMS-based, and OPA-based LiDAR sensors. Followed by that, LiDAR Laser Photonics Rev. 2022, 16, 2100511 www.advancedsciencenews.com www.lpr-journal.org sensors utilizing novel nanophotonics devices are summarized and discussed. Nanophotonics devices implemented in LiDAR sensors include optical switches for sequential illumination, integrated resonator for OFC-based high-performance sensing, and metasurface-based SLM for beam steering.
In the near future, compact solid-state LiDAR sensors with high performance in terms of fast speed, high resolution, large FOV, and low power consumption are in need, and require further research and development. [15] Thanks to the advanced nanoscale semiconductor fabrication technology, photonics devices can be mass-produced with low cost. These devices have the potential to miniaturize and redesign the existing LiDAR sensing system for various applications. [16] Hence, we believe the future research and development work for advanced LiDAR sensors should leverage on the advancement of photonics technology, and can be directed in the following three pathways. First, as mentioned earlier, for OFC-based LiDAR sensors demonstrated so far, [1,2,30,103] the only integrated photonics component is the microring resonator. In future work, different photonics devices can be integrated on the same chip, including pump source, OPA, and PD, to achieve a fully integrated photonics-based Li-DAR sensing system. Also, different chips or photonic platforms can be linked up by photonic wire bonding. [116,117] In addition, the integration of photonic and electronic chips can further reduce the system size, cost, and power consumption. [34,73,162] Second, the demonstration of active metasurface-based devices has shown potential to be applied in LiDAR sensors. The active tuning of the metasurface can be achieved by utilizing MEMS technology, [148][149][150][151][152] implementing LC, [31,153,154] and using PCM. [155][156][157][158] As mentioned earlier, the sub-wavelength scale phase control by meta-elements enables metasurface-based Li-DAR system to achieve high-resolution 3D sensing. Also, the large beam steering angle of active metasurface-based devices enables a LiDAR system with large FOV. The fast modulation speed of active metasurface contributes to high frame rate of a LiDAR system. Future efforts can be made toward the active metasurface-based LiDAR sensor, to demonstrate compact and high-performance 3D sensing system.
Last but not the least, further explorations on multispectral Li-DAR sensor can be conducted. In addition to the 3D information of the sensing object, multispectral LiDAR is capable of capturing information in one more dimension, which is the spectral domain. Spectral information can be used for sensing of materials and identification of chemical composition. [163][164][165][166][167][168][169][170] A recent review has been made to summarize the research progress toward compact spectral imaging and spectral LiDAR sensors. [104] Spectral LiDAR has been applied for environment monitoring [4,5,171] and shows potential to be applied in autonomous driving. [172] While nanophotonics-based spectral imaging systems have been demonstrated, [173][174][175][176][177][178] compact nanophotonics-based spectral Li-DAR sensing systems together with wide bandgap integrated photonics material [179][180][181] remain to be further explored. Potential applications include biometric identification, biomedical imaging, autonomous driving, archaeology, and art conservation.