ISSCC 2023 (written by Dan McGrath)

On Monday, 20 February, I attended the Image Sensor Session at ISSCC 2023. It provided over eight talks and interesting mix of presentations. The most noticeable change with this year’s session was the large presence of event-driven image sensors versus the single presentations each on SPAD-based and on small pixel mobile devices. The other presentations described a silicon-based terahertz image sensor, an x-ray image sensor and a low-power image sensor powered with on-chip energy harvesting.

The first two talks described image sensors that integrated both event-driven and standard CIS pixels within the same pixel array to allow high speed acquisition of changes in the scene interleaved with full-color video. This allowed the creation of high speed video using the event-driven data to enable deblurring. Both used in their event-driven readout a row-based scanning with an architecture to minimize loss of events and with selectable adaptation to allow the skipping over areas with no activity. Andreas Suess of Omnivision (Paper 5.1) described a three-stack 1 Mpix sensor with a top-layer with CIS pixels devices, a face-to-face layer with event-driven pixel circuits and a bottom layer with readout circuitry, achieving a small stacked die size. Kazutoshi Kodama of SONY (Paper 5.2) described a two-stack 2-Mpix sensor with the face-to-face bottom layer contain the peripheral circuitry adjacent to the event-driven circuits placed under the image array. Both when run in adaptive mode allowed event-driven rates of 4.6 Keps.

Atsumi Niwa of SONY (Paper 5.3) described achieving the record smallest event-driven pixel pitch of 2.97?m by using a single comparator per readout to sequentially capture the rising and falling intensity events. The image sensor also included circuitry to provide auto-thresholding to account for shifting contrast in the scene and to address illumination flicker.

Hyuncheol Kim of Samsung (Paper 5.4) described the design of a 2×2 pixel with 0.64?m photodiode pitch that enabled all-directional PDAF. To achieve this, a large full-well of 20ke- and a read noise of 0.98e-, the design included internal overflow within each quad of photodiodes and in the use of quarter-annulus source-follower gate.

Min Liu of the Chinese Academy of Sciences (Paper 5.5) described a silicon-based 16kpix Tera Hertz imaging array. The advantage of THz imaging is the combination of high resolution with non-destructive depth penetration. The 60?m pixel is built in a 180nm CMOS process with the antenna formed as a square of Metal 6. The on-chip circuitry incorporates filtering and chopping to reduce noise and column-based 1-bit I-ADC to provide good linearity in a small area.

Byungchoul Park of Yonsei University (Paper 5.6) described the ROIC for a Gd2O2S scintillator-based x-ray detector that is demonstrated for dental or machine inspection applications. The pixel is based on a silicon SPAD and incorporates a series of design features to enhance performance. The design incorporates twin ping pong counters to create a seamless digital global shutter. It includes active quenching activated by counter overflow and global reset for global shutter operation. Imagery was shown both for low-dose photon-counting mode and for a high-dose extrapolation mode that extends total counts by 400x. Radiation hardness data shows expected 8-year life for part.

Karim Ali Ahmed of National University of Singapore (Paper 5.7) presented an image sensor aimed at surveillance with features on-chip circuitry for illumination-based energy harvesting and for a combination of feature extraction and of the tile-based determination for when objects enter or leave the scene. The pixel is a 12T dynamic-least-significant-bit-circuit producing 4-bit time-based output. The goal of this combination is to operate so energy harvesting enables minimum power usage to provide extended battery operation.

Rahul Gulve of the University of Toronto (Paper 5.8) presented a VGA-format image sensor which can produce individually or simultaneously global shutter imagery and HDR flux triggered readout. The use of random access dual-tap pixel operation increases high-speed global shutter updates with no light lost. The HDR mode takes a different approach from the standard single slope readout both in using a 3T pixel to directly monitor flux and the multiple sampling of a non-linear reference waveform combined with a look-up-table-based regression to provide high speed acquisition. The image sensor allowed an external ramp and so acted as a test bed for evaluating different reference waveforms with the conclusion that the best waveform is a sine wave. The performance using 25 subexposures mapping a 5×5 tile produced a 25-frame burst-mode video at 4.7 kfps.

The session was well attended with a good number of familiar faces. The papers were well-presented with good technical quality. The one thing missing in the session was attendee participation for Q&A. This may be due to the novelty and variety of the presentations making it difficult to frame succinct questions on the spot.

The sensors from Papers 5.1, 5.2, 5.3 and 5.6 were part of the evening demonstration session. The interest and discussion around the tables seemed lively.

Dan McGrath.

2 Responses to “ISSCC 2023 (written by Dan McGrath)”

  1. James Zhang says:

    Hello Dan,

    I am interested in these papers.
    Could you please send the presentation slides/papers to me?
    I am very appreciate.

    Best regards,
    James

  2. albert says:

    James, I sent you the material through wetransfer.com
    Best, Albert.

Leave a Reply