Live demonstration: a frame-based CMOS vision sensor with high dynamic range for events generation
Loading...
Identifiers
Publication date
Advisors
Tutors
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
IEEE Xplore
Abstract
This live demonstration shows a frame-based CMOS vision sensor with high dynamic range for event generation. Our CMOS vision sensor features 64 × 64 processing elements that comprise one 4T-APS and local circuitry to provide events and high dynamic range extension. The event generation is performed synchronously through the threshold of the frame difference between consecutive frames. The dynamic range extension is carried out per-pixel with the overflow capacitance method. The sensor can reach up to thousands of event frames per second. Electrical simulations indicate a dynamic range of 85 dB, which narrows the gap with dynamic vision sensors
Description
Demostrador de un circuito sensor de eventos con 64 x 64 píxeles a partir de la diferencia de dos imágenes. El sensor está diseñado en tecnología de 180 nm, y presenta como características principales: 81 dB de rango dinámico, 1.400 eventos por segundo y 250 nW/px como consumo de potencia
Bibliographic citation
M. Jaklin, D. García-Lesta, P. López and V. M. Brea, "Live Demonstration: A Frame-Based CMOS Vision Sensor with High Dynamic Range for Events Generation," 2025 IEEE International Symposium on Circuits and Systems (ISCAS), London, United Kingdom, 2025, pp. 1-1, doi: 10.1109/ISCAS56072.2025.11044206
Relation
Has part
Has version
Is based on
Is part of
Is referenced by
Is version of
Requires
Publisher version
https://doi.org/10.1109/ISCAS56072.2025.11044206Sponsors
This project has received funding from H2020 Marie Skłodowska-Curie Actions; Xunta de Galicia; Marie Skłodowska-Curie grant, Grant/Award Num ber: 860370, ”Cátedra Televés en Diseño Microelectrónico” (TSI-069100 2023-0010) by the PERTE Chip, Secretaría de Estado de Telecomunicaciones e Infraestructuras Digitales, Ministerio de Asuntos Económicos y Transformación Digital and has been co-funded by the European Union NextGenerationEU, Spanish Ministry of Science, Innovation and Universities under Grant PID2021-128009OB-C32 and Xunta de Galicia (ED431G 2019/04, ED431C2021/048, ERDF/FEDER)








