Sorry, you need to enable JavaScript to visit this website.

A Single-Wavelength Real-Time Material-Sensing Camera Based on Time-of-Flight Measurements

Primary tabs

Citation Author(s):
Otmar Loffeld, Thomas Kerstein, Bernd Buxbaum
Submitted by:
Miguel Heredia Conde
Last updated:
14 May 2020 - 3:55am
Document Type:
Demo
Document Year:
2020
Event:
Presenters Name:
Miguel Heredia Conde
Paper Code:
S&T-P2.3
Categories:

Abstract 

Abstract: 

Time-of-Flight (ToF) cameras provide a fast and robust way of acquiring the 3D shape of real scenes. Dense depth images can be generated at tens of frame per second. 3D shapes can be then segmented and objects classified, but can we directly sense the objects’ material using just a ToF camera? This live demonstration proves the answer to be affirmative. This possibility has only very recently been unveiled and we are, to the best of our knowledge, the first providing a live demonstrator showing the feasibility of this approach. Differently from mainstream research on material classification, we do not make use of costly hyperspectral measurements, but of single-wavelength ToF measurements. Depending on the nature of the material, the probing function arising from the modulation/demodulation sensing scheme (e.g., a sinusoid in CW-ToF) will be further convolved with a different impulse response function that is characteristic of the material. If the material is perfectly opaque and plain this response function is a Dirac delta function. If not, phenomena like internal reflections will produce bandlimited response functions. Using a multi-frequency CW-ToF system we are able to extract the main Fourier coefficients of such bandlimited functions, thus offering an accurate approximation thereof and enabling native material recognition. This novel approach opens the door for low-cost ToF cameras to enter high-impact applications, e.g., those requiring distinguishing biological material from artificial ones, typically with markedly different inner light diffusion characteristics. The demonstrator consists of our material sensing camera, connected to a computer that shows a real-time video of “material images”. The camera observes a dynamic scene containing real-life objects of different materials. Objects can be removed from the scene and new objects can be incorporated. The spectators will enjoy a literal “hands-on” interaction, and will also be able to challenge the system with their own items.

up
0 users have voted:

Dataset Files

heredia_ICASSP_2020_poster.pdf

(169)