Résumé

Sensing and projecting an individual’s human thermal comfort in buildings is a complex but crucial task for transitioning toward human-centered indoor climate control. Non-intrusive collection of physiological and personal information from the human body, such as skin temperature, activity rate, and clothing insulation, is possible when applying computer vision algorithms to infrared (IR) and RGB images. In this study, we applied multi-modal non-intrusive computer vision algorithms to extract personal features such as the clothing ensemble, activity level, posture, sex, age, and skin temperature as a human’s thermal comfort defining parameters. Moreover, we evaluated the capability of an IR camera to detect the skin temperature of the face and hands by comparing them with the contact measurement using iButton skin temperature sensors in controlled experiments performed on males and females in a climatic chamber. Finally, we highlighted the remaining research directions that would pave the way for non-intrusive personalized human thermal comfort

Détails