
A recent Apple patent describes new ways to use the FaceTime camera and accompanying sensors in current iPhones to take health measurements. The process “uses one or more of the camera and the proximity sensor to emit light into a body part of a user touching a surface of the electronic device and one or more of the camera, the ambient light sensor, and the proximity sensor to receive at least part of the emitted light reflected by the body part of the user,” much like the Apple Watch uses reflected light from the sensors in contact with a user’s skin to take its measurements. With added sensors in future devices to boost what is already there, Apple claims it might be able to provide “a blood pressure index, a blood hydration, a body fat content, an oxygen saturation, a pulse rate, a perfusion index, an electrocardiogram, a photoplethysmogram, and/or any other such health data.” [via 9to5Mac]