Apple’s iOS 13 touts new multi-cam capabilities, including simultaneous capture of audio, video, and photo from cameras on iPads and iPhones.
A new API will let developers create apps that take advantage of this feature, with apps that can stream audio, photo or video from rear and front cameras at the same time. In the WWDC, Apple shows a demo of a picture-in-picture video capture of the user via the front camera while recording from the rear lens.

This multi-cam support will be compatible on new iPhone models, including the iPhone XR, XS, XS Max and the iPad Pro. Due to hardware limitations, multiple sessions with several apps or cameras open is currently not available.
Semantic Segmentation Mattes is also a new feature on the iOS 13. In it, teeth, skin, and hair are isolated for adding various effects such as color changes, face paint, and others. Apple has released the API so developers can start tinkering with the technology.