Remote photoplethysmography (rPPG) is attractive for tracking a subject’s physiological parameters without wearing a device. However, rPPG is known to be prone to body movement-induced artifacts, making it unreliable in realistic situations. Here we report a method to minimize the movement-induced artifacts. The method selects an optimal region of interest (ROI) automatically, prunes frames in which the ROI is not clearly captured (e.g., subject moves out of the view), and analyzes rPPG using an algorithm in CIELab color space, rather than the widely used RGB color space. We show that body movement primarily affects image intensity, rather than chromaticity, and separating chromaticity from intensity in CIELab color space thus helps achieve effective reduction of the movement-induced artifacts. We validate the method by performing a pilot study including 17 people with diverse skin tones.