Significance: The rates of melanoma and nonmelanoma skin cancer are rising across the globe. Due to a shortage of board-certified dermatologists, the burden of dermal lesion screening and erythema monitoring has fallen to primary care physicians (PCPs). An adjunctive device for lesion screening and erythema monitoring would be beneficial because PCPs are not typically extensively trained in dermatological care. Aim: We aim to examine the feasibility of using a smartphone-camera-based dermascope and a USB-camera-based dermascope utilizing polarized white-light imaging (PWLI) and polarized multispectral imaging (PMSI) to map dermal chromophores and erythema. Approach: Two dermascopes integrating LED-based PWLI and PMSI with both a smartphone-based camera and a USB-connected camera were developed to capture images of dermal lesions and erythema. Image processing algorithms were implemented to provide chromophore concentrations and redness measures. Results: PWLI images were successfully converted to an alternate colorspace for erythema measures, and the spectral bandwidth of the PMSI LED illumination was sufficient for mapping of deoxyhemoglobin, oxyhemoglobin, and melanin chromophores. Both types of dermascopes were able to achieve similar relative concentration results. Conclusion: Chromophore mapping and erythema monitoring are feasible with PWLI and PMSI using LED illumination and smartphone-based cameras. These systems can provide a simpler, more portable geometry and reduce device costs compared with interference-filter-based or spectrometer-based clinical-grade systems. Future research should include a rigorous clinical trial to collect longitudinal data and a large enough dataset to train and implement a machine learning-based image classifier. |
1.IntroductionThe rates of melanoma and nonmelanoma skin cancers (NMSC) have been steadily rising,1,2 and early diagnosis is key for improved outcomes.3 Because there is a shortage of board-certified dermatologists,4,5 particularly in remote or underserved settings where <10% of dermatologists practice,6 most of the burden of diagnosis and treatment falls on primary care physicians (PCPs) who are not extensively trained in dermatological care.3,7 Dermoscopy is a tool utilized to improve the in vivo diagnostic accuracy of benign versus malignant lesions, a unique skill that requires additional training, even among board-certified dermatologists. In remote settings, dermascopes may capture and document pigmented lesions that can be forwarded to expert colleagues through telemedicine for further analysis.8 Unfortunately, dermascopes and their accessories range from hundreds to thousands of dollars,9,10 which is potentially too expensive for general medical practice. Thus, there is a need for a low-cost, readily available dermoscopy tool to bridge this clinical need. Lesion evaluation using visual, subjective methods such as the ABCDE criteria and seven-point checklist are useful tools for PCPs.3,11 The ABCDE criteria predict melanoma by a lesion’s asymmetry, border irregularity, coloration, diameter if , and evolution, providing a sensitivity of 0.85 and specificity of 0.72.3,11 The seven-point checklist monitors a lesion’s change in size, shape, color, and looks for diameters , crusting or bleeding, and sensory change, providing a sensitivity of 0.77 and specificity of 0.80.3 Continuous monitoring has shown to improve outcomes through early detection as evidenced by mole mapping techniques12,13 and the increase in sensitivity and specificity with the addition of the evolving in the ABCDE criteria.11 Adjunctive tools utilizing objective measures such as polarized multispectral imaging (PMSI) and polarized white-light imaging (PWLI) to map dermal chromophores [hemoglobin, deoxyhemoglobin (Hb), and melanin], quantify erythema, and perform image classification for lesion screening have the potential to increase early detection of melanoma by PCPs and even outside the physician’s office, leading to reduced need for biopsy and improved outcomes.14–27 We propose a smartphone combined with LED illumination as the ideal platform for an adjunctive medical device, which will provide a portable system with easy-to-operate apps and native image capture, processing, and data transmission. These systems can reduce the costs associated with interference-filter-based14,15,20 or spectrometer-based21,23 systems while also providing a more compact, portable geometry for use in any testing environment compared with clinical-grade imaging systems.17–19 2.MaterialsWe have developed two point-of-care dermascope design concepts for skin lesion screening and erythema monitoring, implementing both PMSI and PWLI28 on an LG G5 (LG, Seoul, South Korea) smartphone platform. One system concept utilizes the embedded smartphone camera for imaging while the other uses a USB-connected camera module that connects to the smartphone. Both systems share a common illumination system and software application to enable PWLI and PMSI. The PMSI and PWLI dermascope using the smartphone’s embedded rear camera is shown in Figs. 1(c)–1(e). The main LG G5 camera consists of a Sony IMX234 Exmor RS sensor with , pixels and a sensor size. The sensor is paired with a , 4.42-mm focal length lens. To decrease the working distance of the optical system to allow imaging of the epidermis, a 24-mm focal length achromatic doublet (Ross Optical, El Paso, Texas, USA) is placed 4 mm away from the principal plane of the smartphone optical system, providing a magnification of and a numerical aperture . After cropping, the field of view (FOV) is , as shown in Fig. 2. The imaging achromat is aligned to the smartphone camera using a machined PMMA disk installed in a removable 3D-printed annulus of VeroBlue RGD840 (Stratasys, Eden Prairie, Minnesota, USA) plastic. The annulus serves as an imaging guide; its length equals the optical system working distance (23 mm), so the PCP can contact the patient to stabilize the device and ensure correct focus. An additional 3D-printed structure serves as a mounting platform for the smartphone, imaging annulus, and LED electronics. The alternative PMSI and PWLI dermascope [Figs. 1(a) and 1(b)] is also based on an LG smartphone platform, but it utilizes an external USB-connected RGB camera (OV5648, Omnivision, Santa Clara, California, USA; 5 MP, ) with the vendor-supplied focal length lens adjusted to a working distance of 30 mm. After cropping, the FOV is . In addition, the integrated infrared (IR) filter was removed. Again, the mechanical design of the annulus is matched to the working distance of the camera, providing in-focus imaging when the device contacts the patient. For both systems, multispectral illumination is accomplished using a custom printed circuit board (PCB) with LEDs of various wavelengths (Lumileds, Amsterdam, The Netherlands; Vishay, Malvern, Pennsylvania, USA) installed as shown in Table 1. The color wavelengths were chosen based on commercial availability and the ability to probe both hemoglobin isosbestic points and separate oxygenated from deoxygenated hemoglobin content along the molar attenuation curves (Fig. 3). Table 1LED and camera settings for each illumination wavelength and each dermascope. LED wavelength (λ), LED part number, smartphone camera LED-driving current (I), smartphone camera LED flux for a single LED of the given color, smartphone camera International Organization for Standardization setting, smartphone camera exposure time, USB camera LED-driving current (I), USB camera LED flux for a single LED of the given color, USB camera brightness setting, and USB exposure time are provided.
For the smartphone-based dermascope, the PMMA disk used for mounting the lens also extends over the illumination LEDs to provide mounting for a linear polarizer (Edmund Optics, Barrington, New Jersey, USA). An orthogonal linear polarizer is installed in front of the imaging channel, enabling both PMSI and PWLI and reducing the effect of specular reflection on the images.28 The LED sources’ spectral fluxes, , shown in Fig. 3, were measured with a spectrometer (Ocean Optics). The USB-camera-based dermascope uses the same LED PCB and wavelengths for illumination along with orthogonal polarizers in the illumination channel (Edmund Optics) and the imaging channel (Moxtek, Orem, Utah, USA). To help normalize white-light image luminance, an 18% gray color reference (Kodak, Rochester, New York, USA) is permanently installed on both sides of the image FOV. Because the 3D-printed mounting foundation does not need to mount the LED board and imaging annulus, a previously designed geometry is used for this system.29 The illumination PCB consists of three LEDs of each color soldered in a symmetrical pattern around the camera aperture to maximize uniformity without additional beam shaping optics. The backside solder mask of the PCB was removed to expose the copper and is attached to a copper heatsink with electrically insulating epoxy (DP240, 3M, St. Paul, Minnesota, USA). Numerous vias were placed on the PCB to ensure a low thermal resistance between the front and backside copper planes. The LEDs are driven with a switching boost power supply (LT3478, Linear Technology, Milpitas, California, USA) powered by two lithium-ion batteries (Orbtronic, Saint Petersburg, Florida, USA). Each LED color string can be turned on individually with a custom power level setting and illumination, and image capture is synchronized by a custom Android application through a Bluetooth-connected microcontroller (MCU, IOIO-OTG, SparkFun Electronics, Niwot, Colorado, USA). The LED-driving currents, fluxes, and dermascopes’ image capture settings are shown in Table 1. In addition, the smartphone camera uses the daylight white balance setting, and the white balance setting of the USB camera is inaccessible. A block diagram of the system electronics is shown in Fig. 4.30 The Android application controls the camera functions, synchronizes the LED illumination, and sets camera exposure time. For the USB camera, the Android app was modified to use the USB camera instead of the on-board smartphone camera. Images are connected to an ID assigned to each patient, removing identifiable information from the smartphone. Screenshots of the app are shown in Fig. 4. 3.Methods3.1.Data ProcessingThe algorithms used to process collected dermal images are provided in Algorithms 1 and 2. Descriptions of the steps and related equations are provided in the following sections. Algorithm 1Processing of reference images.
Algorithm 2Processing of dermal images.
3.1.1.Image collectionWhen the dermascopes were first built, images of an 18% reflective gray card were collected by each system at each wavelength to serve as both the optical density (OD) and illumination uniformity references. For dermal image collection, a pilot study was performed on human subjects at the University of Arizona College of Medicine, Division of Dermatology to determine feasibility of each multispectral dermascope. This study received institutional review board approval (#1612067061). All patients provided informed written and oral consent. 3.1.2.Colorspace conversionsThe melanin content, erythema, and chromophore concentration measurements rely on conversion to the CIELAB and CIEXYZ colorspaces. The imaging systems natively capture in the sRGB colorspace, and the images are first converted to linear RGB space:31 where is each channel of the image. Images are then converted from to CIEXYZ using the transformation matrix,31 where is the luminance value and is used to calculate ODs from the dermis images and reference. Luminance is a measure that scales optical radiation by the response of the human visual system.32 Because the images will be processed by a computer, accurate color representation for a human is not required, so an additional luminance measure, , is created using the equal sum of all three channels:3.1.3.Reference and illumination uniformity correctionUsing the reference images that have been converted to CIEXYZ or Yequal, reference luminance images are defined as where is the (luminance) channel of the CIEXYZ image or Yequal. The reference grayscale image is normalized to serve as the illumination reference for the dermal images. where is now the illumination uniformity correction matrix.The dermal CIEXYZ and Yequal images are corrected in the same way where is the illumination uniformity corrected dermal image with constant mean luminance. Finally, OD dermal images are calculated as Finally, the USB dermascope has sections of a 18% gray photography card mounted on either side of the FOV [Fig. 1(b)]. Knowing the card image should equal 50% levels of RGB, the luminance of the white-light image is scaled using the following equation:3.1.4.Chromophore concentrationThe Beer–Lambert law is utilized to measure the relative concentrations of Hb, oxyhemoglobin (), and melanin:17,22,33–35 where is the resulting intensity, is the incident intensity, is the concentration of the chromophore, is the molar attenuation coefficient of the chromophore at a particular wavelength, and is the optical path length of the light in the medium for the incident wavelength. This is restated as OD: where is due to residual absorption from molecules present in the epidermis and dermis.The molar extinction coefficients for Hb and 36 and melanin37 are shown in Fig. 3. Jacques’s 37 was fit with an exponential curve to extend the wavelength to 1000 nm, resulting in a fit of Optical path lengths, , for the chromophores are calculated from a linear fit of Anderson’s data38 in the region of the illumination wavelengths, where is in units of nm and is in units of cm.Because the LEDs are broad spectrum, we integrate over the wavelength probability density function to calculate a total molar attenuation coefficient39,40 for each color The resulting molar attenuation coefficients for all of the chromophores are shown in Table 2.Table 2Molar extinction coefficients calculated using Eq. (13) for each illumination wavelength compared with the molar extinction coefficients for the peak wavelength.
A system of equations is built from the multispectral datacube and the molar attenuation coefficients shown in Table 2 and the system is solved by linear algebra least-squares techniques33 where are calculated OD matrices for each illumination wavelength.The ability of the dermascopes to properly measure relative chromophore concentrations was validated using a finger occlusion test. Images were taken with both dermascopes and the chromophores mapped preocclusion, after 2 min of occlusion, postocclusion, and 5 min after ending the occlusion.41 3.1.5.Melanin and erythemaTo measure melanin content and erythema, the white-light image is converted to the CIELAB42 colorspace using lightness () as a measure of relative melanin content and the direction of red color stimuli () as a measure of redness, with more positive values indicating higher levels of erythema.43 Before converting to CIELAB, normalization constants must be calculated from the white-LED spectral content. Using the color matching functions,44 , , (Fig. 5), , , and are calculated as42 where is the relative spectral flux of the white-LED source as shown in Fig 3. The normalization constants , , and are calculated byThe image is then converted to CIELAB by42 whereIn addition to the white-light image measures, melanin and erythema measures are constructed from the color-OD images. Melanin content16,45 is calculated as As shown in Table 2, these two wavelengths maximize the difference in melanin absorption and minimize the effect of Hb and absorption.Erythema, due to increased blood content, results in increased blue light absorption but little change in red light absorption46 as shown in Table 2. Therefore, an erythema index is constructed as 3.2.Optical System CharacterizationThe linearity of the camera responses was measured by adjusting the exposure time in the case of the smartphone-camera-based dermascope and image brightness in the case of the USB-camera-based dermascope, capturing images of the matte 18% gray photography card with each LED color, and measuring the image luminance mean at each wavelength. Performance of the imaging system’s cutoff frequency and FOV was validated with a 1951 United States Air Force (USAF) resolution test chart, and the modulation transfer function (MTF) was measured using the slanted-edge method.47 Illumination uniformity was measured by illuminating the matte 18% gray photography card with each LED color and imaging the surface with the dermascope. The uniformity is quantified using the coefficient of variation, (),48 on normalized data where is the mean of the pixels in the image and is the standard deviation of the pixel values.4.Results4.1.Clinical ResultsFollowing are the RGB, chromophore, melanin, and erythema measures for cases of junctional nevus (JN) (Fig. 6) and squamous cell carcinoma (SCC) (Fig. 7); each case was captured with both the USB camera dermascope and the smartphone camera dermascope. The chromophore maps for both dermascopes at the chosen time points for the occlusion test are shown in Fig. 8. 4.2.Optical System PerformanceFigure 9 shows the changes in the mean of the sum of the red, green, and blue image channels over varying exposure times for the smartphone-based camera and over brightness settings for the USB camera. Figure 10 shows full-field and zoomed 1951 USAF resolution test chart images after cropping along with measured MTF data using the slanted-edge test for both dermascopes. Maps of the illumination uniformities of both systems are shown in Fig. 11, and the coefficient of variations are given in Table 3. Table 3LED illumination uniformity according to Eq. (21).
4.3.CIEXYZ NormalizationThe CIEXYZ normalization constants calculated from the white-LED spectrum for the two dermascopes are shown in Table 4. Table 4Measured CIEXYZ normalization constants for both dermascopes.
5.DiscussionThe distribution of polarized multispectral dermascopes based on smartphone platforms and low-cost color LEDs to PCPs (and eventually to consumers) has the potential to democratize dermal chromophore and melanoma mapping along with erythema monitoring, improving quantitative monitoring of lesions and increasing early detection of skin cancers. This platform demonstrates a number of advantages compared with previous systems targeting chromophore mapping and skin cancer screening.14,15,17,18,20,22–24 The smartphone platform is a compact, low-cost, portable, easy-to-use system with native image capture and processing capabilities, which removes the need for expensive, clinical-grade imaging systems.17–19 The platform is flexible enough to use either the embedded camera for imaging or a separate USB-connected camera, depending on the desired ergonomics of the user. Both system implementations can still use the built-in smartphone camera for wide-field, white-light, and dermal imaging [the annulus in Fig. 1(c) can be removed]. Additionally, the smartphone camera can be used for large area image capture either using the smartphone-camera-based dermascope with the imaging annulus removed or using the USB camera’s host smartphone. The use of low-cost, compact, high-power, high-efficacy, surface mount LEDs improves on the costs and complexities associated with laser-based,22,24 interference-filter-based,14,15,20 and spectrometer-based21,23 systems. While these systems likely allow for better discrimination due to their narrow-bandwidth sources or detection schemes, the costs involved (with the possible exception of the laser-based systems) are prohibitive. High-reliability LEDs are available in myriad wavelengths to probe various points along the chromophore molar attenuation curves (Fig. 3) and can be powered with simple driving circuits. Surface-mount packages remove the bulk of transistor outline can packages (or larger packages) necessary for edge-emitting lasers, and the broad wavelength selection is wider than that of surface mount laser packages such as vertical-cavity surface-emitting lasers. The cost of LED sources compared with laser sources or interference filters allows for the use of multiple wavelengths in a single system while keeping bill of materials (BOM) costs low. 5.1.Clinical TestingInitial testing of the systems is promising as both systems were able to capture full image datasets and return similar results of relative chromophore concentrations across multiple dermal lesions except for Hb in the JN case, as shown in Fig. 6. The deviation could be explained by the difference in IR imaging performance between the two dermascopes. In addition, relative melanin content and erythema as measured through the CIELAB white-light images and OD color images agreed between systems and are reasonable based on visual examination. The USB camera and smartphone camera have differing levels of luminance in their white-light images as seen in Figs. 6 and 7, leading to a difference in baseline lightness and redness values, where the higher luminance smartphone images show higher overall and values. However, as seen in Fig. 6, the relative changes are similar, where between the nevus and surrounding skin and between the nevus and surrounding skin. The occlusion test (Fig. 8) provided directionally correct results for both dermascopes, although the magnitudes of change in chromophore concentration were dissimilar between dermascopes. Again, this deviation could be explained by the difference in IR imaging performance between the two dermascopes. With the next system revision, the ability to measure absolute concentrations should be confirmed with known blood and melanin phantoms. To fully validate the system, a full clinical trial of longitudinal data with multiple types of skin lesions in addition to testing patients with a wide range of baseline melanin levels will be necessary.49 Once a large dataset is collected along with biopsy and diagnosis results, classification algorithms can be built using machine learning, principal components analysis, or similar tools.25–27,50 The statistics of the large dataset and the classifier can then be used to predict the relationships between chromophores, lesion type, and diagnosis. In our two datasets, high-melanin concentrations were present for the JN case but not for the SCC case. The classifier will help to determine if this relationship is true more generally or how this might change in patients with high baseline levels of melanin. Likewise, while the Hb and levels were similar in our two datasets, a larger dataset might reveal that cancerous activity increases blood flow,51 increasing both Hb and and possibly the ratios between them. The classifier could use additional features and relationships in the images. For example, by Eq. (12), the optical path length increases as the wavelength increases, increasing the probe depth. Detecting lesion shape changes over depth through edge detection or similar means could provide another layer of information. Hints of these changes are apparent in both the JN and SCC cases as both have changing edges as the wavelength changes. Likewise, the classifier could potentially use additional measures such as blood contrasts16 and oxygenation percentages.52 5.2.Measured Optical PerformanceBoth cameras produced approximately linear responses when changing exposure time in the case of the smartphone camera dermascope and brightness in the case of the USB camera, providing confidence in the ability of the systems to have a linear response to intensity changes from illumination absorption. For the smartphone dermascope, the measured MTF performance matched both the predicted diffraction-limited performance and the cutoff frequency measured with the USAF target where group 5 to 6 () is resolvable. The root mean square error (RMSE) between the measured MTF and predicted diffraction-limited performance was . The USB dermascope’s measured MTF performance did not match the predicted diffraction-limited performance (); however, full specifications of the imaging lens are not provided by the manufacturer, precluding a more accurate estimation of the true diffraction-limited performance. The lens’ NA was estimated to be 0.004 based on the slanted-edge measurement. The measured MTF cutoff frequency matched the USAF target measurement where group 3 to 6 () was resolvable. As shown in the dermal images, both dermascopes demonstrated sufficient image quality for most reasonably sized lesions, with the ability to resolve features as small as for the smartphone dermascope and for the USB dermascope. Illumination uniformity was greater than 85% for all wavelengths with both dermascopes and was easily corrected in the image processing algorithms. 5.3.Next StepsA number of improvements could be made to the systems before conducting a large-scale clinical trial. Currently, the system processing does not incorporate color-to-color spatial image registration. The effects of this are most readily seen in Fig. 7 where the border markings do not completely overlap. Image capture of a full dataset takes about 20 s. Increasing capture speed would reduce the likelihood for image blur between images, easing the need for color-to-color image registration while faster image capture would also increase patient comfort. If image capture speed is not able to be increased, having the clinician deliberately add the markings would likely improve registration because they provide high contrast, well-defined features to extract. The USB dermascope could benefit from an improved lens design. Future systems could better take advantage of smartphones with two rear cameras and add stereoscopic 3D imaging to its analyses to provide a topography of the skin lesion. Alternatively, the dual cameras could provide two FOVs or two NAs for imaging flexibility. Additional illumination optics, such as diffusers,24 could increase illumination uniformity. The LED board was originally designed to take advantage of the dual cameras of the LG G5, but reducing the center aperture of the LED board could increase illumination uniformity and reduce system size. LED wavelengths could also be better tailored to the task or expanded into UV wavelengths to probe potential autofluorescence signatures. Finally, to determine the effect of the IR filter on the mapping performance, an additional dermascope should be built and tested with the USB camera in which the IR filter is not removed. 6.ConclusionTwo geometries of smartphone-based dermascopes for dermal lesion screening and erythema monitoring using PMSI and PWLI are described. These devices augment the capabilities of PCPs, with the potential for earlier detection of melanoma and NMSC along with quantitative monitoring of erythema. The combination of LED sources, 3D-printing, and smartphone-based imaging enables the creation of low-cost (a high-volume BOM cost of excluding the smartphone should be easily achievable), feature-rich, easy-to-use medical imaging devices using either the smartphone camera or a USB camera. While initial results are promising, a longitudinal clinical trial along with histopathology gold-standards will be necessary to validate the diagnostic performance of the devices across multiple lesion types and skin types. 7.Appendix A: Processing Algorithms VisualizedFigures 12Fig. 13–14 provide flowcharts to help visualize the processing algorithms for the reference, white-light images, and color images provided in Algorithms 1 and 2. 8.Appendix B: Finger OcclusionWhile the Hb and chromophore levels should change during the finger occlusion test as shown in Fig. 8, the melanin and background measures should remain constant. Figure 15 shows the additional measures during the occlusion test. Here, the melanin measure has been divided by 100 and the background measure divided by 10,000 for easier comparison of changes between measures. Here, the USB camera’s melanin and background measurements are more stable over the time points compared with the smartphone camera. AcknowledgmentsThe authors would like to thank Oliver Spires for assistance in manufacturing the PMMA disk, Pier Morgan and the Center for Gamma-ray Imaging for use and operation of the rapid prototype printer, David Coombs for assistance with PCB assembly, and Shaobai Li for collection of USAF and slanted-edge images. We are grateful for our funding sources. This research was supported by the National Institutes of Health Biomedical Imaging and Spectroscopy Training under Grant No. T32EB000809 and the National Institutes of Health under Grant No. S10OD018061. ReferencesF. Bray et al.,
“Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries,”
CA Cancer J. Clin., 68
(6), 394
–424
(2018). https://doi.org/10.3322/caac.21492 CAMCAM 0007-9235 Google Scholar
R. L. Siegel et al.,
“Cancer statistics, 2019,”
CA Cancer J. Clin., 69
(1), 7
–34
(2019). https://doi.org/10.3322/caac.21551 CAMCAM 0007-9235 Google Scholar
E. Harrington et al.,
“Diagnosing malignant melanoma in ambulatory care: a systematic review of clinical prediction rules,”
BMJ Open, 7
(3), e014096
(2017). https://doi.org/10.1136/bmjopen-2016-014096 Google Scholar
A. Ehrlich et al.,
“Trends in dermatology practices and the implications for the workforce,”
J. Am. Acad. Dermatol., 77
(4), 746
–752
(2017). https://doi.org/10.1016/j.jaad.2017.06.030 JAADDB 0190-9622 Google Scholar
M. R. Sargen et al.,
“The dermatology workforce supply model: 2015–2030,”
Dermatol. Online J., 23
(9), 8
(2017). Google Scholar
J. Y. Yoo et al.,
“Trends in dermatology: geographic density of US dermatologists,”
Arch Dermatol., 146
(7), 779
–779
(2010). https://doi.org/10.1001/archdermatol.2010.127 Google Scholar
C. J. Koelink et al.,
“Skin lesions suspected of malignancy: an increasing burden on general practice,”
BMC Fam. Pract., 15
(1), 29
(2014). https://doi.org/10.1186/1471-2296-15-29 Google Scholar
A. J. Gewirtzman et al.,
“Computerized digital dermoscopy,”
J. Cosmet. Dermatol., 2
(1), 14
–20
(2003). https://doi.org/10.1111/j.1473-2130.2003.00009.x Google Scholar
, “Photo dermoscopy devices,”
https://dermlite.com/collections/photo-dermoscopy-devices Google Scholar
N. R. Abbasi et al.,
“Early diagnosis of cutaneous melanoma: revisiting the ABCD criteria,”
JAMA, 292
(22), 2771
(2004). https://doi.org/10.1001/jama.292.22.2771 JAMAAP 0098-7484 Google Scholar
M. A. Weinstock et al.,
“Enhancing skin self-examination with imaging: evaluation of a mole-mapping program,”
J. Cutan. Med. Surg., 8
(1), 5
(2004). https://doi.org/10.1177/120347540400800101 Google Scholar
V. Chiu et al.,
“The use of mole-mapping diagrams to increase skin self-examination accuracy,”
J. Am. Acad. Dermatol., 55
(2), 245
–250
(2006). https://doi.org/10.1016/j.jaad.2006.02.026 JAADDB 0190-9622 Google Scholar
B. Farina et al.,
“Multispectral imaging approach in the diagnosis of cutaneous melanoma: potentiality and limits,”
Phys. Med. Biol., 45
(5), 1243
–1254
(2000). https://doi.org/10.1088/0031-9155/45/5/312 PHMBA7 0031-9155 Google Scholar
A. Vogel et al.,
“Using noninvasive multispectral imaging to quantitatively assess tissue vasculature,”
J. Biomed. Opt., 12
(5), 051604
(2007). https://doi.org/10.1117/1.2801718 JBOPFO 1083-3668 Google Scholar
D. Kapsokalyvas et al.,
“Imaging of human skin lesions with the multispectral dermoscope,”
Proc. SPIE, 7548 754808
(2010). https://doi.org/10.1117/12.841790 PSISDG 0277-786X Google Scholar
D. Jakovels et al.,
“RGB imaging system for mapping and monitoring of hemoglobin distribution in skin,”
Proc. SPIE, 8158 81580R
(2011). https://doi.org/10.1117/12.893789 PSISDG 0277-786X Google Scholar
I. Kuzmina et al.,
“Contact and contactless diffuse reflectance spectroscopy: potential for recovery monitoring of vascular lesions after intense pulsed light treatment,”
J. Biomed. Opt., 16
(4), 040505
(2011). https://doi.org/10.1117/1.3569119 JBOPFO 1083-3668 Google Scholar
I. Diebele et al.,
“Clinical evaluation of melanomas and common nevi by spectral imaging,”
Biomed. Opt. Express, 3
(3), 467
(2012). https://doi.org/10.1364/BOE.3.000467 BOEICL 2156-7085 Google Scholar
F. Wang et al.,
“High-contrast subcutaneous vein detection and localization using multispectral imaging,”
J. Biomed. Opt., 18
(5), 050504
(2013). https://doi.org/10.1117/1.JBO.18.5.050504 JBOPFO 1083-3668 Google Scholar
F. Vasefi et al.,
“Polarization-sensitive hyperspectral imaging in vivo: a multimode dermoscope for skin analysis,”
Sci. Rep., 4
(1), 4924
(2015). https://doi.org/10.1038/srep04924 SRCEC3 2045-2322 Google Scholar
J. Spigulis et al.,
“Snapshot RGB mapping of skin melanin and hemoglobin,”
J. Biomed. Opt., 20
(5), 050503
(2015). https://doi.org/10.1117/1.JBO.20.5.050503 JBOPFO 1083-3668 Google Scholar
F. Vasefi et al.,
“Multimode optical dermoscopy (SkinSpect) analysis for skin with melanocytic nevus,”
Proc. SPIE, 9711 971110
(2016). https://doi.org/10.1117/12.2214288 PSISDG 0277-786X Google Scholar
J. Spigulis et al.,
“Smartphone snapshot mapping of skin chromophores under triple-wavelength laser illumination,”
J. Biomed. Opt., 22
(9), 091508
(2017). https://doi.org/10.1117/1.JBO.22.9.091508 JBOPFO 1083-3668 Google Scholar
M. Elbaum et al.,
“Automatic differentiation of melanoma from melanocytic nevi with multispectral digital dermoscopy: a feasibility study,”
J. Am. Acad. Dermatol., 44
(2), 207
–218
(2001). https://doi.org/10.1067/mjd.2001.110395 JAADDB 0190-9622 Google Scholar
G. Lu et al.,
“Medical hyperspectral imaging: a review,”
J. Biomed. Opt., 19
(1), 010901
(2014). https://doi.org/10.1117/1.JBO.19.1.010901 JBOPFO 1083-3668 Google Scholar
B. Song et al.,
“Automatic classification of dual-modalilty, smartphone-based oral dysplasia and malignancy images using deep learning,”
Biomed. Opt. Express, 9
(11), 5318
–5329
(2018). https://doi.org/10.1364/BOE.9.005318 BOEICL 2156-7085 Google Scholar
R. R. Anderson,
“Polarized light examination and photography of the skin,”
Arch. Dermatol., 127
(7), 1000
–1005
(1991). https://doi.org/10.1001/archderm.1991.01680060074007 Google Scholar
R. D. Uthoff et al.,
“Small form factor, flexible, dual-modality handheld probe for smartphone-based, point-of-care oral and oropharyngeal cancer screening,”
J. Biomed. Opt., 24
(10), 106003
(2019). https://doi.org/10.1117/1.JBO.24.10.106003 JBOPFO 1083-3668 Google Scholar
R. D. Uthoff,
“Rossuthoff/multichannel_led_driver: KiCAD design files,”
(2018). Google Scholar
International Electrotechnical Commission,
“Multimedia systems and equipment—colour measurement and management—Part 2-1: colour management—default RGB colour space—sRGB,”
(1999). Google Scholar
B. G. Grant, Field Guide to Radiometry, SPIE Press, Bellingham, Washington, DC
(2011). Google Scholar
I. Kuzmina et al.,
“Towards noncontact skin melanoma selection by multispectral imaging analysis,”
J. Biomed. Opt., 16
(6), 060502
(2011). https://doi.org/10.1117/1.3584846 JBOPFO 1083-3668 Google Scholar
S. L. Jacques et al.,
“Rapid spectral analysis for spectral imaging,”
Biomed. Opt. Express, 1
(1), 157
–164
(2010). https://doi.org/10.1364/BOE.1.000157 BOEICL 2156-7085 Google Scholar
D. Jakovels et al.,
“2-D mapping of skin chromophores in the spectral range 500–700 nm,”
J. Biophotonics, 3
(3), 125
–129
(2010). https://doi.org/10.1002/jbio.200910069 Google Scholar
S. A. Prahl,
“Tabulated molar extinction coefficient for hemoglobin in water,”
(1998) https://omlc.org/spectra/hemoglobin/summary.html Google Scholar
S. L. Jacques,
“Melanosome absorption coefficient,”
(1998) https://omlc.org/spectra/melanin/mua.html Google Scholar
R. R. Anderson et al.,
“The optics of human skin,”
J. Invest. Dermatol., 77
(1), 13
–19
(1981). https://doi.org/10.1111/1523-1747.ep12479191 JIDEAE 0022-202X Google Scholar
A. E. Siegman, Lasers, University Science Books, Mill Valley, California
(1986). Google Scholar
A. Belay et al.,
“Determination of integrated absorption cross-section, oscillator strength and number density of caffeine in coffee beans by the integrated absorption coefficient technique,”
Int. J. Phys. Sci., 4
(11), 722
–728
(2009). 1992-1950 Google Scholar
I. Nishidate et al.,
“Estimation of melanin and hemoglobin using spectral reflectance images reconstructed from a digital RGB image by the Wiener estimation method,”
Sensors, 13
(6), 7902
–7915
(2013). https://doi.org/10.3390/s130607902 SNSRES 0746-9462 Google Scholar
J. Schanda et al., Colorimetry: Understanding the CIE System, CIE/Commission Internationale de l’eclairage, Wiley-Interscience, Vienna, Austria, Hoboken, New Jersey
(2007). Google Scholar
J. K. Wagner et al.,
“Comparing quantitative measures of erythema, pigmentation and skin response using reflectometry,”
Pigment Cell Res., 15
(5), 379
–384
(2002). https://doi.org/10.1034/j.1600-0749.2002.02042.x PCREEA 0893-5785 Google Scholar
C. Wyman et al.,
“Simple analytic approximations to the CIE XYZ color matching functions,”
J. Comput. Graph. Tech., 2
(2), 11
(2013). Google Scholar
I. Diebele et al.,
“Analysis of skin basalioma and melanoma by multispectral imaging,”
Proc. SPIE, 8427 842732
(2012). https://doi.org/10.1117/12.922301 PSISDG 0277-786X Google Scholar
B. L. Diffey et al.,
“A portable instrument for quantifying erythema induced by ultraviolet radiation,”
Br. J. Dermatol., 111
(6), 663
–672
(1984). https://doi.org/10.1111/j.1365-2133.1984.tb14149.x BJDEAZ 0007-0963 Google Scholar
P. D. Burns,
“Slanted-edge MTF for digital camera and scanner analysis,”
in Proc. IS&T,
135
–138
(2000). Google Scholar
Z. Qin et al.,
“Analysis of condition for uniform lighting generated by array of light emitting diodes with large view angle,”
Opt. Express, 18
(16), 17460
–17476
(2010). https://doi.org/10.1364/OE.18.017460 OPEXFF 1094-4087 Google Scholar
S. Sprigle et al.,
“Detection of skin erythema in darkly pigmented skin using multispectral images,”
Adv. Skin Wound Care, 22
(4), 172
–179
(2009). https://doi.org/10.1097/01.ASW.0000305465.17553.1c Google Scholar
P. Rubegni et al.,
“Digital dermoscopy analysis and artificial neural network for the differentiation of clinically atypical pigmented skin lesions: a retrospective study,”
J. Invest. Dermatol., 119
(2), 471
–474
(2002). https://doi.org/10.1046/j.1523-1747.2002.01835.x JIDEAE 0022-202X Google Scholar
M. Lupu et al.,
“Vascular patterns in basal cell carcinoma: dermoscopic, confocal and histopathological perspectives,”
Oncol. Lett., 17
(5), 4112
–4125
(2019). https://doi.org/10.3892/ol.2019.10070 Google Scholar
A. Cysewska-Sobusiak,
“Noninvasive monitoring of arterial blood oxygenation with spectrophotometric technique,”
Proc. SPIE, 1711 311
–323
(1993). https://doi.org/10.1117/12.155671 PSISDG 0277-786X Google Scholar
BiographyRoss D. Uthoff received his BS degree in optical engineering from the Rose-Hulman Institute of Technology and his PhD from the James C. Wyant College of Optical Sciences at the University of Arizona with a focus on biomedical imaging with smartphone cameras. He is currently a lidar engineer at Lumotive, working on lidar utilizing metasurface-based beam steering. He is a member of SPIE. |