Commentary
Article
Author(s):
Kristen Whitney, DO, FAAD, dermatologist within the Allegheny Health Network, discusses advancements in dermoscopy, predicting that emerging technologies and artificial intelligence (AI) will improve dermatologists' efficiency in monitoring and diagnosing skin conditions.
In an interview with The American Journal of Managed Care® (AJMC®), Kristen Whitney, DO, FAAD, a dermatologist at the Allegheny Health Network, discussed the evolution of dermoscopy and predicted how dermatologists will use it moving forward.
Dermoscopy, or dermatoscopy, involves using a handheld light to examine cutaneous lesions before skin cancer surgery.1 It is considered a relatively inexpensive, readily available tool that can improve outcomes and quality of life in patients who need surgical excision of nonmelanoma skin cancers.
This transcript has been lightly edited.
AJMC: What do you see as the future of dermoscopy in the dermatology field? Are there any emerging technologies or techniques that you are excited about?
Whitney: Dermoscopy is great because it's basically a handheld microscope that dermatologists use to magnify different things on the skin. So, we use it primarily in our skin checks to see if something's benign, malignant, or atypical.
It started to really take shape in the 1990s, but it didn't really pick up steam until the early 2000s. It's great technology to have, it makes us catch skin cancers much earlier and things that might not look much to the human eye. When you magnify, you can get a better understanding of it. I caught skin cancers when they were one and a half millimeters in diameter with this scope. So, it's definitely something exciting because you can catch things so much earlier.
Now, that being said, for the human eye, it can take 5 to 7 years of extensive training for dermatologists, and, of course, it starts in our residency. Most residencies now are teaching this technology, but it takes a while to really get good at it. Now, I would say most dermatologists do use it, and it's taught in all dermatology residencies vs 10-15 years ago.
Moving forward, it's a great technology. For example, in some patients covered in atypical moles, it can be very cumbersome to watch closely what's changing to try to detect those subtle changes; we used to take photographs with our dermatoscope. They even make connections so you can have your clinic iPad or iPhone, and it syncs up to the patient's chart. So, that was newer technology as of 5 to 10 years ago. It was great because you have these zoomed-in photos that you can monitor patients with, but the problem is: what happens when you have a patient with 100 or 1000 spots that you want to photograph?
For myself, it was very time consuming to photograph these patients, and you would spend an hour, 2 hours, trying to photograph them. But now we have technology out there where the cameras are getting more advanced for that part of it, and you can photograph most of the body now in a few minutes with different forms of AI that are out there; you can get amazing photographs both of the skin and then zoomed in images of things, as well.
So, that technology is really exciting to have, and that's something that is going to make a lot of dermatologists’ jobs much easier. For some advanced imaging, we even have algorithms that can characterize the mole and grade them as far as how atypical or concerning they are. So, that's pretty relatively new, and so that's exciting. Of course, it doesn't replace dermatologists because you still need that eye to interpret, but it can make our job even better and more efficient.
Reference