Carpet design

Imaging technique helps surgeons remove cancer cells

The current standard of care for head and neck cancers is that surgeons typically remove a margin of tissue around a tumor to make sure they’ve caught all the cancer cells. In the process, however, they can remove more healthy tissue than necessary, which can lead to debilitating problems for patients eating, drinking, talking, or breathing.

To improve medical outcomes for these patients, researchers at the Illinois Institute of Technology (Chicago, Illinois, USA; www.iit.edu) work with surgical oncologists at the University of Groningen (Groningen, Netherlands; www.rug.nl). Using fluorescence-guided imaging, they want to assess tumors that have been surgically removed while patients are still in the operating room.

The goal is to use the images of the tumors to determine if surgeons need to remove more tissue from the patient during the same procedure. If the outer margins of the tumor are free of cancerous cells, there is less chance that problem cells will still be inside the patient.

If the methodology proved effective, it would be a big change. This type of follow-up information is usually not available until a patient is discharged from surgery and pathologists have had time to examine tumors.

To achieve their goals, surgical oncologists specializing in head and neck cancers at the University of Groningen inject patients with a fluorescent marker before surgery. The marker binds to cancer cells by emitting a light signal visible under certain lighting conditions.

Guide fluorescent imaging

It’s not a perfect solution, however. According to Kenneth M. Tichauer, associate professor of biomedical engineering at the Illinois Institute of Technology, fluorescent imaging uses low-energy light, which tends to scatter rather than absorb into tissue. As a result, this method is not effective in detecting small clusters of cancer cells, which typically occur at the margins of the tumor. And, the technique only depicts cells up to 1mm below the surface of the tumor, whereas surgeons would ideally like to assess cells up to 5mm below the surface.

To address these issues, Tichauer and his team developed a method that takes advantage of the scattering effect of fluorescent imaging and combines it with images taken at multiple apertures.

Why several openings? “If something is deep and you take a picture with an open aperture or a closed aperture, you actually tend to get about the same image. But if it’s close to the surface and you’re looking with a open aperture and a closed aperture, one looks different from the other,” he says.

Based on the differences in images taken with different apertures, Tichauer and his research team developed algorithms to determine how deep in the tissue the light-emitting cells are.

The prototype microscopy vision system starts with a 780 nm LED light source (M780LP1) from Thorlabs (Newton, NJ, USA; www.thorlabs.com). On top of the LED, the team mounted other Thorlabs products: an adjustable collimation adapter (SM1U) and a bandpass excitation NIR filter centered at 780 nm (FBH05780-10).

The camera is a PCO.panda 4.2 from PCO Imaging (Kelheim, Germany; www.pco-tech.com). The team used a high-resolution 24mm Schneider VIS-NIR lens with an f/2–f/16 aperture range from Edmund Optics (Barrington, NJ, USA; www.edmundoptics.com). They mounted a Thorlabs emission filter (FELH0800) between the camera and the lens.

They attached this assembly to a light tube, which splits the light focused on the sample into two beams at 45° angles and opposite to each other. The specimen, or tumor, rests on a Thorlabs “stage” that can be raised or lowered, depending on the thickness of the specimen.

The team used MATLAB from MathWorks (Natick, MA, USA; www.mathworks.com) as the primary coding environment, says Cody Rounds, a doctoral student at the Illinois Institute of Technology. They used the software to control the scene, process the images and perform calculations. “A GUI (graphical user interface) was created using MATLAB to consolidate all of this into an easy-to-use user interface for surgeons,” says Rounds.

The team tested the system on samples from six patients during the end of a clinical trial run by the University of Groningen. Tichauer said he was pleased with the results and hopes to publish information about the work in a peer-reviewed journal.