Digital Pathology’s Past to Present



Fig. 1.1
An example of how our own in-house developed viewer software can be used to annotate different features on a digital slide



As will be discussed in the following pages, this broad range of potential uses for digital pathology has led both to its rapidly expanding use across an even broader range of indications, and to an ever-growing requirement for expanded, advanced, and enhanced technology to support all these functions. This technology started with a desire to transmit (digital) images of pathology slides to distant sites, primarily for the purposes of obtaining the opinions of off-site pathologists, a process which has been called telepathology.



1.2 Beginnings and Evolution


On February 9, 2009, United States President Barack Obama said: “We have the most inefficient health care system imaginable. We’re still using paper. Nurses can’t read the prescriptions that doctors have written out. Why wouldn’t we want to put that on an electronic medical record that will reduce error rates, reduce our long-term costs of health care, and create jobs right now?” [12].

With that statement and a subsequent federal economic stimulus package called the American Recovery and Reinvestment Act (ARRA) of 2009 that President Obama signed into law on February 17, 2009, Health Information Technology (HIT) achieved a new level of prominence in the USA. But the move toward creating a system of electronic records that would allow for the paperless storage and transmission of health information started decades before this, with the development of, albeit very limited, systems to transmit information between sites and centers using the primitive computer technology of the day. It progressed as computer technology progressed, leading first to networks that would allow for the sharing of information between multiple sites, but ultimately progressing to all-encompassing health information systems capable of infinitely more.


1.2.1 Reaching Out: Telepathology Networks


The first digital camera was not developed until 1975, and it was virtually two decades later before they started to gain widespread commercial popularity [13]. However, when black and white microscopy (digital) images were transmitted from Boston’s Logan Airport to Massachusetts General Hospital way back in 1968 [14], it was a huge step toward recognizing one of the primary advantages of such technology: the ability to convert pathology slides into (digital) images that could then be transmitted across distances. Transmitting these microscopy images only was feasible because a rudimentary laboratory information system, called the Massachusetts General Hospital Utility Multi-programming System (MUMPS), had been created as a collaboration between Massachusetts General Hospital and a company called Bolt Beranek Newman [15]. At about the same time, General Electric announced their intentions to create a commercially available hospital information system through a subsidiary called MediNet. However, given perceived astronomical costs and complexities, this plan ultimately was abandoned [16]. Further attempts to create such systems failed repeatedly throughout much of the 1970s, largely because programming tools and computer technology were inadequate [17]. Among the various problems encountered was that, despite enormous sizes relative to modern-day computers, systems generally lacked the computing power to handle any more than a single user or interface at a given time. It was not until the 1980s that technology companies such as Intel and IBM started to rapidly enhance their ability to construct semiconductors, resulting in what was quickly termed Moore’s Law: the doubling of available computing power every one to 2 years. Meanwhile, standardized, easy-to-use, increasingly powerful, and highly portable programming languages emerged, such as Pascal and C/C+; and Intel’s x86 instruction set architecture, initially released with the Intel 8086 CPU in 1978 to increase the unit’s memory capacity from 8 to 16 bits, rose to prominence [18]. This, in turn, led to the creation of more powerful and much more user-friendly relational database management systems, and their eventual integration into workplaces that included clinical laboratories. Ultimately, however, it was creation of the World Wide Web in the 1990s that led to the widespread development and use of data transmission systems that now comprise one of the cornerstones of digital pathology, overcoming the greatest challenge that was faced with the first systems. Transmitting pathology data from one laboratory to a distant site was becoming not only feasible, but almost instantaneous.

Informatics has been defined as “the discipline focused on the acquisition, storage, and use of information in a specific setting or domain [12].” Discussion about creating a digital informatics environment began as early as the late 1970s and early 1980s, but among radiologists rather than pathologists [19, 20]. It was not until in the mid- to late 1980s that the term telepathology, which Weinstein et al. [21, 22] defined as “the practice of pathology over a long distance,” made its first appearance in the scientific literature. Components of the first telepathology systems included a remote-controlled light microscope attached to a high-resolution video camera; a pathologist workstation that incorporated controls for manipulating the microscope; a high-resolution video monitor; and some form of telecommunications linkage [21]. The first published report of a telepathology network linking multiple facilities was by Weinstein et al. [23], who described their international pathology network linking pathology services across four cities in Arizona and two international sites: one in China and the other in Mexico. By the mid-1990s, telepathology was gaining momentum. But it was not until the commercial availability of digital cameras and scanners, followed by their integration into medical practice and ultimate linkage to the World Wide Web, that the true potential of telepathology took a giant leap forward [24], ultimately allowing for the creation of digitally based pathology service networks far grander than anything initially proposed or likely even conceived by Weinstein or anyone else in the mid-1990s [14, 2123, 25, 26].


1.2.2 Digital Pathology and Whole Slide Imaging


With its emergence in the twenty-first century, digital pathology has represented a fundamental change in the way pathological specimens are viewed. Instead of viewing glass slides or other specimens through a microscope, such specimens can now be examined through a digital monitor. The most essential element of this process is a device to digitally capture images (image digitization). To date, such devices include digital cameras and digital scanners.

Digital cameras work by recording (optical) images not on film, as classic cameras do, but on electronic image sensors. These electronic sensors, in turn, generate electronic signals that are converted into long digital sequences (of “1”s and “0”s). By sampling, which involves digitizing the coordinate values, and by quantization, which is digitizing the amplitude values, we obtain a digital image of the optical image. In truth, there are actually four principle images: (1) There is the optical image, which is created by the lens system; (2) There is the digital image, which is created by digitizing the optical image; (3) There is the displayed image, which is the digital image converted; and (4) There is the continuous image, which is its mathematical representation.

To function, each digital camera contains a built-in microcomputer, to which images are saved for later editing, viewing, and transfer to other devices. Like digital cameras, digital slide scanners contain electronic image sensors that generate electronic signals; but, unlike cameras, slide scanners allow for the capture of multiple images during movement of the sensor, without moving the object of interest. And where cameras must be placed on top of a microscope, scanners completely bypass the need for a camera (which is built into the scanner in that case).

Low- to medium-resolution scanners were being used commercially and in clinical settings long before digital cameras ever gained popularity. In the clinical setting, their first use was as early as the 1980s. However, such use was only for single-function purposes like the DNA sequencing of gel auto-radiographs, measuring immunofluorescence in stained cells, quantifying immunoblots, and copying figures and other images, like slides, for scientific publications [2731]. These early scanners lacked both the resolution and functionality of twenty-first-century scanners. To be functional for digital pathology, both functions needed to undergo significant enhancements.

Digital pathology images must have sufficient contrast in order to visualize individual cells and their characteristics. As with digital photography, image contrast is determined by the optical components (the lens) and the image sensor: the smaller the pixel spacing in the sensor and the better the resolution of the lens, the more finely resolved the optical image can be captured without sampling distortion (also called pixelation or undersampling). However, achieving higher-contrast images comes with its own challenges and costs, like greater memory requirements and higher bandwidth capacity to capture, upload, and transmit these digital optical images of adequate resolution.

One of the many advances in digital pathology that has occurred over time relates to the creation of the pathologist’s new work station, which has been called a digital cockpit or digital dashboard [32], so that it enhances each pathologist’s ability to access, visualize, interpret, and share digital pathology images and thereby utilize digital informatics systems effectively and efficiently. One major problem with dashboards, and in fact with most components of telepathology and digital pathology systems, has been their lack of standardization.

Another major accomplishment that had its origins with seminal work that was reported by Ferreira et al. [4], but which truly only started to be used clinically in the new millennium [33], has been the evolution from capturing individual microscopy fields for review to whole slide imaging (WSI). This process requires specialized scanners that are both high-resolution and high-speed and that can digitize images across large arrays and combine these images, through a process called stitching, into even larger arrays [3436]. As such, not just certain fields of a slide, but entire slides can be visualized. Moreover, they can be scanned at multiple levels of magnification and in all three planes (x, y, and z) [14]. One major advantage of stitching numerous images together rather than trying to capture entire slides at once relates to the former allowing for images to be brought into focus and then captured at higher levels of magnification, rather than having to capture the entire slide at lower magnification and later enlarge it, thereby tampering with focus and losing resolution. However, stitching creates its own complications, like ensuring that the lighting, focus, and coloring of each of the partial images match, especially since the topology of the specimen may vary from one part of the specimen to another. This has led to numerous process refinements, like shading and lighting normalization, auto-focusing, and independent dual-sensor scanning that allows for image acquisition and focusing to be performed sequentially, rather than in the same step [37, 38].

Scanning at different levels (i.e., different focal planes) must be distinguished from zooming in, which refers to magnification within a single plane, both of which are useful, but for different reasons. Whereas magnification allows for closer inspection and better detection of smaller structures, the ability to scan tissues at different levels and in all three planes has led to the generation of three-dimensional image reproductions of original tissue, which is achieved by scanning multiple focal planes into images and then stacking them, a process that is invaluable for the evaluation of cytological specimens, frozen sections, and other thick specimens where the pathologist needs to assess cellular architecture in multiple planes. In this way, entire tissue sections can be visualized [3942]. Thick specimens can not only be scanned throughout, but the focal plane can be rotated in any direction. In addition, as will be elaborated further in Chap. 4, stains can be both detected and quantified [43, 44]. This is accomplished using recent innovations like automated (histopathology) pattern recognition [45]; color enhancement, and standardization techniques [4648], as well as color content analysis that allows for the detection and quantification of histochemical stains [49]; and image microarrays (IMA) and multiplexed biomarker testing so that several tissue characteristics, biomarkers, or stains can be sought and detected on the same slide, thereby replacing the tedious-to-make and difficult-to-maintain cell and tissue (paraffin) blocks of traditional microscopy [50, 51].

Despite all these functions, systems have become faster and now have capacities to multitask. Not just one slide, but numerous slides or tissue specimens, captured in many different focal planes, can be imaged. Detecting or quantifying several different cellular characteristics, like the uptake of fluorescent stain, can be processed at the same time. In addition, data can be extracted from the images and tested, using various algorithms that allow for the automatic detection and characterization of, as an example, cancerous cells [52]. Automated image analysis of routine histological sections is now being used to detect and quantify the expression of human epidermal growth factor receptor (HER-2) in breast cancer, since over-expression is associated with an increased risk of recurrence and poor outcomes [53]. It also predicts responsiveness to trastuzumab, a monoclonal antibody that targets the HER2/neu receptor [53].


1.2.3 Differences with Radiology


Notice how we call it “radiology,” and not “digital radiology.” That is because radiology already is digital. We take it for granted, so we omit the “digital” adjective. We estimate that pathology is about 10 years behind radiology in terms of digitization. It is tempting therefore to think that solutions devised for radiology can be replicated or even reused for pathology.

However, it is important to recognize that pathology is fundamentally different from radiology, and there are reasons why digital pathology lags behind. First, the source material is different: radiology usually works with live specimens (patients), whereby pathology usually concerns specimen samples (biopsies, cytologies). Protocols also are fundamentally different in pathology, whereby additional stains often are requested by the pathologist, based on observations made in the original H&E stained slide. This constant interchange between digital observations and wet laboratory techniques poses new challenges for laboratory information systems, which is just one reason RIS-software (Radiology Information System) is not a good fit for today’s pathology departments (and therefore cannot be easily ported either).

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Mar 17, 2016 | Posted by in COMPUTERIZED TOMOGRAPHY | Comments Off on Digital Pathology’s Past to Present

Full access? Get Clinical Tree

Get Clinical Tree app for offline access