"Big Data" comes to medical imaging

June 27, 2013
by Brendon Nafziger, DOTmed News Associate Editor
Google has changed the way people do a lot of things, from planning road trips to checking e-mail, to — if Google Glass pans out — choosing eyewear. A few years ago, the California tech giant decided to shake up another field: epidemiology. In 2008, amidst much fanfare, the company launched Google Flu Trends. The web service rests on a pretty clever idea. It uses complex algorithms to monitor people’s searches for flu terms or symptoms, using this activity as a proxy for the flu — if people are googling the flu, Google reasons, they probably have it — and thereby modeling the spread of the disease.

The company says its data pretty closely matches the Centers for Disease Control and Prevention’s own weekly influenza reports, based on tried-and-true epidemiological methods. And for some years the data does. But this spring, Nature reported that Google Flu probably botched it, predicting about twice as many cases of flu as there actually were this season, according to CDC’s numbers. No one knows why it failed, but one suggestion is the media hype around the flu confused the system. People were searching for the flu, not because they had it, but because they learned of the outbreak somewhere and wanted to follow up on it.

Even if Google Flu had a bit of a misstep (one possibly soon fixed with a better algorithm), it does point to what’s in store for medicine: Big Data, or pooling the data drawn from the collective behaviors of millions of people to discover everything from public health trends to disease risk factors and more.

This is coming to medical imaging, too, and the implications won’t be confined to clinical matters — business software can dip down into those richer mines of data to dig up informational nuggets useful for hospital administrators to understand how their enterprises are working.

Nadim Michel Daher

“Big Data analytics is key, both for the clinical aspect and for its operational aspects for an enterprise to be able to look at data from the last five years and analyze their own enterprise, how it can be improved and trends in imaging, and make data-driven decisions accordingly,” Nadim Daher, principal analyst with Frost & Sullivan, tells DOTmed Business News.

But there’s still a lot of work to be done — on exploring new storage models and figuring out how to get the most out of analytics — before Big Data has its impact, big or small, on the field.

Data deluge
For medical imaging, the data being created is expected to be very, very big. According to Frost & Sullivan’s recent predictions, diagnostic imaging alone will generate nearly 1 exabyte of data by 2016. One exabyte is one of those units so large the number used to describe it seems like a nonsense word — it’s 1 quintillion bytes. That is, 1 billion gigabytes. To put that in perspective, it’s the equivalent of 250 million DVDs, a little more than one month’s worth of the entire world’s mobile data traffic last year, or one-fifth of a text containing all words ever spoken, according to Cisco Systems Inc. In short, it’s a lot of data.

But even this might be an underestimate. Frost & Sullivan says its projections for diagnostic imaging growth, which, by the way, excludes some data-intensive imaging like interventional radiology, are a mid-range estimate. With a higher-range forecast, the volume crosses the 1 exabyte Rubicon either next year or in 2015.

Mike Leonard

Mike Leonard, director of product management of health care services business with Iron Mountain, Inc., thinks the number’s too conservative, based on the storage firm’s own reports of a 50 percent year-over-year growth in its medical imaging archive business.

“If the growth rates (elsewhere) are as high as we’re seeing, those growth rates exceed what Frost & Sullivan has,” he tells DOTmed News. “We definitely see the amount of storage that hospitals need to budget for each year growing incrementally.”

He estimates the storage budget’s growing at about 3 to 5 percent of the total IT budget per year, with the growth larger for smaller hospitals. All told, between 2012 and 2015, hospitals’ total data load is expected to grow from 650 terabytes to about 1.7 petabytes, he says, citing published reports.

True, the growth of diagnostic imaging procedures has plateaued or even fallen over the past few years, but the absolute volumes are still up, Daher says. “The average storage volume requirement is going up, and that’s showing no sign of slowing down,” he predicts.

For growth, what modalities appear to be the culprits? “Generally, CT and MRI because of size of the studies coupled with the volume of studies done,” Leonard says. Digital mammograms are large, too, and an emerging innovation in this field might just make mammography more burdensome: breast tomosynthesis.

Tomo troubles
Breast tomosynthesis is a 3-D digital mammography technology that uses an X-ray scanner to spin in an arc over the breast, taking multiple images that it reconstructs into a 3-D image. Only one tomosynthesis unit, Hologic’s Selenia Dimensions, is on the market, having been cleared by the Food and Drug Administration in February 2011, but other manufacturers have released tomosynthesis devices in Europe and will probably bring them over to the U.S.

When they do, and should the tomosynthesis market take off (right now it is not reimbursed by Medicare, which somewhat limits its appeal), it could be an even bigger data archiving burden for imaging centers and hospitals.

Dr. David Clunie, an imaging informaticist, is even helping organize a talk at the Society for Imaging Informatics in Medicine’s 2013 conference this summer on how breast tomosynthesis could “kill” traditional PACS, in part because of the size of its files. Even a compressed, four-view tomo study would be roughly 350 MB — larger than even most CT chest and abdomen scans, he said.

“That is quite a lot of data to transmit and store, particularly when one considers the relatively high throughput of a dedicated screening facility,” he wrote in a March article on Aunt Minnie. “It is certainly a nontrivial amount of data to include in one’s consideration of capacity and cost of the archival distribution infrastructure.”

Tomo’s big file size also has knock-on effects, by requiring faster network speeds to shuttle tomo data across and beefier specs for viewing workstations, according to Steve Deaton, vice president of sales with Vitzek, a company that recently added a tomo viewing feature to its mammography PACS.

“The DBT size causes a ripple that will not only require more investments in IT infrastructure between the server and the radiologists' eyes, but it will also change the workflow models,” Deaton, who’s sitting on the tomo SIIM panel with Clunie, told DOTmed News by e-mail. “We have seen more radiologists reading from locations remote to the actual acquisition location, and this usually involves transmitting data over the Internet. Internet speeds usually are a fraction of what local networks are, so it will be increasingly difficult for radiologists to maintain their current distributed location model.”

Keep in mind, too, that diagnostic imaging isn’t the only discipline pumping up imaging volumes. Cardiology is a big contributor here, Daher says, and there’s already an established market for cardiology PACS. But possibly, the lion in the road is pathology. If CT or MR studies average 50 to 100 MB per study, digitized pathology studies start at over 1 gigabyte, Iron Mountain’s Leonard says.

“That’s going to be a tremendous growth driver of the next few years,” he notes.

Daher agrees. “It’s something everyone’s looking at,” he says, but even that’s not all. “It’s not only digital pathology; it’s also endoscopy, ophthalmology, dermatology, surgery — a number of image-using and image-producing departments.”

Changing the business model
New pressures on storage could lead to new storage models, but in general the industry is fairly conservative. “Most health care organizations, regardless of size, continue with business-as-usual in terms of managing storage,” Leonard says.

This means building out a disk-based archive system, periodically adding to it or swapping out obsolete equipment after five years. But new storage demands and the advent of offsite cloud — a fancy way of talking about systems managed remotely by a vendor or third-party — are emerging as an option. Slowly.

“From what we’ve seen, I don’t know anybody would disagree, that cloud storage market for medical imaging is really just at the beginning stages,” Leonard says. Even so, cloud systems are generally for so-called older studies in the archives, not tier 1, ultra-fast, latency-free storage that radiologists need for recent exams.

”It’s very rare that somebody’s going to put their tier 1 storage offsite,” Leonard says. “As bandwidth becomes less expensive and more available, that potentially changes.”

Still, Frost & Sullivan’s Daher believes that more facilities are moving to a cloud-style, pay-as-you-go model, where a facility pays per procedure or volume unit, such as by terabyte, rather than adding more modular racks into the already crowded archives. But he says it could be in the “early adopter stage” for some time.

Analytics, phase 1
Once you have all this data, though, whether stored onsite or off, what do you do with it? Here’s where analytics comes in. And here, just like with the cloud, medicine is just catching up.

“When the world talks about Big Data, you’ve got Google and Amazon and their way more advanced computer science going at these problems, and in medicine we’ll be watching that and potentially trying to harness new ways of crawling through data,” Dr. Matthew Morgan, a radiologist and imaging informaticist at the University of Utah, tells DOTmed News. For medicine, digital trends tend to come five to 10 years later, he says.

In radiology, for now, informaticists and vendors are about working more on data aggregation than true “Big Data,” with less concern on making discoveries by combing through reams of data, and more interest in silo-breaking, by making sure information held by RIS, PACS, electronic medical records and even dose-monitoring software can be pooled together.

The silo-breaking, which Morgan calls phase 1 for analytics, is nonetheless proceeding. Medicalis, Primordial and M*Modal, for instance, are all vendors working to provide analytics on top of PACS you’re already running, mostly by offering software that listens to the HL7 messages — pre-formed data packets—generated by different medical software systems and wraps it up under one user interface.

To see how effective this is, the industry also needs solid case studies of how people can most benefit from analytics, and how they’re using indicators and so-called dashboards — analytics tools that gather key metrics together in one place. Some information is trickling in. This spring, the Journal of the American College of Radiology published a study on the use of dashboards by academic medical centers in the U.S., finding that nearly two-thirds used them to track patient volume, turnaround times and access to MRIs and other high-end devices.

But the programs were fairly new, with about half saying they used the dashboard technology for less than two years, the study said.

One possible answer to the backroom analytics headaches — of getting programs working together — might be discovered in the world of startups. Analytics Informatics, a spinoff founded by Paul Nagy, at Johns Hopkins University, and Christopher Meenan at the University of Maryland, is laying out one promising strategy, Morgan says. The idea behind the company is that if several institutions run the same “under the blankets” technology — sort of like the Android operating system being common to different kinds of phones or tablets — they can more easily create tools or apps on top that can be shared among different institutions without having to invest too much time from their IT teams.

“Once you’re mapped into the system, all the complexity fades away,” Morgan says. “If I build a little tool that tracks scanner utilization, anyone else using the same platform will also (be able to use it).”

“Now you’re able to be creative,” he adds.