Skip to main content
Premium Trial:

Request an Annual Quote

Q&A: Pillar Biosciences' Dale Yuzuki on Managing Cost Waste and Quality Control in Clinical Labs


NEW YORK (GenomeWeb) – Quality control and cost are key issues in running any successful clinical lab. With the US Food and Drug Administration weighing increased oversight over laboratory-developed tests, and the Centers for Medicare & Medicaid Services moving closer to implementing rules expected to reduce Medicare payments to clinical labs, the issues of clinical lab costs and quality control are attracting even more scrutiny.

Uncertainty in healthcare, in general, also has put increasing pressure on clinical labs to better assess their costs and quality controls. Project Santa Fe, a coalition of regional laboratories formed to define the economic valuation of clinical labs, noted in a report released this spring that cost and quality controls will likely play a more equal role in the value of clinical labs in coming years.

"For the most part, laboratory professionals have focused on improving quality and outcomes — the numerator in the value equation. But transformation from volume to value will require meaningful quantification of costs — the denominator," the report stated.

360Dx recently spoke with Dale Yuzuki, senior global market development manager at Pillar Biosciences, a Massachusetts-based startup developing NGS-based assays for oncology applications, about costs and waste in clinical labs and the challenges of controlling costs while maintaining tight quality standards. Yuzuki has extensive experience in product development and sales and marketing for firms involved in the genomics research and clinical markets including SeraCare Life Sciences, Thermo Fisher Scientific, and Life Technologies. Earlier this year he penned a blog piece on SeraCare's website about reducing the costs associated with running a genomics lab.

Below is an edited transcript of the interview.

How much of an issue is cost waste for labs, and what are some of the areas where labs see the most waste?

I think waste is a big problem in terms of cost. Why? Because of labor. Labor is a huge human cost that people in the pathology laboratories don't always [fully consider]. We certainly have labs that do look at labor costs, but often in academic centers they are not looking at time sheets and time use of their employees in the same way that maybe a large commercial laboratory would, and it's a huge problem. One of the big areas is the amount of time spent doing repetitive tasks. You have highly-trained people that are fixing things because they weren't maintained, or something went awry with the human manipulation of things. Things get mislabeled, plates get turned around, records get mangled, things happen, both human caused and not human caused. Fixing those problems are a cost waste and a drag on the system.

What role does employee training play in controlling costs?

Training is an important point and laboratories do have regular training, and they really invest quite a lot in their people. At the same time, you have people who are doing really repetitive work and they are going to burn out. We have a really finely tuned machine where we are training a lot of people and we are treating them well, yet they leave because they don't like the repetitive nature of the work.

How has the cost issue changed with the advent of molecular and genetic tests?

With molecular and genetic tests, the cost issue changes because of the nature of the complexity. When you think about clinical chemistry companies, they have gigantic automated instruments.

The clinical chemistry side is relatively mature. It has been around for a long time, and the level of automation is mind blowing. But on the molecular and genetic side, it requires a lot more training. It requires a lot more expensive labor. It's not as automated, not as push-button. It requires more skill and interpretation. That is maybe a generality, in that there might be molecular tests that are real-time PCR-based that are more automated, but on the genetic side? There are so may bits and pieces to worry about that it increases the labor and the cost just from the point of view of somebody implementing that.  

The flip side of the cost issue is that for an NGS type of assay that may be running 20 samples at a time, it might be $8,000 or $10,000 just to press go. For each individual test per sample there may be easily $400 to $500 invested in that single sample, whereas in other laboratory tests, it might be in the tens of dollars, chemistry analyzers might be $5. So, when you talk about a cost per sample, if you are getting a genomic profile to match therapy to the tumor, before they have to press start on the instrument there has been a lot of labor to take those 20 samples and do what we call library preparation. When they do press go, there is already an investment in time that is not captured in the cost. Even with Illumina trying to patch things in a smaller format with MiSeq. With the sample preparation, it could be $150 to $200 per sample for 20 samples, so right now you are talking about $3,000 or $4,000. It could be easily another $6,000 just for the reagents for the sequencing. Then we are not talking about the cost per analysis.

That's how some of the economics work, and that doesn't even include the costs of my expensive technician and the training. Then, what if a mistake was made, and with that $10,000 we have to say, 'OK, we have to redo it.' We don't have to go all the way back to the beginning, we still have plenty of this library material, but nevertheless it's time and energy spent setting up another run. These kinds of challenges add costs.

There seems to be interest in improving reference materials and reference standards. What needs to be done to develop these tools so that they can lead to improvements in labs and eventual cost savings?

People really have to walk a tightrope of reducing costs as low as they can, and yet producing the highest-quality result because they know it impacts patient care. Let's say this is a laboratory-developed test. Using reference materials to make sure that the answer that they are getting is the answer they should be getting is an extra cost. That's the fine balance between wanting to be doing the best I know how, and yet at the same time knowing that is going to incur an extra cost. That financial burden is a tough one. It really is hard. There is no easy answer.

Let's say the cost per sample is on the order of $500, and there are 20 samples. If I cram in a sample that's a control, and one out of 20 is just a regular normal sequence, that's a $500 opportunity cost. I can run a patient sample, and spend $500 that way, or I can run the reference material standard and it will still cost me $500. So, for the $500 cost, do I run another patient sample or do I run a reference standard to make sure I'm measuring what I need to measure?

A control is a ruler and a measuring stick so we know what truth is. It costs money to know the truth, so maybe I just check in once a month, which is fine. But there are people in the current laboratory-developed-test environment, and they are not forced to measure truth, so they won't. The FDA does not regulate LDTs. In the laboratory-developed-test world there is freedom to innovate, and that is where the FDA is careful about selective enforcement.

Is quality control a problem for labs in general?

No, quality is not a huge problem, but controls come into the discussions of cost, because there are such high-cost pressures in the laboratory. There are very high-cost pressures due to reimbursement, due to how much they are getting paid. The FDA really depends upon a vibrant LDT environment for innovation to occur, and then it matures enough to where they can take a certain test and put their stamp of approval on it. It's a very arduous, expensive, and difficult undertaking for FDA approval, but at the end of that process, with that FDA stamp, a test can be easily adopted.

Is there anything that the FDA should do to help labs control costs and avoid waste?

The FDA is really there to make sure that patients are not harmed. The FDA is also there to make sure that there is usefulness for whatever it is they approve – it has to have a clear benefit. … They are not really worried about cost. They kind of let the free market sort out the cost problem. They really cannot mandate you must be less expensive or run more efficiently. What they can do is make sure that whatever they approve is really a lot better.

Is there anything the industry can do?

I have had people ask me what does a clinical laboratory need. Does it need more automation? That's what companies like Bio-Rad with GnuBio [the droplet-based DNA sequencing technology company Bio-Rad acquired in 2014] and NanoString with Hyb & Seq [its single-molecule, direct digital sequencing technology] are tackling in the clinical market. Can we make it as straightforward as extracting the DNA from the sample and putting it in the instrument?

Others like Pillar Bioscience are tackling it from the existing infrastructure perspective, and saying can we make the up-front technician's process easier, so that it's less labor intensive. That is the value proposition, where if I'm able to cut the process down so the technician has a lot less manipulation to do, or you have a chemistry process by which the same goals are achieved but there is a lot less room for failure, that is how we are tackling it. I think there is merit to both sides.

What the industry can do, they are already trying to do. They spend a lot of money trying to make it faster and easier and basically to help drive adoption. But the inherent processes are hard and labor-intensive. It's just kind of the nature of the beast.

So quality and cost control in the industry will be driven by market forces?

The FDA is concerned about quality. CAP [College of American Pathologists] does annual proficiency evaluations and publishes results from anonymized results to get a pulse of the state of testing. Two years ago, CAP put out basically a dry experiment in which they issued large data files and groups would then take those data files as if they were coming off a sequencer, make the appropriate interpretation, and return them back. In the last year or two, they have actually started giving out anonymized DNA samples and saying go ahead and test it and report back.  Then they report back to the community how well these laboratories are doing.

Laboratories want to know how they compare to their peers. They want to know for their own reasons, and CAP as an organization wants to know how the field is doing with these high-complexity tests.

As an industry, quality works in everybody's interest. I think that's one of the things that is different with regards to life sciences: it attracts people who are truly driven by the best intentions. That bottom-line profit motive that causes all kinds of distortions like in the financial world, there isn't that kind of financial driver in life sciences. It really is for patient health and safety. From the diagnostic testing point of view, they keep that front and center. When I talk to pathology lab people, for them it's all about the patient.