Pre-market tests required for the U.S. Food and Drug Administration to approve stents, pacemakers and other cardiovascular devices may not reveal their true risks to patients, according to a new study published online Monday in the Archives of Internal Medicine.
That’s because most studies don’t include in their data the results derived from doctors’ practice on “training patients,” say researchers from the University of California, San Francisco. Training patients are a group of patients whom doctors essentially use to learn how to insert the device or conduct a particular procedure related to that device. Once they’ve got the hang of it, they start the study, using a fresh batch of patients called “treatment patients” for the actual test results.
And in the relatively few studies where the results for these training patients are included, their success rates were much lower than the treatment patients. According to one study summary, there were nine deaths in 91 training patients – a 9.9% mortality rate – compared with five deaths in 135 treatment patients – a 3.7% mortality rate. Another summary showed that putting a stent in training patients took longer than it did for treatment patients – 1.17 hours compared with 0.98 hours. It seems that a little bit of practice makes a difference.
But most doctors using these devices aren’t going to be practiced the first time they have to implant a pacemaker or deliver a stent, the study authors point out. So these studies don’t accurately predict how patients will fare – and it may not be fair to the training patients, who may not realize that their odds of survival are lower than other study patients.
“To protect the rights of training patients,” as well as to prevent problems with safety and efficacy outcomes “and to better understand the effect of operator learning on device performance, we call for increased transparency of data from training patients,” the authors write.
Follow me at twitter.com/LAT_aminakhan.