[ | E-mail | Share ]
Contact: Vanessa Wasta
wasta@jhmi.edu
410-955-1287
Johns Hopkins Medical Institutions
A combination of several well-known safety procedures could greatly reduce patient-harming errors in the use of radiation to treat cancer, according to a new study led by Johns Hopkins researchers.
Radiation oncologists use more than a dozen quality assurance (QA) checks to prevent radiotherapy errors, but until now, the Hopkins researchers say, no one has systematically evaluated their effectiveness. Working with researchers at Washington University in St. Louis, the Hopkins team gathered data on about 4,000 "near miss" events that occurred during 2008-2010 at the two institutions. They then narrowed the data set to 290 events in which errors occurred that if they had not been caught in time could have allowed serious harm to patients. For each commonly used QA check, they determined the percentage of these potential patient-harming incidents that could have been prevented.
The group's key finding was that a combination of approximately six common QA measures would have prevented more than 90 percent of the potential incidents.
"While clinicians in this field may be familiar with these quality assurance procedures, they may not have appreciated how effective they are in combination," says Eric Ford, Ph.D., D.A.B.R., assistant professor of Radiation Oncology and Molecular Radiation Sciences at Johns Hopkins, who will present the group's findings on August 3 at the joint American Association of Physicists in Medicine (AAPM) and Canadian Organization of Medical Physicists annual meeting, held July 31 to August 4, 2011 in Vancouver, Canada.
At a separate symposium at the meeting, also on August 3, Ford and his colleagues will make related recommendations for the standardization of radiotherapy accident investigation procedures.
Ionizing radiation such as gamma radiation or proton beam radiation has long been a staple in cancer treatment, because it can efficiently create cell-killing DNA breaks within tumors. The goal is to use it in ways that maximize the dose delivered to a tumor, while keeping healthy tissue around the tumor as protected as possible by sharply focusing the radiation treatment area.
Unfortunately, the multistep complexity of radiation therapy, and the numerous precision measurements its use entails, can sometimes lead to mistakes, with patients getting too little radiation where it's needed, or too much where it isn't.
One QA check, a piece of hardware called an Electronic Portal Imaging Device (EPID), is built in to many radiotherapy-delivery machines, and can provide a real-time X-raylike image of the radiation coming through a patient. But Ford says less than one percent of radiotherapy clinics use EPID because the software and training needed to operate are mostly absent.
However, Ford says, their research showed that another key to safety turned out to be a humble checklist of relatively low-tech measures, "assuming it's used consistently correctly, which it often isn't," adds Ford. The checklist includes reviews of patient charts before treatment by both physicians and radiation-physicists, who calculate the right dose of radiation.
Use of film-based radiation-dose measurements as an alternative to EPID and a mandatory "timeout" by the radiation therapist before radiation is turned on to double-check that the written treatment plan and doses match what's on the radiation delivery machines were also on the list of the most effective QA procedures.
A common QA measure known as pretreatment IMRT (intensity modulated radiation therapy), in which clinical staff do a "test run" of the radiotherapy device at its programmed strength with no patient present, ranked very low on the list because it would have prevented almost none of the potential incidents studied. "This is important to know, because pre-treatment IMRT often consumes a lot of staff time," says Ford.
Ford and his Johns Hopkins colleague Stephanie Terezakis, M.D., a pediatric radiation oncologist and a contributor to the QA evaluation study, also are members of the AAPM Working Group on the Prevention of Errors. At the Vancouver meeting, in a symposium on August 3, the group will make recommendations for a national radiotherapy incident reporting system. The group is developing a way to have treatment errors and near-misses reported and sent to a central group for evaluation and dissemination to clinics, says Ford. "It could work in ways similar to how air and train accidents are reported to the National Transportation Safety Board," he noted.
###
Other experts who contributed to the QA-check effectiveness study are Kendra Harris, M.D., a radiation oncology resident at Johns Hopkins; Annette Souranis, a therapist in the radiation oncology department, and Sasa Mutic, Ph.D., associate professor of radiation oncology at Washington University School of Medicine in St. Louis, Missouri.
The study was funded with a pilot research grant from Elekta Inc.
Abstract Title/Number: WE-C-214-5-- A Quantification of the Effectiveness of Standard QA Measures at Preventing Errors in Radiation Therapy and the Promise of in Vivo EPID-Based Portal Dosimetry
Abstract Link: http://www.aapm.org/meetings/amos2/pdf/59-16302-92754-297.pdf
[ | E-mail | Share ]
?
AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert! system.
[ | E-mail | Share ]
Contact: Vanessa Wasta
wasta@jhmi.edu
410-955-1287
Johns Hopkins Medical Institutions
A combination of several well-known safety procedures could greatly reduce patient-harming errors in the use of radiation to treat cancer, according to a new study led by Johns Hopkins researchers.
Radiation oncologists use more than a dozen quality assurance (QA) checks to prevent radiotherapy errors, but until now, the Hopkins researchers say, no one has systematically evaluated their effectiveness. Working with researchers at Washington University in St. Louis, the Hopkins team gathered data on about 4,000 "near miss" events that occurred during 2008-2010 at the two institutions. They then narrowed the data set to 290 events in which errors occurred that if they had not been caught in time could have allowed serious harm to patients. For each commonly used QA check, they determined the percentage of these potential patient-harming incidents that could have been prevented.
The group's key finding was that a combination of approximately six common QA measures would have prevented more than 90 percent of the potential incidents.
"While clinicians in this field may be familiar with these quality assurance procedures, they may not have appreciated how effective they are in combination," says Eric Ford, Ph.D., D.A.B.R., assistant professor of Radiation Oncology and Molecular Radiation Sciences at Johns Hopkins, who will present the group's findings on August 3 at the joint American Association of Physicists in Medicine (AAPM) and Canadian Organization of Medical Physicists annual meeting, held July 31 to August 4, 2011 in Vancouver, Canada.
At a separate symposium at the meeting, also on August 3, Ford and his colleagues will make related recommendations for the standardization of radiotherapy accident investigation procedures.
Ionizing radiation such as gamma radiation or proton beam radiation has long been a staple in cancer treatment, because it can efficiently create cell-killing DNA breaks within tumors. The goal is to use it in ways that maximize the dose delivered to a tumor, while keeping healthy tissue around the tumor as protected as possible by sharply focusing the radiation treatment area.
Unfortunately, the multistep complexity of radiation therapy, and the numerous precision measurements its use entails, can sometimes lead to mistakes, with patients getting too little radiation where it's needed, or too much where it isn't.
One QA check, a piece of hardware called an Electronic Portal Imaging Device (EPID), is built in to many radiotherapy-delivery machines, and can provide a real-time X-raylike image of the radiation coming through a patient. But Ford says less than one percent of radiotherapy clinics use EPID because the software and training needed to operate are mostly absent.
However, Ford says, their research showed that another key to safety turned out to be a humble checklist of relatively low-tech measures, "assuming it's used consistently correctly, which it often isn't," adds Ford. The checklist includes reviews of patient charts before treatment by both physicians and radiation-physicists, who calculate the right dose of radiation.
Use of film-based radiation-dose measurements as an alternative to EPID and a mandatory "timeout" by the radiation therapist before radiation is turned on to double-check that the written treatment plan and doses match what's on the radiation delivery machines were also on the list of the most effective QA procedures.
A common QA measure known as pretreatment IMRT (intensity modulated radiation therapy), in which clinical staff do a "test run" of the radiotherapy device at its programmed strength with no patient present, ranked very low on the list because it would have prevented almost none of the potential incidents studied. "This is important to know, because pre-treatment IMRT often consumes a lot of staff time," says Ford.
Ford and his Johns Hopkins colleague Stephanie Terezakis, M.D., a pediatric radiation oncologist and a contributor to the QA evaluation study, also are members of the AAPM Working Group on the Prevention of Errors. At the Vancouver meeting, in a symposium on August 3, the group will make recommendations for a national radiotherapy incident reporting system. The group is developing a way to have treatment errors and near-misses reported and sent to a central group for evaluation and dissemination to clinics, says Ford. "It could work in ways similar to how air and train accidents are reported to the National Transportation Safety Board," he noted.
###
Other experts who contributed to the QA-check effectiveness study are Kendra Harris, M.D., a radiation oncology resident at Johns Hopkins; Annette Souranis, a therapist in the radiation oncology department, and Sasa Mutic, Ph.D., associate professor of radiation oncology at Washington University School of Medicine in St. Louis, Missouri.
The study was funded with a pilot research grant from Elekta Inc.
Abstract Title/Number: WE-C-214-5-- A Quantification of the Effectiveness of Standard QA Measures at Preventing Errors in Radiation Therapy and the Promise of in Vivo EPID-Based Portal Dosimetry
Abstract Link: http://www.aapm.org/meetings/amos2/pdf/59-16302-92754-297.pdf
[ | E-mail | Share ]
?
AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert! system.
Source: http://www.eurekalert.org/pub_releases/2011-08/jhmi-coe080111.php
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.