The goal of this work is to evaluate the effectiveness of Plan-Checker Tool (PCT) which was created to improve first-time plan quality, reduce patient delays, increase the efficiency of our electronic workflow, and standardize and automate the physics plan review in the treatment planning system (TPS). PCT uses an application programming interface to check and compare data from the TPS and treatment management system (TMS). PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user as part of a plan readiness check for treatment. Prior to and during PCT development, errors identified during the physics review and causes of patient treatment start delays were tracked to prioritize which checks should be automated. Nineteen of 33checklist items were automated, with data extracted with PCT. There was a 60% reduction in the number of patient delays in the six months after PCT release. PCT was successfully implemented for use on all external beam treatment plans in our clinic. While the number of errors found during the physics check did not decrease, automation of checks increased visibility of errors during the physics check, which led to decreased patient delays. The methods used here can be applied to any TMS and TPS that allows queries of the database.
Proper quality assurance (QA) of the radiotherapy process can be time-consuming and expensive. Many QA efforts, such as data export and import, are inefficient when done by humans. Additionally, humans can be unreliable, lose attention, and fail to complete critical steps that are required for smooth operations. In our group we have sought to break down the QA tasks into separate steps and to automate those steps that are better done by software running autonomously or at the instigation of a human. A team of medical physicists and software engineers worked together to identify opportunities to streamline and automate QA. Development efforts follow a formal cycle of writing software requirements, developing software, testing and commissioning. The clinical release process is separated into clinical evaluation testing, training, and finally clinical release. We have improved six processes related to QA and safety. Steps that were previously performed by humans have been automated or streamlined to increase first-time quality, reduce time spent by humans doing low-level tasks, and expedite QA tests. Much of the gains were had by automating data transfer, implementing computer-based checking and automation of systems with an event-driven framework. These coordinated efforts by software engineers and clinical physicists have resulted in speed improvements in expediting patient-sensitive QA tests.
Software upgrades of the treatment management system (TMS) sometimes require that all data be migrated from one version of the database to another. It is necessary to verify that the data are correctly migrated to assure patient safety. It is impossible to verify by hand the thousands of parameters that go into each patient's radiation therapy treatment plan. Repeating pretreatment QA is costly, time‐consuming, and may be inadequate in detecting errors that are introduced during the migration. In this work we investigate the use of an automatic Plan Comparison Tool to verify that plan data have been correctly migrated to a new version of a TMS database from an older version. We developed software to query and compare treatment plans between different versions of the TMS. The same plan in the two TMS systems are translated into an XML schema. A plan comparison module takes the two XML schemas as input and reports any differences in parameters between the two versions of the same plan by applying a schema mapping. A console application is used to query the database to obtain a list of active or in‐preparation plans to be tested. It then runs in batch mode to compare all the plans, and a report of success or failure of the comparison is saved for review. This software tool was used as part of software upgrade and database migration from Varian's Aria 8.9 to Aria 11 TMS. Parameters were compared for 358 treatment plans in 89 minutes. This direct comparison of all plan parameters in the migrated TMS against the previous TMS surpasses current QA methods that relied on repeating pretreatment QA measurements or labor‐intensive and fallible hand comparisons.PACS numbers: 87.55.T, 87.55.Qr
Purpose: To enhance patient safety and radiation therapy quality assurance by implementing an immediate computer‐based audit of radiation therapy treatment record. Provide an automatic second check of the RT record in a paperless environment. Replace some aspects of weekly physics chart checks with a computer‐based audit of the treatment. Method: A software agent receives signals from a system we have developed, EventNet, when sentinel events occur in the TMS. Audits are triggered when a “treatment canceled”, and one of the many “treatment complete” signals are received. The treatment canceled audit assumes the patient treatment was canceled and verifies that no treatment happened. The treatment completed audit assumes that a treatment occurred and checks the treatment history parameters and dose summary against the scheduled plans, session, daily and total doses. The agent extracts the plan(s) and the day's treatment history from the TMS with a DICOM query and sends that data to a web service that compares the history against the plan. If the audit succeeds, the agent quietly adds an entry to a log file. When it fails the agent alerts staff via pager, email, or SMS message. Results: The agent is able to catch events from the TMS and trigger one of two audit processes. Basic audits of the treatment histories works as designed. Intentional variation in treatment parameters in test cases are caught and trigger the immediate alert. Conclusion: Immediate audit of the RT record is better than waiting for weekly physics chart check to verify plan parameters. Rather than have a physicist seek out variations in treatments by checking every treated parameter, a software agent can provide a basic audit and alert the physicist if needed. This brings more focus on patient treatments that require oversight and could free up valuable time for other QA tasks.
Purpose: To optimize clinical efficiency and shorten patient wait time by minimizing the time and effort required to perform the Winston‐Lutz test before stereotactic radiosurgery (SRS) through automation of the delivery, analysis, and documentation of results. Methods: The radiation fields of the Winston‐Lutz (WL) test were created in a “machine‐QA patient” saved in ARIA for use before SRS cases. Images of the BRW target ball placed at mechanical isocenter are captured with the portal imager for each of four, 2cm×2cm, MLC‐shaped beams. When the WL plan is delivered and closed, this event is detected by in‐house software called EventNet which automates subsequent processes with the aid of the ARIA web services. Images are automatically retrieved from the ARIA database and analyzed to determine the offset of the target ball from radiation isocenter. The results are posted to a website and a composite summary image of the results is pushed back into ImageBrowser for review and authenticated documentation. Results: The total time to perform the test was reduced from 20‐25 minutes to less than 4 minutes. The results were found to be more accurate and consistent than the previous method which used radiochromic film. The images were also analyzed with DoseLab for comparison. The difference between the film and automated WL results in the X and Y direction and the radius were (−0.17 +/− 0.28) mm, (0.21 +/− 0.20) mm and (−0.14 +/− 0.27) mm, respectively. The difference between the DoseLab and automated WL results were (−0.05 +/− 0.06) mm, (−0.01 +/− 0.02) mm and (0.01 +/− 0.07) mm, respectively. Conclusions: This process reduced patient wait times by 15–20 minutes making the treatment machine available to treat another patient. Accuracy and consistency of results were improved over the previous method and were comparable to other commercial solutions. Access to the ARIA web services is made possible through an Eclipse co‐development agreement with Varian Medical Systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.