We examined the 21-day treatment sheets of one hundred patients who underwent haematopoietic stem cell transplantation in two subscription-based and two open-access databases in terms of several categories for 2 years in a row. Fleiss' and Cohen's kappa statistics were used to analyse the databases' agreement levels. None of the databases detected all of the interactions, and the severity categories assigned to interactions were often different among the four-drug interaction database programmes. A total of 1393 and 1382 different drug–drug interactions were detected in the subsequent versions of the databases, namely the 2021 and 2022 versions. The Fleiss kappa overall agreement among databases was slight. Uptodate and Micromedex showed fair agreement, and other database pairs showed slight agreement in severity ratings. There was a poor agreement among databases for interactions seen in bone marrow transplantation patients. Therefore, it would be safer to use more than one d atabase in daily practice. Further work needs to be done to understand the agreement-level of databases for different types of interactions.
Abstract
What is known and objective
Patients who have undergone haematopoietic stem cell transplantation are prone to drug–drug interactions due to polypharmacy. Drug–drug interaction databases are essential tools for identifying interactions in this patient group. However, drug–drug interaction checkers, which help manage interactions, may have disagreements about assessing the existence or severance of the interactions. The study aimed to determine differences among popular drug–drug interaction databases from several angles for patients who underwent haematopoietic stem cell transplantation.
Methods
The 21-day treatment sheets of one hundred patients who underwent haematopoietic stem cell transplantation were examined in two subscription-based (Uptodate and Micromedex) and two open-access databases (Drugs.com and Epocrates) in terms of several categories two years in a row. Statistical analysis was utilized to understand the compatibility of databases in terms of severity scores, evidence levels, given references, and word counts in interaction reports. Fleiss' and Cohen's kappa statistics were used to analyse the databases' agreement levels.
Results and discussion
A total of 1393 and 1382 different drug–drug interactions were detected in subsequent versions of the databases, namely the 2021 and 2022 versions. The Fleiss kappa overall agreement among databases was slight. Uptodate and Micromedex showed fair agreement, and other database pairs showed slight agreement in severity ratings.
Conclusion
There was a poor agreement among databases for interactions seen in bone marrow transplantation patients. Therefore, it would be safer to use more than one database in daily practice. Further work needs to be done to understand the agreement level of databases for different types of interactions.
No comments:
Post a Comment