Speaker
Description
Authors: Anna Lene Seidler (University Medicine Rostock), Kylie Hunter (The University of Sydney), Mason Aberoumand (The University of Sydney), Sol Libesman (The University of Sydney), James Sotiropoulos (The University of Sydney), Jonathan Williams (The University of Sydney), Jannik Aagerup (The University of Sydney), Angie Barba (The University of Sydney), Nipun Shrestha (The University of Sydney), Rui Wang (The University of Sydney), Ben Mol (Monash University), Wentao Li (Monash University), Angela Webster (The University of Sydney)
Background: Mistrust in research is increasing, causing some to argue that relying solely on published aggregate data for evidence is no longer sufficient. Instead, experts suggest that checking individual participant data (IPD) is optimal to produce the most trustworthy systematic reviews and guidelines. However, there is limited guidance on how to conduct integrity checks on datasets of completed studies.
Methods: Development of the tool involved a literature review of existing items to assess the integrity of RCTs and their IPD, then consultation with an expert advisory group. Agreed items were incorporated into a standardised tool and automated where possible. This tool was piloted on 73 trials from two IPD meta-analyses, a sample of five trials with IPD datasets flagged by journal editors as having known integrity issues, and eight similar datasets without known integrity issues. Evaluation workshops were held to iteratively refine the tool.
Results: We developed the IPD Integrity Tool, comprising seven study-level domains and eight IPD-specific domains (unusual or repeated data patterns, baseline characteristics, correlations, date violations, patterns of allocation, internal inconsistencies, external inconsistencies, and plausibility of data). Within each domain, items are rated as having either no issues, some/minor issue(s), or many/major issue(s) according to decision rules. If there are many and/or major issues that cannot be resolved, the study should be excluded from evidence synthesis and/or not considered suitable for publication. In our validation checks, the Tool accurately identified all five studies with known integrity issues.
Discussion: The IPD Integrity Tool enables users to assess the integrity of RCTs via examination of IPD. The Tool may be applied by various stakeholders, such as journal editors to prevent publication of untrustworthy studies, and systematic reviewers to assess RCTs for inclusion in analyses. The overarching goal is to ensure that only trustworthy evidence informs guidelines, policy and practice.
Conflict of interest | No conflicts of interest to declare |
---|