Print
Reducing required sample size of new studies by effectively using historical data: new methods for evaluating mechanisms, conditions, and alternative explanations - Data Synthesis Power
Details
Locations:Netherlands
Start Date:Sep 1, 2018
End Date:Aug 31, 2020
Contract value: EUR 165,598
Sectors: Research, Science & Innovation
Description
Programme(s): H2020-EU.1.3.2. - Nurturing excellence by means of cross-border and cross-sector mobility
Topic(s): MSCA-IF-2017 - Individual Fellowships
Call for proposal: H2020-MSCA-IF-2017
Funding Scheme: MSCA-IF-EF-ST - Standard EF
Grant agreement ID: 792119
Objective
Scientists, organizations, and governments are moving in the direction of sharing their data. This will lead to an unprecedented amount of available data that researchers could use as prior information in a Bayesian analysis. While Bayesian methods are a promising method for using the growing amount of available information to increase statistical power in research studies, there is a great need for developing guidelines for calibrating existing data to create accurate informative prior distributions.
Bayesian methods that incorporate accurate prior information in the statistical analysis reduce the required sample size of the new study without decreasing the chance of detecting a true effect. However, prior data may be from a different population, measured using different instruments, and/or collected using different experimental procedures than those in the current study. Using the results from a prior study that differs from the current study in any of these respects as prior information in a Bayesian analysis can lead to biased results.
Several promising calibration methods have been proposed for linear regression analysis. In this project I will extend and test these methods to models with third variables that reveal the conditions under which an effect exists (moderators), the mechanism through which one variable affects another (mediators), and variables that might explain an observed effect (confounders).
I will develop guidelines and tools for social science researchers in numerous fields ranging from psychology and pedagogics to education and epidemiology for proper calibration of data from previous studies when evaluating the conditions under which an effect exists, the mechanism through which one variable affects another, and alternative explanations for an observed effect. These calibration methods will reduce required sample sizes in new studies, thus reducing required funds for data collection, and the burden on the researchers and participants.