On February 15, 2019, the Taskforce issued a report with a number of recommendations and associated actions that set out needs to be addressed, although the mechanisms by which these may be achieved require further focused work over the coming year. Since the term “big data” is widely used but lacks a commonly accepted definition, this Report provides a definition of big data as «extremely large datasets which may be complex, multi-dimensional, unstructured and heterogeneous, which are accumulating rapidly and which may be analyzed computationally to reveal patterns, trends, and associations».
In particular, the Taskforce takes into account the considerable amount of data deriving from wearable devices, electronic health records, social media and clinical trials and assesses how to utilize it for regulatory purposes. The description carried out by the Taskforce focused on six data subgroups: genomics, bioanalytical ‘omics, clinical trials, observational data, data on spontaneous adverse drug reactions and social media and m-health data.
For instance, with regard to data standardization, the Taskforce suggests promoting the use of global, harmonized and comprehensive standards to facilitate interoperability of data. Secondly, with respect to data quality, the Taskforce specifies that characterization of data quality across multiple data sources is essential in order to fully understand the reliability of the evidence. Additionally, it recommends the development of timely, efficient and sustainable frameworks for data sharing and access, as well as to further support mechanisms to promote a culture of data sharing.