Additional observations on sharing
Contents
Additional observations on sharing#
We have three additional areas of learning from our review that is beneficial to report.
Poor quality logic diagrams#
We found peer reviewed articles that contained poor quality and low resolution images of model logic. For example, we found several studies where the ARENA simulation software was used and model documentation was based on screenshots of the software as opposed to carefully designed diagrams. These images were often of no use to reporting and sharing: resolution was too low and text was unreadable.
Data availability statements#
Where journals mandated, study authors provided a data availability statement within their publication. A generic example of this is ‘The data used within this study’ or ‘the model inputs’ are ‘available upon reasonable request’. Such statements ignored the availability of the model itself. It is unclear if such an omission is intentional, not considered relevant, or not considered at all due to standard practice in the authors home discipline. What is clear is that journal articles do not consider models as ‘data’ and as such are not enforcing details on access.
Open science journals and reporting guidelines#
Due to journal policy, we expected articles published in open science championing journals such as PLOS1 and BioMedCentral to enforce links to model artefacts. However, this was not always the case. In some cases we found, no model, or an online appendix with a table of input data. We also found that these journals did not enforce reporting guidelines. It may be that authors, reviewers, and editors do not consider these requirements to extend to computational artefacts such as DES models.
Time spent to setup sharing#
The majority of DES model sharing we found did not score well in our best practice audit. One explanation is that sharing of the model was only considered at the end of the simulation study or model write-up with minimal time dedicated to the mechanism and quality of sharing. Evidence to support this statement can be taken from the GitHub repositories with a single or small number of commits (updates to code) - a sign that model code was only uploaded at the end of a project. Often without any documentation, license, instructions to run, information on software dependencies, or link to a paper describing the work.