“About 60 percent of IEEE conferences, magazines, and journals have no practices in place to ensure reproducibility of the research they publish. That’s according to a study by an ad hoc committee formed by the IEEE Computer Society to investigate the matter and suggest remedies.
Reproducibility—the ability to repeat a line of research and obtain consistent results—can help confirm the validity of scientific discoveries, IEEE Fellow Manish Parashar points out. He is chair of the society’s Committee on Open Science and Reproducibility….
The goal of the ad hoc committee’s study was to ensure that research results IEEE publishes are reproducible and that readers can look at the results and “be confident that they understand the processes used to create those results and they can reproduce them in their labs,” Parashar says….
Here are three key recommendations from the report:
Researchers should include specific, detailed information about the products they used in their experiment. When naming the software program, for example, authors should include the version and all necessary computer codes that were written. In addition, journals should make submitting the information easier by adding a step in the submission process. The survey found that 22 percent of the society’s journals, magazines, and conferences already have infrastructure in place for submitting such information.
All researchers should include a clear, specific, and complete description of how the reported results were reached. That includes input data, computational steps, and the conditions under which experiments and analysis were performed.
Journals and magazines, as well as scientific societies requesting submissions for their conferences, should develop and disclose policies about achieving reproducibility. Guidelines should include such information as how the papers will be evaluated for reproducibility and criteria code and data must meet….”