This is the UM case. This should have been easy to catch because they duplicated the same images multiple times in the same figure.
It's not so easy to catch manipulation in most other cases, especially if it's data manipulation. Experiments can't be replicated easily during the duration of the review period and you have to rely on your experience to figure out if data could possibly be manipulated.
Problems with the current peer review process:
1. Journals are shortening review periods. They are asking for reviews to be completed in 2-4 weeks when they used to give 3-6 months in the past. There's no time to really mull over the results.
2. Senior researchers with experience to catch manipulation aren't reviewing papers. Many pass them off to the post-graduate students to do the reviews for them.
3. Pay to publish journals don't really want negative reviews from reviewers.
Just reviewed a paper last month in a prestigious journal in my field. The paper's claims, methods, and theory was obviously wrong, but out of the three reviewers, one of them gave it a generic review saying he liked the tone of the paper, it was well-written, well-organised, etc. Nothing about the technical (de)merits.
Fortunately, two of us raised the technical issues and the paper was rejected.
Science Image Integrity Violation in Research Publications
Jun 19 2016, 09:58 AM
Quote
0.0149sec
0.29
6 queries
GZIP Disabled