Digital imaging ethics: Where does India stand? - IndiaBioscience

In the last decade, several Indian colleges, universities and research institutes have gained access to high-end microscopes and post-imaging processing tools. With increasing allegations of image manipulation and duplication against researchers from India, we need to stop and ask if there has been a concomitant increase in ‘digital imaging literacy’.

Between January 2018 and May 2018, 12 papers published by two researchers from the Indian Institute of Technology (Indian School of Mines), Dhanbad were retracted because of image duplication and manipulation.

On April 19th 2018, a PhD student from Calcutta University wrote a public post about the data manipulation in her lab where her guide and a senior research scholar drew “molecular weight markers with pencil on a blot and ‘rescanned’ the blot to convince the reviewers and the editorial board of the journal”.

In recent times, India has been in the limelight for several cases of fraudulent imaging practices. In a study published in 2016, researchers from Stanford University, Johns Hopkins School of Medicine and Washington School of Medicine analysed the prevalence of image duplication across different countries. To do this, they looked at 348 papers with image duplications published between 2013 and 2014 in Plos One and mapped their country of origin. The papers with duplicated images most frequently originated in India, followed by China and Taiwan.

Honest mistakes or fabrications?

Before we can discuss strategies to combat this trend, we need to understand the origin of this behaviour. Is it possible to determine if these cases arise from honest mistakes during image processing, data compilation or figure preparation – or are they outright acts of misconduct?

In a recent study available in pre-print format in BioRxiv, researchers visually analysed 960 papers published in Molecular and Cell Biology between 2009 and 2016 for inappropriate image duplication. Out of these, 59 papers (6.1%) contained inappropriate image manipulations. Interestingly, most authors who made formal corrections reported that the error arose during image assembly. For example, sometimes they accidentally included the same image twice, selected the wrong image, or assembled panels with incorrectly placed images.

This study is reassuring in the fact that most such errors could be resolved by improving the imaging and compiling practices of the authors. However, these results may not represent a general trend for papers from different countries, and we do not know how many of those 960 papers were from India.

The line between ‘adjustment’ and ‘manipulation’ can be blurry

[Photo: Imaged by P Surat, courtesy Maithreyi Narsimha lab, TIFR, Mumbai]

These images show a view of few cells from a tissue during development in Drosophila. The first image (A) is unadjusted. The second (B) is adjusted, but there is no loss of information. The third image (C), though cleanest in appearance, has been adjusted such that certain information is now lost from the image−you can no longer see the small dots in some cells in the bottom right (blue circles, B). The last panel (D, blue rectangle) shows a view where the intensity of one cell has been selectively increased. Out of these cases, the second modification (B) may be acceptable, but the third and fourth (C and D) undeniably count as ‘manipulation’.

First point about making these changes – it is easy! If someone wanted to make such changes a decade ago, they would have had to put in real effort. Today, one can create a panel within minutes showing that one cell selectively shows increased levels of x protein.

Second, even if the error arose due to an erratic mouse click, it still counts as ‘manipulation’. Below is a list of some ‘honest errors’ which still count as manipulation:

  • If you splice two gel images from different experiments into one without specifying.
  • If you duplicate a lane in a western blot into another panel.
  • If you specifically modify certain parts of an image in contrast to the whole image.
  • If you combine images from different regions/time into one without specifying or showing the borders.
  • If you do not possess unaltered original images/blots for each of your panels.
  • If you label your images/panels incorrectly.
  • If you use Photoshop to remove certain signals to make your image look ‘nice and clean’.
  • If you crop certain parts of an image without mentioning. “If you desperately want to see a particular result, you can probably find it in some corner of your image – despite the fact that the major part of the data says a different story”, quips Sudipto Maiti, Professor at the Tata Institute of Fundamental Research (TIFR), Mumbai and a regular instructor at the Bangalore Microscopy Course at the National Center for Biological Sciences (NCBS), Bangalore, one of the few courses in India which imparts training in microscopy and image processing.

The changing landscape of image submissions to journals

Journals are now adapting to the increased threat of image manipulations, and most journals require that images be minimally processed and that all unprocessed data and metafiles be submitted at the time of review. All image acquisition tools, image processing software, and processing manipulations to improve the image should be mentioned.

Journals are also training their editors to detect ‘tell-tale’ signs that an image is manipulated. Some journals perform ‘spot check’ where all images of a randomly selected paper in each issue are visually checked for image manipulations. Recently, researchers developed a software which could check image manipulation in papers. In time, the use of such software, similar to software that check plagiarism, could become prevalent in journals. Maiti has another suggestion along these lines. “It may not be too much work to get the image processing software providers to make their software so that it always embeds the raw image inside the processed image,” he says, “This will enable anyone to say do a left click on the image and check what the raw data looked like. This embedded raw image should be non-manipulable and traceable to the original machine. The journals should insist on having the image in this format, so that at least on the online format, the raw data is always available”.

What can the scientific community do to counter image manipulations in India?

Increasing awareness

“Training in advanced microscopy and imaging is indeed lagging behind,” states Rahul Roy, Associate Professor at the Indian Institute of Science (IISc), Bangalore and also an instructor at the Bangalore Microscopy Course. “There have been several efforts to impart such training through workshops and courses but they are still limited. Therefore, ‘imaging literacy’ is poor and access to new technology without proper training and support is already an impediment to doing good science”, he says.

Shifting from qualitative to quantitative reporting of images

Roy stresses that “the focus should be shifted from qualitative research reporting (like images) to quantitative reporting with statistical analysis from raw images that accompany the images”. This can be useful in cases where a single image is displayed which may not be representative of the actual data.

National and institutional legal policies

Two recent studies assessed various risk factors for scientific misconduct. Interestingly, the most common factor which is usually attributed to this behaviour − the pressure to ‘publish or perish’ – was not found to be a significant risk factor. Instead, the likelihood of retraction was lower for countries where policies against scientific misconduct were legally defined either at the level of the nation or the institution. Thus, establishing legal infrastructure in Indian institutes and government against all categories of scientific misconduct could be a potential deterrent against deliberate malpractice.

Peer control and cultural factors

Open communication and mutual criticism are two of the pillars of science. A scientific culture where peers or collaborators are discouraged or scared to criticize their peers’ work breeds an environment ripe for fraudulent practices. As a scientific community, we should encourage an environment where student and colleagues can provide honest and unbiased peer control both locally (as lab members and intra-institute members) and globally.

Misconduct does not affect one person or a lab; it shapes the view of science in our country and is a reflection of us as a scientific community. Thus, sincere and urgent efforts are required from both the scientific community and the government to improve the pursuit of science in India.


This was the first article in our new series on research ethics. Please let us know your views on this topic in the comments below.

This is a companion discussion topic for the original entry at

This is a very timely and informative article. Congratulations to Surat and to IndiaBioscience. I look forward to more in this series.
Microscopy and digital imaging has become very popular and powerful tool, thanks to innovative developments in microscopy and imaging, and to the robust cell biological methods that are now available to investigators. However, the automation has also resulted in unintentional (but often serious) errors, and of course provided convenient scope for unethical manipulations. One of the major error (often unintentional but can also be intentional) happens during image acquisition itself. The basic fact of saturation of the signal in the image is very often ignored (many may not even be aware of this issue). Consequently, this not only distorts the qualitative information, it leads to seriously compromised quantification. The same applies to western blots and semi-quantitative RT-PCRs. An associated issue with confocal and more advanced microscopy, especially in centralized facilities, is that the images are acquired by a technician/operator when the investigator herself/himself may not be around. This not only can result in missing some important event that may be noticed by experienced researcher, but also can produce an image which looks very ‘nice and impressive’ but may not be representative of the actual situation since one or the other signal may have been unduly amplified or reduced to generate the image. Since at the end, only a few images are available to the investigator, generality of the ‘observed’ results also remains uncertain.
As noted in the article, awareness about such issues is very critical. Commentaries like this would go a long way to help researchers make the best use of newer methods that allow us to peep into lives of cells.

1 Like

Thanks Dr Lakhotia for your kind remarks. Yes, hopefully by having more discussions and raising awareness, we, as a scientific community, can bring down the prevalence of such events.

1 Like

“We would also like to suggest that mechanisms be put in place at every department and institutional level, so that a centralised cell supervises archiving of data. Cloud storage at institutional level is essential to prevent loss of data. The matching of raw data with final images being sent for publication also needs third party supervision (which is in place in several western institutions) so that errors are minimised." - a suggestion put forth in a report on yet another case of serial image duplication-linked retractions, this time from the Bose Institute in Kolkata -

1 Like

Having a centralised storage system (at the level of university) for data sounds like a great idea. I think some labs do have a lab-specific common server for data, but that could be tampered in cases where deliberate fabrications are attempted.

1 Like