An interesting read by Daniel Sarewitz in the recent Nature issue. It is a follow-up on the old discussion on the importance of the quality of the research papers as opposed to their quantity, and that the former should rather be taken into account to evaluate scientists for jobs, grants, and prizes.*
He gives an interesting example of poor quality, which is quite shocking from my naïve perspective:
"...The quality problem has been widely recognized in cancer science, in which many cell lines used for research turn out to be contaminated. For example, a breast-cancer cell line used in more than 1,000 published studies actually turned out to have been a melanoma cell line. The average biomedical research paper gets cited between 10 and 20 times in 5 years, and as many as one-third of all cell lines used in research are thought to be contaminated, so the arithmetic is easy enough to do: by one estimate, 10,000 published papers a year cite work based on contaminated cancer cell lines. Metastasis has spread to the cancer literature."
* The main problem is, as usual, that the committee members rarely read the actual papers, and stick with the single-number estimates (such as journal impact-factors or h-index) instead.