The world of scientific publishing has undergone a metamorphosis, with most scientific articles being published online. To measure the impact of scientific data, many concerted efforts have been made to develop new tools. Rather than waiting for publication of citations in the print media, these tools help us to decipher the impact of tools in the online medium.
One of the most prominent journal metric is the “download impact factor.” It is defined as the rate at which articles are downloaded from a journal. This tool is similar to the “journal impact factor.” Another prominent tool for this usage is the Journal Usage Factor, which is calculated on the basis of mean and not median. Although there are many social network metrics, the download networks estimate the information through clicks and not download logs.
To determine the measure of journal impact, both citations and download data log have been defined. A single indicator cannot be used to measure the impact of scientific journals. Most researchers now believe that indicators measuring the download data have greater impact today given the firm grasp of online media.
The download frequency of a journal would not be affected by the impact factor. In terms of absolute value, there is a strong correlation between citation and frequency of download for a journal. Furthermore, there is moderate correlation between download number and journal impact factor.
Scopus is a very useful tool to measure citation data. On the other hand, ScienceDirect is a tool to measure the number of downloads. Both these tools are used to comprehend the relationship between download and citations. Thus, the influence on publication output is measured. Scopus is an impact tool that does not include conference papers and abstracts. ScienceDirect is a measuring tool that includes the impact of all kinds of papers.
Scopus is a measure of the time taken for a paper to be cited, whereas downloads is the tool that measures the innovative value of papers. In each subject area, “excellent” papers were those that had a large number of “mean downloads.”
In both English and non-English journals, there was a strong correlation between downloads and citations. There were journals whose papers were downloaded in great numbers but these downloads did not really result in citations.
For individual papers, correlations are weaker than that of journals; however, they are markedly more significant than sample size. The number of downloads depends on the how well circulation is the journal. It does not really depend on novelty. Quality of paper is reflected in terms of citations today. Journals that have wide circulation and diffusion would have many downloads, but that does not really correspond with citations.
Papers published in journals with low impact would have less number of downloads, regardless of whether these papers receive many citations later. This implies that download data cannot be considered as a predictor of citation, especially when the journal has lower significance in its early years.
In English journals, the number of downloads is slightly less than citation for papers. In non-English journals, the number of downloads is slightly more than the number of citations. In non-English journals, the correlation between citations and downloads seems to be much lesser.