Wiley, a leading academic publisher of STEM journals, recently announced its decision to enter into a five-year agreement with DEAL Consortium in Germany. This consortium represents a collaboration of more than 1000 academic institutions of Germany.
The drift in scholarly communications has favored open access publishing, and the evolution of the hybrid model of publishing. The needs of the scholarly community will be better addressed by this partnership.
Thanks to this five-year deal, Wiley will now offer open access articles freely to authors of leading institutions in Germany. Thus, the readability and access to content would increase from Wiley’s portfolio of journals. Let us understand how this agreement benefits the research ecosystem of Germany.
The deal will facilitate a better understanding of all the scholarly institutions associated with OA publishing in Germany. The investment needs of journals would be recognized as they deliver high quality content with greater impact to researchers.
The deal also enables the development of a better infrastructure in terms of workflows. Thus, both the parties have agreed to support readers, authors, and librarians involved in the OA movement. Finally, all the institutions of the consortium will benefit from training sessions and workshops held by the well-established team of Wiley.
The deal between DEAL Consortium and Wiley is going to support a new wave of Open Access movement in Germany. This would be immensely beneficial to gain path-breaking results in the research community of Germany.
Kudos to the five-year agreement that dwells on supporting both the publishers and researchers associated with OA movement. The collaborative approach of Wiley has been much appreciated by DEAL Consortium of Germany. The cooperation model is sustainable.
A partnership has been announced between ResearchGate and Taylor & Francis, making it a new development in the field of STEM publishing. ResearchGate is a leading professional network of researchers, enabling collaboration and sharing of research publications.
Taylor & Francis is a renowned publisher of journals and books in the academic world. With this partnership, Taylor & Francis would provide access to 200 high quality journals to researchers registered at ResearchGate.
ResearchGate has specifically designed a new service entitled “Journal Home” to cater to the growing demands of journals. All the 200 journals would get enhanced visibility on ResearchGate platform. A profile will be created for each journal on ResearchGate, which would be accessible to researchers throughout the platform. Every journal title will also have an article page, benefitting sharing, collaboration, and networking.
This development is interesting as it plans to make more than 100,000 recorded versions of open access articles freely readable on ResearchGate platform. All the articles written henceforth will be included in the 70 completely OA journals and be available for viewership on ResearchGate.
Taylor & Francis is an important name in the field of academic publisher. It has published cutting-edge research from various disciplines, such as sciences, social sciences, and humanities. Since Researchgate platform has more than 25 million researchers, this partnership would benefit the publisher immensely. It would increase the readership of viewers and engage the interest of new audiences.
The founders of ResearchGate have expressed their gratitude to the publishers: Taylor & Francis. They will be embarking on a new journey as they make so many articles freely accessible. Most scientific content is hidden till today under the restrictions of a paywall, thanks to subscription journals. ResearchGate supports the OA model of publishing in academia, paving a new way for researchers in developing countries.
A preclinical study was conducted to treat the most common form of brain cancer in humans, that is, glioblastoma. This experimental study was conducted by researchers at the University of Texas at Dallas and the UT Southwestern Medical Center. They have developed a novel technique that delivers medication through the barrier between the bloodstream and the brain. The research study was performed on mice and published in Nature Communications magazine.
In the USA, the most aggressive form of brain cancer is glioblastoma. Every year, about 12000 patients are diagnosed with glioblastoma in the US. The average survival rate of these patients is just 15 to 18 months, post-diagnosis. Presently, the conventional modes of treatment, such as chemotherapy, radiation, and surgery, are not really effective in most cases. Most chemotherapeutic agents cannot pass through the blood-brain barrier, making them ineffective in destroying glioblastoma tumors.
The primary function of the blood-brain barrier is to restrict and prevent substances from the bloodstream from entering the parenchyma in the brain. The blood-brain barrier shows high selectivity as a filter and acts a protective barrier in the human brain. The blood-brain barrier is the biggest hindrance in the treatment of any disease related to the brain. Although it is just 1 micron in thickness, it obstructs about 98% molecules from entering the brain.
Our current research study was conducted on experimental mice, which were engineered genetically to contain mutations of human glioblastoma. The researchers used gold nanoparticles to co-deliver medication and to target the blood vessels. The nanoparticles were administered through an injection in the bloodstream. When they reached the mouse skull, they were activated with non-invasive laser pulses.
Owing to laser treatment, thermomechanical waves were formed in the brain of mouse model. The blood-brain barrier became permeable within a short span of time, thereby enabling the medication to enter the target area. In this experiment, researchers used a chemotherapeutic drug named paclitaxel. This drug is usually used to treat ovarian, lung, and breast cancers. On its own, the drug molecule fails as it cannot surpass the blood-brain barrier in patients with brain cancer.
As the novel drug delivery method could permeate through the barrier, the technique was successful. The size of the brain tumour shrank in size; the overall survival rate of experimental mice was more than 50%. These preclinical procedures need to be guided by further research studies on humans. This is a novel mode of treating diseases like brain cancer, offering hope to patients with high mortality.
The UC Davis Comprehensive Cancer Center has conducted an in-depth research study. The focus was to activate programmed death of cancer cells. A crucial epitope has been identified on the CD95 receptor, and it triggers the death of cells. An epitope is a section of protein that activates the larger protein. As cell death can now be programmed, cancer treatment methods have become effective.
In molecular biology, CD95 receptors are known as Fas and death receptors. These receptors are proteins present in cell membrane. They evoke self-destruction of cells by releasing a signal. This happens when CD95 receptors are activated. By modulating the activity of Fas, researchers extended its benefits to CAR T-cell therapy. This was effective in destroying solid tumours of ovarian cancer.
Managing cancer with better therapies
The conventional method of treating cancer includes chemotherapy, radiotherapy, and surgery. In cases where cancer is diagnosed at an initial stage, these methods are effective. However, cancer cases may relapse, especially when they are therapy-resistant. Recently, CAR T-cell immunotherapy and an immune checkpoint receptor molecule have shown to activate antibodies, and thus they are promising candidates that destroy the cycle of cancerous growth.
These immunotherapeutic agents are effective against only few types of cancer cells, such as ovarian cancer, breast cancer, lung cancer, and pancreatic cancer. In CAR T-cell therapy, researchers engineer the specific type of immune cells, that is, T cells. They graft these cells on a specific antibody that targets specific tumours. The grafted T cells are quite effective in battling leukaemia and other types of blood cancer.
The engineered T cells have not been effective in combating solid tumours; the microenvironment of these tumours drives off T cells and other immune cells. Thus, they cannot provide a therapeutic effect to solid tumours. Although the immune receptor activates antibodies, the T cells cannot infiltrate without additional spaces.
The activity of death receptors
Now, let us understand the activity of death receptors. Through targeted therapy, we can trigger them into programming cell death of tumours. Thus, chemotherapeutic drugs should be such that they induce the activity of death receptors. Many pharmaceutical companies have been slightly successful in targeting the death receptor-5. But the clinical trials of Fas agonists have failed.
Developing the right target
The activity of immune cells is effectively regulated by Fas. However, researchers have proposed that cancer cells can be targeted selectively if they identify the correct epitope. After identifying the targeted epitope, researchers of this study have designed a new type of antibodies. These antibodies show selectivity while binding and activating Fas. With this strategy, specific tumor cells can be destroyed.
A new study has reported about a promising approach for treating pancreatic cancer. The study was conducted on mice by researchers of the Queen Mary University of London. They identified the cells that caused metastasis of pancreatic cancer, and they also explained how the weakness of these targeted cells could be explored to mitigate pancreatic cancer with existing drugs.
These researchers have reported that certain types of cells known as amoeboid cells were present in most patients with pancreatic cancer. The activity of these fast-moving cells was aggressive and invasive, thereby weakening the immune system of patients.
These amoeboid cells are also reportedly to have been found in patients with other types of cancer, such as liver cancer, breast cancer, and skin cancer. The survival rate of such patients is poor. However, this is the first study to report these cells in pancreatic cancer patients.
Researchers also found high levels of expression of a molecule named CD73 in patients with pancreatic cancer. This molecule is believed to be produced by amoeboid cells and it drives the metastasis of cancer, thereby weakening the immune system. The activity of this CD73 molecule had to be blocked. Thus, tumour tissues would not have spread to the liver.
The amoeboid cells were reportedly detected in late as well as early-stage patients with pancreatic cancer. This implies that the activity of CD73 molecule should be blocked at an early stage of the disease and the aggressive nature of amoeboid cells can be curtailed. Thus, the damage caused to the body can be reduced, thereby indicating a new hope for patients with pancreatic cancer.
Currently, the rate of survival and patient outcomes of pancreatic cancer are poor. Presently, every year more than 10000 people are diagnosed with pancreatic cancer in the UK. The conventional mode of treatment enables just 7% of patients to survive for five years after detection. Presently, chemotherapy, radiotherapy, and surgery are not working well for most patients. This novel mode of treatment has given promising results in mice and human clinical trials need to be conducted soon.
The main goal of academics is to present their research work in prestigious journals. The process of publishing a paper is a laborious one, and not just competitive. Academic publishers like Elsevier receive about 2.6 million research papers each year. Editors and peer reviewers have to carefully select the most credible research papers for publication.
Manuscript authoring with AI
Manuscript development process is a task that not only includes biomedical statistics and results but also editing, reviewing, and checking content for plagiarism. Although a researcher finds this process lengthy and time-consuming, the effort is worth million-dollar research grants. Most researchers are ESL (English as second language) and need to polish their manuscript for English language errors.
Today, AI tools like Grammarly.com have made polishing of research papers a very simple process. They not only rectify errors related to grammar and punctuation, they also contain sophisticated algorithms to check for terminology in reference to a context. Algorithms are also developed to check for plagiarism of content. These AI tools are software like iThenticate.com.
Manuscript Images with AI
Most biomedical research papers get their scientific integrity from the professional images that they publish. Most images are of microbiology and techniques, such as Western blot. The images are taken from a magnified slide, and this may lead to overlapping of images accidentally. The pictures of these images have to be stored, and their duplication has to be prevented. Careful attention needs to be paid to images as their duplication in papers would get the paper rejected by a journal.
A research paper can contain several sub-images, so researchers now make use of AI to compare all sub-images for authenticity. Duplication of images is avoided with the use of a software that preserves the integrity of images. An automated software named Proofig is an excellent example of AI in manuscript images. This software uses computer vision for scanning a manuscript. Then, all the images are compared within minutes. Any duplication of images is flagged off by this software.
Manuscript review process with AI
A peer reviewer has to carefully parse through the content of the manuscript, identifying the novelty of research and guiding researchers in their work. They also check the accuracy of the results presented in the paper. This is a time-consuming process for many publishers.
ChatGPT is the latest tool of AI and based on the GPT-3 algorithm. It enables a researcher to write a paper efficiently, providing content that is most relevant to the field of study. Now, AI supports the needs of researchers and editors. Scientific research is not completely automated by AI. It needs the efforts of scientists and publishers.
In thin layered chromatography (TLC), an adsorbent material is spread on sheet and components of a mixture get separated due to the effect of the solvent. To determine how pure a sample is or to determine the progress of a chemical reaction, most analytical chemists perform the technique of TLC.
While performing TLC in a laboratory, scientists usually separate components of a black ink or identify all the chemicals present in the extracts of a leaf. This description implies that TLC has been a qualitative technique till date. In other words, it only tells a chemist the different compounds present in a mixture. However, it cannot tell how much quantity of each component is present in the same mixture.
An innovative app named qTLC has been developed by Stefan Guldin. He and his team of researchers use photos taken by a smartphone to decipher the quantity of each component present in a mixture. This implies that qTLC app has to be combined with TLC technique to make it a quantitative and qualitative technique of analysis. Now, compound concentrations can be known through this integrative technique.
In this process, a chemist would place three samples of definite concentration. Next, the chemist would place a sample of unknown concentration. All the four samples have to laid on the TLC plate and the eluting agent has to be run over it. The coloured spots should be dried and visualized in UV light. A smartphone photograph can then be taken for review through the app.
A calibration curve is constructed using the program in the app. The app takes into consideration the spots created by the three samples of known concentration. Then, the unknown concentration of the fourth sample is estimated from the calibration curve.
Uneven illumination is corrected by the app, which is an automated version. Nevertheless, the photos have to very clear and the calibration curve has to be accurate. The size of the spot and the intensity of the spot is also determined accurately by the program. This eliminates error and bias of quantification.
The healthcare industry has undergone a metamorphosis in recent times. The advent of artificial intelligence has dramatically changed predictive diagnosis and provided personalized plans of treatment, which has in turn improved patient outcomes. The medical processes are now channelized in the right direction. However, we have to consider biomedical ethics of AI in healthcare industry.
The delivery of healthcare goods and services, the procedures associated with diagnostics, and tremendous care offered to patients are some of the avenues being revolutionized by the advent of AI. The power of medical data has never been tapped to this extent by any other technology. Today, AI has transformed the way healthcare professionals deal with electronic patient records, diagnostic tools like imaging techniques, and genetic engineering techniques. Although medical data is vast, AI can easily peruse through it at a very rapid speed. The complexity patterns of medical data have never been understood till date.
As AI has transformed the way of data analysis, diagnosis of patients has become very fast and accurate. Algorithms based on machine learning have been developed and trained on large volumes of medical data of patients. Consequently, the diagnosis of illness can happen at an early stage, improving the outcome of treatments. In fact, personalized treatment plans are now possible due to AI. The prognosis of patients is also improved with AI.
Surgery has now become a mechanical procedure; the entry of medical robots has improved the speed at which surgery can be performed with precision. Medical robots are now assisting surgeons in medical operation theatre. Thanks to robots, minimal invasive procedures are now being performed with ease. The chances of complications are minimized and patients are recovering at a faster rate.
AI has changed the way healthcare processes in call centers are carried. Chatbots are AI-driven health assistants in the virtual world. This has now boosted the prospects of telemedicine. Thanks to digitalization of medical records and chatbots, healthcare delivery services have become very accessible and professionals are now moving beyond confined physical clinics.
Healthcare apps are powered by AI and patients can now manage their diseases by knowing more about their illness in real-time. Patient queries are solved by chatbots and appointments to physicians can also be scheduled as per convenience. Chronic healthcare issues like diabetes have self-care management apps.
In the field of research and development, AI is breaking all barriers. Drug discovery was a laborious process in the field of pharmaceuticals and healthcare. Today, AI tools have algorithms to curtail down the lengthy and costly procedure of drug discovery. AI is now predicting the structures of organic molecules, which are being used later as the active ingredient of a drug. Thus, potential candidates are being devised as novel drugs. The interaction of drugs with biological systems is now better understood with AI tools.
First, let’s enlist the flaws of peer review process in academic publishing: slow and lengthy process, lack of transparency, and slow speed of completion. As peer review process is a voluntary service, there is sharp shortfall in the number of reviewers working for a journal. Most academics have rigorous workload. Ever since the onset of COVID-19 pandemic, the peer reviewers have ignored academic publishing and the process of academic publishing has hit an all- time low.
Although China has the highest number of papers published in international journals, most journals rely on the work and effort of Western peer reviewers. The quality of science skills offered by American and European peer reviewers is still considered quite high, as compared to Asian countries, like Japan, China, and Korea.
How can the speed of peer review process be increased to boost academic output? Most researchers have told academic publishers that peer review process should be accelerated by paying an honorarium to peer reviewers. Better forms of incentives should be provided to peer reviewers as it is a rigorous process that protects scientific accuracy and establishes facts.
Academic publishers are also asked to share profits with research departments of universities and institutes. Some of the other path-breaking strategies include free subscription of the journal, vouchers of publication, etc. However, peer review process quality lays heavy emphasis on the scientific rigor of reviewers.
If peer review becomes mandatory, universities would only recommend people with outstanding contribution to research. Conflicts of interest is another area that needs to be tackled. If academic publishers create a database of peer reviewers, authors can easily find experts that are related to their field of study.
The recruitment process of peer reviewers should be improved. The type of work academic publishers distribute should also be examined thoroughly. The methodology used in a research study or the content of the novel results should be correlated with the scientific publications of a researcher. Thus, either content or methodology should be used as a criterion for identifying an expert reviewer.
Journals should send vivid invitation letters to selected reviewers, which may or may not be many in number, depending on the field of study. The process is simpler when journal ask reviewers to accept or reject their invitation for review. There are many independent researchers from industries who can ease off the workload of academics. They too must be recruited. Finally, retired professors could form the creamy layer of peer reviewers.
Although double-blind peer review completely negates the biases towards nationalities of researchers, the open peer review process is also gaining ground. The identities of authors and reviewers are disclosed, which makes it a transparent process and increases human communication between authors and reviewers.
Consider the academic review process of Royal Society Open Sciences. It publishes the decisions of the journals editors; it publishes the review letters; and it also requests the voluntary peer reviewers to disclose their identity. The Open Access movement is gaining ground in academic publishing. Greater emphasis is now given to research studies that have time-sensitive parameters.
In academic publishing, an article is first a drafted manuscript that is carefully reviewed by scientists of a particular discipline and specialization. Their in-depth commentary identifies the flaws and highlights the benefits of the experimental study design and results. Receiving research grants and scholarships is impossible without getting a manuscript approved by a group of esteemed peer reviewers, who are usually mid-career researchers with an impressive track record of publications.
Most early career researchers are post-doc candidates who have to scrutinize their work from the eagle eyes of three to four peer reviewers. The authenticity of the research and its related findings need to be officially recognized by peer reviewers. After peer review process is completed, the article is polished by an academic publisher.
Some of the flaws of academic peer review process is that it is a slow and lengthy process, often determined by the type of peer review model followed by a journal. Academics are overworked people and they work on volunteer basis for journals. Therefore, peer review process seems to be exploitative for academics as it offers very little or no remuneration. The time and effort put in reviewing is an integral part of the publication process, so it should be compensated.
Another striking flaw of the peer review process is that is getting biased and lacks transparency. Most journals follow the double blind peer review system. The names of the authors of the manuscript are concealed. The reviewers do not know the names of the authors of the manuscript. At the same time, the names of the reviewers and their credentials are not furnished to authors. Thus, the current system lacks transparency and acts as a “black box.”
Finally, the speed of the peer review process is associated with a long waiting time. Whenever, a paper is submitted to a journal, it is scrutinized for the novelty of findings. Once the content is approved, the authors have to wait for a long time before the paper is sent to set of esteemed peer reviewers.
Once the peer review process is completed, the final publication process is initiated: here, the editors do NOT work in tandem with reviewers. Usually, researchers get their work published in peer-reviewed journals within one year or two. Delays in peer review process makes policymakers rely on outdated findings of science.
Early career researchers have to make a mark in the field of scholarly publications. They need to get a tenure of post-doc research positions and professorship only on the basis of their successful publications. Most post-doc researchers are very good in laboratory activities. Writing of experimental manuscripts is an art, which needs to be deciphered from the constructive comments of peer reviewers.