José Luis Ortega
joseluisortega.bsky.social
José Luis Ortega
@joseluisortega.bsky.social
Scientist in the Institute for Advanced Social Sciences (IESA) at the Spanish National Research Council (CSIC). Bibliometrics, Scholarly publishing system and Research integrity
RetractBASE (retractbase.csic.es), the largest open search engine specialized on retracted literature, makes publicly accessible their datasets here: osf.io/xtrsb/files/...
September 15, 2025 at 2:15 PM
Today, I presented retractbase.csic.es, our open database about retractions, in Bristol STI-ENID Conference. @stienid2025.bsky.social with a very good reception
September 5, 2025 at 4:33 PM
RetractBASE also incorporates author, organization and journal indexes to identify the entities with the most retractions and enrich the data exploration.
April 7, 2025 at 1:48 PM
It includes more than 120k publications between 2000 and 2024, where all they are connected with their retraction notices, and external links to the original publication, other databases and @pubpeer.com are provided to broadering information about the reason of retraction.
April 7, 2025 at 1:48 PM
5/7 Regarding document types, all the databases have problems in the extraction and identification of entities in books and book chapters. Books (.31) show greater author variability, and improper extraction is seen in affiliations (15.9%) and venues (44.7%)
January 15, 2025 at 3:22 PM
3/7 Crossref-based products such as Dimensions, Scilit, and The Lens identify more venues and publishers than others, while they have limitations in the identification of authors and orgs. Dimensions is the database with the best performance, in comparison to Scilit and The Lens.
January 15, 2025 at 3:22 PM
2/7 Dimensions and OpenAlex are the best products processing authors because they have the lowest percentage of authors with one publication (>88%), and the lowest slope coefficient. They also show low average author variation (Dimensions, .12; OpenAlex, .17).
January 15, 2025 at 3:22 PM
1/7 “Research entity information and coverage in eight free access scholarly databases” is our (Lorena Delgado and me) last comparison of free-access scholarly databases, according to four research entities (authors, organizations, venues and disciplines). osf.io/preprints/so...
January 15, 2025 at 3:22 PM
9/9 However, and despite pseudoscience communities could have a marginal position in mainstream science; it’s worrying to notice that they have better social impact than scientific results, suggesting that it could be more acceptable to public opinion, being a fundamental pillar of its prevalence.
January 9, 2025 at 3:04 PM
8/9 Regarding the correction, there are no differences when editorial notices are released, but the number of reported papers on PubPeer is considerably higher in pseudoscientific disciplines, indicating that their methodologies are not acceptable for a scientific standard and then are reported.
January 9, 2025 at 3:04 PM
7/9 Linked to this, there is a great presence of professional associations and non-profit organizations performing and funding pseudoscience studies. While in scientific fields predominate universities and government agencies, excepting integrative medicine.
January 9, 2025 at 3:04 PM
6/9 The most remarkable difference is the important presence of practitioners and self-employed researchers in pseudoscientific studies (authors and editorial board's members), suggesting that these authors lack of experience in research activities and perform studies to validate their practices.
January 9, 2025 at 3:04 PM
4/9 the percentage of journal-self citations in pseudoscience are higher than in science papers, while the proportion of cross-group citations show an inverse trend. This finding suggest that pseudosciences tend to be more isolated and more resistant to external ideas than scientific communities.
January 9, 2025 at 3:04 PM
1/9 “How does pseudoscience differ from science? A pair-wise bibliometric analysis” is my first approximation to the study of pseudoscience from a bibliometric approach, observing differences between pseudoscientific and scientific disciplines using publishing parameters doi.org/10.31235/osf...
January 9, 2025 at 2:57 PM
6/6 References from publications by Georg Thieme, Routledge and, to a lesser extent, Oxford University Press, have trouble being indexed in all databases. This fact emphasizes the importance of book publishers providing references and ensuring their indexation in databases.
November 26, 2024 at 3:56 PM
5/6 the study of references has detected how specific document types, such as books and book chapters, present limitations when their references are extracted and indexed in scholarly databases. This limitation is transversal to every database, suggesting that these issues are caused by publishers.
November 26, 2024 at 3:56 PM
4/6 Analysis of references has revealed that Scilit only extracts references with Digital Object Identifiers (DOI) and that Semantic Scholar causes significant problems when it adds references from external web versions. Microsoft Academic was updated several years previous to the defunct.
November 26, 2024 at 3:56 PM
3/6 Publications in Crossref-based databases (Crossref, Dimensions, Scilit and The Lens) have similar citation counts, while in search engines (Google Scholar, Microsoft Academic and Semantic Scholar) they have more citations due to a greater coverage of publications and integration of web copies.
November 26, 2024 at 3:56 PM