Update: we've extracted and analyzed 428,799,920 citation statements targeting 31,024,840 different papers. #openscience
We added 19,696 citation statements to yesterday from 994 preprints/articles on #COVID19. This makes it easy to see how these articles are being cited and if there is supporting or contradicting evidence. Here's an example
Want to join a small but passionate team working to improve research? We're hiring a senior software developer! #hiring
We've put together an interactive citation network visualization of articles and preprints on COVID-19 to help you explore research more effectively. This is a feature we will make available for all research articles in the future. What do you think?
Thank you for an amazing conference and for selecting as the 2019 Innovation in Publishing awardee! We're truly humbled, grateful, and excited to dig in and work with the community towards making research more reliable! #alpsp19
What if there was a systematic way to find out if a science paper has been contradicted? They're developing it using deep learning #AI, now w/ 8 million publications, 280 M citation statements
Happy July 305,972,156 (the latest count of total citations we've extracted from millions of scientific articles). You'll soon see 63,715,449 million more citation statements on !
We've added classification tallies to our search results page! Now you can see if a paper has been supported or contradicted before clicking into the report. Try it out!
Are you a PhD student with some extra time that wants to learn about academic publishing and startups as well as help make better? Email us at [email protected]! We're looking for more interns! #openscience #scicomm #phdchat
Citations =/= endorsements. uses machine learning to determine whether citations support, contradict, or merely mention a study - giving a better indication of impact It even has a neat Chrome extension. Give it a try at: #phdchat
We classify citations as supporting, contradicting, and mentioning. We also show where those citations come from (Intro, Methods, Results, Discussions, or other). Want to see how a protocol was done in 10 different papers? Try !
I came across a tool I didn't even know I needed: It gives a list of papers with a short fragment of text where your paper of interest is mentioned so that you don't have to open each paper to look for a citation context. Handy for writing reviews!
Interested in helping make science more reliable? Like working with a lot of data? Like working remotely? We're hiring! #remotejobs #openscience
Been looking at () which attempts classify claim sentiment (supporting, neutral, contradicting) in flanking text from article citations. It's probably several orders of magnitude more difficult than the typical Twitter sentiment analysis stuff 1/5
Check a new piece from on #covid19 academic articles mapped based on which ones are supported or contradicted by subsequent research
1970 vs. 2019. Vision & Reality. #openscience
I would say that this is an issue but the fact that it is so easily caught makes it tolerable to some extent. A much larger issue (imo) is that there is no easy way to tell how reliable any scientific publication is. Trying to change that at
If you haven't seen us already, you might be interested in checking out . We've ingested/analyzed 8M full-text articles and are currently processing ~300k PDFs a day. The technical challenges you mention are indeed very difficult, but not insurmountable.
A lot of scientific claims being made at #CollisionConf. Check if they are supported or contradicted at
Interesting: vs IF seems to show (noting caveats) that society journals receive on average more positive citations #alpsp19
Next up is Josh Nicholson, co-founder and CEO of , a deep learning platform that evaluates the reliability of scientific claims by citation analysis. #ALPSP19
New startup, assesses #preprints and #openaccess papers to show how often a paper has been supported or contradicted by the studies that cite it, as well as how many times it has simply been mentioned. This will be a game changer.
the holy trinity of scientific chrome extensions: (shows supporting/contradicting citations), pubpeer comments, zotero connector
Inderdaad, vandaar mijn "sentiment". Er worden trouwens wel stappen gezet hierin
Blog - Europe PMC: Europe PMC Integrates Smart Citations from .
(brief break from brexit focus). Yuri Lazebnik kindly pointed me to , which I 1st heard of on . I'm afraid the e.g. he sent makes me doubt ability of text mining to identify which papers do and don't support prior ones.
do something like this, and a few years ago Springer experimented with it, but it didn't get very far.
Hi // #EBMLive2019: I am interested in citation context analysis in clinical trials. Come talk to me! A good point to start: a tool we developed that tells you HOW and not just HOW OFTEN an article has been cited!