/>
Datasaur.ai

Consensus X
Datasaur

Introduction

Datasaur and Consensus started their relationship in the middle of 2021 to help Consensus build an application that allows people to search for answers that are evidence-based. Their answers are peer-reviewed which provides reliable information to everyone. Consensus collaborates with Datasaur in order to train their ML model, with ease and efficiency. Since we started working together, they’ve been able to accomplish highly accurate results within a few months time.

The Story of Consensus

Consensus wanted to make it easy for people to find evidence-based information. The internet is a vast space filled with endless information, and it's nearly impossible to get easy, yet rigorous, answers. Consensus makes getting reliable answers as easy as a Google search by using AI to instantaneously extract information about your questions, directly from peer-reviewed scientific research.

In the summer of 2022, Consensus is planning to release an ad-free search engine. This will be a huge step forward for the company (and users of the internet). The search engine utilizes natural language processing to supply users with aggregated scientific information from sources that are peer-reviewed.

Whether you are a student looking to find sources for an essay, a parent looking to find teaching techniques for your kids, or you are just trying to win an argument amongst friends—Consensus’ search engine will help. Consensus is being built for you.

To learn more and sign-up for the waitlist, visit https://consensus.app/

The Labeling Challenges for Consensus

The technology powering Consensus is a state-of-the-art, machine learning system. The ML model extracts and analyzes data from scientific literature. As any company that is building a ML model knows, the quality of datasets that are training the model is the most important factor.


In order to construct the high-quality dataset required, Consensus was searching for an annotation product that provided the following:
Could apply labels to scientific papers on multiple different levels (entity, sentence, multiple sentence, etc.)
Ability to create custom label sets
Enabled multiple annotators working on the same file
Could be used by people who have never used an annotation product before (in our case PhD’s and PhD students)
Provided easy to use oversight tools to manage multiple projects with dozens of annotators simultaneously
Datasaur.ai

Why Consensus Chose Datasaur

Consensus decided that Datasaur was the best product for a number of reasons:
1.
Datasaur was able to meet all of the complex annotation requirements listed above
2.
Datasaur was able to very explicitly show us how they would solve our problem in their hands-on demo session
3.
Datasaur offered best-in-class annotation oversight tools that we could not find in competitors
4.
Datasaur offered a personal touch and continued support for a team that was unfamiliar with annotation tools

How Datasaur Responded

Datasaur onboarded Consensus ensuring that the platform was meeting every one of Consensus’ annotation needs. Key stakeholders from both companies met virtually to ensure that their projects were being set up properly within the UI of Datasaur. From there, Datasaur trained Consensus personnel on the best ways to use the platform to maximize feature functionality for their requirements.

They collaborated on every step of the process from end-to-end. From ingestion to the exportation of data, Datasaur and Consensus collaborated together to ensure that the trained data was of high quality. Together, the two organizations collaborated on the best reviewing methods (including discussions of consensus standards), the best output formats, labeling techniques, and workflows.
Datasaur.ai

“We [Consensus] had a very complex and specific set of annotation needs. Datasaur was able to address those needs efficiently and effectively all while maintaining the personal touch you would expect from a start-up. Working with Datasaur has enabled us to create state-of-the-art datasets of thousands of annotated scientific papers, from start to finish, in a matter of a few short months.”

Eric Olson, Co-Founder & CEO

The Result

In the end, Consensus was able to achieve highly accurate results in a matter of a few months. Their first workflow was completed with such success that they are now using Datasaur for a second workflow, which utilizes an entirely different method of labeling and review. 

Describing their success, Eric Olson from Consensus wrote the following: 

“We [Consensus] had a very complex and specific set of annotation needs. Datasaur was able to address those needs efficiently and effectively all while maintaining the personal touch you would expect from a start-up. Working with Datasaur has enabled us to create state-of-the-art datasets of thousands of annotated scientific papers, from start to finish, in a matter of a few short months.”

Ready to improve your team’s labeling performance 10X?

Schedule your demo today!
Get Started
Get Started
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.