Two Hat Is Changing the Landscape of Content Moderation With New Image Recognition Technology

Two Hat, Creator of Community Sift, Acquires Image Recognition and Visual Search Company ImageVision

Leading AI technology company Two Hat, creator of the chat, image and video moderation solution Community Sift, announced that it has acquired ImageVision, an image recognition and visual search company. With the addition of ImageVision’s patented computer vision technology, Two Hat now provides the most accurate and efficient all-in-one content moderation solution for social platforms in the industry.

“ImageVision gives us access to over 10 years of research and experience and $10 million worth of technology,” says Two Hat CEO and founder Chris Priebe. “Anyone can train a system to filter 90% of items correctly. It’s only through blending multiple systems that we will finally see the 99% accuracy that makes technology invisible. And when it comes protecting abused kids, spending money to solve the problem is our primary mission.”

Also Read: Digital Air Strike Acquires Two Companies, Expanding Ad Tech and Digital Retailing Solutions for the Automotive Industry

Two Hat’s Community Sift currently provides content moderation solutions to clients like SuperAwesome and Kabam. Most recently, they have collaborated with Canadian law enforcement to develop CEASE.ai, an artificial intelligence model that detects new child sexual abuse material (CSAM) for law enforcement and social platforms. Now, with ImageVision’s computer vision and deep learning expertise boosting their existing technology, Two Hat is poised to provide the most accurate pornography and CSAM detector in the industry.

Also Read: Infor Announces USD1.5 Billion Investment Ahead of Potential IPO

Every day, users share 3.2 billion images on social networks. Over 3 billion people are active social media users, so that number is only projected to grow in 2019. With global companies like Facebook and Google facing mounting scrutiny over their content moderation practices and hiring tens of thousands of human moderators in response, many platforms are now searching for scalable, AI-based solutions to protect their brand and users from NSFW (not safe for work) images. Priebe believes that 2019 is the year that image moderation becomes not just a nice-to-have, but instead a necessity for social platforms.

“The big players are recognizing the value of creating safe online spaces for all users,” Priebe said. “We’re excited to help bring that possibility to more sharing platforms this year. We can’t wait to collaborate with social networks with this innovative technology.”

Recommended Read: Purely CRM Takes Product Care to the Cloud

Brought to you by
For Sales, write to: contact@martechseries.com
Copyright © 2024 MarTech Series. All Rights Reserved.Privacy Policy
To repurpose or use any of the content or material on this and our sister sites, explicit written permission needs to be sought.