Protect your platform from Child Sexual Abuse Materials.

Krunam provides breakthrough technology that identifies and classifies previously unknown CSAM images and video at scale. Our CSAM classifier protects your online community, your employees, your brand’s reputation, and, last but not least, millions of children around the world.

Book a consultation

Protect your platform from Child Sexual Abuse Materials.

Krunam provides breakthrough technology that identifies and classifies previously unknown CSAM images and video at scale. Our CSAM classifier protects your online community, your employees, your brand’s reputation, and, last but not least, millions of children around the world.

Book a consultation

What Krunam does

VigilAI CAID Classifier

Our classifier utilizes the latest 2D and 3D computer vision and deep learning to assist your content moderation teams root out and eliminate CSAM on your platform faster. And it is a fully GDPR and CCPA compliant on-premise solution that allows all of the data to stay on your system.

CSAM Content Moderation
Risk Mitigation

Protecting your content moderators from psychological harm is critical in the fight against CSAM. Krunam can help you organize the data and set moderation approaches that increase throughput, all while taking steps to reduce the psychological toll on your employees – leading to more robust and faster content moderation.

CSAM Governance Expertise

Finding CSAM is one thing – but having your Trust & Safety team handle the situation that navigates the tricky balance of public perception, ESG investor relations, potential reporting requirements, and survivor sensitivity and needs is not easy. Krunam’s multi-disciplinary team can assist you in finding the right solution for your situation.

What Krunam does

CSAM Content Moderation
Risk Mitigation

Protecting your content moderators from psychological harm is critical in the fight against CSAM. Krunam can help you organize the data and set moderation approaches that increase throughput, all while taking steps to reduce the psychological toll on your employees – leading to more robust and faster content moderation.

VigilAI CAID Classifier
Krunam provides breakthrough technology that identifies and classifies previously unknown CSAM images, video, and, soon, live streaming at scale. Our CSAM classifier protects your online community, your employees, your brand’s reputation, and, last but not least, millions of children around the world.

CSAM Governance Expertise
Finding CSAM is one thing – but having your Trust & Safety team handle the situation that navigates the tricky balance of public perception, ESG investor relations, potential reporting requirements, and survivor sensitivity and needs is not easy. Krunam’s multi-disciplinary team can assist you in finding the right solution for your situation.

The challenge

In 2020 alone there were over 60 million CSAM images or videos reported by authorities across the globe. While the legal definition of CSAM varies by country, the common denominator is an image or video that shows a child who is engaged in or is depicted as being engaged in explicit sexual activity.

97%*

In 2020 there was a 97.5% increase in the number of CyberTipline reports of online enticement versus 2019.

82%*

Children of all ages appear in CSAM, with 82% of the total number of them aged just 7-14.

92%*

Young girls are highly targeted by abusers, making up 92% of children appearing in CSAM.

The Challenge

In 2020 alone there were over 60 million CSAM images or videos reported by authorities across the globe. While the legal definition of CSAM varies by country, the common denominator is an image or video that shows a child who is engaged in or is depicted as being engaged in explicit sexual activity.

97%*

In 2020 there was a 97.5% increase in the number of CyberTipline reports of online enticement compared to the year prior.

82%*

Children of all ages appear in CSAM, with 82% of the total number of them aged just 7-14.

92%*

Young girls are highly targeted by abusers, making up 92% of children appearing in CSAM.

Who is at risk?

Online Communities

Your online community is harmed when people accidentally run into CSAM on your platform. This dramatically lowers trust in your services.

Businesses

Your business is at risk – as CSAM being present hurts the brand and online community you’ve spent years cultivating; and a large scandal on your platform could negatively impact your financial results.

Content Moderators

Content moderators and Investigators are damaged psychologically while reviewing CSAM in order to remove it from your platform. Our tool can help mitigate this harm while increasing the speed of human moderation by an estimated 40%.

Survivors

Survivors of abuse are re-victimized by the distribution of CSAM, often years after it occurred. Some survivors are suing platforms now for inaction in removing CSAM – exposing your platform to even more risk.

Who is at risk?

Online Communities

Your online community is harmed when people accidentally run into CSAM on your platform. This dramatically lowers trust in your services.

Businesses

Your business is at risk – as CSAM being present hurts the brand and online community you’ve spent years cultivating; and a large scandal on your platform could negatively impact your financial results.

Content Moderators

Content moderators and Investigators are damaged psychologically while reviewing CSAM in order to remove it from your platform. Our tool can help mitigate this harm while increasing the speed of human moderation by an estimated 40%.

 

Survivors

Survivors of abuse are re-victimized by the distribution of CSAM, often years after it occurred. Some survivors are suing platforms now for inaction in removing CSAM – exposing your platform to even more risk.

 

How it works

The World’s Pre-Eminent CSAM Dataset

Our VigilAI CAID Classifier algorithm is trained on the meticulously collected, law enforcement labeled Child Abuse Image Database (CAID) maintained by the UK’s Home Office. It is the pre-eminent privacy compliant CSAM data set in the world.

CSAM Recognition

Our algorithm is trained via the latest AI and Deep Learning techniques. This allows our technology to identify the “hallmarks” of CSAM, adult and benign materials giving your team deep insight into six (and soon to be seven) classes of images and videos.

State-of-the-art computer vision

Our VigilAI CAID Classifier uses state of the art computer vision to evaluate images and videos at scale. The Krunam team includes AI and computer vision experts, and child sexual assault investigators.

Identifies new imagery

Our breakthrough technology can detect previously unseen CSAM. For the first time organizations holding third party images and video can identify unknown CSAM via a fully GDPR & CCPA privacy compliant, AI trained classifier.

What makes

Krunam different

The VigilAI CAID Classifier is the next generation of CSAM recognition – pushing beyond the boundaries of perceptual hashing which is only able to identify previously known CSAM. Our classifier can identify newly generated and previously unknown CSAM – since it uses visual cues that it learned from being trained on the world class CAID dataset.

In addition to identifying unknown images, with over 60% of CSAM known to be video as of 2019, Krunam has built in the critical capability of scanning video as well as still images. With live streaming support available later in 2021.

How it works

The World’s Pre-Eminent CSAM Dataset
Our VigilAI CAID Classifier algorithm is trained on the meticulously collected, law enforcement labeled Child Abuse Image Database (CAID) maintained by the UK’s Home Office. It is the pre-eminent privacy compliant CSAM data set in the world.

CSAM Recognition
Our algorithm is trained via the latest AI and Deep Learning techniques. This allows our technology to identify the “hallmarks” of CSAM, adult and benign materials giving your team deep insight into six (and soon to be seven) classes of images and videos.

State of the art computer vision
Our VigilAI CAID Classifier uses state of the art computer vision to evaluate images and videos at scale. The Krunam team includes AI and computer vision experts, and child sexual assault investigators.

Identifies new imagery
Our breakthrough technology can detect previously unseen CSAM. For the first time organizations holding third party images and video can identify unknown CSAM via a fully GDPR & CCPA privacy compliant, AI trained classifier.

Our Partners

Vigil AI
Cutting-edge computer vision, AI, deep learning and machine learning technologists with CSAM domain expertise.

Just Business
Investing in and incubating profitable and forward-thinking ventures dedicated to positively impact the world since 2007.

Not For Sale
A global NGO that is a leader in innovative solutions fighting human trafficking, with survivor support and alleviation programs on five continents.

View

What makes
Krunam different

The VigilAI CAID Classifier is the next generation of CSAM recognition – pushing beyond the boundaries of perceptual hashing which is only able to identify previously known CSAM. Our classifier can identify newly generated and previously unknown CSAM – since it uses visual cues that it learned from being trained on the world class CAID dataset.

In addition to identifying unknown images, with over 60% of CSAM known to be video as of 2019, Krunam has built in the critical capability of scanning video as well as still images. With live streaming support available later in 2021.

Our Partners

Not For Sale

Vigil AI

View

Just Business