Announcing two new content detection engines NSFW Language and Image!

Keeping with this year's theme of big announcements, the scanii team is ecstatic to announce the beta availability of not one but two brand new content detection engines: NSFW Images and NSFW Language.

NSFW Image

NSFW stands for not safe for work and NSFW Image is our brand new content detection engine designed from the ground up to utilize the latest in artificial intelligence and computer vision to detect adult, offensive or otherwise inappropriate images.

Worried about inappropriate images being accidentally shared by your users? Turn this detection engine on and say goodbye to that risk.

NSFW Language

Probably one of the most exciting features we've released this year, NSFW language detection allow you to detect profane, offensive or otherwise inappropriate language across virtually all files types including images, that's right, you can send us phone camera image of a document containing adult language and it will be detected accordingly - oh and we forgot to mention that this detection works across 23 languages.

And that's not the even the most exciting part of this announcement, the NSFW language database is open source, utilities the amazing open source YARA format and is available at https://github.com/uvasoftware/yara-language-nsfw.

How to get started

Our new content detection engines are available to everyone now, all you need to is enabled them for your API keys, details here - oh and there's no extra charge for using them :)

Here's your obligatory clebration gif:

via GIPHY

Onwards,

The scanii team.

Last updated on 08/05/2016.