Truple Logo Truple Support
search  /  articles  /  misc  /  high-risk-screenshot-inaccuracy

High risk screenshot inaccuracy

Screenshot risk rating engine explained

Last updated: October 30, 2023

Background

Truple can optionally scan the screenshots and provide a risk rating for each screenshot. If the screenshot is deemed high risk, then it'll be flagged and called out on the accountability report.

(In)accuracy

However sometimes the app flags things that it shouldn't, and misses things that it should flag.

It's important to remember though that computers "think" in 1's and 0's. The images we look at on the screen all boil down to a bunch of 1's and 0's. The algorithms have to take those 1's and 0's and make sense of them.

Think of all the different things that could show up in a "sexual" image. The individual(s) in the image could be a picture of a human, or a hand drawn cartoon, or a computer generated cartoon. They could be forward facing, laying down, standing backwards, camera position from above, below, the side, etc. It could show close up views, or far away zoomed out views. It could be people of difference races, skin tones, hairdo's, makeup, clothing, etc. It could be in black and white, in a low light environment, or super bright lights. There could be other people around dressed normally, or it could be the individual alone. There could be one person in the image, or a dozen. The algorithm needs to distinguish between someone swimming in a non sexual way from someone posing in a scantily clad swimsuit. Now take all of that, and put it in a corner of a screenshot, where the rest of the screenshot is showing "normal" stuff. The algorithm needs to take all this in, and pick out the sexual content. These are incredibly complex technical problems, that haven't been solved perfectly.

Even the best algorithms, running on the most powerful servers make mistakes. Now understand that Truple runs it's algorithm on your phone, where there's limited RAM, CPU, and battery power. By running it on the phone, this enables us to better protect your privacy and keep our operating costs low. Right now typical costs for one of the high-end algorithms run about ~$1.50 per thousand images. Many of our customers capture over a thousand screenshots per day per device... paying for the high end algorithm costs alone for a single one of these devices would be well over $40 / month.

So instead we put together an algorithm that balances accuracy and efficiency. It's lightweight and able to run on modern devices without impacting that device's performance in any notice-able way. But the trade off is some accuracy.

The algorithm will make mistakes. Sometimes it will miss sexual content. Sometimes it will flag normal content as "high risk". This is why we give you access to every screenshot though. When time allows, report recipients should be quickly scrolling through all of the screenshots. Humans are able to detect sexual content much more efficiency than computers. However, we understand that peoples time is limited, so the algorithm is there to help. When you're short on time, at a minimum be sure to check the high risk screenshots.

Adjust sensitivity levels

Now to complicate things a bit more, each person using Truple has their own opinions about what is vs isn't sexual. They also have their own preferences about whether they'd like the algorithm to be more sensitive or less sensitive. Because of this, we added the ability to adjust the algorithm's sensitivity levels. If you're tired of all the false positives, and are willing to accept some false negatives, you can decrease the algorithm's sensitivity. If you'd prefer the algorithm is more aggressive in flagging content, because you want to make sure it doesn't miss anything, you can make the algorithm more sensitive.

You can adjust the algorithms sensitivity level from the device settings page.

Still need help?
Email support@truple.io