<p>Social media platforms TikTok and Bumble have joined an initiative to prevent the sharing of non-consensual intimate images online.</p>.<p>The social media platforms partnered with StopNCII.org (Stop Non-Consensual Intimate Image Abuse), which hosts a tool developed in partnership with Meta.</p>.<p>TikTok, Bumble, Facebook and Instagram will detect and block any images that are included in StopNCII.org's bank of hashes, reports Engadget.</p>.<p>The website uses on-device hashing technology through which people being threatened with intimate image abuse can create unique identifiers of their images, (also known as 'hashes' or digital fingerprints).</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/international/world-news-politics/fbi-director-raises-national-security-concerns-about-tiktok-1167977.html" target="_blank">FBI director raises national security concerns about TikTok</a></strong></p>.<p>This process takes place on their device. To protect users' privacy, StopNCII.org only uploads a unique string of letters and numbers rather than actual files, according to the report.</p>.<p>Moreover, hashes submitted to StopNCII.org are shared with participating partners.</p>.<p>If an image or video uploaded to TikTok, Bumble, Facebook, or Instagram matches a corresponding hash and "satisfies partner policy requirements", then the file will be forwarded to the platform's moderation team.</p>.<p>When moderators find that the image violates their platform's rules, they will remove it, and the other partner platforms will block the image as well, said the report.</p>.<p>The tool has been available for a year, and over 12,000 people have used it to prevent intimate videos and images from being shared without permission.</p>.<p>Users have created more than 40,000 hashes to date, the report added.</p>
<p>Social media platforms TikTok and Bumble have joined an initiative to prevent the sharing of non-consensual intimate images online.</p>.<p>The social media platforms partnered with StopNCII.org (Stop Non-Consensual Intimate Image Abuse), which hosts a tool developed in partnership with Meta.</p>.<p>TikTok, Bumble, Facebook and Instagram will detect and block any images that are included in StopNCII.org's bank of hashes, reports Engadget.</p>.<p>The website uses on-device hashing technology through which people being threatened with intimate image abuse can create unique identifiers of their images, (also known as 'hashes' or digital fingerprints).</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/international/world-news-politics/fbi-director-raises-national-security-concerns-about-tiktok-1167977.html" target="_blank">FBI director raises national security concerns about TikTok</a></strong></p>.<p>This process takes place on their device. To protect users' privacy, StopNCII.org only uploads a unique string of letters and numbers rather than actual files, according to the report.</p>.<p>Moreover, hashes submitted to StopNCII.org are shared with participating partners.</p>.<p>If an image or video uploaded to TikTok, Bumble, Facebook, or Instagram matches a corresponding hash and "satisfies partner policy requirements", then the file will be forwarded to the platform's moderation team.</p>.<p>When moderators find that the image violates their platform's rules, they will remove it, and the other partner platforms will block the image as well, said the report.</p>.<p>The tool has been available for a year, and over 12,000 people have used it to prevent intimate videos and images from being shared without permission.</p>.<p>Users have created more than 40,000 hashes to date, the report added.</p>