//Leon County Sheriff’s Office Investigating Explicit Facebook Spam

Leon County Sheriff’s Office Investigating Explicit Facebook Spam

Share with friends

By: WCTV Eyewitness News 

TALLAHASSEE, Fla. (WCTV) — Authorities across the country, including those in Leon County, are investigating a series of Facebook messages that contain images that show what appears to be a sexual encounter between an adult and a child.

WCTV has learned multiple law enforcement agencies across the country – including those in Columbus, Ohio, Chicago, Illinois and Denver, Colorado – are investigating. The Leon County Sheriff’s Office has also launched an investigation into the images.

ColumbusPolice, has received several calls regarding a video containing child porn being received via Facebook Messenger. Agencies from Missouri, Alabama and Tennessee have also received reports. Federal authorities have been notified. DO NOT OPEN OR FORWARD THE LINK!!!

Multiple media outlets, including WCTV, have received the potential spam. Upon receipt, WCTV station management reported the messages to local authorities.

The message, along with the explicit image, contains text requesting to make the image “go viral” to get justice for the victim.

Law enforcement strongly suggests if you receive the message to delete it. Do not open it and do not forward it.

“We do not allow the sharing of child exploitative images on Facebook or Messenger — even to express outrage. Regardless of intention, sharing such imagery is harmful and illegal,” said a Facebook spokesperson in a statement to WCTV. “When we become aware of such images, we remove them and notify the National Center for Missing and Exploited Children. We urge people never to share such content and to report it to local authorities immediately.”

Facebook said it has not seen any instances of the image being shared with the intent to exploit, but rather users were sharing to increase awareness and to try to help. But, Facebook reminds users that the sharing of this content – no matter the intention – is illegal, and is encouraging people to report the content to local authorities.

Facebook also says it has banked the image using its PhotoDNA technology to prevent future uploads.

More information on what to do if you have received the message can be found here.

Facebook’s full statement to WCTV can be read below:

At Facebook, we take these issues very seriously. There is nothing more important to us than the safety and security of our community, especially the most vulnerable.
We have zero tolerance for child exploitation images (CEI) being shared on Facebook or Messenger – regardless of intent – and we are extremely aggressive in preventing and removing child exploitative content.

  • Sometimes people may use our platform to share CEI in an attempt to draw attention to it and/or identify the perpetrators. While we recognize that the intent here may be good, sharing this imagery is illegal in the US and elsewhere.
    We flag all reports for child exploitation to National Center for Missing and Exploited Children (NCMEC) in the US. As a US company we are obliged to do this. NCMEC has a vast global network of relationships and will, in turn, work with local law enforcement teams around the world to bring people to justice.
  • As is always the case, if we have reason to believe a child is in immediate/imminent danger, we may proactively refer a case to local law enforcement (in addition to reporting it to NCMEC), to make sure the child is immediately safeguarded.
    Every image that is uploaded to our site is scanned using PhotoDNA technology, and has been since 2011. When an image is found to be of CEI, the content is immediately deleted, the account is taken down, and we report all instances of such content to the appropriate officials.
  • Photo DNA is a technology that scans all images on Facebook and Instagram and flags known child exploitative material so that future uploads of that imagery is prevented from surfacing on the platform at all.
  • If someone does attempt to upload known CEI, our systems prevent upload, the account that attempted upload is immediately taken down, and we report all instances of such content to the NCMEC.
  • When we become aware of newly generated CEI based, we hash the content to prevent further sharing, report it to the appropriate officials, and delete the account.
    We take child exploitation reports very seriously and we move swiftly.
  • We make it easy for people to use the ‘report’ button to report violations of our policy, and we prioritize reports of child sexual exploitation.
  • People can report instances of child exploitation content using any reporting flow on our site. All of our teams are trained to recognize this content and pass it to our team of child safety experts.
    We also work with NGOs and others to block people searching for known child exploitative terms. This list is updated on a regular basis.
    In addition, we have created shortcuts on Facebook and Instagram to provide education and additional resources (developed in conjunction with the National Human Trafficking Resource Center) to people who search for terms related to sex trafficking. These terms have been provided by internal and external experts and when someone searches for them on Facebook, we will have a pop-up that reminds them sex trafficking is illegal and violates our policies and shares resources for getting help.
    We educate our community about what to do if they come across CEI with extensive resources in our Help Center.
  • We emphasize the fact that people should not share, download, or comment on the content, reminding them that it can be criminal to share, or send messages with, photos and videos of children being sexually abused and exploited. And we tell them that they won’t be asked to provide a copy of the content in any report.
    Facebook is committed to a multistakeholder, comprehensive global effort to fight child exploitation and thwart the proliferation of CEI. We hold a leadership position of the board of the WePROTECT Global Alliance, an international multi-stakeholder initiative to end the sexual exploitation of children online. With support from the Fund to End Violence against Children, WePROTECT has brought governments, the technology industry and international and civil society organizations together to develop a comprehensive national model involving all stakeholders to tackle this crime and to help countries work toward the implementation of the model. Facebook’s role in this effort includes:
  • board membership
  • participation in the development of an international database (maintained by the Internet Watch Foundation) for hashed CEI mages so we can use PhotoDNA to stop proliferation of these images via the internet
  • work with our partners worldwide to help the investigation and prosecution of child sexual exploitation
  • trainings for smaller industry players interested in developing the internal technological and operational systems needed to stop the proliferation of CSE
  • webinar trainings for child helplines throughout the world
  • assistance in a much needed technology upgrade for the NCMEC to help them manage the increasing number of reports resulting from the growing global multi-stakeholder effort to thwart child sexual exploitation.
    We launched AMBER Alerts on Facebook in 2011 to help families and authorities successfully recover missing children and have since expanded the program to over 12 countries.
  • People in a designated search area where local law enforcement has activated an AMBER Alert, will see the alert in their News Feed. The alert includes a photo of the missing child, a description, the location of the abduction, and any other pertinent, available information. Users can share the alert with friends to spread awareness, tapping into an organic desire to help.
  • We know the chances of finding a missing child increase when more people are on the lookout, especially in the critical first hours.
  • Our goal is to help get these alerts out quickly to the people who are in the best position to help.
    In May 2016, we invited over 75 engineers from across industry, including Microsoft and Google, as well as from child safety NGOs, such as NCMEC, Thorn and InHope to the Facebook campus in San Francisco for the first-ever cross industry child safety hackathon to develop tools and products that enhance child online safety (read more here). We hosted the hackathon in 2017 again, and have now added the TechCoalition and Google as co-hosts to the event to expand its scope and reach. One of the prototypes that came out of the hackathon is a tool that will enable people to match known photos of missing children against online trafficking ads.

(WCTV)