MENLO PARK, Calif. (KGO) -- Meta is responding to new claims it didn't do enough to stop child exploitation on its platforms.
According to a Wall Street Journal investigation, two separate teams inside of Meta raised alarms about the issue in internal reports.
The teams found that hundreds of what Meta calls "parent-managed minor accounts" on Facebook and Instagram were using the subscription feature to sell exclusive content not available to non-paying followers.
The report says the content frequently featured young girls in bikinis and leotards and it was sold to an audience that was overwhelmingly male, leaving some posts plastered with suggestive emojis and sexual comments.
RELATED: Meta collected children's data, refused to close under 13 Instagram accounts, court document alleges
They say staffers found that its own algorithms were promoting child-modeling subscriptions to likely pedophiles.
According to the Wall Street Journal investigation, Meta staffers formally recommended Meta to require accounts selling subscriptions to child-focused content to register themselves, so the company could monitor them.
Meta instead decided to create an automated system to prevent suspected pedophiles from being given the option to subscribe to parent-run accounts, which didn't always work.
This investigation comes just weeks after Meta founder and CEO Mark Zuckerberg apologized to parents at a congressional hearing.
VIDEO: Meta CEO Mark Zuckerberg apologizes to families of kids harmed online as Senate grills tech CEOs
Meta released a statement to ABC7 News, saying in part:
"We launched creator monetization tools with a robust set of safety measures and multiple checks on both creators and their content."
They added the company's plans to limit likely pedophiles from subscribing to children are "part of their ongoing safety work."
If you're on the ABC7 News app, click here to watch live