When the lip-syncing app Musical.ly first exploded in popularity nearly four years ago, it was best known for being a teen sensation. But according to the Federal Trade Commission, the app also illegally collected information from children under the age of 13. The agency announced Wednesday that Musical.ly, now known as TikTok, has agreed to pay a $5.7 million fine to settle the allegations, which the agency described as “the largest civil penalty ever obtained by the Commission in a children’s privacy case.” TikTok must also comply with the Children’s Online Privacy Protection Act, or COPPA, going forward and take down any videos uploaded by users under 13.
“This record penalty should be a reminder to all online services and websites that target children: We take enforcement of COPPA very seriously, and we will not tolerate companies that flagrantly ignore the law,” FTC chair Joe Simons said in a statement. The FTC’s complaint alleges that Musical.ly violated COPPA by failing to require parental consent for users under 13, neglecting to notify parents about how the app collected personal information on underage users, and not permitting parents to request to have their children’s data deleted.
TikTok subsequently announced on Wednesday that it was launching a separate portion of its app for children under 13, which “introduces additional safety and privacy protections designed specifically for this audience.”
By essentially combining Vine with Spotify, Musical.ly captured the attention of around 100 million finicky Generation Z consumers. In November 2017, it was bought by Chinese company ByteDance and later rebranded as TikTok. The app—which lets users share 15-second video clips set to music—has now been installed more than a billion times, including 96 millions times in the United States, according to the research firm Sensor Tower. After receiving $3 billion from SoftBank and other investors in October, Bytedance is now considered one of the most valuable privately held startups in the world.
The FTC alleges that TikTok was aware that a “significant percentage” of its users were younger than 13, and that it received thousands of complaints from parents whose underage children had created accounts. Until April 2017, the app’s website even warned parents that “If you have a young child on Musical.ly, please be sure to monitor their activity on the App,” according to the FTC’s complaint. But the app didn’t start requesting users provide their age until later that year, the agency notes. Since then, the app prevented anyone who indicated they were under 13 from creating an account, but its operators didn’t confirm existing users’ ages.
“Kids’ lives are increasingly lived online, and companies like TikTok have been all too eager to take advantage of child app users at every turn,” Ed Markey, a Democratic senator from Massachusetts and a crucial figure in COPPA’s passage 20 years ago, said in a statement. He and other lawmakers introduced legislation last year designed to update the law.
TikTok accounts are public by default; other people can see the content users post unless they adjust their privacy settings. But the FTC complaint alleges that even if users set their profiles to private, others could still message them. For years, a number of media outlets have reported that underage users were being solicited to send nude images on Music.ly, and later on TikTok as well.
TikTok says it has now created a separate app experience for users under 13, which does not permit them to share personal information. It doesn’t allow uploading videos, commenting on others’ videos, messaging with users, or maintaining a profile or followers. In short, TikTok will now only allow young kids to consume content—not share it. Starting Wednesday, both new and existing TikTok users will be required to verify their birthday. They will then be directed to the kid-friendly portion of the app if they say they’re under 13. The company also launched a new tutorial series emphasizing privacy and security on its platform.
Two-tiered “mixed audience” systems like what TikTok is implementing were first permitted via an amendment made to COPPA in 2012, according to Dona Fraser, the director of the Children’s Advertising Review Unit of the Council of Better Business Bureaus. The FTC credited the unit for bringing attention to the TikTok case Wednesday. “It’s a great way to comply,” she says. “You create two products in one. [Children] get one environment that doesn’t include all the bells and whistles that will trigger COPPA.”
The change will likely have a large impact on TikTok’s community, where very young users have played a significant role since the beginning. Some of the platform’s largest stars, like Lauren Godwin—who has 12.3 million fans—have sung “duets” with kids who appear under 13. It’s not yet clear what the platform will do about these videos, which feature underage users but are not shared directly on their own profiles. A spokesperson for TikTok said some of the implementation details of the new system are still being finalized.
While the FTC voted 5–0 to approve the consent decree, some FTC officials believe TikTok should be required to do more than pay a fine. “FTC investigations typically focus on individual accountability only in certain circumstances—and the effect has been that individuals at large companies have often avoided scrutiny,” commissioners Rohit Chopra and Rebecca Kelly Slaughter wrote in a joint statement. “We should move away from this approach. Executives of big companies who call the shots as companies break the law should be held accountable.”