COPPA and YouTube - Is this a good decision by YouTube?

Is this a good decision by YouTube?


  • Total voters
    4

Paladin

Hack The Planet!
Member
Joined
Oct 21, 2019
Messages
513
Trophies
0
On September 4, 2019, Google paid the US Federal Trade Commission (FTC) a 170 million dollar fine over violations of COPPA, the Children's Online Privacy Protection Act, which states that websites may not collect any personal information from anyone under 13 without explicit parental consent. But the FTC didn't just want the fine, they went after YouTube instead. Two months later on November 13, YouTube made a massive change.

If you upload a video, you are now required to market it as "For Kids" or "Not For Kids."

If a video is marked for kids, you lose up to 90% of revenue, comments, end screens, notifications, and recommendation by YouTube's algorithm. So your video is essentially dead in the water.

If a video is marked Not For Kids, you retain all features. However, and here's the big part - if the FTC manually reviews your video and determines it is for kids when you've marked it Not For Kids, you, the content creator, may be fined up to $42,000 for each video (!!!)

Basically, Fortnite videos will lose all comments, and anything that looks even vaguely childish will have to miss out on key features starting January 1, 2020.

This is a very big decision, and one that is likely to result in a large number of people quitting YouTube.

xenonVirus. Let's discuss it. Do you agree with this or not? Why or why not? I'm interested to hear your opinions.
 
Last edited:
People are running around screaming youtube is on fire, wonder how big of an issue this will be.
 
People are running around screaming youtube is on fire, wonder how big of an issue this will be.

It is a pretty big issue, because if you set a video as Not For Kids and someone from the FTC thinks that it's for kids, your video will be deleted and you can be fined $42,000 per video.

Setting a video as For Kids disables comments, notifications, personalized ads, and end screens.

xenonVirus. I'm going to be honest. This is a pretty big issue.
 
Last edited:
If this was youtube themselves investing channels I would be more concerned because of how they rely on the almighty algorithms, but the FTC being who they are will do this using human being with more common sense. I'm not making light of people concerns, but I don't this is gonna be that big of a deal. Youtube channels that create specifically for kinds are gonna have it a bit rough in terms on monetization, but overall I don't think it's gonna be that big of a deal. Gaming content, unless created specifically for kids, will mostly remain unaffected by this
 
It is a pretty big issue, because if you set a video as Not For Kids and someone from the FTC thinks that it's for kids, your video will be deleted and you can be fined $42,000 per video.

Setting a video as For Kids disables comments, notifications, personalized ads, and end screens.

This is a pretty big issue.

I guess easy solution is, find a clip that is 'not for kids' and just use it in all video clips.
 
I am sick and tired of those picture match verification clips, and with this chore of having to avoid fines by manually selecting every video as being safe or unsafe for kids, when there could be hundreds on a channel, I can see a lot of people leaving YouTube behind. I understand why these measures are in place, but they're still pretty tedious, especially when you use proxies.

I also hate receiving all these security alerts when you log in from a new location or device, but again, I know that they're important in the event of someone using your accounts. I especially hate this feature on Facebook, where you're not doing anything wrong, then they suddenly disable your account and ask that you upload a photo to be able to proceed, but it's not instant and you have to wait on somebody manually checking you over. If it's not up to their standards, then it's goodbye to that profile. It's so annoying. Also, there's another block in place where you can only verify a mobile number a few times before you need another one.
 
I especially hate this feature on Facebook, where you're not doing anything wrong, then they suddenly disable your account and ask that you upload a photo.

That's never happened to me (luckily) but why would Facebook do that when there are easier ways to verify that it's really you, like phone number or email account?

xenonVirus. I understand Facebook is paranoid about security, but I think they're taking this way too far.
 
I think it mostly occurs if you make a new account with the same IP address, but also if you access Facebook from a new location, like if you have enabled a VPN. It must think you're a bot, or something. I don't know if this is still happening, but there used to be profiles that sent friend requests through your contacts, and they would try to fool unsuspecting users by asking for your personal information, but I think they weren't even real profiles. Just fake ones with sexy looking women. That's what used to happen on forums too, before CAPTCHA stopped that. The bots cannot auto-register as only a human user can resolve the procedure.
 

Latest content

General chat
Help Show users
  • No one is chatting at the moment.
      There are no messages in the chat. Be the first one to say Hi!
      Back
      Top