zlacker

[parent] [thread] 10 comments
1. restin+(OP)[view] [source] 2019-06-14 18:24:16
I don't think this is incentivizing bad behavior. It's merely showing the viewer more of what they are already watching with a gradual introduction to broader material. The example of a youtube serving content to "pedophiles" is borderline asinine. The neural network is just making suggestions on viewing, it's not telling people to watch. In regards to the complaint that "adult" content is being served to adolescents, there is an option to filter out sensitive content all together.

Also, as a parent to 4 children myself, the idea of letting my kids loose on the internet completely devoid of any supervision is ridiculous. When did it become youtube's responsibility to parent the children in its audience? Should we also ban HBO, Amazon, and Netflix from providing recommendations because it might be a child in front of the screen?

This is just another pointed attempt to censor free speech via the abuse of technology companies. The idea being that the platform will be restrictive if they are constantly badgered about it.

replies(3): >>maskli+04 >>hrktb+Ne >>SomeOl+5Q
2. maskli+04[view] [source] 2019-06-14 18:50:01
>>restin+(OP)
> with a gradual introduction to broader material.

It doesn't gradually introduce broader material, it gradually introduces more "engaging" material.

replies(1): >>restin+J4
◧◩
3. restin+J4[view] [source] [discussion] 2019-06-14 18:54:52
>>maskli+04
I would argue that your point is semantics, but even so you still have a choice of whether or not to watch the recommended more "engaging" material. It doesn't change the overall point of my statement.
replies(1): >>tvanan+8a
◧◩◪
4. tvanan+8a[view] [source] [discussion] 2019-06-14 19:33:10
>>restin+J4
I'd say it's quite a different point. My own experience has been that the recommended "engaging" material is something in the same genre as whatever I just saw, but with a clickbaitier title, flashier thumbnail, and overall poorer informational quality. It's the different between saying "I see you enjoy sandwiches, maybe you would also enjoy salads or a plate of sushi" and "I see you enjoy sandwiches--here's a candy bar, an off-brand soda made with high-fructose corn syrup, and a carton of cheap grocery store ice cream."
replies(1): >>restin+hg
5. hrktb+Ne[view] [source] 2019-06-14 20:11:15
>>restin+(OP)
> just making suggestions on viewing, it’s not telling people to watch

I’m not sure I get the difference between suggesting content and telling people what content to watch. Were you trying to drive a different point ?

That aside, it seems your argument is that youtube being neutral in recommending videos shelters them from blame, while the article is basically about why being neutral is harmful.

I personaly think anything dealing with human content can’t be left neutral, as we need a bias towards positivity. Just as we don’t allow generic tools to kill and save people in the same proportion, we want a clear net positive.

replies(1): >>restin+kh
◧◩◪◨
6. restin+hg[view] [source] [discussion] 2019-06-14 20:24:02
>>tvanan+8a
The semantics argument I was pointing out was in regards to "broader" vs "engaging". That's not what my statement was about, it was that no matter what the algorithm recommends to you, you still have the choice whether or not to watch it. The point you are making is purely anecdotal as I assure you the neural network is not simply showing you

>same genre as whatever I just saw, but with a clickbaitier title, flashier thumbnail, and overall poorer informational quality

replies(1): >>Faark+kK
◧◩
7. restin+kh[view] [source] [discussion] 2019-06-14 20:31:36
>>hrktb+Ne
To make my first point clear, here is a scenario:

I walk up to you on the street and suggest you give me a dollar.

vs

I walk up to you on the street and take a dollar from you by force.

Youtube is a platform, in order remain a platform it MUST remain neutral. You cannot have an open forum with bias. There are certain mutually agreed upon rules, (no nudity, extreme violence, etc.), those limitations are more than enough to handle the vast majority of "negative" content.

I whole heartedly disagree that we need a bias towards positivity. Who determines what that definition is? Something you see as negative, I might happen to enjoy. If Youtube begins to censor itself in that way it is no longer a platform and is now responsible for ALL of its content.

replies(1): >>hrktb+ck
◧◩◪
8. hrktb+ck[view] [source] [discussion] 2019-06-14 20:56:19
>>restin+kh
Thanks for the clarification on the first point. Won’t youtube effectively shove the next recommended video to a user as long as auto-play is activated ?

Also they are the default view, I’d argue suggestions are a lot more than just “suggestions”. It would be akin to a restaurant “suggesting” their menu, and you’d need to interrogate the waiter to explore what else you could be served. For most people the menu is effectively the representation of the food of the restaurant.

For the neutrality, if you recognize there are agreed upon rules, as you point out, the next question becomes who agreed on these rules, and who made them ?

Who agreed nudity should be banned ? Which country ? What nudity ? and art ? and educational content ? and documentaries ? at which point does it become nudity ? The more we dig into it, the more it becomes fuzzy, everyone’s boundary is different, and all the rules are like that.

Any rule in place is positive to a group and negative to another, for a rule to stay in place it needs to have more supporters than detractors, or put it another way have more positive impact than negative ones.

The current set of rules are the ones that were deemed worthwile, I think it’s healthy to chalenge them or to push for other rules that could garner enough agreement to stay in place.

replies(1): >>restin+tl
◧◩◪◨
9. restin+tl[view] [source] [discussion] 2019-06-14 21:09:48
>>hrktb+ck
> Won’t youtube effectively shove the next recommended video to a user as long as auto-play is activated ?

You can very easily turn auto-play off. There is plenty of opportunity to switch videos. It would be different if youtube forced you to watch the next video in order to use the site.

>For the neutrality, if you recognize there are agreed upon rules, as you point out, the next question becomes who agreed on these rules, and who made them ?

Youtube made them. Those are pre-conditions for uploading videos. They don't have to have any reason why they made them, those are conditions that must be met in order to upload a video. So by uploading a video you are agreeing to them.

>Any rule in place is positive to a group and negative to another

I don't agree with this generality. However, this discussion is not about the legitamacy of the rules to use youtube, it is whether or not youtube should censor videos, (that meets basic rules of use). My opinion is no, your's as you stated above was:

>I personaly think anything dealing with human content can’t be left neutral, as we need a bias towards positivity.

I agree with you that Youtube should routinely challenge their own rule sets. That is not the same as censoring their content, or in this case modifying their recommendation algorithm.

◧◩◪◨⬒
10. Faark+kK[view] [source] [discussion] 2019-06-15 03:22:22
>>restin+hg
You can keep telling yourself that you have a "choice", but in the end we all are just humans, with quite predictable behavior. Bias selection of content is since forever one of the more effective ways of shaping opinion. Politics is fighting hard on that front for a reason. For the first time ever are some very few algorithms selecting content for millions of people, with apparently little human oversight. Yes, this should worry us. Simply assuming the results of those will benefit mankind, especially in the long term, would be foolish. It's not quite exactly like the usual ai safety paperclip scenario, but by now it should be very obvious that optimizing watch-time, even with current "ai", comes with significant unintended side effects / drawbacks.
11. SomeOl+5Q[view] [source] 2019-06-15 05:56:42
>>restin+(OP)
The broader material is the problem. It’s not a natural way of using recommendations: it’s just an ad at that point.
[go to top]