Elsagate: Revelations about Alphabet inc.

73  2017-11-18 by [deleted]

[deleted]

17 comments

Well, we know Alphabet is affiliated with CIA, and we know CIA does all kinds of illegal schemes to make secret money (drugs, organs, humans, etc)....maybe Alphabet are the ones making the videos? Or at least are in on it in some way?

My hunch is the CIA and Alphabet collect the data from Elsagate and things like that. Who posts, who watches, and who are those people and how can that info further their agenda.

It is a circle, bots are used and kids don’t buy anything. No wealth being created. it is money laundering!!!!.

Fuck how is it so obvious but I only just saw it!?

But realize Google doesn't give two shits about buyers, they make money from the ad when you click it. My son clicks ads 1000 a day on toothbrush videos on my phone and Google profits every time.

The company that placed that ad spent money on someone who will not buy the product (Son is 2).

One could also theorize that Google prefers ad clickers who don't buy, even adults, if I'm interested in hiking boots and click an ad but then don't buy the product then I'll keep looking and clicking more ads thus generating more revenue for Google.

That’s what I mean. When it is obvious that ad clicks are bots, someone is either being blatantly ripped off, which should make ads far less valuable or money is being cycled

The ad companies are funneling the money to alphabet through "clicks." I think that's what they're getting at.

I didn't see it that way. Don't know if that's what's actually happening but that would be a workable way to launder money.

1.) how do they get so many views?

2.) why is such a bad problem when this media has been around since Newgrounds.com?

What are you getting at?

i dont understand the hype behind it.

The content has adds... sometimes many per video. They are monetized.

I’ve been theorizing with a friend along the same lines. YouTube is making a killing with this.

Honestly, I think there's a pervert problem, and I think there's an us problem. Think about it, every time there's a post here regarding this shit, people click the links. Earlier there was a post that hit the front page about YouTube videos that are uploaded by kids full of comments by predators. I'm not logged in, but some of the suggested videos are videos I've seen recently posted here as a problem.

So basically, here and anywhere else where people post links to videos that are questionable, you have people clicking links, and thereby linking what they click to previous links they've clicked. The more that people concerned about elsagate click other links to other weird shit, the more the algorithm will link all this weird shit.

What I think people are missing, is that the videos (for the most part) aren't inherently sexual, all are weird.. but what makes it a problem is the comments.

There's a planet with over 7 billion people on it, and enough are perverts and have enough time to watch weird ass videos about Spiderman and Elsa.. we know the videos get views because of auto play, but the more people search those videos, then other videos that are linked to actual kids, the more it changes the algorithm.

I hope someone is following what I'm saying at this point.. basically, the people concerned about this stuff in the first place are contributing to what videos are suggested to the videos you've watched. Clearing your search history doesn't change what the algorithm uses. Your device is given an ID, and although that Id is scrubbed of your personal data, it is still tracking what that Id does.

Simply, the more people that are alerted to all this weirdness, all the videos they watch are linked to any other weirdness they watch.. And in the end the perverts win, because YouTube feeds them a selection of shit they're into, and we aren't actually helping.

Rather than reporting the videos, people should be reporting the accounts of the comments that are obviously perverts. And I mean to the FBI. They probably have hundreds of ways they are trying to connect to people like them, and cutting off those ways is what actually effects change.

Completely agree that the largest and most terrifying revelation is that they have to know. They know the subscribers. They know who watches. And what are they doing? Why are they not doing anything?

This is almost exactly what I was thinking. They didn't seem to have any problem being vigilant with their strict rules with so many other genres of videos how could this one have gone untouched for so long? They are only doing something about it now that the awareness of what's happening is starting to gain some serious momentum.

There is a much easier way to sort these kind of accounts FROM THE JUMP:

Cross-referencing.

Youtube can x-reference Facebook, Gmail, Twitter, Instagram.

Does the email have a verified Facebook account? What was the last activity on facebook? Is their Twitter email verified?

So, if you start coming up with a bunch of no's...

You've cut a large majority of the users out.

Now: You can begin to sort them by title.

A lot of them have the acronym: IRL in them (normalizing chat-room references and words). There are also other words that appear.

Now you can filter videos out within certain RGB spectrums for the thumbnail because a lot of these thumbnails use large areas of a solid bright color.

An algorithm change to Youtube's algorithm isn't shit. It will not affect them what-so-ever to clamp down on these videos.

They just refuse.

You would think Disney was also aware of how their property was being abused and monetized, and yet they've done nothing to stop it.