So yesterday, tragedy struck the You Tube headquarters in San Bruno, CA. A woman opened fire on You Tube’s main work campus. This forced hundreds of employees and managers to either flee the building or take cover under their desks. But who would do this? You Tube shooting: About the suspect.
So the suspect’s name was Nasim Aghdam. She was 39 years old and from San Diego, CA. I say ‘was’ because at the end of her shooting spree, she turned the gun on herself. Furthermore, we learn Ashdam was a You Tube’er. In fact, she had over 10,000 subscribers at one time. She used the name ‘Nasime Sabz’. On her channel, she did everything from Middle Eastern dancing and speaking to crusading for animal rights. She also promoted her vegan diet and lifestyle.
But according to reports, recently, You Tube stopped paying for her videos. This deeply cut into her primary income. Then, Aghdam complained You Tube added age restrictions and other filters. It angered her when these filters brought her viewership down. Aghdam’s father saw how upset she was. That’s why he got so worried when she went missing. In fact, the day before this tragedy, the father warned police she might go to San Bruno with with intentions. Nasim Aghdam’s page is no more. They deleted it immediately following the shooting. This tragedy hit the whole IT service and IT support community.
The You Tube shooting rampage injured three people. One is in critical condition, one in serious, and one is in fair condition. You Tube and their parent company, Google, assure they’re cooperation with authorities, and comfort those directly affected and their loved ones. This brings up You Tube headquarters’ security. In fact, CBS journalist said despite him having to get a badge, he still just walked through without a metal detector. But what about the suspect? We should never blame You Tube or Google for this heinous act. Nasim Aghadam is to blame. She is the only one who deserves blame. The fact she took her own life rather then face justice or the victims is a sign of a sociopathic coward. I do feel pity for Ashdam’s father, but this cowardly, murderous act forever tarnished whatever legacy Nasim Aghadam? tried to leave behind. Am I being too harsh here?
So yesterday, this country endured another mass shooting. This one took place at a high school in Parkland, Florida, a suburban town just west of Fort Lauderdale. So now, 17 are dead and 15 are still fighting for their lives. Social media reveals Florida shooter’s mind.
On Instagram, there are photos of the Florida shooter, Nikolas Cruz, posing with guns and knives. Between 2015 and 2016, he also posted pictures and ads of weapons he wanted to buy. One such weapon was the Maverick 88 Slug. He even asked his followers for advice about gun costs and passing background checks. Then there are the knife pictures. In January 2016 alone, Cruz posted countless Instagram photos of him holding knives like trophies.
I also notice how in many of these photos, Cruz covers his face with a mask or bandanna. In fact, in most of his pictures, he covered his face. He posted photos of dead animals. When anybody talks about killing animals, or posts pictures of dead animals, that’s a huge sign of trouble. In fact, that’s how many serial killers get started. Furthermore, he commented on You Tube videos about killing people. Several months back, on an Antifa You Tube video, he commented, “…I wish to kill as many of you as I can”.? Then there was a You Tube video that talked about the?1966 sniper shooting at U-Texas-Austin. He commented, “I’m going to do what [the shooter] did”.
IT support may have dropped the ball, but Cruz’s classmates and peers saw this coming. One tweeted, “Everyone predicted it”. Then, another student revealed Cruz always had guns on him. Even the mayor of Parkland confirmed Cruz was a mental health patient who underwent treatment. So IT service wasn’t the only one who dropped the ball here. The mental health facility did; they should have kept him there. I’m not one for extreme gun control, but how did a boy in his late teens get access to so many guns and knives? Furthermore, when he posted these disturbing, perverse posts, why didn’t anybody intervene? It seems the only ones hip to what this guy was about was his fellow teens. The adults in Cruz’s life seemed asleep at the wheel. Social media revealed what was on the Florida shooter’s mind. Why didn’t anybody listen?
We at Geek choice want to extend our thoughts, prayers, and condolences to all affected by tragic shooting.
So yesterday, we talked about Logan Paul’s ad suspension over bad You Tube/Twitter behavior. But it seems like You Tube isn’t stopping there. Because they now have new policy about what is good and what isn’t. You Tube’s new policy: Good rules or censorship?
So now, You Tube is ready to put suspensions and sanctions on any You Tuber. That is, any You Tuber that post videos they call harmful to the You Tube community, whether that be viewers, vloggers, or advertisers. However, this may not mean outright banning. It could mean blocking sponsors. It could also mean not making a home page or trending tab. I guess it depends on how severe the offense.
They are using humans to monitor and enforce these rules. But they’re also using AI/robot technology. What AI will do is track what vloggers are posting. One spokesperson said when somebody uploads something blatantly cruel, it can hurt a lot of people. He listed things like heinous pranks, promoting violence and hatred, and acts of cruelty. He says the damage can create real-life consequences. But he kinda assured us this isn’t about censorship. He said let’s make sure a few don’t impact the 99.9 percent who are doing right.
I gotta admit, that’s great talking when it comes to You Tube’s new policy. But is it the truth or spin? What will they consider inappropriate, or hate speech? The rules may have been a little clearer a few decades ago, or even a few years ago. But this is 2018. This is a time where everybody seems to be offended by everything, one way or another. You probably look at somebody wrong, and you’re going to offend them. That’s playing out in the IT support and computer servicing field as well. The other problem is the whole AI technology thing. So a robot is going to suspend somebody because their vlog was offensive? How will that work?? You Tube’s new policy: Good rules or censorship?
So last month, popular You Tube star Logan Paul found himself in trouble. Because in a January 2018 video, he posted a real dead body. He apologized and removed the video, but apparently,? he didn’t learn much of a lesson. You Tube suspends Logan Paul…again.
So since that incident a month ago, Logan Paul’s antics are just as inappropriate. Paul’s worst hits include rat tasering, encouraging eating Tide pods, and other violations. You Tube puts it’s foot down. Because today, they tweeted, “In response to Logan Paul’s recent pattern of behavior, we’ve temporarily suspended ads on his channels”. He can still post You Tube videos. However, no can’t make money or accept sponsors from advertisers.
Here is what is interesting about this. The Tide pod incident came on Twitter, not You Tube. Paul tweeted he would swallow one Tide pod for every retweet. I’m guessing Paul didn’t do this because he’s still alive and well today. He since deleted the tweets, but the tweets was enough for You Tube to take notice. Yes, You Tube humans as well as machines. That’s another subject matter right there. You Tube is within it’s rights to either suspend advertisers or forbid the uploader from releasing You Tube videos altogether. If Paul doesn’t quit this, then could that be next for him?
Now let’s bring up censorship. Like in many areas of life, that’s a big issue for IT support and computer service. Personally, I think You Tube made the right call in this situation. The Tide Pod challenge is already a disturbing and dangerous online trend. And for London Paul to encourage this behavior among young people is just as disturbing. It wasn’t on You Tube, but this man has over 20 million subscribers. Think about the influence London Paul has. Some may say, “If they censor him, they can censor anybody for anything!”. That’s a valid point. And I hate censorship, too. But with freedom comes responsibility. Obviously, this is a lesson Logan Paul hasn’t learned. Do you think this suspension is a good call? Or is it censorship?
So IT support can be a peculiar thing. Just three days into the new year and You Tube is already in hot water. This is because of Logan Paul, a You Tube star with over 15 million subscribers. He also is part of Red subscription. Will he survive You Tube’s first 2018 scandal.
Paul put up a video of Japan’s ‘suicide forest‘. However, this video included a suicide victim. I like to think this was just a lapse a judgement on Paul’s part and he didn’t realize the victim was there. Also, he took the video down, but not soon enough. Yes, he took it down 24 hours after he put it up. However, 24 hours was more than enough time for six million people to watch it. You guessed it; the outrage was intense and furious.
So what about You Tube? After all, this is You Tube’s first 2018 scandal. Somehow, this gory video passed moderation. Keep in mind this moderation forbids violent and gory real-life content that intends to shock and/or disrespect. To add insult to injury, many of Paul’s fans are minors. We don’t know if You Tube will take disciplinary action against Logan Paul’s channel. However, this appears to be his first strike. If Paul, or anybody, gets three strikes in a 90 day period, then You Tube suspends their account. Logan Paul apologized profusely about this incident.
But I think this is You Tube’s fault just as much as Paul’s, maybe even a little bit more. This proves waiting 24 hours to delete a gory video is 23 hours too late. Also, consider how popular You Tube is getting. In my computer service shop, and in other places, many say they’re canceling cable. They’re replacing it with You Tube’s new 35.00 a month service. With this, one can watch their favorite shows and sports live, just like on cable TV. Furthermore, You Tube already has over a billion users, watching billions of hours a day. This is a recipe for the wild, wild west of social media. You Tube promises AI moderation; goodness knows they need them all.? What can be done to prevent the next You Tube scandal?
I love You Tube. In our computer service shop, sometimes that’s all I watch. But lately, many complain of inappropriate videos they put up at You Tube, especially when it comes to children. So You Tube is doing something about it. You Tube increases moderation.
So according to You Tube CEO Susan?Wojcicki, they’re hiring over 10,000 moderators in the near future. This is after a series of complaints that are now turning into outright scandals. This also includes content aimed at kids. For instance, you take your kids to a rated G movie. Then your hear f bombs and see blood and nudity, as do your kids, in this rated G movie. Imagine the outrage.
Former You Tube show Toy Freaks is finding that out the hard way. Because You Tube cancelled their channel after many complained the young daughter was in inappropriate situations. They’re not the only ones feeling this kind of backlash. Last year, You Tube put in place a new policy enforcing stricter policies against channels that aim blatant, inappropriate content at kids. In fact, they canceled thousands of videos and cut off advertising for over 50,000 channels. They already up the polices, but apparently, they need more enforcement. That’s where the extra 10,000 moderators come in.
Sure, they have IT support tools like algorithms to moderate bad content. But let’s face it, human beings are the best way to regulate human beings. Personally, I’d like to see a ratings system like they have on movie and TV shows. Then the moderators can determine what is good for what age group. What if a kid shows has something inappropriate for small children? Then a moderator can put a PG-13 like rating on the video, and explain why the video is bad for small children. I have my reservations about them canceling channels altogether. Just put a rating on them, then let the parents decide if it’s good for them or not. I totally understand parents want to keep their kids safe and innocent. You Tube increases moderation…can this lead to censorship?