It feels like your companies dont give a damn You give a

first_img by Qingdao Haier Shootings at New Zealand Mosques Kill at Least 49 PeoplePrime Minister Jacinda Ardern identified the shootings as a terrorist attack. ShareVideo Player is loading.Play VideoPauseMuteCurrent Time 0:03/Duration 2:18Loaded: 29.01%0:03Stream Type LIVESeek to live, currently playing liveLIVERemaining Time -2:15 Playback Rate1xChaptersChaptersDescriptionsdescriptions off, selectedCaptionscaptions settings, opens captions settings dialogcaptions off, selectedEnglish Sponsored Content Haier Smart Home Has the Solutionscenter_img Captions Audio Trackdefault, selectedFullscreenThis is a modal window.Beginning of dialog window. Escape will cancel and close the window.TextColorWhiteBlackRedGreenBlueYellowMagentaCyanTransparencyOpaqueSemi-TransparentBackgroundColorBlackWhiteRedGreenBlueYellowMagentaCyanTransparencyOpaqueSemi-TransparentTransparentWindowColorBlackWhiteRedGreenBlueYellowMagentaCyanTransparencyTransparentSemi-TransparentOpaqueFont Size50%75%100%125%150%175%200%300%400%Text Edge StyleNoneRaisedDepressedUniformDropshadowFont FamilyProportional Sans-SerifMonospace Sans-SerifProportional SerifMonospace SerifCasualScriptSmall CapsReset restore all settings to the default valuesDoneClose Modal DialogEnd of dialog window.Close Modal DialogThis is a modal window. This modal can be closed by pressing the Escape key or activating the close button.Close Modal DialogThis is a modal window. This modal can be closed by pressing the Escape key or activating the close button.PlayMuteCurrent Time 0:00/Duration 0:00Loaded: 0%Stream Type LIVESeek to live, currently playing liveLIVERemaining Time -0:00 Playback Rate1xFullscreenFacebook said it struggled to identify the video of the New Zealand mosque shootings because of the use of a head-mounted camera by the gunman, which made it harder for its systems to automatically detect the nature of the video.“This was a first-person shooter video, one where we have someone using a GoPro helmet with a camera focused from their perspective of shooting,” Neil Potts, Facebook’s public policy director, told British lawmakers Wednesday.Terror footage from a first-person perspective “was a type of video we had not seen before,” he added. Because of the nature of the video, Facebook’s artificial intelligence—used to detect and prioritize videos that are likely to contain suicidal or harmful acts—did not work.Potts was giving evidence Wednesday to a committee of senior lawmakers in the U.K. as part of a parliamentary inquiry into hate crime. Representatives for Twitter and Alphabet’s Google and YouTube also gave evidence.Social media platforms, such as Facebook, have been facing scrutiny after the shooter accused of killing dozens of people in two mosques in New Zealand live-streamed the murders over the internet. The social media company came under sharp criticism for not taking the video down fast enough and for letting it be circulated and uploaded to other platforms like YouTube.At congressional hearings in the U.S. over the past two years, executives from Facebook and YouTube said they were investing heavily in artificial intelligence that would be able to find and block violent and graphic videos before anyone saw them. In a blog post following the attack, Facebook said that its AI systems are based on using many thousands of examples of content to train a system to detect certain types of text, imagery or video.Potts was also chastised by the committee’s chair, the Labour party’s Yvette Cooper, for not knowing the senior officer in charge of counter terrorism policing in the U.K., Neil Basu.“We’ve been told by the counter terrorism chief that social companies don’t report to the police incidents that clearly are breaking the law,” Cooper told Potts. “You may remove it, but you don’t report it.”Potts responded that he was “not familiar with the person you mentioned, or his statement,” and later apologized for not knowing him. He said, however, that Facebook doesn’t report all crimes to police but does report “imminent threats.”“These are places where government could be giving us more guidance,” Potts said.The committee investigating hate crime is separate to the one that recently recommended the British government take tougher measures to keep technology companies like Facebook in check, following a year-long inquiry into fake news and its impact on elections.Stephen Doughty, a Labour party lawmaker, directed broad and strongly-worded criticism at all three witnesses.“Your systems are simply not working and quite frankly it’s a cesspit,” he said, referring to the collective platforms’ content. “It feels like your companies don’t give a damn. You give a lot of rhetoric but you don’t take action.”Marco Pancini, director of public policy for YouTube, responded that “we need to do a better job and we are doing a better job,” adding that since an earlier hearing “we introduced a team that helps us better understand trends of violations of our policies by far-right organisations.”“That’s all wonderful but they’re clearly not doing a very good job,” Doughty replied.You May Like HealthFormer GE CEO Jeff Immelt: To Combat Costs, CEOs Should Run Health Care Like a BusinessHealthFor Edie Falco, an ‘Attitude of Gratitude’ After Surviving Breast CancerLeadershipGhosn Back, Tesla Drop, Boeing Report: CEO Daily for April 4, 2019AutosElon Musk’s Plan to Boost Tesla Sales Is Dealt a SetbackMPWJoe Biden, Netflix Pregnancy Lawsuit, Lesley McSpadden: Broadsheet April 4last_img

Leave a Reply

Your email address will not be published. Required fields are marked *