Early Warning | Policy & Regulations | Good News | Jobs & Careers | T&S FAQs
T&S Early Warning News
Get ahead of new stories that are impacting the T&S industry.
Australian prime minister loses control of WeChat
ABC News | Jan 24, 2022
Company Listed: WeChat
Australian Prime Minister Scott Morrison has lost control of his account on the Chinese-owned social media platform WeChat and a lawmaker on Monday accused China's leaders of political interference.
Morrison’s 76,000 WeChat followers were notified his page had been renamed “Australian Chinese new life” earlier this month and his photograph had been removed, Sydney’s The Daily Telegraph newspaper reported. The changes were made without the government’s knowledge, the report said.
Morrison’s office declined to comment on the report.
In response to a question from The Associated Press, WeChat's parent company Tencent said that there was “no evidence of any hacking or third-party intrusion," related to Morrison's account.
“Based on our information, this appears to be a dispute over account ownership," the company said.
In accordance with Chinese regulations, Morrison’s public account was registered with a Chinese citizen and was later transferred to its current operator, the company said. It identified the present owner of the account only as a “technology services company," adding that it would “continue to look into this matter further."
When Your Doctor Is on TikTok
The Atlantic | Jan 23, 2022
Company Listed: TikTok
In the second week of March 2020, uncertainty ruled TikTok. Students shared clips of school PA systems announcing closures and cancellations. Travelers filmed their frantic efforts to return to the U.S. before President Donald Trump’s border restrictions went into effect. And yet many users speculated that warnings of a life-reordering pandemic were overblown. Comment sections seemed angsty, but conspiracies abounded, hinting at the diverging versions of reality that lay ahead.
As the first wave of coronavirus shutdowns began, an epidemiologist named Katrine Wallace joined TikTok to combat her boredom and isolation. Wallace, a professor at the University of Illinois at Chicago, watched as the viral dances and humor of “quarantine” took over the app, but she told me that she didn’t consider making any kind of video herself until the app’s algorithm began showing her coronavirus conspiracy theories.
At that point in time, it was, like, the conspiracy that the disease wasn’t even real, like it was all something to throw the election, that everything was miscoded, that it was really the flu and we were pretending it was something else,” she told me. “Naively, I thought to myself, Well, I’m just gonna fix this on this app.”
Is Virtual Reality Safe for Kids?
VRTimes | Jan 23, 2022
Company Listed: Social Media
Finally, virtual reality hardware is coming into the mainstream and children are increasingly coming into contact with the experience. With this being one of the coolest new technologies at the moment, parents are also buying virtual reality headset gifts for their kids. Just other technologies, including the personal computer and the smartphone, VR presents both dangers and opportunities for children.
However, most adult buyers may not be aware that virtual reality headsets have been primarily built for adults and for adult head sizes and most VR headset manufacturers actually prohibit their use by minors.
Many virtual reality headsets, including Meta Quest 2, do not have the option for turning on parental controls that allow a responsible adult to block 18+ content as well as other types of content that may be harmful to kids.
Meta has set a minimum age of 13 for the use of its VR hardware and services in accordance with the global child protection regulations as well as the user data protection law COPPA which protects the online privacy of children under the age of 13.
However, if you have tried some of the popular virtual reality hangouts such as VRChat, Rec Room, and many of the other online games, you will notice that this age restriction does not necessarily prevent kids from playing in virtual reality.
Blame Facebook? Black doctor’s lawsuit over New Brunswick ordeal takes aim at social media giant
Toronto Star | Jan 21, 2022
Company Listed: Facebook Meta
A Black doctor who says he was forced to leave New Brunswick after being publicly and wrongfully identified as the source of a COVID-19 outbreak has filed his lawsuit against the province and the RCMP.
But there was an unexpected, additional defendant included Thursday in Dr. Jean-Robert Ngola’s statement of claim: Facebook.
With that, Ngola’s lawsuit has become the latest legal action poised to explore the culpability of a social media platform in the messages of its users.
But experts say holding Facebook responsible may prove an uphill battle.
According to his statement of claim, he confirmed public health guidelines with authorities on both side of the border and took pains to minimize his exposure to the coronavirus during his trip.
A few days after his return, one of his patients tested positive for COVID. Ngola returned a positive test the following day. He immediately went into quarantine.
On the same day, May 27, 2020, New Brunswick Premier Blaine Higgs held a news conference where, according to the statement of claim, he singled out Ngola — referring to him as “one irresponsible individual” — as the cause of the outbreak of several cases of COVID in Campbellton.
In late May 2020, Ngola, a Campbellton doctor, travelled to Montreal on a family emergency to pick up his daughter and bring her to New Brunswick.
DCMS Committee Finds Online Safety Bill Gets the Balance Wrong
ISPReview | Jan 24, 2022
Company Listed Facebook Meta, Twitter
A new cross-party UK report from the DCMS Select Committee has warned that the Government’s new Online Safety Bill (OSB), which will task Ofcom with tackling “harmful” content online, fails to get the balance right and “neither protects freedom of expression nor is it clear nor robust enough to tackle illegal and harmful online content.”
At present much of the online content you see today is governed by a self-regulatory approach, which has often struggled to keep pace with rapid online changes. Various examples exist for “harmful” content, such as the rise of the ISIS terrorist group, child abuse, as well as state sponsored propaganda from hostile countries, online bullying, racism and the spread of COVID-19 conspiracy theories etc. Some of this is already illegal.
The big social media firms (Facebook, Twitter etc.) usually catch-up with tackling such problems, but they’re often perceived to be too slow or unwilling to act unless pressured to do so, while other websites seem to exist solely to promote the worst of humanity. The OSB appears set to pressure all sites, from big social media firms to small blogs, to act or face the consequences (e.g. access blocked by ISPs, huge fines and making some people liable for what others may say on their website).
Mother of 11-year-old who died by suicide sues social media firms Meta and Snap
The Washington Post | Jan 22, 2022
Company Listed: Facebook Meta, Snapchat
The mother of an 11-year-old child, Selena Rodriguez, from Enfield, Conn. has filed a lawsuit against two social media giants, arguing a lack of adequate safeguards led her daughter to take her own life in July 2021.
A wrongful-death lawsuit against Snap Inc., which runs Snapchat, and Meta Platforms Inc. — the parent company of Facebook and Instagram — was filed Thursday at the U.S. District Court for the Northern District of California, San Francisco Division, by the Social Media Victims Law Center (SMVLC), a Seattle-based legal advocacy group.
The lawsuit alleges that Selena Rodriguez’s suicide was “caused by the defective design, negligence and unreasonably dangerous features of their products,” the SMVLC said in a statement.
The statement also says the tech giants “knowingly and purposefully designed, manufactured, marketed, and sold social media products that were unreasonably dangerous because they were designed to be addictive to minor users.”
Neither Meta nor Snap immediately responded to a request for comment from The Washington Post but have previously said they work to protect minors using their platforms and continually build features to better protect young users.
T&S Policies & Regulations
Regulatory news and policy decisions impacting the T&S ecosystem.
Online safety bill ‘a missed opportunity’ to prevent child abuse, MPs warn
The Guardian | Jan 24, 2022
Company Listed: Social Media
The sharing of some of the most insidious images of child abuse will not be prevented by a new government bill that aims to the make the internet a safer place, MPs have said.
The draft online safety bill is not clear or robust enough to tackle some forms of illegal and harmful content, according to a report by the digital, culture, media and sport (DCMS) committee. The landmark bill places a duty of care on tech firms to protect users from harmful content or face substantial fines imposed by the communications regulator Ofcom.
“In its current form what should be world-leading, landmark legislation instead represents a missed opportunity,” said Julian Knight, the chair of the DCMS committee. “The online safety bill neither protects freedom of expression, nor is it clear nor robust enough to tackle illegal and harmful online content. Urgency is required to ensure that some of the most pernicious forms of child sexual abuse do not evade detection because of a failure in the online safety law.”
The report urges the government to tackle types of content that are technically legal such as “breadcrumbing”, where child abusers leave digital signposts for fellow abusers to find abuse content, and deepfake pornography, which it says are not covered by the bill currently, although creators of deepfake images can be prosecuted for harassment. On child sexual abuse, the committee said the bill should tackle behaviour by predators designed to evade content moderation.
Online Safety and Media Regulation Bill Published - Major Changes to Irish Media Regulation Ahead
Lexology | Jan 21, 2022
Company Listed: Social Media
The Online Safety and Media Regulation Bill (the “Bill”) was published on Wednesday 12 January 2022. The Bill will transpose the revised Audiovisual Media Services Directive (“AVMS”) into Irish law and in doing so will establish a new Irish regulator of traditional and online-only media, the Media Commission, with new criminal and civil enforcement powers.
The full consequences of the new Media Commission regime will not be clear until the Media Commission issues the below-described Rules and Codes and we hope that these will be subject to industry consultation . However, there will be more near-term consequences, including an obligation for all providers of audio-visual on-demand services in the State to register with the Media Commission within 3 months of its establishment.
Now that it has been published, the Bill will begin its passage through the two Houses of the Oireachtas, which could take up to a couple of months. Once the Bill is voted through the Oireachtas, it will become law and the Media Commission, including an Online Safety Commissioner, will be established.
Transposition of the Revised Audio Visual Media Services Directive and Establishment of a Media Commission
By transposing the AVMS Directive, the Bill will replace the Broadcasting Authority of Ireland with a multi-personal Media Commission with the responsibility to oversee updated regulations for broadcasting and video on-demand services. This will update the way in which television broadcasting services and video on-demand services are regulated, to ensure greater regulatory alignment between traditional linear TV and video on-demand services, such as RTÉ Player and Apple TV.
Who will regulate the regulators? Big Tech and their influence on government policy
National Post | Jan 20, 2022
Company Listed: Amazon, Google and Facebook meta
Amazon, Google and Facebook are just some of the corporate giants that have taken over the tech space. While it seems like technology has come to regulate people’s everyday lives, the question now being asked is who gets to regulate the tech industry?
The assumption is that regulators are unbiased and qualified individuals that care deeply about consumer safety, but that may not always be the case. According to findings from a new pilot research project, there are often revolving doors between staff in private and public sectors that can leave consumers vulnerable to what is known as “regulatory capture.”
“When public policy is enacted in the interest of private industry, rather than in the public interest, that’s regulatory capture,” states the website for the Regulatory Capture Lab, which is aimed at revealing how decision-making works in Canada.
Even as the federal government starts to establish guidelines surrounding tech, conflicts of interest and revolving private to public career cycles have led experts like Jim Balsillie, the founder of the Center for Digital Rights and a collaborator on the Regulatory Capture Lab, to question whether Canadians are at top of mind in decision-making.
T&S Careers & Jobs
T&S jobs posted recently, often within the last 24 hours, looking for top talent.
NewsBreak | Mountain View, CA, USA
Posted: Jan 24, 2022
5+ years of experience in Trust & Safety
Familar with overall T&S product, policy (specifically regarding news/entertainment/social media content), processes, KPIs, and overall management
Bachelor degree in public policy, politics, law, economics, communications, behavioral sciences or related field
Excellent verbal and written communications with the ability to translate complex challenges into simple and clear language and persuade between multiple levels
Strong analytical reasoning, execution, and project management skills
Experience working with international partners across different time zones and cultures
TikTok | Mountain View, CA, USA
Posted: Jan 24, 2022
Your background and professional expertise are grounded in areas related to online operations, consulting, or project management, and you have demonstrated this through a track record of achievements in helping launch major initiatives
5+ years of experience in a relevant field (e.g., policy implementation and/or enforcement, program management, consulting, operations)
Excellent cross-functional project management skills, good communication skills; a passion to unite people behind a common goal
Proven track record of pushing organizational or process changes
Baseline exposure to Policy development and content policy issues
Ability to work in fast-moving and ambiguous environment
Google | Sunnyvale, CA, USA
Posted: Jan 24, 2022
Bachelor's degree or equivalent practical experience
5 years of relevant work experience in operations management, policy enforcement, and project management
Experience working cross-functionally and interacting across all organizational levels
Ability to interact/build relationships with diverse technical and non-technical groups
Ability to manage multiple competing priorities in a fast-paced and constantly changing environment
Via | New York, NY, USA
Posted: Jan 24, 2022
Highly analytical and a good communicator
You can not only knock out a complex analysis, but can quickly distill key insights for various groups
Meticulous and vigilant, with a keen attention to detail
Collaborative and able to generate buy-in to get things done within a multi-team, multi-geography organization
Comfortable with analytical tools (e.g., Excel, Tableau, SQL, Python) and a willingness to further develop this skill set
5+ years of work experience, likely at a high-growth startup, established tech company, consulting, legal, or financial firm
TaskUs | United States
Posted: Jan 24, 2022
You need to have prior experience in Online Safety but, more importantly, you should be comfortable working in the "grey"
3+ years experience as Team Lead or similar role
Proven work experience in a User Safety or Research environment
Excellent communication skills, Fluent English and Have high quality writing and proofreading skills
Outstanding organisational skills
Ability to work with limited guidance and adapt to changing priorities and directions
What is T&S?
Great question (and one the industry hasn’t fully agreed upon). Think of Trust & Safety as the organization that protects social media and Big Tech companies and users from online abuse. As long as users are able to generate content online, time has told us that a minority of these users will create content that is offensive, scandalous, fraudulent, disingenuous, misinformative, hateful, spiteful, and harmful. T&S fights this with flexible policies, resilient operations, and informed detections.
Is T&S interesting?
That depends. If you enjoy stopping bad actors and keeping good people safe, then you’ll love T&S. It requires understanding new technologies and staying on top of breaking issues across society and the world. It also pays Big Tech salaries (no coding skills required).
How do you get into T&S?
That’s the tricky part. The short answer is: not easily. Most T&S orgs are part of Big Tech and (as a result) pay extremely well, so jobs are highly competitive. If you don’t know someone in the industry who can help refer you in, then you can learn more about T&S policies, current events, issues, regulations, and jobs at trustsafetyinstitute.com to improve your hiring chances and start doing meaningful work that helps people.