Early Warning | Policy & Regulations | Good News | Jobs & Careers | T&S FAQs
T&S Early Warning News
Get ahead of new stories that are impacting the T&S industry.
Australian prime minister loses control of WeChat
ABC News | Jan 24, 2022
Company Listed: WeChat
Australian Prime Minister Scott Morrison has lost control of his account on the Chinese-owned social media platform WeChat and a lawmaker on Monday accused China's leaders of political interference.
Morrison’s 76,000 WeChat followers were notified his page had been renamed “Australian Chinese new life” earlier this month and his photograph had been removed, Sydney’s The Daily Telegraph newspaper reported. The changes were made without the government’s knowledge, the report said.
Morrison’s office declined to comment on the report.
In response to a question from The Associated Press, WeChat's parent company Tencent said that there was “no evidence of any hacking or third-party intrusion," related to Morrison's account.
“Based on our information, this appears to be a dispute over account ownership," the company said.
In accordance with Chinese regulations, Morrison’s public account was registered with a Chinese citizen and was later transferred to its current operator, the company said. It identified the present owner of the account only as a “technology services company," adding that it would “continue to look into this matter further."
When Your Doctor Is on TikTok
The Atlantic | Jan 23, 2022
Company Listed: TikTok
In the second week of March 2020, uncertainty ruled TikTok. Students shared clips of school PA systems announcing closures and cancellations. Travelers filmed their frantic efforts to return to the U.S. before President Donald Trump’s border restrictions went into effect. And yet many users speculated that warnings of a life-reordering pandemic were overblown. Comment sections seemed angsty, but conspiracies abounded, hinting at the diverging versions of reality that lay ahead.
As the first wave of coronavirus shutdowns began, an epidemiologist named Katrine Wallace joined TikTok to combat her boredom and isolation. Wallace, a professor at the University of Illinois at Chicago, watched as the viral dances and humor of “quarantine” took over the app, but she told me that she didn’t consider making any kind of video herself until the app’s algorithm began showing her coronavirus conspiracy theories.
At that point in time, it was, like, the conspiracy that the disease wasn’t even real, like it was all something to throw the election, that everything was miscoded, that it was really the flu and we were pretending it was something else,” she told me. “Naively, I thought to myself, Well, I’m just gonna fix this on this app.”
Is Virtual Reality Safe for Kids?
VRTimes | Jan 23, 2022
Company Listed: Social Media
Finally, virtual reality hardware is coming into the mainstream and children are increasingly coming into contact with the experience. With this being one of the coolest new technologies at the moment, parents are also buying virtual reality headset gifts for their kids. Just other technologies, including the personal computer and the smartphone, VR presents both dangers and opportunities for children.
However, most adult buyers may not be aware that virtual reality headsets have been primarily built for adults and for adult head sizes and most VR headset manufacturers actually prohibit their use by minors.
Many virtual reality headsets, including Meta Quest 2, do not have the option for turning on parental controls that allow a responsible adult to block 18+ content as well as other types of content that may be harmful to kids.
Meta has set a minimum age of 13 for the use of its VR hardware and services in accordance with the global child protection regulations as well as the user data protection law COPPA which protects the online privacy of children under the age of 13.
However, if you have tried some of the popular virtual reality hangouts such as VRChat, Rec Room, and many of the other online games, you will notice that this age restriction does not necessarily prevent kids from playing in virtual reality.
Blame Facebook? Black doctor’s lawsuit over New Brunswick ordeal takes aim at social media giant
Toronto Star | Jan 21, 2022
Company Listed: Facebook Meta
A Black doctor who says he was forced to leave New Brunswick after being publicly and wrongfully identified as the source of a COVID-19 outbreak has filed his lawsuit against the province and the RCMP.
But there was an unexpected, additional defendant included Thursday in Dr. Jean-Robert Ngola’s statement of claim: Facebook.
With that, Ngola’s lawsuit has become the latest legal action poised to explore the culpability of a social media platform in the messages of its users.
But experts say holding Facebook responsible may prove an uphill battle.
According to his statement of claim, he confirmed public health guidelines with authorities on both side of the border and took pains to minimize his exposure to the coronavirus during his trip.
A few days after his return, one of his patients tested positive for COVID. Ngola returned a positive test the following day. He immediately went into quarantine.
On the same day, May 27, 2020, New Brunswick Premier Blaine Higgs held a news conference where, according to the statement of claim, he singled out Ngola — referring to him as “one irresponsible individual” — as the cause of the outbreak of several cases of COVID in Campbellton.
In late May 2020, Ngola, a Campbellton doctor, travelled to Montreal on a family emergency to pick up his daughter and bring her to New Brunswick.
DCMS Committee Finds Online Safety Bill Gets the Balance Wrong
ISPReview | Jan 24, 2022
Company Listed Facebook Meta, Twitter
A new cross-party UK report from the DCMS Select Committee has warned that the Government’s new Online Safety Bill (OSB), which will task Ofcom with tackling “harmful” content online, fails to get the balance right and “neither protects freedom of expression nor is it clear nor robust enough to tackle illegal and harmful online content.”
At present much of the online content you see today is governed by a self-regulatory approach, which has often struggled to keep pace with rapid online changes. Various examples exist for “harmful” content, such as the rise of the ISIS terrorist group, child abuse, as well as state sponsored propaganda from hostile countries, online bullying, racism and the spread of COVID-19 conspiracy theories etc. Some of this is already illegal.
The big social media firms (Facebook, Twitter etc.) usually catch-up with tackling such problems, but they’re often perceived to be too slow or unwilling to act unless pressured to do so, while other websites seem to exist solely to promote the worst of humanity. The OSB appears set to pressure all sites, from big social media firms to small blogs, to act or face the consequences (e.g. access blocked by ISPs, huge fines and making some people liable for what others may say on their website).
Mother of 11-year-old who died by suicide sues social media firms Meta and Snap
The Washington Post | Jan 22, 2022
Company Listed: Facebook Meta, Snapchat
The mother of an 11-year-old child, Selena Rodriguez, from Enfield, Conn. has filed a lawsuit against two social media giants, arguing a lack of adequate safeguards led her daughter to take her own life in July 2021.
A wrongful-death lawsuit against Snap Inc., which runs Snapchat, and Meta Platforms Inc. — the parent company of Facebook and Instagram — was filed Thursday at the U.S. District Court for the Northern District of California, San Francisco Division, by the Social Media Victims Law Center (SMVLC), a Seattle-based legal advocacy group.
The lawsuit alleges that Selena Rodriguez’s suicide was “caused by the defective design, negligence and unreasonably dangerous features of their products,” the SMVLC said in a statement.
The statement also says the tech giants “knowingly and purposefully designed, manufactured, marketed, and sold social media products that were unreasonably dangerous because they were designed to be addictive to minor users.”
Neither Meta nor Snap immediately responded to a request for comment from The Washington Post but have previously said they work to protect minors using their platforms and continually build features to better protect young users.
T&S Policies & Regulations
Regulatory news and policy decisions impacting the T&S ecosystem.
Online safety bill ‘a missed opportunity’ to prevent child abuse, MPs warn
The Guardian | Jan 24, 2022
Company Listed: Social Media
The sharing of some of the most insidious images of child abuse will not be prevented by a new government bill that aims to the make the internet a safer place, MPs have said.
The draft online safety bill is not clear or robust enough to tackle some forms of illegal and harmful content, according to a report by the digital, culture, media and sport (DCMS) committee. The landmark bill places a duty of care on tech firms to protect users from harmful content or face substantial fines imposed by the communications regulator Ofcom.
“In its current form what should be world-leading, landmark legislation instead represents a missed opportunity,” said Julian Knight, the chair of the DCMS committee. “The online safety bill neither protects freedom of expression, nor is it clear nor robust enough to tackle illegal and harmful online content. Urgency is required to ensure that some of the most pernicious forms of child sexual abuse do not evade detection because of a failure in the online safety law.”
The report urges the government to tackle types of content that are technically legal such as “breadcrumbing”, where child abusers leave digital signposts for fellow abusers to find abuse content, and deepfake pornography, which it says are not covered by the bill currently, although creators of deepfake images can be prosecuted for harassment. On child sexual abuse, the committee said the bill should tackle behaviour by predators designed to evade content moderation.