[-] [email protected] 3 points 1 day ago

I was in a gifted and talented program as a child, and my best friend had an Asperger’s diagnosis. His behavior was noticeably different from the other gifted children. In my experience, the term was not used in the manner you describe.

[-] [email protected] 7 points 1 day ago

IMO, there was a pretty big shift starting around 2005-2007, when it became possible to watch video pornography on common pocket-sized electronic devices. Before then, porn was accessible to varying degrees. It became ubiquitous.

[-] [email protected] 3 points 1 day ago

Married men. Lots of women think men looking at pornography is a form of unfaithfulness.

[-] [email protected] 2 points 1 day ago

In the ‘90s, autism was associated with an IQ <70.

[-] [email protected] 10 points 1 day ago

There’s really no reason to not have midterms. Even if there is a blue tsunami, and Congress impeaches Trump, that just empowers him further. It would finally precipitate the Constitutional crisis that solidifies his dictatorship. The Executive is the enforcement arm of the government, and it’s already been purged and filled with cultists. They will not remove him, they will not stop following his orders.

Cancelling the midterms would surely cause a Constitutional crisis, earlier, and there’s no reason to be hasty. The longer that event can be put off, the less likely people will resist violently.

[-] [email protected] 8 points 1 day ago

The leadership purges are already done. They’re well into the rank-and-file purges. They’ve already done the transgendered, two days ago they started on the blacks, with a change in grooming standards designed to kick out ~60% of black personnel within a year.

Now they’re exploring how to recruit to fill the positions they’re opening up.

[-] [email protected] 8 points 3 days ago

this feels like a bad time to have debt

It’s a great time to have debt. If you don’t end up imprisoned, killed, or a refugee, massive inflation will make repayment easy.

[-] [email protected] 4 points 3 days ago

No paywall for me.

OTTAWA — Newly released documents show federal public safety officials quietly expressed concern over the tech industry’s ability to curb the spread of extremist and terrorist content online after sector-wide layoffs.

The documents, released to the National Post under federal access-to-information legislation, were prepared ahead of a 2023 meeting with Google, which owns YouTube, as well as a meeting with X, formerly known as Twitter.

Officials specifically reported the rise in terrorist and extremist materials found on the platforms in the aftermath of the Oct. 7 attacks on Israel.

“The Israel-Hamas conflict has created an avalanche … with at times hundreds of thousands of graphic videos and images of mass shootings, kidnappings and other violence widely circulating on social media,” reads one of the briefing notes.

“The industry’s response has thus far been disappointing. While tech companies state that they are removing significant volumes of terrorist images and videos, many tens of thousands are still circulating.”

Officials tied the spread of this content to earlier layoffs made by X and Google, with the latter saying that a majority of videos flagged in the immediate aftermath of the Oct. 7 attacks were removed before they had 1,000 views.

“Violent extremist content is strictly prohibited on YouTube, and we continue to invest in the teams and technologies that allow us to remove this material quickly. Any notion that we’d compromise the safety of our platform is categorically false,” a spokesperson for YouTube said in a statement.

In late 2023, the then deputy minister of Public Safety Canada, who has since retired, met separately with officials from Google on the sidelines of the Halifax International Security Forum that November, as well as a senior representative from X, who has since left that position, during a G7 Interior Ministers’ meeting the same year.

Max Watson, a spokesperson for the department, said while details of their discussions were private, they “focused on security issues,” which included “how to address online harms.”

The internal briefing documents detail how officials believed one of the reasons platforms and their companies were struggling to remove terrorist and extremist content was because thousands of staff responsible for content moderation had been laid off.

In early 2023, Google and its parent company, Alphabet, announced that it was laying off some 12,000 staff, with cuts continuing across other major tech companies in the years since.

Around that time, reports suggested roughly one-third of the staff working at Jigsaw, a technology incubator at Google, tasked with developing tools for content moderation and helping identify terrorist content that could be removed, but which was separate from YouTube’s trust and safety division, were cut.

The internal documents show officials suggested asking Google, which owns the video-sharing platform, about the possibility of rehiring some of the staff, and also asking how it was working to “mitigate the harms” related to violent and terrorist content, citing the “recent cuts in trust and safety seen across the industry.”

YouTube has said it removed tens of thousands of videos for violating its service guidelines regarding violent extremism and hate speech, since the Oct. 7 attacks.

It also says that more than 90 per cent of the content removed for violating its policy regarding violent extremism was taken down before it had reached 1,000 views, and that this was the case for roughly 96 per cent of the videos removed from October to December 2023.

In the internal documents from that year, public safety officials expressed concern about how removing this amount of content still saw “tens of thousands of videos and posts circulating, some getting millions of views.”

When it came to X, officials flagged reports that content considered to be antisemitic had exploded by more than 900 per cent on the platform, while content considered Islamophobic had increased by around 400 per cent.

The briefing note said one of the purposes of the meeting with a representative from X was to express Canada’s concern, “with X seen as among the worst of the big companies.”

Officials also pointed out that since Elon Musk bought the company in October 2022, thousands of jobs had been cut, with the company doing away with its ethical AI team and laying off 15 per cent of its trust and safety department. Elsewhere in the briefing documents, they also discussed how X had “made decisions to deprioritize” the removal of this content, “often citing freedom of speech or public interest exceptions.”

Both X, along with Google and YouTube, have been involved in an international NGO founded in 2017 dedicated to preventing terrorists and extremists from spreading their content online.

In the internal documents, however, public safety officials suggested it was struggling to address the rise in terrorist and violent extremist content online, saying the Israel-Hamas conflict had “created a crisis online.”

In the department’s statement, spokesperson Max Watson said Canada remains concerned about the spread of this content, saying Canada was one of the first countries to sign onto an initiative struck after a gunman livestreamed a mass shooting that killed 51 people in Christchurch, New Zealand.

Reached for comment, a spokesman for Public Safety Minister Gary Anandasangaree declined to respond, pointing instead to the department’s response, which outlined the millions of dollars the government had dedicated towards preventing the spread of this content online.

Under former prime minister Justin Trudeau, the Liberals tabled a bill aimed at compelling social media companies to reduce the exposure of their users, in particular children, to different types of harmful content, including that which “incites violent extremism or terrorism.”

The legislation failed to pass by the time Prime Minister Mark Carney, who succeeded Trudeau back in March, triggered a spring federal election.

While his government has said it plans to amend the Criminal Code to address online exploitation and the sextortion of children, it has not clarified whether it intends to try to move on to regulate companies to remove harmful content.

[-] [email protected] 2 points 4 days ago

Looks like a truckstop.

Horsecook

0 post score
0 comment score
joined 1 month ago