algorithmic amplification

What is the impact of constant misinformation on consumer social media? Dave Troy discusses the effects in a long Twitter thread:

“Disinformation is the operational end of a process designed to break down society and radicalize it into cultish forms. This process leads people away from truth. We can’t address this process by distributing truth; the cure for disinformation is not simply truth … Truth is, rather, a goal we must arrive at … We need to turn our attention to what is being lost: social ties, social trust, social capital … We don’t look enough at the relationship between identity, in-group, and belief. They are all reflections of the same thing and you can’t alter one without altering the others. This is why injecting garbage breaks down social ties and alters belief and identity. Sufficiently radicalized, people won’t recover their prior social connections, leaving them stranded on ‘islands of dissensus’. There is no natural pathway back from this. It’s a one way process. Throwing truth at them doesn’t restore lost social/family ties; it alienates them.” —Dave Troy

Twitter recently revealed — Examining algorithmic amplification of political content on Twitter — that its algorithm that decides what you see in your stream can have a social and political impact.

Read more

facebook is not a trusted space

The time has come. Facebook is in the news today and not as the tech media darling it likes to portray itself as.

“Former Facebook (FB.O) employee and whistleblower Frances Haugen will urge the U.S. Congress on Tuesday to regulate the social media giant, which she plans to liken to tobacco companies that for decades denied that smoking damaged health, according to prepared testimony seen by Reuters.” —Reuters 2021-10-04

In 2007, I asked if we need an alternative to Facebook — As we become more interconnected and use the Web for problem solving, finding love and sharing our sorrow, we should seriously consider public infrastructure as the backbone for social networking. Just as we have funded roads and airports, we need to provide safe and open platforms for online community forming.

Read more

people, not algorithms

Can an algorithm defeat an algorithm? One group of European researchers think it can be done. I have my doubts

“The approach involves assigning numerical values to both social media content and users. The values represent a position on an ideological spectrum, for example far left or far right. These numbers are used to calculate a diversity exposure score for each user. Essentially, the algorithm is identifying social media users who would share content that would lead to the maximum spread of a broad variety of news and information perspectives.

Then, diverse content is presented to a select group of people with a given diversity score who are most likely to help the content propagate across the social media network—thus maximizing the diversity scores of all users.” —IEEE 2021-01-21

I believe that any system that can be gamed, will be gamed. Adding another algorithmic layer on platforms designed to manipulate human behaviour will likely result in a continuing game of whack-a-mole, like search engine optimization (SEO). Humans are not machines, and machines (including software) are not humans.

Read more

human engagement counters misinformation

A recent study conducted by Facebook suggests that when it comes to vaccine doubts and misinformation, “a small group appears to play a big role in pushing the skepticism”.

Some of the early findings are notable: Just 10 out of the 638 population segments contained 50 percent of all vaccine hesitancy content on the platform. And in the population segment with the most vaccine hesitancy, just 111 users contributed half of all vaccine hesitant content. —WaPo 2021-03-14

Small groups of people can have influence beyond their numbers. For example when a committed minority in society rises above 25% there can be a tipping point. However it only takes 10% if those people have an unshakeable belief in their cause. Meanwhile, inside an organization, there is usually a small group of people — 3% — who can influence up to 85% its members. Find out more at — 25-10-3.

Read more

platforms and the precariat

Is it possible to be a musician today and earn a middle class income?

The music industry is fundamentally broken up into three separate arms: recorded music, music licensing and live music. Where recorded music — physical album sales — was once the bread-and-butter for musicians, first the advent of piracy platforms like Napster, and then the gradual shift to streaming services like Spotify, Amazon and Apple Music made that framework unsustainable.

“In order for me to earn a minimum wage, an annual minimum wage of $30,000, I need to gain six million streams at the average royalty rate of half a cent per listen,” [musician] Sainas said. “That’s unattainable.” —CBC 2021-03-11

In 2005, the oft-quoted business guru, Seth Godin suggested that the long tail would provide for middle class entrepreneurs and musicians.

Read more

connect, challenge, create

Education won’t counter populism — changing education might

Slovakia’s president, Zuzana Caputova, was elected in March 2019, and surprisingly showed a way out of the populist quagmire that many countries find themselves in. The tribal affiliations retrieved by the previous corrupt government, particularly via social media, were what Caputova had to counter in order to get elected.

She addressed these tribes not by creating a new tribe, but by discounting the tribal perspective and focusing on the population’s common humanity instead. In this case, it worked. Understanding The Laws of Media, especially the retrieval quadrant gives us a tool to counter the negative effects — or potential reversal — of new technologies like social media. This is real media literacy.

Read more

let’s stop the war of words

A November 2019 article in the British Medical Journal showed how difficult it is to change peoples’ minds, especially with regards to vaccinations. Facts don’t change peoples’ minds.

Lesson 2: don’t bring a fact to a narrative fight

Experts and health professionals can arm themselves with white papers, peer reviewed studies, and symposia; but if these are our only weapons, we will only ever get so far. In an era in which experts are increasingly distrusted, the “we know best” mindset is counterproductive.

Those wishing to encourage vaccination need to identify and amplify the stories that emerge from the real lives and lived experiences of people in their communities (to start, they need to listen for them). It is no coincidence that the most effective climate advocacy in the world right now comes from the improvisations and stories of a 16 year old girl rather than the strategic plans of a generations old institution. —BMJ: New Power versus Old

For example, a mandatory education class in Ontario, Canada — complete with videos and health care professionals to advise — has been useless in getting parents to accept vaccinations for their children.

Read more

a global clown show

In 2007 I was concerned that Facebook was selling personal data. That same year I asked if there could be a public alternative to Facebook. By 2010 I had left the platform.

This year, after our local newspaper closed, I commented that we are now dependent on this global corporation — that uses our data to manipulate us — as our main form of communication. It is as if we live in a company-owned town, and buy all of our goods from the company store, using a party telephone line that the bosses listen in on. This is directly the fault of government, organizational, and community leaders who have either been lazy, ignorant, or perhaps malicious in promoting this control platform to engage others.

I have faulted our common natural stupidity for following along with the costly convenience of using Facebook as the default communications medium. Christopher Wylie, the whistle-blower for the Cambridge Analytica scandal, said that, “The internet is part and parcel of democracy now, whether you like it or not … Do we need rules that we as a society agree on, with independent regulators who are on our side, not on shareholders’ side?”

Read more

debunking handbook 2020

The Debunking Handbook 2020 has just been published and is an excellent free guide to address the mass amounts of misinformation, disinformation, and propaganda that flow through our digital communications everyday and then influence real life behaviours. I have discussed some of these phenomena previously, in confronting the post-truth machines and pre-bunking the conspiracy theorists.

The 19-page Handbook provides these handy definitions.

  • Misinformation: False information that is disseminated, regardless of intent to mislead.
  • Disinformation: Misinformation that is deliberately disseminated to mislead.
  • Fake news: False information, often of a sensational nature, that mimics news media content.
  • Continued influence effect: The continued reliance on inaccurate information in people’s memory and reasoning after a credible correction has been presented.
  • Illusory truth effect: Repeated information is more likely to be judged true than novel information because it has become more familiar.

Read more

non-violence + 3.5%

How many people does it take to change an organization or a society?

Minority groups need 25% to influence the majority in a society. But it only takes 10% if the group is committed with unshakeable belief. Inside an organization, the right mix of people requires only 3% to influence 85% of their colleagues. There is more information about these figures here — 25-10-3

Harvard University Professor Erica Chenoweth’s research is influencing protest movements with her findings — in first being non-violent, and secondly understanding how many people have to get involved with the movement. In the USA this would be just over 11 million people. In Canada it would be 1.3 million.

Read more