Politics: from Quantum to Thermodynamic

In the past, candidates campaigned with a quantum model of the voter. Each voter counts. The candidate needs to convince neighborhoods and each house on a block. It was quantum in the sense that individual action mattered. Every atom/voter needs a packet of energy to transform from one candidate another.

Now the quantum model of the voter can be replaced by a classical thermodynamic model. One thinks in terms of the temperature of groups instead of individuals. Rather than counting individuals, the campaign thinks of percentages.

For many issues, opinions are split nearly evenly. To get the electoral results for a candidate, moving a fraction of voters in the middle is enough. The effect is that, for the majority of the electorate, the candidates don’t need to address them. They won’t matter to the outcome because their votes are free energy to win an election without exerting any work.

Targeted advertising such as what is available with Facebook, YouTube and Google allow candidates to focus their appeals to the subgroups that are in the middle. Adding heat to targeted parts of the pot can be more successful.

Changing the votes of one or two percent in the right demographic can be enough to win the election. Narrow campaigns targeted to subgroups can be more efficient than mass appeals through TV and radio. A campaign’s money can be stretched further when it tries to change the temperature of small groups rather than trying to push individual votes one at a time.

In a quantum model of campaigning, each person matters. When a candidate shifts their efforts to a classical, thermodynamic model, what matters is convincing groups in the margins. People have been reduced from individual human voters into inhuman mathematical abstractions.

Vulnerable

Thought bubble
In the book “Alone Together: Why We Expect More from Technology and Less from Each Other,” Sherry Turkle said “With some exceptions, when we make ourselves vulnerable we expect to be nurtured.” (p. 235) She’s referring Erik Erikson and the expectations coming from basic trust. That’s something that’s completely counter to the way that the Internet often works. For the most part, any time you put yourself out there, you’re at risk of being attacked or ridiculed rather than built up and comforted.

One place for this risk comes from places that encourage intimacy. It’s not always easy to trust people to begin with. Letting your guard down out there in the social media dystopia isn’t always safe. If one makes comments that might be too hard to share in person, it can still end up hurting.

One reason for this is that anonymity allows people to be more negative and exhibit the dark tetrad personality traits when, in person, they wouldn’t act out. To them, the idea that people have feelings or that they are afraid of being humiliated is alien. Often, the enemy only feels good by getting some lolz.

Sometimes you can find a community of like-minded people where you can be safe. This reflects Sherry’s comment “Communities are places where one feels safe enough to take the good and the bad.” (p. 238) I’ve found some, like deviantART, are different than most social media. One reason is that to belong there, you have to put in some work. You can’t just repost an inane meme and belong. A member of dA is a creator. A pretender just looks around and is lurking.

Every time you get in front of a computer screen and post something on Twitter or Facebook, it’s possible to misstep and be misunderstood catastrophically.

The Facebook Experience

I used to have a facebook account, but was very dissatisfied. I wasn’t comfortable with its addictive nature. Also, more often than not, I was self-conscious about adding information that didn’t fit social norms in times of stress.

It seemed that people preferred to submit clever graphics and people could leave the real “them” out. Just put up a facade–all is well. The last straw for me was when they suggested that I might be employed by a fellowship I belong to.

The reason the facebook topic came up with me again is that a course at IUPUI that I was thinking of taking included facebook postings as part of the coursework. I didn’t really want to get an account again. Hence the conflict.

 

Dr. BJ Fogg at Stanford University has studied persuasive technology. He calls it captology. His definition of persuasive is slightly different than the natural one. Persuasive means to cause a desired behavior. It isn’t about the cognitive persuasion to think about an issue a certain way. In his method, you pick a behavior you want to increase, make it easy to do and then prompt the behavior. The behaviors can be tiny such as to click a “Like” button or complex and have you to log in and update your content.

Facebook uses persuasive technology to increase income for the company. The users of facebook need to encourage people to advertise there. “Like” is a simple behavior. It seems to indicate that you’re engaged with a vendor’s products and services. On Veritasium, https://www.youtube.com/watch?v=oVfHeWTKjag suggests that a “like” may not be what it seems.

The part where I get uncomfortable is that facebook has covert information that it can use to manipulate the interaction. People think of the website interacting solely with them, but with billions of users, facebook knows how people act in aggregate and can notice how to make a change with a tiny impact but is statistically significant. By combining these impacts, they can be manipulative and do it without being detected. They can manipulate the users and they can manipulate the advertisers.

One can’t be naive and think that facebook does things are solely for the benefit of its users. When one starts a post and then erases it, facebook’s software can notice. Since they know when this happens, they can find ways to encourage people add content more freely. They also target what you see to what they know you are more likely to attend to and not what you might value.

They knew that many of my friends belonged to a fellowship, so it was natural to blindly propose, to me who hadn’t listed an employer, that I might be employed in the same place.

With facebook they are capable of knowing more about you than you can imagine. They use that to make their shareholder’s wealthy. When I see an ad on YouTube, I know it is an advertisement. I can ignore it if I want. If you’re being persuaded to participate in advertising without knowing that you are being marketed to, that’s where the facebook experience is letting the smoke and mirrors conceal the real interaction. In “Captology and the Friendly Art of Persuasion” by Lynn Griener (*) comments “Advertisers may, for example, be able to get away with sneaky and intrusive tactics” and that “facebook must play it straight.” However, with huge data resources and insatiable stockholders, facebook’s straight can be pretty crooked.

(*) Greiner, Lynn. “Captology and the Friendly Art of Persuasion,” NetWorker, Fall 2009. doi: 10.1145/1600303.1600306