Apple Podcasts | Google Podcasts
Session 121
Tone provides a deeper sense of one’s attitude. Understanding tones and structures in a CARS passage will help you get a better sense of the passage and author.
As always, I’m joined by Jack Westin from JackWestin.com. Check out all their amazing free resources including a free trial session of Jack’s full course to see how it’s like learning from Jack Westin himself.
Listen to this podcast episode with the player above, or keep reading for the highlights and takeaway points.
Link to the article:
https://reallifemag.com/more-than-a-feeling/
So many authorities want to use computational power to uncover how you feel. School superintendents have deputized “aggression detectors” to record and analyze voices of children. Human resources departments are using AI to search workers’ and job applicants’ expressions and gestures for “nervousness, mood, and behavior patterns.” Corporations are investing in profiling to “decode” customers, separating the wheat from the chaff, the wooed from the waste. Richard Yonck’s 2017 book Heart of the Machine predicts that the “ability of a car to read and learn preferences via emotional monitoring of the driver will be a game changer.”
Affective computing — the computer-science field’s term for such attempts to read, simulate, predict, and stimulate human emotion with software — was pioneered at the MIT Media Lab by Rosalind Picard in the 1990s and has since become wildly popular as a computational and psychological research program. Volumes like The Oxford Handbook of Affective Computing describe teams that are programming robots, chatbots, and animations to appear to express sadness, empathy, curiosity, and much more. “Automated face analysis” is translating countless images of human expressions into standardized code that elicits certain responses from machines. As affective computing is slowly adopted in health care, education, and policing, it will increasingly judge us and try to manipulate us.
Troubling aspects of human-decoding software are already emerging. Over 1,000 experts recently signed a letter condemning “crime-predictive” facial analysis. Their concern is well-founded. Psychology researchers have demonstrated that faces and expressions do not necessarily map neatly onto particular traits and emotions, let alone to the broader mental states evoked in “aggression detection.” Since “instances of the same emotion category are neither reliably expressed through nor perceived from a common set of facial movements,” the researchers write, communicative capacities of the face are limited. The dangers of misinterpretation are clear and present in all these scenarios.
Bias is endemic in U.S. law enforcement. Affective computing may exacerbate it. For example, as researcher Lauren Rhue has found, “Black men’s facial expressions are scored with emotions associated with threatening behaviors more often than white men, even when they are smiling.” Sampling problems are also likely to be rife. If a database of aggression is developed from observation of a particular subset of the population, the resulting AI may be far better at finding “suspect behavior” in that subset rather than others. Those who were most exposed to surveillance systems in the past may then be far more likely to suffer computational judgments of their behavior as “threatening” or worse. The Robocops of the future are “machine learning” from data distorted by a discrimination-ridden past.
[01:51] Tone is Important
The tone is very important for a student to really latch on to. Imagine you’re having a conversation with someone where there’s no emotion. And they’re really upset like a friend calls you. Maybe they just broke up with someone or maybe they just had a really bad day. Or they got an F or C on their tests. And they called you and they just talked to you in a very mundane manner. Now, that’s not going to sound human. It sounds like a robot.
“Tone provides people with a deeper sense of your attitude and a deeper sense of your direction.”Click To TweetA good author will inject tone because it just enriches the experience for the reader. You feel more invested.
When someone’s sad, or when someone’s happy, you tend to follow their tone and their direction, their mood, and then you embody that mood. So a good author will also provide words that give you clues as to how they feel or how they want you to feel. And so, that gives you a better sense of what’s going on.
[04:06] Paragraph 1, Sentence 1
So many authorities want to use computational power to uncover how you feel.
Jack says:
The way the author paints this is that even in the first sentence, they have given you a lot of info. First, you notice the extremeness. It’s encompassing too many people for us to even consider it to be realistic. And that tells you that maybe this person is emotional to some extent about this topic.
Second, they say “so many.” And if someone told you this in person, they’re probably annoyed by this fact or opinion, whatever you want to call it. The author is trying to inject some sort of annoyance or concern into the reader’s mind.
[05:29] Paragraph 1, Sentence 2
School superintendents have deputized “aggression detectors” to record and analyze voices of children.
Jack says:
Google already does this. When you search for something, they know exactly how you’re feeling.
[06:10] Paragraph 1, Sentence 3
Human resources departments are using AI to search workers’ and job applicants’ expressions and gestures for “nervousness, mood, and behavior patterns.”
Jack says:
These are some more examples of how people are using this computational power.
[06:32] Paragraph 1, Sentence 4
Corporations are investing in profiling to “decode” customers, separating the wheat from the chaff, the wooed from the waste.
Jack says:
Companies are using it, as well as HR departments and schools – everyone’s trying to use it. The phrases “wheat from the chaff” and “wooed from the waste” could be confusing for students. And that gives you a good representation of what you should be looking at. Wooed is something that may sound good, and waste is probably not a good thing. Then wheat is probably good and chaff is probably bad.
But none of that really matters. This paragraph is talking about how people feel or uncovering how people feel. And they do this through decoding customers. And that’s what you need to care about.
[07:59] Paragraph 1, Sentence 5
Richard Yonck’s 2017 book Heart of the Machine predicts that the “ability of a car to read and learn preferences via emotional monitoring of the driver will be a game changer.”
Jack says:
It’s just one more example here of how cars are using it as well.
[08:24] Paragraph 2, Sentence 1
Affective computing — the computer-science field’s term for such attempts to read, simulate, predict, and stimulate human emotion with software — was pioneered at the MIT Media Lab by Rosalind Picard in the 1990s and has since become wildly popular as a computational and psychological research program.
Jack says:
It’s a little history lesson here of where this started. This is weird because we were just talking about uncovering how you feel. And why are we talking about the history of affective computing now?
And this is confusing for students. Many students reading this would not expect us to go backtrack to talking about its history. But this is something we’ve also learned in the past, which is that authors can write in whatever way they want, they can structure things, whatever.
Usually, they would put the history before their concern, but they’ve actually reversed it here. They have put the concern in the first paragraph. And now they’re going backtracking and talking about its history.
So just go with the flow. Understand that the structure has changed. We may revisit the first paragraph. That’s typically what happens when you write an essay. You write about your thesis in the first paragraph, and then you expand on it as you keep writing. So it’s just a style of writing that students need to be aware of.
[10:24] Paragraph 2, Sentence 2
Volumes like The Oxford Handbook of Affective Computing describe teams that are programming robots, chatbots, and animations to appear to express sadness, empathy, curiosity, and much more.
Jack says:
It’s some more backstory of who’s doing what.
[10:46] Paragraph 2, Sentence 3
“Automated face analysis” is translating countless images of human expressions into standardized code that elicits certain responses from machines.
Jack says:
This is like in the Terminator future where the machines will know that we’re scared of them.
[11:07] Paragraph 2, Sentence 4
As affective computing is slowly adopted in health care, education, and policing, it will increasingly judge us and try to manipulate us.
Jack says:
It says that as computing becomes more widely adopted, it will judge us and try to manipulate us. Now, the word manipulate doesn’t necessarily have to be negative. Because you can manipulate an instrument. So it’s not negative. You just change things.
However, given the context, the way that they’ve said this, the tone that they’ve injected gives manipulation a negative connotation.
'Words are meaningless without tone. You need to know the tone in order to understand the word or at least how they want you to understand the word.'Click To TweetHow the author said “judge us and try to manipulate us” does sound very extreme. And we noticed that extremeness because it’s so bold. And so here, it’s negative. However, I would wait for even a more extreme word to come up something like “it would ruin or destroy us.” Those are really powerful words that give the author sentiment away.
[12:58] Paragraph 3, Sentence 1
Troubling aspects of human-decoding software are already emerging.
Jack says:
Now, it’s really clear because the author gives it away saying it’s troubling, which is not a good word. Now, students must be able to pick this up because the tone has been established.
[13:32] Paragraph 3, Sentence 2
Over 1,000 experts recently signed a letter condemning “crime-predictive” facial analysis.
Jack says:
It’s not good to use “crime-predictive facial analysis.” It’s creepy when just a camera is always looking at people predicting if someone is going to commit a crime.
[14:09] Paragraph 3, Sentence 3
Their concern is well-founded.
Jack says:
The author here is supporting these experts.
[14:15] Paragraph 3, Sentence 4
Psychology researchers have demonstrated that faces and expressions do not necessarily map neatly onto particular traits and emotions, let alone to the broader mental states evoked in “aggression detection.”
Jack says:
The author here is saying there’s research that shows that facial detection doesn’t actually mean anything.
[14:41] Paragraph 3, Sentence 5
Since “instances of the same emotion category are neither reliably expressed through nor perceived from a common set of facial movements,” the researchers write, communicative capacities of the face are limited.
Jack says:
The researchers are saying the face really can’t communicate what you’re trying to draw from it. And this makes sense given where the author’s going. We can expect something like this to come up.
[15:25] Paragraph 3, Sentence 6
The dangers of misinterpretation are clear and present in all these scenarios.
Jack says:
The misinterpretation of someone’s facial expression doesn’t necessarily mean that’s what they feel.
[16:13] Paragraph 4, Sentence 1
Bias is endemic in U.S. law enforcement.
Jack says:
It’s a pretty straight-up statement here from the author talking about law enforcement and bias.
[16:24] Paragraph 4, Sentence 2
Affective computing may exacerbate it.
Jack says:
The author is saying maybe it’ll get worse with this.
[16:32] Paragraph 4, Sentence 3
For example, as researcher Lauren Rhue has found, “Black men’s facial expressions are scored with emotions associated with threatening behaviors more often than white men, even when they are smiling.”
Jack says:
It’s discussing racial disparities in this computing algorithm.
[16:57] Paragraph 4, Sentence 4
Sampling problems are also likely to be rife.
Jack says:
You may not necessarily know what sampling problems and rife mean. So let’s go ahead with the paragraph.
[17:11] Paragraph 4, Sentence 5
If a database of aggression is developed from observation of a particular subset of the population, the resulting AI may be far better at finding “suspect behavior” in that subset rather than others.
Jack says:
You don’t have to know the details here as long as you know that there are issues with affective computing. As long as you know it’s the same issue or makes it worse when it goes into law enforcement. Then you’re fine.
If you’re stuck on these details, trying to memorize them or understand them to a really good degree, you’re going to lose a bigger purpose.
The structure of the passage has been established. The first paragraph brings up the concern in a subtle way. The second paragraph brings up how this computing started, but then ultimately brings up the major concern in an obvious way.
Then the third paragraph explains more of that concern, which now establishes the main idea. And now this last paragraph is just going on and talking about another field that also has the same issues with affective computing.
So by understanding the structure of the passage, we’ve done two things. We found the main idea. And if they ever asked a question about any paragraph, we immediately know why that paragraph exists. And we can answer it based on that paragraph alone.
'The MCAT likes you to know the structure of the passage.'Click To TweetAnd so, always read the passage because if they didn’t want it there, they wouldn’t put it. They care about every paragraph so you should read every paragraph and in the same order it’s presented, because it gives you structure and balance.
[19:23] Paragraph 4, Sentence 6
Those who were most exposed to surveillance systems in the past may then be far more likely to suffer computational judgments of their behavior as “threatening” or worse.
Jack says:
It’s just continuing that sampling problem.
[19:42] Paragraph 4, Sentence 7
The Robocops of the future are “machine learning” from data distorted by a discrimination-ridden past.
Jack says:
This is a nice little visual to end there with Robocops of the future, which is a good, old movie.
[19:58] Main Idea
The main idea is about the problems with affective computing. And how affective computing can lead to more discrimination is just one point in the passage. The bigger picture is all of the problems associated with it.
The underlying, bigger foundational issue is that there are issues with affective computing. So if you get a question, what is the main idea, don’t pick discrimination because that was only in one paragraph. Pick something that applies to every paragraph.
'The general theme of the passage is what you should pick.'Click To TweetThat affective computing can lead to more discrimination is not wrong. It’s actually a true statement. That’s what they said in the passage. But it doesn’t answer the question.
The question is, what’s the main idea? What’s the central thesis of the passage? And discrimination was not a central thesis. It was not a consistent theme throughout the passage. It was only at the end as an example.
So knowing the difference and knowing the structure helps a ton with those kinds of questions.
Links:
Link to the article: