Apple Podcasts | Google Podcasts
Session 38
Today, we’re diving into another great passage about airplanes, technology, and accidents.
We’re joined by Jack Westin as we try to dive deep into another passage to help you ace the CARS section of your MCAT.
Be sure to sign up for Jack’s free Daily CARS Passages sent right into your email inbox. Simply go to JackWestin.com.
Listen to this podcast episode with the player above, or keep reading for the highlights and takeaway points.
[02:08] A Brief Overview
This is a regular passage that brings in a deeper perspective. This could be the harder part as a lot of students like to read and look for superficial stuff. That’s great!
Sometimes, the author could say something different but there could be a deeper connection to what you’re reading. And you need to make that connection. It’s not that hard. However, if you keep your mind closed off to what you’re reading, it might be difficult.
Link to article:
https://www.theatlantic.com/ideas/archive/2019/04/why-accidents-like-notre-dame-fire-happen/587956/
Accidents are part of life. So are catastrophes. Two of Boeing’s new 737 Max 8 jetliners, arguably the most modern of modern aircraft, crashed in the space of less than five months. A cathedral whose construction started in the 12th century burned before our eyes, despite explicit fire-safety procedures and the presence of an on-site firefighter and a security agent. If Notre-Dame stood for so many centuries, why did safeguards unavailable to prior generations fail? How did modernizing the venerable Boeing 737 result in two horrific crashes, even as, on average, air travel is safer than ever before?
These are questions for investigators and committees. They are also fodder for accident theorists. Take Charles Perrow, a sociologist who published an account of accidents occurring in human-machine systems in 1984. Now something of a cult classic, Normal Accidents made a case for the obvious: Accidents happen. What he meant is that they must happen. Worse, according to Perrow, a humbling cautionary tale lurks in complicated systems: Our very attempts to stave off disaster by introducing safety systems ultimately increase the overall complexity of the systems, ensuring that some unpredictable outcome will rear its ugly head no matter what. Complicated human-machine systems might surprise us with outcomes more favorable than we have any reason to expect. They also might shock us with catastrophe.
When disaster strikes, past experience has conditioned the public to assume that hardware upgrades or software patches will solve the underlying problem. This indomitable faith in technology is hard to challenge—what else solves complicated problems? But sometimes our attempts to banish accidents make things worse.
In his 2014 book, To Save Everything, Click Here, the author Evgeny Morozov argues that “technological solutionism”—leaving the answer up to Silicon Valley—causes us to neglect other ways of addressing problems. In The Glass Cage, published the same year, Nicholas Carr points warily to “deskilling,” which occurs when the skills of human operators working a job begin to erode, as automation makes such capacities unnecessary. On average, automation is safer than error-prone humans, so a typical response to deskilling is “So what?”
The specter of airline pilots losing their manual flying skills—or being stripped of the ability to use them—brings to mind the tragedy of the Boeing 737 Max crashes. Investigators reviewing the crashes, which killed 157 people in Indonesia and 189 in Ethiopia, have zeroed in on a software problem in the maneuvering-characteristics augmentation system, or MCAS. MCAS is necessary for the Max, unlike its older brother, the 737-800, because the former sports a redesign that fits larger engines under the wings. The engine on the Max sits farther forward, creating a vulnerability to stalling from steeper climb rates on takeoff. MCAS simply pushes the nose down—and in the process, it transfers control away from the pilots. Pushing the nose down helps when averting a stall, but too much nose-down has fatal consequences.
[Related episode: MCAT CARS Skills—A Passage About Evolution and Technology]
[04:05] Paragraph 1, Sentences 1-2
Accidents are part of life. So are catastrophes.
Jack says:
The author starts the article by talking about accidents and catastrophes.
[04:21] Paragraph 1, Sentence 2
Two of Boeing’s new 737 Max 8 jetliners, arguably the most modern of modern aircraft, crashed in the space of less than five months.
Jack says:
This is another straightforward fact. They’re example of accidents.
[05:00] Paragraph 1, Sentence 3
A cathedral whose construction started in the 12th century burned before our eyes, despite explicit fire-safety procedures and the presence of an on-site firefighter and a security agent.
Jack says:
These are specifically accidents. Even though they’ve got all these fire safety procedures and onsite security agents, there was still an accident. There was still fire.
[05:38] Paragraph 1, Sentence 4
If Notre-Dame stood for so many centuries, why did safeguards unavailable to prior generations fail?
Jack says:
The author is bringing up the question here as to why an accident in spite of the safeguards.
[06:03] Paragraph 1, Sentence 5
How did modernizing the venerable Boeing 737 result in two horrific crashes, even as, on average, air travel is safer than ever before?
Jack says:
Another question as to why an accident when these planes have been made more modern.
Basically, they’re bringing up a question but it’s rhetorical in nature. The author is saying that this is the problem. We have modern things and ways of preventing accidents, but we still have accidents. Why?
[06:53] Paragraph 2, Sentence 1
These are questions for investigators and committees.
Jack says:
This is another straightforward simple sentence here.
[07:00] Paragraph 2, Sentence 2
They are also fodder for accident theorists.
Jack says:
You don’t have to worry about the word “fodder.” You can just blank that out. And it mentioned accident theorists so these people are probably interested in this in some ways.
Accident theorists hypothesize and come up with reasons for why things happen. Fodder probably means food, sustenance, fuel for these accident theorists.
[08:00] Paragraph 2, Sentence 3
Take Charles Perrow, a sociologist who published an account of accidents occurring in human-machine systems in 1984.
Jack says:
Human-machine systems are probably something that humans make or that involve humans and machines.
[08:49] Paragraph 2, Sentence 4
Now something of a cult classic, Normal Accidents made a case for the obvious: Accidents happen.
Jack says:
We’re given the title of the book. Specifically, it says that accidents happen.
[09:13] Paragraph 2, Sentence 5
What he meant is that they must happen.
Jack says:
It means we can never get rid of them. They must happen.
[09:39] Paragraph 2, Sentence 4
Worse, according to Perrow, a humbling cautionary tale lurks in complicated systems: Our very attempts to stave off disaster by introducing safety systems ultimately increase the overall complexity of the systems, ensuring that some unpredictable outcome will rear its ugly head no matter what.
Jack says:
The author basically predicted the crash of these jetliners. The more safety stuff you put in, you’re going to have accidents because it’s just going to be complicated to do things.
So the first paragraph introduced that accidents happen. But there was a deeper meaning behind that.
It was basically superficially trying to tell you more important. And that’s the idea that its complexity or modernization that we’re focusing on trying to prevent accidents is what causes accidents. This is the idea the author actually is caring more about.
[11:04] Paragraph 2, Sentence 7
Complicated human-machine systems might surprise us with outcomes more favorable than we have any reason to expect.
Jack says:
It’s saying that we’re surprised by the fact that it’s causing accidents.
[11:55] Paragraph 2, Sentence 8
They also might shock us with catastrophe.
Jack says:
The author is trying to lead us to this that while these machines are awesome, the author is also setting up the scene for how it could also be a disaster.
[12:20] Paragraph 3, Sentence 1
When disaster strikes, past experience has conditioned the public to assume that hardware upgrades or software patches will solve the underlying problem.
Jack says:
As people, we assume that an upgrade of these software and hardware could stop accidents from happening.
The author is leading us to this point that when we make things more complicated, we’re probably making it worse. So the public at large is being told that they’re going to fix this problem. But are we really fixing the problem? Probably, no.
[13:22] Paragraph 3, Sentence 2
This indomitable faith in technology is hard to challenge—what else solves complicated problems?
Jack says:
This is rhetorical. The author is saying that this is the problem that we have. And we’re trying to solve problems by introducing more complicated things. That’s the issue with all this.
People are being told that there’s nothing else we can do to solve this problem unless we try to make a patch or some kind of complicated solution. The author is having some kind of qualm or problem with this.
[14:30] Paragraph 3, Sentence 3
But sometimes our attempts to banish accidents make things worse.
Jack says:
The statement is just saying what the author Perrow was saying.
[15:00] Paragraph 4, Sentence 1
In his 2014 book, To Save Everything, Click Here, the author Evgeny Morozov argues that “technological solutionism”—leaving the answer up to Silicon Valley—causes us to neglect other ways of addressing problems.
Jack says:
You can assume this is probably related to technology. If you argue about technological solutionism, leaving the answer to Silicon Valley, this causes us to forget other ways of addressing the problem. We’re entirely only focused on technology to solve our problems, and the author is suggesting this is the problem.
[16:08] Paragraph 4, Sentence 2
In The Glass Cage, published the same year, Nicholas Carr points warily to “deskilling,” which occurs when the skills of human operators working a job begin to erode, as automation makes such capacities unnecessary.
Jack says:
The author is talking about automation to boost our skills. As we grow more dependent on technology, we use our skills less and less often. Deskilling means losing our skills as we depend on technology.
[17:04] Paragraph 4, Sentence 3
On average, automation is safer than error-prone humans, so a typical response to deskilling is “So what?”
Jack says:
The typical response to deskilling is “so what.” So we have two more names and two more books. Not only does technology introduce more problems for us. It increases the complexity of the system and our solution for it. But it also prevents us from using other skills because we’re losing those skills.
[17:55] Paragraph 5, Sentence 1
The specter of airline pilots losing their manual flying skills—or being stripped of the ability to use them—brings to mind the tragedy of the Boeing 737 Max crashes.
Jack says:
Now, we’re bringing it back to the specific example that was brought up earlier. Because of the technology in modern aircraft, the pilots are losing their ability. They have been deskilled to fly the airplane.
[18:28] Paragraph 5, Sentence 2
Investigators reviewing the crashes, which killed 157 people in Indonesia and 189 in Ethiopia, have zeroed in on a software problem in the maneuvering-characteristics augmentation system, or MCAS.
Jack says:
Tying into what Perrow said, as we make things more complicated to make things safer, then we have more issues potentially.
Now, we’re connecting this back to the initial paragraph where it was discussing the whole plane accidents. So it’s suggesting that when you add more complexity software patches, then more accidents are just going to happen.
[19:38] Paragraph 5, Sentence 3
MCAS is necessary for the Max, unlike its older brother, the 737-800, because the former sports a redesign that fits larger engines under the wings.
Jack says:
It’s saying we need the MCAS systems because we have larger engines.
[20:08] Paragraph 5, Sentence 4
The engine on the Max sits farther forward, creating a vulnerability to stalling from steeper climb rates on takeoff.
Jack says:
It’s just explaining why we need this MCAS system. You don’t have to memorize all this. The bigger point is what matters. The actual means as to why we have this technology doesn’t really matter. But it’s the fact that we have the technology that’s causing issues.
The question will post something about this to scare students. And students think they need to understand it at a very in-depth level. But you really don’t have to know anything about the engine or its capabilities. Again, it’s the bigger picture that matters. Obviously, you need to comprehend every sentence you read, but you’ve got to tie it back to the bigger picture.
[23:04] Paragraph 5, Sentence 5
MCAS simply pushes the nose down—and in the process, it transfers control away from the pilots.
Jack says:
The author is just giving more specifics.
[23:25] Paragraph 5, Sentence 6
Pushing the nose down helps when averting a stall, but too much nose-down has fatal consequences.
Jack says:
This connects us back to everything we were talking about in the middle of the passage. We have this technology to try to make the plane safer, but it’s adding more complexity. This is prone to more accidents. And it’s taking control away from the pilots so they’re being deskilled.
[25:00] Understanding What You’re Reading
As you get through the passage, you’re going to naturally get faster. If you start understanding what’s happening early on and start making those connections and where the author is headed, then the last paragraph is going to just be the summary. This is not going to be so hard, especially if you recognize that it’s connected to the earlier paragraphs.
You don’t have to skim and not read it and not pay attention. You will naturally just go faster. There might be deeper connections but embrace yourself to understand it and keep an open mind about it. This will allow you to not only understand what you’re reading, but also finish faster.
Links:
Link to article:
https://www.theatlantic.com/ideas/archive/2019/04/why-accidents-like-notre-dame-fire-happen/587956/