21 What surprised Gemma about the Vanguard Charter?
A the focus on passenger protection
B the rules for pedestrian priority
C the lack of clear programming guidelines
22 Gemma believes the biggest barrier to public acceptance is
A understanding the complex coding.
B exaggerated news reports.
C trusting the cameras in heavy rain.
23 For the introduction of their presentation, the students agree to use
A a short video clip.
B a famous quote.
C an interactive poll.
24 How does Callum feel about the ‘Moral Dial’ concept?
A It is too complicated for manufacturers.
B It is unfair to lower-income buyers.
C It is an unethical transfer of responsibility.
25 What is the students’ main goal for next Monday?
A to finalize the slides
B to practice their timing
C to submit the initial draft
Questions 26 to 30
What ethical priority does the Valerius Algorithm apply to each of the following crash scenarios?
Choose FIVE answers from the box and write the correct letter, A-G, next to Questions 26-30.
Ethical Priorities
A minimize injury to bystanders
B protect the youngest individuals
C distribute harm equally
D prioritize the vehicle owner
E follow local traffic laws strictly
F swerve into property rather than people
G maintain the current trajectory
Scenarios
26 The narrow bridge scenario
27 The sudden obstacle scenario
28 The medical emergency scenario
29 The animal crossing scenario
30 The system failure scenario
Keys
21 B
22 B
23 C
24 C
25 A
26 F
27 A
28 C
29 G
30 E
Transcripts
Part 3: You will hear two university students, Callum and Gemma, discussing a presentation on the ethics of autonomous vehicles.
CALLUM: Hi Gemma. Have you finished reading the Vanguard Charter chapter for our presentation?
GEMMA: Yes, I did. I expected it to focus heavily on passenger protection, keeping the people inside secure. But what really surprised me was its strict rules for pedestrian priority.
CALLUM: Exactly. I thought the lack of clear programming guidelines would be an issue, but they laid everything out. Moving on to public acceptance, I think getting people to understand the complex coding is the biggest hurdle.
GEMMA: I disagree. The math behind the algorithms isn’t what scares people. It’s the exaggerated news reports. Every time a self-driving car makes a tiny mistake, the media blows it completely out of proportion.
CALLUM: True. You don’t think trusting cameras in heavy rain is a bigger fear?
GEMMA: Not really. The sensationalism is the real barrier right now. Anyway, how should we start our talk? A video might take up too much time. We could use that famous quote by Dr. Vane instead?
CALLUM: Or, we could engage the audience immediately with an interactive poll?
GEMMA: Oh, I love that idea. Let’s go with the poll. It will grab their attention right away.
CALLUM: Perfect. Speaking of the ‘Moral Dial’, where buyers choose the car’s ethical settings… Do you think it’s too complicated for manufacturers to build?
GEMMA: No, the engineering is straightforward. Some critics argue it’s unfair to lower-income buyers. But my main problem is that it’s an unethical transfer of responsibility. The designers are passing the tough choices onto the consumer.
CALLUM: I completely agree. Looking at our timetable, we need to practice our timing eventually, and submit the initial draft.
GEMMA: Actually, the professor extended the draft deadline. So next Monday, our main goal is to finalize the slides. Then we can practice.
CALLUM: Sounds like a plan. Finalizing the slides it is. Now, we need to explain the Valerius Algorithm using some theoretical crash scenarios.
GEMMA: Good idea. Let’s start with the narrow bridge scenario. In this one, a sudden stop isn’t possible. Does it prioritize the vehicle owner?
CALLUM: No, it actually calculates the trajectory to swerve into property rather than people. It will aim for a fence to avoid hitting anyone.
GEMMA: Got it. What about the sudden obstacle scenario, like a boulder on the road?
CALLUM: The primary rule there is to minimize injury to bystanders. Even if it means a rough ride for passengers, it won’t swerve onto a crowded pavement.
GEMMA: That makes sense. Then there is the medical emergency scenario. Does it follow local traffic laws strictly?
CALLUM: Usually, yes. But in a crisis, it overrides those laws. I read earlier that it seeks to protect the youngest individuals if a crash is unavoidable.
GEMMA: Wait, let me check the text. Ah, no. It simply decides to distribute harm equally if a collision happens. It ensures no single person takes the full impact.
CALLUM: Oh, right! Thanks. Now, the animal crossing scenario. This is a common one. Does it swerve into the ditch?
GEMMA: No, the system is programmed to just maintain the current trajectory. Swerving causes worse accidents, so it just brakes straight ahead.
CALLUM: Brutal, but logical. Finally, the system failure scenario. If the computers completely freeze, what happens?
GEMMA: It defaults to a basic mechanical backup. The sole priority there is to follow local traffic laws strictly, staying in its lane and gradually coming to a halt.