Learn how to plan, assure, and control quality in predictive projects.
🎥 Watch PMP Exam Prep video series: https://www.youtube.com/playlist?list=PLaZjaTadwi1sDBAXtUd6JI5_FUsIJjpAT
How do predictive projects ensure deliverables meet standards while preventing costly rework? In this video, we dive into Waterfall Project Quality Management, covering how quality is planned, built into processes, verified through control activities, and continuously improved using structured frameworks and tools.
This is the sixth video in our 15-part Waterfall Review & Question series. You’ll learn the difference between Quality Assurance and Quality Control, how Cost of Quality drives decision-making, and how tools like control charts, Pareto charts, fishbone diagrams, and audits are applied in predictive environments. You’ll then test your understanding with 10 scenario-based practice questions (Questions 51–60) and detailed explanations.
✅ You’ll learn how to:
• Plan quality using standards, metrics, and checklists
• Distinguish between Manage Quality (QA) and Control Quality (QC)
• Apply Cost of Quality concepts, emphasizing prevention over inspection
• Interpret control charts, trends, and non-random patterns
• Use quality tools such as Pareto charts, histograms, and fishbone diagrams
• Apply continuous improvement frameworks like PDCA, Six Sigma, Lean, and TQM
By practicing these questions, you’ll strengthen your ability to prevent defects, analyze variation, and drive continuous improvement — critical skills for both the PMP® exam and real-world predictive projects.
Chapters:
0:00 Project Quality Management Overview
3:48 Question 51
Show More Show Less View Video Transcript
0:00
The topic we'll cover is quality
0:02
management. The processes that ensure
0:04
project deliverables meet defined
0:06
standards and satisfy stakeholder
0:09
expectations.
0:10
In waterfall projects, quality is
0:12
formally planned and documented. This
0:14
begins with plan quality management
0:16
where the project manager identifies
0:18
quality standards, metrics, and
0:20
processes that the project will follow.
0:23
The outputs include the quality
0:25
management plan, quality metrics, and
0:27
quality checklists that will guide the
0:28
team throughout the project. Next comes
0:32
manage quality, also known as quality
0:34
assurance. This is an executing process
0:36
that focuses on doing things the right
0:38
way. The project manager ensures that
0:41
processes are being followed correctly
0:43
by using activities such as audits,
0:46
process analysis, and preventive
0:48
actions. The goal is to build quality
0:50
into the work and enable continuous
0:53
improvement rather than relying only on
0:55
inspections at the end. Then we move to
0:58
control quality which is about verifying
1:00
results. This monitoring process
1:02
involves inspections, testing and
1:05
measuring deliverables against defined
1:07
quality standards. While quality
1:08
assurance ensures that the process is
1:10
correct, quality control ensures that
1:13
the final deliverables meet the required
1:15
specifications.
1:16
The exam also emphasizes the concept of
1:19
the cost of quality which includes four
1:22
categories. Prevention costs are
1:24
incurred to avoid defects in the first
1:26
place. Appraisal costs are for measuring
1:28
and monitoring such as inspections and
1:30
tests. Internal failure costs are the
1:33
costs of defects found before delivery,
1:35
such as rework or scrap. External
1:38
failure costs occur after delivery, such
1:40
as warranty claims or recalls. Remember,
1:42
PMI stresses that prevention is cheaper
1:45
and preferred over inspection. Quality
1:48
management also ties into continuous
1:50
improvement frameworks. One is
1:52
Dimmming's plan, do check, act cycle,
1:55
which drives ongoing process
1:57
improvement. You should also be aware of
1:59
six sigma which focuses on reducing
2:01
defects and lean, which emphasizes
2:04
eliminating waste. On the exam, expect
2:07
questions on quality tools. These
2:09
include the cause and effect diagram,
2:12
also known as the Ishiawa or fishbone
2:15
diagram, which helps identify root
2:17
causes of problems. The Pareto chart
2:20
which applies the 8020 principle to
2:22
highlight the vital few causes that
2:24
generate most problems. Histograms which
2:27
display data distributions. Control
2:30
charts which track process variation and
2:32
use concepts like upper and lower
2:34
control limits, the rule of seven and
2:36
distinguishing assignable from common
2:39
cause variation. Scatter diagrams which
2:41
show relationships between two
2:43
variables. flowcharts which map
2:45
processes and check sheets which are
2:47
simple tally forms for data collection.
2:50
Other important concepts include
2:52
benchmarking where project processes are
2:54
compared against best practices, audits
2:57
which ensure compliance and promote
2:58
continuous improvement and the
3:00
distinction between prevention and
3:02
inspection. Remember in predictive
3:04
projects prevention is always emphasized
3:07
over inspection. Finally, predictive
3:09
projects place a strong emphasis on
3:12
formal quality planning, documented
3:14
standards, and acceptance testing at the
3:16
end of the project to confirm
3:18
deliverables meet requirements.
3:21
So, quality management in predictive
3:23
projects is about planning standards,
3:25
assuring the process, controlling
3:27
results, preventing defects, and driving
3:29
continuous improvement using structured
3:31
tools and techniques. Now, we'll go
3:34
through practice questions that test
3:35
your knowledge of quality assurance
3:37
versus quality control, cost of quality,
3:39
continuous improvement frameworks, and
3:41
the use of quality tools in predictive
3:44
environments. Let's get into the first
3:46
question on this topic. Question 51.
3:49
Project manager is planning a
3:51
large-scale infrastructure project using
3:54
a predictive approach. The sponsor
3:56
emphasizes minimizing costly rework and
3:59
ensuring stakeholder satisfaction.
4:01
During quality planning, the team
4:03
struggles to agree on how to measure
4:05
success and apply quality standards
4:07
consistently across work packages. What
4:10
should the project manager do? A. Begin
4:13
quality planning after scope and
4:15
schedule baselines are finalized to
4:16
avoid misalignment. B. Use the
4:19
requirements traceability matrix to
4:21
define measurable quality goals linked
4:24
to stakeholder expectations.
4:26
C. Facilitate a workshop to define
4:29
quality metrics and integrate them into
4:31
the quality management plan. D. Allow
4:35
each work package lead to define quality
4:37
standards independently to improve
4:39
ownership and accountability.
4:41
You can pause the video here if you need
4:43
more time to work on the question. The
4:46
correct answer is C. This question tests
4:49
your understanding of planning quality
4:51
in predictive environments where quality
4:53
must be defined early and clearly
4:55
documented to guide execution and avoid
4:57
costly rework. Choice C is the best
5:00
option because facilitating a workshop
5:03
promotes shared understanding and
5:04
ensures quality metrics are agreed upon
5:06
and integrated into the quality
5:08
management plan. This promotes alignment
5:11
and proactive quality control. Choice A
5:14
is incorrect. Waiting until the scope
5:16
and schedule are baseline delays quality
5:18
planning. PMI recommends starting
5:21
quality planning in parallel with scope
5:23
planning, not after. Choice B is
5:26
incorrect. The requirements traceability
5:28
matrix is useful for tracking
5:30
stakeholder needs, but it's not the
5:32
primary tool for defining projectwide
5:34
quality metrics and standards. Choice D
5:37
is incorrect. Allowing different teams
5:40
to define their own standards leads to
5:42
inconsistency which undermines quality
5:45
control in predictive projects. Let's
5:48
move on to the next question if you're
5:50
ready. Question 52. A predictive project
5:53
is in execution and the project manager
5:56
notices that different teams are using
5:58
slightly different procedures to
5:59
complete similar tasks. Some completed
6:02
deliverables are failing inspections due
6:04
to inconsistencies in how quality
6:06
standards were interpreted and applied.
6:09
What should the project manager do next?
6:12
A. Conduct a root cause analysis on
6:14
failed deliverables and update the issue
6:17
log. B. Assign inspection
6:19
responsibilities to ensure objectivity
6:22
during quality control.
6:24
C. Perform a process analysis and
6:27
recommend preventive actions to align
6:29
execution with quality standards. D.
6:33
Increase the frequency of quality
6:34
inspections to identify nonconformance
6:37
earlier. You can pause the video here if
6:40
you need more time to work on the
6:41
question. The correct answer is C. This
6:44
question tests your understanding of the
6:47
difference between manage quality QA and
6:50
control quality QC in predictive
6:53
projects. It emphasizes how process
6:55
issues, not just product defects, must
6:58
be proactively addressed to prevent
7:00
recurring quality failures. Choice C is
7:03
the best option because process analysis
7:05
and preventive actions are core
7:07
activities of manage quality QA. They
7:10
help ensure that teams follow consistent
7:12
approved methods aligned with the
7:13
quality management plan. Choice A is
7:16
incorrect. Root cause analysis is a
7:18
useful technique, but it belongs more to
7:21
control quality and it focuses on
7:23
identifying why deliverables failed, not
7:25
on improving the processes themselves.
7:28
Choice B is incorrect. Assigning new
7:30
inspectors may help reduce bias, but it
7:32
does not address the underlying
7:34
inconsistency in execution across teams.
7:38
Choice D is incorrect. More frequent
7:40
inspections might catch issues sooner,
7:42
but this is reactive and doesn't solve
7:44
the root cause. Lack of consistent
7:46
quality process application. Let's move
7:48
on to the next question if you're ready.
7:51
Question 53. During quality control on a
7:54
predictive project, a control chart
7:56
tracking manufacturing defects shows
7:58
that several data points fall within the
8:00
upper and lower control limits but
8:02
consistently trend upward and approach
8:04
the upper limit. No single point exceeds
8:07
the control limits, but the pattern is
8:09
persistent. What should the project
8:12
manager do? A take no action since the
8:16
process is still statistically in
8:18
control. B halt production and initiate
8:21
a formal root cause analysis to address
8:23
potential non-conformance.
8:25
C. Share the trend data with the sponsor
8:27
and suggest increasing sampling
8:29
frequency to ensure compliance. D.
8:33
Investigate the trend as a possible
8:34
assignable cause and evaluate the need
8:36
for preventive action. You can pause the
8:39
video here if you need more time to work
8:40
on the question. The correct answer is
8:43
D. This question tests your knowledge of
8:46
control charts and early quality
8:48
intervention in predictive projects.
8:50
Identifying subtle signals before
8:52
defects occur is key to preventing
8:55
rework and protecting quality baselines.
8:58
Choice D is the best option because
9:00
while the process appears in control,
9:02
the upward trend suggests a possible
9:04
assignable cause. Investigating early
9:07
allows the team to apply preventive
9:08
actions and avoid future quality
9:10
failures.
9:11
Choice A is incorrect. Although all
9:14
points are inside limits, the trend
9:16
pattern violates control chart rules
9:18
like the rule of seven, which may signal
9:20
an underlying issue. Choice B is
9:23
incorrect. Halting production is
9:25
extreme. There's no actual
9:27
non-conformance yet, only a potential
9:29
risk that should first be investigated.
9:32
Choice C is incorrect. Sharing data and
9:35
increasing sampling may help with
9:36
monitoring, but it delays taking
9:38
proactive action to investigate and
9:40
correct the potential root cause. Let's
9:43
move on to the next question if you're
9:44
ready. Question 54. During planning for
9:48
a highly regulated infrastructure
9:50
project, the project manager is asked to
9:52
justify the high budget allocated for
9:54
team training, documentation, reviews,
9:56
and preventive testing. A senior
9:57
stakeholder suggests reducing these
9:59
costs and increasing post delivery
10:01
inspections to save time upfront. What
10:04
should the project manager do? A. Accept
10:07
the recommendation as regulatory
10:09
projects typically emphasize final
10:12
acceptance inspections.
10:14
B. Explain that preventive costs reduce
10:17
overall quality costs by minimizing
10:19
rework and customer dissatisfaction.
10:22
D. agree to reduce training and instead
10:24
allocate more to defect tracking and
10:26
reinspection. D. Propose balancing
10:30
prevention and inspection efforts to
10:31
maintain stakeholder support and reduce
10:34
initial planning effort. You can pause
10:36
the video here if you need more time to
10:39
work on the question. The correct answer
10:41
is B. This question tests your
10:44
understanding of the cost of quality
10:45
categories in predictive projects and
10:48
PMI's emphasis on prevention over
10:50
inspection to minimize total cost and
10:53
improve long-term outcomes. Choice B is
10:56
the best option because prevention costs
10:58
like training and documentation reviews
11:01
help avoid expensive rework,
11:03
non-compliance, and customer
11:04
dissatisfaction later. PMI recommends
11:07
investing upfront in quality. Choice A
11:10
is incorrect. While inspections are
11:12
important, final acceptance is too late
11:14
to catch many quality problems. This
11:16
approach increases the risk of external
11:19
failure costs. Choice C is incorrect.
11:22
Shifting funds to defect tracking
11:24
focuses on appraisal and internal
11:26
failure costs which are reactive, not
11:28
preventive, and drive up overall cost of
11:30
quality. Choice D is incorrect. While it
11:33
sounds balanced, compromising prevention
11:35
efforts undermines quality planning and
11:37
increases the risk of failures later in
11:39
execution. Let's move on to the next
11:42
question if you're ready. Question 55. A
11:45
project manager on a large multi-year
11:48
predictive project is reviewing defect
11:50
trends and delivery delays across
11:52
multiple phases. The team previously
11:54
applied corrective actions, but similar
11:56
issues continue to reoccur. Leadership
11:58
now expects a more structured approach
12:00
to identify systemic inefficiencies and
12:03
reduce variability in outcomes. What
12:06
should the project manager do? A
12:08
implement fasttracking in the schedule
12:10
to make up for loss time and improve
12:12
team motivation. B recommend additional
12:15
quality audits across all phases to
12:18
ensure compliance with documented
12:19
procedures. C. Apply lean principles to
12:23
eliminate nonvalue adding activities
12:25
across the workflow. D. Initiate a
12:28
structured root cause investigation
12:30
using the PDCA cycle and six sigma
12:32
principles. You can pause the video here
12:35
if you need more time to work on the
12:37
question. The correct answer is D. This
12:40
question tests your understanding of
12:42
structured continuous improvement
12:44
frameworks such as PDCA and six sigma
12:47
which focus on root cause analysis and
12:49
reducing process variation in predictive
12:51
projects. Choice D is the best option
12:54
because the repeated nature of the
12:56
issues points to deeper systemic
12:58
inefficiencies. Using PDCA and six sigma
13:02
provides a disciplined datadriven
13:04
approach to improve performance and
13:07
prevent recurrence. Choice A is
13:09
incorrect. Fasttracking addresses
13:12
schedule delays, not quality issues or
13:14
root causes. It may even introduce more
13:16
risk if underlying inefficiencies
13:19
remain. Choice B is incorrect. Audits
13:22
can help identify compliance gaps, but
13:25
they do not replace structured process
13:27
improvement methods. The scenario needs
13:29
more than procedural verification.
13:32
Choice C is incorrect. Lean can reduce
13:35
waste, but the scenario also requires
13:37
analytical depth to uncover why defects
13:40
keep happening, not just to eliminate
13:42
steps. Let's move on to the next
13:44
question if you're ready. Question 56.
13:48
A quality audit on a predictive project
13:50
reveals a spike in defect frequency
13:53
related to electrical subsystem testing.
13:56
The quality lead wants to determine if
13:58
these defects share a common root cause
14:01
and whether the trend suggests a deeper
14:03
process failure that needs prevention.
14:06
Which quality tools should the team use
14:08
together to investigate and validate the
14:10
source of variation?
14:12
A histogram and run chart. B. Flowchart
14:16
and control chart. C. Pareto chart and
14:20
control chart. D. Fishbone diagram and
14:24
histogram. You can pause the video here
14:27
if you need more time to work on the
14:28
question. The correct answer is C. This
14:32
question tests your ability to match the
14:33
right quality tools to both the problem
14:35
type and the intent of analysis. A key
14:37
expectation in predictive project
14:39
environments where datadriven
14:41
decisionmaking is critical. Choice C is
14:43
the best option because the Pareto chart
14:46
helps the team prioritize the most
14:48
frequent defects applying the 8020
14:50
principle while the control chart
14:52
enables them to see whether the process
14:54
is statistically in control and whether
14:57
variation is assignable or common cause.
15:00
Used together, they provide both
15:01
prioritization and trend validation.
15:04
Choice A is incorrect. While histograms
15:07
and run charts show frequency and trend,
15:09
they don't provide insight into process
15:11
stability or root cause depth. Choice B
15:14
is incorrect. Flowcharts map processes
15:17
and control charts detect variation, but
15:20
this pair lacks prioritization of defect
15:22
types and doesn't guide resolution
15:24
effectively. Choice D is incorrect. A
15:27
fishbone diagram supports brainstorming
15:29
potential causes and a histogram shows
15:32
defect frequency, but neither shows
15:35
whether variation is systemic or under
15:37
control. Let's move on to the next
15:39
question if you're ready. Question 57.
15:42
Midway through a complex government
15:44
infrastructure project using a
15:46
predictive approach, the project manager
15:48
notices inconsistent quality results
15:50
across multiple work packages. To
15:53
address the issue, the manager wants to
15:55
ensure alignment with industry best
15:57
practices while also identifying
15:59
internal process inefficiencies.
16:02
What should the project manager do? A.
16:05
Conduct a controlled quality inspection
16:07
and update the quality metrics to
16:09
reflect observed trends. B. Initiate a
16:13
lessons learned session with key team
16:15
members and vendors to review execution
16:18
gaps. C. Perform benchmarking against
16:21
similar high-erforming projects and
16:23
initiate a quality audit. D. Present the
16:26
issue to the sponsor and recommend
16:28
increasing compliance reviews across the
16:30
remaining work. You can pause the video
16:33
here if you need more time to work on
16:35
the question. The correct answer is C.
16:38
This question tests your understanding
16:40
of proactive quality improvement tools
16:42
in predictive projects. Specifically,
16:44
how to use benchmarking and audits to
16:46
identify gaps and align with best
16:48
practices. Choice C is the best option
16:51
because benchmarking allows the project
16:53
manager to compare performance with
16:54
external high-erforming standards while
16:57
a quality audit helps evaluate the
16:59
effectiveness of current processes and
17:01
identify opportunities for improvement.
17:03
Both are proactive structured tools for
17:05
addressing systemic issues. Choice A is
17:08
incorrect. While control quality
17:10
inspections focus on verifying
17:12
deliverables, they don't address root
17:14
causes or broader process weaknesses.
17:17
Choice B is incorrect. Lessons learned
17:19
are valuable, but they're generally
17:21
captured after phases or at closure. Not
17:24
ideal for mid- project course
17:26
correction. Choice D is incorrect. While
17:29
escalating to the sponsor and increasing
17:32
compliance review sounds reasonable,
17:34
it's a reactive governance tactic, not a
17:36
strategic improvement initiative like
17:38
auditing and benchmarking. Let's move on
17:41
to the next question if you're ready.
17:43
Question 58. During testing of a
17:46
repetitive manufacturing process in a
17:48
predictive project, a control chart for
17:51
defect rates shows seven consecutive
17:53
points plotted on the same side of the
17:55
mean, all within control limits. The
17:58
process owner insists that since the
18:00
points are inside the limits, no action
18:03
is required. What should the project
18:05
manager do? A take no action as all
18:09
points fall within upper and lower
18:11
control limits. B. Recalculate the
18:14
control limits to account for the shift
18:16
in performance trend. C. Perform a cause
18:19
analysis and consider corrective action
18:21
due to the indication of a non-random
18:23
pattern. D. Consult with the quality
18:26
team to adjust sampling frequency and
18:28
continue monitoring for additional
18:30
trends. You can pause the video here if
18:33
you need more time to work on the
18:34
question. The correct answer is C. This
18:38
question tests your knowledge of
18:39
statistical quality control using
18:41
control charts, especially recognizing
18:43
non-random patterns like the rule of
18:46
seven, which is often misunderstood in
18:48
predictive projects. Choice C is the
18:51
best option because seven points on the
18:53
same side of the mean, even within
18:55
control limits, signal non-random
18:57
variation. This violates process
18:59
stability assumptions and calls for
19:01
investigation and possible corrective
19:03
action. Choice A is incorrect. while the
19:06
points are within limits. Ignoring the
19:08
trend misses the early warning of
19:11
process instability. Choice B is
19:13
incorrect. Recalculating control limits
19:16
without understanding the cause could
19:18
normalize abnormal behavior masking
19:20
issues. Choice D is incorrect. Adjusting
19:23
sampling frequency is premature. While
19:25
it sounds reasonable, it doesn't address
19:27
the non-random trend already present.
19:30
Let's move on to the next question if
19:31
you're ready. Question 59. A
19:34
pharmaceutical manufacturing project
19:36
following a predictive life cycle is in
19:39
the execution phase. The quality
19:41
assurance team recommends investing more
19:43
time in process audits and preventive
19:46
checks to ensure compliance, but some
19:48
engineers argue that testing the final
19:51
deliverables is more practical given
19:53
tight timelines. What should the project
19:55
manager do? A support increased
19:58
preventive activities to reduce the
20:00
likelihood of defects and downstream
20:02
rework.
20:04
B. Prioritize final product testing and
20:07
inspections to ensure regulatory
20:09
compliance. C. Discuss the decision with
20:12
the quality control lead to balance
20:14
prevention with available resources.
20:17
D. Continue with inspections while
20:19
adding defect tracking metrics to guide
20:21
future quality improvements. You can
20:24
pause the video here if you need more
20:26
time to work on the question. The
20:28
correct answer is A. This question tests
20:32
your understanding of PMI's emphasis on
20:35
prevention over inspection, especially
20:38
in highly regulated industries like
20:40
pharmaceuticals where quality issues can
20:42
be extremely costly if detected late.
20:45
Choice A is the best option because
20:47
preventive actions such as audits and
20:49
process checks reduce the risk of errors
20:51
before they occur. In predictive
20:53
environments, early quality planning and
20:55
prevention help avoid costly downstream
20:58
rework. Choice B is incorrect. While
21:01
final inspections are important, relying
21:03
on them reactively increases the risk of
21:06
missing process related defects and
21:08
leads to a higher cost of quality.
21:10
Choice C is incorrect. While discussing
21:13
the matter is collaborative, it defers
21:15
decision-making and lacks the proactive
21:17
leadership expected from the project
21:19
manager. Choice D is incorrect. Adding
21:22
defect metrics is helpful for trend
21:24
analysis, but does not substitute for
21:26
prevention. It's reactive rather than
21:28
proactive. Let's move on to the next
21:30
question if you're ready. Question 60. A
21:34
predictive infrastructure project is
21:36
nearing completion. The project manager
21:38
wants to implement lessons learned from
21:40
quality reviews and embed a culture of
21:42
continuous improvement into the
21:44
organization's project practices. Which
21:46
approach best supports this goal within
21:49
a predictive project environment? A. Use
21:52
control charts and inspections to
21:54
identify deliverable defects during
21:56
testing. B. Establish a root cause
22:00
analysis framework and apply statistical
22:02
process control post project. C. Conduct
22:05
a quality audit and recommend Kaizen
22:07
techniques for ongoing small
22:09
improvements.
22:10
D. Use total quality management TQM
22:14
principles to embed organizationwide
22:16
continuous improvement focused on
22:17
processes and people. You can pause the
22:20
video here if you need more time to work
22:22
on the question. The correct answer is
22:25
D. This question tests your
22:27
understanding of organizational level
22:29
quality frameworks and how they can be
22:31
applied after project execution in a
22:33
predictive environment. It also compares
22:35
TQM with tactical methods like Kaizen
22:38
and inspection tools. Choice D is the
22:41
best option because TQM provides a
22:44
structured organizationwide approach to
22:46
improving quality systems, aligning
22:49
people, processes, and culture around
22:51
continuous improvement. It's a strategic
22:54
fit for post- project quality
22:56
enhancement. Choice A is incorrect.
22:58
Control charts and inspections are tools
23:01
used during execution, not suitable for
23:03
establishing long-term quality culture
23:05
post project. Choice B is incorrect.
23:08
Root cause analysis is useful for issue
23:11
resolution, but lacks the systemic
23:13
organizational focus needed to drive
23:16
cultural change. Choice C is incorrect.
23:19
Kaizen supports incremental
23:20
improvements, but in this context, TQM
23:23
is broader and more aligned with
23:25
embedding quality throughout the
23:27
organization.
23:28
Congratulations on completing all 10
23:31
questions for quality management in
23:32
waterfall projects. You've now answered
23:35
60 questions in your PMP exam journey.
23:38
If this content is helpful, please give
23:40
this video a thumbs up and subscribe to
23:43
PM Aspirant for helpful content like
23:45
this. Keep up the great momentum and
23:47
keep pushing forward.

