Foundations · Module 7

Human factors and phishing

If we design the system so the safe action is slow and awkward, people will route around it.

1.1h 3 outcomes Cybersecurity Foundations

Previously

Identity and access

Identity is where most real world attacks start because stolen access is cheaper than breaking encryption.

This module

Human factors and phishing

If we design the system so the safe action is slow and awkward, people will route around it.

Next

Privacy and everyday data protection

Privacy is not only a legal idea.

Progress

Mark this module complete when you can explain it without rereading every paragraph.

Why this matters

This module is about making security human, practical, and repeatable.

What you will be able to do

  • 1 Explain why insecure behaviour is often a design problem, not a moral failing
  • 2 Spot common pressure tactics used in phishing and social engineering
  • 3 Explain governance as ownership of trade offs and decisions

Before you begin

  • No previous technical background required
  • Read the section explanation before using tools

Common ways people get this wrong

  • Channel spoofing. Email, chat, and phone can all be faked. Trust the verified path, not the tone.
  • Approval bypass. Attackers push you around process. The defence is a hard rule: no exceptions under urgency.

If we design the system so the safe action is slow and awkward, people will route around it. This module is about making security human, practical, and repeatable.

Humans are not the weakest link. Humans are the system. Most insecure behaviour is a rational response to a bad setup. If it takes five minutes to do the safe thing, people will do the fast thing. If approvals block urgent work, people will route around them. If security tools produce noise, people will ignore them.

Governance It is decision making, not paperwork. It is how an organisation decides what it will accept, what it will fix, and who owns the trade offs. CISSP governance principles focus on accountability, risk ownership, and clear policy that matches reality.

Real organisational failures are often not technical. They are unclear ownership, unclear priorities, and no rehearsal for bad days. The paperwork shows up after the incident, usually with a new template and a tired team.

In real organisations, governance shows up in things like who is allowed to approve exceptions, how access is granted and removed, how incidents are escalated, and what gets funded. Good governance is boring in the best way. It makes security decisions repeatable instead of emotional.

Everyday example. If nobody is clearly responsible for locking up at night, it eventually becomes "someone will do it." That is not a plan. Governance is deciding who locks up, how you check, and what happens if it is missed.

Common mistake. Blaming individuals for predictable system failures. Another common mistake is writing policies that describe an ideal world and then punishing people for living in the real one.

Why it matters. Human factors and governance are what make security sustainable. Without them, controls decay, exceptions pile up, and the organisation only gets serious after harm has happened.

Mental model

Verification under pressure

Phishing works by using normal work habits against you. The defence is a simple decision path.

  1. 1

    Message arrives

  2. 2

    Check channel and intent

  3. 3

    Verify out of band

  4. 4

    Act or escalate

Assumptions to keep in mind

  • Time pressure is part of the attack. Urgency is a tool. The safe move is to slow down.
  • Verification has a script. If you rely on vibes, you will lose. Use a repeatable verification step.

Failure modes to notice

  • Channel spoofing. Email, chat, and phone can all be faked. Trust the verified path, not the tone.
  • Approval bypass. Attackers push you around process. The defence is a hard rule: no exceptions under urgency.

Key terms

Governance
Governance is how decisions are made and owned, including who accepts risk, who funds controls, and who is accountable.

Check yourself

Quick check. Human factors

0 of 4 opened

Why is 'someone clicked a link' not a full explanation

Because systems, incentives, and processes shape behaviour. The fix is usually design and process, not blame.

What is a safe default action for urgent requests

Pause and verify through an independent channel before taking action.

Name one common pressure tactic

Urgency, secrecy, authority, or fear.

What does governance mean in simple terms

Who owns decisions, who accepts risk, and how choices are made and checked.

Artefact and reflection

Artefact

A short verification checklist you can use for urgent requests

Reflection

Where in your work would explain why insecure behaviour is often a design problem, not a moral failing change a decision, and what evidence would make you trust that change?

Optional practice

Classify emails and learn practical signals. Sender domains, urgency tactics, mismatched links, and risky requests.

Also in this module

Social engineering simulator

Practice recognising pressure tactics and choosing safe verification steps.

Source NIST Cybersecurity Framework (CSF) 2.0 (2024)
Source OWASP Top 10 (2025)
Source OWASP ASVS 5.0.0
Source ISO/IEC 27001:2022 Information security management systems