ONLINE
THREATS: 4
0
1
0
1
0
1
0
1
1
0
0
1
1
0
1
0
1
1
0
0
1
1
1
0
0
0
1
0
1
1
0
0
0
1
1
0
1
0
1
0
1
1
0
0
0
0
1
0
1
1

Inquiry Testing: Interview-Based Audit Procedures

Loading advertisement...
85

The Interview That Exposed a $12 Million Fraud: When Asking Questions Matters More Than Technology

I'll never forget sitting across from Janet, the accounts payable supervisor at a mid-sized manufacturing firm, during what was supposed to be a routine SOC 2 compliance audit. We were 45 minutes into what I'd planned as a 30-minute control inquiry when something in her explanation of their vendor payment approval process didn't add up.

"So you're telling me that you verify all new vendor registrations against the approved vendor list before processing any payments?" I asked, leaning forward slightly.

"Absolutely," Janet replied with confident certainty. "It's company policy. Every single vendor goes through the approval workflow in our ERP system."

I pulled up my laptop and opened the vendor master file extract the IT team had provided the day before. "That's interesting," I said, scrolling through the data. "Because I'm seeing 47 vendors added to your system in the past six months with no corresponding approval workflow records. Can you help me understand how that happens?"

The color drained from Janet's face. Her eyes darted to the conference room door. "I... well... sometimes there are emergency situations where we need to expedite—"

What started as a simple control inquiry ultimately uncovered a sophisticated accounts payable fraud scheme. Janet had been creating fictitious vendors, routing payments to accounts she controlled, and circumventing approval controls for 14 months. Total theft: $12.3 million. The scheme had bypassed automated controls, left minimal digital forensics, and would have continued undetected if not for skilled interview-based inquiry testing.

That investigation taught me a lesson I've carried through 15+ years of security and compliance work: technology controls are essential, but they're not enough. Automated testing validates that systems work as configured. Inquiry testing validates that people understand, follow, and can't circumvent those controls. It's the human verification layer that catches the gaps, the workarounds, the misunderstandings, and yes—sometimes the fraud—that no automated scan will ever find.

In this comprehensive guide, I'm going to share everything I've learned about effective inquiry testing as an audit procedure. We'll cover the theoretical foundation that separates skilled inquiries from checkbox interviews, the specific techniques I use to elicit accurate information while detecting deception, the integration with evidence-based testing, and the documentation standards that satisfy auditors across every major framework. Whether you're conducting your first control audit or refining your inquiry methodology, this article will transform how you validate organizational controls through structured interviews.

Understanding Inquiry Testing: The Foundation of Human-Based Audit Evidence

Let me start by addressing the most common misconception about inquiry testing: it's not just "asking questions." I've reviewed hundreds of audit workpapers where "inquiry" consisted of emailing a control owner "Do you perform this control?" and accepting "Yes" as sufficient evidence. That's not inquiry testing—that's compliance theater.

Inquiry testing is a systematic methodology for gathering audit evidence through structured interviews, observations, and analysis of responses. It's one of six primary audit evidence types recognized across major frameworks, and when done properly, it provides insights that no amount of log analysis or system testing can match.

The Role of Inquiry in the Audit Evidence Hierarchy

Audit evidence exists on a reliability spectrum. Understanding where inquiry fits helps you use it appropriately:

Evidence Type

Reliability Level

Strengths

Limitations

Best Use Cases

Direct Observation

Very High

Auditor witnesses control execution, impossible to falsify

Point-in-time only, Hawthorne effect, resource-intensive

Physical security controls, segregation of duties, operational procedures

Reperformance

Very High

Auditor executes control independently, validates effectiveness

Doesn't prove control owner competence, time-consuming

Calculations, reconciliations, configuration reviews

Inspection of Evidence

High

Tangible artifacts, time-stamped, independently verifiable

Can be fabricated, doesn't prove understanding

Approvals, logs, reports, documentation

Analytical Procedures

Medium-High

Identifies anomalies and trends, efficient for large datasets

Indirect evidence, requires interpretation

Financial analysis, trend identification, outlier detection

Inquiry

Medium

Assesses understanding, identifies informal controls, detects workarounds

Subjective, can be misleading, requires corroboration

Control design understanding, process walkthroughs, gap identification

Management Representation

Low

Establishes accountability, documents claims

Self-serving, easily manipulated, requires substantiation

Executive attestations, policy acknowledgment, risk acceptance

Notice that inquiry sits in the middle—more reliable than simple management representations but less reliable than direct observation or inspection. This means inquiry should almost never be your only evidence source, but it's invaluable for understanding context, identifying risks, and knowing what other evidence to gather.

At the manufacturing firm where I uncovered Janet's fraud, we'd already performed extensive automated testing:

  • ✅ ERP system access controls tested (all passed)

  • ✅ Segregation of duties matrix validated (no conflicts found)

  • ✅ Payment authorization logs sampled (all showed proper approvals)

  • ✅ Vendor master change logs reviewed (all changes had user IDs)

The technology controls appeared perfect. But inquiry testing revealed the human reality: Janet had discovered that adding vendors through a rarely-used "emergency vendor" module bypassed the approval workflow, and she'd exploited this design flaw for over a year. No automated test would have caught this—only a skilled inquiry uncovered the control gap.

Inquiry Testing Across Compliance Frameworks

Different frameworks have different expectations for inquiry testing, but all recognize it as essential:

Framework

Inquiry Requirements

Specific Guidance

Documentation Standards

SOC 2

AICPA TSP Section 100 AT-C 105

Inquiry required for all control design assessments, understanding entity's system

Written summaries, audio recordings acceptable, must corroborate with other evidence

ISO 27001

Clause 9.2 Internal Audit

Interviews with process owners, compliance verification

Interview notes, findings documentation, evidence of follow-up

PCI DSS v4.0

Testing Procedures for each requirement

Personnel interviews required for many controls (firewall rules, access management, etc.)

Interview documentation showing who, when, what was discussed

HIPAA

45 CFR 164.308(a)(8) Evaluation

Workforce interviews to verify policies and procedures

Interview summaries, competency verification

FedRAMP

Rev. 5 Assessment Methodology

Interviews required for 70+ controls across all families

Detailed interview summaries in SAR, must identify interviewees by role

FISMA

NIST SP 800-53A Rev. 5

Interview assessment objects specified for each control

Interview artifacts in assessment plan and report

GDPR

Article 24 Accountability

Data protection impact assessments require stakeholder consultation

Documentation of consultation process and outcomes

When I conduct SOC 2 audits, roughly 40% of my evidence comes from inquiry testing—interviews with control owners, process walkthroughs with operational staff, and verification discussions with management. For ISO 27001, that percentage is even higher because the standard emphasizes organizational culture and awareness, which can only be assessed through human interaction.

The Psychology of Effective Inquiry

Here's what separates average auditors from exceptional ones: understanding the psychology of interviewing. People are complex, and their responses to audit questions are influenced by dozens of factors beyond the factual answer.

Factors Influencing Interview Responses:

Influence Factor

Impact on Responses

Mitigation Strategies

Fear of Consequences

Minimize problems, hide violations, defensive answers

Build rapport, emphasize learning over blame, separate findings from disciplinary action

Desire to Please

Tell auditor what they want to hear, overstate compliance

Ask open-ended questions, request evidence, probe inconsistencies

Incomplete Understanding

Confident but incorrect answers, knowledge gaps disguised

Test understanding with scenarios, request demonstrations, validate with documentation

Organizational Politics

Protect colleagues, shift blame, territorial responses

Interview multiple stakeholders, cross-reference answers, observe group dynamics

Time Pressure

Rushed answers, superficial responses, impatience

Schedule adequate time, return for follow-up, prioritize critical topics

Authority Bias

Defer to management views, suppress concerns, group-think

Interview staff privately, encourage dissent, validate junior staff input

During Janet's interview, several of these factors were in play. She initially showed "desire to please" by confidently confirming all controls were followed. But when I probed specific scenarios, her "fear of consequences" emerged through physical tells—the color change, the eye movement, the deflection. Understanding these psychological patterns helped me recognize that deeper investigation was needed.

"The best auditors are part detective, part psychologist, and part teacher. They don't just collect answers—they observe behaviors, test understanding, and create safe spaces for truth-telling." — Chief Audit Executive, Fortune 500 Financial Services

When to Use Inquiry Testing vs. Other Evidence Types

I've developed decision criteria for when inquiry is the right evidence-gathering approach:

Use Inquiry Testing When:

  1. Understanding Control Design: How is the control supposed to work? What's the intended process?

  2. Assessing Awareness: Do people understand their security/compliance responsibilities?

  3. Identifying Informal Controls: What undocumented practices actually keep things secure?

  4. Detecting Workarounds: How do people handle exceptions or system limitations?

  5. Evaluating Competency: Can control owners actually perform their assigned duties?

  6. Understanding Context: Why was this control implemented? What risk does it address?

  7. Identifying Changes: What's different from last audit? What's planned?

Don't Rely Solely on Inquiry For:

  1. Proving Control Execution: Use logs, inspection, or observation instead

  2. Validating Technical Configurations: Use reperformance or inspection

  3. Confirming Authorizations: Use inspection of approvals or signatures

  4. Quantifying Metrics: Use data analytics or system reports

  5. Establishing Timelines: Use timestamps, logs, or dated documents

At the manufacturing firm, inquiry was perfect for understanding how Janet's team was supposed to process vendor payments, but it would have been insufficient to prove whether the controls actually operated. That's why I combined inquiry with log analysis, sample testing, and vendor verification—the full evidence spectrum.

Phase 1: Preparation and Planning—Setting Yourself Up for Success

Effective inquiry testing starts long before you sit down with an interviewee. The preparation phase determines whether your interviews will be productive or perfunctory.

Defining Inquiry Objectives

I start every audit engagement by clearly defining what I need to learn through inquiry. Vague objectives produce vague results.

Inquiry Objective Framework:

Objective Category

Sample Objectives

Key Questions

Evidence to Corroborate

Control Design Assessment

Understand how access provisioning works

What triggers access requests? Who approves? What's the workflow? What happens for emergency access?

Workflow diagrams, policy documents, ticketing system screenshots

Control Ownership Clarity

Identify who's responsible for backup verification

Who performs the verification? How often? Who do they report failures to?

Job descriptions, RACI matrices, escalation procedures

Process Understanding

Map the incident response process

What constitutes an incident? Who gets notified? What's the timeline? Where's it documented?

IR plan, runbooks, recent incident tickets

Awareness Verification

Assess security awareness training effectiveness

What topics were covered? What's your role in data protection? How do you report suspicious activity?

Training records, phishing simulation results, policy acknowledgments

Exception Handling

Understand how policy violations are managed

What happens when someone violates the policy? Are there approved exceptions? How are they tracked?

Exception logs, variance approvals, disciplinary records

Change Management

Identify changes since last audit

What's changed in your processes? Any new systems? Different staff? Updated procedures?

Change logs, org charts, system inventories

For the manufacturing firm audit, my inquiry objectives included:

  • Primary: Understand the vendor onboarding and payment approval process end-to-end

  • Secondary: Assess segregation of duties between vendor creation, purchase order approval, and payment processing

  • Tertiary: Identify any emergency/exception processes that bypass normal controls

  • Quaternary: Evaluate staff understanding of fraud prevention controls

That fourth objective—evaluating fraud prevention understanding—is what led me to ask Janet about the 47 unauthorized vendors. It wasn't in the original audit scope, but during preparation, I'd noticed that the firm's fraud risk assessment identified vendor fraud as a top-5 risk. That insight shaped my inquiry approach.

Selecting the Right Interviewees

One of the most common mistakes in inquiry testing is interviewing only management. I've uncovered countless control failures by talking to front-line staff who actually execute controls daily versus managers who describe how they think it works.

Interviewee Selection Matrix:

Role Level

When to Interview

What to Ask

Typical Duration

Evidence Value

Executive Management

Risk appetite, strategic direction, resource allocation

What are your top security concerns? How do you measure security effectiveness? What's your investment priority?

30-45 minutes

Low for control detail, high for tone-at-the-top

Functional Management

Control design, policy development, exception approval

How was this control designed? What metrics do you track? How do you know it's working?

45-60 minutes

Medium-high for design, low for operation

Control Owners

Control operation, day-to-day execution, common issues

Walk me through how you perform this control. What do you do when it fails? What's the hardest part?

60-90 minutes

Very high for operation, medium for design

Front-Line Staff

Actual practices, workarounds, practical challenges

Show me how you do this. What happens in emergency situations? What makes this difficult?

45-60 minutes

Very high for reality-checking, gap identification

IT/Technical Staff

System configurations, technical controls, architecture

How is this technically implemented? What are the system limitations? Where are the logs?

60-120 minutes

Very high for technical validation

Audit/Compliance Staff

Historical findings, known issues, previous testing

What issues have you found before? Any recurring problems? What concerns you most?

30-45 minutes

Medium for current state, high for context

For the vendor payment process, I interviewed:

  • CFO (15 minutes): Budget approval authority, fraud concerns

  • Controller (30 minutes): Process design, oversight approach

  • Accounts Payable Manager (45 minutes): Daily operations, team supervision

  • Janet (AP Supervisor) (90 minutes initially, 120 minutes follow-up): Hands-on process execution

  • Two AP Clerks (30 minutes each): Actual transaction processing, system usage

  • IT ERP Administrator (45 minutes): System configuration, access controls

Interviewing the AP clerks was crucial—they mentioned that "sometimes Janet processes vendors herself when we're busy," which contradicted the segregation of duties matrix. That inconsistency drove deeper inquiry with Janet.

Developing Your Interview Guide

I never walk into an interview without a structured guide. But I also never follow it like a script—it's a framework, not a straitjacket.

Interview Guide Structure:

1. OPENING (5-10 minutes)
   - Introduction and rapport building
   - Explain audit purpose and confidentiality
   - Set expectations for time and format
   - Establish recording/note-taking approach
2. BACKGROUND (10-15 minutes) - Interviewee's role and tenure - General responsibilities - Reporting structure - Related experience
3. CONTROL UNDERSTANDING (20-40 minutes) - Describe the control in your own words - Walk me through the process step-by-step - What documentation do you use? - Where is evidence created/stored?
4. CONTROL EXECUTION (20-40 minutes) - How often do you perform this? - What triggers execution? - What tools/systems do you use? - Who else is involved?
Loading advertisement...
5. EXCEPTION HANDLING (10-20 minutes) - What happens when normal process can't be followed? - What are common exceptions? - Who approves exceptions? - How are they documented?
6. CHALLENGES & GAPS (10-15 minutes) - What makes this control difficult? - What would make it more effective? - Any concerns you want to share? - What keeps you up at night?
7. CHANGES & FUTURE STATE (5-10 minutes) - What's changed recently? - Any planned improvements? - System upgrades or process changes?
Loading advertisement...
8. CLOSING (5 minutes) - Summarize key points - Clarify any ambiguities - Schedule follow-up if needed - Thank participant

This structure flows naturally from general to specific, building rapport before diving into potentially sensitive topics. The "challenges & gaps" section is where people often reveal the most—they've been heard, they're comfortable, and they want to help make things better.

Logistical Preparation

Small details matter in inquiry testing. I've had interviews derailed by poor logistics.

Pre-Interview Checklist:

Item

Purpose

Preparation Required

Private Space

Candid responses, confidentiality, freedom from interruption

Reserve conference room, check for privacy, ensure no one can overhear

Adequate Time

Thorough exploration, no rushing, opportunity for follow-up

Schedule 2x estimated time, buffer between interviews, availability for extension

Background Materials

Context, relevant policies, previous findings

Collect process docs, prior audit reports, organizational charts, system diagrams

Recording Equipment

Accurate documentation, ability to revisit responses

Test audio recorder, backup device, obtain permission, have note-taking backup

Documentation Tools

Real-time capture, structured notes

Laptop or tablet, interview template, reference materials, evidence repository

Evidence Samples

Reference specific examples, test understanding

System screenshots, sample transactions, policy excerpts, configuration exports

I always arrive 10 minutes early to set up, test equipment, and mentally prepare. That quiet time lets me review objectives and get into the right mindset.

When I interviewed Janet, I'd brought samples of the 47 unauthorized vendors in a spreadsheet. Having that concrete data visible during the interview—not as an accusation, but as a puzzle to solve together—created the cognitive dissonance that led to her eventual admission.

Phase 2: Conducting Effective Interviews—Technique and Execution

The interview itself is where preparation meets practice. I've refined my approach through hundreds of interviews, learning which techniques elicit truth and which create defensiveness.

Establishing Rapport and Trust

People don't share openly with someone they perceive as threatening. The first 10 minutes determine the quality of the next 60.

Rapport-Building Techniques:

Technique

How to Execute

Why It Works

Mirroring

Match interviewee's energy level, speaking pace, formality

Creates subconscious comfort, signals "we're similar"

Active Listening

Maintain eye contact, nod acknowledgment, paraphrase to confirm understanding

Shows respect, validates their input matters

Shared Purpose

Frame audit as "helping the organization improve" not "finding violations"

Reduces defensiveness, encourages collaboration

Expertise Recognition

Acknowledge their knowledge and experience

Builds confidence, positions them as expert not suspect

Confidentiality Assurance

Explain how information will be used, who will see it

Reduces fear, enables candor

Humor (Appropriate)

Light, non-offensive humor to ease tension

Humanizes auditor, creates comfortable atmosphere

I typically open with something like:

"Thanks for making time today, Janet. I know you're busy, so I really appreciate it. I've been reviewing your vendor payment process, and honestly, I'm impressed by the volume you handle. I want to understand how you and your team manage it all—not to find fault, but to accurately represent your controls in our report. And if there are any pain points or areas where you think the process could be better, I'd love to hear about those too. Everything we discuss stays in the audit workpapers, and findings are presented at an organizational level, not attributed to individuals. Does that sound good?"

This opening:

  • Shows appreciation (builds rapport)

  • Recognizes their expertise (reduces defensiveness)

  • Frames audit as understanding not interrogation (establishes purpose)

  • Promises confidentiality (encourages candor)

  • Invites improvement suggestions (creates collaboration)

Janet visibly relaxed after this opening. She smiled, settled into her chair, and said "That's refreshing—most auditors make me feel like I'm under investigation." That comfort is what made the later difficult conversation possible.

Question Types and Techniques

Not all questions are created equal. The type of question determines the type of answer.

Question Taxonomy:

Question Type

Structure

Purpose

Example

When to Use

Open-Ended

"How do you...?" "What happens when...?" "Describe the process for..."

Elicit detailed explanations, encourage storytelling

"How do you verify a new vendor is legitimate?"

Beginning of topics, understanding processes

Closed-Ended

"Do you...?" "Is there...?" "Are you...?"

Confirm specific facts, yes/no answers

"Do you check the vendor against the approved list?"

Confirming understanding, verifying specific points

Probing

"Tell me more about..." "Can you give me an example?" "What do you mean by...?"

Dig deeper, clarify ambiguity, get specifics

"You mentioned 'emergency situations'—can you give me an example?"

Following up on vague answers, exploring red flags

Hypothetical

"What would you do if...?" "How would you handle...?" "Suppose that..."

Test decision-making, understand exception handling

"What would you do if a vendor needed payment before approval came through?"

Identifying workarounds, assessing judgment

Reflective

"So what I'm hearing is..." "It sounds like you..." "Let me make sure I understand..."

Confirm understanding, encourage elaboration

"So you're saying the system doesn't enforce vendor approval—you just check manually?"

Validating interpretations, giving chance to correct

Leading

"Isn't it true that...?" "Wouldn't you agree...?" "Don't you think...?"

Suggest desired answer (generally avoid)

"Don't you think the approval process is too slow?"

Almost never—creates bias, invalidates responses

I structure interviews to flow from open to closed questions:

Interview Flow Pattern:

1. Start Open: "Walk me through how vendor payments work, from request to check cutting."
   [Let them talk for 5-10 minutes with minimal interruption]
2. Probe Details: "You mentioned the approval workflow—tell me more about that." [Follow interesting threads, dig into specifics]
3. Test Understanding: "Can you show me an example of an approved vendor in the system?" [Move from theory to evidence]
Loading advertisement...
4. Identify Exceptions: "What happens when someone needs to pay a vendor who isn't approved yet?" [Critical for finding gaps]
5. Confirm Key Points: "So to confirm, every vendor payment requires approval from the controller before processing—is that correct?" [Lock in commitments that can be tested]
6. Close Open: "Is there anything else about this process you think I should know?" [Catch what you missed]

With Janet, the exception question was the key: "What happens when someone needs to pay a vendor who isn't approved yet?" Her answer—"Well, we have a way to handle that..."—with a slight hesitation, signaled there was more to explore.

Detecting Deception and Inconsistency

I'm not a forensic interrogator, but I've learned to recognize signals that warrant deeper inquiry. These aren't proof of lying—they're indicators that something needs more examination.

Deception Indicators:

Indicator Category

Specific Behaviors

What It Might Mean

Appropriate Response

Verbal Indicators

Overly specific details, repeating questions, qualifying answers ("to be honest..."), changing pronouns (I→we)

Buying time, distancing from actions

Probe with follow-up questions, request evidence

Physical Indicators

Avoiding eye contact, fidgeting, touching face/neck, closed body posture

Discomfort, stress, defensiveness

Note but don't confront, revisit topic from different angle

Response Timing

Long pauses, rapid answers without thought, "I don't recall" for recent events

Fabricating vs. remembering, selective memory

Ask same question later in different way, request documents

Inconsistency

Contradicting earlier statements, diverging from documentation, conflict with other interviewees

Confusion, knowledge gaps, or deception

Point out inconsistency directly but non-accusatorially

Defensiveness

Aggressive responses, blame-shifting, minimizing, "that's not my job"

Fear, CYA mentality, hiding problems

De-escalate, reframe questions, emphasize learning focus

Over-Cooperation

Volunteering excessive information, agreeing too readily, "whatever you need"

Desire to please, hiding real issues

Test claims with evidence requests, verify with others

When I noticed Janet's color change and eye movement, I didn't accuse her of anything. Instead, I said:

"I see this list has you a bit concerned. I'm not suggesting anything inappropriate—I'm just trying to understand how vendors end up in the system without approval workflow records. Maybe it's a system limitation, or a different process I don't know about. Help me understand what I'm seeing here."

This non-threatening approach gave her an "out"—a way to explain without admitting wrongdoing. But it also made clear I wasn't going to accept "I don't know" and move on.

"The goal isn't to trap people or play gotcha. The goal is to understand reality. Sometimes reality includes control failures or even fraud. Creating psychological safety actually makes people more likely to reveal problems, not less." — Forensic Audit Partner, Big Four Firm

Managing Difficult Interviews

Not all interviews go smoothly. I've developed strategies for common challenging situations:

Difficult Interview Scenarios:

Scenario

Characteristics

Management Strategy

Escalation Trigger

Hostile/Defensive

Aggressive tone, confrontational answers, blame-shifting

Remain calm, acknowledge concerns, reframe audit purpose, take breaks if needed

Threats, intimidation, refusal to participate

Evasive

Vague answers, "I don't know," deflection to others, topic-switching

Pin down specifics, ask same question differently, request documentation

Consistent refusal to answer basic questions

Overly Technical

Jargon-heavy, assumes expertise, condescending, loses you in details

Ask for clarification without shame, request diagrams, break into components

Deliberate obfuscation to avoid scrutiny

Rambling

Tangential stories, unfocused answers, time-wasting

Politely interrupt and redirect, use closed questions, set time expectations

Interview exceeds 2x planned duration

Nervous/Anxious

Visible stress, minimal answers, fear of consequences

Extra rapport-building, emphasize confidentiality, use easy questions first

Stress becomes counterproductive to learning

Unprepared

Lacks knowledge, unclear on responsibilities, no access to systems/docs

Reschedule with preparation requirements, interview others, rely on documentation

Cannot answer any substantive questions

I once interviewed a CISO who was actively hostile because he viewed audits as "checkbox waste of time." Rather than fight it, I acknowledged it: "I get it—audits can feel like bureaucratic overhead. Here's what I commit: I'll respect your time, I'll ask smart questions because I've done my homework, and if I find something, I'll tell you directly, not surprise you in the report. Fair enough?" He grudgingly agreed, and the interview became productive.

Note-Taking and Documentation During Interviews

What you document during the interview is as important as what you ask. I've seen audit findings overturned because interview documentation was inadequate.

Real-Time Documentation Approach:

Method

Advantages

Disadvantages

Best For

Audio Recording

Complete accuracy, reviewable, captures tone and hesitation

Requires permission, inhibits candor, time-consuming to review

Complex technical discussions, potential disputes, detailed procedures

Typed Notes

Searchable, fast, organized, shareable

Distracting, miss non-verbal cues, impersonal

Standard interviews, process documentation

Handwritten Notes

More engaging, flexible, sketch diagrams

Illegible, not searchable, manual transcription needed

Building rapport, visual thinkers, observation notes

Hybrid Approach

Best of multiple methods

Slightly redundant

High-stakes interviews, fraud investigations

I typically type structured notes in a template while maintaining periodic eye contact, then immediately after the interview (within 30 minutes), I add:

  • Non-verbal observations

  • Gut reactions and red flags

  • Follow-up questions needed

  • Evidence to gather

  • Inconsistencies with other sources

Interview Documentation Template:

INTERVIEW SUMMARY
Loading advertisement...
Date: [Date] Time: [Start] - [End] Interviewer: [Name] Interviewee: [Name, Title] Location: [Where] Recording: [Yes/No - if yes, file location]
PURPOSE: [What you were trying to learn]
KEY FINDINGS: 1. [Critical point 1 + supporting detail] 2. [Critical point 2 + supporting detail] 3. [Critical point 3 + supporting detail]
Loading advertisement...
PROCESS WALKTHROUGH: [Step-by-step as described by interviewee]
CONTROL EFFECTIVENESS: [Interviewee's assessment of whether controls work]
EXCEPTIONS & WORKAROUNDS: [Any deviations from documented procedures]
Loading advertisement...
EVIDENCE REFERENCED: [Documents shown, systems demonstrated, examples provided]
INCONSISTENCIES: [Conflicts with documentation, other interviews, or observable facts]
FOLLOW-UP REQUIRED: [ ] Additional evidence needed: [List] [ ] Clarification questions: [List] [ ] Additional interviewees: [List]
Loading advertisement...
AUDITOR OBSERVATIONS: [Non-verbal cues, credibility assessment, concerns]

For Janet's interview, my documentation included verbatim quotes of her explanations about the emergency vendor module, my observation that she seemed "increasingly uncomfortable when discussing vendor approval circumvention," and a note to myself: "FOLLOW UP: Pull complete audit log for emergency vendor module—verify her claim that it's 'rarely used.'"

That audit log showed 383 uses in 14 months—all by Janet. That discrepancy between "rarely" and 383 times became pivotal evidence in the investigation.

Phase 3: Specialized Inquiry Techniques for Different Control Types

Different types of controls require different inquiry approaches. Over the years, I've developed specialized questioning techniques for each major control category.

Access Control Inquiries

Access controls are fundamental to nearly every compliance framework, and inquiry testing reveals whether they work in practice.

Access Control Inquiry Framework:

Control Aspect

Sample Questions

What You're Really Testing

Red Flags to Watch For

Provisioning

"Walk me through what happens when someone joins your team. How do they get system access? Who approves it? How long does it take?"

Whether formal process is followed, approval documentation, timeliness

"We just give them access right away and get approval later," Access requests sitting unapproved for weeks

Review/Recertification

"How do you verify that people still need their access? How often? What happens if you find inappropriate access?"

Actual execution vs. documented frequency, remediation of exceptions

"We haven't done that in a while," Manager just clicks 'approve all'

Termination

"What happens when someone leaves the company? Who gets notified? How quickly is access removed? How do you verify it's done?"

Speed of deprovisioning, cross-functional coordination

"Sometimes it takes a few days," No verification process

Privileged Access

"Who has admin access? How is it different from regular access? How do you track what they do with it?"

Least privilege principle, monitoring adequacy

"Everyone on IT has admin access," No logging of privileged actions

Emergency Access

"What happens if someone needs emergency access after hours? Who approves it? How is it tracked?"

Break-glass procedures, accountability

"They just call me and I grant it," No audit trail

During one financial services SOC 2 audit, I asked the IT Manager: "Walk me through your most recent access recertification." He pulled up the spreadsheet showing 347 users reviewed with 100% manager approval—perfect on paper. Then I asked: "How long did that take managers to review?" He said proudly, "We sent it Friday afternoon and all 12 managers responded by Monday morning!"

That timing was physically impossible for thorough review. I interviewed three managers and asked them to estimate time spent. Average answer: "Maybe 10 minutes." 347 users reviewed across 12 managers in roughly 120 total minutes meant each user got about 20 seconds of review. The control was checkbox compliance, not effective access governance.

Change Management Inquiries

Change management controls prevent unauthorized modifications to systems and environments. Inquiry reveals whether changes actually follow the documented process.

Change Management Inquiry Framework:

Process Phase

Sample Questions

Validation Focus

Common Issues

Change Request

"How do you request a change? What information is required? Can you show me a recent example?"

Completeness of requests, approval before implementation

Changes implemented first, documented later

Risk Assessment

"How do you assess risk for changes? Who's involved? What factors do you consider?"

Risk analysis depth, stakeholder inclusion

Rubber-stamp approvals, no actual risk analysis

Testing

"How do you test changes before production? What's your test environment like? Who performs testing?"

Test environment parity, test coverage adequacy

"We test in production," No test environment exists

Approval

"Who approves changes? What criteria do they use? What happens if they reject one?"

Appropriate authority, substantive review, rejection actually happens

Auto-approvals, same person requesting and approving

Implementation

"How do you deploy changes? Who does it? How do you know it worked?"

Segregation of duties, verification procedures

Developer deploys own code, no validation

Emergency Changes

"What qualifies as an emergency? How do emergency changes work? How are they documented?"

Definition consistency, post-facto approval, tracking

Everything is "emergency," no distinction from normal

I once audited a healthcare organization whose change management policy was ISO 27001-perfect on paper. But when I asked the DevOps engineer, "Tell me about your last emergency change," he casually mentioned "Yeah, we had a bug in production last Thursday, so I pushed a hotfix directly to resolve it."

"Did that go through the change process?" I asked.

"Well, it was an emergency, so I documented it in the change system on Friday after it was already in production."

"Who approved it?"

"My manager approved it retroactively on Monday."

This revealed that their "emergency change" process was actually "implement first, document later"—completely undermining the control's purpose. Further inquiry showed this happened 2-3 times per week.

Incident Response Inquiries

Incident response capabilities are difficult to assess without witnessing actual incidents, making inquiry testing particularly valuable.

Incident Response Inquiry Framework:

IR Component

Sample Questions

Assessment Focus

Capability Indicators

Detection

"How do you know when an incident has occurred? What alerts exist? Who monitors them?"

Coverage adequacy, signal-to-noise ratio

24/7 monitoring, automated alerting, defined thresholds

Classification

"How do you decide if something is an incident? What's the severity framework? Who makes that call?"

Consistency of categorization, escalation triggers

Written criteria, examples of each severity, documented decisions

Notification

"Who gets notified when an incident occurs? How quickly? Through what channels?"

Timeliness, stakeholder inclusion, redundant communications

Contact trees tested, multiple notification paths, management awareness

Containment

"What's your first priority when you detect an incident? How do you contain it? Who makes containment decisions?"

Speed vs. thoroughness balance, decision authority

Pre-approved containment actions, playbooks for common scenarios

Investigation

"How do you investigate what happened? What evidence do you collect? Who performs the analysis?"

Forensic capability, evidence preservation

Forensic tools available, trained personnel, chain of custody procedures

Recovery

"How do you restore normal operations? How do you verify the threat is gone? What testing occurs?"

Validation rigor, recurrence prevention

Systematic validation, lessons learned documented

The most revealing incident response question I ask is: "Tell me about your most recent security incident—not a theoretical scenario, an actual one. Walk me through exactly what happened."

If they say "We haven't had any incidents," that's a red flag—every organization has incidents, even if they're minor. It suggests detection gaps or definition problems.

If they describe a well-managed response with clear timelines, evidence collection, stakeholder communication, and lessons learned, that's a strong indicator of capability.

If they describe chaos—"Well, we had this thing a few months ago, and I think John handled it, but I'm not sure what he did or if it got resolved"—that reveals the control is theoretical, not operational.

Security Awareness Inquiries

Security awareness is the hardest control to assess because you're testing human behavior and knowledge retention. Inquiry is often your only viable approach.

Security Awareness Inquiry Framework:

Awareness Area

Sample Questions

What You're Testing

Proficiency Indicators

Phishing Recognition

"You get an email that looks like it's from your CEO asking you to urgently verify your password—what do you do?"

Threat recognition, reporting behavior

Describes verification steps, mentions reporting mechanism, skepticism

Data Classification

"How do you know what data is confidential vs. public? Can you give me examples of each? How do you protect confidential data?"

Classification understanding, handling procedures

Accurate examples, specific protections for each class

Incident Reporting

"If you suspected your laptop was infected with malware, what would you do? Who would you contact?"

Reporting knowledge, appropriate escalation

Names specific contact/team, describes steps, urgency recognition

Physical Security

"Someone you don't recognize is following you into the building without badging—what do you do?"

Tailgating awareness, confrontation comfort

Describes challenge procedure, security contact, comfort with intervention

Password Security

"How do you create secure passwords? Do you reuse passwords across systems? How do you remember them all?"

Password hygiene, tool usage

Describes password manager, unique passwords, complexity understanding

Social Engineering

"Someone calls claiming to be from IT support and asks for your password to fix an issue—what do you do?"

Verification procedures, authority questioning

Refuses to provide credentials, describes verification process, reports attempt

I don't just ask these questions to executives or IT staff—I ask random employees across departments. The variance in answers tells you whether your awareness program is actually creating organizational competency or just checking a compliance box.

During one retail company audit, I interviewed a cashier and asked about the company's data privacy policy. She confidently said, "Oh yeah, we had a video about that last month during onboarding. Something about not sharing customer information."

"Can you give me an example of customer information you handle?" I asked.

"Sure—credit card numbers, addresses, phone numbers, email, purchase history."

"How do you protect that information?"

Long pause. "Um... I don't share it with anyone?"

"Where is it stored? Who can access it? How long do you keep it?"

"I... I don't know. I guess in the computer somewhere?"

This revealed that training covered high-level concepts but didn't translate to operational practices. She couldn't articulate what data she handled, where it lived, who could access it, or how to protect it—despite confidently claiming awareness of the policy.

Phase 4: Corroborating Inquiry with Other Evidence

Inquiry alone is insufficient. Every statement made during interviews should be validated against objective evidence.

The Inquiry-Evidence Matrix

I use a systematic approach to map inquiry responses to corroborating evidence:

Inquiry Response

Evidence Type

Specific Artifacts

Validation Method

"We perform access reviews quarterly"

Inspection

Access review spreadsheets, approval emails, remediation tickets

Verify dates, sampling reviews, check remediation completion

"All changes require CAB approval"

Inspection + Analytical

Change tickets, CAB meeting minutes, deployment logs

Sample changes, confirm pre-implementation approval, identify unapproved changes

"Backups run nightly and are tested monthly"

Inspection + Reperformance

Backup logs, test restoration logs, verification reports

Review log completeness, witness test restoration, validate success criteria

"Security awareness training is required annually"

Inspection + Analytical

Training records, completion reports, assessment scores

Verify 100% completion, check recency, review assessment performance

"All vendors are approved before payments"

Analytical + Inspection

Vendor master file, approval workflows, payment records

Match vendors to approvals, identify unapproved vendors, sample payments

"Incidents are detected and responded to within 2 hours"

Inspection

Incident tickets, SIEM alerts, response timelines

Calculate actual response times, identify SLA breaches

At the manufacturing firm, Janet's statements during inquiry became testable claims:

Janet's Claims vs. Evidence:

Janet's Statement

Evidence Gathered

Actual Finding

Discrepancy

"Every vendor is approved through the workflow system"

Vendor master file + workflow approvals

47 vendors with no approval records

12% of vendors unapproved

"The emergency vendor module is rarely used for true emergencies only"

Emergency module audit logs

383 uses in 14 months, all by Janet, many non-emergency

Frequent use, pattern of abuse

"I don't have authority to approve my own vendor additions"

Access rights matrix + segregation of duties review

Janet had both vendor creation and approval rights

SoD violation

"All emergency vendors are eventually processed through normal approval"

Comparison of emergency vendors vs. approval workflow

41 of 47 emergency vendors never received normal approval

87% never formally approved

"This is standard practice approved by my manager"

Interview with AP Manager

Manager unaware of emergency module usage or approval circumvention

Contradictory testimony

This evidence matrix transformed inquiry responses into documented findings. Without the inquiry, I wouldn't have known to look for emergency module usage. Without the corroborating evidence, I couldn't prove the control failure.

Triangulation: Validating Through Multiple Sources

The most reliable findings come from triangulation—confirming the same fact through three independent sources.

Triangulation Example: Access Termination Process

Source Type

Source

Finding

Inquiry

HR Manager interview

"We send termination notifications to IT within 1 hour of employment end"

Inspection

Sample of 25 termination notifications

22 of 25 sent within 1 hour, 3 sent next business day

Analytical

Comparison of terminated employees vs. active directory status

8 of 25 had AD accounts active 24+ hours post-termination

Inquiry

IT Manager interview

"We disable accounts within 4 hours of receiving notification from HR"

Inspection

Sample AD account disable logs

19 of 25 disabled within 4 hours, 4 disabled within 8 hours, 2 disabled after 24 hours

This triangulation reveals:

  • HR process mostly works (88% on-time notification)

  • IT process mostly works (76% within SLA)

  • But combined process has gaps (32% of accounts active beyond acceptable timeframe)

Each source validates parts of the story while revealing system weaknesses that inquiry alone wouldn't capture.

Identifying Inquiry-Evidence Gaps

Sometimes you can't corroborate what someone told you—and that gap itself is a finding.

Common Inquiry-Evidence Gaps:

Gap Type

Example

Potential Causes

Audit Response

Evidence Doesn't Exist

Control owner describes detailed review process, no documentation exists

Control not actually performed, poor documentation, manual process without artifacts

Document as control design deficiency, recommend evidence creation

Evidence Contradicts Inquiry

Manager says backups tested monthly, logs show last test was 8 months ago

Knowledge gap, miscommunication, intentional misrepresentation

Present discrepancy, request explanation, escalate if not resolved

Evidence Inconclusive

Staff describes segregation of duties, access logs don't clearly show separation

Poor logging, inadequate systems, complex workarounds

Perform additional testing, observe process directly, recommend logging improvements

Timeframe Mismatch

Training described as "regular," records show 18-month gap

Definition disagreement, recent changes, inconsistent execution

Clarify expectations, document actual frequency, assess adequacy

Scope Mismatch

Security controls described for "all systems," evidence covers only 60%

Incomplete implementation, scope creep, organizational silos

Identify uncovered systems, assess risk, recommend expansion

When I encounter gaps, I always return to the interviewee with a non-accusatory approach:

"When we spoke, you mentioned that vendor approvals are required for all payments. But when I reviewed the vendor master file, I found 47 vendors without approval records. Help me understand what I'm seeing—is this a system limitation, a different approval process, or something else?"

This framing—"help me understand"—positions the interviewee as the expert who can explain the discrepancy, not the suspect who got caught. It's more likely to produce honest clarification.

Phase 5: Documentation and Workpaper Standards

Inquiry evidence is only valuable if it's properly documented. I've seen solid findings dismissed because workpapers didn't meet audit standards.

Workpaper Requirements Across Frameworks

Different frameworks and audit standards have different documentation expectations:

Standard

Workpaper Requirements

Specific Inquiry Documentation

Retention Period

AICPA SSAE 18

Sufficient detail for experienced auditor to understand work performed

Interview summaries, interviewee identification, date/time, key responses, evidence references

7 years

PCAOB AS 1215

Document procedures performed, evidence obtained, conclusions reached

Who interviewed, when, what questions asked, responses received, follow-up performed

7 years

IIA Standards

Document planning, execution, communication of results

Interview guides, notes, summaries, evidence linkage

Varies by organization (typically 5-7 years)

ISO 19011

Objective evidence to support conclusions

Interview records, competence verification, conformity determination

Per certification body (typically 3-6 years)

GAO Yellow Book

Support for findings, conclusions, recommendations

Interview documentation, response accuracy verification, corroboration

3 years minimum

For SOC 2 audits, I structure inquiry workpapers to satisfy AICPA standards:

SOC 2 Inquiry Workpaper Template:

WORKPAPER REFERENCE: [Number]
CONTROL TESTED: [Control ID and Description]
TEST PROCEDURE: Inquiry with Control Owner
OBJECTIVE: Determine whether [control owner] understands their responsibilities for [control activity] and can describe how the control operates in practice.
INTERVIEWEE: Name: [Full Name] Title: [Job Title] Department: [Department] Tenure in Role: [Time period] Date/Time: [When interview occurred] Location: [Where]
Loading advertisement...
QUESTIONS ASKED: 1. [Question 1] Response: [Summary of response] 2. [Question 2] Response: [Summary of response]
[etc.]
KEY FINDINGS FROM INQUIRY: - [Finding 1] - [Finding 2] - [Finding 3]
Loading advertisement...
CORROBORATING EVIDENCE: - [Evidence type + reference to other workpapers] - [Evidence type + reference to other workpapers]
INCONSISTENCIES/CONCERNS: [Any discrepancies between inquiry and other evidence, or between multiple interviewees]
CONCLUSION: Based on inquiry with [name] and corroborating evidence [references], the control [operates effectively / has design deficiencies / has operating effectiveness deficiencies] because [specific reason].
Loading advertisement...
REVIEWED BY: [Name, Date]

This structure ensures any auditor reviewing my workpapers can understand what I did, what I learned, and how I reached my conclusion.

Audio Recording: Best Practices and Pitfalls

Audio recordings can be powerful evidence, but they're also risky if not handled properly.

Audio Recording Decision Matrix:

Situation

Record?

Justification

Special Considerations

Routine control walkthrough

Optional

Notes usually sufficient, recording may inhibit candor

If recorded, inform interviewee, get permission

Complex technical procedure

Yes

Accuracy critical, difficult to capture in real-time notes

Supplement with screen recording if demonstrating system

Conflicting statements

Yes

Create contemporaneous record for later review

May need to interview again on record if prior interview wasn't recorded

Potential fraud indicators

Yes

Protect against later disputes, preserve exact wording

Consult legal before recording, consider privilege

Sensitive personnel matters

Depends

May create legal liability if poorly handled

Consult HR and legal, consider not recording

International interviewees

Check laws

Some jurisdictions require two-party consent

Verify local recording laws, get explicit permission

When I do record, I follow this protocol:

  1. Inform Before Recording: "I'd like to record this conversation so I can focus on our discussion rather than note-taking. The recording will only be used for audit documentation and will be kept confidential. Is that okay with you?"

  2. Verbal Consent On Recording: "For the record, this is [my name] interviewing [their name] on [date] regarding [topic]. [Interviewee name], you've consented to this recording, correct?" [Wait for verbal "yes"]

  3. Secure Storage: Encrypt recordings, store in secure audit repository, limit access to audit team only

  4. Retention Management: Delete after retention period, maintain destruction log

  5. Transcription Approach: Transcribe key portions, not necessarily entire recording, mark timestamps for critical statements

I once recorded an interview with a database administrator who described their backup verification process in detail. Three months later, during report review, he claimed he'd never said the backups weren't tested—he insisted he'd told me they were tested quarterly. I was able to pull up the exact recording timestamp where he said, "Honestly, we haven't actually done a restore test in probably a year." That contemporaneous evidence settled the dispute.

Handling Confidential and Privileged Information

Sometimes inquiry reveals information that's sensitive, privileged, or legally protected. Handling it properly is critical.

Sensitive Information Categories:

Information Type

Examples

Handling Requirements

Disclosure Limitations

Attorney-Client Privilege

Legal advice, litigation strategy, pending legal matters

Flag as privileged, separate storage, limited distribution

Do not include in audit reports, discuss only with legal counsel

Personal Health Information

Employee health conditions, medical accommodations

HIPAA compliance, need-to-know basis, redaction

Anonymize in findings, aggregate if reporting

Personally Identifiable Information

SSN, financial data, personal details

Minimize collection, encrypt storage, redact in reports

Only collect if essential, destroy when no longer needed

Trade Secrets

Proprietary processes, formulas, competitive intelligence

NDA coverage, secure handling, no external sharing

Describe in general terms in reports, specific details confidential

Pending Investigations

Fraud investigations, HR matters, regulatory inquiries

Coordinate with investigators, avoid interference

May not be included in audit scope, defer to investigators

Executive Compensation

Salary details, bonuses, equity

Confidential designation, executive-only distribution

Aggregate or anonymize if reporting on compensation controls

During the vendor fraud investigation, Janet eventually revealed details about her personal financial situation (gambling debts) that motivated the fraud. While relevant to understanding motive, this personal information wasn't appropriate for the audit report. I documented it in investigation notes marked "Confidential - Legal Review Only" and shared only with legal counsel and the CFO, not in the general audit findings.

Creating Executive-Friendly Inquiry Summaries

Executives don't want to read 40 pages of interview transcripts. They want concise, actionable summaries.

Executive Summary Format for Inquiry Findings:

INQUIRY TESTING SUMMARY
Control Area: [e.g., Vendor Payment Approval Process]
SCOPE: - 7 interviews conducted with personnel at three levels (executive, management, operational) - 12 hours total interview time - Covered vendor onboarding, payment approval, exception handling, fraud prevention
METHODOLOGY: - Structured interviews using standardized guide - Responses corroborated with system testing, log analysis, and documentation review - Triangulation across multiple interviewees to validate consistency
Loading advertisement...
KEY FINDINGS:
1. DESIGN DEFICIENCY - Emergency Vendor Approval Bypass Severity: High What We Found: Emergency vendor creation module allows vendors to be added and paid without normal approval workflow. Used 383 times in 14 months, exclusively by single individual with both creation and approval rights. Business Impact: Circumvents segregation of duties and approval controls, enabled $12.3M fraud. Supporting Evidence: - Interview with AP Supervisor (documented knowledge of bypass method) - Audit log analysis (383 emergency vendor uses) - Access rights review (SoD violation confirmed) - Payment transaction testing (41 payments to unapproved vendors) Recommendation: Disable emergency vendor module or require dual approval for emergency vendor payments, implement post-facto approval requirement with management review.
2. OPERATING EFFECTIVENESS GAP - Manager Awareness Severity: Medium [Similar structure for next finding]

This format gives executives what they need:

  • Clear severity classification

  • Plain-language problem description

  • Business impact framing

  • Evidence-based substantiation

  • Actionable recommendations

Phase 6: Advanced Inquiry Techniques—Expert-Level Skills

After mastering the fundamentals, there are advanced techniques that separate competent auditors from exceptional ones.

The Cognitive Interview Technique

Borrowed from forensic psychology, cognitive interviewing improves recall accuracy and detail.

Cognitive Interview Principles:

Principle

Technique

Application in Audits

Example

Mental Reinstatement

Have interviewee mentally recreate the context

"Close your eyes and picture the last time you performed this control. Where were you? What time of day? What did you do first?"

Improves memory recall, surfaces details

Report Everything

Encourage reporting all details, even seemingly irrelevant

"Tell me everything you remember, even small details that might not seem important"

Captures context that explains anomalies

Change Perspective

Ask interviewee to describe from different viewpoints

"If I were watching you perform this control, what would I see you doing?"

Reveals unconscious assumptions

Reverse Order

Have interviewee describe events in reverse chronological order

"Start with the end result and work backward—what happened right before that?"

Reduces fabrication (harder to lie in reverse)

I used reverse-order questioning with Janet when her explanation of the vendor approval process seemed rehearsed. Instead of asking her to describe it start-to-finish (which she'd clearly prepared for), I asked:

"The vendor is now in the system and approved. What happened right before that approval? And what happened before that step? And before that?"

Going backward disrupted her prepared narrative and revealed inconsistencies she hadn't planned for. It's harder to maintain a false story when you have to tell it backwards.

Detecting Knowledge Gaps vs. Deception

Not every incorrect statement is a lie—sometimes people genuinely don't know. Distinguishing incompetence from deception changes your response.

Knowledge Gap Indicators vs. Deception Indicators:

Behavior

Knowledge Gap

Deception

Distinguishing Question

"I don't know"

Freely admits gaps

Avoids specific admissions

"Who would know?" vs. "I'm not sure"

Vague answers

Lacks specific knowledge

Deliberately obscures truth

Welcomes clarification vs. resists follow-up

Inconsistency

Changes as remembers better

Changes as story evolves

Corrections align with evidence vs. diverge

Confidence level

Uncertain, seeks validation

Overly confident or defensive

"I think so, but let me check" vs. "Definitely"

Assistance seeking

Offers to find information

Deflects to others

Proactive vs. evasive

During one access control audit, a system administrator told me that privileged access was "strictly controlled" but couldn't describe the specific process. When I asked who did control it, he immediately said "Let me check with my manager and get you the exact procedure." That's a knowledge gap—he knew the control existed but didn't know the details.

Contrast that with a different admin who confidently asserted "All privileged access is reviewed monthly" but then changed it to "quarterly" when I asked to see records, then to "we're actually setting that up now" when I probed further. That pattern—confident assertion followed by progressive backpedaling—signals deception, not ignorance.

Group Interviews vs. Individual Interviews

Sometimes interviewing people together reveals different information than individual interviews.

Group vs. Individual Interview Dynamics:

Aspect

Individual Interview

Group Interview

When to Use Each

Candor

Higher—no peer pressure or hierarchy concerns

Lower—people self-censor, defer to authority

Individual: Sensitive topics, hierarchical issues<br>Group: Collaborative processes

Coverage

Deeper on individual role

Broader across team

Individual: Specific control ownership<br>Group: End-to-end processes

Contradiction

Must interview multiple people separately

Emerges in real-time discussion

Individual: Detecting inconsistencies<br>Group: Resolving inconsistencies

Time Efficiency

Lower—multiple sessions needed

Higher—cover multiple people simultaneously

Individual: Key controls<br>Group: General awareness

Group Dynamics

Not observable

Observable—who defers to whom, who dominates

Group: Understanding culture, hierarchy

I typically start with group interviews for process understanding, then conduct individual follow-ups for control ownership and sensitive topics.

Example: For change management, I'll do a group walkthrough with the change advisory board (CAB) to understand the overall process and observe team dynamics. Then I'll individually interview the developer, the QA engineer, and the release manager to understand their specific roles and any concerns they might not raise in front of the group.

In one memorable group interview, the junior DBA mentioned "sometimes we approve changes via Slack when people can't make the CAB meeting." The CAB chair immediately shot him a look and said "That's just for emergencies." But that exchange told me exactly what to investigate further—and individual interviews confirmed that Slack approvals were routine, not exceptional.

The Strategic Silence Technique

One of the most powerful interview techniques is... saying nothing.

Strategic Silence Applications:

Scenario

How to Use

What It Reveals

Timing

After Initial Answer

Ask question, then wait 5-10 seconds after they finish

People often add critical details to fill silence

After any substantive answer

When Detecting Evasion

Ask direct question, wait silently for answer

Discomfort increases, making evasion harder to maintain

When answers are vague or deflective

After Contradiction

Point out inconsistency, then say nothing

Person feels pressure to explain or resolve

When you've identified a discrepancy

During Stress Indicators

Notice discomfort cues, pause questioning

Gives person time to decide whether to elaborate

When you sense they're holding back

The silence technique is uncomfortable—for both interviewer and interviewee—but incredibly effective. Most people cannot tolerate silence in conversation and will fill it, often with information they hadn't planned to share.

When Janet gave me her initial explanation of the emergency vendor module, I stayed silent for about 8 seconds, just looking at my notes. She then added, unprompted: "I mean, it's really just for when we have urgent situations and can't wait for the normal approval process." That addition—which I hadn't asked for—opened the entire line of questioning about what qualified as "urgent" and how often it happened.

Phase 7: Common Inquiry Testing Pitfalls and How to Avoid Them

Even experienced auditors fall into predictable traps. I've learned these lessons the hard way.

Pitfall 1: Confirmation Bias

The Problem: Seeking information that confirms your hypothesis while ignoring contradictory evidence.

How It Manifests:

  • Asking leading questions that suggest the desired answer

  • Interpreting ambiguous responses as supportive

  • Dismissing contradictory statements as outliers

  • Focusing inquiry on areas you expect to find issues

Real Example: During a PCI DSS audit, I suspected that firewall rules weren't being reviewed regularly. I asked the network engineer, "The firewall review process isn't really happening monthly, is it?" He responded, "Well, sometimes we get busy..."—which I interpreted as confirmation. But when I later reviewed change tickets, they showed monthly reviews were documented. What he actually meant was "We get busy so sometimes we complete them late in the month, but they do happen." My biased question and interpretation nearly led to a false finding.

How to Avoid:

  • Ask open-ended questions first: "Describe your firewall review process" before "Do you perform monthly reviews?"

  • Actively seek disconfirming evidence

  • Have someone else review your inquiry notes for bias

  • Document answers verbatim, not just your interpretation

Pitfall 2: Accepting Generalities Without Specifics

The Problem: Letting interviewees give high-level, theoretical answers without proving they know operational details.

How It Manifests:

  • "Yes, we do that" without describing how

  • "It's all documented" without showing documentation

  • "That's in the policy" without demonstrating understanding

  • "Everyone knows" without testing actual knowledge

Real Example: Security manager told me "All our developers receive secure coding training." I accepted that at face value. Later, when I randomly interviewed three developers, none could name any secure coding principle or describe training content. The training existed—as a 15-minute video during onboarding that no one remembered. Generality accepted, reality missed.

How to Avoid:

  • Follow every general claim with "Can you give me a specific example?"

  • Ask "Show me" or "Walk me through" instead of "Do you do?"

  • Request demonstration, not just description

  • Validate claims with evidence immediately during or after interview

Pitfall 3: Single Source Reliance

The Problem: Building findings on a single interview without corroboration.

How It Manifests:

  • Accepting one person's description as fact

  • Not checking if other stakeholders agree

  • Missing that the person you interviewed is the exception, not the norm

  • Failing to validate interview claims with evidence

Real Example: CISO described a mature vulnerability management program with monthly scanning, two-week patching SLAs, and executive reporting. I wrote it up as a strong control based solely on his description. Later, another auditor from our team interviewed the IT operations manager and learned that scanning was actually quarterly, patching often took 60+ days, and reports hadn't gone to executives in six months. Single source nearly created a materially misleading report.

How to Avoid:

  • Interview at least two people for any critical control

  • Always corroborate inquiry with objective evidence

  • Cross-reference statements from different organizational levels

  • Document when you're unable to corroborate and flag as limitation

Pitfall 4: Failure to Probe Red Flags

The Problem: Noticing concerning signals but not following up adequately.

How It Manifests:

  • Accepting explanations at face value when they don't quite make sense

  • Moving on when interviewee shows discomfort

  • Ignoring inconsistencies to stay on schedule

  • Avoiding difficult follow-up to maintain rapport

Real Example: During an access review audit, the IT manager explained that terminated employee access was "disabled immediately" but then mentioned "Well, sometimes there's a delay if it's after hours." That qualifier was a red flag I initially let pass. When I finally probed, I learned "sometimes" meant "usually"—after-hours terminations regularly resulted in 12-24 hour delays because the on-call person wasn't trained on the process. My initial failure to probe nearly missed a significant control deficiency.

How to Avoid:

  • Create a "red flag list" during interviews for follow-up

  • When you notice discomfort or hesitation, note it and return to it

  • Use phrases like "Tell me more about that" when something seems off

  • Schedule extra time for follow-up on concerning areas

Pitfall 5: Over-Reliance on Inquiry for Technical Controls

The Problem: Using inquiry as primary evidence for controls that should be validated through testing.

How It Manifests:

  • Asking "Are logs reviewed?" instead of reviewing logs yourself

  • Accepting "Yes, encryption is enabled" without checking configurations

  • Relying on "Backups are tested monthly" without witnessing tests

  • Using interview as sole evidence for technical effectiveness

Real Example: Asked security engineer "Is multi-factor authentication enabled for all VPN access?" He said "Yes, absolutely." I documented it as an effective control based on that inquiry. Later testing revealed MFA was enabled for new users but hadn't been enforced for 87 legacy accounts from before the MFA requirement. The engineer genuinely believed it was universal—he just didn't know about the grandfather clause. Inquiry alone was insufficient for a technical control.

How to Avoid:

  • Use inquiry for understanding design and intent

  • Use inspection/testing for validating technical implementation

  • When interviewee says a technical control is in place, respond: "Great, can you show me?"

  • Document inquiry conclusions as "design understanding" not "operating effectiveness"

The Professional Growth Path: Becoming an Expert Interviewer

Effective inquiry testing is a skill that develops over time. Here's the progression I've observed across hundreds of auditors:

Inquiry Skill Maturity Levels:

Level

Characteristics

Typical Behavior

Development Focus

1 - Novice

Reads questions from script, accepts answers at face value, minimal follow-up

"Do you perform quarterly access reviews?" "Yes." "Okay, thank you."

Learn to probe, ask open questions, build rapport

2 - Competent

Uses structured guides flexibly, probes some responses, documents adequately

Asks follow-up questions, recognizes inconsistencies, gets specific examples

Develop pattern recognition, improve corroboration, practice active listening

3 - Proficient

Adapts questions to responses, detects evasion, triangulates evidence

Modifies approach mid-interview, pursues red flags, validates claims

Master non-verbal cues, advanced questioning techniques, strategic silence

4 - Expert

Creates psychological safety for truth-telling, elicits information others miss, distinguishes deception from ignorance

Makes people want to share, catches subtle inconsistencies, knows when to push and when to back off

Mentor others, refine industry-specific techniques, continuous learning

5 - Master

Transforms interviews into collaborative problem-solving, builds organizational trust, finds fraud others miss

People seek them out for difficult investigations, uncovers systemic issues, creates lasting relationships

Teach methodology, influence standards, develop new techniques

I've been doing this for 15+ years and still learn something from every interview. The key is reflective practice—after each interview, ask yourself:

  • What did I miss?

  • What red flags did I not pursue?

  • What could I have asked differently?

  • Where did my biases show up?

  • What will I do differently next time?

That self-assessment is what drives progression from novice to expert.

Key Takeaways: Mastering Inquiry-Based Audit Procedures

After sharing everything I've learned about inquiry testing across hundreds of audits and that transformative fraud investigation, here are the critical principles to remember:

1. Inquiry is Essential but Insufficient

Inquiry testing provides context, understanding, and human insight that no automated test can match. But it must always be corroborated with objective evidence. Use inquiry to know what to test, then test to validate what you learned.

2. Preparation Determines Quality

The best interviews start with clear objectives, careful interviewee selection, background research, and structured guides. The 30 minutes you spend preparing saves hours during execution and prevents missed opportunities.

3. Psychology Matters as Much as Procedure

Understanding how people respond under pressure, what motivates truthfulness, and how to create psychological safety is as important as knowing what questions to ask. Master the human dynamics, not just the technical checklists.

4. Listen More, Talk Less

The best interviewers are those who create space for interviewees to share. Ask open questions, use strategic silence, and resist the urge to fill every gap with more questions. Often the most valuable information comes unprompted.

5. Probe Red Flags Relentlessly

When something doesn't add up, when someone shows discomfort, when answers conflict with evidence—don't let it go. The difference between a competent audit and exceptional one is often the willingness to pursue uncomfortable follow-up.

6. Document Contemporaneously and Completely

Notes taken during the interview are worth 10x more than notes from memory hours later. Document verbatim quotes, non-verbal observations, and your own reactions in real-time. Your workpapers should tell the complete story.

7. Corroborate Everything That Matters

Any finding based primarily on inquiry alone is vulnerable. Triangulate critical claims through multiple sources—other interviews, system evidence, documentation review, and observation. Build findings on foundations, not single data points.

8. Adapt to Context and Culture

Different industries, organizational cultures, and geographic regions require different approaches. The inquiry style that works for financial services may fail in healthcare. The techniques effective in North America may be inappropriate in Asia. Stay culturally aware.

9. Ethics Guide Every Decision

Inquiry testing gives you power—power to expose failures, damage reputations, and impact careers. Use that power responsibly. Seek truth, not scalps. Build understanding, not blame. Focus on improving controls, not punishing people.

10. Continuous Improvement is Non-Negotiable

Every interview is a learning opportunity. Reflect on what worked, what didn't, and what you'd change. Study interrogation techniques, read psychology research, learn from experienced investigators. Inquiry mastery is a journey, not a destination.

Your Path Forward: Elevating Your Inquiry Testing Practice

Whether you're conducting your first control audit or your thousandth, there's always room to improve your inquiry methodology. Here's what I recommend you do immediately:

This Week:

  1. Review Your Last Audit: Look at your inquiry documentation. Did you probe red flags? Did you corroborate claims? Did you interview the right people? What would you do differently?

  2. Create Your Interview Guide Template: Build a structured guide for your most common control types. Don't start from scratch every time.

  3. Practice Active Listening: In your next meeting (audit or otherwise), focus entirely on listening. Notice how much more you learn when you talk less.

This Month:

  1. Study One New Technique: Pick something from this article you haven't tried—cognitive interviewing, strategic silence, group dynamics observation—and deliberately practice it in your next three interviews.

  2. Build Your Evidence Matrix: For your next audit, create the inquiry-to-evidence mapping before you start. Know exactly how you'll corroborate each claim.

  3. Get Feedback: Have a colleague review your interview documentation. Ask: "Is it clear what I did and what I learned? Are there gaps?"

This Quarter:

  1. Benchmark Your Approach: Compare your inquiry methodology against the frameworks in this article. Where are you strong? Where do you have gaps?

  2. Expand Your Interviewee Range: If you typically interview only managers, start including front-line staff. If you focus on IT, branch into business units. Diverse perspectives reveal different truths.

  3. Measure Your Effectiveness: Track how often your inquiry findings are supported by corroborating evidence, how many control deficiencies you identify through inquiry versus other methods, and how frequently you need to re-interview for missing information. Trends will guide improvement.

This Year:

  1. Develop Specialized Skills: Pick one advanced technique—fraud detection, cognitive interviewing, cultural adaptation—and invest in formal training or deep self-study.

  2. Mentor Others: Teaching inquiry techniques to junior auditors forces you to articulate your tacit knowledge and often reveals gaps in your own understanding.

  3. Build Your Reputation: Become known as the auditor who finds what others miss, not through aggressive interrogation but through skilled, respectful inquiry that people actually want to participate in.

The manufacturing firm where I uncovered the $12.3 million fraud? They implemented every recommendation we made—emergency vendor module disabled, segregation of duties enforced, manager oversight enhanced, and fraud awareness training for all AP staff. But more importantly, they changed their culture around inquiry and questioning. Staff are now encouraged and trained to ask "why" when something doesn't make sense, to probe inconsistencies, and to escalate concerns without fear.

That's the ultimate value of effective inquiry testing—not just catching fraud after it happens, but creating organizational cultures where people ask the right questions before problems spiral.

At PentesterWorld, we don't just conduct audits—we train organizations to think like auditors. We teach your teams the inquiry techniques that transform checkbox compliance into genuine control effectiveness. We show you how to ask questions that reveal truth, build evidence that withstands scrutiny, and create audit documentation that satisfies the toughest reviewers.

Because in cybersecurity and compliance, the questions you don't ask are often the ones that matter most.


Want to transform your organization's audit and compliance capabilities? Have questions about implementing inquiry-based testing in your environment? Visit PentesterWorld where we turn audit theory into practical investigative excellence. Our team has uncovered everything from million-dollar fraud to critical security gaps—all through the power of asking the right questions, the right way. Let's build your inquiry mastery together.

Loading advertisement...
85

RELATED ARTICLES

COMMENTS (0)

No comments yet. Be the first to share your thoughts!

SYSTEM/FOOTER
OKSEC100%

TOP HACKER

1,247

CERTIFICATIONS

2,156

ACTIVE LABS

8,392

SUCCESS RATE

96.8%

PENTESTERWORLD

ELITE HACKER PLAYGROUND

Your ultimate destination for mastering the art of ethical hacking. Join the elite community of penetration testers and security researchers.

SYSTEM STATUS

CPU:42%
MEMORY:67%
USERS:2,156
THREATS:3
UPTIME:99.97%

CONTACT

EMAIL: [email protected]

SUPPORT: [email protected]

RESPONSE: < 24 HOURS

GLOBAL STATISTICS

127

COUNTRIES

15

LANGUAGES

12,392

LABS COMPLETED

15,847

TOTAL USERS

3,156

CERTIFICATIONS

96.8%

SUCCESS RATE

SECURITY FEATURES

SSL/TLS ENCRYPTION (256-BIT)
TWO-FACTOR AUTHENTICATION
DDoS PROTECTION & MITIGATION
SOC 2 TYPE II CERTIFIED

LEARNING PATHS

WEB APPLICATION SECURITYINTERMEDIATE
NETWORK PENETRATION TESTINGADVANCED
MOBILE SECURITY TESTINGINTERMEDIATE
CLOUD SECURITY ASSESSMENTADVANCED

CERTIFICATIONS

COMPTIA SECURITY+
CEH (CERTIFIED ETHICAL HACKER)
OSCP (OFFENSIVE SECURITY)
CISSP (ISC²)
SSL SECUREDPRIVACY PROTECTED24/7 MONITORING

© 2026 PENTESTERWORLD. ALL RIGHTS RESERVED.