SYSTEM ANALYSIS AND DESIGN FACT FINDING TECHNIQUES

UNIT 1: FACT FINDING TECHNIQUES

1. What is Fact Finding?

Fact finding (also called information gathering or data collection) is the process of discovering and documenting facts about the current system, user needs, business processes, constraints, and stakeholder expectations.

Purpose: To replace assumptions, opinions, and guesses with verifiable truth before designing a new system.

1.1 Why Fact Finding is Critical

  • Avoids “analysis paralysis” – Provides structured ways to gather data.
  • Reduces rework – Correct facts early prevent costly changes later.
  • Builds stakeholder trust – Users see that their input matters.
  • Supports all SAD phases – Feasibility, requirements, design, testing.

1.2 Types of Facts Needed

Fact CategoryExamples
OrganizationalStructure, culture, decision-making, policies
ProcessCurrent workflows, transaction volumes, bottlenecks
UserRoles, skills, frustrations, workarounds
TechnicalHardware, software, networks, data formats
ExternalRegulations, competitors, customer expectations
ConstraintsBudget, timeline, legal, security

Fact Finding Triangulation Overview

2. Common Fact Finding Techniques

There are seven core techniques. Each has strengths and weaknesses. Use a combination for triangulation.

TechniqueBest forTypical Output
1. InterviewIn-depth understanding of opinions, reasons, historyInterview notes, transcripts
2. Questionnaire/SurveyGathering quantitative data from many peopleStatistical summaries
3. ObservationSeeing real behavior, not self-reportedWork logs, video, checklists
4. Document AnalysisUnderstanding existing rules, forms, reportsProcess maps, data dictionaries
5. SamplingReducing volume when full data is too largeRepresentative data sets
6. Research / BenchmarkingLearning industry best practicesReports, comparisons
7. Workshop / JADReaching consensus quickly among stakeholdersPrioritized requirements, models

3. Technique 1: Interview

3.1 Definition

A face-to-face or remote conversation between an analyst and a stakeholder to gather information.

3.2 Types of Interviews

TypeDescriptionExample Question
UnstructuredOpen-ended, free-flowing conversation. Good for exploring unknown areas.“Tell me about your daily work.”
StructuredPredetermined questions, often closed-ended. Good for comparing answers.“On a scale of 1–5, how satisfied are you with the current system?”
Semi-structuredMix of both – core questions + follow-ups. Most common in SAD.“What reports do you use? (Then probe: How often? Which fields are missing?)”

3.3 Step-by-Step Interview Process

  1. Planning

    • Define objective (e.g., “Understand inventory reorder process”)
    • Select interviewees (managers, operators, customers)
    • Schedule 30–60 minutes
    • Prepare question list
  2. Question Design (avoid common errors)

    • Open-ended: “Describe the steps when a customer returns a product.”
    • Closed-ended: “Do you use the return form? Yes/No.”
    • Probing: “You mentioned delays. Can you give an example?”
    • Avoid leading: ❌ “Don’t you think the system is slow?” → ✅ “How would you rate system response time?”
  3. Conducting the Interview

    • Start with rapport (thank, explain purpose, confidentiality)
    • Listen more than talk (80/20 rule)
    • Take notes or record (with permission)
    • Use active listening: “What I hear you saying is…”
  4. Post-Interview

    • Write summary within 24 hours
    • Send to interviewee for validation
    • Extract facts and requirements

The 4-Step Interview Process

3.4 Advantages & Disadvantages

AdvantagesDisadvantages
Rich, detailed dataTime-consuming (schedule, travel, transcribe)
Can explore unexpected topicsInterviewer bias can affect responses
Builds relationshipsMay be intimidating for some users
Clarify ambiguity immediatelyScheduling multiple people is hard

3.5 Example Interview Question Set (for a Library System)

Objective: Understand book borrowing process.

  • Open: “Walk me through what happens when a member brings a book to the counter.”
  • Closed: “Does the system automatically calculate the due date? Yes/No.”
  • Probing: “You mentioned fines – how are they calculated? Are there exceptions?”
  • Opinion: “What is the biggest frustration with the current system?”

4. Technique 2: Questionnaire / Survey

4.1 Definition

A set of written questions distributed to many people (often geographically dispersed) to collect standardized data.

4.2 When to Use

  • Large number of stakeholders (100+)
  • Need for statistical validity
  • Sensitive topics where anonymity is desired
  • Preliminary fact gathering before interviews

4.3 Types of Questions

TypeExample
Multiple choice“How often do you use the report? (Daily / Weekly / Monthly / Never)”
Likert scale (1-5)“The system is easy to use: 1=Strongly disagree … 5=Strongly agree”
Rank order“Rank the following features from most to least important: …”
Open-ended (short)“What one thing would improve your work most?”

Questionnaire Design Types

4.4 Designing an Effective Questionnaire

Rules:

  • Keep it short (10–15 questions, <10 minutes)
  • Use simple, unambiguous language
  • Group related questions
  • Avoid double-barreled questions (e.g., “Do you like the speed and reliability?” – two separate issues)
  • Pilot test with 2–3 users

Example structure:

  • Demographics (role, department, experience)
  • Current system usage (frequency, tasks)
  • Satisfaction ratings (scale)
  • Problems (checkboxes + open “other”)
  • Future needs (priorities)

4.5 Advantages & Disadvantages

AdvantagesDisadvantages
Can reach hundreds of peopleLow response rate (10–30% typical)
Low cost per respondentNo follow-up or clarification
Easy to analyze quantitativelyCannot observe tone or body language
Anonymous answers may be more honestPoorly designed questions yield useless data

4.6 Response Rate Improvement Tips

  • Offer small incentive (gift card, donation)
  • Send reminder after 1 week
  • Keep it very short
  • Explain how results will be used
  • Get endorsement from a manager

5. Technique 3: Observation

5.1 Definition

Watching users perform their actual work in their real environment, without interfering (or with minimal interference).

5.2 Types of Observation

TypeDescriptionExample
PassiveAnalyst watches but does not interact. Best for understanding natural behavior.Sitting in a call center, silently watching an agent take calls.
ActiveAnalyst may ask occasional questions while observing.“What are you doing now? Why?”
ParticipantAnalyst performs the work alongside users (if trained).Working as a cashier for one shift to feel the pressure.

Observation Techniques Overview

5.3 Structured Observation Methods

  • Time and motion logs – Record how long each task takes.
  • Work sampling – Observe at random intervals (e.g., every 30 minutes) and note what the user is doing.
  • Video recording (with permission) – Allows review later.

5.4 Steps

  1. Get permission from management and users (explain purpose, confidentiality).
  2. Schedule observation during typical and peak periods.
  3. Prepare a checklist of what to look for (e.g., steps, delays, errors, workarounds).
  4. Observe without judging – users may be nervous.
  5. Take detailed notes – include timestamps, screens, physical actions.
  6. Follow up with users to clarify anything seen.

5.5 Advantages & Disadvantages

AdvantagesDisadvantages
Captures what people actually do, not what they say they doHawthorne effect – users change behavior when watched
Reveals tacit knowledge (things users don’t think to mention)Time-consuming; limited to small number of users
Discovers workarounds and inefficienciesOnly captures what happens during observation period (may miss rare events)
No reliance on memoryCannot observe mental processes (decision-making, reasoning)

5.6 Example Observation Checklist (Warehouse Picking)

ActivityObserved?Notes
Worker uses paper list or scanner?
Time to locate each item
Errors (wrong item picked)
Does worker need to walk back for missing info?
Any waiting time?

6. Technique 4: Document Analysis

6.1 Definition

Examining existing written materials that describe the current system, business rules, data, or procedures.

6.2 Types of Documents to Analyze

Document TypeWhat It Reveals
Organization chartsReporting structure, decision authority
Policy manualsRules that must be enforced by the system
Procedure manualsStep-by-step processes (often outdated!)
Forms and screensData fields, validations, workflows
ReportsInformation needs, data sources, frequency
System documentationExisting software design, database schemas
Meeting minutesPast decisions, known issues
Audit logs / error logsActual system usage and problems
Job descriptionsUser responsibilities

6.3 How to Perform Document Analysis

  1. Collect all relevant documents (ask users: “What forms do you fill out?”)
  2. Scan for obvious facts (data elements, rules, frequencies).
  3. Analyze in detail:
    • For a form: list every field, its source, validation rules, optional/mandatory.
    • For a report: who uses it, when, why, and what decisions are based on it.
  4. Look for discrepancies between what documents say and what users do (indicates outdated procedures).
  5. Extract requirements (e.g., “The form has a field for ‘Tax ID’ → system must capture tax ID.”)

6.4 Advantages & Disadvantages

AdvantagesDisadvantages
Low cost (documents are often available)Documents may be outdated or inaccurate
Provides historical contextCannot reveal unwritten practices
Unobtrusive (no user time needed)May be missing important information
Good starting point before interviewsCan be voluminous and time-consuming to review

6.5 Example: Analyzing an Expense Report Form

Form fields: Employee name, date, description, amount, client code, manager signature.
Findings:

  • “Client code” is a free text field but users often write “N/A” → missing validation.
  • Manager signature is handwritten → no digital approval workflow.
  • No field for receipt attachment → users keep paper receipts.
    Requirements derived:
  • Client code must be selected from a validated list.
  • Digital approval with audit trail.
  • Receipt image upload.

7. Technique 5: Sampling

7.1 Definition

Selecting a representative subset of data or transactions to analyze, rather than examining every single instance.

7.2 When to Use

  • Large volume of transactions (e.g., 1 million invoices per year)
  • Need to understand error rates or patterns
  • Time or budget constraints

7.3 Sampling Methods

MethodDescriptionExample
Random samplingEvery item has equal chance of selectionPick 100 invoice numbers using a random number generator.
Stratified samplingDivide population into groups (strata), then sample from eachSample 50 high-value orders + 50 low-value orders to see differences.
Systematic samplingPick every nth itemEvery 100th customer transaction.
Convenience samplingChoose easiest items (least reliable)First 50 invoices on the pile – not recommended.

Sampling Methods Overview

7.4 Determining Sample Size

Rule of thumb for SAD:

  • For error rate estimation: at least 30–50 items.
  • For process discovery: 10–20 typical examples + 5 edge cases.
  • Use formula for confidence level: n = [Z^2 * p(1-p)] / e^2 (simplified: for 95% confidence, 5% margin, use ~400 for large populations).

7.5 Advantages & Disadvantages

AdvantagesDisadvantages
Saves time and effortRisk of unrepresentative sample
Practical for large datasetsRequires statistical knowledge to do properly
Can identify patterns and anomaliesMay miss rare but critical events

7.6 Example: Sampling Customer Complaints

Population: 5,000 complaints last year.
Sample: 200 randomly selected.
Analysis reveals: 40% are about slow delivery, 30% about wrong items, 20% about website errors, 10% other.
Conclusion: Focus requirements on delivery tracking and inventory accuracy.

8. Technique 6: Research / Benchmarking

8.1 Definition

Studying external sources to learn about best practices, industry standards, available technologies, or competitor systems.

8.2 Sources of Research

SourceWhat It Provides
Industry reports (Gartner, Forrester)Trends, benchmarks, vendor comparisons
Technical documentation (APIs, SDKs)Feasibility of integration
Competitor websites / demosFeature expectations
Professional associations (IEEE, PMI)Standards and methodologies
Government regulationsCompliance requirements

8.3 Benchmarking Process

  1. Identify what to benchmark (e.g., order processing time, customer satisfaction score).
  2. Find comparable organizations (same industry, similar size).
  3. Collect data (surveys, public reports, site visits).
  4. Compare your current metrics to the benchmark.
  5. Set target requirements based on best-in-class.

8.4 Advantages & Disadvantages

AdvantagesDisadvantages
Avoids reinventing the wheelMay not be applicable to your unique context
Justifies requirements with external evidenceTime-consuming to find good sources
Uncovers innovative solutionsBenchmarking partners may not share data

8.5 Example: Benchmarking for a Help Desk System

Benchmark data (industry average):

  • First response time: 2 hours.
  • Resolution time: 24 hours.
  • Customer satisfaction: 85%.
    Current system: 8 hours first response, 72 hours resolution, 60% satisfaction.
    Requirement: “First response time shall not exceed 2 hours for high-priority tickets.”

9. Technique 7: Workshop / JAD (Joint Application Development)

9.1 Definition

A facilitated, structured meeting that brings together key stakeholders (users, managers, analysts, developers) for a short, intensive period (1–5 days) to define requirements or solve problems.

9.2 JAD Roles

RoleResponsibility
SponsorExecutive who provides authority and resources
FacilitatorNeutral person who runs the workshop (not an analyst)
ScribeDocuments all decisions and action items
Users / ManagersProvide domain knowledge and decisions
AnalystsTranslate discussions into models (use cases, DFDs)
Observers (optional)Developers or QA who listen but don’t dominate

9.3 JAD Session Structure (Typical 3-day)

  • Day 1: Orientation, scope definition, high-level processes. Identify actors and use cases.
  • Day 2: Detail use cases, business rules, exception flows. Prototyping (paper or digital).
  • Day 3: Prioritize requirements (MoSCoW). Identify conflicts and resolve. Draft SRS outline.

JAD Roles and Session Structure

9.4 Advantages & Disadvantages

AdvantagesDisadvantages
Very fast (days instead of weeks)Expensive (pulling people off work)
Builds consensus and ownershipRequires skilled facilitator
Reduces misunderstandingsCan be dominated by loud voices
Produces validated requirementsNot all stakeholders can attend

9.5 JAD Success Factors

  • Clear objectives and agenda shared beforehand.
  • Right participants (decision-makers must be present).
  • Neutral facilitator (not the analyst).
  • Comfortable, off-site location if possible.
  • Follow-up within 48 hours.

10. Selecting the Right Fact Finding Technique(s)

No single technique is sufficient. Use a multi-method approach.

10.1 Decision Matrix

ScenarioRecommended Techniques
You know nothing about the domainInterviews + Document analysis + Observation
You need quantitative data from 500 usersQuestionnaire + Sampling
Users give contradictory informationObservation (see what they actually do) + JAD (resolve conflicts)
You have very limited timeJAD workshop + Document analysis
The system is critical and high-riskAll techniques (triangulation)

10.2 Fact Finding Plan Template

PhaseTechniqueDurationParticipantsDeliverable
Week 1Document analysis8 hoursAnalyst aloneDocument summary
Week 1-2Interviews (10 people)15 hoursKey usersInterview notes
Week 2Observation (2 days)16 hours3-4 usersWorkflow log
Week 3Questionnaire (200 people)2 hours (design) + 1 week responseAll staffStatistical report
Week 4JAD workshop (2 days)16 hours12 stakeholdersPrioritized requirements

11. Documenting Fact Finding Results

Always organize findings. Common formats:

11.1 Fact Finding Summary Table

Fact IDFact StatementSourceDateConfidence (1-5)
F-001The warehouse processes 500 orders per day on average.Interview: Warehouse Mgr10 Mar4
F-002Order entry takes 4 minutes for a 10-line order.Observation (5 samples)12 Mar5
F-00330% of users want a mobile interface.Survey (n=200)15 Mar4

11.2 Requirements Log (Traceable to Facts)

Req IDRequirementDerived from Fact(s)
FR-12System shall display estimated delivery date at order entry.F-001, F-003 (customer complaints)

11.3 Action Log for Follow-up

IssueTechnique UsedPerson ResponsibleDeadline
Clarify discount calculation ruleInterview (Finance)Analyst20 Mar

12. Common Pitfalls in Fact Finding

  • Relying on only one technique – Biased or incomplete facts. Use triangulation (3+ methods).
  • Asking leading questions – False confirmation of your own assumptions. Use neutral, open-ended wording.
  • Not documenting as you go – Forgetting details; relying on memory.
  • Interviewing only managers – Missing operational realities. Always interview frontline users.
  • Observing only during normal periods – Missing peak load problems.
  • Ignoring “silent” stakeholders (e.g., IT operations) – Non-functional requirements missed.
  • No follow-up validation – Misinterpreted facts cause wrong requirements.

13. Summary Table – Fact Finding Techniques at a Glance

TechniqueBest ForTime CostPeople InvolvedMain Output
InterviewDeep, qualitative insightsMedium (1–2 hours per person)1 analyst + 1 userInterview notes
QuestionnaireQuantitative data from manyLow per respondentManyStatistics, charts
ObservationActual behavior, not reportedHigh (real-time)1 analyst + 1–3 usersWorkflow logs
Document analysisExisting rules, forms, dataLow to mediumAnalyst aloneDocument summary
SamplingLarge data volumesMedium (analysis)Analyst + data ownerSample statistics
ResearchExternal benchmarksVariableAnalyst aloneResearch report
JAD / WorkshopFast consensus, resolutionHigh (group time)8–15 peopleValidated requirements

14. Real-World Example: Fact Finding for a Hospital Patient Registration System

Context: A hospital wants to reduce patient wait time during registration. Current process is paper-based.

TechniqueDetailsFindings
Document analysisReviewed registration form (20 fields).Found 5 redundant fields; inconsistent manual.
ObservationWatched 4 clerks for 2 hours each.Avg. time = 7 mins; 40% time spent calling insurance.
InterviewTalked to clerks, nurse, manager.Clerks want auto-verification; nurse wants immediate data.
SamplingAnalyzed 200 completed forms.12% missing ID; 5% wrong name.
ResearchIndustry benchmarks.Best-in-class = 2 mins; Tablet check-in is trend.
JAD workshopBrought clerks, IT, billing, nurse.Resolved conflict on necessary billing data.

Key requirements derived:

  • Auto-verification with insurance API.
  • Tablet self-check-in for returning patients.
  • Real-time patient data to nurse station.
  • Registration time ≤ 3 minutes for 90% of patients.

15. Key Takeaways

  • Fact finding is the foundation of all SAD work. Garbage in = garbage out.
  • Use multiple techniques to cross-check facts (triangulation).
  • Talk to all levels – managers, users, technical staff, and support.
  • Watch what people do, not just what they say (observation is powerful).
  • Document everything with traceability from fact to requirement.
  • Validate your findings with stakeholders before moving to design.
Hi! Need help with studies? 👋
AI