{{ theme.skipToContentLabel || 'Skip to content' }}

Test Appendices

Parent Document: TEST_PLAN.mdVersion: 1.2 | Date: January 5, 2026


Appendix A: Record of Changes

VersionDateAuthorDescription
0.12025-11-15QA LeadInitial draft
0.22025-11-20QA LeadAdded sections 1-6
0.32025-11-25QA LeadAdded sections 7-12
1.02025-12-02QA LeadCompleted all sections
1.12025-12-20QA LeadAdded Dust Ranger RLS policies testing
1.22026-01-05QA LeadAdded Device Timeline Data Gaps test cases (TC-REPORT-021 to TC-REPORT-026)

Appendix B: Glossary

TermDefinition
Acceptance CriteriaConditions for feature completion
BOMBureau of Meteorology (Australia)
CI/CDContinuous Integration/Deployment
DustRangerPM monitoring device
E2EEnd-to-End testing
LCPLargest Contentful Paint
PIIPersonally Identifiable Information
PM2.5/PM10Particulate Matter sizes
RLSRow-Level Security
RTMRequirements Traceability Matrix
SPASingle Page Application
UATUser Acceptance Testing
WCAGWeb Content Accessibility Guidelines

Appendix C: Acronyms

AcronymFull Term
APIApplication Programming Interface
CSVComma-Separated Values
JWTJSON Web Token
OWASPOpen Web Application Security Project
PDFPortable Document Format
QAQuality Assurance
RTORecovery Time Objective
RPORecovery Point Objective
SQLStructured Query Language
SSL/TLSSecure Sockets Layer / Transport Layer Security
URLUniform Resource Locator
VRRValidation Readiness Review
IRRImplementation Readiness Review
ORROperational Readiness Review

Appendix D: Test Case Template

markdown
# Test Case: TC-[MODULE]-[NUMBER]

## Metadata
| Field | Value |
|-------|-------|
| **Test Case ID** | TC-MODULE-XXX |
| **Test Case Name** | [Descriptive name] |
| **Priority** | P0/P1/P2/P3 |
| **Type** | Functional/Integration/E2E/Performance |
| **Requirement** | REQ-XXX |

## Objective
[Brief description of what is being tested]

## Preconditions
- [Prerequisite 1]
- [Prerequisite 2]

## Test Data
- [Input data requirements]

## Test Steps
| Step | Action | Expected Result |
|------|--------|-----------------|
| 1 | [Action] | [Expected] |
| 2 | [Action] | [Expected] |

## Expected Results
[Overall expected outcome]

## Execution (filled during testing)
- **Executed By:** [Name]
- **Date:** [YYYY-MM-DD]
- **Environment:** Dev/Staging/Production
- **Status:** ✅ Pass | ❌ Fail | ⏸️ Blocked
- **Defects:** [IDs if any]
- **Notes:** [Observations]

Appendix E: Defect Report Template

markdown
# Defect Report

## Summary
[Brief description - 50 chars max]

## Environment
- **Browser:** Chrome/Firefox/Safari/Edge [version]
- **OS:** macOS/Windows/iOS/Android [version]
- **Environment:** Dev/Staging/Production
- **User Role:** Standard/Admin

## Classification
- **Severity:** S0 (Blocker) | S1 (Critical) | S2 (Major) | S3 (Minor) | S4 (Trivial)
- **Priority:** P0 (Critical) | P1 (High) | P2 (Medium) | P3 (Low)

## Steps to Reproduce
1. [Step 1]
2. [Step 2]
3. [Step 3]

## Expected Behavior
[What should happen]

## Actual Behavior
[What actually happens]

## Test Data
- **File:** [filename if applicable]
- **Account:** [test account used]
- **Date Range:** [if applicable]

## Attachments
[Screenshots, console logs, network traces]

## Additional Context
- **Frequency:** Always/Sometimes/Rarely
- **First Observed:** [date or version]
- **Workaround:** [if available]
- **Related Issues:** #XXX

Appendix F: Test Summary Report Template

markdown
# Test Summary Report

**Project:** Dustac Environmental Monitoring Dashboard
**Release:** [Version]
**Date:** [YYYY-MM-DD]
**Author:** [QA Lead]

## Executive Summary
[Brief overview of testing activities and results]

## Test Scope
- Features tested: [list]
- Features not tested: [list with reasons]

## Test Execution Summary

| Metric | Value |
|--------|-------|
| Total Test Cases | XXX |
| Executed | XXX |
| Passed | XXX |
| Failed | XXX |
| Blocked | XXX |
| Pass Rate | XX% |

## Defect Summary

| Severity | Open | Closed | Total |
|----------|------|--------|-------|
| S0 - Blocker | X | X | X |
| S1 - Critical | X | X | X |
| S2 - Major | X | X | X |
| S3 - Minor | X | X | X |
| **Total** | X | X | X |

## Coverage Analysis
[Requirements coverage, code coverage metrics]

## Risks & Issues
[Outstanding risks, unresolved issues]

## Recommendations
[Go/No-Go recommendation with justification]

## Sign-Off
- QA Lead: _____________ Date: _______
- Dev Lead: _____________ Date: _______
- Product Owner: _____________ Date: _______

Appendix G: Risk Mitigation Strategies

Risk IDRiskMitigation
R-001Supabase OutageLocal instance for dev, monitor status
R-002Data LossDaily backups, soft-delete
R-003Performance IssuesQuery optimization, caching
R-004CSV Format IssuesRobust validation, preview
R-005Security BreachRLS, JWT, audits
R-006PDF FailureRetry logic, timeouts
R-007External API DownGraceful degradation, caching

Appendix H: Testing Tools Summary

ToolPurposeLicenseCost
VitestUnit testingMITFree
PlaywrightE2E testingApache 2.0Free
React Testing LibraryComponent testingMITFree
MSWAPI mockingMITFree
axe-coreAccessibilityMPL 2.0Free
Lighthouse CIPerformanceApache 2.0Free
k6Load testingAGPL 3.0Free
SnykSecurity scanningProprietaryFree tier
GitHub ActionsCI/CD-Free tier

Total Cost: $0/month (free tiers)


Appendix I: Device Timeline Data Gaps - Detailed Test Cases

TC-REPORT-021: Render Continuous Data Period

FieldValue
Test Case IDTC-REPORT-021
PriorityP1 (High)
TypeFunctional
RequirementREQ-045

Objective: Verify device timeline chart correctly displays continuous data collection periods as single segments.

Preconditions:

  • User logged in with valid session
  • Upload session with single device data available
  • All measurements within 1 hour intervals (no gaps)

Test Data:

  • Device: TEST-DEVICE-001
  • Records: 100 measurements, 15-minute intervals
  • Duration: 25 hours continuous

Steps:

  1. Navigate to Reports page
  2. Select upload session with continuous data
  3. Generate report with Device Timeline chart enabled
  4. Verify timeline chart rendering

Expected Results:

  • Single continuous bar displayed for device
  • No gap indicators shown
  • dataGaps count equals 0
  • segments array contains exactly 1 segment

TC-REPORT-022: Detect Single Data Gap

FieldValue
Test Case IDTC-REPORT-022
PriorityP1 (High)
TypeFunctional
RequirementREQ-045

Objective: Verify system correctly detects a gap when >1 hour exists between consecutive records.

Preconditions:

  • User logged in with valid session
  • Upload session with gap in data (>1 hour between records)

Test Data:

  • Device: TEST-DEVICE-002
  • Records: 50 measurements before gap, 50 after
  • Gap duration: 2 hours (exceeds 1 hour threshold)

Steps:

  1. Navigate to Reports page
  2. Select upload session with gap in data
  3. Generate report with Device Timeline chart enabled
  4. Verify gap detection in timeline

Expected Results:

  • Two separate segments displayed for device
  • Visual gap indicator between segments
  • dataGaps count equals 1
  • segments array contains exactly 2 segments
  • Gap location correctly corresponds to data discontinuity

TC-REPORT-023: Multiple Gaps Detection

FieldValue
Test Case IDTC-REPORT-023
PriorityP1 (High)
TypeFunctional
RequirementREQ-045

Objective: Verify system correctly detects and displays multiple data gaps within a single device's timeline.

Preconditions:

  • User logged in
  • Upload session with multiple gaps in data

Test Data:

  • Device: TEST-DEVICE-003
  • 3 continuous periods separated by 2 gaps
  • Gap 1: 3 hours, Gap 2: 5 hours

Steps:

  1. Navigate to Reports page
  2. Select upload session with multiple gaps
  3. Generate report
  4. Inspect timeline chart rendering

Expected Results:

  • Three separate segments displayed
  • Two visual gap indicators
  • dataGaps count equals 2
  • segments array contains exactly 3 segments
  • Segment record counts sum to total record count

TC-REPORT-024: Multi-Device Display

FieldValue
Test Case IDTC-REPORT-024
PriorityP1 (High)
TypeFunctional
RequirementREQ-045

Objective: Verify timeline chart correctly displays multiple devices with varying gap patterns.

Preconditions:

  • User logged in
  • Upload session with 3+ devices

Test Data:

  • Device A: Continuous data (0 gaps)
  • Device B: 1 gap (2 segments)
  • Device C: 3 gaps (4 segments)

Steps:

  1. Navigate to Reports page
  2. Select multi-device upload session
  3. Generate report
  4. Verify all devices displayed in timeline

Expected Results:

  • All devices displayed on separate rows
  • Devices sorted by firstSeen timestamp
  • Each device shows correct segment count
  • Gap counts accurate per device
  • Tooltip shows device summary (total records, duration, gaps)

TC-REPORT-025: Gap Threshold Boundary

FieldValue
Test Case IDTC-REPORT-025
PriorityP2 (Medium)
TypeBoundary
RequirementREQ-045

Objective: Verify 1-hour gap threshold is correctly applied.

Preconditions:

  • User logged in
  • Test data prepared with specific gap intervals

Test Data:

  • Scenario A: Gap of exactly 59 minutes (should NOT create gap)
  • Scenario B: Gap of exactly 61 minutes (SHOULD create gap)
  • Scenario C: Gap of exactly 60 minutes (boundary - should NOT create gap)

Steps:

  1. Upload test data for each scenario
  2. Generate reports for each
  3. Verify gap detection behavior

Expected Results:

  • Scenario A: dataGaps = 0 (under threshold)
  • Scenario B: dataGaps = 1 (exceeds threshold)
  • Scenario C: dataGaps = 0 (at threshold, not exceeded)

TC-REPORT-026: ECharts Rendering Performance

FieldValue
Test Case IDTC-REPORT-026
PriorityP2 (Medium)
TypePerformance
RequirementREQ-045

Objective: Verify timeline chart renders efficiently with large datasets.

Preconditions:

  • User logged in
  • Large upload session available

Test Data:

  • 10 devices
  • 1000 records per device
  • Mixed gap patterns (0-5 gaps per device)

Steps:

  1. Navigate to Reports page
  2. Select large dataset upload
  3. Generate report with timeline chart
  4. Measure chart rendering time

Expected Results:

  • Chart renders within 3 seconds
  • No browser freezing or lag
  • All segments displayed correctly
  • Scrolling/zooming responsive
  • Memory usage remains stable

Appendix J: Document Revision History

VersionDateSections ChangedReason
1.02025-12-02AllInitial test plan
1.12025-12-20Section 6Dust Ranger RLS testing
1.22026-01-05Sections 3,4,6, Appendix IDevice Timeline Data Gaps

Review Schedule:

  • Quarterly reviews
  • Phase reviews at start of each major phase
  • Ad-hoc updates for significant changes

Next Scheduled Review: 2026-04-01


This document is confidential and proprietary to Dustac.