Test Cases Document
Parent Document: TEST_PLAN.mdVersion: 1.2 | Date: January 5, 2026
4. Planned Tests
4.1 Development Testing
4.1.1 Unit Testing
Example Test Cases:
TC-UNIT-001: CSV Parser - Valid File with BOM
TC-UNIT-002: CSV Parser - Missing Required Columns
TC-UNIT-003: CSV Parser - Invalid Date Format
TC-UNIT-004: Date Formatter - UTC to Local Timezone
TC-UNIT-005: Report Aggregation - Daily Averages Calculation
TC-UNIT-006: Button Component - Click Handler Invocation
TC-UNIT-007: Form Validation - Required Field Enforcement
TC-UNIT-008: Device Timeline - Gap Detection Above 1 Hour Threshold
TC-UNIT-009: Device Timeline - Segment Count Calculation
TC-UNIT-010: Device Timeline - Multiple Devices Sorted by First Seen4.1.2 Component Integration Testing
Example Test Cases:
TC-COMP-001: Data View - Filter Selection to Display Update
TC-COMP-002: Report Generation - Date Selection to PDF Download
TC-COMP-003: Dashboard Filter - Apply Filter Updates All Charts
TC-COMP-004: Authentication Flow - Login to Protected Route Access
TC-COMP-005: Device Timeline - Render Segments with Data Gaps
TC-COMP-006: Device Timeline - Multi-Device Timeline Alignment4.1.3 API Integration Testing
Test Scope:
- CRUD operations for all tables
- RLS policy enforcement
- JWT token management
- Storage operations (read/write)
- Edge Function invocations
4.2 Validation Testing
4.2.1 System Testing
| Test Area | Test Cases | Priority |
|---|---|---|
| Data Integration | TC-DATA-001 to TC-DATA-010 | P0-P1 |
| Report Generation | TC-REPORT-001 to TC-REPORT-026 | P0-P1 |
| Dashboard | TC-DASH-001 to TC-DASH-010 | P0-P1 |
| Authentication | TC-AUTH-001 to TC-AUTH-008 | P0 |
4.2.2 Performance Testing
| Metric | Test Case | Target |
|---|---|---|
| Dashboard Load | TC-PERF-001 | <2.5s LCP |
| API Response | TC-PERF-002 | <500ms p95 |
| PDF Generation | TC-PERF-003 | <60s |
| Concurrent Users | TC-PERF-004 | 50 users |
4.3 Implementation Testing
4.3.1 Regression Test Suite
TC-REG-001: User Authentication - Login Flow Works
TC-REG-002: Data Integration - Scraper Data Syncs Correctly
TC-REG-003: Dashboard - Display Measurements for All Devices
TC-REG-004: Report Generation - Generate and Download PDF
TC-REG-005: Multi-Site Management - Create Site and Assign Devices
TC-REG-006: Data Filtering - Filter by Date Range and Device
TC-REG-007: RLS Enforcement - Users Cannot Access Other User's Data
TC-REG-008: Performance - Dashboard Loads in <2.5s4.3.2 UAT Scenarios
TC-UAT-001: Daily Workflow - View Morning Dust Readings
TC-UAT-002: Compliance Report - Generate Monthly Report
TC-UAT-003: Data Analysis - Find Cause of Dust Spike
TC-UAT-004: Site Setup - Configure New Site with 5 Devices
TC-UAT-005: Weekly Report - Generate Field Report with Photos
TC-UAT-006: Water Tracking - Log Water Truck Refill Events7. Test Case Management
7.1 Test Case Design Principles
- Requirement Traceability: Link to requirements/user stories
- Independence: No execution order dependencies
- Repeatability: Consistent results across runs
- Clarity: Clear, unambiguous steps
- Maintainability: Page Object Model for UI tests
7.2 Test Case Template
Test Case ID: TC-[MODULE]-[NUMBER]
Test Case Name: [Descriptive name in active voice]
Module/Feature: [Feature area]
Priority: P0 (Critical) | P1 (High) | P2 (Medium) | P3 (Low)
Type: Functional | Integration | E2E | Performance | Security
Preconditions:
• [Prerequisites]
Test Data:
• [Input files, accounts, config]
Test Steps:
1. [Step 1]
2. [Step 2]
Expected Results:
• [Expected outcome]
Actual Results: (filled during execution)
Pass/Fail: (filled during execution)7.3 Module Prefixes
| Prefix | Module |
|---|---|
| DATA | Data integration (scrapers) |
| REPORT | PDF report generation |
| DASH | Dashboard and visualization |
| AUTH | Authentication |
| DUST | Dust levels monitoring |
| WEEKLY | Weekly field reports |
| FLOW | Flow meter tracking |
| CLIMATE | Climate data integration |
| SITES | Mine sites and devices |
7.4 Requirements Traceability Matrix
| Requirement | Test Cases | Status |
|---|---|---|
| REQ-001 | TC-AUTH-001, TC-AUTH-002 | Covered |
| REQ-012 | TC-DATA-001 to TC-DATA-010 | Covered |
| REQ-045 | TC-REPORT-001 to TC-REPORT-026 | Covered |
Device Timeline Data Gaps Test Cases
TC-REPORT-021: Render Continuous Data Period
Priority: P1 | Type: Functional
Objective: Verify timeline chart displays continuous data as single segment.
Test Data:
- Device: TEST-DEVICE-001
- Records: 100 measurements, 15-minute intervals
- Duration: 25 hours continuous
Expected Results:
- Single bar displayed
dataGaps= 0segmentsarray length = 1
TC-REPORT-022: Detect Single Data Gap
Priority: P1 | Type: Functional
Objective: Verify gap detection when >1 hour between records.
Test Data:
- Gap duration: 2 hours
Expected Results:
- Two segments displayed
dataGaps= 1- Visual gap indicator shown
TC-REPORT-023: Multiple Gaps Detection
Priority: P1 | Type: Functional
Objective: Verify multiple data gaps detection.
Test Data:
- 3 continuous periods, 2 gaps (3h and 5h)
Expected Results:
- Three segments displayed
dataGaps= 2- Segment record counts sum to total
TC-REPORT-024: Multi-Device Display
Priority: P1 | Type: Functional
Objective: Verify timeline with multiple devices.
Test Data:
- Device A: 0 gaps
- Device B: 1 gap
- Device C: 3 gaps
Expected Results:
- All devices on separate rows
- Sorted by
firstSeen - Correct segment count per device
TC-REPORT-025: Gap Threshold Boundary
Priority: P2 | Type: Boundary
Objective: Verify 1-hour threshold behavior.
Scenarios:
- 59 min gap →
dataGaps= 0 - 61 min gap →
dataGaps= 1 - 60 min gap →
dataGaps= 0 (at threshold)
TC-REPORT-026: ECharts Rendering Performance
Priority: P2 | Type: Performance
Objective: Verify chart renders efficiently.
Test Data:
- 10 devices, 1000 records each
Expected Results:
- Renders within 3 seconds
- No browser freezing
- Responsive scrolling/zooming
Related Documents:
- TEST_PLAN.md - Main test plan
- TEST_SCOPE.md - Feature testing scope
- TEST_APPENDICES.md - Templates