System Testing Document
Parent Document: TEST_PLAN.mdVersion: 1.2 | Date: January 5, 2026
1. Overview
1.1 Purpose
This document defines the system testing strategy for the Dustac Environmental Monitoring Dashboard. System tests validate the complete integrated system against functional and non-functional requirements.
1.2 Scope
- Functional testing of complete features
- Performance testing
- Security testing
- Usability testing
- Accessibility testing
- Compatibility testing
1.3 Environment
- Environment: Staging (production-equivalent)
- URL: https://staging.dustac.app
- Database: Supabase staging project
2. Functional Testing
2.1 User Authentication
TC-SYS-AUTH-001: User Registration
| Field | Value |
|---|---|
| Test ID | TC-SYS-AUTH-001 |
| Priority | P0 |
| Requirement | REQ-001 |
Preconditions:
- User not registered in system
- Valid @dustac.com.au email
Test Steps:
| Step | Action | Expected Result |
|---|---|---|
| 1 | Navigate to /register | Registration form displayed |
| 2 | Enter valid email (test@dustac.com.au) | Email field accepts input |
| 3 | Enter password (min 8 chars, 1 number) | Password strength indicator shows "Strong" |
| 4 | Confirm password | Passwords match indicator shown |
| 5 | Click "Register" | Loading indicator shown |
| 6 | Check email | Confirmation email received |
| 7 | Click confirmation link | Account activated, redirected to login |
Pass Criteria: User successfully registered and can login
TC-SYS-AUTH-002: User Login
| Field | Value |
|---|---|
| Test ID | TC-SYS-AUTH-002 |
| Priority | P0 |
| Requirement | REQ-002 |
Test Steps:
| Step | Action | Expected Result |
|---|---|---|
| 1 | Navigate to /login | Login form displayed |
| 2 | Enter valid email | Email accepted |
| 3 | Enter valid password | Password masked |
| 4 | Click "Login" | JWT token set in storage |
| 5 | Verify redirect | Dashboard displayed |
| 6 | Check session | User info in header |
Pass Criteria: User authenticated and dashboard accessible
2.2 CSV Upload
TC-SYS-UPLOAD-001: Upload Valid CSV
| Field | Value |
|---|---|
| Test ID | TC-SYS-UPLOAD-001 |
| Priority | P0 |
| Requirement | REQ-012 |
Test Data: valid-multi-device-weekly.csv (5 devices, 10,080 records)
Test Steps:
| Step | Action | Expected Result |
|---|---|---|
| 1 | Navigate to /upload | Upload page displayed |
| 2 | Click "Select Files" | File browser opens |
| 3 | Select test CSV | File appears in list |
| 4 | Verify preview | Shows filename, row count, columns |
| 5 | Click "Upload" | Progress bar starts |
| 6 | Wait for completion | Success message: "10,080 records imported" |
| 7 | Check upload history | New upload session listed |
| 8 | Verify dashboard | New data visible in charts |
Pass Criteria: All records imported, data visible in dashboard
TC-SYS-UPLOAD-002: Upload Invalid CSV
| Field | Value |
|---|---|
| Test ID | TC-SYS-UPLOAD-002 |
| Priority | P0 |
| Requirement | REQ-012 |
Test Data: missing-columns.csv (missing PM2.5 column)
Test Steps:
| Step | Action | Expected Result |
|---|---|---|
| 1 | Select invalid CSV | File appears in list |
| 2 | Verify preview | Warning indicator shown |
| 3 | Click "Upload" | Validation error displayed |
| 4 | Check error message | "Missing required column: numberconcentrations2p5" |
| 5 | Verify database | No partial data inserted |
Pass Criteria: Upload rejected with clear error, no data corruption
2.3 Report Generation
TC-SYS-REPORT-001: Generate PDF Report
| Field | Value |
|---|---|
| Test ID | TC-SYS-REPORT-001 |
| Priority | P0 |
| Requirement | REQ-045 |
Preconditions: Upload session with data exists
Test Steps:
| Step | Action | Expected Result |
|---|---|---|
| 1 | Navigate to /reports/generate | Report generator displayed |
| 2 | Select upload session | Session details shown |
| 3 | Select date range | Charts preview updates |
| 4 | Enable all chart types | All chart options checked |
| 5 | Click "Generate PDF" | Progress indicator starts |
| 6 | Wait for completion (< 60s) | Download button appears |
| 7 | Download PDF | PDF file downloaded |
| 8 | Open PDF | All charts rendered, data correct |
Pass Criteria: PDF generated within 60s, all charts visible
TC-SYS-REPORT-002: Device Timeline with Data Gaps
| Field | Value |
|---|---|
| Test ID | TC-SYS-REPORT-002 |
| Priority | P1 |
| Requirement | REQ-045 |
Preconditions: Upload session with data gaps (>1 hour between records)
Test Steps:
| Step | Action | Expected Result |
|---|---|---|
| 1 | Generate report with timeline chart | Timeline chart visible in preview |
| 2 | Verify gap visualization | Gaps shown as breaks in timeline bars |
| 3 | Hover over segment | Tooltip shows segment details |
| 4 | Verify gap count | Gap count matches expected |
| 5 | Generate PDF | Timeline chart in PDF matches preview |
Pass Criteria: Data gaps correctly visualized in both preview and PDF
2.4 Dashboard
TC-SYS-DASH-001: Dashboard Data Display
| Field | Value |
|---|---|
| Test ID | TC-SYS-DASH-001 |
| Priority | P0 |
| Requirement | REQ-030 |
Test Steps:
| Step | Action | Expected Result |
|---|---|---|
| 1 | Login and navigate to dashboard | Dashboard loads |
| 2 | Verify KPI cards | All metrics displayed |
| 3 | Verify time series chart | Data plotted correctly |
| 4 | Apply date filter | Chart updates |
| 5 | Select different device | Data filters to device |
| 6 | Check responsiveness | Layout adapts to screen size |
Pass Criteria: Dashboard displays accurate data with working filters
3. Performance Testing
3.1 Performance Targets
| Metric | Target | Critical |
|---|---|---|
| Dashboard LCP | < 2.5s | < 4.0s |
| API Response (p95) | < 500ms | < 1000ms |
| PDF Generation | < 60s | < 120s |
| CSV Upload (10k rows) | < 30s | < 60s |
| Concurrent Users | 50 | 25 |
3.2 Performance Test Cases
TC-SYS-PERF-001: Dashboard Load Time
| Field | Value |
|---|---|
| Test ID | TC-SYS-PERF-001 |
| Priority | P0 |
| Type | Performance |
Test Steps:
- Clear browser cache
- Navigate to dashboard with Lighthouse CI
- Measure LCP, FID, CLS
- Repeat 5 times, calculate average
Pass Criteria:
- LCP < 2.5s
- FID < 100ms
- CLS < 0.1
TC-SYS-PERF-002: API Response Time
| Field | Value |
|---|---|
| Test ID | TC-SYS-PERF-002 |
| Priority | P0 |
| Type | Performance |
Test Script (k6):
import http from 'k6/http';
import { check } from 'k6';
export const options = {
vus: 10,
duration: '1m',
thresholds: {
http_req_duration: ['p(95)<500']
}
};
export default function() {
const res = http.get('https://staging.dustac.app/api/measurements');
check(res, {
'status is 200': (r) => r.status === 200,
'response time < 500ms': (r) => r.timings.duration < 500
});
}Pass Criteria: p95 response time < 500ms
TC-SYS-PERF-003: Concurrent User Load
| Field | Value |
|---|---|
| Test ID | TC-SYS-PERF-003 |
| Priority | P1 |
| Type | Load Test |
Test Script (k6):
export const options = {
stages: [
{ duration: '2m', target: 10 }, // Ramp up
{ duration: '5m', target: 50 }, // Peak load
{ duration: '2m', target: 0 } // Ramp down
],
thresholds: {
http_req_failed: ['rate<0.01'], // <1% errors
http_req_duration: ['p(95)<1000'] // <1s p95
}
};Pass Criteria: System handles 50 concurrent users with <1% error rate
4. Security Testing
4.1 OWASP Top 10 Coverage
| Vulnerability | Test Case | Method |
|---|---|---|
| SQL Injection | TC-SYS-SEC-001 | Automated scan |
| Broken Auth | TC-SYS-SEC-002 | Manual + automated |
| Sensitive Data | TC-SYS-SEC-003 | Manual review |
| XSS | TC-SYS-SEC-004 | Automated scan |
| Broken Access | TC-SYS-SEC-005 | Manual testing |
4.2 Security Test Cases
TC-SYS-SEC-001: SQL Injection Prevention
Test Steps:
- Input malicious SQL in search fields:
'; DROP TABLE users; -- - Input SQL in URL parameters:
/api/data?id=1 OR 1=1 - Verify parameterized queries in logs
Pass Criteria: All inputs sanitized, no SQL execution
TC-SYS-SEC-002: Authentication Security
Test Steps:
- Attempt login with invalid JWT
- Test expired token handling
- Verify session timeout (24 hours)
- Test password reset token expiry
Pass Criteria: All unauthorized requests rejected with 401
TC-SYS-SEC-005: RLS Enforcement
Test Steps:
- Login as User A
- Attempt API call to User B's data
- Verify response is empty or 403
- Check direct database access blocked
Pass Criteria: Users cannot access other users' data
5. Usability Testing
5.1 Usability Heuristics
| Heuristic | Test Focus |
|---|---|
| Visibility | Status indicators, loading states |
| Match | Mining terminology, familiar patterns |
| Control | Undo, cancel, back navigation |
| Consistency | UI patterns, terminology |
| Error Prevention | Validation, confirmations |
| Recognition | Clear labels, tooltips |
| Flexibility | Shortcuts, customization |
| Aesthetics | Clean design, minimal clutter |
| Error Recovery | Helpful error messages |
| Help | Documentation, tooltips |
5.2 Usability Test Cases
TC-SYS-USE-001: First-Time User Onboarding
| Field | Value |
|---|---|
| Test ID | TC-SYS-USE-001 |
| Priority | P1 |
Scenario: New user completes first upload and generates report
Observations:
- [ ] User finds upload button easily
- [ ] File selection is intuitive
- [ ] Progress feedback is clear
- [ ] Report generation is discoverable
- [ ] PDF download is obvious
Pass Criteria: User completes flow without external help
6. Accessibility Testing
6.1 WCAG 2.1 AA Compliance
| Criterion | Test Method |
|---|---|
| Keyboard Navigation | Manual testing |
| Screen Reader | NVDA/VoiceOver |
| Color Contrast | axe-core |
| Focus Indicators | Visual inspection |
| Alt Text | Automated scan |
6.2 Accessibility Test Cases
TC-SYS-A11Y-001: Keyboard Navigation
Test Steps:
- Navigate entire app using Tab key
- Verify all interactive elements focusable
- Test Enter/Space activation
- Verify Escape closes modals
Pass Criteria: All functionality accessible via keyboard
TC-SYS-A11Y-002: Screen Reader Compatibility
Test Steps:
- Enable NVDA/VoiceOver
- Navigate through dashboard
- Verify chart descriptions read aloud
- Test form field labels
- Verify error messages announced
Pass Criteria: All content accessible to screen reader users
7. Compatibility Testing
7.1 Browser Matrix
| Browser | Version | Priority |
|---|---|---|
| Chrome | Latest, Latest-1 | P0 |
| Firefox | Latest, Latest-1 | P0 |
| Safari | Latest | P1 |
| Edge | Latest | P1 |
7.2 Device Matrix
| Device | Resolution | Priority |
|---|---|---|
| Desktop | 1920x1080 | P0 |
| Desktop | 1366x768 | P0 |
| Tablet | 1024x768 | P1 |
| Mobile | 375x667 | P2 |
7.3 Compatibility Test Cases
TC-SYS-COMPAT-001: Cross-Browser Testing
Test Steps:
- Execute core workflow in each browser
- Verify chart rendering consistency
- Check CSS layout accuracy
- Test file upload functionality
- Verify PDF generation
Pass Criteria: Consistent behavior across all browsers
8. Test Execution Summary
8.1 Test Case Count
| Category | Total | P0 | P1 | P2 |
|---|---|---|---|---|
| Functional | 25 | 12 | 10 | 3 |
| Performance | 8 | 4 | 3 | 1 |
| Security | 10 | 6 | 3 | 1 |
| Usability | 6 | 2 | 3 | 1 |
| Accessibility | 5 | 2 | 2 | 1 |
| Compatibility | 6 | 3 | 2 | 1 |
| Total | 60 | 29 | 23 | 8 |
8.2 Exit Criteria
- All P0 test cases passed
- ≥95% P1 test cases passed
- No open S0/S1 defects
- Performance targets met
- Security scan passed
Related Documents:
- TEST_PLAN.md - Main test plan
- TEST_E2E.md - End-to-end tests
- TEST_EXECUTION.md - Execution process