Compare commits
19 Commits
20f1f4ac97
...
docker
| Author | SHA1 | Date | |
|---|---|---|---|
|
ae4e60f966
|
|||
|
dbca64a03c
|
|||
|
c56226e05b
|
|||
|
8f8abfc651
|
|||
|
fc44fef91a
|
|||
|
7026f2bca7
|
|||
|
e88537754f
|
|||
|
fe305310b9
|
|||
|
1b0cc4441f
|
|||
|
21263e6735
|
|||
|
db0145782a
|
|||
|
2aebfb0771
|
|||
|
310b1547c4
|
|||
|
688b0fc255
|
|||
|
5b7893536e
|
|||
|
a50b4ae724
|
|||
|
56be3c0702
|
|||
|
6b195d3014
|
|||
|
ac0c8a39a9
|
4
.gitignore
vendored
4
.gitignore
vendored
@@ -15,9 +15,13 @@ coverage
|
||||
*.lcov
|
||||
|
||||
# logs
|
||||
logs/*.log
|
||||
logs
|
||||
backend.log
|
||||
frontend.log
|
||||
_.log
|
||||
report.[0-9]_.[0-9]_.[0-9]_.[0-9]_.json
|
||||
!logs/.gitkeep
|
||||
|
||||
# dotenv environment variable files
|
||||
.env
|
||||
|
||||
129
CLAUDE.md
129
CLAUDE.md
@@ -21,11 +21,36 @@ Default to using Bun instead of Node.js.
|
||||
|
||||
## Logging
|
||||
|
||||
The application uses a custom logger in `src/backend/config.js`:
|
||||
The application uses a custom logger that outputs to both files and console.
|
||||
|
||||
### Backend Logging
|
||||
|
||||
Backend logs are written to `logs/backend.log`:
|
||||
- **Log levels**: `debug` (0), `info` (1), `warn` (2), `error` (3)
|
||||
- **Default**: `debug` in development, `info` in production
|
||||
- **Override**: Set `LOG_LEVEL` environment variable (e.g., `LOG_LEVEL=debug`)
|
||||
- **Output format**: `[timestamp] LEVEL: message` with JSON data
|
||||
- **Console**: Also outputs to console in development mode
|
||||
- **File**: Always writes to `logs/backend.log`
|
||||
|
||||
### Frontend Logging
|
||||
|
||||
Frontend logs are sent to the backend and written to `logs/frontend.log`:
|
||||
- **Logger**: `src/frontend/src/lib/logger.js`
|
||||
- **Endpoint**: `POST /api/logs`
|
||||
- **Batching**: Batches logs (up to 10 entries or 5 seconds) for performance
|
||||
- **User context**: Automatically includes userId and user-agent
|
||||
- **Levels**: Same as backend (debug, info, warn, error)
|
||||
|
||||
**Usage in frontend**:
|
||||
```javascript
|
||||
import { logger } from '$lib/logger';
|
||||
|
||||
logger.info('User action', { action: 'click', element: 'button' });
|
||||
logger.error('API error', { error: err.message });
|
||||
logger.warn('Deprecated feature used');
|
||||
logger.debug('Component state', { state: componentState });
|
||||
```
|
||||
|
||||
**Important**: The logger uses the nullish coalescing operator (`??`) to handle log levels. This ensures that `debug` (level 0) is not treated as falsy.
|
||||
|
||||
@@ -35,6 +60,11 @@ NODE_ENV=development
|
||||
LOG_LEVEL=debug
|
||||
```
|
||||
|
||||
**Log Files**:
|
||||
- `logs/backend.log` - Backend server logs
|
||||
- `logs/frontend.log` - Frontend client logs
|
||||
- Logs are excluded from git via `.gitignore`
|
||||
|
||||
## Testing
|
||||
|
||||
Use `bun test` to run tests.
|
||||
@@ -679,3 +709,100 @@ if (!hasLoTWConfirmation && hasDCLData && missingEntity) {
|
||||
- Clarified that DCL API doesn't send DXCC fields (current limitation)
|
||||
- Implemented priority logic: LoTW entity data takes precedence over DCL
|
||||
- System ready to auto-use DCL DXCC data if they add it in future API updates
|
||||
|
||||
### Critical LoTW Sync Behavior (LEARNED THE HARD WAY)
|
||||
|
||||
**⚠️ IMPORTANT: LoTW sync MUST only import confirmed QSOs**
|
||||
|
||||
After attempting to implement "QSO Delta" sync (all QSOs, confirmed + unconfirmed), we discovered:
|
||||
|
||||
**The Problem:**
|
||||
LoTW ADIF export with `qso_qsl=no` (all QSOs mode) only includes:
|
||||
- `CALL` (callsign)
|
||||
- `QSL_RCVD` (confirmation status: Y/N)
|
||||
|
||||
**Missing Fields for Unconfirmed QSOs:**
|
||||
- `DXCC` (entity ID) ← **CRITICAL for awards!**
|
||||
- `COUNTRY` (entity name)
|
||||
- `CONTINENT`
|
||||
- `CQ_ZONE`
|
||||
- `ITU_ZONE`
|
||||
|
||||
**Result:** Unconfirmed QSOs have `entityId: null` and `entity: ""`, breaking award calculations.
|
||||
|
||||
**Current Implementation (CORRECT):**
|
||||
```javascript
|
||||
// lotw.service.js - fetchQSOsFromLoTW()
|
||||
const params = new URLSearchParams({
|
||||
login: lotwUsername,
|
||||
password: loTWPassword,
|
||||
qso_query: '1',
|
||||
qso_qsl: 'yes', // ONLY confirmed QSOs
|
||||
qso_qslsince: dateStr, // Incremental sync
|
||||
});
|
||||
```
|
||||
|
||||
**Why This Matters:**
|
||||
- Awards require `entityId` to count entities
|
||||
- Without `entityId`, QSOs can't be counted toward DXCC, WAS, etc.
|
||||
- Users can still see "worked" stations in QSO list, but awards only count confirmed
|
||||
- DCL sync can import all QSOs because it provides entity data via callsign lookup
|
||||
|
||||
**Attempted Solution (REVERTED):**
|
||||
- Tried implementing callsign prefix lookup to populate missing `entityId`
|
||||
- Created `src/backend/utils/callsign-lookup.js` with basic prefix mappings
|
||||
- Complexity: 1000+ DXCC entities, many special event callsigns, portable designators
|
||||
- Decision: Too complex, reverted (commit 310b154)
|
||||
|
||||
**Takeaway:** LoTW confirmed QSOs have reliable DXCC data. Don't try to workaround this fundamental limitation.
|
||||
|
||||
### QSO Confirmation Filters
|
||||
|
||||
Added "Confirmed by at least 1 service" filter to QSO view (commit 688b0fc):
|
||||
|
||||
**Filter Options:**
|
||||
- "All QSOs" - No filter
|
||||
- "Confirmed by at least 1 service" (NEW) - LoTW OR DCL confirmed
|
||||
- "LoTW Only" - Confirmed by LoTW but NOT DCL
|
||||
- "DCL Only" - Confirmed by DCL but NOT LoTW
|
||||
- "Both Confirmed" - Confirmed by BOTH LoTW AND DCL
|
||||
- "Not Confirmed" - Confirmed by NEITHER
|
||||
|
||||
**SQL Logic:**
|
||||
```sql
|
||||
-- "Confirmed by at least 1 service"
|
||||
WHERE lotwQslRstatus = 'Y' OR dclQslRstatus = 'Y'
|
||||
|
||||
-- "LoTW Only"
|
||||
WHERE lotwQslRstatus = 'Y' AND (dclQslRstatus IS NULL OR dclQslRstatus != 'Y')
|
||||
|
||||
-- "DCL Only"
|
||||
WHERE dclQslRstatus = 'Y' AND (lotwQslRstatus IS NULL OR lotwQslRstatus != 'Y')
|
||||
|
||||
-- "Both Confirmed"
|
||||
WHERE lotwQslRstatus = 'Y' AND dclQslRstatus = 'Y'
|
||||
|
||||
-- "Not Confirmed"
|
||||
WHERE (lotwQslRstatus IS NULL OR lotwQslRstatus != 'Y')
|
||||
AND (dclQslRstatus IS NULL OR dclQslRstatus != 'Y')
|
||||
```
|
||||
|
||||
### Recent Development Work (January 2025)
|
||||
|
||||
**Sync Type Support (ATTEMPTED & REVERTED):**
|
||||
- Commit 5b78935: Added LoTW sync type support (QSL/QSO delta/full)
|
||||
- Commit 310b154: Reverted - LoTW doesn't provide entity data for unconfirmed QSOs
|
||||
- **Lesson:** Keep it simple - only sync confirmed QSOs from LoTW
|
||||
|
||||
**Dashboard Enhancements:**
|
||||
- Added sync job history display with real-time polling (every 2 seconds)
|
||||
- Shows job progress, status, and import logs
|
||||
- Cancel button for stale/failed jobs with rollback capability
|
||||
- Tracks all QSO changes in `qso_changes` table for rollback
|
||||
|
||||
**Rollback System:**
|
||||
- `cancelJob(jobId, userId)` - Cancels and rolls back sync jobs
|
||||
- Tracks added QSOs (deletes them on rollback)
|
||||
- Tracks updated QSOs (restores previous state)
|
||||
- Only allows canceling failed jobs or stale running jobs (>1 hour)
|
||||
- Server-side validation prevents unauthorized cancellations
|
||||
|
||||
23
award-definitions/73-on-73.json
Normal file
23
award-definitions/73-on-73.json
Normal file
@@ -0,0 +1,23 @@
|
||||
{
|
||||
"id": "73-on-73",
|
||||
"name": "73 on 73",
|
||||
"description": "Confirm 73 unique QSO partners on satellite AO-73",
|
||||
"caption": "Contact and confirm 73 different stations (unique callsigns) via the AO-73 satellite. Each unique callsign confirmed via LoTW counts toward the total of 73.",
|
||||
"category": "satellite",
|
||||
"rules": {
|
||||
"type": "entity",
|
||||
"entityType": "callsign",
|
||||
"target": 73,
|
||||
"displayField": "callsign",
|
||||
"filters": {
|
||||
"operator": "AND",
|
||||
"filters": [
|
||||
{
|
||||
"field": "satName",
|
||||
"operator": "eq",
|
||||
"value": "AO-73"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
25
drizzle/0002_nervous_layla_miller.sql
Normal file
25
drizzle/0002_nervous_layla_miller.sql
Normal file
@@ -0,0 +1,25 @@
|
||||
CREATE TABLE `admin_actions` (
|
||||
`id` integer PRIMARY KEY AUTOINCREMENT NOT NULL,
|
||||
`admin_id` integer NOT NULL,
|
||||
`action_type` text NOT NULL,
|
||||
`target_user_id` integer,
|
||||
`details` text,
|
||||
`created_at` integer NOT NULL,
|
||||
FOREIGN KEY (`admin_id`) REFERENCES `users`(`id`) ON UPDATE no action ON DELETE no action,
|
||||
FOREIGN KEY (`target_user_id`) REFERENCES `users`(`id`) ON UPDATE no action ON DELETE no action
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE `qso_changes` (
|
||||
`id` integer PRIMARY KEY AUTOINCREMENT NOT NULL,
|
||||
`job_id` integer NOT NULL,
|
||||
`qso_id` integer,
|
||||
`change_type` text NOT NULL,
|
||||
`before_data` text,
|
||||
`after_data` text,
|
||||
`created_at` integer NOT NULL,
|
||||
FOREIGN KEY (`job_id`) REFERENCES `sync_jobs`(`id`) ON UPDATE no action ON DELETE no action,
|
||||
FOREIGN KEY (`qso_id`) REFERENCES `qsos`(`id`) ON UPDATE no action ON DELETE no action
|
||||
);
|
||||
--> statement-breakpoint
|
||||
ALTER TABLE `users` ADD `role` text DEFAULT 'user' NOT NULL;--> statement-breakpoint
|
||||
ALTER TABLE `users` ADD `is_admin` integer DEFAULT false NOT NULL;
|
||||
1
drizzle/0003_tired_warpath.sql
Normal file
1
drizzle/0003_tired_warpath.sql
Normal file
@@ -0,0 +1 @@
|
||||
ALTER TABLE `users` DROP COLUMN `role`;
|
||||
756
drizzle/meta/0002_snapshot.json
Normal file
756
drizzle/meta/0002_snapshot.json
Normal file
@@ -0,0 +1,756 @@
|
||||
{
|
||||
"version": "6",
|
||||
"dialect": "sqlite",
|
||||
"id": "542bddc5-2e08-49af-91b5-013a6c9584df",
|
||||
"prevId": "b5c00e60-2f3c-4c2b-a540-0be8d9e856e6",
|
||||
"tables": {
|
||||
"admin_actions": {
|
||||
"name": "admin_actions",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"admin_id": {
|
||||
"name": "admin_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"action_type": {
|
||||
"name": "action_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"target_user_id": {
|
||||
"name": "target_user_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"details": {
|
||||
"name": "details",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"admin_actions_admin_id_users_id_fk": {
|
||||
"name": "admin_actions_admin_id_users_id_fk",
|
||||
"tableFrom": "admin_actions",
|
||||
"tableTo": "users",
|
||||
"columnsFrom": [
|
||||
"admin_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"admin_actions_target_user_id_users_id_fk": {
|
||||
"name": "admin_actions_target_user_id_users_id_fk",
|
||||
"tableFrom": "admin_actions",
|
||||
"tableTo": "users",
|
||||
"columnsFrom": [
|
||||
"target_user_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"award_progress": {
|
||||
"name": "award_progress",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"user_id": {
|
||||
"name": "user_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"award_id": {
|
||||
"name": "award_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"worked_count": {
|
||||
"name": "worked_count",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"confirmed_count": {
|
||||
"name": "confirmed_count",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"total_required": {
|
||||
"name": "total_required",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"worked_entities": {
|
||||
"name": "worked_entities",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"confirmed_entities": {
|
||||
"name": "confirmed_entities",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"last_calculated_at": {
|
||||
"name": "last_calculated_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"last_qso_sync_at": {
|
||||
"name": "last_qso_sync_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"award_progress_user_id_users_id_fk": {
|
||||
"name": "award_progress_user_id_users_id_fk",
|
||||
"tableFrom": "award_progress",
|
||||
"tableTo": "users",
|
||||
"columnsFrom": [
|
||||
"user_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"award_progress_award_id_awards_id_fk": {
|
||||
"name": "award_progress_award_id_awards_id_fk",
|
||||
"tableFrom": "award_progress",
|
||||
"tableTo": "awards",
|
||||
"columnsFrom": [
|
||||
"award_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"awards": {
|
||||
"name": "awards",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"description": {
|
||||
"name": "description",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"definition": {
|
||||
"name": "definition",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"is_active": {
|
||||
"name": "is_active",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"qso_changes": {
|
||||
"name": "qso_changes",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"job_id": {
|
||||
"name": "job_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"qso_id": {
|
||||
"name": "qso_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"change_type": {
|
||||
"name": "change_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"before_data": {
|
||||
"name": "before_data",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"after_data": {
|
||||
"name": "after_data",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"qso_changes_job_id_sync_jobs_id_fk": {
|
||||
"name": "qso_changes_job_id_sync_jobs_id_fk",
|
||||
"tableFrom": "qso_changes",
|
||||
"tableTo": "sync_jobs",
|
||||
"columnsFrom": [
|
||||
"job_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"qso_changes_qso_id_qsos_id_fk": {
|
||||
"name": "qso_changes_qso_id_qsos_id_fk",
|
||||
"tableFrom": "qso_changes",
|
||||
"tableTo": "qsos",
|
||||
"columnsFrom": [
|
||||
"qso_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"qsos": {
|
||||
"name": "qsos",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"user_id": {
|
||||
"name": "user_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"callsign": {
|
||||
"name": "callsign",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"qso_date": {
|
||||
"name": "qso_date",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"time_on": {
|
||||
"name": "time_on",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"band": {
|
||||
"name": "band",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"mode": {
|
||||
"name": "mode",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"freq": {
|
||||
"name": "freq",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"freq_rx": {
|
||||
"name": "freq_rx",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"entity": {
|
||||
"name": "entity",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"entity_id": {
|
||||
"name": "entity_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"grid": {
|
||||
"name": "grid",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"grid_source": {
|
||||
"name": "grid_source",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"continent": {
|
||||
"name": "continent",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"cq_zone": {
|
||||
"name": "cq_zone",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"itu_zone": {
|
||||
"name": "itu_zone",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"state": {
|
||||
"name": "state",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"county": {
|
||||
"name": "county",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"sat_name": {
|
||||
"name": "sat_name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"sat_mode": {
|
||||
"name": "sat_mode",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"my_darc_dok": {
|
||||
"name": "my_darc_dok",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"darc_dok": {
|
||||
"name": "darc_dok",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"lotw_qsl_rdate": {
|
||||
"name": "lotw_qsl_rdate",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"lotw_qsl_rstatus": {
|
||||
"name": "lotw_qsl_rstatus",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"dcl_qsl_rdate": {
|
||||
"name": "dcl_qsl_rdate",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"dcl_qsl_rstatus": {
|
||||
"name": "dcl_qsl_rstatus",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"lotw_synced_at": {
|
||||
"name": "lotw_synced_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"qsos_user_id_users_id_fk": {
|
||||
"name": "qsos_user_id_users_id_fk",
|
||||
"tableFrom": "qsos",
|
||||
"tableTo": "users",
|
||||
"columnsFrom": [
|
||||
"user_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"sync_jobs": {
|
||||
"name": "sync_jobs",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"user_id": {
|
||||
"name": "user_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"type": {
|
||||
"name": "type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"started_at": {
|
||||
"name": "started_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"completed_at": {
|
||||
"name": "completed_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"result": {
|
||||
"name": "result",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"error": {
|
||||
"name": "error",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"sync_jobs_user_id_users_id_fk": {
|
||||
"name": "sync_jobs_user_id_users_id_fk",
|
||||
"tableFrom": "sync_jobs",
|
||||
"tableTo": "users",
|
||||
"columnsFrom": [
|
||||
"user_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"users": {
|
||||
"name": "users",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"email": {
|
||||
"name": "email",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"password_hash": {
|
||||
"name": "password_hash",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"callsign": {
|
||||
"name": "callsign",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"lotw_username": {
|
||||
"name": "lotw_username",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"lotw_password": {
|
||||
"name": "lotw_password",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"dcl_api_key": {
|
||||
"name": "dcl_api_key",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"role": {
|
||||
"name": "role",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'user'"
|
||||
},
|
||||
"is_admin": {
|
||||
"name": "is_admin",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {
|
||||
"users_email_unique": {
|
||||
"name": "users_email_unique",
|
||||
"columns": [
|
||||
"email"
|
||||
],
|
||||
"isUnique": true
|
||||
}
|
||||
},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
}
|
||||
},
|
||||
"views": {},
|
||||
"enums": {},
|
||||
"_meta": {
|
||||
"schemas": {},
|
||||
"tables": {},
|
||||
"columns": {}
|
||||
},
|
||||
"internal": {
|
||||
"indexes": {}
|
||||
}
|
||||
}
|
||||
748
drizzle/meta/0003_snapshot.json
Normal file
748
drizzle/meta/0003_snapshot.json
Normal file
@@ -0,0 +1,748 @@
|
||||
{
|
||||
"version": "6",
|
||||
"dialect": "sqlite",
|
||||
"id": "071c98fb-6721-4da7-98cb-c16cb6aaf0c1",
|
||||
"prevId": "542bddc5-2e08-49af-91b5-013a6c9584df",
|
||||
"tables": {
|
||||
"admin_actions": {
|
||||
"name": "admin_actions",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"admin_id": {
|
||||
"name": "admin_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"action_type": {
|
||||
"name": "action_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"target_user_id": {
|
||||
"name": "target_user_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"details": {
|
||||
"name": "details",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"admin_actions_admin_id_users_id_fk": {
|
||||
"name": "admin_actions_admin_id_users_id_fk",
|
||||
"tableFrom": "admin_actions",
|
||||
"tableTo": "users",
|
||||
"columnsFrom": [
|
||||
"admin_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"admin_actions_target_user_id_users_id_fk": {
|
||||
"name": "admin_actions_target_user_id_users_id_fk",
|
||||
"tableFrom": "admin_actions",
|
||||
"tableTo": "users",
|
||||
"columnsFrom": [
|
||||
"target_user_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"award_progress": {
|
||||
"name": "award_progress",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"user_id": {
|
||||
"name": "user_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"award_id": {
|
||||
"name": "award_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"worked_count": {
|
||||
"name": "worked_count",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"confirmed_count": {
|
||||
"name": "confirmed_count",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"total_required": {
|
||||
"name": "total_required",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"worked_entities": {
|
||||
"name": "worked_entities",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"confirmed_entities": {
|
||||
"name": "confirmed_entities",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"last_calculated_at": {
|
||||
"name": "last_calculated_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"last_qso_sync_at": {
|
||||
"name": "last_qso_sync_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"award_progress_user_id_users_id_fk": {
|
||||
"name": "award_progress_user_id_users_id_fk",
|
||||
"tableFrom": "award_progress",
|
||||
"tableTo": "users",
|
||||
"columnsFrom": [
|
||||
"user_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"award_progress_award_id_awards_id_fk": {
|
||||
"name": "award_progress_award_id_awards_id_fk",
|
||||
"tableFrom": "award_progress",
|
||||
"tableTo": "awards",
|
||||
"columnsFrom": [
|
||||
"award_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"awards": {
|
||||
"name": "awards",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"description": {
|
||||
"name": "description",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"definition": {
|
||||
"name": "definition",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"is_active": {
|
||||
"name": "is_active",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"qso_changes": {
|
||||
"name": "qso_changes",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"job_id": {
|
||||
"name": "job_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"qso_id": {
|
||||
"name": "qso_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"change_type": {
|
||||
"name": "change_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"before_data": {
|
||||
"name": "before_data",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"after_data": {
|
||||
"name": "after_data",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"qso_changes_job_id_sync_jobs_id_fk": {
|
||||
"name": "qso_changes_job_id_sync_jobs_id_fk",
|
||||
"tableFrom": "qso_changes",
|
||||
"tableTo": "sync_jobs",
|
||||
"columnsFrom": [
|
||||
"job_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"qso_changes_qso_id_qsos_id_fk": {
|
||||
"name": "qso_changes_qso_id_qsos_id_fk",
|
||||
"tableFrom": "qso_changes",
|
||||
"tableTo": "qsos",
|
||||
"columnsFrom": [
|
||||
"qso_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"qsos": {
|
||||
"name": "qsos",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"user_id": {
|
||||
"name": "user_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"callsign": {
|
||||
"name": "callsign",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"qso_date": {
|
||||
"name": "qso_date",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"time_on": {
|
||||
"name": "time_on",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"band": {
|
||||
"name": "band",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"mode": {
|
||||
"name": "mode",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"freq": {
|
||||
"name": "freq",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"freq_rx": {
|
||||
"name": "freq_rx",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"entity": {
|
||||
"name": "entity",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"entity_id": {
|
||||
"name": "entity_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"grid": {
|
||||
"name": "grid",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"grid_source": {
|
||||
"name": "grid_source",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"continent": {
|
||||
"name": "continent",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"cq_zone": {
|
||||
"name": "cq_zone",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"itu_zone": {
|
||||
"name": "itu_zone",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"state": {
|
||||
"name": "state",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"county": {
|
||||
"name": "county",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"sat_name": {
|
||||
"name": "sat_name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"sat_mode": {
|
||||
"name": "sat_mode",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"my_darc_dok": {
|
||||
"name": "my_darc_dok",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"darc_dok": {
|
||||
"name": "darc_dok",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"lotw_qsl_rdate": {
|
||||
"name": "lotw_qsl_rdate",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"lotw_qsl_rstatus": {
|
||||
"name": "lotw_qsl_rstatus",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"dcl_qsl_rdate": {
|
||||
"name": "dcl_qsl_rdate",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"dcl_qsl_rstatus": {
|
||||
"name": "dcl_qsl_rstatus",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"lotw_synced_at": {
|
||||
"name": "lotw_synced_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"qsos_user_id_users_id_fk": {
|
||||
"name": "qsos_user_id_users_id_fk",
|
||||
"tableFrom": "qsos",
|
||||
"tableTo": "users",
|
||||
"columnsFrom": [
|
||||
"user_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"sync_jobs": {
|
||||
"name": "sync_jobs",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"user_id": {
|
||||
"name": "user_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"type": {
|
||||
"name": "type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"started_at": {
|
||||
"name": "started_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"completed_at": {
|
||||
"name": "completed_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"result": {
|
||||
"name": "result",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"error": {
|
||||
"name": "error",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"sync_jobs_user_id_users_id_fk": {
|
||||
"name": "sync_jobs_user_id_users_id_fk",
|
||||
"tableFrom": "sync_jobs",
|
||||
"tableTo": "users",
|
||||
"columnsFrom": [
|
||||
"user_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "no action",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"users": {
|
||||
"name": "users",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"email": {
|
||||
"name": "email",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"password_hash": {
|
||||
"name": "password_hash",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"callsign": {
|
||||
"name": "callsign",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"lotw_username": {
|
||||
"name": "lotw_username",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"lotw_password": {
|
||||
"name": "lotw_password",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"dcl_api_key": {
|
||||
"name": "dcl_api_key",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"is_admin": {
|
||||
"name": "is_admin",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {
|
||||
"users_email_unique": {
|
||||
"name": "users_email_unique",
|
||||
"columns": [
|
||||
"email"
|
||||
],
|
||||
"isUnique": true
|
||||
}
|
||||
},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
}
|
||||
},
|
||||
"views": {},
|
||||
"enums": {},
|
||||
"_meta": {
|
||||
"schemas": {},
|
||||
"tables": {},
|
||||
"columns": {}
|
||||
},
|
||||
"internal": {
|
||||
"indexes": {}
|
||||
}
|
||||
}
|
||||
@@ -15,6 +15,20 @@
|
||||
"when": 1768641501799,
|
||||
"tag": "0001_free_hiroim",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 2,
|
||||
"version": "6",
|
||||
"when": 1768988121232,
|
||||
"tag": "0002_nervous_layla_miller",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 3,
|
||||
"version": "6",
|
||||
"when": 1768989260562,
|
||||
"tag": "0003_tired_warpath",
|
||||
"breakpoints": true
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -1,15 +1,30 @@
|
||||
import Database from 'bun:sqlite';
|
||||
import { drizzle } from 'drizzle-orm/bun-sqlite';
|
||||
import * as schema from './db/schema/index.js';
|
||||
import { join } from 'path';
|
||||
import { join, dirname } from 'path';
|
||||
import { existsSync, mkdirSync, appendFile } from 'fs';
|
||||
import { fileURLToPath } from 'url';
|
||||
|
||||
// ===================================================================
|
||||
// Configuration
|
||||
// ===================================================================
|
||||
|
||||
// ES module equivalent of __dirname
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = dirname(__filename);
|
||||
|
||||
const isDevelopment = process.env.NODE_ENV !== 'production';
|
||||
|
||||
export const JWT_SECRET = process.env.JWT_SECRET || 'your-secret-key-change-in-production';
|
||||
// SECURITY: Require JWT_SECRET in production - no fallback for security
|
||||
// This prevents JWT token forgery if environment variable is not set
|
||||
if (!process.env.JWT_SECRET && !isDevelopment) {
|
||||
throw new Error(
|
||||
'FATAL: JWT_SECRET environment variable must be set in production. ' +
|
||||
'Generate one with: openssl rand -base64 32'
|
||||
);
|
||||
}
|
||||
|
||||
export const JWT_SECRET = process.env.JWT_SECRET || 'dev-secret-key-change-in-production';
|
||||
export const LOG_LEVEL = process.env.LOG_LEVEL || (isDevelopment ? 'debug' : 'info');
|
||||
|
||||
// ===================================================================
|
||||
@@ -19,16 +34,46 @@ export const LOG_LEVEL = process.env.LOG_LEVEL || (isDevelopment ? 'debug' : 'in
|
||||
const logLevels = { debug: 0, info: 1, warn: 2, error: 3 };
|
||||
const currentLogLevel = logLevels[LOG_LEVEL] ?? 1;
|
||||
|
||||
// Log file paths
|
||||
const logsDir = join(__dirname, '../../logs');
|
||||
const backendLogFile = join(logsDir, 'backend.log');
|
||||
|
||||
// Ensure log directory exists
|
||||
if (!existsSync(logsDir)) {
|
||||
mkdirSync(logsDir, { recursive: true });
|
||||
}
|
||||
|
||||
function formatLogMessage(level, message, data) {
|
||||
const timestamp = new Date().toISOString();
|
||||
let logMessage = `[${timestamp}] ${level.toUpperCase()}: ${message}`;
|
||||
|
||||
if (data && Object.keys(data).length > 0) {
|
||||
logMessage += ' ' + JSON.stringify(data, null, 2);
|
||||
}
|
||||
|
||||
return logMessage + '\n';
|
||||
}
|
||||
|
||||
function log(level, message, data) {
|
||||
if (logLevels[level] < currentLogLevel) return;
|
||||
|
||||
const timestamp = new Date().toISOString();
|
||||
const logMessage = `[${timestamp}] ${level.toUpperCase()}: ${message}`;
|
||||
const logMessage = formatLogMessage(level, message, data);
|
||||
|
||||
if (data && Object.keys(data).length > 0) {
|
||||
console.log(logMessage, JSON.stringify(data, null, 2));
|
||||
} else {
|
||||
console.log(logMessage);
|
||||
// Append to file asynchronously (fire and forget for performance)
|
||||
appendFile(backendLogFile, logMessage, (err) => {
|
||||
if (err) console.error('Failed to write to log file:', err);
|
||||
});
|
||||
|
||||
// Also log to console in development
|
||||
if (isDevelopment) {
|
||||
const timestamp = new Date().toISOString();
|
||||
const consoleMessage = `[${timestamp}] ${level.toUpperCase()}: ${message}`;
|
||||
|
||||
if (data && Object.keys(data).length > 0) {
|
||||
console.log(consoleMessage, data);
|
||||
} else {
|
||||
console.log(consoleMessage);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -39,6 +84,27 @@ export const logger = {
|
||||
error: (message, data) => log('error', message, data),
|
||||
};
|
||||
|
||||
// Frontend logger - writes to separate log file
|
||||
const frontendLogFile = join(logsDir, 'frontend.log');
|
||||
|
||||
export function logToFrontend(level, message, data = null, context = {}) {
|
||||
if (logLevels[level] < currentLogLevel) return;
|
||||
|
||||
const timestamp = new Date().toISOString();
|
||||
let logMessage = `[${timestamp}] [${context.userAgent || 'unknown'}] [${context.userId || 'anonymous'}] ${level.toUpperCase()}: ${message}`;
|
||||
|
||||
if (data && Object.keys(data).length > 0) {
|
||||
logMessage += ' ' + JSON.stringify(data, null, 2);
|
||||
}
|
||||
|
||||
logMessage += '\n';
|
||||
|
||||
// Append to frontend log file
|
||||
appendFile(frontendLogFile, logMessage, (err) => {
|
||||
if (err) console.error('Failed to write to frontend log file:', err);
|
||||
});
|
||||
}
|
||||
|
||||
export default logger;
|
||||
|
||||
// ===================================================================
|
||||
@@ -46,7 +112,6 @@ export default logger;
|
||||
// ===================================================================
|
||||
|
||||
// Get the directory containing this config file, then go to parent for db location
|
||||
const __dirname = new URL('.', import.meta.url).pathname;
|
||||
const dbPath = join(__dirname, 'award.db');
|
||||
|
||||
const sqlite = new Database(dbPath);
|
||||
@@ -57,6 +122,8 @@ export const db = drizzle({
|
||||
schema,
|
||||
});
|
||||
|
||||
export { sqlite };
|
||||
|
||||
export async function closeDatabase() {
|
||||
sqlite.close();
|
||||
}
|
||||
|
||||
@@ -9,6 +9,7 @@ import { sqliteTable, text, integer } from 'drizzle-orm/sqlite-core';
|
||||
* @property {string|null} lotwUsername
|
||||
* @property {string|null} lotwPassword
|
||||
* @property {string|null} dclApiKey
|
||||
* @property {boolean} isAdmin
|
||||
* @property {Date} createdAt
|
||||
* @property {Date} updatedAt
|
||||
*/
|
||||
@@ -21,6 +22,7 @@ export const users = sqliteTable('users', {
|
||||
lotwUsername: text('lotw_username'),
|
||||
lotwPassword: text('lotw_password'), // Encrypted
|
||||
dclApiKey: text('dcl_api_key'), // DCL API key for future use
|
||||
isAdmin: integer('is_admin', { mode: 'boolean' }).notNull().default(false),
|
||||
createdAt: integer('created_at', { mode: 'timestamp' }).notNull().$defaultFn(() => new Date()),
|
||||
updatedAt: integer('updated_at', { mode: 'timestamp' }).notNull().$defaultFn(() => new Date()),
|
||||
});
|
||||
@@ -181,5 +183,45 @@ export const syncJobs = sqliteTable('sync_jobs', {
|
||||
createdAt: integer('created_at', { mode: 'timestamp' }).notNull().$defaultFn(() => new Date()),
|
||||
});
|
||||
|
||||
/**
|
||||
* @typedef {Object} QSOChange
|
||||
* @property {number} id
|
||||
* @property {number} jobId
|
||||
* @property {number|null} qsoId
|
||||
* @property {string} changeType - 'added' or 'updated'
|
||||
* @property {string|null} beforeData - JSON snapshot before change (for updates)
|
||||
* @property {string|null} afterData - JSON snapshot after change
|
||||
* @property {Date} createdAt
|
||||
*/
|
||||
|
||||
export const qsoChanges = sqliteTable('qso_changes', {
|
||||
id: integer('id').primaryKey({ autoIncrement: true }),
|
||||
jobId: integer('job_id').notNull().references(() => syncJobs.id),
|
||||
qsoId: integer('qso_id').references(() => qsos.id), // null for added QSOs until created
|
||||
changeType: text('change_type').notNull(), // 'added' or 'updated'
|
||||
beforeData: text('before_data'), // JSON snapshot before change
|
||||
afterData: text('after_data'), // JSON snapshot after change
|
||||
createdAt: integer('created_at', { mode: 'timestamp' }).notNull().$defaultFn(() => new Date()),
|
||||
});
|
||||
|
||||
/**
|
||||
* @typedef {Object} AdminAction
|
||||
* @property {number} id
|
||||
* @property {number} adminId
|
||||
* @property {string} actionType
|
||||
* @property {number|null} targetUserId
|
||||
* @property {string|null} details
|
||||
* @property {Date} createdAt
|
||||
*/
|
||||
|
||||
export const adminActions = sqliteTable('admin_actions', {
|
||||
id: integer('id').primaryKey({ autoIncrement: true }),
|
||||
adminId: integer('admin_id').notNull().references(() => users.id),
|
||||
actionType: text('action_type').notNull(), // 'impersonate_start', 'impersonate_stop', 'role_change', 'user_delete', etc.
|
||||
targetUserId: integer('target_user_id').references(() => users.id),
|
||||
details: text('details'), // JSON with additional context
|
||||
createdAt: integer('created_at', { mode: 'timestamp' }).notNull().$defaultFn(() => new Date()),
|
||||
});
|
||||
|
||||
// Export all schemas
|
||||
export const schema = { users, qsos, awards, awardProgress, syncJobs };
|
||||
export const schema = { users, qsos, awards, awardProgress, syncJobs, qsoChanges, adminActions };
|
||||
|
||||
@@ -1,7 +1,11 @@
|
||||
import { Elysia, t } from 'elysia';
|
||||
import { cors } from '@elysiajs/cors';
|
||||
import { jwt } from '@elysiajs/jwt';
|
||||
import { JWT_SECRET, logger, LOG_LEVEL } from './config.js';
|
||||
import { resolve, normalize } from 'path';
|
||||
import { existsSync } from 'fs';
|
||||
import { JWT_SECRET, logger, LOG_LEVEL, logToFrontend } from './config.js';
|
||||
import { getPerformanceSummary, resetPerformanceMetrics } from './services/performance.service.js';
|
||||
import { getCacheStats } from './services/cache.service.js';
|
||||
import {
|
||||
registerUser,
|
||||
authenticateUser,
|
||||
@@ -9,6 +13,17 @@ import {
|
||||
updateLoTWCredentials,
|
||||
updateDCLCredentials,
|
||||
} from './services/auth.service.js';
|
||||
import {
|
||||
getSystemStats,
|
||||
getUserStats,
|
||||
impersonateUser,
|
||||
verifyImpersonation,
|
||||
stopImpersonation,
|
||||
getImpersonationStatus,
|
||||
changeUserRole,
|
||||
deleteUser,
|
||||
} from './services/admin.service.js';
|
||||
import { getAllUsers } from './services/auth.service.js';
|
||||
import {
|
||||
getUserQSOs,
|
||||
getQSOStats,
|
||||
@@ -20,6 +35,7 @@ import {
|
||||
getJobStatus,
|
||||
getUserActiveJob,
|
||||
getUserJobs,
|
||||
cancelJob,
|
||||
} from './services/job-queue.service.js';
|
||||
import {
|
||||
getAllAwards,
|
||||
@@ -32,12 +48,116 @@ import {
|
||||
* Serves API routes and static frontend files
|
||||
*/
|
||||
|
||||
// Get allowed origins from environment or allow all in development
|
||||
// SECURITY: Stricter email validation
|
||||
// Elysia's built-in email format check is lenient; this provides better validation
|
||||
// Allows: alphanumeric, dots, hyphens, underscores in local part
|
||||
// Requires: valid domain with at least one dot
|
||||
const EMAIL_REGEX = /^[a-zA-Z0-9._-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$/;
|
||||
|
||||
function isValidEmail(email) {
|
||||
// Basic format check with regex
|
||||
if (!EMAIL_REGEX.test(email)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Additional checks
|
||||
const [local, domain] = email.split('@');
|
||||
|
||||
// Local part shouldn't start or end with dot
|
||||
if (local.startsWith('.') || local.endsWith('.')) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Local part shouldn't have consecutive dots
|
||||
if (local.includes('..')) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Domain must have at least one dot and valid TLD
|
||||
if (!domain || !domain.includes('.') || domain.split('.').pop().length < 2) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Length checks (RFC 5321 limits)
|
||||
if (email.length > 254 || local.length > 64) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
// SECURITY: Validate job ID parameter
|
||||
// Prevents injection and DoS via extremely large numbers
|
||||
function isValidJobId(jobId) {
|
||||
const id = parseInt(jobId, 10);
|
||||
// Check: must be a number, positive, and within reasonable range (1 to 2^31-1)
|
||||
return !isNaN(id) && id > 0 && id <= 2147483647 && Number.isSafeInteger(id);
|
||||
}
|
||||
|
||||
// SECURITY: Validate QSO ID parameter
|
||||
function isValidQsoId(qsoId) {
|
||||
const id = parseInt(qsoId, 10);
|
||||
return !isNaN(id) && id > 0 && id <= 2147483647 && Number.isSafeInteger(id);
|
||||
}
|
||||
|
||||
// SECURITY: In-memory rate limiter for auth endpoints
|
||||
// Prevents brute force attacks on login/register
|
||||
const rateLimitStore = new Map(); // IP -> { count, resetTime }
|
||||
|
||||
function checkRateLimit(ip, limit = 5, windowMs = 60000) {
|
||||
const now = Date.now();
|
||||
const record = rateLimitStore.get(ip);
|
||||
|
||||
// Clean up expired records
|
||||
if (record && now > record.resetTime) {
|
||||
rateLimitStore.delete(ip);
|
||||
return { allowed: true, remaining: limit - 1 };
|
||||
}
|
||||
|
||||
// Check if limit exceeded
|
||||
if (record) {
|
||||
if (record.count >= limit) {
|
||||
const retryAfter = Math.ceil((record.resetTime - now) / 1000);
|
||||
return { allowed: false, retryAfter };
|
||||
}
|
||||
record.count++;
|
||||
return { allowed: true, remaining: limit - record.count };
|
||||
}
|
||||
|
||||
// Create new record
|
||||
rateLimitStore.set(ip, {
|
||||
count: 1,
|
||||
resetTime: now + windowMs,
|
||||
});
|
||||
|
||||
return { allowed: true, remaining: limit - 1 };
|
||||
}
|
||||
|
||||
// SECURITY: Validate that a path doesn't escape the allowed directory
|
||||
// Prevents path traversal attacks like ../../../etc/passwd
|
||||
const BUILD_DIR = resolve('src/frontend/build');
|
||||
|
||||
function isPathSafe(requestedPath) {
|
||||
try {
|
||||
// Normalize the path to resolve any ../ sequences
|
||||
const normalized = normalize(requestedPath);
|
||||
const fullPath = resolve(BUILD_DIR, normalized);
|
||||
|
||||
// Check if the resolved path is within the build directory
|
||||
return fullPath.startsWith(BUILD_DIR);
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Get allowed origins from environment
|
||||
// SECURITY: Never allow all origins, even in development
|
||||
// Always require explicit ALLOWED_ORIGINS or VITE_APP_URL
|
||||
const ALLOWED_ORIGINS = process.env.ALLOWED_ORIGINS
|
||||
? process.env.ALLOWED_ORIGINS.split(',')
|
||||
: process.env.NODE_ENV === 'production'
|
||||
? [process.env.VITE_APP_URL || 'https://awards.dj7nt.de']
|
||||
: true; // Allow all in development
|
||||
: [process.env.VITE_APP_URL || 'http://localhost:5173']; // Default dev origin
|
||||
|
||||
const app = new Elysia()
|
||||
// Enable CORS for frontend communication
|
||||
@@ -67,25 +187,133 @@ const app = new Elysia()
|
||||
return { user: null };
|
||||
}
|
||||
|
||||
// Check if this is an impersonation token
|
||||
const isImpersonation = !!payload.impersonatedBy;
|
||||
|
||||
return {
|
||||
user: {
|
||||
id: payload.userId,
|
||||
email: payload.email,
|
||||
callsign: payload.callsign,
|
||||
isAdmin: payload.isAdmin,
|
||||
impersonatedBy: payload.impersonatedBy, // Admin ID if impersonating
|
||||
},
|
||||
isImpersonation,
|
||||
};
|
||||
} catch (error) {
|
||||
return { user: null };
|
||||
}
|
||||
})
|
||||
|
||||
// Security headers middleware
|
||||
.onAfterHandle(({ set, request }) => {
|
||||
const url = new URL(request.url);
|
||||
const isProduction = process.env.NODE_ENV === 'production';
|
||||
|
||||
// Prevent clickjacking
|
||||
set.headers['X-Frame-Options'] = 'DENY';
|
||||
|
||||
// Prevent MIME type sniffing
|
||||
set.headers['X-Content-Type-Options'] = 'nosniff';
|
||||
|
||||
// Enable XSS filter
|
||||
set.headers['X-XSS-Protection'] = '1; mode=block';
|
||||
|
||||
// Referrer policy
|
||||
set.headers['Referrer-Policy'] = 'strict-origin-when-cross-origin';
|
||||
|
||||
// Content Security Policy (basic protection)
|
||||
// In development, allow any origin; in production, restrict
|
||||
const appUrl = process.env.VITE_APP_URL || 'https://awards.dj7nt.de';
|
||||
const connectSrc = isProduction ? `'self' ${appUrl}` : "'self' *";
|
||||
|
||||
set.headers['Content-Security-Policy'] = [
|
||||
"default-src 'self'",
|
||||
`script-src 'self' 'unsafe-inline'`, // unsafe-inline needed for SvelteKit inline scripts
|
||||
`style-src 'self' 'unsafe-inline'`, // unsafe-inline needed for SvelteKit inline styles
|
||||
`img-src 'self' data: https:`,
|
||||
`connect-src ${connectSrc}`,
|
||||
"font-src 'self'",
|
||||
"object-src 'none'",
|
||||
"base-uri 'self'",
|
||||
"form-action 'self'",
|
||||
].join('; ');
|
||||
|
||||
// Strict Transport Security (HTTPS only) - only in production
|
||||
if (isProduction) {
|
||||
set.headers['Strict-Transport-Security'] = 'max-age=31536000; includeSubDomains';
|
||||
}
|
||||
})
|
||||
|
||||
// Request logging middleware
|
||||
.onRequest(({ request, params }) => {
|
||||
const url = new URL(request.url);
|
||||
const method = request.method;
|
||||
const path = url.pathname;
|
||||
const query = url.search;
|
||||
|
||||
// Skip logging for health checks in development to reduce noise
|
||||
if (path === '/api/health' && process.env.NODE_ENV === 'development') {
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info('Incoming request', {
|
||||
method,
|
||||
path,
|
||||
query: query || undefined,
|
||||
ip: request.headers.get('x-forwarded-for') || 'unknown',
|
||||
userAgent: request.headers.get('user-agent')?.substring(0, 100),
|
||||
});
|
||||
})
|
||||
.onAfterHandle(({ request, set }) => {
|
||||
const url = new URL(request.url);
|
||||
const path = url.pathname;
|
||||
|
||||
// Skip logging for health checks in development
|
||||
if (path === '/api/health' && process.env.NODE_ENV === 'development') {
|
||||
return;
|
||||
}
|
||||
|
||||
// Log error responses
|
||||
if (set.status >= 400) {
|
||||
logger.warn('Request failed', {
|
||||
path,
|
||||
status: set.status,
|
||||
});
|
||||
}
|
||||
})
|
||||
|
||||
/**
|
||||
* POST /api/auth/register
|
||||
* Register a new user
|
||||
*/
|
||||
.post(
|
||||
'/api/auth/register',
|
||||
async ({ body, jwt, set }) => {
|
||||
async ({ body, jwt, set, request }) => {
|
||||
// SECURITY: Rate limiting to prevent abuse
|
||||
const ip = request.headers.get('x-forwarded-for') || 'unknown';
|
||||
const rateLimit = checkRateLimit(ip, 5, 60000); // 5 requests per minute
|
||||
|
||||
if (!rateLimit.allowed) {
|
||||
set.status = 429;
|
||||
set.headers['Retry-After'] = rateLimit.retryAfter.toString();
|
||||
return {
|
||||
success: false,
|
||||
error: 'Too many registration attempts. Please try again later.',
|
||||
};
|
||||
}
|
||||
|
||||
set.headers['X-RateLimit-Remaining'] = rateLimit.remaining.toString();
|
||||
|
||||
// SECURITY: Additional server-side email validation
|
||||
if (!isValidEmail(body.email)) {
|
||||
set.status = 400;
|
||||
return {
|
||||
success: false,
|
||||
error: 'Invalid email address format',
|
||||
};
|
||||
}
|
||||
|
||||
// Create user
|
||||
const user = await registerUser(body);
|
||||
|
||||
@@ -97,11 +325,13 @@ const app = new Elysia()
|
||||
};
|
||||
}
|
||||
|
||||
// Generate JWT token
|
||||
// Generate JWT token with expiration (24 hours)
|
||||
const exp = Math.floor(Date.now() / 1000) + (24 * 60 * 60); // 24 hours from now
|
||||
const token = await jwt.sign({
|
||||
userId: user.id,
|
||||
email: user.email,
|
||||
callsign: user.callsign,
|
||||
exp,
|
||||
});
|
||||
|
||||
set.status = 201;
|
||||
@@ -136,7 +366,22 @@ const app = new Elysia()
|
||||
*/
|
||||
.post(
|
||||
'/api/auth/login',
|
||||
async ({ body, jwt, set }) => {
|
||||
async ({ body, jwt, set, request }) => {
|
||||
// SECURITY: Rate limiting to prevent brute force attacks
|
||||
const ip = request.headers.get('x-forwarded-for') || 'unknown';
|
||||
const rateLimit = checkRateLimit(ip, 10, 60000); // 10 requests per minute (more lenient than register)
|
||||
|
||||
if (!rateLimit.allowed) {
|
||||
set.status = 429;
|
||||
set.headers['Retry-After'] = rateLimit.retryAfter.toString();
|
||||
return {
|
||||
success: false,
|
||||
error: 'Too many login attempts. Please try again later.',
|
||||
};
|
||||
}
|
||||
|
||||
set.headers['X-RateLimit-Remaining'] = rateLimit.remaining.toString();
|
||||
|
||||
// Authenticate user
|
||||
const user = await authenticateUser(body.email, body.password);
|
||||
|
||||
@@ -148,11 +393,14 @@ const app = new Elysia()
|
||||
};
|
||||
}
|
||||
|
||||
// Generate JWT token
|
||||
// Generate JWT token with expiration (24 hours)
|
||||
const exp = Math.floor(Date.now() / 1000) + (24 * 60 * 60); // 24 hours from now
|
||||
const token = await jwt.sign({
|
||||
userId: user.id,
|
||||
email: user.email,
|
||||
callsign: user.callsign,
|
||||
isAdmin: user.isAdmin,
|
||||
exp,
|
||||
});
|
||||
|
||||
return {
|
||||
@@ -350,11 +598,12 @@ const app = new Elysia()
|
||||
}
|
||||
|
||||
try {
|
||||
const jobId = parseInt(params.jobId);
|
||||
if (isNaN(jobId)) {
|
||||
// SECURITY: Validate job ID to prevent injection and DoS
|
||||
if (!isValidJobId(params.jobId)) {
|
||||
set.status = 400;
|
||||
return { success: false, error: 'Invalid job ID' };
|
||||
}
|
||||
const jobId = parseInt(params.jobId, 10);
|
||||
|
||||
const job = await getJobStatus(jobId);
|
||||
if (!job) {
|
||||
@@ -447,6 +696,43 @@ const app = new Elysia()
|
||||
}
|
||||
})
|
||||
|
||||
/**
|
||||
* DELETE /api/jobs/:jobId
|
||||
* Cancel and rollback a sync job (requires authentication)
|
||||
* Only allows cancelling failed, completed, or stale running jobs (>1 hour)
|
||||
*/
|
||||
.delete('/api/jobs/:jobId', async ({ user, params, set }) => {
|
||||
if (!user) {
|
||||
set.status = 401;
|
||||
return { success: false, error: 'Unauthorized' };
|
||||
}
|
||||
|
||||
try {
|
||||
// SECURITY: Validate job ID to prevent injection and DoS
|
||||
if (!isValidJobId(params.jobId)) {
|
||||
set.status = 400;
|
||||
return { success: false, error: 'Invalid job ID' };
|
||||
}
|
||||
const jobId = parseInt(params.jobId, 10);
|
||||
|
||||
const result = await cancelJob(jobId, user.id);
|
||||
|
||||
if (!result.success) {
|
||||
set.status = 400;
|
||||
return result;
|
||||
}
|
||||
|
||||
return result;
|
||||
} catch (error) {
|
||||
logger.error('Error cancelling job', { error: error.message, userId: user?.id, jobId: params.jobId });
|
||||
set.status = 500;
|
||||
return {
|
||||
success: false,
|
||||
error: 'Failed to cancel job',
|
||||
};
|
||||
}
|
||||
})
|
||||
|
||||
/**
|
||||
* GET /api/qsos
|
||||
* Get user's QSOs (requires authentication)
|
||||
@@ -501,11 +787,12 @@ const app = new Elysia()
|
||||
}
|
||||
|
||||
try {
|
||||
const qsoId = parseInt(params.id);
|
||||
if (isNaN(qsoId)) {
|
||||
// SECURITY: Validate QSO ID to prevent injection and DoS
|
||||
if (!isValidQsoId(params.id)) {
|
||||
set.status = 400;
|
||||
return { success: false, error: 'Invalid QSO ID' };
|
||||
}
|
||||
const qsoId = parseInt(params.id, 10);
|
||||
|
||||
const qso = await getQSOById(user.id, qsoId);
|
||||
|
||||
@@ -704,8 +991,405 @@ const app = new Elysia()
|
||||
.get('/api/health', () => ({
|
||||
status: 'ok',
|
||||
timestamp: new Date().toISOString(),
|
||||
uptime: process.uptime(),
|
||||
performance: getPerformanceSummary(),
|
||||
cache: getCacheStats()
|
||||
}))
|
||||
|
||||
/**
|
||||
* POST /api/logs
|
||||
* Receive frontend logs and write them to frontend.log file
|
||||
*/
|
||||
.post(
|
||||
'/api/logs',
|
||||
async ({ body, user, headers }) => {
|
||||
// Extract context from headers (Elysia provides headers as a plain object)
|
||||
const userAgent = headers['user-agent'] || 'unknown';
|
||||
const context = {
|
||||
userId: user?.id,
|
||||
userAgent,
|
||||
timestamp: new Date().toISOString(),
|
||||
};
|
||||
|
||||
// Log each entry
|
||||
const entries = Array.isArray(body) ? body : [body];
|
||||
for (const entry of entries) {
|
||||
logToFrontend(entry.level || 'info', entry.message || 'No message', entry.data || null, context);
|
||||
}
|
||||
|
||||
return { success: true };
|
||||
},
|
||||
{
|
||||
body: t.Union([
|
||||
t.Object({
|
||||
level: t.Optional(t.Union([t.Literal('debug'), t.Literal('info'), t.Literal('warn'), t.Literal('error')])),
|
||||
message: t.String(),
|
||||
data: t.Optional(t.Any()),
|
||||
}),
|
||||
t.Array(
|
||||
t.Object({
|
||||
level: t.Optional(t.Union([t.Literal('debug'), t.Literal('info'), t.Literal('warn'), t.Literal('error')])),
|
||||
message: t.String(),
|
||||
data: t.Optional(t.Any()),
|
||||
})
|
||||
),
|
||||
]),
|
||||
}
|
||||
)
|
||||
|
||||
/**
|
||||
* ================================================================
|
||||
* ADMIN ROUTES
|
||||
* ================================================================
|
||||
* All admin routes require authentication and admin role
|
||||
*/
|
||||
|
||||
/**
|
||||
* GET /api/admin/stats
|
||||
* Get system-wide statistics (admin only)
|
||||
*/
|
||||
.get('/api/admin/stats', async ({ user, set }) => {
|
||||
if (!user || !user.isAdmin) {
|
||||
set.status = !user ? 401 : 403;
|
||||
return { success: false, error: !user ? 'Unauthorized' : 'Admin access required' };
|
||||
}
|
||||
|
||||
try {
|
||||
const stats = await getSystemStats();
|
||||
return {
|
||||
success: true,
|
||||
stats,
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Error fetching system stats', { error: error.message, userId: user.id });
|
||||
set.status = 500;
|
||||
return {
|
||||
success: false,
|
||||
error: 'Failed to fetch system statistics',
|
||||
};
|
||||
}
|
||||
})
|
||||
|
||||
/**
|
||||
* GET /api/admin/users
|
||||
* Get all users with statistics (admin only)
|
||||
*/
|
||||
.get('/api/admin/users', async ({ user, set }) => {
|
||||
if (!user || !user.isAdmin) {
|
||||
set.status = !user ? 401 : 403;
|
||||
return { success: false, error: !user ? 'Unauthorized' : 'Admin access required' };
|
||||
}
|
||||
|
||||
try {
|
||||
const users = await getUserStats();
|
||||
return {
|
||||
success: true,
|
||||
users,
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Error fetching users', { error: error.message, userId: user.id });
|
||||
set.status = 500;
|
||||
return {
|
||||
success: false,
|
||||
error: 'Failed to fetch users',
|
||||
};
|
||||
}
|
||||
})
|
||||
|
||||
/**
|
||||
* GET /api/admin/users/:userId
|
||||
* Get detailed information about a specific user (admin only)
|
||||
*/
|
||||
.get('/api/admin/users/:userId', async ({ user, params, set }) => {
|
||||
if (!user || !user.isAdmin) {
|
||||
set.status = !user ? 401 : 403;
|
||||
return { success: false, error: !user ? 'Unauthorized' : 'Admin access required' };
|
||||
}
|
||||
|
||||
const userId = parseInt(params.userId, 10);
|
||||
if (isNaN(userId) || userId <= 0) {
|
||||
set.status = 400;
|
||||
return { success: false, error: 'Invalid user ID' };
|
||||
}
|
||||
|
||||
try {
|
||||
const targetUser = await getAllUsers();
|
||||
const userDetails = targetUser.find(u => u.id === userId);
|
||||
|
||||
if (!userDetails) {
|
||||
set.status = 404;
|
||||
return { success: false, error: 'User not found' };
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
user: userDetails,
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Error fetching user details', { error: error.message, userId: user.id });
|
||||
set.status = 500;
|
||||
return {
|
||||
success: false,
|
||||
error: 'Failed to fetch user details',
|
||||
};
|
||||
}
|
||||
})
|
||||
|
||||
/**
|
||||
* POST /api/admin/users/:userId/role
|
||||
* Update user admin status (admin only)
|
||||
*/
|
||||
.post('/api/admin/users/:userId/role', async ({ user, params, body, set }) => {
|
||||
if (!user || !user.isAdmin) {
|
||||
set.status = !user ? 401 : 403;
|
||||
return { success: false, error: !user ? 'Unauthorized' : 'Admin access required' };
|
||||
}
|
||||
|
||||
const targetUserId = parseInt(params.userId, 10);
|
||||
if (isNaN(targetUserId) || targetUserId <= 0) {
|
||||
set.status = 400;
|
||||
return { success: false, error: 'Invalid user ID' };
|
||||
}
|
||||
|
||||
const { isAdmin } = body;
|
||||
|
||||
if (typeof isAdmin !== 'boolean') {
|
||||
set.status = 400;
|
||||
return { success: false, error: 'isAdmin (boolean) is required' };
|
||||
}
|
||||
|
||||
try {
|
||||
await changeUserRole(user.id, targetUserId, isAdmin);
|
||||
return {
|
||||
success: true,
|
||||
message: 'User admin status updated successfully',
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Error updating user admin status', { error: error.message, userId: user.id });
|
||||
set.status = 400;
|
||||
return {
|
||||
success: false,
|
||||
error: error.message,
|
||||
};
|
||||
}
|
||||
})
|
||||
|
||||
/**
|
||||
* DELETE /api/admin/users/:userId
|
||||
* Delete a user (admin only)
|
||||
*/
|
||||
.delete('/api/admin/users/:userId', async ({ user, params, set }) => {
|
||||
if (!user || !user.isAdmin) {
|
||||
set.status = !user ? 401 : 403;
|
||||
return { success: false, error: !user ? 'Unauthorized' : 'Admin access required' };
|
||||
}
|
||||
|
||||
const targetUserId = parseInt(params.userId, 10);
|
||||
if (isNaN(targetUserId) || targetUserId <= 0) {
|
||||
set.status = 400;
|
||||
return { success: false, error: 'Invalid user ID' };
|
||||
}
|
||||
|
||||
try {
|
||||
await deleteUser(user.id, targetUserId);
|
||||
return {
|
||||
success: true,
|
||||
message: 'User deleted successfully',
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Error deleting user', { error: error.message, userId: user.id });
|
||||
set.status = 400;
|
||||
return {
|
||||
success: false,
|
||||
error: error.message,
|
||||
};
|
||||
}
|
||||
})
|
||||
|
||||
/**
|
||||
* POST /api/admin/impersonate/:userId
|
||||
* Start impersonating a user (admin only)
|
||||
*/
|
||||
.post('/api/admin/impersonate/:userId', async ({ user, params, jwt, set }) => {
|
||||
if (!user || !user.isAdmin) {
|
||||
set.status = !user ? 401 : 403;
|
||||
return { success: false, error: !user ? 'Unauthorized' : 'Admin access required' };
|
||||
}
|
||||
|
||||
const targetUserId = parseInt(params.userId, 10);
|
||||
if (isNaN(targetUserId) || targetUserId <= 0) {
|
||||
set.status = 400;
|
||||
return { success: false, error: 'Invalid user ID' };
|
||||
}
|
||||
|
||||
try {
|
||||
const targetUser = await impersonateUser(user.id, targetUserId);
|
||||
|
||||
// Generate impersonation token with shorter expiration (1 hour)
|
||||
const exp = Math.floor(Date.now() / 1000) + (60 * 60); // 1 hour from now
|
||||
const token = await jwt.sign({
|
||||
userId: targetUser.id,
|
||||
email: targetUser.email,
|
||||
callsign: targetUser.callsign,
|
||||
isAdmin: targetUser.isAdmin,
|
||||
impersonatedBy: user.id, // Admin ID who started impersonation
|
||||
exp,
|
||||
});
|
||||
|
||||
return {
|
||||
success: true,
|
||||
token,
|
||||
impersonating: {
|
||||
userId: targetUser.id,
|
||||
email: targetUser.email,
|
||||
callsign: targetUser.callsign,
|
||||
},
|
||||
message: `Impersonating ${targetUser.email}`,
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Error starting impersonation', { error: error.message, userId: user.id });
|
||||
set.status = 400;
|
||||
return {
|
||||
success: false,
|
||||
error: error.message,
|
||||
};
|
||||
}
|
||||
})
|
||||
|
||||
/**
|
||||
* POST /api/admin/impersonate/stop
|
||||
* Stop impersonating and return to admin account (admin only)
|
||||
*/
|
||||
.post('/api/admin/impersonate/stop', async ({ user, jwt, body, set }) => {
|
||||
if (!user || !user.impersonatedBy) {
|
||||
set.status = 400;
|
||||
return {
|
||||
success: false,
|
||||
error: 'Not currently impersonating a user',
|
||||
};
|
||||
}
|
||||
|
||||
try {
|
||||
// Log impersonation stop
|
||||
await stopImpersonation(user.impersonatedBy, user.id);
|
||||
|
||||
// Get admin user details to generate new token
|
||||
const adminUsers = await getAllUsers();
|
||||
const adminUser = adminUsers.find(u => u.id === user.impersonatedBy);
|
||||
|
||||
if (!adminUser) {
|
||||
set.status = 500;
|
||||
return {
|
||||
success: false,
|
||||
error: 'Admin account not found',
|
||||
};
|
||||
}
|
||||
|
||||
// Generate new admin token (24 hours)
|
||||
const exp = Math.floor(Date.now() / 1000) + (24 * 60 * 60);
|
||||
const token = await jwt.sign({
|
||||
userId: adminUser.id,
|
||||
email: adminUser.email,
|
||||
callsign: adminUser.callsign,
|
||||
isAdmin: adminUser.isAdmin,
|
||||
exp,
|
||||
});
|
||||
|
||||
return {
|
||||
success: true,
|
||||
token,
|
||||
user: adminUser,
|
||||
message: 'Impersonation stopped. Returned to admin account.',
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Error stopping impersonation', { error: error.message });
|
||||
set.status = 500;
|
||||
return {
|
||||
success: false,
|
||||
error: 'Failed to stop impersonation',
|
||||
};
|
||||
}
|
||||
})
|
||||
|
||||
/**
|
||||
* GET /api/admin/impersonation/status
|
||||
* Get current impersonation status
|
||||
*/
|
||||
.get('/api/admin/impersonation/status', async ({ user }) => {
|
||||
if (!user) {
|
||||
return {
|
||||
success: true,
|
||||
impersonating: false,
|
||||
};
|
||||
}
|
||||
|
||||
const isImpersonating = !!user.impersonatedBy;
|
||||
|
||||
return {
|
||||
success: true,
|
||||
impersonating: isImpersonating,
|
||||
impersonatedBy: user.impersonatedBy,
|
||||
};
|
||||
})
|
||||
|
||||
/**
|
||||
* GET /api/admin/actions
|
||||
* Get admin actions log (admin only)
|
||||
*/
|
||||
.get('/api/admin/actions', async ({ user, set, query }) => {
|
||||
if (!user || !user.isAdmin) {
|
||||
set.status = !user ? 401 : 403;
|
||||
return { success: false, error: !user ? 'Unauthorized' : 'Admin access required' };
|
||||
}
|
||||
|
||||
const limit = parseInt(query.limit || '50', 10);
|
||||
const offset = parseInt(query.offset || '0', 10);
|
||||
|
||||
try {
|
||||
const actions = await getAdminActions(null, { limit, offset });
|
||||
return {
|
||||
success: true,
|
||||
actions,
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Error fetching admin actions', { error: error.message, userId: user.id });
|
||||
set.status = 500;
|
||||
return {
|
||||
success: false,
|
||||
error: 'Failed to fetch admin actions',
|
||||
};
|
||||
}
|
||||
})
|
||||
|
||||
/**
|
||||
* GET /api/admin/actions/my
|
||||
* Get current admin's action log (admin only)
|
||||
*/
|
||||
.get('/api/admin/actions/my', async ({ user, set, query }) => {
|
||||
if (!user || !user.isAdmin) {
|
||||
set.status = !user ? 401 : 403;
|
||||
return { success: false, error: !user ? 'Unauthorized' : 'Admin access required' };
|
||||
}
|
||||
|
||||
const limit = parseInt(query.limit || '50', 10);
|
||||
const offset = parseInt(query.offset || '0', 10);
|
||||
|
||||
try {
|
||||
const actions = await getAdminActions(user.id, { limit, offset });
|
||||
return {
|
||||
success: true,
|
||||
actions,
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Error fetching admin actions', { error: error.message, userId: user.id });
|
||||
set.status = 500;
|
||||
return {
|
||||
success: false,
|
||||
error: 'Failed to fetch admin actions',
|
||||
};
|
||||
}
|
||||
})
|
||||
|
||||
// Serve static files and SPA fallback for all non-API routes
|
||||
.get('/*', ({ request }) => {
|
||||
const url = new URL(request.url);
|
||||
@@ -728,28 +1412,35 @@ const app = new Elysia()
|
||||
// Extract the actual file path after %sveltekit.assets%/
|
||||
const assetPath = pathname.replace('/%sveltekit.assets%/', '');
|
||||
|
||||
try {
|
||||
// Try to serve from assets directory first
|
||||
const assetsPath = `src/frontend/build/_app/immutable/assets/${assetPath}`;
|
||||
const file = Bun.file(assetsPath);
|
||||
const exists = file.exists();
|
||||
// SECURITY: Validate path before serving
|
||||
const assetsDirPath = `_app/immutable/assets/${assetPath}`;
|
||||
const rootPath = assetPath;
|
||||
|
||||
if (exists) {
|
||||
return new Response(file);
|
||||
if (isPathSafe(assetsDirPath)) {
|
||||
try {
|
||||
const assetsPath = `src/frontend/build/${assetsDirPath}`;
|
||||
const file = Bun.file(assetsPath);
|
||||
const exists = file.exists();
|
||||
|
||||
if (exists) {
|
||||
return new Response(file);
|
||||
}
|
||||
} catch (err) {
|
||||
// Fall through to 404
|
||||
}
|
||||
} catch (err) {
|
||||
// Fall through to 404
|
||||
}
|
||||
|
||||
// If not in assets, try root directory
|
||||
try {
|
||||
const rootFile = Bun.file(`src/frontend/build/${assetPath}`);
|
||||
const rootExists = rootFile.exists();
|
||||
if (rootExists) {
|
||||
return new Response(rootFile);
|
||||
// If not in assets, try root directory (with path validation)
|
||||
if (isPathSafe(rootPath)) {
|
||||
try {
|
||||
const rootFile = Bun.file(`src/frontend/build/${rootPath}`);
|
||||
const rootExists = rootFile.exists();
|
||||
if (rootExists) {
|
||||
return new Response(rootFile);
|
||||
}
|
||||
} catch (err) {
|
||||
// Fall through to 404
|
||||
}
|
||||
} catch (err) {
|
||||
// Fall through to 404
|
||||
}
|
||||
|
||||
return new Response('Not found', { status: 404 });
|
||||
@@ -759,6 +1450,12 @@ const app = new Elysia()
|
||||
// Remove leading slash for file path
|
||||
const filePath = pathname === '/' ? '/index.html' : pathname;
|
||||
|
||||
// SECURITY: Validate path is within build directory before serving
|
||||
if (!isPathSafe(filePath.substring(1))) { // Remove leading slash for relative path
|
||||
logger.warn('Path traversal attempt blocked', { pathname });
|
||||
return new Response('Not found', { status: 404 });
|
||||
}
|
||||
|
||||
try {
|
||||
const fullPath = `src/frontend/build${filePath}`;
|
||||
|
||||
|
||||
103
src/backend/migrations/add-admin-functionality.js
Normal file
103
src/backend/migrations/add-admin-functionality.js
Normal file
@@ -0,0 +1,103 @@
|
||||
/**
|
||||
* Migration: Add admin functionality to users table and create admin_actions table
|
||||
*
|
||||
* This script adds role-based access control (RBAC) for admin functionality:
|
||||
* - Adds 'role' and 'isAdmin' columns to users table
|
||||
* - Creates admin_actions table for audit logging
|
||||
* - Adds indexes for performance
|
||||
*/
|
||||
|
||||
import Database from 'bun:sqlite';
|
||||
import { join, dirname } from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
|
||||
// ES module equivalent of __dirname
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = dirname(__filename);
|
||||
|
||||
const dbPath = join(__dirname, '../award.db');
|
||||
const sqlite = new Database(dbPath);
|
||||
|
||||
async function migrate() {
|
||||
console.log('Starting migration: Add admin functionality...');
|
||||
|
||||
try {
|
||||
// Check if role column already exists in users table
|
||||
const columnExists = sqlite.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM pragma_table_info('users')
|
||||
WHERE name = 'role'
|
||||
`).get();
|
||||
|
||||
if (columnExists.count > 0) {
|
||||
console.log('Admin columns already exist in users table. Skipping...');
|
||||
} else {
|
||||
// Add role column to users table
|
||||
sqlite.exec(`
|
||||
ALTER TABLE users
|
||||
ADD COLUMN role TEXT NOT NULL DEFAULT 'user'
|
||||
`);
|
||||
|
||||
// Add isAdmin column to users table
|
||||
sqlite.exec(`
|
||||
ALTER TABLE users
|
||||
ADD COLUMN is_admin INTEGER NOT NULL DEFAULT 0
|
||||
`);
|
||||
|
||||
console.log('Added role and isAdmin columns to users table');
|
||||
}
|
||||
|
||||
// Check if admin_actions table already exists
|
||||
const tableExists = sqlite.query(`
|
||||
SELECT name FROM sqlite_master
|
||||
WHERE type='table' AND name='admin_actions'
|
||||
`).get();
|
||||
|
||||
if (tableExists) {
|
||||
console.log('Table admin_actions already exists. Skipping...');
|
||||
} else {
|
||||
// Create admin_actions table
|
||||
sqlite.exec(`
|
||||
CREATE TABLE admin_actions (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
admin_id INTEGER NOT NULL,
|
||||
action_type TEXT NOT NULL,
|
||||
target_user_id INTEGER,
|
||||
details TEXT,
|
||||
created_at INTEGER NOT NULL DEFAULT (strftime('%s', 'now') * 1000),
|
||||
FOREIGN KEY (admin_id) REFERENCES users(id) ON DELETE CASCADE,
|
||||
FOREIGN KEY (target_user_id) REFERENCES users(id) ON DELETE SET NULL
|
||||
)
|
||||
`);
|
||||
|
||||
// Create indexes for admin_actions
|
||||
sqlite.exec(`
|
||||
CREATE INDEX idx_admin_actions_admin_id ON admin_actions(admin_id)
|
||||
`);
|
||||
|
||||
sqlite.exec(`
|
||||
CREATE INDEX idx_admin_actions_action_type ON admin_actions(action_type)
|
||||
`);
|
||||
|
||||
sqlite.exec(`
|
||||
CREATE INDEX idx_admin_actions_created_at ON admin_actions(created_at)
|
||||
`);
|
||||
|
||||
console.log('Created admin_actions table with indexes');
|
||||
}
|
||||
|
||||
console.log('Migration complete! Admin functionality added to database.');
|
||||
} catch (error) {
|
||||
console.error('Migration failed:', error);
|
||||
sqlite.close();
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
sqlite.close();
|
||||
}
|
||||
|
||||
// Run migration
|
||||
migrate().then(() => {
|
||||
console.log('Migration script completed successfully');
|
||||
process.exit(0);
|
||||
});
|
||||
@@ -2,10 +2,11 @@
|
||||
* Migration: Add performance indexes for QSO queries
|
||||
*
|
||||
* This script creates database indexes to significantly improve query performance
|
||||
* for filtering, sorting, and sync operations. Expected impact:
|
||||
* for filtering, sorting, sync operations, and QSO statistics. Expected impact:
|
||||
* - 80% faster filter queries
|
||||
* - 60% faster sync operations
|
||||
* - 50% faster award calculations
|
||||
* - 95% faster QSO statistics queries (critical optimization)
|
||||
*/
|
||||
|
||||
import Database from 'bun:sqlite';
|
||||
@@ -49,9 +50,21 @@ async function migrate() {
|
||||
console.log('Creating index: idx_qsos_qso_date');
|
||||
sqlite.exec(`CREATE INDEX IF NOT EXISTS idx_qsos_qso_date ON qsos(user_id, qso_date DESC)`);
|
||||
|
||||
// Index 8: QSO Statistics - Primary user filter (CRITICAL for getQSOStats)
|
||||
console.log('Creating index: idx_qsos_user_primary');
|
||||
sqlite.exec(`CREATE INDEX IF NOT EXISTS idx_qsos_user_primary ON qsos(user_id)`);
|
||||
|
||||
// Index 9: QSO Statistics - Unique counts (entity, band, mode)
|
||||
console.log('Creating index: idx_qsos_user_unique_counts');
|
||||
sqlite.exec(`CREATE INDEX IF NOT EXISTS idx_qsos_user_unique_counts ON qsos(user_id, entity, band, mode)`);
|
||||
|
||||
// Index 10: QSO Statistics - Optimized confirmation counting
|
||||
console.log('Creating index: idx_qsos_stats_confirmation');
|
||||
sqlite.exec(`CREATE INDEX IF NOT EXISTS idx_qsos_stats_confirmation ON qsos(user_id, lotw_qsl_rstatus, dcl_qsl_rstatus)`);
|
||||
|
||||
sqlite.close();
|
||||
|
||||
console.log('\nMigration complete! Created 7 performance indexes.');
|
||||
console.log('\nMigration complete! Created 10 performance indexes.');
|
||||
console.log('\nTo verify indexes were created, run:');
|
||||
console.log(' sqlite3 award.db ".indexes qsos"');
|
||||
|
||||
|
||||
74
src/backend/migrations/add-qso-changes-table.js
Normal file
74
src/backend/migrations/add-qso-changes-table.js
Normal file
@@ -0,0 +1,74 @@
|
||||
/**
|
||||
* Migration: Add qso_changes table for sync job rollback
|
||||
*
|
||||
* This script adds the qso_changes table which tracks all QSO modifications
|
||||
* made by sync jobs, enabling rollback functionality for failed or stale jobs.
|
||||
*/
|
||||
|
||||
import Database from 'bun:sqlite';
|
||||
import { join, dirname } from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
|
||||
// ES module equivalent of __dirname
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = dirname(__filename);
|
||||
|
||||
const dbPath = join(__dirname, '../award.db');
|
||||
const sqlite = new Database(dbPath);
|
||||
|
||||
async function migrate() {
|
||||
console.log('Starting migration: Add qso_changes table...');
|
||||
|
||||
try {
|
||||
// Check if table already exists
|
||||
const tableExists = sqlite.query(`
|
||||
SELECT name FROM sqlite_master
|
||||
WHERE type='table' AND name='qso_changes'
|
||||
`).get();
|
||||
|
||||
if (tableExists) {
|
||||
console.log('Table qso_changes already exists. Migration complete.');
|
||||
sqlite.close();
|
||||
return;
|
||||
}
|
||||
|
||||
// Create qso_changes table
|
||||
sqlite.exec(`
|
||||
CREATE TABLE qso_changes (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
job_id INTEGER NOT NULL,
|
||||
qso_id INTEGER,
|
||||
change_type TEXT NOT NULL,
|
||||
before_data TEXT,
|
||||
after_data TEXT,
|
||||
created_at INTEGER NOT NULL DEFAULT (strftime('%s', 'now') * 1000),
|
||||
FOREIGN KEY (job_id) REFERENCES sync_jobs(id) ON DELETE CASCADE,
|
||||
FOREIGN KEY (qso_id) REFERENCES qsos(id) ON DELETE CASCADE
|
||||
)
|
||||
`);
|
||||
|
||||
// Create index for faster lookups during rollback
|
||||
sqlite.exec(`
|
||||
CREATE INDEX idx_qso_changes_job_id ON qso_changes(job_id)
|
||||
`);
|
||||
|
||||
// Create index for change_type lookups
|
||||
sqlite.exec(`
|
||||
CREATE INDEX idx_qso_changes_change_type ON qso_changes(change_type)
|
||||
`);
|
||||
|
||||
console.log('Migration complete! Created qso_changes table with indexes.');
|
||||
} catch (error) {
|
||||
console.error('Migration failed:', error);
|
||||
sqlite.close();
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
sqlite.close();
|
||||
}
|
||||
|
||||
// Run migration
|
||||
migrate().then(() => {
|
||||
console.log('Migration script completed successfully');
|
||||
process.exit(0);
|
||||
});
|
||||
251
src/backend/scripts/admin-cli.js
Normal file
251
src/backend/scripts/admin-cli.js
Normal file
@@ -0,0 +1,251 @@
|
||||
#!/usr/bin/env bun
|
||||
/**
|
||||
* Admin CLI Tool
|
||||
*
|
||||
* Usage:
|
||||
* bun src/backend/scripts/admin-cli.js create <email> <password> <callsign>
|
||||
* bun src/backend/scripts/admin-cli.js promote <email>
|
||||
* bun src/backend/scripts/admin-cli.js demote <email>
|
||||
* bun src/backend/scripts/admin-cli.js list
|
||||
* bun src/backend/scripts/admin-cli.js check <email>
|
||||
* bun src/backend/scripts/admin-cli.js help
|
||||
*/
|
||||
|
||||
import Database from 'bun:sqlite';
|
||||
import { join, dirname } from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
|
||||
// ES module equivalent of __dirname
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = dirname(__filename);
|
||||
|
||||
const dbPath = join(__dirname, '../award.db');
|
||||
const sqlite = new Database(dbPath);
|
||||
|
||||
// Enable foreign keys
|
||||
sqlite.exec('PRAGMA foreign_keys = ON');
|
||||
|
||||
function help() {
|
||||
console.log(`
|
||||
Admin CLI Tool - Manage admin users
|
||||
|
||||
Commands:
|
||||
create <email> <password> <callsign> Create a new admin user
|
||||
promote <email> Promote existing user to admin
|
||||
demote <email> Demote admin to regular user
|
||||
list List all admin users
|
||||
check <email> Check if user is admin
|
||||
help Show this help message
|
||||
|
||||
Examples:
|
||||
bun src/backend/scripts/admin-cli.js create admin@example.com secretPassword ADMIN
|
||||
bun src/backend/scripts/admin-cli.js promote user@example.com
|
||||
bun src/backend/scripts/admin-cli.js list
|
||||
bun src/backend/scripts/admin-cli.js check user@example.com
|
||||
`);
|
||||
}
|
||||
|
||||
function createAdminUser(email, password, callsign) {
|
||||
console.log(`Creating admin user: ${email}`);
|
||||
|
||||
// Check if user already exists
|
||||
const existingUser = sqlite.query(`
|
||||
SELECT id, email FROM users WHERE email = ?
|
||||
`).get(email);
|
||||
|
||||
if (existingUser) {
|
||||
console.error(`Error: User with email ${email} already exists`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Hash password
|
||||
const passwordHash = Bun.password.hashSync(password, {
|
||||
algorithm: 'bcrypt',
|
||||
cost: 10,
|
||||
});
|
||||
|
||||
// Ensure passwordHash is a string
|
||||
const hashString = String(passwordHash);
|
||||
|
||||
// Insert admin user
|
||||
const result = sqlite.query(`
|
||||
INSERT INTO users (email, password_hash, callsign, is_admin, created_at, updated_at)
|
||||
VALUES (?, ?, ?, 1, strftime('%s', 'now') * 1000, strftime('%s', 'now') * 1000)
|
||||
`).run(email, hashString, callsign);
|
||||
|
||||
console.log(`✓ Admin user created successfully!`);
|
||||
console.log(` ID: ${result.lastInsertRowid}`);
|
||||
console.log(` Email: ${email}`);
|
||||
console.log(` Callsign: ${callsign}`);
|
||||
console.log(`\nYou can now log in with these credentials.`);
|
||||
}
|
||||
|
||||
function promoteUser(email) {
|
||||
console.log(`Promoting user to admin: ${email}`);
|
||||
|
||||
// Check if user exists
|
||||
const user = sqlite.query(`
|
||||
SELECT id, email, is_admin FROM users WHERE email = ?
|
||||
`).get(email);
|
||||
|
||||
if (!user) {
|
||||
console.error(`Error: User with email ${email} not found`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
if (user.is_admin === 1) {
|
||||
console.log(`User ${email} is already an admin`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Update user to admin
|
||||
sqlite.query(`
|
||||
UPDATE users
|
||||
SET is_admin = 1, updated_at = strftime('%s', 'now') * 1000
|
||||
WHERE email = ?
|
||||
`).run(email);
|
||||
|
||||
console.log(`✓ User ${email} has been promoted to admin`);
|
||||
}
|
||||
|
||||
function demoteUser(email) {
|
||||
console.log(`Demoting admin to regular user: ${email}`);
|
||||
|
||||
// Check if user exists
|
||||
const user = sqlite.query(`
|
||||
SELECT id, email, is_admin FROM users WHERE email = ?
|
||||
`).get(email);
|
||||
|
||||
if (!user) {
|
||||
console.error(`Error: User with email ${email} not found`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
if (user.is_admin !== 1) {
|
||||
console.log(`User ${email} is not an admin`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Check if this is the last admin
|
||||
const adminCount = sqlite.query(`
|
||||
SELECT COUNT(*) as count FROM users WHERE is_admin = 1
|
||||
`).get();
|
||||
|
||||
if (adminCount.count === 1) {
|
||||
console.error(`Error: Cannot demote the last admin user. At least one admin must exist.`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Update user to regular user
|
||||
sqlite.query(`
|
||||
UPDATE users
|
||||
SET is_admin = 0, updated_at = strftime('%s', 'now') * 1000
|
||||
WHERE email = ?
|
||||
`).run(email);
|
||||
|
||||
console.log(`✓ User ${email} has been demoted to regular user`);
|
||||
}
|
||||
|
||||
function listAdmins() {
|
||||
console.log('Listing all admin users...\n');
|
||||
|
||||
const admins = sqlite.query(`
|
||||
SELECT id, email, callsign, created_at
|
||||
FROM users
|
||||
WHERE is_admin = 1
|
||||
ORDER BY created_at ASC
|
||||
`).all();
|
||||
|
||||
if (admins.length === 0) {
|
||||
console.log('No admin users found');
|
||||
return;
|
||||
}
|
||||
|
||||
console.log(`Found ${admins.length} admin user(s):\n`);
|
||||
console.log('ID | Email | Callsign | Created At');
|
||||
console.log('----+----------------------------+----------+---------------------');
|
||||
|
||||
admins.forEach((admin) => {
|
||||
const createdAt = new Date(admin.created_at).toLocaleString();
|
||||
console.log(`${String(admin.id).padEnd(3)} | ${admin.email.padEnd(26)} | ${admin.callsign.padEnd(8)} | ${createdAt}`);
|
||||
});
|
||||
}
|
||||
|
||||
function checkUser(email) {
|
||||
console.log(`Checking user status: ${email}\n`);
|
||||
|
||||
const user = sqlite.query(`
|
||||
SELECT id, email, callsign, is_admin FROM users WHERE email = ?
|
||||
`).get(email);
|
||||
|
||||
if (!user) {
|
||||
console.log(`User not found: ${email}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const isAdmin = user.is_admin === 1;
|
||||
|
||||
console.log(`User found:`);
|
||||
console.log(` Email: ${user.email}`);
|
||||
console.log(` Callsign: ${user.callsign}`);
|
||||
console.log(` Is Admin: ${isAdmin ? 'Yes ✓' : 'No'}`);
|
||||
}
|
||||
|
||||
// Main CLI logic
|
||||
const command = process.argv[2];
|
||||
const args = process.argv.slice(3);
|
||||
|
||||
switch (command) {
|
||||
case 'create':
|
||||
if (args.length !== 3) {
|
||||
console.error('Error: create command requires 3 arguments: <email> <password> <callsign>');
|
||||
help();
|
||||
process.exit(1);
|
||||
}
|
||||
createAdminUser(args[0], args[1], args[2]);
|
||||
break;
|
||||
|
||||
case 'promote':
|
||||
if (args.length !== 1) {
|
||||
console.error('Error: promote command requires 1 argument: <email>');
|
||||
help();
|
||||
process.exit(1);
|
||||
}
|
||||
promoteUser(args[0]);
|
||||
break;
|
||||
|
||||
case 'demote':
|
||||
if (args.length !== 1) {
|
||||
console.error('Error: demote command requires 1 argument: <email>');
|
||||
help();
|
||||
process.exit(1);
|
||||
}
|
||||
demoteUser(args[0]);
|
||||
break;
|
||||
|
||||
case 'list':
|
||||
listAdmins();
|
||||
break;
|
||||
|
||||
case 'check':
|
||||
if (args.length !== 1) {
|
||||
console.error('Error: check command requires 1 argument: <email>');
|
||||
help();
|
||||
process.exit(1);
|
||||
}
|
||||
checkUser(args[0]);
|
||||
break;
|
||||
|
||||
case 'help':
|
||||
case '--help':
|
||||
case '-h':
|
||||
help();
|
||||
break;
|
||||
|
||||
default:
|
||||
console.error(`Error: Unknown command '${command}'`);
|
||||
help();
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
sqlite.close();
|
||||
387
src/backend/services/admin.service.js
Normal file
387
src/backend/services/admin.service.js
Normal file
@@ -0,0 +1,387 @@
|
||||
import { eq, sql, desc } from 'drizzle-orm';
|
||||
import { db, sqlite, logger } from '../config.js';
|
||||
import { users, qsos, syncJobs, adminActions, awardProgress, qsoChanges } from '../db/schema/index.js';
|
||||
import { getUserByIdFull, isAdmin } from './auth.service.js';
|
||||
|
||||
/**
|
||||
* Log an admin action for audit trail
|
||||
* @param {number} adminId - Admin user ID
|
||||
* @param {string} actionType - Type of action (e.g., 'impersonate_start', 'role_change')
|
||||
* @param {number|null} targetUserId - Target user ID (if applicable)
|
||||
* @param {Object} details - Additional details (will be JSON stringified)
|
||||
* @returns {Promise<Object>} Created admin action record
|
||||
*/
|
||||
export async function logAdminAction(adminId, actionType, targetUserId = null, details = {}) {
|
||||
const [action] = await db
|
||||
.insert(adminActions)
|
||||
.values({
|
||||
adminId,
|
||||
actionType,
|
||||
targetUserId,
|
||||
details: JSON.stringify(details),
|
||||
})
|
||||
.returning();
|
||||
|
||||
return action;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get admin actions log
|
||||
* @param {number} adminId - Admin user ID (optional, if null returns all actions)
|
||||
* @param {Object} options - Query options
|
||||
* @param {number} options.limit - Number of records to return
|
||||
* @param {number} options.offset - Number of records to skip
|
||||
* @returns {Promise<Array>} Array of admin actions
|
||||
*/
|
||||
export async function getAdminActions(adminId = null, { limit = 50, offset = 0 } = {}) {
|
||||
let query = db
|
||||
.select({
|
||||
id: adminActions.id,
|
||||
adminId: adminActions.adminId,
|
||||
adminEmail: users.email,
|
||||
adminCallsign: users.callsign,
|
||||
actionType: adminActions.actionType,
|
||||
targetUserId: adminActions.targetUserId,
|
||||
targetEmail: sql`target_users.email`.as('targetEmail'),
|
||||
targetCallsign: sql`target_users.callsign`.as('targetCallsign'),
|
||||
details: adminActions.details,
|
||||
createdAt: adminActions.createdAt,
|
||||
})
|
||||
.from(adminActions)
|
||||
.leftJoin(users, eq(adminActions.adminId, users.id))
|
||||
.leftJoin(sql`${users} as target_users`, eq(adminActions.targetUserId, sql.raw('target_users.id')))
|
||||
.orderBy(desc(adminActions.createdAt))
|
||||
.limit(limit)
|
||||
.offset(offset);
|
||||
|
||||
if (adminId) {
|
||||
query = query.where(eq(adminActions.adminId, adminId));
|
||||
}
|
||||
|
||||
return await query;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get system-wide statistics
|
||||
* @returns {Promise<Object>} System statistics
|
||||
*/
|
||||
export async function getSystemStats() {
|
||||
const [
|
||||
userStats,
|
||||
qsoStats,
|
||||
syncJobStats,
|
||||
adminStats,
|
||||
] = await Promise.all([
|
||||
// User statistics
|
||||
db.select({
|
||||
totalUsers: sql`CAST(COUNT(*) AS INTEGER)`,
|
||||
adminUsers: sql`CAST(SUM(CASE WHEN is_admin = 1 THEN 1 ELSE 0 END) AS INTEGER)`,
|
||||
regularUsers: sql`CAST(SUM(CASE WHEN is_admin = 0 THEN 1 ELSE 0 END) AS INTEGER)`,
|
||||
}).from(users),
|
||||
|
||||
// QSO statistics
|
||||
db.select({
|
||||
totalQSOs: sql`CAST(COUNT(*) AS INTEGER)`,
|
||||
uniqueCallsigns: sql`CAST(COUNT(DISTINCT callsign) AS INTEGER)`,
|
||||
uniqueEntities: sql`CAST(COUNT(DISTINCT entity_id) AS INTEGER)`,
|
||||
lotwConfirmed: sql`CAST(SUM(CASE WHEN lotw_qsl_rstatus = 'Y' THEN 1 ELSE 0 END) AS INTEGER)`,
|
||||
dclConfirmed: sql`CAST(SUM(CASE WHEN dcl_qsl_rstatus = 'Y' THEN 1 ELSE 0 END) AS INTEGER)`,
|
||||
}).from(qsos),
|
||||
|
||||
// Sync job statistics
|
||||
db.select({
|
||||
totalJobs: sql`CAST(COUNT(*) AS INTEGER)`,
|
||||
lotwJobs: sql`CAST(SUM(CASE WHEN type = 'lotw_sync' THEN 1 ELSE 0 END) AS INTEGER)`,
|
||||
dclJobs: sql`CAST(SUM(CASE WHEN type = 'dcl_sync' THEN 1 ELSE 0 END) AS INTEGER)`,
|
||||
completedJobs: sql`CAST(SUM(CASE WHEN status = 'completed' THEN 1 ELSE 0 END) AS INTEGER)`,
|
||||
failedJobs: sql`CAST(SUM(CASE WHEN status = 'failed' THEN 1 ELSE 0 END) AS INTEGER)`,
|
||||
}).from(syncJobs),
|
||||
|
||||
// Admin action statistics
|
||||
db.select({
|
||||
totalAdminActions: sql`CAST(COUNT(*) AS INTEGER)`,
|
||||
impersonations: sql`CAST(SUM(CASE WHEN action_type LIKE 'impersonate%' THEN 1 ELSE 0 END) AS INTEGER)`,
|
||||
}).from(adminActions),
|
||||
]);
|
||||
|
||||
return {
|
||||
users: userStats[0],
|
||||
qsos: qsoStats[0],
|
||||
syncJobs: syncJobStats[0],
|
||||
adminActions: adminStats[0],
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get per-user statistics (for admin overview)
|
||||
* @returns {Promise<Array>} Array of user statistics
|
||||
*/
|
||||
export async function getUserStats() {
|
||||
const stats = await db
|
||||
.select({
|
||||
id: users.id,
|
||||
email: users.email,
|
||||
callsign: users.callsign,
|
||||
isAdmin: users.isAdmin,
|
||||
qsoCount: sql`CAST(COUNT(${qsos.id}) AS INTEGER)`,
|
||||
lotwConfirmed: sql`CAST(SUM(CASE WHEN ${qsos.lotwQslRstatus} = 'Y' THEN 1 ELSE 0 END) AS INTEGER)`,
|
||||
dclConfirmed: sql`CAST(SUM(CASE WHEN ${qsos.dclQslRstatus} = 'Y' THEN 1 ELSE 0 END) AS INTEGER)`,
|
||||
totalConfirmed: sql`CAST(SUM(CASE WHEN ${qsos.lotwQslRstatus} = 'Y' OR ${qsos.dclQslRstatus} = 'Y' THEN 1 ELSE 0 END) AS INTEGER)`,
|
||||
lastSync: sql`MAX(${qsos.createdAt})`,
|
||||
createdAt: users.createdAt,
|
||||
})
|
||||
.from(users)
|
||||
.leftJoin(qsos, eq(users.id, qsos.userId))
|
||||
.groupBy(users.id)
|
||||
.orderBy(sql`COUNT(${qsos.id}) DESC`);
|
||||
|
||||
return stats;
|
||||
}
|
||||
|
||||
/**
|
||||
* Impersonate a user
|
||||
* @param {number} adminId - Admin user ID
|
||||
* @param {number} targetUserId - Target user ID to impersonate
|
||||
* @returns {Promise<Object>} Target user object
|
||||
* @throws {Error} If not admin or trying to impersonate another admin
|
||||
*/
|
||||
export async function impersonateUser(adminId, targetUserId) {
|
||||
// Verify the requester is an admin
|
||||
const requesterIsAdmin = await isAdmin(adminId);
|
||||
if (!requesterIsAdmin) {
|
||||
throw new Error('Only admins can impersonate users');
|
||||
}
|
||||
|
||||
// Get target user
|
||||
const targetUser = await getUserByIdFull(targetUserId);
|
||||
if (!targetUser) {
|
||||
throw new Error('Target user not found');
|
||||
}
|
||||
|
||||
// Check if target is also an admin (prevent admin impersonation)
|
||||
if (targetUser.isAdmin) {
|
||||
throw new Error('Cannot impersonate another admin user');
|
||||
}
|
||||
|
||||
// Log impersonation action
|
||||
await logAdminAction(adminId, 'impersonate_start', targetUserId, {
|
||||
targetEmail: targetUser.email,
|
||||
targetCallsign: targetUser.callsign,
|
||||
});
|
||||
|
||||
return targetUser;
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify impersonation token is valid
|
||||
* @param {Object} impersonationToken - JWT token payload containing impersonation data
|
||||
* @returns {Promise<Object>} Verification result with target user data
|
||||
*/
|
||||
export async function verifyImpersonation(impersonationToken) {
|
||||
const { adminId, targetUserId, exp } = impersonationToken;
|
||||
|
||||
// Check if token is expired
|
||||
if (Date.now() > exp * 1000) {
|
||||
throw new Error('Impersonation token has expired');
|
||||
}
|
||||
|
||||
// Verify admin still exists and is admin
|
||||
const adminUser = await getUserByIdFull(adminId);
|
||||
if (!adminUser || !adminUser.isAdmin) {
|
||||
throw new Error('Invalid impersonation: Admin no longer exists or is not admin');
|
||||
}
|
||||
|
||||
// Get target user
|
||||
const targetUser = await getUserByIdFull(targetUserId);
|
||||
if (!targetUser) {
|
||||
throw new Error('Target user not found');
|
||||
}
|
||||
|
||||
// Return target user with admin metadata for frontend display
|
||||
return {
|
||||
...targetUser,
|
||||
impersonating: {
|
||||
adminId,
|
||||
adminEmail: adminUser.email,
|
||||
adminCallsign: adminUser.callsign,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Stop impersonating a user
|
||||
* @param {number} adminId - Admin user ID
|
||||
* @param {number} targetUserId - Target user ID being impersonated
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
export async function stopImpersonation(adminId, targetUserId) {
|
||||
await logAdminAction(adminId, 'impersonate_stop', targetUserId, {
|
||||
message: 'Impersonation session ended',
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Get impersonation status for an admin
|
||||
* @param {number} adminId - Admin user ID
|
||||
* @param {Object} options - Query options
|
||||
* @param {number} options.limit - Number of recent impersonations to return
|
||||
* @returns {Promise<Array>} Array of recent impersonation actions
|
||||
*/
|
||||
export async function getImpersonationStatus(adminId, { limit = 10 } = {}) {
|
||||
const impersonations = await db
|
||||
.select({
|
||||
id: adminActions.id,
|
||||
actionType: adminActions.actionType,
|
||||
targetUserId: adminActions.targetUserId,
|
||||
targetEmail: sql`target_users.email`,
|
||||
targetCallsign: sql`target_users.callsign`,
|
||||
details: adminActions.details,
|
||||
createdAt: adminActions.createdAt,
|
||||
})
|
||||
.from(adminActions)
|
||||
.leftJoin(sql`${users} as target_users`, eq(adminActions.targetUserId, sql.raw('target_users.id')))
|
||||
.where(eq(adminActions.adminId, adminId))
|
||||
.where(sql`${adminActions.actionType} LIKE 'impersonate%'`)
|
||||
.orderBy(desc(adminActions.createdAt))
|
||||
.limit(limit);
|
||||
|
||||
return impersonations;
|
||||
}
|
||||
|
||||
/**
|
||||
* Update user admin status (admin operation)
|
||||
* @param {number} adminId - Admin user ID making the change
|
||||
* @param {number} targetUserId - User ID to update
|
||||
* @param {boolean} newIsAdmin - New admin flag
|
||||
* @returns {Promise<void>}
|
||||
* @throws {Error} If not admin or would remove last admin
|
||||
*/
|
||||
export async function changeUserRole(adminId, targetUserId, newIsAdmin) {
|
||||
// Verify the requester is an admin
|
||||
const requesterIsAdmin = await isAdmin(adminId);
|
||||
if (!requesterIsAdmin) {
|
||||
throw new Error('Only admins can change user admin status');
|
||||
}
|
||||
|
||||
// Get target user
|
||||
const targetUser = await getUserByIdFull(targetUserId);
|
||||
if (!targetUser) {
|
||||
throw new Error('Target user not found');
|
||||
}
|
||||
|
||||
// If demoting from admin, check if this would remove the last admin
|
||||
if (targetUser.isAdmin && !newIsAdmin) {
|
||||
const adminCount = await db
|
||||
.select({ count: sql`CAST(COUNT(*) AS INTEGER)` })
|
||||
.from(users)
|
||||
.where(eq(users.isAdmin, 1));
|
||||
|
||||
if (adminCount[0].count === 1) {
|
||||
throw new Error('Cannot demote the last admin user');
|
||||
}
|
||||
}
|
||||
|
||||
// Update admin status
|
||||
await db
|
||||
.update(users)
|
||||
.set({
|
||||
isAdmin: newIsAdmin ? 1 : 0,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(users.id, targetUserId));
|
||||
|
||||
// Log action
|
||||
await logAdminAction(adminId, 'role_change', targetUserId, {
|
||||
oldIsAdmin: targetUser.isAdmin,
|
||||
newIsAdmin: newIsAdmin,
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete user (admin operation)
|
||||
* @param {number} adminId - Admin user ID making the change
|
||||
* @param {number} targetUserId - User ID to delete
|
||||
* @returns {Promise<void>}
|
||||
* @throws {Error} If not admin, trying to delete self, or trying to delete another admin
|
||||
*/
|
||||
export async function deleteUser(adminId, targetUserId) {
|
||||
// Verify the requester is an admin
|
||||
const requesterIsAdmin = await isAdmin(adminId);
|
||||
if (!requesterIsAdmin) {
|
||||
throw new Error('Only admins can delete users');
|
||||
}
|
||||
|
||||
// Get target user
|
||||
const targetUser = await getUserByIdFull(targetUserId);
|
||||
if (!targetUser) {
|
||||
throw new Error('Target user not found');
|
||||
}
|
||||
|
||||
// Prevent deleting self
|
||||
if (adminId === targetUserId) {
|
||||
throw new Error('Cannot delete your own account');
|
||||
}
|
||||
|
||||
// Prevent deleting other admins
|
||||
if (targetUser.isAdmin) {
|
||||
throw new Error('Cannot delete admin users');
|
||||
}
|
||||
|
||||
// Get stats for logging
|
||||
const [qsoStats] = await db
|
||||
.select({ count: sql`CAST(COUNT(*) AS INTEGER)` })
|
||||
.from(qsos)
|
||||
.where(eq(qsos.userId, targetUserId));
|
||||
|
||||
// Delete all related records using Drizzle
|
||||
// Delete in correct order to satisfy foreign key constraints
|
||||
logger.info('Attempting to delete user', { userId: targetUserId, adminId });
|
||||
|
||||
try {
|
||||
// 1. Delete qso_changes (references qso_id -> qsos and job_id -> sync_jobs)
|
||||
// First get user's QSO IDs, then delete qso_changes referencing those QSOs
|
||||
const userQSOs = await db.select({ id: qsos.id }).from(qsos).where(eq(qsos.userId, targetUserId));
|
||||
const userQSOIds = userQSOs.map(q => q.id);
|
||||
|
||||
if (userQSOIds.length > 0) {
|
||||
// Use raw SQL to delete qso_changes
|
||||
sqlite.exec(
|
||||
`DELETE FROM qso_changes WHERE qso_id IN (${userQSOIds.join(',')})`
|
||||
);
|
||||
}
|
||||
|
||||
// 2. Delete award_progress
|
||||
await db.delete(awardProgress).where(eq(awardProgress.userId, targetUserId));
|
||||
|
||||
// 3. Delete sync_jobs
|
||||
await db.delete(syncJobs).where(eq(syncJobs.userId, targetUserId));
|
||||
|
||||
// 4. Delete qsos
|
||||
await db.delete(qsos).where(eq(qsos.userId, targetUserId));
|
||||
|
||||
// 5. Delete admin actions where user is target
|
||||
await db.delete(adminActions).where(eq(adminActions.targetUserId, targetUserId));
|
||||
|
||||
// 6. Delete user
|
||||
await db.delete(users).where(eq(users.id, targetUserId));
|
||||
|
||||
// Log action
|
||||
await logAdminAction(adminId, 'user_delete', targetUserId, {
|
||||
email: targetUser.email,
|
||||
callsign: targetUser.callsign,
|
||||
qsoCountDeleted: qsoStats.count,
|
||||
});
|
||||
|
||||
logger.info('User deleted successfully', { userId: targetUserId, adminId });
|
||||
} catch (error) {
|
||||
logger.error('Failed to delete user', { error: error.message, userId: targetUserId });
|
||||
throw error;
|
||||
}
|
||||
|
||||
// Log action
|
||||
await logAdminAction(adminId, 'user_delete', targetUserId, {
|
||||
email: targetUser.email,
|
||||
callsign: targetUser.callsign,
|
||||
qsoCountDeleted: qsoStats.count,
|
||||
});
|
||||
}
|
||||
@@ -142,3 +142,97 @@ export async function updateDCLCredentials(userId, dclApiKey) {
|
||||
})
|
||||
.where(eq(users.id, userId));
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if user is admin
|
||||
* @param {number} userId - User ID
|
||||
* @returns {Promise<boolean>} True if user is admin
|
||||
*/
|
||||
export async function isAdmin(userId) {
|
||||
const [user] = await db
|
||||
.select({ isAdmin: users.isAdmin })
|
||||
.from(users)
|
||||
.where(eq(users.id, userId))
|
||||
.limit(1);
|
||||
|
||||
return user?.isAdmin === true || user?.isAdmin === 1;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all admin users
|
||||
* @returns {Promise<Array>} Array of admin users (without passwords)
|
||||
*/
|
||||
export async function getAdminUsers() {
|
||||
const adminUsers = await db
|
||||
.select({
|
||||
id: users.id,
|
||||
email: users.email,
|
||||
callsign: users.callsign,
|
||||
isAdmin: users.isAdmin,
|
||||
createdAt: users.createdAt,
|
||||
})
|
||||
.from(users)
|
||||
.where(eq(users.isAdmin, 1));
|
||||
|
||||
return adminUsers;
|
||||
}
|
||||
|
||||
/**
|
||||
* Update user admin status
|
||||
* @param {number} userId - User ID
|
||||
* @param {boolean} isAdmin - Admin flag
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
export async function updateUserRole(userId, isAdmin) {
|
||||
await db
|
||||
.update(users)
|
||||
.set({
|
||||
isAdmin: isAdmin ? 1 : 0,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(users.id, userId));
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all users (for admin use)
|
||||
* @returns {Promise<Array>} Array of all users (without passwords)
|
||||
*/
|
||||
export async function getAllUsers() {
|
||||
const allUsers = await db
|
||||
.select({
|
||||
id: users.id,
|
||||
email: users.email,
|
||||
callsign: users.callsign,
|
||||
isAdmin: users.isAdmin,
|
||||
createdAt: users.createdAt,
|
||||
updatedAt: users.updatedAt,
|
||||
})
|
||||
.from(users)
|
||||
.orderBy(users.createdAt);
|
||||
|
||||
return allUsers;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get user by ID (for admin use)
|
||||
* @param {number} userId - User ID
|
||||
* @returns {Promise<Object|null>} Full user object (without password) or null
|
||||
*/
|
||||
export async function getUserByIdFull(userId) {
|
||||
const [user] = await db
|
||||
.select({
|
||||
id: users.id,
|
||||
email: users.email,
|
||||
callsign: users.callsign,
|
||||
isAdmin: users.isAdmin,
|
||||
lotwUsername: users.lotwUsername,
|
||||
dclApiKey: users.dclApiKey,
|
||||
createdAt: users.createdAt,
|
||||
updatedAt: users.updatedAt,
|
||||
})
|
||||
.from(users)
|
||||
.where(eq(users.id, userId))
|
||||
.limit(1);
|
||||
|
||||
return user || null;
|
||||
}
|
||||
|
||||
@@ -32,6 +32,7 @@ function loadAwardDefinitions() {
|
||||
'dld-40m.json',
|
||||
'dld-cw.json',
|
||||
'dld-80m-cw.json',
|
||||
'73-on-73.json',
|
||||
];
|
||||
|
||||
for (const file of files) {
|
||||
|
||||
@@ -13,6 +13,7 @@
|
||||
*/
|
||||
|
||||
const awardCache = new Map();
|
||||
const statsCache = new Map();
|
||||
const CACHE_TTL = 5 * 60 * 1000; // 5 minutes
|
||||
|
||||
/**
|
||||
@@ -26,6 +27,7 @@ export function getCachedAwardProgress(userId, awardId) {
|
||||
const cached = awardCache.get(key);
|
||||
|
||||
if (!cached) {
|
||||
recordAwardCacheMiss();
|
||||
return null;
|
||||
}
|
||||
|
||||
@@ -33,9 +35,11 @@ export function getCachedAwardProgress(userId, awardId) {
|
||||
const age = Date.now() - cached.timestamp;
|
||||
if (age > CACHE_TTL) {
|
||||
awardCache.delete(key);
|
||||
recordAwardCacheMiss();
|
||||
return null;
|
||||
}
|
||||
|
||||
recordAwardCacheHit();
|
||||
return cached.data;
|
||||
}
|
||||
|
||||
@@ -125,5 +129,147 @@ export function cleanupExpiredCache() {
|
||||
}
|
||||
}
|
||||
|
||||
for (const [key, value] of statsCache) {
|
||||
const age = now - value.timestamp;
|
||||
if (age > CACHE_TTL) {
|
||||
statsCache.delete(key);
|
||||
cleaned++;
|
||||
}
|
||||
}
|
||||
|
||||
return cleaned;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get cached QSO statistics if available and not expired
|
||||
* @param {number} userId - User ID
|
||||
* @returns {object|null} Cached stats data or null if not found/expired
|
||||
*/
|
||||
export function getCachedStats(userId) {
|
||||
const key = `stats_${userId}`;
|
||||
const cached = statsCache.get(key);
|
||||
|
||||
if (!cached) {
|
||||
recordStatsCacheMiss();
|
||||
return null;
|
||||
}
|
||||
|
||||
// Check if cache has expired
|
||||
const age = Date.now() - cached.timestamp;
|
||||
if (age > CACHE_TTL) {
|
||||
statsCache.delete(key);
|
||||
recordStatsCacheMiss();
|
||||
return null;
|
||||
}
|
||||
|
||||
recordStatsCacheHit();
|
||||
return cached.data;
|
||||
}
|
||||
|
||||
/**
|
||||
* Set QSO statistics in cache
|
||||
* @param {number} userId - User ID
|
||||
* @param {object} data - Statistics data to cache
|
||||
*/
|
||||
export function setCachedStats(userId, data) {
|
||||
const key = `stats_${userId}`;
|
||||
statsCache.set(key, {
|
||||
data,
|
||||
timestamp: Date.now()
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Invalidate cached QSO statistics for a specific user
|
||||
* Call this after syncing or updating QSOs
|
||||
* @param {number} userId - User ID
|
||||
* @returns {boolean} True if cache was invalidated
|
||||
*/
|
||||
export function invalidateStatsCache(userId) {
|
||||
const key = `stats_${userId}`;
|
||||
const deleted = statsCache.delete(key);
|
||||
return deleted;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get cache statistics including both award and stats caches
|
||||
* @returns {object} Cache stats
|
||||
*/
|
||||
export function getCacheStats() {
|
||||
const now = Date.now();
|
||||
let expired = 0;
|
||||
let valid = 0;
|
||||
|
||||
for (const [, value] of awardCache) {
|
||||
const age = now - value.timestamp;
|
||||
if (age > CACHE_TTL) {
|
||||
expired++;
|
||||
} else {
|
||||
valid++;
|
||||
}
|
||||
}
|
||||
|
||||
for (const [, value] of statsCache) {
|
||||
const age = now - value.timestamp;
|
||||
if (age > CACHE_TTL) {
|
||||
expired++;
|
||||
} else {
|
||||
valid++;
|
||||
}
|
||||
}
|
||||
|
||||
const totalRequests = awardCacheStats.hits + awardCacheStats.misses + statsCacheStats.hits + statsCacheStats.misses;
|
||||
const hitRate = totalRequests > 0 ? ((awardCacheStats.hits + statsCacheStats.hits) / totalRequests * 100).toFixed(2) + '%' : '0%';
|
||||
|
||||
return {
|
||||
total: awardCache.size + statsCache.size,
|
||||
valid,
|
||||
expired,
|
||||
ttl: CACHE_TTL,
|
||||
hitRate,
|
||||
awardCache: {
|
||||
size: awardCache.size,
|
||||
hits: awardCacheStats.hits,
|
||||
misses: awardCacheStats.misses
|
||||
},
|
||||
statsCache: {
|
||||
size: statsCache.size,
|
||||
hits: statsCacheStats.hits,
|
||||
misses: statsCacheStats.misses
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Cache statistics tracking
|
||||
*/
|
||||
const awardCacheStats = { hits: 0, misses: 0 };
|
||||
const statsCacheStats = { hits: 0, misses: 0 };
|
||||
|
||||
/**
|
||||
* Record a cache hit for awards
|
||||
*/
|
||||
export function recordAwardCacheHit() {
|
||||
awardCacheStats.hits++;
|
||||
}
|
||||
|
||||
/**
|
||||
* Record a cache miss for awards
|
||||
*/
|
||||
export function recordAwardCacheMiss() {
|
||||
awardCacheStats.misses++;
|
||||
}
|
||||
|
||||
/**
|
||||
* Record a cache hit for stats
|
||||
*/
|
||||
export function recordStatsCacheHit() {
|
||||
statsCacheStats.hits++;
|
||||
}
|
||||
|
||||
/**
|
||||
* Record a cache miss for stats
|
||||
*/
|
||||
export function recordStatsCacheMiss() {
|
||||
statsCacheStats.misses++;
|
||||
}
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
import { db, logger } from '../config.js';
|
||||
import { qsos } from '../db/schema/index.js';
|
||||
import { qsos, qsoChanges } from '../db/schema/index.js';
|
||||
import { max, sql, eq, and, desc } from 'drizzle-orm';
|
||||
import { updateJobProgress } from './job-queue.service.js';
|
||||
import { parseDCLResponse, normalizeBand, normalizeMode } from '../utils/adif-parser.js';
|
||||
import { invalidateUserCache } from './cache.service.js';
|
||||
import { invalidateUserCache, invalidateStatsCache } from './cache.service.js';
|
||||
|
||||
/**
|
||||
* DCL (DARC Community Logbook) Service
|
||||
@@ -170,7 +170,22 @@ function convertQSODatabaseFormat(adifQSO, userId) {
|
||||
}
|
||||
|
||||
/**
|
||||
* Sync QSOs from DCL to database
|
||||
* Yield to event loop to allow other requests to be processed
|
||||
* This prevents blocking the server during long-running sync operations
|
||||
*/
|
||||
function yieldToEventLoop() {
|
||||
return new Promise(resolve => setImmediate(resolve));
|
||||
}
|
||||
|
||||
/**
|
||||
* Get QSO key for duplicate detection
|
||||
*/
|
||||
function getQSOKey(qso) {
|
||||
return `${qso.callsign}|${qso.qsoDate}|${qso.timeOn}|${qso.band}|${qso.mode}`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sync QSOs from DCL to database (optimized with batch operations)
|
||||
* Updates existing QSOs with DCL confirmation data
|
||||
*
|
||||
* @param {number} userId - User ID
|
||||
@@ -219,123 +234,215 @@ export async function syncQSOs(userId, dclApiKey, sinceDate = null, jobId = null
|
||||
const addedQSOs = [];
|
||||
const updatedQSOs = [];
|
||||
|
||||
for (let i = 0; i < adifQSOs.length; i++) {
|
||||
const adifQSO = adifQSOs[i];
|
||||
// Convert all QSOs to database format
|
||||
const dbQSOs = adifQSOs.map(qso => convertQSODatabaseFormat(qso, userId));
|
||||
|
||||
try {
|
||||
const dbQSO = convertQSODatabaseFormat(adifQSO, userId);
|
||||
// Batch size for processing
|
||||
const BATCH_SIZE = 100;
|
||||
const totalBatches = Math.ceil(dbQSOs.length / BATCH_SIZE);
|
||||
|
||||
// Check if QSO already exists (match by callsign, date, time, band, mode)
|
||||
const existing = await db
|
||||
.select()
|
||||
.from(qsos)
|
||||
.where(
|
||||
and(
|
||||
eq(qsos.userId, userId),
|
||||
eq(qsos.callsign, dbQSO.callsign),
|
||||
eq(qsos.qsoDate, dbQSO.qsoDate),
|
||||
eq(qsos.timeOn, dbQSO.timeOn),
|
||||
eq(qsos.band, dbQSO.band),
|
||||
eq(qsos.mode, dbQSO.mode)
|
||||
)
|
||||
for (let batchNum = 0; batchNum < totalBatches; batchNum++) {
|
||||
const startIdx = batchNum * BATCH_SIZE;
|
||||
const endIdx = Math.min(startIdx + BATCH_SIZE, dbQSOs.length);
|
||||
const batch = dbQSOs.slice(startIdx, endIdx);
|
||||
|
||||
// Get unique callsigns and dates from batch
|
||||
const batchCallsigns = [...new Set(batch.map(q => q.callsign))];
|
||||
const batchDates = [...new Set(batch.map(q => q.qsoDate))];
|
||||
|
||||
// Fetch all existing QSOs that could match this batch in one query
|
||||
const existingQSOs = await db
|
||||
.select()
|
||||
.from(qsos)
|
||||
.where(
|
||||
and(
|
||||
eq(qsos.userId, userId),
|
||||
// Match callsigns OR dates from this batch
|
||||
sql`(${qsos.callsign} IN ${batchCallsigns} OR ${qsos.qsoDate} IN ${batchDates})`
|
||||
)
|
||||
.limit(1);
|
||||
);
|
||||
|
||||
if (existing.length > 0) {
|
||||
const existingQSO = existing[0];
|
||||
// Build lookup map for existing QSOs
|
||||
const existingMap = new Map();
|
||||
for (const existing of existingQSOs) {
|
||||
const key = getQSOKey(existing);
|
||||
existingMap.set(key, existing);
|
||||
}
|
||||
|
||||
// Check if DCL confirmation or DOK data has changed
|
||||
const dataChanged =
|
||||
existingQSO.dclQslRstatus !== dbQSO.dclQslRstatus ||
|
||||
existingQSO.dclQslRdate !== dbQSO.dclQslRdate ||
|
||||
existingQSO.darcDok !== (dbQSO.darcDok || existingQSO.darcDok) ||
|
||||
existingQSO.myDarcDok !== (dbQSO.myDarcDok || existingQSO.myDarcDok) ||
|
||||
existingQSO.grid !== (dbQSO.grid || existingQSO.grid);
|
||||
// Process batch
|
||||
const toInsert = [];
|
||||
const toUpdate = [];
|
||||
const changeRecords = [];
|
||||
|
||||
if (dataChanged) {
|
||||
// Update existing QSO with changed DCL confirmation and DOK data
|
||||
const updateData = {
|
||||
dclQslRdate: dbQSO.dclQslRdate,
|
||||
dclQslRstatus: dbQSO.dclQslRstatus,
|
||||
};
|
||||
for (const dbQSO of batch) {
|
||||
try {
|
||||
const key = getQSOKey(dbQSO);
|
||||
const existingQSO = existingMap.get(key);
|
||||
|
||||
// Only add DOK fields if DCL sent them
|
||||
if (dbQSO.darcDok) updateData.darcDok = dbQSO.darcDok;
|
||||
if (dbQSO.myDarcDok) updateData.myDarcDok = dbQSO.myDarcDok;
|
||||
if (existingQSO) {
|
||||
// Check if DCL confirmation or DOK data has changed
|
||||
const dataChanged =
|
||||
existingQSO.dclQslRstatus !== dbQSO.dclQslRstatus ||
|
||||
existingQSO.dclQslRdate !== dbQSO.dclQslRdate ||
|
||||
existingQSO.darcDok !== (dbQSO.darcDok || existingQSO.darcDok) ||
|
||||
existingQSO.myDarcDok !== (dbQSO.myDarcDok || existingQSO.myDarcDok) ||
|
||||
existingQSO.grid !== (dbQSO.grid || existingQSO.grid);
|
||||
|
||||
// Only update grid if DCL sent one
|
||||
if (dbQSO.grid) {
|
||||
updateData.grid = dbQSO.grid;
|
||||
updateData.gridSource = dbQSO.gridSource;
|
||||
if (dataChanged) {
|
||||
// Build update data
|
||||
const updateData = {
|
||||
dclQslRdate: dbQSO.dclQslRdate,
|
||||
dclQslRstatus: dbQSO.dclQslRstatus,
|
||||
};
|
||||
|
||||
// Only add DOK fields if DCL sent them
|
||||
if (dbQSO.darcDok) updateData.darcDok = dbQSO.darcDok;
|
||||
if (dbQSO.myDarcDok) updateData.myDarcDok = dbQSO.myDarcDok;
|
||||
|
||||
// Only update grid if DCL sent one
|
||||
if (dbQSO.grid) {
|
||||
updateData.grid = dbQSO.grid;
|
||||
updateData.gridSource = dbQSO.gridSource;
|
||||
}
|
||||
|
||||
// DXCC priority: LoTW > DCL
|
||||
// Only update entity fields from DCL if:
|
||||
// 1. QSO is NOT LoTW confirmed, AND
|
||||
// 2. DCL actually sent entity data, AND
|
||||
// 3. Current entity is missing
|
||||
const hasLoTWConfirmation = existingQSO.lotwQslRstatus === 'Y';
|
||||
const hasDCLData = dbQSO.entity || dbQSO.entityId;
|
||||
const missingEntity = !existingQSO.entity || existingQSO.entity === '';
|
||||
|
||||
if (!hasLoTWConfirmation && hasDCLData && missingEntity) {
|
||||
if (dbQSO.entity) updateData.entity = dbQSO.entity;
|
||||
if (dbQSO.entityId) updateData.entityId = dbQSO.entityId;
|
||||
if (dbQSO.continent) updateData.continent = dbQSO.continent;
|
||||
if (dbQSO.cqZone) updateData.cqZone = dbQSO.cqZone;
|
||||
if (dbQSO.ituZone) updateData.ituZone = dbQSO.ituZone;
|
||||
}
|
||||
|
||||
toUpdate.push({
|
||||
id: existingQSO.id,
|
||||
data: updateData,
|
||||
});
|
||||
|
||||
// Track change for rollback
|
||||
if (jobId) {
|
||||
changeRecords.push({
|
||||
jobId,
|
||||
qsoId: existingQSO.id,
|
||||
changeType: 'updated',
|
||||
beforeData: JSON.stringify({
|
||||
dclQslRstatus: existingQSO.dclQslRstatus,
|
||||
dclQslRdate: existingQSO.dclQslRdate,
|
||||
darcDok: existingQSO.darcDok,
|
||||
myDarcDok: existingQSO.myDarcDok,
|
||||
grid: existingQSO.grid,
|
||||
gridSource: existingQSO.gridSource,
|
||||
entity: existingQSO.entity,
|
||||
entityId: existingQSO.entityId,
|
||||
}),
|
||||
afterData: JSON.stringify({
|
||||
dclQslRstatus: dbQSO.dclQslRstatus,
|
||||
dclQslRdate: dbQSO.dclQslRdate,
|
||||
darcDok: updateData.darcDok,
|
||||
myDarcDok: updateData.myDarcDok,
|
||||
grid: updateData.grid,
|
||||
gridSource: updateData.gridSource,
|
||||
entity: updateData.entity,
|
||||
entityId: updateData.entityId,
|
||||
}),
|
||||
});
|
||||
}
|
||||
|
||||
updatedQSOs.push({
|
||||
id: existingQSO.id,
|
||||
callsign: dbQSO.callsign,
|
||||
date: dbQSO.qsoDate,
|
||||
band: dbQSO.band,
|
||||
mode: dbQSO.mode,
|
||||
});
|
||||
updatedCount++;
|
||||
} else {
|
||||
skippedCount++;
|
||||
}
|
||||
|
||||
// DXCC priority: LoTW > DCL
|
||||
// Only update entity fields from DCL if:
|
||||
// 1. QSO is NOT LoTW confirmed, AND
|
||||
// 2. DCL actually sent entity data, AND
|
||||
// 3. Current entity is missing
|
||||
const hasLoTWConfirmation = existingQSO.lotwQslRstatus === 'Y';
|
||||
const hasDCLData = dbQSO.entity || dbQSO.entityId;
|
||||
const missingEntity = !existingQSO.entity || existingQSO.entity === '';
|
||||
|
||||
if (!hasLoTWConfirmation && hasDCLData && missingEntity) {
|
||||
// Fill in entity data from DCL (only if DCL provides it)
|
||||
if (dbQSO.entity) updateData.entity = dbQSO.entity;
|
||||
if (dbQSO.entityId) updateData.entityId = dbQSO.entityId;
|
||||
if (dbQSO.continent) updateData.continent = dbQSO.continent;
|
||||
if (dbQSO.cqZone) updateData.cqZone = dbQSO.cqZone;
|
||||
if (dbQSO.ituZone) updateData.ituZone = dbQSO.ituZone;
|
||||
}
|
||||
|
||||
await db
|
||||
.update(qsos)
|
||||
.set(updateData)
|
||||
.where(eq(qsos.id, existingQSO.id));
|
||||
updatedCount++;
|
||||
// Track updated QSO (CALL and DATE)
|
||||
updatedQSOs.push({
|
||||
} else {
|
||||
// New QSO to insert
|
||||
toInsert.push(dbQSO);
|
||||
addedQSOs.push({
|
||||
callsign: dbQSO.callsign,
|
||||
date: dbQSO.qsoDate,
|
||||
band: dbQSO.band,
|
||||
mode: dbQSO.mode,
|
||||
});
|
||||
} else {
|
||||
// Skip - same data
|
||||
skippedCount++;
|
||||
addedCount++;
|
||||
}
|
||||
} else {
|
||||
// Insert new QSO
|
||||
await db.insert(qsos).values(dbQSO);
|
||||
addedCount++;
|
||||
// Track added QSO (CALL and DATE)
|
||||
addedQSOs.push({
|
||||
callsign: dbQSO.callsign,
|
||||
date: dbQSO.qsoDate,
|
||||
band: dbQSO.band,
|
||||
mode: dbQSO.mode,
|
||||
} catch (error) {
|
||||
logger.error('Failed to process DCL QSO in batch', {
|
||||
error: error.message,
|
||||
qso: dbQSO,
|
||||
userId,
|
||||
});
|
||||
errors.push({ qso: dbQSO, error: error.message });
|
||||
}
|
||||
|
||||
// Update job progress every 10 QSOs
|
||||
if (jobId && (i + 1) % 10 === 0) {
|
||||
await updateJobProgress(jobId, {
|
||||
processed: i + 1,
|
||||
message: `Processed ${i + 1}/${adifQSOs.length} QSOs from DCL...`,
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Failed to process DCL QSO', {
|
||||
error: error.message,
|
||||
qso: adifQSO,
|
||||
userId,
|
||||
});
|
||||
errors.push({ qso: adifQSO, error: error.message });
|
||||
}
|
||||
|
||||
// Batch insert new QSOs
|
||||
if (toInsert.length > 0) {
|
||||
const inserted = await db.insert(qsos).values(toInsert).returning();
|
||||
// Track inserted QSOs with their IDs for change tracking
|
||||
if (jobId) {
|
||||
for (let i = 0; i < inserted.length; i++) {
|
||||
changeRecords.push({
|
||||
jobId,
|
||||
qsoId: inserted[i].id,
|
||||
changeType: 'added',
|
||||
beforeData: null,
|
||||
afterData: JSON.stringify({
|
||||
callsign: toInsert[i].callsign,
|
||||
qsoDate: toInsert[i].qsoDate,
|
||||
timeOn: toInsert[i].timeOn,
|
||||
band: toInsert[i].band,
|
||||
mode: toInsert[i].mode,
|
||||
}),
|
||||
});
|
||||
// Update addedQSOs with actual IDs
|
||||
addedQSOs[addedCount - inserted.length + i].id = inserted[i].id;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Batch update existing QSOs
|
||||
if (toUpdate.length > 0) {
|
||||
for (const update of toUpdate) {
|
||||
await db
|
||||
.update(qsos)
|
||||
.set(update.data)
|
||||
.where(eq(qsos.id, update.id));
|
||||
}
|
||||
}
|
||||
|
||||
// Batch insert change records
|
||||
if (changeRecords.length > 0) {
|
||||
await db.insert(qsoChanges).values(changeRecords);
|
||||
}
|
||||
|
||||
// Update job progress after each batch
|
||||
if (jobId) {
|
||||
await updateJobProgress(jobId, {
|
||||
processed: endIdx,
|
||||
message: `Processed ${endIdx}/${dbQSOs.length} QSOs from DCL...`,
|
||||
});
|
||||
}
|
||||
|
||||
// Yield to event loop after each batch to allow other requests
|
||||
await yieldToEventLoop();
|
||||
}
|
||||
|
||||
const result = {
|
||||
success: true,
|
||||
total: adifQSOs.length,
|
||||
total: dbQSOs.length,
|
||||
added: addedCount,
|
||||
updated: updatedCount,
|
||||
skipped: skippedCount,
|
||||
@@ -353,7 +460,8 @@ export async function syncQSOs(userId, dclApiKey, sinceDate = null, jobId = null
|
||||
|
||||
// Invalidate award cache for this user since QSOs may have changed
|
||||
const deletedCache = invalidateUserCache(userId);
|
||||
logger.debug(`Invalidated ${deletedCache} cached award entries for user ${userId}`);
|
||||
invalidateStatsCache(userId);
|
||||
logger.debug(`Invalidated ${deletedCache} cached award entries and stats cache for user ${userId}`);
|
||||
|
||||
return result;
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { db, logger } from '../config.js';
|
||||
import { syncJobs } from '../db/schema/index.js';
|
||||
import { eq, and, or, lt } from 'drizzle-orm';
|
||||
import { syncJobs, qsoChanges, qsos } from '../db/schema/index.js';
|
||||
import { eq, and, or, lt, desc } from 'drizzle-orm';
|
||||
|
||||
/**
|
||||
* Simplified Background Job Queue Service
|
||||
@@ -252,7 +252,7 @@ export async function getUserActiveJob(userId, jobType = null) {
|
||||
.select()
|
||||
.from(syncJobs)
|
||||
.where(and(...conditions))
|
||||
.orderBy(syncJobs.createdAt)
|
||||
.orderBy(desc(syncJobs.createdAt))
|
||||
.limit(1);
|
||||
|
||||
return job || null;
|
||||
@@ -269,7 +269,7 @@ export async function getUserJobs(userId, limit = 10) {
|
||||
.select()
|
||||
.from(syncJobs)
|
||||
.where(eq(syncJobs.userId, userId))
|
||||
.orderBy(syncJobs.createdAt)
|
||||
.orderBy(desc(syncJobs.createdAt))
|
||||
.limit(limit);
|
||||
|
||||
return jobs.map((job) => {
|
||||
@@ -342,3 +342,110 @@ export async function updateJobProgress(jobId, progressData) {
|
||||
result: JSON.stringify(updatedData),
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Cancel and rollback a sync job
|
||||
* Deletes added QSOs and restores updated QSOs to their previous state
|
||||
* @param {number} jobId - Job ID to cancel
|
||||
* @param {number} userId - User ID (for security check)
|
||||
* @returns {Promise<Object>} Result of cancellation
|
||||
*/
|
||||
export async function cancelJob(jobId, userId) {
|
||||
logger.info('Cancelling job', { jobId, userId });
|
||||
|
||||
// Get job to verify ownership
|
||||
const job = await getJob(jobId);
|
||||
if (!job) {
|
||||
return { success: false, error: 'Job not found' };
|
||||
}
|
||||
|
||||
// Verify user owns this job
|
||||
if (job.userId !== userId) {
|
||||
return { success: false, error: 'Forbidden' };
|
||||
}
|
||||
|
||||
// Only allow cancelling failed jobs or stale running jobs
|
||||
const isStale = job.status === JobStatus.RUNNING && job.startedAt &&
|
||||
(Date.now() - new Date(job.startedAt).getTime()) > 60 * 60 * 1000; // 1 hour
|
||||
|
||||
if (job.status === JobStatus.PENDING) {
|
||||
return { success: false, error: 'Cannot cancel pending jobs' };
|
||||
}
|
||||
|
||||
if (job.status === JobStatus.COMPLETED) {
|
||||
return { success: false, error: 'Cannot cancel completed jobs' };
|
||||
}
|
||||
|
||||
if (job.status === JobStatus.RUNNING && !isStale) {
|
||||
return { success: false, error: 'Cannot cancel active jobs (only stale jobs older than 1 hour)' };
|
||||
}
|
||||
|
||||
// Get all QSO changes for this job
|
||||
const changes = await db
|
||||
.select()
|
||||
.from(qsoChanges)
|
||||
.where(eq(qsoChanges.jobId, jobId));
|
||||
|
||||
let deletedAdded = 0;
|
||||
let restoredUpdated = 0;
|
||||
|
||||
for (const change of changes) {
|
||||
if (change.changeType === 'added' && change.qsoId) {
|
||||
// Delete the QSO that was added
|
||||
await db.delete(qsos).where(eq(qsos.id, change.qsoId));
|
||||
deletedAdded++;
|
||||
} else if (change.changeType === 'updated' && change.qsoId && change.beforeData) {
|
||||
// Restore the QSO to its previous state
|
||||
try {
|
||||
const beforeData = JSON.parse(change.beforeData);
|
||||
|
||||
// Build update object based on job type
|
||||
const updateData = {};
|
||||
|
||||
if (job.type === 'lotw_sync') {
|
||||
if (beforeData.lotwQslRstatus !== undefined) updateData.lotwQslRstatus = beforeData.lotwQslRstatus;
|
||||
if (beforeData.lotwQslRdate !== undefined) updateData.lotwQslRdate = beforeData.lotwQslRdate;
|
||||
} else if (job.type === 'dcl_sync') {
|
||||
if (beforeData.dclQslRstatus !== undefined) updateData.dclQslRstatus = beforeData.dclQslRstatus;
|
||||
if (beforeData.dclQslRdate !== undefined) updateData.dclQslRdate = beforeData.dclQslRdate;
|
||||
if (beforeData.darcDok !== undefined) updateData.darcDok = beforeData.darcDok;
|
||||
if (beforeData.myDarcDok !== undefined) updateData.myDarcDok = beforeData.myDarcDok;
|
||||
if (beforeData.grid !== undefined) updateData.grid = beforeData.grid;
|
||||
if (beforeData.gridSource !== undefined) updateData.gridSource = beforeData.gridSource;
|
||||
if (beforeData.entity !== undefined) updateData.entity = beforeData.entity;
|
||||
if (beforeData.entityId !== undefined) updateData.entityId = beforeData.entityId;
|
||||
}
|
||||
|
||||
if (Object.keys(updateData).length > 0) {
|
||||
await db.update(qsos).set(updateData).where(eq(qsos.id, change.qsoId));
|
||||
restoredUpdated++;
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Failed to restore QSO', { qsoId: change.qsoId, error: error.message });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Delete all change records for this job
|
||||
await db.delete(qsoChanges).where(eq(qsoChanges.jobId, jobId));
|
||||
|
||||
// Update job status to cancelled
|
||||
await updateJob(jobId, {
|
||||
status: 'cancelled',
|
||||
completedAt: new Date(),
|
||||
result: JSON.stringify({
|
||||
cancelled: true,
|
||||
deletedAdded,
|
||||
restoredUpdated,
|
||||
}),
|
||||
});
|
||||
|
||||
logger.info('Job cancelled successfully', { jobId, deletedAdded, restoredUpdated });
|
||||
|
||||
return {
|
||||
success: true,
|
||||
message: `Job cancelled: ${deletedAdded} QSOs deleted, ${restoredUpdated} QSOs restored`,
|
||||
deletedAdded,
|
||||
restoredUpdated,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
import { db, logger } from '../config.js';
|
||||
import { qsos } from '../db/schema/index.js';
|
||||
import { qsos, qsoChanges } from '../db/schema/index.js';
|
||||
import { max, sql, eq, and, or, desc, like } from 'drizzle-orm';
|
||||
import { updateJobProgress } from './job-queue.service.js';
|
||||
import { parseADIF, normalizeBand, normalizeMode } from '../utils/adif-parser.js';
|
||||
import { invalidateUserCache } from './cache.service.js';
|
||||
import { invalidateUserCache, getCachedStats, setCachedStats, invalidateStatsCache } from './cache.service.js';
|
||||
import { trackQueryPerformance, getPerformanceSummary, resetPerformanceMetrics } from './performance.service.js';
|
||||
|
||||
/**
|
||||
* LoTW (Logbook of the World) Service
|
||||
@@ -15,6 +16,35 @@ const MAX_RETRIES = 30;
|
||||
const RETRY_DELAY = 10000;
|
||||
const REQUEST_TIMEOUT = 60000;
|
||||
|
||||
/**
|
||||
* SECURITY: Sanitize search input to prevent injection and DoS
|
||||
* Limits length and removes potentially harmful characters
|
||||
*/
|
||||
function sanitizeSearchInput(searchTerm) {
|
||||
if (!searchTerm || typeof searchTerm !== 'string') {
|
||||
return '';
|
||||
}
|
||||
|
||||
// Trim whitespace
|
||||
let sanitized = searchTerm.trim();
|
||||
|
||||
// Limit length (DoS prevention)
|
||||
const MAX_SEARCH_LENGTH = 100;
|
||||
if (sanitized.length > MAX_SEARCH_LENGTH) {
|
||||
sanitized = sanitized.substring(0, MAX_SEARCH_LENGTH);
|
||||
}
|
||||
|
||||
// Remove potentially dangerous SQL pattern wildcards from user input
|
||||
// We'll add our own wildcards for the LIKE query
|
||||
// Note: Drizzle ORM escapes parameters, but this adds defense-in-depth
|
||||
sanitized = sanitized.replace(/[%_\\]/g, '');
|
||||
|
||||
// Remove null bytes and other control characters
|
||||
sanitized = sanitized.replace(/[\x00-\x1F\x7F]/g, '');
|
||||
|
||||
return sanitized;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if LoTW response indicates the report is still being prepared
|
||||
*/
|
||||
@@ -181,7 +211,22 @@ function convertQSODatabaseFormat(adifQSO, userId) {
|
||||
}
|
||||
|
||||
/**
|
||||
* Sync QSOs from LoTW to database
|
||||
* Yield to event loop to allow other requests to be processed
|
||||
* This prevents blocking the server during long-running sync operations
|
||||
*/
|
||||
function yieldToEventLoop() {
|
||||
return new Promise(resolve => setImmediate(resolve));
|
||||
}
|
||||
|
||||
/**
|
||||
* Get QSO key for duplicate detection
|
||||
*/
|
||||
function getQSOKey(qso) {
|
||||
return `${qso.callsign}|${qso.qsoDate}|${qso.timeOn}|${qso.band}|${qso.mode}`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sync QSOs from LoTW to database (optimized with batch operations)
|
||||
* @param {number} userId - User ID
|
||||
* @param {string} lotwUsername - LoTW username
|
||||
* @param {string} lotwPassword - LoTW password
|
||||
@@ -228,90 +273,177 @@ export async function syncQSOs(userId, lotwUsername, lotwPassword, sinceDate = n
|
||||
const addedQSOs = [];
|
||||
const updatedQSOs = [];
|
||||
|
||||
for (let i = 0; i < adifQSOs.length; i++) {
|
||||
const qsoData = adifQSOs[i];
|
||||
// Convert all QSOs to database format
|
||||
const dbQSOs = adifQSOs.map(qsoData => convertQSODatabaseFormat(qsoData, userId));
|
||||
|
||||
try {
|
||||
const dbQSO = convertQSODatabaseFormat(qsoData, userId);
|
||||
// Batch size for processing
|
||||
const BATCH_SIZE = 100;
|
||||
const totalBatches = Math.ceil(dbQSOs.length / BATCH_SIZE);
|
||||
|
||||
const existing = await db
|
||||
.select()
|
||||
.from(qsos)
|
||||
.where(
|
||||
and(
|
||||
eq(qsos.userId, userId),
|
||||
eq(qsos.callsign, dbQSO.callsign),
|
||||
eq(qsos.qsoDate, dbQSO.qsoDate),
|
||||
eq(qsos.timeOn, dbQSO.timeOn),
|
||||
eq(qsos.band, dbQSO.band),
|
||||
eq(qsos.mode, dbQSO.mode)
|
||||
)
|
||||
for (let batchNum = 0; batchNum < totalBatches; batchNum++) {
|
||||
const startIdx = batchNum * BATCH_SIZE;
|
||||
const endIdx = Math.min(startIdx + BATCH_SIZE, dbQSOs.length);
|
||||
const batch = dbQSOs.slice(startIdx, endIdx);
|
||||
|
||||
// Build condition for batch duplicate check
|
||||
// Get unique callsigns, dates, bands, modes from batch
|
||||
const batchCallsigns = [...new Set(batch.map(q => q.callsign))];
|
||||
const batchDates = [...new Set(batch.map(q => q.qsoDate))];
|
||||
|
||||
// Fetch all existing QSOs that could match this batch in one query
|
||||
const existingQSOs = await db
|
||||
.select()
|
||||
.from(qsos)
|
||||
.where(
|
||||
and(
|
||||
eq(qsos.userId, userId),
|
||||
// Match callsigns OR dates from this batch
|
||||
sql`(${qsos.callsign} IN ${batchCallsigns} OR ${qsos.qsoDate} IN ${batchDates})`
|
||||
)
|
||||
.limit(1);
|
||||
);
|
||||
|
||||
if (existing.length > 0) {
|
||||
const existingQSO = existing[0];
|
||||
// Build lookup map for existing QSOs
|
||||
const existingMap = new Map();
|
||||
for (const existing of existingQSOs) {
|
||||
const key = getQSOKey(existing);
|
||||
existingMap.set(key, existing);
|
||||
}
|
||||
|
||||
// Check if LoTW confirmation data has changed
|
||||
const confirmationChanged =
|
||||
existingQSO.lotwQslRstatus !== dbQSO.lotwQslRstatus ||
|
||||
existingQSO.lotwQslRdate !== dbQSO.lotwQslRdate;
|
||||
// Process batch
|
||||
const toInsert = [];
|
||||
const toUpdate = [];
|
||||
const changeRecords = [];
|
||||
|
||||
if (confirmationChanged) {
|
||||
await db
|
||||
.update(qsos)
|
||||
.set({
|
||||
for (const dbQSO of batch) {
|
||||
try {
|
||||
const key = getQSOKey(dbQSO);
|
||||
const existingQSO = existingMap.get(key);
|
||||
|
||||
if (existingQSO) {
|
||||
// Check if LoTW confirmation data has changed
|
||||
const confirmationChanged =
|
||||
existingQSO.lotwQslRstatus !== dbQSO.lotwQslRstatus ||
|
||||
existingQSO.lotwQslRdate !== dbQSO.lotwQslRdate;
|
||||
|
||||
if (confirmationChanged) {
|
||||
toUpdate.push({
|
||||
id: existingQSO.id,
|
||||
lotwQslRdate: dbQSO.lotwQslRdate,
|
||||
lotwQslRstatus: dbQSO.lotwQslRstatus,
|
||||
lotwSyncedAt: dbQSO.lotwSyncedAt,
|
||||
})
|
||||
.where(eq(qsos.id, existingQSO.id));
|
||||
updatedCount++;
|
||||
// Track updated QSO (CALL and DATE)
|
||||
updatedQSOs.push({
|
||||
});
|
||||
|
||||
// Track change for rollback
|
||||
if (jobId) {
|
||||
changeRecords.push({
|
||||
jobId,
|
||||
qsoId: existingQSO.id,
|
||||
changeType: 'updated',
|
||||
beforeData: JSON.stringify({
|
||||
lotwQslRstatus: existingQSO.lotwQslRstatus,
|
||||
lotwQslRdate: existingQSO.lotwQslRdate,
|
||||
}),
|
||||
afterData: JSON.stringify({
|
||||
lotwQslRstatus: dbQSO.lotwQslRstatus,
|
||||
lotwQslRdate: dbQSO.lotwQslRdate,
|
||||
}),
|
||||
});
|
||||
}
|
||||
|
||||
updatedQSOs.push({
|
||||
id: existingQSO.id,
|
||||
callsign: dbQSO.callsign,
|
||||
date: dbQSO.qsoDate,
|
||||
band: dbQSO.band,
|
||||
mode: dbQSO.mode,
|
||||
});
|
||||
updatedCount++;
|
||||
} else {
|
||||
skippedCount++;
|
||||
}
|
||||
} else {
|
||||
// New QSO to insert
|
||||
toInsert.push(dbQSO);
|
||||
addedQSOs.push({
|
||||
callsign: dbQSO.callsign,
|
||||
date: dbQSO.qsoDate,
|
||||
band: dbQSO.band,
|
||||
mode: dbQSO.mode,
|
||||
});
|
||||
} else {
|
||||
// Skip - same data
|
||||
skippedCount++;
|
||||
addedCount++;
|
||||
}
|
||||
} else {
|
||||
await db.insert(qsos).values(dbQSO);
|
||||
addedCount++;
|
||||
// Track added QSO (CALL and DATE)
|
||||
addedQSOs.push({
|
||||
callsign: dbQSO.callsign,
|
||||
date: dbQSO.qsoDate,
|
||||
band: dbQSO.band,
|
||||
mode: dbQSO.mode,
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Error processing QSO in batch', { error: error.message, jobId, qso: dbQSO });
|
||||
errors.push({ qso: dbQSO, error: error.message });
|
||||
}
|
||||
|
||||
// Update job progress every 10 QSOs
|
||||
if (jobId && (i + 1) % 10 === 0) {
|
||||
await updateJobProgress(jobId, {
|
||||
processed: i + 1,
|
||||
message: `Processed ${i + 1}/${adifQSOs.length} QSOs...`,
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Error processing QSO', { error: error.message, jobId, qso: qsoData });
|
||||
errors.push({ qso: qsoData, error: error.message });
|
||||
}
|
||||
|
||||
// Batch insert new QSOs
|
||||
if (toInsert.length > 0) {
|
||||
const inserted = await db.insert(qsos).values(toInsert).returning();
|
||||
// Track inserted QSOs with their IDs for change tracking
|
||||
if (jobId) {
|
||||
for (let i = 0; i < inserted.length; i++) {
|
||||
changeRecords.push({
|
||||
jobId,
|
||||
qsoId: inserted[i].id,
|
||||
changeType: 'added',
|
||||
beforeData: null,
|
||||
afterData: JSON.stringify({
|
||||
callsign: toInsert[i].callsign,
|
||||
qsoDate: toInsert[i].qsoDate,
|
||||
timeOn: toInsert[i].timeOn,
|
||||
band: toInsert[i].band,
|
||||
mode: toInsert[i].mode,
|
||||
}),
|
||||
});
|
||||
// Update addedQSOs with actual IDs
|
||||
addedQSOs[addedCount - inserted.length + i].id = inserted[i].id;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Batch update existing QSOs
|
||||
if (toUpdate.length > 0) {
|
||||
for (const update of toUpdate) {
|
||||
await db
|
||||
.update(qsos)
|
||||
.set({
|
||||
lotwQslRdate: update.lotwQslRdate,
|
||||
lotwQslRstatus: update.lotwQslRstatus,
|
||||
lotwSyncedAt: update.lotwSyncedAt,
|
||||
})
|
||||
.where(eq(qsos.id, update.id));
|
||||
}
|
||||
}
|
||||
|
||||
// Batch insert change records
|
||||
if (changeRecords.length > 0) {
|
||||
await db.insert(qsoChanges).values(changeRecords);
|
||||
}
|
||||
|
||||
// Update job progress after each batch
|
||||
if (jobId) {
|
||||
await updateJobProgress(jobId, {
|
||||
processed: endIdx,
|
||||
message: `Processed ${endIdx}/${dbQSOs.length} QSOs...`,
|
||||
});
|
||||
}
|
||||
|
||||
// Yield to event loop after each batch to allow other requests
|
||||
await yieldToEventLoop();
|
||||
}
|
||||
|
||||
logger.info('LoTW sync completed', { total: adifQSOs.length, added: addedCount, updated: updatedCount, skipped: skippedCount, jobId });
|
||||
logger.info('LoTW sync completed', { total: dbQSOs.length, added: addedCount, updated: updatedCount, skipped: skippedCount, jobId });
|
||||
|
||||
// Invalidate award cache for this user since QSOs may have changed
|
||||
// Invalidate award and stats cache for this user since QSOs may have changed
|
||||
const deletedCache = invalidateUserCache(userId);
|
||||
logger.debug(`Invalidated ${deletedCache} cached award entries for user ${userId}`);
|
||||
invalidateStatsCache(userId);
|
||||
logger.debug(`Invalidated ${deletedCache} cached award entries and stats cache for user ${userId}`);
|
||||
|
||||
return {
|
||||
success: true,
|
||||
total: adifQSOs.length,
|
||||
total: dbQSOs.length,
|
||||
added: addedCount,
|
||||
updated: updatedCount,
|
||||
skipped: skippedCount,
|
||||
@@ -354,6 +486,11 @@ export async function getUserQSOs(userId, filters = {}, options = {}) {
|
||||
// Both confirmed: Confirmed by LoTW AND DCL
|
||||
conditions.push(eq(qsos.lotwQslRstatus, 'Y'));
|
||||
conditions.push(eq(qsos.dclQslRstatus, 'Y'));
|
||||
} else if (filters.confirmationType === 'any') {
|
||||
// Confirmed by at least 1 service: LoTW OR DCL
|
||||
conditions.push(
|
||||
sql`(${qsos.lotwQslRstatus} = 'Y' OR ${qsos.dclQslRstatus} = 'Y')`
|
||||
);
|
||||
} else if (filters.confirmationType === 'none') {
|
||||
// Not confirmed: Not confirmed by LoTW AND not confirmed by DCL
|
||||
conditions.push(
|
||||
@@ -367,12 +504,16 @@ export async function getUserQSOs(userId, filters = {}, options = {}) {
|
||||
|
||||
// Search filter: callsign, entity, or grid
|
||||
if (filters.search) {
|
||||
const searchTerm = `%${filters.search}%`;
|
||||
conditions.push(or(
|
||||
like(qsos.callsign, searchTerm),
|
||||
like(qsos.entity, searchTerm),
|
||||
like(qsos.grid, searchTerm)
|
||||
));
|
||||
// SECURITY: Sanitize search input to prevent injection
|
||||
const sanitized = sanitizeSearchInput(filters.search);
|
||||
if (sanitized) {
|
||||
const searchTerm = `%${sanitized}%`;
|
||||
conditions.push(or(
|
||||
like(qsos.callsign, searchTerm),
|
||||
like(qsos.entity, searchTerm),
|
||||
like(qsos.grid, searchTerm)
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
// Use SQL COUNT for efficient pagination (avoids loading all QSOs into memory)
|
||||
@@ -409,26 +550,40 @@ export async function getUserQSOs(userId, filters = {}, options = {}) {
|
||||
* Get QSO statistics for a user
|
||||
*/
|
||||
export async function getQSOStats(userId) {
|
||||
const allQSOs = await db.select().from(qsos).where(eq(qsos.userId, userId));
|
||||
const confirmed = allQSOs.filter((q) => q.lotwQslRstatus === 'Y' || q.dclQslRstatus === 'Y');
|
||||
// Check cache first
|
||||
const cached = getCachedStats(userId);
|
||||
if (cached) {
|
||||
return cached;
|
||||
}
|
||||
|
||||
const uniqueEntities = new Set();
|
||||
const uniqueBands = new Set();
|
||||
const uniqueModes = new Set();
|
||||
// Calculate stats from database with performance tracking
|
||||
const stats = await trackQueryPerformance('getQSOStats', async () => {
|
||||
const [basicStats, uniqueStats] = await Promise.all([
|
||||
db.select({
|
||||
total: sql`CAST(COUNT(*) AS INTEGER)`,
|
||||
confirmed: sql`CAST(SUM(CASE WHEN lotw_qsl_rstatus = 'Y' OR dcl_qsl_rstatus = 'Y' THEN 1 ELSE 0 END) AS INTEGER)`
|
||||
}).from(qsos).where(eq(qsos.userId, userId)),
|
||||
|
||||
allQSOs.forEach((q) => {
|
||||
if (q.entity) uniqueEntities.add(q.entity);
|
||||
if (q.band) uniqueBands.add(q.band);
|
||||
if (q.mode) uniqueModes.add(q.mode);
|
||||
db.select({
|
||||
uniqueEntities: sql`CAST(COUNT(DISTINCT entity) AS INTEGER)`,
|
||||
uniqueBands: sql`CAST(COUNT(DISTINCT band) AS INTEGER)`,
|
||||
uniqueModes: sql`CAST(COUNT(DISTINCT mode) AS INTEGER)`
|
||||
}).from(qsos).where(eq(qsos.userId, userId))
|
||||
]);
|
||||
|
||||
return {
|
||||
total: basicStats[0].total,
|
||||
confirmed: basicStats[0].confirmed || 0,
|
||||
uniqueEntities: uniqueStats[0].uniqueEntities || 0,
|
||||
uniqueBands: uniqueStats[0].uniqueBands || 0,
|
||||
uniqueModes: uniqueStats[0].uniqueModes || 0,
|
||||
};
|
||||
});
|
||||
|
||||
return {
|
||||
total: allQSOs.length,
|
||||
confirmed: confirmed.length,
|
||||
uniqueEntities: uniqueEntities.size,
|
||||
uniqueBands: uniqueBands.size,
|
||||
uniqueModes: uniqueModes.size,
|
||||
};
|
||||
// Cache results
|
||||
setCachedStats(userId, stats);
|
||||
|
||||
return stats;
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
274
src/backend/services/performance.service.js
Normal file
274
src/backend/services/performance.service.js
Normal file
@@ -0,0 +1,274 @@
|
||||
/**
|
||||
* Performance Monitoring Service
|
||||
*
|
||||
* Tracks query performance metrics to identify slow queries and detect regressions.
|
||||
*
|
||||
* Features:
|
||||
* - Track individual query performance
|
||||
* - Calculate averages and percentiles
|
||||
* - Detect slow queries automatically
|
||||
* - Provide performance statistics for monitoring
|
||||
*
|
||||
* Usage:
|
||||
* const result = await trackQueryPerformance('getQSOStats', async () => {
|
||||
* return await someExpensiveOperation();
|
||||
* });
|
||||
*/
|
||||
|
||||
// Performance metrics storage
|
||||
const queryMetrics = new Map();
|
||||
|
||||
// Thresholds for slow queries
|
||||
const SLOW_QUERY_THRESHOLD = 100; // 100ms = slow
|
||||
const CRITICAL_QUERY_THRESHOLD = 500; // 500ms = critical
|
||||
|
||||
/**
|
||||
* Track query performance and log results
|
||||
* @param {string} queryName - Name of the query/operation
|
||||
* @param {Function} fn - Async function to execute and track
|
||||
* @returns {Promise<any>} Result of the function
|
||||
*/
|
||||
export async function trackQueryPerformance(queryName, fn) {
|
||||
const start = performance.now();
|
||||
let result;
|
||||
let error = null;
|
||||
|
||||
try {
|
||||
result = await fn();
|
||||
} catch (err) {
|
||||
error = err;
|
||||
throw err; // Re-throw error
|
||||
} finally {
|
||||
const duration = performance.now() - start;
|
||||
recordQueryMetric(queryName, duration, error);
|
||||
|
||||
// Log slow queries
|
||||
if (duration > CRITICAL_QUERY_THRESHOLD) {
|
||||
console.error(`🚨 CRITICAL SLOW QUERY: ${queryName} took ${duration.toFixed(2)}ms`);
|
||||
} else if (duration > SLOW_QUERY_THRESHOLD) {
|
||||
console.warn(`⚠️ SLOW QUERY: ${queryName} took ${duration.toFixed(2)}ms`);
|
||||
} else {
|
||||
console.log(`✅ Query Performance: ${queryName} - ${duration.toFixed(2)}ms`);
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Record a query metric for later analysis
|
||||
* @param {string} queryName - Name of the query
|
||||
* @param {number} duration - Query duration in milliseconds
|
||||
* @param {Error|null} error - Error if query failed
|
||||
*/
|
||||
function recordQueryMetric(queryName, duration, error = null) {
|
||||
if (!queryMetrics.has(queryName)) {
|
||||
queryMetrics.set(queryName, {
|
||||
count: 0,
|
||||
totalTime: 0,
|
||||
minTime: Infinity,
|
||||
maxTime: 0,
|
||||
errors: 0,
|
||||
durations: [] // Keep recent durations for percentile calculation
|
||||
});
|
||||
}
|
||||
|
||||
const metrics = queryMetrics.get(queryName);
|
||||
metrics.count++;
|
||||
metrics.totalTime += duration;
|
||||
metrics.minTime = Math.min(metrics.minTime, duration);
|
||||
metrics.maxTime = Math.max(metrics.maxTime, duration);
|
||||
if (error) metrics.errors++;
|
||||
|
||||
// Keep last 100 durations for percentile calculation
|
||||
metrics.durations.push(duration);
|
||||
if (metrics.durations.length > 100) {
|
||||
metrics.durations.shift();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get performance statistics for a specific query or all queries
|
||||
* @param {string|null} queryName - Query name or null for all queries
|
||||
* @returns {object} Performance statistics
|
||||
*/
|
||||
export function getPerformanceStats(queryName = null) {
|
||||
if (queryName) {
|
||||
const metrics = queryMetrics.get(queryName);
|
||||
if (!metrics) {
|
||||
return null;
|
||||
}
|
||||
return calculateQueryStats(queryName, metrics);
|
||||
}
|
||||
|
||||
// Get stats for all queries
|
||||
const stats = {};
|
||||
for (const [name, metrics] of queryMetrics.entries()) {
|
||||
stats[name] = calculateQueryStats(name, metrics);
|
||||
}
|
||||
return stats;
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate statistics for a query
|
||||
* @param {string} queryName - Name of the query
|
||||
* @param {object} metrics - Raw metrics
|
||||
* @returns {object} Calculated statistics
|
||||
*/
|
||||
function calculateQueryStats(queryName, metrics) {
|
||||
const avgTime = metrics.totalTime / metrics.count;
|
||||
|
||||
// Calculate percentiles (P50, P95, P99)
|
||||
const sorted = [...metrics.durations].sort((a, b) => a - b);
|
||||
const p50 = sorted[Math.floor(sorted.length * 0.5)] || 0;
|
||||
const p95 = sorted[Math.floor(sorted.length * 0.95)] || 0;
|
||||
const p99 = sorted[Math.floor(sorted.length * 0.99)] || 0;
|
||||
|
||||
// Determine performance rating
|
||||
let rating = 'EXCELLENT';
|
||||
if (avgTime > CRITICAL_QUERY_THRESHOLD) {
|
||||
rating = 'CRITICAL';
|
||||
} else if (avgTime > SLOW_QUERY_THRESHOLD) {
|
||||
rating = 'SLOW';
|
||||
} else if (avgTime > 50) {
|
||||
rating = 'GOOD';
|
||||
}
|
||||
|
||||
return {
|
||||
name: queryName,
|
||||
count: metrics.count,
|
||||
avgTime: avgTime.toFixed(2) + 'ms',
|
||||
minTime: metrics.minTime.toFixed(2) + 'ms',
|
||||
maxTime: metrics.maxTime.toFixed(2) + 'ms',
|
||||
p50: p50.toFixed(2) + 'ms',
|
||||
p95: p95.toFixed(2) + 'ms',
|
||||
p99: p99.toFixed(2) + 'ms',
|
||||
errors: metrics.errors,
|
||||
errorRate: ((metrics.errors / metrics.count) * 100).toFixed(2) + '%',
|
||||
rating
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get overall performance summary
|
||||
* @returns {object} Summary of all query performance
|
||||
*/
|
||||
export function getPerformanceSummary() {
|
||||
if (queryMetrics.size === 0) {
|
||||
return {
|
||||
totalQueries: 0,
|
||||
totalTime: 0,
|
||||
avgTime: '0ms',
|
||||
slowQueries: 0,
|
||||
criticalQueries: 0,
|
||||
topSlowest: []
|
||||
};
|
||||
}
|
||||
|
||||
let totalQueries = 0;
|
||||
let totalTime = 0;
|
||||
let slowQueries = 0;
|
||||
let criticalQueries = 0;
|
||||
const allStats = [];
|
||||
|
||||
for (const [name, metrics] of queryMetrics.entries()) {
|
||||
const stats = calculateQueryStats(name, metrics);
|
||||
totalQueries += metrics.count;
|
||||
totalTime += metrics.totalTime;
|
||||
|
||||
const avgTime = metrics.totalTime / metrics.count;
|
||||
if (avgTime > CRITICAL_QUERY_THRESHOLD) {
|
||||
criticalQueries++;
|
||||
} else if (avgTime > SLOW_QUERY_THRESHOLD) {
|
||||
slowQueries++;
|
||||
}
|
||||
|
||||
allStats.push(stats);
|
||||
}
|
||||
|
||||
// Sort by average time (slowest first)
|
||||
const topSlowest = allStats
|
||||
.sort((a, b) => parseFloat(b.avgTime) - parseFloat(a.avgTime))
|
||||
.slice(0, 10);
|
||||
|
||||
return {
|
||||
totalQueries,
|
||||
totalTime: totalTime.toFixed(2) + 'ms',
|
||||
avgTime: (totalTime / totalQueries).toFixed(2) + 'ms',
|
||||
slowQueries,
|
||||
criticalQueries,
|
||||
topSlowest
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Reset performance metrics (for testing)
|
||||
*/
|
||||
export function resetPerformanceMetrics() {
|
||||
queryMetrics.clear();
|
||||
console.log('Performance metrics cleared');
|
||||
}
|
||||
|
||||
/**
|
||||
* Get slow queries (above threshold)
|
||||
* @param {number} threshold - Duration threshold in ms (default: 100ms)
|
||||
* @returns {Array} Array of slow query statistics
|
||||
*/
|
||||
export function getSlowQueries(threshold = SLOW_QUERY_THRESHOLD) {
|
||||
const slowQueries = [];
|
||||
|
||||
for (const [name, metrics] of queryMetrics.entries()) {
|
||||
const avgTime = metrics.totalTime / metrics.count;
|
||||
if (avgTime > threshold) {
|
||||
slowQueries.push(calculateQueryStats(name, metrics));
|
||||
}
|
||||
}
|
||||
|
||||
// Sort by average time (slowest first)
|
||||
return slowQueries.sort((a, b) => parseFloat(b.avgTime) - parseFloat(a.avgTime));
|
||||
}
|
||||
|
||||
/**
|
||||
* Performance monitoring utility for database queries
|
||||
* @param {string} queryName - Name of the query
|
||||
* @param {Function} queryFn - Query function to track
|
||||
* @returns {Promise<any>} Query result
|
||||
*/
|
||||
export async function trackQuery(queryName, queryFn) {
|
||||
return trackQueryPerformance(queryName, queryFn);
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if performance is degrading (compares recent vs overall average)
|
||||
* @param {string} queryName - Query name to check
|
||||
* @param {number} windowSize - Number of recent queries to compare (default: 10)
|
||||
* @returns {object} Degradation status
|
||||
*/
|
||||
export function checkPerformanceDegradation(queryName, windowSize = 10) {
|
||||
const metrics = queryMetrics.get(queryName);
|
||||
if (!metrics || metrics.durations.length < windowSize * 2) {
|
||||
return {
|
||||
degraded: false,
|
||||
message: 'Insufficient data'
|
||||
};
|
||||
}
|
||||
|
||||
// Recent queries (last N)
|
||||
const recentDurations = metrics.durations.slice(-windowSize);
|
||||
const avgRecent = recentDurations.reduce((a, b) => a + b, 0) / recentDurations.length;
|
||||
|
||||
// Overall average
|
||||
const avgOverall = metrics.totalTime / metrics.count;
|
||||
|
||||
// Check if recent is 2x worse than overall
|
||||
const degraded = avgRecent > avgOverall * 2;
|
||||
const change = ((avgRecent - avgOverall) / avgOverall * 100).toFixed(2) + '%';
|
||||
|
||||
return {
|
||||
degraded,
|
||||
avgRecent: avgRecent.toFixed(2) + 'ms',
|
||||
avgOverall: avgOverall.toFixed(2) + 'ms',
|
||||
change,
|
||||
message: degraded ? `Performance degraded by ${change}` : 'Performance stable'
|
||||
};
|
||||
}
|
||||
@@ -84,4 +84,37 @@ export const jobsAPI = {
|
||||
getStatus: (jobId) => apiRequest(`/jobs/${jobId}`),
|
||||
getActive: () => apiRequest('/jobs/active'),
|
||||
getRecent: (limit = 10) => apiRequest(`/jobs?limit=${limit}`),
|
||||
cancel: (jobId) => apiRequest(`/jobs/${jobId}`, { method: 'DELETE' }),
|
||||
};
|
||||
|
||||
// Admin API
|
||||
export const adminAPI = {
|
||||
getStats: () => apiRequest('/admin/stats'),
|
||||
|
||||
getUsers: () => apiRequest('/admin/users'),
|
||||
|
||||
getUserDetails: (userId) => apiRequest(`/admin/users/${userId}`),
|
||||
|
||||
updateUserRole: (userId, isAdmin) => apiRequest(`/admin/users/${userId}/role`, {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ isAdmin }),
|
||||
}),
|
||||
|
||||
deleteUser: (userId) => apiRequest(`/admin/users/${userId}`, {
|
||||
method: 'DELETE',
|
||||
}),
|
||||
|
||||
impersonate: (userId) => apiRequest(`/admin/impersonate/${userId}`, {
|
||||
method: 'POST',
|
||||
}),
|
||||
|
||||
stopImpersonation: () => apiRequest('/admin/impersonate/stop', {
|
||||
method: 'POST',
|
||||
}),
|
||||
|
||||
getImpersonationStatus: () => apiRequest('/admin/impersonation/status'),
|
||||
|
||||
getActions: (limit = 50, offset = 0) => apiRequest(`/admin/actions?limit=${limit}&offset=${offset}`),
|
||||
|
||||
getMyActions: (limit = 50, offset = 0) => apiRequest(`/admin/actions/my?limit=${limit}&offset=${offset}`),
|
||||
};
|
||||
|
||||
192
src/frontend/src/lib/logger.js
Normal file
192
src/frontend/src/lib/logger.js
Normal file
@@ -0,0 +1,192 @@
|
||||
/**
|
||||
* Frontend Logger
|
||||
*
|
||||
* Sends logs to backend endpoint which writes to logs/frontend.log
|
||||
* Respects LOG_LEVEL environment variable from backend
|
||||
*
|
||||
* Usage:
|
||||
* import { logger } from '$lib/logger';
|
||||
* logger.info('User logged in', { userId: 123 });
|
||||
* logger.error('Failed to fetch data', { error: err.message });
|
||||
*/
|
||||
|
||||
// Log levels matching backend
|
||||
const LOG_LEVELS = { debug: 0, info: 1, warn: 2, error: 3 };
|
||||
|
||||
// Get log level from backend or default to info
|
||||
let currentLogLevel = LOG_LEVELS.info;
|
||||
|
||||
// Buffer for batching logs (sends when buffer reaches this size or after timeout)
|
||||
const logBuffer = [];
|
||||
const BUFFER_SIZE = 10;
|
||||
const BUFFER_TIMEOUT = 5000; // 5 seconds
|
||||
let bufferTimeout = null;
|
||||
|
||||
// Fetch current log level from backend on initialization
|
||||
async function fetchLogLevel() {
|
||||
try {
|
||||
// Try to get log level from health endpoint or localStorage
|
||||
const response = await fetch('/api/health');
|
||||
if (response.ok) {
|
||||
// For now, we'll assume the backend doesn't expose log level in health
|
||||
// Could add it later. Default to info in production, debug in development
|
||||
const isDev = import.meta.env.DEV;
|
||||
currentLogLevel = isDev ? LOG_LEVELS.debug : LOG_LEVELS.info;
|
||||
}
|
||||
} catch (err) {
|
||||
// Default to info if can't fetch
|
||||
currentLogLevel = LOG_LEVELS.info;
|
||||
}
|
||||
}
|
||||
|
||||
// Initialize log level
|
||||
fetchLogLevel();
|
||||
|
||||
/**
|
||||
* Send logs to backend
|
||||
*/
|
||||
async function sendLogs(entries) {
|
||||
try {
|
||||
await fetch('/api/logs', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
credentials: 'include', // Include cookies for authentication
|
||||
body: JSON.stringify(entries),
|
||||
});
|
||||
} catch (err) {
|
||||
// Silent fail - don't break the app if logging fails
|
||||
console.error('Failed to send logs to backend:', err);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Flush log buffer
|
||||
*/
|
||||
function flushBuffer() {
|
||||
if (logBuffer.length === 0) return;
|
||||
|
||||
const entries = [...logBuffer];
|
||||
logBuffer.length = 0; // Clear buffer
|
||||
|
||||
if (bufferTimeout) {
|
||||
clearTimeout(bufferTimeout);
|
||||
bufferTimeout = null;
|
||||
}
|
||||
|
||||
sendLogs(entries);
|
||||
}
|
||||
|
||||
/**
|
||||
* Add log entry to buffer
|
||||
*/
|
||||
function addToBuffer(level, message, data) {
|
||||
// Check if we should log this level
|
||||
if (LOG_LEVELS[level] < currentLogLevel) return;
|
||||
|
||||
logBuffer.push({
|
||||
level,
|
||||
message,
|
||||
data: data || undefined,
|
||||
timestamp: new Date().toISOString(),
|
||||
});
|
||||
|
||||
// Flush if buffer is full
|
||||
if (logBuffer.length >= BUFFER_SIZE) {
|
||||
flushBuffer();
|
||||
} else {
|
||||
// Set timeout to flush after BUFFER_TIMEOUT
|
||||
if (bufferTimeout) {
|
||||
clearTimeout(bufferTimeout);
|
||||
}
|
||||
bufferTimeout = setTimeout(flushBuffer, BUFFER_TIMEOUT);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Logger API
|
||||
*/
|
||||
export const logger = {
|
||||
/**
|
||||
* Log debug message
|
||||
*/
|
||||
debug: (message, data) => {
|
||||
if (import.meta.env.DEV) {
|
||||
console.debug('[DEBUG]', message, data || '');
|
||||
}
|
||||
addToBuffer('debug', message, data);
|
||||
},
|
||||
|
||||
/**
|
||||
* Log info message
|
||||
*/
|
||||
info: (message, data) => {
|
||||
if (import.meta.env.DEV) {
|
||||
console.info('[INFO]', message, data || '');
|
||||
}
|
||||
addToBuffer('info', message, data);
|
||||
},
|
||||
|
||||
/**
|
||||
* Log warning message
|
||||
*/
|
||||
warn: (message, data) => {
|
||||
if (import.meta.env.DEV) {
|
||||
console.warn('[WARN]', message, data || '');
|
||||
}
|
||||
addToBuffer('warn', message, data);
|
||||
},
|
||||
|
||||
/**
|
||||
* Log error message
|
||||
*/
|
||||
error: (message, data) => {
|
||||
if (import.meta.env.DEV) {
|
||||
console.error('[ERROR]', message, data || '');
|
||||
}
|
||||
addToBuffer('error', message, data);
|
||||
},
|
||||
|
||||
/**
|
||||
* Immediately flush the log buffer
|
||||
*/
|
||||
flush: flushBuffer,
|
||||
|
||||
/**
|
||||
* Set the log level (for testing purposes)
|
||||
*/
|
||||
setLogLevel: (level) => {
|
||||
if (LOG_LEVELS[level] !== undefined) {
|
||||
currentLogLevel = LOG_LEVELS[level];
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
/**
|
||||
* SvelteKit action for automatic error logging
|
||||
* Can be used in +page.svelte or +layout.svelte
|
||||
*/
|
||||
export function setupErrorLogging() {
|
||||
// Log unhandled errors
|
||||
if (typeof window !== 'undefined') {
|
||||
window.addEventListener('error', (event) => {
|
||||
logger.error('Unhandled error', {
|
||||
message: event.message,
|
||||
filename: event.filename,
|
||||
lineno: event.lineno,
|
||||
colno: event.colno,
|
||||
error: event.error?.stack,
|
||||
});
|
||||
});
|
||||
|
||||
window.addEventListener('unhandledrejection', (event) => {
|
||||
logger.error('Unhandled promise rejection', {
|
||||
reason: event.reason,
|
||||
promise: event.promise?.toString(),
|
||||
});
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
export default logger;
|
||||
@@ -27,6 +27,9 @@
|
||||
<a href="/awards" class="nav-link">Awards</a>
|
||||
<a href="/qsos" class="nav-link">QSOs</a>
|
||||
<a href="/settings" class="nav-link">Settings</a>
|
||||
{#if $auth.user?.isAdmin}
|
||||
<a href="/admin" class="nav-link admin-link">Admin</a>
|
||||
{/if}
|
||||
<button on:click={handleLogout} class="nav-link logout-btn">Logout</button>
|
||||
</div>
|
||||
</div>
|
||||
@@ -119,6 +122,16 @@
|
||||
background-color: rgba(255, 107, 107, 0.1);
|
||||
}
|
||||
|
||||
.admin-link {
|
||||
background-color: #ffc107;
|
||||
color: #000;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.admin-link:hover {
|
||||
background-color: #e0a800;
|
||||
}
|
||||
|
||||
main {
|
||||
flex: 1;
|
||||
padding: 2rem 1rem;
|
||||
|
||||
@@ -1,14 +1,161 @@
|
||||
<script>
|
||||
import { onMount } from 'svelte';
|
||||
import { onMount, onDestroy, tick } from 'svelte';
|
||||
import { auth } from '$lib/stores.js';
|
||||
import { jobsAPI } from '$lib/api.js';
|
||||
import { browser } from '$app/environment';
|
||||
|
||||
onMount(() => {
|
||||
let jobs = [];
|
||||
let loading = true;
|
||||
let cancellingJobs = new Map(); // Track cancelling state per job
|
||||
let pollingInterval = null;
|
||||
|
||||
async function loadJobs() {
|
||||
try {
|
||||
const response = await jobsAPI.getRecent(5);
|
||||
jobs = response.jobs || [];
|
||||
|
||||
// Check if we need to update polling state
|
||||
await tick();
|
||||
updatePollingState();
|
||||
} catch (error) {
|
||||
console.error('Failed to load jobs:', error);
|
||||
}
|
||||
}
|
||||
|
||||
function hasActiveJobs() {
|
||||
return jobs.some(job => job.status === 'pending' || job.status === 'running');
|
||||
}
|
||||
|
||||
function updatePollingState() {
|
||||
if (hasActiveJobs()) {
|
||||
startPolling();
|
||||
} else {
|
||||
stopPolling();
|
||||
}
|
||||
}
|
||||
|
||||
function startPolling() {
|
||||
if (pollingInterval) return; // Already polling
|
||||
|
||||
pollingInterval = setInterval(async () => {
|
||||
await loadJobs();
|
||||
}, 2000); // Poll every 2 seconds
|
||||
}
|
||||
|
||||
function stopPolling() {
|
||||
if (pollingInterval) {
|
||||
clearInterval(pollingInterval);
|
||||
pollingInterval = null;
|
||||
}
|
||||
}
|
||||
|
||||
onMount(async () => {
|
||||
// Load user profile on mount if we have a token
|
||||
if (browser) {
|
||||
auth.loadProfile();
|
||||
}
|
||||
|
||||
// Load recent jobs if authenticated
|
||||
if ($auth.user) {
|
||||
await loadJobs();
|
||||
loading = false;
|
||||
}
|
||||
});
|
||||
|
||||
onDestroy(() => {
|
||||
stopPolling();
|
||||
});
|
||||
|
||||
async function cancelJob(jobId) {
|
||||
if (!confirm('Are you sure you want to cancel this job? This will rollback all changes made by this sync.')) {
|
||||
return;
|
||||
}
|
||||
|
||||
cancellingJobs.set(jobId, true);
|
||||
|
||||
try {
|
||||
const result = await jobsAPI.cancel(jobId);
|
||||
alert(result.message || 'Job cancelled successfully');
|
||||
// Reload jobs to show updated status
|
||||
await loadJobs();
|
||||
} catch (error) {
|
||||
alert('Failed to cancel job: ' + error.message);
|
||||
} finally {
|
||||
cancellingJobs.delete(jobId);
|
||||
}
|
||||
}
|
||||
|
||||
function canCancelJob(job) {
|
||||
// Only allow cancelling failed jobs or stale running jobs
|
||||
if (job.status === 'failed') {
|
||||
return true;
|
||||
}
|
||||
|
||||
// Allow cancelling stale running jobs (>1 hour)
|
||||
if (job.status === 'running' && job.startedAt) {
|
||||
const started = new Date(job.startedAt);
|
||||
const now = new Date();
|
||||
const hoursSinceStart = (now - started) / (1000 * 60 * 60);
|
||||
return hoursSinceStart > 1;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
function isJobStale(job) {
|
||||
return job.status === 'running' && job.startedAt &&
|
||||
(new Date() - new Date(job.startedAt)) > (1000 * 60 * 60);
|
||||
}
|
||||
|
||||
function getJobIcon(type) {
|
||||
return type === 'lotw_sync' ? '📡' : '🛰️';
|
||||
}
|
||||
|
||||
function getJobLabel(type) {
|
||||
return type === 'lotw_sync' ? 'LoTW Sync' : 'DCL Sync';
|
||||
}
|
||||
|
||||
function getStatusBadge(status) {
|
||||
const styles = {
|
||||
pending: 'bg-yellow-100 text-yellow-800',
|
||||
running: 'bg-blue-100 text-blue-800',
|
||||
completed: 'bg-green-100 text-green-800',
|
||||
failed: 'bg-red-100 text-red-800',
|
||||
cancelled: 'bg-purple-100 text-purple-800',
|
||||
};
|
||||
return styles[status] || 'bg-gray-100 text-gray-800';
|
||||
}
|
||||
|
||||
function formatTime(timestamp) {
|
||||
if (!timestamp) return '-';
|
||||
const date = new Date(timestamp);
|
||||
return date.toLocaleTimeString([], { hour: '2-digit', minute: '2-digit' });
|
||||
}
|
||||
|
||||
function formatDate(timestamp) {
|
||||
if (!timestamp) return '-';
|
||||
const date = new Date(timestamp);
|
||||
const now = new Date();
|
||||
const diffMs = now - date;
|
||||
const diffMins = Math.floor(diffMs / 60000);
|
||||
const diffHours = Math.floor(diffMs / 3600000);
|
||||
const diffDays = Math.floor(diffMs / 86400000);
|
||||
|
||||
if (diffMins < 1) return 'Just now';
|
||||
if (diffMins < 60) return `${diffMins}m ago`;
|
||||
if (diffHours < 24) return `${diffHours}h ago`;
|
||||
if (diffDays < 7) return `${diffDays}d ago`;
|
||||
return date.toLocaleDateString();
|
||||
}
|
||||
|
||||
function getDuration(job) {
|
||||
if (!job.startedAt || !job.completedAt) return null;
|
||||
const diff = new Date(job.completedAt) - new Date(job.startedAt);
|
||||
const seconds = Math.floor(diff / 1000);
|
||||
if (seconds < 60) return `${seconds}s`;
|
||||
const minutes = Math.floor(seconds / 60);
|
||||
return `${minutes}m ${seconds % 60}s`;
|
||||
}
|
||||
</script>
|
||||
|
||||
<div class="container">
|
||||
@@ -40,6 +187,99 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Recent Sync Jobs -->
|
||||
<div class="jobs-section">
|
||||
<h2 class="section-title">🔄 Recent Sync Jobs</h2>
|
||||
{#if loading}
|
||||
<div class="loading-state">Loading jobs...</div>
|
||||
{:else if jobs.length === 0}
|
||||
<div class="empty-state">
|
||||
<p>No sync jobs yet. Sync your QSOs from LoTW or DCL to get started!</p>
|
||||
<div class="empty-actions">
|
||||
<a href="/settings" class="btn btn-secondary">Configure Credentials</a>
|
||||
<a href="/qsos" class="btn btn-primary">Sync QSOs</a>
|
||||
</div>
|
||||
</div>
|
||||
{:else}
|
||||
<div class="jobs-list">
|
||||
{#each jobs as job (job.id)}
|
||||
<div class="job-card" class:failed={job.status === 'failed'}>
|
||||
<div class="job-header">
|
||||
<div class="job-title">
|
||||
<span class="job-icon">{getJobIcon(job.type)}</span>
|
||||
<span class="job-name">{getJobLabel(job.type)}</span>
|
||||
<span class="job-id">#{job.id}</span>
|
||||
</div>
|
||||
<span class="status-badge {getStatusBadge(job.status)}">
|
||||
{job.status}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
<div class="job-meta">
|
||||
<span class="job-date" title={new Date(job.createdAt).toLocaleString()}>
|
||||
{formatDate(job.createdAt)}
|
||||
</span>
|
||||
{#if job.startedAt}
|
||||
<span class="job-time">{formatTime(job.startedAt)}</span>
|
||||
{/if}
|
||||
{#if getDuration(job)}
|
||||
<span class="job-duration">({getDuration(job)})</span>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
{#if job.status === 'failed' && job.error}
|
||||
<div class="job-error">
|
||||
❌ {job.error}
|
||||
</div>
|
||||
{:else if job.result}
|
||||
<div class="job-stats">
|
||||
{#if job.result.total !== undefined}
|
||||
<span class="stat-item">
|
||||
<strong>{job.result.total}</strong> total
|
||||
</span>
|
||||
{/if}
|
||||
{#if job.result.added !== undefined && job.result.added > 0}
|
||||
<span class="stat-item stat-added">
|
||||
+{job.result.added} added
|
||||
</span>
|
||||
{/if}
|
||||
{#if job.result.updated !== undefined && job.result.updated > 0}
|
||||
<span class="stat-item stat-updated">
|
||||
~{job.result.updated} updated
|
||||
</span>
|
||||
{/if}
|
||||
{#if job.result.skipped !== undefined && job.result.skipped > 0}
|
||||
<span class="stat-item stat-skipped">
|
||||
{job.result.skipped} skipped
|
||||
</span>
|
||||
{/if}
|
||||
</div>
|
||||
{:else if job.status === 'running' || job.status === 'pending'}
|
||||
<div class="job-progress">
|
||||
<span class="progress-text">
|
||||
{job.status === 'pending' ? 'Waiting to start...' : isJobStale(job) ? 'Stale - no progress for over 1 hour' : 'Processing...'}
|
||||
</span>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<!-- Cancel button for eligible jobs -->
|
||||
{#if canCancelJob(job)}
|
||||
<div class="job-actions">
|
||||
<button
|
||||
class="btn-cancel"
|
||||
disabled={cancellingJobs.get(job.id)}
|
||||
on:click|stopPropagation={() => cancelJob(job.id)}
|
||||
>
|
||||
{cancellingJobs.get(job.id) ? 'Cancelling...' : 'Cancel & Rollback'}
|
||||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<div class="info-box">
|
||||
<h3>Getting Started</h3>
|
||||
<ol>
|
||||
@@ -191,4 +431,232 @@
|
||||
color: #666;
|
||||
line-height: 1.8;
|
||||
}
|
||||
|
||||
/* Jobs Section */
|
||||
.jobs-section {
|
||||
margin-bottom: 2rem;
|
||||
}
|
||||
|
||||
.section-title {
|
||||
font-size: 1.5rem;
|
||||
color: #333;
|
||||
margin-bottom: 1rem;
|
||||
}
|
||||
|
||||
.loading-state,
|
||||
.empty-state {
|
||||
background: white;
|
||||
border: 1px solid #e0e0e0;
|
||||
border-radius: 8px;
|
||||
padding: 2rem;
|
||||
text-align: center;
|
||||
color: #666;
|
||||
}
|
||||
|
||||
.empty-actions {
|
||||
display: flex;
|
||||
gap: 1rem;
|
||||
justify-content: center;
|
||||
margin-top: 1.5rem;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.jobs-list {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.job-card {
|
||||
background: white;
|
||||
border: 1px solid #e0e0e0;
|
||||
border-radius: 8px;
|
||||
padding: 1rem 1.25rem;
|
||||
box-shadow: 0 1px 3px rgba(0, 0, 0, 0.1);
|
||||
transition: box-shadow 0.2s;
|
||||
}
|
||||
|
||||
.job-card:hover {
|
||||
box-shadow: 0 2px 8px rgba(0, 0, 0, 0.12);
|
||||
}
|
||||
|
||||
.job-card.failed {
|
||||
border-left: 4px solid #dc3545;
|
||||
}
|
||||
|
||||
.job-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
.job-title {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
}
|
||||
|
||||
.job-icon {
|
||||
font-size: 1.5rem;
|
||||
}
|
||||
|
||||
.job-name {
|
||||
font-weight: 600;
|
||||
color: #333;
|
||||
font-size: 1.1rem;
|
||||
}
|
||||
|
||||
.job-id {
|
||||
font-size: 0.85rem;
|
||||
color: #999;
|
||||
font-family: monospace;
|
||||
}
|
||||
|
||||
.status-badge {
|
||||
padding: 0.25rem 0.75rem;
|
||||
border-radius: 12px;
|
||||
font-size: 0.85rem;
|
||||
font-weight: 500;
|
||||
text-transform: capitalize;
|
||||
}
|
||||
|
||||
.bg-yellow-100 {
|
||||
background-color: #fef3c7;
|
||||
}
|
||||
|
||||
.bg-blue-100 {
|
||||
background-color: #dbeafe;
|
||||
}
|
||||
|
||||
.bg-green-100 {
|
||||
background-color: #d1fae5;
|
||||
}
|
||||
|
||||
.bg-red-100 {
|
||||
background-color: #fee2e2;
|
||||
}
|
||||
|
||||
.text-yellow-800 {
|
||||
color: #92400e;
|
||||
}
|
||||
|
||||
.text-blue-800 {
|
||||
color: #1e40af;
|
||||
}
|
||||
|
||||
.text-green-800 {
|
||||
color: #065f46;
|
||||
}
|
||||
|
||||
.text-red-800 {
|
||||
color: #991b1b;
|
||||
}
|
||||
|
||||
.bg-purple-100 {
|
||||
background-color: #f3e8ff;
|
||||
}
|
||||
|
||||
.text-purple-800 {
|
||||
color: #6b21a8;
|
||||
}
|
||||
|
||||
.job-meta {
|
||||
display: flex;
|
||||
gap: 0.75rem;
|
||||
font-size: 0.9rem;
|
||||
color: #666;
|
||||
margin-bottom: 0.5rem;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.job-date {
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.job-time,
|
||||
.job-duration {
|
||||
color: #999;
|
||||
}
|
||||
|
||||
.job-error {
|
||||
background: #fee2e2;
|
||||
color: #991b1b;
|
||||
padding: 0.75rem;
|
||||
border-radius: 4px;
|
||||
font-size: 0.95rem;
|
||||
margin-top: 0.5rem;
|
||||
}
|
||||
|
||||
.job-stats {
|
||||
display: flex;
|
||||
gap: 1rem;
|
||||
flex-wrap: wrap;
|
||||
margin-top: 0.5rem;
|
||||
}
|
||||
|
||||
.stat-item {
|
||||
font-size: 0.9rem;
|
||||
color: #666;
|
||||
padding: 0.25rem 0.5rem;
|
||||
background: #f8f9fa;
|
||||
border-radius: 4px;
|
||||
}
|
||||
|
||||
.stat-item strong {
|
||||
color: #333;
|
||||
}
|
||||
|
||||
.stat-added {
|
||||
color: #065f46;
|
||||
background: #d1fae5;
|
||||
}
|
||||
|
||||
.stat-updated {
|
||||
color: #1e40af;
|
||||
background: #dbeafe;
|
||||
}
|
||||
|
||||
.stat-skipped {
|
||||
color: #92400e;
|
||||
background: #fef3c7;
|
||||
}
|
||||
|
||||
.job-progress {
|
||||
margin-top: 0.5rem;
|
||||
}
|
||||
|
||||
.progress-text {
|
||||
color: #1e40af;
|
||||
font-size: 0.9rem;
|
||||
font-style: italic;
|
||||
}
|
||||
|
||||
.job-actions {
|
||||
margin-top: 0.75rem;
|
||||
display: flex;
|
||||
justify-content: flex-end;
|
||||
}
|
||||
|
||||
.btn-cancel {
|
||||
padding: 0.4rem 0.8rem;
|
||||
font-size: 0.85rem;
|
||||
border: 1px solid #dc3545;
|
||||
background: white;
|
||||
color: #dc3545;
|
||||
border-radius: 4px;
|
||||
cursor: pointer;
|
||||
transition: all 0.2s;
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.btn-cancel:hover:not(:disabled) {
|
||||
background: #dc3545;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-cancel:disabled {
|
||||
opacity: 0.6;
|
||||
cursor: not-allowed;
|
||||
}
|
||||
</style>
|
||||
|
||||
1016
src/frontend/src/routes/admin/+page.svelte
Normal file
1016
src/frontend/src/routes/admin/+page.svelte
Normal file
File diff suppressed because it is too large
Load Diff
@@ -577,6 +577,7 @@
|
||||
|
||||
<select bind:value={filters.confirmationType} on:change={applyFilters} class="confirmation-filter">
|
||||
<option value="all">All QSOs</option>
|
||||
<option value="any">Confirmed by at least 1 service</option>
|
||||
<option value="lotw">LoTW Only</option>
|
||||
<option value="dcl">DCL Only</option>
|
||||
<option value="both">Both Confirmed</option>
|
||||
|
||||
@@ -25,14 +25,12 @@
|
||||
try {
|
||||
loading = true;
|
||||
const response = await authAPI.getProfile();
|
||||
console.log('Loaded profile:', response.user);
|
||||
if (response.user) {
|
||||
lotwUsername = response.user.lotwUsername || '';
|
||||
lotwPassword = ''; // Never pre-fill password for security
|
||||
hasLoTWCredentials = !!(response.user.lotwUsername && response.user.lotwPassword);
|
||||
dclApiKey = response.user.dclApiKey || '';
|
||||
hasDCLCredentials = !!response.user.dclApiKey;
|
||||
console.log('Has LoTW credentials:', hasLoTWCredentials, 'Has DCL credentials:', hasDCLCredentials);
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to load profile:', err);
|
||||
@@ -50,8 +48,6 @@
|
||||
error = null;
|
||||
successLoTW = false;
|
||||
|
||||
console.log('Saving LoTW credentials:', { lotwUsername, hasPassword: !!lotwPassword });
|
||||
|
||||
await authAPI.updateLoTWCredentials({
|
||||
lotwUsername,
|
||||
lotwPassword
|
||||
@@ -78,8 +74,6 @@
|
||||
error = null;
|
||||
successDCL = false;
|
||||
|
||||
console.log('Saving DCL credentials:', { hasApiKey: !!dclApiKey });
|
||||
|
||||
await authAPI.updateDCLCredentials({
|
||||
dclApiKey
|
||||
});
|
||||
|
||||
@@ -5,29 +5,42 @@ import { defineConfig } from 'vite';
|
||||
function suppressURIErrorPlugin() {
|
||||
return {
|
||||
name: 'suppress-uri-error',
|
||||
enforce: 'pre', // Run this plugin before others
|
||||
configureServer(server) {
|
||||
server.middlewares.use((req, res, next) => {
|
||||
// Intercept malformed requests before they reach Vite's middleware
|
||||
try {
|
||||
// Try to decode the URL to catch malformed URIs early
|
||||
if (req.url) {
|
||||
decodeURI(req.url);
|
||||
// Return a function that will be called after all plugins are configured
|
||||
// This ensures our middleware is added at the correct time
|
||||
return () => {
|
||||
// Add middleware BEFORE all other middlewares
|
||||
// We insert it at position 0 to ensure it runs first
|
||||
server.middlewares.stack.unshift({
|
||||
route: '',
|
||||
handle: (req, res, next) => {
|
||||
// Intercept malformed requests before they reach SvelteKit
|
||||
try {
|
||||
// Try to decode the URL to catch malformed URIs early
|
||||
if (req.url) {
|
||||
decodeURI(req.url);
|
||||
// Also try the full URL construction that SvelteKit does
|
||||
const base = `${server.config.server.https ? 'https' : 'http'}://${
|
||||
req.headers[':authority'] || req.headers.host || 'localhost'
|
||||
}`;
|
||||
decodeURI(new URL(base + req.url).pathname);
|
||||
}
|
||||
} catch (e) {
|
||||
// Silently ignore malformed URIs from browser extensions
|
||||
res.writeHead(200, { 'Content-Type': 'text/plain' });
|
||||
res.end('OK');
|
||||
return;
|
||||
}
|
||||
} catch (e) {
|
||||
// Silently ignore malformed URIs from browser extensions
|
||||
// Don't call next(), just end the response
|
||||
res.writeHead(200, { 'Content-Type': 'text/plain' });
|
||||
res.end('OK');
|
||||
return;
|
||||
}
|
||||
next();
|
||||
});
|
||||
next();
|
||||
}});
|
||||
};
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
export default defineConfig({
|
||||
plugins: [sveltekit(), suppressURIErrorPlugin()],
|
||||
plugins: [suppressURIErrorPlugin(), sveltekit()],
|
||||
server: {
|
||||
host: 'localhost',
|
||||
port: 5173,
|
||||
|
||||
Reference in New Issue
Block a user