Compare commits
10 Commits
e09ab94e63
...
0020f0318d
| Author | SHA1 | Date | |
|---|---|---|---|
|
0020f0318d
|
|||
|
af43f8954c
|
|||
|
233888c44f
|
|||
|
0161ad47a8
|
|||
|
645f7863e7
|
|||
|
9e73704220
|
|||
|
7f77c3adc9
|
|||
|
720144627e
|
|||
|
223461f536
|
|||
|
27d2ef14ef
|
1
.gitignore
vendored
1
.gitignore
vendored
@@ -41,3 +41,4 @@ report.[0-9]_.[0-9]_.[0-9]_.[0-9]_.json
|
|||||||
*.db
|
*.db
|
||||||
*.sqlite
|
*.sqlite
|
||||||
*.sqlite3
|
*.sqlite3
|
||||||
|
sample
|
||||||
|
|||||||
210
CLAUDE.md
210
CLAUDE.md
@@ -19,6 +19,22 @@ Default to using Bun instead of Node.js.
|
|||||||
- Prefer `Bun.file` over `node:fs`'s readFile/writeFile
|
- Prefer `Bun.file` over `node:fs`'s readFile/writeFile
|
||||||
- Bun.$`ls` instead of execa.
|
- Bun.$`ls` instead of execa.
|
||||||
|
|
||||||
|
## Logging
|
||||||
|
|
||||||
|
The application uses a custom logger in `src/backend/config.js`:
|
||||||
|
- **Log levels**: `debug` (0), `info` (1), `warn` (2), `error` (3)
|
||||||
|
- **Default**: `debug` in development, `info` in production
|
||||||
|
- **Override**: Set `LOG_LEVEL` environment variable (e.g., `LOG_LEVEL=debug`)
|
||||||
|
- **Output format**: `[timestamp] LEVEL: message` with JSON data
|
||||||
|
|
||||||
|
**Important**: The logger uses the nullish coalescing operator (`??`) to handle log levels. This ensures that `debug` (level 0) is not treated as falsy.
|
||||||
|
|
||||||
|
Example `.env` file:
|
||||||
|
```
|
||||||
|
NODE_ENV=development
|
||||||
|
LOG_LEVEL=debug
|
||||||
|
```
|
||||||
|
|
||||||
## Testing
|
## Testing
|
||||||
|
|
||||||
Use `bun test` to run tests.
|
Use `bun test` to run tests.
|
||||||
@@ -130,8 +146,10 @@ The award system is JSON-driven and located in `award-definitions/` directory. E
|
|||||||
2. **`dok`**: Count unique DOK (DARC Ortsverband Kennung) combinations
|
2. **`dok`**: Count unique DOK (DARC Ortsverband Kennung) combinations
|
||||||
- `target`: Number required
|
- `target`: Number required
|
||||||
- `confirmationType`: "dcl" (DARC Community Logbook)
|
- `confirmationType`: "dcl" (DARC Community Logbook)
|
||||||
|
- `filters`: Optional filters (band, mode, etc.) for award variants
|
||||||
- Counts unique (DOK, band, mode) combinations
|
- Counts unique (DOK, band, mode) combinations
|
||||||
- Only DCL-confirmed QSOs count
|
- Only DCL-confirmed QSOs count
|
||||||
|
- Example variants: DLD 80m, DLD CW, DLD 80m CW
|
||||||
|
|
||||||
3. **`points`**: Point-based awards
|
3. **`points`**: Point-based awards
|
||||||
- `stations`: Array of {callsign, points}
|
- `stations`: Array of {callsign, points}
|
||||||
@@ -165,17 +183,49 @@ The award system is JSON-driven and located in `award-definitions/` directory. E
|
|||||||
|
|
||||||
**ADIF Parser**: `src/backend/utils/adif-parser.js`
|
**ADIF Parser**: `src/backend/utils/adif-parser.js`
|
||||||
- `parseADIF(adifData)`: Parse ADIF format into QSO records
|
- `parseADIF(adifData)`: Parse ADIF format into QSO records
|
||||||
|
- Handles case-insensitive `<EOR>` delimiters (supports `<EOR>`, `<eor>`, `<Eor>`)
|
||||||
|
- Uses `matchAll()` for reliable field parsing
|
||||||
|
- Skips header records automatically
|
||||||
- `parseDCLResponse(response)`: Parse DCL's JSON response format `{ "adif": "..." }`
|
- `parseDCLResponse(response)`: Parse DCL's JSON response format `{ "adif": "..." }`
|
||||||
- `normalizeBand(band)`: Standardize band names (80m, 40m, etc.)
|
- `normalizeBand(band)`: Standardize band names (80m, 40m, etc.)
|
||||||
- `normalizeMode(mode)`: Standardize mode names (CW, FT8, SSB, etc.)
|
- `normalizeMode(mode)`: Standardize mode names (CW, FT8, SSB, etc.)
|
||||||
- Used by both LoTW and DCL services for consistency
|
- Used by both LoTW and DCL services for consistency
|
||||||
|
|
||||||
|
**Job Queue Service**: `src/backend/services/job-queue.service.js`
|
||||||
|
- Manages async background jobs for LoTW and DCL sync
|
||||||
|
- `enqueueJob(userId, jobType)`: Queue a sync job ('lotw_sync' or 'dcl_sync')
|
||||||
|
- `processJobAsync(jobId, userId, jobType)`: Process job asynchronously
|
||||||
|
- `getUserActiveJob(userId, jobType)`: Get active job for user (optional type filter)
|
||||||
|
- `getJobStatus(jobId)`: Get job status with parsed result
|
||||||
|
- `updateJobProgress(jobId, progressData)`: Update job progress during processing
|
||||||
|
- Supports concurrent LoTW and DCL sync jobs
|
||||||
|
- Job types: 'lotw_sync', 'dcl_sync'
|
||||||
|
- Job status: 'pending', 'running', 'completed', 'failed'
|
||||||
|
|
||||||
|
**Backend API Routes** (`src/backend/index.js`):
|
||||||
|
- `POST /api/lotw/sync`: Queue LoTW sync job
|
||||||
|
- `POST /api/dcl/sync`: Queue DCL sync job
|
||||||
|
- `GET /api/jobs/:jobId`: Get job status
|
||||||
|
- `GET /api/jobs/active`: Get active job for current user
|
||||||
|
- `GET /*`: Serves static files from `src/frontend/build/` with SPA fallback
|
||||||
|
|
||||||
|
**SPA Routing**: The backend serves the SvelteKit frontend build from `src/frontend/build/`.
|
||||||
|
- Paths with file extensions (`.js`, `.css`, etc.) are served as static files
|
||||||
|
- Paths without extensions (e.g., `/qsos`, `/awards`) are served `index.html` for client-side routing
|
||||||
|
- Common missing files like `/favicon.ico` return 404 immediately
|
||||||
|
- If frontend build is missing entirely, returns a user-friendly 503 HTML page
|
||||||
|
- Prevents ugly Bun error pages when accessing client-side routes via curl or non-JS clients
|
||||||
|
|
||||||
**DCL Service**: `src/backend/services/dcl.service.js`
|
**DCL Service**: `src/backend/services/dcl.service.js`
|
||||||
- `fetchQSOsFromDCL(dclApiKey, sinceDate)`: Fetch from DCL API
|
- `fetchQSOsFromDCL(dclApiKey, sinceDate)`: Fetch from DCL API
|
||||||
|
- API Endpoint: `https://dings.dcl.darc.de/api/adiexport`
|
||||||
|
- Request: POST with JSON body `{ key, limit: 50000, qsl_since, qso_since, cnf_only }`
|
||||||
- `parseDCLJSONResponse(jsonResponse)`: Parse example/test payloads
|
- `parseDCLJSONResponse(jsonResponse)`: Parse example/test payloads
|
||||||
- `syncQSOs(userId, dclApiKey, sinceDate, jobId)`: Sync QSOs to database
|
- `syncQSOs(userId, dclApiKey, sinceDate, jobId)`: Sync QSOs to database
|
||||||
- `getLastDCLQSLDate(userId)`: Get last QSL date for incremental sync
|
- `getLastDCLQSLDate(userId)`: Get last QSL date for incremental sync
|
||||||
- Ready for when DCL publishes their API
|
- Debug logging (when `LOG_LEVEL=debug`) shows API params with redacted key (first/last 4 chars)
|
||||||
|
- Fully implemented and functional
|
||||||
|
- **Note**: DCL API is a custom prototype by DARC; contact DARC for API specification details
|
||||||
|
|
||||||
### DLD Award Implementation (COMPLETED)
|
### DLD Award Implementation (COMPLETED)
|
||||||
|
|
||||||
@@ -214,6 +264,20 @@ The DLD (Deutschland Diplom) award was recently implemented:
|
|||||||
|
|
||||||
**Documentation**: See `docs/DOCUMENTATION.md` for complete documentation including DLD award example.
|
**Documentation**: See `docs/DOCUMENTATION.md` for complete documentation including DLD award example.
|
||||||
|
|
||||||
|
**Frontend**: `src/frontend/src/routes/qsos/+page.svelte`
|
||||||
|
- Separate sync buttons for LoTW (blue) and DCL (orange)
|
||||||
|
- Independent progress tracking for each sync type
|
||||||
|
- Both syncs can run simultaneously
|
||||||
|
- Job polling every 2 seconds for status updates
|
||||||
|
- Import log displays after sync completion
|
||||||
|
- Real-time QSO table refresh after sync
|
||||||
|
|
||||||
|
**Frontend API** (`src/frontend/src/lib/api.js`):
|
||||||
|
- `qsosAPI.syncFromLoTW()`: Trigger LoTW sync
|
||||||
|
- `qsosAPI.syncFromDCL()`: Trigger DCL sync
|
||||||
|
- `jobsAPI.getStatus(jobId)`: Poll job status
|
||||||
|
- `jobsAPI.getActive()`: Get active job on page load
|
||||||
|
|
||||||
### Adding New Awards
|
### Adding New Awards
|
||||||
|
|
||||||
To add a new award:
|
To add a new award:
|
||||||
@@ -226,32 +290,107 @@ To add a new award:
|
|||||||
6. Update documentation in `docs/DOCUMENTATION.md`
|
6. Update documentation in `docs/DOCUMENTATION.md`
|
||||||
7. Test with sample QSO data
|
7. Test with sample QSO data
|
||||||
|
|
||||||
|
### Creating DLD Award Variants
|
||||||
|
|
||||||
|
The DOK award type supports filters to create award variants. Examples:
|
||||||
|
|
||||||
|
**DLD on 80m** (`dld-80m.json`):
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": "dld-80m",
|
||||||
|
"name": "DLD 80m",
|
||||||
|
"description": "Confirm 100 unique DOKs on 80m",
|
||||||
|
"caption": "Contact 100 different DOKs on the 80m band.",
|
||||||
|
"category": "darc",
|
||||||
|
"rules": {
|
||||||
|
"type": "dok",
|
||||||
|
"target": 100,
|
||||||
|
"confirmationType": "dcl",
|
||||||
|
"displayField": "darcDok",
|
||||||
|
"filters": {
|
||||||
|
"operator": "AND",
|
||||||
|
"filters": [
|
||||||
|
{ "field": "band", "operator": "eq", "value": "80m" }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**DLD in CW mode** (`dld-cw.json`):
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"rules": {
|
||||||
|
"type": "dok",
|
||||||
|
"target": 100,
|
||||||
|
"confirmationType": "dcl",
|
||||||
|
"filters": {
|
||||||
|
"operator": "AND",
|
||||||
|
"filters": [
|
||||||
|
{ "field": "mode", "operator": "eq", "value": "CW" }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**DLD on 80m using CW** (combined filters, `dld-80m-cw.json`):
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"rules": {
|
||||||
|
"type": "dok",
|
||||||
|
"target": 100,
|
||||||
|
"confirmationType": "dcl",
|
||||||
|
"filters": {
|
||||||
|
"operator": "AND",
|
||||||
|
"filters": [
|
||||||
|
{ "field": "band", "operator": "eq", "value": "80m" },
|
||||||
|
{ "field": "mode", "operator": "eq", "value": "CW" }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Available filter operators**:
|
||||||
|
- `eq`: equals
|
||||||
|
- `ne`: not equals
|
||||||
|
- `in`: in array
|
||||||
|
- `nin`: not in array
|
||||||
|
- `contains`: contains substring
|
||||||
|
|
||||||
|
**Available filter fields**: Any QSO field (band, mode, callsign, grid, state, satName, etc.)
|
||||||
|
|
||||||
### Confirmation Systems
|
### Confirmation Systems
|
||||||
|
|
||||||
- **LoTW (Logbook of The World)**: ARRL's confirmation system
|
- **LoTW (Logbook of The World)**: ARRL's confirmation system
|
||||||
- Service: `src/backend/services/lotw.service.js`
|
- Service: `src/backend/services/lotw.service.js`
|
||||||
|
- API: `https://lotw.arrl.org/lotwuser/lotwreport.adi`
|
||||||
- Fields: `lotwQslRstatus`, `lotwQslRdate`
|
- Fields: `lotwQslRstatus`, `lotwQslRdate`
|
||||||
- Used for DXCC, WAS, VUCC, most awards
|
- Used for DXCC, WAS, VUCC, most awards
|
||||||
- ADIF format with `<EOR>` delimiters
|
- ADIF format with `<EOR>` delimiters
|
||||||
- Supports incremental sync by date
|
- Supports incremental sync by `qso_qslsince` parameter (format: YYYY-MM-DD)
|
||||||
|
|
||||||
- **DCL (DARC Community Logbook)**: DARC's confirmation system
|
- **DCL (DARC Community Logbook)**: DARC's confirmation system
|
||||||
- Service: `src/backend/services/dcl.service.js`
|
- Service: `src/backend/services/dcl.service.js`
|
||||||
|
- API: `https://dings.dcl.darc.de/api/adiexport`
|
||||||
- Fields: `dclQslRstatus`, `dclQslRdate`
|
- Fields: `dclQslRstatus`, `dclQslRdate`
|
||||||
- DOK fields: `darcDok` (partner's DOK), `myDarcDok` (user's DOK)
|
- DOK fields: `darcDok` (partner's DOK), `myDarcDok` (user's DOK)
|
||||||
- Required for DLD award
|
- Required for DLD award
|
||||||
- German amateur radio specific
|
- German amateur radio specific
|
||||||
- API in development (parser ready)
|
- Request format: POST JSON `{ key, limit, qsl_since, qso_since, cnf_only }`
|
||||||
- Response format: JSON with ADIF string in `adif` field
|
- Response format: JSON with ADIF string in `adif` field
|
||||||
- Supports DOK (DARC Ortsverband Kennung) data
|
- Supports incremental sync by `qsl_since` parameter (format: YYYYMMDD)
|
||||||
|
- Updates QSOs only if confirmation data has changed
|
||||||
|
|
||||||
### ADIF Format
|
### ADIF Format
|
||||||
|
|
||||||
Both LoTW and DCL return data in ADIF (Amateur Data Interchange Format):
|
Both LoTW and DCL return data in ADIF (Amateur Data Interchange Format):
|
||||||
- Field format: `<FIELD_NAME:length>value`
|
- Field format: `<FIELD_NAME:length>value`
|
||||||
- Record delimiter: `<EOR>` (end of record)
|
- Record delimiter: `<EOR>` (end of record, case-insensitive)
|
||||||
- Header ends with: `<EOH>` (end of header)
|
- Header ends with: `<EOH>` (end of header)
|
||||||
- Example: `<CALL:5>DK0MU<BAND:3>80m<QSO_DATE:8>20250621<EOR>`
|
- Example: `<CALL:5>DK0MU<BAND:3>80m<QSO_DATE:8>20250621<EOR>`
|
||||||
|
- **Important**: Parser handles case-insensitive `<EOR>`, `<eor>`, `<Eor>` tags
|
||||||
|
|
||||||
**DCL-specific fields**:
|
**DCL-specific fields**:
|
||||||
- `DCL_QSL_RCVD`: DCL confirmation status (Y/N/?)
|
- `DCL_QSL_RCVD`: DCL confirmation status (Y/N/?)
|
||||||
@@ -262,10 +401,69 @@ Both LoTW and DCL return data in ADIF (Amateur Data Interchange Format):
|
|||||||
|
|
||||||
### Recent Commits
|
### Recent Commits
|
||||||
|
|
||||||
|
- `233888c`: fix: make ADIF parser case-insensitive for EOR delimiter
|
||||||
|
- **Critical bug**: LoTW uses lowercase `<eor>` tags, parser was splitting on uppercase `<EOR>`
|
||||||
|
- Caused 242K+ QSOs to be parsed as 1 giant record with fields overwriting each other
|
||||||
|
- Changed to case-insensitive regex: `new RegExp('<eor>', 'gi')`
|
||||||
|
- Replaced `regex.exec()` while loop with `matchAll()` for-of iteration
|
||||||
|
- Now correctly imports all QSOs from large LoTW reports
|
||||||
|
- `645f786`: fix: add missing timeOn field to LoTW duplicate detection
|
||||||
|
- LoTW sync was missing `timeOn` in duplicate detection query
|
||||||
|
- Multiple QSOs with same callsign/date/band/mode but different times were treated as duplicates
|
||||||
|
- Now matches DCL sync logic: `userId, callsign, qsoDate, timeOn, band, mode`
|
||||||
|
- `7f77c3a`: feat: add filter support for DOK awards
|
||||||
|
- DOK award type now supports filtering by band, mode, and other QSO fields
|
||||||
|
- Allows creating award variants like DLD 80m, DLD CW, DLD 80m CW
|
||||||
|
- Uses existing filter system with eq, ne, in, nin, contains operators
|
||||||
|
- Example awards created: dld-80m, dld-40m, dld-cw, dld-80m-cw
|
||||||
|
- `9e73704`: docs: update CLAUDE.md with DLD award variants documentation
|
||||||
|
- `7201446`: fix: return proper HTML for SPA routes instead of Bun error page
|
||||||
|
- When accessing client-side routes (like /qsos) via curl or non-JS clients,
|
||||||
|
the server attempted to open them as static files, causing Bun to throw
|
||||||
|
an unhandled ENOENT error that showed an ugly error page
|
||||||
|
- Now checks if a path has a file extension before attempting to serve it
|
||||||
|
- Paths without extensions are immediately served index.html for SPA routing
|
||||||
|
- Also improves the 503 error page with user-friendly HTML when frontend build is missing
|
||||||
|
- `223461f`: fix: enable debug logging and improve DCL sync observability
|
||||||
|
- `27d2ef1`: fix: preserve DOK data when DCL doesn't send values
|
||||||
|
- DCL sync only updates DOK/grid fields when DCL provides non-empty values
|
||||||
|
- Prevents accidentally clearing DOK data from manual entry or other sources
|
||||||
|
- Preserves existing DOK when DCL syncs QSO without DOK information
|
||||||
|
- `e09ab94`: feat: skip QSOs with unchanged confirmation data
|
||||||
|
- LoTW/DCL sync only updates QSOs if confirmation data has changed
|
||||||
|
- Tracks added, updated, and skipped QSO counts
|
||||||
|
- LoTW: Checks if lotwQslRstatus or lotwQslRdate changed
|
||||||
|
- DCL: Checks if dclQslRstatus, dclQslRdate, darcDok, myDarcDok, or grid changed
|
||||||
|
- `3592dbb`: feat: add import log showing synced QSOs
|
||||||
|
- Backend returns addedQSOs and updatedQSOs arrays in sync result
|
||||||
|
- Frontend displays import log with callsign, date, band, mode for each QSO
|
||||||
|
- Separate sections for "New QSOs" and "Updated QSOs"
|
||||||
|
- Sync summary shows total, added, updated, skipped counts
|
||||||
- `8a1a580`: feat: implement DCL ADIF parser and service integration
|
- `8a1a580`: feat: implement DCL ADIF parser and service integration
|
||||||
- Add shared ADIF parser utility (src/backend/utils/adif-parser.js)
|
- Add shared ADIF parser utility (src/backend/utils/adif-parser.js)
|
||||||
- Implement DCL service with API integration ready
|
- Implement DCL service with API integration
|
||||||
- Refactor LoTW service to use shared parser
|
- Refactor LoTW service to use shared parser
|
||||||
- Tested with example DCL payload (6 QSOs parsed successfully)
|
- Tested with example DCL payload (6 QSOs parsed successfully)
|
||||||
- `c982dcd`: feat: implement DLD (Deutschland Diplom) award
|
- `c982dcd`: feat: implement DLD (Deutschland Diplom) award
|
||||||
- `322ccaf`: docs: add DLD (Deutschland Diplom) award documentation
|
- `322ccaf`: docs: add DLD (Deutschland Diplom) award documentation
|
||||||
|
|
||||||
|
### Sync Behavior
|
||||||
|
|
||||||
|
**Import Log**: After each sync, displays a table showing:
|
||||||
|
- New QSOs: Callsign, Date, Band, Mode
|
||||||
|
- Updated QSOs: Callsign, Date, Band, Mode (only if data changed)
|
||||||
|
- Skipped QSOs: Counted but not shown (data unchanged)
|
||||||
|
|
||||||
|
**Duplicate Handling**:
|
||||||
|
- QSOs matched by: userId, callsign, qsoDate, timeOn, band, mode
|
||||||
|
- If confirmation data unchanged: Skipped (not updated)
|
||||||
|
- If confirmation data changed: Updated with new values
|
||||||
|
- Prevents unnecessary database writes and shows accurate import counts
|
||||||
|
|
||||||
|
**DOK Update Behavior**:
|
||||||
|
- If QSO imported via LoTW (no DOK) and later DCL confirms with DOK: DOK is added ✓
|
||||||
|
- If QSO already has DOK and DCL sends different DOK: DOK is updated ✓
|
||||||
|
- If QSO has DOK and DCL syncs without DOK (empty): Existing DOK is preserved ✓
|
||||||
|
- LoTW never sends DOK data; only DCL provides DOK fields
|
||||||
|
|
||||||
|
**Important**: DCL sync only updates DOK/grid fields when DCL provides non-empty values. This prevents accidentally clearing DOK data that was manually entered or imported from other sources.
|
||||||
|
|||||||
19
award-definitions/dld-40m.json
Normal file
19
award-definitions/dld-40m.json
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
{
|
||||||
|
"id": "dld-40m",
|
||||||
|
"name": "DLD 40m",
|
||||||
|
"description": "Confirm 100 unique DOKs on 40m",
|
||||||
|
"caption": "Contact and confirm stations with 100 unique DOKs (DARC Ortsverband Kennung) on the 40m band. Only DCL-confirmed QSOs with valid DOK information on 40m count toward this award.",
|
||||||
|
"category": "darc",
|
||||||
|
"rules": {
|
||||||
|
"type": "dok",
|
||||||
|
"target": 100,
|
||||||
|
"confirmationType": "dcl",
|
||||||
|
"displayField": "darcDok",
|
||||||
|
"filters": {
|
||||||
|
"operator": "AND",
|
||||||
|
"filters": [
|
||||||
|
{ "field": "band", "operator": "eq", "value": "40m" }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
20
award-definitions/dld-80m-cw.json
Normal file
20
award-definitions/dld-80m-cw.json
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
{
|
||||||
|
"id": "dld-80m-cw",
|
||||||
|
"name": "DLD 80m CW",
|
||||||
|
"description": "Confirm 100 unique DOKs on 80m using CW",
|
||||||
|
"caption": "Contact and confirm stations with 100 unique DOKs (DARC Ortsverband Kennung) on the 80m band using CW mode. Only DCL-confirmed QSOs with valid DOK information on 80m CW count toward this award.",
|
||||||
|
"category": "darc",
|
||||||
|
"rules": {
|
||||||
|
"type": "dok",
|
||||||
|
"target": 100,
|
||||||
|
"confirmationType": "dcl",
|
||||||
|
"displayField": "darcDok",
|
||||||
|
"filters": {
|
||||||
|
"operator": "AND",
|
||||||
|
"filters": [
|
||||||
|
{ "field": "band", "operator": "eq", "value": "80m" },
|
||||||
|
{ "field": "mode", "operator": "eq", "value": "CW" }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
19
award-definitions/dld-80m.json
Normal file
19
award-definitions/dld-80m.json
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
{
|
||||||
|
"id": "dld-80m",
|
||||||
|
"name": "DLD 80m",
|
||||||
|
"description": "Confirm 100 unique DOKs on 80m",
|
||||||
|
"caption": "Contact and confirm stations with 100 unique DOKs (DARC Ortsverband Kennung) on the 80m band. Only DCL-confirmed QSOs with valid DOK information on 80m count toward this award.",
|
||||||
|
"category": "darc",
|
||||||
|
"rules": {
|
||||||
|
"type": "dok",
|
||||||
|
"target": 100,
|
||||||
|
"confirmationType": "dcl",
|
||||||
|
"displayField": "darcDok",
|
||||||
|
"filters": {
|
||||||
|
"operator": "AND",
|
||||||
|
"filters": [
|
||||||
|
{ "field": "band", "operator": "eq", "value": "80m" }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
19
award-definitions/dld-cw.json
Normal file
19
award-definitions/dld-cw.json
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
{
|
||||||
|
"id": "dld-cw",
|
||||||
|
"name": "DLD CW",
|
||||||
|
"description": "Confirm 100 unique DOKs using CW mode",
|
||||||
|
"caption": "Contact and confirm stations with 100 unique DOKs (DARC Ortsverband Kennung) using CW (Morse code). Each unique DOK on CW counts separately. Only DCL-confirmed QSOs with valid DOK information count toward this award.",
|
||||||
|
"category": "darc",
|
||||||
|
"rules": {
|
||||||
|
"type": "dok",
|
||||||
|
"target": 100,
|
||||||
|
"confirmationType": "dcl",
|
||||||
|
"displayField": "darcDok",
|
||||||
|
"filters": {
|
||||||
|
"operator": "AND",
|
||||||
|
"filters": [
|
||||||
|
{ "field": "mode", "operator": "eq", "value": "CW" }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -17,7 +17,7 @@ export const LOG_LEVEL = process.env.LOG_LEVEL || (isDevelopment ? 'debug' : 'in
|
|||||||
// ===================================================================
|
// ===================================================================
|
||||||
|
|
||||||
const logLevels = { debug: 0, info: 1, warn: 2, error: 3 };
|
const logLevels = { debug: 0, info: 1, warn: 2, error: 3 };
|
||||||
const currentLogLevel = logLevels[LOG_LEVEL] || 1;
|
const currentLogLevel = logLevels[LOG_LEVEL] ?? 1;
|
||||||
|
|
||||||
function log(level, message, data) {
|
function log(level, message, data) {
|
||||||
if (logLevels[level] < currentLogLevel) return;
|
if (logLevels[level] < currentLogLevel) return;
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
import { Elysia, t } from 'elysia';
|
import { Elysia, t } from 'elysia';
|
||||||
import { cors } from '@elysiajs/cors';
|
import { cors } from '@elysiajs/cors';
|
||||||
import { jwt } from '@elysiajs/jwt';
|
import { jwt } from '@elysiajs/jwt';
|
||||||
import { JWT_SECRET, logger } from './config.js';
|
import { JWT_SECRET, logger, LOG_LEVEL } from './config.js';
|
||||||
import {
|
import {
|
||||||
registerUser,
|
registerUser,
|
||||||
authenticateUser,
|
authenticateUser,
|
||||||
@@ -283,13 +283,13 @@ const app = new Elysia()
|
|||||||
}
|
}
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const result = await enqueueJob(user.id);
|
const result = await enqueueJob(user.id, 'lotw_sync');
|
||||||
|
|
||||||
if (!result.success && result.existingJob) {
|
if (!result.success && result.existingJob) {
|
||||||
return {
|
return {
|
||||||
success: true,
|
success: true,
|
||||||
jobId: result.existingJob,
|
jobId: result.existingJob,
|
||||||
message: 'A sync job is already running',
|
message: 'A LoTW sync job is already running',
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -299,7 +299,41 @@ const app = new Elysia()
|
|||||||
set.status = 500;
|
set.status = 500;
|
||||||
return {
|
return {
|
||||||
success: false,
|
success: false,
|
||||||
error: `Failed to queue sync job: ${error.message}`,
|
error: `Failed to queue LoTW sync job: ${error.message}`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
/**
|
||||||
|
* POST /api/dcl/sync
|
||||||
|
* Queue a DCL sync job (requires authentication)
|
||||||
|
* Returns immediately with job ID
|
||||||
|
*/
|
||||||
|
.post('/api/dcl/sync', async ({ user, set }) => {
|
||||||
|
if (!user) {
|
||||||
|
logger.warn('/api/dcl/sync: Unauthorized access attempt');
|
||||||
|
set.status = 401;
|
||||||
|
return { success: false, error: 'Unauthorized' };
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const result = await enqueueJob(user.id, 'dcl_sync');
|
||||||
|
|
||||||
|
if (!result.success && result.existingJob) {
|
||||||
|
return {
|
||||||
|
success: true,
|
||||||
|
jobId: result.existingJob,
|
||||||
|
message: 'A DCL sync job is already running',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
} catch (error) {
|
||||||
|
logger.error('Error in /api/dcl/sync', { error: error.message });
|
||||||
|
set.status = 500;
|
||||||
|
return {
|
||||||
|
success: false,
|
||||||
|
error: `Failed to queue DCL sync job: ${error.message}`,
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
@@ -646,13 +680,28 @@ const app = new Elysia()
|
|||||||
try {
|
try {
|
||||||
const fullPath = `src/frontend/build${filePath}`;
|
const fullPath = `src/frontend/build${filePath}`;
|
||||||
|
|
||||||
// Use Bun.file() which doesn't throw for non-existent files
|
// For paths without extensions or directories, use SPA fallback immediately
|
||||||
|
// This prevents errors when trying to open directories as files
|
||||||
|
const ext = filePath.split('.').pop();
|
||||||
|
const hasExtension = ext !== filePath && ext.length <= 5; // Simple check for file extension
|
||||||
|
|
||||||
|
if (!hasExtension) {
|
||||||
|
// No extension means it's a route, not a file - serve index.html
|
||||||
|
const indexFile = Bun.file('src/frontend/build/index.html');
|
||||||
|
return new Response(indexFile, {
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'text/html; charset=utf-8',
|
||||||
|
'Cache-Control': 'no-cache, no-store, must-revalidate',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to serve actual files (with extensions)
|
||||||
const file = Bun.file(fullPath);
|
const file = Bun.file(fullPath);
|
||||||
const exists = file.exists();
|
const exists = file.exists();
|
||||||
|
|
||||||
if (exists) {
|
if (exists) {
|
||||||
// Determine content type
|
// Determine content type
|
||||||
const ext = filePath.split('.').pop();
|
|
||||||
const contentTypes = {
|
const contentTypes = {
|
||||||
'js': 'application/javascript',
|
'js': 'application/javascript',
|
||||||
'css': 'text/css',
|
'css': 'text/css',
|
||||||
@@ -685,6 +734,7 @@ const app = new Elysia()
|
|||||||
}
|
}
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
// File not found or error, fall through to SPA fallback
|
// File not found or error, fall through to SPA fallback
|
||||||
|
logger.debug('Error serving static file, falling back to SPA', { path: pathname, error: err.message });
|
||||||
}
|
}
|
||||||
|
|
||||||
// SPA fallback - serve index.html for all other routes
|
// SPA fallback - serve index.html for all other routes
|
||||||
@@ -696,10 +746,24 @@ const app = new Elysia()
|
|||||||
'Cache-Control': 'no-cache, no-store, must-revalidate',
|
'Cache-Control': 'no-cache, no-store, must-revalidate',
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
} catch {
|
} catch (err) {
|
||||||
return new Response('Frontend not built. Run `bun run build`', { status: 503 });
|
logger.error('Frontend build not found', { error: err.message });
|
||||||
|
return new Response(
|
||||||
|
'<!DOCTYPE html><html><head><title>Quickawards - Unavailable</title></head>' +
|
||||||
|
'<body style="font-family: system-ui; max-width: 600px; margin: 100px auto; padding: 20px;">' +
|
||||||
|
'<h1>Service Temporarily Unavailable</h1>' +
|
||||||
|
'<p>The frontend application is not currently available. This usually means the application is being updated or restarted.</p>' +
|
||||||
|
'<p>Please try refreshing the page in a few moments.</p></body></html>',
|
||||||
|
{ status: 503, headers: { 'Content-Type': 'text/html; charset=utf-8' } }
|
||||||
|
);
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
// Start server - uses PORT environment variable if set, otherwise defaults to 3001
|
// Start server - uses PORT environment variable if set, otherwise defaults to 3001
|
||||||
.listen(process.env.PORT || 3001);
|
.listen(process.env.PORT || 3001);
|
||||||
|
|
||||||
|
logger.info('Server started', {
|
||||||
|
port: process.env.PORT || 3001,
|
||||||
|
nodeEnv: process.env.NODE_ENV || 'unknown',
|
||||||
|
logLevel: LOG_LEVEL,
|
||||||
|
});
|
||||||
|
|||||||
@@ -27,6 +27,10 @@ function loadAwardDefinitions() {
|
|||||||
'sat-rs44.json',
|
'sat-rs44.json',
|
||||||
'special-stations.json',
|
'special-stations.json',
|
||||||
'dld.json',
|
'dld.json',
|
||||||
|
'dld-80m.json',
|
||||||
|
'dld-40m.json',
|
||||||
|
'dld-cw.json',
|
||||||
|
'dld-80m-cw.json',
|
||||||
];
|
];
|
||||||
|
|
||||||
for (const file of files) {
|
for (const file of files) {
|
||||||
@@ -173,9 +177,9 @@ export async function calculateAwardProgress(userId, award, options = {}) {
|
|||||||
async function calculateDOKAwardProgress(userId, award, options = {}) {
|
async function calculateDOKAwardProgress(userId, award, options = {}) {
|
||||||
const { includeDetails = false } = options;
|
const { includeDetails = false } = options;
|
||||||
const { rules } = award;
|
const { rules } = award;
|
||||||
const { target, displayField } = rules;
|
const { target, displayField, filters } = rules;
|
||||||
|
|
||||||
logger.debug('Calculating DOK-based award progress', { userId, awardId: award.id, target });
|
logger.debug('Calculating DOK-based award progress', { userId, awardId: award.id, target, hasFilters: !!filters });
|
||||||
|
|
||||||
// Get all QSOs for user
|
// Get all QSOs for user
|
||||||
const allQSOs = await db
|
const allQSOs = await db
|
||||||
@@ -185,10 +189,17 @@ async function calculateDOKAwardProgress(userId, award, options = {}) {
|
|||||||
|
|
||||||
logger.debug('Total QSOs for user', { count: allQSOs.length });
|
logger.debug('Total QSOs for user', { count: allQSOs.length });
|
||||||
|
|
||||||
|
// Apply filters if defined
|
||||||
|
let filteredQSOs = allQSOs;
|
||||||
|
if (filters) {
|
||||||
|
filteredQSOs = applyFilters(allQSOs, filters);
|
||||||
|
logger.debug('QSOs after DOK award filters', { count: filteredQSOs.length });
|
||||||
|
}
|
||||||
|
|
||||||
// Track unique (DOK, band, mode) combinations
|
// Track unique (DOK, band, mode) combinations
|
||||||
const dokCombinations = new Map(); // Key: "DOK/band/mode" -> detail object
|
const dokCombinations = new Map(); // Key: "DOK/band/mode" -> detail object
|
||||||
|
|
||||||
for (const qso of allQSOs) {
|
for (const qso of filteredQSOs) {
|
||||||
const dok = qso.darcDok;
|
const dok = qso.darcDok;
|
||||||
if (!dok) continue; // Skip QSOs without DOK
|
if (!dok) continue; // Skip QSOs without DOK
|
||||||
|
|
||||||
|
|||||||
@@ -9,10 +9,18 @@ import { parseDCLResponse, normalizeBand, normalizeMode } from '../utils/adif-pa
|
|||||||
*
|
*
|
||||||
* DCL Information:
|
* DCL Information:
|
||||||
* - Website: https://dcl.darc.de/
|
* - Website: https://dcl.darc.de/
|
||||||
* - API: Coming soon (currently in development)
|
* - API Endpoint: https://dings.dcl.darc.de/api/adiexport
|
||||||
* - ADIF Export: https://dcl.darc.de/dml/export_adif_form.php (manual only)
|
|
||||||
* - DOK fields: MY_DARC_DOK (user's DOK), DARC_DOK (partner's DOK)
|
* - DOK fields: MY_DARC_DOK (user's DOK), DARC_DOK (partner's DOK)
|
||||||
*
|
*
|
||||||
|
* API Request Format (POST):
|
||||||
|
* {
|
||||||
|
* "key": "API_KEY",
|
||||||
|
* "limit": null,
|
||||||
|
* "qsl_since": null,
|
||||||
|
* "qso_since": null,
|
||||||
|
* "cnf_only": null
|
||||||
|
* }
|
||||||
|
*
|
||||||
* Expected API Response Format:
|
* Expected API Response Format:
|
||||||
* {
|
* {
|
||||||
* "adif": "<ADIF_VER:5>3.1.3\\n<CREATED_TIMESTAMP:15>20260117 095453\\n<EOH>\\n..."
|
* "adif": "<ADIF_VER:5>3.1.3\\n<CREATED_TIMESTAMP:15>20260117 095453\\n<EOH>\\n..."
|
||||||
@@ -20,13 +28,11 @@ import { parseDCLResponse, normalizeBand, normalizeMode } from '../utils/adif-pa
|
|||||||
*/
|
*/
|
||||||
|
|
||||||
const REQUEST_TIMEOUT = 60000;
|
const REQUEST_TIMEOUT = 60000;
|
||||||
|
const DCL_API_URL = 'https://dings.dcl.darc.de/api/adiexport';
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Fetch QSOs from DCL API
|
* Fetch QSOs from DCL API
|
||||||
*
|
*
|
||||||
* When DCL provides their API, update the URL and parameters.
|
|
||||||
* Expected response format: { "adif": "<ADIF data>" }
|
|
||||||
*
|
|
||||||
* @param {string} dclApiKey - DCL API key
|
* @param {string} dclApiKey - DCL API key
|
||||||
* @param {Date|null} sinceDate - Last sync date for incremental sync
|
* @param {Date|null} sinceDate - Last sync date for incremental sync
|
||||||
* @returns {Promise<Array>} Array of parsed QSO records
|
* @returns {Promise<Array>} Array of parsed QSO records
|
||||||
@@ -37,30 +43,44 @@ export async function fetchQSOsFromDCL(dclApiKey, sinceDate = null) {
|
|||||||
sinceDate: sinceDate?.toISOString(),
|
sinceDate: sinceDate?.toISOString(),
|
||||||
});
|
});
|
||||||
|
|
||||||
// TODO: Update URL when DCL publishes their API endpoint
|
// Build request body
|
||||||
const url = 'https://dcl.darc.de/api/export'; // Placeholder URL
|
const requestBody = {
|
||||||
|
key: dclApiKey,
|
||||||
const params = new URLSearchParams({
|
limit: 50000,
|
||||||
api_key: dclApiKey,
|
qsl_since: null,
|
||||||
format: 'json',
|
qso_since: null,
|
||||||
qsl: 'yes',
|
cnf_only: null,
|
||||||
});
|
};
|
||||||
|
|
||||||
// Add date filter for incremental sync if provided
|
// Add date filter for incremental sync if provided
|
||||||
if (sinceDate) {
|
if (sinceDate) {
|
||||||
const dateStr = sinceDate.toISOString().split('T')[0].replace(/-/g, '');
|
const dateStr = sinceDate.toISOString().split('T')[0].replace(/-/g, '');
|
||||||
params.append('qsl_since', dateStr);
|
requestBody.qsl_since = dateStr;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Debug log request parameters (redact API key)
|
||||||
|
logger.debug('DCL API request parameters', {
|
||||||
|
url: DCL_API_URL,
|
||||||
|
method: 'POST',
|
||||||
|
key: dclApiKey ? `${dclApiKey.substring(0, 4)}...${dclApiKey.substring(dclApiKey.length - 4)}` : null,
|
||||||
|
limit: requestBody.limit,
|
||||||
|
qsl_since: requestBody.qsl_since,
|
||||||
|
qso_since: requestBody.qso_since,
|
||||||
|
cnf_only: requestBody.cnf_only,
|
||||||
|
});
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const controller = new AbortController();
|
const controller = new AbortController();
|
||||||
const timeoutId = setTimeout(() => controller.abort(), REQUEST_TIMEOUT);
|
const timeoutId = setTimeout(() => controller.abort(), REQUEST_TIMEOUT);
|
||||||
|
|
||||||
const response = await fetch(`${url}?${params}`, {
|
const response = await fetch(DCL_API_URL, {
|
||||||
|
method: 'POST',
|
||||||
signal: controller.signal,
|
signal: controller.signal,
|
||||||
headers: {
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
'Accept': 'application/json',
|
'Accept': 'application/json',
|
||||||
},
|
},
|
||||||
|
body: JSON.stringify(requestBody),
|
||||||
});
|
});
|
||||||
|
|
||||||
clearTimeout(timeoutId);
|
clearTimeout(timeoutId);
|
||||||
@@ -69,9 +89,10 @@ export async function fetchQSOsFromDCL(dclApiKey, sinceDate = null) {
|
|||||||
if (response.status === 401) {
|
if (response.status === 401) {
|
||||||
throw new Error('Invalid DCL API key. Please check your DCL credentials in Settings.');
|
throw new Error('Invalid DCL API key. Please check your DCL credentials in Settings.');
|
||||||
} else if (response.status === 404) {
|
} else if (response.status === 404) {
|
||||||
throw new Error('DCL API endpoint not found. The DCL API may not be available yet.');
|
throw new Error('DCL API endpoint not found.');
|
||||||
} else {
|
} else {
|
||||||
throw new Error(`DCL API error: ${response.status} ${response.statusText}`);
|
const errorText = await response.text();
|
||||||
|
throw new Error(`DCL API error: ${response.status} ${response.statusText} - ${errorText}`);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -82,7 +103,7 @@ export async function fetchQSOsFromDCL(dclApiKey, sinceDate = null) {
|
|||||||
|
|
||||||
logger.info('Successfully fetched QSOs from DCL', {
|
logger.info('Successfully fetched QSOs from DCL', {
|
||||||
total: qsos.length,
|
total: qsos.length,
|
||||||
hasConfirmations: qsos.filter(q => qso.dcl_qsl_rcvd === 'Y').length,
|
hasConfirmations: qsos.filter(q => q.dcl_qsl_rcvd === 'Y').length,
|
||||||
});
|
});
|
||||||
|
|
||||||
return qsos;
|
return qsos;
|
||||||
@@ -94,7 +115,6 @@ export async function fetchQSOsFromDCL(dclApiKey, sinceDate = null) {
|
|||||||
|
|
||||||
logger.error('Failed to fetch from DCL', {
|
logger.error('Failed to fetch from DCL', {
|
||||||
error: error.message,
|
error: error.message,
|
||||||
url: url.replace(/api_key=[^&]+/, 'api_key=***'),
|
|
||||||
});
|
});
|
||||||
|
|
||||||
throw error;
|
throw error;
|
||||||
@@ -103,7 +123,7 @@ export async function fetchQSOsFromDCL(dclApiKey, sinceDate = null) {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* Parse DCL API response from JSON
|
* Parse DCL API response from JSON
|
||||||
* This function exists for testing with example payloads before DCL API is available
|
* Can be used for testing with example payloads
|
||||||
*
|
*
|
||||||
* @param {Object} jsonResponse - JSON response in DCL format
|
* @param {Object} jsonResponse - JSON response in DCL format
|
||||||
* @returns {Array} Array of parsed QSO records
|
* @returns {Array} Array of parsed QSO records
|
||||||
@@ -232,16 +252,25 @@ export async function syncQSOs(userId, dclApiKey, sinceDate = null, jobId = null
|
|||||||
|
|
||||||
if (dataChanged) {
|
if (dataChanged) {
|
||||||
// Update existing QSO with changed DCL confirmation and DOK data
|
// Update existing QSO with changed DCL confirmation and DOK data
|
||||||
|
// Only update DOK/grid fields if DCL actually sent values (non-empty)
|
||||||
|
const updateData = {
|
||||||
|
dclQslRdate: dbQSO.dclQslRdate,
|
||||||
|
dclQslRstatus: dbQSO.dclQslRstatus,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Only add DOK fields if DCL sent them
|
||||||
|
if (dbQSO.darcDok) updateData.darcDok = dbQSO.darcDok;
|
||||||
|
if (dbQSO.myDarcDok) updateData.myDarcDok = dbQSO.myDarcDok;
|
||||||
|
|
||||||
|
// Only update grid if DCL sent one
|
||||||
|
if (dbQSO.grid) {
|
||||||
|
updateData.grid = dbQSO.grid;
|
||||||
|
updateData.gridSource = dbQSO.gridSource;
|
||||||
|
}
|
||||||
|
|
||||||
await db
|
await db
|
||||||
.update(qsos)
|
.update(qsos)
|
||||||
.set({
|
.set(updateData)
|
||||||
dclQslRdate: dbQSO.dclQslRdate,
|
|
||||||
dclQslRstatus: dbQSO.dclQslRstatus,
|
|
||||||
darcDok: dbQSO.darcDok || existingQSO.darcDok,
|
|
||||||
myDarcDok: dbQSO.myDarcDok || existingQSO.myDarcDok,
|
|
||||||
grid: dbQSO.grid || existingQSO.grid,
|
|
||||||
gridSource: dbQSO.gridSource || existingQSO.gridSource,
|
|
||||||
})
|
|
||||||
.where(eq(qsos.id, existingQSO.id));
|
.where(eq(qsos.id, existingQSO.id));
|
||||||
updatedCount++;
|
updatedCount++;
|
||||||
// Track updated QSO (CALL and DATE)
|
// Track updated QSO (CALL and DATE)
|
||||||
@@ -325,8 +354,6 @@ export async function syncQSOs(userId, dclApiKey, sinceDate = null, jobId = null
|
|||||||
/**
|
/**
|
||||||
* Get last DCL QSL date for incremental sync
|
* Get last DCL QSL date for incremental sync
|
||||||
*
|
*
|
||||||
* TODO: Implement when DCL provides API
|
|
||||||
*
|
|
||||||
* @param {number} userId - User ID
|
* @param {number} userId - User ID
|
||||||
* @returns {Promise<Date|null>} Last QSL date or null
|
* @returns {Promise<Date|null>} Last QSL date or null
|
||||||
*/
|
*/
|
||||||
|
|||||||
@@ -19,20 +19,21 @@ export const JobStatus = {
|
|||||||
const activeJobs = new Map();
|
const activeJobs = new Map();
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Enqueue a new LoTW sync job
|
* Enqueue a new sync job
|
||||||
* @param {number} userId - User ID
|
* @param {number} userId - User ID
|
||||||
|
* @param {string} jobType - Type of job ('lotw_sync' or 'dcl_sync')
|
||||||
* @returns {Promise<Object>} Job object with ID
|
* @returns {Promise<Object>} Job object with ID
|
||||||
*/
|
*/
|
||||||
export async function enqueueJob(userId) {
|
export async function enqueueJob(userId, jobType = 'lotw_sync') {
|
||||||
logger.debug('Enqueueing LoTW sync job', { userId });
|
logger.debug('Enqueueing sync job', { userId, jobType });
|
||||||
|
|
||||||
// Check for existing active job
|
// Check for existing active job of the same type
|
||||||
const existingJob = await getUserActiveJob(userId);
|
const existingJob = await getUserActiveJob(userId, jobType);
|
||||||
if (existingJob) {
|
if (existingJob) {
|
||||||
logger.debug('Existing active job found', { jobId: existingJob.id });
|
logger.debug('Existing active job found', { jobId: existingJob.id, jobType });
|
||||||
return {
|
return {
|
||||||
success: false,
|
success: false,
|
||||||
error: 'A LoTW sync job is already running or pending for this user',
|
error: `A ${jobType} job is already running or pending for this user`,
|
||||||
existingJob: existingJob.id,
|
existingJob: existingJob.id,
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
@@ -42,16 +43,16 @@ export async function enqueueJob(userId) {
|
|||||||
.insert(syncJobs)
|
.insert(syncJobs)
|
||||||
.values({
|
.values({
|
||||||
userId,
|
userId,
|
||||||
type: 'lotw_sync',
|
type: jobType,
|
||||||
status: JobStatus.PENDING,
|
status: JobStatus.PENDING,
|
||||||
createdAt: new Date(),
|
createdAt: new Date(),
|
||||||
})
|
})
|
||||||
.returning();
|
.returning();
|
||||||
|
|
||||||
logger.info('Job created', { jobId: job.id, userId });
|
logger.info('Job created', { jobId: job.id, userId, jobType });
|
||||||
|
|
||||||
// Start processing asynchronously (don't await)
|
// Start processing asynchronously (don't await)
|
||||||
processJobAsync(job.id, userId).catch((error) => {
|
processJobAsync(job.id, userId, jobType).catch((error) => {
|
||||||
logger.error(`Job processing error`, { jobId: job.id, error: error.message });
|
logger.error(`Job processing error`, { jobId: job.id, error: error.message });
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -68,15 +69,14 @@ export async function enqueueJob(userId) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Process a LoTW sync job asynchronously
|
* Process a sync job asynchronously
|
||||||
* @param {number} jobId - Job ID
|
* @param {number} jobId - Job ID
|
||||||
* @param {number} userId - User ID
|
* @param {number} userId - User ID
|
||||||
|
* @param {string} jobType - Type of job ('lotw_sync' or 'dcl_sync')
|
||||||
*/
|
*/
|
||||||
async function processJobAsync(jobId, userId) {
|
async function processJobAsync(jobId, userId, jobType) {
|
||||||
const jobPromise = (async () => {
|
const jobPromise = (async () => {
|
||||||
try {
|
try {
|
||||||
// Import dynamically to avoid circular dependency
|
|
||||||
const { syncQSOs } = await import('./lotw.service.js');
|
|
||||||
const { getUserById } = await import('./auth.service.js');
|
const { getUserById } = await import('./auth.service.js');
|
||||||
|
|
||||||
// Update status to running
|
// Update status to running
|
||||||
@@ -85,37 +85,72 @@ async function processJobAsync(jobId, userId) {
|
|||||||
startedAt: new Date(),
|
startedAt: new Date(),
|
||||||
});
|
});
|
||||||
|
|
||||||
// Get user credentials
|
let result;
|
||||||
const user = await getUserById(userId);
|
|
||||||
if (!user || !user.lotwUsername || !user.lotwPassword) {
|
if (jobType === 'dcl_sync') {
|
||||||
await updateJob(jobId, {
|
// Get user credentials
|
||||||
status: JobStatus.FAILED,
|
const user = await getUserById(userId);
|
||||||
completedAt: new Date(),
|
if (!user || !user.dclApiKey) {
|
||||||
error: 'LoTW credentials not configured',
|
await updateJob(jobId, {
|
||||||
|
status: JobStatus.FAILED,
|
||||||
|
completedAt: new Date(),
|
||||||
|
error: 'DCL credentials not configured',
|
||||||
|
});
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get last QSL date for incremental sync
|
||||||
|
const { getLastDCLQSLDate, syncQSOs: syncDCLQSOs } = await import('./dcl.service.js');
|
||||||
|
const lastQSLDate = await getLastDCLQSLDate(userId);
|
||||||
|
const sinceDate = lastQSLDate || new Date('2000-01-01');
|
||||||
|
|
||||||
|
if (lastQSLDate) {
|
||||||
|
logger.info(`Job ${jobId}: DCL incremental sync`, { since: sinceDate.toISOString().split('T')[0] });
|
||||||
|
} else {
|
||||||
|
logger.info(`Job ${jobId}: DCL full sync`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update job progress
|
||||||
|
await updateJobProgress(jobId, {
|
||||||
|
message: 'Fetching QSOs from DCL...',
|
||||||
|
step: 'fetch',
|
||||||
});
|
});
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Get last QSL date for incremental sync
|
// Execute the sync
|
||||||
const { getLastLoTWQSLDate } = await import('./lotw.service.js');
|
result = await syncDCLQSOs(userId, user.dclApiKey, sinceDate, jobId);
|
||||||
const lastQSLDate = await getLastLoTWQSLDate(userId);
|
|
||||||
const sinceDate = lastQSLDate || new Date('2000-01-01');
|
|
||||||
|
|
||||||
if (lastQSLDate) {
|
|
||||||
logger.info(`Job ${jobId}: Incremental sync`, { since: sinceDate.toISOString().split('T')[0] });
|
|
||||||
} else {
|
} else {
|
||||||
logger.info(`Job ${jobId}: Full sync`);
|
// LoTW sync (default)
|
||||||
|
const user = await getUserById(userId);
|
||||||
|
if (!user || !user.lotwUsername || !user.lotwPassword) {
|
||||||
|
await updateJob(jobId, {
|
||||||
|
status: JobStatus.FAILED,
|
||||||
|
completedAt: new Date(),
|
||||||
|
error: 'LoTW credentials not configured',
|
||||||
|
});
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get last QSL date for incremental sync
|
||||||
|
const { getLastLoTWQSLDate, syncQSOs } = await import('./lotw.service.js');
|
||||||
|
const lastQSLDate = await getLastLoTWQSLDate(userId);
|
||||||
|
const sinceDate = lastQSLDate || new Date('2000-01-01');
|
||||||
|
|
||||||
|
if (lastQSLDate) {
|
||||||
|
logger.info(`Job ${jobId}: LoTW incremental sync`, { since: sinceDate.toISOString().split('T')[0] });
|
||||||
|
} else {
|
||||||
|
logger.info(`Job ${jobId}: LoTW full sync`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update job progress
|
||||||
|
await updateJobProgress(jobId, {
|
||||||
|
message: 'Fetching QSOs from LoTW...',
|
||||||
|
step: 'fetch',
|
||||||
|
});
|
||||||
|
|
||||||
|
// Execute the sync
|
||||||
|
result = await syncQSOs(userId, user.lotwUsername, user.lotwPassword, sinceDate, jobId);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Update job progress
|
|
||||||
await updateJobProgress(jobId, {
|
|
||||||
message: 'Fetching QSOs from LoTW...',
|
|
||||||
step: 'fetch',
|
|
||||||
});
|
|
||||||
|
|
||||||
// Execute the sync
|
|
||||||
const result = await syncQSOs(userId, user.lotwUsername, user.lotwPassword, sinceDate, jobId);
|
|
||||||
|
|
||||||
// Update job as completed
|
// Update job as completed
|
||||||
await updateJob(jobId, {
|
await updateJob(jobId, {
|
||||||
status: JobStatus.COMPLETED,
|
status: JobStatus.COMPLETED,
|
||||||
@@ -197,9 +232,10 @@ export async function getJobStatus(jobId) {
|
|||||||
/**
|
/**
|
||||||
* Get user's active job (pending or running)
|
* Get user's active job (pending or running)
|
||||||
* @param {number} userId - User ID
|
* @param {number} userId - User ID
|
||||||
|
* @param {string} jobType - Optional job type filter
|
||||||
* @returns {Promise<Object|null>} Active job or null
|
* @returns {Promise<Object|null>} Active job or null
|
||||||
*/
|
*/
|
||||||
export async function getUserActiveJob(userId) {
|
export async function getUserActiveJob(userId, jobType = null) {
|
||||||
const conditions = [
|
const conditions = [
|
||||||
eq(syncJobs.userId, userId),
|
eq(syncJobs.userId, userId),
|
||||||
or(
|
or(
|
||||||
@@ -208,6 +244,10 @@ export async function getUserActiveJob(userId) {
|
|||||||
),
|
),
|
||||||
];
|
];
|
||||||
|
|
||||||
|
if (jobType) {
|
||||||
|
conditions.push(eq(syncJobs.type, jobType));
|
||||||
|
}
|
||||||
|
|
||||||
const [job] = await db
|
const [job] = await db
|
||||||
.select()
|
.select()
|
||||||
.from(syncJobs)
|
.from(syncJobs)
|
||||||
|
|||||||
@@ -241,6 +241,7 @@ export async function syncQSOs(userId, lotwUsername, lotwPassword, sinceDate = n
|
|||||||
eq(qsos.userId, userId),
|
eq(qsos.userId, userId),
|
||||||
eq(qsos.callsign, dbQSO.callsign),
|
eq(qsos.callsign, dbQSO.callsign),
|
||||||
eq(qsos.qsoDate, dbQSO.qsoDate),
|
eq(qsos.qsoDate, dbQSO.qsoDate),
|
||||||
|
eq(qsos.timeOn, dbQSO.timeOn),
|
||||||
eq(qsos.band, dbQSO.band),
|
eq(qsos.band, dbQSO.band),
|
||||||
eq(qsos.mode, dbQSO.mode)
|
eq(qsos.mode, dbQSO.mode)
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -13,8 +13,10 @@
|
|||||||
*/
|
*/
|
||||||
export function parseADIF(adifData) {
|
export function parseADIF(adifData) {
|
||||||
const qsos = [];
|
const qsos = [];
|
||||||
// Split by <EOR> (end of record) - case sensitive as per ADIF spec
|
|
||||||
const records = adifData.split('<EOR>');
|
// Split by <EOR> (case-insensitive to handle <EOR>, <eor>, <Eor>, etc.)
|
||||||
|
const regex = new RegExp('<eor>', 'gi');
|
||||||
|
const records = adifData.split(regex);
|
||||||
|
|
||||||
for (const record of records) {
|
for (const record of records) {
|
||||||
if (!record.trim()) continue;
|
if (!record.trim()) continue;
|
||||||
@@ -26,10 +28,11 @@ export function parseADIF(adifData) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
const qso = {};
|
const qso = {};
|
||||||
const regex = /<([A-Z0-9_]+):(\d+)(?::[A-Z]+)?>/gi;
|
|
||||||
let match;
|
|
||||||
|
|
||||||
while ((match = regex.exec(record)) !== null) {
|
// Use matchAll for cleaner parsing (creates new iterator for each record)
|
||||||
|
const matches = record.matchAll(/<([A-Z0-9_]+):(\d+)(?::[A-Z]+)?>/gi);
|
||||||
|
|
||||||
|
for (const match of matches) {
|
||||||
const [fullMatch, fieldName, lengthStr] = match;
|
const [fullMatch, fieldName, lengthStr] = match;
|
||||||
const length = parseInt(lengthStr, 10);
|
const length = parseInt(lengthStr, 10);
|
||||||
const valueStart = match.index + fullMatch.length;
|
const valueStart = match.index + fullMatch.length;
|
||||||
@@ -38,9 +41,6 @@ export function parseADIF(adifData) {
|
|||||||
const value = record.substring(valueStart, valueStart + length);
|
const value = record.substring(valueStart, valueStart + length);
|
||||||
|
|
||||||
qso[fieldName.toLowerCase()] = value.trim();
|
qso[fieldName.toLowerCase()] = value.trim();
|
||||||
|
|
||||||
// Update regex position to continue after the value
|
|
||||||
regex.lastIndex = valueStart + length;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Only add if we have at least a callsign
|
// Only add if we have at least a callsign
|
||||||
|
|||||||
@@ -74,6 +74,8 @@ export const qsosAPI = {
|
|||||||
|
|
||||||
syncFromLoTW: () => apiRequest('/lotw/sync', { method: 'POST' }),
|
syncFromLoTW: () => apiRequest('/lotw/sync', { method: 'POST' }),
|
||||||
|
|
||||||
|
syncFromDCL: () => apiRequest('/dcl/sync', { method: 'POST' }),
|
||||||
|
|
||||||
deleteAll: () => apiRequest('/qsos/all', { method: 'DELETE' }),
|
deleteAll: () => apiRequest('/qsos/all', { method: 'DELETE' }),
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|||||||
@@ -13,11 +13,20 @@
|
|||||||
let pageSize = 100;
|
let pageSize = 100;
|
||||||
let pagination = null;
|
let pagination = null;
|
||||||
|
|
||||||
// Job polling state
|
// Job polling state - LoTW
|
||||||
let syncJobId = null;
|
let lotwSyncJobId = null;
|
||||||
let syncStatus = null;
|
let lotwSyncStatus = null;
|
||||||
let syncProgress = null;
|
let lotwSyncProgress = null;
|
||||||
let pollingInterval = null;
|
let lotwPollingInterval = null;
|
||||||
|
|
||||||
|
// Job polling state - DCL
|
||||||
|
let dclSyncJobId = null;
|
||||||
|
let dclSyncStatus = null;
|
||||||
|
let dclSyncProgress = null;
|
||||||
|
let dclPollingInterval = null;
|
||||||
|
|
||||||
|
// Sync result
|
||||||
|
let syncResult = null;
|
||||||
|
|
||||||
// Delete confirmation state
|
// Delete confirmation state
|
||||||
let showDeleteConfirm = false;
|
let showDeleteConfirm = false;
|
||||||
@@ -34,14 +43,17 @@
|
|||||||
if (!$auth.user) return;
|
if (!$auth.user) return;
|
||||||
await loadQSOs();
|
await loadQSOs();
|
||||||
await loadStats();
|
await loadStats();
|
||||||
// Check for active job on mount
|
// Check for active jobs on mount
|
||||||
await checkActiveJob();
|
await checkActiveJobs();
|
||||||
});
|
});
|
||||||
|
|
||||||
// Clean up polling interval on unmount
|
// Clean up polling intervals on unmount
|
||||||
onDestroy(() => {
|
onDestroy(() => {
|
||||||
if (pollingInterval) {
|
if (lotwPollingInterval) {
|
||||||
clearInterval(pollingInterval);
|
clearInterval(lotwPollingInterval);
|
||||||
|
}
|
||||||
|
if (dclPollingInterval) {
|
||||||
|
clearInterval(dclPollingInterval);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -76,98 +88,169 @@
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
async function checkActiveJob() {
|
async function checkActiveJobs() {
|
||||||
try {
|
try {
|
||||||
const response = await jobsAPI.getActive();
|
const response = await jobsAPI.getActive();
|
||||||
if (response.job) {
|
if (response.job) {
|
||||||
syncJobId = response.job.id;
|
const job = response.job;
|
||||||
syncStatus = response.job.status;
|
if (job.type === 'lotw_sync') {
|
||||||
// Start polling if job is running
|
lotwSyncJobId = job.id;
|
||||||
if (syncStatus === 'running' || syncStatus === 'pending') {
|
lotwSyncStatus = job.status;
|
||||||
startPolling(response.job.id);
|
if (lotwSyncStatus === 'running' || lotwSyncStatus === 'pending') {
|
||||||
|
startLoTWPolling(job.id);
|
||||||
|
}
|
||||||
|
} else if (job.type === 'dcl_sync') {
|
||||||
|
dclSyncJobId = job.id;
|
||||||
|
dclSyncStatus = job.status;
|
||||||
|
if (dclSyncStatus === 'running' || dclSyncStatus === 'pending') {
|
||||||
|
startDCLPolling(job.id);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error('Failed to check active job:', err);
|
console.error('Failed to check active jobs:', err);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
async function startPolling(jobId) {
|
async function startLoTWPolling(jobId) {
|
||||||
syncJobId = jobId;
|
lotwSyncJobId = jobId;
|
||||||
syncStatus = 'running';
|
lotwSyncStatus = 'running';
|
||||||
|
|
||||||
// Clear any existing interval
|
if (lotwPollingInterval) {
|
||||||
if (pollingInterval) {
|
clearInterval(lotwPollingInterval);
|
||||||
clearInterval(pollingInterval);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Poll every 2 seconds
|
lotwPollingInterval = setInterval(async () => {
|
||||||
pollingInterval = setInterval(async () => {
|
|
||||||
try {
|
try {
|
||||||
const response = await jobsAPI.getStatus(jobId);
|
const response = await jobsAPI.getStatus(jobId);
|
||||||
const job = response.job;
|
const job = response.job;
|
||||||
|
|
||||||
syncStatus = job.status;
|
lotwSyncStatus = job.status;
|
||||||
syncProgress = job.result?.progress ? job.result : null;
|
lotwSyncProgress = job.result?.progress ? job.result : null;
|
||||||
|
|
||||||
if (job.status === 'completed') {
|
if (job.status === 'completed') {
|
||||||
clearInterval(pollingInterval);
|
clearInterval(lotwPollingInterval);
|
||||||
pollingInterval = null;
|
lotwPollingInterval = null;
|
||||||
syncJobId = null;
|
lotwSyncJobId = null;
|
||||||
syncProgress = null;
|
lotwSyncProgress = null;
|
||||||
syncStatus = null;
|
lotwSyncStatus = null;
|
||||||
|
|
||||||
// Reload QSOs and stats
|
|
||||||
await loadQSOs();
|
await loadQSOs();
|
||||||
await loadStats();
|
await loadStats();
|
||||||
|
|
||||||
// Show success message
|
|
||||||
syncResult = {
|
syncResult = {
|
||||||
success: true,
|
success: true,
|
||||||
|
source: 'LoTW',
|
||||||
...job.result,
|
...job.result,
|
||||||
};
|
};
|
||||||
} else if (job.status === 'failed') {
|
} else if (job.status === 'failed') {
|
||||||
clearInterval(pollingInterval);
|
clearInterval(lotwPollingInterval);
|
||||||
pollingInterval = null;
|
lotwPollingInterval = null;
|
||||||
syncJobId = null;
|
lotwSyncJobId = null;
|
||||||
syncProgress = null;
|
lotwSyncProgress = null;
|
||||||
syncStatus = null;
|
lotwSyncStatus = null;
|
||||||
|
|
||||||
// Show error message
|
|
||||||
syncResult = {
|
syncResult = {
|
||||||
success: false,
|
success: false,
|
||||||
error: job.error || 'Sync failed',
|
source: 'LoTW',
|
||||||
|
error: job.error || 'LoTW sync failed',
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error('Failed to poll job status:', err);
|
console.error('Failed to poll LoTW job status:', err);
|
||||||
// Don't stop polling on error, might be temporary
|
|
||||||
}
|
}
|
||||||
}, 2000);
|
}, 2000);
|
||||||
}
|
}
|
||||||
|
|
||||||
async function handleSync() {
|
async function startDCLPolling(jobId) {
|
||||||
|
dclSyncJobId = jobId;
|
||||||
|
dclSyncStatus = 'running';
|
||||||
|
|
||||||
|
if (dclPollingInterval) {
|
||||||
|
clearInterval(dclPollingInterval);
|
||||||
|
}
|
||||||
|
|
||||||
|
dclPollingInterval = setInterval(async () => {
|
||||||
|
try {
|
||||||
|
const response = await jobsAPI.getStatus(jobId);
|
||||||
|
const job = response.job;
|
||||||
|
|
||||||
|
dclSyncStatus = job.status;
|
||||||
|
dclSyncProgress = job.result?.progress ? job.result : null;
|
||||||
|
|
||||||
|
if (job.status === 'completed') {
|
||||||
|
clearInterval(dclPollingInterval);
|
||||||
|
dclPollingInterval = null;
|
||||||
|
dclSyncJobId = null;
|
||||||
|
dclSyncProgress = null;
|
||||||
|
dclSyncStatus = null;
|
||||||
|
|
||||||
|
await loadQSOs();
|
||||||
|
await loadStats();
|
||||||
|
|
||||||
|
syncResult = {
|
||||||
|
success: true,
|
||||||
|
source: 'DCL',
|
||||||
|
...job.result,
|
||||||
|
};
|
||||||
|
} else if (job.status === 'failed') {
|
||||||
|
clearInterval(dclPollingInterval);
|
||||||
|
dclPollingInterval = null;
|
||||||
|
dclSyncJobId = null;
|
||||||
|
dclSyncProgress = null;
|
||||||
|
dclSyncStatus = null;
|
||||||
|
|
||||||
|
syncResult = {
|
||||||
|
success: false,
|
||||||
|
source: 'DCL',
|
||||||
|
error: job.error || 'DCL sync failed',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Failed to poll DCL job status:', err);
|
||||||
|
}
|
||||||
|
}, 2000);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function handleLoTWSync() {
|
||||||
try {
|
try {
|
||||||
const response = await qsosAPI.syncFromLoTW();
|
const response = await qsosAPI.syncFromLoTW();
|
||||||
|
|
||||||
if (response.jobId) {
|
if (response.jobId) {
|
||||||
// Job was queued successfully
|
startLoTWPolling(response.jobId);
|
||||||
startPolling(response.jobId);
|
|
||||||
} else if (response.existingJob) {
|
} else if (response.existingJob) {
|
||||||
// There's already an active job
|
startLoTWPolling(response.existingJob);
|
||||||
startPolling(response.existingJob);
|
|
||||||
} else {
|
} else {
|
||||||
throw new Error(response.error || 'Failed to queue sync job');
|
throw new Error(response.error || 'Failed to queue LoTW sync job');
|
||||||
}
|
}
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
syncResult = {
|
syncResult = {
|
||||||
success: false,
|
success: false,
|
||||||
|
source: 'LoTW',
|
||||||
error: err.message,
|
error: err.message,
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let syncResult = null;
|
async function handleDCLSync() {
|
||||||
|
try {
|
||||||
|
const response = await qsosAPI.syncFromDCL();
|
||||||
|
|
||||||
|
if (response.jobId) {
|
||||||
|
startDCLPolling(response.jobId);
|
||||||
|
} else if (response.existingJob) {
|
||||||
|
startDCLPolling(response.existingJob);
|
||||||
|
} else {
|
||||||
|
throw new Error(response.error || 'Failed to queue DCL sync job');
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
syncResult = {
|
||||||
|
success: false,
|
||||||
|
source: 'DCL',
|
||||||
|
error: err.message,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
async function applyFilters() {
|
async function applyFilters() {
|
||||||
currentPage = 1;
|
currentPage = 1;
|
||||||
@@ -263,31 +346,52 @@
|
|||||||
<button
|
<button
|
||||||
class="btn btn-danger"
|
class="btn btn-danger"
|
||||||
on:click={() => showDeleteConfirm = true}
|
on:click={() => showDeleteConfirm = true}
|
||||||
disabled={syncStatus === 'running' || syncStatus === 'pending' || deleting}
|
disabled={lotwSyncStatus === 'running' || lotwSyncStatus === 'pending' || dclSyncStatus === 'running' || dclSyncStatus === 'pending' || deleting}
|
||||||
>
|
>
|
||||||
Clear All QSOs
|
Clear All QSOs
|
||||||
</button>
|
</button>
|
||||||
{/if}
|
{/if}
|
||||||
<button
|
<button
|
||||||
class="btn btn-primary"
|
class="btn btn-primary lotw-btn"
|
||||||
on:click={handleSync}
|
on:click={handleLoTWSync}
|
||||||
disabled={syncStatus === 'running' || syncStatus === 'pending' || deleting}
|
disabled={lotwSyncStatus === 'running' || lotwSyncStatus === 'pending' || deleting}
|
||||||
>
|
>
|
||||||
{#if syncStatus === 'running' || syncStatus === 'pending'}
|
{#if lotwSyncStatus === 'running' || lotwSyncStatus === 'pending'}
|
||||||
Syncing...
|
LoTW Syncing...
|
||||||
{:else}
|
{:else}
|
||||||
Sync from LoTW
|
Sync from LoTW
|
||||||
{/if}
|
{/if}
|
||||||
</button>
|
</button>
|
||||||
|
<button
|
||||||
|
class="btn btn-primary dcl-btn"
|
||||||
|
on:click={handleDCLSync}
|
||||||
|
disabled={dclSyncStatus === 'running' || dclSyncStatus === 'pending' || deleting}
|
||||||
|
>
|
||||||
|
{#if dclSyncStatus === 'running' || dclSyncStatus === 'pending'}
|
||||||
|
DCL Syncing...
|
||||||
|
{:else}
|
||||||
|
Sync from DCL
|
||||||
|
{/if}
|
||||||
|
</button>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{#if syncProgress}
|
{#if lotwSyncProgress}
|
||||||
<div class="alert alert-info">
|
<div class="alert alert-info">
|
||||||
<h3>Syncing from LoTW...</h3>
|
<h3>Syncing from LoTW...</h3>
|
||||||
<p>{syncProgress.message || 'Processing...'}</p>
|
<p>{lotwSyncProgress.message || 'Processing...'}</p>
|
||||||
{#if syncProgress.total}
|
{#if lotwSyncProgress.total}
|
||||||
<p>Progress: {syncProgress.processed || 0} / {syncProgress.total}</p>
|
<p>Progress: {lotwSyncProgress.processed || 0} / {lotwSyncProgress.total}</p>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
{#if dclSyncProgress}
|
||||||
|
<div class="alert alert-info">
|
||||||
|
<h3>Syncing from DCL...</h3>
|
||||||
|
<p>{dclSyncProgress.message || 'Processing...'}</p>
|
||||||
|
{#if dclSyncProgress.total}
|
||||||
|
<p>Progress: {dclSyncProgress.processed || 0} / {dclSyncProgress.total}</p>
|
||||||
{/if}
|
{/if}
|
||||||
</div>
|
</div>
|
||||||
{/if}
|
{/if}
|
||||||
@@ -602,6 +706,23 @@
|
|||||||
.header-buttons {
|
.header-buttons {
|
||||||
display: flex;
|
display: flex;
|
||||||
gap: 1rem;
|
gap: 1rem;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
}
|
||||||
|
|
||||||
|
.lotw-btn {
|
||||||
|
background-color: #4a90e2;
|
||||||
|
}
|
||||||
|
|
||||||
|
.lotw-btn:hover:not(:disabled) {
|
||||||
|
background-color: #357abd;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dcl-btn {
|
||||||
|
background-color: #e67e22;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dcl-btn:hover:not(:disabled) {
|
||||||
|
background-color: #d35400;
|
||||||
}
|
}
|
||||||
|
|
||||||
.stats-grid {
|
.stats-grid {
|
||||||
|
|||||||
@@ -194,8 +194,8 @@
|
|||||||
<div class="settings-section">
|
<div class="settings-section">
|
||||||
<h2>DCL Credentials</h2>
|
<h2>DCL Credentials</h2>
|
||||||
<p class="help-text">
|
<p class="help-text">
|
||||||
Configure your DARC Community Logbook (DCL) API key for future sync functionality.
|
Configure your DARC Community Logbook (DCL) API key to sync your QSOs.
|
||||||
<strong>Note:</strong> DCL does not currently provide a download API. This is prepared for when they add one.
|
Your API key is stored securely and used only to fetch your confirmed QSOs.
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
{#if hasDCLCredentials}
|
{#if hasDCLCredentials}
|
||||||
@@ -220,7 +220,7 @@
|
|||||||
placeholder="Your DCL API key"
|
placeholder="Your DCL API key"
|
||||||
/>
|
/>
|
||||||
<p class="hint">
|
<p class="hint">
|
||||||
Enter your DCL API key for future sync functionality
|
Enter your DCL API key to sync QSOs
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
@@ -233,10 +233,10 @@
|
|||||||
<h3>About DCL</h3>
|
<h3>About DCL</h3>
|
||||||
<p>
|
<p>
|
||||||
DCL (DARC Community Logbook) is DARC's web-based logbook system for German amateur radio awards.
|
DCL (DARC Community Logbook) is DARC's web-based logbook system for German amateur radio awards.
|
||||||
It includes DOK (DARC Ortsverband Kennung) fields for local club awards.
|
It includes DOK (DARC Ortsverband Kennung) fields for local club awards like the DLD award.
|
||||||
</p>
|
</p>
|
||||||
<p>
|
<p>
|
||||||
<strong>Status:</strong> Download API not yet available.{' '}
|
Once configured, you can sync your QSOs from DCL on the QSO Log page.
|
||||||
<a href="https://dcl.darc.de/" target="_blank" rel="noopener">
|
<a href="https://dcl.darc.de/" target="_blank" rel="noopener">
|
||||||
Visit DCL website
|
Visit DCL website
|
||||||
</a>
|
</a>
|
||||||
|
|||||||
Reference in New Issue
Block a user