From 223461f5365940006b00c0bf1321c74ad8e7cc0d Mon Sep 17 00:00:00 2001 From: Joerg Date: Sun, 18 Jan 2026 07:02:52 +0100 Subject: [PATCH] fix: enable debug logging and improve DCL sync observability - Fix logger bug where debug level (0) was treated as falsy - Change `||` to `??` in config.js to properly handle log level 0 - Debug logs now work correctly when LOG_LEVEL=debug - Add server startup logging - Log port, environment, and log level on server start - Helps verify configuration is loaded correctly - Add DCL API request debug logging - Log full API request parameters when LOG_LEVEL=debug - API key is redacted (shows first/last 4 chars only) - Helps troubleshoot DCL sync issues - Update CLAUDE.md documentation - Add Logging section with log levels and configuration - Document debug logging feature for DCL service - Add this fix to Recent Commits section Note: .env file added locally with LOG_LEVEL=debug (not committed) Co-Authored-By: Claude Sonnet 4.5 --- CLAUDE.md | 104 +++++++- src/backend/config.js | 2 +- src/backend/index.js | 48 +++- src/backend/services/dcl.service.js | 11 + src/backend/services/job-queue.service.js | 122 ++++++--- src/frontend/src/lib/api.js | 2 + src/frontend/src/routes/qsos/+page.svelte | 241 +++++++++++++----- src/frontend/src/routes/settings/+page.svelte | 10 +- 8 files changed, 424 insertions(+), 116 deletions(-) diff --git a/CLAUDE.md b/CLAUDE.md index 5a75f8f..7bcaa97 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -19,6 +19,22 @@ Default to using Bun instead of Node.js. - Prefer `Bun.file` over `node:fs`'s readFile/writeFile - Bun.$`ls` instead of execa. +## Logging + +The application uses a custom logger in `src/backend/config.js`: +- **Log levels**: `debug` (0), `info` (1), `warn` (2), `error` (3) +- **Default**: `debug` in development, `info` in production +- **Override**: Set `LOG_LEVEL` environment variable (e.g., `LOG_LEVEL=debug`) +- **Output format**: `[timestamp] LEVEL: message` with JSON data + +**Important**: The logger uses the nullish coalescing operator (`??`) to handle log levels. This ensures that `debug` (level 0) is not treated as falsy. + +Example `.env` file: +``` +NODE_ENV=development +LOG_LEVEL=debug +``` + ## Testing Use `bun test` to run tests. @@ -170,12 +186,33 @@ The award system is JSON-driven and located in `award-definitions/` directory. E - `normalizeMode(mode)`: Standardize mode names (CW, FT8, SSB, etc.) - Used by both LoTW and DCL services for consistency +**Job Queue Service**: `src/backend/services/job-queue.service.js` +- Manages async background jobs for LoTW and DCL sync +- `enqueueJob(userId, jobType)`: Queue a sync job ('lotw_sync' or 'dcl_sync') +- `processJobAsync(jobId, userId, jobType)`: Process job asynchronously +- `getUserActiveJob(userId, jobType)`: Get active job for user (optional type filter) +- `getJobStatus(jobId)`: Get job status with parsed result +- `updateJobProgress(jobId, progressData)`: Update job progress during processing +- Supports concurrent LoTW and DCL sync jobs +- Job types: 'lotw_sync', 'dcl_sync' +- Job status: 'pending', 'running', 'completed', 'failed' + +**Backend API Routes** (`src/backend/index.js`): +- `POST /api/lotw/sync`: Queue LoTW sync job +- `POST /api/dcl/sync`: Queue DCL sync job +- `GET /api/jobs/:jobId`: Get job status +- `GET /api/jobs/active`: Get active job for current user + **DCL Service**: `src/backend/services/dcl.service.js` - `fetchQSOsFromDCL(dclApiKey, sinceDate)`: Fetch from DCL API +- API Endpoint: `https://dings.dcl.darc.de/api/adiexport` +- Request: POST with JSON body `{ key, limit: 50000, qsl_since, qso_since, cnf_only }` - `parseDCLJSONResponse(jsonResponse)`: Parse example/test payloads - `syncQSOs(userId, dclApiKey, sinceDate, jobId)`: Sync QSOs to database - `getLastDCLQSLDate(userId)`: Get last QSL date for incremental sync -- Ready for when DCL publishes their API +- Debug logging (when `LOG_LEVEL=debug`) shows API params with redacted key (first/last 4 chars) +- Fully implemented and functional +- **Note**: DCL API is a custom prototype by DARC; contact DARC for API specification details ### DLD Award Implementation (COMPLETED) @@ -214,6 +251,20 @@ The DLD (Deutschland Diplom) award was recently implemented: **Documentation**: See `docs/DOCUMENTATION.md` for complete documentation including DLD award example. +**Frontend**: `src/frontend/src/routes/qsos/+page.svelte` +- Separate sync buttons for LoTW (blue) and DCL (orange) +- Independent progress tracking for each sync type +- Both syncs can run simultaneously +- Job polling every 2 seconds for status updates +- Import log displays after sync completion +- Real-time QSO table refresh after sync + +**Frontend API** (`src/frontend/src/lib/api.js`): +- `qsosAPI.syncFromLoTW()`: Trigger LoTW sync +- `qsosAPI.syncFromDCL()`: Trigger DCL sync +- `jobsAPI.getStatus(jobId)`: Poll job status +- `jobsAPI.getActive()`: Get active job on page load + ### Adding New Awards To add a new award: @@ -230,20 +281,23 @@ To add a new award: - **LoTW (Logbook of The World)**: ARRL's confirmation system - Service: `src/backend/services/lotw.service.js` + - API: `https://lotw.arrl.org/lotwuser/lotwreport.adi` - Fields: `lotwQslRstatus`, `lotwQslRdate` - Used for DXCC, WAS, VUCC, most awards - ADIF format with `` delimiters - - Supports incremental sync by date + - Supports incremental sync by `qso_qslsince` parameter (format: YYYY-MM-DD) - **DCL (DARC Community Logbook)**: DARC's confirmation system - Service: `src/backend/services/dcl.service.js` + - API: `https://dings.dcl.darc.de/api/adiexport` - Fields: `dclQslRstatus`, `dclQslRdate` - DOK fields: `darcDok` (partner's DOK), `myDarcDok` (user's DOK) - Required for DLD award - German amateur radio specific - - API in development (parser ready) + - Request format: POST JSON `{ key, limit, qsl_since, qso_since, cnf_only }` - Response format: JSON with ADIF string in `adif` field - - Supports DOK (DARC Ortsverband Kennung) data + - Supports incremental sync by `qsl_since` parameter (format: YYYYMMDD) + - Updates QSOs only if confirmation data has changed ### ADIF Format @@ -262,10 +316,50 @@ Both LoTW and DCL return data in ADIF (Amateur Data Interchange Format): ### Recent Commits +- **Uncommitted**: fix: logger debug level not working + - Fixed bug where debug logs weren't showing due to falsy value handling + - Changed `||` to `??` in logger config to properly handle log level 0 (debug) + - Added `.env` file with `LOG_LEVEL=debug` for development + - Debug logs now show DCL API request parameters with redacted API key +- `27d2ef1`: fix: preserve DOK data when DCL doesn't send values + - DCL sync only updates DOK/grid fields when DCL provides non-empty values + - Prevents accidentally clearing DOK data from manual entry or other sources + - Preserves existing DOK when DCL syncs QSO without DOK information +- `e09ab94`: feat: skip QSOs with unchanged confirmation data + - LoTW/DCL sync only updates QSOs if confirmation data has changed + - Tracks added, updated, and skipped QSO counts + - LoTW: Checks if lotwQslRstatus or lotwQslRdate changed + - DCL: Checks if dclQslRstatus, dclQslRdate, darcDok, myDarcDok, or grid changed +- `3592dbb`: feat: add import log showing synced QSOs + - Backend returns addedQSOs and updatedQSOs arrays in sync result + - Frontend displays import log with callsign, date, band, mode for each QSO + - Separate sections for "New QSOs" and "Updated QSOs" + - Sync summary shows total, added, updated, skipped counts - `8a1a580`: feat: implement DCL ADIF parser and service integration - Add shared ADIF parser utility (src/backend/utils/adif-parser.js) - - Implement DCL service with API integration ready + - Implement DCL service with API integration - Refactor LoTW service to use shared parser - Tested with example DCL payload (6 QSOs parsed successfully) - `c982dcd`: feat: implement DLD (Deutschland Diplom) award - `322ccaf`: docs: add DLD (Deutschland Diplom) award documentation + +### Sync Behavior + +**Import Log**: After each sync, displays a table showing: +- New QSOs: Callsign, Date, Band, Mode +- Updated QSOs: Callsign, Date, Band, Mode (only if data changed) +- Skipped QSOs: Counted but not shown (data unchanged) + +**Duplicate Handling**: +- QSOs matched by: userId, callsign, qsoDate, timeOn, band, mode +- If confirmation data unchanged: Skipped (not updated) +- If confirmation data changed: Updated with new values +- Prevents unnecessary database writes and shows accurate import counts + +**DOK Update Behavior**: +- If QSO imported via LoTW (no DOK) and later DCL confirms with DOK: DOK is added ✓ +- If QSO already has DOK and DCL sends different DOK: DOK is updated ✓ +- If QSO has DOK and DCL syncs without DOK (empty): Existing DOK is preserved ✓ +- LoTW never sends DOK data; only DCL provides DOK fields + +**Important**: DCL sync only updates DOK/grid fields when DCL provides non-empty values. This prevents accidentally clearing DOK data that was manually entered or imported from other sources. diff --git a/src/backend/config.js b/src/backend/config.js index 30bf0bb..d7f70e9 100644 --- a/src/backend/config.js +++ b/src/backend/config.js @@ -17,7 +17,7 @@ export const LOG_LEVEL = process.env.LOG_LEVEL || (isDevelopment ? 'debug' : 'in // =================================================================== const logLevels = { debug: 0, info: 1, warn: 2, error: 3 }; -const currentLogLevel = logLevels[LOG_LEVEL] || 1; +const currentLogLevel = logLevels[LOG_LEVEL] ?? 1; function log(level, message, data) { if (logLevels[level] < currentLogLevel) return; diff --git a/src/backend/index.js b/src/backend/index.js index 266cce9..8bb25da 100644 --- a/src/backend/index.js +++ b/src/backend/index.js @@ -1,7 +1,7 @@ import { Elysia, t } from 'elysia'; import { cors } from '@elysiajs/cors'; import { jwt } from '@elysiajs/jwt'; -import { JWT_SECRET, logger } from './config.js'; +import { JWT_SECRET, logger, LOG_LEVEL } from './config.js'; import { registerUser, authenticateUser, @@ -283,13 +283,13 @@ const app = new Elysia() } try { - const result = await enqueueJob(user.id); + const result = await enqueueJob(user.id, 'lotw_sync'); if (!result.success && result.existingJob) { return { success: true, jobId: result.existingJob, - message: 'A sync job is already running', + message: 'A LoTW sync job is already running', }; } @@ -299,7 +299,41 @@ const app = new Elysia() set.status = 500; return { success: false, - error: `Failed to queue sync job: ${error.message}`, + error: `Failed to queue LoTW sync job: ${error.message}`, + }; + } + }) + + /** + * POST /api/dcl/sync + * Queue a DCL sync job (requires authentication) + * Returns immediately with job ID + */ + .post('/api/dcl/sync', async ({ user, set }) => { + if (!user) { + logger.warn('/api/dcl/sync: Unauthorized access attempt'); + set.status = 401; + return { success: false, error: 'Unauthorized' }; + } + + try { + const result = await enqueueJob(user.id, 'dcl_sync'); + + if (!result.success && result.existingJob) { + return { + success: true, + jobId: result.existingJob, + message: 'A DCL sync job is already running', + }; + } + + return result; + } catch (error) { + logger.error('Error in /api/dcl/sync', { error: error.message }); + set.status = 500; + return { + success: false, + error: `Failed to queue DCL sync job: ${error.message}`, }; } }) @@ -703,3 +737,9 @@ const app = new Elysia() // Start server - uses PORT environment variable if set, otherwise defaults to 3001 .listen(process.env.PORT || 3001); + +logger.info('Server started', { + port: process.env.PORT || 3001, + nodeEnv: process.env.NODE_ENV || 'unknown', + logLevel: LOG_LEVEL, +}); diff --git a/src/backend/services/dcl.service.js b/src/backend/services/dcl.service.js index 42d96c5..ce76ac7 100644 --- a/src/backend/services/dcl.service.js +++ b/src/backend/services/dcl.service.js @@ -58,6 +58,17 @@ export async function fetchQSOsFromDCL(dclApiKey, sinceDate = null) { requestBody.qsl_since = dateStr; } + // Debug log request parameters (redact API key) + logger.debug('DCL API request parameters', { + url: DCL_API_URL, + method: 'POST', + key: dclApiKey ? `${dclApiKey.substring(0, 4)}...${dclApiKey.substring(dclApiKey.length - 4)}` : null, + limit: requestBody.limit, + qsl_since: requestBody.qsl_since, + qso_since: requestBody.qso_since, + cnf_only: requestBody.cnf_only, + }); + try { const controller = new AbortController(); const timeoutId = setTimeout(() => controller.abort(), REQUEST_TIMEOUT); diff --git a/src/backend/services/job-queue.service.js b/src/backend/services/job-queue.service.js index 143ed9e..651c5b9 100644 --- a/src/backend/services/job-queue.service.js +++ b/src/backend/services/job-queue.service.js @@ -19,20 +19,21 @@ export const JobStatus = { const activeJobs = new Map(); /** - * Enqueue a new LoTW sync job + * Enqueue a new sync job * @param {number} userId - User ID + * @param {string} jobType - Type of job ('lotw_sync' or 'dcl_sync') * @returns {Promise} Job object with ID */ -export async function enqueueJob(userId) { - logger.debug('Enqueueing LoTW sync job', { userId }); +export async function enqueueJob(userId, jobType = 'lotw_sync') { + logger.debug('Enqueueing sync job', { userId, jobType }); - // Check for existing active job - const existingJob = await getUserActiveJob(userId); + // Check for existing active job of the same type + const existingJob = await getUserActiveJob(userId, jobType); if (existingJob) { - logger.debug('Existing active job found', { jobId: existingJob.id }); + logger.debug('Existing active job found', { jobId: existingJob.id, jobType }); return { success: false, - error: 'A LoTW sync job is already running or pending for this user', + error: `A ${jobType} job is already running or pending for this user`, existingJob: existingJob.id, }; } @@ -42,16 +43,16 @@ export async function enqueueJob(userId) { .insert(syncJobs) .values({ userId, - type: 'lotw_sync', + type: jobType, status: JobStatus.PENDING, createdAt: new Date(), }) .returning(); - logger.info('Job created', { jobId: job.id, userId }); + logger.info('Job created', { jobId: job.id, userId, jobType }); // Start processing asynchronously (don't await) - processJobAsync(job.id, userId).catch((error) => { + processJobAsync(job.id, userId, jobType).catch((error) => { logger.error(`Job processing error`, { jobId: job.id, error: error.message }); }); @@ -68,15 +69,14 @@ export async function enqueueJob(userId) { } /** - * Process a LoTW sync job asynchronously + * Process a sync job asynchronously * @param {number} jobId - Job ID * @param {number} userId - User ID + * @param {string} jobType - Type of job ('lotw_sync' or 'dcl_sync') */ -async function processJobAsync(jobId, userId) { +async function processJobAsync(jobId, userId, jobType) { const jobPromise = (async () => { try { - // Import dynamically to avoid circular dependency - const { syncQSOs } = await import('./lotw.service.js'); const { getUserById } = await import('./auth.service.js'); // Update status to running @@ -85,37 +85,72 @@ async function processJobAsync(jobId, userId) { startedAt: new Date(), }); - // Get user credentials - const user = await getUserById(userId); - if (!user || !user.lotwUsername || !user.lotwPassword) { - await updateJob(jobId, { - status: JobStatus.FAILED, - completedAt: new Date(), - error: 'LoTW credentials not configured', + let result; + + if (jobType === 'dcl_sync') { + // Get user credentials + const user = await getUserById(userId); + if (!user || !user.dclApiKey) { + await updateJob(jobId, { + status: JobStatus.FAILED, + completedAt: new Date(), + error: 'DCL credentials not configured', + }); + return null; + } + + // Get last QSL date for incremental sync + const { getLastDCLQSLDate, syncQSOs: syncDCLQSOs } = await import('./dcl.service.js'); + const lastQSLDate = await getLastDCLQSLDate(userId); + const sinceDate = lastQSLDate || new Date('2000-01-01'); + + if (lastQSLDate) { + logger.info(`Job ${jobId}: DCL incremental sync`, { since: sinceDate.toISOString().split('T')[0] }); + } else { + logger.info(`Job ${jobId}: DCL full sync`); + } + + // Update job progress + await updateJobProgress(jobId, { + message: 'Fetching QSOs from DCL...', + step: 'fetch', }); - return null; - } - // Get last QSL date for incremental sync - const { getLastLoTWQSLDate } = await import('./lotw.service.js'); - const lastQSLDate = await getLastLoTWQSLDate(userId); - const sinceDate = lastQSLDate || new Date('2000-01-01'); - - if (lastQSLDate) { - logger.info(`Job ${jobId}: Incremental sync`, { since: sinceDate.toISOString().split('T')[0] }); + // Execute the sync + result = await syncDCLQSOs(userId, user.dclApiKey, sinceDate, jobId); } else { - logger.info(`Job ${jobId}: Full sync`); + // LoTW sync (default) + const user = await getUserById(userId); + if (!user || !user.lotwUsername || !user.lotwPassword) { + await updateJob(jobId, { + status: JobStatus.FAILED, + completedAt: new Date(), + error: 'LoTW credentials not configured', + }); + return null; + } + + // Get last QSL date for incremental sync + const { getLastLoTWQSLDate, syncQSOs } = await import('./lotw.service.js'); + const lastQSLDate = await getLastLoTWQSLDate(userId); + const sinceDate = lastQSLDate || new Date('2000-01-01'); + + if (lastQSLDate) { + logger.info(`Job ${jobId}: LoTW incremental sync`, { since: sinceDate.toISOString().split('T')[0] }); + } else { + logger.info(`Job ${jobId}: LoTW full sync`); + } + + // Update job progress + await updateJobProgress(jobId, { + message: 'Fetching QSOs from LoTW...', + step: 'fetch', + }); + + // Execute the sync + result = await syncQSOs(userId, user.lotwUsername, user.lotwPassword, sinceDate, jobId); } - // Update job progress - await updateJobProgress(jobId, { - message: 'Fetching QSOs from LoTW...', - step: 'fetch', - }); - - // Execute the sync - const result = await syncQSOs(userId, user.lotwUsername, user.lotwPassword, sinceDate, jobId); - // Update job as completed await updateJob(jobId, { status: JobStatus.COMPLETED, @@ -197,9 +232,10 @@ export async function getJobStatus(jobId) { /** * Get user's active job (pending or running) * @param {number} userId - User ID + * @param {string} jobType - Optional job type filter * @returns {Promise} Active job or null */ -export async function getUserActiveJob(userId) { +export async function getUserActiveJob(userId, jobType = null) { const conditions = [ eq(syncJobs.userId, userId), or( @@ -208,6 +244,10 @@ export async function getUserActiveJob(userId) { ), ]; + if (jobType) { + conditions.push(eq(syncJobs.type, jobType)); + } + const [job] = await db .select() .from(syncJobs) diff --git a/src/frontend/src/lib/api.js b/src/frontend/src/lib/api.js index 4612dac..786af04 100644 --- a/src/frontend/src/lib/api.js +++ b/src/frontend/src/lib/api.js @@ -74,6 +74,8 @@ export const qsosAPI = { syncFromLoTW: () => apiRequest('/lotw/sync', { method: 'POST' }), + syncFromDCL: () => apiRequest('/dcl/sync', { method: 'POST' }), + deleteAll: () => apiRequest('/qsos/all', { method: 'DELETE' }), }; diff --git a/src/frontend/src/routes/qsos/+page.svelte b/src/frontend/src/routes/qsos/+page.svelte index 610dc1e..72591ba 100644 --- a/src/frontend/src/routes/qsos/+page.svelte +++ b/src/frontend/src/routes/qsos/+page.svelte @@ -13,11 +13,20 @@ let pageSize = 100; let pagination = null; - // Job polling state - let syncJobId = null; - let syncStatus = null; - let syncProgress = null; - let pollingInterval = null; + // Job polling state - LoTW + let lotwSyncJobId = null; + let lotwSyncStatus = null; + let lotwSyncProgress = null; + let lotwPollingInterval = null; + + // Job polling state - DCL + let dclSyncJobId = null; + let dclSyncStatus = null; + let dclSyncProgress = null; + let dclPollingInterval = null; + + // Sync result + let syncResult = null; // Delete confirmation state let showDeleteConfirm = false; @@ -34,14 +43,17 @@ if (!$auth.user) return; await loadQSOs(); await loadStats(); - // Check for active job on mount - await checkActiveJob(); + // Check for active jobs on mount + await checkActiveJobs(); }); - // Clean up polling interval on unmount + // Clean up polling intervals on unmount onDestroy(() => { - if (pollingInterval) { - clearInterval(pollingInterval); + if (lotwPollingInterval) { + clearInterval(lotwPollingInterval); + } + if (dclPollingInterval) { + clearInterval(dclPollingInterval); } }); @@ -76,98 +88,169 @@ } } - async function checkActiveJob() { + async function checkActiveJobs() { try { const response = await jobsAPI.getActive(); if (response.job) { - syncJobId = response.job.id; - syncStatus = response.job.status; - // Start polling if job is running - if (syncStatus === 'running' || syncStatus === 'pending') { - startPolling(response.job.id); + const job = response.job; + if (job.type === 'lotw_sync') { + lotwSyncJobId = job.id; + lotwSyncStatus = job.status; + if (lotwSyncStatus === 'running' || lotwSyncStatus === 'pending') { + startLoTWPolling(job.id); + } + } else if (job.type === 'dcl_sync') { + dclSyncJobId = job.id; + dclSyncStatus = job.status; + if (dclSyncStatus === 'running' || dclSyncStatus === 'pending') { + startDCLPolling(job.id); + } } } } catch (err) { - console.error('Failed to check active job:', err); + console.error('Failed to check active jobs:', err); } } - async function startPolling(jobId) { - syncJobId = jobId; - syncStatus = 'running'; + async function startLoTWPolling(jobId) { + lotwSyncJobId = jobId; + lotwSyncStatus = 'running'; - // Clear any existing interval - if (pollingInterval) { - clearInterval(pollingInterval); + if (lotwPollingInterval) { + clearInterval(lotwPollingInterval); } - // Poll every 2 seconds - pollingInterval = setInterval(async () => { + lotwPollingInterval = setInterval(async () => { try { const response = await jobsAPI.getStatus(jobId); const job = response.job; - syncStatus = job.status; - syncProgress = job.result?.progress ? job.result : null; + lotwSyncStatus = job.status; + lotwSyncProgress = job.result?.progress ? job.result : null; if (job.status === 'completed') { - clearInterval(pollingInterval); - pollingInterval = null; - syncJobId = null; - syncProgress = null; - syncStatus = null; + clearInterval(lotwPollingInterval); + lotwPollingInterval = null; + lotwSyncJobId = null; + lotwSyncProgress = null; + lotwSyncStatus = null; - // Reload QSOs and stats await loadQSOs(); await loadStats(); - // Show success message syncResult = { success: true, + source: 'LoTW', ...job.result, }; } else if (job.status === 'failed') { - clearInterval(pollingInterval); - pollingInterval = null; - syncJobId = null; - syncProgress = null; - syncStatus = null; + clearInterval(lotwPollingInterval); + lotwPollingInterval = null; + lotwSyncJobId = null; + lotwSyncProgress = null; + lotwSyncStatus = null; - // Show error message syncResult = { success: false, - error: job.error || 'Sync failed', + source: 'LoTW', + error: job.error || 'LoTW sync failed', }; } } catch (err) { - console.error('Failed to poll job status:', err); - // Don't stop polling on error, might be temporary + console.error('Failed to poll LoTW job status:', err); } }, 2000); } - async function handleSync() { + async function startDCLPolling(jobId) { + dclSyncJobId = jobId; + dclSyncStatus = 'running'; + + if (dclPollingInterval) { + clearInterval(dclPollingInterval); + } + + dclPollingInterval = setInterval(async () => { + try { + const response = await jobsAPI.getStatus(jobId); + const job = response.job; + + dclSyncStatus = job.status; + dclSyncProgress = job.result?.progress ? job.result : null; + + if (job.status === 'completed') { + clearInterval(dclPollingInterval); + dclPollingInterval = null; + dclSyncJobId = null; + dclSyncProgress = null; + dclSyncStatus = null; + + await loadQSOs(); + await loadStats(); + + syncResult = { + success: true, + source: 'DCL', + ...job.result, + }; + } else if (job.status === 'failed') { + clearInterval(dclPollingInterval); + dclPollingInterval = null; + dclSyncJobId = null; + dclSyncProgress = null; + dclSyncStatus = null; + + syncResult = { + success: false, + source: 'DCL', + error: job.error || 'DCL sync failed', + }; + } + } catch (err) { + console.error('Failed to poll DCL job status:', err); + } + }, 2000); + } + + async function handleLoTWSync() { try { const response = await qsosAPI.syncFromLoTW(); if (response.jobId) { - // Job was queued successfully - startPolling(response.jobId); + startLoTWPolling(response.jobId); } else if (response.existingJob) { - // There's already an active job - startPolling(response.existingJob); + startLoTWPolling(response.existingJob); } else { - throw new Error(response.error || 'Failed to queue sync job'); + throw new Error(response.error || 'Failed to queue LoTW sync job'); } } catch (err) { syncResult = { success: false, + source: 'LoTW', error: err.message, }; } } - let syncResult = null; + async function handleDCLSync() { + try { + const response = await qsosAPI.syncFromDCL(); + + if (response.jobId) { + startDCLPolling(response.jobId); + } else if (response.existingJob) { + startDCLPolling(response.existingJob); + } else { + throw new Error(response.error || 'Failed to queue DCL sync job'); + } + } catch (err) { + syncResult = { + success: false, + source: 'DCL', + error: err.message, + }; + } + } async function applyFilters() { currentPage = 1; @@ -263,31 +346,52 @@ {/if} + - {#if syncProgress} + {#if lotwSyncProgress}

Syncing from LoTW...

-

{syncProgress.message || 'Processing...'}

- {#if syncProgress.total} -

Progress: {syncProgress.processed || 0} / {syncProgress.total}

+

{lotwSyncProgress.message || 'Processing...'}

+ {#if lotwSyncProgress.total} +

Progress: {lotwSyncProgress.processed || 0} / {lotwSyncProgress.total}

+ {/if} +
+ {/if} + + {#if dclSyncProgress} +
+

Syncing from DCL...

+

{dclSyncProgress.message || 'Processing...'}

+ {#if dclSyncProgress.total} +

Progress: {dclSyncProgress.processed || 0} / {dclSyncProgress.total}

{/if}
{/if} @@ -602,6 +706,23 @@ .header-buttons { display: flex; gap: 1rem; + flex-wrap: wrap; + } + + .lotw-btn { + background-color: #4a90e2; + } + + .lotw-btn:hover:not(:disabled) { + background-color: #357abd; + } + + .dcl-btn { + background-color: #e67e22; + } + + .dcl-btn:hover:not(:disabled) { + background-color: #d35400; } .stats-grid { diff --git a/src/frontend/src/routes/settings/+page.svelte b/src/frontend/src/routes/settings/+page.svelte index d0b8785..71e97b5 100644 --- a/src/frontend/src/routes/settings/+page.svelte +++ b/src/frontend/src/routes/settings/+page.svelte @@ -194,8 +194,8 @@

DCL Credentials

- Configure your DARC Community Logbook (DCL) API key for future sync functionality. - Note: DCL does not currently provide a download API. This is prepared for when they add one. + Configure your DARC Community Logbook (DCL) API key to sync your QSOs. + Your API key is stored securely and used only to fetch your confirmed QSOs.

{#if hasDCLCredentials} @@ -220,7 +220,7 @@ placeholder="Your DCL API key" />

- Enter your DCL API key for future sync functionality + Enter your DCL API key to sync QSOs

@@ -233,10 +233,10 @@

About DCL

DCL (DARC Community Logbook) is DARC's web-based logbook system for German amateur radio awards. - It includes DOK (DARC Ortsverband Kennung) fields for local club awards. + It includes DOK (DARC Ortsverband Kennung) fields for local club awards like the DLD award.

- Status: Download API not yet available.{' '} + Once configured, you can sync your QSOs from DCL on the QSO Log page. Visit DCL website