Syncing Maps Datasets API with Convex
Syncing Maps Datasets API with Convex
Problem Statement
Challenge: How to maintain bidirectional sync between Convex database (live, real-time) and Google Maps Datasets API (spatial rendering) while preserving GeoJSON files as version-controlled source of truth?
Requirements:
- Changes in Convex β Automatically update Google Maps visualization
- Admin edits β Persist to both Convex and eventual GeoJSON export
- GeoJSON files β Seed Convex on initialization
- Avoid sync conflicts and data loss
- Support incremental updates (don't re-upload entire dataset on every change)
- Enable multiple dataset types with different sync strategies
Solution: Three-layer sync architecture with scheduled jobs, mutation triggers, and manual export commands.
Sync Architecture
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Admin Edits (React Component) β
β β’ Update GPS object properties β
β β’ Adjust coordinates β
β β’ Add new objects β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β (immediate)
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Convex Database (Live) β
β β’ Real-time updates β
β β’ Optimistic UI β
β β’ Source for Maps API sync β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β (debounced, batched)
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Google Maps Datasets API (Rendering) β
β β’ Spatial indexing β
β β’ Vector tiles β
β β’ Public visualization β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β (scheduled export)
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β GeoJSON Files (Version Control) β
β β’ Git commits β
β β’ Source of truth β
β β’ Manual review before commit β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Sync Strategies
Strategy 1: Full Dataset Replacement (Simple)
Use case: Small datasets (<1000 objects), infrequent updates
Flow:
- Admin makes changes in Convex
- Scheduled job (every 30 minutes) exports entire dataset to GeoJSON
- Upload GeoJSON to Google Cloud Storage
- Trigger Maps Datasets API import (replaces entire dataset)
Pros:
- Simple implementation
- Guaranteed consistency
- Easy to reason about
Cons:
- Inefficient for large datasets
- Higher API costs
- 5-10 minute propagation delay
Strategy 2: Incremental Updates (Advanced)
Use case: Large datasets (>1000 objects), frequent updates
Flow:
- Admin makes changes in Convex
- Change tracking table records deltas
- Scheduled job (every 10 minutes) processes deltas
- Batch update Maps Datasets API (add/update/delete specific features)
- Clear processed deltas
Pros:
- Efficient for large datasets
- Faster propagation
- Lower API costs
Cons:
- More complex implementation
- Requires change tracking
- Potential sync conflicts
Strategy 3: Manual Export (Research/Review)
Use case: Research datasets requiring manual review, sensitive data
Flow:
- Admin makes changes in Convex
- Review changes in admin UI
- Manual export command β generates GeoJSON
- Review GeoJSON diff in Git
- Commit to version control
- Manual Maps API upload (or automated on merge)
Pros:
- Full control over what syncs
- Research integrity
- Git audit trail
Cons:
- Requires manual intervention
- Slower propagation
- More workflow complexity
Implementation: Full Dataset Replacement
Step 1: Convex to GeoJSON Export
typescript// convex/sync/exportToGeoJSON.ts import { mutation } from "../_generated/server"; import { v } from "convex/values"; export const exportDatasetToGeoJSON = mutation({ args: { datasetId: v.id("gpsDatasets") }, handler: async (ctx, args) => { // Get dataset metadata const dataset = await ctx.db.get(args.datasetId); if (!dataset) throw new Error("Dataset not found"); // Get all GPS objects const objects = await ctx.db .query("gpsObjects") .withIndex("by_dataset", q => q.eq("datasetId", args.datasetId)) .collect(); // Convert to GeoJSON const geojson = { type: "FeatureCollection", features: objects.map(obj => ({ type: "Feature", geometry: obj.geometry, properties: { name: obj.name, description: obj.description, ...obj.properties, // Internal metadata convexId: obj._id, convexCreatedAt: obj.createdAt, convexUpdatedAt: obj.updatedAt, tags: obj.tags, songId: obj.songId } })) }; return geojson; } });
Step 2: Write GeoJSON to Vault
typescript// scripts/sync-to-vault.ts import { ConvexHttpClient } from "convex/browser"; import { api } from "../convex/_generated/api"; import fs from "fs/promises"; import path from "path"; const client = new ConvexHttpClient(process.env.CONVEX_URL!); async function exportToVault(datasetId: string) { // Export from Convex const geojson = await client.mutation(api.sync.exportDatasetToGeoJSON, { datasetId }); // Get dataset metadata for filename const dataset = await client.query(api.gpsDatasets.getDataset, { datasetId }); // Write to vault const outputPath = path.join( process.env.VAULT_PATH!, dataset.geojsonPath ); await fs.writeFile( outputPath, JSON.stringify(geojson, null, 2), "utf8" ); console.log(`β Exported ${geojson.features.length} objects to ${outputPath}`); return outputPath; } // Usage: // npx tsx scripts/sync-to-vault.ts dataset_abc123 const datasetId = process.argv[2]; if (!datasetId) { console.error("Usage: npx tsx scripts/sync-to-vault.ts <datasetId>"); process.exit(1); } exportToVault(datasetId);
Step 3: Upload to Google Cloud Storage
typescript// scripts/sync-to-gcs.ts import { Storage } from "@google-cloud/storage"; import fs from "fs/promises"; const storage = new Storage({ keyFilename: process.env.GOOGLE_CLOUD_KEY_PATH }); async function uploadToGCS(localPath: string, bucketName: string) { const bucket = storage.bucket(bucketName); const filename = path.basename(localPath); await bucket.upload(localPath, { destination: `gps-datasets/${filename}`, metadata: { contentType: "application/json", metadata: { uploadedAt: new Date().toISOString() } } }); console.log(`β Uploaded to gs://${bucketName}/gps-datasets/${filename}`); return `gs://${bucketName}/gps-datasets/${filename}`; }
Step 4: Trigger Maps Datasets API Import
typescript// scripts/sync-to-maps-api.ts import { google } from "@googleapis/mapsplatformdatasets"; const auth = new google.auth.GoogleAuth({ keyFile: process.env.GOOGLE_CLOUD_KEY_PATH, scopes: ["https://www.googleapis.com/auth/cloud-platform"] }); const client = await google.mapsplatformdatasets({ version: "v1alpha", auth }); async function importToMapsAPI( projectId: string, datasetId: string, gcsUri: string ) { const fullDatasetPath = `projects/${projectId}/datasets/${datasetId}`; // Trigger import const response = await client.projects.datasets.import({ name: fullDatasetPath, requestBody: { inputFormat: "GEO_JSON", gcsSource: { uri: gcsUri } } }); console.log(`β Triggered import for dataset ${datasetId}`); console.log(`Operation: ${response.data.name}`); // Wait for operation to complete (polling) const operationName = response.data.name; let complete = false; while (!complete) { await new Promise(resolve => setTimeout(resolve, 5000)); // Wait 5s const operation = await client.projects.datasets.operations.get({ name: operationName }); complete = operation.data.done ?? false; if (operation.data.error) { throw new Error(`Import failed: ${operation.data.error.message}`); } } console.log(`β Import completed successfully`); }
Step 5: Combined Sync Script
typescript// scripts/full-sync.ts import { ConvexHttpClient } from "convex/browser"; import { api } from "../convex/_generated/api"; async function fullSync(datasetId: string) { console.log(`Starting full sync for dataset ${datasetId}...`); // 1. Export from Convex to GeoJSON file const geojsonPath = await exportToVault(datasetId); // 2. Upload to Google Cloud Storage const gcsUri = await uploadToGCS(geojsonPath, process.env.GCS_BUCKET!); // 3. Get Google Maps dataset ID from Convex const dataset = await client.query(api.gpsDatasets.getDataset, { datasetId }); if (!dataset.googleDatasetId) { console.error("Dataset not yet registered with Google Maps API"); console.log("Run: npm run create-google-dataset"); process.exit(1); } // 4. Import to Maps Datasets API await importToMapsAPI( process.env.GOOGLE_PROJECT_ID!, dataset.googleDatasetId, gcsUri ); // 5. Update Convex with sync timestamp await client.mutation(api.gpsDatasets.updateSyncTimestamp, { datasetId, timestamp: Date.now() }); console.log(`β Full sync completed for ${dataset.name}`); }
Scheduled Sync Job (Convex Cron)
typescript// convex/crons.ts import { cronJobs } from "convex/server"; import { internal } from "./_generated/api"; const crons = cronJobs(); // Sync all datasets every 30 minutes crons.interval( "sync-gps-datasets", { minutes: 30 }, internal.sync.syncAllDatasets ); export default crons; // convex/sync.ts import { internalMutation } from "./_generated/server"; export const syncAllDatasets = internalMutation({ handler: async (ctx) => { // Get all datasets marked for auto-sync const datasets = await ctx.db .query("gpsDatasets") .filter(q => q.eq(q.field("autoSync"), true)) .collect(); for (const dataset of datasets) { // Trigger external sync script via webhook or action await ctx.scheduler.runAfter(0, "triggerExternalSync", { datasetId: dataset._id }); } } });
Implementation: Incremental Updates
Change Tracking Table
typescript// Add to schema.ts gpsObjectChanges: defineTable({ objectId: v.id("gpsObjects"), datasetId: v.id("gpsDatasets"), changeType: v.union( v.literal("created"), v.literal("updated"), v.literal("deleted") ), previousState: v.optional(v.any()), // For rollback changedAt: v.number(), synced: v.boolean(), syncedAt: v.optional(v.number()) }) .index("by_dataset_unsynced", ["datasetId", "synced"]) .index("by_object", ["objectId"]);
Track Changes on Mutations
typescript// Wrap existing mutations export const addGPSObject = mutation({ args: { /* ... */ }, handler: async (ctx, args) => { // Create object const objectId = await ctx.db.insert("gpsObjects", { /* ... */ }); // Track change await ctx.db.insert("gpsObjectChanges", { objectId, datasetId: args.datasetId, changeType: "created", changedAt: Date.now(), synced: false }); return objectId; } }); export const updateGPSObject = mutation({ args: { objectId: v.id("gpsObjects"), updates: v.any() }, handler: async (ctx, args) => { const existing = await ctx.db.get(args.objectId); if (!existing) throw new Error("Object not found"); // Track change await ctx.db.insert("gpsObjectChanges", { objectId: args.objectId, datasetId: existing.datasetId, changeType: "updated", previousState: existing, // For potential rollback changedAt: Date.now(), synced: false }); // Apply update await ctx.db.patch(args.objectId, { ...updates, updatedAt: Date.now() }); } });
Process Incremental Changes
typescript// scripts/incremental-sync.ts async function processIncrementalChanges(datasetId: string) { // Get unsynced changes const changes = await client.query(api.sync.getUnsyncedChanges, { datasetId }); if (changes.length === 0) { console.log("No changes to sync"); return; } console.log(`Processing ${changes.length} changes...`); // Group by change type const created = changes.filter(c => c.changeType === "created"); const updated = changes.filter(c => c.changeType === "updated"); const deleted = changes.filter(c => c.changeType === "deleted"); // Process each group if (created.length > 0) { // Export new objects to GeoJSON const newObjects = await client.query(api.gpsObjects.getMany, { objectIds: created.map(c => c.objectId) }); // Upload to Maps API (append to dataset) await appendToMapsDataset(datasetId, newObjects); } if (updated.length > 0) { // Similar process for updates await updateMapsDatasetFeatures(datasetId, updated); } if (deleted.length > 0) { // Delete from Maps API await deleteFromMapsDataset(datasetId, deleted); } // Mark changes as synced await client.mutation(api.sync.markChangesSynced, { changeIds: changes.map(c => c._id) }); console.log("β Incremental sync completed"); }
Reverse Sync: GeoJSON β Convex
Seed from GeoJSON on Initialize
typescript// scripts/seed-from-geojson.ts import { ConvexHttpClient } from "convex/browser"; import { api } from "../convex/_generated/api"; import fs from "fs/promises"; async function seedFromGeoJSON( geojsonPath: string, datasetId: string ) { // Read GeoJSON file const content = await fs.readFile(geojsonPath, "utf8"); const geojson = JSON.parse(content); console.log(`Seeding ${geojson.features.length} objects...`); // Batch insert const client = new ConvexHttpClient(process.env.CONVEX_URL!); for (const feature of geojson.features) { // Check if object already exists (by convexId in properties) const existingId = feature.properties.convexId; if (existingId) { const existing = await client.query(api.gpsObjects.get, { objectId: existingId }); if (existing) { console.log(`Skipping existing object: ${feature.properties.name}`); continue; } } // Insert new object await client.mutation(api.gpsObjects.addGPSObject, { datasetId, geometry: feature.geometry, name: feature.properties.name, description: feature.properties.description, properties: feature.properties, tags: feature.properties.tags || [] }); } console.log("β Seeding completed"); }
Workflow Examples
Workflow 1: Field Recording Added
1. User records audio at Tijuana Jazz Club
2. React component creates GPS object in Convex
β Includes coordinates, audio URL, metadata
3. GPS object saved (optimistic UI update)
4. Scheduled job (30 min later) exports to GeoJSON
5. GeoJSON uploaded to Google Cloud Storage
6. Maps Datasets API imports updated dataset
7. Map visualization updates automatically
8. Developer reviews GeoJSON diff
9. Git commit preserves change
Workflow 2: Manual Correction
1. Developer notices incorrect coordinates in Git
2. Edit GeoJSON file directly
3. Run seed script: npm run seed-from-geojson
4. Convex updated with corrected data
5. Maps API syncs on next scheduled job
6. All visualizations show corrected location
Workflow 3: Bulk Import
1. Receive CSV of 500 water monitoring sites
2. Convert CSV β GeoJSON with script
3. Place GeoJSON in vault
4. Run seed script to populate Convex
5. Trigger full sync to Maps API
6. Map visualization updates with all 500 sites
7. Commit GeoJSON to Git
Conflict Resolution
Handling Concurrent Edits
Scenario: User A edits object in React UI, User B edits same object simultaneously
Strategy:
- Convex handles concurrent mutations with last-write-wins
- Optimistic UI shows immediate feedback
- If conflict detected, show warning to user
- Allow manual merge or force overwrite
Handling Sync Failures
Scenario: Maps API import fails (network error, quota exceeded)
Strategy:
- Scheduled job logs error to Convex error table
- Retry with exponential backoff
- Alert developer if failures persist >24 hours
- Manual intervention required for persistent failures
Performance Optimization
Batching Changes
Instead of syncing every 30 minutes, batch by:
- Time window (accumulate 30 minutes of changes)
- Change count (sync when >100 unsynced changes)
- Manual trigger (admin clicks "Sync Now")
Caching Map Tiles
Google Maps caches vector tiles for performance. After syncing:
- Wait 5-10 minutes for propagation
- Force refresh map:
map.data.reload()
Debouncing User Edits
typescript// In React component import { useDebouncedCallback } from 'use-debounce'; const debouncedUpdate = useDebouncedCallback( (objectId, updates) => { updateGPSObject({ objectId, updates }); }, 1000 // Wait 1 second after user stops typing );
Monitoring and Debugging
Sync Status Dashboard
typescript// convex/sync/status.ts export const getSyncStatus = query({ args: { datasetId: v.id("gpsDatasets") }, handler: async (ctx, args) => { const dataset = await ctx.db.get(args.datasetId); const unsyncedChanges = await ctx.db .query("gpsObjectChanges") .withIndex("by_dataset_unsynced", q => q.eq("datasetId", args.datasetId) .eq("synced", false) ) .collect(); return { datasetName: dataset?.name, lastSyncedAt: dataset?.lastSyncedAt, unsyncedChanges: unsyncedChanges.length, nextSyncIn: calculateNextSync(dataset?.lastSyncedAt), googleDatasetId: dataset?.googleDatasetId }; } });
Logging
typescript// convex/schema.ts syncLogs: defineTable({ datasetId: v.id("gpsDatasets"), operation: v.union( v.literal("export"), v.literal("upload"), v.literal("import"), v.literal("seed") ), status: v.union(v.literal("success"), v.literal("error")), message: v.string(), details: v.optional(v.any()), timestamp: v.number() }).index("by_dataset", ["datasetId"]);
See Also
- GPS Dataset Architecture with Convex - Database schema
- Building a GPS Dataset Manager - React component with sync controls
- Maps Datasets API - Google's spatial database API
- Data Storage Architecture - NNT ecosystem sync patterns