Using MCP Tools for Google Maps Integration
Using MCP Tools for Google Maps Integration
Overview
The Google Maps MCP (Model Context Protocol) server provides simple function calls for common mapping operations - no API code needed. Perfect for data processing, automation scripts, and backend workflows.
What you can do:
- Search for places and get details
- Geocode addresses ↔ coordinates
- Calculate routes and travel times
- Get elevation data
- All without writing HTTP requests or parsing JSON
Available MCP tools:
maps_search_places- Find venues, facilities, landmarksmaps_place_details- Get detailed info (hours, reviews, photos)maps_geocode- Address → coordinatesmaps_reverse_geocode- Coordinates → addressmaps_directions- Routes with turn-by-turn directionsmaps_distance_matrix- Travel times between multiple pointsmaps_elevation- Elevation at coordinates
Time: 1-2 hours to set up, instant use thereafter
Setup: Docker MCP Gateway
Prerequisites
✅ Docker installed and running ✅ Google Cloud project with APIs enabled ✅ Google Maps API key ✅ Basic command line knowledge
Step 1: Configure MCP Server
The Google Maps MCP server is part of the Docker MCP Gateway. Configure it with your API key:
~/.config/mcp/config.json:
json{ "mcpServers": { "googlemaps": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "GOOGLE_MAPS_API_KEY=YOUR_API_KEY_HERE", "mcp/googlemaps" ] } } }
Step 2: Test Connection
python# In Python (or your preferred language with MCP support) from mcp import Client client = Client() # Test geocoding result = client.call_tool("maps_geocode", { "address": "Tijuana, Mexico" }) print(result) # {'location': {'lat': 32.5331957, 'lng': -117.0192784}, ...}
Common Workflows
Workflow 1: Batch Geocoding Addresses
Problem: You have 100 addresses in a spreadsheet. Need coordinates for mapping.
Solution:
pythonimport csv # Read addresses from CSV with open('addresses.csv', 'r') as f: reader = csv.DictReader(f) locations = list(reader) # Geocode each address for location in locations: try: result = maps_geocode(location['address']) location['latitude'] = result['location']['lat'] location['longitude'] = result['location']['lng'] location['formatted_address'] = result['formatted_address'] print(f"✓ {location['name']}: {location['latitude']}, {location['longitude']}") except Exception as e: print(f"✗ {location['name']}: {e}") location['latitude'] = None location['longitude'] = None # Save updated CSV with open('addresses_geocoded.csv', 'w') as f: writer = csv.DictWriter(f, fieldnames=locations[0].keys()) writer.writeheader() writer.writerows(locations) print(f"\nGeocoded {len(locations)} addresses")
Real-world example:
bash# Input: addresses.csv name,address Tijuana Jazz Club,"Av. Revolución 1006, Tijuana" Black Box,"Av. Revolución 1217, Tijuana" WWTP La Morita,"Blvd. Olivos Nte. 38, Tijuana" # Output: addresses_geocoded.csv (with lat/lng added) name,address,latitude,longitude,formatted_address Tijuana Jazz Club,"Av. Revolución 1006, Tijuana",32.5327,-117.0363,"Av. Revolución 1006, Zona Centro, 22000 Tijuana, B.C., Mexico" ...
Workflow 2: Find All Venues in Area
Problem: Create database of all music venues in Tijuana.
Solution:
python# Search for different types of music venues queries = [ "music venues Tijuana Mexico", "concert halls Tijuana Mexico", "jazz clubs Tijuana Mexico", "recording studios Tijuana Mexico", "live music bars Tijuana Mexico" ] all_venues = [] for query in queries: results = maps_search_places(query) print(f"{query}: found {len(results['places'])} results") for place in results['places']: all_venues.append({ 'name': place['name'], 'address': place['formatted_address'], 'lat': place['location']['lat'], 'lng': place['location']['lng'], 'place_id': place['place_id'], 'rating': place.get('rating'), 'types': ','.join(place.get('types', [])) }) # Deduplicate by place_id unique_venues = {v['place_id']: v for v in all_venues}.values() print(f"\nFound {len(unique_venues)} unique venues") # Get detailed info for each detailed_venues = [] for venue in unique_venues: details = maps_place_details(venue['place_id']) venue['phone'] = details.get('formatted_phone_number') venue['website'] = details.get('website') venue['hours'] = details.get('opening_hours', {}).get('weekday_text', []) venue['reviews_count'] = len(details.get('reviews', [])) detailed_venues.append(venue) print(f"✓ {venue['name']} - {venue['rating']}★") # Export as JSON import json with open('tijuana_venues.json', 'w') as f: json.dump(detailed_venues, f, indent=2)
Workflow 3: Calculate Travel Times Matrix
Problem: Which water treatment plant is closest to each monitoring site?
Solution:
python# Treatment plants (origins) plants = [ "WWTP La Morita, Tijuana", "San Antonio de los Buenos Wastewater Treatment Plant, Tijuana", "CESPT Plant, Tijuana" ] # Monitoring sites (destinations) sites = [ "Tijuana River Estuary", "32.4850,-116.9500", # Can use coordinates too "32.5100,-116.9200", "Downtown Tijuana" ] # Calculate all origin→destination times matrix = maps_distance_matrix( origins=plants, destinations=sites, mode="driving" ) # Find nearest plant for each site print("Nearest plant to each monitoring site:\n") for j, dest in enumerate(matrix['destination_addresses']): print(f"Site: {dest}") # Get travel time from each plant times = [] for i in range(len(plants)): element = matrix['results'][i]['elements'][j] if element['status'] == 'OK': times.append({ 'plant': matrix['origin_addresses'][i], 'duration_min': element['duration']['value'] / 60, 'distance_km': element['distance']['value'] / 1000 }) # Find shortest nearest = min(times, key=lambda x: x['duration_min']) print(f" → Nearest: {nearest['plant']}") print(f" ({nearest['duration_min']:.0f} min, {nearest['distance_km']:.1f} km)\n")
Output:
Nearest plant to each monitoring site:
Site: Tijuana River Estuary
→ Nearest: San Antonio de los Buenos Wastewater Treatment Plant
(18 min, 12.3 km)
Site: 32.4850,-116.9500
→ Nearest: CESPT Plant
(8 min, 5.7 km)
...
Workflow 4: Generate Inspection Routes
Problem: Visit 10 facilities. What's the optimal route?
Solution:
python# Facilities to inspect facilities = [ "32.5679,-117.1241", # Site 1 "32.5450,-117.0850", # Site 2 "32.5200,-117.0300", # Site 3 "32.4850,-116.9500", # Site 4 "32.4650,-116.9300", # Site 5 "32.4552,-116.8605", # Site 6 "32.4698,-117.0828", # Site 7 "32.4569,-116.9080", # Site 8 "32.4400,-116.8800", # Site 9 "32.5100,-116.9200" # Site 10 ] # Starting point (office/lab) start = "32.5300,-117.0200" # Calculate optimal route route = maps_directions( origin=start, destination=start, # Return to start waypoints="optimize:true|" + "|".join(facilities), mode="driving" ) # Get optimized order waypoint_order = route['routes'][0]['waypoint_order'] print("Optimal inspection route:") print(f"1. Start: {start}") for i, wp_index in enumerate(waypoint_order): print(f"{i+2}. Site {wp_index+1}: {facilities[wp_index]}") print(f"{len(facilities)+2}. Return to start\n") # Calculate total time and distance legs = route['routes'][0]['legs'] total_time = sum(leg['duration']['value'] for leg in legs) total_distance = sum(leg['distance']['value'] for leg in legs) print(f"Total driving: {total_distance/1000:.1f} km in {total_time/60:.0f} minutes") print(f"Add ~15 min per site for inspection") print(f"Total estimated time: {total_time/60 + len(facilities)*15:.0f} minutes ({(total_time/60 + len(facilities)*15)/60:.1f} hours)") # Export turn-by-turn directions print("\n=== TURN-BY-TURN DIRECTIONS ===\n") for i, leg in enumerate(legs): print(f"Leg {i+1}: {leg['start_address']} → {leg['end_address']}") print(f"Distance: {leg['distance']['text']}, Duration: {leg['duration']['text']}\n") for step_num, step in enumerate(leg['steps'], 1): # Remove HTML tags from instructions instruction = step['html_instructions'].replace('<b>', '').replace('</b>', '').replace('<div>', ' ').replace('</div>', '') print(f" {step_num}. {instruction} ({step['distance']['text']})") print()
Workflow 5: Enrich Dataset with Place Data
Problem: You have venue names but need addresses, hours, and ratings.
Solution:
pythonimport csv # Read venue names with open('venue_names.csv', 'r') as f: reader = csv.DictReader(f) venues = list(reader) # Enrich each venue for venue in venues: # Search for place search = maps_search_places(f"{venue['name']} Tijuana Mexico") if len(search['places']) > 0: place = search['places'][0] # Get detailed info details = maps_place_details(place['place_id']) # Add to venue data venue['address'] = place['formatted_address'] venue['lat'] = place['location']['lat'] venue['lng'] = place['location']['lng'] venue['rating'] = place.get('rating') venue['phone'] = details.get('formatted_phone_number', '') venue['website'] = details.get('website', '') venue['hours'] = '; '.join(details.get('opening_hours', {}).get('weekday_text', [])) print(f"✓ {venue['name']} - Found and enriched") else: print(f"✗ {venue['name']} - Not found") # Save enriched data with open('venues_enriched.csv', 'w') as f: writer = csv.DictWriter(f, fieldnames=venues[0].keys()) writer.writeheader() writer.writerows(venues)
Workflow 6: Elevation Profile for Infrastructure
Problem: Verify sewer line elevation (gravity flow should go downhill).
Solution:
python# Sewer line waypoints (upstream to downstream) line_points = [ {"lat": 32.4700, "lng": -116.8600, "name": "Upstream"}, {"lat": 32.4680, "lng": -116.8620, "name": "Midpoint 1"}, {"lat": 32.4660, "lng": -116.8640, "name": "Midpoint 2"}, {"lat": 32.4640, "lng": -116.8660, "name": "Midpoint 3"}, {"lat": 32.4620, "lng": -116.8680, "name": "Downstream"} ] # Get elevation for each point elevations = maps_elevation([ {"latitude": p["lat"], "longitude": p["lng"]} for p in line_points ]) # Check gravity flow print("Sewer line elevation profile:\n") for i, result in enumerate(elevations['results']): point = line_points[i] elev = result['elevation'] print(f"{point['name']}: {elev:.1f} m") if i > 0: prev_elev = elevations['results'][i-1]['elevation'] drop = prev_elev - elev if drop > 0: print(f" ↓ Drop: {drop:.1f} m (gravity OK)") else: print(f" ⚠ RISE: {abs(drop):.1f} m (PUMPING REQUIRED)") print() # Total elevation change total_drop = elevations['results'][0]['elevation'] - elevations['results'][-1]['elevation'] print(f"Total elevation change: {total_drop:.1f} m") print(f"Average slope: {total_drop / len(line_points):.2f} m per segment")
Integration with Other Tools
Export to GeoJSON for Maps Datasets API
pythonimport json # Use MCP tools to gather data venues = [] for query in ["music venues Tijuana", "recording studios Tijuana"]: results = maps_search_places(query) venues.extend(results['places']) # Convert to GeoJSON features = [] for venue in venues: feature = { "type": "Feature", "geometry": { "type": "Point", "coordinates": [venue['location']['lng'], venue['location']['lat']] }, "properties": { "name": venue['name'], "address": venue['formatted_address'], "rating": venue.get('rating'), "place_id": venue['place_id'] } } features.append(feature) geojson = { "type": "FeatureCollection", "features": features } with open('venues.geojson', 'w') as f: json.dump(geojson, f, indent=2) print(f"Exported {len(features)} venues to venues.geojson")
Automate Daily Updates
bash#!/bin/bash # daily_update.sh - Refresh venue database # 1. Search for new venues python3 scripts/search_venues.py > venues_raw.json # 2. Geocode any missing coordinates python3 scripts/geocode_addresses.py venues_raw.json venues_geocoded.json # 3. Calculate distances to key locations python3 scripts/calculate_distances.py venues_geocoded.json venues_final.json # 4. Upload to Maps Datasets API gsutil cp venues_final.json gs://your-bucket/ curl -X POST "[Datasets API import endpoint]" ... # 5. Notify team echo "Venue database updated: $(date)" | mail -s "Maps Update" [email protected]
Combine with n8n Workflows
json{ "nodes": [ { "name": "Trigger Daily", "type": "n8n-nodes-base.cron", "parameters": {"cronExpression": "0 6 * * *"} }, { "name": "Get New Venues", "type": "n8n-nodes-base.executeCommand", "parameters": { "command": "python3 /scripts/search_venues.py" } }, { "name": "Geocode Addresses", "type": "n8n-nodes-base.function", "parameters": { "functionCode": "// Call maps_geocode via MCP for each venue" } }, { "name": "Update Database", "type": "n8n-nodes-base.postgres" } ] }
Best Practices
Rate Limiting
pythonimport time def geocode_with_rate_limit(addresses, delay=0.1): """Geocode with rate limiting to avoid quota issues""" results = [] for i, address in enumerate(addresses): try: result = maps_geocode(address) results.append(result) print(f"[{i+1}/{len(addresses)}] ✓ {address}") except Exception as e: print(f"[{i+1}/{len(addresses)}] ✗ {address}: {e}") results.append(None) # Rate limit time.sleep(delay) return results
Error Handling
pythondef safe_geocode(address, default_coords=None): """Geocode with fallback""" try: result = maps_geocode(address) return { 'lat': result['location']['lat'], 'lng': result['location']['lng'], 'success': True } except Exception as e: print(f"Geocoding failed for '{address}': {e}") if default_coords: return {'lat': default_coords[0], 'lng': default_coords[1], 'success': False} else: return {'lat': None, 'lng': None, 'success': False}
Caching Results
pythonimport json import os CACHE_FILE = 'geocode_cache.json' def load_cache(): if os.path.exists(CACHE_FILE): with open(CACHE_FILE, 'r') as f: return json.load(f) return {} def save_cache(cache): with open(CACHE_FILE, 'w') as f: json.dump(cache, f) def geocode_cached(address): cache = load_cache() if address in cache: print(f"Cache hit: {address}") return cache[address] result = maps_geocode(address) cache[address] = result save_cache(cache) return result
Troubleshooting
MCP connection fails:
- Check Docker is running:
docker ps - Verify API key is set correctly
- Test API key directly:
curl "https://maps.googleapis.com/maps/api/geocode/json?address=Tijuana&key=YOUR_KEY"
Quota exceeded:
- Check usage in Google Cloud Console
- Implement rate limiting (see above)
- Consider caching results
Coordinates are wrong:
- Some addresses return approximate locations
- Use
location_typein response to check accuracy:ROOFTOP- Precise addressRANGE_INTERPOLATED- ApproximateGEOMETRIC_CENTER- Area center (not specific address)
No results for search:
- Try broader query ("Tijuana venues" vs "Tijuana jazz clubs")
- Include city/country in query
- Check spelling and language
Related Workflows
- Building an Audio Soundwalk Map - Use MCP tools for data prep
- Mapping Water Infrastructure with Custom Data - Backend processing
Resources
- MCP Google Maps Server
- Docker MCP Gateway Documentation
- Google Maps Platform API documentation (for understanding responses)