Event Map using Google API

Rob shares all the tech you'll need to pull data from Google Drive and use that to drive data on your website.

With the launch of our new website, we moved almost everything to statically built pages. This included the event map in the footer, which was maintained in a YAML file that was assembled as part of our deploy process. This worked great for building the site, but the content management was a mess. We had one person updating a Google travel spreadsheet and a different person trying to figure out what changes were made in order to update the events content.

We wanted a better source for managing the event schedule than the build process for the rest of the site. Our first thought was to use Lanyrd.

We implemented a version using Lanyrd, and then we realized that being restricted to the Lanyrd data didn't give us some of the fine-grained details that we wanted. Talking this over with Emily and Rob, we decided that since we're already tracking our travel plans in a Google Drive spreadsheet, and there is an API for pulling data out of Drive, this seemed like a good solution. It allowed us the flexibility to define our own data model, to have the update not be tied to deploying the website, and to use the data we’re already putting into the spreadsheet. That's what I like to call a win-win-win situation.

List of Technologies

Google Drive Spreadsheet

We already had a basic spreadsheet for tracking events. I only had to make a few changes for it to work for my purposes:

  1. I added a few dropdowns to make entering data more consistent.
  2. I used Google Spreadsheets' geocoding to automatically get the coordinates for each event to plot on the map.

    To make this work, I added the following script to my spreadsheet:

    function geoCode(addr){
    var gc = Maps.newGeocoder();
    var geoJSON = gc.geocode(addr);
    var lat = geoJSON.results[0].geometry.location.lat;
    var lng = geoJSON.results[0].geometry.location.lng;
    return lat + ", " + lng;
    view raw geocode-script.js hosted with ❤ by GitHub

    Then I just added a field that concatenated the address parts. Here is a sample:

    Geocoding example

  3. I made the Spreadsheet public so that the API can access it.


I didn't want to mess with making a bunch of HTTP requests and deciphering responses from the Google API, so Jeremy turned me on to Tabletop.js, which made digesting the API really simple.

table = tabletop.init
key: "############################################"
callback: (data) =>
# do things
view raw tabletop.coffee hosted with ❤ by GitHub


Since it didn't seem like exposing our API key and making a fresh call to the API for every request was a good idea, the pulling of data from Google Drive happens in a separate process. There is a simple node script that gets the data from the API, transforms it to the format that the website expects to plot the markers on the map, and then saves that data to a json file that the website loads when it needs the map.

UPDATE: After putting this in place, we found that occassionally the map markers would not show up on the website. After looking into it, I found that the latitude and longitude were sometimes coming back [null, null]. It appears that we have a race condition with the geocoding and pulling the spreadsheet data down through the API. Occasionally we get back null values for latitude and longitude. Now our Node script throws an error when the latitude or longitude are invalid. This way we keep the existing file and don't overwrite it with bad data. Then we simply try again the next time the CRON job runs.

Whenever Gem

Ryan Cromwell turned me on to the Whenever Gem for another project, and it fit here perfectly. It gives me a simple Ruby file to declare how often I want to run a job, and then it adds that to the crontab. This way, I can update the CRON job that runs the node script on deploys. It also handles making sure that the path is right for the environment that the website is running in (dev, prod, etc.).

set :output, "/log/cron_log.log"
every 1.hour do
command "cd #{server_path} && coffee google.coffee map-data.json"
view raw schedule.rb hosted with ❤ by GitHub


Running Whenever Gem creates a CRON job that runs at the given interval. So we can make changes to the spreadsheet and shortly after that they'll be live on the website. Since this data doesn't update all that frequently, and it isn't vital, updating once an hour seemed like a good way to be a good citizen of the Google Drive API.


Lastly, there is a bit of CoffeeScript which fires when you click on the "More Footer" button. It grabs the map-data.json file and parses the data into an event list that gets added to the page. It also uses that data to plot the markers onto our Mapbox map.

That's It

This was a fun little project that allowed us to keep things simple, reduce duplicate content by using a system that we were already using (Google Drive), and keep the website content up to date with minimal effort.