Sounds and Places: Experiencing Bluebrain’s ‘Central Park’

Share this:

Bluebrain is not a startup. There’s no technical co-founder or venture capital dollars. In fact, they’re not even a company. Bluebrain is a Washington DC-based band, and they’ve created one of the most compelling location-based apps of the past twelve months. The app is a location-based album called Central Park (Listen to the Light), and it accomplishes what very few others have to date been able to: namely, to build an app that actually augments reality vis a vis location. More than that, though, I’d like to suggest that Central Park (Listen to the Light) is a model for the possibilities of what the next generation of location-based apps should look like.

First, a couple of words about the project. Bluebrain is a two-person ensemble, including brothers Ryan and Hays Holladay. Central Park (Listen to the Light) is their second location-based album, following the release of National Mall in May 2011. The concept behind the album is simple: it’s an app that users can download to their iPhone, and that is designed to be listened to within the context of a specific location — in this case, New York’s Central Park. As the user is moving through the park, the app “tracks their location via the iPhone’s built-in GPS capabilities; the melody and rhythm of the music varies in accordance with the user’s path.” Central Park (Listen to the Light), in other words, is a soundtrack for your experience in Central Park. And it is, uniquely, your soundtrack — because although the music is contextually tied to your location in the park, the progression and sequencing of that music is determined by factors like route and pace that have led you to that spot.

Central Park builds on the rich layer of metadata surrounding your location (route, speed, topography) to craft a wholly unique experience. But consider the effect: it’s a soundtrack for your exploration of the park. The app itself fades into the background of that exploration, augmenting it and making it new, but never becoming the focal point of your experience. “It’s a choose your own adventure album,” and the adventure is layered onto your reality. It might not qualify as capital-A Augmented Reality, but there’s no doubt that Central Park is a far more powerful user experience than most other Augmented Reality apps on the market. This, I want to suggest, is the real potential of “location-based apps” and “augmented reality” – the potential to layer brand new experiences on top of the rich data that exists within everything that we call “location.”

Compare Central Park’s use case to the broad category of apps that we refer to as ‘location based’, i.e. restaurant review apps, social networking apps, daily deals apps, etc… By and large, these products understand location as a simple reference point (literally) of latitude and longitude. In these cases, location is something to be catalogued, and not something to be interacted with. Rather than fading away into the background, the app becomes the center point of an experience, the user dutifully picking away at tiny keys to check in, update a status, or search. Rather than creating experience, the app is merely representing it. Central Park, on the other hand, understands that the app doesn’t need to be the center of the user’s attention in order to create a magical experience, but can instead use a variety of data about your location to embed itself into the very experience itself.

Ultimately, I think, what we’re talking about are apps that implicitly use all of the rich metadata of location (things like route, environment, topography, proximity to other people, places, and things, etc…) to create magical, serendipitous experiences in day-to-day life. We’re talking about things like autonomous search. Serendipity engines. That kind of thing. Now, this isn’t to suggest that there aren’t beautiful technologies, well designed apps, and very profitable business models that are being built on top of the location-as-reference-point model. But personally? I would love to see what’s possible when we consider all of the ways that location creates experience. Or when we think in terms of creating experience through location, and not merely representing it. A next generation of location-based apps that incorporates all of the rich data of location, rather than just a spot on a map, to create magical new experiences and to tell new stories.

Michael Fives is the organizer of the Location Based Apps meetup in NYC, a group of developers and entrepreneurs that meets monthly to talk all things location. Currently, Michael is a product manager at Meetup, focusing on mobile. He develops for web (front-end) and mobile (iOS) in his spare time.