Back to

Raindrop: Can You See Behind the Scenes?

We recently launched the Raindrop web application as part of FLOW: Can You See the River, a project conceived by Mary Miss. Our team started on the project about a year ago, when Mary and her studio began meeting with us and scientists from Butler University and Williams Creek Consulting to build an app illustrating the concept that “All property is riverfront property.” When Mary and I began discussing the project, we talked about the challenge of catching a person’s attention and then engaging them with a visual experience that could lead them to deeper levels of information and insight about the natural world. This is essentially what a good visualization does, so I was excited to be part of the team building this technological bridge between art and science.

Let’s begin with a tour of the functionality. When you start the app, it loads some resources while displaying the title screen, and then you have the chance to view an introduction or skip ahead to the map screen.

Because the project centers around the White River near Indianapolis, we only assembled hydrological data for the area around Marion County for the application (I’ll go into more detail later). On the map screen, a prompt appears to inform you that tapping on the map will simulate a rain event. When the map is tapped, the app displays the series of streams, storm drains, and/or sanitary lines that would carry a raindrop from that location to the White River. It also displays the area (known as a drainage basin or watershed) from which other raindrops would follow the same path. Another prompt then appears to let you know that tapping on the raincloud icon allows for selection of storm intensity. As little as a quarter inch of rain can cause sewers to overflow into streams in this area, so when this option is selected, the path displayed will change to reflect this so you can see where you don’t want to go fishing. You can also toggle the display of the 100-year floodplain, which shows you where you can keep your feet dry during a big flood event. In addition to selecting a location on the map, pressing the compass icon locates your device via GPS, and typing in the address bar uses the Google Maps address look-up feature. Tapping on a question mark icon provides some information about pollutants that threaten the path upstream, as reported by the Indiana Department of Environmental Management.

Pressing the “i” icon at the top opens the informational menu. From here, you can learn more about the app, check current weather alerts and conditions, find out how weather differs from climate, get some tips on how to improve water quality, and visit the project website.

Now we can get into some behind the scenes stuff. We wanted to try to reach a broad audience with Raindrop, so we decided to put the time that we had into developing a cross-platform mobile application. These are known as web apps, in contrast to native apps. If we only had an iPhone native app in the App Store, people using Android phones wouldn’t be able to use it, and vice-versa. To handle cross-platform compatibility, we decided to build Raindrop using a framework called jqMobile, which was in a very early stage when we started. It hasn’t quite had an official release yet (it’s in its third beta release at the moment), but has become increasingly robust with each version.

As for the map, you might wonder how we figured out the path that raindrops take to get to the river. Our collaborator at Williams Creek combined information based on digital elevation models, which can be used to derive the boundaries of natural watersheds, with data from the city that indicates where all of the storm drains and sanitary systems are and which areas drain into them.

We then wrote Python scripts to read the scientific data and generate KML geometry files and look-up tables. The application uses a spatial grid look-up to figure out which basin is tapped (so it’s not perfectly accurate, but not too slow either), and then loads the appropriate file with the graphics to display for the path and the basin. It also reads information from another table that has all the details about pollutants.

Along the way, we’ve combined this technology with graphical elements and design guidance provided by Mary’s team, and scientific guidance and content from Butler. The multi-disciplinary process has really embodied the nature of Mary’s City as A Living Laboratory concept. And just as the aim is to lead curious folks from Mary’s eye-catching mirrors and markers along the river to the website and the web app to learn more, hopefully those who discover the project online will follow the raindrop and find their way down to experience the river as well.

Filed under: Technology

Comments are closed.