A River of Data Runs Through Vancouver’s Aquatic Informatics
Having the “up to my neck” in it feeling is common place for many CEOs. For Aquatic Informatics‘ CEO Ed Quilty, his story starts with water being up to his hips (literally).
Originally a river ecologist as he put it, “not a software guy, not a business guy, but a guy wearing rubber boots, working in rivers, scrapping slime off rocks and collecting bugs and water samples.” Since then he’s led Aquatic Informatics for the past 11 years, growing it to the point where thousands of scientists and over 400 agencies in 28 countries are using the software.
The software is addressing critical water data management and analysis challenges for the environmental monitoring industry. They work with a variety of customers including federal, state/provincial or local government departments, hydropower operators, mining companies, academic groups and consulting organizations, who collect, manage and process large volumes of water quality or quantity data. Located in Vancouver, they’ve also been recently named as a BC company that’s “Ready to Rocket.”
Quilty traces the genesis of the company back to the early 90’s while on a UVic co-op work term for the BC Ministry of the Environment. “They had this cutting edge sensor for continuous water quality measurement. This was a big leap forward from putting samples in a bottle, and shipping them off to a lab for analysis. The lab could tell you the amount of nutrients, organics, or heavy metals. This was only a small snapshot, and the odds of missing some major event were always significant. With this sensor technology we were getting information every minute or fifteen minutes. It was a big change in the industry. We went from never having enough data to being overwhelmed with data.”
With a Forest Renewal BC project, he saw the value of sensing technology and data collection, that focused on water quality inventory. He was seeing the impact of the forestry industry on water.
With an overwhelming volume of the data, Quilty realized that using spreadsheets wasn’t cutting it. “That’s when we starting working on scripts to automate the data processing. It was trying to figure out how we’d manage all this data that really got things started. I was a biologist used to dealing with 30 or 40 samples a year, not per hour.”
They built a very lightweight version of the initial application, one good enough to sell into his professional network in BC. The real break came when they caught the attention of the US Geological Survey.
At that time it wouldn’t scale to meet the needs of the largest water monitoring agency in the world, but they were invited to respond to a RFP calling to model the water flow of water in rivers. Quilty said “we bid, not thinking we’d win it, but to get the exposure and learning experience. Much to our surprise we won the $500,000 contract. We had six months to build it, and were only going to get paid at the end if we successfully delivered. Our chief scientist knocked it out of the park.”
In 2006 all of the USGS hydrologists and flow specialists across the US were introduced to Aquarius.
The AI application allows for the conversion of water level to water flow. This information is important in terms of water allocations for irrigation, dam operations, for drinking water, industry, and for fisheries. “You have all of these competing interests, farmers, fisherman, environmentalists, cities, fighting over water particularly in places like the Colorado River, other arid regions, and then you have issues around flooding,” Quilty told BetaKit. “It’s kind of like Goldilocks trying to right just the right balance.”
Think about the considerations of dam operators, where an inch of water can mean millions of dollars. “They’re constantly trying to do optimization of their reservoirs,” Quilty said. “They want to keep them as high as they can, but they have legal obligations to release water to people below, like fishermen and farmers. At the same time they’re trying to balance Mother Nature, throwing different scenarios at them like flood or drought conditions.”
Without data, without good science and quality information, good policy decisions can’t be made. For AI, it’s about breaking down the information silos. They’re focused on facilitating better communication and creating more efficient networks. Quilty said that “often you see organizations collecting information from the same watersheds and not knowing it. You end up with dense networks in some areas, and very sparse in others.
The value of sharing this data easily is at the core of good environmental management. And without the data there’s no chance of managing our resources well. The opportunity to have a positive global impact is massive: imagine helping prevent resource abuses, like giant lakes and aquifers being drained, as highlighted in this New York Times article about Iran’s Lake Urmia.
Quilty isn’t reserved in sharing the company big hairy audacious goal, “it’s about hosting and managing all of the planets environmental data. We’ll be bringing discrete water quality and groundwater information into the system, and moving into atmospheric, ocean, and soil data too. It’s critical to be getting the right data from the right places, at the right time. I think about all of these sensors like EKG’s measuring the heartbeat of the whole planet, and we want to be part of optimizing it.”