250 likes | 357 Views
Modernizing operations at the Water Survey of Canada : From field technologies to data production. André Bouchard , David Hutchinson, Brian Pessah, Jeff Woodward, Paul Campbell A/manager, Hydrologie et Écohydraulique Service Météorologique du Canada CWRA conference, Hamilton (ON), June 2014.
E N D
Modernizing operations at the Water Survey of Canada : From field technologies to data production André Bouchard, David Hutchinson, Brian Pessah, Jeff Woodward, Paul Campbell A/manager, Hydrologie et Écohydraulique Service Météorologique du Canada CWRA conference, Hamilton (ON), June 2014
Summary • The business of the Water Survey of Canada (WSC) • Where we were and where we are • Field technologies • Office technologies (data production and delivery) • Looking to the future • In the field • In the office
About the Water Survey of Canada • Started in 1908, with a federal government allocation of $10,000 • “The first appropriation made by Parliament for hydrographic work was in 1908… …. as this vote was not available until the season was too far advanced, only a part of it was used in purchasing equipment in 1909” • “In Organizing the Hydrographic Surveys, it was realized with the funds available, it would be impossible to make complete investigations of the whole water supply” • First published data :“Report of Progress of Streamflow for Calendar Year, 1909” by P.M. Sauder, Chief Hydrographer • Spent two years in Montana with USGS
About the WSC • Currently, WSC is the major operator in a national network of some 2650 stations, operated through cost-sharing agreements between the federal government and the 10 provinces and 3 territories • WSC has 28 offices nationally, ranging from 1 person offices to some with 25+ • A total of 220 staff, including technologists, engineers, scientists, and management
Where we were : field technologies • For almost all of the past 100 years, our primary velocity measurement instrument has been mechanical current meters – the Price AA and the Pygmy meter – and all of our performance standards, procedures, methodologies have been based on that technology and its known limitations, and safety hazards. • Related processes and tools were essentially manual & paper based. • At the station level, technologies such as paper-based recorders were common, we made regular use of “observers”, data transmission was not automated and operation of the network was centered around an annual publication process.
History of HydroAcoustics at WSC • In the 1980’s : AFFRA systems • In late 1980’s/early 1990’s, WSC joined the USGS in investigating the use of acoustic Doppler technology in riverine environments • Early results showed great potential • Portable, could be used in many streams, drastically reduced the time to obtain a measurement, could reduce operational costs related to obtaining a measurement and could mitigate health and safety issues • But they were large physically, and expensive • WSC bought its first Broadband ADCPs in 1994, for ~ $80K each • Early uses included: • Special surveys of the St. Lawrence River to calibrate hydrodynamic models, and to characterize the tidal cycle at Quebec City through measurements done every ½ hour, which could not be done with traditional technology • Flow patterns and bathymetric surveys in northern rivers • Flow patterns past a beach in a river bay to determine the daily health of the beach
Moved from point mmts to… • Price AA - 20 panels and 1 or 2 velocity measurements by panel • With ADCPs, given the right conditions…
Evolution of hydroacoustic use Non-Moving Parts Society
Where Are We Today? • WSC has accepted for operational use: • ADCP’s TRDI (Rio Grande, RiverRay, StreamPro) SonTek (M9, S5 with interim procedures) • Sontek Flowtrackers • OceanScience tethered and remote control boats • Hornet remote control bank operated cableways • Various GPS systems for positioning • Various ship-to-shore telecommunication systems • Moving boat and section-by-section software
Future considerations (1) • Enhancement of instrument functionality and application • Shallower streams, wading, moving boat vs in-situ (ADVM), reduction in size • Modernization of software • Section-by-section, QA/QC of results • Ensure proper data management • Transmission of data from instrument to computer • Determination of position of instrument, especially related to water’s edge • GPS, range finders • Tethered boats, remote control boats, remote control cableways, pitch & roll impacts • Providing manufacturer’s feedback and improving procedures for auto-adapting ADCPs (RiverRay, M9) • Development & documentation of new standards, techniques, operating procedures, training • Training - Instrument operation, theory of acoustic doppler, accreditation • Assess & define operational limitations (velocity, turbulence, suspended sediment)
Future considerations (2) • Moving bed conditions • Deployment platforms • Data integrity between traditional and hydroacoustic technologies • Impact of new technologies on product quality • Improving application under ice • Dealing with sediment transport • Integrating uncertainty into our vocabulary • Use of flotillas • Analysis of existing measurements • Great interest internationally (WMO-Chy) • Optimization of configuration as a function of stream settings • Looking technological improvements but concern over the instruments becoming a “black box” • Re-appropriating the calculation of Q
Technologies at the station • From chart recorders and manual obs to pressure transducers and shaft encoders with digital loggers and RT transmission. • Field computers to gather field data and to download logger data. • Real-time data transmission (landline, GOES, radio). • Use of smartphones to transmit information from the field to the office and to clients. • Extending the parameter suite to include water temps. • Assessing requirements (universities involved) • Cameras to assess conditions at sites (ice).
Office technologies • Initially : Manual processing of data from charts and observations and publication in an annual paper report • Then manual digitizing of data from charts and processing by way of a mainframe with @WSC software • Still an annual publication in both paper and then digital forms • Then on to full digital processing • New Leaf to acquire data • Compumod for data production • Hydat and Hydex for data dissemination • The advent of real-time • Moving to automated continuous data publication • Hydrometric Workstation software (Aquarius)
Our previous system (1995 – 2011) • Was a strong innovation back in the mid-1990s • By the late 2000s however • Architecture – 32 unconnected servers. • Aging technology • Database • Interface • Conflicts with other software updates on Windows • Basically adapted to an annual data production process
Started with a concept • Concept paper led the way for a full blown design session • Followed by an assessment of the environment (build or buy) • Decision to buy with the possibility to customize • RFP in late 2000’s • National Implementation now complete
User perspective : 2 Interfaces Whiteboard – Technical Interface – analysis and small scale dev work Springboard – web based day to day work
Modern System Architecture From 32 unconnected servers in each office to…
Other Developments • Water Office • HFC – modern software for field computers • Upcoming data mart to modernize the data dissemination end of things (use of web services) • Moving to standard data exchange formats such as WaterML 2.0 • North America Water Watch • Impact of space based technologies (SWOT) • Importance of site characterization as related to the application of available technologies • Radar to estimate surface velocities • Automated QA/QC routines for real-time data • Management of site visit information with pictures and video
The Future • We’re now able to consider the eventual implementation of some of the original concepts such as • Multiple estimators of discharge. • Better connections between monitoring and modelling (feedback loop between the two). • Better tools to undertand what is actually going on at the sites in terms of physics. • Dealing with backwater, ice, etc. • Extending the network by adding virtual sites ? • To try and tackle this, we’re working to set up a dev team • Define a process to move from dev to ops (+ dev environment) • How to manage time series from external modelling systems • How to manage the models themselves • Develop standards and procedures to ensure data quality
An example Sainte-Anne Channel Vaudreuil Channel
The solution Scripting toolbox Whiteboard