Tidal Datum Description

Mean Low Tide (MLT)

MLT was established by United States Army Corps of Engineers (USACE) back in the 1930s. The USACE MLT datum only exists on benchmarks they established along the Intracoastal Waterway back in the 1930s. To connect these benchmarks to TCOON stations would be expensive and not particularly useful. Galveston USACE now use TCOON’s Mean Lower Water (MLW) values as computed by NOAA’s National Ocean Service (NOS).

For More Information

Navigating the DNR Tidal Datum page

Tidal Datum Schematic

Station Datum

All water elevations are measured at each station relative to the Station Datum. Station Datum is an arbitrary zero point set low enough at each station such that the water level will never be below this point. The arbitrary number is designed to allow all water level observations to be positive numbers. Each station has its own unique Station Datum because of physical conditions at the station.

A series of deep-driven-rod bench marks exist to maintain the station’s zero point over time. If any of the benchmarks or the tide gauge itself were to move, the water level measurements associated with that benchmark or tide gauge could be adjusted accordingly to coincide with the readings of the others which did not move. In the figure below, station datum is labeled “STND” and is located at the bottom of the figure.

Primary Water Levels

On several of our stations we have more than one water level sensor; the “main” water level sensor (the one we trust most) is always the “primary water level” sensor.

Water elevations are measured at each station relative to a zero point (called the station datum) chosen such that none of the water level measurements will be below the zero point. Each station has its own station datum unrelated to the others, and we keep a series of deep-driven-rod bench marks to maintain the station’s zero point over time and to determine the conversions to other vertical datums.

Most of our stations use an Aquatrak sensor to compute “primary water level”. Each primary water level reading is computed by collecting 181 1-second water level measurements, computing the standard deviation of those measurements, discarding any “outlier” measurements more than three standard deviations from the mean, and reporting the mean of the remaining samples as the primary water level elevation for that period.

Station Configuration

typical station

CBI uses several different data collection systems to obtain environmental measurements. Each station has sensors for measuring the various environmental parameters, a data collection computer for controlling the sensors and temporarily storing data on-site, one or more telemetry devices for retrieving data from the station, and solar panels and batteries for power. The instrument configuration at each station is dependent on the purpose of the station and the local conditions. For example, the instruments used to measure parameters such as water current, salinity, and dissolved oxygen are expensive to install and maintain, thus they are only deployed at a limited number of sites. The presence of buildings or other physical structures near a data collection station may prevent accurate wind measurements; as a consequence, wind is not measured at such stations.

Most TCOON stations use a Next Generation Water Level Measurement System (NGWLMS) as designed by NOS . At the heart of the NGWLMS system is a Sutron 9000 computer that controls the attached sensors, collects and stores environmental observations, and transmits the observations via satellite, radio, or telephone modem. The Sutron 9000 has been quite reliable for TCOON operations; however these systems are beginning to deteriorate with age and it is difficult to find repair or replacement parts. As a result, in the past several years CBI has gradually migrated to using Vitel VX1100 Data Acquisition and Telemetry Unit computers for data collection at some stations. The VX1100 system provides much of the same functionality and capability as the Sutron 9000 at a lower cost.

 For other projects such as the Corpus Christi Real-Time Navigation System and the Freeport FlowInfo System, CBI has developed a data collection computer using industry-standard PC-104 computer components. A typical PC-104-based data collection computer consists of an Intel 486 or Pentium processor, 16 megabytes of RAM, and at least 20 megabytes of solid-state hard disk space. The data collection computer runs a modified form of the Linux operating system; this allows the use of a rich set of software development tools and provides a robust multitasking environment for controlling sensors, storing data, and doing on-site processing such as data compression for transmission over bandwidth-limited communications channels.

The sensors used to measure environmental parameters include acoustic transducers for measuring water elevations, wind anemometers for wind speed and direction, acoustic doppler current profiling instruments, multiparameter water-quality probes, and other similar instruments. Each of the sensors is interfaced to the data collection computer via serial communications ports or analog-to-digital conversion hardware. CBI has an operations staff that performs routine maintenance and calibration of each sensor.

All the stations in TCOON and related projects measure environmental parameters at some multiple of six-minute intervals. For example, water-level measurements are taken every six minutes, while other measurements such as salinity or barometric pressure may be made every thirty or sixty minutes. The data are stored on-site in the data collection computer and then transmitted to CBI by one or more communications channels including satellite, spread-spectrum packet radio, or telephone modem. The choice of communication medium used for a station depends on the availability of telephone or radio connections and the degree of need for real-time observations. For example, stations in the Corpus Christi Real-Time Navigation System and Freeport FlowInfo System use line-of-sight packet radios to allow for frequent acquisition of data from the on-site data collection computers-typically downloading new data once every twelve minutes. At stations where radio or telephone connections are not available, satellite transmissions are used to transmit data at hourly or three-hourly intervals. Thus, the time from measurement to acquisition at CBI depends on the measurement interval and the communications medium used. The data arrive at CBI somewhere between six minutes and six hours after measurement.

Data Management

CBI maintains a central repository for all data collected by TCOON and other CBI environmental networks. The varied applications for CBI’s environmental data sets present diverse requirements for data management. For example, the computation of tidal datums used in determination of property boundaries requires detailed analysis of long-term data sets in accordance with NOS standards and sufficient audit capability to defend the accuracy of the datums in legal contexts. Recreational and lay users desire easy-to-understand presentations of data (e.g., graphics or summaries), while scientists need access to the raw data in a form that can be easily imported into models or research projects. Applications such as marine navigation and weather forecasting need near-real-time access to data sets and automated quality-control systems.

Since 1991, the Conrad Blucher Institute for Surveying and Science has placed data management as a “mission critical” component of its observation networks. The CBI is principally responsible for data management at CBI. CBI recognizes that the success of its observation network efforts depends on the quality of the end products. Because many of CBI’s products are used to determine property boundaries and support engineering designs, it’s possible that these products will need to be defended in a court of law. Therefore, CBI maintains detailed records and audit trails for all of the steps used in the creation of its data products. Electronic data management and highly automated systems have been the keys that allow CBI to achieve these results within limited budgets.

The data management strategy used by CBI can be summarized by these design principles.

Rigorous adherence to these design principles has produced a system that is robust, stable, and flexible enough to accommodate a wide variety of observational-data needs and changes in requirements. Since 1991, CBI’s data acquisition and reporting system has been able to quickly and cleanly adapt to changes in sensor packages, hardware environments, operating systems, database management systems, and communications environments. Our present data management system is running on a 1 GHz Pentium-processor based personal computer using the Linux operating system and open-source software packages such as Perl, Apache, and MySQL. The overall architecture of the data management system can be divided into loosely integrated subsystems.

Each business morning, one or more CBI personnel perform additional quality control by visually inspecting recently received data in the online database. This is facilitated by a Web-based interface that automatically graphs the previous fourteen days’ collected data for each station in the network. An analyst detecting a potential problem in the network can use this same interface to enter a message into the online system indicating that a problem or anomaly needs to be investigated and/or corrected. These quality-control messages are then distributed daily to field operations staff and management, who then arrange for necessary repairs and recovery of missing data. Operators also have the ability to suppress distribution of erroneous data that may not have been detected by the automated quality-control systems.

CBI’s extensive use of automation has resulted in a cost-effective, reliable, and flexible implementation of data management. Data acquisition, archiving, and distribution take place autonomously with only occasional operator intervention in cases of platform malfunction or data transmission errors. The daily data inspection results in timely platform repairs and excellent data quality. A CBI staff member can generally perform a complete inspection of the data from all stations in the network in less than an hour.

Furthermore, the use of automated systems for the majority of the data processing tasks makes it possible to provide environmental data to end-users in near-real time. Observations that pass the automated quality-control features of the system are generally made available to end-users within seconds of the data’s arrival at CBI. For stations equipped with radio transmission facilities, this means that data are typically available to end-users within fifteen minutes of the actual time of measurement.

Data Management Design Principles

  1. Preserve source data and annotate data instead of modifying or deleting. Because CBI’s products may be critically reviewed in legal proceedings, CBI discards as little information as possible. All source records from instruments are kept and archived in their original format, and these serve as the basis for all “derived” products. All changes (adjustments and deletions) are performed as annotations to the data sets without modifying the source data. In fact, it is possible for CBI to automatically regenerate the entire observation network database from the archived source documents.
  2. Automate as much as possible. Automation is key to all CBI operations. Automation increases the capacity to manage data, reduces the opportunities for human error, ensures a consistent method for managing and processing data, and documents the procedures used (in the form of programs) to derive the data products.
  3. Maintain a standard data interchange format. TCOON’s early data management systems explicitly recognized the importance of a standardized data format that is easy to produce and easy for humans and automated systems to process. All data entering the TCOON data management system are first converted into a standardized ASCII representation with common time references and units. Also, all data files are tagged with sufficient metadata information to allow a consumer to determine the source and processing that has been performed on the data.
  4. Avoid complex or proprietary components. Loose integration of simple components provides a more flexible and robust system than a tight integration of more complex components. Since 1991 all TCOON data management components have been Unix based; since 1996 all user interfaces have been Web based, and since 1998 all TCOON data management components have utilized standard PC-based hardware systems and open-source software packages such as Linux and MySQL. Using widely available and standardized system components has provided flexibility and stability in the data management environment.
  5. Emphasize long-term reliability over short-term costs. Investments made in infrastructure design and implementation reap long-term benefits in the form of reduced operations costs and improved reliability. Maintenance and operation of a long-term observation network requires a long-term perspective for all of the day-to-day decisions that must be made. Software and procedural “quick fixes” are to be avoided, because a poor decision made in haste today often leads to an expensive repair in the future.

Data Management Integrated Subsystems

  1. The “data acquisition” subsystem is responsible for retrieving the data from sensor packages and observation platforms in the network. Because of the wide variety of communications links and platform hardware used for measuring environmental data, the data acquisition subsystem is actually accomplished by a small number of independent software modules that implement communication with specific platform hardware or specific communications media. As new platform hardware configurations or communications channels are added to the observation network, it is easy to develop the corresponding acquisition component and integrate it into the existing acquisition subsystem.
  2. The “data archival/decoding” subsystem is responsible for maintaining an archive of the source data. All incoming data are placed in a special directory known as the “inbox”, where an automated “clerk” process picks up the data for archiving and decoding. The clerk process then calls individual decoder programs that convert the raw data files into a standardized interchange format for storage in the MySQL database. The clerk and its decoder programs also perform quality-control data checks to prevent erroneous data from entering the online database system. Any errors found in the incoming data are flagged and the source data files are left in the inbox until an operator takes corrective actions to eliminate the errors.
  3. The MySQL database contains the decoded data as received from the observation platforms after they have passed through the initial decoder quality-control checks. Any further corrections or adjustments to the data are recorded in the database as separate “correction” records that are automatically applied to the data whenever they are extracted from the observations database.
  4. The “data extraction” subsystem provides an interface to allow users to query the observation database. Currently the Hypertext Transfer Protocol (HTTP) and the Apache Web Server are the primary query interface to the database. The query interface has the ability to retrieve data from the observation database in a number of different formats (e.g., graphs, text) and can automatically apply various transformations to the data to present them in a form most useful to the end-user.
  5. Surrounding the data extraction subsystem are a number of World Wide Web front ends that provide different views of the available data as well as ancillary information to assist the end-user in understanding the system.


Data Applications

Tidal Datum Definition. Water-level datums have their origins in the need to reduce depth soundings taken at different stages of the tide during hydrographic surveys to a common level. In recent years, greater emphasis has been placed on the need of the land surveyor engaged in waterfront boundaries determined by tidal definition. NOS has established the procedures needed to compute tidal datums in normal tidal regimes. However, at many locations along the Texas coast, the astronomical tide is often masked by local meteorological conditions and long-term trends in the Gulf of Mexico. These conditions make the standard procedures difficult to apply and increase the degree of labor required to produce a tidal datum. CBI and NOS have worked together to improve the applicability of the procedures to the Texas coast and to reduce the need for manual processing. CBI has developed Web-based software that automates the computations of datums from water-level data stored in the CBI database. This software and the daily quality-control procedures allow CBI to quickly produce tidal datums with little manual intervention.

Littoral Boundary Definition. The tidal datums described above are used in the determination of littoral boundaries between submerged and privately owned lands. Bench-mark leveling is performed annually at each TCOON station to ensure station stability and to relate water-level information to reference points on land. CBI publishes bench-mark elevation sheets on its Web site that indicate the elevation of bench marks above tidal datum planes; surveyors then use these elevations for precisely locating littoral boundaries.

Navigation and Marine Safety. The ready availability of TCOON data has great value to navigation interests. Information provided by TCOON is augmented with current meter data from other systems to provide near-real-time reports of conditions in heavily trafficked shipping channels in several Texas ports. One such system is the Corpus Christi Real-Time Navigation System sponsored by the Port of Corpus Christi. CBI has installed acoustic doppler current instruments at key locations along the Corpus Christi Ship Channel; near-real-time data from these instruments and TCOON platforms allow ship pilots to better navigate large vessels in and out of Corpus Christi Bay. Access to the near-real-time data is provided via automated touch-tone voice response systems running on CBI computers; pilots call a local telephone number to receive a digitized voice summary of the latest current, water-level, and meteorological conditions for the stations in Corpus Christi Bay. The NOS-sponsored Houston/Galveston Physical Oceanographic Real-Time System (PORTS) makes similar use of TCOON stations to assist pilots in Galveston Bay, and the Freeport FlowInfo system monitors currents and water levels for Port Freeport.

In 2001 the Corpus Christi National Weather Service Office funded the installation of an offshore station for collection of meteorological and wave data. Prior to the installation of this station, the local weather office had limited data in the 15 to 20-mile offshore region from which to produce its marine forecasts. The new offshore platform provides forecasters with vital information needed to produce accurate marine forecasts. Financial support for this installation ended in 2004 and the station is not presently functional.

Channel Dredging and Maintenance. The United States Army Corps of Engineers Galveston District uses TCOON data to plan and execute its maintenance of Federally authorized channels and waterways along the Texas coast, including ship channels and the Gulf Intracoastal Waterway. Navigation channel maintenance and operation activities require knowledge of water level, tidal datums, and other environmental parameters before, during, and after dredging. Convenient access to real-time and historical data is needed in order to conduct these activities.

Oil-Spill Response. One of the more significant applications of TCOON data uses its near-real time capabilities to calibrate circulation and oil-spill trajectory models with recently collected observations. The Texas Water Development Board has developed an automated system that models water currents in Corpus Christi Bay and Galveston Bay using TCOON data. Each night, the latest water level measurements are automatically downloaded from the TCOON Web site via the Internet, and these measurements are used to generate a new set of model calculations that predict the currents for the following three days. These current predictions are then fed as input to a trajectory model called SpillSim that predicts where oil will move if a spill occurs in these bays. Oil-spill emergency response teams use information from these models to deploy clean-up teams and other resources to minimize a spill’s impact.

Hurricane and Storm Preparation An initial and ongoing application of CBI’s tide stations has been to provide timely water-level data to assist the City of Corpus Christi and the local Corpus Christi National Weather Service Office with hurricane or storm preparedness. Local officials use TCOON data as input for decisions regarding inundation of low-lying areas along the coast, road closures, and evacuations. Software has been developed to allow the local National Weather Service Office to seamlessly integrate near-real-time CBI observations into its weather forecasting systems to better predict the effects of an incoming storm. Because the A&M-CC campus is evacuated if a hurricane threatens the area, CBI can move its data-collection system to local emergency operations centers to continue collection and distribution of TCOON data during the storm event.

Recreation and Benefit to the General Public. As part of its public service mission, CBI has endeavored to find applications for TCOON that provide benefit to the general public; several of these applications are described here. A particularly successful application of TCOON data has been CBI’s WindInfo system. WindInfo is a telephone voice-response system (361-992-WIND) that provides the general public with wind and water conditions for any station in CBI’s observation networks. In 1993, WindInfo received over 60,000 calls from windsurfers, sailors, fishermen, and other recreational enthusiasts desiring near-real-time information along the coast. In addition to wind data, continious salinity and water temperature data observations from CBI monitoring stations in several Texas bay systems are very popular with the recreational and commercial fishing communities.