How advances in computer simulations can lead to more resilient coastlines
January 31, 2020
January 31, 2020
Advanced numerical modeling technology allows engineers to simulate complex and changing conditions for more successful and resilient projects
On any coastal engineering project, large or small, it is critical to have a solid understanding of the physical processes involved, such as tides, waves, and storm surge. If you’re lucky, maybe the National Oceanic and Atmospheric Administration (NOAA) or the United States Geological Survey (USGS) plunked a tide gauge, flow meter, or wave buoy a few decades ago near the spot you’re examining and now you’ve got a long-term, verified source of data at your site. This, of course, is incredibly unlikely.
You could deploy some instruments, but that comes with its own set of difficulties. You’re likely to end up with a short-term data set that samples only a specific set of conditions at a few point locations. And what if your area of interest is not just a single section of shoreline but, say, a 20,000-acre salt marsh needing coastal habitat restoration?
The solution is a numerical model. Over the past two decades, massive advances in computing power (and cost efficiency) have made this type of model easier and more common than ever.
Depending on the application, software can be run on a desktop computer or remotely by leasing time on a high-performance cloud computing cluster such as those hosted by Amazon or Google. Additionally, there is a wealth of freely available data from US government agencies and partners, such as NOAA, USGS, and the National Weather Service (NWS), that can be used to calibrate, validate, and drive these models.
In the coastal engineering world, a numerical model is a virtual representation of the body of water of interest that simulates the physics that drive water level fluctuation, currents, and circulation; build and propagate waves; and cause sediment transport. The physical parameters can include tides, currents, river flows, salinity, temperature, waves, and even meteorological and solar influences.
Models can be one-, two-, or three-dimensional and cover short- or long-term timeframes. Both historical and hypothetical scenarios can be evaluated. The spatial domain can be large and coarsely refined, small and finely resolved, or both.
Modern hydrodynamic numerical models are extremely adaptable to the needs of a specific project, whether that is simulating storm surge from a hurricane in the western Atlantic Ocean or 3D temperature fluctuations in a pond in central Florida. Once a model is set up and validated, users have access to a complete physics-based simulation of all relevant parameters at thousands, or even millions, of points within an area of interest, all from the comfort of one’s desk.
In the past, these models were largely in the realm of academia, and predictably difficult to use. Eventually, some of the more popular modeling tools such as Delft3D and ADCIRC were released to the public. But they often came with a significant monetary cost. In 2011, however, the tides shifted.
Delft3D went open source, leaving no barrier to entry for using a world-class model other than diving in and learning the software. Today, the graphical user interfaces (GUIs) are extremely user friendly, and users can still pop the hood and access more uncommon modules not available via the GUI if project needs require it.
The scalability of modern numerical models make them suitable for nearly any project that requires an assessment of hydrodynamics—from a 1D wave model for a specific location and scenario that can be set up and executed in a day, to a large-scale, highly-detailed flow and wave model covering an entire ocean and hundreds of storm events that requires months of setup and simulation time.
How can numerical models benefit project design?
With the evolution of this technology, advanced hydrodynamic modeling can bring great advantages to a project. Two projects particularly come to mind in my work as a coastal engineer: one is a classic coastal engineering application, while the other is decidedly not.
The first involves a multi-year effort for a post-Sandy recovery at Prime Hook National Wildlife Refuge (NWR) on the western shore of Delaware Bay. After Hurricane Sandy, this artificially impounded freshwater marsh was ravaged by salt intrusion due to a number of breaches that tore through the barrier island separating the marsh from the bay. Water flowing through the breaches led to marsh die-off and increased flooding of the local community. Working with the US Fish and Wildlife Service (USFWS), we used Delft3D to develop a highly detailed numerical model of the marsh and bay with nearly two million computational points.
The goal of this effort was to better understand the circulation within the refuge and its relationship with tidal and salinity patterns in Delaware Bay in order to investigate potential solutions to the problems caused by the barrier island destruction. As a result, the model incorporated tides, salinity, freshwater inflow, and wind data. It was then calibrated and validated for both normal conditions and extreme storm events. In an iterative process, we tested a number of physical modifications to the refuge, including breach stabilization, breach closure, beach nourishment, and channel modification.
Eventually, the model results showed us that it was possible to reestablish a salt marsh (the refuge’s natural condition before it was artificially impounded in the 1980s) by closing the breaches and restoring historical flow pathways. At the same time, this approach would facilitate ecosystem recovery and solve the flooding issue for the local community. The detailed results of the model were used to predict the types and locations of potential marsh regimes based on water level ranges and salinities, as shown in the graphic above. This level of detail would be difficult to achieve via an empirical analysis, and the model proved to be an invaluable tool for testing and modifying potential solutions.
Today, after an extensive beach nourishment and channel dredging effort, Prime Hook NWR is in a state of rapid recovery, and its success is a template for restoration efforts nationwide as we move towards more sustainable and resilient design in the face of climate change and sea level rise.
The second project example illustrates the fact that the use of hydrodynamic models is not restricted to coastal and estuarine areas. The same physics apply in a lake as in an ocean, so if your model is set up appropriately, it can accurately reproduce hydrodynamics in any body of water. In this case, a utility company in Florida required help determining a solution to the insufficient effectiveness of a power plant cooling pond during the hottest summer days.
A three-dimensional Delft3D model was developed, incorporating the power plant discharge and intake, water temperature, wind conditions, air temperature, humidity, rainfall, and solar radiation. Water temperature profiles were collected throughout the pond to calibrate and validate the model, then several scenarios were run to determine if the client’s proposed flow diversions could improve the cooling capacity of the pond.
In the end, due to the lack of temperature stratification, it was determined that it would be necessary to cool the water via mechanical means, as none of the diversion scenarios proved effective. The modeling effort allowed the client to eliminate potential alternatives without the risk of implementing an ineffective solution.
The advantages to making hydrodynamic modeling a regular part of an engineering arsenal are manifold. With today’s abundance of freely available, trusted data (including digital elevation models, tide gauges, wave buoys, etc.), a calibrated and validated hydrodynamic model can be developed, often with minimal field effort. If a field effort is required, it isn’t always necessary to capture a specific scenario “in the wild,” because a properly validated model can be used to simulate a wide range of conditions with confidence.
Further, the model needs only to be validated at select points, and users can be largely confident that it is well-behaved and accurate throughout the domain. The results of the modeling effort are far more thorough and detailed than any field effort or empirical analysis, and the output can be presented with rich visualizations, helping the client and stakeholders better understand the conclusions and recommendations gleaned from the model.