|
1 | 1 | \chapter*{Abstract}
|
2 | 2 | % \noindent\emph{Nearshore bathymetry needs}
|
3 | 3 |
|
4 |
| -\noindent\textbf{Nearshore bathymetry needs} Knowledge of nearshore bathymetry is crucial for every aspect of the blue economy. Currently, it is expensive to obtain nearshore bathymetry at regional scales, and therefore this data is not available for many coasts. The data scarcity occurs especially in the global south and big ocean states that are at the highest risk from climate change. Currently the only global bathymetric dataset, GEBCO, provides depth data at approximately a 500m resolution. This data is of limited accuracy in the nearshore zone because the source data is sparse in nearshore areas. Recent research has found that NASA's ICESat-2, launched in 2018 to study the cryosphere, can incidentally capture bathymetric data. Some studies have been doing using small sites, but the potential for a global product has not yet been investigated. This thesis proposes a method of using GEBCO data as a starting point and incorporating ICESat-2 data via a Kalman filter. This results in a product with an upscaled spatial resolution and improved the accuracy without requiring any in-situ data. If other local bathymetry data is available, it can be added as input to the Kalman filter or used for validation of the method. |
| 4 | +\noindent\textbf{Nearshore bathymetry needs} Knowledge of nearshore bathymetry is crucial for every aspect of the blue economy. Currently, it is expensive to obtain nearshore bathymetry at regional scales, and therefore this data is not available for many coasts. The data scarcity occurs especially in the global south and big ocean states that are at the highest risk from climate change. Currently the only global bathymetric dataset, GEBCO, provides depth data at approximately a 500m resolution. This data is of limited accuracy in the nearshore zone because the source data is sparse in nearshore areas. Recent research has found that NASA's ICESat-2, launched in 2018 to study the cryosphere, can incidentally capture bathymetric data. Some studies have been doing using small sites, but the potential for a global product has not yet been investigated. This thesis proposes a method of using GEBCO data as a starting point and incorporating ICESat-2 data via a Kalman filter. This results in a product with an downscaled spatial resolution and improved the accuracy without requiring any in-situ data. If other local bathymetry data is available, it can be added as input to the Kalman filter or used for validation of the method. |
5 | 5 | % \noindent\emph{Lidar Satellite-derived bathymetry}
|
6 | 6 |
|
7 | 7 |
|
8 | 8 | \noindent\textbf{Lidar satellite-derived bathymetry} Recent studies in lidar remote sensing have shown that ICESat-2 can capture nearshore bathymetry at depths of up to 20 meters in tracks 0-3km apart, provided atmospheric conditions are good and the water is sufficiently clear. Hence, this data could provide a source of high-resolution bathymetric depth profiles in tropical areas and possibly at higher latitudes. This research project proposes an automated processing chain \pdfcomment{to state that mine is an alternative, do I need to introduce the NASA chain in the abstract?}, written in python, for extracting bathymetry points from the lidar data based on the density of the photon returns in the underwater zone. This method can reliably identify points containing bathymetric signal.
|
9 | 9 |
|
10 | 10 | % \noindent\emph{Data Assimilation via Kalman Updating}
|
11 | 11 |
|
12 |
| -\noindent\textbf{Data assimilation via Kalman Updating} To upscale the data using a Kalman filter, first global data from GEBCO is clipped to the area of interest, and then resampled bilinearly to 50m resolution. Then, the ICESat-2 photon data for the area is processed to generate point measurements of bathymetric depth. To fill in the gaps between these point measurements, the bathymetric points are subsampled and interpolated to the same resolution as the GEBCO data using a universal kriging interpolator. This interpolator results in a raster of the estimated depth, and a raster of the estimated uncertainty. To update the interpolated GEBCO grid, the Kalman gain is calculated for each raster cell in and using the Kalman state equation a new bathymetry grid is produced. If other data is available, the process can be applied recursively with other depth and uncertainty grids, allowing the Bayesian combination of any number of bathymetry datasets for the site. |
| 12 | +\noindent\textbf{Data assimilation via Kalman Updating} To downscale the data using a Kalman filter, first global data from GEBCO is clipped to the area of interest, and then resampled bilinearly to 50m resolution. Then, the ICESat-2 photon data for the area is processed to generate point measurements of bathymetric depth. To fill in the gaps between these point measurements, the bathymetric points are subsampled and interpolated to the same resolution as the GEBCO data using a universal kriging interpolator. This interpolator results in a raster of the estimated depth, and a raster of the estimated uncertainty. To update the interpolated GEBCO grid, the Kalman gain is calculated for each raster cell in and using the Kalman state equation a new bathymetry grid is produced. If other data is available, the process can be applied recursively with other depth and uncertainty grids, allowing the Bayesian combination of any number of bathymetry datasets for the site. |
13 | 13 |
|
14 | 14 | % \noindent\emph{Validation}
|
15 | 15 |
|
16 | 16 | \noindent \textbf{Validation} To validate the method, the improvement in RMS error is calculated between the resulting bathymetry grid and previously validated, high-accuracy survey data. The validation has been applied at several global test sites to verify that the method is generalizable to other regions. Validation sites are chosen based on the availability of validation data from either USGS survey, Dutch Jarkus Data, or survey data from Van Oord projects.
|
17 | 17 |
|
18 | 18 | % \noindent\emph{Outlook}
|
19 | 19 |
|
20 |
| -\noindent\textbf{Outlook} The results of this research could allow easier methods of characterizing nearshore bathymetry in remote areas. It could also be extended to include temporal variation – as more bathymetric data becomes available, it could be used to measure the dynamic changes of coastal systems. Data with this temporal dimension could provide valuable validation data for coastal dynamics models. However, the method is limited by strict water clarity requirements and missing data due to atmospheric conditions. Despite these limitations, it could potentially be upscaled to a global product for clear water areas. |
| 20 | +\noindent\textbf{Outlook} The results of this research could allow easier methods of characterizing nearshore bathymetry in remote areas. It could also be extended to include temporal variation – as more bathymetric data becomes available, it could be used to measure the dynamic changes of coastal systems. Data with this temporal dimension could provide valuable validation data for coastal dynamics models. However, the method is limited by strict water clarity requirements and missing data due to atmospheric conditions. Despite these limitations, it could potentially be downscale to a global product for clear water areas. |
0 commit comments