Skip to content

Commit 8784cb9

Browse files
committed
minor changes/typos mostly
1 parent d45cb62 commit 8784cb9

File tree

3 files changed

+5
-4
lines changed

3 files changed

+5
-4
lines changed

document/chapters/abstract.tex

+3-3
Original file line numberDiff line numberDiff line change
@@ -1,20 +1,20 @@
11
\chapter*{Abstract}
22
% \noindent\emph{Nearshore bathymetry needs}
33

4-
\noindent\textbf{Nearshore bathymetry needs} Knowledge of nearshore bathymetry is crucial for every aspect of the blue economy. Currently, it is expensive to obtain nearshore bathymetry at regional scales, and therefore this data is not available for many coasts. The data scarcity occurs especially in the global south and big ocean states that are at the highest risk from climate change. Currently the only global bathymetric dataset, GEBCO, provides depth data at approximately a 500m resolution. This data is of limited accuracy in the nearshore zone because the source data is sparse in nearshore areas. Recent research has found that NASA's ICESat-2, launched in 2018 to study the cryosphere, can incidentally capture bathymetric data. Some studies have been doing using small sites, but the potential for a global product has not yet been investigated. This thesis proposes a method of using GEBCO data as a starting point and incorporating ICESat-2 data via a Kalman filter. This results in a product with an upscaled spatial resolution and improved the accuracy without requiring any in-situ data. If other local bathymetry data is available, it can be added as input to the Kalman filter or used for validation of the method.
4+
\noindent\textbf{Nearshore bathymetry needs} Knowledge of nearshore bathymetry is crucial for every aspect of the blue economy. Currently, it is expensive to obtain nearshore bathymetry at regional scales, and therefore this data is not available for many coasts. The data scarcity occurs especially in the global south and big ocean states that are at the highest risk from climate change. Currently the only global bathymetric dataset, GEBCO, provides depth data at approximately a 500m resolution. This data is of limited accuracy in the nearshore zone because the source data is sparse in nearshore areas. Recent research has found that NASA's ICESat-2, launched in 2018 to study the cryosphere, can incidentally capture bathymetric data. Some studies have been doing using small sites, but the potential for a global product has not yet been investigated. This thesis proposes a method of using GEBCO data as a starting point and incorporating ICESat-2 data via a Kalman filter. This results in a product with an downscaled spatial resolution and improved the accuracy without requiring any in-situ data. If other local bathymetry data is available, it can be added as input to the Kalman filter or used for validation of the method.
55
% \noindent\emph{Lidar Satellite-derived bathymetry}
66

77

88
\noindent\textbf{Lidar satellite-derived bathymetry} Recent studies in lidar remote sensing have shown that ICESat-2 can capture nearshore bathymetry at depths of up to 20 meters in tracks 0-3km apart, provided atmospheric conditions are good and the water is sufficiently clear. Hence, this data could provide a source of high-resolution bathymetric depth profiles in tropical areas and possibly at higher latitudes. This research project proposes an automated processing chain \pdfcomment{to state that mine is an alternative, do I need to introduce the NASA chain in the abstract?}, written in python, for extracting bathymetry points from the lidar data based on the density of the photon returns in the underwater zone. This method can reliably identify points containing bathymetric signal.
99

1010
% \noindent\emph{Data Assimilation via Kalman Updating}
1111

12-
\noindent\textbf{Data assimilation via Kalman Updating} To upscale the data using a Kalman filter, first global data from GEBCO is clipped to the area of interest, and then resampled bilinearly to 50m resolution. Then, the ICESat-2 photon data for the area is processed to generate point measurements of bathymetric depth. To fill in the gaps between these point measurements, the bathymetric points are subsampled and interpolated to the same resolution as the GEBCO data using a universal kriging interpolator. This interpolator results in a raster of the estimated depth, and a raster of the estimated uncertainty. To update the interpolated GEBCO grid, the Kalman gain is calculated for each raster cell in and using the Kalman state equation a new bathymetry grid is produced. If other data is available, the process can be applied recursively with other depth and uncertainty grids, allowing the Bayesian combination of any number of bathymetry datasets for the site.
12+
\noindent\textbf{Data assimilation via Kalman Updating} To downscale the data using a Kalman filter, first global data from GEBCO is clipped to the area of interest, and then resampled bilinearly to 50m resolution. Then, the ICESat-2 photon data for the area is processed to generate point measurements of bathymetric depth. To fill in the gaps between these point measurements, the bathymetric points are subsampled and interpolated to the same resolution as the GEBCO data using a universal kriging interpolator. This interpolator results in a raster of the estimated depth, and a raster of the estimated uncertainty. To update the interpolated GEBCO grid, the Kalman gain is calculated for each raster cell in and using the Kalman state equation a new bathymetry grid is produced. If other data is available, the process can be applied recursively with other depth and uncertainty grids, allowing the Bayesian combination of any number of bathymetry datasets for the site.
1313

1414
% \noindent\emph{Validation}
1515

1616
\noindent \textbf{Validation} To validate the method, the improvement in RMS error is calculated between the resulting bathymetry grid and previously validated, high-accuracy survey data. The validation has been applied at several global test sites to verify that the method is generalizable to other regions. Validation sites are chosen based on the availability of validation data from either USGS survey, Dutch Jarkus Data, or survey data from Van Oord projects.
1717

1818
% \noindent\emph{Outlook}
1919

20-
\noindent\textbf{Outlook} The results of this research could allow easier methods of characterizing nearshore bathymetry in remote areas. It could also be extended to include temporal variation – as more bathymetric data becomes available, it could be used to measure the dynamic changes of coastal systems. Data with this temporal dimension could provide valuable validation data for coastal dynamics models. However, the method is limited by strict water clarity requirements and missing data due to atmospheric conditions. Despite these limitations, it could potentially be upscaled to a global product for clear water areas.
20+
\noindent\textbf{Outlook} The results of this research could allow easier methods of characterizing nearshore bathymetry in remote areas. It could also be extended to include temporal variation – as more bathymetric data becomes available, it could be used to measure the dynamic changes of coastal systems. Data with this temporal dimension could provide valuable validation data for coastal dynamics models. However, the method is limited by strict water clarity requirements and missing data due to atmospheric conditions. Despite these limitations, it could potentially be downscale to a global product for clear water areas.

document/chapters/discussion.tex

+1-1
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ \section{2D interpolation via universal kriging}
2626

2727
The universal kriging approach proved effective to transform the point estimates into a gridded estimate of seabed elevation and uncertainty.
2828

29-
The largest practical limitation to the Bayesian updating method is the requirement to interpolate the data. Kriging interpolaters are ideal because they are robust to outliers and provide an uncertainty estimate as well as a depth estimate. The downside of the kriging interpolator is both the computational complexity and the requirement that points are not too close together. If points are too close to one another, the kriging matrix is not soluble, and the algorithm has a complexity of $\mathcal{O}(n^3)$ where $n$ is the number of points. This means that there is a relatively strict practical limitation to the number of points that can be used as input to the interpolator. The upper limit with a laptop with 32GB RAM was found to be approximately 2000 points without exceeding the available memory. One way to deal with this is to use a tiling strategy, and repeat the process using tiles which contain 2000 bathymetric points or less. This would need to be done adaptively since the bathymetric points are not evenly distributed. Using a larger number of smaller sub-sites would take longer to process, but it would at least be feasible using a consumer-grade computer.
29+
The largest practical limitation to the Bayesian updating method is the requirement to interpolate the data. Kriging interpolaters are ideal because they are robust to outliers and provide an uncertainty estimate as well as a depth estimate. The downside of the kriging interpolator is both the computational complexity and the requirement that points are not too close together. If points are too close to one another, the kriging matrix is not soluble, and the algorithm has a complexity of $\mathcal{O}(n^4)$ where $n$ is the number of points being interpolated \parencite{}. This means that there is a relatively strict practical limitation to the number of points that can be used as input to the interpolator. The upper limit with a laptop with 32GB RAM was found to be approximately 2000 points without exceeding the available memory. One way to deal with this is to use a tiling strategy, and repeat the process using tiles which contain 2000 bathymetric points or less. This would need to be done adaptively since the bathymetric points are not evenly distributed. Using a larger number of smaller sub-sites would take longer to process, but it would at least be feasible using a consumer-grade computer.
3030

3131
One disadvantage of ICESat-2 data is the uneven spatial distribution of the resulting bathymetric points. Due to the orbit of ICESat-2, the ATL03 data is available along transects oriented $\pm \ang{6}$ relative to north (see Figure \ref{fig:distribution-of-bathy-points-in-space}). Because of this pattern, even if there was perfect bathymetry data along every single ICESat-2 transect within the study area, the spatial distribution of the points would be anisotropic. Given that there are gaps in these transects where the bathymetric signal is weaker, the spatial distribution of the data can vary significantly by site. These gaps along transects can be seen in Figure \ref{fig:distribution-of-bathy-points-in-space}. The gaps of weaker signal can be caused by areas that are too deep or too turbid for the laser to reach the seabed, or due to instrument/atmospheric issues (\ref{sec:discussion-photon-issues}).
3232

document/chapters/methodology.tex

+1
Original file line numberDiff line numberDiff line change
@@ -224,6 +224,7 @@ \subsection{Bayesian data assimilation using Kalman update equation}\label{subse
224224

225225
Equation \ref{eq:kalmangain} is the Kalman gain, an estimation of the strength of the estimation. For this case, it is assumed that the matrix $H$ is the identity matrix. Equations \ref{eq:kalmangain}, \ref{eq:new_state_measurement}, and \ref{eq:new_uncertainty} can be simplified to
226226

227+
\pdfcomment{number these?}
227228
$$ K = \frac{P_k}{P_k + R} $$
228229

229230
$$ \hat{x}_k = \hat{x}_{\bar{k}} + K(\hat{z}_k - \hat{x}_{\bar{k}}) $$

0 commit comments

Comments
 (0)
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy