Calculate Harmonic Parameters#
Besides using static harmonic parameters to derive the non-flood likelihood \(P(\sigma^0|nonflood)\), we can also dynamically calculate the parameters form past backscattering information. In this notebook we show how we can extract these coefficients that describe seasonal patterns in Sentinel 1 radar backscatter variability.
import hvplot.xarray # noqa
import numpy as np
import xarray as xr
from dask.distributed import Client
from dask_flood_mapper import flood
from dask_flood_mapper.catalog import (format_datetime_for_xarray_selection,
initialize_catalog, initialize_search)
from dask_flood_mapper.harmonic_params import (
create_harmonic_parameters, process_harmonic_parameters_datacube)
from dask_flood_mapper.processing import prepare_dc, process_sig0_dc
client = Client(processes=False, threads_per_worker=2, n_workers=1, memory_limit="12GB")
client
Client
Client-fa9012d7-5bff-11f0-89b1-6045bd4c942d
Connection method: Cluster object | Cluster type: distributed.LocalCluster |
Dashboard: http://10.1.0.247:8787/status |
Cluster Info
LocalCluster
17718832
Dashboard: http://10.1.0.247:8787/status | Workers: 1 |
Total threads: 2 | Total memory: 11.18 GiB |
Status: running | Using processes: False |
Scheduler Info
Scheduler
Scheduler-5482ce96-d79e-409f-aedd-f5eefe091b83
Comm: inproc://10.1.0.247/2481/1 | Workers: 1 |
Dashboard: http://10.1.0.247:8787/status | Total threads: 2 |
Started: Just now | Total memory: 11.18 GiB |
Workers
Worker: 0
Comm: inproc://10.1.0.247/2481/4 | Total threads: 2 |
Dashboard: http://10.1.0.247:44613/status | Memory: 11.18 GiB |
Nanny: None | |
Local directory: /tmp/dask-scratch-space/worker-mgh_yl7f |
As an example we will select only a small region of interest contained in the Zingst case study.
time_range = "2022-10-11T05:25:26"
minlon, maxlon = 12.999, 13
minlat, maxlat = 53.999, 54
bounding_box = [minlon, minlat, maxlon, maxlat]
We will now load the sigma nought datacube from EODC. In order to fit the harmonic functions to the timeseries, the temporal range is extended to include the three years before the provided time range. This is done by using the dynamic
parameter of initialize_search
.
eodc_catalog = initialize_catalog()
search = initialize_search(eodc_catalog, bounding_box, time_range, dynamic=True)
items_sig0 = search.item_collection()
sig0_dc = prepare_dc(items_sig0, bounding_box, bands="VV")
sig0_dc, orbit_sig0 = process_sig0_dc(sig0_dc, items_sig0, bands="VV")
sig0_dc
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[4], line 5
3 items_sig0 = search.item_collection()
4 sig0_dc = prepare_dc(items_sig0, bounding_box, bands="VV")
----> 5 sig0_dc, orbit_sig0 = process_sig0_dc(sig0_dc, items_sig0, bands="VV")
6 sig0_dc
File /opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/site-packages/dask_flood_mapper/processing.py:46, in process_sig0_dc(sig0_dc, items_sig0, bands)
38 sig0_dc = (
39 post_process_eodc_cube(sig0_dc, items_sig0, bands)
40 .rename_vars({"VV": "sig0"})
(...)
43 .sortby("time")
44 )
45 orbit_sig0 = order_orbits(sig0_dc)
---> 46 sig0_dc = sig0_dc.groupby("time").mean(skipna=True)
47 sig0_dc = sig0_dc.assign_coords(orbit=("time", orbit_sig0))
48 sig0_dc = sig0_dc.persist()
File /opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/site-packages/xarray/core/_aggregations.py:2974, in DatasetGroupByAggregations.mean(self, dim, skipna, keep_attrs, **kwargs)
2964 return self._flox_reduce(
2965 func="mean",
2966 dim=dim,
(...)
2971 **kwargs,
2972 )
2973 else:
-> 2974 return self._reduce_without_squeeze_warn(
2975 duck_array_ops.mean,
2976 dim=dim,
2977 skipna=skipna,
2978 numeric_only=True,
2979 keep_attrs=keep_attrs,
2980 **kwargs,
2981 )
File /opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/site-packages/xarray/core/groupby.py:1991, in DatasetGroupByBase._reduce_without_squeeze_warn(self, func, dim, axis, keep_attrs, keepdims, shortcut, **kwargs)
1989 with warnings.catch_warnings():
1990 warnings.filterwarnings("ignore", message="The `squeeze` kwarg")
-> 1991 check_reduce_dims(dim, self.dims)
1993 return self._map_maybe_warn(reduce_dataset, warn_squeeze=False)
File /opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/site-packages/xarray/core/groupby.py:72, in check_reduce_dims(reduce_dims, dimensions)
70 reduce_dims = [reduce_dims]
71 if any(dim not in dimensions for dim in reduce_dims):
---> 72 raise ValueError(
73 f"cannot reduce over dimensions {reduce_dims!r}. expected either '...' "
74 f"to reduce over all dimensions or one or more of {dimensions!r}."
75 f" Try passing .groupby(..., squeeze=False)"
76 )
ValueError: cannot reduce over dimensions ['time']. expected either '...' to reduce over all dimensions or one or more of FrozenMappingWarningOnValuesAccess(FrozenMappingWarningOnValuesAccess({'y': 7, 'x': 5})). Try passing .groupby(..., squeeze=False)
Calculate Harmonic Parameters#
This function fits sine and cosine functions known as harmonic oscillators to each pixel of the Sentinel 1 \(\sigma^0\) datacube. These seasonally varying curves can then be extracted from time series. What is left is the noise or transient events, for example flood events, superimposed on the seasonal trend.
# Function to reformat time range for selecting from the Xarray
datetime_xr = format_datetime_for_xarray_selection(search, time_range)
hpar_list = create_harmonic_parameters(sig0_dc)
__, hpar_dc, __ = process_harmonic_parameters_datacube(sig0_dc, datetime_xr, hpar_list)
hpar_dc
t = sig0_dc.time.dt.dayofyear
n = 365
y = (
hpar_dc.M0
+ hpar_dc.C1 * np.cos(2 * np.pi * t / n)
+ hpar_dc.S1 * np.sin(2 * np.pi * t / n)
+ hpar_dc.C2 * np.cos(2 * np.pi * t / n)
+ hpar_dc.S2 * np.sin(2 * np.pi * t / n)
+ hpar_dc.C3 * np.cos(2 * np.pi * t / n)
+ hpar_dc.S3 * np.sin(2 * np.pi * t / n)
)
Fit Harmonic Function to Original Data#
Finally, we merge the two datasets and superimpose the fitted harmonic function on the raw sigma nought data.
xr.merge([y.rename("pred"), sig0_dc]).squeeze().assign_coords(
x=range(len(sig0_dc.x)), y=range(len(sig0_dc.y))
).hvplot(x="time")
Calculating of harmonic parameters can be selected during flood mapping by setting the parameter
to True
.
flood.decision(bbox=bounding_box, datetime=time_range, dynamic=True)
This setting theoretically frees the user from the need for a precomputed harmonic parameters dataset, thereby only requiring sentinel-1 sigma nought and PLIA datasets as static inputs. Do mind, however, that calculating harmonic parameters can be a computationally expensive operation.