Skip to content
This repository has been archived by the owner on Apr 11, 2019. It is now read-only.

Commit

Permalink
Move everything to the new directory structure
Browse files Browse the repository at this point in the history
  • Loading branch information
kwilcox committed Jun 10, 2014
1 parent e724106 commit 9b874fc
Show file tree
Hide file tree
Showing 26 changed files with 675 additions and 97 deletions.
2 changes: 1 addition & 1 deletion Theme_2_Extreme_Events/README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
# IOOS System Test - Theme 2 - Exteme Events
# IOOS System Test - Theme 2 - Extreme Events

The Extreme Events theme, as its name suggests, focuses on the analysis of data related to events that produce extreme adverse effects across a geographic area. Possible topics within this theme may include, but not be limited to, the following:
43 changes: 6 additions & 37 deletions Theme_2_Extreme_Events/Scenario_2A_Coastal_Inundation/README.md
Original file line number Diff line number Diff line change
@@ -1,42 +1,11 @@
# IOOS System Test - Theme 2 - Exteme Events
# IOOS System Test - Theme 2 - Extreme Events

## Scenario 2A - Coastal Inundation

### Requirements
As a severe storm is approaching the U.S. East Coast which will result in inundation, flooding, and wind damage over an extensive area. Coastal emergency managers must prepare for and respond to flooding as well as plan and implement evacuations. Emergency managers, forecasters, and researchers currently rely on a number of data sources to do their work, including observations and forecast models:

1. Using `pip`
```bash
pip install -r pip-requirements.txt
pip install git+https://github.com/wrobstory/folium.git#egg=folium
pip install git+https://github.com/SciTools/cartopy.git@v0.10.0
pip install git+https://github.com/SciTools/iris.git@v1.6.1
```
* Forecasters are interested to know how federal and non-federal models compare to observed waves, river flows, and water levels, throughout the storm.
* Coastal emergency managers use inundation and flood data to identify where first responders should dedicate resources.
* After the storm, researchers want to compare observed data to modeled data to identify model shortcomings and areas of improvement, as well as to quality control observing systems.

2. Using `conda`
```bash
conda install --file conda-requirements.txt
conda install -c https://conda.binstar.org/rsignell iris=v1.6.2_RPS
```
If you are using environments within conda, be sure to specify it
```bash
conda install -n yourenvname --file conda-requirements.txt
conda install -n yourenvname -c https://conda.binstar.org/rsignell iris=v1.6.2_RPS
```

### Helper methods

Some helper functions have been abstracted into the file called `utilities.py`
so the IPython notebook can maintain a certain degree of readability.


**Note:** If your HDF5 and/or NETCDF4 libraries are in uncommon locations, you
may need to specify the paths when installing netCDF4.
```bash
HDF5_DIR=/your/path/to/hdf5 NETCDF4_DIR=/your/path/to/netcdf4 PIP_OR_CONDA_INSTALL_COMMAND
```

**Note:** If your `gdal-config` binary is in an uncommon location, you may need
to specify the path when installing.
```bash
PATH=/your/path/to/gdal/bin:$PATH PIP_OR_CONDA_INSTALL_COMMAND
```
The ability to quickly and easily integrate these complementary datasets and predictions into visualizations and analyses will help emergency managers and responders improve their ability to forecast, prepare for and respond to coastal storms.
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# IOOS System Test - Theme 2 - Exteme Events

## Scenario 2A - Coastal Inundation

### Extreme Value Analysis - Inundation

#### Requirements

1. Using `pip`
```bash
pip install -r pip-requirements.txt
pip install git+https://github.com/wrobstory/folium.git#egg=folium
pip install git+https://github.com/SciTools/cartopy.git@v0.10.0
pip install git+https://github.com/SciTools/iris.git@v1.6.1
```

2. Using `conda`
```bash
conda install --file conda-requirements.txt
conda install -c https://conda.binstar.org/rsignell iris=v1.6.2_RPS
```
If you are using environments within conda, be sure to specify it
```bash
conda install -n yourenvname --file conda-requirements.txt
conda install -n yourenvname -c https://conda.binstar.org/rsignell iris=v1.6.2_RPS
```

#### Helper methods

Some helper functions have been abstracted into the file called `utilities.py`
so the IPython notebook can maintain a certain degree of readability.


**Note:** If your HDF5 and/or NETCDF4 libraries are in uncommon locations, you
may need to specify the paths when installing netCDF4.
```bash
HDF5_DIR=/your/path/to/hdf5 NETCDF4_DIR=/your/path/to/netcdf4 PIP_OR_CONDA_INSTALL_COMMAND
```

**Note:** If your `gdal-config` binary is in an uncommon location, you may need
to specify the path when installing.
```bash
PATH=/your/path/to/gdal/bin:$PATH PIP_OR_CONDA_INSTALL_COMMAND
```
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,9 @@

## Scenario 2A - Coastal Inundation

### Requirements
### Extreme Value Analysis - Waves

#### Requirements

1. Using `pip`
```bash
Expand All @@ -23,7 +25,7 @@
conda install -n yourenvname -c https://conda.binstar.org/rsignell iris=v1.6.2_RPS
```

### Helper methods
#### Helper methods

Some helper functions have been abstracted into the file called `utilities.py`
so the IPython notebook can maintain a certain degree of readability.
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
pandas
numpy
matplotlib
OWSLib
netCDF4
lxml
pyoos
git+https://github.com/wrobstory/folium.git#egg=folium
cython
pyshp
Pillow
cartopy
pyke
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
cython
numpy
scipy
pandas
matplotlib
OWSLib
netCDF4
lxml
pyoos
pyshp
Pillow
pyke
Shapely
biggus
prettyplotlib
Original file line number Diff line number Diff line change
@@ -1,24 +1,44 @@
"""
@file utilities.py
@brief utility functions for IOOS System Test notebook: Scenario_A_Model_Obs_Compare_Waves.ipynb
Constants and definitions.
Now we need to specify all the names we know for water level, names that
will get used in the CSW search, and also to find data in the datasets that
are returned. This is ugly and fragile. There hopefully will be a better
way in the future...
Standard Library.
"""

import cStringIO
from lxml import etree
import urllib2
from io import BytesIO
from warnings import warn
try:
from urllib.request import urlopen
except ImportError:
from urllib import urlopen

# Scientific stack.
import numpy as np
from IPython.display import HTML
from pandas import DataFrame, concat, read_csv

# Custom IOOS/ASA modules (available at PyPI).
from owslib import fes
from owslib.ows import ExceptionReport


def date_range(start_date='1900-01-01', stop_date='2100-01-01', constraint='overlaps'):
"""
Hopefully something like this will be implemented in fes soon.
"""
name_list = ['water level',
'sea_surface_height',
'sea_surface_elevation',
'sea_surface_height_above_geoid',
'sea_surface_height_above_sea_level',
'water_surface_height_above_reference_datum',
'sea_surface_height_above_reference_ellipsoid']

sos_name = 'water_surface_height_above_reference_datum'


def dateRange(start_date='1900-01-01', stop_date='2100-01-01',
constraint='overlaps'):
"""Hopefully something like this will be implemented in fes soon."""
if constraint == 'overlaps':
propertyname = 'apiso:TempExtent_begin'
start = fes.PropertyIsLessThanOrEqualTo(propertyname=propertyname,
Expand All @@ -36,46 +56,50 @@ def date_range(start_date='1900-01-01', stop_date='2100-01-01', constraint='over
return start, stop


def get_NDBC_station_long_name(sta):
"""
Get longName for specific NDBC station
"""
url = ('http://sdf.ndbc.noaa.gov/sos/server.php?service=SOS&version=1.0.0&'
'request=DescribeSensor&version=1.0.0&outputFormat=text/xml;subtype="sensorML/1.0.1"&'
'procedure=urn:ioos:station:wmo:%s') % sta
tree = etree.parse(urllib2.urlopen(url))
def get_Coops_longName(station):
"""Get longName for specific station from COOPS SOS using DescribeSensor
request."""
url = ('http://opendap.co-ops.nos.noaa.gov/ioos-dif-sos/SOS?service=SOS&'
'request=DescribeSensor&version=1.0.0&'
'outputFormat=text/xml;subtype="sensorML/1.0.1"&'
'procedure=urn:ioos:station:NOAA.NOS.CO-OPS:%s') % station
tree = etree.parse(urlopen(url))
root = tree.getroot()
longName = root.xpath("//sml:identifier[@name='longName']/sml:Term/sml:value/text()",
namespaces={'sml': "http://www.opengis.net/sensorML/1.0.1"})
return longName
path = "//sml:identifier[@name='longName']/sml:Term/sml:value/text()"
namespaces = dict(sml="http://www.opengis.net/sensorML/1.0.1")
longName = root.xpath(path, namespaces=namespaces)
if len(longName) == 0:
longName = station
return longName[0]


def coops2df(collector, coops_id, sos_name):
"""
Request CSV response from SOS and convert to Pandas DataFrames.
"""
"""Request CSV response from SOS and convert to Pandas DataFrames."""
collector.features = [coops_id]
collector.variables = [sos_name]
response = collector.raw(responseFormat="text/csv")
data_df = read_csv(cStringIO.StringIO(str(response)), parse_dates=True, index_col='date_time')
# data_df['Observed Data']=data_df['water_surface_height_above_reference_datum (m)']-data_df['vertical_position (m)']
data_df['Observed Wave Height Data'] = data_df['sea_surface_wave_significant_height (m)']
data_df['Observed Peak Period Data'] = data_df['sea_surface_wave_peak_period (s)']

a = get_NDBC_station_long_name(coops_id)
if len(a) == 0:
long_name = coops_id
else:
long_name = a[0]
long_name = get_Coops_longName(coops_id)

try:
response = collector.raw(responseFormat="text/csv")
data_df = read_csv(BytesIO(response.encode('utf-8')),
parse_dates=True,
index_col='date_time')
col = 'water_surface_height_above_reference_datum (m)'
if False:
data_df['Observed Data'] = (data_df[col] -
data_df['vertical_position (m)'])
data_df['Observed Data'] = data_df[col]
except ExceptionReport as e:
warn("Station %s is not NAVD datum. %s" % (long_name, e))
data_df = DataFrame() # Assing an empty DataFrame for now.

data_df.name = long_name
return data_df


def mod_df(arr, timevar, istart, istop, mod_name, ts):
"""
Return time series (DataFrame) from model interpolated onto uniform time
base.
"""
"""Return time series (DataFrame) from model interpolated onto uniform time
base."""
t = timevar.points[istart:istop]
jd = timevar.units.num2date(t)

Expand All @@ -100,9 +124,7 @@ def mod_df(arr, timevar, istart, istop, mod_name, ts):


def service_urls(records, service='odp:url'):
"""
Extract service_urls of a specific type (DAP, SOS) from records.
"""
"""Extract service_urls of a specific type (DAP, SOS) from records."""
service_string = 'urn:x-esri:specification:ServiceType:' + service
urls = []
for key, rec in records.iteritems():
Expand All @@ -116,9 +138,7 @@ def service_urls(records, service='odp:url'):


def nearxy(x, y, xi, yi):
"""
Find the indices x[i] of arrays (x,y) closest to the points (xi, yi).
"""
"""Find the indices x[i] of arrays (x,y) closest to the points (xi, yi)."""
ind = np.ones(len(xi), dtype=int)
dd = np.ones(len(xi), dtype='float')
for i in np.arange(len(xi)):
Expand All @@ -129,20 +149,16 @@ def nearxy(x, y, xi, yi):


def find_ij(x, y, d, xi, yi):
"""
Find non-NaN cell d[j,i] that are closest to points (xi, yi).
"""
"""Find non-NaN cell d[j,i] that are closest to points (xi, yi)."""
index = np.where(~np.isnan(d.flatten()))[0]
ind, dd = nearxy(x.flatten()[index], y.flatten()[index], xi, yi)
j, i = ind2ij(x, index[ind])
return i, j, dd


def find_timevar(cube):
"""
Return the time variable from Iris. This is a workaround for iris having
problems with FMRC aggregations, which produce two time coordinates.
"""
"""Return the time variable from Iris. This is a workaround for iris having
problems with FMRC aggregations, which produce two time coordinates."""
try:
cube.coord(axis='T').rename('time')
except: # Be more specific.
Expand All @@ -152,24 +168,31 @@ def find_timevar(cube):


def ind2ij(a, index):
"""
Returns a[j, i] for a.ravel()[index].
"""
"""Returns a[j, i] for a.ravel()[index]."""
n, m = a.shape
j = np.int_(np.ceil(index//m))
i = np.remainder(index, m)
return i, j


def get_coordinates(bounding_box, bounding_box_type):
"""
Create bounding box coordinates for the leaflet map
"""
"""Create bounding box coordinates for the map."""
coordinates = []
if bounding_box_type is "box":
coordinates.append([bounding_box[0][1], bounding_box[0][0]])
coordinates.append([bounding_box[0][1], bounding_box[1][0]])
coordinates.append([bounding_box[1][1], bounding_box[1][0]])
coordinates.append([bounding_box[1][1], bounding_box[0][0]])
coordinates.append([bounding_box[0][1], bounding_box[0][0]])
return coordinates
return coordinates


def inline_map(m):
"""From http://nbviewer.ipython.org/gist/rsignell-usgs/
bea6c0fe00a7d6e3249c."""
m._build_map()
srcdoc = m.HTML.replace('"', '"')
embed = HTML('<iframe srcdoc="{srcdoc}" '
'style="width: 100%; height: 500px; '
'border: none"></iframe>'.format(srcdoc=srcdoc))
return embed
Loading

0 comments on commit 9b874fc

Please sign in to comment.