"
+ ]
+ }
+ ],
+ "prompt_number": 1
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# IOOS System Test - Theme 3 - Scenario A - [Description](https://github.com/ioos/system-test/wiki/Development-of-Test-Themes#scenario-3a-assessing-seabird-vulnerability-in-the-bering-sea)\n",
+ "\n",
+ "## Assessing Seabird Vulnerability in the Bering Sea\n",
+ "\n",
+ "## Questions\n",
+ "1. Can we discover, access, and overlay Important Bird Area polygons (and therefore other similar layers for additional important resource areas) on modeled datasets in the Bering Sea?\n",
+ "3. Is metadata for projected climate data layers and Important Bird Area polygons sufficient to determine a subset of polygons desired by a query?\n",
+ "4. Can a simple set statistics (e.g., mean and standard deviation) be derived from multiple variables in each of the six models to derive the forecast variability of climate conditions through time, through the end of the model runs (2003-2040)?\n",
+ "5. Can we create a standardized matrix or other display method for output variables that allow resource experts to easily assess projected changes in climate variables, within given ranges of time, and compare projected changes across multiple coupled oceanographic and climate models?\n",
+ "6. Can we develop a set of process-specific guidelines and a standardized set of outputs for a tool that would allow researchers to address a diversity of resource management questions relative to projected changes in climate for specific zones of interest?\n"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Q1 - Can we discover, access, and overlay Important Bird Area polygons (and therefore other similar layers for additional important resource areas) on modeled datasets in the Bering Sea?"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Discovery is not possible - No Important Bird Area polygons are not discoverable at this time. They are, however, available in an GeoServer known to us. This should be fixed. The WFS service should be added to a queryable CSW.
"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "##### Load 'known' WFS endpoint with Important Bird Area polygons"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "collapsed": false,
+ "input": [
+ "from owslib.wfs import WebFeatureService\n",
+ "known_wfs = \"http://solo.axiomalaska.com/geoserver/audubon/ows\"\n",
+ "wfs = WebFeatureService(known_wfs, version='1.0.0')\n",
+ "print sorted(wfs.contents.keys())"
+ ],
+ "language": "python",
+ "metadata": {},
+ "outputs": [
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ "['audubon:audubon_all_seafloorsubstrate_project', 'audubon:audubon_beardedseal_polys', 'audubon:audubon_beluga_lines', 'audubon:audubon_beluga_polys', 'audubon:audubon_blgu_polys', 'audubon:audubon_bowhead_fall', 'audubon:audubon_bowhead_huntareas', 'audubon:audubon_bowhead_lines', 'audubon:audubon_bowhead_polys', 'audubon:audubon_bowhead_quiet', 'audubon:audubon_bowhead_spring', 'audubon:audubon_bowhead_summer', 'audubon:audubon_bowhead_winter', 'audubon:audubon_capelin_lines', 'audubon:audubon_capelin_polys', 'audubon:audubon_chumsalmon_polys', 'audubon:audubon_coei_polys', 'audubon:audubon_euphausiids_polys', 'audubon:audubon_falsecalunuscopepods_polys', 'audubon:audubon_gray_polys', 'audubon:audubon_ibas', 'audubon:audubon_kiei_polys', 'audubon:audubon_kimu_polys', 'audubon:audubon_ltdu_polys', 'audubon:audubon_murre_polys', 'audubon:audubon_nofu_polys', 'audubon:audubon_opiliotannercrab_polys', 'audubon:audubon_pacificherring_lines', 'audubon:audubon_pacificherring_polys', 'audubon:audubon_pinksalmon_polys', 'audubon:audubon_polarbear_polys', 'audubon:audubon_ribbonseal_polys', 'audubon:audubon_ringedseal_polys', 'audubon:audubon_rogu_polys', 'audubon:audubon_rtlo_polys', 'audubon:audubon_saffroncod_polys', 'audubon:audubon_shoals', 'audubon:audubon_spei_polys', 'audubon:audubon_spottedseal_polys', 'audubon:audubon_stei_polys', 'audubon:audubon_stsh_polys', 'audubon:audubon_walrus_lines', 'audubon:audubon_walrus_polys', 'audubon:audubon_yblo_polys']\n"
+ ]
+ }
+ ],
+ "prompt_number": 2
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "##### We already know that the 'audubon:audubon_ibas' layer is Import Bird Areas. Request 'geojson' response from the layer"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "collapsed": false,
+ "input": [
+ "import geojson\n",
+ "geojson_response = wfs.getfeature(typename=['audubon:audubon_ibas'], maxfeatures=1, outputFormat=\"application/json\", srsname=\"urn:x-ogc:def:crs:EPSG:4326\").read()\n",
+ "feature = geojson.loads(geojson_response)"
+ ],
+ "language": "python",
+ "metadata": {},
+ "outputs": [],
+ "prompt_number": 3
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "##### Convert to Shapely geometry objects"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "collapsed": false,
+ "input": [
+ "from shapely.geometry import shape\n",
+ "shapes = [shape(s.get(\"geometry\")) for s in feature.get(\"features\")]"
+ ],
+ "language": "python",
+ "metadata": {},
+ "outputs": [],
+ "prompt_number": 4
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "##### Map the geometry objects"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "collapsed": false,
+ "input": [
+ "import folium\n",
+ "map_center = shapes[0].centroid\n",
+ "mapper = folium.Map(location=[map_center.x, map_center.y], zoom_start=6)\n",
+ "for s in shapes:\n",
+ " if hasattr(s.boundary, 'coords'):\n",
+ " mapper.line(s.boundary.coords, line_color='#FF0000', line_weight=5)\n",
+ " else:\n",
+ " for p in s:\n",
+ " mapper.line(p.boundary.coords, line_color='#FF0000', line_weight=5)\n",
+ "mapper._build_map()\n",
+ "\n",
+ "from IPython.core.display import HTML\n",
+ "HTML(''.format(srcdoc=mapper.HTML.replace('\"', '"')))"
+ ],
+ "language": "python",
+ "metadata": {},
+ "outputs": [
+ {
+ "html": [
+ ""
+ ],
+ "metadata": {},
+ "output_type": "pyout",
+ "prompt_number": 7,
+ "text": [
+ ""
+ ]
+ }
+ ],
+ "prompt_number": 7
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Can we discover other datasets in this polygon area?"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "##### Setup BCSW Filters to find models in the area of the Important Bird Polygon"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "collapsed": false,
+ "input": [
+ "from owslib import fes \n",
+ "\n",
+ "# Polygon filters\n",
+ "polygon_filters = []\n",
+ "for s in shapes:\n",
+ " f = fes.BBox(bbox=list(reversed(s.bounds)))\n",
+ " polygon_filters.append(f)\n",
+ "# If we have more than one polygon filter, OR them together\n",
+ "if len(polygon_filters) > 1:\n",
+ " polygon_filters = fes.Or(polygon_filters)\n",
+ "elif len(polygon_filters) == 1:\n",
+ " polygon_filters = polygon_filters[0]\n",
+ " \n",
+ "# Name filters\n",
+ "name_filters = []\n",
+ "model_strings = ['roms', 'selfe', 'adcirc', 'ncom', 'hycom', 'fvcom', 'wrf']\n",
+ "for model in model_strings:\n",
+ " title_filter = fes.PropertyIsLike(propertyname='apiso:Title', literal='*%s*' % model, wildCard='*')\n",
+ " name_filters.append(title_filter)\n",
+ " subject_filter = fes.PropertyIsLike(propertyname='apiso:Subject', literal='*%s*' % model, wildCard='*')\n",
+ " name_filters.append(subject_filter)\n",
+ "# Or all of the name filters together\n",
+ "name_filters = fes.Or(name_filters)\n",
+ "\n",
+ "# Final filters\n",
+ "filters = fes.And([polygon_filters, name_filters])"
+ ],
+ "language": "python",
+ "metadata": {},
+ "outputs": [],
+ "prompt_number": 31
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "##### The actual CSW filters look like this"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "collapsed": false,
+ "input": [
+ "from owslib.etree import etree\n",
+ "print etree.tostring(filters.toXML(), pretty_print=True)"
+ ],
+ "language": "python",
+ "metadata": {},
+ "outputs": [
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ "\n",
+ " \n",
+ " ows:BoundingBox\n",
+ " \n",
+ " -163.312671675 67.2307384888\n",
+ " -163.853365277 67.041722233\n",
+ " \n",
+ " \n",
+ " \n",
+ " \n",
+ " apiso:Title\n",
+ " *roms*\n",
+ " \n",
+ " \n",
+ " apiso:Subject\n",
+ " *roms*\n",
+ " \n",
+ " \n",
+ " apiso:Title\n",
+ " *selfe*\n",
+ " \n",
+ " \n",
+ " apiso:Subject\n",
+ " *selfe*\n",
+ " \n",
+ " \n",
+ " apiso:Title\n",
+ " *adcirc*\n",
+ " \n",
+ " \n",
+ " apiso:Subject\n",
+ " *adcirc*\n",
+ " \n",
+ " \n",
+ " apiso:Title\n",
+ " *ncom*\n",
+ " \n",
+ " \n",
+ " apiso:Subject\n",
+ " *ncom*\n",
+ " \n",
+ " \n",
+ " apiso:Title\n",
+ " *hycom*\n",
+ " \n",
+ " \n",
+ " apiso:Subject\n",
+ " *hycom*\n",
+ " \n",
+ " \n",
+ " apiso:Title\n",
+ " *fvcom*\n",
+ " \n",
+ " \n",
+ " apiso:Subject\n",
+ " *fvcom*\n",
+ " \n",
+ " \n",
+ " apiso:Title\n",
+ " *wrf*\n",
+ " \n",
+ " \n",
+ " apiso:Subject\n",
+ " *wrf*\n",
+ " \n",
+ " \n",
+ "\n",
+ "\n"
+ ]
+ }
+ ],
+ "prompt_number": 32
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "##### Find all models contain in all CSW endpoints"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "collapsed": false,
+ "input": [
+ "from owslib.csw import CatalogueServiceWeb\n",
+ "endpoints = ['http://www.nodc.noaa.gov/geoportal/csw',\n",
+ " 'http://www.ngdc.noaa.gov/geoportal/csw',\n",
+ " 'http://catalog.data.gov/csw-all',\n",
+ " #'http://cwic.csiss.gmu.edu/cwicv1/discovery',\n",
+ " 'http://geoport.whoi.edu/geoportal/csw',\n",
+ " 'https://edg.epa.gov/metadata/csw',\n",
+ " 'http://cmgds.marine.usgs.gov/geonetwork/srv/en/csw',\n",
+ " 'http://cida.usgs.gov/gdp/geonetwork/srv/en/csw',\n",
+ " 'http://geodiscover.cgdi.ca/wes/serviceManagerCSW/csw', \n",
+ " 'http://geoport.whoi.edu/gi-cat/services/cswiso']"
+ ],
+ "language": "python",
+ "metadata": {},
+ "outputs": [],
+ "prompt_number": 35
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "##### Filter out CSW servers that do not support a BBOX query"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "collapsed": false,
+ "input": [
+ "bbox_endpoints = []\n",
+ "for url in endpoints:\n",
+ " queryables = []\n",
+ " try:\n",
+ " csw = CatalogueServiceWeb(url, timeout=20)\n",
+ " except BaseException:\n",
+ " print \"Failure - %s - Timed out\" % url\n",
+ " if \"BBOX\" in csw.filters.spatial_operators:\n",
+ " print \"Success - %s - BBOX Query supported\" % url\n",
+ " bbox_endpoints.append(url) \n",
+ " else:\n",
+ " print \"Failure - %s - BBOX Query NOT supported\" % url"
+ ],
+ "language": "python",
+ "metadata": {},
+ "outputs": [
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ "Success - http://www.nodc.noaa.gov/geoportal/csw - BBOX Query supported\n",
+ "Success - http://www.ngdc.noaa.gov/geoportal/csw - BBOX Query supported"
+ ]
+ },
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ "\n",
+ "Success - http://catalog.data.gov/csw-all - BBOX Query supported"
+ ]
+ },
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ "\n",
+ "Success - http://geoport.whoi.edu/geoportal/csw - BBOX Query supported"
+ ]
+ },
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ "\n",
+ "Success - https://edg.epa.gov/metadata/csw - BBOX Query supported"
+ ]
+ },
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ "\n",
+ "Success - http://cmgds.marine.usgs.gov/geonetwork/srv/en/csw - BBOX Query supported"
+ ]
+ },
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ "\n",
+ "Success - http://cida.usgs.gov/gdp/geonetwork/srv/en/csw - BBOX Query supported"
+ ]
+ },
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ "\n",
+ "Success - http://geodiscover.cgdi.ca/wes/serviceManagerCSW/csw - BBOX Query supported"
+ ]
+ },
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ "\n",
+ "Success - http://geoport.whoi.edu/gi-cat/services/cswiso - BBOX Query supported"
+ ]
+ },
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ "\n"
+ ]
+ }
+ ],
+ "prompt_number": 36
+ },
+ {
+ "cell_type": "code",
+ "collapsed": false,
+ "input": [
+ "for url in bbox_endpoints:\n",
+ " try:\n",
+ " csw = CatalogueServiceWeb(url, timeout=20)\n",
+ " csw.getrecords2(constraints=[filters], maxrecords=1000, esn='full')\n",
+ " for record, item in csw.records.items():\n",
+ " print \"*\", item.title\n",
+ " except BaseException as e:\n",
+ " print \"* FAILED\", url, e.msg "
+ ],
+ "language": "python",
+ "metadata": {},
+ "outputs": [
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ "* Clean Catalog for NCOM forecast models/NCOM Region 1 Aggregation/NCOM Region 1 Best Time Series\n",
+ "* Clean Catalog for NCOM forecast models/NCOM Region 2 Aggregation/NCOM Region 2 Best Time Series\n",
+ "* Clean Catalog for NCOM forecast models/NCOM Region 6 Aggregation/NCOM Region 6 Best Time Series\n",
+ "* U.S. Navy Coastal Ocean Model (NCOM): Global\n",
+ "* GFDL CM2.0, 20C3M (run 3) climate of the 20th Century experiment (20C3M) output for IPCC AR4 and US CCSP\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual Experimental Forecasts CM2.1U_CDAef_v1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* model output prepared for GFDL Seasonal-Interannual experimental forecasts - CM2.1U_CDAef_V1.0\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* GFDL CM2.0, 20C3M (run 1) climate of the 20th Century experiment (20C3M) output for IPCC AR4 and US CCSP\n",
+ "* GFDL CM2.0, 20C3M (run 1) climate of the 20th Century experiment (20C3M) output for IPCC AR4 and US CCSP\n",
+ "* GFDL CM2.0, 20C3M (run 2) climate of the 20th Century experiment (20C3M) output for IPCC AR4 and US CCSP\n",
+ "* GFDL CM2.0, 20C3M (run 2) climate of the 20th Century experiment (20C3M) output for IPCC AR4 and US CCSP\n",
+ "* GFDL CM2.0, 20C3M (run 3) climate of the 20th Century experiment (20C3M) output for IPCC AR4 and US CCSP\n",
+ "* CLIVAR model output prepared for GFDL Seasonal-Interannual Experimental Forecasts Coupled Data Assimilation Experiment\n",
+ "* HYCOM Region 6 Aggregation/Best Time Series\n",
+ "* HYCOM Surface Aggregation/Best Time Series\n",
+ "* NCOM Region 1 Aggregation/Best Time Series\n",
+ "* NCOM Region 2 Aggregation/Best Time Series\n",
+ "* NCOM Region 6 Aggregation/Best Time Series\n",
+ "* NCOM SFC 8 Aggregation/Best Time Series\n",
+ "* NCOM SFC8 Hindcast Aggregation/Best Time Series\n",
+ "*"
+ ]
+ },
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ " Multibeam collection for HLY0503: Multibeam data collected aboard Healy from 2005-08-04 to 2005-09-29, departing from Dutch Harbor, AK and returning to Tromso, Norway\n",
+ "* Nighttime Lights Annual Composites V4\n",
+ "* Multibeam collection for HLY0103: Multibeam data collected aboard Healy from 2001-10-27 to 2001-11-28, departing from Tromso, Norway and returning to Tromso, Norway\n",
+ "* Multibeam collection for HLY05TE: Multibeam data collected aboard Healy from 2005-09-29 to 2005-11-03, departing from Tromso, Norway and returning to Dublin, Ireland\n",
+ "* HYCOM GLBa0.08\n",
+ "* HYCOM Region 6 Aggregation/Best Time Series\n",
+ "* HYCOM Surface Aggregation/Best Time Series\n",
+ "* NCOM Region 1 Aggregation/Best Time Series\n",
+ "* NCOM Region 2 Aggregation/Best Time Series\n",
+ "* NCOM Region 6 Aggregation/Best Time Series\n",
+ "* NCOM SFC 8 Aggregation/Best Time Series\n",
+ "* NCOM SFC8 Hindcast Aggregation/Best Time Series\n",
+ "* HYCOM GLBa0.08\n",
+ "* DAP Server For RPSASA Environmental Data Server/Navy HYCOM Aggregation\n",
+ "*"
+ ]
+ },
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ " HYCOM GLBa0.08\n",
+ "*"
+ ]
+ },
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ " R2 & NE: County Level 2006-2010 ACS Income Summary\n",
+ "* R2 & NE: State Level 2006-2010 ACS Income Summary\n",
+ "* R2 & NE: Tract Level 2006-2010 ACS Income Summary\n",
+ "* FAILED"
+ ]
+ },
+ {
+ "output_type": "stream",
+ "stream": "stdout",
+ "text": [
+ " http://geodiscover.cgdi.ca/wes/serviceManagerCSW/csw ORA-00907: missing right parenthesis\n"
+ ]
+ },
+ {
+ "ename": "AttributeError",
+ "evalue": "CatalogueServiceWeb instance has no attribute 'records'",
+ "output_type": "pyerr",
+ "traceback": [
+ "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m\n\u001b[1;31mAttributeError\u001b[0m Traceback (most recent call last)",
+ "\u001b[1;32m\u001b[0m in \u001b[0;36m\u001b[1;34m()\u001b[0m\n\u001b[0;32m 5\u001b[0m \u001b[1;32mexcept\u001b[0m \u001b[0mBaseException\u001b[0m \u001b[1;32mas\u001b[0m \u001b[0me\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 6\u001b[0m \u001b[1;32mprint\u001b[0m \u001b[1;34m\"* FAILED\"\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0murl\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0me\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mmsg\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 7\u001b[1;33m \u001b[1;32mfor\u001b[0m \u001b[0mrecord\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mitem\u001b[0m \u001b[1;32min\u001b[0m \u001b[0mcsw\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mrecords\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mitems\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 8\u001b[0m \u001b[1;32mprint\u001b[0m \u001b[1;34m\"*\"\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mitem\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mtitle\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n",
+ "\u001b[1;31mAttributeError\u001b[0m: CatalogueServiceWeb instance has no attribute 'records'"
+ ]
+ }
+ ],
+ "prompt_number": 40
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "##### Get bounding polygons from each dataset "
+ ]
+ },
+ {
+ "cell_type": "code",
+ "collapsed": false,
+ "input": [
+ "from paegan.cdm.dataset import CommonDataset\n",
+ "\n",
+ "dataset_polygons = {}\n",
+ "for dap in dap_urls:\n",
+ " cd = CommonDataset.open(dap)\n",
+ " var = cd.get_varname_from_stdname(standard_name=\"air_temperature\")\n",
+ " # var = cd.get_varname_from_stdname(standard_name=\"sea_water_temperature\")\n",
+ " if var:\n",
+ " dataset_polygons[dap] = cd.getboundingpolygon(var=var)\n",
+ " else:\n",
+ " print \"No standard_name 'air_temperature' in\", dap\n",
+ " "
+ ],
+ "language": "python",
+ "metadata": {},
+ "outputs": [
+ {
+ "ename": "NameError",
+ "evalue": "name 'dap_urls' is not defined",
+ "output_type": "pyerr",
+ "traceback": [
+ "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m\n\u001b[1;31mNameError\u001b[0m Traceback (most recent call last)",
+ "\u001b[1;32m\u001b[0m in \u001b[0;36m\u001b[1;34m()\u001b[0m\n\u001b[0;32m 2\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 3\u001b[0m \u001b[0mdataset_polygons\u001b[0m \u001b[1;33m=\u001b[0m \u001b[1;33m{\u001b[0m\u001b[1;33m}\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 4\u001b[1;33m \u001b[1;32mfor\u001b[0m \u001b[0mdap\u001b[0m \u001b[1;32min\u001b[0m \u001b[0mdap_urls\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 5\u001b[0m \u001b[0mcd\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mCommonDataset\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mopen\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mdap\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 6\u001b[0m \u001b[0mvar\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mcd\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mget_varname_from_stdname\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mstandard_name\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;34m\"air_temperature\"\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n",
+ "\u001b[1;31mNameError\u001b[0m: name 'dap_urls' is not defined"
+ ]
+ }
+ ],
+ "prompt_number": 43
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "##### Overlay dataset polygons on top of Important Bird Area polygons"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "collapsed": false,
+ "input": [
+ "import random\n",
+ "for p in dataset_polygons:\n",
+ " color = \"%06x\" % random.randint(0,0xFFFFFF)\n",
+ " map.line(p.boundary.coords, line_color=color, line_weight=5)\n",
+ " \n",
+ "map._build_map()\n",
+ "\n",
+ "from IPython.core.display import HTML\n",
+ "HTML(''.format(srcdoc=map.HTML.replace('\"', '"')))"
+ ],
+ "language": "python",
+ "metadata": {},
+ "outputs": [
+ {
+ "ename": "NameError",
+ "evalue": "name 'dataset_polygons' is not defined",
+ "output_type": "pyerr",
+ "traceback": [
+ "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m\n\u001b[1;31mNameError\u001b[0m Traceback (most recent call last)",
+ "\u001b[1;32m\u001b[0m in \u001b[0;36m\u001b[1;34m()\u001b[0m\n\u001b[0;32m 1\u001b[0m \u001b[1;32mimport\u001b[0m \u001b[0mrandom\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 2\u001b[1;33m \u001b[1;32mfor\u001b[0m \u001b[0mp\u001b[0m \u001b[1;32min\u001b[0m \u001b[0mdataset_polygons\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 3\u001b[0m \u001b[0mcolor\u001b[0m \u001b[1;33m=\u001b[0m \u001b[1;34m\"%06x\"\u001b[0m \u001b[1;33m%\u001b[0m \u001b[0mrandom\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mrandint\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;36m0\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;36m0xFFFFFF\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 4\u001b[0m \u001b[0mmap\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mline\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mp\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mboundary\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mcoords\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mline_color\u001b[0m\u001b[1;33m=\u001b[0m\u001b[0mcolor\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mline_weight\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;36m5\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 5\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n",
+ "\u001b[1;31mNameError\u001b[0m: name 'dataset_polygons' is not defined"
+ ]
+ }
+ ],
+ "prompt_number": 41
+ },
+ {
+ "cell_type": "code",
+ "collapsed": false,
+ "input": [],
+ "language": "python",
+ "metadata": {},
+ "outputs": []
+ }
+ ],
+ "metadata": {}
+ }
+ ]
+}
\ No newline at end of file
diff --git a/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_Important_Bird_Area_Polygon_Analysis/Scenario_3A_SeaBirds.py b/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_Important_Bird_Area_Polygon_Analysis/Scenario_3A_SeaBirds.py
new file mode 100644
index 0000000..828979f
--- /dev/null
+++ b/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_Important_Bird_Area_Polygon_Analysis/Scenario_3A_SeaBirds.py
@@ -0,0 +1,211 @@
+# -*- coding: utf-8 -*-
+# 3.0
+
+#
+
+from utilities import css_styles
+css_styles()
+
+#
+
+# # IOOS System Test - Theme 3 - Scenario A - [Description](https://github.com/ioos/system-test/wiki/Development-of-Test-Themes#scenario-3a-assessing-seabird-vulnerability-in-the-bering-sea)
+#
+# ## Assessing Seabird Vulnerability in the Bering Sea
+#
+# ## Questions
+# 1. Can we discover, access, and overlay Important Bird Area polygons (and therefore other similar layers for additional important resource areas) on modeled datasets in the Bering Sea?
+# 3. Is metadata for projected climate data layers and Important Bird Area polygons sufficient to determine a subset of polygons desired by a query?
+# 4. Can a simple set statistics (e.g., mean and standard deviation) be derived from multiple variables in each of the six models to derive the forecast variability of climate conditions through time, through the end of the model runs (2003-2040)?
+# 5. Can we create a standardized matrix or other display method for output variables that allow resource experts to easily assess projected changes in climate variables, within given ranges of time, and compare projected changes across multiple coupled oceanographic and climate models?
+# 6. Can we develop a set of process-specific guidelines and a standardized set of outputs for a tool that would allow researchers to address a diversity of resource management questions relative to projected changes in climate for specific zones of interest?
+
+#
+
+# ## Q1 - Can we discover, access, and overlay Important Bird Area polygons (and therefore other similar layers for additional important resource areas) on modeled datasets in the Bering Sea?
+
+#
+
+# Discovery is not possible - No Important Bird Area polygons are not discoverable at this time. They are, however, available in an GeoServer known to us. This should be fixed. The WFS service should be added to a queryable CSW.
+
+#
+
+# ##### Load 'known' WFS endpoint with Important Bird Area polygons
+
+#
+
+from owslib.wfs import WebFeatureService
+known_wfs = "http://solo.axiomalaska.com/geoserver/audubon/ows"
+wfs = WebFeatureService(known_wfs, version='1.0.0')
+print sorted(wfs.contents.keys())
+
+#
+
+# ##### We already know that the 'audubon:audubon_ibas' layer is Import Bird Areas. Request 'geojson' response from the layer
+
+#
+
+import geojson
+geojson_response = wfs.getfeature(typename=['audubon:audubon_ibas'], maxfeatures=1, outputFormat="application/json", srsname="urn:x-ogc:def:crs:EPSG:4326").read()
+feature = geojson.loads(geojson_response)
+
+#
+
+# ##### Convert to Shapely geometry objects
+
+#
+
+from shapely.geometry import shape
+shapes = [shape(s.get("geometry")) for s in feature.get("features")]
+
+#
+
+# ##### Map the geometry objects
+
+#
+
+import folium
+map_center = shapes[0].centroid
+mapper = folium.Map(location=[map_center.x, map_center.y], zoom_start=6)
+for s in shapes:
+ if hasattr(s.boundary, 'coords'):
+ mapper.line(s.boundary.coords, line_color='#FF0000', line_weight=5)
+ else:
+ for p in s:
+ mapper.line(p.boundary.coords, line_color='#FF0000', line_weight=5)
+mapper._build_map()
+
+from IPython.core.display import HTML
+HTML(''.format(srcdoc=mapper.HTML.replace('"', '"')))
+
+#
+
+# ### Can we discover other datasets in this polygon area?
+
+#
+
+# ##### Setup BCSW Filters to find models in the area of the Important Bird Polygon
+
+#
+
+from owslib import fes
+
+# Polygon filters
+polygon_filters = []
+for s in shapes:
+ f = fes.BBox(bbox=list(reversed(s.bounds)))
+ polygon_filters.append(f)
+# If we have more than one polygon filter, OR them together
+if len(polygon_filters) > 1:
+ polygon_filters = fes.Or(polygon_filters)
+elif len(polygon_filters) == 1:
+ polygon_filters = polygon_filters[0]
+
+# Name filters
+name_filters = []
+model_strings = ['roms', 'selfe', 'adcirc', 'ncom', 'hycom', 'fvcom', 'wrf']
+for model in model_strings:
+ title_filter = fes.PropertyIsLike(propertyname='apiso:Title', literal='*%s*' % model, wildCard='*')
+ name_filters.append(title_filter)
+ subject_filter = fes.PropertyIsLike(propertyname='apiso:Subject', literal='*%s*' % model, wildCard='*')
+ name_filters.append(subject_filter)
+# Or all of the name filters together
+name_filters = fes.Or(name_filters)
+
+# Final filters
+filters = fes.And([polygon_filters, name_filters])
+
+#
+
+# ##### The actual CSW filters look like this
+
+#
+
+from owslib.etree import etree
+print etree.tostring(filters.toXML(), pretty_print=True)
+
+#
+
+# ##### Find all models contain in all CSW endpoints
+
+#
+
+from owslib.csw import CatalogueServiceWeb
+endpoints = ['http://www.nodc.noaa.gov/geoportal/csw',
+ 'http://www.ngdc.noaa.gov/geoportal/csw',
+ 'http://catalog.data.gov/csw-all',
+ #'http://cwic.csiss.gmu.edu/cwicv1/discovery',
+ 'http://geoport.whoi.edu/geoportal/csw',
+ 'https://edg.epa.gov/metadata/csw',
+ 'http://cmgds.marine.usgs.gov/geonetwork/srv/en/csw',
+ 'http://cida.usgs.gov/gdp/geonetwork/srv/en/csw',
+ 'http://geodiscover.cgdi.ca/wes/serviceManagerCSW/csw',
+ 'http://geoport.whoi.edu/gi-cat/services/cswiso']
+
+#
+
+# ##### Filter out CSW servers that do not support a BBOX query
+
+#
+
+bbox_endpoints = []
+for url in endpoints:
+ queryables = []
+ try:
+ csw = CatalogueServiceWeb(url, timeout=20)
+ except BaseException:
+ print "Failure - %s - Timed out" % url
+ if "BBOX" in csw.filters.spatial_operators:
+ print "Success - %s - BBOX Query supported" % url
+ bbox_endpoints.append(url)
+ else:
+ print "Failure - %s - BBOX Query NOT supported" % url
+
+#
+
+for url in bbox_endpoints:
+ try:
+ csw = CatalogueServiceWeb(url, timeout=20)
+ csw.getrecords2(constraints=[filters], maxrecords=1000, esn='full')
+ for record, item in csw.records.items():
+ print "*", item.title
+ except BaseException as e:
+ print "* FAILED", url, e.msg
+
+#
+
+# ##### Get bounding polygons from each dataset
+
+#
+
+from paegan.cdm.dataset import CommonDataset
+
+dataset_polygons = {}
+for dap in dap_urls:
+ cd = CommonDataset.open(dap)
+ var = cd.get_varname_from_stdname(standard_name="air_temperature")
+ # var = cd.get_varname_from_stdname(standard_name="sea_water_temperature")
+ if var:
+ dataset_polygons[dap] = cd.getboundingpolygon(var=var)
+ else:
+ print "No standard_name 'air_temperature' in", dap
+
+
+#
+
+# ##### Overlay dataset polygons on top of Important Bird Area polygons
+
+#
+
+import random
+for p in dataset_polygons:
+ color = "%06x" % random.randint(0,0xFFFFFF)
+ map.line(p.boundary.coords, line_color=color, line_weight=5)
+
+map._build_map()
+
+from IPython.core.display import HTML
+HTML(''.format(srcdoc=map.HTML.replace('"', '"')))
+
+#
+
+
diff --git a/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/hannahscontribution.ipynb b/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_Important_Bird_Area_Polygon_Analysis/hannahscontribution.ipynb
similarity index 100%
rename from Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/hannahscontribution.ipynb
rename to Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_Important_Bird_Area_Polygon_Analysis/hannahscontribution.ipynb
diff --git a/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/pip-requirements.txt b/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_Important_Bird_Area_Polygon_Analysis/pip-requirements.txt
similarity index 70%
rename from Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/pip-requirements.txt
rename to Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_Important_Bird_Area_Polygon_Analysis/pip-requirements.txt
index a565a6e..2691394 100644
--- a/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/pip-requirements.txt
+++ b/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_Important_Bird_Area_Polygon_Analysis/pip-requirements.txt
@@ -3,3 +3,5 @@ numpy
matplotlib
OWSLib
netCDF4
+geojson
+Shapely
diff --git a/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_Important_Bird_Area_Polygon_Analysis/utilities.py b/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_Important_Bird_Area_Polygon_Analysis/utilities.py
new file mode 100644
index 0000000..ef703aa
--- /dev/null
+++ b/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_Important_Bird_Area_Polygon_Analysis/utilities.py
@@ -0,0 +1,32 @@
+def service_urls(records, service='odp:url'):
+ """Extract service_urls of a specific type (DAP, SOS) from records."""
+ service_string = 'urn:x-esri:specification:ServiceType:' + service
+ urls = []
+ for key, rec in records.iteritems():
+ # Create a generator object, and iterate through it until the match is
+ # found if not found, gets the default value (here "none").
+ url = next((d['url'] for d in rec.references if
+ d['scheme'] == service_string), None)
+ if url is not None:
+ urls.append(url)
+ return urls
+
+
+from IPython.core.display import HTML
+def css_styles():
+ return HTML("""
+
+ """)
diff --git a/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_SeaBirds.ipynb b/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_SeaBirds.ipynb
deleted file mode 100644
index 3b8c37a..0000000
--- a/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_SeaBirds.ipynb
+++ /dev/null
@@ -1,56 +0,0 @@
-{
- "metadata": {
- "name": "",
- "signature": "sha256:156e5805577dbc3d75b8a28697532c29a58d262099aee3520b7db20bf475f300"
- },
- "nbformat": 3,
- "nbformat_minor": 0,
- "worksheets": [
- {
- "cells": [
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "# IOOS System Test - Theme 3 - Scenario A - [Description](https://github.com/ioos/system-test/wiki/Development-of-Test-Themes#scenario-3a-assessing-seabird-vulnerability-in-the-bering-sea)\n",
- "\n",
- "## Assessing Seabird Vulnerability in the Bering Sea\n",
- "\n",
- "## Questions\n",
- "1. Can all the PMEL models and their corresponding variables be adequately discovered and accessed using IOOS tools?\n",
- "2. Can we discover, access, and overlay Important Bird Area polygons (and therefore other similar layers for additional important resource areas) on PMEL climate projections for the Bering Sea?\n",
- "3. Is metadata for projected climate data layers and Important Bird Area polygons sufficient to determine a subset of polygons desired by a query?\n",
- "4. Can a simple set statistics (e.g., mean and standard deviation) be derived from multiple variables in each of the six models to derive the forecast variability of climate conditions through time, through the end of the model runs (2003-2040)?\n",
- "5. Can we create a standardized matrix or other display method for output variables that allow resource experts to easily assess projected changes in climate variables, within given ranges of time, and compare projected changes across multiple coupled oceanographic and climate models?\n",
- "6. Can we develop a set of process-specific guidelines and a standardized set of outputs for a tool that would allow researchers to address a diversity of resource management questions relative to projected changes in climate for specific zones of interest?\n"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "## Q1 - Can all the PMEL models and their corresponding variables be adequately discovered and accessed using IOOS tools?"
- ]
- },
- {
- "cell_type": "code",
- "collapsed": false,
- "input": [],
- "language": "python",
- "metadata": {},
- "outputs": [],
- "prompt_number": 0
- },
- {
- "cell_type": "code",
- "collapsed": false,
- "input": [],
- "language": "python",
- "metadata": {},
- "outputs": []
- }
- ],
- "metadata": {}
- }
- ]
-}
\ No newline at end of file
diff --git a/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_SeaBirds.py b/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_SeaBirds.py
deleted file mode 100644
index e0e6f1f..0000000
--- a/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/Scenario_3A_SeaBirds.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# -*- coding: utf-8 -*-
-# 3.0
-
-#
-
-# # IOOS System Test - Theme 3 - Scenario A - [Description](https://github.com/ioos/system-test/wiki/Development-of-Test-Themes#scenario-3a-assessing-seabird-vulnerability-in-the-bering-sea)
-#
-# ## Assessing Seabird Vulnerability in the Bering Sea
-#
-# ## Questions
-# 1. Can all the PMEL models and their corresponding variables be adequately discovered and accessed using IOOS tools?
-# 2. Can we discover, access, and overlay Important Bird Area polygons (and therefore other similar layers for additional important resource areas) on PMEL climate projections for the Bering Sea?
-# 3. Is metadata for projected climate data layers and Important Bird Area polygons sufficient to determine a subset of polygons desired by a query?
-# 4. Can a simple set statistics (e.g., mean and standard deviation) be derived from multiple variables in each of the six models to derive the forecast variability of climate conditions through time, through the end of the model runs (2003-2040)?
-# 5. Can we create a standardized matrix or other display method for output variables that allow resource experts to easily assess projected changes in climate variables, within given ranges of time, and compare projected changes across multiple coupled oceanographic and climate models?
-# 6. Can we develop a set of process-specific guidelines and a standardized set of outputs for a tool that would allow researchers to address a diversity of resource management questions relative to projected changes in climate for specific zones of interest?
-
-#
-
-# ## Q1 - Can all the PMEL models and their corresponding variables be adequately discovered and accessed using IOOS tools?
-
-#
-
-
-#
-
-
diff --git a/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/utilities.py b/Theme_3_Species_Protection_and_Marine_Habitat_Conservation/Scenario_3A_Assessing_Seabird_Vulnerability_in_the_Bering_Sea/utilities.py
deleted file mode 100644
index e69de29..0000000