Skip to content

Commit 59cc69c

Browse files
author
Matt Bertrand
committed
- Reformatted processor descriptions
- Improved efficiency of a few geomosaic processors with many bands - Renamed local_settings.py to local_settings.py.template to avoid errors when running unittests from geonode
1 parent 29b98f5 commit 59cc69c

20 files changed

Lines changed: 260 additions & 232 deletions

File tree

.travis.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ install:
2020
- pip install -r dev-requirements.txt
2121
- pip install -e .
2222
- git clone -b 2.4.x https://github.com/GeoNode/geonode.git
23-
- cp local_settings.py geonode/geonode/.
23+
- cp local_settings.py.template geonode/geonode/local_settings.py
2424
- pip install -e geonode
2525

2626
script:

dataqs/airnow/airnow.py

Lines changed: 25 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -50,31 +50,31 @@ class AirNowGRIB2HourlyProcessor(GeoDataMosaicProcessor):
5050
"airnow_aqi_combined"]
5151
img_patterns = ["", "_pm25", "_combined"]
5252
layer_titles = ["Ozone", "PM25", "Combined Ozone & PM25"]
53-
description = """
54-
U.S. Environmental Protection Agency’s (EPA) nationwide, voluntary program,
55-
AirNow(www.airnow.gov), provides real-time air quality data and forecasts to
56-
protect public health across the United States, Canada, and parts of Mexico.
57-
AirNowreceives real-time ozone and PM2.5data from over 2,000 monitors and
58-
collects air quality forecasts for more than 300 cities.
59-
60-
As part of the Global Earth Observation System of Systems (GEOSS)
61-
(www.epa.gov/geoss) program, the AirNow API system broadens access to AirNowdata
62-
and data products. AirNow API produces data products in several standard data
63-
formats and makes them available via FTP and web services. This document
64-
describes the GRIB2 file formats.
65-
66-
All data provided by AirNow API are made possible by the efforts of more than
67-
120 local, state, tribal, provincial, and federal government agencies
68-
(www.airnow.gov/index.cfm?action=airnow.partnerslist). These data are not fully
69-
verified or validated and should be considered preliminary and subject to
70-
change. Data and information reported to AirNow from federal, state, local, and
71-
tribal agencies are for the express purpose of reporting and forecasting the
72-
Air Quality Index (AQI). As such, they should not be used to formulate or
73-
support regulation, trends, guidance, or any other government or public
74-
decision making. Official regulatory air quality data must be obtained from
75-
EPA’s Air Quality System (AQS) (www.epa.gov/ttn/airs/airsaqs). See the AirNow
76-
Data Exchange Guidelines at http://airnowapi.org/docs/DataUseGuidelines.pdf.
77-
"""
53+
description = (
54+
"U.S. Environmental Protection Agency’s (EPA) nationwide, voluntary "
55+
"program, AirNow(www.airnow.gov), provides real-time air quality data "
56+
"and forecasts to protect public health across the United States, "
57+
"Canada, and parts of Mexico. AirNowreceives real-time ozone and PM2.5 "
58+
"data from over 2,000 monitors and collects air quality forecasts for "
59+
"more than 300 cities.\n\nAs part of the Global Earth Observation "
60+
"System of Systems (GEOSS)(www.epa.gov/geoss) program, the AirNow API "
61+
"system broadens access to AirNowdata and data products. AirNow API "
62+
"produces data products in several standard data formats and makes them"
63+
"available via FTP and web services. This documen describes the GRIB2 "
64+
"file formats. All data provided by AirNow API are made possible by "
65+
"the efforts of more than 120 local, state, tribal, provincial, and "
66+
"federal government agencies (www.airnow.gov/index.cfm?action=airnow."
67+
"partnerslist). These data are not fully verified or validated and "
68+
"should be considered preliminary and subject to change. Data and "
69+
"information reported to AirNow from federal, state, local, and tribal "
70+
"agencies are for the express purpose of reporting and forecasting the "
71+
"Air Quality Index (AQI). As such, they should not be used to formulate"
72+
" or support regulation, trends, guidance, or any other government or "
73+
"public decision making. Official regulatory air quality data must be "
74+
"obtained from EPA’s Air Quality System (AQS) (www.epa.gov/ttn/airs/"
75+
"airsaqs). See the AirNow Data Exchange Guidelines at http://airnowapi."
76+
"org/docs/DataUseGuidelines.pdf."
77+
)
7878

7979
def download(self, auth_account=AIRNOW_ACCOUNT, days=1):
8080
"""

dataqs/cmap/cmap.py

Lines changed: 53 additions & 52 deletions
Original file line numberDiff line numberDiff line change
@@ -30,57 +30,54 @@ class CMAPProcessor(GeoDataMosaicProcessor):
3030
layer_name = 'cmap'
3131
bounds = '-178.75,178.75,-88.75,88.75'
3232
title = 'CPC Merged Analysis of Precipitation, 1979/01 - {}'
33-
abstract = """The CPC Merged Analysis of Precipitation ("CMAP") is a
34-
technique which produces pentad and monthly analyses of global precipitation
35-
in which observations from raingauges are merged with precipitation estimates
36-
from several satellite-based algorithms (infrared and microwave). The analyses
37-
are are on a 2.5 x 2.5 degree latitude/longitude grid and extend back to 1979.
38-
These data are comparable (but should not be confused with) similarly combined
39-
analyses by the Global Precipitation Climatology Project which are described in
40-
Huffman et al (1997).
41-
42-
It is important to note that the input data sources to make these analyses are
43-
not constant throughout the period of record. For example, SSM/I (passive
44-
microwave - scattering and emission) data became available in July of 1987;
45-
prior to that the only microwave-derived estimates available are from the MSU
46-
algorithm (Spencer 1993) which is emission-based thus precipitation estimates
47-
are avaialble only over oceanic areas. Furthermore, high temporal resolution IR
48-
data from geostationary satellites (every 3-hr) became available during 1986;
49-
prior to that, estimates from the OPI technique (Xie and Arkin 1997) are used
50-
based on OLR from polar orbiting satellites.
51-
52-
The merging technique is thoroughly described in Xie and Arkin (1997). Briefly,
53-
the methodology is a two-step process. First, the random error is reduced by
54-
linearly combining the satellite estimates using the maximum likelihood method,
55-
in which case the linear combination coefficients are inversely proportional to
56-
the square of the local random error of the individual data sources. Over global
57-
land areas the random error is defined for each time period and grid location
58-
by comparing the data source with the raingauge analysis over the surrounding
59-
area. Over oceans, the random error is defined by comparing the data sources
60-
with the raingauge observations over the Pacific atolls. Bias is reduced when
61-
the data sources are blended in the second step using the blending technique of
62-
Reynolds (1988). Here the data output from step 1 is used to define the "shape"
63-
of the precipitation field and the rain gauge data are used to constrain the
64-
amplitude.
65-
66-
Monthly and pentad CMAP estimates back to the 1979 are available from CPC ftp
67-
server.
68-
69-
References:
70-
71-
Huffman, G. J. and co-authors, 1997: The Global Precipitation Climatology
72-
Project (GPCP) combined data set. Bull. Amer. Meteor. Soc., 78, 5-20.
73-
74-
Reynolds, R. W., 1988: A real-time global sea surface temperature analysis. J.
75-
Climate, 1, 75-86.
76-
77-
Spencer, R. W., 1993: Global oceanic precipitation from the MSU during 1979-91
78-
and comparisons to other climatologies. J. Climate, 6, 1301-1326.
79-
80-
Xie P., and P. A. Arkin, 1996: Global precipitation: a 17-year monthly analysis
81-
based on gauge observations, satellite estimates, and numerical model outputs.
82-
Bull. Amer. Meteor. Soc., 78, 2539-2558.
83-
"""
33+
abstract = (
34+
"The CPC Merged Analysis of Precipitation ('CMAP') is a technique which"
35+
" produces pentad and monthly analyses of global precipitation in which"
36+
" observations from raingauges are merged with precipitation estimates "
37+
"from several satellite-based algorithms (infrared and microwave). The "
38+
"analyses are on a 2.5 x 2.5 degree latitude/longitude grid and extend "
39+
"back to 1979.\n\nThese data are comparable (but should not be confused"
40+
" with) similarly combined analyses by the Global Precipitation "
41+
"Climatology Project which are described in Huffman et al (1997).\n\n"
42+
"It is important to note that the input data sources to make these "
43+
"analyses are not constant throughout the period of record. For example"
44+
", SSM/I (passive microwave - scattering and emission) data became "
45+
"available in July of 1987; prior to that the only microwave-derived "
46+
"estimates available are from the MSU algorithm (Spencer 1993) which is"
47+
" emission-based thus precipitation estimates are avaialble only over "
48+
" oceanic areas. Furthermore, high temporal resolution IR data from "
49+
"geostationary satellites (every 3-hr) became available during 1986;"
50+
" prior to that, estimates from the OPI technique (Xie and Arkin 1997) "
51+
"are used based on OLR from polar orbiting satellites.\n\nThe merging "
52+
"technique is thoroughly described in Xie and Arkin (1997). Briefly, "
53+
"the methodology is a two-step process. First, the random error is "
54+
"reduced by linearly combining the satellite estimates using the "
55+
"maximum likelihood method, in which case the linear combination "
56+
"coefficients are inversely proportional to the square of the local "
57+
"random error of the individual data sources. Over global land areas "
58+
"the random error is defined for each time period and grid location by "
59+
"comparing the data source with the raingauge analysis over the "
60+
"surrounding area. Over oceans, the random error is defined by "
61+
"comparing the data sources with the raingauge observations over the "
62+
"Pacific atolls. Bias is reduced when the data sources are blended in "
63+
"the second step using the blending technique of Reynolds (1988). Here "
64+
"the data output from step 1 is used to define the \"shape\" of the "
65+
"precipitation field and the rain gauge data are used to constrain the "
66+
"amplitude.\n\nMonthly and pentad CMAP estimates back to the 1979 are "
67+
"available from CPC ftp server.\n\nSource: "
68+
"http://www.esrl.noaa.gov/psd/data/gridded/data.cmap.html\n\nRaw data "
69+
"file: ftp://ftp.cdc.noaa.gov/Datasets/cmap/enh/precip.mon.mean.nc"
70+
"\n\nReferences:\n\nHuffman, G. J. and "
71+
"co-authors, 1997: The Global Precipitation Climatology Project (GPCP) "
72+
"combined data set. Bull. Amer. Meteor. Soc., 78, 5-20.\n\nReynolds, R."
73+
" W., 1988: A real-time global sea surface temperature analysis. J. "
74+
"Climate, 1, 75-86.\n\nSpencer, R. W., 1993: Global oceanic "
75+
"precipitation from the MSU during 1979-91 and comparisons to other "
76+
"climatologies. J. Climate, 6, 1301-1326.\n\nXie P., and P. A. Arkin, "
77+
"1996: Global precipitation: a 17-year monthly analysis based on gauge "
78+
"observations, satellite estimates, and numerical model outputs. Bull. "
79+
"Amer. Meteor. Soc., 78, 2539-2558."
80+
)
8481

8582
def download(self, url, tmp_dir=GS_TMP_DIR, filename=None):
8683
if not filename:
@@ -122,6 +119,7 @@ def run(self):
122119
cdf_file = self.convert(os.path.join(self.tmp_dir, ncfile))
123120
bands = get_band_count(cdf_file)
124121
img_list = self.get_mosaic_filenames(self.layer_name)
122+
dst_files = []
125123
for band in range(1, bands + 1):
126124
band_date = re.sub('[\-\.]+', '', self.get_date(band).isoformat())
127125
img_name = '{}_{}T000000000Z.tif'.format(self.layer_name, band_date)
@@ -136,7 +134,10 @@ def run(self):
136134
os.makedirs(dst_dir)
137135
if dst_file.endswith('.tif'):
138136
shutil.move(os.path.join(self.tmp_dir, band_tif), dst_file)
139-
self.post_geoserver(dst_file, self.layer_name)
137+
dst_files.append(dst_file)
138+
time.sleep(RSYNC_WAIT_TIME * 2)
139+
for dst_file in dst_files:
140+
self.post_geoserver(dst_file, self.layer_name, sleeptime=0)
140141

141142
if not style_exists(self.layer_name):
142143
with open(os.path.join(script_dir,

dataqs/cmap/resources/cmap.sld

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,6 @@
99
<sld:Rule>
1010
<sld:RasterSymbolizer>
1111
<Opacity>1.0</Opacity>
12-
<ChannelSelection>
13-
<GrayChannel>
14-
<SourceChannelName>{latest_band}</SourceChannelName>
15-
</GrayChannel>
16-
</ChannelSelection>
1712
<ColorMap extended="true">
1813
<sld:ColorMapEntry color="#2b83ba" label="0 mm" opacity="1.0" quantity="0"/>
1914
<sld:ColorMapEntry color="#74b6ad" label="10 mm" opacity="1.0" quantity="10"/>

dataqs/forecastio/forecastio_air.py

Lines changed: 10 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -41,13 +41,16 @@ class ForecastIOAirTempProcessor(GeoDataMosaicProcessor):
4141
prefix = "forecast_io_airtemp"
4242
base_url = "http://maps.forecast.io/temperature/"
4343
layer_name = "forecast_io_airtemp"
44-
description = """Project Quicksilver is an experimental new data product
45-
that attempts to create the world's highest resolution real-time map of global
46-
(near-surface) air temperature.\n\n
47-
It is generated using the same source data models that power Forecast.io,
48-
combined with a sophisticated microclimate model that adjusts the temperatures
49-
based on the effects of elevation, terrain, proximity to water, foliage cover,
50-
and other factors.\n\nSource: http://blog.forecast.io/project-quicksilver/"""
44+
description = (
45+
"Project Quicksilver is an experimental new data product that attempts "
46+
"to create the world's highest resolution real-time map of global "
47+
"(near-surface) air temperature.\n\nIt is generated using the same "
48+
"source data models that power Forecast.io, combined with a "
49+
"sophisticated microclimate model that adjusts the temperatures based "
50+
"on the effects of elevation, terrain, proximity to water, foliage "
51+
"cover, and other factors.\n\nSource: "
52+
"http://blog.forecast.io/project-quicksilver/"
53+
)
5154

5255
def parse_name(self, img_date):
5356
imgstrtime = img_date.strftime("%Y-%m-%d %H:00")

dataqs/gdacs/gdacs.py

Lines changed: 17 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -39,19 +39,23 @@ class GDACSProcessor(GeoDataProcessor):
3939
prefix = "gdacs_alerts"
4040
layer_title = 'Flood, Quake, Cyclone Alerts - GDACS'
4141
params = {}
42-
base_url = \
43-
"http://www.gdacs.org/rss.aspx?profile=ARCHIVE&fromarchive=true&" + \
44-
"from={}&to={}&alertlevel=&country=&eventtype=EQ,TC,FL&map=true"
45-
description = """GDACS (Global Disaster and Alert Coordination System) is a
46-
collaboration platform for organisations providing information on humanitarian
47-
disasters. From a technical point of view, GDACS links information of all
48-
participating organisations using a variety of systems to have a harmonized list
49-
of data sources.In 2011, the GDACS platform was completely revised to collect,
50-
store and distribute resources explicitly by events. The system matches
51-
information from all organisations (by translating unique identifiers), and make
52-
these resources available for GDACS users and developers in the form of GDACS
53-
Platform Services. The GDACS RSS feed automatically include a list of available
54-
resources.\n\nSource: http://www.gdacs.org/resources.aspx"""
42+
base_url = (
43+
"http://www.gdacs.org/rss.aspx?profile=ARCHIVE&fromarchive=true&"
44+
"from={}&to={}&alertlevel=&country=&eventtype=EQ,TC,FL&map=true")
45+
description = (
46+
"GDACS (Global Disaster and Alert Coordination System) is a "
47+
"collaboration platform for organisations providing information on "
48+
"humanitarian disasters. From a technical point of view, GDACS links "
49+
"information of all participating organisations using a variety of "
50+
"systems to have a harmonized list of data sources.In 2011, the GDACS "
51+
"platform was completely revised to collect, store and distribute "
52+
"resources explicitly by events. The system matches information from "
53+
"all organisations (by translating unique identifiers), and make these "
54+
"resources available for GDACS users and developers in the form of "
55+
"GDACS Platform Services. The GDACS RSS feed automatically include a "
56+
"list of available resources.\n\nSource: "
57+
"http://www.gdacs.org/resources.aspx"
58+
)
5559

5660
def __init__(self, *args, **kwargs):
5761
for key in kwargs.keys():

dataqs/gdacs/tests.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ def test_download(self):
5555
body=response)
5656
rssfile = self.processor.download(self.processor.base_url.format(
5757
self.processor.params['sdate'], self.processor.params['edate']),
58-
self.processor.prefix + ".rss")
58+
filename=self.processor.prefix + ".rss")
5959
rsspath = os.path.join(
6060
self.processor.tmp_dir, rssfile)
6161
self.assertTrue(os.path.exists(rsspath))
@@ -74,7 +74,7 @@ def test_cleanup(self):
7474
body=response)
7575
rssfile = self.processor.download(self.processor.base_url.format(
7676
self.processor.params['sdate'], self.processor.params['edate']),
77-
self.processor.prefix + ".rss")
77+
filename=self.processor.prefix + ".rss")
7878
rsspath = os.path.join(
7979
self.processor.tmp_dir, rssfile)
8080
self.assertTrue(os.path.exists(rsspath))

dataqs/gfms/gfms.py

Lines changed: 15 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -52,18 +52,21 @@ class GFMSProcessor(GeoDataProcessor):
5252
layer_future = "gfms_latest"
5353
layer_current = "gfms_current"
5454
prefix = 'Flood_byStor_'
55-
description = u"""The GFMS (Global Flood Management System) is a NASA-funded
56-
experimental system using real-time TRMM Multi-satellite Precipitation Analysis
57-
(TMPA) precipitation information as input to a quasi-global (50°N - 50°S)
58-
hydrological runoff and routing model running on a 1/8th degree latitude /
59-
longitude grid. Flood detection/intensity estimates are based on 13 years of
60-
retrospective model runs with TMPA input, with flood thresholds derived for
61-
each grid location using surface water storage statistics (95th percentile plus
62-
parameters related to basin hydrologic characteristics). Streamflow,surface
63-
water storage,inundation variables are also calculated at 1km resolution.
64-
In addition, the latest maps of instantaneous precipitation and totals from the
65-
last day, three days and seven days are displayed.
66-
\n\nSource: http://eagle1.umd.edu/flood/"""
55+
description = (
56+
u"The GFMS (Global Flood Management System) is a NASA-funded"
57+
u"experimental system using real-time TRMM Multi-satellite "
58+
u"Precipitation Analysis (TMPA) precipitation information as input to a"
59+
u"quasi-global (50°N - 50°S) hydrological runoff and routing model "
60+
u"running on a 1/8th degree latitude /longitude grid. Flood detection/"
61+
u"intensity estimates are based on 13 years of retrospective model runs"
62+
u"with TMPA input, with flood thresholds derived for each grid location"
63+
u" using surface water storage statistics (95th percentile plus "
64+
u"parameters related to basin hydrologic characteristics). Streamflow,"
65+
u" surface water storage,inundation variables are also calculated at 1"
66+
u"km resolution. In addition, the latest maps of instantaneous "
67+
u"precipitation and totals from the last day, three days and seven days"
68+
u" are displayed.\n\nSource: http://eagle1.umd.edu/flood/"
69+
)
6770

6871
def get_latest_future(self):
6972
"""

0 commit comments

Comments
 (0)