Skip to content

Commit d9df075

Browse files
Merge pull request #171 from KristinaRiemer/update_image_vignette
[WIP] update images vignette
2 parents 9e0a21e + eb3a4c0 commit d9df075

1 file changed

Lines changed: 18 additions & 72 deletions

File tree

vignettes/03-get-images-python.Rmd

Lines changed: 18 additions & 72 deletions
Original file line numberDiff line numberDiff line change
@@ -27,12 +27,26 @@ All the `terrautils` functions are now available in Python, although we will onl
2727
In this section we retrieve the names of different sensor types that are available.
2828
This will allow you to understand what files may be avaialable other than just those containing RBG image data.
2929

30+
In order to run Python functions, including those from the `terrautils` library, within this Rmarkdown, we have to install and set up `reticulate`.
31+
32+
```{r get_pkgs, message=FALSE}
33+
if(!require(reticulate)){
34+
install.packages("reticulate")
35+
reticulate::py_install("terrautils")
36+
}
37+
38+
library(reticulate)
39+
use_virtualenv("r-reticulate")
40+
```
41+
3042
We will first be using the `get_sensor_list` function to retrieve all the data on available sensors.
3143
We will then use the `unique_sensor_names` function to extract only the sensor names from the data we just retrieved.
3244

33-
```{python py_get_sensor_data, eval=FALSE}
45+
```{python py_get_functions}
3446
from terrautils.products import get_sensor_list, unique_sensor_names
47+
```
3548

49+
```{python py_get_sensor_data}
3650
url = 'https://terraref.ncsa.illinois.edu/clowder/'
3751
key = ''
3852
@@ -61,7 +75,7 @@ If the `stream=True` parameter was omitted the file's entire contents would be i
6175

6276
To illustrate how this might work we are going to pre-populated an array of file names and their associated Clowder IDs.
6377

64-
```{python py_sample_images_data, eval=FALSE}
78+
```{python py_sample_images_data}
6579
files = [ {"id": "5c507cb74f0c4b0cbe6705f2",
6680
"filename": "rgb_geotiff_L1_ua-mac_2018-06-02__14-12-05-077_right.tif"},
6781
{"id": "5c507cb84f0cfd2aedf5a75a",
@@ -78,12 +92,12 @@ First we format the base URL for our query allowing us to reuse it for each file
7892
Next we loop through our array and create a customized URL while making the call to fetch the data using the `requests` interface.
7993
Finally we open the output file and use a loop to write the retrieved data.
8094

81-
```{python py_fetch_image_files, eval=FALSE}
95+
```{python py_fetch_image_files}
8296
import requests
8397
from io import open
8498
8599
# We are using the same `url` and `key` variables declared in the previous example above.
86-
filesurl = url + '/files/'
100+
filesurl = url + 'files/'
87101
params={ 'key': key }
88102
89103
for f in files:
@@ -108,71 +122,3 @@ Below are examples of images captured approximately one month apart [^1] [^2]
108122

109123
[^1]: May 4, 2018 - rgb_geotiff_L1_ua-mac_2018-05-04__13-07-04-077_right.tif,rgb_geotiff_L1_ua-mac_2018-05-04__13-07-04-077_left.tif
110124
[^2]: Jun 2, 2018 - rgb_geotiff_L1_ua-mac_2018-06-02__14-12-05-077_right.tif,rgb_geotiff_L1_ua-mac_2018-06-02__14-12-05-077_left.tif
111-
112-
113-
<!-- The following is the 'correct' way of retrieving file names and IDS based upon a site and sensor name
114-
115-
116-
117-
## Objective: To be able to demonstrate how to locate and retrieve RGB image files
118-
119-
This vignette shows how to locate and retrieve image files from your site for a specific date range using Python.
120-
We will be searching for and retrieving [ublicly available images associated with growing Season 6 from the University of Arizona's [Maricopa Agricultural Center](http://cals-mac.arizona.edu/).
121-
Note that the search in this vignette will not return any results, but we wull walk you through the process so that you can query your own data.
122-
The files are stored online on the data management system Clowder, which is accessed using an API.
123-
124-
After completing this vignette it should be possible to search for and retrieve other files through the use of the API.
125-
126-
As an added bonus we've also included an exmple of how to retrieve the list of available sensor names through the API.
127-
By using the sensor names returned, it's possible to retrieve other files containing the data the sensors have collected.
128-
129-
**Requirements**
130-
131-
* Python
132-
* the terrautils library
133-
+ this can be installed from pypi by running `pip install terrautils` in the terminal
134-
* an API key to access these data
135-
136-
The API key is a string that gets generated upon request through your Clowder account.
137-
Existing API keys will work with this vignette.
138-
The following steps can help you get a new API key
139-
140-
1. first register with Clowder at "https://terraref.ncsa.illinois.edu/clowder/" site
141-
a) click the `Login` button and wait for the login screen to appear
142-
b) then select the `Sign up` button and enter an email address you have access to
143-
2. an email is sent to the entered address with instructions for completing the registration process
144-
3. once registration is complete
145-
c) log into Clowder and select the `View profile` menu option from the drop-down that is near the search control
146-
d) enter a name and click the `+ Add` button under "User API Keys" heading in the profile page to generate a new key
147-
4. copy the key shown for use in this vignette
148-
149-
## Locating the images
150-
151-
To begin looking for files, a sensor name and site name are needed.
152-
We will be using 'RGB GeoTIFFs Datasets' as the sensor name and 'MAC Field Scanner Season 6 Range 20 Column 6' as the site name.
153-
Later in this vignette we show how to retrieve the list of available sensors.
154-
When seraching for your images, you will want to use your site name.
155-
156-
The URL string points to the API to query.
157-
In this case we'll be using "https://terraref.ncsa.illinois.edu/clowder/api".
158-
We will be using an empty key for this example since we are querying public data.
159-
You will want to use the key you created for your Clowder account when querying your data.
160-
161-
As mentioned in the Objectives, we are limiting the search timeframe through the use of the `since` and `until` parameters.
162-
These parameters can be omitted to search for all available images (or all other files by changing the sensor).
163-
164-
```{python py_get_file_listing, eval=FALSE}
165-
from terrautils.products import get_file_listing
166-
167-
url = 'https://terraref.ncsa.illinois.edu/clowder/'
168-
key = ''
169-
sensor = 'RGB GeoTIFFs Datasets'
170-
sitename = 'MAC Field Scanner Season 6 Range 20 Column 6'
171-
files = get_file_listing(None, url, key, sensor, sitename,
172-
since='2018-05-01', until='2018-05-31')
173-
```
174-
175-
The `files` variable now contains an array of all the file in the datasets that match the sensor in the plot for the month of May.
176-
As with this example, when performing you own queries it's possible that there are no matches found and the `files` array would be empty.
177-
178-
-->

0 commit comments

Comments
 (0)