The objectives for this lab were to become familiar with downloading data sets for several internet sources, then to be able to import that data into ArcGIS, join the data, project the data into one coordinate system, and build a geodatabse to house the data sets. To do so we utilized python scripts to automate the process by clipping and projecting the raster data sets we collected by the Trempeleau county boundary.
General methods
For the first part of the lab we downloaded five separate datasets from four online database resources that were supplied to us by our professor. After downloading the zip files to a temporary file located on our department drive (for the zip files to later be deleted from) we unzipped them to our own "working" folder and then selected from them which rasters we would be using for our geodatabase.
The data sets and websites we acquired them from are as follows:
1. National Land Cover Database 2011: Land cover (http://nationalmap.gov/viewers.html.)
2. National Elevation Dataset 1/3 arc Second (http://nationalmap.gov/viewers.html.)
3. USDA Cropland data layer by state (http://datagateway.nrcs.usda.gov/)
4. Trempealeau County geodatabase (http://www.tremplocounty.com/landrecords/)
5. SSURGO soil survey (http://websoilsurvey.sc.egov.usda.gov/App/HomePage.htm)
After the process of downloading and organizing the data sets we created a python script using the program Pyscripter to clip and project the soils, land use, and elevations data rasters. The script I created and ran is shown below.
After the script had completed running I was able to create several maps showing such attributes as soil composition, landform classes, and digital elevation.
From the metadata, I gathered a number of data quality parameters and complied them into a table. It functioned two fold as a good exercise to get us thinking about the specific data quality parameters we will be working with as professions when our datasets come from various sources, and also as preparation for working with more complex data sets down the line in classes. Knowing our data sets inside and out is a key component to understanding what we are working with and the limitations we may encounter.
NASS CDL Land use
|
NED Elevation
|
NLCD Lancover
|
NTAD Rail lines
|
SSURGO Soil survey
|
|
Scale
|
1:4800
|
N/A
|
N/A
|
1:24,000
|
N/A
|
Effective resolution
|
30
meters
|
10
meters
|
30
meters
|
12
meters
|
N/A
|
MMU
|
22
meters
|
10
meters
|
30
meters
|
12
meters
|
N/A
|
Planimetric Coordinate accuracy
|
10
meters
|
N/A
|
N/A
|
N/A
|
N/A
|
Lineage
|
Originator: Elecnor Deimos Imaging
Publisher: Astrium GEO Information Services
Publication_Place: Elecnor Deimos Imaging, Valladolid, Spain
Beginning Date: 20121001
Ending Date: 20131231
|
Organization: U.S. Geological Survey
Beginning_Date: 19990201
Ending_Date: 20131101
Progress: In work
Maintenance_and_Update_Frequency: As Needed
|
Organization: U.S. Geological
Survey
Beginning_Date: 20040409
Ending_Date: 20111111
Progress: In work
Maintenance_and_Update_Frequency: Every 5 years
|
Raquel
Hunt, Federal Railroad administration
|
US
department of agriculture, NRCS
|
Temporal accuracy
|
2013
growing season
|
01/11/2013
|
2011 Edition, amended 2014
|
03/21/2014
|
09/16/2014
|
Attribute accuracy
|
84.7%
(Kappa 0.772)
|
N/A
|
N/A
|
N/A
|
N/A
|
The data sets we worked with were quite interesting and by using the python code to automate the process it was a nice introduction into using python as a tool in our every day GIS repertoire. From the data table I constructed it is evident that many data sources are sub par when it comes to reliable metadata and that having all the necessary components can be very beneficial when working closely with certain databases. Now that I have gotten my feet wet with python I am feeling more confident in my ability to code for future projects and labs, along with working intimately with downloaded datasets.




