Search
Create
Log in
Sign up
Log in
Sign up
Midterm Review
STUDY
Flashcards
Learn
Write
Spell
Test
PLAY
Match
Gravity
Terms in this set (88)
T/F Topological Relationships are sensitive to geometric transformations
False
Name topological relationships
Dimensionality, adjacency, connectivity, containment, coincidence
Why would you want to geocode some data
Translate some description of a place to an unambiguous location on earth's surface
The ________ the score the ________ suitable the area can be
More/Higher and more
Complete geocoding and explain what it does: Left House ____, ____ House Number
Number, Right
- defines the range of address numbers on a street
775 110
756 - 210 =
876 210
645
546
666
Affine Type transformations
Scaling, Skewing, Rotation, Translation, Projection
T/F: There can never be too many links with which to georeference an image
False
Interpolation can provide ____ given a set of known _____ for other locations
values and values
Higher order transformation can introduce ______ in some areas while improving ______ in others
uncertainty, accuracy
Name scope of map algebra operations
local, focal, zonal, global
Name the hard versions of weighted overlay
constraint/feasibility analysis
Name the soft versions of weighted overlay
possibility/suitability analysis
Kriging assumes distances or direction between sample points show a ______ ______ that help describe the surface
spatial correlation
Define cost surface
- measure cost for transversing the physical difference
- uses a cost impedance value to transverse each cell
- need a cost grid as an input
- not analagous to network distance paths
Semivariogram
small y- nugget
x - range
large y - sill
MUT
m=matched
u=unmatched
t=tied multiple candidates
convex hull
rubberband (This is taken as the convex polygon of smallest area that completely encloses the set)
3 key steps of kriging
1. exploratory statistical analysis of data
2. variogram modeling
3. creating the surface based variogram
Topology
- Define relationships between
- Survive stretching and warping (i.e. crossing lines)
- Topological relationships are insensitive to geometric transformations
- - Includes projections, stretching and topology
How does spatial analysis rely on topological structures?
topology answers commonly asked questions in spatial analysis (adjacency, connectivity, containment, or coincidence at some point. Makes stronger representation.
Effective Topological Structures
?
How do shape-files deal with topological relationships?
?
Adjacency
defined as: including the touching of land parcels, counties, and nation-states -Compare common nodes and arcs
-Compute "on the fly"
-Compare nodes of adjacent features, no table needed
-Directionality is harder
-Adjacency Table/Lists
-"Brute force" approach
-For every pair of features, if they intersect, store the features' index values in list
General goal of geocoding
Translation of some description of a place to an unambiguous location on the earth's surface
Geocoding inputs
address list (table file, list)
place name list
xy coordinates
Geocoding Service
Need rules for reading address data, matching the address data to the reference data and creating output
Address Locator
the tool run to provide location information, based on a reference
AL Style: template that translates between understood formats and your data
Can be created in database, file folder
Local use vs. Distributed use
Relies on a reference table, your own data with fields defined
Needs an address locator style
Reference Data
the GIS dataset that you find that will help build the locator
When is georeferenceing required
When there isn't spatial data (i.e. an old aerial photo)
Have a "key" layer that is already georeferenced in the desired output coordinate system
Geocoded feature class
output feature class, interactive
Bilinear interpolation
average value of 4 nearest cells in the untransformed data, weighted by distance to the transformed cell location.
Use for continuous data
"Smoothing" function for data
Used for elevation surfaces
Cubic convolution
average value of 16 nearest cells in the untransformed data weighted by distance to the transformed cell location.
Use for continuous data
"Smoothing" function for data, but sharper than bilinear
Used for aerial imagery
Nearest Neighbor
takes the value from the cell closest to the transformed cell.
Maintains original values, so good for categorical data
Non-convex hull
An alternative bounding polygon, can be based on any criteria, but common methods are: expansion, contraction, and density contouring
Minimum Bounding Rectangle
Uses the minimum X-coordinate, Min Y-coordinate, Maximum X-Coordinate, and Maximum Y-coordinate to create a rectangle to include all the data.
Point-In-Polygon Operation
determines if point is within a polygon
Containment
when a point lies inside rather than outside an area
Main Transformation Applications
-Creation of a new feature class in another projection
-Changing datums
-Changing coordinate systems
-Changing units of measurement
-"Georeferencing" (Warping; Rubber-sheeting)
Affine Transformation Formula / 6 Coefficient locations / Function of formula
x' = Ax + By + C
y' = Dx + Ey + F
-x, y are coordinates of input layer
-x', y' are the transformed (new) coordinates
-A = x-scale; dimension of a pixel in map units in x direction
-B, D = rotation terms
-C, F = translation terms; x,y map coordinates of the center of the upper left pixel
-E = negative of y-scale; dimension of a pixel in map units in y direction
To assign input points new locations to create the transformation in the polygon
When do you use nearest neighbor, bilinear interpolation, or cubic convolution?
Nearest Neighbor: maintains original value, so good for categorical data
Bilinear Interpolation: used for continuous data that "smooths" the data over, so good for elevation data
Cubic convolution: also use for continuous data that "smooths" data (but sharper than bilinear), so good for aerial imagery.
Affine Transformation
A combination of linear transformations that converts digitizer coordinates into Cartesian coordinates.
Translation
The transformation that changes the origin of the polygon. Shape is kept, just moved.
Rotation
The transformation that rotates polygon about its origin.
Scaling
The transformation that expands or contracts the polygon.
2nd/3rd order polynomial transformations
?
Local Analysis / when to use
The value generated in the output raster is a function of cell values at the same location on in the input layers. When you take the temperature average in each cell using two raster grids, this is an example of a local operation. (operation on one pixel at a time)
Use to combine single-theme grids, frequency, statistical analysis, change detections, boolean, and conversion of units.
When you take the temperature average in each cell using two raster grids, this is an example of a local operation.
Focal Analysis / when to use
value of the output cell determined by the cells in a specified neighborhood around each input cell (group of pixels at a time/neighborhood operation)
Used in image processing (ex. smoothing, moving window, kernel)
Convolution, kernel and moving windows are examples of image processing techniques that use focal operations.
Zonal Analysis / when to use
value of each output cell determined by all the input cells of the same zone (operation on a zone/common group of pixels)
An example of a zone could be a watershed. When you want to calculate the total mean volume of precipitation in each watershed zone, this is an example of when you would use a zonal operation.
Operators you can use when constructing Map Algebra
?
Default Neighborhood Type when using focal analysis
rectangle, 3 by 3
Other Neighborhood types when using focal analysis
RECTANGLE (varying width and length),
CIRCLE (varying radius),
ANNULUS (varying inner radius and outer radius),
WEDGE (varying radius, start angle, and end angle)
Focal Operators
Sum: All all touching cells together
Mean: Add all touching cells together than divide by number of cells
Minimum: the touching cell with the smallest number value
Maximum: the touching cell with the largest number value
How are distances measured using raster analysis?
Distance Measure Operations calculate distances away from cells designated as source cells
Euclidean Operations calculates the shortest straight distance from each cell to its nearest source cell (EucDistance) and assigns each cell the value of its nearest source cell (EucAllocation)
Distances follow a node-link relationship where nodes are the cell centroid, links
connect nodes. Links can be lateral or diagonal
Reclassification and Weighted overlay difference, how do they work together?
Reclas: The process of taking input cell values and replacing them with new output cell values. Reclassification is often used to simplify or change the interpretation of raster data by changing a single value to a new value, or grouping ranges of values into single values (for example, assigning a value of 1 to cells that have values of 1 to 50, 2 to cells that range from 51 to 100, and so on)
Weighted Overlay: A technique for combining multiple rasters by applying a common measurement scale of values to each raster, weighting each according to its importance, and adding them together to create an integrated analysis.
When using weighted overlay, you are using raster, which need integer data but if it is not you can use reclassification to change continuous data into integer data and then continue to overlay
Weighted overlay and Site Suitability
-Usually, higher scores indicate "better"
-Make sure "Scale" values are representative of reclassified values
-More broadly, classification does involve some loss of data resolution
-All elevation values between 1500 and 2000 -> 1
-Would an aspect value of 180 have the same true suitability to an aspect value of 158?
-Only use Weighted Analysis if inputs carry uneven influence!
Map Algebra
algebraic expression of Spatial Analyst tools, operators, and functions to perform geographic analysis on raster data
Operator
allow mathematical or selection or relational operations to be performed on rasters and scalars
Local Analysis
operation on one pixel at a time
Focal Analysis
group of pixels at a time, neighborhood operation
Zonal Analysis
operation on a zone (common group of pixels)
Zone Layer
defines zone
Value Layer
contains input cell values
Source Cell
cell designated the starting point in a distance operation
Destination cell
cell designated the end point in a distance operation
Cost Surface
number of units (on x -axis) * k cost (on the y axis)
Cost Distance
-Measures the cost for traversing the physical distance
-Uses a cost or impedance value to traverse each cell
-Need a cost grid as an input
-Not analogous to network distance paths
Accumulated Cost Surface
-Find the "least"
-Usually an iterative problem
-Spread algorithm, using immediate neighbors
-Queen contiguity/moves from cell (8 directions)
-Represent points and costs at cell centers
General idea behind interpolation and when is it employed?
The estimation of surface values at unsampled points based on known surface values of surrounding points.
Interpolation can be used to estimate elevation, rainfall, temperature, chemical dispersion, or other spatially-based phenomena. Interpolation is commonly a raster operation, but it can also be done in a vector environment using a TIN surface model. There are several well-known interpolation techniques, including spline and kriging.
Required Inputs for Interpolation
1) estimated weights
2) known/measured values
Interpolation equation
where zj is the z-value to be estimated for location j, the λi are a set of estimated weights and the zi are the known (measured) values at points (xi,yi). As zj is a simple weighted average an additional constraint is required ensuring that the sum of the weights adds up to 1:
How does the sampling of points (and their known values) impact the output of interpolation?
?
Inverse distance weighint vs. natural neighbor
IDW: Divides each of the observations by the distance it is from the target point raised to a power α (good because its common, quick, relatively easy to understand, BUT, interpolated values limited by the range of the data; Classic "bulls-eye" look to output)
natural neighbor: Uses areas of influence by generating Voronoi polygons around input points (Generally smoother than IDW
Instead of weights based off of distance, weights are based off of area coverage; borrowed values)
Kriging vs. deterministic methods
A geostatistical interpolation method to estimate unknown z-values from known z-value. Difference from IDW and Natural Neighbor is that you need to know trends and patterns of the data, which requires some data exploration. (incorporates weights given by inverse distance AND spatial autocorrelation. Sadly you still HAVE TO KNOW about spatially correlated distance or directional bias in data)
Kriging Steps
1) Exploratory statistical analysis of data
2) Variogram modeling
3) Creating the surface based on variogram
Universal Kriging vs. Ordinary Kriging vs. Simple Kriging
Ordinary kriging: there is not a constant mean in your data, but not an underlying trend
Universal kriging: there is a global trend in your data, and it is accounted for
Simple kriging: there is a constant mean, and not an underlying trend
Surface Interpolation
?
Deterministic Methods
-Assign values to locations based on nearby point values and on a mathematical formula.
-Inverse Distance Weighting
-Natural Neighbor
Geostatistical methods
-Able to produce a prediction surface that incorporates spatial autocorrelation, along with a measure of accuracy/uncertainty
-Kriging
Inverse Distance Weighting
By inverting the distance among spatial features, and using that inverted value as a weight, near things have a larger weight or influence than things that are farther away.
Natural Neighbor
An interpolation method for multivariate data in a Delaunay triangulation. The value for an interpolation point is estimated using weighted values of the closest surrounding points in the triangulation. These points, the natural neighbors, are the ones the interpolation point would connect to if inserted into the triangulation.
Kriging
A geostatistical interpolation method to estimate unknown z-values from known z-values
Variogram
how variance increases as distance increases
The variogram is defined as the variance of the difference between two variables at two locations. The variogram generally increases with distance and is described by nugget, sill, and range parameters. If the data is stationary, then the variogram and the covariance are theoretically related to each other.
Semivariogram
The variogram divided by two.
Sill
the parameter of a variogram or semivariogram model that represents a value that the variogram tends toward when distances become large.
Range
A parameter of a variogram or semivariogram model that represents a distance beyond which there is little or no autocorrelation among variables.
Nugget
A parameter of a covariance or semivariogram model that represents independent error, measurement error, or microscale variation at spatial scales that are too fine to detect.
anisotropy
how direction influences spatial autocorrelation, in addition to distance
A property of a spatial process, or data in which spatial dependence (autocorrelation) changes with both the distance and the direction between two locations.
YOU MIGHT ALSO LIKE...
ASCP MLT/MLS Certification Exam (BOC) Preparation
scottmooredo
$6.99
STUDY GUIDE
GEOG 4047 Final Exam Lecture 11
33 Terms
justin_campano
Mid Term Review
101 Terms
Dchrsti
PLUS
FinalExam
61 Terms
Christian0427
OTHER SETS BY THIS CREATOR
Reading Into a New China (2nd Ed.): Lesson 20 (Part 2)
20 Terms
smt26397
Reading Into a New China (2nd Ed.): Lesson 20 (Part 1)
35 Terms
smt26397
Reading Into a New China (2nd Ed.): Lesson 15 (Part 2)
37 Terms
smt26397
Reading Into a New China (2nd Ed.): Lesson 14 (Part 2)
16 Terms
smt26397
;