Wednesday, December 21, 2016

Activity 12: Pix4D Mapper


Introduction
In order for the Pix4Dmapper software to process imagery it needs matched keypoints. Matched keypoints are two points from two different images that are in the same location. The more matched keypoints there are, the more accurately 3D points can be computed, so a high level of overlap is required for an accurate output. Recommended overlap is 75% frontal and 60% side. For area with little visual content; such as snow and sand, frontal overlap should be at least 85% and side overlap should be at least 70%. The exposure should also be set to get as much contrast as possible. Rapid check is a processing image that can be used to quickly produce low-resolution images to assess quality and completeness of the image while still onsite. Pix4Dmapper can process multiple flights however the pilot needs to maintain a similar altitude for the flights.   Oblique images can be processed, this data will need to be collected with the camera at a 45o angle and succeeding images should be higher with a decreased angle. To combine oblique imagery with nadir images it is highly recommended to have GCPs and/or tie points, however GCPs are not necessary for Pix4D. After each step of processing a quality report will be displayed. The quality report is very a comprehensive report of how the process went, it includes a quality check, a preview of the images produced, number of images overlapped throughout the output, and much more.  This lab is designed to introduce these basic functions of the program using an image taken of the Litchfield mine, southwest of Eau Claire (figure 1). 
Figure 1. The study area


Methods
For this lab Pix4Dmapper Pro was used to create a 3D image of the Litchfield mine from a series of images collected using a Phantom UAV; to do this a new project was started in Pix4D and the images from the flight uploaded into the project. A quality report was automatically generated after the process was finished. The image was then used to experiment with a few of the various tools available in Pix4D mapper. A line measurement, a surface area, a volume calculation, and a video tour of the study area were all made. The image was also brought into Arc Scene to create another 3D rendering of the imagery.

Results/Conclusions

Pix4D mapper is a relatively user friendly program. The import and manipulation of imagery is rather straightforward, however the import process does provide a lot of information in the quality report that is a bit overwhelming. Measurements taken provided a reference of the overall size of the mine. Figure 2 displays the location of the measurements taken; the line length is 19.5 m, the surface area measured is 66.23 m2, and the volume measured is 32.93 m ± 1.44 m3. Figure 3 is the 3D rendering of the study area made in Arc Scene, set with the image itself as the floating surface to improve the appearance. Finally a video was made to give a virtual tour of the image produced in PiX4D, attached below. 

Figure 2. imagery with location of measurements taken

Figure 3. Arc Scene rendering of imagery





 






Tuesday, December 6, 2016

Field Activity 11: Point Features Survey Using a Dual Frequency GPS

Introduction
This lab is designed to provide experience using a survey grade GPS, which is accurate within centimeters. The mapping grade GPS can result in error of up to a meter and some projects demand precision. For example, the mapping of a cemetery could not have error of that extent as overlap of burial plots could occur. For this lab the class surveyed a small section of the campus mall (figure 1) and mapped the results.

Figure 1. Reference map for survey area

Methods
To begin this lab the components of the survey GPS were explained. The GPS itself is mounted at the top of a tripod apparatus and there is a handheld component that is blue toothed to communicate with the GPS unit. The class paired up and each pair took two sets of points. One of the members positioned the tripod where the point was to be recorded by staking the front leg into the ground and then letting out the length of the other two legs as needed to place them into the ground as well. The tripod was then leveled using a leveler that is attached to the tripod at about waist height. When the tripod is in position the partner, who is holding the handheld device records the point (figure 2). This resulted in a total of 20 points collected in a random, stratified manner that was supposed to emphasize the area of relief in the study area. The information collected was then transferred into a text file used to create a point feature class in ArcMap. The point feature class was then used to create a series of continuous surfaces using the various interpolation methods used in Field Activity 5.

Figure 2. Data collection


Results/Discussion

In the sandbox survey performed in Field Activity 4, 434 points were collected in a square meter area. Each of the interpolation methods resulted in an accurate portrayal of the surface and the merits of each interpolation method could be evaluated based on these results. In this lab, only 20 points were collected in a much larger area. The interpolation methods; IDW, kriging, and spline employ algorithms not recommended for random sampling schemes as they will result in over representation of areas more densely populated with survey points. The TIN interpolation method is not recommended for areas away from survey points. While none of the interpolations employed were not advisable for the sampling scheme used, they should have resulted in a surface representative of the actual surface. Figure 3 depicts much of the survey area, which is a mound shaped area. When compared to figures 4, 5, 6, and 7; one can see that a lack of sufficient sample points resulted in inaccurate representations in all interpolation methods. Perhaps if the number of points had only been doubled to 40, an accurate continuous surface may have resulted. Furthermore, had it been included the natural neighbor interpolation method would most likely have been the best fit for a stratified sampling scheme. 

Figure 3. Surface of survey area





  Figure 5. TIN interpolation
Figure 4. IDW Interpolation




Figure 7. Spline interpolation
Figure 6. Kriging interpolation


Tuesday, November 29, 2016

Field Activity 10: Arc Collector 2

Introduction
A majority of US adults own a Smartphone, which is capable of doing the same tasks a GPS is used for and can be online almost anywhere. The Arc Collector app utilizes these capabilities for accurate data collection in the field that can be shared online immediately. The previous lab dealt with data collection from a previously created database. In this lab the entire process will be done; first one must come up with a research question answerable using Arc Collector, create a database, and then use it to conduct field research.
Information about people can be gathered through observation of how they express themselves. People express themselves in many ways, from stickers on vehicles to the way they decorate their homes. One interesting method of self expression is through the use of lawn ornaments. In this lab Arc Collector and Arc Map will be used to create a database with domains that aid in answering the question of what proportion of homes in a given study area place lawn décor in their front yard as a method of self expression and of those, what percentage include animal lawn décor.

Study Area
The study area is a section in the southeast corner of the third-ward neighborhood beginning at the corner of Roosevelt and Garfield. This section was selected because it is out of the college rental section of the neighborhood and a majority of the homes are permanent residences. It is assumed that homes with annually changing tenants are less likely to have personalized yards. The study was conducted in the afternoon on a weekday so as to arouse fewer suspicions, as a stop in front of and observation of each home would be required. The weather was warm but cloudy and misting. This was not planned but assisted in the reduction of homeowners present during the survey.

Methods
After coming up with the research question, a polygon feature class of the study area can be digitized if necessary and a list of attributes capable of answering the question needs to be made. This lab required at least three fields for attributes; one text field for notes, a floating point or integer, and one with category items to choose from. The lawn décor survey includes four fields; two short integer, one category, and one text. The first short integer field indicates the presence or absence of lawn décor; in which 0 signifies absence of lawn décor and 1 signifies its presence. This field will allow for calculation of number of homes with lawn décor present. The other short integer field is for number of items present ranging from 0 to 20. It was assumed no one yard would exceed twenty items in their front yard alone. The category section is for type with options of animal, plant, other, combination (meant for those with animals as well as other items), or none. This field will be useful in calculating the percentage of home with lawn décor have at least one that is an animal. Lastly, there is a text field for notes on anything the surveyor feels note worthy. The ArcGIS for Collector web page has a very helpful tutorial with step by step instructions for creating the geodatabase with domains for each attribute, defining the feature class, setting up fields in the feature class to correspond with the previously created domains, and the steps to publish the map as a service. After the process has been completed, one should be able to log onto ArcGIS online and go to the My Content tab to find their map and open it in Map Viewer. When the map is open, choose Save As and then share it with the class group. The map service should now appear in the list of options when the Arc Collector app is open on a Smartphone device. 
It is a good idea to check to see if each attribute field is correct and functional (figure 1). For this survey the Notes attribute did not allow the actual insertion of notes. Upon further investigation it was found that when domains are being created, the text field type only allows coded values and no coded values were set for the Notes attribute. This was corrected by defining a new feature class and not associating the domain with the Notes attribute. Now the survey is ready for data collection using Arc Collector. Only the front yards of houses on each street within the study area were surveyed. Items also serving a functional purpose; such as bird houses, planters, and chairs were not included. Seasonal items were also not included. Wind chimes, while could be viewed as functional, were included. 
Figure 1.


Results
The published map created using Arc Collector was used to create a series of maps relating to the question of what percentage of people in a section of the third ward neighborhood have lawn décor pieces that are animals. There were a few survey points taken just outside of the study area, so the study area was corrected to include these points before the points were created.

Fifty-two homes were surveyed in the study area and of those 52, 21 home had some type of lawn ornament present in their front yard (figure 2). Of the 21 homes with lawn décor, 52% of them had at least one piece of décor in the form of an animal (figure 3). A graduated symbols map was made to reflect the quantity of lawn décor pieces at each house sampled (figure 4) and unique values were used to represent the type classification of each (figure 5).  
                
Figure 2.

             
Figure 3. 












Figure 4.
Figure 5.












Discussion

This turned out to be a very interesting study. The third-ward is a peaceful, beautiful neighborhood with many qualities to be observed. In this surveyors opinion, the best piece of lawn décor observed was a large stone turtle (figure 6). In addition to some interesting lawn décor pieces; there was also some eye catching architecture and landscaping choices. There was also a section of the study area in which the home`s backyards became Putnam Park, it could be observed from the road that this made for a most ideal backyard for someone choosing to live in the city but wanted to be able to feel like they were in the country.
Figure 6.

Tuesday, November 15, 2016

Field Activity 9: Arc Collector

Introduction
                A 2015 study by Pew Research Center found that 68% of U.S. adults owned a smartphone, 45% owned a tablet computer. and smartphone ownership increases to 86% for those 18-29 years old. The processing power of a smart phone or tablet is much greater than that of a GPS, therefore it makes sense that these convenient and almost omnipresent devices would make a good alternative to a standalone GPS device for the use of GPS data collection. Applications such as Arc Collector even offer the option to collect data while offline, similar to a GPS. You can download a map or use a basemap for reference and edits can be updated once a connection has been reestablished. In this lab, groups will utilize the same map while online to collect micro-climate data that will be updated on the fly.

Study Area
                The study area is an area about 1.2 square kilometers that covers almost all of UWEC`s main campus. It was divided into five zones; two zones were sampled by two groups, one zone was sampled by one group, one zone was sampled by three groups and one zone was not sampled at all (Figure 1). There was a small amount of zone overlap, but this did not cause excess data collection in any of these areas. Some geographic features noteworthy in micro-climate assessment are the large ridge running along the southern border of zone 3 and along the borders of zones four and five, the base map contains subtle contour lines that reveal this feature, and the two flowing bodies of water. Little Niagara Creek runs along the borders of zones two and three and then just within zone four before dumping into the Chippewa River. The river constitutes a large area within zone one. There are also zones with many buildings, as labeled in figure 1, and an area within zone three that is forested.
 
Figure 1. Map of study areas with zones and group data collection

Methods
                To begin this lab, everyone with a smartphone had to download the Arc Collector application on their phone, log into ArcGIS Online using a web browser to join the group containing the map necessary for data collection, and then log onto the application to open the map in Arc Collector. The map included polygons dividing the study area into zones and each group was assigned a zone in which to collect micro-climate attribute data at numerous locations within the zone. Each group received a Kestrel handheld ambient weather station (figure 2) and a compass to collect temperature, dew point, wind speed, and azimuth of wind direction. The groups consisted of two people; one for data collection and the other for input of data into Collector. Because everyone was online and data was being entered into a shared map; as data was entered into Arc Collector each group could see all points as they were added. The resulting point data as well as the zone polygons were then imported into ArcMap for analysis. Various interpolation methods were used to visualize variations between areas and predictions of data between points. The data was collected at random with one zone almost completely lacking data points and the river constituting another area lacking data points. The resulting dataset was not evenly spaced with points clustered together in some areas and others lacking points (figure 3).
Figure 3. Map displaying distribution of data points collected
Figure 2. Kestrel handheld Weather Station used to collect temperature, dew point, and wind speed
           


Results
Because the IDW interpolation method assigns values based on the values of known points nearby, it is an interpolation method that is not recommended for datasets with an uneven distribution of sample points such as this one. This is evident in the IDW interpolation of wind speed; the areas containing a higher density of samples are over represented with excessive value variation radiating from areas of high density (figure 4). The anomaly found in the southeast section of the study area can be explained by wind direction and the presence of the steep ridge directly south of the points. The wind was generally blowing at an azimuth between 180 and 270 degrees, leaving the area of data collection protected from wind by the ridge. Spline is another interpolation method not recommended for data with over and underrepresented areas. However, it does create a smooth transition between points and is ideal for gradually changing values. Temperature within a small study area like this will only contain gradual changes depending on location and the resulting interpolation contains change that is relatively subtle (figure 5). There are a few areas between points that have been given values that are most likely inaccurate. These could be explained by temperatures taken near heat vents and temperatures taken in areas with lots of shade and water, making it cooler. The kriging method of interpolation uses a similar algorithm as IDW to assign values to unknown points, but it also assumes correlation based on distance and direction from known points. This addition to the method makes it more ideal for the interpolation of dew point. The dew point of a space is partially determined by the moisture in the air, which will vary between points taken along water, taken near the swampy woods area, and the higher elevation of upper campus. Distance between these points will help prevent correlations between these points. The result provides a gradually changing interpolation of dew point (figure 6) compared to that of the natural neighbors result which resulted in distinct layers of change radiating from point clusters (figure 7).

         Figure 4. IDW interpolation of wind speed,
with azimuth compass
Figure 5. Spline interpolation of temperature        













Figure 6. Kriging interpolation of dew point
Figure 7. Natural neighbor interpolation of dew point













Conclusion

                This lab offered a look at how the convenience of data collection using Arc Collector can then be easily used to analyze and manipulate data in ArcMap. The data analysis tools available in ArcMap make Arc Collector a very useful tool, but it is also useful for those who do not have access to ArcMap. Arc Collector itself can be used to create maps and collect data and very accurately track where you’ve been. 

Tuesday, November 8, 2016

Field Activity 8: Map and Compass Navigation

Introduction
                As most people have already experienced, technology can fail and should not be relied upon. The ability to navigate with a map and compass is an essential skill for anyone hiking or backpacking through the wilderness. In this activity groups will be using two navigation maps created in the previous week`s lab  and a compass to try to navigate to five points in Universal Transverse Mercator (UTM) coordinates located in the wooded area surrounding UWEC`s Priory. Both navigation maps contain two meter contour lines and a grid. One of the maps includes the geographic coordinate system with a decimal degrees grid and the other map is in a UTM projection with a 50-meter interval grid.
Study Area
 The Priory is a residence hall and children`s nature academy just over 3 miles south of UWEC`s main campus set on a 120 acre wooded lot containing a lot of relief. The navigation took place on a warm, fall day and the sky remained clear for the majority of the activity. Temperatures during the navigation remained at about 16o Celsius.

Methods
                The class met at the Priory and upon arrival each group received; copies of their previously submitted maps, a course to navigate, a map compass, and a GPS unit. After a brief map compass (figures 1 and 2) tutorial, each group member marked the five UTM coordinates on their map and compared points for accuracy. To assist in the approximation of distance traveled, the two group members who would assume the pace counter role took a pace count for a 100-yard stretch in the parking lot (the third group member had an injury which prevented hiking through some of the terrain). Before embarking on this adventure three different roles were defined; the pace counter, azimuth control, and pace count recorder. The azimuth control would use the compass to determine the correct direction of travel and choose a landmark for the pace counter to travel to, then the pace counter travel to the specified landmark while counting the paces it took to reach the landmark and yell out the number of paces taken for the recorder to document and keep a running total. The 100 meter pace count could be used to produce a rough estimate of the number of steps needed to reach the points based on the map`s scale and measured distance from point to point.  It was now time to choose a starting point and determine azimuth from the starting point to the first point of the course. From a point easily locatable on the map; the map itself was laid on a flat surface, the compass was set on it and orientated true north by matching up the red arrow and the red outline below it, the map was then rotated properly so that is was also orientated true north. The edge of compass was then aligned with the anchor arrow at the starting point and the direction arrow aimed at the first destination, then the compass housing had to be rotated until the orienting lines were lined up with the lines of longitude on the map, thus facing north. The directional arrow now marked the bearing needed to travel to reach the first point. After the pace counter reached each landmark, the azimuth control would walk to that location and reassess the bearing by holding the compass so that it was facing due north. A new landmark could then be determined based on where the directional arrow was now pointing. The recorder would keep a running total of steps taken, adding a roughly estimated distance to account for relief as it was traversed, and alerting group members when the point, marked by a pink ribbon (figure 3), should be nearby. The group member not responsible for pace counting would then conduct a brief reconnaissance to see if it could be located before sending the pace counter further.  For each point the approximate bearing was recalculated and the process repeated. However, after successfully locating the first point, the second point was elusive and a pink ribbon was found on the ground some distance away from where it was thought the point should be. In an attempt to move on, the bearing for point three was determined based on the assumed approximate location of point two. After searching a broad area around the assumed location of point three without success, the GPS was consulted to approximate bearing and distance between actual location and marked location for point three. It was again assumed that another ribbon had been removed from the tree and point four was approximated from the current location according to the GPS. After yet again failing to find an actual marked tree, class was almost over and a hasty attempt to locate point five was made based on current location according to the GPS. These efforts were to no avail and it was time to use the recently practiced map and compass navigation skills to return to the Priory where the lone Dr. Hupy awaited our arrival.  


Figure 1. Compass similar to the one used



Figure 2. Compass with parts labeled









Figure 3. A point marked by a pink ribbon

Results/Discussion
                The GPS track log mapped out with along with the plotted points reveal how accurately the course was navigated. Point one is the point furthest south, point two being the next one closest to that, and points three, four, and five are in a counter-clockwise order (figure 4). Oddly enough, it appears point one was missed and perhaps the point located was part of a different course. It also seems point two would have been successfully located and it was correctly assumed to have been removed. Navigation from point two to point three appears to be where things went wrong. The general direction of navigation is accurate, but the distances traveled were not at all sufficient to reach points three or four. There are two likely explanations for this error; first the approximation for point two`s location may have been off and the approximated addition of steps necessary based on relief was most likely off. Point five appears to have been in relatively close proximity; however hasty searching that had extended beyond the class’s meeting time may have caused the group to miss the last and final point.As for the navigation as a whole, another source of error could have been the overlooking of declination. A post lab calculation for the day the activity took place found it to be 1.14° W ± 0.39° changing by 0.06° W per year. This seems to be quite negligible and should not have altered the results too greatly, as omission of declination should still have resulted in close vicinity to the target.

Figure 4. The course points mapped with the navigation track log

               
Conclusion

Of the two maps created, the UTM map was most helpful. The feature of greatest assistance was the grid; the grid lines were left subtly visible on the map making it easier to assess location. The points were plotted on this map and the 50-meter intervals helped in assessing distance. After the GPS track log had been changed to this projection it also gave UTM coordinates, so the UTM map was also used in orienteering when a point seemed to elude the pace count method. After point one was located all other points were within areas of great relief and the contour lines were somewhat helpful when a point was thought to be nearby. They were used to decide if said point should be at the base, summit, or side of a ravine. If given an opportunity to repeat the activity, subtle three meter contour lines with a slightly more visible 25 meter grid might be of more use. Previous experience is also a very helpful asset in this sort of activity and the knowledge to take the concept of distance change in relation to relief into account will be taken away from this lab. This lab was a fun, educational challenge that taught a valuable life skill to someone who can frequently be found in hiking in the wilderness. 

Tuesday, November 1, 2016

Field Activity 7: Development of a Navigation Map

Introduction
Stored within the hippocampus of the human brain there is an inherent sense of direction that enables the process of navigation to happen. A study done at University College in London indicated that those who perform navigational tasks regularly have more developed hippocampi compared to those who do not(Maxwell, 2013). In order for one to utilize this inherent sense of direction to successfully navigate, a few additional tools are necessary.  First, a location system is needed to identify location in reference to surrounding area. Often location systems employ a projected coordinate system to accomplish this. These coordinate systems entail a number of systems more precise than latitude and longitude in order to facilitate navigation on a large geographic scale. The second component for successful navigation is an actual navigational tool such as GPS technology or a map. In this lab, two large scale maps of UWEC`s Priory on Eau Claire`s southwest side (figure 1) will be created for future use in navigation of the area. One of the maps will use the Universal Transverse Mercator (UTM) coordinate system to give spatial information in meters and the other will use the Geographic Coordinate System to display the same spatial information as decimal degrees. The UTM coordinate system is divided into 60 zones, each being six longitudinal degrees wide(esri). Each zone is then split at the equator to form north and south sections of each zone (figure 2), the navigation area falls into zone 15N. The UTM coordinate system is ideal for land navigation using large-scale maps because it is measured in meters and can be tied to a distance measuring system. The Transverse Mercator projection used by this coordinate system is a cylindrical projection that does not maintain direction on small scale maps, however it is appropriate for the navigation area because it is a large scale map and falls within one single UTM zone. The geographic coordinate system references latitude and longitude to identify location in terms of decimal degrees. This is helpful information to have when using a GPS, as the technology uses this system to identify location.

Figure 1. Location of navigation activity from Google maps

Figure 2. UTM zones of contiguous United States nps.gov


Methods
                The area to be mapped was UWEC`s Priory and a geodatabase with the navigational boundary, remotely sensed imagery, and a contours map was provided. Begin with a blank ArcMap document, it is then best to copy and paste the geodatabase into a private folder connection where it can be altered. The first item added to the map was the navigation boundary and the source information inspected. The layer had a UTM projected coordinate system, making the UTM the first map created. While creating these maps, it is important to bear in mind that they will be printed and used for navigation.
Before doing anything else, change the layout to an 11X17-landscape format. The next step is to create contour lines of the area; the land surrounding the Priory contains a lot of relief and contour lines will assist in spatial navigation of the area. To do this, the existing 2-foot contour lines layer was added to the map for examination. Contour lines every two feet seemed a little excessive and a new two-meter contours layer created from one of the elevation model layers using the contours tool. After this was accomplished, each provided raster image of the navigation area was placed on the map to determine the best backdrop. One of the true color raster images seemed to be the best fit and set to a 40% transparency allowing the contour lines to be more visible. The selected image then needed to be projected into the Transverse Mercator projection with the appropriate UTM coordinate system. With the navigation boundary, contour lines, and raster image background in place, it was time to switch to layout view and add all the necessary elements. One of the more tricky requirements for this map was the grid with appropriate labeling. It is necessary to be in layout view to accomplish this task and can be found in data frame properties, under the grid tab. For the UTM map, make sure all layers included in the map are in the appropriate Transverse Mercator projection and the grid should be at 50-meter intervals on the X and Y-axes. The remaining elements include a north arrow, scale, information about projection and coordinate system, source information, and your name so no one else can take credit for all your hard work. The map with geographic coordinates in decimal degrees requires all the same elements however; the appropriate project tools must be used to change all layers to a geographic coordinate system. After this has been accomplished and all the UTM layers removed, a grid with decimal degree divisions can be placed over the map.  

Results/Discussion
                 Both maps were created with the goal of being user friendly and usable for those without field navigation experience. The two resulting maps both include two-meter contours in hopes that relief can be used as a guide for spatial orientation in the field. They also include aerial imagery backgrounds to assist in spatial orientation once on location. The UTM grid has finer spacing to facilitate the tracking of distance navigated (figure 3). The GCS grid does not include spacing quite as fine as the UTM, but if given a current location in decimal degrees, one should be able to use the map to determine a relatively accurate location within the navigation area (figure 4).

Figure 3. Map with UTM grid
Figure 4. Map with GCS grid



Conclusion

                The ability to perform a simple spatial navigation activity using a map is an important skill to have. Not only have studies shown that it promotes activity in the hippocampus, but they have also shown that this method of navigation may be better for the brain that reliance on a GPS for navigation (Maxwell, 2013). Furthermore, technology may fail and one may need to utilize their inherent sense of direction to navigate out of a remote area or to make it to a job interview using landmarks and directions. This lab challenges one to consider what map characteristics would be useful for navigation and is an excellent precursor to an actual navigation activity.   

Tuesday, October 25, 2016

Field Activity 6: Distance Azimuth Survey

Introduction
                An accurate and sufficient survey of an area can sometimes be accomplished using a grid based approach, but this method is not always ideal. Often the survey area is too large to effectively create an accurate grid for spatial sampling. An effective alternative is the use of a GPS receiver in conjunction with a total station, a surveying instrument containing an electronic distance meter as well as an electronic theodolite. A total station is capable of very accurate horizontal and vertical angles, sloping distance, as well as calculation of coordinates of an unknown point from a known point. While this combination of technology is a very accurate and efficient method to survey large areas, it is expensive and not always reliable. The price of a total station starts at $3,500.00 and a GPS is also usually needed to determine at least one known coordinate. If a person is fortunate enough to have access to such equipment, it should not be completely relied upon. Technology tends to fail on occasion and it is important to have an alternative survey method. The distance and azimuth method is one such alternative that can be done using less complex equipment. This method requires at least one known coordinate point and data is collected in relation to the known point; this is known as implicit data. With known coordinates of one point, the distance azimuth method can be achieved using only a compass and a measuring tape. Accuracy and efficiency does however improve as quality and availability of equipment increases.

Methods
                This lab was performed in order to gain experience using the distance azimuth survey method. The study area was a section of Putnam Park, on the UWEC campus (figure 1). A recreational grade, Bad Elf brand GPS was used to determine the coordinates of three known points within the study area. The implicit data gathered in this survey included azimuth and distance; from each given point groups used a lensatic compass to determine azimuth and a TruPulse laser rangefinder to find distance in meters to various trees in relatively close proximity. Other attribute data included tree species and diameter at breast height (DBH). To efficiently collect data while also allowing for each group member to utilize the equipment, the group delegated tasks and rotated between them. One group member selected a tree and measured DBH, one found azimuth, another measured distance, another recorded the data, and tree species identification was a group effort.
 In order to compile data sets from all three coordinate points the class created a shared excel spreadsheet in which all groups could enter their collected data. The appropriate fields within the spreadsheet were set to a numerical data format and the table saved as a csv file to facilitate the import into ArcMap. To begin mapping the survey in ArcMap a new geodatabase was created and the table was imported into it. The table was then added to a map document as XY data and then converted into a point feature class using the WGS84 projection in order to coincide with the GPS coordinate values. When displayed over a world imagery base map, two of the three points were not located in their expected locations. Point three was located near interstate 94 and point two was about 30 yards north of its actual location. The points were corrected using the base map for reference and editor to move the points to a more accurate positions. In order to map all data collected, data management tools found in Arc Toolbox were used. First, the Bearing Distance to Line command was used to create lines from each origin to the surveyed trees (figure 2) and then the Feature Vertices to Points command was used to create points at the vertices of the lines created. The resulting points were then symbolized based on tree species (figure 3). Finally, the created features were used to compile a map layout depicting all three study areas.

Figure 1. Location of study area
Figure 2. Result of Bearing Distance to Line command
Figure 3. Result of Feature Vertices to Points command



Results/Discussion
                A few issues to were encountered during the initial import of data into ArcMap. After creating a point feature class of the origin points, a world imagery base map was added and zoomed into the study area`s location, but the points were not visible within the study area. The zoom to layer feature of the point feature class revealed the points to be located off the west coast of Africa, a fellow classmate`s further investigation indicated the X and Y values had been transposed. After this error was corrected, the remaining inaccuracies could most likely be explained by a GPS error. The study area was at the base of a large ridge to the south, which most likely interfered with the GPS. Previous groups participating in a similar lab used Google Earth to determine origin points and also encountered the trivial errors found in this lab after the X and Y coordinates had been transposed to represent the correct values collected. The error was corrected with the same level of ease as in this lab, by simply editing the point locations before running the other data management tools.
The features created from the data management tools were compiled and displayed over a world topographic base map to create a final map layout depicting the origin of each survey within the study area as well as the distance and species of each tree surveyed (figure 4). The data was also used to create a graduated symbols map based on DBH of the trees (figure 5). The final results are quite accurate, but point three is slightly south of its actual location and some of the trees appear to be in the middle of the road. This could have been more accurate if the initial points had been edited using the world topographic or the streets base map instead of the world imagery base map.

Figure 4. Layout depicting tree species

Figure 5. Layout depicting tree DBH




Conclusion

                In real life surveying situations, often the sample area is too large for a grid based survey. In these instances, surveyors often utilize expensive, high-tech equipment; such as total stations. However, not all surveyors have access to this equipment and when it is available it may fail to function properly. The distance azimuth survey proves to be an acceptable alternative approach when the study area is too large for a grid based sampling method and resources are limited. As seen technical reports from previous classes, even when beginning a survey without known coordinates the distance azimuth method can still be utilized. This just requires one be able to locate their origin(s) on Google Earth or some other application in which the coordinates of the location can be extracted. 

Tuesday, October 18, 2016

Field Activity 5: Sandbox Survey Part 2, Visualizing and Refining a Terrain Survey

Introduction
                Prior to this lab each group was to create a terrain inside of a sandbox of about one square meter (Figure 1), and then tasked to come up with a sampling scheme and technique for the creation of a digital elevation surface. Due to the small size of the study area, the grid scheme consisted of 6 X 6 cm squares and systematic point sampling was used to gather elevation data from the X, Y intersections of each square. In cases of sharp change in elevation within one square two measurements were taken in one square, giving the X or Y a 0.5 decimal place in the coordinating location. Data was manually recorded on a hand drawn grid matching that of the one created over the sandbox. This method organized the data and assisted in data normalization as it was entered into an excel spreadsheet. Data normalization involves the input of data into a database so that all related data is stored together, without any redundancy. Data for this excel file was set to numeric, double to indicate the decimal values and keep all data in the same format. X, Y, and Z data for each grid was entered beginning at the first X column from bottom to top and so forth, this pattern eliminated redundancy and kept all data inside one excel file. In total; 434 data points were collected in a relatively uniform pattern throughout the entire study area, with some clustering of points in areas with rapid change in elevation. These data points can be imported into ArcGIS or Arc Scene, where interpolation can mathematically create a visual of the entire terrain based on the know values.

Figure 1


Methods
                To bring this process a geodatabase was created and the excel file, which has been set to a numerical data format, is imported into the database, brought into ArcMap as X, Y data, and then converted into a point feature class. After a point feature class has been made the data can be used to create a continuous surface using interpolation methods. Four raster interpolations and one done in a vector environment are experimented with to determine the ideal representation of the terrain. The four raster operations are: Inverse Distance Weighted (IDW), natural neighbors, kriging, and spline. The interpolation done in a vector environment is called TIN.
IDW and spline methods are deterministic methods, they are based directly on a set of surrounding points or mathematical calculation to produce a smooth continuous surface. IDW interpolation averages the values of known points in the area of the cells being calculated to give them values based on proximity to the know values; cells closer to known points are more influenced by the average than those farther away. Influence of known points can be controlled by defining power, a larger power will place more emphasis on near points and result in a more detailed map, while a lower power will place emphasis on distant points as well to produce a smoother map. Produced values can also be varied by manipulation of the search radius to control for distance of points from the cell being calculated. The IDW interpolation method would not be advantageous for random sampling methods, areas with a higher density of samples would be more well represented than areas with fewer points. Spline uses a mathematical function and the resulting surface passes exactly through each sample point, making it ideal an ideal method for large samples. The function used creates a smooth transition from one point to the next and is ideal for interpolation of gradually changing surfaces such as; elevation, rain fall levels, and pollution concentration. Since this method passes through each sample point to create the resulting surface, it is not ideal for sampling techniques that result in clustered areas or fewer sample points in proportion to the entire area, such as random and/or stratified.
Kriging is a multi-step, geostatistical method of interpolation that creates a prediction of a surface based on sample points and also produce a measure of accuracy of those predictions. It assumes a correlation between points based on distance and direction between them. Kriging is similar to IDW in the fact that it weights points in a certain vicinity but it goes beyond IDW and takes spatial arrangement of the sample points into account, in a process called variography. While kriging goes a step beyond IDW to create a more accurate portrayal of a surface, like IDW it is not an advantageous method for sampling techniques that have resulted in clusters of sample points, such as random sampling.
Natural neighbor interpolation utilizes a subdivision of known points closest to the unknown point to interpolate and does not deduce patterns from the subdivision data. Hence if the entered sample does not indicate the pattern of a ridge, valley, etc.; it will not create this feature. Natural neighbor interpolation is advantageous for samples that have been collected using a stratified sampling method due to subdivisions within the study area.  However random sampling may result in poor representation of a specific surface and thus natural neighbor would produce a poor representation of the surface as well.
TIN, or triangulated irregular network, uses an algorithm to select sample points in the form triangles and then creates circles around each triangle throughout the entire surface. The intersections of these are used to create a series of the smallest possible, non-overlapping triangles. The triangle`s edges cause discontinuous slopes throughout the produced surface, which may result in a jagged appearance and TIN is not an advisable interpolation method for areas away from sample points.
Given the comprehensive and uniform collection of sample points in the terrain, each of these methods should produce a relatively accurate portrayal of the terrain. However, the areas with more densely collected data may result in over-representation in those areas in all interpolation methods except spline. Because of the fact that spline interpolation passes the surface though each data point and produces a smoothed surface in-between the points. Further reading in ArcGIS help indicates that spline is also ideal for areas with large numbers of sample points and gradually changing surfaces, like elevation. This indicated it would be the most ideal method for the survey.

Results/Discussion
                Prior to this activity, it was necessary to create a sandbox terrain and spatially sample it for elevation values. The entire terrain was slightly larger than one square meter and a systematic point sampling of the entire terrain was feasible. Systematic sampling done within an X, Y plane consisting of 361, 62 centimeters, quadrates resulted in an excel spreadsheet with 434 data points (Link to excel file). After the excel file was imported into ArcGIS as X, Y data and converted into a point feature class (Figure 2A), it produced a collection of sample points that were uniform throughout most of the terrain, with areas of sharp change in elevation more densely sampled. Each of the five interpolation methods described above were applied to the feature class to create different continuous elevation surfaces and each analyzed for best fit for the survey.
For each method, the elevation from surfaces option was set to floating on a custom surface with the custom surface being set to the corresponding surface being interpolated and each was set to shade areal features relative to the scene`s light position. Furthermore, other than the TIN method, each scene was set to a stretched symbology with the same color ramp. The TIN method only offers elevation symbology and did not offer the same color ramp.  All First, IDW resulted in elevation changes that were relatively smooth and accurate but the image appeared “dimply”, similar to that of a zoomed in view of a golf ball in almost the entire image (Figure 3A). Kriging also accurately represented elevation, but it appeared very geometric. It would most likely appear similar to a fractal if each shape within the image were colored in differently (Figure 3B). The natural neighbor method is smooth in most places, but edges are rough and smaller elevation changes are less pronounced (Figure 3C). TIN interpolation produced a very detailed image, however it is very “blocky” with angular curvature instead of a smooth surface appearance (Figure 3D). Finally, the spline interpolation method produced smooth elevation transitions, with the most pronounced representation of different elevations throughout the terrain. Based on these results, the spline method appeared to be the interpolation most appropriate for the survey taken (Figure 2B).
Before applying the interpolation methods and having only researched sampling techniques and some about the first geography law; it seemed the survey performed may have been slightly excessive. However, on the other hand, a larger survey is ideal and the time taken to perform the survey was not all that long. After experimenting with and reading about each interpolation method, it can be concluded that the sampling technique and grid scheme was not too excessive. The results from each interpolation method captured most details of the original terrain, and it was clear what was being portrayed in each image.

Figure 2


Figure 3







Summary/Conclusions

                Field based surveys collect essential spatial data needed to determine the relative data of unknown areas within proximity to the collected data, in order to establish an acceptable representation of the entire study area. This activity illustrated the basics of the survey process on a much smaller scale than usual. Like this survey, surveyors need to decide which sampling technique and tools are most suitable for the task. However, the tools used in most field-based studies are go beyond string and thumbtacks, some of the tools used include GPS receivers, total stations, 3-D scanners, and UAVs. In larger survey areas, surveyors must also consider temporal aspects. If a survey is to be done as a follow up to a previous one, the surveyors need to consider whether it should be done in the same temporal situation as the previous survey. Alternatively, if a new survey is to be conducted, the surveyors need to determine the best temporal situation for the survey. The detail given to this small scale survey is not always a feasible option, resulting in a different collection of sample points. This is where the other interpolations would yield a more accurate representation of the survey in question.

                Interpolation is a useful tool for visualization of many things beyond elevation. Other gradually changing phenomena’s such as water table levels and rain fall can be interpolated or demographic data, such as a survey of HIV distribution in Africa. Ecological surveys, such as forest type could also be interpolated. Interpolation methods are crucial in many fields for the extraction of important information otherwise extremely difficult or impossible to survey completely.

Tuesday, October 11, 2016

Field Activity 4:Sandbox Survery;Creation of a Digital Elevation Surface using critical thinking skills and improvised survey techniques

Introduction
Waldo Tobler`s First Law of Geography states that; “Everything is related to everything else, but near things are more related than distant things” (Esri GIS Dictionary). This statement is the basis of sampling; data collection of a representative population or sample area used to investigate a whole population. Well-chosen spatial samples can be used to create an acceptable description of the Earth`s surface. There are many factors that must be taken into consideration in order to create a well-chosen spatial sample. Some of these considerations include sample size; larger samples tend to be more representative of the whole, how to minimize bias when sampling, and which sampling technique is most appropriate for the area being sampled. A common scheme for spatial sampling is based on points within a grid framework and there are three primary sampling techniques; random, systematic and stratified. Each sampling technique has benefits and disadvantages. Both random and systematic sampling can be sub-classified into point, line and area sampling. Point sampling involves data collection at x, y intersections or in the center of the grid correlating to an x, y intersection, line sampling involves data collection at points along a line, and area sampling involves data collection within grid squares. Random sampling removes bias but can result in a poor representative population due to clustering of sample points. Systematic sampling involves evenly distributed points however, bias can lead to over or under-representation of a certain pattern. Some study areas have a know proportion of specific subdivisions, in these cases stratified sampling would be the best technique. Stratified sampling would evenly distribute sample points taken in proportion to each subdivision and the samples taken in each subdivision could be taken randomly or systematically. The goal of this activity is to create a terrain in an approximately one square meter area and then determine the most fitting sampling scheme and technique to obtain sufficient data for the creation of an accurate digital elevation surface of the terrain.

Methods
                It was decided that systematic point sampling, recording elevation (Z) at each  X, Y intersection was the most appropriate for this project; as the whole study area was only slightly larger than a square meter and the entire study area could be sampled efficiently in a relatively short time frame. A similar alternative to this approach would have been to record Z at the center of each grid square, however in some grid squares a sharp change in elevation near the center would make it difficult to determine which measurement to take. The stream box used was 114cm x 114cm, from these measurements it was determined that a grid with 6cm x 6cm squares would be ideal to capture sufficient data points for mapping. These divisions would result in 361 squares measuring 36 cm2 each.  Meter sticks were used as guides and thumbtacks were used to mark each 6cm point on all four sides (figure 1) and string was wrapped around them to form a physical grid (figure 2) to ensure accurate location of the Z points recorded. The top of the stream box was made to be sea level, since the string grid was at the same level as the top of the stream box and the grid could also serve as a guide for sea level during data collection. Where the terrain exceeded sea level, the string was pulled into the terrain without damaging the structures so that all string sat at sea level. After the grid system had been completed over the terrain an origin was set at the southeast corner of the stream box. To facilitate data collection a similar grid of smaller proportions was drawn on paper and the Z value for each X, Y coordinate recorded in its corresponding box. In cases where the terrain in an individual grid square changed sharply, two measurements were taken and depending on which plane the change happened along the X or Y value was given a decimal value ending in .5 and then a whole number. The collected data was then entered into an excel spreadsheet, beginning at the origin and working up each Y column along the X axis. 



figure 2
figure 1



Results/Discussion
                After grid squares with sharp contrasts in elevation were split into two separate Z values the final number of data points collected was 434. The Z value`s minimum was -13, maximum was 10, the mean was ~-2.151, and standard deviation was ~4.146. The X and Y values both ranged from 1 to 19. The sampling method chosen at the beginning of this project and ultimately utilized for data collection appears to have been the best method however fewer points could have been collected. The primary issue turned out to be the grid squares with sharp changes in elevation; this was corrected through the collection of two Z values within those grids and depending on which plane the change happened along the X or Y value was given a decimal value ending in .5 and then a whole number. Another issue was the quantity of data points to be collected. The process was expedited through the delegation of tasks; one person held the measuring stick, another read aloud the value, and a third recorded the data.

Conclusions
                The sampling technique used in this project was slightly tedious but a majority of the points taken represent a 36 cm2 section of the whole study area and the even distribution of points will give an unbiased, more accurate representation of the entire study area. Sampling is an effective way to create a reasonably accurate portrayal of spatial features on the Earth`s surface because surfaces that are closer together tend to be related to one another and data about a surface can be used to predict the nature of another surface that is close by. The activity of sampling a surface created in a small area provides an experience that would help with spatial sampling on a larger scale; it touches on the types of problems that could be encountered and the methods that can be used to solve these problems. After learning about the First Law of Geography, it seems the resulting dataset from this project was slightly excessive and fewer data points could have been collected in order to recreate the surface accurately. At the same time, the collection of data point was not costly and should result in a more accurate portrayal of the terrain that was created.