-
Notifications
You must be signed in to change notification settings - Fork 1
/
en.search-data.min.json
1 lines (1 loc) · 847 KB
/
en.search-data.min.json
1
[{"id":0,"href":"/classes/geog526/labs/bib/","title":"Bibliography Guidelines","parent":"Labs","content":"Part of your grade is comprised of seven article reviews. These reviews will be turned in at the beginning of lab on designated days (see schedule on syllabus). Each article review is worth 8 points, for a total of 56 points.\n Bibliography Criteria Annotated Bibliography Assignment Requirements Example of very bad bibliography: Example of a good bibliography: Bibliography Criteria The following criteria are used for each of the bibliographies:\nBibliography #1 - 3: You must use a commercial journal (such as, Apogeo Spatial (Formerly Imaging Notes), GeoWorld, Geoscience and Remote Sensing Magazine, Geospatial Solutions, Earth Imaging Journal, etc.). However, the article MUST deal with REMOTE SENSING and not geographic information systems. The article must be at least 3 pages long. GIS World, GEO World, and GEO Info Systems are located in Anschutz; some articles from these journals can be accessed online.\nBibliography #4: This article must deal with some aspect of aerial photography and NOT satellite imagery. These articles must come from peer-reviewed journals (see example list below) and not commercial journals like GIS World or Earth Imaging Systems.\nExamples of Peer-Reviewed Journals (for Bib. 4-7):\n Photogrammetric Engineering and Remote Sensing (Engineering Library) Remote Sensing of Environment (Science Library) IEEE Transactions on Geoscience and Remote Sensing (Science Library) International Journal of Remote Sensing (Science Library) GIScience and Remote Sensing GeoCarto International (Science Library) Bibliography #5 - 7: These can be articles dealing with any aspect of remote sensing. However, they MUST come from peer-reviewed journals.\nExamples of ways to search for peer-reviewed articles include, but are not limited to, Web of Science (link through KU library), Google Scholar (Caution: not all articles through Google Scholar are peer-reviewed) and ScienceDirect.\nYou will also have a chance at the end of semester to submit Bibliography #8. The score for #8 will replace your lowest score in the same category.\nAnnotated Bibliography Assignment Requirements This assignment is meant to familiarize students with the remote sensing literature and applications. Grading will be based on quality of the citation and annotation and will include the following criteria: correct citation style, sentence structure, punctuation, grammar, proofreading, spelling, ability to remain on focus, and use of higher cognitive skills in analyzing and synthesizing the article. Each bibliography should be about ½ to ¾ page in length and single-spaced in 12 point type. You may submit these on blackboard under the respective section.\nCitation format should follow ACS style or similar (Zotero is your friend):\ne.g: Author 1, A.B.; Author 2, C.D. Title of the article. Abbreviated Journal Name Year, Volume, page range.\ne.x: Dunham, J., and K. Price. Comparison of nadir and off-nadir multi-spectral response patterns for six tallgrass prairie treatments in eastern Kansas. Photogrammetric Engineering and Remote Sensing 1996, 62(8): 961-967.\n Please write down Bibliography # and your name on the first line.\n Example of very bad bibliography: Kevin Price, detection of soil erosion within pinyon-juniper woodlands using thematic mapper (TM) data. Remote Sensing Journal page 233.\nI have never read such a cool story. It had lots of good stuff in it like how the trees reflected stuff back at the camera thing and it makes outrageous pictures. All my roommates thought it was a neat article too. I learned some good things in this magazine.\n This “annotated bibliography” received less than 1 point\n Example of a good bibliography: Coll - Bibliography #2\nPrice, K. P. Detection of soil erosion within Pinyon-Juniper woodlands using Thematic Mapper (TM) data. Remote Sensing of Environment 1993 45: 233-248.\nIn this study, the author tested the sensitivity of TM data to varying degrees of soil erosion in pinyon-juniper woodlands in central Utah. TM data were also evaluated as a predictor of the Universal Soil Loss Equation (USLE) Crop Management C factor for pinyon juniper woodlands. Multispectral measurements collected by Landsat Thematic Mapper (TM) were correlated with field measurements, direct soil loss estimates, and the Universal Soil Loss Equation (USLE) estimates.\nCorrelation analysis showed that TM band 4 (near infrared) accounted for 78% of the variability in percent trees (r = -0.88). In multiple regression, percent trees, total soil loss, and percent total nonliving cover together accounted for nearly 70% of the variability in TM bands 2, 3, 4, and 5. TM spectral data were consistently better predictors of soil erosion factors than any combination of field factors. TM data were more sensitive to vegetation variations than the USLE C factor. USLE estimates showed low annual rates of erosion that varied little among the study sites. Direct measurements of the rate of soil loss using the SEDIMENT (Soil Erosion DIrect measureMENT) technique, indicated high and varying rates of soil loss among the sites since tree establishment. Erosion estimates from the USLE and SEDIMENT methods suggest that erosion rates have been severe in the past, but because significant amounts of soil have already been eroded, and the surface is now armored by rock debris, present erosion rates are lower. Indicators of accelerated erosion, however, were still present on all sites suggesting that the USLE underestimated erosion within the study area. These findings indicate that remotely sensed multispectral data should be considered as input to future models for estimating soil loss within pinyon-juniper woodlands and probably other vegetation types distributed over broad geographic regions.\n This annotated bibliography received full credit (8 points)\n "},{"id":1,"href":"/classes/random/Install_Help/","title":"Installing GIS","parent":"Random bin","content":"Using GIS (or really any skill you are trying to pick up) at least twice a week greatly improves the time to learn, and the easiest way to do so is to have it readily available to you. Here I\u0026rsquo;ll walk through the ways to install the software for yourself.\nInstall instructions for: ArcMap/ArcPro QGIS ArcMap/ArcPro The ESRI suite of products tends to be a demanding set of programs, so you should have a fairly modern computer if you want to install them for yourself. The ArcGIS system requirements can be found here. I also recommend at least 64 gigs of free hard drive space at minimum. In addition to the hardware requirements, ArcGIS also requires the Windows operating system. Below find installation instructions for different platforms.\nThe Virtual option is by far the easiest option if you are a recently virtualized member of the KU community. The QGIS virtual option is what I use when I\u0026rsquo;m not wed to ArcMap. Windows Before you install ArcGIS for Desktop To install ArcMap on windows, check the system requirements to make sure your computer has the hardware and software required. You should also close any other open programs or windows. Activate your authorization code Visit https://wwww.esri.com/educationedition to begin the process of activating and downloading your ArcGIS for Desktop Student Trial software. If you need an authorization code, you can use the free trial ESRI provides by sighing up at https://www.esri.com/en-us/arcgis/trial. Create or Log in using your ESRI account as necessary. You will need to use this log in again so remember what you enter here. Enter the authorization code and click Activate ArcGIS. If you received the ArcGIS for Desktop software from your instructor or license administrator, or will be installing from a network server, proceed to step 10. Click the button for the ArcGIS for Desktop software version being activated. Download and install ArcGIS for Desktop Student Trial If necessary, download the ArcGIS Uninstall Utility and uninstall previous versions of ArcGIS Desktop or Server. The software cannot be installed on a computer that has a previous version of ArcGIS for Desktop or ArcGIS for Server installed. It\u0026rsquo;s OK if the computer has ArcGIS Explorer installed. If necessary, install the Microsoft .NET Framework (version 3.5 Service Pack 1 or higher). However, if you have a current version of Windows installed it is highly likely this is already installed. If you are unsure if you have the .NetFramework installed, you can download a .net Version detector from http://www.asoft.be/prod_netver.html. Determine the location for the ArcGIS for Desktop software you wish to install and click the Download button. You can also download tutorial Data, if desired. Double-click ArcGIS_Desktop_10XXXXXXX.exe (Depending on the version you selected, the file extensions may be different) to extract the installation files. Locate and run Setup.exe to install ArcGIS for Desktop. The “Complete” installation is recommended. After the files are installed, the Authorization Wizard will open and prompt you to choose a product to authorize; select “ArcGIS Desktop Advanced (Single Use)” and click continue. The Authorization Wizard will prompt you for an authorization code; enter your activated code. Follow the prompts and the software will authorize and be ready for use. Note: leave the default option for the software extensions selected; they will be authorized automatically. Finally, restart your computer and try to open ArcMap. If the installation was successful, you should be greeted with the ArcMap template window. If the installation was unsuccessful, you may need to double check your license authorization. Search for the \u0026ldquo;ArcGIS Administrator\u0026rdquo; program which was installed alongside ArcMap, click on Desktop, select the Advanced (ArcInfo) Single Use radio button, and then the Authorize Now\u0026hellip; button. Follow the prompts and enter the necessary info.\n Mac \u0026amp; Linux If you are on a mac os or Linux distribution, there are two ways to install ArcMap. You may use Boot camp to dual boot windows, or you can use a virtualization software to launch a windows installation. In either case you will need a windows key to fully authorize the software. For more information see this ESRI post. Here I will walk you through a virtualization process suitable for either operating system.\nInstalling VirtualBox You should start by closing any other programs or windows, and then download VirtualBox here. Double click the VirtualBox exe and click through the guided installation. Installing Windows Windows provides unlicensed virtual environments for development, available here. Download the VirtualBox version, and unzip it. In Oracle VM, go to File\u0026gt;Import Virtual Appliance and point to the Windows ova. Click next. Most of the defaults are fine, but I like to create separate folders of each VM I import. To do this, use the drop down underneath the table to select \u0026ldquo;other\u0026hellip;\u0026rdquo; and create a new folder called Windows and click on the folder once to select it. Lastly, click Import. Connecting your hard drive Finally, lets set up the instance. Right click on the newly added machine and go to settings. Most of the time these are sensible defaults, although there are a few things we can tweak. under the System dialog, feel free to increase the Base Memory to a reasonable amount, bearing in mind that the more you allocate to the vm, the less you have available on the host. Feel free to explore the other options but don\u0026rsquo;t change them unless you know what you\u0026rsquo;re doing or can handle a crash or two :) Under Shared Folders, create a new folder (the icon with a plus) Point to a shared folder on your host machine (mine is called \u0026ldquo;hold\u0026rdquo; here) Select the Auto-mount option Change the Mount point to something sensible (like \\share) Installing ArcMap To launch Windows, click the imported Windows appliance and click start. You may close the Auto-capture alerts. To connect your folder simply navigate to This PC in the windows explorer and click on the share folder button on the VirtualBox window as needed to make the folder appear. Now that you have a version of Windows installed, you can follow the instructions under the Windows tab.\nA last note: As you may have noticed, when you downloaded the VM image, there was an expiration date ~60 days from the date you downloaded it. You will either need to activate this version of windows, or delete and reinstall as needed. Virtual options If you are a student at KU, IT offers virtualLabs, a remote desktop program you can use to access ArcMap. In order to access VirtualLabs you will need to install Citrix Workspace. Because ArcMap is running on a remote PC, the system requirements are far lower and largely related to browser compatibility. To install this, you should:\n Login to Virtual Lab using your KU Online ID and password. You will be prompted to download and install the “Citrix Receiver” application. Follow the installation instructions, including restarting your computer. and finally, re-log into https://virtuallab.ku.edu/vpn/index.html. You should see the storefront, where you can select ArcMap, which will launch through your browser. Click the Open Citrix Workspace Launcher. Finally, in order to transfer data you need to \u0026ldquo;Permit all access\u0026rdquo;, a prompt you can make appear when you attempt to connect to a folder. At no point did I have to log in via the actual citrix app. Feel free to close the prompt that looks like so: If you need more detailed instructions, you can find those here. Finally, as per the instructions\u0026hellip;\n When working in Virtual Lab, you must save files to your computer or mobile device. Virtual Lab does not provide storage space. Files edited with Virtual Lab applications are not backed up by KU and cannot be restored by KU if something happens to them. NOTE: After one hour of idle time, Virtual Lab will disconnect and any unsaved work will be lost. It is important to save files to your personal device before you disconnect from Virtual Lab.\n Finally, a few tips picked up along the way:\n File pathing is especially important when using this service. I would consider it best practice to make sure that you are saving all outputs to your local computer. These paths should always start with \\client... This is your local pc, so use your folder structure. I\u0026rsquo;ve found connecting to OneDrive to work some of the time, but be aware that you are going through three different api\u0026rsquo;s just to load a file. When in doubt, one of the first things I cross off my troubleshooting list is to place the files on my local disk. You will have to be exceptionally patient, this is by far one of the slowest ways I\u0026rsquo;ve ever used GIS. QGIS \u0026ldquo;QGIS is a user friendly Open Source Geographic Information System (GIS) licensed under the GNU General Public License. QGIS is an official project of the Open Source Geospatial Foundation (OSGeo). It runs on Linux, Unix, Mac OSX, Windows and Android and supports numerous vector, raster, and database formats and functionalities.\u0026rdquo; When I first encountered QGIS many moons ago, it used to be a cumbersome, difficult tool to use compared to ArcMap. However, in just a few short years QGIS has made massive improvements and can even stand toe to toe with ESRI in many respects, and is free and open source. Additionally, it can be even easer to install than ArcMap, and if you go with the virtual installation route provided by OSGeo below you will have a platform agnostic installation of many of the GIS tools and programs that you could ever need (Python, R, PostGIS, GRASS, GDAL ect.) As such, I hearty recommend the virtual option. Regardless of the version chosen system requirements are minimal (1 GB RAM (2 GB are better for trying Java based applications), 1GHz i386 or amd64 compatible CPU) and most modern computers will meet your needs.\nOS independent The installation of QGIS is far less involved than it\u0026rsquo;s ESRI counterparts. The QGIS website has downloads separated by operating system here, simply select your operating system. New users are recommended to install the standalone installer of the latest release. All you need to do to get started is to download and run the executable. Virtually See the OSGeo quickstart docs for a more verbose but slightly outdated set of instructions should this set of steps not do it for you.\nInstalling VirtualBox You should start by closing any other programs or windows, and then download VirtualBox here. Double click the VirtualBox exe and click through the guided installation. Installing OSGeoLive Download and unzip the latest version of OSGeoLive, you probably want the .amd64.vmdk.7z version. Save that unzipped folder in your VirtualBox VMs folder. In Oracle VM, Click on Tools, New. Name your new machine (I call mine FOSSGIS). You can leave the Machine Folder dialog on VirtualBox VMs. Change your Type to Linux and your Version to Ubuntu. You will most certainly want to increase your memory (ram) beyond the half gig default. Select the \u0026ldquo;Use an existing virtual hard drive disk file\u0026rdquo; and point to the vmdk file. Finally, click Create. Setting up Finally, lets set share a folder. Right click on the newly added machine and go to settings. Under Shared Folders, create a new folder (the icon with a plus) Point to a shared folder on your host machine (mine is called \u0026ldquo;FOSSFlood-master\u0026rdquo; here) Select the Auto-mount option The shared folder is now connected! If you\u0026rsquo;re on a smaller keyboard you likely don\u0026rsquo;t have a right-ctrl key, change the host bind key to something else (I use right alt. File \u0026gt; Preferences \u0026gt; Input) Updating for the future While the above steps will serve you through the intro and intermediate classes, I also use this method almost exclusively for my python needs. Python can be a wonderful tool, but installation and update woes, along with less flexible graphical execution options means it often loses out to R when new geospatial programmers are looking for their first launch point. With this linux (ubuntu) installation, we bypass virtually all these issues (see note below). However, to get the most out of this way of accessing python, we need to set up a few things. Therefore, after setting up your virtual machine for the first time, I suggest you open up a terminal and run the following: sudo apt-get update -y\t# updates all sorts of goodies sudo apt install snapd\t# Installs snapd, used to install programs sudo snap install --classic notepadqq\t# Installs a better text editor (almost notepad++) sudo apt install python3-pip\t# Installs pip3 for python sudo apt-get install proj-bin\t# Installs and updates proj library pip3 install pysimplegui\t# Adds gui library for python scripts! pip3 install seaborn --upgrade\t# My favorite python plotting library, because matpltlib syntax is disgusting pip3 install jupyterlab\t# for notebook support # Only run if you want to get geopandas to run pip3 install setuptools==39.0.1 pip3 install pyproj==2.6.1 pip3 install pandas==0.23.3 pip3 install geopandas==0.5.1 When I say all, I mean all but the arcpy library :/ ESRI provides a tiny bit of help in this direction, but it seems as though their objective is to force you into the always on licensing of the new millennium. For what it\u0026rsquo;s worth, I don\u0026rsquo;t think I\u0026rsquo;ve encountered anything written in arcpy that wasn\u0026rsquo;t more straightforward to implement yourself, but of course your mileage will vary. "},{"id":2,"href":"/classes/geog111/","title":"Introduction to GEOINT -- GEOG111","parent":"Classes","content":"This is a living document. Changes will be announced in class. Instructor: Jim Coll\nOffice: 404C Lindley Hall\nEmail: jcoll@ku.edu\nOffice Hours: day from time-time \u0026amp; day from time-time or by appointment Class Meetings: Monday 2:00 — 4:30 pm\nClass Room: Lindley 228 Labs: Wednesday or Thursday 11:00 — 12:50 pm\nLab Room: Lindley 310 Announcements and mics: Hello all!\nMy name is Jim Coll and I am your instructor for the semester. A few notes for you pertain to how I run this course. Although the KU blackboard site will be the \u0026ldquo;official\u0026rdquo; site for this class and the place you submit all work to, I use this site here for the benefit of all and my own selfish desire to streamline my digital footprint. I will keep both sites as identical as possible when double posting material (e.g. the syllabus and course schedule), but in case of a conflict treat this version as the most recent. Use the navigation table to the left to find the syllabus, labs, and other documents. I have made myself as available to you as I can so feel free to find me via:\n Email Class slack channel Cornering me in the hallway Below you\u0026rsquo;ll find the course outline and slides as appropriate. I look forward to an exciting, productive, and stimulating semester with you all.\nBest,\nJim\nTentative Course Outline: To help us stay on track, I have pinned the semester schedule below with the relevant lectures, tasks, due dates, and other important institutional dates: Note that this is a living timeline subject to change. \r\r\rDate\rTopic\rWeekly Activity Due\rLab\rUniversity Dates\r\rWeek 1\r2/2\rSyllabus Day\rReading: Lecture notes\rSpatial skills pretest \u0026amp; Computer basics\r\u0026nbsp;\r\r2/4\rWhat is GIS \u0026amp; GEOINT\rAssignment: Current event report\r\u0026nbsp;\r\rWeek 2\r2/9\rGEOINT, GIS, and Ethics\rReading: Lecture notes\rIntro to Google Earth Pro\r\u0026nbsp;\r\r2/11\rGEOINT in US Government\rAssignment: Careers in GEO\r\u0026nbsp;\r\rWeek 3\r2/16\rDatums and Coordinate Systems\rReading: Lecture notes\rCoordinates and Position Measurements\r\u0026nbsp;\r\r2/18\rProjections and spatial references\rAssignment: Favorite projection\r\u0026nbsp;\r\rWeek 4\r2/23\rGeospatial data sources, data types\rReading: Lecture notes\rGPS\r\u0026nbsp;\r\r2/25\rData sources \u0026amp; collection methods\rAssignment: Field relevant data\r\u0026nbsp;\r\rWeek 5\r3/2\rReview\r\u0026nbsp;\rGIS introduction\r\u0026nbsp;\r\r3/4\rExam 1\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 6\r3/9\rIntroduction to GIS\r\u0026nbsp;\rGIS spatial analysis\r\u0026nbsp;\r\r3/11\rSpatial Analysis (vector)\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 7\r3/16\rSpatial Analysis (raster)\r\u0026nbsp;\rDigital Terrain Analysis\r\u0026nbsp;\r\r3/18\rTerrain Analysis\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 8\r3/23\rMap making and visualization\r\u0026nbsp;\rMap making\r\u0026nbsp;\r\r3/25\rCloud computation\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 9\r3/30\rReview\r\u0026nbsp;\rCloud GIS\r\u0026nbsp;\r\r4/1\rExam 2\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 10\r4/6\rIntroduction to remote sensing\r\u0026nbsp;\rVisual Imagery Interpretation\r\u0026nbsp;\r\r4/8\rRemote Sensing Principals\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 11\r4/13\rDigital Image Science\r\u0026nbsp;\rRemotely Sensed Imagery and Color Composites\r\u0026nbsp;\r\r4/15\rIntroduction to image interpretation\rAssignment: Current event report\r\u0026nbsp;\r\rWeek 12\r4/20\rAirborne systems\r\u0026nbsp;\rLandsat 8 Imagery\r\u0026nbsp;\r\r4/22\rSatellite systems\rAssignment: Image interpretation\r\u0026nbsp;\r\rWeek 13\r4/27\rRemote Sensing applications\r\u0026nbsp;\rEarth-Observing Missions Imagery\r\u0026nbsp;\r\r4/29\rRemote Sensing applications\rAssignment: Favorite satellite\r\u0026nbsp;\r\rWeek 14\r5/4\rFlex day\r\u0026nbsp;\rSpatial skills test\r\u0026nbsp;\r\r5/6\rReview\r\u0026nbsp;\r\u0026nbsp;\r\rFinals\r5/12\rIn person final\r\u0026nbsp;\r\u0026nbsp;\r\u0026nbsp;\r\r\r\r\r\r\r\r\r\r\r\r"},{"id":3,"href":"/classes/geog526/labs/lab10/","title":"Lab - Data Acquisition and Image Preprocessing","parent":"Labs","content":"Introduction This lab serves as an introduction to downloading remotely sensed data from one, of the several, online sources and includes some common remote sensing image preprocessing functions. This lab allows you to become familiar with the process of searching for and downloading a Landsat scene, and how to stack layers from the original products. You will also use multiple SPOT images to learn the image mosaic and subset processes and to subset the image to an area of interest. You need to submit your results to BB this lab. Because you only have one chance to submit your work, submit all the results after you finish the whole lab. There are totally 9 snapshots you need to submit, please name your file as #_your last name (# represents 1-9). Readings Textbook: Chapter 6 and Lab08 Guide Materials: SPOT6 MS images; ERDAS IMAGINE 2014\nPart1: Landsat Image Acquisition, Download and Import\nIn this portion of the lab you will search Earth Explorer to identify “suitable” Landsat 8 imagery, download a scene and import it into ERDAS Imagine.\nA suitable image is defined by the user’s needs. This includes the spatial extent, the spatial resolution, temporal resolution and the level of cloud contamination. You may only need a portion of a scene to be cloud-free. Or perhaps your application requires a very specific time stamp (data from a specific year, season or both), so you may be able to accept some cloud contamination. You may need to acquire imagery from different seasons to capture vegetation phenology and that may require the use of data spanning multiple years. Go to http://earthexplorer.usgs.gov/. Follow the instructions to create an online account. Once your account is created, log in to the website using your credentials.\nYou can search for data using various spatial parameters. For example, you can define using an Address/Place (street address, City, State, County), a Path/Row (such as the WRS-2, the unique scene identifier for Landsat discussed in Lab 6) or X, Y coordinates (which can be the location of one point, an intersect between two pairs of x,y coordinates or a box defined by x,y coordinates). You can also specify a date range for the search. Leave this part blank for now. Under the Search Criteria tab type in the location of an area you are interested in acquiring Landsat imagery. Click the Address/Place tab to define your location. Type in one or more of the following to your location: street address, city, county, state, country. Click the Show button. Then Click on an Address/Place in the list to show the location on the map and add coordinates to the Area of Interest Control.\n Where is your location of interest? ___________________________¬ (1 point)¬ Click the Data Sets tab and scroll through the list to see all the data available for download. Now Click on Landsat Archive. Then check the box L8 OLI/TIRS (i.e. you are searching Landsat 8 imagery only). How many datasets are available through Earth Explorer? _____________________ (1 point) Click the Additional Criteria tab. This tab allows you to specify additional search criteria including % of acceptable cloud cover, Data processing type (i.e. what level of processing has occurred on the data), path/row or even scene identification. Look at the options. Do not select anything under this tab, for this exercise we want to see all data available for your selected area regardless of cloud cover. Click the Results tab to view the list of images available for the area you specified. In the Search Results window, Click on Show Browse Overlay, the second option in this toolbar . It will display your image over the imagery in the main window. It will highlight the icon in green like shown above, when the image is displayed. Scroll through the available imagery and display images that look the most cloud free. Find the most cloud free image for the year and take a screen shot and submit it to BB.\n What is the path/row and the date of the Landsat 8 image you found with the least amount of cloud cover? (2 points) Path/row: ______________ Date: ___________________________\n What is the estimated percent cloud cover over the entire scene (circle one)? (1 point)\nCloud-free \u0026lt;10% 10-20% 20-30% 30-40% 50-60% 60-70% 70-80% 80-90% 90-100%\n IF you were to download the image, click the Download icon on the image toolbar. Click to download the Level 1 GeoTiff Data Product and save the image locally. (this could take a few minutes) We are not downloading these data for this lab.\nPretend you downloaded your image. You would need to use 7 zip or another software package to uncompress the file. Each band would be in a separate TIFF file. Navigate to G:/Fall-2015/G526/Lab08/Landsat8/ and copy the individual bands to your computer. You now need to stack the bands to produce one output image file. To create a layer stacked image, open ERDAS Imagine 2014. Under the Raster tab click Spectral (located under the Resolution subtab) and select Layer Stack. For the input file, navigate to the location where you copied the TIFFs and select band 2 (*_B2). Click the Add button. Repeat the process for bands 3-7. Make sure and add the bands in sequential order. Once all bands are loaded in sequential order, specify an output file name and location and use an *.img file format for output. Then click OK to run the process. Once the process is complete, view the layer stack in ERDAS Imagine. Close ERDAS Imagine and delete the files you copied and created. Now we will do another search on Earth Explorer. You will use path/row and year information to identify the most suitable scenes for crop type mapping in Iowa. You will need to identify the best spring date, summer date and fall date to represent the seasonality of the growing season. For the purposes of this lab, spring is defined as March 1st – May 31st, summer as June 1st- August 31st and fall as September 1st – November 30th. Use this Search Criteria: Path/Row 25/31; Data set: Landsat 4-5; Dates: March 01, 2010 through November 1, 2010. 5.\tHow many images are there to choose from? ____________ (1 point)\na. The best spring image date is: ______________________. (2 points) b. The best summer image date is: ____________________. (2 points)\nc. The best fall image date is: ________________________. (2 points) Part 2. SPOT Data import and format conversion We will find variety of data format when you download the remote sensing products. For example, “hdf (Hierarchical Data Format)” is the format for MODIS, ASTER. The data format of SPOT6 is JPEG 2000 (with the file extension as .jp2). In most of time, we want to keep all of the dataset in one format, which is convenient to manage. In this part, we will learn how to use ERDAS 2014 to impart the data and export data to the format you want. ERDAS IMAGINE 2014 has the default format: IMAGINE image (.img). Open the ERDAS 2014, navigate to “Manage Data” tab, click “Import Data” and then you will see the window-“Import”. In this exercise, we will import the SPOT data (in folder include exact directory location SPOT6_MS) to the ERDAS format.\nChange the “format “ to JPEG 2000, select your SPOT data and then specify the output name and location (The default format is *.img). After you press OK button, it will show a new window, and click OK.\nRepeat the process to import all the jp2 files in the folder. Or you can do batch by coding and import all the images at once. Part 3. Image mosaic Load all six SPOT6 images (*.img) into the same 2D view. Set the image “fit to frame” (we have talked about several ways to do it in previous labs), and then you should see a view like this.\nNavigate to “Raster” tab, click “Mosaic” and choose MosaicPro from 2D View. It will show a new window (MosiacPro). You also can add and delete the images. You can try different function buttons in the tool bar. For example, you can set the resample method by clicking Edit-\u0026gt; Image Resample Options… Then you click Process-\u0026gt;Run Mosaic…\n After the mosaic finishes, open the image in a new 2D View. Compare the views, what’s the differences between unmosaic view and mosaic view? (2 points)\n If you are a data manager of large database, discuss the advantages and disadvantages of storing your data as mosaiced images from the view of data management (you may start from the aspects of updating data, storage, retrieval and distribution of data, processing of data, etc.) and data applications. (3 points)\n Reload all the six original SPOT6 image in a new 2D view. And this time, you can try to play with create/edit/delete hole in the mosaic process . Create one hole (or several) in the image mosaic process, and then redo the mosaic process. Snapshot your final results and submit it to BB. (2 points) Typically adjacent images have overlap areas, e.g. the aerial photos (required about 30% side-overlap). In these cases, you may need to create a seam line or seam polygon. We will not cover this part in this lab. Part 3. Subset Imagery (see section 11.8 in your textbook for a complete description) Now make the mosaic result visible. Go to “Raster”, and click “Subset and Chip”. ERDAS provides several ways to subset images.\n A)\tChoose “Create Subset Image”.\nTry to subset the following features and save and submit the snapshots to BB. (hint: this is SPOT6, with blue band) (8 points) Ocean ( with true color combination) Airport ( with true color combination) Agricultural area (with False color combination(color infrared)) One feature you are interested in (with your interpretation result). B)\tChoose “Dice image” then the “Dice an image” window shows up. This function is to cut the image into several similar pieces, just like the original SPOT6 images.\nDice the mosaic image into 3 *2 sub-images (3 columns * 2 rows). Submit the snapshot of results to BB (true color combination) to the assignment. (4 points) The values you set for “Dimension in x dir.”________________ “Dimension in y dir.” _______________ (2 points)\nThe following figure is an example of the results in false color combination.\nC)\tCreate an irregular shape of subset. a)\tCreate an “AOI” (Area of Interest) layer. Right click on the image, and choose “New AOI Layer”. It will have a new layer in your active 2D view.\nb)\tDraw an AOI. Navigate to “Drawing” tab, and in the section of “Insert Geometry”. You can choose different shapes to draw an AOI.\nChoose “Polygon” button (see where the mouse is pointing above), then you can create an AOI over the area of interest you want to subset. To outline your AOI, position your mouse at the starting location and click, then move to the next area and click, repeat until you have delineated your subset area. After digitizing the AOI, go to File-\u0026gt; Save as -\u0026gt; AOI layer as… to save the AOI as a file. Save the aoi file to the Lab08 directory.\nc)\tThen go to the Raster-\u0026gt;Subset \u0026amp; Chip -\u0026gt; Create Subset Image. Now You DON’T need to define the “subset Definition” we set before, which was a rectangle shape. Click the “AOI” button. In the Window of “Choose AOI”, click AOI file and go to the Lab08 directory and select the AOI file you created before. After you give an output file name, click OK. You will get a subset output file like below.\nNow create the irregular shapes of AOI, one polygon, one ellipse (press shift when you draw an ellipse to create a circle), and subset the features you like. Submit two snapshots to BB. (6 points) You are done!!!\nWrapping up There is no need to save anything from this lab, so when done you can simply close without saving. Submit your answers to the questions on blackboard.\n"},{"id":4,"href":"/classes/geog526/labs/lab06/","title":"Lab - Introduction to ERDAS IMAGINE I","parent":"Labs","content":"Learning Objective The goals for you to aim for in this lab:\n Build simple and compound database queries Extract the results of a query into separate GIS layers Count the number of features within the boundary of another feature Overlay two features for spatial analysis Create and use buffers around objects for spatial analysis Outline: Learning Objective Submission requirements Guide of Bands: __________ Submission Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rGuide Contrast Stretching •\tImproving detectability of objects by increasing the variation of tones •\tRedistribution of DNs along the bit scale to utilize more of the gray scale •\tOriginal range of DNs expanded to utilize a fuller contrast range of display device\nContrastStrech.png •\tMany times DNs don’t occupy the full range o\tExample: DNs = 40, 45, 50 o\tAll look dark gray because they are so close on the 255 range o\tContrast stretching temporarily contrasts the numbers; doesn’t corrupt original values \t40 assigned 0 \t45 assigned 127 \t50 assigned 255 \tend up with 1/3 black, 1/3 gray, \u0026amp; 1/3 white\nERDAS IMAGINE •\twhen displaying image in IMAGINE, we can specify which bands we want to display with which color gun (Red, Green, Blue) o\tExample: SPOT Band 2 (red) can be displayed with green color gun •\tWhat the computer assigns Layer 1 may not correspond to Band # o\tExample: Layer 1 in SPOT image may not be SPOT Band 1\nStep 1: Copy files to your directory Log on to the machine. Go to your directory and create a Lab04 folder. Right click on the Lab04 folder and choose copy. Navigate to the folder that you just created, right click and choose paste.\nStep 2: Accessing ERDAS IMAGINE Click on the Start button and search ERDAS Imagine 2014. After a moment the IMAGINE logo should come up, there may be license expiration information, click “Continue”. Followed by the main IMAGINE menu, a panel across the top of the computer screen. A “viewer”, a black screen in which images are displayed, should also come up. You can resize these windows by grabbing the corners of the windows with your cursor, or you can move the windows around if you so desire.\nStep 3: Displaying an Image To display an image, right click on “2D View #1” under the contents menu on the left side of the screen and click on “Open Raster Layer…”, or simply click on the open file icon at the very top left of the screen. The Select Layer to Add window should now appear. Select your personal folder from the “Look in:” dropdown menu. In your directory, select (highlight) the file garden.img, but don’t click OK just yet. The data in this file are from a subset of SPOT’s multi-spectral sensor over western Kansas near Garden City. Look at the other information in the Select Layer to Add window under the Raster Options tab. The Display as should be on True Color. Switch back to the File tab. At the very bottom is additional information, including the number of columns and rows and the number of bands in the image.\n How many bands are available? What are the wavelengths represented by Spot’s Multi-spectral Scanner (XS) sensor (provide the band number, name, and wavelength)? (4 points) Use your textbook for this question. of Bands: __________ Band\tWave Length Name\n After answering the above questions, click OK. Once the image is displayed, you can retrieve information regarding various characteristics of the layer by clicking on the Metadata icon under the Home tab (the icon looks like a piece of paper with an “i” in the center). After the Metadata dialogue box opens (this may take a bit!) click on the General tab and find the Layer Info data box, in this look at the Height and Width numbers.\nHow many columns of data do we have? __________ Rows?_____________ (2 points) Step 4: Assigning Color Guns to Bands Reopen garden.img. Under the heading “Layers to Colors” on the Raster Options tab of the Open Raster Layer window, we can specify which computer color gun is used to display a certain band. Change it to Red-1, Green-2, and Blue-3.\nThis is telling you that we are assigning band 1 to the first \u0026ldquo;color gun\u0026rdquo;, which is red; band 2 to the second color gun, which is green; and band 3 to the third computer color gun, which is blue. If you entered layers (or bands) 3,2,1 in that order, it would assign band 3 to color gun 1 (red), band 2 to color gun 2 (green), and band 1 to color gun 3 (blue). Get it? You can change the assignment of bands to color guns by clicking on the Multispectral tab and go to Bands where you will see the three color guns to assign.\nIn the bottom of the same window, check Clear Display(this clears the data from the display window). Also click on the box Fit to Frame- you’ll want to do this every time you display an image. This displays the whole image in the window, rather than setting it to a pre-defined default zoom. Now click OK at the side of the screen. The Spot image should appear, covering an area of about 6 sections by 6 sections (generally, 4 center pivot plots cover a whole section).\nKeep in mind that these data have been \u0026ldquo;contrast stretched\u0026rdquo; to make it easier to interpret visually. This is done by the computer’s examination of a statistics file of the image to determine how it should be stretched. You will learn how to do this later.\n The image displayed is a color composite. What type of color composite is it? (1 point)\n Suppose we assigned band 3 to all color guns, what will this image look like? And WHY? (Explain why this image displays in this color scheme - remember that all three colors were assigned the same band of data). (3 points) Will it be a color composite or…?\n Step 5: SPOT Image Change the band combinations on the image, i.e. change the band-color gun assignments. On the main menu, click the Multispectral tab. Here there is a sub-menu called “Bands”. Change the band assignments to 3 to red, 3 to green, and 3 to blue. In effect, we are only displaying one band of the data,.\nLook at the image.\nKeep in mind that the bands of SPOT may or may not have been entered in the correct order. To investigate pixel values and the DN’s associated with them, press the Inquire “+” icon on the viewer menu under the Home tab. This brings up the “inquire cursor”. In the window that now comes up, examine the file File-pixel column. These are the reflectance values (PV’s, DN’s, etc.) for the pixel on which the cross-hairs of the cursor lie. You will see pixel values for each of the layers of your image. Move around the cursor with your mouse and watch the pixel values change. This should help you interpret which of the image layers actually correspond to which SPOT bands.\n What is the SPOT wavelength that is represented by this image (be careful!!)? (1 point) (Come back to this later if you don’t know)\n Look at the SPOT bands one at a time (by using Multispectral tab/Bands subtab and assigning a band to all color guns), considering how the composite image looked in question #3 and the color assignment information.\n Go back and look at the composite you used in #3-4 (bands 1,2,3 assigned to RGB color guns, respectively)\nAre the SPOT bands in their proper order? (is band 1 really the first band of SPOT data? In other words, is this really the shortest wavelength band of SPOT?). [HINT: Use the “inquire cursor”, examine pixel values, assume all center-pivots in the lower right are vegetated. Justify your answer and, if they are not in correct order, what order are they in? (3 points) Please see next page.\n Are they in Proper order: YES NO\rIndicate order: Band\r 1\t=\n2\t= 3\t=\nStep 7: MSS Image Let’s display another image. You can continue using the same viewer if you wish. Or, if you want to keep the garden.img up, you can open another viewer window, by clicking on the File tab, then click on the New option, and then click on the 2D View option (Do not worry about the additional options that appear).\nUse the open file button and find msslaw.img in your directory. Accept the default color assignments, but in Raster Options tab, click on Fit-to-Frame. Click OK\n This image is a Landsat MSS image of Lawrence. Comment on the spatial resolution with respect to garden.img. (2 points) Is the spatial resolution coarser or finer?\n Both the garden.img and msslaw.img are subsets (smaller portions) of the full SPOT and MSS scenes. Assume both subsets are same % of the total image. However, if both a full SPOT image and a full MSS image were displayed to the same dimensions, which would be smaller-scale and why? (3 points) Which full scene will cover the largest area?\n TM Image Now, let’s display a TM image. In a viewer window, click on the open file to display tmdata.img (the one saved in your directory) as a single band gray-scale image. This file contains TM data over Lawrence. Let’s simply examine each band individually. This time, in the Open Raster Layer window, under Raster Options tab, find the Display As field and from the dropdown menu select Gray Scale, and select band (layer) 1. Before displaying it, answer the following:\nUnder Layers, how many bands (layers) are available? ___________________ (1 point) As I mentioned, IMAGINE performs an automatic contrast stretch. However, let’s take a look at what happens when we do not perform a contrast stretch. Under Raster Options, click on Fit-to-Frame and No Stretch. Click on OK.\nYou should be displaying only band 1 (without contrast stretching).\nSuppose we had chosen the True Color display option and assigned band 1 to all three color guns, what color would the image be displayed as? (see #4) (1 point) Look at the displayed image. Note that these data have not been contrast stretched! These are RAW data, with each pixel assigned a computer brightness value that matches their pixel value.\n Are all the land cover types clearly distinct from one another? (2 points)\n Which 3 land cover types have a higher spectral reflectance? (2 points) They are lighter/brighter in the image.\n Use the Panchromatic tab and examine each band individually (remember to click on Fit-to-Frame). Continue to do this until you have looked at each band (1-7). Close the Set Layer Combinations window when done.(Note: there is a drop down menu and each band is there)\nRedisplay the tmdata.img by using the open file and assign band 1 to all the color guns (again, using True Color and Fit-to-Frame, but this time do not click on the No Stretch box).\nIn what 2 ways is this image display different from the non-stretched one? (3 points) Again, use the Multispectral tab Bands subtab to look individually at all the bands, now in contrast-stretched form.\nWhich band is the Thermal band? How can you tell? (2 points) It should be obvious which one looks different. Look at your textbook for a description of how the thermal band differs. The bands 1-7 of tmdata.img match Landsat TM bands 1-7. Next you are going to display a color-IR false color composite, a digital image which mimics a Color Infrared photograph.\nRedisplay the image tmdata.img with the open file. Use True Color as the display, and for the color gun assignments, use 4,3,2, in that order. Also, let’s see what the image looks like without contrast stretching. Click on the No Stretch box and Fit-to-frame, then click on OK.\n Provide Band #, wavelength and word description for band-color gun assignments.(9 points) Look at your textbook for the information on the bands you displayed.\n Band #\tWavelength\tColor Gun Assignment\r Red color gun will display ______\t_____________\t___________________\nGreen color gun will display ______\t_____________\t___________________\nBlue color gun will display ______\t_____________\t___________________\nThis image lacks contrast (as did the single band “no stretch” images) because we did not let IMAGINE perform the automatic contrast stretching. Let’s try to manually change contrast. In the contents tab on the left, right-click on the current image and select Brightness/Contrast. Use the HELP button to find out what the two top slide bars refer to. Press File-Exit to close the help window. The contrast and brightness can also be found under the Multispectral tab\nTop slide bar refers to: (1 point)\nBottom slide bar refers to: (1 point)\nChange brightness to 60 and contrast to 75. Click Apply. How does this change the image? (2 points) Play around some more with the contrast tool. Alter the slider bar values. When you want to go back to the original image, just hit Reset. Just make sure you DO NOT hit SAVE on the Contrast Tool menu \u0026ndash; this will permanently change the file. Hit Close to exit from the Contrast Tool window.\nUse open file to redisplay the same 4,3,2 color composite of tmdata.img, only this time do not click on the No-Stretch box. Let IMAGINE perform its automatic contrast stretching.\nAfter you have looked at the automatic contrast-stretched image, go to the Multispectral tab\\Bands subtab to alter the band combinations of tmdata.img. Change color gun assignments to 3,2,1, red, green, blue (RGB) respectively. 17. What type of color composite is this? (Hint: Think theoretically what you are doing - what wavelengths are you assigning to which color guns?) (1 point)\nPlay around with other band combinations to form different color composites if you wish. When you are finished, press File Exit from the main IMAGINE panel menu. If you get the following prompts, “Are you Sure you want to quit Imagine?” YES, and “Do you want to print the log file?” NO. YOU\u0026rsquo;RE DONE!!!!!!\nSubmission All you have to turn into blackboard for this week is the final image you created above.\n"},{"id":5,"href":"/classes/geog526/labs/lab09A/","title":"Lab - Introduction to ERDAS IMAGINE (Part II)","parent":"Labs","content":"Guide Image Info •\tEnables you to view and edit many elements of a raster image file (.img), including statistics, map information, and projection information.\n•\tto access, click on the\ticon or right click on image name in contents and select metadata\nInquire Cursor •\tgives you individual pixel information by using a cursor which displays as a crosshair in the Viewer window •\tlists all of the available layers and indicates which color represents each individual layer •\tFor every layer, each pixel has a value representing reflectance •\tHigher values indicate brighter reflectance •\tto access, click on the\ticon\nDigital Number •\tNumber that represents reflectance value •\tAssigned to each pixel •\tAlso referred to as bit value (BV) or pixel value (PV)\nDN.png\nSkip Factor •\tTells the computer how many pixel values to look at when computing statistics •\tFor example if skip factor = o\t1 = look at every pixel o\t2 = every other pixel is sampled \u0026amp; used to calculate statistics))\nHistograms\nDNCurvesNormal.PNG\n•\tThis histogram tells you that there are 110 pixels with a DN = 127 and 20 pixels with a DN = 70\nBimodal Histogram •\tInfrared bands (such as band 4) are easily distinguishable from other bands because they are bimodal due to the drastically different reflectance of water and vegetation in IR\nDNCurves.PNG\nExercise. Copy the Lab07 folder from G:Fall-2015\\G526\\ . Right click on the Lab07 folder and choose copy. Navigate to a local directory, right click and choose paste.\nStart ERDAS IMAGINE and open the tmdata.img using the methods learned in Lab5. Begin by looking at each band of the tmdata.img individually (Gray-Tone), and a false color composite if you wish \u0026mdash; however, click on the No Stretch box when displaying. Use Raster Panchromatic (or Multispectral) tab to look at all bands. By using the No Stretch, it is almost as if no statistics have been created for the data. You will now go through the procedure to “build statistics” for tmdata.img.\nFrom the viewer menu, under the Home tab, click Met adata Tab (under the Information tab) and click on View/Edit Image Metadata. In the Image Metadata window that comes up, under the General tab, look in the Statistics Info section. This is what IMAGINE looks at to determine a stretch. Let’s pretend that we have no statistical information and we have to “build” or create the statistics.\nPress the “” icon. This button goes through the data and finds the pixel values and compiles statistics. Look at the options, but you need not change anything. Use the Help button to find out more about the options. Close Help when done. Press Ok to generate statistics.\n What does the \u0026ldquo;skip factor\u0026rdquo; refer to? (1point) After we have compiled statistics, we can look at histograms of the data. This is a common and useful exercise in remote sensing work, and it is used to get a “feel” for the data. In the Image Metadata window, press the icon that looks like a histogram. With the cursor, grab the corner of the histogram window and stretch it out so it’s nearly as wide as the screen. Look at the histograms for all bands. In the Image Metadata window, change the selected band and the histogram should automatically be updated.\nUsing the Statistics Info, fill in the following chart: (7 points) 1 2 3 4 5 6 7 Min. Data Value Max. Data Value Mean Standard Deviation Look at all the statistics information/histograms and answer the questions below:\nFind the pixel value (digital number) with the most pixels (points) for each band (look at the mode value). List what the pixel value is and the number of pixels (see the histogram tab and by putting your cursor over any pixel in the histogram, you get the pixel value on the x-axis and the frequency of that value on the y-axis): (7 points) TM Bands\n 1 2 3 4 5 6 7 Pixel Value # of Pixels Why are the pixel values with the largest number of pixels not the same as the mean values found in #2? (3 points)\n Look at the histograms for bands 4, 5, and 7. Describe the major difference between these histograms and the ones for 1, 2, and 6 (use your lab guide to help you answer this question). (3 points)\n What is causing this difference? (3 points)\n Close the histogram window and File Close from the Image Metadata window. Redisplay tmdata.img, looking at the same bands individually (in Gray Tone) as you viewed in the beginning of this lab, only this time do NOT click on the No stretch box. Does the image look the same as it did in the beginning? If not, how has the appearance changed? (3 points)\n Display tmdata.img again, but this time select the True Color display option and create a true color composite using bands 3, 2, and 1 (band 3 to RED, band 2 to GREEN, band 1 to BLUE).\nWhen the image is displayed, press the Inquire Icon “+” on the Viewer menu to bring up the Inquire Cursor. To zoom in press the magnifying glass with the “+” in it. To get back to the full image, make sure you have clicked on the arrow cursor in the viewer menu, and then hold the right mouse button and select “Fit to frame”, or simply click the “Fit to Frame” button under the home tab.\nFill in the chart below. Use values from the “File pixel” column. The values in the “LUT Value” are the computer brightness values assigned to the pixel after the raw DNs have been temporarily stretched for display/contrast enhancement purposes. In the Inquire Cursor window, in the upper left, change the coordinate type to File instead of Map. Note the “File X and Y” coordinates change when you move the cursor. (7 points) TM Bands\n Area 1 2 3 4 5 6 7 Clinton Lake, near center Kansas River, between I-70 \u0026amp; Mass. St. Bridge Lush, Agricultural Field Downtown Area I-70 (zoom in to find road) Some area that looks interesting; Describe: Compare these values to the histograms and see where the pixel values you observe in the Inquire cursor window fall in the histogram (if you could not answer question 4 before you may be able to do so now). Comment on the brightness values you have observed (I expect an in-depth answer here). Focus on one of the areas you looked at in #8. (10 points) Close all viewers when you are done and DO NOT save changes.\nExit ERDAS Imagine, you’re done!\n"},{"id":6,"href":"/classes/geog526/labs/lab05/","title":"Lab - INTRODUCTION TO IMAGE INTERPRETATION","parent":"Labs","content":"Learning Objective This exercise is designed to introduce you to procedures used in image analysis. The image diagnostics (or clues) that we use to identify and classify features that we see depicted on images are as follows:\n(S) Size (Sh) Shape (T) Tone or color (Tx) Texture (P) Pattern (St) Site or location (A) Association or context (Sd) Shadow (Te) Temporal characteristics\nDuring this lab you will identify features on different digital aerial imagery. Use one or more of the diagnostics, logic, and inference to answer the questions posed for each location. You may not always be certain that your answer is correct, but make a best guess in every case. Briefly explain your reasoning if asked to do so.\nNote: When asked to indicate diagnostics used in your feature identification, respond by using initials provided in parentheses above. When asked for the type of aerial imagery, respond with the appropriate initials indicated in parentheses as follows: (P) Black and white panchromatic aerial imagery (IR) Black and white infrared aerial imagery (C) Natural color aerial imagery (CIR) Color infrared aerial imagery\nOutline: Learning Objective Submission requirements Guide Tutorial Submission Submission requirements Materials\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab05Questions.docx\rHandout to turn in\r\r\rrawdata/ACSDT5Y2019.B01003_2021-04-18T051624.zip\rPopulation estimates for Kansas Counties from ACS - 2019\r\r\rrawdata/GOVTUNIT_Kansas_State_GDB.zip\rKansas boundaries from the National Map\r\r\rrawdata/STRUCT_Kansas_State_GDB.zip\rKansas structures from the National Map\r\r\rrawdata/TRAN_Kansas_State_GDB.zip\rKansas transportation layer from the National Map\r\r\rGuide Image Interpretation Image interpretation involves examining aerial photographs/images for the purpose of identifying objects and judging their significance. Novice photo interpreters often encounter difficulties when presented with their first aerial photograph. Aerial photographs are different from ‘regular’ photos in a variety of ways (perspective, scale, visible/infrared wavelengths). The following image diagnostics (or clues) can aid in identifying objects on aerial photographs.\nImage Diagnostics Size – the use of size as a diagnostic characteristic both the relative and absolute sizes of objects can be important (cars vs. trucks or buses; single family vs. multi-family residences, brush vs. trees, etc.). The size of objects must be considered in the context of the scale of a photograph. The scale will help you determine if an object is a stock pond or Clinton Lake.\nShape – the shape of objects/features can provide diagnostic clues that aid identification. For example, roads can have right angle turns while railroads can not. Man-made features have straight edges that natural features tend not to. Regular geometric shapes are usually indicators of human presence and use. Some objects can be identified almost solely on the basis of their shape (for example, the Pentagon).\nTone/Color – tone refers to the relative brightness or color of elements on a photograph. It is, perhaps, the most basic of the diagnostic elements because without tonal differences none of the other elements could be discerned.\nTexture – refers to the frequency of change or arrangement of tones. The impression of smoothness or roughness of image features is caused by the frequency of change of tone in photographs. The visual smoothness or roughness of an area can often be a valuable clue in image interpretation. Water bodies are typically fine textured, while grass is medium, and brush is rough, although there are always exceptions.\nPattern – refers to the spatial arrangement of objects. Pattern can be either man-made or natural. For example, consider the role of pattern as a diagnostic tool when distinguishing between an unmanaged area of trees (random pattern) and an orchard (evenly spaced rows).\nSite – refers to topographic or geographic location. This characteristic of photographs is especially important in identifying vegetation types and landforms. For example, large circular depressions in the ground are readily identified as sinkholes in central Florida, where the bedrock consists of limestone. This identification would make little sense, however, if the site were underlain by granite.\nAssociation – some objects are so commonly associated with one another that identification of one tends to indicate or confirm the existence of another. For example, smoke stacks, step buildings, cooling ponds, transformer yards, coal piles, railroad tracks = coal fired power plant.\nShadow - shadows aid interpreters in determining the height of objects in aerial photographs. However, they also obscure objects lying within them.\nTemporal characteristics – refer to the date/time at which the photo was acquired. Seasonal variations in reflectance aid in interpretation.\nTypes of film Standard Films – record the visible wavelengths (red, green, and blue) Panchromatic film – normal black and white film Natural color film – normal color film Infrared Films – record visible and infrared wavelengths Black \u0026amp; white infrared film – black and white film that detect IR reflectance Color infrared film – color film that detects IR reflectance\nWhy IR? Color infrared photography, often called \u0026ldquo;false color photography\u0026rdquo;, is widely used for the interpretation of natural resources. Both standard-color and color-infrared films are manufactured to have three distinct layers, or emulsions. Each layer is sensitive to different wavelengths or energy.\nGreen, healthy vegetation has a high reflection level of near-infrared wavelengths and appears red on the processed film; red objects with very low near-infrared reflection appear green; green objects with very low near-infrared reflection appear blue; and blue objects with very low near-infrared reflection appear black. The primary use of color infrared photography is vegetation studies. This is because healthy green vegetation is a very strong reflector of infrared radiation and appears bright red on color infrared photographs. For example, leaves of healthy, growing vegetation reflect a high level of near-infrared wavelengths and appear red on color-infrared film. Unhealthy or dormant vegetation may appear light red or a light shade of blue-green, depending on the plant\u0026rsquo;s degree of good health. These color distinctions make color-infrared photographs useful in assessing the health of plants.\nWater, on the other hand, absorbs near-infrared wavelengths and appears black in the image. Water with varying amounts of suspended particles appears as shades of blue. Shallow water would reflect the material present in its stream bottom. Bare soils appear as patches of white, blue, or green in most agricultural regions. Generally speaking, the moister the soil, the darker the soil color. Soil composition affects all color ranges shown on aerial photographs. Dry, sandy land will appear white in color. With the addition of moisture to this land, the white coloring turns into light gray or light tan. Soils composed of clay are darker in color than the sandy areas as well as tending toward more blue-green tones. Clay soils holding extreme moisture would resemble darker shades of the same colors. These identical soils, when high in organic matter, such as silt or loam, would be viewed darkest in the same corresponding color scheme.\nIR Quick Reference Summary\nIR Color Shift If feature reflects:\tBlue\tGreen (except veg)\tRed\tNearIR\nThen it appears:\tBlack\tBlue\tGreen\tRed\nVegetation Reflects more in nearer (see Spectral Reflectance Handout 2-35) Factors that influence IR response include: Different species (see Spectral Reflectance Handout 2-37) Temporal aspects (life cycle) Density of vegetation Health of vegetation (see Spectral Reflectance Handout 2-38)\nWater Stands out clearly on IR photos Because water absorbs IR, it will appear black (see Spectral Reflectance Handout 2-35) Silt-laden water may appear blue-green based on sediment reflectance (see Spectral Reflectance Handout 2-35)\nSoil Wet soil will absorb IR (because of water content) and appear dark Dry soil will reflect IR and appear lighter\nInstructions for connecting to DASC Image Server Click on the Add Data button , and set the Look in: menu option to GIS Servers.\nDouble-click on Add ArcGIS Server.\nSelect Use GIS Services and click Next.\nEnter http://imageserver.kansasgis.org/arcgis/services as the Server URL and click Finish.\nThis will create an ArcGIS Server connection called arcgis on imageserver.kansasgis.org. Double-click on the connection name to connect to the server. Once connected, you will see a list of folders containing map services available from the server. Click the Add Data icon . In Look in: menu option select GIS Servers and select arcgis on imageserver.kanasasgis.org. It may take a minute for it to respond. Click on Statewide folder and ask me what files to add for this lab.\nTutorial Copy the G:\\Fall-2015\\G526\\Lab04 from the server to local hard drive. Open Geog526_Lab4.mxd file in ArcGIS. This file contains a shapefile (Lab4_Features.shp) that outlines areas in multiple locations you will interpret. On the main menu, click on Bookmarks to see a list of locations you will use. There are multiple dates of aerial imagery for Kansas available at the Data Access and Support Center at the Kansas Geological Survey. You will access this data through their image server. If the imagery displays in ArcGIS upon opening the .mxd file, your computer has already been set up to access the image server. If you need to connect to the image server, follow the instructions in the guide.\nArcGIS Tips: Click on or off a layer to see the layers underneath. Use the zoom and pan tools to navigate the data. Use the Identify tool to view the attributes for a feature. Activate the tool by clicking on the icon in the toolbar. Then click on the feature (i.e. point, polygon) and a new window opens showing the attributes for that feature.\nTo view specific information about the imagery or layer, click on the file name in the Table of Contents, Right Click and select Properties. Under Source tab you will find information such as the number of bands, cell size (spatial resolution), pixel type and projection information.\n–Location Kansas City, KS Click on the Bookmark labeled Kansas City, KS Make the following layers visible:\nLab4_Features ortho_1-1-1m_ks209_2008_2 2003_NAIP_2m_1\n What type of aerial imagery is this (hint: refers to the band combination? (1 point) Right click on ortho_1-1-1m_ks209_2008_2 in the Table of Contents and click on Open Attribute Table. This is the flight line information for the aerial imagery flown for Wyandotte County (FIPS Code 209), where Kansas City, KS is located.\n Use the attribute table for ortho_1-1-1m_ks209_2008_2 to provide the full date range for the aerial imagery flown for Wyandotte County: (1 point) Identify the features indicated. Indicate diagnostics used in your identification and briefly explain your reasoning. (12 points)\r Location Identification Diagnostics Used Reasoning\n 1\n 2\n 3\n 4\n Look closely. What evidence do you see that indicates it might be possible to separate some species of trees? (1 point)\r Location - Garden City and vicinity, KS Click on the Bookmark labeled Garden City, KS Make the following layers visible:\nLab4_Features ortho_1-1-1m_ks055_2008_2 2003_NAIP_2m_1\nProvide the date range for the aerial imagery for this county (open the attribute table for ortho_1-1_1m_ks055_2008_2) (1 point)? ______________________\n If the average diameter of a circular center pivot irrigation system is one-half mile, what is the approximate scale of this photo? (2 points)\r centerpivotscale.png\n You probably know that this is a relatively dry part of Kansas. With that in mind, what do you think we have at location 5 (1 point)?\rHow do you know (1 point)? Identify feature 7. (1 point) ____________________________________________________\r Briefly explain your reasoning. (1 point)\n Identify feature 8. (1 point) ____________________________________________________\rBriefly explain your reasoning (1 points). Note that there are other similar features at other locations in the image.\r What features are located at 9, 10 and 11? (1 point) _________________________________________\rHow are 9 and 10 different from 11 (What different use do they have)? (1 point)\rHow do you know? (1 point)\r How are 9 and 10 different? (1 point) _________________________________________\nHow do you know? (1 point)\r What is located at 12? (1 point) ________________________________________\nHow do you know that it is not an example of center pivot irrigation? (1 point)\r Location - Topeka, KS Click on the Bookmark labeled Topeka, KS Make the following layers visible:\nLab4_Features 2003_NAIP_2m_1\n Identify feature 13. (1 point) ____________________________________________________________\rExplain your reasoning. (1 point)\r Identify feature 14. (1 point) ____________________________________________________________\rExplain your reasoning. (1 point)\r Explain why pixel resolution is more important than representative fraction in the digital remote-sensing images. (2 points)\n What are the white and black features in the image below? What image diagnostics did you use to identify these features? (2 points)\n truecolorcloudshadow.png\nSubmission All you have to turn into blackboard for this week is the final image you created above.\n"},{"id":7,"href":"/classes/geog526/labs/lab11/","title":"Lab - INTRODUCTION TO MODIS TIME SERIES","parent":"Labs","content":"Learning Objective This lab serves as an introduction to MODIS data products, which are widely used in a variety of terrestrial, aquatic and atmospheric remote sensing research and applications. This lab will familiarize you with MODIS vegetation indices products and various land cover/land use interpretation tasks. In addition, you will become familiar with the characteristics of MODIS imagery and time-series.\nOutline: Learning Objective Submission requirements Guide Tutorial Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rGuide Tutorial Part1 and Part2 are optional, so there are no questions about these two parts. Part1. Download the MODIS data\n Go to the website https://lpdaac.usgs.gov/data_access, then you can find a list of links to access the data. Click the link of Reverb (Launch Reverb). We will be able to search and download the data from the Reverb system. (Or you can directly go to http://reverb.echo.nasa.gov). 1 choose the bounding box 2 Draw a box to select the area that you are interested. (Not necessary to be the same of our example) 3 Input your Key word: MODIS NDVI in the Search Terms section 4 Specify the start and the end of time. 5 Check the dataset you would like to download. In this example, we select MODIS/Aqua Vegetation Indices 16-Day L3 Global 250 m SIN Grid V005\n After finishing the five steps shown above, then you can go to the end of this page and click . You will see a new window about the data quality summaries for the dataset we chose before, “Accept” it.\n Three data files were selected. “Browse” is to show the quick view of NDVI and EVI. Click this button, you will see more details. After checking the basic information of these files, click to add them into cart (the color change from blue to yellow after you added them). At last, you click .\n Now you can download the data by clicking at the end of page. It will appear an window to download the paths of data files. Keep the default choices and click “Save”, and then you can choose a directory to save the *.txt file on your own path.\n Open the file you saved from the last step. And you can copy the path to the browser to download the files. Be patient, and it will take several minutes to finish the download. e.g.\n Part2. Import the MODIS VI product in ERDAS IMAGINE 2014. Did you remember the data import process (import SPOT6 *.jp2 to *.img) from last lab? We will do similar steps. Go to “Manage Data” and choose “Import Data”. Change the “format” to HDF in the Import window. Then you can import the data to img file format. This process will take very long due to the large date size and heavy computation. You can stop this step. We will provide the data for later use.\nPS. NASA has its own special tool for the MODIS data to do reprojection and mosaic, called MRT (MODIS Reprojection Tool). It is very fast and convenient; you can download and find further details in https://lpdaac.usgs.gov/tools/modis_reprojection_tool. We won’t cover it in this lab.\nPlease, read the lab guide before you start the following sections. Part 3. Basic information for MODIS\n Fill in the table below for MODIS (combining the information from the guide and the link https://lpdaac.usgs.gov/dataset_discovery/modis/modis_overview . For the band range name, you can use what you learned from the previous labs (compared to TM/SPOT) or the definition in http://en.wikipedia.org/wiki/Electromagnetic_spectrum . Remember to write values with units. (6 pts) Band number\tWave range\tResolution Band range (e.g. red, blue, green, near-infrared, or infrared) 1\t______________\t______________\t______________ 2\t______________\t______________\t______________ 3\t______________\t______________\t______________ 4\t______________\t______________\t______________ 7\t______________\t______________\t______________ 31\t______________\t______________\t______________\nRead carefully about the MODIS Naming Conventions below, and then answer the questions. MODIS filenames follow a naming convention which gives useful information regarding the specific product. For example, the filename MOD09A1.A2006001.h08v05.005.2006012234657.hdf indicates: MOD09A1 - Product Short Name .A2006001 - Julian Date of Acquisition (A-YYYYDDD) .h08v05 - Tile Identifier (horizontalXXverticalYY) .005 - Collection Version .2006012234567 - Julian Date of Production (YYYYDDDHHMMSS) .hdf - Data Format (HDF-EOS) The MODIS Long Name convention also provides useful information. For example, all products belonging to the MODIS/Terra Surface Reflectance 8-Day L3 Global 500m SIN Grid V005 collection have the following characteristics: MODIS/Terra - Instrument/Sensor Surface Reflectance - Geophysical Parameter 8-Day - Temporal Resolution L3 - Processing Level Global - Global or Swath 500m - Spatial Resolution SIN Grid - Gridded or Not V005 - Collection Version A)\tThe vegetation indices product file name is MOD13A2.A2013209.h10v05.005.2013226035948.hdf Please answer the following questions. (hint, for the long name, you may search the short name on the LPDAAC website, and the title of its home page is the long name, it will very similar to the example format) (4 points)\nProduct short name\t___________________________________ Product long name\t___________________________________ Geophysical Parameter\t___________________________________ Sensor name ___________________________________ Julian Date and the regular date of Acquisition\t________________and _______________ Collection Version\t___________________________________ Spatial resolution ___________________________________ Temporal resolution\t___________________________________\nB)\tThe MODIS/Aqua Land Surface Temperature and Emissivity (LST/E) products, MYD11C2.A2004150.004.2005226210814.hdf (4 points) In order to answer the questions below, you may need to search through Internet. (This is the link for MYD11C2 products: https://lpdaac.usgs.gov/dataset_discovery/modis/modis_products_table/myd11c2 )\nProduct short name\t___________________________________ Product long name\t___________________________________ Geophysical Parameter\t___________________________________ Sensor name ___________________________________ Julian Date and the regular date of Acquisition\t________________and _______________ Collection Version\t___________________________________ Spatial resolution ___________________________________ Temporal resolution\t___________________________________\n Estimate the NDVI from the figures below (the percentages indicates the reflectivity, the equation for the NDVI is on the guide). The left tree is healthy and the right one has withered. Then compared the values between these two trees, and why NDVI is helpful to show the healthiness of vegetation. (2 (calculations) +1(the comparision) points)\n Estimate the NDVI from the spectral profiles. The figure shows the profiles of green vegetation, dry vegetation, and soil. First, draw and label the band ranges which are required to calculate NDVI (Using MODIS band range) in this figure. Then you approximate the band-averaged reflectance to calculate NDVI for each feature in the graph below. (Show ALL your work and think about what information NDVI provides.) (9 points)\n A)\tWhich bands are used for calculating NDVI and their spectral range (Wavelength, with units) ( 2 points)\nBand Name\tSpectral range 1 _________________________________\t_________________________________ 2 _________________________________\t_________________________________\nB)\tDraw the spectral ranges of these two bands on the figure (1 points) C)\tEstimate the reflectivity for each feature below and calculate the NDVI (4.5 points) (show ALL your work!)\nSurface features\tReflectivity 1 Reflectivity 2\tNDVI Green Vegetation Dry Vegetation\tSoil\nD)\tCompare the NDVI you calculated above, which feature is with high NDVI? Why NDVI can distinguish vegetation and non-vegetation? (1.5 points)\nFrom the G:/Fall-2015/GEO526, copy the folder of Lab09 to your local drive. Lab09/MYD13Q1 contains the original MODIS vegetation indices file (.hdf) and the reprojected files (.tif) exported from hdf file. Open ERDAS and load the tif files, display the images and answer the following questions. First, carefully read about the vegetation products available, including the layer information. https://lpdaac.usgs.gov/dataset_discovery/modis/modis_products_table/mod13q1 The units, bit types, and scale factors are essential for you to get the correct values in MODIS products.\nScience Data Sets (HDF Layers) (12)\tUNITS\tBIT TYPE\tFILL*\tVALID RANGE **\tSCALE FACTOR*** 250m 16 days NDVI\tNDVI\t16-bit signed integer\t-3000\t-2000, 10000\t0.0001 250m 16 days EVI\tEVI\t16-bit signed integer\t-3000\t-2000, 10000\t0.0001 250m 16 days VI Quality detailed QA\tBits\t16-bit unsigned integer\t65535\t0, 65534\tNA 250m 16 days red reflectance (Band 1)\tReflectance\t16-bit signed integer\t-1000\t0, 10000\t0.0001 250m 16 days NIR reflectance (Band 2)\tReflectance\t16-bit signed integer\t-1000\t0, 10000\t0.0001 250m 16 days blue reflectance (Band 3)\tReflectance\t16-bit signed integer\t-1000\t0, 10000\t0.0001 250m 16 days MIR reflectance (Band 7)\tReflectance\t16-bit signed integer\t-1000\t0, 10000\t0.0001 250m 16 days view zenith angle\tDegree\t16-bit signed integer\t-10000\t-9000, 9000\t0.01 250m 16 days sun zenith angle\tDegree\t16-bit signed integer\t-10000\t-9000, 9000\t0.01 250m 16 days relative azimuth angle\tDegree\t16-bit signed integer\t-4000\t-3600, 3600\t0.1 250m 16 days composite day of the year\tJulian day of year\t16-bit signed integer\t-1\t1, 366\tNA 250m 16 days pixel reliability summary QA\tRank\t8-bit signed integer\t-1\t0, 3\tNA\n*The values in the Fill column means that if the data are unavailable for a pixel (NoData), use the listed fill value and assign that pixel to that value. ** Valid range shows the meaningful values in the product. If the pixel value ranges between these two values, it is valid and can be converted to the real scale by multiplying by the “scale factor”. *** SCALE FACTOR is used to calculate the real scale pixel value. Multiplying valid range by the scale factor gives you the real scale value.\nThe next exercise is to calculate the real scale value for NDVI and EVI.\nA)\tWhat’s the real range for NDVI and EVI (note, it is not the valid range for the NDVI and EVI product list the table above)? There are four pixels representing the dense vegetation, deep water, bare soil, and NoData in the image, what’s the real scale NDVI if the pixel values are -3000, 5600, -1050, and 1? (Show all your steps, 4 points) And assign the surface feature to each NDVI. A: dense vegetation ; B: deep water; C: bare soil; D: NoData Pixel values (NDVI)\tScale factor\tThe real scale NDVI\tSurface Features (A, B, C, or D), choose from the list above -3000\t5600\t-1050\t1\nB)\tUse the inquire tool under Home tab/Information section (or select the tool from the right click menu on the image). Navigate the cross to the location in the inquire window below and find the pixel values of reflectance (band 1(Red), 2(NIR), 3(Blue), and 7(MIR)), NDVI and EVI, and calculate the real scale values for those points. (be sure you are clear about the fill value, valid range and multiply by scale factor for each layer) (12 points).\nFor example, change the first choice to “File”, then you can type in the file coordinates provided in the window below, then read the FILE PIXEL value in the window.\nLocation 1: File Location (X,Y) -\u0026gt; (5777,2424)\tPixel value (or File Pixel)\tScale factor\tReal scale value NDVI\tEVI\tRed\tNIR\tBLUE\tMIR\nLocation 2: File Location (X,Y) -\u0026gt; (5300,2928)\tPixel value (or File Pixel)\tScale factor\tReal scale value NDVI\tEVI\tRed\tNIR\tBLUE\tMIR\nC)\tUsing the two tables above, use the pixel values in NIR and RED to calculate NDVI, and compare the values with the NDVI provided in the products. (2 points)\nLocation 1:\tLocation 2:\nThink carefully about the differences between TM and MODIS. Discuss the pros and cons of the two datasets and their potential applications. (List at least 3 pros and cons and 2 suitable applications for each sensor) (5 points) Part 4: MODIS NDVI Temporal Profiles In a viewer open the MODIS time-series located at: G:/Fall-2015/Geo526/Lab09/Terra/terra_2012_subset.img. NDVI in this file is still in the valid range (-2000 – 10,000). The subset is of an agricultural area in western Kansas.\nEach layer is a 16-day composite of NDVI values. Assign the Red, Green and Blue color guns to layers 8, 13 and 18, respectively. Open the Spectral Profile tool under the Utilities subtab under Multispectral. Use the crosshairs to create multiple spectral profiles in one graph for one field for each color below. Examine the NDVI profiles over the year. Using the table below showing time periods and Julian days for 16-day composites, label the crop type you think each color is and the time of year the peak or maximum NDVI occurs. (6pts) Colors in MODIS Time-Series\tCrop Type\tDate of Maximum NDVI\n Red Royal blue White Green "},{"id":8,"href":"/classes/geog526/labs/lab07/","title":"Lab - LANDSAT TM \u0026 SPOT IMAGERY","parent":"Labs","content":"Learning Objective The purpose of Lab06 is to familiarize you with interpreting Landsat Thematic Mapper and SPOT imagery. Upon completion of this lab you should be aware of the usefulness of each TM band, and the similarities and differences between TM and SPOT data.\nOutline: Learning Objective Submission requirements Tutorial of Bands: 3 Wrapping up Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab08Questions.docx\rHandout to turn in\r\r\rTutorial Q1. Circle the nadir point in the air photo and discuss why. (2 point) NADIRimage.jpg\nPart 2: Flight Planning\nThis example illustrates the various calculations involved in preparing an aerial flight plan for an area of 80mi2.\nBasic information required is as follows: Desired photographic scale: 1,320ft/in Scale of base map: 1:62,500 or 1in = 5,208ft Size of area: 8 miles EW by 10 miles NS or 42,240ft by 52,800ft Average ground elevation above mean sea level (MSL): 1,200ft Average forward overlap: 60% Sidelap: 15-45%, averaging approximately 30% Negative format: 9x9in or 11,880ft by 11,880ft on the ground Camera focal length: 6in or .5ft\nItems to be computed in preparing the flight plan are as follows: Flying height above ground and height above mean sea level Direction and number of flight lines Ground distance between flight lines Actual percentage of sidelap Map distance between flight lines Ground distance between exposures on each line Map distance between exposures on each line Number of exposures on each line \u0026amp; total number of exposures\nFlight map computations:\nFrom #1 desired scale is 1,320ft (GD) per inch (PD) 1,320ft * 12 in = 15,840in\tso desired scale = 1:15,840\nFlying height above ground (AGL) and height above mean sea level Flying height above ground (AGL) AGL = focal length * desired scale denominator AGL = .5ft * 15,840 = 7920ft above ground\n Flying height above mean sea level\r7,920 + 1,200 (from #4) = 9,120ft\r B.\tDirection and Number of Flight Lines Direction of flight lines: N-S following long dimension of tract. Number of flight lines: Assuming an average sidelap of 30% (#6), the lateral gain from one line to another is 70% of the print width, or Print width GD = negative formatscale denominator = 9in15840 = 142560 in = 11880ft (1-Average sidelap) * Print width GD 0 .70 x 11,880 = 8,316ft (lateral gain) between lines.\nThe number of intervals between lines is found by dividing the tract width (42,240 ft)\rby the lateral gain 8,316ft. The result is\rTrack width =\t42,240ft = 5.08 or 5 intervals and 6 flight lines. Lateral gain\t8,316ft C. Ground Distance between Flight Line (Tract width) 42,240ft = 8,448ft GD between flight lines (Intervals) 5\nD. Actual Percent of Sidelap Actual percent of sidelap, assuming exterior flight lines are centered over tract boundaries:\nSidelap%= (Print width GD (ft)-GD between flight lines (ft))/(Print width GD (ft))*100\t Sidelap%=(11,880-8,488)/11,880*100=28.9%\nE. Map Distance between Flight Lines Map scale = 1:62,500\n 1\tMap Distance\t1 inch\tx inches\r=\t=\r62,500\tGround Distance\t5,208ft\t8,448ft\r= 1.62in between flight lines on map\r F. Ground Distance between exposures on each line Assuming an average forward overlap of 60%, the spacing between successive exposures is 40% of the print width or:\n 0.40 * 11,880ft = 4,752ft\r G. Map Distance between exposures on each line\n 1\tMap Distance\t1 inch\tx inches\r=\t=\r62,500\tGround Distance\t5,208ft\t4,752ft\rOR scale of base map = distance between exposure on map\r40% of print width\r H. Number of exposures on each line and total number of exposures Number of intervals between exposures is found by dividing tract length (52,800ft) by 4,752 = 11.11 intervals. This would require 12 exposures inside the area, assuming that the first exposure is centered over one tract boundary.\nIn addition, two extra exposures are commonly made at the ends of each line. Thus, a total of 12 + 2 + 2 = 16 exposures would be taken on each flight line.\rSo, the total number of exposures required to cover the entire tract is:\r6 lines * 16 exposures/line = 96 exposures\r Part 3\nAssume you get a job, after acing this class, and are a well-paid Projects Coordinator for a local photogrammetric consulting firm. Your task is to set up a flight plan for an over-flight near Lawrence, Kansas. Use the following information to plan the mission.\rDesired photographic scale: 1:24,000\rScale of base map: 1:100,000\rSize of area to be covered: 7 miles EW by 10.5 miles NS Average ground elevation above mean sea level (MSL): 450m\rAverage forward overlap: 65%\rSidelap: 15-45%, averaging 30%\rNegative format: 9x9in Camera focal length: 6in (.5ft)\r Q2: CALCULATE THE FOLLOWING: (2points each, a total of 22) Flying height above ground (AGL) and height above mean sea level Flying height above ground (AGL) AGL = focal length * desired scale denominator AGL = .5ft * 24,000= 12,000 ft above ground\n Flying height above mean sea level\r12,000 + 1,476.4 = 13,476.4 ft\r B.\tDirection and Number of Flight Lines Direction of flight lines: N-S following long dimension of tract. Number of flight lines: Assuming an average sidelap of 30% (#6), the lateral gain from one line to another is 70% of the print width, or Print width GD = negative formatscale denominator = 9in24,000 = 142560 in = 11880ft (1-Average sidelap) * Print width GD 0 .70 x 11,880 = 8,316ft (lateral gain) between lines.\nThe number of intervals between lines is found by dividing the tract width (42,240 ft)\rby the lateral gain 8,316ft. The result is\rTrack width =\t42,240ft = 2.93 or 3 intervals and 4 flight lines. Lateral gain\t8,316ft C. Ground Distance between Flight Line (Tract width) 36,960 ft = 13,320 ft GD between flight lines (Intervals) 3\nD. Actual Percent of Sidelap Actual percent of sidelap, assuming exterior flight lines are centered over tract boundaries:\nSidelap%= (Print width GD (ft)-GD between flight lines (ft))/(Print width GD (ft))*100\t Sidelap%=(18,000-12,320)/18,000*100=31.56%\nE. Map Distance between Flight Lines Map scale = 1:62,500\n 1\tMap Distance\t1 inch\tx inches\r=\t=\r62,500\tGround Distance\t10,000 ft 12,320 ft\r= 1.48in between flight lines on map\r F. Ground Distance between exposures on each line Assuming an average forward overlap of 65%, the spacing between successive exposures is 35% of the print width or:\n 0.35 * 18,000ft = 6,300ft\r G. Map Distance between exposures on each line\n 1\tMap Distance\t1 inch\tx inches\r=\t=\r62,500\tGround Distance\t5,208ft\t4,752ft\rOR scale of base map = distance between exposure on map\r40% of print width\r H. Number of exposures on each line and total number of exposures Number of intervals between exposures is found by dividing tract length (5,280ft) by 6,300 = 8.8 intervals. This would require 9 exposures inside the area, assuming that the first exposure is centered over one tract boundary.\nIn addition, two extra exposures are commonly made at the ends of each line. Thus, a total of 9 + 4 = 13 exposures would be taken on each flight line.\rSo, the total number of exposures required to cover the entire tract is:\r4 lines * 13 exposures/line = 52 exposures\r Part 4 Introduction to ERDAS IMAGINE\nIntroduction This part will provide an introduction to the Earth Resources Data Analysis System (ERDAS) IMAGINE software that you will be using for the rest of the semester.\nStep 1: Copy files to your directory Log on to the machine. Go to your directory under My Computer and create a lab04 folder. Download GEOG 526 Lab 4 Data to your Lab04 folder and unzip it.\nStep 2: Accessing ERDAS IMAGINE Click on the Start button and go to All Programs ERDAS Imagine 2016 ERDAS Imagine 2016. After a moment the IMAGINE logo should come up, followed by the main IMAGINE menu, a panel across the top of the computer screen. A “viewer”, a black screen in which images are displayed, should also come up. You can resize these windows by grabbing the corners of the windows with your cursor, or you can move the windows around if you so desire.\nStep 3: Displaying an Image To display an image, right click on “2D View #1” under the contents menu on the left side of the screen and click on “Open Raster Layer…”, or simply click on the open file icon at the very top left of the screen. The Select Layer to Add window should now appear. Select the E: drive from the “Look in:” dropdown menu, and double click on “Users” your directory (E:\\your_name\\GEOG526\\Lab05). In your directory, select (highlight) the file garden9182018.img, but don’t click OK just yet. The data in this file are from a subset of SPOT’s multi-spectral sensor over western Kansas near Garden City. Look at the other information in the Select Layer to Add window under the Raster Options tab. The Display should be on True Color. Change it to Gray Scale and Display Layer 1. This allows you to display one data layer (one band/ wavelength range) in the viewer. Switch back to the File tab. At the very bottom is additional information, including the number of columns and rows and the number of bands in the image.\n How many bands are available? What are the wavelengths represented by Spot’s Multi-spectral Scanner (XS) sensor (provide the band number, name, and wavelength)? (6 points) Use your textbook or the handout for this question. of Bands: 3 Band\tWave Length Name\tXS1\t0.50-0.59 um green XS2\t0.61-0.68 um red XS3\t0.79-0.89 um near IR\nAfter answering the above questions, click OK. Once the image is displayed, you can retrieve information regarding various characteristics of the layer by clicking on the Metadata icon under the Home tab (the icon looks like a piece of paper with an “i” in the center). After the Metadata dialogue box opens (this may take a bit!) click on the General tab and find the Layer Info data box, in this look at the Height and Width numbers.\nHow many columns of data do we have? 512 Rows? 512 (2 points) Step 4: Assigning Color Guns to Bands Reopen garden9182018.img. Under the heading Layers to Colors on the Raster Options tab of the Open Raster Layer window, we can specify which computer color gun is used to display a certain band. Make sure it is Red-3, Green-2, and Blue-1\nThis is telling you that we are assigning band 3 to the first \u0026ldquo;color gun\u0026rdquo;, which is red; band 2 to the second color gun, which is green; and band 1 to the third computer color gun, which is blue. If you entered layers (or bands) 1,2,3 in that order, it would assign band 1 to color gun 1 (red), band 2 to color gun 2 (green), and band 3 to color gun 3 (blue). Get it?\nIn the bottom of the same window, check Clear Display (this clears the data from the display window). Also click on the box Fit to Frame- you’ll want to do this every time you display an image. This displays the whole image in the window, rather than setting it to a pre-defined default zoom. Now click OK at the side of the screen. The color composite Spot image should appear.\nYou can change the assignment of bands to color guns by clicking on the toolbar on the top, Raster - Multispectral tab and go to Bands where you will see the three color guns to assign.\nKeep in mind that these data have been \u0026ldquo;contrast stretched\u0026rdquo; to make it easier to interpret visually. This is done by the computer’s examination of a statistics file of the image to determine how it should be stretched. You will learn how to do this later.\n The image displayed is a color composite. What type of color composite is it? Please refer to the spectral wavelength of the image bands (3 point)\n Color Gun\tImage Band No. Wavelength Name\tRed\t3\tnear IR Green\t2\tred\rBlue\t1\tgreen\r Step 5: SPOT Image Change the band combinations on the image, i.e. change the band-color gun assignments. On the main menu, click the Multispectral tab. Here there is a sub-menu called “Bands”. Change the band assignments to 3 to red, 3 to green, and 3 to blue.\nWhat will this image look like now? And WHY? (Explain why this image displays in this color scheme - remember that all three colors were assigned the same band of data). Is it a color composite or…? What information is highlighted by this image? (3 point) (Hint: With the color composite AND band 3 displayed, look at the features and determine if they are light or dark in band 3. If they are light, that means that type of feature reflects high in that band. Recall the vegetated spectral response curve and how it reflects high or low in portions of the electromagnetic spectrum (NIR vs visible). Look at vegetated features and non-vegetated areas (water and bare soil) to help answer this question.) Since all three color guns were assigned to the same band, the resulting image is a grayscale representation of Band 3 (near-IR), where high values (white) represent high near-IR reflectance. Near-IR reflectance highlights the vegetation.\nZoom in the image in the viewer until you can identify the cells/pixels. To investigate pixel values and the DN’s associated with them, press the Inquire “+” icon on the viewer menu under the Home tab. This brings up the “inquire cursor”. In the window that now comes up, examine the file File-pixel column. These are the reflectance values (PV’s, DN’s, etc.) for the pixel on which the cross-hairs of the cursor lie. You will see pixel values for each of the layers of your image. Move around the cursor with your mouse and watch the pixel values change.\nOpen garden.img with Layers to Colors: Red: 1, Green: 2 and Blue: 3. Don’t check Clear display (keep both garden.img and garden9182018.img in View 1) and order garden.img on top of garden9182018.img. Keep in mind that the bands of SPOT may or may not have been entered in the correct order, so always be aware of the wavelength range of each individual layer/spectral band you put into a gun. Image garden.img is the identical SPOT image with garden9182018.img with a different stacking order of image bands/ wavelengths. You may click on and off to check either image and change orders of bands to R,G, B guns from Raster – Multispectral for each to explore.\nWhich SPOT image has its bands in the proper order? Justify your answer and, if they are not in correct order, what order are they in? (3 points) Bands 1 \u0026amp; 3 are reversed between the two. In garden9182018.img, the bands are in the correct order of green, red, near IR, respectively. In garden.img the near IR is in band 1 (as evidenced by high values over vegetated areas when all three guns set to Band 1) and band 3 has been switched for the green band. Wrapping up There is no need to save anything from this lab, so when done you can simply close without saving. Submit your answers to the questions on blackboard.\n"},{"id":9,"href":"/classes/geog526/labs/lab09B/","title":"Lab - Remote Sensing (Assignment 9) NDVI","parent":"Labs","content":"Perform a Normalized Difference Vegetation Index (NDVI) on both change detection images; Tallahassee-97.img and Tallahassee-99.img.\nFrom the main horizontal menu bar select Raster\u0026gt; Unsupervised \u0026gt; Indices. The input file will be one of the two Tallahassee images (repeat the process for the other one), output file will be something like Tallahassee-97-ndvi, make sure the Sensor is highlighted as Landsat TM, and make sure the Select Function is highlighted to NDVI. Click OK.\nOnce Tallahassee-97-ndvi and Tallahassee-99-ndvi have been created display them side-by-side in two separate viewers. Ensure that you select Pseudo Color from Raster Options. Scale the images to fit their respective viewers. Bring up their Attributes table from Viewer \u0026gt; Raster and highlight all Values above 0.47 and change the color to something conspicuous like red. Do this for both images.\n| Year | Entire Image\nValue Total | Entire Image\nValue\nMaximum | Entire Image\nHistogram Total | \u0026gt; 0.47\nValue Total | \u0026gt; 0.47\nValue Maximum | \u0026gt; 0.47\n Histogram Total 1997 1999 "},{"id":10,"href":"/classes/geog526/labs/lab13/","title":"Lab - Remote Sensing with RADAR Imagery","parent":"Labs","content":"Learning Objective The purpose of Lab 11 is to familiarize you with interpreting radar imagery. Upon completion of this lab you should be aware of the usefulness and difficulties of radar imagery in remote sensing applications.\nOutline: Learning Objective Submission requirements Guide Wrapping up Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rYou are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts and submit it to blackboard. Copy the folder Chapter 12, which contains the following GeoTIFF datasets from NASA’s Earth Observatory:\r* Bakken_vir_2012317_geo.tif, a VIIRS image showing a section of northwestern North Dakota at night\r* russia_tmo_2012170_fires_geo.tif, a MODIS image showing fires in Siberia\r* samerica_vir_2012202_geo.tif, a VIIRS image showing a section of the South American eastern coastline at night\r* irene_amo_2011238_geo.tif, a MODIS image showing Hurricane Irene\r* Newzealand_amo_2017317_geo.tif, a MODIS image showing an algal bloom off the coast of New Zealand\r* The folder also contains the following KML dataset from the University of Wisconsin’s Space Science and Engineering Center:\r* Daily_MODIS_May3, a series of MODIS images covering the United States from May 3, 2017\rGuide Background Information •\tAcronym – “Radio Detection And Ranging” •\t“Ranging” refers to measurement of time delay •\tActive system – generates own source of energy to acquire return •\tSignal representative of radar backscatter •\tLow backscatter = darker area •\tHigh backscatter = brighter area •\t3 types of active radar scanners: o\tDoppler o\tPlan Position Indicator o\tSLAR (this lab focuses on SLAR)\nCapabilities •\tCan measure distance from antenna to land surface features •\tAbility to detect frequency \u0026amp; polarization shifts •\tCan be used in almost any weather (helpful in tropical climates) •\tIndependent of solar illumination •\tCan penetrate clouds, some snow, some soil, some vegetation providing crisp topographic information\nWhat determines return signal? •\tWavelength used •\tPolarization •\tGeometry •\tSurface Characteristics (Book page 220, Section 7.10)\nWavelengths Used\nBand Designations\tWavelengths Ka\t.75 – 1.18 cm K\t1.18 – 1.67 cm Ku\t1.67 – 2.40 cm X\t2.40 - 3.75 cm C\t3.75 - 7.5 cm S\t7.5 - 15 cm L\t15 - 30 cm UHF\t30 - 100 cm P\t77- 107 cm **Radar originated as a military endeavor during WWII. Because of its military origins, the band designations were assigned arbitrarily to ensure security and therefore don’t hold any specific meaning.\nSLARgeom.png\nll radar collection occurs in slant range, therefore geometric errors exist\n•\tRadar layover o\tradar measures all distance with respect to time elapsed between transmission \u0026amp; reception o\tat near range, top of tall object closer to antenna than is its base o\ttop appears closer (appears to lean) o\tanalogous to relief displacement\nFrontSLARgeom.png\nGeometry of SLAR •\tSLAR = Side-Looking Airborne Radar •\tDepression angle – angle between upper edge of beam \u0026amp; horizontal extension of plane •\tNear range – edge nearest airplane (more scale compression) •\tFar range – edge farthest from airplane •\tSlant range – direct distance from object to antenna (high distortion; object appear curved) •\tGround range – represents correct scaling of distances; preference for interpretation because it minimizes distortion\nWrapping up There is no need to save anything from this lab, so when done you can simply close without saving. Submit your answers to the questions on blackboard.\n"},{"id":11,"href":"/classes/geog526/labs/lab12/","title":"Lab - THERMAL IMAGERY","parent":"Labs","content":"Learning Objective The purpose of Lab 10 is to familiarize you with interpreting thermal imagery. Upon completion of this lab you should be aware of the usefulness of thermal imagery in various applications.\nOutline: Learning Objective Submission requirements Exercise Submission Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rAdvanced Spaceborne Thermal Emission and Reflection Radiomter (ASTER): one of the sensors onboard the EOS satellites.\n5 Thermal Infrared (TIR) bands covering the 8-12µm portion of the EMS. Spatial resolution 90m x 90m\nLANDSAT TM Band 6 •\tspatial resolution – 120m x 120m (coarse relative resolution) •\tdetects emitted energy (not reflected energy) •\temitted radiation of earth’s surface reveals information concerning the thermal properties of materials •\tconsidered a ‘passive’ system in that it measures energy emitted from earth (as opposed to the sensor itself) •\tuncalibrated – relative temperatures (brightest tones generally indicate warmest surfaces) •\tprovides qualitative information instead of quantitative •\tscanners generate geometric errors, therefore can’t be used for accurate measurements\nThermal Properties •\tEmissivity – ability to absorb \u0026amp; radiate heat; good absorbers = high emissivity •\tConductivity – measure of the rate at which heat will flow through material •\tCapacity – ability of material to store heat •\tDiffusivity – measure of material’s internal heat transfer •\tSpecific Heat/Thermal inertia – amount of energy necessary to raise 1 gram of substance 1 degree C (water has high specific heat)\nTime of Day Considerations •\tTime of image acquisition is key information when interpreting thermal images •\tTime can be estimated by examining land/water comparisons •\tTypically: o\tgreatest contrast between features on a day image o\twater cool on day images (dark) \u0026amp; warm on night images (bright) o\tvegetation relatively cool on day images (due to plant transpiration) o\tpavement/building materials appear bright; absorb a lot of heat \u0026amp; have a high thermal capacity\nTime of Day Considerations •\tDifferential heating \u0026amp; cooling •\tPredawn \tShow residual heat remaining in objects at end of cooling period •\tPost-sunrise \tThermal shadows – cooler, darker areas on slopes away from sun •\tTemperature cross-over periods \tTwo periods (sunrise \u0026amp; sunset) when land/water have similar temps \tAvoided for remote sensing\nThermalCurves.png Thermal Interpretation •\tSurface winds o\tWind shadow – bright signature downwind from an obstruction to surface winds; caused by reduction of windchill in these areas o\tWind streaks – alternating light/dark lines that are parallel to wind direction •\tRain o\tCreates surface of cooler, more uniform temps •\tClouds \u0026amp; Fog o\tUsually masks thermal IR emissions from surface o\tResults in darker tones o\tCloud shadows: areas of surface that are in shadow of clouds; cooler, darker tones •\tGhosts o\tthermal impressions of object that had been moved (dark)\nApplications •\tHeat-loss analysis •\tThermal pollution monitoring •\tForest fire detection •\tWeather forecasting •\tVolcanic risk analysis ThermalExamples.png\nExercise Copy the Lab10 data to the local drive, and Open the ASTER images by using ERDAS 2014. Image 1 - Kansas City Open the image ast_l1b_00308302012041611_20120905142736_7327.img in ERDAS 2014. Locate each area using the inquire box . (Right click the image, you will find this tool in the list.) Change Units from Map to File. Location A\tLocation B\nLocation C\tLocation D\n Identify the features at the following locations: (4 points)\r A. ________________________________\nB. ________________________________\nC. ________________________________\nD. ________________________________\nOpen the ASTER daytime image in the same viewer (but don’t clear display) or in a separate view and link the two viewers.\n2. How have these features changed between the day and night images? Explain your reasoning using any thermal capacity property (9 points) Day\tNight\tReasoning B\tC\tD\n Using the night image and the file coordinates below, what feature do the white pixels represent? (1 point)\r Look at the features of D in Question 1. Compare the day and night images. Explain the reason behind what you see using your knowledge of thermal infrared energy interactions and physical geography. (1 point)\n What do bright features mean in the thermal images? Using your knowledge of thermal energy/matter interactions, explain the reason why feature B and D in Question 1 are bright, respectively. (Hint, the reasons for the B and D are different.) (2 point)\n Sacramento, CA UHIPP Image\nCalifornia’s capital city has diverse land cover types resulting in unique thermal regimes shown in the false-color infrared on the right. In this “quick look” image which has not been calibrated with ground sensor data or corrected for atmospheric interference – - dark red are hot, and blue and green are cool.\nThis thermal image was formed, possibly by density slicing the thermal data, in the following manner:\n Red Hot- About 122 deg. C (140 deg. F)\rOrange Yellow Green Blue Cool – About 29-36 deg. C (85-96 deg. F)\r Open the image sacramento.jpg (Choose file type “JFIF”) in ERDAS 2014, right click the image, and choose inquire box and to locate the land cover A by using the location information below. (i) What is the land cover labeled A (1 point)?\n(ii) Is this land cover cooler or warmer than the surrounding areas? Explain why (2 points)?\n(iii) If a similar image were acquired at night, approximately at 10:00 pm, what differences if any, would you expect in the thermal characteristics of the land cover labeled A? (1 point)\r(iv) What is the explanation for your answer in (iii) above? (1 point)\r Land cover types labeled B and D are cultural features while C is a natural feature. (3 points) (Note you are working in map coordinates ) B\tC\tD\n(i) Identify the features at the following locations:\rB. ________________________________\rC. ________________________________\rD. ________________________________\r (ii) Which land cover type(s) C and the features North (light blue) and east (dark blue) of C is the coolest and why? (2 points, hints: think about what the other two features are)\n(iii) Which is the warmest: paved road, park, or river? Discuss the function of water and vegetation to modulate the surface temperature? (2 points)\nInterstate 5, running north to south along the East side of the Sacramento River, and US 50, from left to the bottom right, are clearly visible on the images. (i)What surface material (Asphalt, Bitumen or Concrete) was used in the construction of the two roads? (1 point)\n(ii) Are these roads cooler or warmer than the area labeled as B? (1 point)\n(iii) Provide an explanation for your answer in (ii) above. (1 point)\nSubmission All you have to turn into blackboard for this week is the final image you created above.\n"},{"id":12,"href":"/classes/geog111/labs/lab01/","title":"Lab 01 - Computer basics","parent":"Labs","content":"Learning Objective This lab is not a beginner level \u0026ldquo;how to computer\u0026rdquo;, but we will cover a few critical PC setup tips that will only make your life easier, as well as a few lesser known tips and tricks that I\u0026rsquo;ve picked up and found useful.\nOutline: Learning Objective Submission requirements Tutorial Navigating folder structures Setting up Cloud storage Setting up Windows options Screenshots Virtual access Installing GIS Misc. \u0026amp; Neat tips Wrapping up Submission requirements There is nothing to submit for this week, but make sure you\u0026rsquo;ve followed along and set up your system correctly or you\u0026rsquo;ll have a rough time as we move through the semester.\nThis lab is almost exclusively designed for Windows although there are certainly these same settings in Apple and Linux systems. Some of these steps will also be specific to the lab PC\u0026rsquo;s only. Tutorial Navigating folder structures When working on a computer, it is vital that you know \u0026ldquo;where\u0026rdquo; on the hard drive you are working. This \u0026ldquo;where\u0026rdquo; is the folder path. For instance, when you open up a new word document and attempt to save it, it will likely kick you to a save prompt that has some sort of file path displayed, ex:\nIf you click More Save Options, you should end up at the Windows File Explorer view, and at the top in the address bar or over on the navigation pane, you\u0026rsquo;ll see a full path and visial hierarchy of files.\nMake sure you know where that is, both as a folder path and what drive you are on. Sometimes I may tell you that you need to be working \u0026ldquo;one level up\u0026rdquo;, which just means that you need to be back on folder in your folder tree.\nSetting up Cloud storage OneDrive OneDrive could have been great but poor implementation on the IT end means we are often saddled with a three legged work horse. While some fields may never encounter the issues we are about to uncover, Geographic Information Science \u0026amp; Systems are a branch of data science and that means we have some pretty intense formats to work with. When you first log on to a PC, the first thing you should check is to see where the OneDrive folder is attempting to sync to, and how healthy that drive is. The easiest way to get this info at a glance is to open the File Explorer (usually the pinned folder icon on the task bar), and then click on This PC to see all the drives attached and their space available. If you right click on the OneDrive icon and go to properties, you\u0026rsquo;ll see where it lives (in my case, it\u0026rsquo;s on the H:)\nIf you are working on the Lindley Hall PC\u0026rsquo;s, they are set up so that each user has the default OneDrive location on the C:, and these fill up fast. If you find yourself short on space, you\u0026rsquo;ll want to move this to the D: like so:\n Click on the OneDrive icon in the taskbar Select Help \u0026amp; Settings \u0026gt; Settings In the Account tab, select Unlink this PC, and when the OneDrive setup screen appears, you can close it. You can now either move or delete that original OneDrive folder (If you\u0026rsquo;ve just started out, the easiest thing to do is just delete it). The safest and most logical place to put the new folder is in your user profile on the D: ex: D:/Users/A123B456/OneDrive\n On the start bar search for OneDrive and open the app. On the OneDrive setup screen, select Get started, and then follow the instructions until you get to the screen where you can change your OneDrive folder\u0026rsquo;s location. Choose the new folder location, and then select OK. Wait for your files to sync and you should be all set If you are using OneDrive though a company or though KU, you generally don\u0026rsquo;t get a whole lot of control over the name of the folder you choose. To make transferring files easer down the line, the first folder in my setup is always called \u0026ldquo;Root\u0026rdquo;, and everything lives under that. If you do have control over what your OneDrive folder is called, make sure it is short and does not have spaces or characters, that will save time down the line as well. Dropbox ToDo Setting up Windows options There are a few system settings that you should always have shown in windows. If you search \u0026ldquo;Show File Extensions\u0026rdquo; in the windows search bar, the Settings For developers tab should pop up. Click on the settings and set a few of these options. The key one we are looking for here is to make sure Hide Extensions for known file types is unchecked. You always want to know what it is you are clicking on. Other handy ones I use are the Display the full path in the title bar and the Show hidden files options.\nScreenshots Most screenshots look terribly unprofessional when simply pasted into a document, but they do convey the needed information with virtually no friction and are therefor pretty commonplace. For those times when creating a real image is more effort than it\u0026rsquo;s worth, the Snip tool (soon to be the snip and sketch tool) is your friend. Hit the windows search bar and type in snip to get the snip window to pop up. There are a few ways to snip including the option to add a delay if you need to navigate menus before the screenshot is taken.\nVirtual access Using GIS more than once a week will greatly increase your familiarity and learning. You don\u0026rsquo;t need to run out and buy a new computer if you don\u0026rsquo;t want to though. If you are a KU student, you have a few means of remote access to compute resources. There is a large cluster of web based applications for use at http://virtualdesktop.ku.edu/, and you can log into the PC\u0026rsquo;s in Lindley using http://virtuallab.ku.edu/. I recommend the latter (virtual labs), as the hardware is a little more robust than the virtual desktop options. The first time you attempt to log on it will walk you through installing Citrix receiver, but you do not need to create an account or otherwise log into anything other than the KU logins.\nInstalling GIS If you have a PC capable of running GIS, there are instructions on how to go about installing it here.\nMisc. \u0026amp; Neat tips Copy - paste operations can get pretty heavy handed with how they handle formatting. If you right click on any of the Windows office products, there are generally options such as Paste with format, paste and merge format, paste as values, ect, that are more behaved than the standard ctrl-v option. PowerPoint is a pretty magical program when you drill into all the capabilities it offers, but one of the neater and more accessible functions is the ability to quickly remove a solid background color from a picture. Just add the image to a black slide, click on the Picture Format toolbar and then the Color \u0026gt; Set Transparent Color tool and then click on the color you want to remove the background from. Find yourself working with tabular text data? A handy keyboard shortcut to highlight along a specific set of character columns is to alt-click (works in notepad++, word). The example below shows how this might be useful if you wanted to quickly remove the first two digits off the year column.\n Wrapping up If you are on a PC in Lindley, make sure you save all your work and then log off. As the submission requirements outlines, there was nothing to submit for this lab, but make sure you do these steps. Note that although these settings should follow you should you move from PC to PC, take a quick second each time you log in to make sure that it\u0026rsquo;s set up as you expect (the OneDrive on the D: is the biggie)\n"},{"id":13,"href":"/classes/geog558/labs/lab01/","title":"Lab 01 - Data survey and database building","parent":"Labs","content":"Learning Objective The first part of this lab requires writing a survey of the data and mapping technologies in the field that you are interested in (wildlife reserve, conservation, urban planning, forest resources, natural hazards, business, etc.), not limited to the data and mapping technologies covered in the lecture. This survey may include both traditional and new emerging datasets and/or mapping technologies that are used in the fields. The second part of the lab will download imagery and create a database that will be used for the next few labs.\nOutline: Learning Objective Submission requirements Tutorial Part 1 (70 points): A survey of data and mapping technologies Part 2 (30 points): Building a database Setting up folders and accounts Downloading data Wrapping up Submission requirements Submit the paper outlined in part 1, and the screenshot of the data you downloaded in part 2 to blackboard.\nTutorial Part 1 (70 points): A survey of data and mapping technologies You will write a survey on the data and mapping technologies in a field that you are interested in. The survey should have 2-4 single spaced pages using a font size of 12 points. Reference style should follow one of the GIS journal reference styles (e.g. Transactions in GIS, Cartography and Geographic Information Science, International Journal of Geographic Information Science, The Geographical Journal, and Computers \u0026amp; Geosciences, etc.). I also point you to several resources for reference management software syllabus.\nGrading for the survey is based on:\n (80%) Quality of the content: how many potential datasets and mapping methods are included; How you organize your contents. Whether the length of survey is proper, and has a proper level of synthesis and details. (20%) Format: follow the requirements described above and check your grammar, spelling and sentence structures before you submit the survey. Part 2 (30 points): Building a database One of our goals over the the next couple of labs is to recreate part of the analysis found in Frey et al., 2018, specifically Figure 2 and a lake volume estimation. We will use these data in the next couple of labs and I will not wait for you to download them next week so be prepared to have them ready.\nSetting up folders and accounts \rSome sort of file management system is preferable to nothing. Time wasted looking for files can be better spent performing the analysis, other research, or discovering new music. Here I'll outline what I use, but if you already have a system that works for you feel free to deviate. Just as an aside, avoid file names with spaces, leading numbers, or strange characters.\rOn OneDrive, create a folder for classes, in that one for this class (GEOG558), and in that folder create one for labs. In that folder create one for Lab01, and one for base data (BaseData). In that create folders, one called Elevation and the other called Landsat. The Landsat tiles can get a little messy so I create extra folders underneath the Landsat folder titled something along the lines of LS05_YYYY. a visual example of this setup is shown at the bottom of this page.\n We will use USGS Earth Explorer to download the requisite datasets, including Landsat scenes and elevation data. Before we start you'll need accounts at both https://earthexplorer.usgs.gov/ and https://urs.earthdata.nasa.gov/ \rDownloading data \rOpen up Earthexplorer We first need to define our area of interest. Under the Search Criteria tab change the coordinate box to decimal degrees, add a coordinate, and copy/paste the following into their respective fields. Latitude: -9.397055203542507 Longitude: -77.38027425750124 Under the Data Sets tab, search for SRTM3 SRTMGL-1 and make sure the box next to the dataset is checked. Click results and then download the standard format (the HGT, not any of the metadata). \r There are a few useful buttons on a results screen. The black footprint shows you the footprint of a tile, and the image icon will draw a jpg over the browser so you can see what the data you\u0026rsquo;re downloading.\n The elevation data is distributed as a nested zip file. To use it you'll need to unzip it twice to actually use it in a program. Unzip (Right click 7-Zip Extract to “FileNameHere”) the SRTMGL folder and move the HGT file (a fancy tif for SRTM data) into your folder structure. \rNext we want the Landsat scenes. We\u0026rsquo;ll simplify this somewhat by starting at 1995 and looking forward in time every 2 year. We\u0026rsquo;ll also use scenes taken in the winter, so that the lake is stable.\nUnder the Data Sets tab, navigate to the Landsat Landsat Collection 1 Level 1 check [Landsat 4-5 TM C1 Level-1](https://www.usgs.gov/centers/eros/science/usgs-eros-archive-landsat-archives-landsat-4-5-thematic-mapper-tm-level-1-data?qt-science_center_objects=0#qt-science_center_objects). Set the Date Range dates to 06/01/1995 and 08/01/1995 respectively.\rClick on Additional Criteria, set the Land Cloud Cover to less than 10% Click on Results, when the images load in you can explore them with the “Show Browser Overlay” button, and download the desired tiles (Get the Level-1 GeoTIFF data products. Do this for every two years up to 2009 (change the Date Range on the Search Criteria tab). The images I selected for this analysis were: YEAR SCENE NAME 1995 LT05_L1TP_008067_19950622_20170109_01_T1 1997 LT05_L1TP_008066_19970729_20161230_01_T1 1999 LT05_L1TP_008067_19990601_20161217_01_T1 2001 LT05_L1TP_008067_20010622_20161210_01_T1 2003 LT05_L1TP_008066_20030714_20161205_01_T1 2005 LT05_L1TP_008066_20050703_20161126_01_T1 2007 LT05_L1TP_008067_20070725_20161112_01_T1 2009 LT05_L1TP_008066_20090628_20161024_01_T1 The Landsat tiles are downloaded with extensions .tar.gz, they’ve also essentially been compressed twice. Unzip them the first time, open them up and unzip them a second time. Copy all the files into their respective folders.\rOpen up ArcMap and the ArcCatalog window, connect to the lab folder, and expand the BaseData file tree out a bit. Submit a screenshot of your final files (similar to the one below). Wrapping up As a reminder, you are submitting the paper written in part 1 and a screenshot of your downloaded landsat scenes.\n"},{"id":14,"href":"/classes/geog558/labs/lab02/","title":"Lab 02 - Cartographic Modeling","parent":"Labs","content":"Learning Objective This lab will introduce local operations using cartographic modeling (map algebra). Part 1 is intended to show you how you can organize and display raster datasets in a usable format. Part 2 will walk you through how to calculate NDWI and perform a water area analysis. Part 3 you will model the geographic distribution of Kudzu in the conterminous United States and then compare your model to the actual geographic distribution, using Map Algebra and attribute table manipulation. I\u0026rsquo;ve prefaced each lab with the learning objectives of the exercise, what you need to submit to blackboard, a set of materials for the lab as appropriate, and then the instructions for the lab.\nOutline: Learning Objective Submission requirements Tutorial What you need to submit: Part 1: Lake delineation and water classification Setting up the MXD Creating An Area of Interest (AOI) Calculating NDWI Calculating MNDWI Classifying and quantifying the change in water surface area Part 2: Simple Landsat Composites Creating a composite image Part 3: Cartographic Modeling Establishing the model parameters Running the model Assessing the model Wrapping up Submission requirements Tutorial What you need to submit: Hint: Copy and paste the questions below into a word document and submit it to blackboard\n *** # **Materials** [click to download](https://kansas-my.sharepoint.com/:f:/g/personal/j610c377_home_ku_edu/EuPaly9t2yxGrCruD_fcGakBSzTfsGwq5KC-iWYwU6GNog?e=YznxCc) .tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rRelevant part\rData Name\rDescription\r\r\rPart 1\rBaseData\rData we downloaded as part of lab 1\r\r\rPart 3\rusanpre25k\rannual precipitation in mm (25000m grid)\r\r\rPart 3\rusfrost25k\rnumber of frost-free-days in conterminous US (25000m grid)\r\r\rPart 3\rsample1.shp\rpoint file showing locations of known presence/absence of Kudzu (to be used to make model)\r\r\rPart 3\rsample2.shp\rpoint file showing locations of known presence/absence of Kudzu (to be used for accuracy assessment)\r\r If you are short on space, you only need the .mpk from part 2, the LANDSAT tiles are ~20 gigs and are provided for reproducibility\u0026rsquo;s sake. Part 1: Lake delineation and water classification Setting up the MXD Open up a new ArcMap document. First, a few housekeeping items. Let’s take the training wheels off ArcMap and make sure we can overwrite data. Click Geoprocessing \u0026gt; Geoprocessing Options and make sure the top two boxes in this window are selected. You may also want to increase the time period results are stored to a month or so. From the 1995 year, add in the green, NIR, MIR, and MTL bands/images. Hint#1: When performing the NDWI calculation, we refer to bands based their wavelength (e.g. green, NIR). However, when distributing data the bands are referred to by their number based on sensor properties (so in our case, LANDSAT 5 TM has 7 bands and change). GEE provides a nice, concise overview of a remote sensing products bands, see https://developers.google.com/earth-engine/datasets/catalog/LANDSAT_LT05_C01_T1\nA note about pyramids: You will want to build pyramids when you add raster images into ArcMap. You will notice that when this is done, ArcMap will create a file with a .ovr extension (look using the Windows file browser). We\u0026rsquo;ll talk about pyramids in class more, but as a cursory introduction, think about Google Maps. When you zoom out to look at the entire planet, there is no physical way to represent 30 meters on the screen. Instead, we take a generalization of all the pixels underneath the region and display those values in their place. Because Google sets the standards for mapping, there are generally 21 levels of images that are loaded in, with the top most layer being zoom level 1, and increasing as you zoom in. See an interactive example of this concept in action at https://www.maptiler.com/google-maps-coordinates-tile-bounds-projection/ Select all 4 images using shift-click, and right click \u0026gt; Group to place them in a group. Slow click and rename this group to LS1995\n Finally, let\u0026rsquo;s find our lake using the go to xy tool and copy/pasting the lat/long in: -9.3970552,-77.3802742 Creating An Area of Interest (AOI) Note: Skill review from Lab 4\n We next need to create a shape representing the area of analysis. First, right click on your BaseData folder and create a new File Geodatabase. Rename the database to LakePalcacocha.gdb Right click on the new geodatabase and go to New \u0026gt; Feature Class Name the Feature Class AOI and set the type to polygon. Click next. Scroll to the bottom of the window, expand the Layers folder, and click on the only option (WGS_1984_UTM_Zone_18N | WKID: 32618). Click through the rest of the setup dialog to finish creating the shapefile. Assuming you were successful you should now have a new layer called AOI in your TOC Start an editing session (Editor toolbar \u0026gt; Editor \u0026gt; Start editing) Make sure you are editing the AOI layer/LakePalcacocha.gdb database. On the editor toolbar, click create features Click on AOI to bring up the construction tools Use the Circle tool to create a small circle that encompasses the entirety of the lake. It is alright for our purposes if we include land area, but do not make it so large that it might include nearby lakes. This should look something like so:\n Click on the editor and stop editing. Save your edits when it prompts you to. Calculating NDWI The NWDI is defined as the normalized difference of the Green and Near Infrared bands of the Thematic Mapper. (See the McFeeters paper in the International Journal Of Remote Sensing)\n$$NDWI = \\frac{Green-NIR}{Green+NIR}$$ We will use the raster calculator tool to accomplish this. At this point we can also use the environmental settings to clip our data so we don’t process the whole scene. Go to Geoprocessing \u0026gt; Environments Under workspace, point the Current and Scratch workspaces to our LakePalcacocha.gdb Under processing extent, set the extent to “Same as Layer AOI”. Click OK Use the search bar to pull up the Raster Calculator and use the prompts to calculate the NDWI. Extension warnings? Did you take the time to read it? Warnings are a chance for programmers to communicate to you and in this case you should learn that you need to turn on spatial analysis under Customize \u0026gt; Extensions. Go ahead and try to enter in the expression above. Save the resulting raster using the folder icon under the Output Raster dialog box, click on the name of one of the other bands (to preserve the LANDSAT naming convention), and change the end of the file to …NDWI.tif. You’ll notice that the resulting raster has a value of 0, and is entirely empty. This is because ArcMap likes to save memory and forces raster calculations to integers. Fixing this just means we need to be more verbose and run the calculations as floats using Float() under the Math section. This conversion to integer occurs in both the numerator and denominator, so the float needs to wrap both of them separately.\n Go to Geoprocessing \u0026gt; Results. If you double click on the last tool run, it will open the tool prompt as it was run. Change the expression to:\nFloat(\u0026ldquo;LT05…B 2.TIF\u0026rdquo; - \u0026ldquo;LT05…B4.TIF\u0026rdquo;) / Float(\u0026ldquo;LT05…B2.TIF\u0026rdquo; + \u0026ldquo;LT05…B4.TIF\u0026rdquo;) Calculating MNDWI The MNWDI is defined as the normalized difference of the Green and Near Infrared bands of the Thematic Mapper. (See the Xu paper, also in the International Journal Of Remote Sensing) $$MNDWI = \\frac{Green-MIR}{Green+MIR}$$ Because this process is so similar to the NDWI (we use band 5, not band 4), it is easiest to pull the results up from our last raster calculator and change a few characters. Change all 4’s to 5’s and change the output file name to MNDWI. go to geoprocessing \u0026gt; Results and double click on the last tool run. Classifying and quantifying the change in water surface area You’ve probably noticed by now, but the NDWI and MNDWI calculations return an image with valid values from -1 to 1, but this doesn’t tell us what is and is not water. To do that we need to classify the image into a binary water/not water schema. There are a variety of ways to do this that range in sophistication/robustness, but for our purposes, a simple threshold will suffice. This means we want classify all pixels with a NDWI/MNDWI value greater than or less than some value into water. Use the identify button and poke around the image to find a good threshold. I settled on 0.30 Use the raster calculator to select the cells greater than 0.30, and append the name of the NDWI layer with the threshold value (…NDWI_3.tif). Examine the outputs (right click on the resulting layer and see the attribute table.) You now have enough information to answer the following questions. Question 1:\nBased on the results from 1995, which water delineation metric would you choose to perform this analysis, NDWI or MNDWI. Why? Question 2:\nWhat is the area of the lake in 1995? Question 3:\nCalculate the area of the lake for 2003 and 2009. Finish the table below and create a short graph of the area through time. Create a short map of the before/after.\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rYear # of Pixels 1995: xxx 1997: 100 1999: 143 2001: 184 2003: xxx 2005: 437 2007: 503 2009: xxx \r![imcenter](/classes/geog558/img/Lab2AnswerSheet1.png) \r\r Part 2: Simple Landsat Composites Creating a composite image One of the downfalls that plague many passive remote sensing platforms is the inability to penetrate cloud cover. Notice that although we didn’t happen to encounter cloud obfuscation in the first part of this lab, clouds are not uncommon. Let’s use water year 2005 as an example. I’ve taken the liberty of downloading and setting up your ArcMap document and data for you. Just download the Map package and double click to open.\n Because land/water has a lower reflectance across all bands as opposed to clouds, one of the simplest and most effective means of filtering out cloud cover is to take the local minimum from a raster stack. ArcMap has a tool called Cell Statistics which performs statistics for local operations. Either use the search tool or find it in Spatial Analyst Tools \u0026gt; local. We want to calculate the minimum of each band within the 4 seasons. Using band 4 for the winter as an example, the inputs should look like so: I have gone ahead and taken the minimum of 3 of the 4 seasons. So you only need to calculate the minimums for the spring months. Perform the NDWI calculation, thresholding, and area measurement as was performed above. Answer questions 4 and 5 below. Save and close Arcmap, we’re done looking at the lake for now. Question 4:\nCreate another graph of lake volume over time as was performed above. Question 5: What are some potential issues of using gap-filled imagery for scientific analysis? Part 3: Cartographic Modeling Establishing the model parameters Start ArcMap and add sanpre25k, usfrost25k, and sample1 from your lab02\\Part03 folder. extracting data for building Kudzu distribution model. Open the attribute table for sample1. The numbers in the “KUDZU” field represent presence or absence of Kudzu at that point (1 for presence, 0 for absence) You want to find the minimum annual precipitation (in mm) and the minimum frost-free days at each of these points. This information will be your source of input for modeling potential Kudzu distribution across the U.S. Close the attribute table. ArcToolbox contains an Extract values to points tool that will add the value of a raster to a set of points. It then creates a new point file with an attribute holding that raster value. In the Spatial Analyst Toolbox \u0026gt;Extraction \u0026gt; Extract values to points tool. This tool requires three inputs and has two optional ones. You can find out more information about each of the inputs (this applies to other ArcTools as well) by clicking on the Show Help button and then clicking on the field of interest. Set your Input point features to sample1 (You can either use the drop-down that shows all the point layers in the ArcMap Table of Contents or the Browse file button to navigate to the file). Set your Input raster to usanpre25k. Set Output point features so that it saves the new file called sample1precip to your lab02\\Part03 folder. Click OK (Be patient. This may take a while.). Your new point file will appear in the ArcMap Table of Contents. Open the attribute table for sample1precip. Notice the new variable called RASTERVALU that contains the annual precipitation amount for that point Repeat the above steps to create a similar point file containing a RASTERVALU attribute for frost-free days at each point. Call it sample1frost. Alternatively, there is an Extract multi values to points tool, which will do this whole process at once.\n By looking at the minimum precipitation and minimum number of frost-free days that Kudzu required for survival in the sample points, we can model (predict) its potential geographic expansion across the U.S. In order to do this, we need to get this information out of the datasets we just created. To find the minimum annual precipitation and minimum number of frost-free days required for Kudzu to survive:\n Open the attribute tables of your samples and perform an attribute query to select only those points with known Kudzu presence (“KUDZU” = 1) Show only the selected records using the table toggle button. Sort in ascending order the RASTERVALU field (R-Click on column heading). Question 6:\nThe minimum annual precipitation for Kudzu to survive is:________? Question 7:\nThe minimum frost free day for Kudzu to survive is:________? Running the model Create new grids based on the usanpre25k and usfrost25k grid using Raster Calculator usanpre25k \u0026gt;= minimum value** (**Minimum Value in question 6) From the usanpre25k, the new grid cells with an annual precipitation greater than or equal to the minimum annual precipitation will have a value of 1, while cells with an annual precipitation less than the minimum value will have a value of 0 Save the grid as precipitation in your lab02\\part03 folder usfrost25k \u0026gt;= minimum value** (**Minimum Value in question 7) From the usfrost25k, the new grid cells with a frost-free-days greater than or equal to the minimum frost-free-days will have a value of 1, while cells with a frost-free-days less than the minimum frost-free-days will have a value of 0 Save the grid as frost in your lab02\\part03 folder Predict the Kudzu U.S. distribution by multiplying the precipitation and frost grids. Save the output grid as kudzu_model. Question 8:\nThe total area (in m2) suitable for Kudzu to survive (according to your model) is:________ (m2) Assessing the model Add sample2 to your ArcMap Table of Contents Using the Extract values to points tool you now want to create a new point shapefile called sample2check that will have an attribute of actual presence (KUDZU) and one for predicted presence (RASTERVALU found by using kudzu_model as raster input) After you have created sample2check you can quantify your accuracy in one of many ways. Here are two suggestions:\n Add a new field named check and perform the following calculation: check = [RASTERVALU] * 10 + [KUDZU]. The Check field should have the following outputs: 0, 1, 10, and 11 (0 and 1 represent 00 and 01 respectively) Perform attribute query using the AND operator 4 times, once for each possible combination. (i.g. KUDZU = 1 AND RASTERVALUE = 0) Question 9:\nComplete the following table:\n.tg {border-collapse:collapse;border-spacing:0;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:black;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:black;}\r.tg .tg-baqh{text-align:center;vertical-align:top}\r.tg .tg-c3ow{border-color:inherit;text-align:center;vertical-align:top}\r.tg .tg-lqy6{text-align:right;vertical-align:top}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-dvpl{border-color:inherit;text-align:right;vertical-align:top}\r\r\r\rActual Kudzu Distribution\r\r\rNo Kudzu\rKudzu\r\r\rYour Model\rNo Kudzu\rD:(00)\rB:(10)\r\r\rKudzu\rC:(01)\rA:(11)\r\r\r Use your constructed confusion matrix and the accuracy metrics below to help answer question 10. For more about confusion matrices look at wikipedia:\nAccuracy - ((A+D)/(A+B+C+D)):\nFalse Positive Rate - (C/(A+C)):\nFalse Omission Rate - (B/(A+B)):\nMCC - (A*D-B*C)/sqrt((A+B)*(A+C)*(D+B)*(D+C)):\nQuestion 10:\nDiscuss some of the reasons for the inaccuracy of your model. In particular, examine why the error may be occurring. Also, looking at your kudzu_model layer, where geographically do you think this error is? (Hint: Consider the factors that your model did not include that could explain the geography of the model’s error) Need help answering question 10? Make a map like so! Useful tools: Raster to polygon, symbology as categories, rectangle as text, insert images. If performance is an issue, add basemaps last. Spend some time on the map and I\u0026rsquo;ll count that as a response to the question, so throw on a nice album (one of my current favorites) and cartography away! Wrapping up Feel free to save your map document\n"},{"id":15,"href":"/classes/geog111/labs/lab02/","title":"Lab 02 - Introduction to Google Earth Pro","parent":"Labs","content":" This lab is a gratefully modified version of lab 2 from Bradley A. Shellito\u0026rsquo;s Introduction to Geospatial Technologies\n Learning Objective This lab provides an introduction on how to use Google Earth Pro and will help familiarize you with many of its features. Although we\u0026rsquo;ll touch on several more advanced software as the class moves on, Google Earth Pro is a really fast and useful arrow to have in your quiver and we\u0026rsquo;ll be back to use it more than once. The steps and analyses we\u0026rsquo;ll do in this introductory lab are pretty basic but foundational, and we\u0026rsquo;ll build on these as we move forward. The goals for you to take away from this lab are:\n Familiarize yourself with the Google Earth Pro (GEP) environment, basic functionality, and navigation using the software Use different GEP layers and features Outline: Learning Objective Submission requirements Tutorial Obtaining software Setting up GEP Navigating in GEP Exploring landscapes in GEP Using GEP to Save Imagery Using GEP to Make Placemarks and Measurements Exploring landscapes in GEP Other Features of Google Earth Pro Google Mars Google Sky Flight Simulator Wrapping up Submission requirements .tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\r\rHandout to turn in\r\r\rYou are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts, and submit it to blackboard when done.\rTutorial Obtaining software Google Earth can be found in many forms. You are more than likely familiar with Google Maps, an application centered on driving directions and location finding, but there is also Google Earth on the web, a Google Earth for for your phone, and Google Earth Pro for desktop (GEP from here on out), a simplistic but fully fledged geospatial software. The lab computers already have this installed, but if you want to do this on your own PC you will need to download and install it.\nSetting up GEP If this is your first time opening GEP, the window will look like so:\nFeel free to take the tour to get a head start on the rest of this lab if you\u0026rsquo;ve never used Google Earth before, but we\u0026rsquo;ll walk though the most important steps below.\nOne of the first things to do in any software is to look at the options. To find these, go to Tools \u0026gt; Options. Take a second and read though the visualization options in the 3D View tab. Note that the option to change both the Show Lat/Long format and Units of Measurement is in this tab. In the Navigation tab, a useful behavior you may want to turn off is the tilt on zoom option, the \u0026ldquo;Do not automatically tilt while zooming\u0026rdquo; is the most intuitive zoom behavior and will fix the need to habitually reorient yourself when using GEP to explore. Back on the main application page, if you click View, you should make sure both the Toolbar and Sidebar boxes are checked. There are also several other map options here you might want to explore. Common ones to keep on are the Overview map and Scale Legend.\nBefore we start exploring the planet, lets take a look at the options and set some of the default behaviors Use the mouse to change your perspective and explore different areas of the globe.\nNavigating in GEP Being able to adeptly navigate around is critical to any geographic software. Nothing is more frustrating that knowing how you want to move around but not being able to do so. Below are the navigation steps, read them and take the time to become proficient with them.\n To pan in any direction: Left-click and hold. Then, drag the cursor until you see the view you want. To return to the default view (reorient yourself so north is up and the camera angle is pointed straight down) - Click the map and press \u0026ldquo;r\u0026rdquo;. You can zoom in and out to see more or less of a map area. Use the scroll wheel on your mouse or mouse touchpad to zoom in and out. The map controls on the upper right hand side of the map can also be used to pivot, pan, and zoom using just the mouse. Finally, there is a search bar on the toolbar on the left that works just like the search bar on Google Maps. Move the mouse to the upper-right side of the screen, and a set of controls appear, should you want to use those instead. These controls fade out when not in use and reappear when the mouse rolls over them.\n The first button, an eye ball surrounded by four arrow marks with ‘N’ marked above the upward arrow, representing the ‘Look around’ function. Grab the N with the mouse (by placing the pointer on N and holding down the left mouse button) and drag the N (that is, slowly rotate it) around the ring (this is commonly called “grab and drag”). You see the view change (north will still be the direction in which the N is pointed). Clicking on the N returns the imagery so that the north direction is facing the top of the screen. The second button, a palm shaped symbol in the center surrounded by four arrow marks, representing the ‘Move function. By selecting one of the directional arrows and clicking it with the mouse, you can tilt your view around the terrain as if you were standing still and looking about. You can also grab the control and move it (by holding down the left button on your mouse) to simulate looking around in various directions. Recall from above that you can reset your view with the \u0026ldquo;r\u0026rdquo; key. You\u0026rsquo;ll find the Street View peg man button below that. This icon appears when Google Street View imagery is available to see on the ground. To use Street View, you would grab the icon with the mouse and drag it to a street that’s visible in the view to enter the Street View mode. This control is visible only if there are streets in the view through which you can enter Street View mode Below that, there is a zoom slider with ‘plus’ marked at the top and ‘minus’ marked at the bottom. The zoom slider is marked at the center of the slide. In the search box, type in University of Kansas. As GEP rotates zooms in, look to the Layers box. Make sure that both Photos, 3D Buildings, and Terrain are selected. This enables linking of locations on the ground to photos that users have taken. The locations that have linked photos have small blue and brown circles over the imagery. Click on the photo symbols on the imagery to see some photos.\n See if you can locate the building you are currently in by panning and zooming around the campus. If you got lost or aren\u0026rsquo;t on campus but are playing along, type in Lindley hall, KU into the search bar. Grab the Street View icon from the controls and drag it to the street right in front of the building. You’ll see large areas of the campus pathways and roads turn blue (which indicates that Street View imagery is available for this street). Release the icon on the street, and the view shifts from an aerial view to imagery that looks like you’re actually standing in that spot. Use a combination of the Street View and imagery to answer question 1:\n Question 1\nFrom viewing Lindley Hall from above and in front, what details from the aerial view can help identify what the building is and how to properly orient yourself? From the GEP Layers panel, make sure Roads is checked. You’ll see major roads (interstates, state highways) identifiable as their labels appear, and local roads will also be identifiable when you zoom in close. We’re going to plan a lunch trip to the greatest restaurant in Lawrence, Thai Diner. In the Search box, click on the Get Directions option. In the A option (this is where you’re traveling from), type in Lindley Hall. In the B option (this is where you’re traveling to), type in Thai Diner. Finally, click the Get Directions button. GEP zooms out to show you the path it calculated for driving distance between the two points, and the Search box fills with directions featuring stops and turns along the way. Question 2\nBy viewing the road network between our destinations, you’ll realize there are many possible routes between the two. Why do you suppose GEP chose this particular route? Question 3\nBased on the route that GEP calculated, what is the driving distance (and approximate time equivalent) to get to our destination? The capabilities to take a bunch of letters and numbers and turn them into a mapped location and to calculate the shortest driving distance between points are some of the core functions of GIS, and we will develop these later in the course. For now, you can click the X at the bottom of the Search box to clear the directions and remove the route\nExploring landscapes in GEP Kansas has some truly gorgeous landscapes, but it is mathematically flatter than a pancake, so let\u0026rsquo;s go explore a slightly more topographically diverse landscape. The closest National Park to Lawrence, the Badlands national park, is just to the northwest.\nTo see the boundaries of Badlands National Park, go to the Layers box and expand the option for More (click on the triangle to the left of the name). In the options that appear under the expanded heading, put a checkmark in the Parks/Recreation Areas option. Pan and Zoom out until you see the northernmost boundary of Badlands National Park highlighted in green. If you get lost remember you can use the search function. Once the park is centered in your view new icons for the locations of Visitors Centers and other features should appear. Pan over to the eastern edge of the park, and you see a large Question Mark icon indicating a park entrance as well as a green arrow indicating an overlook. Make sure that the option for Terrain in the Layers box is turned on. Zoom into the point representing the overlook. At the bottom of the main map you\u0026rsquo;ll see numbers representing the latitude and longitude of the point, as well as the real-world elevation of that spot.\nQuestion 4\nWhat is the elevation of this particular overlook in Badlands National Park? The imagery in GEP is placed on top of a model of Earth’s terrain. To get a better view of this extra dimension, use the Zoom Slider to tilt the view down (remember you can also hold down the Ctrl key and move the mouse to change your perspective) so that you can look around as if you were standing at the overlook point (in a perspective view of the planet). Once you’ve tilted all the way down, use the Look controls to examine the landscape. From here, use the Move controls to fly over the Badlands from this overlook point. Once you feel like you have a decent feeling of what the Badlands looks like, answer Question 5.\nQuestion 5\nHow does the terrain modeling (with the tilt function) aid in the visualization of the Badlands? This ability to model the peaks and valleys of the landscape with aerial imagery “draped” or “stretched” over the terrain for a more realistic appearance is often used with many aspects of geospatial technology, and we\u0026rsquo;ll explore some of these analyses in more depth later in the semester.\nUsing GEP to Save Imagery It’s time to continue to the next leg of our journey by heading to Mount Rushmore. Carved out of the side of a mountain in the Black Hills, Mount Rushmore National Memorial features the faces of presidents George Washington, Thomas Jefferson, Theodore Roosevelt, and Abraham Lincoln. For more information about Mount Rushmore.\nIn GEP’s Search box, type \u0026ldquo;Mount Rushmore\u0026rdquo;. GEP zooms around to an overhead view of the monument. Like the Badlands, Mount Rushmore is overseen by the National Park Service. Zoom out a little bit and you’ll see the extent of the memorial’s boundaries (still outlined in green). Let\u0026rsquo;s save an image of what’s being shown in the view. Although we could use the snip tool, GEP can take a “snapshot” and save it as a JPEG (.jpg) file. Position the view to see the full outlined extent of Mount Rushmore, select the File \u0026gt; Save \u0026gt; Save Image. You’ll see the two most common map elements appear, a box with a legend and a title box with Untitled Map in it. Click on that to give your image a name. If your text gets too long, you can enter a new line by pressing Shift-Enter. Once done, you\u0026rsquo;ll see at the top of the map window there are options including map options, image resolution, and he save button. Once you are happy with your setting hit the save button and save the image to your PC. Minimize GEP go to the location on your computer where you saved the image and open it (using a simple image viewer like Microsoft OfficePhotos). Question 6\nNote that even though the graphic contains the locations of Mount Rushmore, the outline of the park, and information concerning location (including latitude, longitude, and elevation) at the bottom, it doesn’t have any spatial reference for measurements. Why is this? Even though the saved image doesn’t have any spatial reference data, we\u0026rsquo;ll cover how to add that back in later in the class. For now, you can turn off the Parks/Recreation layer.\nUsing GEP to Make Placemarks and Measurements While you’re examining Mount Rushmore, you can set up some points of reference to which you can return. GEP allows you to create points of reference as placemarks. Let\u0026rsquo;s set up three points on the map: the top of the mountain, the amphitheater, and the parking area.\nFrom the GEP toolbar, select the Add Placemark button (1) A yellow pushpin (labeled “Untitled Placemark”) appears on the screen. Using your mouse, click on the Pushpin and drag it to the rear of the amphitheater so that the pin of the placemark is where the path meets the amphitheater (2) In the Placemark dialog box, type \u0026ldquo;Mount Rushmore Amphitheater\u0026rdquo; (3) Click the Placemark icon button next to where you typed the name, you can select an icon other than the yellow pushpin. Choose something more distinctive (4) When finished, click OK to close the dialog box. Put a second placemark at the top of the mountain by repeating this process. Name this new placemark \u0026ldquo;Top of the Memorial\u0026rdquo;. When done, position the two as tightly within the view as possible. Click on the Ruler tool on the toolbar. In the Ruler dialog box that appears, select Feet from the Map Length pull-down menu Use the options on the Line tab, which allow you compute the distance between two points. If you wanted to measure multiple points, you could do so from the Path tab. Using this tool, measure the distance between your two placemarks to answer the question below: Question 7\nWhat is the measured distance between the rear of the amphitheater and the top of the memorial? (Keep in mind that this is the ground distance, not a straight line between the two points.) When you’re done, click on Clear in the Ruler dialog box to remove the drawn line from the screen and then close the Ruler dialog box. These abilities to create points of reference (as well as lines and area shapes) and then compute the distances between them might seem trivial, but these process level functions form the heart almost every GIS tool.\nFinally, let\u0026rsquo;s use the Tilt functions of GEP to get a perspective view on Mount Rushmore (as we did in the Badlands). Make sure that the option for Terrain in the Layers box is turned on and 3D Buildings option is turned off. Note that although you can see the height and dimensions of the mountain, the famous four presidents’ faces can’t been seen. Question 8\nEven with Terrain turned on and the view switching over to a perspective, why can’t the presidents’ faces on the side of the memorial be seen? Now turn 3D Buildings on, can you see them now?\nExploring landscapes in GEP Notice that next to the elevation value at the bottom of the GEP screen, there’s a set of coordinates for lat (latitude) and long (longitude). Move the mouse around the screen, and you see the coordinates change to reflect the latitude and longitude of whatever the mouse’s current location is.\nZoom in closely on the road that enters the parking structure area of Mount Rushmore. Question 9\nWhat are the latitude and longitude coordinates of the entrance to the Mount Rushmore parking area? You can also reference specific locations on Earth’s surface by their coordinates instead of by name. In the Search box, type the following coordinates: 43.836584, –103.623403. GEP rotates and zooms to this new location. Turn on Photos to obtain more information on what you’ve just flown to. You can also turn on the 3D Buildings layer (if necessary, again using the legacy 3D Buildings option, as you did with the Corn Palace and Mount Rushmore) to look at the location with a 3D version of it there.\nAnswer Questions 10 and 11, and then turn off the Photos (and the 3D Buildings option) when you’re done. Question 10\nWhat is located at the following geographic coordinates: latitude 43.836584, longitude –103.623403? Question 11\nWhat is located at the following geographic coordinates: latitude 43.845709, longitude –103.563499? Other Features of Google Earth Pro Google Earth Pro changes with new updates and features, but some of the more novel features include Google Mars, Google Sky, and Flight Simulator.\nGoogle Mars From the View pull-down menu, select Explore \u0026gt; Mars. The view shifts to the familiar-looking globe, but this time covered with imagery (from NASA and the USGS) from the surface of Mars. The usual controls work the same way as they do with GEP, and there’s a lot of Martian territory to be explored. When you’re ready to return to GEP you use the same View \u0026gt; Explore menus and then select Earth.\nGoogle Sky From the View pull-down menu, select Explore \u0026gt; Sky. GEP’s view changes: Instead of looking down on Earth, you’re looking up to the stars, and instead of seeing remotely sensed aerial or satellite imagery, you’re looking at space telescope (including the Hubble) imagery. There’s plenty of imagery to be explored in this new mode. When you’re ready to return to GEP you use the same View \u0026gt; Explore menus and then select Earth.\nFlight Simulator From the Tools pull-down menu, select Enter Flight Simulator. A new dialog box appears, asking if you want to pilot an F-16 fighter jet or an SR-22 propeller airplane. Select whichever plane you want to fly; GEP switches to the view as seen out the cockpit of your chosen plane, and you can fly across South Dakota (and the rest of GEP). Try not to crash (although you can easily reset if you do).\nWrapping up There is no need to save anything from this lab, so when done you can simply close without saving. Submit your answers to the questions on blackboard.\n"},{"id":16,"href":"/classes/geog111/labs/lab03/","title":"Lab 03 - Coordinates and Position Measurements","parent":"Labs","content":" This lab is a gratefully modified version of lab 2 from Bradley A. Shellito\u0026rsquo;s Introduction to Geospatial Technologies\n Learning Objective In this lab we\u0026rsquo;ll continue using Google Earth Pro (GEP) to examine coordinate systems and the relationships between various sets of coordinates and the objects they represent in the real world. In addition, we’ll make more measurements using GEP and compare measurements made by using different coordinate systems. The goals for you to take away from this lab:\n Set up a graticule of lines in GEP Locate places and objects based only on their coordinates Make measurements across long and short distances and then compare measurements with surface distance calculations Translate latitude/longitude coordinates into UTM Outline: Learning Objective Submission requirements Tutorial Examining Coordinates and Distance Measurements in Google Earth Pro Using UTM Coordinates and Measurements in Google Earth Pro Wrapping up Submission requirements .tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab3Questions.docx\rHandout to turn in\r\r\rYou are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts, and submit it to blackboard when done.\rTutorial Examining Coordinates and Distance Measurements in Google Earth Pro Start Google Earth Pro. Once Earth settles into view, scroll your mouse around Earth. You’ll see a set of latitude and longitude coordinates appear at the bottom of the view; these represent the coordinates assigned to your mouse’s location. By default, GEP uses the GCS coordinate system and the WGS84 datum. To examine the full graticule of latitude and longitude lines, from the View menu make sure Grid is selected. Some key GCS lines will be highlighted in yellow amid the web of lat/long lines—the Equator, the Prime Meridian, the Antimeridian (how Google Earth labels the 180th meridian), the Tropic of Cancer, and the Tropic of Capricorn. We’ll begin our adventure in Lawrence, KS. Type that into the Search box. GEP will rotate and zoom to the area. You’ll also see the spaces between the lat/long lines grow smaller and new values appear as GEP zooms in. Next, we’ll go a specific location in Lawrence. Type \u0026ldquo;38.9584517, -95.251449\u0026rdquo; into the search box. These are the decimal degree lat/long coordinates of the Lindley Hall. Click the Placemark button on the toolbar (Review Lab 1 if you need a quick refresher). The yellow pushpin is automatically placed at the lat/long coordinates you had GEP search for so there\u0026rsquo;s no need to manually place it. In the new placemark dialog box, type \u0026ldquo;Lindley Hall\u0026rdquo; and change the pin symbology to something you like. The coordinates for Lindley Hall are in decimal degrees, but other methods of displaying coordinates are available in GEP. From the Tools pull-down menu, select Options. In the Show Lat/Long options, select the radio button next to Degrees, Minutes, Seconds, click Apply and then OK. From the Places box in GEP, right-click on the placemark Lindley Hall and then select Properties. The coordinates are changed to degrees, minutes, and seconds (DMS). Answer Question 1. Close the Lindley Hall placemark dialog box.\nQuestion 1\nWhat are the coordinates for Lindley Hall, in degrees, minutes, and seconds? Decimal degree coordinates for Thai Diner are \u0026ldquo;38.9435517, -95.2424942\u0026rdquo;. Use the Search box to zoom to these coordinates and put a placemark at that spot. Name it Thai Diner, and answer Question 2. Question 2\nWhat are the coordinates for the Thai Diner, in degrees, minutes, and seconds? Adjust the view so you can see both placemarks at the edges of the view. Select the Ruler tool and compute the distance between the Lindley Hall and the Thai Diner (again, review Lab 1 if you need a quick refresher). Answer Question 3. Question 3\nAccording to GEP, what is the distance between Lindley Hall and Thai Diner? It’s such a short distance (relative to the size of the globe) from Lindley Hall to Thai Diner that the differences between measurements should be very small, so let’s look at some larger distances.\nLindley Hall (in London, England) is located at \u0026ldquo;51.4944692, -0.1344998\u0026rdquo;. Use the Search box to zoom to these coordinates. Once GEP arrives there, put a placemark at that spot, name it \u0026ldquo;Lindley Hall, London\u0026rdquo;, and then answer Question 4 Question 4\nWhat are the coordinates for Lindley Hall, London, in degrees, minutes, and seconds? Use the Ruler to compute the distance between Lindley Hall and Lindley Hall, London. You will have to pan and zoom to get the ruler to measure between both locations. Answer Questions 5 and 6. Close the Ruler dialog box when you’re done. Question 5\nAccording to your measurement in GEP, what is the computed distance (in miles) between the Lindley Hall and Lindley Hall, London? Question 6\nWhy is the line curved rather than straight? What kind of distance is being computed here? Of course, this is a rough estimate because the scale of the view and the imprecise nature of your point and click makes hitting the placemarks directly with the start point and the end point of the ruler difficult.\nTo check your measurements using a Web utility, go to https://www.movable-type.co.uk/scripts/latlong.html. This Website enables you to compute the surface (real-world) distance between two sets of lat/long coordinates. Type your answer to Question 1 as the degrees, minutes, and seconds for the latitude and longitude of Point 1, and your answer from Question 4 in Point 2. Note that you can simply write each of those coordinates with a space in between them; for instance, 40 degrees, 41 minutes, 52 seconds North latitude can be typed in the box as 40 41 52 N.\n Click the see it on a map button. The surface distance between the two points is computed in kilometers, and a zoomable map of the great circle distance from Lindley Hall to Lindley Hall, London is displayed for you. Finally, convert you answer from km to miles (a simple Google search will accomplish this, \u0026ldquo;### km to miles\u0026rdquo;). Answer Question 7. Question 7\nAccording to this website measurement, what is the computed surface distance (in miles) between the Lindley Hall and Lindley Hall, London? Your answers to Questions 5 and 7 should be relatively similar, given how they were both computed; if they’re way off, redo your Question 5 measurement and make sure you’re inputting the correct numbers for Question 7.\nUsing UTM Coordinates and Measurements in Google Earth Pro Universal Transverse Mercator (UTM) is a projected coordinate system, and GEP can track coordinates using this UTM system as well as lat/long. From Tools on the menu bar, select Options. In the Show Lat/Long options, select the radio button for Universal Transverse Mercator, click Apply and then OK. UTM coordinates are measured in meters rather than degrees of latitude or longitude, which enables an easier method of determining coordinates. Moving the mouse east increases your easting, and moving it west decreases the easting. The same holds true for northing: Moving the mouse north increases the value of northing, and moving it south decreases the northing value.\nZoom out to see the whole world. You’ll also see that the graticule has changed from a web of latitude/longitude lines to the UTM grid draped over Earth. You should see the UTM zones laid out across the world, numbered from 1 through 60. Answer Question 8. Question 8\nIn what UTM zone is Lindley Hall, London located? In what UTM zone is Lindley Hall located? Double-click on the Lindley Hall placemark(in the Places box), and GEP rotates and zooms back to the Lindley Hall. Scroll the mouse around the area. You\u0026rsquo;ll see new coordinates appear at the bottom of the view, but this time they are the zone, easting, and northing measurements. Open the Properties of The Lindley Hall placemark by right-clicking on it. The UTM easting, northing, and zone appear. Using these methods, answer questions 9 and 10. Question 9\nWhat are the UTM coordinates of the Lindley Hall? Question 10\nWhat are the UTM coordinates of the Thai Diner? Change the view so that you can see both Lindley Hall and Thai Diner. Using your answers for Questions 9 and 10 (and perhaps the Ruler tool or some basic arithmetic/trigonometry), answer Question 11. Question 11\nHow many meters away to the north and east is Lindley Hall from Thai Diner? UTM can also be used to easily determine the locations of other objects. Return to the Lindley Hall placemark again. Answer Questions 12 and 13. Question 12\nWhat object is located approximately 66.69 meters north and 692 meters east of Lindley Hall? Question 13\nWhat are the full UTM coordinates of this object? Wrapping up There is no need to save anything from this lab, so when done you can simply close without saving. Submit your answers to the questions on blackboard.\n"},{"id":17,"href":"/classes/geog111/labs/lab04/","title":"Lab 04 - GPS","parent":"Labs","content":"Background This exercise provides an introduction to using a Global Positioning System (GPS) receiver to obtain coordinates and create a point shapefile. GPS is a system consisting of a network of satellites that orbit ~11,000 nautical miles from the earth in six different orbital paths. They are continuously monitored by ground stations located worldwide. The satellites transmit signals that can be detected with a GPS receiver. Using the receiver, you can determine your location with great precision through the trilateration (not triangulation!) of signals from at least 3 satellites, getting a distance from the difference between time measurements.\nAlthough the system is very sophisticated, and atomic clocks are used by the satellites, there are multiple sources and types of errors involved in finding your location. As the GPS signal passes through the charged particles of the ionosphere and then through the water vapor in the troposphere, this causes the signal to slow a bit, and this creates the same kind of error as bad clocks. Also, if the satellites that are in your view at a particular moment are close together in the sky, the intersecting circles that define a position will cross at very shallow angles. That increases the error margin around a position. The kind of GPS receivers we will be using provide about 10 meter accuracy (which may be reduced to under 3 m if differential corrections like WAAS are used), depending on the number of satellites available and the geometry of those satellites.\nLearning Objective In this exercise we are going to collect the coordinates of some set of campus features using either the GPS units or your phones, generate a few shapefiles of those features, and then use the shapefile to make a map. The goals for you to take away from this lab:\n How to use your selected platform to collect data How to export that data How to view that data in GIS software Outline: Background Learning Objective Submission requirements Tutorial Notes before getting started Your mission How to collect data Getting your data onto a PC Open your features in Google Earth and make a quick map Submission requirements For this lab you are submitting a map (read: not a screenshot) of the GPS features you create in the field. Submit the created .jpg to blackboard.\nTutorial Notes before getting started If you have an Android phone, I highly recommend the Android specific application, it is by far the most robust data collection application I\u0026rsquo;ve tested, and you will always have your phone on hand. The Apple/cross platform application will work for the objectives and purposes of this lab, but you will likely be underwhelmed by the capabilities. The handheld units are the most traditional (and perhaps most powerful) options, but unless you go out and buy a GPS unit for yourself you will likely never hold a GPS unit in your hands again. The apps outlined in this lab are all free to use, although many have the option of upgrading to a paid version. You do not need to pay for anything if you don\u0026rsquo;t want to. If in the event you get confused as to what your looking at, spamming the back button should always take you to the main menus. Note: Make sure your unit is charged before going out on your adventure!\n Your mission Your goal in this lab is to demonstrate to me that you can collect the three different types of data: points, lines, and polygons. How you go about demonstrating that is up to you. Perhaps you want to go out and collect bus stops, a bus path, and outline the building? You can also go collect tree locations, water paths, and outline grassy areas? Maybe you want to go sample all the coffee on campus and trace your path from your dorm to the class? Whatever you decide to do, go out and collect these data and then view them in Google Earth to make a very simple map.\nHow to collect data Use the following instructions, your intuition as an individual born in the digital age, and trial and error to collect data for the above mission. Android If you are on an android device there are a few options available to you. The one you will need for this lab is called Locus GIS. The advantages of this app (Locus) over the cross platform one is several-fold. Perhaps most critically, the app allows you to export databases of features at once without going through a paywall. This app also seems to forgo ads when collecting data. The forms to add data are also more robust, allowing you to add attributes and pictures to the feature. I also feel the interface is easier to use.\nInitial setup and clearing data LOCUS separates data by projects, so to start a new project, go to the menu and create a new project. Make sure your new project coordinate system is set to 4326.\nNavigate To go to a particular point, first add the point using the plus button, and manually enter the coordinates. After the point has been added, you can click on it, and at the top of the application, one of the options will be to navigate to the point using Google Maps.\nCollecting data Locus allows you to record points based on GPS, but also has a rich digital editor that allows you to draw GIS data. We\u0026rsquo;ll only cover recording data here, but feel free to explore! Collecting a point To create points, you need to first create a point file and fill out the fields as necessary. These can be edited in app.\nCollecting a line or a polygon You likewise need to create new files for lines and polygons. To draw a line, start recording and stop or pause recording when you want to stop.\n Apple You are looking for the Fields Area Measure Free app. There is a guided tutorial built right into the app, and your intuitive knowledge as digital generation should carry you the rest of the way.\nInitial setup and clearing data Click the menu button (upper left corner), and go to saved measures. Click the three buttons next to the entries to delete them. To set the units, go to settings, and make sure the measurement system is set to metric Navigate To go to a particular point, first add the point using the plus button, and manually enter the coordinates. After the point has been added, you can click on it, and at the top of the application, one of the options will be to navigate to the point using Google Maps. Final notes There are a few caveats to this app. First, you may have to sit though an ad if you drop more than a few features in a session. Finally, the exporting of the data must be done individually (feature by feature), making the unpaid version of this unworkable for larger projects. Sorry apple :(\n Garminetrex10 The lab has access to the eTrex handheld garmin units. See the website for the users manual. Instructions on how to accomplish some of the most common tasks are included below:\nInitial setup and clearing data First, lets make sure the unit is set up properly.\n From the main menu, go to setup | units and ensure distance is set to metric and elevation is in feet from the setup menu, go to Position Format and make sure it\u0026rsquo;s set to hddd.ddddd°, Map Datum and Map Spheroid are both set to WGS 84 Finally, from setup go to reset and make sure that you delete all waypoints and clear the current track Navigate To go to a point, from the main menu go to where to | coordinates, and enter your coordinates, or choose a previously dropped waypoint. When done, the map will pop up, directing you to your chosen destination. Collecting data Collecting a point From the main menu, go to Mark waypoint and click done\nCollecting a line or a polygon Lines:\n Go to Main Menu | Tracks To begin line: Clear | Yes (clear track log) | Track Log on To end line: Track Log off | Save | Yes (save all tracks) Make sure you clear the track log between tracks! (This keeps each line separate.) Polygons:\n Use the same steps as lines, except you need to make sure that your line ends where it began. GarminGPSMap60CS The lab also has Garmin GPSMap 60CS handheld units. See the website for the users manual. Instructions on how to accomplish some of the most common tasks are included below:\n Note: Charge the unit with the provided usb cable before going out on your adventure!\n Initial setup and clearing data Make sure the GPS unit is configured to collect coordinates in decimal degrees and distance is in meters: Main Menu (press Menu twice) | Setup | Units | Position format = hddd.ddddd° | Distance/Speed = Metric Even though the map datum will be set to WGS 84, this will correlate fine with the NAD projection of our data later on. Next, lets clear out the old data.\n Find | Waypoints | Menu | Delete… | All Symbols | Yes Main Menu | Tracks | Clear | Yes | Menu | Delete All Saved | Yes Collecting data Collecting a point Points:\n Press the Mark button. Click OK to save. To view a list of all of the waypoints you have taken, use the Find button (it leads to Waypoints menu) next to the Mark button. Collecting a line or a polygon Lines:\n Go to Main Menu | Tracks To begin line: Clear | Yes (clear track log) | Track Log on To end line: Track Log off | Save | Yes (save all tracks) Make sure you clear the track log between tracks! (This keeps each line separate.) Polygons:\n Use the same steps as lines, except you need to make sure that your line ends where it began. Getting your data onto a PC Each method of acquiring GPS data can come with it\u0026rsquo;s own headache of data massaging. Our goal is getting the data off of the device and into the KML/KMZ format. Fortunately for us, all of the above methods of data collection facilitate this format natively, so we don\u0026rsquo;t have to do a weird data conversion dance to arrive at our desired endpoint. Use the following instructions to export your data.\nFrom LOCUS As mentioned, LOCUS makes this process painless, from your project simply click the 3 more buttons and then Export as KML.\n From Handhelds Click the Start/Windows button, click the little down arrow in the bottom left corner, and then click dnrgps (under #-Programs). Note: you can also download from the Minnesota Department of Natural Resources. However, apparently ftp is on the way out and it can be difficult to get it to download properly. I\u0026rsquo;ve saved a version here for archival purposes, but to get this to download for yourself (I\u0026rsquo;m assuming you are using Chrome) you need to:\n Type \u0026ldquo;chrome://flags\u0026rdquo; into the browser bar ctrl-f \u0026ldquo;ftp\u0026rdquo; to find the relevant flag enable and refresh the browser Attempt to redownload it, On the bottom download bar, an unsecure warning appears. click the arrow and then Keep to finally retrieve the file Connect the GPS unit to the computer using the provided USB cable. Turn the unit on, press the Page button until you reach the Satellite screen, press the Menu button and select Use with GPS off (this helps save the battery). If the application doesn’t find your GPS unit automatically, select GPS | Find GPS (or Connect to Default GPS). Download your features into the application by clicking the GPS menu and selecting Download All. Look at the table of waypoints. If you see any unwanted waypoints in the table, select them and click the red X on the left side of the screen. Let’s project our data before we go any further. Click File \u0026gt; Set Projection. On the Projection tab, set the POSC Code to \u0026ldquo;26915\u0026rdquo;, the datum to \u0026ldquo;NAD83\u0026rdquo;, and the projection to \u0026ldquo;UTM zone 15N\u0026rdquo; (or whatever is appropriate for your use case). When finished, click OK. Select all of the waypoints that you want to use and then click Edit \u0026gt; *Project Coordinates. Click File \u0026gt; Save To \u0026gt; File…. Save your waypoints as a KML with a sensible naming convention (like \u0026ldquo;GEOG111_Lab4Waypoints\u0026rdquo;). Downloading tracks is similar to downloading waypoints. Click the Tracks tab and then select all the records (rows) that correspond to your tracks click File \u0026gt; Save To \u0026gt; File…. Save your track as KMZ with a sensible name. When it prompts for Save to Shape/GPS Types, be sure you select the appropriate one. When in doubt, Line is safest. When you’re done creating your KML\u0026rsquo;s, close the DNR GPS application, turn the GPS unit off, disconnect the unit from your computer, and put it back. From KML (apple) Click on the Menu button to open the Saved measures. Long press on a saved set to open up the options at the top and click the share icon Click on KML and then either save the measure to your device or share with an application you have access to (emailing it to yourself is a common if not long way to do this). Open your features in Google Earth and make a quick map In Windows Explorer, double-click the file you just created. (If you don’t see it, look for KMZ File under the Type column.) Google Earth should open and zoom to the location where your points are. Are your points where they’re supposed to be? Common errors: \u0026ldquo;When I open my KML layer adobe pops up.\u0026rdquo; Adobe can be aggressive with it\u0026rsquo;s file extensions and may have been set as the default program to handle KML layers as for some reason. You can either right click on the file and open with \u0026gt; find program, or open Google Earth and go to File \u0026gt; Open Find a cool view angle and zoom level that fully encompasses your data. Let\u0026rsquo;s export our map for viewing. Go to File \u0026gt; Save \u0026gt; Save Image\u0026hellip; This adds common map elements including Title blocks and Legend, as well as a north arrow. Fill out the pertinent information, including a relevant title, your name, and the data source. Finally, under Map Options at the top, you have the option of toning down the intensity of the base map. When done, export your map and submit it. Congratulations, you just made your first map! "},{"id":18,"href":"/classes/geog111/labs/lab05/","title":"Lab 05 - GIS introduction","parent":"Labs","content":" This lab is a gratefully modified version of lab 5 from Bradley A. Shellito\u0026rsquo;s Introduction to Geospatial Technologies\n Learning Objective This lab introduces you to some of the basic features of GIS. You will be using a free open source program, QGIS, to navigate a GIS environment and begin working with geospatial data. The labs in Chapters 6 and 7 will show you how to utilize several more GIS features; the aim of this lab is to familiarize you with the basic GIS functions of the software. The goals for you to take away from this lab:\n Familiarize yourself with the GIS software environment of your choice, including basic navigation and tool use with both the Map and Browser Examine characteristics of geospatial data, such as the coordinate system, datum, and projection information Familiarize yourself with adding data and manipulating data layer properties, including changing the symbology and appearance of geospatial data Familiarize yourself with data attribute tables in QGIS Join a table of non-spatial data to a layer’s attribute table in QGIS Make measurements between objects in QGIS Outline: Learning Objective Submission requirements Tutorial Downloading data Data prep Working in GIS software Submission Submission requirements Materials\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab05Questions.docx\rHandout to turn in\r\r\rrawdata/ACSDT5Y2019.B01003_2021-04-18T051624.zip\rPopulation estimates for Kansas Counties from ACS - 2019\r\r\rrawdata/GOVTUNIT_Kansas_State_GDB.zip\rKansas boundaries from the National Map\r\r\rrawdata/STRUCT_Kansas_State_GDB.zip\rKansas structures from the National Map\r\r\rrawdata/TRAN_Kansas_State_GDB.zip\rKansas transportation layer from the National Map\r\r\rYou are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts and submit it to blackboard.\nTutorial Downloading data In this lab we\u0026rsquo;ll use data from two of the most common sources in the US, the National Map and the Census bureau. I\u0026rsquo;ve gone and downloaded the data for Kansas for you to use, but if you want to do this for a state you are more interested in\u0026hellip; How to download your own data ↕ We\u0026rsquo;ll get started in the National Map. This provides easy access to a number of data sets at useful geographic units. Start by defining an AOI (Area of Interest), the simplest is a point within the boundaries of your state. Next, select the Boundaries - National Boundary Dataset check box and within that the State and FileGDB radio buttons. Do the same for the Structures - National Structures Dataset and Transportation layers. After you\u0026rsquo;ve done that click the Search products button at the top. This will filter the data into the requested selections and take you to the Products tab. For each of the requested data sets, you can click the Expand View prompt, and then the Download ZIP prompt to get the data. You should save these in a new folder, preferable in a separate folder. I typically create a folder called rawdata that I place data into. This folder serves as a place to keep data that comes directly from a provider, and I don\u0026rsquo;t touch it afterwards. Although this might seem a bit overkill, you\u0026rsquo;ll thank yourself down the road as you grow as a data scientist. Finally, we\u0026rsquo;d like to add population data, and for that, the Census is King? We cover census data in much more detail in future classes but in essence, the census is a rich collection of data hidden behind difficult data conventions, obscure naming schema, and a poor interface that is frustrating for even advanced users to gather and process. Fortunately, grabbing population data is almost straightforward. Starting at the U.S. Census data portal, search for \u0026ldquo;Total Population\u0026rdquo; and locate the TOTAL POPULATION table from the American Community Survey. Click CUSTOMIZE TABLE to find table view, and you should be taken to this page. First, click on the Geo button. We want County populations within your selected states, and we want all counties, like so: We can turn off the Margin of Error, and turn on Transpose, and then change the Product to the 5 year estimate. Finally, change the Year tab to the most recent year. You can now click Download and save the data to your relevant folder. Data prep As you become more versed in the geographic sciences, you\u0026rsquo;ll slowly learn how data needs to be formatted. One of these is that the Census data we downloaded has an extra row which we\u0026rsquo;ll need to remove before it\u0026rsquo;s ready to be used. To do this we first need to unzip the files we\u0026rsquo;ve downloaded. You can do this one at a time with the standard extract feature built into windows, but I recommend 7-Zip. If you highlight all three and right click \u0026gt; 7-Zip \u0026gt; Extract to \u0026ldquo;*\u0026quot; it will put them in that same location with the same name. Recall that we\u0026rsquo;re working in the \u0026ldquo;rawdata\u0026rdquo; folder, but we are about to (irreversibly) modify the data, so we should create a new folder. In general, I create a \u0026ldquo;storedata\u0026rdquo; folder for these intermediate products. So, create that folder and then copy the ****data_with_overlays file into it. Open it up (Excel is fine), and remove the second row by right clicking on the row number and press delete, and the save it and make sure it\u0026rsquo;s saved in a .csv format. Working in GIS software Use the start menu to open up your software of choice. ArcPRO Working with a Project in ArcGIS Pro In the Windows search bar type in \u0026ldquo;ArcGIS Pro\u0026rdquo; and open the program. In the top right corner make sure you are signed in to your ESRI account. ArcGIS Pro opens and asks how you want to proceed. You can either open an existing project, begin using a blank project template, or start without a project template. In ArcGIS Pro, a “project” is the means by which you organize all the various layers, tables, maps, and so on of your work, and a “template” is how a project starts when you begin working with ArcGIS Pro. If you were just going to do something quick like examine a GIS layer, you wouldn’t need to use a template. However, as you’ll be doing several things in this lab and to get into the good habit of being digitally organized, click on Map in the New Blank Templates options. You are now prompted to give your project a name and indicate where it will be located. Let\u0026rsquo;s call this project \u0026ldquo;lab05\u0026rdquo; and save it in the lab05 folder. Finally, click OK. ArcGIS Pro opens to your project. The large area in the center is referred to as the View. Because you chose the Map template, ArcGIS Pro opened with a new Map for you to add your data to. You can see a new tab called Map in the View, and it contains a topographic map of the United States. In ArcGIS Pro, you add GIS data to a Map so you can work with it. You can have multiple different tabs in the View and switch back and forth between them as needed. For instance, you can have several Maps available at once, each containing different data. You can also have tabs open for one or more Layouts. For this lab, you’ll be using only this one Map. Along the top of the window you\u0026rsquo;ll find the toolbar area. This, along with the toolbox which we\u0026rsquo;ll cover later, are the primary means of interacting with your data. You see a Catalog pane on the right-hand side of the screen and a Contents pane on the left-hand side of the screen (with the word Contents at the top). The Contents pane shows what is contained in the tab that is currently being used. As you add items to whatever you’re working with, you\u0026rsquo;ll see them available in the Contents pane. Don\u0026rsquo;t see them? Click on the view tab at the top and click Reset Panels \u0026gt; Reset Panels for Mapping (default) 8. With several Maps or tabs open, it’s easy to lose track of what the various tabs contain. Even though you’re using only one Map in this exercise, for ease of use, you can change its name to something more distinctive than “Map.” With the Map tab in the View selected, return to the Contents pane. Right-click on Map and choose Properties.\n9. The Map Properties dialog box opens. On the General tab, change the name of this Map from Map to \u0026ldquo;Kansas\u0026rdquo;. Then click OK to close the Map Properties dialog box. You should now see that the Map tab has changed to be the Kansas tab as well as having updated in the Contents panel. 10. Click the x on the Kansas Map tab to close the map window. To get this back, in the Catalog pane, you see a Maps folder. Expand this, and you\u0026rsquo;ll see a list of all the Maps you have available in this particular project. You should have only one: the Kansas Map. You can add this map back into the view by coming to this location and double-clicking the Map.\nAdd and manipulate basemap A fresh map template loads in a standard topographic map, but this can easily be changed on the Map tab under the Basemap icon. Take a second and explore the different options here. Navigate around map Navigating around the map is a critical skill you need in order to be a competent practitioner. Fortunately, Google controls became pretty standard and you likely know most of them at this point simply though intuition and general exposure. Manual controls While the Explore button is on, you can left click to pan around the map. Both the right click and the scroll wheel zooms the map in and out. Does the zoom seem backwards to you? You can change this (The Projects tab \u0026gt; Options \u0026gt; Navigation) There is also a Google Earth style navigation compass. If you right click in the map and select Navigator. This opens a navigator compass on the map which allows you to rotate and navigate around the map with a click. Clicking on the arrow reorients yourself northward.\n Go To point 1\nAdd Vector Data In the Catalog pane, expand the Folders option, and you see the folders and data available for this project. If you explore your folders, you\u0026rsquo;ll notice that none of your data is visible. To add data without moving your data between folders manually, right-click on the Folders option and choose Add Folder Connection. From there, navigate to the folder you\u0026rsquo;ve stored your data in (in this case, the Lab05 folder).\n Expand the geodatabases to verify that it contains the three feature classes you’ll be working with in this lab: Trans_AirportPoint, GU_CountyOrEquivalent, and Trans_TrailSegment. Adding Data to ArcGIS Pro and Working with the Contents Pane Return to the Map. Its corresponding area in the Contents pane is empty, so let\u0026rsquo;s add some data to work with. Click on the Map tab at the top of the screen, and then within the Layer group, click on the Add Data icon () and select Data. In the dialog box, navigate to the rawdata folder, then the GOVTUNIT_Kansas_State_GDB.gdb geodatabase, and then in the GovernmentUnits you\u0026rsquo;ll find the GU_CountyOrEquivalent layer. Click OK (or double-click) to add it to the map.\n You now see the GU_CountyOrEquivalent feature class listed in the Layers panel and its contents (a set of polygons displaying each of the counties in Kansas) displayed in the Map View. In the Layers panel, you see a checkmark in the box next to the layer. When this checkmark is displayed, the layer is shown in the Map View, and when the checkmark is removed, the layer is not displayed. Let\u0026rsquo;s add two more layers from the CalData geodatabase: CalAirports (a point layer) and CalTrails (a line layer). All three of your layers are now in the Layers panel. You can manipulate the “drawing order” of items in the Layers panel by grabbing a layer with the mouse and moving it up or down in the Layers panel. Whatever layer is at thebottom is drawn first, and the layers above it in the Layers panel are drawn on top of it. Thus, if you move the CalBounds layer to the top of the Layers panel, the other layers are no longer visible in the Map View because the polygons of the CalBounds layer are being drawn over them. Add Tabular Data Look at attribute tables Joining vector and tabular data Selections Manual pick Select by attribute Select by Location saving Next, you need to get some basic information about your GIS layers. Right-click on the CalBounds layer and choose Show Feature Count. A number in brackets appears after the CalBounds name, indicating how many features are in that layer. If each feature in the CalBounds layer represents a county in California, then this is the number of counties in the dataset. Repeat step 5 for the CalAirports and CalTrails layers and examine their information. Question 1 How many airports are represented as points in the CalAirports dataset? (Remember that each feature is a separate polygon.) How many trails are represented as lines in the CalTrails dataset? Symbology of Features in QGIS Notice that the symbology generated for each of the three objects is simple: Points, lines, and polygons have been assigned a random color. You can significantly alter the appearance of the objects in the Map View to customize your maps. Right-click on the CalBounds layer and select Properties. In the Properties dialog box, select the Symbology tab. Here, you can alter the appearance of the states by changing their style and color, as well as elements of how the polygons are outlined. Select one of the suggested appearances for your polygons. To select a different color for your layer, first click on Fill and then clicking on the solid color bar beside the Color pull-down menu and choose the color you want. Clicking once on Simple fill brings up more options for adjusting the color, border color, border thickness, and so on. These options allow you to alter the CalBounds layer to make it more appealing. After you’ve selected all the options you want, click Apply to make the changes and then click OK to close the dialog box. Repeat step 4 to change the appearances for the CalAirports and CalTrails feature classes. Note that you can also change the points for the airports into shapes such as stars, triangles, or a default airport symbol, and you can change the size and color of the airports. Several different styles are available for the lines of the trails as well. Obtaining Projection Information Chapter 2 discusses the importance of projections, coordinate systems, and datums. All of the information associated with these concepts can be accessed using QGIS. Right-click on the CalBounds feature class and select Properties. Click on the Information tab in the dialog box. Information about the layer (also known as the metadata, or “data about the data”) is shown. The top of the dialog box gives a variety of information, including some shorthand about the coordinate system being used. Carefully read through the abbreviated information to determine what units of measurement are being used for the CalBounds layer. Click OK to exit the CalBounds Properties dialog box. Question 2 What units of measurement (feet, meters, degrees, etc.) are being used in the CalTrails dataset? Question 3 What units of measurement (feet, meters, degrees, etc.) are being used in the CalAirports dataset? QGIS allows you to specify the projection information used by the entire project. Because your layers are in the U.S. National Atlas Equal Area projection, you need to tell QGIS to use this projection information while you’re using that data in this lab. To do this, select the Project pull-down menu and then choose Properties. In the Project Properties dialog box, select the CRS tab (which stands for Coordinate Reference System). In the filter box, type US National Atlas to limit your search to find the U.S. National Atlas Equal Area projection, which is just one of the hundreds of supported projections. In the bottom panel, select the US National Atlas Equal Area option. Question 4 To which “family” of projections does the U.S. National Atlas Equal Area projection belong? Hint: QGIS shows a hierarchical system of projections that can help you answer this question. Click Apply to make the changes and then click OK to close the dialog box. Examining Attribute Tables Each of your three layers is represented by points (airports), lines (trails), and polygons (the counties), but each layer also has a corresponding set of attributes that accompanies the spatial features. As noted in the chapter, these non-spatial attributes are stored in an attribute table for each layer. To get started working with this non-spatial data, in the Layers panel, right-click on the CalBounds layer and choose Open Attribute Table. A new window opens, showing the CalBounds attribute table. Each of the rows of the table is a record representing an individual feature (in this case, a single polygon of a California county), and the columns of the table are the fields representing the attributes for each of those records. By perusing the attribute table, you can find information about a particular record. However, with 58 records, it could take some time to find specific information. One way to organize the table is to sort it: To do this, scroll across the CalBounds attribute table until you find the field called County_Name, click the County_Name table header, and the table sorts itself into alphabetical order by the name of the county. Click on the County_Name table header a second time, and the table sorts itself into reverse alphabetical order. Click it a third time to return the sorting of the table to alphabetical order. With the table sorted, scroll through it to find the record with the County_Name attribute San Diego. Each county has a FIPS code assigned to it; this is a numerical value that is a government designation to uniquely identify things such as states, counties, and townships. The FIPS code information for each California county is located in this attribute table. Question 5 What is the county FIPS code for San Diego County? Joining a Table to an Attribute Table While there are a lot of records in the CalBounds attribute table, QGIS allows you to add further attributes to them. This can be done by taking another table that contains additional non-spatial information and joining it to the CalBounds attribute table. Tobegin, the first thing you need to do is add to QGIS a table of non-spatial data with which to work. From the Layer pull-down menu, choose Add Layer and then choose Add Delimited Text Layer. In the Data Source Manager dialog box, click the Browse button next to File Name, navigate to the C:\\Chapter5QGIS folder and choose the CAPop.csv file, and click Open. Under File Format, select the radio button for CSV (comma separated values). Expand the options for Geometry definition and select the radio button for No geometry (attribute only table) because this is only a table of values, not something that will be a spatial layer. Leave the other settings as their defaults. At the bottom of the dialog box you can see a preview of what your added table will look like. If everything looks okay, click Add. A new item called CAPop is added to the Layers panel. This is a table that contains a new set of attributes. Click Close in the Data Source Manager dialog box if it is still open. Right-click on CAPop and choose Open Attribute Table. You see the new non-spatial table open in QGIS; it contains a field with the county’s FIPS code, the name of that county, and the population of that county. Before you can join two tables together, they need to contain the same field; in this case, the CAPop table contains a field called N that contains the name of the county, and the CalBounds layer’s attribute table contains a field called County_Name that contains the same information. Because these two tables (the spatial data from CalBounds and the non-spatial data from CAPop) have a field in common (i.e., they both have a field containing the name of each county), they can be joined together. To perform the join, right-click on the CalBounds layer, select Properties, and select the Joins tab. At the bottom of the Joins dialog box, click the green plus button to indicate that you want to join a table to the layer with which you’re working. In the Add Vector Join dialog, for Join Layer, select CAPop (which is the table containing the information you want to join to the spatial layer). For Join Field, select N (which is the name of the county name field in the CAPop table). For Target Field, select County_Name (which is the name of the county name field in the CalBounds layer). Put a checkmark in the Joined Fields box and put checkmarks next to the CAFIPS, N, and Pop2010 options to join the data from the Pop2010 field from CAPop to CalBounds. Put a checkmark in the Custom Field Name Prefix box so that any of the joined fields will start with the prefix CAPop_, which will help you keep them separate. Leave the other options alone and click OK to perform the join. Then click OK in the Layer Properties dialog box to close it Open the CalBounds layer’s attribute table and scroll across to the far right. You now see the population values from the CAPop table joined for each record in the attribute table (i.e., each county now has a value for population joined to it).18. Now that CAPop_Pop2010 is a field in the attribute table, you can sort it from highest to lowest populations (or vice versa) by clicking on the field header, just as you can do to sort any other field (in the same way you sorted the counties alphabetically by name). Keeping this ability in mind, answer Questions 5.6–5.9. Question 6\nWhat is the 2010 population of Los Angeles County? Question 7\nWhat is the 2010 population of the California county with FIPS code 065? Question 8\nWhat county in California has the lowest 2010 population, and what is that population value? Question 9\nWhat county in California has the third-highest 2010 population, and what is that population value? Navigating the Map View You’ll now just focus on one area of California: San Diego County. In the CalBounds attribute table, scroll through until you find San Diego County and then click on the header of the record on the far-left of the attribute table. Back in the Map View, you see San Diego County now highlighted in yellow. QGIS provides a number of tools for navigating around the data layers in the view. Click on the icon that shows a magnifying glass with three arrows to zoom the Map View to the full extent of all the layers. It’s good to do this if you’ve zoomed too far in or out in the Map View or need to restart. The other magnifying glass icons allow you to zoom in (the plus icon) or out (the minus icon). You can zoom by clicking in the window or by clicking and dragging a box around the area you want to zoom in on. The hand icon, which is the Pan tool, allows you to “grab” the Map View by clicking on it and dragging the map around the screen for navigation. Use the Zoom and Pan tools to center the Map View on San Diego County so that you’re able to see the entirety of the state and its airports and trails. Return to the CalBounds attribute table and hold down the Ctrl key on the keyboard while clicking the header record for the county again to remove the blue highlighting. Then close the attribute table. 5.8 Interactively Obtaining Information Even with the Map View centered on San Diego County, there are an awful lot of point symbols there, representing the airports. You are going to want to identify and use only a couple of them. QGIS has a searching tool that allows you to locate specific objects from a dataset. Right-click on the CalAirports layer and select Open Attribute Table. The attribute table for the layer appears as a new window. You need to find out which of these 1163 records represents San Diego International Airport. To begin searching the attribute table, click on the button at the bottom that says Show All Features and choose the option Field Filter. A list of all of the attributes appears; choose the one called Name. You can now search through the attribute table on the basis of the Name field. At the bottom of the dialog box, type San Diego International Airport. Click on the Enter key on the keyboard. The record where the Name attribute is “San Diego International” appears as the only record in the attribute table. Click on the record header next to the Name field (the tab should be labeled 269), and you see the record highlighted in dark blue, just like when you selected San Diego County previously. This indicates that you’ve selected the record. Drag the dialog box out of the way so that you can see both it and the Map View at the same time. You can see that the point symbol for the San Diego International Airport has changed to a yellow color. This means it has been selected by QGIS, and any actions performed on the CalAirports layer will affect only the selected items, not all the items in CalAirports. However, all that you’ve done so far is locate and select an object. To obtain information about it, you can use the Identify Features tool (the icon on the toolbar with a white cursor pointing to the letter i in a blue circle). Identify Features works only with the layer that is selected in the Map Legend, so make sure CalAirports (i.e., the layer in which you want to identify items) is highlighted in the Map Legend before choosing the tool Select the Identify Features tool and then click on the point representing San Diego International Airport (zooming in as necessary). A new panel called Identify Results appears, listing all the attributes for the CalAirports layer. If you click on the View Feature Form button at the top of the panel, you can see all the field attribute information that goes along with the record/point representing San Diego International Airport. Question 10\nWhat is the three-letter FAA airport classification code for San Diego International Airport? Close the Identify Results panel after you’ve answered the question. Labeling Features Rather than dealing with several points on a map and trying to remember which airport is which, it’s easier to simply label each airport so that its name appears in the Map View.QGIS gives you the ability to do this by creating a label for each record and allowing you to choose the field with which to create the label. To start, right-click on the CalAirports layer in the Map Legend and select Properties. Click on the Labels tab. From the pull-down menu at the top of the dialog box, choose Single labels. From the pull-down menu next to Label with, choose Name. For now, accept all the defaults and click Apply and then click OK. Back in the Map View, you see that labels of the airport names have been created and placed near the appropriate points. Change any of the options—font size, color, label placement, and even the angle—to set up the labels so that you can easily examine the map data. Measurements on a Map With all the airports now labeled, it’s easier to keep track of them. Your next task is to make a series of measurements between points to find the Euclidian (straight-line) distance between airports around San Diego. Zoom in tightly so that your Map View contains the airports San Diego International and North Island Naval Air Station. Select the Measure Line tool from the toolbar (the one that resembles a gray ruler with a line positioned over the top). You see that the cursor has turned into a crosshairs. Place the crosshairs on the point representing San Diego International Airport and left-click the mouse. Drag the crosshairs south and west to the point representing North Island Naval Air Station and click the mouse there. The distance measurement appears in the Measure box. If you click on the Info text in the dialog box, you see that you are measuring “ellipsoidal” distance, referred to as the “surface” distance as measured on a sphere. Note that each line of measurement you make with the mouse counts as an individual “segment,” and the value listed in the box for “Total” is the sum of all segments. Question 11\nWhat is the ellipsoidal distance from San Diego International Airport to North Island Naval Air Station?5. Clear all the lines of measurement by clicking on New in the Measure box. Zoom out a bit so that you can see some of the other airports in the region, particularly Montgomery Field and Brown Field Municipal Airport. Question 12\nWhat is the total ellipsoidal distance in miles from San Diego International Airport to Montgomery Field, then from Montgomery Field to Brown Field Municipal Airport, and then finally from Brown Field Municipal Airport back to San Diego International Airport? Saving Your Work (and Working on It Later) When you’re using QGIS, you can save your work at any time and then return to it later. When work is saved in QGIS, a QGZ file is written to disk. Later, you can reopen this file to pick up your work where you left off. Saving to a QGZ file is done by selecting the Project pull-down menu and then selecting Save. Files can be reopened by choosing Open from the Project pull-down menu. Exit QGIS by selecting Exit QGIS from the Project pull-down menu. QGIS Open a map Add and manipulate basemap Navigate around map Manual controls Go To point Add Vector Data Add Tabular Data Look at attribute tables Joining vector and tabular data Selections Manual pick Select by attribute Select by Location saving An Introduction to QGIS Start QGIS, and it opens in its initial mode. The left-hand column (where you see the word “Layers”) is referred to as the Map Legend; it’s where you see a list of available data layers. The blank screen that takes up most of the interface, called the Map View, is where data will be displayed. Important note: If you don’t see the Map Legend on the screen (or if other panels are open where the Map Legend should be), you can change which panels are available to view. From the View pull-down menu, select Panels and then you can place an x next to the name of each panels that you want to see on the screen. To begin, make sure that the Layers panel has an x next to its name.\nImportant note: Before you begin adding and viewing data, you need to examine the data you have available to you. To do so in QGIS, you can use the QGIS Browser—a utility designed to enable you to organize and manage GIS data. Important note: Before you begin adding and viewing data, you need to examine the data you have available to you. To do so in QGIS, you can use the QGIS Browser—a utility designed to enable you to organize and manage GIS data.\nIf the Browser panel is not open, from the View pull-down menu, select Panels and then place an x next to Browser Panel. The Browser panel can be used to manage your data and to get information about it. In the Browser tree (the section going down the left-hand side of the dialog box), navigate to the C: drive and open the Chapter5QGIS folder. Verify that you have the CalData.gdb folder (which is the file geodatabase) and the CAPop.csv file (which is the table containing population data). Next, expand the CalData.gdb geodatabase by clicking the arrow next to it. You have three feature classes available: CalAirports, CalBounds, and CalTrails. Adding Data to QGIS and Working with the Map Legend Next, you need to add some data to the Layers panel. The first layer you need to add is the CalBounds layer. You can do this in one of two ways: a. You can drag and drop the CalBounds layer directly from the Browser panel into the Layers panel. b. From the Layer pull-down menu, choose Add Layer and then choose Add Vector Layer. In the Add Vector Layer dialog box, click on the radio button for Directory, and from the Type pull-down menu choose OpenFileGDB. This will allow you to chooseindividual files from within the file geodatabase. Next, click Browse and navigate to the C:\\Chapter5QGIS folder. In that folder click on the CalData folder and then click on Select Folder. Next, click on Add. You see a list of the different layers that are within the file geodatabase and from which you can choose. To begin, choose CalBounds and click OK. You now see the CalBounds feature class listed in the Layers panel and its contents (a set of polygons displaying each of the counties in California) displayed in the Map View. In the Layers panel, you see a checkmark in the box next to the CalBounds feature class. When the checkmark is displayed, the layer is shown in the Map View, and when the checkmark is removed, the layer is not displayed. Add two more layers from the CalData geodatabase: CalAirports (a point layer) and CalTrails (a line layer). All three of your layers are now in the Layers panel. You can manipulate the “drawing order” of items in the Layers panel by grabbing a layer with the mouse and moving it up or down in the Layers panel. Whatever layer is at thebottom is drawn first, and the layers above it in the Layers panel are drawn on top of it. Thus, if you move the CalBounds layer to the top of the Layers panel, the other layers are no longer visible in the Map View because the polygons of the CalBounds layer are being drawn over them. Next, you need to get some basic information about your GIS layers. Right-click on the CalBounds layer and choose Show Feature Count. A number in brackets appears after the CalBounds name, indicating how many features are in that layer. If each feature in the CalBounds layer represents a county in California, then this is the number of counties in the dataset. Repeat step 5 for the CalAirports and CalTrails layers and examine their information. Question 1 How many airports are represented as points in the CalAirports dataset? (Remember that each feature is a separate polygon.) How many trails are represented as lines in the CalTrails dataset? Symbology of Features in QGIS Notice that the symbology generated for each of the three objects is simple: Points, lines, and polygons have been assigned a random color. You can significantly alter the appearance of the objects in the Map View to customize your maps. Right-click on the CalBounds layer and select Properties. In the Properties dialog box, select the Symbology tab. Here, you can alter the appearance of the states by changing their style and color, as well as elements of how the polygons are outlined. Select one of the suggested appearances for your polygons. To select a different color for your layer, first click on Fill and then clicking on the solid color bar beside the Color pull-down menu and choose the color you want. Clicking once on Simple fill brings up more options for adjusting the color, border color, border thickness, and so on. These options allow you to alter the CalBounds layer to make it more appealing. After you’ve selected all the options you want, click Apply to make the changes and then click OK to close the dialog box. Repeat step 4 to change the appearances for the CalAirports and CalTrails feature classes. Note that you can also change the points for the airports into shapes such as stars, triangles, or a default airport symbol, and you can change the size and color of the airports. Several different styles are available for the lines of the trails as well. Obtaining Projection Information Chapter 2 discusses the importance of projections, coordinate systems, and datums. All of the information associated with these concepts can be accessed using QGIS. Right-click on the CalBounds feature class and select Properties. Click on the Information tab in the dialog box. Information about the layer (also known as the metadata, or “data about the data”) is shown. The top of the dialog box gives a variety of information, including some shorthand about the coordinate system being used. Carefully read through the abbreviated information to determine what units of measurement are being used for the CalBounds layer. Click OK to exit the CalBounds Properties dialog box. Question 2 What units of measurement (feet, meters, degrees, etc.) are being used in the CalTrails dataset? Question 3 What units of measurement (feet, meters, degrees, etc.) are being used in the CalAirports dataset? QGIS allows you to specify the projection information used by the entire project. Because your layers are in the U.S. National Atlas Equal Area projection, you need to tell QGIS to use this projection information while you’re using that data in this lab. To do this, select the Project pull-down menu and then choose Properties. In the Project Properties dialog box, select the CRS tab (which stands for Coordinate Reference System). In the filter box, type US National Atlas to limit your search to find the U.S. National Atlas Equal Area projection, which is just one of the hundreds of supported projections. In the bottom panel, select the US National Atlas Equal Area option. Question 4 To which “family” of projections does the U.S. National Atlas Equal Area projection belong? Hint: QGIS shows a hierarchical system of projections that can help you answer this question. Click Apply to make the changes and then click OK to close the dialog box. Examining Attribute Tables Each of your three layers is represented by points (airports), lines (trails), and polygons (the counties), but each layer also has a corresponding set of attributes that accompanies the spatial features. As noted in the chapter, these non-spatial attributes are stored in an attribute table for each layer. To get started working with this non-spatial data, in the Layers panel, right-click on the CalBounds layer and choose Open Attribute Table. A new window opens, showing the CalBounds attribute table. Each of the rows of the table is a record representing an individual feature (in this case, a single polygon of a California county), and the columns of the table are the fields representing the attributes for each of those records. By perusing the attribute table, you can find information about a particular record. However, with 58 records, it could take some time to find specific information. One way to organize the table is to sort it: To do this, scroll across the CalBounds attribute table until you find the field called County_Name, click the County_Name table header, and the table sorts itself into alphabetical order by the name of the county. Click on the County_Name table header a second time, and the table sorts itself into reverse alphabetical order. Click it a third time to return the sorting of the table to alphabetical order. With the table sorted, scroll through it to find the record with the County_Name attribute San Diego. Each county has a FIPS code assigned to it; this is a numerical value that is a government designation to uniquely identify things such as states, counties, and townships. The FIPS code information for each California county is located in this attribute table. Question 5 What is the county FIPS code for San Diego County? Joining a Table to an Attribute Table While there are a lot of records in the CalBounds attribute table, QGIS allows you to add further attributes to them. This can be done by taking another table that contains additional non-spatial information and joining it to the CalBounds attribute table. Tobegin, the first thing you need to do is add to QGIS a table of non-spatial data with which to work. From the Layer pull-down menu, choose Add Layer and then choose Add Delimited Text Layer. In the Data Source Manager dialog box, click the Browse button next to File Name, navigate to the C:\\Chapter5QGIS folder and choose the CAPop.csv file, and click Open. Under File Format, select the radio button for CSV (comma separated values). Expand the options for Geometry definition and select the radio button for No geometry (attribute only table) because this is only a table of values, not something that will be a spatial layer. Leave the other settings as their defaults. At the bottom of the dialog box you can see a preview of what your added table will look like. If everything looks okay, click Add. A new item called CAPop is added to the Layers panel. This is a table that contains a new set of attributes. Click Close in the Data Source Manager dialog box if it is still open. Right-click on CAPop and choose Open Attribute Table. You see the new non-spatial table open in QGIS; it contains a field with the county’s FIPS code, the name of that county, and the population of that county. Before you can join two tables together, they need to contain the same field; in this case, the CAPop table contains a field called N that contains the name of the county, and the CalBounds layer’s attribute table contains a field called County_Name that contains the same information. Because these two tables (the spatial data from CalBounds and the non-spatial data from CAPop) have a field in common (i.e., they both have a field containing the name of each county), they can be joined together. To perform the join, right-click on the CalBounds layer, select Properties, and select the Joins tab. At the bottom of the Joins dialog box, click the green plus button to indicate that you want to join a table to the layer with which you’re working. In the Add Vector Join dialog, for Join Layer, select CAPop (which is the table containing the information you want to join to the spatial layer). For Join Field, select N (which is the name of the county name field in the CAPop table). For Target Field, select County_Name (which is the name of the county name field in the CalBounds layer). Put a checkmark in the Joined Fields box and put checkmarks next to the CAFIPS, N, and Pop2010 options to join the data from the Pop2010 field from CAPop to CalBounds. Put a checkmark in the Custom Field Name Prefix box so that any of the joined fields will start with the prefix CAPop_, which will help you keep them separate. Leave the other options alone and click OK to perform the join. Then click OK in the Layer Properties dialog box to close it Open the CalBounds layer’s attribute table and scroll across to the far right. You now see the population values from the CAPop table joined for each record in the attribute table (i.e., each county now has a value for population joined to it).18. Now that CAPop_Pop2010 is a field in the attribute table, you can sort it from highest to lowest populations (or vice versa) by clicking on the field header, just as you can do to sort any other field (in the same way you sorted the counties alphabetically by name). Keeping this ability in mind, answer Questions 5.6–5.9. Question 6\nWhat is the 2010 population of Los Angeles County? Question 7\nWhat is the 2010 population of the California county with FIPS code 065? Question 8\nWhat county in California has the lowest 2010 population, and what is that population value? Question 9\nWhat county in California has the third-highest 2010 population, and what is that population value? Navigating the Map View You’ll now just focus on one area of California: San Diego County. In the CalBounds attribute table, scroll through until you find San Diego County and then click on the header of the record on the far-left of the attribute table. Back in the Map View, you see San Diego County now highlighted in yellow. QGIS provides a number of tools for navigating around the data layers in the view. Click on the icon that shows a magnifying glass with three arrows to zoom the Map View to the full extent of all the layers. It’s good to do this if you’ve zoomed too far in or out in the Map View or need to restart. The other magnifying glass icons allow you to zoom in (the plus icon) or out (the minus icon). You can zoom by clicking in the window or by clicking and dragging a box around the area you want to zoom in on. The hand icon, which is the Pan tool, allows you to “grab” the Map View by clicking on it and dragging the map around the screen for navigation. Use the Zoom and Pan tools to center the Map View on San Diego County so that you’re able to see the entirety of the state and its airports and trails. Return to the CalBounds attribute table and hold down the Ctrl key on the keyboard while clicking the header record for the county again to remove the blue highlighting. Then close the attribute table. 5.8 Interactively Obtaining Information Even with the Map View centered on San Diego County, there are an awful lot of point symbols there, representing the airports. You are going to want to identify and use only a couple of them. QGIS has a searching tool that allows you to locate specific objects from a dataset. Right-click on the CalAirports layer and select Open Attribute Table. The attribute table for the layer appears as a new window. You need to find out which of these 1163 records represents San Diego International Airport. To begin searching the attribute table, click on the button at the bottom that says Show All Features and choose the option Field Filter. A list of all of the attributes appears; choose the one called Name. You can now search through the attribute table on the basis of the Name field. At the bottom of the dialog box, type San Diego International Airport. Click on the Enter key on the keyboard. The record where the Name attribute is “San Diego International” appears as the only record in the attribute table. Click on the record header next to the Name field (the tab should be labeled 269), and you see the record highlighted in dark blue, just like when you selected San Diego County previously. This indicates that you’ve selected the record. Drag the dialog box out of the way so that you can see both it and the Map View at the same time. You can see that the point symbol for the San Diego International Airport has changed to a yellow color. This means it has been selected by QGIS, and any actions performed on the CalAirports layer will affect only the selected items, not all the items in CalAirports. However, all that you’ve done so far is locate and select an object. To obtain information about it, you can use the Identify Features tool (the icon on the toolbar with a white cursor pointing to the letter i in a blue circle). Identify Features works only with the layer that is selected in the Map Legend, so make sure CalAirports (i.e., the layer in which you want to identify items) is highlighted in the Map Legend before choosing the tool Select the Identify Features tool and then click on the point representing San Diego International Airport (zooming in as necessary). A new panel called Identify Results appears, listing all the attributes for the CalAirports layer. If you click on the View Feature Form button at the top of the panel, you can see all the field attribute information that goes along with the record/point representing San Diego International Airport. Question 10\nWhat is the three-letter FAA airport classification code for San Diego International Airport? Close the Identify Results panel after you’ve answered the question. Labeling Features Rather than dealing with several points on a map and trying to remember which airport is which, it’s easier to simply label each airport so that its name appears in the Map View.QGIS gives you the ability to do this by creating a label for each record and allowing you to choose the field with which to create the label. To start, right-click on the CalAirports layer in the Map Legend and select Properties.\n Click on the Labels tab.\n From the pull-down menu at the top of the dialog box, choose Single labels.\n From the pull-down menu next to Label with, choose Name. For now, accept all the defaults and click Apply and then click OK. Back in the Map View, you see that labels of the airport names have been created and placed near the appropriate points.\n Change any of the options—font size, color, label placement, and even the angle—to set up the labels so that you can easily examine the map data.\n Measurements on a Map With all the airports now labeled, it’s easier to keep track of them. Your next task is to make a series of measurements between points to find the Euclidian (straight-line) distance between airports around San Diego. Zoom in tightly so that your Map View contains the airports San Diego International and North Island Naval Air Station. Select the Measure Line tool from the toolbar (the one that resembles a gray ruler with a line positioned over the top). You see that the cursor has turned into a crosshairs. Place the crosshairs on the point representing San Diego International Airport and left-click the mouse. Drag the crosshairs south and west to the point representing North Island Naval Air Station and click the mouse there. The distance measurement appears in the Measure box. If you click on the Info text in the dialog box, you see that you are measuring “ellipsoidal” distance, referred to as the “surface” distance as measured on a sphere. Note that each line of measurement you make with the mouse counts as an individual “segment,” and the value listed in the box for “Total” is the sum of all segments. Question 11\nWhat is the ellipsoidal distance from San Diego International Airport to North Island Naval Air Station?5. Clear all the lines of measurement by clicking on New in the Measure box. Zoom out a bit so that you can see some of the other airports in the region, particularly Montgomery Field and Brown Field Municipal Airport. Question 12\nWhat is the total ellipsoidal distance in miles from San Diego International Airport to Montgomery Field, then from Montgomery Field to Brown Field Municipal Airport, and then finally from Brown Field Municipal Airport back to San Diego International Airport? Saving Your Work (and Working on It Later) When you’re using QGIS, you can save your work at any time and then return to it later. When work is saved in QGIS, a QGZ file is written to disk. Later, you can reopen this file to pick up your work where you left off. Saving to a QGZ file is done by selecting the Project pull-down menu and then selecting Save. Files can be reopened by choosing Open from the Project pull-down menu. Exit QGIS by selecting Exit QGIS from the Project pull-down menu. Submission All you have to turn into blackboard for this week is the final image you created above.\n"},{"id":19,"href":"/classes/geog111/labs/lab06/","title":"Lab 06 - spatial analysis","parent":"Labs","content":" This lab is a gratefully modified version of lab 6 from Bradley A. Shellito\u0026rsquo;s Introduction to Geospatial Technologies p442\n Learning Objective The goals for you to aim for in this lab:\n Build simple and compound database queries Extract the results of a query into separate GIS layers Count the number of features within the boundary of another feature Overlay two features for spatial analysis Create and use buffers around objects for spatial analysis Outline: Learning Objective Submission requirements Tutorial import and join data (refresher) Attribute queries Exporting Overlay Buffer Measuring Submission Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rTutorial ArcPRO QGIS import and join data (refresher) Attribute queries Exporting Overlay Buffer Measuring Submission All you have to turn into blackboard for this week is the final image you created above.\n"},{"id":20,"href":"/classes/geog111/labs/lab07/","title":"Lab 07 - Digital Terrain Analysis","parent":"Labs","content":" This lab is a gratefully modified version of lab 13 from Bradley A. Shellito\u0026rsquo;s Introduction to Geospatial Technologies 934\n Learning Objective This chapter’s lab introduces some of the basics of digital terrain modeling: working with DTMs, slope, viewsheds, and imagery draped over the terrain model. You’ll be using the free Google Earth Pro for this lab. The goals for you to take away from this lab:\n Examine pseudo-3D terrain and navigate across it in Google Earth Pro Examine the effects of different levels of vertical exaggeration on the terrain Create a viewshed and analyze the result Create an animation that simulates flight over 3D terrain in Google Earth Pro Create an elevation profile for use in examining a DTM and slope information Outline: Learning Objective Submission requirements Tutorial Examining Landscapes and Terrain with Google Earth Pro Vertical Exaggeration and Measuring Elevation Height Values Working with Viewsheds in Google Earth Pro Flying and Recording Animations in Google Earth Measuring Profiles and Slopes in Google Earth Pro Wrapping up Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab08Questions.docx\rHandout to turn in\r\r\rTutorial Examining Landscapes and Terrain with Google Earth Pro Start Google Earth Pro (GEP). By default, GEP’s Terrain option is turned on for you. Look in the Layers box to ensure that there is a checkmark next to Terrain. This option drapes the imagery over a DTM (Digital Terrain Model). There are several sources for this terrain, so let\u0026rsquo;s make sure we are using the highest quality one. Because this lab is focused on digital terrain modeling, you should use the most detailed DTM possible, so from the Tools pull-down menu, select Options. In the Google Earth Options dialog box that appears, make sure the checkmark Use high quality terrain (disable for quicker resolution and faster rendering) is on. While in the options, click on the Navigation tab and set the Automatically tilt when zooming radio button. Click Apply and then OK. In the Search box, type \u0026ldquo;Zion National Park, UT\u0026rdquo; and GEP zooms down to the area of Zion National Park. In the Layers box, expand the More option and place a checkmark next to Parks/Recreation Areas. You should then see Zion outlined in green. Zoom out so that you can see the entire layout of Zion in the view. You see a question mark symbol labeled Visitor Center next to a label Park Headquarters. Center the view on this area of Zion and scroll in so your view tilts into a perspective from which you can see the sides of the mountains and canyons in Zion in pseudo-3D. There are two references to heights or elevation values in the bottom portion of the view:\n The heading marked “elev” shows the height of the terrain model (that is, the height above the vertical datum) where the cursor is placed. The heading marked “Eye alt” shows GEP’s measurement for how high above the terrain (or the vertical datum) your vantage point is. Use a combination of your mouse wheel and the Ctrl key to maneuver yourself into a southward facing direction at ~4200 ft, positioned just above the visitor center as shown below. Use this view to answer question 1. Question 1\nHow does the pseudo-3D view from this position and altitude aid in bringing out the terrain features of Zion (compared to what you originally saw in the overhead view)? Vertical Exaggeration and Measuring Elevation Height Values GEP allows you to alter the vertical exaggeration of the terrain layer. Vertical exaggeration changes the vertical scale but keeps the horizontal scale the same. This effect makes it easier to identify topographic variations, but distorts reality (more so than is already distorted through the computer screen) Therefore, it should be used for visualization purposes only. In GEP the settings to change this are located in the To look at different levels of vertical exaggeration by clicking Tools \u0026gt; Options.\n In the box marked Elevation Exaggeration, you can type a value (between 0.01 and 3) to vertically exaggerate GEP’s terrain. Play around, and then type a value of \u0026ldquo;2\u0026rdquo;, click Apply and OK, and then reexamine the area surrounding the park headquarters in Zion. Use this process to answer questions 2 and 3. Question 2\nHow did the vertical exaggeration value of 2 affect the view of Zion? Question 3\nTry the following values for vertical exaggeration: 0.5, 1, and 3. How did each value affect the visualization of the terrain? In addition to the value of 2 you examined in Question 13.2, which value of vertical exaggeration was the most useful for a visual representation of Zion and why? Reset the Elevation Exaggeration to a value of \u0026ldquo;1\u0026rdquo; when you’re done. From here, you’ll examine the elevation values of the terrain surface. Wherever you move the cursor on the screen, a new value for elevation is shown in the elevation at the bottom of the view. By zooming and placing the cursor on its symbol on a spot on the screen, you can determine the elevation value for that location. Question 4\nAt what elevation is the height of the park headquarters/visitor center? Working with Viewsheds in Google Earth Pro Now that you have a pretty good idea of how the landscape of Zion looks in the areas near the park headquarters, you can create a viewshed that will allow you to see what is visible and what is blocked from sight at a certain location. To begin, you’ll see what areas of Zion can be seen and cannot be seen from the park headquarters. Zoom in closely on the question mark symbol that GEP uses to mark the park headquarters. This symbol is what you’ll use as the location for creating a viewshed.\n Click the Add Placemark icon on the toolbar and put the placemark right on the park headquarters question mark symbol (Review Lab 1 if you need a quick refresher). Name this new placemark \u0026ldquo;Park HQ\u0026rdquo;. In the Places box, right-click the new Park HQ placemark and select Show Viewshed. If prompted about the placemark being too low, click on Adjust automatically. GEP will compute the viewshed. Zoom out a bit until you can see the extent of the viewshed; all of the areas covered in green are the places that can be seen from the location of the Park HQ placemark and the areas not in green cannot be seen from there. After zooming out, look at the areas immediately south and southeast of the Park HQ placemark to help you answer question 5. Question 5\nCan the ranger station, the campground, or the two picnic areas just south and southeast of the park headquarters be seen from the Park HQ vantage point? Click on Exit viewshed (in the upper left corner of the map) to remove the viewshed layer. About 2 miles to the northeast of the park headquarters is a scenic overlook. Its symbol in GEP is a small green arrow pointing to a green star. Move over to that location so that you can see it in the view and zoom in closely on the overlook symbol. Put a new placemark directly on the green star and name this placemark \u0026ldquo;Zion Overlook\u0026rdquo;. Create a viewshed for the Zion Overlook point and answer Questions 6 and 7. Exit the viewshed when you’re done. Question 6\nCan the area labeled as parking to the immediate north of your Zion Overlook point be seen from the overlook? Question 7\nWhat is blocking your view of the Sand Beach trail from the overlook? Be specific in your answer. Hint: You may want to zoom in closely to your Zion Overlook placemark and position the view as if you were standing at that spot and looking in the direction of the Sand Beach trail. Flying and Recording Animations in Google Earth GEP allows you to record high-definition videos of the areas you view. In this section, you’ll be capturing a video of the high-resolution imagery draped over the terrain to record a virtual tour of a section of Zion. Before recording, use the mouse and the Move tools to practice flying around the Zion area. You can fly over the terrain, dip in and out of valleys, and skim over the mountaintops. Don’t forget that you can also hold down the Ctrl key and move the mouse to tilt your view. It’s important to have a good feel of the controls, as any movements within the view will be recorded to the video. Therefore, the first few steps take you through a dry run. When you feel confident in your ability to fly over 3D terrain in Google Earth Pro, move on to the next step.\n The tour you will be recording will start at the park headquarters (your Park HQ placemark), move to the scenic overlook (your Zion Overlook placemark), and then finish at the lodging/parking area about a mile to the north of the overlook. Do a short dry run of this before you record. To begin, double-click on the Park HQ placemark in the Places box, and the view shifts there. Double-click on the Zion Overlook placemark in the Places box, and the view jumps to there. Use the mouse and Move tools to fly manually over the terrain a mile north of the overlook to the parking and lodging area and end the tour there. If you needed, prior to recording repeat this dry run maneuvering among the three points until you feel comfortable. When you’re ready to make your tour video, double-click on the Park HQ placemark in the Places box to return to the starting point of the tour. Also, take away the checkmarks next to the Park HQ and Zion Overlook placemarks in the Places box so that the two placemarks will not appear in the view (and thus not appear in the video) and so all you will see is the GEP imagery and terrain. On Google Earth’s toolbar, select the Record a Tour button. A new set of controls appears at the bottom of the view: To start recording the video, click the circular red record button. If you have a microphone hooked up to your computer, you can get really creative and narrate your tour; your narration or sounds will be saved along with your video. After showing the park headquarters in the view for a couple seconds, double-click on the Zion Overlook placemark in the Places box to jump to there. Show the overlook for a few seconds and then use the mouse and move commands to fly to the lodging/parking area to the north. When you’re done, click the circular red record button again to stop recording. A new set of controls appears at the bottom of the screen, and Google Earth Pro begins to play your video. Use the rewind and fast-forward buttons to skip around in the video, and also use the play/pause button to start or stop. You can click the button with the two arrows to repeat the tour or put it on a loop to keep playing. Once you\u0026rsquo;ve created a tour you are happy with, save it by clicking the Save Tour button. Call it \u0026ldquo;Ziontour\u0026rdquo; in the dialog box that opens. The saved tour is then added to your Places box (just like all the other GEP layers). Right now, the tour can only be played in GEP. You’ll want to export your tour to a high-definition video that can be watched by others or shared on the Web. To start this process, first close the DVR control box in the view by clicking on the x in the upper-right corner of the controls. Next, from the Tools pull-down menu, select Movie Maker. In the Movie Maker dialog box, choose 1280 × 720 (HD) for Video parameters to create a high-definition (HD) video of your tour. Next, under Record from choose the radio button for A saved tour and choose Ziontour (My Places). This will be the source of your video. From the pull-down menu next to File type, choose MJPEG (.mp4). Under Save to, use the browse button to navigate to the drive to which you want to save your video and call it \u0026ldquo;Ziontourmovie\u0026rdquo;. Leave the other defaults alone and click Create Movie. A dialog box appears, showing the conversion process involved in creating your video. Return to the folder where you saved your movie and view it by using your computer’s media player.\nQuestion 8\nSubmit the final video file of your video tour of Zion to your instructor, who will check it over for its quality and completeness for you to get credit for this question. Measuring Profiles and Slopes in Google Earth Pro Now let\u0026rsquo;s examine some profiles of the digital terrain model and find the slope information. To begin, return the view to the Park HQ placemark and zoom out so that you can see both the Park HQ and the Zion Overlook placemarks clearly in the view. To examine the terrain profile, you must first draw a path between the two points. Click the ruler tool on the GEP toolbar, and in the dialog box that opens, select the Path tab. In the view, you\u0026rsquo;ll see that your cursor has changed to a crosshairs symbol. click once on the Park HQ placemark location and then click once on the Zion Overlook placemark position. You should see a yellow line drawn between them, and the length of this line is shown in the Ruler dialog box. In the Ruler dialog box, click Save. A new dialog box opens, allowing you to name this path that you’ve drawn. Call it \u0026ldquo;ParkHQOverlookPath\u0026rdquo; and click OK in the naming dialog box. You see a new item called ParkHQOverlookPath added to the Places box. Right-click on it and select Show Elevation Profile. A new window opens below the view, showing the elevation profile between the two points. This profile shows you, in two dimensions, the outline and elevations of the terrain in between the two points. As you move your cursor along the outline of the terrain in the profile, information appears about the elevation and the slope at the location of your cursor. You also see the corresponding location highlighted in the view. The slope information is a positive percentage as you move uphill from the park headquarters to the overlook and a negative percentage as you move downhill. Carefully examine the profile (and the digital terrain) and answer Questions 9 - 11. Question 9\nWhat is the average slope for the 2.2-mile path between the park headquarters and the overlook (both uphill and downhill)? Question 10\nWhat is the steepest uphill slope between the two points, and where is it located? Question 11\nWhat is the steepest downhill slope between the two points, and where is this located? Select a small section of the path and examine only the profile of that small subset. To do so, click the mouse at a place in the profile where you want to begin. Hold down the left mouse button and move the mouse to the place in the profile where you want to end. Release the mouse button, and you see that small section of the profile you chose in a darker red. Statistics for the elevation and slope of that subset are shown in a red box at the top of the profile. To examine one small part of the profile, on the far right side of the profile (the terrain nearest the overlook), you can see that a section of the landscape drops off sharply. (This should be around the 1.6-mile mark between the Park HQ and the overlook.) Using the method described above, highlight the subset of the profile from the location at the bottom of the drop-off to the overlook and then answer Question 12\nFor this section of the terrain, what is the maximum uphill and downhill slope? What is the average uphill and downhill slope? Wrapping up There is no need to save anything from this lab, so when done you can simply close without saving. Submit your answers to the questions on blackboard.\nIn lab 9 (~two weeks from now) we will use Google Earth Engine. This platform is still a beta product, and you will need to request access through your Google account here. These are still approved by hand and may take a few days to get. Once approved, you will need to follow the instructions in the email that is sent before you are able to access the platform. Therefore, be sure to do this step before you attempt to work though lab 9. "},{"id":21,"href":"/classes/geog111/labs/lab08/","title":"Lab 08 - Map making","parent":"Labs","content":" This lab is a gratefully modified version of lab 7 from Bradley A. Shellito\u0026rsquo;s Introduction to Geospatial Technologies 529\n Learning Objective This lab introduces you to the concept of using GIS data to create a print quality map. This map should contain the following:\n The population per square kilometer for all counties in California, set up in an appropriate color scheme The data displayed in a projection with units of measurement other than decimal degrees (The default units used by the lab data are meters; make sure the scale bar reflects this information.) An appropriate legend (Make sure your legend items have regular names and that the legend is not called “legend.”) An appropriate title (Make sure your map title doesn’t include the word “map” in it.) A north arrow A scale bar Text information: your name, the date, and the sources of the data Appropriate borders, colors, and design layout (Your map should be well designed and should not look as if the map elements were thrown together randomly.) The goals for you to take away from this lab: Familiarize yourself with the Map Layout functions of QGIS Arrange and print professional-quality maps from geographic data using the various layout elements Outline: Learning Objective Submission requirements Tutorial Initial Pre-Mapping Tasks in QGIS Setting Graduated Symbology in QGIS The Map Layout in QGIS Choosing Landscape or Portrait Mode in QGIS Map Elements in QGIS Placing a Map into the Layout in QGIS Placing a Scale Bar into the Layout in QGIS Placing a North Arrow into the Layout in QGIS Placing a Legend into the Layout in QGIS Adding Text to the Layout in QGIS Other Map Elements in QGIS Printing the Layout in QGIS Submission Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rTutorial Initial Pre-Mapping Tasks in QGIS If needed, refer to Geospatial Lab Application 5.1: GIS Introduction: QGIS Version for specifics on how to do the following.\n Start QGIS. Add the CalBounds feature class from the CalMap geodatabase in the C:\\Chapter7QGIS folder to the Layers panel. Leave the CalBounds layer’s symbology alone for now; you’ll change it in the next step. Pan and zoom the Map View so that all of California fills up the Map View. Important note: The data used in this Geospatial Lab Application has already been projected to the U.S. National Atlas Equal Area projection for you to use. However, there are many more projections to choose from if you desire; you can change the projection of a data layer by right-clicking on it, selecting Export then Save Features As, and then choosing a projected coordinate system in which to change the layer.\n Set the properties of the project you’re working with so that QGIS will be able to render some of the map elements you’ll be working with (such as the scale bar) properly. From the Project pull-down menu, choose Properties. Select the CRS tab. In the box next to Filter, type US National Atlas Equal Area to search through all available projections and find the one you need. Under Coordinate reference systems of the world, click on the US National Atlas Equal Area option and then click Apply. You have now set the CRS of the project environment to the chosen projection. Click OK to close the dialog box. Setting Graduated Symbology in QGIS QGIS enables you to change a feature’s symbology from a single symbol to multiple symbols or colors and allows for a variety of different data classification methods. Right-click on CalBounds in the Map Legend and select Properties. Click on the Symbology tab. To display the states as graduated symbols, use the following settings (and leave all the other settings alone): From the pull-down menu to the top of the menu, select Graduated. For Column, select PopDens (each county’s population per square kilometer from the year 2010). Use 5 for the number of classes. Use Quantile (Equal Count) for the mode. For the color ramp options, use the pull-down menu to select an appropriate choice. When you have things arranged as you want them, click Apply to make the changes. Take a look at the map and make any further color changes you think are needed. Click OK to close the dialog box. The symbology of CalBounds has changed in the Map View, and the values that make up each of the breaks can be seen in the Layers panel. If the breaks and classes are not already displayed, you can show them by clicking the black arrow button to the left of the CalBounds layer in the Layers panel. The Map Layout in QGIS To begin laying out the print-quality version of the map, you need to begin working in the Map Layout (also known as the Map Composer). This mode of QGIS works like a blank canvas, allowing you to construct a map using various elements.\n To begin, from the Project pull-down menu choose New Print Layout. Before the Map Layout opens, you are prompted to give your layout a title. Type in a descriptive title, such as California Population Density 2010 or something similar. Click OK after you’ve entered the title. A new window, the Map Layout, opens. In the Map Layout, the screen represents the printed page of an 8½ × 11 piece of paper, so be careful when you’re working near the edges of the page and keep all elements of the map within that border. The layout has several toolbars horizontally across its top, with a new set of tools; locate and examine these navigation tools. (Note: Any of the toolbars can be turned on and off by choosing the View pull-down menu, choosing Toolbars, and placing a checkmark in the appropriate box. This section refers to the Layout, Navigation, and Actions toolbars.) Starting at the left and moving right, the tools and their uses are as follows: The blue disk is used to save a layout. The white paper creates a new layout. The white paper over the gray paper is used to create a duplicate layout. The white paper with the wrench opens the Layout Manager. The yellow folder allows you to add a template to the layout. The blue disk with the green bar allows you to save a template. The printer icon is used when you’re printing (see later in the lab).h. The next three icons allow you to export your layout to either (1) an image, (2) SVG format, or (3) PDF format. The two curved arrows allow you to either undo to the last change you made or redo the last change you made (which is useful when you need to back up a step in your map design). The second row of tools is as follows: The plus and minus magnifying glasses are used to zoom in and out of the layout. The magnifying glass with the 1:1 text zooms the map to 100%. The magnifying glass with three arrows zooms to the full extent. The twin curved blue lines icon are used to refresh the view. The lock and unlock icons allow you to fix items in place in the layout (lock) or remove this fix so they can be moved (unlock). The square and circle icons allow you to gather up several items and treat them as a single group (group items) or to turn a group of items back into individual items (ungroup). The last four icons with the blue and yellow boxes are used for raising, aligning, distributing, or resizing map elements to better fit into the layout. The vertical toolbar down the left-hand side of the screen is the Toolbox toolbar, and it provides additional tools: 1. The hand is used for panning and adjusting the content of a particular window—for instance, for moving the position of what’s showing on the map without changing the map itself.\r2. The magnifying glass is used to zoom in on elements on the map.\r3. The cursor pointing to the box is used to move map elements to different places in the layout.\r4. The icon with the blue arrows on a piece of paper is used to move the content of items.\r5. The icon with the hammer and the three blue points joined together is used for editing nodes.\r6. The other tools are used to add map elements to the layout, and you’ll use them later in the lab. You can use these icons to add a new map, an image, a new label, a new legend, and a new scale bar, as well as shapes, nodes, and arrow shapes for annotating the map, the attribute table of a layer, and an HTML frame for displaying Web content.\r Choosing Landscape or Portrait Mode in QGIS Before going any further with your map, you have to choose whether the map should be oriented in portrait or landscape mode. Portrait mode aligns the layout vertically (so that it’s longer than it is wide), while landscape mode aligns the layout horizontally (so that it’s wider than it is long). If you were mapping the United States, you’d likely choose landscape mode to place the entire country so it filled the page. However, when mapping California, you’d likely use portrait mode as it better fits the dimension of the state. To select whether you want to use portrait or landscape, right-click on the layout itself (the big blank white area) and from the pop-up menu that appears, choose Page properties. A new set of tabs appear on the right-hand side of the screen. Click on the Item Properties tab. Under Page Size, next to Orientation, choose Portrait. You see the layout shift to the vertically oriented Portrait mode. By selecting Landscape or Portrait from this option, you can change the orientation of the printed map. Map Elements in QGIS Again, think of the layout (or composition) as a blank sheet of paper that you can use to construct a map. Numerous map elements can be added, including the map canvas, scale bars, north arrows, and a legend. In the Items tab on the right-hand side of the layout, you can see a list of the elements you have added to the layout. When you choose an element in this Items tab, it is also chosen on the layout. You can also turn on an element by placing a checkmark in the box next to its name in the Items tab, and you can turn off an element by removing the checkmark next to it. Each element has properties (such as borders, fill colors, and size and position) that can be accessed by choosing the Item Properties tab on the right-hand side of the layout. You can move and resize elements by selecting them with the mouse and resizing as you would any other object. You can delete map elements by selecting them and pressing the Delete key on the keyboard. Placing a Map into the Layout in QGIS The first element to add to the layout is the default map—in this case, an element that shows all the visible layers in the Map Legend. Click on the Add New Map icon in the vertical Toolbox toolbar. You can then draw a box on the layout to display the current Map View. Draw a rectangular box on the layout to create the canvas. You can click and drag the canvas around the map or resize it as you see fit (using the four blue tabs at its corners), and you can treat the entire canvas as if it’s a single map element. Keep in mind that you’ll be adding several more map elements (such as a legend or a title) and adjust the size, spacing, and balance of the elements accordingly. You can’t manipulate individual layers (for instance, you can’t click and drag a particular county somewhere else), but by using the Move Item Content icon on the vertical toolbar, you can drag the contents of the map around and alter what’s displayed inside the canvas. To make further adjustments to the map (such as creating a border or altering the color of the border), select the Item Properties tab on the right side of the screen and scroll down to see what choices are available (such as expanding the Frame option to change the appearance of the map’s border or the Background option to change the fill color of the map). Investigate the other choices under Item Properties (such as the options under Main Properties for adjusting the scale of the layers being displayed on the map) to set up the canvas the way you’d like it to appear in the final printed layout. Placing a Scale Bar into the Layout in QGIS To add a scale bar to the map, select the Add New Scalebar icon on the toolbar and draw a box on the layout where you want the scale bar to be added. When the scale bar appears, click on it so that the four blue boxes appear at the corners. In the Item Properties box on the right-hand side of the screen, you can see several options for altering the scale bar. Use the options to change the following properties of the scale bar until it looks appropriate: The number of segments on the left or the right side of the 0 value on the scale bar The size of each segment The number of map units per bar units (to even out the breakpoints on the scale bar) The physical dimensions of the scale bar (height, line, width, etc.) The font and color used Placing a North Arrow into the Layout in QGIS To add a north arrow to the map, select the Add Picture icon from the toolbar and draw a box on the layout where you want the north arrow to be added and draw a box the size that you want to north arrow symbol to be. An empty box and several different graphics options appear in the Item Properties tab. The Add Picture option allows you to add a graphic of your own to the map or to select from several pre-made graphics, including several different north arrows. Expand the options under Search Directories and scroll partway down, and you see a number of options for north arrows. Select an appropriate north arrow, and it appears in the empty box on the layout. Use the cursor to position or resize the north arrow. Placing a Legend into the Layout in QGIS Each layer you use in QGIS (such as CalBounds) has a set of layer properties. Anything changed in the layer properties is reflected in changes to the layout. For instance, if you change the symbology of a layer in its properties, its appearance is transferred to a legend in the layout. Similarly, whatever name is given to a layer in the Map Legend carries over to a legend in the layout.\n Return to the regular QGIS window. Currently, the CalBounds layer is named “CalBounds.” However, you can change this name to something more descriptive before adding a legend to the layout. In the QGIS Layers panel, right-click on the name you want to change, CalBounds, and select Rename Layer. In the Layers panel, you can type a new name for the layer (for example, Population per Square Km) and press the Enter key. Return to the layout. To add a legend to the map, click the Add New Legend icon on the toolbar and draw a box on the layout where you want the legend to be added. A default legend is added to the map, consisting of all the layers in the Map Legend with whatever names are assigned to them. Use the cursor to move and resize the legend as needed. To make changes to the default legend, click on the legend itself, and you see several options available in the Item Properties tab. You can change the name of the legend (don’t just call it “Legend”), its font, symbol width, and appearance, as well as the spacing between items. Adding Text to the Layout in QGIS To add text to the map (such as a title or any other text you might want to add), select the Add New Label icon from the toolbar and then draw a box on the layout where you want the text to be added of the size you want the text to be. Note that when text is added, a small box filled with the words “Lorem ipsum” (which is standard placeholder text) will be added to the layout. The lettering in the box, as well as its font, size, and color, can be changed in the Item Properties tab. In the Item Properties tab, under the Label heading, type the text you want to appear in the text box on the map. To alter the font, color, and other properties of the text, click the Font and Font Color drop-downs and choose from the available options. Note that you can alter each text box separately, create larger fonts for titles and smaller fonts for type, and so on. You can also resize each text box and move it to a new position. Other Map Elements in QGIS Though they’re not used in this lab, other map elements can be added, including: A shape: You can add the outline of a shape (an ellipse, a rectangle, or a triangle) to the layout for additional graphics or annotation. Use the Add Ellipse icon to do this. An arrow: You can add a graphic of an arrow (not a north arrow) so that you can point out or highlight areas to add further annotation to your layout. Use the Add Arrow icon to do this. An attribute table: You can add a graphic of a layer’s attribute table to the map to enable the presentation of additional information. Use the Add Attribute Table icon to do this. An HTML frame: You can add a frame that will contain a Web page or other Web content. In the Item Properties tab, you can specify the URL for what you want to display inside the frame. Use the Add HTML Frame icon to do this. Printing the Layout in QGIS When you have constructed the map the way you want it, choose the Layout pull-down menu, select Page Setup, and specify whether you want the map printed in landscape or portrait format and, if desired, add margins for printing. When the layout is ready, select the Print icon from the toolbar. In the Print dialog box, click Print, and QGIS prints to your computer’s default printer. If you want to convert the layout to a digital format instead of printing it, click on the Layout pull-down menu and choose one of the available options: Export as Image (to convert the layout to a format such as JPG or PNG), Export as PDF, or Export as SVG Submission All you have to turn into blackboard for this week is the final image you created above.\n"},{"id":22,"href":"/classes/geog526/labs/lab08/","title":"Lab 08 - Map making","parent":"Labs","content":"Learning Objective The purpose of Lab06 is to familiarize you with interpreting Landsat Thematic Mapper and SPOT imagery. Upon completion of this lab you should be aware of the usefulness of each TM band, and the similarities and differences between TM and SPOT data.\nOutline: Learning Objective Submission requirements Guide Tutorial Submission Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rGuide Terminology electromagnetic energy – energy rays that are either reflected or emitted as radiation by objects. X-rays, ultraviolet light, visible light, near-infrared light, and heat (thermal) radiation are different wavelengths of electromagnetic energy electromagnetic spectrum – continuum of electromagnetic energy from long wavelengths such as radio or radar, to medium wavelengths like visible light, to short wavelengths like X-rays panchromatic – sensitive to all or most of the visible electromagnetic spectrum color composite – putting 2 or 3 MSS bands together in a color image LANDSAT – series of satellites managed as part of NASA\u0026rsquo;s Earth Observing System that acquire imagery of the Earth from space. The images are used in the areas of global change, agriculture, geology, forestry, regional planning, education, and national security\nElectromagnetic Spectrum The electromagnetic spectrum includes visible light, radio waves, heat, X-rays and other forms of electromagnetic energy (EMR). A multispectral sensor samples EMR in various discrete wavelength ranges. A panchromatic sensor samples EMR over a single broad range of wavelengths. Multispectral imagery is useful for analyzing spectral signatures (patterns of reflectance in various wavelengths) of Earth features.\nEM_Spectrum_Properties_edit.png Types of Resolution Spatial Resolution •\tSize of pixel; the smallest digital element that is displayed, pixel stands for “picture element”. •\tThe smaller the area that the pixel can detect , the higher the spatial resolution •\tHigher spatial resolution means more detail is captured and displayed. Spectral Resolution •\thow narrow a wavelength the sensor is sensitive to •\tmore bands covering a the same portion of the spectrum has a higher spectral resolution (e.g. 5 bands covering 0.3-0.7µm has a higher spectral resolution than another dataset with 3 bands covering the same 0.3-0. •\t7µm) Temporal Resolution •\thow often the satellite goes over the same place on earth, often called the revisit period. However some sensors have off nadir data acquisition, the revisit period may not equate to a complete orbit cycle •\tincreased frequency = high temporal resolution Radiometric Resolution •\trange of digital number values (don’t worry about this now)\nPixels and Digital Numbers •\treflected light energy is converted into a digital number (DN) and stored in pixel format •\tdigital number = positive integer value proportional to reflectance •\tDNs are in a column by row format DNRep.png •\teach is a pixel \u0026amp; has a DN •\tdifferent satellites have different pixel sizes MSS\t79m by 56m TM\t30m by 30m (visible, near-infrared and middle infrared)\tSPOT 5\t10m by 10m •\tSPOT has the highest spatial resolution\nLandsat Thematic Mapper (TM) •\tHigh spatial resolution; improved geometric accuracy; greater radiometric detail; more precise spectral resolution Band\tSpectral Definition\tSpatial Resolution 1\tBlue/Green\t30m x 30m 2\tGreen\t30m x 30m 3\tRed\t30m x 30m 4\tNearIR\t30m x 30m 5\tMidIR\t30m x 30m 6\tFarIR (Thermal)\t120m x 120m 7\tMidIR\t30m x 30m\n•\tBands 1-5, 7 detect reflected energy •\tBand 6 detects emitted radiation •\tLandsat passes over US in the morning, keep in mind where the sun is at that time\nTM Data Characteristics •\tCharacteristics of bands selected to maximize detecting \u0026amp; monitoring of different types of resources •\tTM band1 – penetrates water; useful for coastal studies, soil-vegetation differentiation, forest type distinction •\tTM band 4 – detects nearer reflectance peaks in healthy vegetation; useful for water/land interfaces •\tTM band 6 – used for thermal mapping\nTM False Color Composite •\tCombination of TM2, TM3, \u0026amp; TM4 assigned to B,G, and R color guns, respectively •\tSo that: o\tVegetation appears as shades of red (the brighter, the healthier) o\tSoils/sparse vegetation range from white (sands) to greens/browns depending on moisture or organic matter o\tWater bodies appear blue; deep clear H20 dark blue to black; sediment laden H20 lighter o\tUrban areas appear blue/gray o\tClouds/Snow appear bright white\nSPOT (Le Systeme Pour l’ Observation de la Terre) •\t“Earth Observation System”; French system •\tOperates in 2 modes: panchromatic \u0026amp; multispectral •\tSPOT 4-5 Multispectral Mode Band 1 \tGreen Band 2 \tRed Band 3 \tNearIR Band 4 SWIR(short-wave infrared) Spatial Resolution:10m~ 20m •\tPanchromatic Mode Sensitive across broad spectrum (.51 - .73 μm) Spatial Resolution =2.5 m~5m\nQuick Reference Comparison Chart MSS\tTM\tSPOT 5 Spatial Resolution\t79 x 56\t30 x 30; 120 x 120 (TM6)\t10 x 10; 20 x 20 #bands 4\t7\t4+ pan. Temporal Res.\t18/16\t16\t26 Cost\tFree\tFree\tVaries: 10m full scene ~$4,500 Start date\t1972\t1982\t2002 (SPOT1 from 1986) Radiometric\t0 – 127\t0 – 255\t0 – 255 ** Radiometric Resolution – range of values that DN can be assigned\nSPOT recently launched SPOT 6 on September 9, 2012. You may find the details in http://www2.astrium-geo.com/files/pmedia/public/r12317_9_spot6-7_technical_sheet.pdf\nBeside MSS (Landsat 1-4) and TM (Landsat 5), the Landsat program has ETM+ (Landsat 7), and recently launched, Landsat 8 (launched February 11, 2013). WRS: Landsat imagery uses the World Reference System (WRS) to spatially index images. WRS-2 is a grid index of scenes acquired for Landsat 4, 5, \u0026amp; 7. The index is comprised of 233 paths and 248 rows. An individual scene location can be determined using the combination of path and row (path/row).\nTutorial Part 1: Landsat TM\n Fill in the table below (use the tables in your textbook): (7 points)\n TM Bands\t Band\tWave Length Pixel Size/Resolution\n1 ______________ _______________\n2 ______________ _______________\n3 ______________ _______________\n4 ______________ _______________\n5 ______________ _______________\n6 ______________ _______________\n7 ______________ _______________ Copy the TM scene (.img and .rrd file) from G:/ Fall-2015/G526/Lab06/. The image is called I5016037_03720050727.img \u0026amp; be sure to copy I5016037_03720050727.rrd as well. Note that this image has been processed at a Level 1T. See this link for a description of the processing. http://landsathandbook.gsfc.nasa.gov/data_prod/prog_sect11_3.html\n Open ERDAS Imagine 2014, and open the TM image listed above in a 2D Viewer.\nA) According to the website above, what projection should the data be in? _______________________ (1 point) B) Verify the projection by opening the metadata for the image in ERDAS (right click on the image name Metadataclick Projection tab). Is the image in the projection listed on the website? (Circle the answer, 1point) Yes No\nThis image was downloaded from the Global Land Cover Facility (GLCF) at the University of Maryland. They use a file naming convention that provides useful information about the scene. The following is what the file name convention consists of:\n [satellite][sensor][path][starting row]_[ending row][acquisition date].img Lengths: [1] [1] [3] [3] [3] [8]\r Provide the following information: (2 points)\nDate of acquisition: __________________________\nPath/Row: __________________________\n A) Which band is the thermal infrared band of TM (2 points)? Band______ Spectral range: ______________ μm\n B) How can you tell by looking at the imagery? (1 point)\nC) What 2 types of terrain features appear warmest? (2 points)\nD) What 2 types of terrain features appear coolest? (2 points)\nDisplay and Link two viewers in ERDAS Imagine: Open TM band 1in a viewer and Fit to Frame. Now go to File New 2D View and open TM band 4 and Fit to Frame. The two viewers should be displayed side by side. With 2D View #1 (displaying TM band 1) highlighted (the upper frame of the viewer will show in yellow), go to the frame of 2D View #2 and click on the Spatially link icon . This will spatially link the two viewers. You can zoom in one viewer and the area you zoom to will be outlined in the other viewer. Likewise you can zoom to the same spatial extent and pan in one viewer and the other viewer will update to show the same extent. This is helpful when comparing two bands or datasets.\n Using TM band 1 and TM band 4, examine the waterways surrounding Charleston. Comment on the differences you observe in the water surrounding the islands for each band and why you think those differences occur. (2 points)\n Open the Inquire tool. Type in the these coordinates (600474, 3618299) into the X:,Y: boxes, respectively. The crosshairs will move to these coordinates and the images will pan to the area of the X, Y. Make sure your viewers are still linked. Examine the land located in this area of TM band 1, located just left of the barrier islands. Describe this area’s appearance in band 1. Now describe the same area using TM band 4. What is causing the feature to appear differently between the two bands? (2 points)\n Use the Inquire tool to go to the area around the X, Y coordinates 589937, 3620917. The land should be located approximately between two rivers. Examine this land area for both TM band 1 and TM band 4. Describe and explain the differences in how and why they appear the way they do in each band. (2 points)\n What are 2 advantages TM data provide for preparing a land use/land cover map of the area? (2 points)\n What 2 disadvantages, or problems, might the TM data present? (2 points)\n Part 2: SPOT Image Copy the entire folder SPOT6_1.5m_Ortho_Product_Bundle from G:/Fall-2015/G526/Lab06/. This is an original SPOT ortho product (with georeference) with all the information in the folder. Use ERDAS 2014, Open the images and the information in the folder (or use your textbook,internet) to fill the information below. (10 points)\nMultispectral (MS)\tBand Name (e.g. red)\tWave Length\tPixel Size/Resolution 1\t_________\t_______________\t_______________ 2\t_________\t_______________\t_______________ 3\t_________\t_______________\t_______________ 4\t_________\t_______________\t_______________ Panchromatic (P)\t1\t_______________\t_______________\nOpen the image IMG_SPOT6_MS_201212071020271_ORT_605187101_R1C2.JP2 in the folder of SPOT6_1.5m_Ortho_Product_Bundle\\DS_SPOT6_201212071020271_E002N41_01983_1\\PROD_SPOT6_001\\VOL_SPOT6_001_A\\IMG_SPOT6_MS_001_A Answer the following questions:\nA)\tIdentify the number of rows and columns in this image (2 points): Rows ________ and Columns ______________ B)\tWhat’s the resolution of this image? (1 point)___________________\nC)\tHow many acres does this image cover? (Show your work, 2 points)\nD)\tDate of acquisition (1 point) ___________________\nYou will use ArcGIS for this section. Open the “Part3.mxd” in the folder of Lab06. Identify the following features: (8 points) A.\nB.\nC.\nD.\nE.\nF.\nG.\nH.\nThere are many differences between SPOT 5 and SPOT 6. Compare the information about SPOT 5 and the information you found about the SPOT 6 images. Discuss the major differences between them (resolution, band number, etc.) (4 points) Part 3. TM \u0026amp; SPOT Comparisons\nYou have now had a brief opportunity to examine Landsat and SPOT imagery. Use the available SPOT and TM images, the Lab06 handout, and your textbook for reference in answering the following questions.\n Consider 2 advantages and 2 disadvantages of using such products versus, for example, aerial photographs. Speculate on and list some potential applications of Landsat TM and SPOT imagery respectively: (6 points)\n Using a set of four criteria (spectral resolution, temporal resolution, thermal data, and geometric fidelity) contrast between TM and SPOT imagery. (e.g. what are the advantages and disadvantages of each, compared with the other) (4 points)\n Bonus question: Use the resources from the internet, and get the information on Landsat 8. Compare Landsat5 TM and Landsat 8. What are the major differences between these two sensors? (Resolutions, applications?) Your answer must be specific to get full credit (3 points)\nSubmission All you have to turn into blackboard for this week is the final image you created above.\n"},{"id":23,"href":"/classes/geog111/labs/lab09A/","title":"Lab 09A - 3D Modeling and Visualization","parent":"Labs","content":"Future content here\n"},{"id":24,"href":"/classes/geog111/labs/lab09B/","title":"Lab 09B - Cloud based GIS","parent":"Labs","content":" This lab is ungracefully ripped from my own tutorial on the UCGIS BoK entry for GEE. Please explore this version for more information and background.\n Google Earth Engine is still a beta product, and you will need to request access to the platform through your Google account here. These are still approved by hand and may take a few days to get. Once approved, you will need to follow the instructions in the email that is sent before you are able to access the platform. Therefore, be sure to do this step before you attempt to work though this lab. Learning Objective This lab serves as a highly hand held introduction to Google Earth Engine and cloud based geospatial operations. There will be a lot of code to look at here, but no prerequisite code experience is necessary and the steps have been broken into small, digestible pieces. The goal by the end of this tutorial is not that you have become an expert in JavaScript, but that you see the power and potential that cloud based geospatial analyses offers, and that you come away with a deeper appreciation for the technology necessary to facilitate effective global scale operations.\nOutline: Learning Objective Submission requirements Tutorial Background and outline Creating ImageCollections Joining the Collections Masking out large zenith angle pixels Reclassifying Calculating Snow Cover Frequency Trend Analysis Exporting the Results Building a web-based application Conclusions \u0026amp; Wrapping up Submission requirements You are submitting a URL to blackboard. There are no questions to answer or\nTutorial Background and outline Snow cover frequency, or the number of days that a spot is covered by snow divided by the number of days it has a valid observation, is an important metric of the cryosphere, and is critical to a number of biogeophysical processes. Using the daily MODIS snow cover dataset, we will calculate the snow cover frequency on a yearly basis for water years 2003 through 2018. A water year here is defined as October 1st to September 30th and is labeled as the year of the end date. For example, water year 2004 is from Oct. 1, 2003 to Sept. 30, 2004. We will then find and map the linear trend of snow cover frequency at every pixel across the globe. To add a little more nuance on this analysis, we will take into account the sensor zenith angle of each MODIS observation. At the extremes of the MODIS instruments sensor zenith, the pixel length has essentially doubled, and the width has increased more than 10 times that of the nidar pixel. These observations can skew aerial coverages, so we want to limit our daily snow cover dataset to just those observations that have a sensor zenith angle less than 25o, limiting pixels to 110% of the nominal area of a nadir pixel. To do this we need two daily datasets, the daily snow cover product MOD10A1 and the sensor properties contained in the MOD09GA dataset, both are available in GEE.\nTo accomplish this analysis, we will first need to join the two datasets (MOD10A1 and MOD09GA), to create a combined dataset. We then need to mask out those pixels with a high sensor zenith angle. After that, we need to reclassify the snow cover to a simple snow/no snow/missing dataset, which will make future calculations easier. After reclassifying the data, we want to count at each pixel the number of days that have a snow observation and the number of days that have a valid observation. We will then need to add a band representing the water year (time) and calculate the trend at each pixel before finally exporting our results. The steps of this process are exemplified in Figure 3. We will wrap much of that data up into functions, as outlined below, in the larger ovals. The general process we will follow is shown in Figure 3.\n A schematic of data analysis flow.\n Creating ImageCollections Where we are in relation to the rest of our analysis. Here we explore the data and learn its associated structure in GEE.\n Looking at Figure 4, we will start by accessing the MOD10A1 dataset and examine its structure. You can search for the datasets with the search bar at the top of the Code Editor, and use associated GEE data IDs to access them. We start by searching for MOD10A1, and importing it, as shown in Figure 5. Using just over 40 characters, we have access to the entire MOD10A1 dataset in less than 20 seconds. These static variables can also be included as Imports, which sit above the code editor as Imports.\n The view of the Code Editor IDE and how to import an ImageCollection into the API for analysis. The code is available at https://code.earthengine.google.com/8039423066ab1f923a9ba9c7db01512e\n We can see that MOD10A1 is imported into your JavaScript as an ImageCollection. If you recall, the MOD10A1.006 data product is a daily global snow cover dataset and each Image in the ImageCollection has 9 bands. The one we are interested is the “NDSI_Snow_Cover” band which is a fractional snow coverage band with valid values from 0-100. Although the quality band has a high sensor zenith flag, the full sensor zenith angle data is only found in the MOD09GA.006 dataset. Therefore, our next goal is to join the two datasets together.\nJoining the Collections Where we are in relation to the rest of our analysis. Here we demonstrate how to join collections.\n After we import the MOD09GA dataset as another ImageCollection, our goal is to get the sensor zenith angle band from the MOD09GA ImageCollection to join with our snow cover data. To select a particular band to work with, we can use the .select() method, as shown in Figure 7.\n Example of using the .select() and .filterDate() methods in GEE to pull images we need and examine their structure. The code is available at https://code.earthengine.google.com/585a7df6444cfa07d10a25a8dfbb337e\n Joining two collections requires a common property between them, just as it does in standard GIS software. In this case the best way to go about joining the collections is to join them based on their acquisition times, using the “system:Time_Start” property as the filter. If you look at the various types of joins in the documentation, you will see that there are a few ways to do this but we will use the join.inner(). The code to join the two datasets together is shown in Figure 8.\n The code used to join the two MODIS datasets together, which is available at https://code.earthengine.google.com/e63a22be50d2c017363d22d0d0d1c423\n We\u0026rsquo;ve just joined two MODIS collections together, and it took less than 20 lines of code to do so! However, if you print the resulting collection you will get something that looks like Figure 9.\n Printed results of the joined collection, itself the result of the previous code execution.\n The output collection from the join operation is a FeatureCollection, where the matching images are the \u0026ldquo;primary\u0026rdquo; and \u0026ldquo;secondary\u0026rdquo; properties of the features. In order to convert this to a format that we can work with, we need to run a function across the FeatureCollection, creating a new image that has the bands from the primary and secondary images. You can visualize the conversion as a series of operations shown in Figure 10.\n The schematic of the MergeBands function we need to write to transform the results of our join into a more useable form.\n The code for the MergeBands function and for applying the function to each matching pair of images in the FeatureCollection are shown in Figure 11. When run the code, be sure to note the difference between the input and output collections.\n The code (available at https://code.earthengine.google.com/81bac4a77cf20821f86ad94b7611d457) for the MergeBands function and for applying the function to each pair of matched images, which convert the joined collection to a form more useful for subsequent analyses\n Now that we have joined the two collections, we need to remove the pixels that have a sensor zenith angle greater than 25o.\nMasking out large zenith angle pixels Where we are in relation to the rest of our analysis. Here we demonstrate two useful functions to create a sensor filtered dataset.\n As shown in Figure 12, with snow cover and zenith angle joined into images as bands, we need to remove the pixels from each image that have a sensor zenith angle larger than 2500 (the angle is stored as a multiple of *100). To do this we will introduce two of the most useful functions in GEE, .map() and .mask(). As seen in the MergeBands function, we used the .map() method. This method executes the same function across each element in a collection, and returns a collection of the same depth. Masking is another helpful method we’ll use. The input of mask is applied to the image and anything which isn’t specified within is converted to a null value. Figure 13 shows a simplistic example where we use the .mask() method to keep only the areas with an elevation larger than 2000 m from the SRTM90 elevation dataset in the image.\n The code (available at: https://code.earthengine.google.com/60187ddb4b541fb209697f2c6820c65d) and results of an example .mask() method.\n Using the .mask() and .map() methods, we’ll modify our code to mask out any pixels with a sensor zenith angle greater than 250. Because we want to do this for every image in the ImageCollection we will use the .map() method with the MaskSensorPixels() function as shown in the highlighted portion of Figure 14.\n The code for masking out large zenith sensor angle pixels, which is available at https://code.earthengine.google.com/bfed155a968160db67253474cd707c9f\n Reclassifying Where we are in relation to the rest of our analysis. Here we need to reclassify our snow band to a binary no snow/snow dataset.\n As shown in Figure 15, our next step is to reclassify the pixels on each image to snow, non-snow or missing. The “NDSI_Snow_Cover” band in MOD10A1.006 is a fractional band with valid values from 0-100, indicating the fractional snow coverage at a pixel. To perform a snow cover frequency analysis, we need to reclassify the band into the no-snow/snow/missing categories, using values that represent “snow” as 1, “No snow” as 0, and null for \u0026lsquo;missing\u0026rsquo; values. To accomplish this GEE has a remap() method, which takes a list of values and maps the values to another list of values you specify. Because we need to reclassify every image in the collection, we will use the handy .map() method again. We are now at the end of our first data processing chain, \u0026ldquo;Prepare MODIS Snow cover\u0026rdquo;, meaning we can wrap up all the previous steps into a single function, PrepareModisSnowCover(), that takes two dates, a StartDate and a StopDate. The code to do this is shown in Figure 16. As an aside, these last few steps could be wrapped up in the same .map() call, but were separated here for demonstration purposes.\n The code used to generate a year of sensor-zenith-angle filtered, reclassified snow cover data, available at https://code.earthengine.google.com/5e68e31ee7ead90a6d33b80ab8826e55\n Calculating Snow Cover Frequency Where we are in relation to the rest of our analysis. Here we need to create a function that generates a time series of images which stores the snow cover frequency and the date of the image.\n Now that we have a function to create a processed MODIS snow cover dataset for a given time interval we need to construct a function to generate a time series of snow cover frequency as outlined in Figure 17. The PrepareMODISSnowCover function returns a single band, called remapped, that represents a sensor adjusted and reclassified snow cover for each day of the collection. In order to calculate snow cover frequency we need to know both the number of snow days and the number of days we have valid observations. The .count() method in GEE counts the number of times a pixel has a valid value within an image collection, and will ignore (not count) those images that have a null value, so it counts both snow and no-snow values in a pixel stack. We can also count the number of snow days at a pixel by using the .sum() method, which adds the values of a stack of pixels. We will wrap this up in a function that takes a start and end date, and the number of intervals to advance. This code to accomplish this is shown in Figure 18 below: The code which calculates snow cover frequency within a certain time interval is available at https://code.earthengine.google.com/aeee1a828fb717397d38eaef236ed498\n Trend Analysis Where we are in relation to the rest of our analysis. Here we need to pass our final dataset to the LinearFit Reducer.\n As outlined in Figure 19, we have now created the necessary functions to create a time series of snow cover frequency, but to calculate a linear trend we also need a band which represents the time stamp. With a small modification to the GenerateSnowCoverFrequency, we can add the EndDate as a numerical band, which gives us everything we need to calculate the linear trend. If we look at the documentation, ee.Reducer.linearFit() returns an image with two bands, one called \u0026lsquo;scale\u0026rsquo; and one called \u0026lsquo;offset\u0026rsquo;. This is Google\u0026rsquo;s terms for \u0026lsquo;slope\u0026rsquo; and \u0026lsquo;intercept\u0026rsquo;, so the value we are after is the scale. The code to do so is shown in Figure 20.\n The code to create snow cover frequency and linear trend is available at https://code.earthengine.google.com/c5dd98445765937e9ff1c1f4f6d24ec5\n Exporting the Results Where we are in relation to the rest of our analysis. Now we need to display and export our analysis.\n As we can see from Figure 21, we are at the end of our analysis. The final step is to visualize and export the analysis so that others can view it. The first thing we need is to do is create color palettes. We then filter the slope based on a minimum value of snow cover frequency using the .mask() function with a .min() reducer over the entire collection. This ensures that we are not displaying the trend analyses for areas that have very low snow coverages. We then use the Map.addLayer() method to add a layer to the map, setting the min and max values and the palette we want. Then we will export our analysis as a GeoTIFF using Export.Image.toDrive(). For a little more flair, we’ll also export the timeseries of snow cover frequency as a video using Export.video.toDrive(), using another .map() call to force images into rgb space for video. The code to accomplish this is shown in Figure 22.\n The code to accomplish the display and export of our analysis, available at https://code.earthengine.google.com/3ac848247d0e231e50c5feefd7d23f6d\n Building a web-based application Finally, let’s capitalize on the advantage of the new UI and app features in GEE to create an interactive web map so that anyone may view our analysis. The UI Examples provide several great frameworks to construct an app, so we will borrow some of them to create an application using our new snow cover trend data. The code to do so look like so: https://code.earthengine.google.com/bc3998bc98afca2007249eff2d8c6a1a. If you publish the app using the app button above the map and following the clicks, anyone can access the analysis at a URL similar to https://JamesMColl.users.earthengine.app/view/bokdemo as shown in Figure 23. An interactive web application based on this analysis, available at https://JamesMColl.users.earthengine.app/view/bokdemo\n Taking a step back to look at what we’ve accomplished in this tutorial really accentuates what can be accomplished using GEE:\n We pulled in 18 years for two daily MODIS datasets Joined them in space and time Filtered and reclassified the resulting dataset Calculated snow cover frequency Performed the linear fit with the snow cover frequency dataset Exported a GeoTIFF and video of the resulting slope map and created a web application so that anyone can interact with the analysis And the entire process probably took less time to perform than it would have needed to download a single day of MODIS data. This is just a small sample of the processing power and capabilities that Google Earth Engine has to offer, and an example of how this platform has revolutionized global scale geospatial analyses. Conclusions \u0026amp; Wrapping up Taken as a whole, GEE represents the most advanced, cloud-based geoprocessing platform to date. Although several other platforms encompass some of these aspects, no suite of tools currently available can replicate the access to geospatial data, relative simplicity of use, and sheer power of analysis that GEE offers. The platform is constantly under development and new features and algorithms are continuously updated. The GEE team and community is also incredibly approachable and helpful with GEE booths are commonly seen at many geospatial meetings, where they demo use cases and announce new features.\nQuestion to submit\nUsing the code and app above, do the following.\n Navigate to someplace interesting in the world. Take a screenshot of the snow cover trend in that area and paste it at the top of your word document. In 1-2 complete paragraphs below that, answer the following: describe and interpret the trend and why it interests you. Why do you think this trend is occurring? What other information do you think you\u0026rsquo;d need to fully describe why the snow cover is changing in this way? "},{"id":25,"href":"/classes/geog111/labs/lab10/","title":"Lab 10 - Visual Imagery Interpretation","parent":"Labs","content":" This lab is a gratefully modified version of lab 9 from Bradley A. Shellito\u0026rsquo;s Introduction to Geospatial Technologies\n Learning Objective This lab will help you start thinking about what objects look like from the sky rather than from the ground. You’ll be examining a series of images and applying the elements of visual image interpretation discussed in the chapter. As you do, you’ll need to think like a detective and search the image for clues to help figure out just what it is you’re looking at. You should be able to figure out exactly what you’re looking at or at least narrow down the possibilities to a short list before you turn to support data for help. Although the items you’ll be examining in this lab will all be large and prominent—buildings and other very visible features—the application of visual image interpretation elements for these simple (and hopefully fun) examples will help get you started looking at the world from above. The goals for you to take away from this exercise:\n Think of how objects look from an aerial perspective Apply the elements of visual image interpretation (as described in the chapter) to identify objects in the images Outline: Learning Objective Submission requirements Tutorial Viewing images Applying Elements of Visual Image Interpretation Visual Image Interpretation Wrapping up Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rYou are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts and submit it to blackboard.\rTutorial Viewing images To examine a particular image with more control than your browser, you can download the zipped folder above, unzip it, and open an image using your photo viewer of choice. You can also open the image in either ArcGIS Pro or QGIS. These are .png images so they will load in and request to create pyrimids\nApplying Elements of Visual Image Interpretation Take a look at each of the images and try to determine what you’re looking at. There are questions related to each of them. Although everyone’s application of the elements may vary, there are some guidelines that you may find useful, as described in the following steps. Several images may contain multiple items, but all of them work together to help define one key solution. Let\u0026rsquo;s start with this sample image:\nFirst, look at the image as a whole for items to identify:\n The central object is obviously some sort of structure—and it is large, judging from the shadow being cast. The relative size of the structure (compared with the size of several of the cars visible throughout the image) suggests that it’s probably a building and not a monument of some kind. It’s set on a body of water (the tone of the water is different from the tone of the nearby concrete or greenery). Thus, the site can be fixed. One of the most notable features is the large triangular shape of the main portion of the building; note too that its texture is very different from the rest of the building (the three white sections). The shape of the entrance is also very distinctive: It is large and round, with a couple of concentric circles. Based on your interpretation of the initial items, start looking for specifics and relationships between the items in the image:\n The pattern of the concentric circles at the entrance plaza—two outer rings and a third inner ring with some sort of a design placed in it—is striking. Judging from the relative size of a person (pick out shadows near the plaza or on the walkway along the water), the plaza is fairly large. The pattern sort of resembles a giant vinyl record album—and the large curved feature that follows around the plaza on its right looks like the arm of an old record player. The texture of the triangular portions of the building suggests that they could be transparent, as if that whole portion of the structure were made of glass. There’s no parking lot associated with the building, so you’d have to park somewhere else and walk to this building. Further, there are no major roads nearby, again indicating that this is not some sort of destination that you just drive up to and park next to. The large body of water adjacent to large concrete walkways gives a clue that this building is in a large city. Put the clues together and come up with an idea of what the object or scene could be:\n The building is distinctive enough to be a monument, but it is probably too large to be one. It could be some sort of museum, casino, shopping center, or other attraction. However, the building is too small to be a casino or resort center, and it lacks the necessary parking to be a casino, an office building, or a shopping center. The very distinct “record” motif of the front plaza seems to indicate that music (and, more specifically, older music, given the “vinyl record” styling) plays a large part in whatever the building is. From the clues in the image, this could be some sort of music museum, like the Rock \u0026amp; Roll Hall of Fame, which is located in Cleveland, Ohio, right on the shore of Lake Erie. Finally, use auxiliary sources to verify your deduction or to eliminate some potential choices. By using Google Earth to search for the Rock \u0026amp; Roll Hall of Fame, you can confirm the identification. An image search on Google will also turn up some non-aerial pictures of the site to verify your conclusion. This lab does not provide such extensive details on all the images, and you will not necessarily use all the elements to examine every image. For instance, there may be one or two items in an image that will help you make a quick identification. However, there are enough clues in each image to figure out what you’re looking at—or at least narrow down the choices far enough that some research using other sources may be able to lead you to a positive identification. (For instance, by putting together all the clues from the sample image, you could start searching for large museums dedicated to music. Add the triangular glass structures as part of your search, and the Rock \u0026amp; Roll Hall of Fame will certainly come up.)\nVisual Image Interpretation Important note: All images used in this exercise are from areas within the United States.\n For each image, answer the questions presented below and then explain which of the elements of visual image interpretation led you to this conclusion and what they helped you decide. Writing “shadow and shape helped identify traits of the building” would be unacceptable. It would be much better to write something that answers the question “What was so special about the shadows in the scene or the shape of items that helped?” When you identify the items, be very specific (for instance, not just “a baseball stadium” but “Progressive Field” in Cleveland, Ohio). Use external sources for extra information, such as Websites, search engines, books, or maps. When you think you finally have an image properly identified, you may want to use something like Google Earth or Bing Maps to obtain a view of the object you think you’re looking at in the image to help verify your answer.\nQuestion 1\nExamine image 1. What (specifically) is being displayed in this image? What elements of visual image interpretation lead you to draw this conclusion? Question 2\nExamine image 2. There are many similar objects in this image, but you’ll notice some differences too. What (specifically) is being displayed in this image, and what is its location? What elements of visual image interpretation lead you to draw this conclusion? Question 3\nExamine image 3. What (specifically) is being displayed in this image? What elements of visual image interpretation lead you to draw this conclusion? Question 4\nExamine image 4. What (specifically) is being displayed in this image? What elements of visual image interpretation lead you to draw this conclusion? Question 5\nExamine image 5. What (specifically) is being displayed in this image? What elements of visual image interpretation lead you to draw this conclusion? Question 6\nExamine image 6. What (specifically) is being displayed in this image? What elements of visual image interpretation lead you to draw this conclusion? Question 7\nExamine image 7. There are several items in this image but they all add up to one specific thing. What, specifically, is this image showing, and what is its location? What elements of visual image interpretation lead you to draw this conclusion? Question 8\nExamine image 8. There are several items visible in the image, but there’s one that’s more prominent than the others. What, specifically, is this item? What elements of visual image interpretation lead you to draw this conclusion? Question 9\nExamine image 9. What (specifically) is being displayed in this image? What elements of visual image interpretation lead you to draw this conclusion? For instance, it’s obviously an island, but what type of features does the island have (or not have) that may aid in identifying what is being shown here? Question 10\nExamine image 10. What (specifically) is being displayed in this image? What elements of visual image interpretation lead you to draw this conclusion? Wrapping up There is no need to save anything from this lab, so when done you can simply close without saving. Submit your answers to the questions on blackboard.\n"},{"id":26,"href":"/classes/geog111/labs/lab11/","title":"Lab 11 - Remotely Sensed Imagery and Color Composites","parent":"Labs","content":" This lab is a gratefully modified version of lab 10 from Bradley A. Shellito\u0026rsquo;s Introduction to Geospatial Technologies p752\n Learning Objective This chapter’s lab introduces some of the basics of working with multispectral remotely sensed imagery. The goals to take away from this exercise:\n Familiarize yourself with the basics of multispectral data manipulation in common geospatial softwares Load various bands into the color guns and examine the results Create and examine different color composites Compare the digital numbers of distinct water and environmental features in a remotely sensed satellite image in order to create basic spectral profiles Outline: Learning Objective Submission requirements Tutorial Submission Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rYou are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts and submit it to blackboard.\rCopy the folder Chapter 10, which contains a Landsat 8 OLI/TIRS satellite image (called clevelandjuly.img) of Cleveland, Ohio, from July 18, 2018. This file shows a subset of a larger Landsat satellite image. We will discuss Landsat imagery in more detail in Chapter 11, but for now, you just need to know that the OLI and TIRS imagery bands refer to the following portions of the electromagnetic (EM) spectrum, in micrometers (µm):\r* Band 1: Coastal (0.43 to 0.45 µm)\r* Band 2: Blue (0.45 to 0.51 µm)\r* Band 3: Green (0.53 to 0.59 µm)\r* Band 4: Red (0.64 to 0.67 µm)\r* Band 5: Near infrared (0.85 to 0.88 µm)\r* Band 6: Shortwave infrared 1 (1.57 to 1.65 µm)\r* Band 7: Shortwave infrared 2 (2.11 to 2.29 µm)\r* Band 8: Panchromatic (0.50 to 0.68 µm)\r* Band 9: Cirrus (1.36 to 1.38 µm)\r* Band 10: Thermal infrared 1 (10.60 to 11.19 µm)\r* Band 11: Thermal infrared 2 (11.50 to 12.51 µm)\rKeep in mind that Landsat 8 imagery has a 30-meter spatial resolution (except for the panchromatic band, which is 15 meters). Thus, each pixel you will examine covers a 30 meter × 30 meter area on the ground.\nTutorial ArcPro Add image to the map Read in options Symbology options Under Channels, you see the three color guns available to you (red, green, and blue). Each color gun can hold one band. (See the “Lab Data” section for a list of the bands that correspond to the various parts of the electromagnetic spectrum.) The number listed next to each color gun in the dialog box represents the band being displayed with that gun.\n Display band 5 in the red color gun, band 4 in the green color gun, and band 3 in the blue color gun.\n Accept the other defaults for now and click OK.\n A new window appears, and the clevelandjuly image begins loading. It might take a minute or two to load completely.\n Zoom around the image, paying attention to some of the city, landscape, and water areas.\n Question 10.1 What wavelength bands were placed into which color guns? Question 10.2 Why are the colors in the image so strange compared to what you’re normally used to seeing in other imagery (such as Google Earth or Google Maps)? For example, why is most of the landscape red? Question 10.3 In this color composite, what colors are used to display the water, vegetated areas, and urban areas in the image?\nReopen the image, this time using band 4 in the red gun, band 3 in the green gun, and band 2 in the blue gun (referred to as a 4-3-2 combination). When the image reloads, pan and zoom around the image, examining the same areas you just looked at. Question 10.4 What kind of composite did you create in this step? How are the bands displayed in this color composite in relation to their guns? Question 10.5 Why can’t you always use the kind of composite from Question 10.4 when analyzing satellite imagery? Reopen the image yet again, this time using band 7 in the red gun, band 5 in the green gun, and band 3 in the blue gun (referred to as a 7-5-3 combination). When it reloads, pan and zoom around the image, examining the same areas you just looked at. Question 10.6 Once again, what kind of composite was created in this step? Question 10.7 How are vegetated areas being displayed in this color composite (compared with the arrangement in Question 10.4)? Why are they displayed in this color? Examining Color Composites and Color Formations Reopen the image one more time and return to the 5-4-3 combination (that is, band 5 in the red gun, band 4 in the green gun, and band 3 in the blue gun). You can close the other images, as you’ll be working with this one for the rest of the lab. Zoom and move around the image to find and examine Burke Lakefront Airport, From its shape and the pattern of the runways, you should be able to clearly identify it in the Landsat image. Examine the airfield and its surroundings. Question 10.8 Why do the areas in between the runways appear red? Open a new image, this time with a 4-5-3 combination. Examine Burke Lakefront Airport in this new image and compare it to the one you’ve been working with. Question 10.9 Why do the areas in between the runways now appear bright green? Open another new image, this time with a 4-3-5 combination. Examine Burke Lakefront Airport in this new image and compare it with the others you’ve been working with. Question 10.10 Why do the areas in between the runways now appear blue? At this point, keep only the 5-4-3 image open and close the other two. Examining Specific Digital Numbers and Spectral Profiles Regardless of how the pixels are displayed in the image, each pixel in each band of the Landsat 8 image has a specific digital number set in the 0–65535 range. By examining those pixel values for each band, you can chart a basic spectral profile of some features in the image.\n Zoom in to the area around Cleveland’s waterfront and identify the urban areas. (In the image, these will mostly be the white or cyan regions.) From the Window pull-down menu, select New Selection Graph. Another (empty) window (called Selection Graph) opens in MultiSpec. In the image, locate a pixel that’s a good example of an urban or developed area. When the cursor changes to a cross shape, click the pixel once more. Important note: Zoom in so that you are selecting only one pixel with the cursor. A chart appears in the Selection Graph window, showing graphically the DNs for each band at that particular pixel. (See the “Lab Data” section for a list of the bands that correspond to the various parts of the electromagnetic spectrum.) The band numbers are on the x-axis, and the DNs are on the y-axis. Expand or maximize the chart as necessary to better examine the values. The Selection Graph window now shows the data that can be used to compute a simplified version of a spectral profile for an example of the particular urban land use pixel you selected from the image. For the next question, you need to find a pixel that’s a good example of water and another pixel that’s a good example of vegetation. You also need to translate the data from the chart to a spectral profile for each example. In drawing the profiles from the information on the chart, keep two things in mind: a. First, the values at the bottom of the Selection Graph window represent the numbers of the bands being examined. On the chart below, the actual wavelengths of the bands are plotted, so be very careful to make sure you properly match up each band with its respective wavelength. Note that bands 10 and 11 are not charted because they measure emitted thermal energy rather than reflected energy (as in bands 1 through 9). Note that band 8 (panchromatic) is also not charted. b. Second, the values on the y-axis of the Selection Graph window are DNs, not the percentage of reflection, as seen in a spectral signature diagram. There are a number of factors involved in transforming DNs into percent reflectance values because a DN and the percentage of reflectance don’t have an exact one-to-one ratio, as there are other factors that affect the measurement at the sensor, such as atmospheric effects. However, in this simplified example, you need to chart just the DN for this spectral profile. Examine the image to find a good example of a water pixel and a good example of a vegetation pixel. Plot a spectral profile diagram for the water and vegetation pixels you chose on the following diagram. (Remember to plot values as calculated from the DNs.) Examine your two profiles and answer Questions 10.11 and 10.12. Question 10.11 What information can you gain from the spectral profile for water about the ability of water to reflect and absorb energy? That is, what types of energy are most reflected by water, and what types of energy are most absorbed by water? Question 10.12 What information can you gain from the spectral profile for vegetation about the ability of vegetation to reflect and absorb energy? That is, what types of energy are most reflected by vegetation, and what types of energy are most absorbed by vegetation?\n Exit MultiSpec by selecting Exit from the File pull-down menu. There’s no need to save any work in this exercise.\n QGIS TODO Submission All you have to turn into blackboard for this week is the final image you created above.\n"},{"id":27,"href":"/classes/geog111/labs/lab12/","title":"Lab 12 - Landsat 8 Imagery","parent":"Labs","content":" This lab is a gratefully modified version of lab 11 from Bradley A. Shellito\u0026rsquo;s Introduction to Geospatial Technologies p814\n Learning Objective This chapter’s lab builds on the remote sensing basics of Chapter 10 and returns to using the MultiSpec program. In this exercise, you’ll be starting with a Landsat 8 scene and creating a subset of it with which to work. During the lab, you’ll examine the uses for several Landsat 8 band combinations in remote sensing analysis. The goals for you to take away from this exercise:\n Familiarize yourself further and work with satellite imagery in MultiSpec Create a subset image of a Landsat 8 scene Examine different Landsat 8 bands in composites and compare the results Examine various landscape features in multiple Landsat 8 bands and compare them Apply visual image interpretation techniques to Landsat 8 imagery Outline: Learning Objective Submission requirements Open raster data Display in 5-4-3 Submission Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rYou are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts and submit it to blackboard. Copy the folder Chapter 11, which contains a Landsat 8 OLI/TIRS satellite image (called LandsatJuly) of northeastern Ohio from July 18, 2018, which was constructed from data supplied via EarthExplorer. The Landsat 8 image bands refer to the following portions of the electromagnetic (EM) spectrum in micrometers (µm):\r* Band 1: Coastal (0.43 to 0.45 µm)\r* Band 2: Blue (0.45 to 0.51 µm)\r* Band 3: Green (0.53 to 0.59 µm)\r* Band 4: Red (0.64 to 0.67 µm)\r* Band 5: Near infrared (0.85 to 0.88 µm)\r* Band 6: Shortwave infrared 1 (1.57 to 1.65 µm)\r* Band 7: Shortwave infrared 2 (2.11 to 2.29 µm)\r* Band 8: Panchromatic (0.50 to 0.68 µm)\r* Band 9: Cirrus (1.36 to 1.38 µm)\r* Band 10: Thermal infrared 1 (10.60 to 11.19 µm)\r* Band 11: Thermal infrared 2 (11.50 to 12.51 µm)\rBands 1–9 are sensed by OLI, while bands 10 and 11 are sensed by TIRS. Keep in mind that Landsat 8 imagery has a 30-meter spatial resolution (except for the panchromatic band, which is 15 meters).\rOpen raster data Display in 5-4-3 11.2 Using Landsat 8 Imagery Bands The Landsat OLI/TIRS image has several different bands, each with its own use. (See Table 11.1 for information on which band represents which wavelengths.) For instance, looking at the entire Landsat scene now (the 5-4-3) combination, you have a broad overview of a large slice of northern Ohio using the near-infrared, red, and green bands.\n Vegetated areas (such as grass or trees) are reflecting a lot of near-infrared light in the red color gun, causing those areas to appear in shades of red. However, there are a lot of other things in the image as well. Examine the Landsat scene and zoom in on some of the cyan areas on the lakeshore of Lake Erie and then answer Question 11.1. Question 11.1 The features on the image in cyan are largely urbanized and developed areas. Why are they displayed in cyan on this image with the 5-4-3 band combination? Open the LandsatJuly image again—but this time use band 10 in the red color gun, band 10 again in the green color gun, and band 10 again in the blue color gun. These settings use band 10 in all three guns, so you see only this band in grayscale. This version of the LandsatJuly image loads in a separate window. Arrange the two windows (LandsatJuly in the 5-4-3 combination and LandsatJuly in the 10-10-10 combination) side by side so you can see both of them together. Keep in mind that band 10 in the Landsat 8 imagery is one of the thermal bands sensed by TIRS. Examine both of the Landsat scenes and answer Question 11.2. Question 11.2 What do the brighter places on the 10-10-10 image correspond with? Why do these places mostly appear brighter than their surroundings in the 10-10-10 image? Close the 10-10-10 version of LandsatJuly.5. Open the LandsatJuly image again, this time using a 9-9-9 combination (i.e., load the image with band 9 in the red gun, band 9 in the green gun, and band 9 in the blue gun). A new window opens with this image, which is band 9 in grayscale. Place this 9-9-9 image side by side with your original 5-4-3 image. Band 9 in the Landsat 8 imagery is designed for detecting cirrus clouds in imagery. Answer Question 11.3. Question 11.3 Where are the cirrus clouds in this section of northern Ohio in the image? Why are they so hard to see in the regular 5-4-3 image? Close the 9-9-9 image when you’re done so you’re only working with the 5-4-3 image. 11.3 Subsetting Images and Examining Landsat Imagery Right now, you’re working with an entire Landsat 8 scene, as shown on the next page, which is an area roughly 170 kilometers ×183 kilometers. For this lab, you will focus only on the area surrounding downtown Cleveland (i.e., path 18, row 31). You will have to create a subset; in essence, you will “clip” out the area that you’re interested in and create a new image from that Zoom to the part of the LandsatJuly image that shows Cleveland (as in the following graphic): 2. In the image, you should be able to see many features that make up downtown Cleveland —the waterfront area, a lot of urban development, major roads, and water features. 3. In order to create a new image that shows only Cleveland (a suggested region is shown in the graphic above), from the Processor pull-down menu, choose Reformat and then choose Change Image File Format. 4. You can draw a box around the area you want to subset by using the cursor, and the new image that’s created will have the boundaries of the box you’ve drawn on the screen. However, for the sake of consistency in this exercise, use the following values for Area to Reformat: a. Lines: ► Start 9015 ► End 10261 ► Interval 1 b. Columns: ► Start 847 ► End 2143 ► Interval 1 Leave the other defaults alone and click OK\n In the Save As dialog box that appears, save this new image in the Chapter 11 folder with the name clevsub.img. Choose Multispectral for the Save as Type option (from the pull-down menu options next to Save as Type). When you’re ready, click Save.\n Back in MultiSpec, minimize the window containing the LandsatJuly image.\n Open the clevsub image you just created in a new window. (In the Open dialog box, you may have to change the Files of Type that it’s asking about to All Files to be able to select the clevsub image option.)\n Open the clevsub image with a 5-4-3 combination (band 5 in the red gun, band 4 in the green gun, and band 3 in the blue gun).9. Use the other defaults for the Enhancement options: stretch set to Linear and Min-max set to Clip 2% of tails.\n In the Set Histogram Specifications dialog box that opens, select the Compute new histogram method and use the default Area to Histogram settings.\n Click OK when you’re done with the settings. The new subset image shows that the Cleveland area is ready to use.\n Zoom in on the downtown Cleveland area, especially the areas along the waterfront. Also open Google Earth Pro and compare the Landsat image to its very crisp resolution imagery. Answer Questions 11.4 and 11.5. Question 11.4 What kinds of features on the Cleveland waterfront cannot be distinguished at the 30-meter resolution you’re examining in the Landsat image? Question 11.5 Conversely, what specific features on the Cleveland waterfront are apparent at the 30-meter resolution you’re examining in the Landsat image? 11.4 Examining Landsat Bands and Band Combinations\n Zoom in on the Cleveland waterfront area in the clevsub image, so you can see FirstEnergy Stadium, home to the Cleveland Browns, and its immediate surrounding area.\n Open another version of the clevsub image using the 4-3-2 combination.\n Arrange the two windows on the screen so that you can examine them together, expanding and zooming in as needed to be able to view the stadium in both three-band combinations.Question 11.6 Which one of the two band combinations best brought the stadium and its field to prominence? Question 11.7 Why did this band combination best help in viewing the stadium and the field? (Hint: You may want to do some brief online research into the nature of the stadium and its field.)\n Return to the view of Cleveland’s waterfront area and examine the water features (particularly Lake Erie and the river). Paying careful attention to the water features, open four display windows and then expand and arrange them side by side so you can look at differences between them. Create the following image composites, one for each of the four windows: a. 4-3-2 b. 7-6-5 c. 5-3-2 d. 2-2-2 Question 11.8 Which band combination(s) is best for letting you separate water bodies from land? Why?\n Return to the view of Cleveland’s waterfront area. Focus on the urban features and vegetated features. (Zoom and pan where necessary to get a good look at urbanization.) Open new windows with the following new band combinations and note how things change with each combinations: a. 7-5-3 b. 2-3-4 c. 5-4-3\n Question 11.9 Which band combination(s) is(are) best for separating urban areas from other forms of land cover (i.e., vegetation, trees, etc.)? Why?\nSubmission All you have to turn into blackboard for this week is the final image you created above.\n"},{"id":28,"href":"/classes/geog111/labs/lab13/","title":"Lab 13 - Earth Observing Missions Imagery","parent":"Labs","content":" This lab is a gratefully modified version of lab 12 from Bradley A. Shellito\u0026rsquo;s Introduction to Geospatial Technologies p880\n Learning Objective This chapter’s lab introduces some of the basics of examining imagery from three different Earth-observing satellite missions: Terra, Aqua, and Suomi NPP. You will be examining data from MOPITT as well as many types of imagery from MODIS and VIIRS. You will also be using online resources from NASA and others in conjunction with Google Earth. The goals for you to take away from this lab:\n Utilize Google Earth Pro as a tool for examining Earth-observing satellite imagery as an overlay Examine the usage and functions of the day-night band imagery from VIIRS Examine the usage and functions of MODIS imagery for various applications Examine the output from MOPITT imagery of global carbon monoxide Examine environmental and climate applications of satellite imagery from MODIS for land surface temperature, sea surface temperature, and snow cover Outline: Learning Objective Submission requirements Google Earth Pro Viewing VIIRS Satellite Imagery Overlays with Google Earth Pro Wrapping up Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rYou are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts and submit it to blackboard. Copy the folder Chapter 12, which contains the following GeoTIFF datasets from NASA’s Earth Observatory:\r* Bakken_vir_2012317_geo.tif, a VIIRS image showing a section of northwestern North Dakota at night\r* russia_tmo_2012170_fires_geo.tif, a MODIS image showing fires in Siberia\r* samerica_vir_2012202_geo.tif, a VIIRS image showing a section of the South American eastern coastline at night\r* irene_amo_2011238_geo.tif, a MODIS image showing Hurricane Irene\r* Newzealand_amo_2017317_geo.tif, a MODIS image showing an algal bloom off the coast of New Zealand\r* The folder also contains the following KML dataset from the University of Wisconsin’s Space Science and Engineering Center:\r* Daily_MODIS_May3, a series of MODIS images covering the United States from May 3, 2017\r##Tutorial\nGoogle Earth Pro Since we arn\u0026rsquo;t doing much quantitative analysis with this data\nViewing VIIRS Satellite Imagery Overlays with Google Earth Pro Start Google Earth Pro (GEP), and once loaded turn on the option Borders and Labels. From the File pull-down menu, choose Open. Under the pull-down menu where you can change the types of files to open in Google Earth, choose Images to allow GEP to open files of .tif file type. Open the file Bakken_vir_2012317_geo.tif. On the dialog box that pops up, click Crop. When prompted, click the mouse somewhere in the middle of North Dakota. The VIIRS imagery should appear on the screen, properly aligned with Google Earth. In the New Image Overlay dialog box, Google Earth gives you some information about the image. Click OK in this dialog box to close it. A new layer called bakken_vir_2012317_geo.tif is added to Google Earth. Although North Dakota is sparsely populated, a lot of lights can be seen by VIIRS at night. These lights are related to the gas and oil drilling sites of the Bakken shale in the region. You may have to zoom out and pan around the imagery to see the whole North Dakota region. More information about this is available through NASA’s Earth Observatory\n Question 1\nHow does the VIIRS imagery help determine where shale and oil drilling is occurring in this section of North Dakota? In Google Earth Pro, in the Places box, right-click on the Bakken_vir_2012317_geo.tif file and choose Delete to remove it. Open the next GeoTIFF file, samerica_vir_2012202_geo.tif, in Google Earth Pro. GEP rotates to the Atlantic coast of South America and shows you an outline box of where the image should be placed. In the dialog box that appears, again click Crop and click the mouse somewhere in the center of the outline box. The VIIRS image appears properly aligned in GEP. In the New Image Overlay dialog box, Google Earth gives you some information about the image. Click OK in that box to close it. You see a VIIRS image captured during the night in July 2012 showing part of the eastern coast of South America. (More information about this is available through NASA’s Earth Observatory at https://earthobservatory.nasa.gov/images/79822/city-lights-of-south-americas-atlantic-coast.) You may have to zoom out and pan around the imagery to see the whole region covered by the imagery. Question 2\nHow does the VIIRS imagery help determine the spread of cities and populations across this section of South America? How can this imagery be used to aid in measuring demographics such as population density? In Google Earth Pro, in the Places box, right-click on the samerica_vir_2012202_geo.tif file and choose Delete to remove it. 12.2 Viewing MODIS Satellite Imagery Overlays with Google Earth Pro\n Open the next GeoTIFF image to examine: russia_tmo_2012170_fires_geo.tif. GEP rotates to Russia and shows you an outline box of where the image should be placed. In the dialog box that appears, this time click Scale. The MODIS image appears properly aligned in GEP. In the New Image Overlay dialog box, Google Earth gives you some information about the image. Click OK in that box to close it. You see a MODIS image of wildfires burning in Siberia. (More information about this is available through NASA’s Earth Observatory, at https://earthobservatory.nasa.gov/images/78305/siberia-burns.) You may have to zoom out and pan around the imagery to see the whole region covered by the imagery. Question 3\nHow are the fires being shown in this MODIS imagery? How is the extent of the fires being tracked via MODIS? (Hint: What else is visible in the MODIS scene, aside from the fires?) In Google Earth Pro, in the Places box, right-click on the russia_tmo_2012170_fires_geo.tif file and choose Delete to remove it. Open the next GeoTIFF image to examine: newzealand_amo_2017317_geo.tif. This is a MODIS image of a phytoplankton bloom off the coast of New Zealand. (More information about this is available through NASA’s Earth Observatory Again, you may first have to zoom out to see the extent of the MODIS image and then zoom in to examine some of the details. Question 4\nHow is the phytoplankton bloom shown in this MODIS image (that is, what distinguishes the bloom from the surrounding ocean)? What is the approximate size of the bloom (in comparison with the New Zealand coastline)? In Google Earth Pro, in the Places box, right-click on the newzealand_amo_2017317_geo.tif file and choose Delete to remove it. Open the last of the GeoTIFF images to examine: irene_amo_2011238_geo.tif. In the dialog box that appears, click Scale. The MODIS image appears properly aligned in GEP. In the New Image Overlay dialog box, Google Earth gives you some information about the image. Click OK in that box to close it. This is a 2011 MODIS image of Hurricane Irene, which was a massively destructive storm. (More information about monitoring this storm is available at https://earthobservatory.nasa.gov/images/51931/hurricane-irene.) Again, you have to zoom out to see the extent of the imagery of Irene. Question 5\nWhy is MODIS used for monitoring weather and storms such as Hurricane Irene rather than other satellite systems we’ve discussed, such as WorldView-3 or Landsat 7 or 8? In Google Earth Pro, in the Places box, right-click on the irene_amo_2011238_geo.tif file and choose Delete to remove it. 12.3 Viewing Recent Earth-Observing Imagery with Google Earth Pro\n A system such as MODIS can image almost the entire Earth in a single day, and you can access very recent imagery from it. In Google Earth Pro, from the File pull-down menu select Open and then navigate to the Chapter 12 folder and open the KML file called Daily_MODIS_May3. When the file opens, GEP begins zooming very closely to Earth; you might want to stop it in mid-zoom and then zoom out to see the entire United States in Google Earth. This KML file consists of a series of MODIS images from May 3, 2017, that cover nearly all of the United States. Pan and zoom across the imagery to see the entire area covered by these MODIS images. Question 6\nBy examining the imagery, you can see that overlapping images were taken and that small pieces of imagery are missing. Approximately how much area of the United States is covered in one MODIS swath? Why do you think there are pieces missing from the imagery? These MODIS images were downloaded from the University of Wisconsin’s Space Science and Engineering Center’s MODIS Today online tool. To see what’s happening with current MODIS imagery, point your Web browser to http://ge.ssec.wisc.edu/modis-today. When the Website opens, you see the currently available real-time MODIS imagery. Click on the radio buttons for Terra and Aqua to see what type of coverage of MODIS imagery is currently available for each one (as both satellites carry MODIS instruments). Choose either Terra or Aqua, based on which one gives you the better overall coverage for today’s date. If neither is acceptable, click on the Previous Day button and try again with the two satellites. When you have a good MODIS image of the United States to work with, click on Open in Google Earth. If you are prompted with a dialog box, choose to open the file in Google Earth Pro. If not, save the KML to the Chapter 12 folder and then manually open it in Google Earth Pro. You’ll see in Google Earth Pro’s Places box, under Temporary Places, the name of the initial KML you added (t1.17122) placed at the top of the list and then the second KML you added directly from the Website (shown in the graphic above as aqua_today.kml) placed second at the bottom. When you add KML/KMZ files to Google Earth Pro, the order of priority for displaying them goes from top to bottom in the Places box. So, whatever is at the bottom of the stack gets displayed first, then the next up from the bottom is displayed on top of it, until whatever is at the top of the stack gets displayed over everything else. (This will be important in the next section of the lab.) You can turn layers on and off by checking and unchecking their boxes. For now, display only the new KML file you downloaded from the Website. Question 7\nBased on your examination of today’s MODIS imagery, are there any notable weather patterns you can see forming, such as large storms? How much of the United States is visible and not clouded over? Turn off both of the MODIS images. 12.4 Using the NASA Earth Observations (NEO) Web Resources\n In your Web browser, go to http://neo.sci.gsfc.nasa.gov. This is the Website for NEO (NASA Earth Observations), an online source of downloadable Earth observation satellite imagery. In this portion of the lab, you’ll be using NEO’s imagery in KMZ format in conjunction with Google Earth Pro. Rather than provide you with only a view of a flat satellite image, NEO gives you the option of examining the imagery draped across a global view in GEP. Click on the Atmosphere option at the top of the page. Select the option Carbon Monoxide. You see an image showing the global carbon monoxide concentration for one month, collected using the MOPITT instrument aboard Terra. (See the About this dataset section of the NEO Website for more detailed information.) Select 2016 for the year and then choose the option June 2016. Select Google Earth from the Downloads File Type pull-down menu and click on the option 1440 × 720 to begin the download of the KMZ file. If prompted, open the file in Google Earth Pro. If you had to download the KMZ file, locate it on your computer and manually open the file in GEP. Rotate Google Earth Pro to examine the MOPITT imagery draped over the globe. Make sure there’s a checkmark in Google Earth Pro’s Borders and Labels box. Question 8\nGeographically, where were the highest concentrations of carbon monoxide for this month? Turn off the MOPITT Carbon Monoxide image in Google Earth Pro. Back on the NEO Website, select the Energy tab and then select Land Surface Temperature [Day]. Choose the option for 2019 and then select February. Use Google Earth for the Downloads File Type option and click on the 1440 × 720 option. If prompted, open the file in Google Earth Pro. If you had to download the KMZ file, locate it on your computer and manually open the file in Google Earth Pro. Rotate Google Earth Pro to examine the new MODIS image. Question 9\nGeographically, where were the lowest daytime land surface temperatures for this month? Question 10\nGeographically, where specifically in South America are the lowest daytime land temperatures for this month? Why are temperatures so low here when the rest of the continent has higher daytime land surface temperatures? (Hint: You may want to zoom in on some of these areas and then examine them with and without the MODIS imagery turned on.) Turn off the MODIS Land Surface Temperature image in Google Earth Pro. Back on the NEO Website, select the Ocean tab and then select Sea Surface Temperature 2002 + (MODIS). Choose 2019 for the year and then select January. Use Google Earth for the Downloads File Type option and click on the 1440 × 720 option. If prompted, open the file in Google Earth Pro. If you had to download the KMZ file, locate it on your computer and manually open the file in Google Earth Pro. Rotate Google Earth Pro and zoom in to examine the new MODIS composite image. Question 11\nGeographically, where were the areas with the warmest sea surface temperatures on the planet for this month? (Hint: You may want to turn on the Grid functions in Google Earth Pro—under the View pull-down menu—to add further geographic context to help answer this question.) Question 12\nHow do the sea surface temperatures of the Atlantic Ocean off the coasts of Lisbon, Portugal; Saint Pierre, Newfoundland; London, England; and New York City, New York, all compare to one another (that is, rank them from warmest to coolest temperatures) for this month? In Google Earth Pro, turn off this MODIS image. Back on the NEO Website, select the Land tab and then select Snow Cover. Choose 2019 for the year and then select February. Use Google Earth for the Downloads File Type option and click on the 1440 × 720 option. If prompted, open the file in Google Earth Pro. If you had to download the KMZ file, locate it on your computer and manually open the file in Google Earth Pro.17. Rotate Google Earth Pro and zoom in to examine the new MODIS image, showing snow cover on Earth. Question 13\nGeographically, where were the greatest concentrations of snow cover in the southern hemisphere in February 2019 (other than Antarctica and the south polar region)? In Google Earth Pro, turn off this MODIS image. Back on the NEO Website, select the Land tab and then select Snow Cover. Choose 2018 for the year and then select August. Use Google Earth for the Downloads File Type option and click on the 1440 × 720 option. If prompted, open the file in Google Earth Pro. If you had to download the KMZ file, locate it on your computer and manually open the file in Google Earth Pro. Rotate Google Earth Pro and zoom in to examine this second MODIS image showing snow cover on Earth. Question 14\nGeographically, where were the greatest concentrations of snow cover in the northern hemisphere in August 2018 (other than the north pole and the Arctic) At this point, you can exit Google Earth Pro by selecting Exit from the File pull-down menu. Wrapping up There is no need to save anything from this lab, so when done you can simply close without saving. Submit your answers to the questions on blackboard.\n"},{"id":29,"href":"/classes/geog111/labs/lab14/","title":"Lab 14 - Final Lab activity","parent":"Labs","content":"Future content in progress\n"},{"id":30,"href":"/classes/geog526/labs/lab14/","title":"Lab 14 - Final Lab activity","parent":"Labs","content":"TODO\n"},{"id":31,"href":"/classes/geog111/classoutline/","title":"Syllabus and course policies","parent":"Introduction to GEOINT -- GEOG111","content":"This creation of this course was funded though a course creation scholarship through the University of Kansas Intelligence Community Centers for Academic Excellence This is a living document. Changes will be announced in class. The syllabus will supersede the blackboard course in the event of a conflict. Course Description Student Responsibilities Expectations Computers Computer recommendations Resources Academic Integrity Disabilities Course Evaluation Labs Exams Grade determination Course Description This course is an introduction to basic geospatial intelligence concepts and geospatial technologies. By blending broad brush conceptual lectures and hands-on experiences with mapping technologies, participants will learn how to identify, collect, and transform information about locations, people, objects, environments, events, and phenomena into digital representations of the world and generate end-products of geospatial analysis using modern (unclassified) tools. Overall, it contains four main parts: geospatial data and GPS, geographic information systems, remote sensing, and geospatial applications. Students will learn how to acquire and develop geospatial data as the sources for mapping, the skills of analyzing and interpreting spatial information, and how geovisualization can be used in addressing real-world problems in the Intelligence Community.\nSatisfies: Lab and Field Experiences (LFE), Natural Science (N), Part of the KU Core and the undergraduate certificate and minor in intelligence \u0026amp; national security studies \nStudent Responsibilities It is student’s responsibility to attend the lectures, do the readings, and finish lab assignments in a timely manner. All students should be prepared to participate in class discussion and answer questions when called upon. All students are expected to complete labs and exams on time. Labs are typically due at the beginning of the lab period one week after they are assigned, unless otherwise noted.\nExpectations Attend every Class/lab and ask questions! At times, the ArcGIS software can be very unforgiving, and GIS can seem like a foreign language. The last thing I want is for you to walk out of this class without understanding the material. I expect and hope that this will be a rewarding experience for everyone. We are available to answer questions and otherwise offer assistance during the office hours indicated above or by appointment. Additionally, if you have any questions, feel free to drop me an e-mail or use the slack channel (The link will be passed around on the signup sheet at the beginning of the year). I try to be diligent about responding to emails and by using slack you are likely to get a very rapid response.\nComputers The computers in the lab are not cheap to replace. Please do your part to lengthen their lives by not spilling coffee/soda/liquids of any kind on the computer or any of its parts. You do NOT need to go out and purchase a computer for this class, although I do recommend having a personal computer with ArcGIS installed.\nComputer recommendations ArcGIS is a demanding program, and a lower grade computer is not recommended. If you are looking to purchase a computer, I offer these general guidelines. You should aim for a 64-bit computer (x64), and aside from that I like to prioritize CPU \u0026amp; RAM \u0026gt; video \u0026amp; hard drive speed \u0026gt; hard drive space (External hard drives are really cheap, and as long as the computer has a USB port you are good to go). For the last few years I\u0026rsquo;ve deferred to the pre-built pc choices on https://buildmypc.net/ or https://pcpartpicker.com/. In general, I tend to stay away from the \u0026ldquo;gaming\u0026rdquo; laptops. For those prices, you can build a much more powerful desktop and get a lot more mileage out of it, but that\u0026rsquo;s just me. If you want to go that route however virtually all will meet your school related needs. My “Ideal” build (This is excessive, visualization grade)\nResources The lab is open for students whenever there is no scheduled class (see the calendar posted on the lab door. Additionally, ArcMap is available in the GIS lab on the second floor of Watson Library, and on the Architecture and Engineering computers if you have access to those. A signup sheet will also be passed around at the beginning of the semester for those who wish to install ArcGIS on your own personal computers.\nAcademic Integrity Cheating and/or plagiarism is not tolerated. Feel free to discuss the labs with your classmates, but all work you turn in should be your own, and it\u0026rsquo;s insert year here, how hard is it to give someone credit for their work? I use Zotero for my citation needs, as a KU student you have access to Endnote through Office 365, and in a pinch Cite This for Me will also push you across the finish line but you really should try to be more efficient with your time.\nDisabilities Anyone who has a disability that may prevent full demonstration of academic ability should contact me as soon as possible to ensure accommodations can be made to allow for full educational benefit.\nCourse Evaluation TODO - breakdown\nLabs There will be 14 computer labs which count for 50% of your final grade and each lab is equally weighted. Labs make up the bulk of your final grade because it is in the labs that you will gain the practical knowledge of how to use the geospatial technologies. Lab TA will help with your questions. Lab assignments are typically due in a week and should be submitted through Blackboard before the next lab meeting. Any assignment that is turned in after the due time is considered late. The penalty for a late assignment is based on the number of days late (including weekends) and will be penalized 10% per day.\nExams Mid-term exam will cover the materials from the first half of the class, and the final exam will focus on the second half. The final exam is not comprehensive, but since we will be building on concepts throughout the semester, materials from the first half of the semester will inevitably be part of the final exam. Either exam may include a practical portion. This practical portion will consist of instructions to prepare some sort of short analysis or demonstration of practical knowledge related to geointellegence. These will be posted either a week before the exam and due on the date of the exam, or posted the day of the exam and due a week after.\nGrade determination KU uses a 10 point grade scale with +/- options (sans A+). As material is graded you may use the following [grade sheet helper](TODO - URL here) to determine what your grade in the course is. At the end of the term I export the blackboard gradebook and put it into this sheet to submit grades, so you can periodically check your grade this way should you have concerns. Letter grades are determined based on the following grading scale:\n A A- B+ B B- C+ C C- D+ D D- F 100 - 93 92 - 90 89 - 87 86 - 83 82 - 80 79 - 77 76 - 73 72 - 70 69 - 67 66 - 63 62 - 60 \u0026lt;60 "},{"id":32,"href":"/classes/geog358/classoutline/","title":"Syllabus and course policies","parent":"Introduction to GIS -- GEOG358","content":"This is a living document. Changes will be announced in class. Course Description Student Responsibilities Expectations Computers Computer recommendations Resources Academic Integrity Disabilities Course Evaluation Labs Cartography primers Exams Grade determination Course Description This course provides an introduction to computer based analysis of spatial data. Topics covered include the fundamentals of geospatial thought, basic principles of collecting, storing, analyzing, and displaying spatial data, and cursory treatment of analysis and decision making. Emphasis is on problem solving activities using common spatial analytical techniques using typical applications as examples. The student will gain extensive hands on experience with state of the art GIS software in the lab. This course introduces the concepts and components of a geographic information system (GIS). It also teaches the essential skills of operating a functional GIS through the use of the ArcGIS software suite. By completing this course, students will understand the operational processes of spatial data acquisition, editing in ArcGIS Desktop with QA/QC, metadata development, geodatabase design, spatial query and display, spatial analysis and modeling, preliminary GIS application development, cartographic mapping and dynamic visualization, and GIS implementation basics. Students will also be exposed to Google Earth and common Free and Open Source Software (FOSS) tools, as well as the basic concepts of remote sensing and Global Positioning System (GPS). GIS technology has broad applications in natural and social sciences, humanities, environmental studies, engineering, and management. Examples include wildlife habitat study, urban and regional planning, contagious disease monitoring, agriculture and forestry, environmental quality assessment, emergency management, transportation planning, consumer and competitor analysis, and many more. This course will introduce a few selected use cases of GIS application in different disciplines. Finally, this class counts towards the Undergraduate certificate in GIS. I obviously recommend it, particularly if you intend to pursue GIS as a career path/toolset in any capacity. A certificate in GIS shows employers you have the critical skills necessary to perform in the public, private, and academic settings, and counts towards the professional certificate: the GISP (Geographic Information Systems Professional). Fun fact: To earn your GISP, you need 30 “points”. A BS earns you 20. A Masters or PhD earns 25. The GIS certificate is worth 5 and sets you up for success in the written part of the examination.\nSatisfies: Lab and Field Experiences (LFE), Natural Science (N)\nStudent Responsibilities It is student’s responsibility to attend the lectures, do the readings, and finish lab assignments in a timely manner. All students should be prepared to participate in class discussion and answer questions when called upon. All students are expected to complete labs and exams on time. Labs are typically due at the beginning of the lab period one week after they are assigned, unless otherwise noted.\nExpectations Attend every Class/lab and ask questions! At times, the ArcGIS software can be very unforgiving, and GIS can seem like a foreign language. The last thing I want is for you to walk out of this class without understanding the material. I expect and hope that this will be a rewarding experience for everyone. We are available to answer questions and otherwise offer assistance during the office hours indicated above or by appointment. Additionally, if you have any questions, feel free to drop me an e-mail or use the slack channel (The link will be passed around on the signup sheet at the beginning of the year). I try to be diligent about responding to emails and by using slack you are likely to get a very rapid response.\nComputers The computers in the lab are not cheap to replace. Please do your part to lengthen their lives by not spilling coffee/soda/liquids of any kind on the computer or any of its parts. You do NOT need to go out and purchase a computer for this class, although I do recommend having a personal computer with ArcGIS installed.\nComputer recommendations ArcGIS is a demanding program, and a lower grade computer is not recommended. If you are looking to purchase a computer, I offer these general guidelines. You should aim for a 64-bit computer (x64), and aside from that I like to prioritize CPU \u0026amp; RAM \u0026gt; video \u0026amp; hard drive speed \u0026gt; hard drive space (External hard drives are really cheap, and as long as the computer has a USB port you are good to go). For the last few years I\u0026rsquo;ve deferred to the pre-built pc choices on https://buildmypc.net/ or https://pcpartpicker.com/. In general, I tend to stay away from the \u0026ldquo;gaming\u0026rdquo; laptops. For those prices, you can build a much more powerful desktop and get a lot more mileage out of it, but that\u0026rsquo;s just me. If you want to go that route however virtually all will meet your school related needs. My “Ideal” build (This is excessive, visualization grade)\nResources The lab is open for students whenever there is no scheduled class (see the calendar posted on the lab door. Additionally, ArcMap is available in the GIS lab on the second floor of Watson Library, and on the Architecture and Engineering computers if you have access to those. A signup sheet will also be passed around at the beginning of the semester for those who wish to install ArcGIS on your own personal computers.\nAcademic Integrity Cheating and/or plagiarism is not tolerated. Feel free to discuss the labs with your classmates, but all work you turn in should be your own, and it\u0026rsquo;s insert year here, how hard is it to give someone credit for their work? I use Zotero for my citation needs, as a KU student you have access to Endnote through Office 365, and in a pinch Cite This for Me will also push you across the finish line but you really should try to be more efficient with your time.\nDisabilities Anyone who has a disability that may prevent full demonstration of academic ability should contact me as soon as possible to ensure accommodations can be made to allow for full educational benefit.\nCourse Evaluation Course grade will be evaluated based on two exams, pre-class review questions, ~ten lab exercises, and a final project. The exams will include materials from the textbooks, lecture slides, handouts, and labs. A breakdown of grade weights is as follows:\n Exams (17.5% * 2) 35% Pre-class review 5% Labs 40% Final Project 20% Labs This is not a cartography course; however, the final product of the majority of the labs is a map (or maps) of some sort. The grade you receive on the maps will be based primarily on the results of the relevant analysis, not cartographic technique (there is a whole class dedicated to this; GEOG 211). That said, some level of cartographic knowledge will be useful to effectively communicate with maps, and will be stressed in critiques of your submissions. As the semester wears on we expect improvements and may begin taking points off for excessively bad cartographic technique. Please refer to the cartography resources linked below for some basic guidelines or ask for feedback. In addition, each lab will specify the steps to follow with regards to what to submit for grading and how to do so. Please don’t make assumptions. Follow these instructions carefully; failure to submit the correct files or submission of files in an incorrect fashion will result in drastic loss of points. Don’t turn a potential A into an F due to failure to follow directions!\nCartography primers http://www.icsm.gov.au/mapping/cartographic.html https://frew.eri.ucsb.edu/private/ESM263/week/2/ESM-263-2017-02-Cartography_Basics.pdf http://colorbrewer2.org/#type=sequential\u0026scheme=BuGn\u0026n=3 Exams Mid-term exam will cover the materials from the first half of the class, and the final exam will focus on the second half. The final exam is not comprehensive, but since we will be building on concepts throughout the semester, materials from the first half of the semester will inevitably be part of the final exam. Either exam may include a practical portion. This practical portion will consist of instructions to prepare some sort of short analysis or demonstration of practical knowledge related to the use of ArcGIS. These will be posted either a week before the exam and due on the date of the exam, or posted the day of the exam and due a week after.\nGrade determination KU uses a 10 point grade scale with +/- options (sans A+). As material is graded you may use the following [grade sheet helper](TODO - URL here) to determine what your grade in the course is. At the end of the term I export the blackboard gradebook and put it into this sheet to submit grades, so you can periodically check your grade this way should you have concerns. Letter grades are determined based on the following grading scale:\n A A- B+ B B- C+ C C- D+ D D- F 100 - 93 92 - 90 89 - 87 86 - 83 82 - 80 79 - 77 76 - 73 72 - 70 69 - 67 66 - 63 62 - 60 \u0026lt;60 "},{"id":33,"href":"/classes/geog526/classoutline/","title":"Syllabus and course policies","parent":"Remote Sensing -- GEOG526","content":"This is a living document. Changes will be announced in class. Course Description Student Responsibilities Expectations Computers Computer recommendations Resources Academic Integrity Disabilities Course Evaluation Labs Annotated bibliography Writing assignment Exams Grade determination Course Description This course emphasizes the understanding of the aerospace remote sensing foundations and the use of remote sensor data and image interpretation and processing techniques for environmental and urban applications. Specifically, the course will cover concepts and foundations of remote sensing, aerial photography and photogrammetry, visual image interpretation, characteristics of various sensing systems (i.e., multispectral, thermal, hyperspectral), and an introduction to digital image processing techniques. The primary objective of this course is to provide students with the conceptual foundations and the technical skills to apply remote sensing for problem solving in environmental and cultural domains.\nStudent Responsibilities It is student’s responsibility to attend the lectures, do the readings, and finish lab assignments in a timely manner. All students should be prepared to participate in class discussion and answer questions when called upon. All students are expected to complete labs and exams on time. Labs are typically due at the beginning of the lab period one week after they are assigned, unless otherwise noted.\nExpectations Attend every Class/lab and ask questions! At times, the ArcGIS software can be very unforgiving, and GIS can seem like a foreign language. The last thing I want is for you to walk out of this class without understanding the material. I expect and hope that this will be a rewarding experience for everyone. We are available to answer questions and otherwise offer assistance during the office hours indicated above or by appointment. Additionally, if you have any questions, feel free to drop me an e-mail or use the slack channel (The link will be passed around on the signup sheet at the beginning of the year). I try to be diligent about responding to emails and by using slack you are likely to get a very rapid response.\nComputers The computers in the lab are not cheap to replace. Please do your part to lengthen their lives by not spilling coffee/soda/liquids of any kind on the computer or any of its parts. You do NOT need to go out and purchase a computer for this class, although I do recommend having a personal computer with ArcGIS installed.\nComputer recommendations ArcGIS is a demanding program, and a lower grade computer is not recommended. If you are looking to purchase a computer, I offer these general guidelines. You should aim for a 64-bit computer (x64), and aside from that I like to prioritize CPU \u0026amp; RAM \u0026gt; video \u0026amp; hard drive speed \u0026gt; hard drive space (External hard drives are really cheap, and as long as the computer has a USB port you are good to go). For the last few years I\u0026rsquo;ve deferred to the pre-built pc choices on https://buildmypc.net/ or https://pcpartpicker.com/. In general, I tend to stay away from the \u0026ldquo;gaming\u0026rdquo; laptops. For those prices, you can build a much more powerful desktop and get a lot more mileage out of it, but that\u0026rsquo;s just me. If you want to go that route however virtually all will meet your school related needs. My “Ideal” build (This is excessive, visualization grade)\nResources The lab is open for students whenever there is no scheduled class (see the calendar posted on the lab door. Additionally, ArcMap is available in the GIS lab on the second floor of Watson Library, and on the Architecture and Engineering computers if you have access to those. A signup sheet will also be passed around at the beginning of the semester for those who wish to install ArcGIS on your own personal computers.\nAcademic Integrity Cheating and/or plagiarism is not tolerated. Feel free to discuss the labs with your classmates, but all work you turn in should be your own, and it\u0026rsquo;s insert year here, how hard is it to give someone credit for their work? I use Zotero for my citation needs, as a KU student you have access to Endnote through Office 365, and in a pinch Cite This for Me will also push you across the finish line but you really should try to be more efficient with your time.\nDisabilities Anyone who has a disability that may prevent full demonstration of academic ability should contact me as soon as possible to ensure accommodations can be made to allow for full educational benefit.\nCourse Evaluation Evaluation for this course is split evenly between the lab and lecture sections and is distributed as follows\n Lecture sessions: 50% Two exams (17.5% * 2) 35% Writing assignment 15% Lab sessions: 50% 11 Lab assignments (3.19 * 11) 35% Annotated bibliography (2.15 * 7) 15% Labs There will be 11 computer labs which count for 50% of your final grade and each lab is equally weighted. Labs make up the bulk of your final grade because it is in the labs that you will gain the practical knowledge of how to use the geospatial technologies. Lab TA will help with your questions. Lab assignments are typically due in a week and should be submitted through Blackboard before the next lab meeting. Any assignment that is turned in after the due time is considered late. The penalty for a late assignment is based on the number of days late (including weekends) and will be penalized 10% per day.\nAnnotated bibliography The students will complete 7 annotated bibliographic references during the semester. Approximately every two weeks, one article review will be submitted to your lab instructor. The first 3 bibliographies can be from commercial journals or web resources. The last 4 bibliographies need to use articles from peer-reviewed journals. Grading will be based on the quality of the review. The length should be ~300 words.\nWriting assignment This is an open book take-home essay assignment covering materials learned during the semester including lectures and readings. These will be 4-5 essay questions. This assignment is required to be submitted 3 days after the questions are online.\nExams There are two non-cumulative in-class exams in this course. Examinations will only cover material pertaining to lectures and reading assignments in lectures. A calculator is required for the exams.\nGrade determination KU uses a 10 point grade scale with +/- options (sans A+). As material is graded you may use the following [grade sheet helper](TODO - URL here) to determine what your grade in the course is. At the end of the term I export the blackboard gradebook and put it into this sheet to submit grades, so you can periodically check your grade this way should you have concerns. Letter grades are determined based on the following grading scale:\n A A- B+ B B- C+ C C- D+ D D- F 100 - 93 92 - 90 89 - 87 86 - 83 82 - 80 79 - 77 76 - 73 72 - 70 69 - 67 66 - 63 62 - 60 \u0026lt;60 "},{"id":34,"href":"/classes/geog558/classoutline/","title":"Syllabus and course policies","parent":"Intermediate GIS -- GEOG558","content":"This is a living document. Changes will be announced in class. The syllabus will supersede the blackboard course in the event of a conflict. Course Description Student Responsibilities: Expectations: Computers: Computer recommendations Resources Academic Integrity Disabilities Course Evaluation Labs Cartography primers: Exams Grade determination Course Description Building on the introductory GIS class, this course focuses primarily on teaching advanced spatial analysis methods with applications in the mapping of surface water and the analysis of water related environmental issues. Major topics covered include map algebra (also called cartographic modeling), multi-criteria evaluation, terrain analysis, spatial statistics, and spatial interpolation. An additional objective of the course is to have students become familiar with various methods and tools through the use of GIS software. Continuing with the industry standard GIS software, this class expands on GIS platforms through the use of the ArcGIS software suite. Finally, this class counts towards the certificate in GIS.\nStudent Responsibilities: It is student’s responsibility to attend the lectures, do the readings, and finish lab assignments. All students should be prepared to participate in class discussion and answer questions when called upon. All students are expected to complete labs and exams on time. Labs are due at the beginning of the lab period one week after they are assigned, unless otherwise noted. Detailed lab rules will be given by the teaching assistant in the first lab.\nExpectations: Attend every Class/lab and ask questions! At times, the ArcGIS software can be very unforgiving, and GIS can seem like a foreign language. The last thing I want is for you to walk out of this class without understanding the material. I expect and hope that this will be a rewarding experience for everyone. We are available to answer questions and otherwise offer assistance during the office hours indicated above or by appointment. Additionally, if you have any questions, feel free to drop me an e-mail or use the slack channel (The link will be passed around on the sign up sheet at the beginning of the year). I try to be diligent about responding to emails and by using slack you are likely to get a very rapid response\nComputers: The computers in the lab are not cheap to replace. Please do your part to lengthen their lives by not spilling coffee/soda/liquids of any kind on the computer or any of its parts. You do NOT need to go out and purchase a computer for this class, although I do recommend having a personal computer with ArcGIS installed.\nComputer recommendations ArcGIS is a demanding program, and a lower grade computer is not recommended. If you are looking to purchase a computer, I offer these general guidelines. You should aim for a 64-bit computer (x64), and aside from that I like to prioritize CPU \u0026amp; RAM \u0026gt; video \u0026amp; hard drive speed \u0026gt; hard drive space (External hard drives are really cheap, and as long as the computer has a USB port you are good to go). For the last few years I\u0026rsquo;ve deferred to the pre-built pc choices on https://buildmypc.net/ or https://pcpartpicker.com/. In general, I tend to stay away from the \u0026ldquo;gaming\u0026rdquo; laptops. For those prices, you can build a much more powerful desktop and get a lot more mileage out of it, but that\u0026rsquo;s just me. If you want to go that route however virtually all will meet your school related needs. My “Ideal” build (This is excessive, visualization grade)\nResources The lab is open for students whenever there is no scheduled class (see the calendar posted on the lab door. Additionally, ArcMap is available in the GIS lab on the second floor of Watson Library, and on the Architecture and Engineering computers if you have access to those. A signup sheet will also be passed around at the beginning of the semester for those who wish to install ArcGIS on your own personal computers.\nAcademic Integrity Simply put, cheating and/or plagiarism is not tolerated. Feel free to discuss the labs with your classmates, but all work you turn in should be your own, and it\u0026rsquo;s insert year here, how hard is it to give someone credit for their work? I use Zotero for my citation needs, as a KU student you have access to Endnote through Office 365, and in a pinch Cite This for Me will also push you across the finish line but you really should try to be more efficient with your time.\nDisabilities Anyone who has a disability that may prevent full demonstration of academic ability should contact me as soon as possible to ensure accommodations can be made to allow for full educational benefit.\nCourse Evaluation Course grade will be evaluated based on two exams, ten lab exercises, and a final project. The exams will include materials from the textbooks, lecture slides, handouts, and labs. Mid-term exam will cover the materials from the first half of the class, and the final exam will focus on the second half. The final exam is not comprehensive, but since we will be building on concepts throughout the semester, materials from the first half of the semester will inevitably be part of the final exam. A breakdown of grade weights is as follows:\n Midterm exam 20% Final exam 20% Labs 40% Final Project 20% Labs This is not a cartography course; however, the final product of the majority of the labs is a map (or maps) of some sort. The grade you receive on the maps will be based primarily on the results of the relevant analysis, not cartographic technique (there is a whole class dedicated to this; GEOG 211). That said, some level of cartographic knowledge will be useful to effectively communicate with maps, and will be stressed in critiques of your submissions. As the semester wears on we expect improvements and may begin taking points off for excessively bad cartographic technique. Please refer to the cartography resources linked below for some basic guidelines or ask for feedback. In addition, each lab will specify the steps to follow with regards to what to submit for grading and how to do so. Please don’t make assumptions. Follow these instructions carefully; failure to submit the correct files or submission of files in an incorrect fashion will result in drastic loss of points. Don’t turn a potential A into an F due to failure to follow directions!\nCartography primers: http://www.icsm.gov.au/mapping/cartographic.html https://frew.eri.ucsb.edu/private/ESM263/week/2/ESM-263-2017-02-Cartography_Basics.pdf http://colorbrewer2.org/#type=sequential\u0026scheme=BuGn\u0026n=3 Exams Mid-term exam will cover the materials from the first half of the class, and the final exam will focus on the second half. The final exam is not comprehensive, but since we will be building on concepts throughout the semester, materials from the first half of the semester will inevitably be part of the final exam. Either exam may include a practical portion. This practical portion will consist of instructions to prepare some sort of short analysis or demonstration of practical knowledge related to the use of ArcGIS. These will be posted either a week before the exam and due on the date of the exam, or posted the day of the exam and due a week after.\nGrade determination KU uses a 10 point grade scale with +/- options (sans A+). As material is graded you may use the following [grade sheet helper](TODO - URL here) to determine what your grade in the course is. At the end of the term I export the blackboard gradebook and put it into this sheet to submit grades, so you can periodically check your grade this way should you have concerns. Letter grades are determined based on the following grading scale:\n A A- B+ B B- C+ C C- D+ D D- F 100 - 93 92 - 90 89 - 87 86 - 83 82 - 80 79 - 77 76 - 73 72 - 70 69 - 67 66 - 63 62 - 60 \u0026lt;60 "},{"id":35,"href":"/classes/random/cart_prime/","title":"Cartography primer","parent":"Random bin","content":" Some of this content is gratefully pilfered and modified for my own needs from Elizabeth Wesley.\n Cartography, the art, science, and technology of map making, is a sequence of courses and a profession unto itself. I won\u0026rsquo;t even attempt to do it real justice here, but I will try and make a short primer you can follow to make a more interesting and visually appealing map. Use these rough steps to help guide your process.\nStep 1) Decide what it is the map is intended to communicate As a (primarily) visual media, maps can be an effective means of communicating spatial phenomenon and relationships in a concise and attractive manner. Therefore, in order to create an effective map we have to understand what it means to be an effective communicator. Having a clear idea of the map’s message is essential. Before you can decide how to say something you need to know what you want to say. What is your message? Who are you talking to? Use the 7 C\u0026rsquo;s of communication from open.edu to help frame and guide your internal monolog.\nStep 2) Select your elements of design Element selection Maps are visual representations of often complex ideas and relationships. In contrast, the visual language syntax you can deploy to communicate those ideas are comparably sparse. In English speaking locals, our eyes are trained to read things from left to right and top to bottom, and this training loosely translates to visual interpretation. Therefore, you need to make conscious decisions on how you vary elements to guide the viewer through the map so that they interpret the map’s meaning in the way that you intend. A few of the most common elements are outlined below from https://splitcomplementary.blogspot.com/2012/08/new-and-improved-elements-and.html\nPrinciples of design Likewise, these elements and their variations follow a few core Principles of design are the ways that you can use the elements of design. These are the ways that you will vary the elements in order to convey the message of your map.\nMap Most maps have many elements in common. While not all maps need to contain all the elements, they all require some. Map elements provide information essential to the interpretation of the map. All maps should be able to stand on their own, meaning that a viewer requires no additional information to understand the map than is available on the map itself. If someone found your map on the ground would they know what it meant?\n from https://csiss.org/cookbook/images/elements.gif\n Basics of a good map some of the most critical elements needed for an effective, attractive, and honest map:\n Have an informative title. If your map shows all the counties in Texas that are predominately Democratic, don’t name it ‘Map of Texas’. Name it ‘Predominantly Democratic Texas Counties’! Your legend does not need to be titled ‘Legend’. If you are showing Democratic counties in blue and Republican counties in red, your legend might be named ‘Political party affiliation’. Often, the legend does not require a title at all. (This one is a biggie, although it might not be the most offensive thing you see in a day, a legend titled \u0026ldquo;Legend\u0026rdquo; will piss most cartographers and geographers off to no end) Make sure that the legend entries have informative names. Seeing ‘tx_counties_dem’ looks sloppy and may be confusing. Rename layers sensibly so that there is no ambiguity. Make full use of the space on the page. While you don’t want your map to be crowded, there is no reason to leave excessive white space around the map figure. Zooming in on the data of interest is often a good place to start. Experiment with different arrangements of your map elements. Not only should they fit well together, there is an information hierarchy that should be visually enforced. The title and the data you want to emphasize are the most important; they should stand out! Figure 4 from https://gistbok.ucgis.org/bok-topics/visual-hierarchy-and-layout\n You can also change the page layout to be either landscape or portrait, and changing dimensions of an image is also an option. Depending on the shape of your area of interest, one may be more appropriate than the other. Make sure that your data is visible. If you are mapping points that represent all the schools in a Texas city, don’t make the points so large that you can’t make out their location. Likewise, don’t make them so small that they are difficult to see. Most scale bars are adjustable. You can change the number of subdivisions as well as the units. Make them simple and legible! They should be easy to use. Also, depending on the map projection (if it does not preserve distance) and the purpose of the map, a scale bar may not be needed. Likewise, a north arrow is not always needed. If the map projection does not preserve direction, a north arrow is inappropriate as north varies over the map. It is a good practice to include the source of your data on the map. The text should be small and unobtrusive but it is important that you be transparent and show that your message is trustworthy. CHECK YOUR SPELLING! Although it\u0026rsquo;s insert current year here, you\u0026rsquo;ll continually find yourself surprised by just how many different platforms don\u0026rsquo;t have even a basic form of spellcheck built in. When in doubt, write the text out in word and copy-paste it over. Before you export your map as an image, make sure to clear any selections you have map. As we move forward by learning about different kinds of maps there will be many more things to add to this list. However, these are basic principles that apply to every map that you will make. Examples These are examples of a basic GIS map. The primary purpose of this kind of map to is to relay information; however, maps must be readable to get their point across!\nA bad GIS map The standard map a student might submit the first time they\u0026rsquo;ve ever opened GIS. It\u0026rsquo;s classically bad but you\u0026rsquo;re learning so it\u0026rsquo;s forgiven. If this is what your final map for the class looks like, you\u0026rsquo;re in for a rude awakening :)\nA bad (but better) map A bad (but better) map. Typically indicative of a map made in a rush simply because a simple screenshot was not appropriate. If this is what your final map for the class looks like, you\u0026rsquo;re in for a rude awakening :)\nA good map This map could certainly be improved but includes all the elements necessary to interpret it.\nGreat maps resources you should actually look at In no particular order here are some online resources that may be helpful in developing your cartographic skill. Just like public figures employ speach writers to ensure that their message resonates, mapmakers need to employ cartographic skills to ensure that their message is properly interpreted. A quality map is not only informative but also beautiful and inspiring!\nA great companion page from ESRI for more info(https://www.esri.com/arcgis-blog/products/product/mapping/design-principles-for-cartography/)\nCartography and Visualization—UCGIS\nWhen Maps Lie—City Lab\nGood and Bad Maps—Penn State\nESRI Free Cartography Course (online)—April 22-June 3, 2020\nMaps We Love—ESRI\nEdward Tufte—A master of data visualization\nhttps://unearthlabs.com/blog/making-maps-101/\nFinally, a couple of my all-time favorite maps Although these are not the kind of informational maps that are usually map in a GIS, they certainly illustrate the possibilities of cartography!\nNapolean\u0026rsquo;s March to Moscow, Charles Joseph Minard, 1869, https://www.edwardtufte.com/tufte/ Figure 9: Napolean’s March to Moscow, Charles Joseph Minard, 1869, https://www.edwardtufte.com/tufte/\nThe alluvial valley of the lower Mississippi River, Harold Fisk, 1944, http://www.radicalcartography.net/index.html?fisk Figure 10: The alluvial valley of the lower Mississippi River, Harold Fisk, 1944, http://www.radicalcartography.net/index.html?fisk\n"},{"id":36,"href":"/classes/geog358/","title":"Introduction to GIS -- GEOG358","parent":"Classes","content":"This is a living document. Changes will be announced in class. Instructor: Jim Coll\nOffice: 404C Lindley Hall\nEmail: jcoll@ku.edu\nOffice Hours: Monday and Wednesday 10:00 — 11:30 or by appointment Class Meetings: Monday 2:00 — 4:30 pm\nClass Room: Lindley 228 Labs: Wednesday or Thursday 11:00 — 12:50 pm\nLab Room: Lindley 310 Announcements and mics: Hello all!\nMy name is Jim Coll and I am your instructor for the semester. A few notes for you pertain to how I run this course. Although the KU blackboard site will be the \u0026ldquo;official\u0026rdquo; site for this class and the place you submit all work to, I use this site here for the benefit of all and my own selfish desire to streamline my digital footprint. I will keep both sites as identical as possible when double posting material (e.g. the syllabus and course schedule), but in case of a conflict treat this version as the most recent. Use the navigation table to the left to find the syllabus, labs, and other documents. I have made myself as available to you as I can so feel free to find me via:\n Email Class slack channel Cornering me in the hallway Below you\u0026rsquo;ll find the course outline and slides as appropriate. I look forward to an exciting, productive, and stimulating semester with you all.\nBest,\nJim\nTentative Course Outline: To help us stay on track, I have pinned the semester schedule below with the relevant tasks, due dates, and other important institutional dates: Note that this is a living timeline subject to change. \r\r\rDate\rTopic\rWeekly Activity Due\rLab\rUniversity Dates\r\rWeek 1\r8/27\rSyllabus Day\r\r\u0026nbsp;\rSpatial skills pretest \u0026amp; Guided Walkthrough\rLast day for a full refund 8⁄29\r\r8/29\rWhat is GIS\r\r\u0026nbsp;\r\rWeek 2\r9/3\rComputer basics\r\r\u0026nbsp;\rIntroduction to GISystems\r\u0026nbsp;\r\r9/7\rDigital representations of the world\r\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 3\r9/10\rCoordinate systems\r\r\u0026nbsp;\rProjections and Coordinate Systems\r\u0026nbsp;\r\r9/12\rBasics of Field Measurement\r\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 4\r9/17\rIntroduction to GPS\r\r\u0026nbsp;\rUsing GPS for field data collection\r\u0026nbsp;\r\r9/19\rGPS activity\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 5\r9/24\rGeoreferencing\r\r\u0026nbsp;\rOn-screen digitization and image referencing\r\u0026nbsp;\r\r9/26\rDigitizing\r\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 6\r10/1\rCartography\r\r\u0026nbsp;\rBuilding a GIS database\r\u0026nbsp;\r\r10/3\rTypes of Maps\r\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 7\r10/8\rMidterm review\r\r\u0026nbsp;\rNo Lab\r\u0026nbsp;\r\r10/10\rMidterm\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 8\r10/15\rNo Class\r\u0026nbsp;\rSelections Queries and Joins\r\u0026nbsp;\r\r10/17\rJoining Data\r\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 9\r10/22\rAdvanced Selections\r\r\u0026nbsp;\rOverlay \u0026amp; site suitability analysis\r\u0026nbsp;\r\r10/24\rOverlay analysis\r\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 10\r10/29\rSite suitability \u0026amp; Intro to Map algebra\r\r\u0026nbsp;\rIntroduction to network analyst and ArcScene\r\u0026nbsp;\r\r10/31\rTerrain analysis\r\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 11\r11/5\rNetwork analysis\r\r\u0026nbsp;\rInterpolation and Fire Hazard Modeling\r\u0026nbsp;\r\r11/7\rSpatial interpolation\r\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 12\r11/12\rApplications: Census mapping\r\r\u0026nbsp;\rHands - on\r\u0026nbsp;\r\r11/14\rApplications: TIN editing\r\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 13\r11/19\rApplications: HAND flood mapping\r\r\u0026nbsp;\rHands - on\r\u0026nbsp;\r\r11/21\rProject Help\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 14\r11/26\rNo Class\r\u0026nbsp;\rNo Lab\r\u0026nbsp;\r\r11/28\rNo Class\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 15\r12/3\rFinal Project presentations\r\u0026nbsp;\rOpen Lab\r\u0026nbsp;\r\r12/5\rFinal Project presentations\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 16\r12/10\rPractical Final on your own\r\u0026nbsp;\rSpatial skills test\r\u0026nbsp;\r\r12/12\rPractical Final on your own\r\u0026nbsp;\r\u0026nbsp;\r\rFinals\r12/17\rIn person final\r\u0026nbsp;\r\u0026nbsp;\r\u0026nbsp;\r\r\r\r\r\r\r\r\r\r\r\r"},{"id":37,"href":"/classes/geog526/labs/lab01/","title":"Lab - Internet Resources Exploration","parent":"Labs","content":"Learning Objective The objective of this laboratory exercise is to get students familiar with using the Internet as a learning and research resource for remote sensing. Specifically, you will browse some web sites for free remotely sensed data, image providers, professional conferences, societies, and remote sensing software vendors. These should help you search for remotely sensed data required for specific research topics or learning objectives. You will need to write short paragraphs to summarize the websites you visit.\nOutline: Learning Objective Submission requirements Introduction Navigating folder structures Remotely Sensed Imagery Sites: Remote Sensing Data Providers Professional Societies in Remote Sensing and Photogrammetry Remote Sensing Software Vendors Wrapping up Submission requirements There is nothing to submit for this week, but make sure you\u0026rsquo;ve followed along and set up your system correctly or you\u0026rsquo;ll have a rough time as we move through the semester.\nIntroduction Navigating folder structures The Internet is an important learning resource for the study of remote sensing. A huge amount of information pertaining to remote sensing can now be found on the Internet, including satellites and sensors, imagery products, remotely sensed data analysis software and hardware, research papers, technical reports, conference proceedings, educational materials, and professional societies. It is essential for remote sensing students today to be able to search for resources available on the Internet, and to download them if necessary.\nRemotely Sensed Imagery Sites: Kansas Data Access \u0026amp; Support Center (https://www.kansasgis.org/) EROS Data Center at USG (https://eros.usgs.gov/find-data) Reverb - ECHO.NASA (https://reverb.echo.nasa.gov/reverb) EARTHDATA (EOSDIS) (https://earthdata.nasa.gov/) EarthExplorer (EE) (https://earthexplorer.usgs.gov/) The Global Change Master Directory (GCMD) (http://gcmd.gsfc.nasa.gov/) Global Visualization Viewer (GloVis) (https://glovis.usgs.gov/) NASA Visible Earth (http://www.visibleearth.nasa.gov/) Google Earth (http://earth.google.com) Question 1\nSelect three of the above sites and provide a short description (about 100 words) for each of them. Pay attention on the types of data noting their geographic coverage, source, scale, format (jpg, GIF,\u0026hellip;) Remote Sensing Data Providers With the web browser open, visit the following sites:\n GeoEye (spaceimaging) (http://www.geoeye.com/) Digital Globe( http://www.digitalglobe.com/) USGS Landsat Program ( http://landsat.usgs.gov/) SPOT (http://www.spot.com/) Eurimage (http://www.eurimage.com/) Sentinel (https://sentinels.copernicus.eu/web/sentinel/home) Question 2\nSelect two of the above sites and provide a short description (about 100 words) for each of them. Pay attention on the types of imagery noting their geographic coverage, resolution, price (just need to rate cheap, moderate, or expensive) Professional Societies in Remote Sensing and Photogrammetry With the web browser open, visit the following sites:\n International Society for Photogrammetry and Remote Sensing (http://www.isprs.org/) American Society for Photogrammetry and Remote Sensing (http://www.asprs.org) AAG Remote Sensing Speciety Group (http://www.aagrssg.org/) Remote Sensing \u0026amp; Photogrammetry Society (http://www.rspsoc.org/) IEEE Geoscience \u0026amp; Remote Sensing Society (http://www.grss-ieee.org/) Question 3\nSelect two of the above sites and provide a short description (about 100 words) for each of them. Pay attention on the nature, history, and other issues you deem appropriate. Remote Sensing Software Vendors With the web browser open, visit the following sites:\n ERDAS Imagine (http://www.erdas.com) PCI Geomatics (http://www.pcigeomatics.com/) ENVI (http://www.ittvis.com/index.asp) IDRISI (http://www.clarklabs.org/) eCognition (http://www.definiens.com//) Google Earth Engine (GEE) (https://earthengine.google.com/) Question 4\nSelect two of the above sites and provide a short description (about 100 words) for each of them. Wrapping up If you are on a PC in Lindley, make sure you save all your work and then log off. As the submission requirements outlines, there was nothing to submit for this lab, but make sure you do these steps. Note that although these settings should follow you should you move from PC to PC, take a quick second each time you log in to make sure that it\u0026rsquo;s set up as you expect (the OneDrive on the D: is the biggie)\n"},{"id":38,"href":"/classes/geog358/labs/lab00/","title":"Lab 00 - Intro to GIS","parent":"Labs","content":"Learning Objective This lab is here more for me as we walk through how to open and manipulate the interface, the meaning of icons, and how to navigate the world without getting lost. The questions here are more to remind me to talk about things while we explore. The instructions are not meant to be followed. You should use this page more as an opportunity to learn how the labs will be laid out, how the sections are formatted, and how to navigate the site. I\u0026rsquo;ll update this to a real lab at a later date. These labs are largely written for ArcMap 10.6, though operations from ArcMap 10.2 through 10.6 all largely look the same, although some of the tools may have shifted. The first handful of labs are designed to be followed as is, but as the semester progresses I expect you to flex those critical thinking skills of yours. Take a step back, read the learning objectives and ask yourself what steps would be needed to get to that goal. When in doubt, Google it! This lab will walk though how to open and manipulate the ESRI suite of tools and applications, common access patters, opening and manipulating windows, site layout, and how to submit things for the class.\nWhat you need to submit Hint: Copy and paste the questions below into a word document and submit them on blackboard\n Lab 0: Answer Sheet\nName:\nQuestion 1:\nQuestion 2:\nQuestion 3:\nQuestion 4:\nQuestion 5: What is the datum used in the coordinate system for the Hydrology feature dataset?\nQuestion 6: What is the linear unit of measure for this feature dataset?\nQuestion 7: What is the spatial extent (listed under Domain) of this feature dataset?\nXmin:\nYmin:\nXmax:\nYmax:\nQuestion 8: Which is the only feature dataset without coordinate system information?\nMaterials .tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rBaseData\rData we downloaded as part of lab 1\r\r Links on these pages are green, and most of the time it should be obvious that something is a link. So how many of you missed the fact that the materials was a link?\n Part 1: The ESRI suite of tools An introduction to the software suite.\nArcMap ArcCatalog is an application for managing spatial data. The interface is similar to Windows Explorer and is used to copy, move, and delete data. In addition, ArcCatalog is used to obtain information about spatial data such as projection parameters, feature type (point, line, polygon), and attributes. Never use.\nArcAdministrator is where you control all the software, including extensions and licensing. Rarely used.\nArcScene is a stand alone program for small scale GIS areas, and where you want to more explicitly show off 3D surfaces. Rarely used.\nArcGlobe is ESRI\u0026rsquo;s attempt at a Google Earth substitute. Rarely used.\nArcMap is the primary means of performing GIS. Use often.\n ArcPro ArcPro is the web enabled variant of ArcMap. Because modern problems require modern solutions. ESRI is pushing hard to stop supporting ArcMap in order to force people to adopt to ArcPro, and most places who actually use GIS in a significant capacity have likely already switched or are in the process of switching. QGIS QGIS The creation of metadata is an integral part of a GIS. Metadata is basically “data about data.” Information contained in metadata includes, but is not limited to, the creation and modification date, projection information, extent, source, and reliability. ArcCatalog offers several style sheets for the creation of metadata including the Federal Government Data Committee (FGDC) style. The FGDC coordinates the National Spatial Data Infrastructure (NSDI), which is designed to facilitate the sharing of geographic data. More information about FGDC can be found at http://www.fgdc.gov/index.html\n1) Starting ArcCatalog and Organizing Data First you need to create a personal folder on the D: drive. This is where you will store your data for the semester. On the desktop, you should see a folder in the upper left-hand corner with your name on it. Open it, and then create a new folder called G358. Start ArcCatalog by opening the #-Programs folder on the desktop and then double-clicking ArcCatalog 10.5. In ArcCatalog, you will firstly need to connect to a folder. Click on the ‘Connect to Folder’ icon . And then navigate to the data you downloaded Be sure that the G358 folder is highlighted before clicking OK. In the Catalog Tree (the leftmost panel of the window), click the plus sign next to the connection you just made. Now right-click on Lab01 and select Copy. Click the Connect to Folder icon again and create a connection to your personal lab folder. In the Catalog Tree, find the connection to your lab folder, right-click the folder, and then select Paste. Now you have a copy of the lab data in your own folder. Click the plus sign next to your Lab00 folder. 2) Getting Familiar with the layout of ArcCatalog In the Catalog Tree, click the plus sign next to Lab01 in your folder (see figure on next page). The contents of this folder are listed in the Catalog Display window: A database file (GreenValleyDB.mdb)¬ – a data format used by ArcGIS based on a relational database that contains geographic information A map document (Greenvalley.mxd) – a page layout for spatial data and contains annotations and graphic elements. The map document can either be printed or saved in a graphic format (.jpg, .tif, .eps) A layer file (Water Use.lyr) – contains symbology settings for water use in Greenvalley. These settings tell ArcMap how to display the data and can also be a shortcut to the original data stored elsewhere on the hard drive Review the following page in order to familiarize yourself with the layout of ArcCatalog. You will want to refer to this outline throughout the lab if you get confused. ArcCatalog is similar in structure to Windows Explorer \u0026ndash; on the left hand side is a view of the Catalog \u0026ldquo;tree\u0026rdquo; showing how the data is organized. The right hand side provides options for exploring the contents of the data shown in the Catalog tree. Outside of the areas for exploring the catalog folders and their contents, there are several menus and toolbars associated with ArcCatalog. We will explore a number of these in this lab, however we will not examine all of them. Throughout the lab, it will be helpful if you spend a bit of time exploring these on your own as they become active. To find out what a particular button does, hold your mouse cursor over the button for several seconds. A note will appear telling you the function associated with the button. 3) Exploring the geographic data, map documents and layers Change the settings of Details view: The default settings for the Details view in the Catalog Contents window only show the Name and Type of feature. We want to obtain more information about the files we are using before continuing. Select Customize | ArcCatalog Options… and navigate to the Contents tab. Place check marks beside Size and Modified. Also, place a check mark beside # of Features under the metadata list box. Click Apply and then click OK. Question 1\nWhat is the function of each of the following buttons? (Use your mouse \u0026amp; the tool tips if you are unsure) Question 2\nWhat is the file size for Greenvalley.mxd? ______ KB Question 3\nHow many features are contained in parks_polygon feature class? In order to preview the map document, single-click the Map document (Greenvalley.mxd) in the Catalog Tree and select the Preview tab to view a thumbnail of page layout. Notice that there is only one option available from the Preview: drop-down menu (Geography). This is because we can only view the spatial information contained within the map document. To preview a layer, single-click the layer file (Water Use.lyr) in the Catalog Tree to preview this file. The Preview: option defaults to Geography. Change this drop-down menu to Table. A table is displayed in the Catalog Display window showing the attributes for the original data associated with the layer. Click the Contents tab and click on the geodatabase file named GreenvalleyDB.mdb To explore the geodatabase, click the plus sign next to the file GreenvalleyDB.mdb in the Catalog Tree. The contents of this geodatabase are now displayed in the Catalog Tree. The Hydrology, Parks, Public Buildings, etc… are called Feature Datasets and are simply a collection of Feature Classes (Points, Lines, Polygons) that share the same spatial reference. Explore feature datasets and feature classes. Click the plus sign next to Hydrology and note the content is a single Feature Class (in this case, a polygon) called flood_polygon. A thumbnail is displayed in the Catalog Display window. Question 4\nWhat is the projection used in the coordinate system (PCS) for the Hydrology feature dataset? Click the Preview tab to view the geographic information stored in the flood_polygon class. Switch to the Table option to see the attributes. Return to the Contents tab. In the Catalog Tree, select Public Utility. Notice that this Feature Dataset contains three Feature Classes (a point, line and polygon feature class). Click the plus sign for this Feature Dataset. Select the Preview tab and view the Geography and Table for each of these feature classes. 4) Finding coordinate system information Information regarding the projection and its parameters (coordinate system, datum, spheroid, etc…) can be obtained for each Feature Dataset. Right-click on the Hydrology Feature dataset and select Properties… from the drop-down menu. Scroll through this information to become familiar with some of the parameters that are stored within this Feature Dataset. Question 4\nWhat is the projection used in the coordinate system (PCS) for the Hydrology feature dataset? Close the Properties window. 5) Exploring and editing metadata ArcCatalog can also be used to create and search metadata. Under the Transportation feature dataset, single-click the streets_arc feature class. Select the Description tab to view the feature class metadata. To edit information, click the Edit button under the Description tab. Under each item, information can be added about the features contained within this feature class including tags, summary, description, etc. This can be used to identify the contents of the feature class, to locate the file, to find out who created it, and so on. Under the Tags section, add the following text: Street, street name Under the Summary section, add the following text: To store the street center lines of Greenvalley Click Save once you have made these changes. These changes should now be viewable under the Description tab Complete Question 10 on the assignment sheet Connecting to other sources of data Up to this point we’ve been accessing data that is stored on the lab computers. Now we’re going to use the Internet to access data from other sources. First we’re going to connect to the Kansas Data Access and Support Center (KansasGIS.org), a great service run by our friends over at the Kansas Geological Survey. Their data are stored on servers that run on ArcGIS Server. We can access these data in ArcCatalog by using an ArcGIS Server connection. In the Catalog Tree, about halfway down you’ll see GIS Servers. Click the plus sign next to it. Double-click Add ArcGIS Server. Make sure the radio button is set to Use GIS services. Click Next. In the Server URL field, type: http://services.kansasgis.org/arcgis/services Click Finish. Click the plus sign next to arcgis on services.kansasgis.org (user). Click the minus sign next to DASC_Boundaries. Click the plus sign next to KDOR. Click the first CountyOrkaStatus. Click the Full Extent button (the little globe near the top of the screen). Click the Identify button (the blue circle with an \u0026ldquo;i\u0026rdquo; in it). Click different counties to see their attributes. Scroll through the Identify Results window. Answer question 11 on the assignment sheet. Now we’re going to connect to a different server at a different organization through a different process. (It’s different, see?) This time we’re connecting to the servers at the National Oceanic and Atmospheric Administration, a government agency that runs the National Weather Service and other related services. Instead of using an ArcGIS Server connection, though, we’re going to use WMS (Web Map Service). According to Wikipedia, WMS is a “standard protocol for serving georeferenced map images over the Internet that are generated by a map server using data from a GIS database.” Unlike ArcGIS Server, WMS is an open source protocol, which means it’s not connected to any proprietary software.\n In the Catalog Tree, double-click Add WMS Server. In the URL field, type: http://maps.ngdc.noaa.gov/arcgis/services/hot_springs/MapServer/WMSServer? Click OK. Click the plus sign next to hot_springs on maps.ngdc.noaa.gov. Click the plus sign next to Thermal Springs in the United States. Click the plus sign next to Layers. Click Hot Springs. Click the Full Extent button (the little globe). Answer question 12 (last question!) on the assignment sheet. This final step contains detailed instructions for turning your work in. We will do this frequently throughout the semester so be sure to pay attention to the process. Scroll so that your screen looks like the following (so that it displays the changes you made in metadata). This also should bring the ArcCatalog to be captured to the foreground on your computer screen or make the ArcCatalog active window. Locate the PrtScrn key (Print Screen) on your keyboard (see sample keyboard for a guide). Press the Alt + PrtScrn key – this key is used to capture (or take a picture of) your screen. For the process described below, you can use the other image editor rather than Paint as you wish. Now open Paint by going to Start | All Programs | Accessories | Paint On the Paint toolbar, click the Paste button under the Home tab – this will paste a picture of your screen into Paint Now choose Save As… Navigate to your folder Under File Name: enter lab01yourname.png (i.e.: lab01Bruce.png) From the Save as Type drop down box, select PNG Close Paint and exit from ArcCatalog To upload the file, simply launch Microsoft Internet Explorer (or Firefox if available) and log into Blackboard. Click on Assignments | Lab 01 | Submit Lab 01 Fill in any appropriate information, and click Browse… to browse, find, and upload your .PNG file from your personal workspace. Finally, click Submit. I will now be able to view your .PNG through the Blackboard webpage. Question 10\nWhat is the 2010 population for Douglas County? Question 11\nThe dots signify hot springs on American soil. Where in the U.S. do hot springs seem to be concentrated? "},{"id":39,"href":"/classes/geog358/labs/lab01/","title":"Lab 01 - Intro to GISystems","parent":"Labs","content":" This lab is a gratefully modified version a lab from Lee Hachadoorian\u0026rsquo;s github page\n Learning Objective To introduce ArcGIS and its basic data management, display, and analysis components. To learn how to work with map layers, query attribute tables and geographic features, create maps, and design and execute a simple GIS analysis. As a demonstration, you will map and analyze data from Philadelphia, Pennsylvania.\nOutline: Learning Objective Submission requirements Tutorial Acquiring the Data Opening ArcMap and Adding Map Layers What are ArcMap documents? (.mxd files) Saving a Map Document Opening your ArcMap Document (.mxd file) Remove Map Layers from Map Document Organizing GIS data Viewing Shapefiles between Folders Repairing Broken Data Links Using Relative Paths A handful of basic operations Changing the Symbology of Map Layers Change Color Change Point Glyph (Symbol) and Color Using the Tools Toolbar to Navigate and Get Information Zoom and Pan Full Extent, Previous Extent, and Next Extent Identify Features Measure Distances and Areas Selecting things in ArcMap Working with Attribute Tables Opening the Attribute Table Select by Attribute (Attribute Query) Manual Selection Attribute Query Graphical Selection Select by Location (Spatial Selection) Getting Statistics for Selected Records Using Selected Features in Another Selection Operation Creating a New Map Layer from a Selection of Features Designing and Exporting a Map Working in Layout View Exporting a Map Image Submission Submission requirements Tutorial Acquiring the Data Create a workspace, which is a technical way of referring to a project folder, the folder where you store your data files, including both inputs and outputs. Thus, when you see a reference to \u0026lsquo;your workspace\u0026rsquo; here, you can think of it as \u0026ldquo;your Lab1 data folder\u0026rdquo;. Using the term \u0026ldquo;workspace\u0026rdquo; is more concise, and when you become a more advanced GIS user, setting the workspace becomes an important part of automating complex processes. If you have a flash drive, create a new workspace folder named \u0026lsquo;Lab2\u0026rsquo;. If you do not have a flash drive create a new folder in the Documents folder. First we will acquire the following spatial data layers for the city of Philadelphia: schools, bike routes, and neighborhoods. We will download data from PASDA (The Pennsylvania Geospatial Clearinghouse), which is an example of a geoportal, a website that collects geospatial data from many different sources organized around a particular theme or region of interest. Unsurprisingly, PASDA, hosts data related to Pennsylvania, and is maintained by Pennsylvania State University. PASDA hosts data provided by local governments, federal and state agencies, nonprofits, and academic institutions. Go to http://www.pasda.psu.edu. Under SEARCH BY KEYWORD(S) enter \u0026lsquo;Philadelphia\u0026rsquo; and press submit. Find 2016 Philadelphia Planning - Schools, click on the link and press Download, which will download a file. In addition to schools, find and download the following: 2016 Philadelphia Streets - Bike Network \u0026amp; 2016 Philadelphia Planning - Neighborhoods (Check step) You should have downloaded and be able to locate 3 files: PhillyPlanning_Schools.zip PhillyPlanning_Neighborhoods.zip PhillyStreets_Bike_Network.zip Notice all 3 files are .zip files, which is a single file that is composed of one or more individual files that have been compressed to a smaller storage size for convenience. The lab computers will automatically save the zip files to the Downloads folder. Copy or move these files to your Lab1 workspace folder. Unzip the archive. The lab computers have 7-Zip installed on them. In File Explorer, right-click the file and choose 7-Zip → Extract to \u0026ldquo;folder_name\u0026rdquo;. If you don\u0026rsquo;t have that, you really should download it, but for this lab you can get by with the default windows extractor. right-click the file and choose Extract all.\n Unzip each of the .zip files using the 7-zip application (or another appropriate application) and extract the files to your workspace folder. Opening ArcMap and Adding Map Layers ArcMap Open ArcMap. The New Document dialog will appear. Click OK to begin with a blank map. Note: When you are starting with a new map, it doesn\u0026rsquo;t really matter whether you close this dialog by clicking OK, Cancel, or the X at the upper right.\n Click the Add Data Button. In the Add Data browser window click the \u0026lsquo;Connect to Folder\u0026rsquo; button, navigate to your workspace folder, and press OK. Navigate into your workspace folder and add the neighborhoods, schools, and bike network data layers to ArcMap. After adding the files, it should look something like this (the colors may be different): Table of Contents (TOC) The left panel of panel in ArcMap is called the \u0026ldquo;Table of Contents\u0026rdquo;. The TOC lists all the map layers found in the map window and shows what the geographic features in each map layer represent (points, lines, polygons, images, tables). The table of contents helps you manage the display order of map layers and symbols used to represent layers, as well as set the display and other properties (e.g. colors, line thickness) for each map layer. It is also the place where you can easily see the drive paths where your files are physically located on the computer (e.g., C:\\temp\\mygisfiles).\nIt is very important to remember where your files are located when you are working on a GIS project. In a GIS environment you will be doing a lot of data processing which requires the creation of new files, which must be saved. If you don\u0026rsquo;t know where you saved a file, your instructor is not going to know either. Hint: if you click on geoprocessing \u0026gt; results and expand the top most tool to see where Arcmap last dumped something out*\n At the top of the TOC is a toolbar that controls how the layers are listed, and what you can do to them. You can hover over the buttons to see tooltips for their function:\n List By Drawing Order List By Source List By Visibility List By Selection Options\n Select the List By Drawing Order button (if it is not already selected) Try changing the order of the map layers. Click, hold, and drag the neighborhoods layer to the top. Notice that now that is \u0026lsquo;on top\u0026rsquo; it visually obscures the other layers (schools and bike network) below it. Drag and drop the schools and bike network layers to the top again. Select the List By Source button. Note the path, i.e. the drive and folder hierarchy, within which each layer is stored. You can also turn layers on and off (i.e. make them visible or not) by simply checking or unchecking the box next to the layer name in the Table of Contents. ArcPro Linux Content QGIS Windows Content What are ArcMap documents? (.mxd files) ArcMap documents (which have the .mxd extension) allow desktop users to save and share their GIS project with other desktop users or reopen a project at a future time. After an ArcMap document is created and its various map properties are defined (map colors, projection), all of the properties of the GIS map are saved as part of the map document. Remember that the map document file (.mxd) only saves ArcMap document properties and the location of the map layers that are being used and does not save the geospatial data (that is, the data in the shapefiles or other geospatial data format). The ArcMap document will only reference the location where the data files are saved. If you move your data files to a different drive path or folder and then open your project, you will have to locate the files again.\nSaving a Map Document Click File, Save. Navigate to your workspace and save the project as Lab2.mxd. Click Save. Close the ArcMap software completely. Opening your ArcMap Document (.mxd file) As with Microsoft Word or other Windows applications, if you have an existing map document, you can open it directly from the desktop. You do not have to have ArcMap already open.\n In File Explorer, navigate to your map document file (Lab1.mxd) Double-click the file to launch ArcMap and open your existing document. Again, as with other Windows applications, you can open an existing map document from the File menu. The File menu will also list recently opened documents. Remove Map Layers from Map Document Right-click Neighborhoods from the popup menu and click Remove. Close ArcMap completely. Do not save your changes. Organizing GIS data One of the most challenging parts of learning GIS is organizing your files and understanding where they are stored. You will find out in the next few weeks that when you perform spatial operations and/or geoprocess your data ArcGIS often creates new files. You need to stay organized (check you paths and know where the data are being stored). The faster you get this concept, the quicker you will learn the more important GIS concepts.\nThere are many different spatial data formats. Vector data (points, lines and polygons) are commonly stored in shapefiles. A shapefile is a simple, nontopological format (topology will be explained later this term) for storing the geometric location and attribute information of geographic features. Geographic features in a shapefile can be represented by points, lines, or polygons.\nThe shapefile format defines the geometry and attributes of geographically referenced features in three or more files with specific file extensions that should be stored in the same folder. Each file must have the same base name in order for ArcGIS to recognize them as part of the same spatial layer, for example, schools.shp, schools.shx, and schools.dbf.\nThe following are common shapefile extensions. The first three (.shp, .shx, .dbf) are required by the shapefile standard:\n .shp\u0026mdash;The main file that stores the feature geometry; required. .shx\u0026mdash;The index file that stores the index of the feature geometry; required. .dbf\u0026mdash;The dBASE table that stores the attribute information of features; required. .prj\u0026mdash;The file that stores the coordinate system information. All spatial data use some CRS (coordinate reference system), but shapefiles are sometimes distributed without the PRJ file. How to deal with that will be demonstrated in a future exercise. .sbn and .sbx\u0026mdash;The files that store the spatial index of the features. .xml\u0026mdash;Metadata (\u0026ldquo;data about data\u0026rdquo;) for ArcGIS\u0026mdash;stores information about the shapefile, such as the source of the data, accuracy, publication date, time period, etc. Viewing Shapefiles between Folders This section aims to show you that shapefiles are composed of many files that you must transport all together in one group to another location. Additionally, it is important to note where all the files are stored for each project that you work on.\n Open File Explorer and navigate to your workspace folder and examine the schools shapefile. There should be 8 files with the base name PhillyPlanning_Schools. These files must be kept together to be maintained. THE LOSS OF ANY ONE OF THESE FILES CAN BREAK THE SHAPEFILE. Open ArcCatalog and navigate to your workspace so you can view the schools shapefile. Note that it appears as a single file in ArcCatalog, even though it is actually composed of 8 individual files. In ArcCatalog you can rename, copy, and move shapefiles, just as you can in File Explorer, and it will automatically maintain the integrity of all files that compose the shapefile. It is recommended to use the ArcCatalog windows to rename, copy, and move GIS data files (though it is still possible to use File Explorer for this purpose). Close ArcCatalog. Repairing Broken Data Links ArcMap and ArcCatalog should be closed when you begin this section. Your Lab1 workspace folder should have a map document named Lab1.mxd which is linked to data in that folder. We are going to break that link and learn how to repair it.\n We are going to break the link by moving the map document. It could also be broken by moving the data to a new folder, or, if the data were not in the same folder as the map document, by renaming the data folder. If you don\u0026rsquo;t have relative paths set (see next section), it can also be broken just by plugging your flash drive into a new computer and having it mount on a different drive letter. All of these are very common occurrences, so it is very likely that you will encounter broken data links when working with GIS.\n In File Explorer, move Lab1.mxd to a new location (this can be the desktop for ease of use) Double-click Lab1.mxd to open ArcMap. Notice that the map canvas is empty, and all of the layers have a red exclamation mark next to them. This indicates that a layer is defined in the map document, but the source data is not at the expected location. Any layer properties, such as the color of polygon features, are preserved, even if ArcMap can\u0026rsquo;t find the polygons to display!\n Right-click on any of the layers in the TOC. From the popup menu select Data → Repair Data Source.\u0026hellip; Navigate to the correct shapefile for the layer you clicked on. Select it and hit the Add button. Notice that ArcMap has repaired the link for not just that layer, but for all the other layers as well. The Repair Data Source feature will check whether other map document layers come from the same data source. In this case, since all of the shapefiles are in the same folder, all of the data links were repaired. Close ArcMap but do not save your changes. Move Lab1.mxd back into the Lab1 folder. Using Relative Paths A file location can be stored using an absolute or a relative path. In the real world, an absolute path would be a location specified in some address system, like \u0026ldquo;1801 N. Broad Street\u0026rdquo;. A relative path would be a location specified in relation to your current location, like \u0026ldquo;walk two blocks, turn left, walk three blocks, and go into the third building on the right\u0026rdquo;. When you add geographic layers and tables to ArcMap and save the project (.mxd), the software does NOT store the data. It stores the paths to the data. When you use absolute paths, the data must be at the exact same location in the filesystem. If you work on a lab computer and store your map document and data in C:\\temp\\mystuff, but go home and copy the data to C:\\other_folder\\mystuff, ArcMap will not find the data, even though it is right there. If you work on a flash drive that is mounted as drive letter E:\\, but then go work on another computer (or even return to the same computer) that happens to mount the flash drive as drive letter F:\\, ArcMap will not find the data. In order to make sure that you keep your map document connected to your data, you must:\n Use relative paths for the map document. Copy your map document (MXD) and data as a unit. This will be easiest if you keep the data in the same folder as or an immediate subfolder (perhaps named \u0026ldquo;data\u0026rdquo;) of the map document. Relative or absolute paths is a property of the map document, so you may choose to have some that use relative paths and some that use absolute paths. Check the path settings as follows:\n Reopen your Lab1.mxd map document. Click File → Map Document Properties. Check \u0026ldquo;Store relative pathnames to data sources\u0026rdquo; (if it is not checked) and Press OK. Note: Usually, working with relative paths is preferable.\n A handful of basic operations Changing the Symbology of Map Layers In ArcGIS it is very easy to change the colors and symbols of map layers. Let\u0026rsquo;s try changing the color and symbols of the map layers in your Lab1 map document\nChange Color Turn off the bike network and schools layers so only neighborhoods is showing. Double click the neighborhoods layer in the TOC. (You can also right-click on the layer and select Properties from the popup menu.) Click on the Symbology tab.\n Click the colored polygon symbol. Select a new \u0026lsquo;fill\u0026rsquo; color and select a color from the color palette. Select \u0026lsquo;outline\u0026rsquo; color and select a new color. Make sure to choose a fill and outline color that makes it easy to see the individual neighborhoods. Click OK to see the new color settings. Change Point Glyph (Symbol) and Color Turn off all the layers except for the schools layer. Double click the schools layer. Click point symbol. Scroll down and change the glyph. Change the fill color. Make sure to choose a glyph and fill color that makes it easy to see the individual schools. Click OK. Using the Tools Toolbar to Navigate and Get Information ArcMap has many toolbars, which can be toggled on and off in the Customize menu. A default installation will start with the following two toolbars visible:\n Standard\u0026mdash;This has file management features and actions common to almost all Windows software, including opening and saving files, cut-copy-paste, and undo-redo. We have already used to add spatial data to our map document. Tools\u0026mdash;This has a number of tools commonly and specifically used for working with geospatial data. We will explore it now. Hint: Hover the mouse over each of the different buttons to see their name and description.\n Zoom and Pan Using the \u0026lsquo;zoom in\u0026rsquo; or \u0026lsquo;zoom out\u0026rsquo; tool you can navigate to and from close-up views. The pan tool shifts the display in any direction without changing the scale of the map\n Try the zoom tools Try the pan tool Full Extent, Previous Extent, and Next Extent The map extent tools allow you to navigate through map extents. A map extent defines the geographic boundaries for displaying map layers within a data frame. Try the different map extent tools.\nIdentify Features The identify tool allows you to quickly get attribute information about a geographic feature. Let\u0026rsquo;s try to get info about the neighborhoods.\n Turn off all the layers except the neighborhoods. Select the \u0026lsquo;Identify\u0026rsquo; tool and click on one of the neighborhoods. The Identify window will appear. This shows you the attribute values from the attribute table for that neighborhood, such as its name. If you are not getting info about the neighborhoods, then you will need to change the \u0026ldquo;Identify from\u0026rdquo; layer in the dropdown at the top of the window. Try using this tool on the other map layers. Note that when features are close together (e.g. a bike network nearby a school), the Identify tool may select several nearby features. Zoom in until you can select a single feature.\n Measure Distances and Areas Skill demonstration: Can you tell me how big Philadelphia is?\n Click the Measure tool on the Tools toolbar to open the Measure dialog box. When the measure menu opens, click on the units dropdown menu (its down arrow), select distance, and then miles. Click on the northeastern tip of Philadelphia and then drag the ruler drag ruler tool to the southwestern tip and double click to take the measurement. Close the Measure tool. Selecting things in ArcMap Selecting and subsetting a database is one of the most fundamental operations one can do to data. In AcrMap there are several ways one can go about accomplishing this feat. This section will walk you the three most useful ways to select something.\nWorking with Attribute Tables Along with feature classes (points, lines, and polygons) and image datasets (referred to more generally as raster datasets), tables represent one of the three key dataset types in GIS. Tables are used to store descriptive attributes about geographic features (e.g., census data for counties or census tracts, disease counts by health district).\nOpening the Attribute Table Turn off all layers except for schools. Right-click the schools layer in the TOC (table of contents) and select \u0026lsquo;Open Attribute Table\u0026rsquo; from the popup menu. The table containing the records for schools should open. The attribute fields names appear at the top. Scroll across the table to view them all. Some important fields are: FACIL_NAME which contains the name of the school ENROLLMENT which contains the number of students GRADE_LEVE which contains the grade level of the school TYPE_SPECI which indicates whether the school is run by the\u0026hellip; District (Philadelphia School District) Private school Charter school Archdiocese (Catholic parish) school Scroll down the table to view the records. Note the total number of records (rows) in the layer appears at the bottom (550 schools), along with a count of how many are currently selected, which at the moment should be 0). Open the attribute table for the bike network layer. Note the important fields: STREETNAME which indicates the name of the street TYPE which indicates the type of bike lane buffered conventional sharrow conventional with sharrow Open the attribute table for the neighborhood layer. Note the important fields: NAME indicating the name of the neighborhood Close all the attribute tables. Select by Attribute (Attribute Query) An attribute query helps users select or view certain data on the map based on that data\u0026rsquo;s attributes. It works the same way as a relational database like Microsoft Access.\n Structured Query Language (SQL) is a set of defined expressions and syntax used to query and manipulate data in relational database management systems (RDBMS). The American National Standards Institute (ANSI) defines a standard for SQL. Most RDBMSs use that standard and have extended it, making SQL syntax across different RDBMSs slightly different from one another. Query expressions are used in ArcGIS to select a subset of features and table records. Query expressions in ArcGIS adhere to standard SQL expressions.\n One way to select geographic features is directly from the attribute table. From a table, you can manually select records with the mouse pointer, or you could select those records that meet some criteria (SQL query). Once you\u0026rsquo;ve selected records in the table, you\u0026rsquo;ll see those features selected (highlighted) on your map.\nManual Selection Open the schools attribute table. Right click on the FACIL_NAME field and select \u0026lsquo;Sort Ascending\u0026rsquo;. This puts the schools in alphabetical order by name.\n Scroll down the table until you find the Dunbar, Paul Laurence school.\n Click on the row on the far left of the table to manually select that row (see circle in the picture below). Note the Dunbar school will be highlighted to show its selection \u0026ndash; in both the attribute table and in the map (it is adjacent to Temple University Main Campus). Note that at bottom of the table it should indicate 1 out of 550 schools are selected.\n Clear the selection by clicking the Clear Selection button on the attribute table or in ArcMap.\n Attribute Query In the attribute table go the Table Options button at the top left and choose Select by Attributes. A dialog box will pop up that allows to make a query statement. We will create a query that states which schools will be selected based on some criteria, in this example, the school with the name Dunbar, Paul Laurence.\n In the Select by Attributes dialog, note that the beginning of a standard SQL query appears above the lower text box: SELECT * FROM PhillyPlanning_Schools WHERE. The WHERE keyword indicates that a criteria is about to specified. The database engine will use this criteria to filter the records, that is, to select only those records which satisfy the criteria.\n In the top box, double click \u0026quot;FACIL_NAME\u0026quot;. It should appear in the text box below.\n Click the equals (=) button, then click the Get Unique Values button.\n Scroll down in the list of school names and double click 'Dunbar, Paul Laurence'. Note that the column name appears in double quotes. Text values, such as the school name, appear in single quotes. You should form a query appearing in the text box at the bottom of the dialog box: Click Apply. The same school should be selected in the table and on the map.\n Close the Select by Attributes box and clear your selection.\n As another example, let\u0026rsquo;s select schools with enrollments over 500 students. Open Select by Attributes and delete the last query statement.\n Create a query statement by double clicking the ENROLLMENT field at the top, then the greater than (\u0026gt;) button then type in 500. Note that 500 is a numeric value, and does not appear in quotes. Press Apply.\n The selection should show that 212 out of 550 schools have enrollments over 500.\n You can also create a compound query. Build a new query in the Select By Attributes dialog box query window by clicking on the buttons that reads \u0026quot;TYPE_SPECI\u0026quot; = 'District' AND \u0026quot;ENROLLMENT\u0026quot; \u0026gt; 600. This will select only schools run by the Philadelphia School District (as opposed to charter, private, or Archdiocese schools) with enrollments greater than 600 students.\n Check: There should be 77 schools selected.\n Close the Select by Attributes box, clear your selection, and close the attribute table. Graphical Selection You can also select features interactively on the map. The Tools toolbar in ArcMap contains a tool for graphically selecting features. The \u0026lsquo;Select Features\u0026rsquo; tool works using a single graphic that you interactively digitize (draw) as part of the selection process.\n Click the Select Features button, then click \u0026lsquo;Select by Rectangle\u0026rsquo; Using the cursor draw a rectangle around a set of schools, say, the schools in West Philadelphia. Examine the selected records on the map and in the schools attribute table.\n You can also perform a graphical selection using a circle or other shapes.\n Experiment with different graphical selection options. When you are finished, clear your selection.\n Select by Location (Spatial Selection) You can also select features based on their spatial relationship to another feature, such as the distance between features or whether one feature contains another feature. As an example we will select schools located within 500 feet of the bike network.\n Turn on the schools and bike network layers (neighborhoods should be off).\n From the menu at the top, choose Selection, then Select by Location. In the Select By Location dialog box choose\n Selection method: Select features from Check the box next to PhillyPlanning_Schools (since you want to select schools) Source Layer: PhillyStreets_Bike_Network Spatial selection method for target layer feature(s): are within a distance of the source layer feature Apply a search distance: 250 Feet Press OK.\n You should see 140 of the 550 schools selected. Open the attribute table for the schools layer to see how many schools are selected.\n Clear your selection so no schools are selected.\n Getting Statistics for Selected Records When exploring a table, you can immediately get statistics describing the numeric values in the columns. When you use the statistics tool you\u0026rsquo;ll see how many values the column has, as well as the sum, minimum, mean, maximum, and standard deviation of those values. A histogram is also provided showing how the column\u0026rsquo;s values are distributed.\nAs an example, we will get statistics from the schools attribute table:\n Clear all your selections so no schools are selected. Open the schools attribute table.\n Right-click the heading of the field Enrollment and select \u0026ldquo;Statistics...\u0026rdquo;\n On the Statistics dialog box, you\u0026rsquo;ll see information about the values in the field whose heading you clicked. Note the statistics, e.g. minimum, maximum, mean, etc. The sum is the sum of the row values in the Enrollment field for all the schools, i.e. the total number of students in all schools \u0026ndash; 231,981 students.\n Close the statistics box. Using Select by Attributes, select only the private schools. There should be 173 schools selected.\n Reopen the statistics for the Enrollment field. Note the Statistics tool operates only on selected rows (if any are selected). You can see the sum for enrollment for private schools is 15,860 \u0026ndash; this is the total number of students in private schools.\n Using Selected Features in Another Selection Operation You can also select features from a set of features which are already selected, or alternatively, select a set of features from a set of selected source features. We\u0026rsquo;ll use two examples for illustration.\nFirst, we will select features from a set of features \u0026ndash; charter schools (first selection) and enrollments over 1,000 students (second selection).\n Turn on only the schools and neighborhoods layers (turn off the bike network layer). Put the neighborhoods layer on the bottom of the drawing list so you can see both schools and neighborhoods layers clearly.\n Using Select by Attributes, select only the charter schools. There should be 94 selected.\n Open the Select by Attribute dialog box again, but this time for Method: choose select from current selection. Then, of those selected schools, select those with enrollments greater than 1,000. There should be 15 schools selected, which are both charter schools and with enrollments greater than 1,000 students. Now, we will select a set of features from a set of selected source features \u0026ndash; the neighborhoods that contain the selected schools.\n Close the schools attribute table, but keep the 15 charter schools selected. Open Select By Location.\n For Selection method: choose select features from. For Target layer(s) check the box for the neighborhoods layer and uncheck all other boxes. For Source layer: choose the schools layer. Then check the Use selected features box. Under Spatial selection method for target layer feature(s): choose contain the source layer feature. This spatial relationship operator allows you to select based on spatial containment \u0026ndash; whether on feature contains another feature \u0026ndash; i.e. if a neighborhood contains a currently selected school. (Make sure the Apply a search distance box is NOT checked)\n Press OK. There should be 10 out of 48 neighborhoods selected. Creating a New Map Layer from a Selection of Features Frequently, you will want to save your selected subset of features as a separate, standalone data file. For instance, you may want to select a set of schools based some enrollment criteria, and save this smaller number of schools as its own data file, not simply as a selection within the original schools data file. Generally, as you proceed through a GIS analysis, it is helpful to regularly save your selected features in separate data file. You will have occasion to create new data files over and over again this term, and you must pay attention to where you are storing this data every single time. ArcGIS tries very hard to get you to save your data:\n in the \u0026ldquo;Geodatabase\u0026rdquo; format; and in a default location on your local hard drive. Additionally, it usually gives you unhelpful default file names, like \u0026ldquo;Export_Output\u0026rdquo;. Therefore, if you don\u0026rsquo;t pay attention when you do this step, you will end up with a file name that is unintelligible, in a format that you don\u0026rsquo;t want, in a location that you won\u0026rsquo;t be able to find. Do not do this. As an example of how to save a selected set of data to its own file, we will export the 10 selected neighborhoods from the last step in the tutorial.\n Make sure the neighborhoods layer is on and you have 10 neighborhoods selected.\n In the TOC, right-click on the neighborhoods layer, then choose Data \u0026gt; Export Data.\n Under Export: choose Selected features.\n Under Output feature class: click on the yellow folder icon and navigate to your Lab1 workspace folder.\n Under Save as type: choose Shapefile.\n Under Name: enter my_neighborhoods (or, whatever you want to name this file \u0026ndash; whether or not you leave .shp at the end doesn\u0026rsquo;t matter). Click Save. You\u0026rsquo;re actually not done yet, as this just pushes your settings to the Export Data dialog box. Click OK.\n A dialog box will appear asking whether you want to add your new layer to the current map. Click Yes. The layer will now appear in TOC.\n Clear your selection so no schools or neighborhoods are selected.\n Turn off the neighborhoods layer. You should be able to see the my_neighborhoods layer contains only those 10 formerly selected neighborhoods.\n Designing and Exporting a Map In this section we will see how to create a map layout and export the map.\nWorking in Layout View We have been working in a view called \u0026ldquo;Data View\u0026rdquo;, which is most useful for data exploration and analysis. When producing maps for export, we need to switch to \u0026ldquo;Layout View\u0026rdquo;. In this view, you will see a representation of a page, and the map (or maps) will appear on that page. By default, the page will be a standard 8½ by 11 inch page in portrait view. In Layout View you can also add other elements, such as titles, textboxes, and legends. It is possible to place elements \u0026ldquo;off\u0026rdquo; the page (in an area known as the \u0026ldquo;pasteboard\u0026rdquo; in desktop publishing lingo). Anything not on the page will not be included on the map when you export the image, so it can be a useful place to store items during the design process when you are trying out different ideas.\n Turn off the bike network layer and turn on all the other layers. Put them in the following drawing order:\n Schools (top) My_neighborhoods Neighborhoods (bottom) Make sure you are viewing the full extent of your data (press the full extent button to be sure). Your canvas should look something like this (your colors may be different): Switch to Layout View in one of the following ways:\n Click the Layout View button in the lower-left of the map canvas: Select View \u0026gt; Layout View from the menu. Note that you can toggle back and forth between the layout and data view windows.\n When you switched views a new toolbar likely appeared: the Layout toolbar. Recall that if it did not you can click up in the grey area to add it in manually. This toolbar has pan and zoom tools, similar to the ones you have used before, but these will zoom into the page (not the map). Notice that each of these icons has a page behind it (e.g. magnifier on a page) to distinguish it from the pan and zoom tools on the Tools toolbar. Experiment with these tools to move around the page. When you are done, hit Zoom Whole Page (the fourth button on the Layout toolbar). The pan and zoom buttons on the Tools toolbar can be used to resize the map image on the page. Use the Zoom In button to make sure that Philadelphia fills most of the data frame. Note that if you zoom in too far, parts of Philadelphia will be \u0026ldquo;outside\u0026rdquo; the data frame, and will not appear on the page.\n The standard map elements are standard for a reason. Although it is possible to break the rules, you should know what you are doing before you snub 200+ years of cartographic theory. Use the Insert menu to add the following elements:\n Title\u0026mdash;Should default to top center, can be repositioned. Legend\u0026mdash;A dialog will appear offering many options for customizing the legend. Just keep hitting Next until the legend appears on your map. North Arrow. Scale Bar. Reposition the map elements to fill the white space. Try not to have any elements overlaying the Philadelphia neighborhoods. Your final layout may look something like this, but do ~NOT~ try to make it look the same. Just make sure it has all the requested elements: Once you have completed your map, save your .mxd document.\n Exporting a Map Image A finished map product can be exported into an image file that can be inserted into a MS Word document for your lab reports, or into web pages or other documents. This takes no extra effort compared to a screen shot, and is infinity more professional looking. We will not accept screen shots of the ArcMap program, unless the directions specifically request a screen shot. Here, as an example we will export the map using the PNG format.\n Go to File → Export Map.\u0026hellip;\n Set the Save as type to \u0026quot;PNG (*.png)\u0026quot;.\n Navigate to your lab1 folder. As always, be clear about where you are saving your files.\n Press Save.\n Submission All you have to turn into blackboard for this week is the final image you created above.\n"},{"id":40,"href":"/classes/geog358/labs/lab02/","title":"Lab 02 - Projections and Coordinate Systems","parent":"Labs","content":" A note about this lab: Items in bold are to indicate buttons and/or menus you are looking for.\n Lab 2: Projections and Coordinate Systems Learning Objective An introduction to how ArcGIS handles geographic and projected coordinate systems. Users should understand the difference between geographic coordinate systems and projected coordinate systems. Identify appropriate projections for datasets. Become familiar with the ArcGIS toolbox\nTutorial Getting started Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rconusa.shp\rPolygons of the 49 States\r\r\rselecteduscities.shp\rSelected Cities of the United States\r\r\rlatlong.shp\rLatitude-Longitude Lines for the United States\r\r\rkscities.shp\rCities in Kansas\r\r\rSanDiego.shp\runknown coordinate system, from San Diego Sector\r\r\rSanDiego.lyr\rlayer file used to display road type symbology\r\r\rElCentro.shp\runknown coordinate system, from El Centro Sector\r\r\rElCentro.lyr\rlayer file used to display road type symbology\r\r\rStudyArea.shp\rGCS (NAD83), study area polygon\r\r\rCARDS.shp\rGCS (NAD83), California roads\r\r\rCARDS.lyr\rlayer file used to display road type symbology\r\r\rUSGS100k.shp\rPCS (UTM NAD27), USGS DLG roads\r\r\rQuestionsSheet.docx\rHandout to turn in\r\r\r Note: This data is in a zipped format. You are able to tell this because the folder will have a small zipper going down it\u0026rsquo;s side, and if you click into it, the top of the file explorer will have a pink bar with \u0026ldquo;Compressed Folder Tools\u0026rdquo; on the top. To access it you will need to\n Navigate to the folder with the zipped download Right click on the folder and chose 7-zip \u0026gt; \u0026ldquo;extract to your folder name here\u0026rdquo;. Open ArcMap and close any windows that pop up. Save the empty map document to a personal folder, name it something meaningful like GEOG358_Lab02_YourLastName.mxd Click the Add Data button, navigate to your downloaded data folder, hold down the Control key and click conusa.shp, selecteduscities.shp, and latlong.shp, and then click Add. Measuring in ArcMap Measure Locations in Latitude and Longitude Right-click Layers in the Table of Contents and select Properties. Under the General tab, change the Name to U.S. Cities. A little lower in the tab, under the Units heading, make sure Display set to Decimal Degrees. Click OK. A quick-and-dirty way to estimate the latitude and longitude of a point is to hover the Select Elements tool over the point and then look at the numbers in the bottom right part of the window. Click the Full Extent button to put the entire map into focus. Use this technique on Los Angeles, Chicago, and Lawrence. Answer question 1 on the assignment sheet. Hint: did you look in the data downloads folder?\n Now let’s use a more accurate method to determine the latitude and longitude of those cities. Click the Identify button and then click on Lawrence. In the window that pops up, click Lawrence under selectuscities. About an inch below are the city’s coordinates (listed after Location:). Use this method to answer question #2 on your assignment sheet. Unless your quick-and-dirty estimations were dead-on, you should see a small discrepancy between the coordinates in question #1 and question #2.\n Comparing distances in different projections Open the data frame’s Properties window. Select the Coordinate System tab. In the upper box, navigate to Projected Coordinate Systems \u0026gt; Continental \u0026gt; North America, click USA Contiguous Lambert Conformal Conic, make note of the map in the window, and then click Apply at the bottom of the window. Now the map is no longer (as) distorted! Click the Measure tool. In the Measure window, make sure Distance is set to Miles and the Measurement Type (the rightmost down arrow) is set to Planar. Hover the Measure tool over Lawrence. The cursor should snap to Lawrence and a circle should appear around the city. (If this doesn’t happen, click the Select Elements button, then click the Measure tool and try again.) Click on Lawrence and then hover over Chicago. As with Lawrence, the cursor should snap to the location and a circle should appear around it. This is your distance measurement. Using this technique, answer question #3 on your assignment sheet. Changing the map projection Open the Properties window, select the Coordinate System tab, and navigate to Projected Coordinate Systems \u0026gt; UTM \u0026gt; NAD 1983, click NAD 1983 UTM Zone 15N, and click Apply. As you did before, measure the distance from Lawrence to Chicago and from Lawrence to New York City. (Make sure the Measure tool is set to Miles and the Planar distance.) Answer questions #4 and #5 on your assignment sheet. Save your map document. Defining the coordinate system for a shapefile using ArcToolbox Click the ArcToolbox button (it looks like a red toolbox with a computer window in the background). Give it a minute to open.\n Go to Data Management Tools \u0026gt; Projections and Transformations, and then double-click Define Projection. We’re going to use this tool to assign a geographic coordinate system (GCS) to the kscities shapefile in our data folder.\n Click the folder icon —not the down arrow— to the right of Input Dataset or Feature Class. Add your kscities.shp to the map by navigating to your data folder, selecting kscities.shp and clicking Add. Notice the warning that pops up that indicates that the Coordinate System is Unknown. This is what we’re going to fix!\n Click on the pointing hand icon to right of Coordinate System. Just like we did in previous steps, select the following: Geographic Coordinate Systems \u0026gt; North America \u0026gt; NAD 1983. Don’t click OK just yet.\n Notice the components of the GCS, such as:\n Name: GCS_North American_1983 Angular Units: Degree Prime Meridian: Greenwich Datum: D_North American_1983 Semimajor Axis: 6378137 Semiminor Axis: 6356752 Now click OK. Click OK again. Notice that kscities has been added to the Table of Contents.\n NOTE: You can also define the coordinate system for a shapefile in ArcCatalog. Here’s how:\n In ArcCatalog, right click on the shapefile and select Properties. Click the XY Coordinate System tab. The rest will be similar as in ArcToolbox.) Project kscities.shp into Kansas State Plane Coordinate System (PCS) In ArcToolbox go to Data Management Tools \u0026gt; Projections and Transformations, and then double-click Project. This tool is different than the Define Projection tool we just used; it actually creates a new shapefile that is projected into the desired coordinate system. In the Input Dataset or Feature Class section, click the down-arrow —not the folder icon— and select kscities. Click the folder icon to the right of Output Dataset or Feature Class. Navigate to your data folder. In the Name field type kscities_stateplane.shp and then click Save. Click the pointing hand icon to the right of Output Coordinate System and then select the following coordinate system and make note that you can see its parameters: Projected Coordinate Systems \u0026gt; State Plane \u0026gt; NAD 1983 (Meters) \u0026gt; NAD 1983 StatePlane Kansas North FIPS 1501 (Meters).prj Click OK. Click OK again. The projected shapefile will be added to your Table of Contents automatically. Make a map! Turn off all layers except conusa and selecteduscities. Right-click conusa and select Zoom to Layer. Right-click selecteduscities, go to Selection, and select Make This The Only Selectable Layer. Click the Select Features button (it looks like a cursor in front of a tiny map). Draw a box around Lawrence. Now Lawrence is selected. Right-click selecteduscities, go to Selection, and select Create Layer from Selected Features. Now you have a layer of just Lawrence called selecteduscities selection. Click once in the middle of the selecteduscities selection layer name and type Lawrence. Turn off the selecteduscities layer. Right-click the Lawrence layer and select Properties. In the Symbology tab, you should see a big Symbol button with a dot in the middle of it. Click it. In the Symbol Selector window that pops up, scroll down a bit until you see the symbol Star 1. Click that symbol. To the right is a Size box. Type 30 in the box. Click the Color button above it and select a color that you think represents Lawrence. Click OK. Switch to Layout View (accessible via the View menu or a little button in the bottom left corner of the window). Create a map. All you need to include is the mapped area, a title (e.g., “Lawrence, Kansas”), and your name. Spend a couple minutes making it pretty. Since the contiguous U.S. is wider than it is tall, a landscape orientation is recommended. Also, make sure to zoom in close enough so that there isn’t an excessive amount of space. Export the map as a .png file called GEOG_358_Lab02_YourLastName.png. \u0026ldquo;Real World\u0026rdquo; Application The United States Border Patrol\u0026rsquo;s jurisdiction is divided into Regions and further subdivided into Sectors. In the Western Region, the San Diego and El Centro Sectors are responsible for patrolling approximately 130 miles of border between the State of California and Mexico. The San Diego Sector, headquartered in Chula Vista, California, has used GIS for several years. The El Centro Sector, east of San Diego, is in the process of implementing a GIS and is working closely with the San Diego Sector. Both Sectors are committed to standardizing GIS data collection, symbology, and analysis. Through data sharing exercises, the Border Patrol staff noticed that two similar sets of street data, one from the El Centro Sector office and another from the San Diego Sector office, were not aligning. They have asked you to figure out what datum the data is in.\nGetting started Open up a new ArcMap window. Click Cancel on the opening screen. Save the map document in your data folder as GEOG358_Lab02_YourLastNameApplication.mxd By default, there is only one data frame in the Table of Contents. For this exercise, though, we want three data frames. Go to Insert \u0026gt; DataFrame and create two more data frames. Rename the data frames as: Unknown Coordinate Systems GCS NAD83 PCS NAD27 Right click on each shapefile in the ArcCatolog window and determine the datum the shapefile is in Sort the shapefiles to their appropriate data frames. (For example, we will add the shapefiles without coordinate system information to the Unknown Coordinate Systems data frame.) Right-click each data frame, select Add Data…, and add the appropriate shapefile(s). Make sure you are adding shapefiles and not layer files [.lyr])\n When you are presented with a window that complains about an Unknown Spatial Reference, just press OK; this is the problem we are trying to fix. Right-click the Unknown Coordinate Systems data frame and select Activate. Double-click the Unknown Coordinate Systems data frame. Under the General tab, set the map units to Decimal Degrees and the display units to Meters. Click Apply and then OK. Make sure you’re zoomed in enough by right-clicking the SanDiego layer and selecting Zoom To Layer. Look at the map for a couple seconds and try not to develop a migraine. That offset between layers suggests datum problems. The ElCentro layer is probably in one datum and the SanDiego layer in another. Investigate the San Diego and El Centro datum problem Right-click the ElCentro layer, select Copy, right-click the GCS NAD83 data frame, and select Paste Layer(s). Do the same for the SanDiego layer. The GCS NAD83 data frame should have four layers now. Right-click the GCS NAD83 data frame and select Activate. Right-click the StudyArea layer and select Zoom To Layer. Turn off the ElCentro, SanDiego, and StudyArea layers by going to the Table of Contents by clicking the checkboxes next to them. The only visible layer should be cards. Let’s see if either the ElCentro layer or the SanDiego layer might have been created using the same datum as the cards layer. Zoom in closer to one of the roads in the middle part of the cards layer. Now turn the ElCentro and SanDiego on and off a few times. Does one of them seem to fit the cards layer better? If you don’t see any difference, look closer at the highway interchange in the right part of the screen. Define the coordinate systems for San Diego and El Centro street shapefiles We’re going to remove all the SanDiego and ElCentro layers from our map document. Right-click each one (there should be four of them total) and select Remove. Click the ArcToolbox icon and give it a few seconds to open. Go to Data Management Tools \u0026gt; Projections and Transformations and then double-click Define Projection. Click the folder icon to the right of Input Dataset or Feature Class and select SanDiego.shp from your folder. Click the hand icon to the right of Coordinate System and select Geographic Coordinate Systems \u0026gt; North America \u0026gt; NAD 1983. Click OK. Click OK again. This shapefile will automatically be added to the active data frame (GCS NAD83). note: If you received an error message from stating that projection didn’t work, make sure ArcCatalog is closed and try again.\n Double-click the SanDiego layer you just created and then click the Source tab. It should list the Geographic Coordinate System as GCS_North_American_1983. Click the General tab and rename the layer SanDiego_GCS_NAD83. Click OK. The ElCentro shapefile might be in GCS NAD27. Using the Define Projection process we just used, select the following coordinate system for ElCentro: Geographic Coordinate Systems \u0026gt; North America \u0026gt; NAD 1927. Rename the layer ElCentro_GCS_NAD27. The ElCentro layer certainly fits much better now, but that’s not good enough for purists like us. Convert El Centro street shapefile from NAD27 to NAD83 In ArcToolbox go to Data Management Tools \u0026gt; Projections and Transformations and then double-click Project. Click the down-arrow to the right of Input Dataset or Feature Class and select ElCentro_GCS_NAD27. Click the folder to the right of Output Dataset or Feature Class and save the file in your folder as ElCentro_GCS_NAD83.shp. Click the hand icon to the right of Output Coordinate System and select Geographic Coordinate Systems \u0026gt; North America \u0026gt; NAD 1983. Make sure the box below the Geographic Transformation (optional) field says NAD_1927_To_NAD_1983_NADCON. If it doesn’t, click the down-arrow to the right of Geographic Transformation (optional) and select NAD_1927_To_NAD_1983_ NADCON.\n Click OK. This layer will be added to the active data frame. Reproject the Usgs100k shapefile The Usgs100k layer in the PCS NAD27 data frame is a 1:100,000 scale digital line graph (DLG) that was downloaded from the USGS EROS website. It is in a projected coordinate system (NAD_1927_UTM_Zone_11N). Reproject this layer the same way you reprojected the ElCentro layer in the last section. When you’re done, drag the layer over to the GCS NAD83 data frame. How well does it fit the other layers? You may notice that the fit is better in the area’s center compared to its periphery. Add some pre-made layer Esri’s ArcGIS Resource Center defines a layer thusly: “Each layer references a dataset and specifies how that dataset is portrayed using symbols and text labels. When you add a layer to a map, you specify its dataset and set its map symbols and labeling properties.” We’re going to add a few layers that already have their map symbols and labeling properties set but need to be referenced back to their shapefiles.\n With the GCS NAD83 data frame still active, right-click the data frame and select Turn All Layers Off.\n Right-click the data frame, select Add Data, and add the following:\n cards.lyr ElCentro.lyr SanDiego.lyr Don’t see anything? That’s because the connection to the underlying data was severed. This is the case whenever you see grayed-out check boxes with red exclamation marks next to them. Let’s fix this. For each layer, double-click the layer, click the Source tab, click the Set Data Source… button, and then add its corresponding shapefile from your Lab04 folder (e.g., cards.shp for cards.lyr). This should make the gray part of the check boxes and the red exclamation marks disappear.\n Take a screen shot of ArcMap (all of it—including the Table of Contents, menus, etc., using the snip tool is fine in this case) and save it in your data folder as GEOG358_Lab2Application.png.\n Paste the image in the word document with the rest of your answers and submit it on blackboard.\n "},{"id":41,"href":"/classes/geog558/labs/lab03/","title":"Lab 03 - Focal Operations","parent":"Labs","content":"Learning Objective This lab will use neighborhood operators (i.e. focal operators) to examine an elevation surface, remove errors in a DEM, analyze wind exposure, and delineate the edge of our lake classified in Lab 2. In the first part, you will learn how to bring an ASCII into Arc as a grid and confirm your understanding of focal statistics. In the second, you will use the slope tool and focal statistics to blur out errors in the DEM and examine how changing the window results in a changing image. In Part 3, you will perform a site suitability analysis for wind turbines using the slope, aspect, hillshade, and focal wedge tools. Finally, we will use focal erosion to isolate the edge of our lakes classified in Lab 2. You can do all of these parts within the same or separate mxd’s, but be sure to save often.\nOutline: Learning Objective Submission requirements Materials Tutorial Part 1: Focal Operation Test Converting the ASCII file to a grid and run focal tools Part 2: Removing DEM Errors Convert DEM from feet to meters Locate Artifacts (Errors) in the DEM Removing DEM Errors Part 3: Wind Exposure Analysis Convert DEM from feet to meters Obtaining Areas with Good Elevation Generating a Hillshade raster layer Visualizing with hillshade Obtaining Areas with Good Aspect Obtaining Wind Exposure Values Isolate the best sites and those that meet all the criteria except wind exposure Part 4: Lake Edge Detection Calculate the boundary of the lake in 1995 using a focal erosion process Calculate the difference between the two surfaces Perform the above steps for both 2003 and 2009 Make a small map like so Submission requirements Materials click to download\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rRelevant part\rData Name\rDescription\r\r\rPart 1\rTest.txt\rASCII Grid with sample values\r\r\rPart 2\rsilvrsub_f\t\rDEM for Silverton, CO (30m cell size, elevation in feet)\r\r\rPart 3\rwnw\rDEM of NW part of Watauga Cnty (30m cell size, elevation in feet)\r\r\rPart 4\rBaseData\rData downloaded in the first lab\r\r\rPart 4\rNDWI_3.tif\rNDWI threshold maps you created in Lab 2\r\r Tutorial Part 1: Focal Operation Test Converting the ASCII file to a grid and run focal tools Open ArcMap and make sure that the Spatial Analyst extension is activated. Name the default data frame to ASCII Grid. Change the workspace for the Spatial Analyst Tools by going to Geoprocessing \u0026gt; Environments… Under Workspace change the Current Workspace to your Lab3 folder. Rename the data frame named ASCII Grid. Inspect the ASCII Grid file test.txt using a text editor such as notepad. We will use this small raster file to visually verify the results of several focal operations. Open ArcToolbox and Select Conversion Tools | To Raster | ASCII to Raster tool. Set test.txt as the input file and save the output raster as test in your lab03\\part01 folder. Leave the data type to default and Click OK. The test raster will be added to the Table of Contents automatically. You can ignore any projection warnings you receive when adding these rasters. Use the ArcToolbox | Spatial Analyst Tools | Neighborhood | Focal Statistics tool to answer Question #1. hint #1: change your layer symbology to show unique values.\n Question 1:\nUse your test raster layer to run the following three FOCAL operations (with a rectangular neighborhood of 3 by 3, cell size 10) and fill in the respective grids:\nFocal Sum: test_sum:\n.tg {border-collapse:collapse;border-spacing:0;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:black;}\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\rFocal Maximum: test_max:\n\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\rFocal Mean: test_mean:\n\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;\r\r\r Want to flex? You can label the rasters as shown below, and it\u0026rsquo;s good cartography practice. Useful tools: Raster to points/extract values to points, label placement and halos, field representation to round to two decimal places. Save your Arcmap document. Part 2: Removing DEM Errors Convert DEM from feet to meters First let’s lay down the foundation.\n Insert and rename a data frame to DEM Error Removal. Add the silvrsub_f grid to the data frame. This surface was created in rectangular patches and then mosaiced together, but the artifacts of mosaicing are not readily visible by simply viewing the DEM. Since the height is in feet and the cell sizes are in meters, you should put everything into the same unit using a local operation (i.e., Raster Calculator). Hint: 1 foot = 0.3048 meters.\n Save the output to your lab03\\part02 folder as: silversub_m. Locate Artifacts (Errors) in the DEM Let’s start by calculating the slope for the raster to more easily identify the artifacts.\n Open ArcToopbox | Spatial Analyst Tools | Surface | Slope, which is a focal operation we can use to find the linear artifacts in the silversub_m grid. In the slope dialog box, set the input raster as silversub_m. Leave the default values for output measurement and z factor. Save the output grid silver_slope in your lab03\\part02 folder. You should now be able to visually pick out the linear artifacts left from the mosaicing process. Using the measurement tool, determine the approximate size of the patches that were used to mosaic the image and answer Question #2 on your assignment sheet.\nRemoving DEM Errors Now try to eliminate the artifacts by smoothing the surface of silversub_m before deriving the slope. Go to ArcToolbox | Spatial Analyst Tools | Neighborhood | Focal Statistics to compute the mean elevation value in a rectangular neighborhood of 3x3 cells and another of 7x7 cells.\n Use silversub_m as the input in both cases, making sure to name the output appropriately, something like silver3x3 and silver7x7. Derive the slopes of these new surfaces as in step 2, and be sure to name them appropriately, something like slope3x3 and slope7x7 respectively. Answer Question #3 on your assignment sheet. Part 3: Wind Exposure Analysis Convert DEM from feet to meters First let’s lay down the foundation\n Insert a new dataframe and call it Wind Exposure Analysis. Add the wnw grid to the data frame. This surface was created in rectangular patches and then mosaiced together, but the artifacts of mosaicing are not readily visible by simply viewing the DEM. Since the height is in feet and the cell sizes are in meters, you should put everything into the same unit, as was done in Part 2. Save the output to your lab03\\part03 folder as: wnm_m\nObtaining Areas with Good Elevation Select the cells where the elevation in wnw_m is greater than or equal to 1000m by creating the following expression in the Raster Calculator: “wnw_m” \u0026gt;= 1000 Make the output raster a permanent layer named GoodElevation Generating a Hillshade raster layer Using Spatial Analyst, create a hillshade raster from your wnw_m DEM using the ArcToolbox | Spatial Analyst Tools | Surface | Hillshade tool Set wnw_m as your input surface, leave the default settings for azimuth and altitude, but set the Z factor to 3 Save the output grid as hillshade in your lab03\\part03 folder Visualizing with hillshade Hiilshades are useful for providing geographic context, but can overwhelm data which kind of defeats the purpose of making a map. One way to fix this is mask out no data values in your output layer, and make it slightly transparent. We\u0026rsquo;ll do this a few times in this lab so remember this sequence\n Display GoodElevation on top of hillshade Set the color of the bad elevations (0’s) to hollow (no color) by: R-clicking on GoodElivation Select symbology tab Check box “display background value:” Proceed with: 0 symbol and selecting No Color. Change the Transparency value of GoodElivation to approximately 40% R-Click -\u0026gt; Properties under the Display tab Answer Question #4 on your assignment sheet\nObtaining Areas with Good Aspect Calculate the aspect from wnw_m DEM using the ArcToolbox | Spatial Analyst Tools| Surface | Aspect tool Save the output grid as aspect in your lab03\\part03 folder Notice the range of angles assigned to each direction in the table of contents. Aspect values begin at 0 at due north and increase clockwise. Using Raster Calculator, select the cells with an aspect between 225o and 315o. This will give us the westward facing slopes. That expression should look like (225 \u0026lt;= “aspect”) \u0026amp; (“aspect” \u0026lt;= 315) Make the output raster a layer named GoodAspect Display the GoodAspect on top of hillshade (set symbology like you did for GoodElivation) and verify that it represents the western slopes. Answer Question #5 on your assignment sheet.\nObtaining Wind Exposure Values Create a neighbor mean raster of wnw_m using a FocalMean operation.\n ArcToolbox | Spatial Analyst Tools | Neighborhood | Focal Statistics with a wedge neighborhood (start angle 135, end angle 225, and a radius of 10 cells. It is important to note here that the wedge neighborhood is measured differently than the aspect. The aspect measurement begins at 0 degrees due north and increases as we move clockwise. However, when computing a wedge neighborhood, we start at 0 degrees due east and increase as we move counterclockwise. Thus, the degree range here (135-225) represents the same degree range as indicated in the aspect range (225-315). This is simply due to inconsistencies in the way the ArcGIS software handles radial measurements. TODO: Add a nice image of this here.\n Save the output as WindMean10 In Raster Calculator, select the cells whose elevations are higher than their neighbor mean (i.e., WindMean10) wnw_m \u0026gt; WindMean10 Make the output a permanent raster named GoodWindExp10 Display GoodWindExp10 on top of hillshade Isolate the best sites and those that meet all the criteria except wind exposure Using Raster Calculator, select the cells with good elevation (1’s), aspect (1’s), and wind exposure (1’s). Your expression can look like: (GoodElevation == 1) \u0026amp; (GoodAspect == 1) \u0026amp; (GoodWindExp10 == 1), or more simply GoodElevation * GoodAspect * GoodWindExp10. Make the output a permanent raster named GoodSites10. Use Raster Calculator, select the cells with good elevation (1’s), good aspect (1’s) and bad wind exposure (0’s) with the following expression: (GoodElevation == 1) \u0026amp; (GoodAspect == 1) \u0026amp; (GoodWindExp10 == 0). Make this a permanent raster named Blocked10, which indicate cells where the wind is blocked because of hills to the west. Answer questions #6 - #8 on your assignment sheet (you will need to repeat the steps above twice more using different radius values) Part 4: Lake Edge Detection Open your Lab 2 Part 1 mxd or create a data frame/document. This document should have:\n AOI shapefile lake body classifications (…NDWI_#.TIF) )for 1995, 2003, and 2009. The RGB file for each year Calculate the boundary of the lake in 1995 using a focal erosion process Using the Focal statistics tool, calculate the focal minimum using a rectangular 3x3 grid. The input should be the …_NDWI_3.TIF created in Lab 2. Save the output in the 1995 BaseData folder and name the output as …_NDWI_FMin.TIF. Calculate the difference between the two surfaces Using raster calculator, subtract the focal min raster from the lake classification raster using an expression such as: \u0026ldquo;1995\\LT05_L1TP_\u0026hellip;_NDWI_3.TIF\u0026rdquo; - \u0026ldquo;1995\\LT05_L1TP_\u0026hellip;_NDWI_FMin.TIF\u0026rdquo;. Save the output of the map as LT05_\u0026hellip;_LakeBoundry.TIF. Perform the above steps for both 2003 and 2009 Make a small map like so Move to layout view Set the page dimensions to 11 in x 11 (File, page and print setup) Use guides to set up 3 inch grids (click on the guides) Right click on the data frames to copy them Paste them in the frame and rename them to the appropriate place Text can be rotated in the properties To add grid lines, change the frame properties or using lines by adding the drawing toolbar (right click in the grey area as necessary) Finally, export your map "},{"id":42,"href":"/classes/geog358/labs/lab03/","title":"Lab 03 - Using GPS for Field Data Collection","parent":"Labs","content":"Background This exercise provides an introduction to using a Global Positioning System (GPS) receiver to obtain coordinates and create a point shapefile. GPS is a system consisting of a network of satellites that orbit ~11,000 nautical miles from the earth in six different orbital paths. They are continuously monitored by ground stations located worldwide. The satellites transmit signals that can be detected with a GPS receiver. Using the receiver, you can determine your location with great precision through the trilateration (not triangulation!) of signals from at least 3 satellites, getting a distance from the difference between time measurements.\nAlthough the system is very sophisticated, and atomic clocks are used by the satellites, there are multiple sources and types of errors involved in finding your location. As the GPS signal passes through the charged particles of the ionosphere and then through the water vapor in the troposphere, this causes the signal to slow a bit, and this creates the same kind of error as bad clocks. Also, if the satellites that are in your view at a particular moment are close together in the sky, the intersecting circles that define a position will cross at very shallow angles. That increases the error margin around a position. The kind of GPS receivers we will be using provide about 10 meter accuracy (which may be reduced to under 3 m if differential corrections like WAAS are used), depending on the number of satellites available and the geometry of those satellites. If you want to learn more about GPS, I would recommend Garmin’s introduction (http://www.garmin.com/aboutGPS/manual.html) or Trimble’s tutorial (http://www.trimble.com/gps/).\nLearning Objective In this exercise we are going to collect the coordinates of some set of campus features using either the GPS units or your phones, generate a few shapefiles of those features, and then use the shapefile to make a map. The goals for you to take away from this lab:\n How to use your selected platform to collect data How to export that data How to view that data in GIS software Outline: Background Learning Objective Submission requirements Tutorial Notes before getting started How to collect data Getting your data onto a PC Working with your shapefiles in ArcMap Open your features in Google Earth Common errors: Submission requirements Materials (click to download)\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rgeog358_Lab3Questions.docx\rHandout to turn in\r\r\rYou are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts and submit it to blackboard.\rTutorial Notes before getting started If you have an Android phone, I highly recommend the android specific application, it is by far the most robust data collection application I\u0026rsquo;ve tested, and you will always have your phone on hand. The Apple version will work for the objectives and purposes of this lab, but you will likely be underwhelmed by the capabilities. The handheld units are the most traditional (and perhaps most powerful) options, but unless you go out and buy a GPS unit for yourself, you will likely never hold a GPS unit in your hands again.\nThe apps outlined in this lab are all free to use, although many have the option of upgrading to a paid version. You do not need to pay for anything if you don\u0026rsquo;t want to. If in the event you get confused as to what your looking at, spamming the back button should always take you to the main menus. Note: Make sure your unit is charged before going out on your adventure!\n How to collect data Android If you are on an android device there are a few options available to you. The one you will need for this lab is called Locus GIS. The advantages of this app (Locus) over the cross platform one is several-fold. Perhaps most critically, the app allows you to export databases of features at once without going through a paywall. This app also seems to forgo ads when collecting data. The forms to add data are also more robust, allowing you to add attributes and pictures to the feature. I also feel the interface is easier to use.\nInitial setup and clearing data LOCUS separates data by projects, so to start a new project, go to the menu and create a new project. Make sure your new project coordinate system is set to 4326.\nNavigate To go to a particular point, first add the point using the plus button, and manually enter the coordinates. After the point has been added, you can click on it, and at the top of the application, one of the options will be to navigate to the point using Google Maps.\nCollecting data Locus allows you to record points based on GPS, but also has a rich digital editor that allows you to draw GIS data. We\u0026rsquo;ll only cover recording data here, but feel free to explore! Collecting a point To create points, you need to first create a point file and fill out the fields as necessary. These can be edited in app. Collecting a line or a polygon You likewise need to create new files for lines and polygons. To draw a line, start recording and stop or pause recording when you want to stop.\n Apple You are looking for the Fields Area Measure Free app. There is a guided tutorial built right into the app, and your intuitive knowledge as digital generation should carry you the rest of the way.\nInitial setup and clearing data Click the menu button (upper left corner), and go to saved measures. Click the three buttons next to the entries to delete them. To set the units, go to settings, and make sure the measurement system is set to metric Navigate To go to a particular point, first add the point using the plus button, and manually enter the coordinates. After the point has been added, you can click on it, and at the top of the application, one of the options will be to navigate to the point using Google Maps. Final notes There are a few caveats to this app. First, you may have to sit though an ad if you drop more than a few features in a session. Finally, the exporting of the data must be done individually (feature by feature), making the unpaid version of this unworkable for larger projects. Sorry apple :(\n Garminetrex10 The lab has access to the eTrex handheld garmin units. See the website for the users manual. Instructions on how to accomplish some of the most common tasks are included below:\nInitial setup and clearing data First, lets make sure the unit is set up properly.\n From the main menu, go to setup | units and ensure distance is set to metric and elevation is in feet from the setup menu, go to Position Format and make sure it\u0026rsquo;s set to hddd.ddddd°, Map Datum and Map Spheroid are both set to WGS 84 Finally, from setup go to reset and make sure that you delete all waypoints and clear the current track Navigate To go to a point, from the main menu go to where to | coordinates, and enter your coordinates, or choose a previously dropped waypoint. When done, the map will pop up, directing you to your chosen destination. Collecting data Collecting a point From the main menu, go to Mark waypoint and click done\nCollecting a line or a polygon Lines:\n Go to Main Menu | Tracks To begin line: Clear | Yes (clear track log) | Track Log on To end line: Track Log off | Save | Yes (save all tracks) Make sure you clear the track log between tracks! (This keeps each line separate.) Polygons:\n Use the same steps as lines, except you need to make sure that your line ends where it began. GarminGPSMap60CS The lab also has Garmin GPSMap 60CS handheld units. See the website for the users manual. Instructions on how to accomplish some of the most common tasks are included below:\n Note: Charge the unit with the provided usb cable before going out on your adventure!\n Initial setup and clearing data Make sure the GPS unit is configured to collect coordinates in decimal degrees and distance is in meters: Main Menu (press Menu twice) | Setup | Units | Position format = hddd.ddddd° | Distance/Speed = Metric Even though the map datum will be set to WGS 84, this will correlate fine with the NAD projection of our data later on. Next, lets clear out the old data.\n Find | Waypoints | Menu | Delete… | All Symbols | Yes Main Menu | Tracks | Clear | Yes | Menu | Delete All Saved | Yes Collecting data Collecting a point Points:\n Press the Mark button. Click OK to save. To view a list of all of the waypoints you have taken, use the Find button (it leads to Waypoints menu) next to the Mark button. Collecting a line or a polygon Lines:\n Go to Main Menu | Tracks To begin line: Clear | Yes (clear track log) | Track Log on To end line: Track Log off | Save | Yes (save all tracks) Make sure you clear the track log between tracks! (This keeps each line separate.) Polygons:\n Use the same steps as lines, except you need to make sure that your line ends where it began. Getting your data onto a PC Each method of acquiring GPS data can come with it\u0026rsquo;s own headache of data massaging. Our goal is getting the data into shapefile format, and although the LOCUS GIS app will let you export data in that form, none of the other methods play quite as nicely. There are two common means of exporting GPS data, via gpx format (both handhelds do this), or via kml.\nFrom LOCUS As mentioned, LOCUS makes this process painless, from your project simply click the 3 more buttons and then Export as SHP.\n From Handhelds Click the Start/Windows button, click the little down arrow in the bottom left corner, and then click dnrgps (under #-Programs). Note: you can also download from the Minnesota Department of Natural Resources\n Connect the GPS unit to the computer using the provided USB cable. Turn the unit on, press the Page button until you reach the Satellite screen, press the Menu button and select Use with GPS off (this helps save the battery). If the application doesn’t find your GPS unit automatically, select GPS | Find GPS (or Connect to Default GPS). Download your point features by clicking the Waypoint menu and selecting Download. Look at the table of waypoints. If you see any unwanted waypoints in the table, select them and click the red X on the left side of the screen. Let’s project our data before we go any further. Click File | Set Projection. On the Projection tab, set the POSC Code to 26915, the datum to NAD83, and the projection to UTM zone 15N. Click OK. Select all of the waypoints that you want to use and then click Edit | Project Coordinates. Click File | Save To | File…. Save your waypoints as an ESRI Shapefile (2D) with the name Lab05_waypoints. Then save your waypoints as a GPS Exchange Format (*.gpx) with a sensible naming convention (like GEOG358_Lab3Waypoints). Downloading tracks is similar to downloading waypoints, but it can be a bit trickier. Click Track | Download. Select all the records (rows) that correspond to your track—this is where things can get confusing—and then click File | Save To | File…. Save your track as an ESRI Shapefile (2D) with a sensible name. If it asks for a Shape Type, select Line. When you’re done creating your shapefiles, close the DNR GPS application, turn the GPS unit off, and disconnect the unit from your computer. From KML (apple) The Fields Area Measure app (and several other phone based apps), may allow you to only export data in a KML format. Although ArcGIS does not natively support KML (if you navigate to where the file is supposed to be in the catalog, you wont find it. However, by searching for KML in the search box, you will find the KML to Layer tool, which will convert the file into a layer in ArcMap (which can then be exported to a shapefile if you so desire.\n Working with your shapefiles in ArcMap Open ArcMap. Add the shapefiles you created. Click the little down arrow next to the Add Data button and select Add Basemap. Choose a basemap that you think would look good with your shapefiles. Spend some time playing around with different basemaps and different ways to symbolize your new features. Make a map and upload a PNG of it to Blackboard. Open your features in Google Earth Let’s look at our point shapefile in Google Earth, which uses a KML file format. As you may have seen in the apps, this is an export filetype but is behind a paywall. Fortunately ArcGIS can do the conversion for us. Open the ArcToolbox and select Conversion Tools \u0026gt; To KML \u0026gt; Layer To KML. Select the appropriate layers in the Layer dropdown menu. Click the folder button next to Output File, navigate to your personal lab folder, and save the file as MyFeatureName.kml. Change the Layer Output Scale to 1. Click OK. In Windows Explorer, double-click the file you just created. (If you don’t see it, look for KMZ File under the Type column.) Google Earth should open and zoom to the location where your points are. Are your points where they’re supposed to be? Find a cool view angle and zoom level Google Earth lets you export these as a neat map with the appropriate map elements. Fill in the pertinent data, save it, and upload it to Blackboard. need extra instructions? Check out https://jimcoll.github.io/courses/random/firstgoogleearthmap/\n Common errors: \u0026ldquo;When I open my KML layer adobe pops up.\u0026rdquo;\n Windows has set the default program to handle KML layers as adobe for some reason. You can either right click on the file and open with \u0026gt; find program, or open Google Earth and go to File \u0026gt; Open "},{"id":43,"href":"/classes/geog358/labs/lab04/","title":"Lab 04 - On-Screen Digitizing \u0026 Image Restoration","parent":"Labs","content":"**Lab 04:\t** Learning Objective This lab covers one of the most common tasks a starting GIS analysis will likely be paid to do, aligning and digitizing data.\nTutorial Getting started Materials\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rlawrencenorth.tif\rA scanned aerial photograph of north Lawrence\r\r\rnaip2004sub.img\rPortion of a National Agriculture Imagery Program (NAIP) image (image at ~2 meter resolution project in UTM Zone 15, NAD83)\r\r\rusgsLawEast.tif\rScanned USGS Lawrence East quad\r\r\rStudyArea.shp\rA polygon shapefile indicating the digitizing area\r\r\rPart 1: Image Georeferencing Introduction Image data are a common source of information and are particularly useful when mapping vegetation, wetlands, or other natural resources. Scanned photos have some geometric distortion, but this distortion may not be too great, depending on our error specifications and how the photo was taken (flying height, focal length of camera, etc.). Digital orthophotoquads (DOQs) are scanned photos for which most of these errors have been removed. DOQs are being developed for the entire U.S. by the US Geological Survey (USGS). DOQs are a common data source for resource management agencies. The problem is that scanned images will be in some arbitrary scanner or screen coordinate system. If we wish to convert the digitized vector features to a projected coordinate system, we must perform an image registration. An image registration converts the image from a file or scanner coordinate system to a projected map coordinate system. There are many forms of image registration, but the simplest is called a first-order or affine transformation. An affine transformation is appropriate when the terrain is flat and the photograph has been taken with a vertically oriented (mapping) camera; otherwise, a projective transformation should be used. In this lab exercise you will be taking a scanned DOQ over north Lawrence from 1976 and register it to an image from 2004 that already contains geographic coordinates and a projection. You will then digitize roads in a study area that represent the conditions in 1976 that you can then compare to the current road conditions in the more recent image.\n1. Georeference a scanned aerial photo Create a new map document in ArcMap. Save it in your OneDrive as GEOG358_Lab04_Part1.mxd. Add the naip2004sub.img from the data folder to the data frame. This image has a projected coordinate system (UTM Zone 15 NAD83). We will register the scanned image to this coordinate system, thus making it the “control layer.” Add lawrencenorth.tif image to the data frame. If you see the warning about pyramids, recall what this is asking you about and if you see an \u0026ldquo;unreferenced\u0026rdquo; warning, Click OK, that\u0026rsquo;s what we are here to fix\n Did you get a few warnings that looked like so?\nDid you Google it? Spaces in your path name are the most likely culprit. Keep this in mind, as it likely won\u0026rsquo;t be the last time this comes back to bite us. This is the image that we will register to a projected coordinate system, thus making it the “target layer.” Since the image doesn’t have a projection yet, it will not be immediately visible; right-click the layer and select Zoom to Layer. Examine each image in order to discover what features they have in common (e.g., bridges). Click the Customize menu, select Toolbars, and then Georeferencing. The Georeferencing toolbar will appear in ArcMap. Make sure Layer is set to lawrencenorth.tif in the Georeferencing toolbar. Also, make sure that lawrencenorth.tif is on top of naip2004sub.img in the table of contents. Now we must select a succession of control links between the target layer and the control layer. These will be based on features that the images have in common. Find a road intersection in the lawrencenorth.tif layer. Zoom in close enough so that you can clearly see the width of the road. Click the Add Control Points button on the Georeferencing toolbar and then click once right in the center of the intersection. Right-click the naip2004sub.img layer and select Zoom to Layer. Find the intersection that you clicked in the other layer. Zoom in and click in the center of the intersection. Repeat this procedure 11 more times in different parts of the image for a total of 12 control points. The more evenly you spread out your control points, the better. Note that after the second control point the images will overlap, so you will need to turn layers on and off in order to establish the remaining control points. (Fun!) If you make a mistake, you can delete it by clicking the View Link Table button, select the row of the point, and hit Delete on your keyboard. With each successive control link, the image should line up a bit better. If all goes well, you should see the image shift slightly to match the control image. Don’t be alarmed if the images become temporarily distorted during this process. Click the View Link Table button. In the upper right hand corner, next to the word “Forward” is your RMS (root mean square), which is a measure of the accuracy of your registration. If your RMS is higher than 15, bad news: You need to redo your control points, this time with more precision. Once your RMS is under 15, enlarge your Link Table so that all 12 points are listed and then take a screen shot. Save the image as GEOG358_Lab04_Part1.png. This should look like so. There is one last step in the georeferencing process, to save the reference for future use. Before procedding, open up the file explorer to the data folder and make note of the files in there.\n On the georeferenceing bar click Update georefernce If you were sucessful you should now see a new .twf file, and your image is now referenced. Huzzah!\n Part 2: On-Screen Digitizing Introduction In this part of the lab you will learn how to digitize features from a scanned USGS topographic map. Digitizing is the process of converting paper map or image data to vector digital data. In manual vector digitizing, you trace the lines or points from the source media. You control a cursor, usually with a mouse or digitizing puck, and select vertices to define the point, line, or polygonal features you wish to capture. The source media may be hardcopy (e.g., maps on a digitizing table), or softcopy (e.g., a digital image or scanned map). ArcGIS allows us to digitize using either hardcopy or softcopy sources. This lab will involve digitizing a set of features from a scanned USGS 1:24,000 topographic map.\nPreparation Open ArcMap. Close any alerts that pop up. Save the map document in an appropriate place as GEOG358_Lab04_Part2.mxd Click the Add Data button, navigate to the lab data and add usgsLawEast.tif and StudyArea.shp. Let’s change the symbology of the StudyArea layer so that the polygons are hollow. The easiest way to do this is to click the colored box under the layer name and then select “Hollow” from the list of symbols. While you’re there, change the Outline Width to 5 so we can see it better. Change the color of the outline if you wish. Right-click the StudyArea layer and select Zoom To Layer. Recognize this area? Click the Pan button (looks like a hand) and then drag the map around until you find the KU campus. (Hint: It’s to the southwest of the study area!) Create shapefiles for digitizing In the ArcCatalog window, connect to the lab data folder, Right-click in the folder, and select New | Shapefile…. Name the shapefile Buildings and make sure the Feature Type is set to Point. Click the Edit button to add a coordinate system. Select Projected Coordinate Systems \u0026gt; UTM \u0026gt; NAD 1983 \u0026gt; NAD 1983 UTM Zone 15N.prj Click OK. Click OK again. Repeat the last few steps to create two more shapefiles: A polyline shapefile called Streets and a polygon shapefile called Parks. Make sure you set their projections to the same one as the Buildings shapefile. Digitize point features Click the Customize menu, go to Toolbars, and select Editor. In the Editor toolbar, start an edit session by clicking the Editor button and selecting Start Editing. (If a dialog box comes up, click OK.) A Create Features window will appear. Notice that the Construction Tools section near the bottom is empty. In the Create Features window, click Buildings (not the heading, but the one with the point symbol to its left). Since Buildings is a point shapefile, there are now point-related tools in the Construction Tools section. Make sure that you have the Point tool selected. Right-click the StudyArea layer and select Zoom To Layer. Looking at the map, you will see little solid black shapes scattered about. These are buildings, and we are going to digitize them as points. Move your mouse to the center of each building in the study area—zoom in further if you need to—and simply click. These points will be added into the Buildings layer. If you make a mistake, click the Edit Tool (it looks like an arrowhead pointing northwest) on the Editor toolbar, click the point, and then drag it to the correct location or hit Delete. After you finish digitizing all the buildings, click the Editor button, and select Save Edits. Change the symbology of your new Buildings layer if you wish. Save your map document. Digitize line features – Point Mode In the Create Features window, click Streets. Make sure you have the Line tool selected below. Click the Editor button, select Snapping, and then Snapping Toolbar. Hover your mouse over a random section of the study area boundary. Notice how the cursor “snaps” to the line and says Study Area: Edge. This snapping function will help you connect your street lines properly. Find a street intersection on the northern edge of the study area. Click on the intersection— close enough to the study area so that the cursor snaps to it—to begin a line. Add more vertices to the line by clicking your mouse along the streets; as you might imagine, curved streets require more vertices. When you want to finish a line, just double-click where you want it to end. Use this method for roughly half of the streets in the study area. (We will finish the rest using a different method.) Don’t get too hung up on making perfect streets. Save your edits periodically by clicking the Editor button and selecting Save Edits. When you’re finish digitizing, save your edits again. Change the symbology of your new line layer to something more visible (e.g., make the lines wider, change the color, etc.). Save your map document. Digitize line features – Stream Mode In the last section we digitized streets using Point Mode, which requires you to select vertices to make lines. This is a precise way to digitize, but in some cases you might want a quicker method that doesn’t involve as much precision. Enter Stream Mode. In Stream Mode, you make one vertex and then simply trace over your feature (sans clicking) until you’re done. After you finish the feature, ArcMap automatically adds vertices at an interval of your choosing. This interval is called the stream tolerance, and you can change it at any time, including when you’re in the middle of digitizing a feature.\n In the Editor toolbar, click the Editor button and select Options…. Click the General tab. In the Stream Mode section, change the Stream tolerance to 0.1 map units. Just below that, change the number of points [grouped] together when streaming to 10. Click OK. Make sure you have the Line tool selected. Right-click anywhere on the map and select Streaming. When you’re ready to digitize, just click, trace your feature, and double-click. Digitize the rest of the study area streets using this method. Save your edits. End the edit session by clicking Editor and selecting Stop Editing. Save your map document. Editing the Streets attribute table In the Table of Contents, right-click your Streets layer and select Open Attribute Table. Click the Table Options button (upper left corner) and select Add Field. In the Name field type Name. (That’s not a typo!) Change the Type to Text. Under Field Properties change the Length to 25. Click OK. Create another field called Suffix. Change its Type to Text and change its Length to 4. Close the attribute table. Right-click the Streets layer, select Selection and then Make This The Only Selectable Layer. Start an edit session. Click the Select Features tool (five places to the right of the Full Extent button). Looking at the map, click the street feature that corresponds to Lincoln Street in the map. The street will turn blue-green if selected properly. Open the Streets attribute table again. The record for the street feature you selected will be highlighted. Click in that record’s Name field and enter Lincoln. Click in the Suffix field and enter St. Repeat the last couple steps to find and name 5th St. Click the Clear Selected Features button (to the right of the Select Features button). In the Streets attribute table, click the leftmost part of the Lincoln St. record. This will select the record in the table and the map. Resize and/or move your attribute table so you can see that the Lincoln St. line was selected. Close the attribute table and save your edits. By default, ArcMap will only display one field at a time, and so once we tell it to display the Streets labels, it will just say “Lincoln” and “5th” (i.e., without “St.”). Let’s fix that. In the Table of Contents, double-click the Streets layer and click the Labels tab. Change the Label Field to Name. Click the Expression button. In the Expression box add the character \u0026amp; after [Name], and then double-click Suffix in the Fields box above. The expression in the box should read [Name] \u0026amp; [Suffix]. Click OK. In the Text Symbol section make the font bold, size 12, and a color that you think will stand out on the map. Click OK. In the Table of Contents, right-click the Streets layer and select Label Features. Do you see labels on the streets? You may notice the lack of space between the name and the suffix of streets. If you wish, you can fix this problem by going back the Expression box and changing the text to [Name] \u0026amp; “ ” [Suffix]. Save your map document. Digitize polygon features In the Create Features window, click Parks. Make sure you have the Polygon tool selected. Looking at the map, you should see two trailer parks. For each one, digitize a single polygon over the whole white-ish area. (The process is similar to creating line features in Point Mode.) Save your edits and end the edit session. Open the Parks attribute table. Using the same steps as the Streets layer, add new field called Name, change its Type to Text, and change the Length to 25. Start an edit session. In the attribute table, enter names for the trailer parks (e.g., Northwest Trailer Park, Southeast Trailer Park). Save your edits, end the edit session, and save your map document. Make a map Make a quick map of the study area: Include the three layers you created. Don’t include the map image (usgsLawEast.tif). Label the Streets and Parks features. If you want to change the look of the labels, play around in the Labels tab in the layer’s Properties. Change the fill color of the StudyArea from Hollow to 10% gray, and make its outline thinner. Include a title, author, north arrow, scale, legend, and neatline. Export a PNG of the map. Call it GEOG358_Lab04_Part2.png. You should have two images, which you will submit on blackboard.\n"},{"id":44,"href":"/classes/geog558/labs/lab04/","title":"Lab 04 - Zonal Operations","parent":"Labs","content":"Learning Objective In this lab, we are interested in assessing how much of Douglas County is covered by tornado sirens. In other words, we want to know how many people can actually hear a tornado siren when it goes off. While the sirens for Douglas County were designed to be heard at distances of up to 5,800 feet, there might be areas in which people can’t hear them. Finally, we will use the zonal definitions of lake area created in lab 3 to calculate the height of the water from the DEM and turn the height and area into a volume estimation.\nWhat you need to submit Lab 4: Answer Sheet\nName:\nPart 1:\nTotal Douglas County population:\nNumber of people who can hear the tornado sirens:\nNumber of people who cannot hear the tornado sirens:\nPercentage of the population that can hear the tornado sirens:\nPercentage of total population that cannot hear the tornado sirens:\n Note: Your math should add up.\n Part 2:\nHow much water did the lake gain over the time period:\nWhich period did the volume grow the fastest, 1995-2003 or 2003-2009?\nWrite a short (2-4 healthy paragraphs) analysis regarding this process. Some topics to touch on include how this analysis was performed.\n Would you expect a significantly different answer if you tweaked the methods? What are the sources of error in this process? How can we address those, and the uncertainty they add to our analysis? Materials click to download\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rRelevant part\rData Name\rDescription\r\r\rPart 1\rcensusblk.shp\rCensus blocks for Douglas County, Kansas\r\r\rPart 1\rkansas2000censusblk.dbf\rDatabase file containing census data, including population per block group\r\r\rPart 1\rsirens.shp\rPoint shapefile with locations of tornado sirens in Douglas County, KS\r\r\rPart 1\rroads.shp\rDouglas County Roads\r\r\rPart 1\rcountybnd.shp\rDouglas County Boundary\r\r\rPart 2\rBaseData\rData downloaded in the first lab\r\r\rPart 2\r…LakeBoundry.tif\rNDWI threshold maps for 1995, 2003, and 2009\r\r Part 1: Tornado siren coverage Create Buffers In ArcMap, create a new empty map, and save your map document to your personal folder. Add countybnd.shp, roads.shp, censusblk.shp, and sirens.shp from your Part1 folder to the empty arcmap. Create buffers around the sirens using the buffer tool and the following settings: Input Features: sirens Output Features: sirens_buffer.shp (make sure to save in your Part1 folder) Linear Unit: 5800 feet Dissolve Type: None Arrange layers so that you can see the siren points and buffers on top. Let’s remove the common polygon lines in the buffers. In ArcToolbox go to Data Management Tools | Generalization | Dissolve. Use the following settings: Input Features: sirens_buffer Output Features: sirens_buffer_dissolve.shp (make sure to save in your Part1 folder) Change the symbology of the layers to the colors and symbols of your choice, and arrange the layers accordingly. Notice that you could have just used the dissolve type parameter in the buffer tool, but now you know where both are :)\n Finding the Percentage of Population Covered by Sirens Add kansas2000censusblk.dbf to the data frame. Right-click the censusblk layer, select Joins and Relates, and select Join. Join the layer to kansas2000censusblk.dbf using the STFID field. Examine the attribute table of censusblk to make sure the join was successful. Do you see all sorts of demographic attributes? If so, the join was successful. NOTE: Need a refresher on joins? Try lab 6 from the intro class.\n Right-click the header of the POP2000 attribute and select Statistics. The sum value is the number of people in Douglas County. Record the value somewhere, as you will be entering this into the comments section of your Blackboard submission. Close the window. Use Select by Location… to select the census blocks that are within the buffer zone. Hint: Use the are completely within the source layer feature setting. The result should be 1,590 census blocks selected. Find the population in these census blocks by opening the censusblk attribute table, right- clicking the POP2000 attribute header, and selecting Statistics. (It will only give you statistics on the selected records.) Record this number. Find the population out of the range of sirens by clicking Table Options (upper left corner of the attribute table), selecting Switch Selection, and then using the Statistics tool again. Record this number. Switch the selection back (i.e., back to the blocks within range of the sirens) and close the attribute table. Answer the part 1 questions. Part 2: Lake Volume estimations Open your Lab 2 Part 1 mxd or create a data frame/document. This document should have:\n AOI shapefile. lake boundary classifications (…NDWI_#.TIF) )for 1995, 2003, and 2009. The RGB file for each year The elevation data Isolate the lake boundary As you have already noticed, we need to isolate the lake boundary from any errant, misclassified pixels. We can do that a few different ways you can do this. For each image, redefine an AOI and use the raster clip to isolate the lake boundary. This is time consuming unsalable, and poorly reproducible. Here we will deploy a focal approach. Note that here, your numbers will differ from mine, region groups applies group numbers in a quasi-random fashion, and you may or may not have the same number of groups that I do.\n The more preferable way to do this is to use region groups, a tool you will cover in much more depth later in the semester. For now, just know that it groups clusters of raster pixels into unique groups. Use the …LakeBoundary.tif as the input raster Save the output raster as …groups.tif Use Eight connectivity Use the identify tool to pick out the value of the lake edge Use the raster calculator to create a zone of just that value. Hint: mine looks like \u0026ldquo;LT05_L1TP_008067_19950622_20170109_01_T1_LakeBoundry_Group.TIF\u0026rdquo; == 10. Overwrite your lake boundary raster Calculate the height the water rises to on the terrain We will use zonal statistics and our newly created zone to calculate the (insert statistic here) value of the elevation data.\n Use the newly created lake boundary as our zonal definition. We want the statistics from the elevation data. Choose a statistic, I chose maximum but feel free to explore other options. Save the output as …LakeBoundary_ZonalStat.tif This output is the “elevation” of the water, we then need to place this value in a raster that we can subtract from the DEM to calculate a lake depth at each pixel\nCreate the lake surface elevation We need to use the same process as above to pick out just the lake.\n (\u0026quot;…LakeBoundry_Group.TIF\u0026quot; == 10) | (\u0026quot;…LakeBoundry_Group.TIF\u0026quot; == 12) NOTE: That\u0026rsquo;s an | above, not a /. The \u0026ldquo;pipe\u0026rdquo; is computer shortcode for \u0026ldquo;OR\u0026rdquo;. Shift backslash on most keyboards.\n Save the output as …LakeRaster.tif. We then need to use the reclassify tool to turn the binary lake classification into a lake elevation\n Turn 0’s into NoData. Turn 1’s into the value of the zonal statistic output calculated in the previous step. Save the output as …LakeSurface.tif.\n Calculate the lake depth and the lake volume \rWe now need to subtract our two surfaces Hint: RasterCalcuator(lakeSurface – Elevation) Save the output as …LakeDepth.tif. Examine the attribute table of the subtraction grid (**R-click** on LakeDepth | **Open Attribute Table**).\rClick the **Select by Attribute** button and select cells with an elevation difference greater than 0.\rExport selected cells **(Table Options | Export…)** as a dBASE Table file named LakeDepth1995.dbf.\rWhen asked if you want to add the table to your map, select Yes\r If there were no errors but you cannot see LakeDepth1995.dbf in your Table of Contents, you need to switch to List By Source view\n Open LakeDepth1995.dbf **(R-click | Open)**\rAdd a new field **(Table Options | Add Field)** and...\r set the name to Area type to float precision to 16 scale to 1 and then click OK Calculate the surface area of LakeDepth1995 using the following equation, area = # of cells (count) * length of cell * width of cell\r Hint #1: (R-click column to be calculated, choose Field Calculator… if asked about calculating outside an Editing session, click Yes) Hint #2: the length and width of the cell comes from the cell size of the grid, since the cells are square, the values are the same.\n Add a new field and...\r set the name to cell_vol type to float precision to 16 scale to 1 and then click OK Calculate the volume per cell of LakeDepth1995 using the following equation: cell_vol = depth (i.e., Value) * length of cell * width of cell\rAdd a new field and...\r set the name to total_vol type to float precision to 16 scale to 1 and then click OK Calculate the total volume of LakeDepth1995 using the following equation: total_vol = volume per cell (cell_vol) * total number of cells (count) You will need to perform this process twice more for 2003 and 2009, answer the part 2 questions, and submit both parts to Blackboard.\n"},{"id":45,"href":"/classes/geog358/labs/lab05/","title":"Lab 05 - Building a GIS database","parent":"Labs","content":"Learning Objective This lab will teach you how to build a GIS database for Douglas County, KS using existing digital data from the Internet. First you will download data from Kansas Geospatial Community Commons (provided by Data Access and Support Center, or DASC). Then you will define coordinate systems and project the data to Kansas State Plane Coordinate System NAD83 meters North Zone. You will then make a second map of a county of your choice using the USGS National Map Website as your data source.\nThe primary goal of this lab is to learn how to retrieve GIS data. Gathering data is often one of the more difficult challenges for beginning GIS users, but there is a wealth of data available on the Internet for free, it just takes a little Google-foo \u0026amp; massaging to get what you want.\nTutorial Part 1: Creating a GIS Database 1. Getting started: Create the Lab05 folder structure Create a Lab05 folder. Within that folder, create Part1 and Part2 folders. Within the Part1 folder, create three new folders called… DataIn (this will be the folder you download data into) DataClip (this folder will be for data that has been clipped to the Douglas County boundaries), DataProj (this will be the folder in which you will save data that you have projected and renamed) Within the Part2 folder, create the same folder structure. 2. Download data from the Kansas Geospatial Community Commons As we saw in lab 1, a geoportal is a great place to find GIS data. For data related to Kansas, the Kansas Data Access Support Center (DASC) is the place to be.\n Go to DASC In order to download data, you need to create a username and password. To do this, click on “Register” in the upper right corner of the webpage. Feel free to give the KU Department of Geography’s address and phone number as your information: 1475 Jayhawk Blvd, 213 Lindley Hall, University of Kansas Lawrence, KS 66045-7613 (785) 864-5143 Once your account has been confirmed, click on “Catalog.” We are interested in the following data: Census blocks County boundaries Hydrography features Roads Let’s start with the census blocks. Click the Administrative Boundaries category and then click Blocks from the list. Click the File Downloads tab in the top panel of the web page. Right-click the TIGER2012_Blocks.zip (2012- Includes 2010 demographic data) link and select Save Link As… When prompted for a location, browse to your Part1 DataIn folder and save it there. Follow the same procedure to download Counties (TIGER2010_Census_County.zip (2010- Includes demographic data) located under Administrative Boundaries). Now we want to download hydrographic (streams and lakes) data. Under the Water Resources category in the Kansas Water Features data you will find NHD Flow Lines (NHD_FLOWLINES_SHP.zip) NHD Water Areas (NHD_AREA_SHP.zip) NHD Water Bodies (NHD_WATERBODIES_SHP.zip) which are part of the National Hydrography Dataset (NHD). Download them all to your Part1 DataIn folder. (Note: Make sure you’re downloading shapefiles—the first option listed—rather than file geodatabases.) These are big files, so be patient while they’re downloading. Lastly, download the TIGER Roadways – 2012 shapefile (TIGER2012_Roads_Info.zip under the Transportation category). Your final folder layout should look like so: 3. Unzipping data Using Windows Explorer, extract all the zipped files you downloaded. (One way to do this is to right-click each folder and select Extract All. Another way is to select all the folders, right-click and select 7-Zip, and then select Extract Here.) Open a new map in ArcMap and add all the data you just examined. Save your ArcMap document as GEOG358Lab05_Part1_YourLastName.mxd in your Lab05 folder. 4. Clipping data To clip data, we theoretically need two pieces of information, the extent we want to clip to and the features we want to clip. ArcMap follows this convention, although there are a number of caveats to how it is implemented, and there are several ways to accomplish a clip. Here we will select a single feature from a layer, and the extent of that selection will dictate the extent of the clip.\nSelecting an area to clip to: The layers in your map cover the entire state of Kansas. Since we are making a map of only Douglas County, we will clip off the rest of the data we don’t need. First, we need to select the boundaries to use in our clip analysis. Open the attribute table of Tiger2010_Census_County (Right-click | Open Attribute Table), click the Table Options button (upper left corner) and select Select By Attributes\u0026hellip;. In the box near the bottom of the window, use the mouse to enter the expression: “NAME10”=‘Douglas’ This “code” is a variation of SQL (Structured Query Language, remember this from Lab 2?) It really cares about syntax, so it’s usually best to have ArcMap build the code for you (by using the mouse). When that is done click the Verify button. If there are no problems, click Apply; if there are problems, double- check to make sure you typed the expression correctly. Douglas County should now be selected in your attribute table and map. Close the attribute table. Clipping features: Open ArcToolbox. Under Analysis Tools | Extract, double-click the Clip tool. From the dropdown menu (click the down arrow) for Input Features, select the Tiger2010_Roadways. From the dropdown menu for Clip Features, select Tiger2010_Census_County. For the Output Feature Class, browse to your Part1 DataClip folder and save the file as Tiger2010_Roadways_Clip.shp. Press OK to execute the tool. When the Clip tool has finished, turn off all layers in the table of contents except for Tiger2010_Census_County and Tiger2010_Roadways_Clip. If the roads are only shown in Douglas County, your clip was successful. Run the Clip tool again for the other layers: NHDArea, NHDWaterbody, NHD FlowLines and Tiger2012 _Blocks. (Use those layers as the Input Features; Tiger2010_Census_County will be your Clip Feature in every instance. Make sure all output goes into your Part1 DataClip folder.) These steps will take time, so please be patient! Finally, right-click the Tiger2010_Census_County layer, select Data, and select Export Data…. Make sure the Export drop-down menu is set to “Selected features”. Change the output destination to your DataClip folder, change the Save as type to Shapefile, and save the file as DouglasCounty.shp. (This will create a layer of just Douglas County.) Once all clipping is completed, remove the original (i.e., unclipped) layers from your table of contents. 5. Re-project the shapefiles to Kansas State Plane In your ArcMap document, you should now have all the newly clipped shapefiles in your Data Frame. Open ArcToolbox, go to Data Management Tools | Projections and Transformations, and double-click the Project tool. Re-project each of the clipped shapefiles to NAD 1983 StatePlane Kansas North FIPS 1501 (Meters). Projected Coordinate Systems -\u0026gt; State Plane -\u0026gt; NAD 1983 (Meters) -\u0026gt; NAD1983 StatePlane Kansas North FIPS 1501 (Meters).prj) Note: You can search WKID: 6466 Make sure you save the new files to your Lab05 DataProj folder. Remove all un-projected data from your map. Right-click the data frame (Layers) and select Properties…. Click the Coordinate System tab and change the coordinate system to NAD1983 State Plane Kansas North FIPS 1501 (meters).prj (the same as above). Zoom in to Douglas County, take a screenshot of your entire screen, and upload the screenshot to Blackboard. It should look something like: Part 2: Creating your own county map The following section involves obtaining spatial data from the United States Geological Survey National Map website. We will create another map of a different county. For this part, we will use the new National Map website to make a map of a county outside of Kansas. If you are not from here, you can select your home county, or pick one you like. If you lack imagination, use one of the following:\n Aroostook County, in Northernmost Maine Miami-Dade Counties in Southernmost Florida Do not select Douglas County. 1. Download data from USGS National Map Go to The National Map\n Zoom into your area of interest until you can read city and county names on the map.\n Make sure the Current Extent button is selected at the top.\n Toggle on Boundaries and Transportation. Make note of the other types of layers that are available on this site. It is a great resource for your final project.\n Under the catagories you toggled on, make sure you are getting either shapefiles or GeoTIFF, using the correct extent, and then click Find Products\n If you did this right, your window should look like so: In the results tab that appears under each layer, you can…\n Look at the footprint of each layer Look at the metadata for the later Download it Go ahead and download the data into your Part 2 DataIn folder.\n This process will create separate files for each of the three datasets. Finally, unzip the data.\n 2. Download data from the Multi-Resolution Land Characteristics (MRLC) consortium There are several land cover datasets, but perhaps the most pertinent to the US is the National Land Cover Dataset(NLCD), produced by the MRLC Consortium . Some time last year they updated their data policies and, in what I can only surmise was an effort to streamline data distribution system, made it almost impossible to get their data in a usable form without diving into tools unsuited to an intro level class. There are two ways in which you can proceed. If you trust your PC, you can download the whole 1.4 GB NLCD database and then clip to your county. Alternatively, you can download the GAP/LANDFIRE land cover dataset by state and clip from there. Regardless of your final choice, after this step you should have an unzipped geotiff in your folder. 3. Add the data to a new data frame Insert \u0026gt; New data frame Rename the new data frame to part 2 Add the data to this data frame you should have the one raster (Land Cover) and two shapefiles (Boundaries and transportation) 4. Re-project the Layers to the appropriate UTM zone projection. To decide which UTM zone is appropriate, remember you have the world at your fingertips (Google it) Check the projection of your raster by inspecting the source tab in its properties window. If it is not in the UTM zone you intend to use, reproject it. Open ArcToolbox and use the Data Management | Projection and Transformations| Raster | Project Raster tool to re-project the rasters to Projected Coordinate systems \u0026gt; UTM \u0026gt; NAD1983 \u0026gt; UTM Zone ??? N.prj Project your shapefiles into the same UTM zone. Data Management | Projection and Transformations| Project. 5. Clip the layers to the area of interest At this point, let’s explore the search function. If we were to phrase what we want to accomplish using GIS terminology, we want to clip a raster to a shapefile. We don’t really know if this is possible, but if we search for “clip” we get several variants including… Lo and behold, the third clip tool (in the Data Management) toolbox seems promising.\n Go ahead and click on the correct tool and set up the tool like so.\n Notes: Again, include .tif in the filename If your area is irregular and you want to make a tight clip (as opposed to an outer bounding box clip), check the “Use input clipping geometry (optional)” box 6 Make a map Produce a map (keeping in mind our cartographic rules) of the reprojected county data containing the following layers: roads on top of the LAND COVER raster. If you would like, the map can show only a zoomed in portion of your county, rather than the whole thing. For this map, you need to include your name, title, legend, scale bar and north arrow. Be sure to save your map. Now export it as an image named yournameLab5Part2.png. Save the .png in your Lab05 folder and upload it to blackboard. Spend, at the very minimum, ten minutes on this, a bad looking map is worse at communicating an idea than no map at all. As a reference, here is what 10 minutes of my time looks like. You should have two images, which you will submit on blackboard.\n"},{"id":46,"href":"/classes/geog558/labs/lab05/","title":"Lab 05 - Model Builder","parent":"Labs","content":"Lab 5 - Model Builder Learning Objective As you have likely noticed, the process of delineating a lake follows a predictable workflow that could easily be automated, and because time is money, it would behoove us to create a script one could run which would create the needed outputs for us without all the tedious clicking and typing that would otherwise have to occur. Fortunately, ArcGIS has a mode, ModelBuilder, which enables us to visually create a workflow to connect our data to a series of operations without the need to actually code in python. To do this effectively, we need to modify our process somewhat, so the first part of this lab will walk us through the steps needed to perform this analysis on a single year, and part 2 will introduce the model builder interface and how this process can be automated and applied to each year we downloaded in lab 1. By the end of this lab you should have a firm grasp on the local, focal and zonal operations we need to use to create the lake, and how model builder can be used in numerous applications to speed up your workflows.\nWhat you need to submit Submit an image of both your analysis and a screenshot of your model to blackboard. Then submit your screenshot to the Google Spreadsheet.\nMaterials .tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rBaseData\rData we downloaded as part of lab 1\r\r *** # **Part 1: Performing our analysis for one year** # **Part 1A: Setting up Arcmap** ## **Getting set up** First let’s lay down the foundation.\r Open ArcMap and make sure that the Spatial Analyst extension is activated, and overwriting files is enabled. Customize \u0026gt; Extensions \u0026amp; ArcMap options. Rename the default data frame to Base Data and add in the AOI, Bands 2 and 4, and the MTL file for each year. To force our model to calculate everything using a consistent grid size, change the geoprocessing \u0026gt; Environments settings to the following: Workspace: Set both the Current workspace and Scratch workspaces to your lab05 folder. Processing Extent: Use the folder icon and point it to your AOI shapefile. Use the folder icon next to snap raster to select the elevation dataset. Raster Analysis: Select As specified below and type in 30. Click OK. As you discovered in lab 4, the region group tool was useful in creating a raster we could use to select the desired areas. However, the region values were quasi-random. Although we could in theory make a program and so this will not work if we are attempting to create a programmatic workflow. We will instead use a “seed” and the cost distance tool to classify which of the NDWI zones we want.\nCreate the LakeSeed shapefile In the ArcCatalogue window right click on your basedata geodatabase and create a new feature class. Call the class LakeSeed and make it a point feature. Click next. Use the Layers box to select the correct projection. Click next until you reach the Field Names category. In the first empty Field Name box, type in VALUE and change the Data Type to Short Integer. In the field properties you can set the default value to 1. Click Finish. The layer should be added to your window. On the Editor toolbar, click Editor \u0026gt; Start Editing. In the Create Features window, click LakeSeed and choose the point tool. Create a single feature that will “intersect” all lake layers through time, so the center of the smallest lake extent is best. When finished, click editor \u0026gt; stop editing and save edits.\n Part 1B: Isolate the lake: Calculate and threshold NDWI (in one expression) Recall from lab 2 that this expression needs appropriate parenthesis and the Float() attribute. However, we want to reverse the expression here so that we want to classify as a lake has a 0 value, and everything else has a value of 1. This way we can use the cost distance tool to traverse all connected lake pixels without inuring a cost.\n Hint: My expression looks like (Float(\u0026ldquo;1995\\LT05_\u0026hellip;_B2.TIF\u0026rdquo; - \u0026ldquo;1995\\LT05_\u0026hellip;_B4.TIF\u0026rdquo;) / Float(\u0026ldquo;1995\\LT05_\u0026hellip;_B2.TIF\u0026rdquo; + \u0026ldquo;1995\\LT05_\u0026hellip;_B4.TIF\u0026rdquo;)) \u0026lt;= 0.3 Call the output NDWI_yyyy.tif. Use the LakeSeed to isolate just the NDWI zone we want Use the search bar to find the Cost Distance tool. The source data is the LakeSeed. The input cost raster is the NDWI_yyyy.tif layer. Save the output as Cost_yyyy.tif. We want just the lake, so use raster calculator and get just the areas with 0 cost Hint: “Cost_yyyy.tif” ==0 Save the raster as Lake_yyyy.tif Part 1C: Lake volume calculation: Calculate the boundary of the lake in 1995 using a focal erosion process (as per lab 3)\n Using the Focal statistics tool, calculate the focal minimum using a rectangular 3x3 grid The input should be the Lake_yyyy.tif layer Save the output Lake_yyyy_Fmin.tif Using raster calculator, subtract the focal min raster from the lake classification raster to get just the boundary. \u0026ldquo;1995\\Lake_yyyy.tif\u0026rdquo; - \u0026ldquo;1995\\Lake_yyyy_Fmin.tif\u0026rdquo; Save the output of the map as lakeEdge_yyyy.tif Calculate the height the water rises to on the terrain As per lab 4 we will use zonal statistics as table and our newly created zone to calculate the (insert statistic here) value of the elevation data.\n Use the lake boundary (lakeEdge_1995.tif) as our zonal definition We want the statistics from the elevation data, so that goes in the “Input Value Raster” box Choose a statistic, I chose maximum but feel free to explore other options Save the output as …LakeBoundary_ZonalStat Create the lake surface raster Also from lab 4 use the reclass by table function (spatial analyst) to reclassify the lake_1995 raster into a lake surface raster.\n Your input raster is Lake_1995. Your remap table is the table created from zonal statistics as table (LakeBoundary_ZonalStat) The From, to, and output value should fill in automagically, but in case they don\u0026rsquo;t think about what you are trying to accomplish. The from and to value need to match, you are essentially joining tables together here, so both need to be VALUE The Output field is the value that you are reclassifying to, so this should be MAX Save the output as LakeSurface_yyyy.tif We need just the lake elevations, and given the tools shown so far we could accomplish that by multiplying them with the Lake_1995 raster using raster calculator (to set the values we no longer want to 0), and then setting the 0 values to NoData. We can do this in 1 step through raster calculator using the Set Null function. The Set Null(,) function takes two arguments, a mask which sets the values to null and a raster with the values to set otherwise.\n Hint: SetNull(\u0026ldquo;Lake_yyyy\u0026rdquo; ==0, \u0026ldquo;LakeSurface_yyyy.tif\u0026rdquo;)\n Save the output raster as LakeSurface Calculate the volume (as per lab 4) Subtract the DEM from the Lake Surface using the raster calculator tool Save the output as LakeVolume.tif Use the Add field tool (data management) to create a new field Call it volume Use the Calculate Field tool to calculate the volume of each cell Hint: Depth(Value) * area (30*30) * Count Use the summery table to create a table of volume (an optional step, otherwise just know that you\u0026rsquo;ll need to sum the field yourself each time. This process was time consuming, and the only thing that changes in this workflow (once we’ve chosen a place to analyses), is the input bands. This repetitious task is best completed using a function. In ArcMap, functions can take the form of tools, and the visual programming interface called Model Builder.\n PART 2: Creating the model Set up the folders for the lab In your lab folder in the ArcCatalog pane, create a new File Geodatabase In that same lab folder, Right click and create a new \u0026gt; toolbox. Name it LakeVolumeAnalysis.tbx Right click on that toolbox and create a new \u0026gt; model. A blank Model window will appear. Take a moment to explore the tools. You can go to ArcGIS Desktop Help to find more information about each of the tools. Close the Model window. Right click on the model and In the general tab rename the model LakeVolumeForAYear Re-open the Model by R-clicking and selecting Edit… Set up the Model Properties by going to Model | Model Properties Under the Environments tab, check the boxes for Processing Extent | Extent and for Raster Analysis | Cell Size, and for Workspace Click the Values… button, and set the values as in part 1A Remove all layers from the map Starting a Model \u0026amp; Setting up the Workflow You will create a workflow diagram in the Model window that will show all the steps and operations/tools needed to delineate the sea level change. We start the process by dragging tools from ArcToolbox into the Model window to perform a specific operation.\n Open ArcToolbox and set up your model window and ArcMap window so that you can see both ArcToolbox and the model window at the same time. Use the add data button to add in the band 2 and band 4 rasters Right click on the resulting oval and rename it to Band 2 and 4 respectively Right click on it and make it a parameter by checking Model Parameter option from its popup menu (right-click). The letter P appears beside the variable, indicating it is a model parameter. Insert a new variable with insert \u0026gt; create variable Select Double Double click on it and give it a default value of 0.3 Right click on it and make it a parameter as well. Use the search function to drag in the raster calculator tool from ArcCatalog into the model builder canvas Double click on the raster calculator tool and fill it out as appropriate using the steps above. Make sure you use the drop downs to populate the raster calculator, pointing at the data is not appropriate here. Hint: My expression looks like (Float(\u0026quot;%1995 B2%\u0026quot; - \u0026ldquo;%1995 B4%\u0026quot;) / Float(\u0026quot;%1995 B2%\u0026rdquo; + \u0026ldquo;%1995 B4%\u0026quot;)) \u0026lt;= float(%Double%) Name the output NDWI.tif The bubbles should color themselves in if you entered everything in correctly. This means that the tools executed successfully, but be careful. Just because it ran doesn’t mean it is right. Your model should now look like so: Build the rest of the workflow Build the rest of the workflow as defined above by dragging in tools using the same process. A few places of minor note. When you use functions in modelbuilder, you will most often want to use the layers which have a recycle symbol next to them. These are model variables, and in tool prompts they are bracketed by % signs. So for example when you add the Set Null(,) function, it should be parameterized as follows:SetNull(\u0026quot;%Lake_1995%\u0026rdquo; ==0, \u0026ldquo;%LakeSurface_yyyy.tif%\u0026quot;)\nYour final model should look like this: Once the model is up and running, you can feed it the appropriate bands for each year to rapidly perform a lake volume analysis. Create a chart/table of the lake volume change over time, mine looks like so: Finally, submit an image of both your analysis and your model to blackboard. Create a final map of this lake analysis. This will be the last time we look at this lake this semester, so spend some time saying your goodbyes. You\u0026rsquo;ll also share your map in the Google Spreadsheet (linked on bb) so we can critique them after spring break. Feel free to use mine as inspiration. "},{"id":47,"href":"/classes/geog558/labs/lab06/","title":"Lab 06 - Cost Distance, Region Groups, more Model Builder \u0026 Python","parent":"Labs","content":"Lab 6 - Cost Distance, Region Groups, more Model Builder \u0026amp; Python Learning Objective This lab is designed to further strengthen your understanding and ability to apply local, focal, and zonal operations to solve problems and comfortability with ModelBuilder, this time in the context of delineating sea level rise. In part 1, you will walk through 2 methods of using the point and click interface to accomplish your analysis. In part 2, you will use ModelBuilder to accomplish this same task, and in part 3 you will see how you can import your own python analysis into ArcMap as tools.\nWhat you need to submit Lab 6: Answer Sheet\nName:\nQuestion 1: How many pixels are inundated?\nQuestion 2: While you’re building your models with model builder, you create several variables called model parameters that are shown with P letter next to their variables. What happens if you didn’t create those model parameters? What is the benefit of having the model parameters?\nQuestion 3: In part 02 of the lab, list all the output layers after you Run Entire Model in the ‘InundationByRegionGroup’ model builder window.\nQuestion 4: In part 02, how many cells does the final output have the value of ‘1’?\nQuestion 5: Which of the two methods of delineating sea level rise, region group or cost distance, was the more efficient means of performing this analysis? Why do you think this is?\nQuestion 6: Submit a coherent map of the output. Use mine below as an example.\nQuestion 7: This time, double click the InundationByRegionGroupPython script tool in part 03. When the model tool dialog box is opened, fill the parameter boxes with the right layer names. Click OK to run the tool. How many output layers do you have? List them.\nQuestion 8: In part 03 how many cells does the final output have the value of ‘1’?\nQuestion 9: If the Sea Level Rise is set to 4, how many cells does the final output have the value of ‘1’?\n Materials click to download\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rRelevant part\rData Name\rDescription\r\r\rPart 1-3\rDEM\rRaster dataset of the elevation of the area\r\r\rPart 1-3\rOcean\rRaster dataset of the land cover types (Land \u0026 Ocean) over the area\r\r\rPart 3\rSeaLevelRiseInundation.py\rSea Level Rise inundation delineation using region group and zonal max\r\r Part 1: Delineating Sea Level Rise Inundation with ArcMap As you saw in class, it is possible to take two different approaches to map sea level rise in GIS. We can use cost distance to calculate the rise from an ocean source or we can use region group to get the regions of inundation which include an ocean source. Here you need to pick one of the methods and walk through it using ArcMap.\nUnzip the lab 6 folder into your one drive, in it you will find folders already set up for part1, part2, and part3. Go ahead and pick a means of delineating sea level rise from below, either cost distance or region groups, and perform a 2-meter sea level rise analysis.\n\rCost\rDistance\rRegion Group\r\r1. Set the\roceans to a one value source\r1. Create a raster of all the areas less than\rthe projected sea level rise.\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;a.\u0026nbsp;Use the Set null\rtool on the ocean raster\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;b.\u0026nbsp;\u0026quot;Value\u0026quot;\r= 0\r\u0026nbsp;\u0026nbsp;\u0026nbsp;a. Hint: Use the set null\rtool to replace all values \u0026gt; sea level rise to null.\u0026nbsp;\u0026nbsp;\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;c.\u0026nbsp;Your false or constant raster can be\reither the ocean raster or a value of 1\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;b. Otherwise, set the value\rto 1\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;d.\u0026nbsp;call\rthe output “source.tif”\r\u0026nbsp;\u0026nbsp;\u0026nbsp;c.\rCall the output demlteslr\r\r2. Set all elevations in the DEM which are above the sea level\rrise threshold\r2. Group the resulting regions\rusing region group\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;a. Use 8 connectivity and\rgroup using the within method.\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;a.\rSet null tool again, this time on elevation\r\u0026nbsp;\u0026nbsp;\u0026nbsp;b.\rCall the output regions\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;b.\r\u0026quot;Value\u0026quot; \u0026gt; %Sea Level Rise% (in the manual case, 2)\r3. Use zonal\rstatistics to identify the regions which contain an ocean source\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;c. Set\rthe false raster to 1 (Friction should be the same everywhere otherwise)\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;a. We\rwant the value of oceans (input value raster) using the regions as the zone\rdata, and the maximum statistics.\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;d.\rCall the output “friction.tif”\r\r3. Create the cost\rdistance surface\r\u0026nbsp;\u0026nbsp;\u0026nbsp;b.\rCall the output zonemax\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;a.\rCost distance tool…\r4. We then want to remove the\roceans from our calculation\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;b.\rCost is “friction” source is “source”\r\u0026nbsp;\u0026nbsp;\u0026nbsp;a.\rHint: raster calculator (%zonemax%\u0026quot; - \u0026quot;%ocean%\u0026quot;).\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;c.\rCall the output \u0026quot;distance.tif\u0026quot;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;b.\rCall the output diff\r\r4. Remove the source cells to create a raster of just inundation\r5. Finally, set\reverything less than or equal to 0 to null so that we have just the inundated\rpixels\r\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;a.\rSet null\r\u0026nbsp;\u0026nbsp;\u0026nbsp;a.\rHint: Set null tool again\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;b.\r“Value” = 0\u0026nbsp;\r\u0026nbsp;\u0026nbsp;\u0026nbsp;b.\rCall the output SLRbyRegion\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;c.\rAgain, set the value to 1 otherwise\r\u0026nbsp;\r\r\u0026nbsp;\u0026nbsp;\u0026nbsp;d.\rCall the output “SLRbyCost.tif”\r\u0026nbsp;\r\r\r\r\r\r\r\rAnswer question 1\n Part 2: Delineating Sea Level Rise Inundation with Model Builder In ArcGIS we can use ModelBuilder to build a visual model that represents an analysis work flow. A ModelBuilder model typically consists of a set of data connected by a series of GIS operations. In this part we will be using ModelBuilder to create a model to delineate sea level rise inundation. Using a ModelBuilder model is helpful in situations where the model can be applied to numerous datasets, as we saw with the last lab. Here you will use model builder to create both variants of the SLR workflow, one for Region Groups and one for Cost Distance.\nSet up the folders for the lab In your lab 6 part 2 folder, create a new File Geodatabase. R-Click \u0026gt; New \u0026gt; File Geodatabase This will be your new default workspace Add in the DEM and Ocean to the ArcMap, and examine them. When done, remove them so you have a blank map. It will also be helpful if you set overwrite to on Geoprocessing \u0026gt; Geoprocessing Options \u0026gt; Overwrite the outputs… Creating a Toolbox and a Model R-click on your lab 6 part 2 folder and create a new toolbox. Name the new toolbox InundationTools.tbx. R-click on the toolbox and create a new model. Close the Model window. Rename the new Model (R-click | Rename) to InundationByRegionGroup. Re-open the Model by R-clicking and selecting Edit… Set up the Model Properties by going to Model | Model Properties. Under the Environments tab, check the boxes for Processing Extent | Extent and for Raster Analysis | Cell Size. Click the Values… button. Click the down arrows for each of the settings and make the Extent the same as the DEM raster in your data folder, and the Cell Size the same as DEM as well (they both have the same extent and cell size, if you check the data’s properties). Click OK on the Environment Settings. Click OK on the InundationByRegionGroup Properties Window. Setting up the Workflow \u0026amp; Building the InundationByRegionGroup Model You will create a workflow diagram in the Model window that will show all the steps and operations/tools needed to delineate the sea level change. We start the process by dragging tools from ArcToolbox into the Model window to perform a specific operation.\n Open ArcToolbox and set up your model window and ArcMap window so that you can see both ArcToolbox and the model window at the same time. Add the Set Null tool by finding it under ArcToolbox (Spatial Analyst Tools | Conditional) and then dragging it onto the model window. Before you set the options for the Set Null tool, create a Double variable from the Insert menu. Rename it as ‘Sea Level Rise’, give the value of ‘2’ to the variable, and set it as a Model Parameter by checking Model Parameter option from its popup menu (right-click). The letter P appears beside the variable, indicating it is a model parameter. Double click on the Set Null tool. The Set Null dialog will open, and it should be familiar. We can reuse the false raster we used in the first set null. Set the input conditional raster to DEM, and set the output to demlteslr and make sure it is saved in the appropriate place. As for the expression, type in \u0026ldquo;Value\u0026rdquo; \u0026gt; %Sea Level Rise%. Click OK. Finally, right-click the input raster (DEM) and check Model Parameter option. You now have a simple model. Right-click the Set Null and select Run to test your model. Add, examine, and then remove the created Raster from the ArcMap window Add a Region Group tool (Spatial Analyst Tools \u0026gt; Generalization) to your model. Set demlteslr as its input raster Set regions as an output. The number of neighboring cells to use is eight and use within for zone grouping method. Uncheck not to add link field to output and\u0026hellip; click OK to finish the Region Group dialog. Right-click the Region Group tool and select Run to test your model. Add examine, and then remove the created Raster from the ArcMap window Add a Zonal Statistics tool (Spatial Analyst Tools \u0026gt; Zonal) to your model. Set regions as its input Set VALUE as a zone field Use zonemax as an output. Set Ocean as an input value raster. Use maximum for statistics type, then click OK. Set Ocean as a model parameter. Right-click the Zonal Statistics and select Run to test your model. Add examine, and then remove the created Raster from the ArcMap window We now need to use the Raster Calculator to calculate the difference between the zonemax and the Ocean. Hint: (\u0026quot;%zonemax%\u0026quot; - \u0026ldquo;%Ocean%\u0026quot;). Save the output as diff. Right-click the Raster Calculator and select Run to test your model. We are going to add a Set Null again to get our final output Inundation. Create another Double model variable, to define the input false raster or constant value and give it a value ‘1.’ Set diff as the tool’s input and type in the proper expression to query the values less and equal to zero. The final layout should look like so: Right-click the Set Null and select Run to test your model. Saving and testing your whole model Click OK and Save your model. While you’re building your model, you can check if each tool (so far, we created 5 tools; Set Null, Set Null(2), Region Group, Zonal Statistics, and Raster Calculator for this model) you created works fine by right-clicking and running it (select Run). This time, let’s check if whole your model runs in one go. Close the model. After all tools are correctly established in the model, delete all of the output files (delteslr, regions, zonemax, diff, inundation) from your output folder. Run your model using the appropriate inputs. Now recreate the cost distance approach Create a new model using the cost distance approach to delineating sea level rise. Your final model should look something like so:\nAnswer question 2-5.\n Part 3: Sea Level Rise Inundation Delineation with Python Scripting In this part of the lab we will be using ArcGIS Python Command Window to do the same analysis we did in the first part. The main purpose is to show how such a model/analysis can be implemented by using a python script tool. We, therefore, will practice how to create ArcGIS Python tool using a python script.\nCreating a script tool Within ArcGIS, you can create a script tool inside a toolbox. A script tool is like any other tool - it can be opened and executed from the tool dialog box, used in models and the Python window, and called from other scripts and script tools. To create a script tool, you need three things; a script, a custom toolbox, and a precise definition of the parameters of your script. Creating a custom toolbox You can create your own toolbox called a custom toolbox and add tools and toolsets to it. Script and model tools that you create should be stored in a custom toolbox. Open ArcCatalog. In the Catalog Tree, navigate to the folder where you want the toolbox to be created. Right-click the folder and click New \u0026gt; Toolbox. Give the toolbox a new name InundationToolbox.tbx. To create a script tool, right-click your custom toolbox and click Add \u0026gt; Script. This opens the Add Script wizard that takes you step by step through the process of creating a script tool. Give a new script a name InundationByRegionGroupPython, and click next. Add a script file by specifying the location of the file InundationByRegionGroupPython.py. On the third step of the Add Script wizard, add parameters by typing the name and selecting the data type as described in the table below. As for the property, use the default options unless specifically mentioned in the table below. \rDisplay Name\rData Type\rProperty\r\r\rOceans\rRaster Dataset\r\r\r\rDEM\rRaster Dataset\r\r\r\rSea Level Rise\rDouble\rDefault: 2\r\r\rInundation\rRaster Dataset\rDirection: output\r\r After completing the steps, your toolbox will contain a new script tool. You can always modify properties (such as parameter names and data types) of this script tool by right-clicking the script tool and choosing Properties. Double click the script file you just added. The InundationByRegionGroupPython tool dialog will appear and it should look like this image below. Input ocean \u0026amp; dem for the first two input layers. Use the default value for Sea Level Rise. Save Inundation in your part02 folder as an output layer. Click OK to run the tool. Answer question 6-9. When finished, submit your work to blackboard.\n"},{"id":48,"href":"/classes/geog358/labs/lab06/","title":"Lab 06 - Selections Queries and Joins","parent":"Labs","content":"Learning Objective As is more often the case than not, the information we need to answer a question are in separate places (read: shapefiles). Therefore, we need to join that data together. However, the concept of joins are communicated, implemented, and executed very differently within a computer/GIS context than they are in everyday language. Database joins are their own class in computer science/engineering, and we won’t cover all of them in this class. For this lab we will use two, a left join and geographic joins. You can think about left joins as joining two excel sheets together based on a common (or key) field. Spatial joins append entries that match the geographic filter (interest, buffer, ect.)\nPart 1: Attribute \u0026amp; Spatial Queries In this part we will execute queries by attribute, queries by space, and spatial joins. We will also conduct a simple analysis using these functions.\nMaterials\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rbookstores.shp\rBookstores in South Carolina\r\r\rcounties.shp\rCounties in South Carolina\r\r\ristates.shp\rInterstates that pass through South Carolina\r\r\rshomes.shp\rSurveyed homeowners in Santa Clara County\r\r\rsszpoly.shp\rSan Andreas Fault Special Study Zone (SSZ) near Santa Clara\r\r\rinsustatus.dbf\rInsurance status of surveyed homeowners in Santa Clara County 1 = Insured; 2 = Not Insured\r\r\rGEOG358_Lab6AnswerSheet_YourLastName.docx\rLab handout\r\r\r1. Download the lab data Open ArcMap. Add bookstores.shp, counties.shp, and istates.shp from the Part01 folder to your data frame. Go to the data frame (Layers) properties and click on the General tab. Make sure the Map Units are set to Meters and the Display Units are set to Miles. Click OK. 2. Attribute Queries (Select by Attribute) Go to Selection | Select By Attribute Set the Layer to counties. Make sure the Method option is set to Create a new selection. The attributes for the counties layer (“FID”, “AREA”, …) are displayed in the upper half of the window. You can double click on the attributes of interest to make it part of the expression in the lower half of the window. In the middle are the various operators that you will use. On the right, you can click the Get Unique Values button, which will display all of the unique values for the attribute you selected. An example of an expression is (“countyname” = ‘Douglas’), where Douglas county would be selected if we had a Kansas county dataset. Let’s select the counties where populations in 1990 were greater than 30,000 people. In the box with the attributes, scroll down and double-click “POP1990”. Notice that it shows up in the lower box. Now click the \u0026gt; button. In the lower box, type 30000 (no commas!) so that your expression reads: \u0026ldquo;POP1990\u0026rdquo; \u0026gt; 30000. Note that you can always just type the entire expression yourself instead of double-clicking attribute names, etc.; the advantage of double- clicking attribute names, however, is that it reduces the probability of typos in your expression. Check your expression by clicking the Verify button. If everything is correct, the program will tell you that “The expression was successfully verified.” Click Apply once the expression is verified and then look at the map. You should see most of the counties selected. 2. Selection Statistics Go to Selection | Statistics. This function calculates statistics based on the attributes of the selected features. If you have five counties selected, for example, you can find out the average area of those counties by selecting Area in the Field menu and then looking at the statistics below it. Changing the attributes in the Field menu won’t change which features you have selected. Using the Select By Attributes and Statistics tools, answer Questions #1 through #5 on the assignment sheet. Note that the Select By Attributes and Statistics windows can be open at the same time. Clear the selected counties by going to Selection | Clear Select Features or by clicking the Clear Selected Features button. Close the Select By Attributes and Statistics windows. 3. Spatial Queries Open the counties layer properties. Click the Selection tab, click with this color, and select a bright red color. Click Apply and OK. Use the same procedure to make the bookstores layer selections green and the istates layer selections yellow. Use the Select By Attributes function to select Richland County. Hint: Think about which attribute would contain the names of counties, select it, and then use the Get Unique Values button to get the list of county names. Now go to Selection | Select by Location… with the following parameters: Selection method: select features from Target layer(s): bookstores Source layer: counties Use selected features: (yes) Spatial selection method…: are completely within the source layer feature Click Apply. All the bookstores in Richland County should now be green, indicating that they were properly selected. Go to Selection | Statistics to find how many bookstores are in Richland County. Hint: Make sure the bookstores layer is selected! Answer Question #6 on the assignment sheet. Clear the selected features. Use the Select By Attributes tool to select Interstate 26 in the istates layer. Hint: Don’t use the ISTATES_ or ISTATES_ID attributes. I-26 will be highlighted as yellow. Use the Select by Location tool with the following parameters: Selection method: select features from Target layer(s): counties Source layer: istates Use selected features: (yes) Spatial selection method…: intersect (3d) the source layer feature Click Apply. The counties that I-26 passes through will turn red. Use the Statistics tool to find the mean median home income for those counties. (Yes, the mean of the median.) Answer Question #7 on the assignment sheet. Clear the selected features. Let’s find bookstores that are within 15 miles of I-26. As before, select I-26 from the istates layer. Use the Select By Location tool with the following parameters: Selection method: select features from Target layer(s): bookstores Source layer: istates Use selected features: (yes) Spatial selection method: are within a distance of the source layer feature Apply a search distance (this will be grayed out): 15 miles Answer Question #8 on the assignment sheet. Clear selected features. Answer Question #9 on the assignment sheet. Save your map document. Part Two: Attribute Joins and Spatial Joins In this part we will execute a spatial join and then conduct a simple analysis. The Loma Prieta earthquake (Richter 7.1) occurred on October 17, 1989 at 5:04pm Pacific Time. The epicenter was 37° 02' 00\u0026quot; N and 121° 53' 23\u0026quot; W. A large number of residential homes were damaged by the earthquake, and many were destroyed. You will be investigating the distribution of damaged homes in relation to active surface faults.\n1. Join the insurance database to the spatial data Open a new ArcMap document or add a new data frame. Add shomes.shp, sszpoly.shp and insustatus.dbf from the Part02 folder to the workspace. If you get a warning about an Unknown Spatial Reference, just click OK. Set the data frame map units to Meters and display units to Miles. Open the attribute table of the shomes layer. Answer Question #10 on the assignment sheet. Right-click the shomes layer and select Joins and Relates | Join…. Use the following parameters: What do you want to join to this layer?: Join attributes from a table Choose the field in this layer\u0026hellip;: PID Choose the table to join\u0026hellip;: insustatus Choose the field in the table\u0026hellip;: ID Join Options: Keep all records Click Validate Join. This is similar to the Verify button we saw in the Select By Attributes tool; it makes sure that all the right conditions are in place for the join operation to execute successfully. When you become more comfertable with the workflow/system, this is something you may be tempted to skip, but unless you are sure that the join will work, it\u0026rsquo;s usually best to just check. If you are asked whether to create an index, click Yes. Open the attribute table of the shomes layer. Answer Question #11 on the assignment sheet. Use the rest of this handout to fill in Question #12 on the assignment sheet. Right-click on shomes and select Selection | Select All. Write down the total number of surveyed homes in the table on the assignment sheet. Hint: Use the Statistics* tool or just look at the bottom of the attribute table. Use Select By Attributes to select the homes that were insured before the Loma Prieta earthquake: Using the PRELOMAINS attribute, 1 = Insured and 2 = Not insured. Write this information down in the table. Do the same for homes that were insured after the earthquake (POSTLOMAIN). Clear the selected features. 2. Determine the distance of each home to San Andreas SSZ Right-click the shomes layer and select Joins and Relates | Remove Join(s) | insustatus. Right-click shomes again and select Joins and Relates | Join… and use the following parameters: What do you want to join to this layer?: Join data from another layer based on spatial location Choose the layer to join to this layer\u0026hellip;: sszpoly Each point will be given all the attributes of the polygon that: is closest to it (Don’t press OK yet.) Click on the open folder icon and save the output housepolysj.shp in your Part02 folder. Now you can click OK Open the attribute table of your new layer and notice the Distance attribute that was created. This tells you how many meters that particular point is away from the nearest polygon. Select all features in housepolysj and use Statistics to find the average (mean) distance between the homes and San Andreas Special Study Zone (SSZ). Write this down in the assignment sheet table. Right-click the housepolysj layer and select Joins and Relates | Join…. Use the following parameters: What do you want to join to this layer?: Join attributes from a table Choose the field in this layer\u0026hellip;: PID Choose the table to join\u0026hellip;: insustatus Choose the field in the table\u0026hellip;: ID Join Options: Keep all records Use Select By Attribute on housepolysj to determine the number of homes that are within 4 miles of San Andreas SSZ (1 mile = 1609.344 meters). Write this down in the table. Determine the average distance to the San Andreas SSZ for the homes that are within 4 miles of San Andreas SSZ. Write this down in the table. Open the housepolysj attribute table, click the Table Options button (upper left corner) and select Switch Selection. This will invert your selection to (logically) select everything greater than 4 miles from the SSZ. Note: There is also a button on the top of the mune that inverts your selection. Determine the number of homes that are farther than 4 miles of San Andreas SSZ. Write this down in the table. Determine the average distance to the San Andreas SSZ for the homes that are farther than 4 miles of San Andreas SSZ. Write this down in the table. Using similar steps as before to determine how many homes in the housepolysj layer were insured before and after the earthquake. Write this down in the table and then calculate the percentages for each catagory. Submit the answer sheet on blackboard.\n"},{"id":49,"href":"/classes/geog358/labs/lab07/","title":"Lab 07 - Overlay \u0026 site suitability analysis","parent":"Labs","content":"Learning Objective This lab will introduce the idea of overlay analysis in terms of a common application, site suitability. Site suitability can take a number of forms but in this lab we\u0026rsquo;ll approach these concepts in vector and raster form. You will also touch briefly on Terrain analysis, something we\u0026rsquo;ll see more of in the next lab.\nPart 1: Attribute \u0026amp; Spatial Queries A logging company has been given a license to develop and to cut down trees in the Oakwood area (see the figure to the right). However, there are restrictions on where the company can cut down trees.\nThe purpose of this exercise will be to select sites where the company can log. We will use the ArcGIS to select these sites according to the selection rules given below.\nLicense Restrictions:\n No trees may be cut down within 10km of the shrine, in order to preserve the aesthetic of the landscape. No trees may be cut down within 1km of the sea, the lake, or any river in order to help prevent land erosion. The logging sites must be within 5km of existing roads for easy access by heavy logging equipment, since conservation laws will not allow any new roads to be built in this area. Materials\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rforest.shp\rForest stands in the study area\r\r\rrivers.shp\rRivers in the study area\r\r\rwater.shp\rWater bodies in the study area (1 = no water, 10 = water)\r\r\rroads.shp\rRoads through the forest stands\r\r\rshrine.shp\rShrine in the study area\r\r\r1 Read through the Part 1 section At this point you should be able to discern how to accomplish what you want to accomplish. We want to show the logging company where they can and cannot log. As is the case with most things in life, there are a few ways we can do this, and the choice is yours (if you got this, you are my new favorite student). Below I walk you through one way to accomplish this, but you will gain far more if you design your own way to arrive at the same end goal. 2 Data Organization Start ArcMap, create a new empty map, and save your map document to your personal folder. Add forest.shp, rivers.shp, water.shp, roads.shp and shrine.shp, all of which are located in your Part01 folder. In the data frame properties, change the display units to kilometers. Arrange the layers correctly Make sure List By Drawing Order (upper left of the Table of Contents) is selected. Arrange the layers in this order (bottom to top): water, forest, roads, rivers, and shrine. Change the symbology of each layer: Make the rivers blue.\n Make the roads black.\n Make the shrine red.\n Make the water blue. This requires extra steps:\n In the Symbology tab, click Categories near the upper left part of the window.\n Select Unique Values.\n Select ISWATER as the Value Field.\n Click Add All Values.\n Uncheck the box the left of Looking at the Value column, recall that water = 10 and everything else = 1, so double-click the color box next to 10 and change it to a blue color, and then click the color box next to 1 and change it to Hollow.\n Remember these steps, they will come in handy when making the final map for this analysis\n Make the forests green. There are two kinds of trees in this layer, so use a different shade of green for each kind. To do so, follow similar steps as were just performed for the water layer.\n 3 Create Buffers for Roads, Rivers, and Shrine In ArcToolbox (the button with the little red box on it) go to Analysis Tools | Proximity | Buffer. Use the following settings: Input Features: roads Output Feature Class: roads_buffer.shp (make sure to save in your Part01 folder) Linear Unit: 5 km Dissolve Type: All Click OK and give it a minute to work. When it’s done it won’t look pretty, but we’ll change that later. Follow the same steps to create 1 km buffer around the rivers layer and a 10 km buffer around the shrine layer. Make sure you give the output files descriptive names (e.g., rivers_buffer.shp and shrine_buffer.shp). 4 Create Sea and Lake Buffers We need to select the sea and lake polygons from the water layer. If we were to just do this we would end up buffering both shapes in the layer. Recall that when tools run, they typically operate within the active data frame, and use the current selection (or the whole shapefile if nothing is selected). So to accomplish what we want to do, we want only the actual water from the water. To do this, go to Select by Attributes… and use the expression \u0026ldquo;ISWATER\u0026rdquo; = 10. As you did with the other layers, create a 1 km buffer of the water features. (Note that if features are selected, the buffer tool will only create buffers for those selected features.) 5 Finding Suitable Logging Sites Now we will find potential logging areas based on access to roads. In ArcToolbox go to Analysis Tools | Overlay | Intersect. Use the following settings: Input: forest and roads_buffer (you will have to select each one individually) Output Feature Class: Solution_1.shp (make sure to save in your Part01 folder) Join Attributes: All Click OK. Now we will merge the rivers_buffers and Solution_1 layers. In ArcToolbox go to Analysis Tools | Overlay | Union. Use the following settings: Input: Solution_1 and rivers_buffer Output Feature Class: Temp_1.shp (make sure to save this in your Part01 folder) Join Attributes: All Click OK. To find those areas from Solution_1 that are not within the river exclusion zone (i.e., buffer), go to Select by Attributes…, select the Temp_1 layer, and apply the expression “FID_rivers” = -1 In the Table of Contents, right-click the Temp_1 layer and select Data | Export Data…. Save the features as a shapefile called Solution_2.shp. Leave the other settings as they are. When asked if you want to add the exported data as a layer, click Yes. Click the Clear Selected Features button on the main menu. In ArcToolbox go to Analysis Tools | Overlay | Union. Use the following settings: Input: Solution_2 and shrine_buffer Output Feature Class: Temp_2.shp (make sure to save this in your Part01 folder) Join Attributes: All Click OK. To find those areas from Solution_2 that are not within the shrine exclusion zone (i.e., buffer), go to Select by Attributes…, select the Temp_2 layer, and apply the expression “FID_shrine” = -1 In the Table of Contents, right-click the Temp_2 layer and select Data | Export Data…. Save the features as a shapefile called Final.shp. Leave the other settings as they are. In the Table of Contents, remove all layers that have names that start with Temp or Solution or that end with buffer. The following layers should remain: shrine, rivers, roads, Final, forest, and water. 6 Create a map and export Using the map below as a rough guide, create a map that indicates which forest stands the company can legally log. Make sure your symbology is clear and that the crucial components of the map are not covered up by your Final layer. (Hint: Use patterns, and/or go to the Display tab of the layer’s Properties and change the transparency.) Use some creativity— don’t just copy the map below! Export a PNG of the map and paste it in the answer sheet. Part 2 In this part of the lab, we are interested in assessing how much of Douglas County is covered by tornado sirens. In other words, we want to know how many people can actually hear a tornado siren when it goes off. While the sirens for Douglas County were designed to be heard at distances of up to 5,800 feet, there might be areas in which people can’t hear them.\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rcensusblk.shp\rCensus blocks for Douglas County, Kansas\r\r\rkansas2000censusblk.dbf\rDatabase file containing census data, including population per block group\r\r\rroads.shp\rPoint shapefile with locations of tornado sirens in Douglas County, KS\r\r\rroads.shp\rDouglas County Roads\r\r\rcountybnd.shp\rDouglas County Boundary\r\r\r1 Create Buffers In ArcMap, create a new empty map, and save your map document to your personal folder. Add countybnd.shp, roads.shp, censusblk.shp, and sirens.shp from your Part02 folder. Create buffers around the sirens using the following settings: Input Features: sirens Output Features: sirens_buffer.shp (make sure to save in your Part02 folder) Linear Unit: 5800 feet Dissolve Type: All Arrange layers so that you can see the siren points and buffers on top. 2 Finding the Percentage of Population Covered by Sirens Add kansas2000censusblk.dbf to the data frame. Right-click the censusblk layer, select Joins and Relates, and select Join. Join the layer to kansas2000censusblk.dbf using the STFID field. Examine the attribute table of censusblk to make sure the join was successful. Do you see all sorts of demographic attributes? If so, the join was successful. Right-click the header of the POP2000 attribute and select Statistics. The sum value is the number of people in Douglas County. Record the value somewhere, as you will be entering this into the comments section of your Blackboard submission. Close the window. Use Select by Location… to select the census blocks that are within the buffer zone. Hint: Use the are completely within the source layer feature setting. The result should be 1,590 census blocks selected. Find the population in these census blocks by opening the censusblk attribute table, right- clicking the POP2000 attribute header, and selecting Statistics. (It will only give you statistics on the selected records.) Record this number. Find the population out of the range of sirens by clicking Table Options (upper left corner of the attribute table), selecting Switch Selection, and then using the Statistics tool again. Record this number in yout answer sheet and answer the associated questions. Total Douglas County population: Number of people who can hear the tornado sirens: Number of people who cannot hear the tornado sirens: Percentage of the population that can hear the tornado sirens: Percentage of total population that cannot hear the tornado sirens: Note: Your math should add up. If it doesn’t, reevaluate.\n Switch the selection back (i.e., back to the blocks within range of the sirens) and close the attribute table. Take a screenshot of your entire screen and paste it in your answer document. Part 3: School siting You have been asked to help a small town in Vermont find a suitable location to build a new school. There are a few considerations to take into account, including the slope of the land surface, the distance to both recreation sites to existing schools, then reclassify these derived datasets to a common scale from 1-10. You will then set the model up to weight them according to a percentage influence and combine them to produce a map displaying suitable locations for the new school.\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\relevation\rRaster dataset of the elevation of the area\r\r\rlanduse\rRaster dataset of the landuse types over the area\r\r\rrec_sites\rFeature dataset displaying point locations of recreation sites\r\r\rschools\rFeature dataset displaying point locations of existing schools\r\r\r\r### Open a new ArcMap and setup\r* While I would normally say it is ok to create a new data frame and import the data, in this case we will set a few environmental variables, so for reproducibilities sake we will import data into a new map entirely.\r* Bring in the data\r Hint: Getting a permissions error when you try to bring in the rasters? Did you Google it? https://community.esri.com/thread/223412-invalid-raster-dataset-failed-to-create-raster-layer-permission-denied-how-can-i-overcome-this-problem: Think back to the data health page.\r* Click on Geoprocessing Environments:\r* Click the down arrows for the processing extent and set both the processing extent and the Snap Raster to elevation\r* Click the Raster analysis section and set the cell size to the same as elevation.\r![](/geog358/media/Environs.png)\r Framing the work flow The town has asked you to identify some suitable areas in which to site a new school. More explicitly, our site suitability analysis will concern itself with 4 factors. We don\u0026rsquo;t want to spend a lot of money converting a land cover into a developable state. We also don\u0026rsquo;t want to build on a particularly steep slope. Because of the lack of a robust transit system, we want to place this school further away to maximize the accessibility of that school to the population. We also want to place that school close to existing recreational sites. Finally, we want to prioritize these factors in order of distance to a recreational facility \u0026gt; Distance to a school \u0026gt; Land use type = Slope.\nGiven that, we have all the information we need to successfully execute this analysis, but it will be helpful if we frame what we want to accomplish in a visual sense. ArcMap has a wonderful tool, Model builder, that provides an easy means of visualizing a GIS work flow. Because these factors have different \u0026ldquo;units\u0026rdquo;, we need to normalize them to some common scale. Lets pick the easiest one, and make them a simple scale from 1-10, with 10 being the best. The tool to do that is the reclassify tool. The next sections will walk you through the analysis.\n Land cover factor The first thing we will do is reclassify the land cover. Use the reclassify tool. Set the input to the landuse raster Use the LANDUSE field as the reclass field Change the values to those shown below Click on the folder icon next to the dialog box, create a new geodatabase in your part 3 folder, and save all subsequent rasters in there. Call this Output raster LandC_R Slope factor If you use the search function, there are a few ways we can calculate slope. For this application we want but we want the slope tool from the Spatial Analyst toolbox. Set the input raster to elevation Save the output as Slope Leave the rest as defaults Your map should look something like this: Finally, we need to reclassify the slope to our common value range. Use the reclassify tool again. Set the input raster to Slope Set the reclass field to value Click on the classify button Set the classification method to Equal Interval, and the number of intervals to 10 Call the output SlopeR School distance factor To calculate distance we will use the Euclidian Distance tool.\n Set the source features to Schools Call the output distance raster SchDis Then we need to reclassify this raster, in comes the handy reclassify tool.\n Set the input to SchDis Set the reclass field to Value Perform the same normalization as above (classify button \u0026gt; Equal Interval, intervals to 10) Save the output as SchDisR Your final map after this step should look something like so: Recreational site distance factor We need to calculate the distance from our features again, re run the Euclidian Distance tool with the following settings\n Set the source features to rec_sites Call the output distance raster recDis Then we need to reclassify this raster, in comes the handy reclassify tool again.\n Set the input to recDis Set the reclass field to Value Perform the same normalization as above (classify button \u0026gt; Equal Interval, intervals to 10) Save the output as RecDisR Because we want the area closer to the sites to have a higher value, we need to click the reverse New Values button Your final map after this step should look something like so: 7 Site suitability and thresholding With all of our standardized layers, we are ready to calculate our suitability raster. Search for the Raster Calculator Tool (Spatial Analyst Tools) Create an expression to weight each factor using the table below. Call the output SchSites .tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rLayer\rWeight\r\r\rrecR\r0.5\r\r\rschR\r0.25\r\r\rlanduseR\r0.125\r\r\rslopeR\r0.125\r\r\r The results of the site suitability are displayed below. Lets perform one last analysis, and find and highlight the areas with a site suitability score larger than 8. Use the raster calculator tool with the following expression: SchScore \u0026gt; 8, save the output as GoodSites. 8 What to submit First let\u0026rsquo;s change the symbology of our map.Open the attribute table of the GoodSites raster and record the number of cells which meet our criteria. Set the GoodSites symbology so that 1 is hollow, and 0 is gray. Set the transparency of the layer to 50, making a poor mans mask of the interesting areas. Turn off all layers other than GoodSites and ScSitesTake a screenshoot of the map, and record the number of cells and area rated as good fro school site development. Submit the answer sheet to blackboard, you\u0026rsquo;re done! "},{"id":50,"href":"/classes/geog558/labs/lab07/","title":"Lab 07 - Siting a New School with Model Builder and Fungus weight","parent":"Labs","content":"Lab 7 - Siting a New School with Model Builder and Fungus Dispersion Modeling Learning Objective In ArcGIS, we can build a model, based on a flow of data through a series of GIS operations, to arrive at a desired end. We will be using Model Builder to create a suitability layer for school siting in Stowe, Vermont in an automated way (yes, this was the same lab from intro class, we\u0026rsquo;re going to create the model this time). In essence, we will be setting up a diagram in Model Builder that will calculate the suitability raster with one click. Using Model Builder is helpful in situations where similar operations need to be applied in a programmatic way.\nWhat you need to submit Lab 7: Answer Sheet\nName:\nPart 1: Siting a New School with Model Builder:\nQuestion 1: In your model window, go to Model | Export | To Graphic…. Export the model as a png file and call it yournameLab7_1.png (i.e. rooseveltLab7_1.png). Double-check the image file to make sure all the objects are readable and paste it below.\nQuestion 2: Total area of land with suitability 8.5 or greater for new school: m2\nPart 2: Fungal Dispersion\nQuestion 3: Using your model, complete the table below. For each city, put an “X” in the box that corresponds to the first month the fungus “invaded” the city.\n.tg {border-collapse:collapse;border-spacing:0;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:black;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:black;}\r.tg .tg-lboi{border-color:inherit;text-align:left;vertical-align:middle}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r\r\rCITY\rJan\rFeb\rMar\rApril\rMay\rJune\rJuly\r\r\rSelma, AL\r\r\r\r\r\r\r\r\r\rMontgomery, AL\r\r\r\r\r\r\r\r\r\rPensacola, FL\r\r\r\r\r\r\r\r\r\rClarksdale, MS\r\r\r\r\r\r\r\r\r\rEnterprise, AL\r\r\r\r\r\r\r\r\r\rLawrenceburg, TN\r\r\r\r\r\r\r\r\r\rAlbany, GA\r\r\r\r\r\r\r\r\r\rStarkville, MS\r\r\r\r\r\r\r\r\r\rGreenwood, SC\r\r\r\r\r\r\r\r\r\rSt. Augustine, FL\r\r\r\r\r\r\r\r\r\rNatchez, MS\r\r\r\r\r\r\r\r\r\rWright, FL\r\r\r\r\r\r\r\r\r\rEufaula, AL\r\r\r\r\r\r\r\r\r\rMobile, AL\r\r\r\r\r\r\r\r\r\rQuestion 4: You must do one of the two following (do the other for extra credit!)\n Work on the Layout View to create a composite map showing the area invaded by the fungus by the end of each month. In other words, the legend should have 6 possible values, one for each month (including the origin). Make the colors contrasting enough so that you can see the difference between them. Be sure to add your name, a title and a legend. (See the sample map for a guide). -or-\n Submit an image of an equivalent model needed to reproduce the output of the second part of the lab Save your map as a .PNG file using the Export map option of the File menu. Save it in your local lab07/Part2 folder and call it fungus_yourname.png Upload the answers to the questions above along with your images to blackboard. The images are worth roughtly 1/2 of this labs grade so don\u0026rsquo;t forget to include them!\n Materials click to download\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rRelevant part\rData Name\rDescription\r\r\rPart 1\relevation \rRaster dataset of the elevation of the area\r\r\rPart 1\rlanduse\rRaster dataset of the landuse types over the area\r\r\rPart 1\rrec_sites\rFeature dataset displaying point locations of recreation sites\r\r\rPart 1\rschools \rFeature dataset displaying point locations of existing schools\r\r\rPart 2\rpre_01, pre_02, etc.\rMonthly Precipitation (mm) for January through July\r\r\rPart 2\rus48mask\rGrid of 1s for states and no data for other cells\r\r\rPart 2\rfungus_origin\rOrigin of Fungus in Panama City, FL\r\r\rPart 2\rus48cities\rLocation of Major Cities in U.S. (Vector Data Model)\r\r NOTE: Each raster is in the Albers equal area map projection with a 10,000m x 10,000m cell size.\n Part 1: School Sighting (Site Suitability Analysis) Creating a Toolbox and a Model Create a new toolbox (Lab7Tools), and a new model (SchoolSiting) in that toolbox. R-click on your lab07/Part1 folder and create a new toolbox. Name the new toolbox Lab7Tools.tbx. R-click on the toolbox and create a new model. Rename the new Model (R-click/slow-click | Rename) to SchoolSiting. Set up the Model Properties by going to Model | Model Properties Under the Environments tab, check the boxes for Processing Extent | Extent and for Raster Analysis | Cell Size Click the Values… button. Click the down arrows for each of the settings and make the Extent the same as the landuse raster in your data folder, and the Cell Size the same as elevation. Click OK on the Environment Settings. Click OK on the SchoolSiting Properties Window. Setting up the Workflow You will create a workflow diagram in the Model window that will show all the steps and operations/tools needed to get the suitable school sites. We start the process by dragging tools from ArcToolbox into the Model window to perform a specific operation.\n Set up your screen so you can see both ArcToolbox and the model window at the same time. Add the slope tool by finding it under ArcToolbox (Spatial Analyst Tools | Surface) and then dragging it onto the model window. Double click on the slope tool. The slope dialog will open, and will look familiar. Set the input to the elevation raster (from your data folder), and set the output to slope and make sure it is saved in the output folder in the Lab 7 folder. Leave other parameters to their defaults. Click OK. You will see the diagram objects become colored. You now have a very simple model. Next, add the Euclidean (straight line) distance tool (Spatial Analyst Tools | Distance | Euclidean Distance) to create a distance layer from rec_sites. Set the Input to rec_sites (from your data folder), the Output to recDist in your output folder, and the Cell Size to 30. You don’t need a Maximum distance or a direction Raster. Click OK. Notice that an empty diagram object is there for direction raster. Since it is blank, it will be ignored, so we will leave it alone. Add and set up another straight line distance tool to calculate distance from schools (call the output schDist). Save your Model (Model | Save). Click the Run button. Check in ArcCatalog to make sure that these three layers are calculated correctly and saved in your output folder. Add a Reclassify tool (Spatial Analyst Tools | Reclass | Reclassify) to the Model window to reclassify the slope raster. Double click on the Reclassify tool in the Model window. Select slope as the input layer. This time, select the input layer from the dropdown in this dialog window, which shows the layers that are part of the model (We do this so that it will perform this next step from what we calculated in the previous one. In other words, it maintains a flow within the model). In order to create the new classes for the slope layer, click on the Classify button, select the Equal Interval Method and 10 classes. Click OK. It is preferable that the new school site be located on relatively flat ground. You will reclassify the slope layer, giving a value of 10 to the most suitable slopes (those with the lowest angle of slope) and 1 to the least suitable slopes (those with the steepest angle of slope). Click on the Reverse New Values button so the lowest slopes get a value of ten and the steepest slopes a value of 1. Name the output raster layer as slopeR and click OK. Add a new Reclassify tool and double click on it. Select the RecDist layer as the input raster from the dropdown. Reclassify distance to recreation sites raster (recDist) into 10 classes with new values from 1 to 10 using Equal Interval classification method. The school should be located near recreational facilities. You will reclassify this dataset, giving a value of 10 to areas closest to recreation sites (the most suitable locations), giving a value of 1 to areas far from recreation sites (the least suitable locations), and ranking the values in between. Name the output raster layer as recR and click OK. Add a third Reclassify tool and double click on it. Select the SchDist layer as the input raster from the dropdown. Reclassify distance to schools raster (schD) into 10 classes with new values from 1 to 10 using Equal Interval classification method. It is necessary to locate the new school away from existing schools in order to avoid encroaching on their catchment areas. You will reclassify the distance to schools layer, giving a value of 10 to areas away from existing schools (the most suitable locations), giving a value of 1 to areas near existing schools (least suitable locations), and ranking the values in between. Name the output raster layer schR and click OK. Save your Model. Add and set up a fourth Reclassify tool in order to reclassify the landuse raster. Select ‘LANDUSE’ as the ‘Reclass field’ using the dropdown menu. Use the following table to assign the new values to each of the land use categories: .tg {border-collapse:collapse;border-spacing:0;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:black;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:black;}\r.tg .tg-lboi{border-color:inherit;text-align:left;vertical-align:middle}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r\r\rLanduse type\rScore\r\r\rAgriculture\r10\r\r\rBarren land\r6\r\r\rBrush/Transitional\r5\r\r\rBuilt up\r3\r\r\rForest\r4\r\r\rWater\rNoData\r\r\rWetlands\rNoData\r\r\r Call the output landuseR. Click OK and Save your Model. After all four reclassify tools are correctly established in the model, click Run again and check your output folder to make sure everything was calculated correctly.\n Calculating a Suitability Raster With all of our standardized layers, we are ready to calculate our suitability raster. Add a Raster Calculator Tool (Spatial Analyst Tools | Map Algebra | Raster Calculator) to the Model window. Double click on this tool to set up the parameters. Create an expression to weight each factor using the table below. .tg {border-collapse:collapse;border-spacing:0;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:black;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:black;}\r.tg .tg-lboi{border-color:inherit;text-align:left;vertical-align:middle}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r\r\rLayer\rWeight\r\r\rReclass of distance to recreation sites (recR)\r0.5\r\r\rReclass of distance to schools (schR)\r0.25\r\r\rReclass of landuse (landuseR)\r0.125\r\r\rReclass of slope (slopeR)\r0.125\r\r * Check your expression against the image below.\r* Name your output _suitsite_ and save it in your output folder.\r* Click OK.\r![](/geog558/media/Lab07Picture05.png)\r* Your model is now complete. Save it by going to **Model | Save**\r* Close the model.\r* In ArcCatalog, delete all of the files you have created (_slope_, _RecDist_, _SchDist_, _slopeR_, _RecR_, _SchR_, _landuseR_) from your output folder.\r* Re-open the Model by R-clicking and selecting Edit…\r* Go to **Model | Run Entire Model**.\r* All of the layers, including your _suitsite_, will be in your output folder.\r* Open ArcMap.\r* Add _suitsite_.\r* Answer questions #1 and #2 on the answer sheet.\rPart 2: Fungus Diffusion Modeling (Cost Distance) Learning Objectives:\nThis lab will introduce you to the concept of modeling a diffusion process using friction surfaces that change in space and time. Your goal is to model the diffusion of a fungus from its introduction at a seaport in Panama City, FL on January 1, 1989 throughout seven months of the calendar year (i.e., January through July).\nGetting started Add all 7 precipitation grids, fungus_origin, us48mask, and us48cities. Go to Geoprocessing | Environments… and under Raster Analysis set mask to us48mask. Under Processing Extent set the Extent to Same as layer us48mask. Set the current workspace and scratch workspace to your lab07/Part2 folder. Rearrange (if necessary) the precipitation raster layers in chronological order. Save your project as lab7lastname.mxd (e.g.: lab7washington.mxd). Modeling the diffusion process You will now model a diffusion process across a changing velocity surface. You will implement the time steps sequentially. The procedure for modeling diffusion across a dynamically changing environment involves the following steps:\n\rEstablish the origin (for the first run, it will be fungus_origin).\r**Reclassify** the Nth month of precipitation to a velocity grid (units of meters per day) based on velocity for each precipitation range.\r Identify the maximum value of precipitation for that month. Hint: The table of contents shows the range at a glance. Use the Reclassify tool ( ArcToolbox | Spatial Analyst Tools | Reclass | Reclassify) to reclassify the precipitation for that month (e.g: pre_01 for January)into a friction rate according to the following assumption about the disruption speeds. \rPrecipitation Amount\rRate of Movement\r\r\r0m/day\r\r\r100 - 200mm/month\r4000m/day\r\r\r200mm/month and greater\r7000m/day\r\r * Name the output velocity_nn.\r* If writing out the table each time is too tedious for you, you can click “Save…” and save the reclassification look-up table as a table named diff_rate which you can reuse. Create a Time per Unit Distance grid (days per meter). Using the Raster Calculator, create the following expression: 1 / Float(“velocity_nn”) Name the output friction_nn.tif Model the diffusion possible diffusion range for the Nth month by using the **Cost Distance** (ArcToolbox | Spatial Analyst Tools | Distance | Cost Distance) operation. Name the output diffuse_nn.tif\r For example: this will change as you go Input Raster: fungus_origin Input Cost raster: friction_01 Output Distance Raster: diffuse_01.tif Create a new origin raster by reclassifying the travel time raster (diffuse_nn) from the previous step based on the number of days in the month, again using the **Reclassify** tool. Your goal is to set all values less than or equal to the number of days in a particular month to a value of 1, and all other values to no data (make sure you set proper maximum value for each month based on the number of days). Name the output raster as fungus_nn.tif (e.g., fungus_01.tif). \rMonth\rDays\r\r\rJan\r31\r\r\rFeb\r28\r\r\rMar\r31\r\r\rApr\r30\r\r\rMay\r31\r\r\rJun\r30\r\r\rJly\r31\r\r Redo these processes for each successive month using the new fungus distribution (e.g., fungus_01) as your new origin, and repeat this until you have modeled fungus diffusion through July (07). Save your map as Lab7yourname.mxd Complete questions #3 – #4 on the assignment sheet. "},{"id":51,"href":"/classes/geog558/labs/lab08/","title":"Lab 08 - Advanced Terrain Analysis","parent":"Labs","content":"Learning Objective In this lab we\u0026rsquo;ll cover a variety of terrain analysis techniques including shelter analysis and a suite of tools within the hydrology toolbox.\nWhat you need to submit Lab 8: Answer Sheet\nName:\nPart 1\nQuestion 1:\nWhat is the value of the majority of the cells on the focalcells raster layer?\nWhere are the cells with smaller values located?\nand why do they have smaller values?\nQuestion 2:\nWhat is the value range in the shelter analysis raster layer? ________ to ________\nA large positive value means:\nA value close to zero means:\nA small negative value means:\nQuestion 3:\nWhat is the interpretation of the sinks output? What would the correct symbology of this layer look like?\nQuestion 4:\nPaste your final unit hydrograph here.\n Materials click to download\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rRelevant part\rData Name\rDescription\r\r\rPart01\rchase_dem\rDEM for Chase County, KS (30m cell size, elevation in feet)\r\r\rPart02\rPour_point\rA point feature layer that depicts the outlet downstream of the Little River where you'll create a unit hydrograph.\r\r\rPart02\rStowe_boundary\rA polygon feature layer that depicts the boundaries of Stowe, Vermont. This layer was derived from data available from [Vermont Center for Geographic Information (VCGI)](https://vcgi.vermont.gov/)\r\r\rPart02\rStowe_DEM\rA raster layer that depicts elevation within the study area. It also has a resolution of 30 meters. It was derived from data made available by the [United States Geological Survey (USGS)](https://www.usgs.gov/).\r\r\rPart02_worked\rStowe hydrology folder\rA worked version of part 2 in consideration of processing power\r\r This lab, and part 2 in particular, can take a while when CPU is a limiting concern. I can offer a few suggestions, and an alternative.\n Pick a smaller area to process. You can skip step if you are careful, as outlined in the instructions. If you pause drawing that may speed up UI interactions When not actively manipulation the tools you can multi-task. Take the time to understand the workflow, fire off a process, and move onto something else. Try not to have anything super intensive running in conjunction, but light Microsoft office work shouldn\u0026rsquo;t break you. As a stopgap if it\u0026rsquo;s taking too long, I have provided a copy of my part 2 for you to look over. Use this only if needed. A great album to jam to while working. Part 1: Terrain Exposure Analysis In this part of the lab, we will perform a terrain exposure analysis, referred to as shelter analysis. The outputs of this workflow are quite popular in archeology studies and inform us as to how protected a given location is. Terrain Exposure Analysis Procedure Project and normalize raster 1) Start Arcmap and rename the data frame to Terrain Exposure Analysis. Because we are working at a relatively small geographic scale, we should first reproject our raster from it\u0026rsquo;s native projection (GSC_North_America_1983) to a more appropriate one for this analysis (NAD_1983_StatePlane_Colorado_South_FIPS_0503 WKID: 26955).\n Open ArcToolbox and navigate to Data Management Tools | Projections and Transformations | Raster | Project Raster For your Input raster select you chase_dem from your lab08\\part01 folder using the browse button. Here we use the browse button to avoid data locks on the elevation data and avoid inadvertently assigning a projection to the dataframe prematurely. Call the Output raster proj_dem saved in your lab08\\part01 folder To select an Output coordinate system, click on the Spatial Reference Properties Button (the hand with the card) Define the coordinate system as: Projected Coordinate Systems \u0026gt; State Plane \u0026gt; NAD 1983 (meters) \u0026gt; NAD 1983 StatePlane Colorado South FIPS 0503 (meters).prj You can also use the search tool to find it, 26955 Click OK Set the Resampling technique to BILINEAR and click OK 2) Finally, convert the elevation from feet to meters.\n Raster calculator expression: proj_dem * 0.3048 Name the output chase_m Calculate the volume of earth around a window 1) First let\u0026rsquo;s calculate the volume of earth per cell.\n Raster calculator expression, we can round here: chase_m * 30 * 30 Call your output volpercell 2) Finally, use Focal Statistics to calculate the focal sum of this raster layer using a circular neighborhood with a radius of 2 cell size.\n Save the output raster layer as earthvol Calculate the earth volume of a cylinder of earth around a window 1) Create a flat surface using Raster Calculator to create a constant raster layer on which all of the cells have a value of 1.\n (“chase_m” / “chase_m”) will do the trick. Make the output raster a permanent raster named constant 2) Use Focal Statistics to calculate the focal sum on the constant raster layer using a circular neighborhood with a radius of 2 cell size.\n Save the output raster layer as focalcells 3) Calculate the earth volume within the circular neighborhood assuming all the cells inside the neighborhood have the same Z value as the focus (“volpercell” * “focalcells”)\n Save the output raster layer as flatearthvol Answer question #1 on your assignment sheet Calculate the shelter layer Calculate the shelter raster layer by subtracting flatearthvol from earthvol Save the output raster layer as shelter After some stylization, you might end up with a map that looks like so:\n Answer question #2 on your assignment sheet Part 2: Creating a Unit Hydrograph The instructions herein are guided by material gratefully pilfered from https://learn.arcgis.com/en/projects/predict-floods-with-unit-hydrographs/\n As is the case with many locations across the globe, when disaster strikes planners and residents realize they have data gaps that they can not readily fill. Exampled here, the town of Stowe in Vermont suffered considerably when the remnants of Hurricane Irene stuck the Green Mountain region in August 2011. The Little River overflowed and washed out roads, bridges, and culverts. In an effort to learn more about the event, officials tried to recreate the event using a number of techniques. One way to do this is to construct a hydrograph, which are line graphs determining how much water a stream will discharge during a rainstorm. In this lesson, you\u0026rsquo;ll create those hydrographs using data provided. If you\u0026rsquo;re interested in trying this for yourself I encourage you to download and run an area meaningful to you. Refer back to lab 1 if you need a refresher on how to acquire elevation data. The model builder is shown below:\nPrecondition the elevation model Either import the Stowe_DEM and Pour_point or download and select your own pour point for this analysis.\nIdentify and Fill Sinks 1) First, you\u0026rsquo;ll identify sinks in your DEM. Although your DEM was derived from reliable data provided by the USGS, sinks may still be present. In order to identify them we need to know which way the water flows over the surface. The first tool in the workflow is the Flow Direction tool.\n For Input surface raster, choose Stowe_DEM. save the output as Stowe_flow_direction in the geodatabase. The symbology of the resulting layer corresponds to the direction water is likely to flow. The layer\u0026rsquo;s appearance isn\u0026rsquo;t important to this analysis, we just need this layer to run the next step\u0026hellip;\n2) Open the sinks tool, the next step in the analysis.\n For Input D8 flow direction raster, choose Stowe_flow_direction. Save the output as Stowe_sinks. 3) The final step in the current task uses the Fill tool to remove sinks from a DEM.\n For Input surface raster, choose Stowe_DEM Name the output Stowe_fill. Delineate the Watershed Having now removed sinks from the DEM, it is considered \u0026ldquo;hydrologically conditioned\u0026rdquo; and is ready for hydrological analysis. We\u0026rsquo;ll use this DEM to determine the watershed area for the outlet point south of Stowe (or wherever you want to drop your point!). A watershed area is the area in which all flowing water will flow toward a certain point—in this case, your outlet. With the watershed area, you\u0026rsquo;ll be able to limit your subsequent analysis results to the relevant area for your outlet. Determining a watershed requires two components: a flow direction raster layer and an accurate outlet point. One of the most common reasons for a watershed analysis to fail is when the pour point is not adequately aligned to the flow grids. For that reason the Snap Pour Point tool is often included in a workflow. This step can be skipped if you are deliberate with your placement of the point, but I\u0026rsquo;ll walk through the workflow using it here. First, you\u0026rsquo;ll determine the stream\u0026rsquo;s exact location by calculating the areas where water accumulates the most. Then, you\u0026rsquo;ll snap the outlet point\u0026rsquo;s location to match the stream and verify it.\nAssess flow directions The first step to delineate a watershed is to determine the direction that water will flow in your DEM. That way, you can determine the areas where water will flow to the outlet. To do so, you\u0026rsquo;ll\u0026hellip;\n1) Create another flow direction raster layer, this time for your DEM with filled sinks.\n For Input surface raster, choose Stowe_fill. Name your output Stowe_fill_flow_direction. Snap outlet to stream 2) Open the Flow Accumulation tool. This tool creates a raster layer that indicates where water is most likely to accumulate. The flow accumulation for each cell is expressed as a numeric value based on the number of cells that flow into that cell. Cells with high values of accumulation often coincide with the locations of flowing water bodies. With your flow accumulation raster, you\u0026rsquo;ll identify the Little River that flows through Stowe.\n For your Input flow direction raster, choose Stowe_fill_flow_direction. Name your Output accumulation raster Stowe_flow_accumulation. You\u0026rsquo;ll leave the remaining parameters unchanged. The first of these parameters applies a weight to each cell, which is useful when you expect water to be distributed in ways that aren\u0026rsquo;t uniform (for instance, in larger study areas where precipitation levels may vary significantly). The second parameter determines whether accumulation will be measured with integers or floating points. Floating point, the default value, allows for the inclusion of decimal places, making it more accurate in most situations. The third parameter determines the input flow direction. The next step of the task opens the Measure Distance tool (which is a tool on the ribbon, not a geoprocessing tool). You\u0026rsquo;ll use this tool to measure the distance between the current outlet point and the Little River\u0026rsquo;s location as represented in the flow accumulation raster. You can use this distance to then snap the outlet point to the correct cell so that it precisely lines up with the stream. Be sure you\u0026rsquo;re as precise as feasible by zooming in closer to the outlet point. click the outlet point. Then double-click the approximate center of the nearest cell with a high flow accumulation value. The Measure Distance tool\u0026rsquo;s window updates to include the length of the measurement. I measurement approximately 50 meters so I\u0026rsquo;ll round to a distance of 60 meters when I snap the outlet point to the stream. A snapping distance that is slightly higher than the measured distance will help avoid ambiguity in the tool. When deciding on a snapping distance, be sure it doesn\u0026rsquo;t exceed the measured distance by too much, or the point may snap to an area further downstream.\n3) Open up the Snap Pour Point tool.\n For Input raster or feature pour point data, choose Pour_point. For Input accumulation raster, choose Stowe_flow_accumulation. Name the output raster Stowe_snapped_outlet. For Snap distance, type 60. The tool runs and the snapped outlet raster layer is added to the map. The raster displays a single cell that represents the outlet\u0026rsquo;s new location. In the example image, the snapped outlet is represented as a yellow cell.\nDelineate watershed upstream of outlet Now that you have both a flow direction layer and an accurate outlet layer, you can determine the watershed area upstream of the outlet.\n4) The final step of the task opens the Watershed tool.\n For Input D8 flow direction raster, choose Stowe_fill_flow_direction. For Input raster or feature pour point data, choose Stowe_snapped_outlet. For Output raster, set the name to Stowe_watershed. Create a velocity field Next, you\u0026rsquo;ll start determining how long it takes water to reach the outlet, allowing the town to better predict when flooding will occur during a hypothetical rainfall event. To determine the time it takes water to flow somewhere, you first need to determine how fast water flows. You\u0026rsquo;ll calculate the speed of flowing water with a velocity field, another type of raster layer. There are many types of velocity fields, and they can be calculated with a wide variety of mathematical equations. You\u0026rsquo;ll create a velocity field that is spatially variant, but time and discharge invariant. This means that your velocity field makes the following assumptions:\n Velocity is affected by spatial components such as slope and flow accumulation (spatially variant). Velocity at a given location does not change over time (time invariant). Velocity at a given location does not depend on the location\u0026rsquo;s rate of water flow (discharge invariant). In reality, velocity could be time variant and would definitely be discharge variant. However, incorporating these variants would require additional datasets that might not be available and use modeling techniques that might not be replicable in a GIS environment. The spatially variant, time- and discharge-invariant velocity field will provide a generally accurate result, although it\u0026rsquo;s important to remember that any method will be only an approximation of observed phenomena. You\u0026rsquo;ll use a method for creating velocity fields first proposed by Maidment et al. (1996)1. In this method, each cell in the velocity field is assigned a velocity based on the local slope and the upstream contributing area (the number of cells that flow into that cell, or flow accumulation). They use the following equation:\n V = Vm * (sbAc) / (sbAcm)\n Where:\n V is the velocity of a single cell with a local slope of s and an upstream contributing area of A. Coefficients b and c can be determined by calibration. In this scenario, you\u0026rsquo;ll use the method\u0026rsquo;s recommended value of b = c = 0.5. Vm is the average velocity of all cells in the watershed. You\u0026rsquo;ll assume an average velocity of Vm = 0.1 m/s. sbAcm is the average slope-area term throughout the watershed. To avoid results that are unrealistically fast or slow, you\u0026rsquo;ll set limits for minimum and maximum velocities. The lower limit will be 0.02 meters per second, while the upper limit will be 2 meters per second. This equation is only one of many ways that a velocity field can be calculated, and it comes with several assumptions and limitations of its own.\nCreate the slope raster The primary variables in the equation you\u0026rsquo;ll use for your velocity field are slope and upstream contributing area. You already have a raster layer for upstream contributing area: the flow accumulation layer you created above, but we don\u0026rsquo;t yet have a slope layer, so that\u0026rsquo;s the next one we\u0026rsquo;ll create.\n1) Open the Slope tool.\n For the Input raster, choose Stowe_DEM Name your Output raster Stowe_slope Calculate the slope-area term Now that we have a raster layer for both slope and flow accumulation area, we\u0026rsquo;ll calculate a new raster layer that combines them. This layer will show the slope-area term (the value sbAcm from the Maidment et al. equation).\n2) The next step can be accomplished using the Raster Calculator tool.\n Use the fields to create the expression: Power(\u0026quot;Stowe_flow_accumulation\u0026quot;,0.5) * Power(\u0026quot;Stowe_slope\u0026quot;,0.5) Name your output raster Stowe_slope_area_term. Finally, lets use the environments of this tool to clip to our watershed at the same time: In the tool parameters, click Environments. Under Raster Analysis for Mask, choose Stowe_watershed. Your output should look like so:\nCalculate the velocity field 3) Now that you have the slope-area term, you can calculate a velocity field using the following equation:\n V = Vm * (sbAc) / (sbAcm)\n As mentioned previously,\n Vm is the average velocity of all cells in the watershed. You\u0026rsquo;ll use an assumed average value of Vm = 0.1, which is recommended by Maidment et al. Similarly, sbAcm is the average slope-area term across the watershed. Because you\u0026rsquo;ve calculated slope-area term for the watershed, we can simply pull the median out of our raster. To find it you need to\u0026hellip; 3A) right-click Stowe_slope_area_term and choose Properties, and under the Source tab and expand the Statistics heading.\n3B) and then open a raster calculator and enter the expression, Where [Mean slope-area term] is the value that you copied from the Layer Properties window:\n You expression should look like:\n0.1 * (\u0026quot;Stowe_slope_area_term\u0026quot; / [Mean slope-area term]) Call that output Stowe_velocity_unlimited Limit the velocities The raster layer you created is a velocity field, but it has unrealistically high and low velocities. For instance, some values in the field have a velocity of 0 meters per second, which is unlikely during an extreme rainfall event. Likewise, the maximum value of approximately 7.5 meters per second is unrealistic even during a major flood. You\u0026rsquo;ll limit the velocity values with a lower limit of 0.02 meters per second and an upper limit of 2 meters per second. Bearing in mind that there are a number of ways to end up with the desired output, the most direct in my opinion is a complex raster calculator con expression. Other alternatives include reclassifying, multiple con tools, or a clamp tool to name a few.\n4) Open up a Raster Calculator and enter your expression.\n My expression looks like so: Con(\u0026quot;Stowe_velocity_unlimited\u0026quot; \u0026lt; 0.02,0.02,Con(\u0026quot;Stowe_velocity_unlimited\u0026quot; \u0026gt; 2,2,\u0026quot;Stowe_velocity_unlimited\u0026quot;)) Read it from right to left: \u0026ldquo;If \u0026ldquo;Raster\u0026rdquo; is less than x, put y, otherwise; if \u0026ldquo;Raster\u0026rdquo; is greater than than w, put z, otherwise put \u0026ldquo;Raster\u0026rdquo;\u0026rdquo; Save your output as Stowe_velocity. Create an isochrone map Having now created a velocity field that predicts how fast water will flow throughout Stowe, we still need to know the time it takes to get to the pour point. In this step we\u0026rsquo;ll create an isochrone map, which maps the time it takes to reach a specified location from anywhere else in an area. To create the isochrone map, you\u0026rsquo;ll first create a weight grid. Then, you\u0026rsquo;ll assess the time it takes for water to reach the outlet.\nCreate a weight grid Flow time is calculated with a relatively simple equation: The length that water must flow divided by the speed at which it flows. While you know how fast water flows due to your velocity field, you don\u0026rsquo;t know flow length. To determine flow length, you need two variables: flow direction (which you know) and weight (which you don\u0026rsquo;t). Weight, in regard to flow, represents impedance. For instance, water flowing through forested land takes longer than water flowing over smooth rock because it\u0026rsquo;s impeded by terrain. While calculating weight may seem difficult without detailed terrain data, we can simplify our model by assuming that this weight is the inverse of the velocity we calculated. This is more formally notated as\n Weight [L-1T] = 1 / Velocity [LT-1]\n 1) Thus, it is a simple matter to create our weights grid with a raster calculator.\n The raster calculator expression is:\n1 / \u0026quot;Stowe_velocity\u0026quot; Save the output as Stowe_weight Your output should look like so: Assess flow time to outlet pour point 2) The next step of the task uses the Flow Length tool. While this tool, as its name suggests, normally calculates flow length, it has an optional parameter to include a weight raster. When a weight raster is included, the tool calculates flow time instead.\n For Input flow direction raster, choose Stowe_fill_flow_direction. For Output raster, name it Stowe_time. For Direction of measurement, confirm that Downstream is chosen. For Input weight raster, choose Stowe_weight. Finally, use the environments of the tool to clip to our watershed at the same time again. Your output should look like so, and has units of seconds. Create a unit hydrograph The last step we want to accomplish is to create a unit hydrograph. In essence, this unit hydrograph is the area that falls within the specific we want the histogram of the output layer. Although conceptually simple, this can be quite difficult, as outlined in the source tutorial. Fortunately, simpler solutions exists, and the exact method will vary depending on your desired output. Here I will show how to create a \u0026ldquo;raw\u0026rdquo; unit hydrograph and manipulate it within Arcmap and Microsoft Excel.\nApproach 1) Visualize a \u0026ldquo;raw\u0026rdquo; unit hydrograph histogram and then Extract to create the desired output We will first\u0026hellip;\n1) Cast the raster to an int. Because our unit is seconds, truncating our numbers is a reasonable tradeoff, and has the added benefit of creating a workable attribute table for us to export.\n The input raster is Stowe_time Save the output as Stowe_time_int 2) Export our results. We can export the raster as a table and use another software package to make a graph, or we can use ArcMap\u0026rsquo;s built in graph maker to create a simple graph.\n2A) Open the attribute table and create a graph,\n2B) Set the axis value to value and the number of bins to something sane (I started with 35). Click next and click finish.\n2C) Right click on the graph and change advanced properties, or export the image\n2D) A final graph.\n3) Lets now export this database and manipulate it within Excel to create a unit hydrohraph. Here I\u0026rsquo;ll walk you through how to take the exported database into Excel and then transform it into a PivotTable for formatting and visualization.\n To export the database, click on the table options and export it to a dbase table on your hard drive (not in a geodatabase) From there you can open the table in Excel using File \u0026gt; Open. We need to first create a column for area, click in cell D2 and enter the formula =C2*30*30 Click the little black box at the lower right hand corner of the cell to fill the formula down. Go to Inset \u0026gt; pivot table and choose the whole table. Dump the pivot table out in the same worksheet. Add Value and Sum of Area to Rows and Values respectively. Click in the first column of Value in the pivot table, and click Group Field up in the PivotTable Analyze ribbon Group the field into chunks of 1800\u0026hellip; Copy the table, and calculate the unit hydrograph using a time field and the area divided by the timestep. Approach 2) Bin the raster into Isochrones within Arcmap before extracting the graph 1) We can Reclassify our time raster into N isochrones (contours of time) and generate a graph from that. Here I\u0026rsquo;ll demonstrate a 30 minute timestep (1800 seconds).\n Click the classify button after adding the Stowe_time raster to the input, and set the interval size to 1800 (30 minutes in seconds) Name the output Stowe_1800_isochrone. 2) Now we\u0026rsquo;ll use these as the zones to calculate the ordinate of the unit hydrograph. These \u0026ldquo;ordinates\u0026rdquo; are the sum of the area within that time step divided by the timestep. In essence, this is the area within the isochrone divided by range of the isochrone. We will first add a new field to the Stowe_1800_isochrone and calculate the area within each isochrone. We will then normalize that area to the time range of our isochrone.\n Use the Add field button in the table options of the Stowe_1800_isochrone raster Name the field time and set the type to double. Right click on the newly created field and calculate the area (simply the [Value] * 1800) Create a new field called area of type double. Right click on the newly created field and calculate the area (simply the [Count] * 30 * 30) To finish, create a last field called UH_Ord of type float Your expression here is the area divided by the range of the isochrone ([Area]/1800) 3) Finally, create the graph.\n Explore the graphing options as before. Here I demonstrate how you may add two series to the same graph (here the raw histogram and the aggregated and normalized unit hydrograph. Answer questions 3 and 4 related to this part of the lab. Submit your final answer sheet to blackboard.\n MAIDMENT, D.R., OLIVERA, F., CALVER, A., EATHERALL, A. and FRACZEK, W. (1996), UNIT HYDROGRAPH DERIVED FROM A SPATIALLY DISTRIBUTED VELOCITY FIELD. Hydrol. Process., 10: 831-844. doi:10.1002/(SICI)1099-1085(199606)10:6\u0026lt;831::AID-HYP374\u0026gt;3.0.CO;2-N\u0026#160;\u0026#x21a9;\u0026#xfe0e;\n "},{"id":52,"href":"/classes/geog358/labs/lab08/","title":"Lab 08 - Introduction to network analyist and ArcScene","parent":"Labs","content":"Part 1: Network Analysis Objectives The part of the lab is designed to introduce some basic networking analyses, including how to setup a network dataset, how to find the efficient routes under different rules, and how to calculate the nearest facilities.\nMaterials\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rSanFranciscoTran.gdb\rVarious transportation and government data about San Francisco\r\r\rRestricted Turns.lyr\rPre-defined layer of restricted turns along network\r\r\r1. Create a new network dataset There are quite a few steps here where you can make a wrong turn (ba dum tss). Read and follow them carefully, and take the time to read each window in Arcmap and think about what the different options might produce.\n Create a new data frame or start a new map document. Click Customize and select Extensions. Make sure the box next to Network Analyst is checked. Click Close. Click Customize, select Toolbars, and click Network Analyst. Click the ArcCatalog button (looks like a little filing cabinet). Locate your Part1 folder in the Catalog Tree of the ArcCatalog window. Expand the folder (i.e., click the plus sign next to it) and expand the SanFranciscoTrans geodatabase. Right-click the Transportation feature dataset, select New, and click Network Dataset. Name the network dataset Streets_ND and then click Next. Click Next three more times. Under the Field column in the Using Elevation Fields section, click F_ELEV and change it to T_ELEV. Click Next. Click Next again. On the “Specify the attributes for the network dataset” page, click the Add button. Name the attribute RestrictedTurns, change the Usage Type to Restriction, and then click OK. Click the row you just created and then click Evaluators. Under the Type column, change RestrictedTurns to Constant and change the Value column to Use Restriction. Click OK. Click Next through the rest of the prompts and then click Finish. Click Yes on the window that pops up. Give it a minute to run. Click Close on the errors window that pops up. Click Yes to add the network layer to the map The resulting map should look like a chaotic mess of streets, junctions, signposts, etc.—but that’s OK for now. Right-click the RestrictedTurns layer and select Properties. Click the Symbology tab. Click Import. Click the little folder icon, select the file Restricted Turns.lyr in your Part1 folder, and click Add. Click OK. Click OK again. Click the down arrow next to the Add Data button and select Add Basemap. Click Streets and then click Add. Give it a minute to load and say Yes if the program asks about hardware acceleration. 2 Find the best route Click the Network Analyst button on the Network Analyst toolbar and select New Route. Turn off all layers except Route, Streets_ND, and Basemap. Zoom in close to downtown San Fran (the northeast tip of the peninsula). Notice how the colors indicate varying amounts of traffic. Take a screenshot of your entire screen and save it to your personal folder (that should look like this). We are now going to find the best route for a hypothetical shopping trip. Click the Find button (looks like a pair of binoculars). Click the Locations tab and Enter in the Single Line Input field: 301 Post St., San Francisco, CA 94108 Click Find. Right-click the first result listed and select Add as Network Analysis Object. Repeat the last two steps for the following addresses: 1405 Montgomery St., San Francisco, CA 94111 39 Pier 39 Concourse, San Francisco, CA 94133 3301 Lyon St., San Francisco, CA 94123 301 Post St., San Francisco, CA 94108 (…in order to make it a round trip!) Close the Find window. Double-click Route in the Table of Contents and in the Analysis Settings tab. Check the box next to Use Start Time. Change the Time of Day to 09:00 and the Day of Week to Monday. Click OK. Click the Solve button (looks like a grid with a line through it) on the Network Analyst toolbar. Click the Directions button (right next to the Solve button). Scroll down to the bottom of the window. It should list the time and distance of the trip. Move the Directions window so that it and the route are both visible. Take a screenshot of your entire screen and save it to your personal folder (this should look like so). Part 2: 3D in GIS Objectives This part of the lab will introduce you to the ArcScene environment of the 3D Analyst extension. First you will drape an image of Death Valley in ArcScene, and then you will explore the population of Douglas County using three dimensional symbology and Google Earth imagery.\nFrom Esri’s ArcGIS Resource Center: “ArcScene is a 3D viewer that is well suited to generating perspective scenes that allow you to navigate and interact with your 3D feature and raster data. Based on OpenGL, ArcScene supports complex 3D line symbology and texture mapping as well as surface creation and display of TINs. All data is loaded into memory, which allows for relatively fast navigation, pan, and zoom functionality. Vector features are rendered as vectors, and raster data is either downsampled or configured into a fixed number of rows/columns you set.”\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rDVIM3.tif\rSpaceborne radar imagery over Death Valley\r\r\rdvtin\rTriangulated Irregular Network (TIN) terrain model for portion of Death Valley\r\r\rDeath Valley Terrain.lyr\rLayer file of Death Valley, for display purpose only\r\r\rDougctyblks.shp\rDouglas county census block layer\r\r\r1 Draping an Image in ArcScene Open ArcScene. Give it a minute to load. Click Cancel on the window that pops up.\n As you would in ArcMap, add the Death Valley Terrain.lyr from your Part2 folder. Also add DVIM3.TIF, The image is drawn on a plane with a base elevation of zero. Turn off the Death Valley Terrain layer.\n Right-click DVIM3.TIF and select Properties. In the Base Heights tab, click the Floating on a custom surface: radio button. Because the dvtin TIN (from which the layer file was created) is the only surface model in the scene, it appears in the dropdown list. Click OK and observe what happens to the image.\n Draping the radar image over the terrain surface allows you to see the relationship between the general shape of the land surface and the texture of the rocks and sediment that make up the surface.\n Explore the image using the 3D tools\n Right-click Scene layers (similar to ArcMap’s data frame) and select Scene Properties. Under the General tab, change the Vertical Exaggeration to 2. Click OK. Notice that the apparent height of the terrain is now doubled.\n Click the familiar Full Extent button. Click the Set Observer button (it looks like a target with an eye in front of it). This tool allows you to click on a portion of the image to observe the landscape from that point of view. Use this tool to explore the different 3D views. (Beware that clicking on flat spots will point your view to the sky! If that happens, just click the Full Extent button.)\n Select a view that you find particularly interesting. Take a screenshot of your entire screen and save it to your personal folder.\n Save your 3D ArcScene document to your Part2 folder as deathvalley.sxd. 2 Displaying attribute values in 3D Start a new ArcScene document. Add dougctyblks.shp from your Part2 folder. Let’s set the color symbology to reflect population values.\n Open the Properties of the dougctyblks layer and click the Symbology tab. Click Quantities and select pop2k as your Value field. Increase the number of classes to 10. Click Apply. Now we will use extrusion to represent population. (According to Esri, extrusion is “the process of stretching a flat 2D shape vertically to create a 3D object.”)\n Click the Extrusion tab. Check the box to Extrude features in layer and then click on the adorable little calculator. The expression we wish to use here is simply [pop2k], so just double-click pop2k in the Fields box and then click OK. Click the Apply extrusion by: dropdown menu and select adding it to each feature’s base height. Click OK. You should see the extrusions. Since we’ve used the same variable for color and extrusion height, this means the darker the color, the taller the extrusion. Finally, we will make the layer translucent and load it into Google Earth. Open the Properties of the dougctyblks layer and click the Display tab. Set the transparency to 50%. Click OK.\n Open ArcToolbox and go to Conversion Tools\u0026gt;To KML\u0026gt;Layer to KML. Use the dropdown menu (not the little folder) to select dougctyblks as the input layer. Save the output as dougpop.kmz in your Part2 folder. Set your Layer Output Scale to 1. Click OK. Give the tool a minute to work. Save your 3D ArcScene document to your Part2 folder as dougpop.sxd. Close ArcScene. In Windows Explorer, navigate to your Part2 folder and then double-click your dougpop.kmz file. Open it in Google Earth Pan, zoom, and tilt in order to get a good look at the extruded layer. Settle a cool view. Use save image, add the pieces and save it to your personal folder. Part 3: Terrain Analysis Objectives This section will introduce you to basic terrain analysis in the raster data model. You will learn how to: 1 Create contours, profiles, and hillshades 2 Derive slope and aspect 3 Generate a viewshed\nMaterials\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rdem30\rUSGS 7.5 minute DEM of Watauga County, North Carolina (cell size 30m)\r\r\rflattop.shp\rA vista platform (data collected using Trimble GPS)\r\r\rtrail.shp\rflattop trail (data collected using Trimble GPS)\r\r\r1 Generate contours Open ArcMap. Click Customize and select Extensions. Makes sure the Spatial Analyst extension is checked. Add DEM30 and TRAIL.shp from your Part3 folder. In ArcToolbox, go to Spatial Analyst Tools | Surface | Contour and use the following settings: Input raster: DEM30 (select from the dropdown list) Output polyline features: (Save as contour in your Part02 folder) Counter interval: 20 Base contour: 400 Click OK. Right-click the contour layer and select Label Features. Drag the TRAIL layer to the top of the Table of Contents. Make the trail line thicker so that it’s easier to see. Right-click the layer and select Zoom To Layer. Look at the trail. Click Fixed Zoom Out a few times and look at the trail again. 2 Create hillshades and derive slope and aspect Turn off the contour layer and click Full Extent. In ArcToolbox, go to Spatial Analyst Tools | Surface | Hillshade. The input layer should be DEM30. Leave all the settings as their defaults and create an output file named hillshade in your Part3 folder. Repeat the last step but set the Z factor to 3 and save it as hillshade3z. Turn this new hillshade on and off to see the difference between it and the original. Create another hillshade and set Altitude to 60, the Z factor to 3, and save it as hillshade3z60. Turn this new hillshade on and off to see the difference between it and the original. Turn off all the hillshades. In ArcToolbox, go to Spatial Analyst Tools | Surface | Slope. Select DEM30 (from the drop-down menu) as the Input raster. Save the Output raster as slope. Click OK. In ArcToolbox, go to Spatial Analyst Tools | Surface | Aspect. Select DEM30 (from the drop-down menu) as the Input raster. Save the Output raster as aspect. Click OK. Spend a moment looking at your slope and aspect rasters. Turn off the slope and aspect layers. 3 Create trail profile Click Customize and select Extensions. Make sure 3D Analyst is checked. Click Customize, select Toolbars, and select 3D Analyst. Right-click TRAIL and select Zoom to Layer. On the 3D Analyst toolbar, make sure your Layer is set to DEM30. Click the Interpolate Line button. Now you will trace the trail to digitize a line. Simply click once to start and then click again along the path to create vertices. There’s no need to be ultra-precise. Double-click once you have completed the line. On the 3D Analyst toolbar, click the Profile Graph tool and select Profile Graph. (The line must be selected for this to work.) Right-click on the title bar of the profile graph and select Properties. In the Appearance tab, change the Title to Trail Elevation Profile and the Footer to your name. Click Apply and OK. Take a screenshot of the entire screen and save it to your personal folder. Close the Profile Graph. 4 Create a line-of-sight On the 3D Analyst toolbar, click the Create Line of Sight button. In the dialog box that appears, type 1.7 as the Observer offset. (This will show what is visible from the perspective of an observer 1.7 meters [5’7”] tall.) Leave the Target offset as it is. Keep the dialog box open. Zoom into an area of your choosing. Draw some lines by clicking once at a start point and clicking once at an end point. The green segments of the line are visible from the observer point, The red segments are not The blue dot represents the “point of obstruction from the observer to the target.” Close the line-of-sight dialog box. Select the lines-of-sight you created and delete them. 5 Create flattop viewshed Add the FLATTOP.shp from your Part02 folder. Open the layer’s attribute table. Add two new fields, OFFSETA and RADIUS2, both with a Type of Float, a Precision of 8, and a scale of 2. Hint: Start with the Table Options button! Start an edit session. Remember this? Hint: Open the Editor toolbar. In the attribute table of FLATTOP, enter 22.05 in the OFFSETA field and 8045 in the RADIUS2 field. Save your edits and end the edit session. In ArcToolbox, go to Spatial Analyst Tools | Surface | Viewshed. Use DEM30 as the Input raster, FLATTOP as the Input point, and save the output as viewshed in your Part02 folder. Click Full Extent to see your viewshed. Look at the Table of Contents to see which color corresponds to the visible sections. Take a screenshot of the entire screen and save it to your personal folder. Lab submission: You should submit your 5 screenshots to blackboard. Congratulations, you are finished.\n"},{"id":53,"href":"/classes/geog558/labs/lab09/","title":"Lab 09 - Descriptive Spatial Statistics and Point Pattern Analysis","parent":"Labs","content":"Lab 9: Descriptive Spatial Statistics and Point Pattern Analysis Learning Objective In this lab, we will use a tornado touchdown points database which spans from 1950 to 2018 to introduce you to some techniques of measuring geographic distributions. Through the use of yearly and monthly mean centers and standard deviation ellipses, well explore how tornado touchdowns are distributed and then use the tracking analyst to visualize how these change over time. Next, we\u0026rsquo;ll quantify the distribution of ad hoc subsets of the data using point pattern analysis.\nWhat you need to submit Lab 9: Answer Sheet\nName:\nQuestion 1:\nRepeat steps 4 – 7 using the monthly_SDellipse layer to create the temporal layer. Export the animation as yournamemonthly_ttmov and be sure it runs properly. Upload the .avi file (just the monthly ellipses) on Blackboard.\nWhat does this animation tell you about seasonal changes in tornado touchdown locations? Is there any obvious trend?\nQuestion 2:\nFill in the details on the following table (based on point extent): In the C/R/D column, indicate whether there is significant Clustering (z-value less than -1.96), significant Dispersion (z-value greater than 1.96) or neither (R for random) using an alpha of .05.\n Place \u0026amp; Time NNR Z-Score C/R/D Kansas 2005 Washington 2005 Alabama 2005 USA May 2007 USA Nov 2007 Question 3:\nDescribe the differences in the point patterns in Kansas, Washington, and Alabama in 2005. Which (if any) of the patterns was found to be significantly different from the random distribution? Assume alpha level = .05 (the z-value associated with 95% confidence interval is +/- 1.96).\nQuestion 4:\nWhat is the total area of the conterminous US in square meters?: _____ m2.\nQuestion 5:\nFill in the details on the following table (based on the total area of the three states and the conterminous 48 states). Refer to question #2 for how to fill in the C/R/D column.\n Place \u0026amp; Time NNR Z-Score C/R/D Kansas 2005 Washington 2005 Alabama 2005 USA May 2007 USA Nov 2007 Question 6:\nWhat is different about your values in the top table (the extent of the points) and the bottom table (including the area)? Examine the point patterns and speculate on what causes these differences. Read the help file on the Average Nearest Neighbor Distance tool for help in answering this.\nQuestion 7:\nCalculate the NNI for these two states in 2007 using both the point extent (do not indicate an area) and the area of each state. Fill in the tables below\u0026hellip;\n Based on point extent (no area indicated): Place \u0026amp; Time NNR Z-Score C/R/D Michigan 2007 South Dakota 2007 Based on area: Place \u0026amp; Time NNR Z-Score C/R/D Michigan 2007 South Dakota 2007 Getting set up Gathering and formatting data We\u0026rsquo;ll start by downloading data from the NOAA Storm Prediction Center Severe GIS page. There are a number of datasets here, but what we\u0026rsquo;re after here is the original csv data, here, under the Severe Weather Database Files (1950-2017) heading.\n1) Download (at minimum) the 2005-2007_torn.csv file, and place them in their own lab 9 folder.\n Files are listed at https://www.spc.noaa.gov/wcm/#data Be sure to read the user guide so that you understand what it is you\u0026rsquo;ve actually downloaded. ArcMap spatial statistics, while not particularly more computationally demanding than the average gis process, can be slow to run in performance limiting environments. Feel free to tailor this lab to your own situation by downloading as much or as little of the database as you want. Of course using the full database will provide you with a clearer picture and richer outputs, but all questions can be answered with the 2005-2007 data. Because the database is provided as separate csv files, we need to merge them. As is the case with most things in life, there are a few ways we can do this, and the choice is yours.\nStarting with the folder of csv\u0026rsquo;s files you have downloaded, you can\u0026hellip;\n Copy and paste them together into a single file using notepad or excel (least grace, brute force) You can use Windows PowerShell natively to accomplish this (preferred in most cases) Shift right-click \u0026gt; \u0026ldquo;Open PowerShell window here\u0026rdquo;, and copy-paste the following into the terminal: $getFirstLine = $true get-childItem *.csv | foreach { $filePath = $_ $lines = Get-Content $filePath $linesToWrite = switch($getFirstLine) { $true {$lines} $false {$lines | Select -Skip 1} } $getFirstLine = $false Add-Content \u0026#34;combinedcsvs.csv\u0026#34; $linesToWrite } You can merge them using R or Python (also perfectly viable) In R this would look something like: setwd(\u0026#34;/tmp\u0026#34;) filenames \u0026lt;- list.files(pattern = \u0026#34;*.csv\u0026#34;, full.names=TRUE) csvdata \u0026lt;- lapply(filenames,function(i){ read.csv(i, header=TRUE) }) df \u0026lt;- do.call(rbind.data.frame, csvdata) write.csv(df,\u0026#34;combinedcsvs.csv\u0026#34;, row.names=FALSE) You can merge them in ArcMap. The Merge tool takes shapefiles as an input, so you\u0026rsquo;d need to convert them to shapefiles first (instructions below) and then merge them (inelegant and counterproductive). Obviously if you wanted the full database you could have just used the 1950-2018_all_tornadoes.csv or the shapefile on the landing page, but now you know.\n 2) Set up and import the data\n Before we import anything, lets set the projection of the dataframe to an equal area projection so that we can accurately visualize these within the context of dispersion. Right click in the white area and go to properties \u0026gt; Coordinate Systems and search for WKID: 102003 Importing the tornado data Let\u0026rsquo;s next bring in the csv. Drag the csv file from the ArcCatalog panel into the dataframe. Right click on the csv and \u0026ldquo;Display XY Data\u0026rdquo;. Set X Field to slon Set Y Field to slat, and then edit the coordinate system. Set the spatial reference of the data to WKID: 4326 OK your way through the tool. Finally, as you might have noticed with the warnings, this layer has restricted functionality. To resolve this, we just need to export the data by right clicking on it and going to Data \u0026gt; Export Save All records Save the reference system as the data frame Save the output as a shapefile called Torn_yyyy_yyyy.shp Click yes to add the results to the window. Finally, remove everything from your dataframe with the exception of Torn_yyyy_yyyy.shp Adding ancillary data Next add a state boundaries shapefile from your provider of choice. The quickest way to acquire this is to use the arrow dropdown next to add data and \u0026ldquo;Add Data from ArcGIS Online\u0026hellip;\u0026rdquo; You will need to sign in to an ArcGIS account in the upper right hand corner. Search for \u0026ldquo;United States States\u0026rdquo;, and make sure the layer you add is a feature layer. Select the lower 48 states (and DC) from the layer that loads in, and then re-export the layer as lower48.shp. If this fails for some reason, you can grab state boundaries from TIGER. Visualizing the distributions Calculating and Visualizing Descriptive Spatial Statistics 1) Open the Mean Center tool.\n In the Mean Center dialog box, select Torn_yyyy_yyyy.shp as your input feature class. name the output feature class Torn_yyyy_yyyy_mcy.shp. set the case field to yr. 2) Open the attribute table for Torn_yyyy_yyyy_mcy. Notice that you have n records, one for each year of data you imported. In order to use tracking analysis, the extension ArcMap uses to create videos, we need to create a new field, and populate it using Field Calculator to create a readable by the tool. We\u0026rsquo;ll assign a single day, for example, “7/1/1950” to represent 1950\n In the table options, Add Field to create a new field named TA_YEAR Set the type to Text Set the length to 32 byte Right-click on the newly created field and use the Field Calculator Enter the following expression: \u0026quot;7/1/\u0026quot; \u0026amp; [yr] using VB Script. 3) Calculate monthly standard deviation ellipses using the Directional Distribution (Standard Deviational Ellipse) tool in the Spatial Statistics Tools -\u0026gt; Measuring Geographic Distributions toolbox. Afterwards, add the appropriated field for tracking analysis.\n Select Torn_yyyy_yyyy.shp as your input feature class. Name the output feature class Torn_yyyy_yyyy_sdm.shp leave the ellipse size to the default set the Case Field to mo Click OK. Add a new field named TA_MONTH (Text type, 32 byte length) assign the date (use the expression: [mo] \u0026amp; “/1/05” for VB Script) 4) Create a temporal layer from the Torn_yyyy_yyyy_mcy points\n Activate the Tracking Analyst Extension and add it\u0026rsquo;s toolbar. It might have been a while, where do you turn on extensions \u0026amp; add toolbars? (Hint: Customize \u0026gt; extensions \u0026amp; right click on the gray area in the \u0026gt; toolbar area)\n Leave the lower48 layer on and remove all the other layers Click on the Add Temporal Data wizard button You want to add a feature class or shapefile containing temporal data (first radio button) and click on the browse button to select Torn_yyyy_yyyy_mcy.shp as the input Select the appropriate date/time field (i.e. TA_YEAR) and click Next Specify the date \u0026amp; time format (M/d/yyyy) and click Next then Finish. A new “Temporal Layer” will be added to the data frame’s table of contents 5) Modify the symbology of the temporal layer\n Right-click on the Temporal Layer and go to Properties notice the properties tabs are slightly different from those of a typical shapefile In the Symbology tab check the Time Window box in the Show: box under Events The number of periods is the number of intervals in the series, set as appropriate. For Torn_2005_2007_mcy.shp, the number of periods is 3 and the units are Years Click OK. You have set the temporal symbology (i.e., what symbols will display over time) 6) Visualize the temporal data\n Click on the Playback Manager button on the Tracking Analyst toolbar Hint: Hover over tools to see their name, remember? Click on Options and set the temporal extent to the temporal layer (Torn_yyyy_yyyy_mcy) Change the playback rate to 1.00 years per second Click play and watch the animation Write a film analysis detailing the evolution of the main character (Torn_yyyy_yyyy_mcy), with respect to the changing of the seasons. I kid\u0026hellip; 7) Export the animation to a movie\n On the Tracking Analyst toolbar, go to Tracking Analyst -\u0026gt; Animation Tool Change the frame size width to 320 and be sure Maintain Aspect Ratio is selected. Set the output file to yourname_yearly_ttmov Click Generate. Go to Windows Explorer and launch your movie to make sure the animation was generated correctly. Answer question #1 on your answer sheet Part Two: Nearest Neighbor Analysis 1) Select tornado touchdowns within Kansas in 2005.\n Starting with lower48 and the Torn_yyyy_yyyy.shp, export the tornado touchdown data for 2004 select by attributes, input layer Torn_yyyy_yyyy.shp, \u0026ldquo;YEAR\u0026rdquo; = \u0026lsquo;2005\u0026rsquo;, R-Click layer, Export Data…) and call it tt_KS_2005.shp, add it to the map Turn off the Torn_yyyy_yyyy and tt_KS_2005 layers temporarily and use the select tool to select Kansas (alternatively you can use the Select by attributes tool) Go to Selection \u0026gt; Select by Location, select features from tt_KS_2005 that intersect the features in lower48, and check Use selected features. It should inform you that there is one feature (Kansas State) selected. Click OK Turn the tt_KS_2005 layer back on to verify that only touchdown points for the year 2005 in Kansas are highlighted 2) Using the Average Nearest Neighbor tool in the Spatial Statistics Tools | Analyzing Patterns, calculate the Nearest Neighbor Index for Kansas in 2005.\n Input feature class: tt_KS_2005, use Euclidean distance and check Generate Report To see the report, open the ‘NearestNeighbor_Result.html’ in the Results window Click OK and fill in the appropriate information in the table (question #2 on the answer sheet) 3) Calculate Nearest Indices for Washington (the state) and Alabama in 2005\n Repeat parts of step 1 and step 2 above for both Washington and Alabama Fill in the table in question #2 and answer question #3 on the answer sheet 4) Calculating Nearest Neighbor Indices for two months (May and November) across the US in 2007\n Select the touchdown points that occurred in May of 2007 (select by attributes) and export data, naming the output tt_May07 Do the same for November and name the exported dataset appropriately Repeat step 2 to calculate the NNI for the two months Fill in the table in question #2 on the answer sheet 5) Calculate the total area of the 48 conterminous states and DC\n Open the attribute table for the lower48 shapefile Add a new field named AreaSQM. Make it type Float Right-click on the AreaSQM field and click on Calculate Geometry Make the property Area and use coordinate system of the data source. Keep the units as square meters and click OK R-Click on the AreaSQM field, click on Statistics, and note the Sum. Answer question #4 on your answer sheet. 6) Use the total area to calculate NNI of the tornado touchdowns in May and November of 2007\n Repeat step 2 but include the total area figure in the area box. To make this easier, select the value from the statistics window and hit CTRL-C to copy the text to the clipboard. Then click on the Area field in the Nearest Neighbor window and use CTRL-V to paste the number). Fill in the results in the table in question #5 on your answer sheet. 7) Calculate the Nearest Neighbor Indices again, this time using using each region\u0026rsquo;s respective area.\n Repeat the above steps for Nearest neighbor analysis. You can copy the value from the attribute table and paste it into the dialog box). Fill in the rest of the table in question #5 8) Using what you\u0026rsquo;ve learned, answer questions 6 \u0026amp; 7 on your answer sheet.\nSubmit your final answer sheet to blackboard.\n"},{"id":54,"href":"/classes/geog358/labs/lab09/","title":"Lab 09 - Interpolation and Fire Hazard Modeling","parent":"Labs","content":"Part 1: Temperature Modeling and Interpolation Objectives\nThis part will introduce you to the basic operations of cartographic modeling (or map algebra). We will be calculating temperature across the U.S. using a Digital Elevation Model (DEM) and building a simple fire hazard model.\nMaterials\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rUSDEM10K\rDEM covering the conterminous US. \r\r\rUS_TEMP.shp\rWeather stations across conterminous\r\r\rUS. STAT48.shp\r48 conterminous states in US.\r\r\r1 Temperature estimation and comparison Open ArcMap. Add US_TEMP.shp, STAT48.shp and USDEM10k grid (all in the Part01 folder) to the data frame. Ignore warnings about spatial information.\n The original DEM is recorded in feet. However, we need to convert it to meters. Open ArcToolbox and go to Spatial Analyst Tools | Map Algebra | Raster Calculator.\n If you get an error message about licensing, go to Customize | Extensions… and check the Spatial Analyst box.) Double-click the “USDEM10K” layer, and then add * .3048 to the expression. (1 ft = 0.3048 m) Save the Output raster in your Part01 folder as DEM_meters. Your new DEM will appear in your table of contents with values ranging from 0 to ~4,091.\n Stat48grid will be a raster representation of the polygon state48 shapefile. In ArcToolbox, go to Conversion Tools | To Raster | Feature to Raster. Use the following settings:\n Input features: STAT48 Field: STAT48L Output raster: (Save as Stat48grid in your Part01 folder) Output cell size: 10000 Right-click the ArcToolbox node (the top of the ArcToolbox menu—scroll up if you can’t see it) and click Environments….\n Click Workspace and set both the Current Workspace and the Scratch Workspace to your Part01 folder. (You may have to go up one level to get to the full path of your Part01 folder.)\n Scroll down, click Raster Analysis, and set Mask to Stat48grid.\n Click OK.\n 2 Using Global Lapse Rate to Model Temperature The airport in Columbia, South Carolina has an observed mean annual temperature of 17oC with an elevation of approximately 65 m. The global lapse rate tells us that temperature decreases 6.5o C for every 1,000 m increase in elevation. Let’s calculate the elevation and temperature difference in each raster cell from the Columbia Airport. In ArcToolbox go to Spatial Analyst Tools | Map Algebra | Raster Calculator. To calculate the elevation difference from Columbia Airport, use the expression: “DEM_meters” – 65 Save the output in your Part01 folder as elev_diff. Open the Raster Calculator again. To calculate the temperature difference from Columbia Airport based on the global lapse rate, use the expression: “elev_diff” * (6.5 / 1000) Save the output in your Part01 folder as temp_diff. Finally, using the mean annual temperature at Columbia Airport (17o C) and the difference in temperature at each cell from this location (temp_diff), we will estimate the temperature across the U.S. In the Raster Calculator, use the expression: 17 – “temp_diff” Save the output in your Part01 folder as temp_model. 3 Using Inverse Distance Weighting to Interpolate Temperature Now we’re going to interpolate a temperature grid from the US_TEMP shapefile using the Inverse Distance Weighting (IDW) method. In ArcToolbox, go to Spatial Analyst Tools | Interpolation | IDW and use the following settings: Input point features: US_TEMP Z value field: TEMPC Output raster: (Save as temp_IDW in your Part01 folder) Output cell size: 10000 Power: 2 The result is another version of temperature across the nation, but this time it’s based on interpolation between established weather stations. Using the Raster Calculator, calculate the temperature difference between estimated (based on global lapse) and interpolated temperature: “temp_model” – “temp_IDW” Save the output in your Part01 folder as temp_compare. Save your map document as YourLastName_temp_model.mxd. Take a screenshot of the entire ArcMap window and save it to your personal folder. Part 2: Fire Hazard Modeling Objectives\nIn areas of the United States where the population has expanded to rural and forested land covers, the risk of wildland fire is great. The term Wildland/Urban Interface (WUI) is used to describe these areas. Special mitigation and response strategies have been developed to manage risk in these areas. The National Fire Protection Association (NFPA) has developed a set of standards for evaluating risk in developing areas threatened by wildfire.\nThis part of the lab presents a portion of a very simple wildfire risk model. It shows how to use GIS to model just two of the many factors that influence wildfire risk: slope and fuel. Together these two factors make up one of the criteria specified in the NFPA 299 Standards for Protection of Life and Property from Wildfire. (A more realistic model of wildfire risk would use complex data that would include a very detailed breakdown of the vegetation into fuel classes and would model both slope and aspect. Additional data on weather history, the locations of structures, fire history, access, the evacuation routes available, and other factors that might affect fire ignition and spread would also be incorporated in the model.) You will convert digital elevation model (DEM) data and vector files to raster data and assign a fire risk value and percentage of influence. Then you will create a model that will generate a map showing overall fire hazard risk.\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\relevation.dem\rTerrain surface (.dem format)\r\r\rstudyarea\rExtent of the fire risk model\r\r\rvegetation\rVegetation types in the study area\r\r\r1 Building a fire hazard model Start a new ArcMap document (or new data frame) and add the vegetation and studyarea shapefiles from your Part02 folder. Ignore warnings about spatial reference. Right-click the ArcToolbox node (the top of the ArcToolbox) and click Environments…. Click Workspace and set both the Current Workspace and the Scratch Workspace to your Part02 folder. Click Processing Extent and change the Extent to Same as layer studyarea. Click Raster Analysis and set the Cell Size to Same as layer elevation.dem. Click OK. In ArcToolbox, go to Conversion Tools | To Raster | Feature to Raster and use the following settings: Input features: vegetation Field: VEGTYPE Output raster: (Save as Rveg in your Part02 folder) Output cell size: elev_raster (navigate to your Part02 folder) In ArcToolbox, go to Spatial Analyst Tools | Surface | Slope and use the following settings: Input raster: elevation.dem (again, navigate to your Part02 folder) Output raster: (Save as slope in your Part02 folder) Output measurement: PERCENT_RISE Z factor: 1 The weighted linear combination (overlay) process combines slope and vegetation data to assess the overall fire hazard. Because these layers do not contain similar data, the values in each layer must be converted to a common value scale. The common scale for this model will assign a value of 1 to areas with the lowest wildfire potential and 3 to areas with the highest wildfire potential.\n In ArcToolbox, go to Spatial Analyst Tools | Reclass | Reclassify and use the following settings: Input raster: slope Reclass field: VALUE Reclassification: Click Classify… to bring up the Classification window. Change the Classes to 3. Click OK. Change the table to reflect the following: Note that you must use spaces in the Old values ranges (e.g., 0 – 30, not 0-30) .tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rOld values\rNew values\r\r\r0 - 30\r1\r\r\r30 - 60\r2\r\r\r60 - 300\r3\r\r\r Use this same tool (reclassify) to create a factor layer for vegetation: Input raster: Rveg Reclass field: VEGTYPE Reclassification: Change the table to reflect the following: .tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rOld values\rNew values\r\r\rLodgepole pine\r3\r\r\rEngelmann spruce\r2\r\r\rKrummholz\r2\r\r\rnon-forest\r1\r\r\rwhitebark pine\r3\r\r\rwater\rNoData\r\r\rNoData\rNoData\r\r\r Output raster: (Save as vegfactor in your Part02 folder) 2 Linear Combination In the weighted linear combination process, weights are assigned to each factor based on the influence a factor has on the fire hazard assessment. In this model, slope is deemed a more important (0.75) factor than vegetation type (0.25).\n In ArcToolbox, go to Spatial Analyst Tools | Map Algebra | Raster Calculator and enter the following formula: (0.75 * “slopefactor”) + (0.25 * “vegfactor”)\n Save the output in your Part02 folder as fire_hazard.\n Right-click the ArcToolbox node (the top of the ArcToolbox) and click Environments….\n Click Raster Analysis (you may have to scroll down to see it) and set Mask to studyarea. Click OK.\n In ArcToolbox, go to Spatial Analyst Tools | Surface | Hillshade and use the following settings:\n Input raster: elev_raster Output raster: (Save as hillshade in your Part02 folder) Isolate the highest fire risk spot in the study area by entering the following into Raster Calculator (Spatial Analyst Tools | Map Algebra | Raster Calculator): “fire_hazard” == 3 (The pixels that meet this qualification will get a value of 1 and all others will get a value of 0).\n Save the output in your Part02 folder as high_fire_haz.\n In the Table of Contents, move high_fire_haz on top of the hillshade to display the “hot spots” above the terrain surface.\n Go to the Properties of the high_fire_haz layer, click the Display tab, and set the transparency to 50%. Click OK.\n Turn all layers off, except the high_fire_haz and hillshade.\n Take a screenshot of the entire ArcMap window and save it to your personal folder.\n Upload both of your screenshots to Blackboard. There is nothing to hand in this week.\n"},{"id":55,"href":"/classes/geog558/labs/lab10/","title":"Lab 10 - Spatial Interpolation","parent":"Labs","content":"Learning Objective When we deal with spatial analysis, one of the problems we have is getting data we collect in the field into a form we can analyze. Most field data are collected at discrete points (e.g., rain gages, GPS point collection, address points, ect.). There are several interpolation methods that allow us to take this point data and make it into a continuous surface. We will look at the Inverted Distance Weighting and Kriging methods in this lab.\nScenario:\nYou have been hired by KGS to perform an analysis of water level in the High Plains Aquifer in Kansas. They have provided you with 2 shapefiles containing min, mean, and max water level depths from the surface for 1997 and 2017. Your job is to perform a volume comparison. You will need to interpolate a surface representing the water table elevation for each time period and determine the volume gained or lost over the 20-year period, create a map describing the status and change of the water table in Kansas, and a short report of your findings.\nOutline: Learning Objective Submission requirements Materials Submission requirements Lab 10: Answer Sheet\nQuestion 1:\nWhat was your regression function?\nQuestion 2:\nWhat is the interpretation of your regression function?\nQuestion 3:\nWhat was your highest prediction standard error (hint: right click on the GA layer and “Change output to predicted Standard Error”)? How does this error compare with the distribution of the well heads?\nQuestion 4:\nSubtract the two 1997 surfaces. What is the difference in volume between the two interpolation methods. Given that we extract ~3,655,000 acre-feet of water a year, how important is the interpolation method in determining the change in volume?\nQuestion 5:\nPick one of the interpolations and submit your report below. This might include some or all of the following elements.\n A “Current” depth to water level map (example from Brownie Wilson of KGS below) A difference map describing the change in water level over the time period A short (1-2 paragraph) interpretation of the outcomes. This should address common questions an employer might ask including an interpretation of the data, how the map was created, a statement of accuracy, and potential sources of error. An example of how I might approach this question:\n Materials click to download\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rHPWells_1997.shp\rWell depths for Kansas within the High Plains Aquifer from 1997\r\r\rHPWells_2017.shp\rWell depths for Kansas within the High Planes Aquifer from 2017\r\r\rSelectedStates.shp\rBoundaries for Kansas, Nebraska, Colorado\r\r\rKSCounties.shp\rKansas counties from the 2010 TIGER database\r\r\rKSPLSSSections.shp\rKansas PLSS Section shapefile\r\r\rHPBounds_2010.shp\rOuter boundary for the High Plains Aquifer\r\r\rsmalldata.zip\rA subset of the data for Kansas Groundwater Management District 4\r\r Interpolating surfaces can be one of the more processing intensive operations. As an alternative if you are in a processing limited environment feel free to perform the lab using the \u0026ldquo;smalldata\u0026rdquo; folder in place of the full database. I have subset the original assignment to the 6 counties that comprise the Kansas groundwater management district 4. 1) Getting started\n Add the data into ArcMap and examine it\u0026rsquo;s properties. We will use the Geostatistical Analyst feature in ArcMap. There are many, many features of this tool that we will not explore here. You can use the ArcMap help system or Google “Using ArcGIS Geostatistical Analyst” online to explore this tool beyond what we will do here. If necessary, activate the Geostatistical Analyst (Customize | Extensions) Add the Geostatistical Analyst toolbar (Customize | Toolbars | Geostatistical Analyst) 2) Interpolating a surface using IDW\n Go to Geostatistical Analyst | Geostatistical Wizard Highlight Inverse Distance Weighting Select your Source Data (A well shapefile) We want to model a water surface, so make your Data Field (I use avg depth) Click Next If a \u0026ldquo;Handing Coincidental Samples\u0026rdquo; prompt pops up, decide how you would like to handle this occurrence. Since I prefer “worst case” scenarios, I choose \u0026ldquo;Use Maximum\u0026rdquo;. Once you\u0026rsquo;ve chosen, click next. This next window shows you how water levels will be modeled, but in a graphic form. Take a look at the help file for the statistical wizard to determine how the various inputs affect the statistic performed. ArcMap is nice enough to include a power optimization button under general properties. Go ahead and click that button. I also want to include more samples in my interpolation so change maximum neighbors to 25 and minimum neighbors to 7. The next page shows you the cross validation of your surface. This means that they take a point out of the prediction and compare it to the surface it would otherwise have been created without using it. When presenting results, most publications generally eschew printing the nuances of how they ran an interpolation tool in favor of reporting the regression function. Answer questions 1 and 2 below. When you are done click finish. The grid that the geostatistical wizard spits out is a format specific to the interpolation tools and is not easily manipulated as you have grown used to. Export it as a raster by R-clicking on the layer and going to Data | Export to Raster… Change the cell size to 500 and set the output raster to save it as IDWYYYY in your lab10 folder 3) Interpolating a surface via Kriging\n Quick Kriging refresher here: https://gisgeography.com/kriging-interpolation-prediction/\n Go to Geostatistical Analyst | Geostatistical Wizard and choose Kriging / CoKriging Make sure your type of Kriging is set to Simple Kriging and Normal Score, and make sure the Output Type is Prediction Click Next This next window is unique to Normal scores, walking you through the normalization parameters. ArcMap does all the heavy lifting for you so take note of the parameters, look at the QQ plot, and then click Next Again, the modeling window includes an “Optimize model” button, click that and then click Next This visualization in the next window again shows the graphical representation of the interpolated surface in a graphic form. Click on the inset map to move the cross hairs to different portions of the aquifer. The circle around the cross hairs shows which area the model will take into account for that location. Leave all on this window to the set defaults and click Next This next window shows you the model and the Prediction Errors. You can save this validation result to a .dbf file (or a table in a shape file) that you can open in Excel, which may have a helpful feature, depending on your purposes. Click Finish Click OK on the Output Layer Information window and export the layer to a rater as above, using a name similar to KGYYYY Answer questions 3 below. 4) Clip the rasters to the extent of the Aquifer\n Hint: clip tool… make sure you use the check box\n 5) Calculate Water volume difference between the two interpolation methods\n Hint: raster calculator… If you don’t have an attribute table, you will need to do the following: Use Raster Calculator to first turn the raster into an integer while keeping some of our calulated presision. Hint: Int(\u0026quot;KI1997Diff.tif\u0026quot; * 1000) Use the Build Raster Attribute table tool to generate the attribute table Create a new field called Volume, make it a float. Calculate volume using ((Value/1000)/3.28084)*Count*500*500 Answer question 4 6) Pick an interpolation method and interpolate 2017 data using the same methods and parameters as above. Repeat steps 4 and 5 and answer question 5 on the sheet.\n"},{"id":56,"href":"/classes/geog111/labs/","title":"Labs","parent":"Introduction to GEOINT -- GEOG111","content":"Because this is an introductory class, the labs will be pretty explicitly hand-holdy. Make sure you take a second and read the instructions and digest what it is they are asking you to do. If you are more computer savvy or otherwise technically inclined, feel free to try and guess or intuitively feel out the instructions instead. At the end of the day, these are just computers so the consequences of a bad input combination or operation are easily recoverable. These labs follow the class lecture. While only a few are directly cumulative, they do progressively become more involved as the semester progresses.\n Several of these labs are gratefully pilfered and modified from labs in Bradley A. Shellito\u0026rsquo;s Introduction to Geospatial Technologies\n This lab uses a few different software:\n Google Earth, A pretty straightforward installation with nothing special to it. ArcPro \u0026amp; QGIS, GIS software can be difficult to correctly install. This page will help get you over most of those hurdles should you encounter an error. Google Earth engine, which requires Google approval an will need to be applied for prior to Lab 9. A photo viewer. Although the default photo viewer will suffice, if you want a more robust option I use IRFANVIEW (Windows only, sorry mac) It will be covered more in class, but unless explicitly stated otherwise labs will be due a week after the lab session meets. Lab 01 - Computer basics Lab 02 - Introduction to Google Earth Pro Lab 03 - Coordinates and Position Measurements Lab 04 - GPS Lab 05 - GIS introduction Lab 06 - spatial analysis Lab 07 - Digital Terrain Analysis Lab 08 - Map making Lab 09A - 3D Modeling and Visualization Lab 09B - Cloud based GIS Lab 10 - Visual Imagery Interpretation Lab 11 - Remotely Sensed Imagery and Color Composites Lab 12 - Landsat 8 Imagery Lab 13 - Earth Observing Missions Imagery Lab 14 - Final Lab activity "},{"id":57,"href":"/classes/geog358/labs/","title":"Labs","parent":"Introduction to GIS -- GEOG358","content":"As an introductory class, the first half of the labs are will be pretty explicitly hand-holdy, whereas the second half of the semester will be a chance to stretch those critical (spatial) thinking skills we\u0026rsquo;re here to gain. Make sure you take a second and read the instructions and digest what it is they are asking you to do. If you are more computer savvy or otherwise technically inclined, feel free to try and guess or intuitively feel out the instructions instead. At the end of the day, these are just computers so the consequences of a bad input combination or operation are easily recoverable. These labs follow the class lecture, and only a few are topically cumulative, but they do progressively become more involved as the semester progresses.\nIf you need a refresher or have never touched GIS before, you might want to go back to the pervious class and rework the labs, particularly 2, 5, 6, and 8) Lab 00 - Intro to GIS Lab 01 - Intro to GISystems Lab 02 - Projections and Coordinate Systems Lab 03 - Using GPS for Field Data Collection Lab 04 - On-Screen Digitizing \u0026amp; Image Restoration Lab 05 - Building a GIS database Lab 06 - Selections Queries and Joins Lab 07 - Overlay \u0026amp; site suitability analysis Lab 08 - Introduction to network analyist and ArcScene Lab 09 - Interpolation and Fire Hazard Modeling "},{"id":58,"href":"/classes/geog526/labs/","title":"Labs","parent":"Remote Sensing -- GEOG526","content":"This class is typically taken before or alongside the intermediate GIS class and after taking the intro class, and is largely focused around the use, interpretation, and manipulation of raster data from remotely sensed data sources. Although many of these labs will be hand-holdy, as an upper division course you will be expected to start stretching those critical (spatial) thinking skills we\u0026rsquo;re here to gain.\n It will be covered more in class, but unless explicitly stated otherwise labs will be due a week after the lab session meets. Bibliography Guidelines Lab - Data Acquisition and Image Preprocessing Lab - Introduction to ERDAS IMAGINE I Lab - Introduction to ERDAS IMAGINE (Part II) Lab - INTRODUCTION TO IMAGE INTERPRETATION Lab - INTRODUCTION TO MODIS TIME SERIES Lab - LANDSAT TM \u0026amp; SPOT IMAGERY Lab - Remote Sensing (Assignment 9) NDVI Lab - Remote Sensing with RADAR Imagery Lab - THERMAL IMAGERY Lab 08 - Map making Lab 14 - Final Lab activity Lab - Internet Resources Exploration Lab - Maps \u0026amp; Aerial Photography Lab - INTRODUCTION TO UNIT CONVERSION \u0026amp; SCALE PROBLEMS Lab - ELECTROMAGNETIC RADIATION PRINCIPLES "},{"id":59,"href":"/classes/geog558/labs/","title":"Labs","parent":"Intermediate GIS -- GEOG558","content":"This is an advanced GIS class which assumes you took and understood the introductory class (GEOG 358). Although the instructions attempt to be verbose, you will need to stretch your critical (spatial) thinking skills and pause to think about what it is we want to accomplish before you blindly start following the directions. Take a minute, read through the lab steps, and ask yourself why it is I ask you to do something. Can you think of other ways to get to the answers to the questions?\nThese labs are largely written for ArcMap 10.6. However, operations from ArcMap 10.2 through 10.6 and even up to Pro all largely look the same although some of the tools may have shifted or changed names just to keep you on your toes. Although these labs are written so that you can follow them step by step, I expect you to flex those critical thinking skills of yours. Take a step back, read the learning objectives and ask yourself what steps are needed to get to that goal. When in doubt, Google it! If you need a refresher from the intro class, feel free to go back and do the labs. All would be useful in brushing up, but particularly 2, 5, 6, 7 and 9. Lab 01 - Data survey and database building Lab 02 - Cartographic Modeling Lab 03 - Focal Operations Lab 04 - Zonal Operations Lab 05 - Model Builder Lab 06 - Cost Distance, Region Groups, more Model Builder \u0026amp; Python Lab 07 - Siting a New School with Model Builder and Fungus weight Lab 08 - Advanced Terrain Analysis Lab 09 - Descriptive Spatial Statistics and Point Pattern Analysis Lab 10 - Spatial Interpolation "},{"id":60,"href":"/classes/geog358/termpaper/","title":"Project","parent":"Introduction to GIS -- GEOG358","content":"Class project Objective: Identify a biophysical, human, or environmental problem that requires or would benefit from your newfound knowledge of spatial analysis using GIS. Describe the problem that you are interested in. Provide geographic context to the problem. Discuss what kind of GIS analysis you want to use and the implementation strategy you have chosen. Present simple diagrams or maps that illustrate your approach. Apply your approach to real-world data and present the results of your project in oral form. Students are also required to write a term paper based on the final project.\n The intermediate deliverables count towards 10% of the final project grade A presentation which will count towards 25%. the term paper counts towards the remaining 65% Note: the total project counts as 20% of your final grade, see the syllabus page for more details\n Intermediate deliverables: In order to help keep you accountable, there will be several intermediate deliverables due throughout the semester. This will be done in two stages:\nA proposal: All students are required to write a proposal for the final project. In the proposal, you will describe the project you are planning to carry out, the GIS datasets needed, and GIS analysis functions may be used in the project. You will review the literature, what methods and GIS analysis functions have been used solving similar problems.\nA draft of your paper: To gauge your progress and allow time to provide feedback, you will submit a draft of your paper which should at minimum contain an introduction and methods sections.\nPresentation: Each student should plan on a 10-minutes presentation of your final project to the class (~7 minutes, 2 minutes for questions and a minute for transitions). This can take several forms, including but not limited to:\n A slide deck A research poster A diorama Presentation order and grading rubric\n note: The breakdown of the presentation will be as follows: 40% will be the average of your classmates grades 10% will be for correctly filling in the grading sheet (participation) the remaining 50% will be based on my grade.\n Term paper The paper should be far more comprehensive than your presentation, and reflective of your college education. Spelling, grammar, and coherency are expected, and the paper should include the following sections:\n Introduction The introduction should start broad and narrow in. Make sure you provide enough background on your topic so that a non-expert could pick up your paper and have enough context to understand what the problem is, why spatial analysis is needed to solve your problem. Review in literature what methods and spatial analysis have been used solving similar problems. Be sure to include a statement like \u0026ldquo;The objective of this project is to \u0026hellip;\u0026rdquo;. Methods Describe the strategy you solve the problem using GIS and any other necessary methods. Describe the statistic method, computer model, GIS analytic functions / operations you use in the problem solving process. Because this is a GIS class this section should include some of the finer details and methods used that would not normally be included in a peer reviewed or technical article. Diagrams are especial helpful to understand your strategy. This should include a dedicated Study Area and Analysis subsection where you describe the study area. Results Here you present your findings, including final figures and the results (but not the meanings) of any statistical analysis Discussion \u0026amp; Conclusions This portion should briefly summarize the topic and question you answered with this project, the methods you employed and what your findings are. Discuss the findings from your problem-solving process. Evaluate your methodology. Discuss the usefulness and limitation of the analytic functions in GIS. Typically, this section also contains recommendations, both for what steps should be taken and the next research questions that should be addressed by future work. References Use whatever formatting you like. I HIGHLY recommend a citation manager (see syllabus) These papers will be graded using the following criteria, with A-F ranges translated into their respective numerical counterparts. Helpful? tips\nOne of the things that helps me approach papers like this is to find a similar paper in academic literature or a professional report and mirror it\u0026rsquo;s structure. Feel free to do the same. This should also go without saying but if you turn in a paper with screenshots and are not pointing out UI features I will automatically take 10 points off the paper grade. You are here to learn GIS, demonstrate you have done so.\nThe following are due throughout the semester (dates tentative and subject to change with notice):\nDue dates Proposal (1 ~ 3 pages, due Oct 10th)\nIntroduction (due Nov. 26th)\nPresentations (Week of Dec 3rd)\nFull Term Paper (6 ~ 10 pages, due on Dec 17th)\n"},{"id":61,"href":"/classes/geog558/termpaper/","title":"Project","parent":"Intermediate GIS -- GEOG558","content":"Class project Objective: Identify a biophysical, human, or environmental problem that requires or would benefit from your newfound knowledge of spatial analysis using GIS. Describe the problem that you are interested in. Provide geographic context to the problem. Discuss what kind of GIS analysis you want to use and the implementation strategy you have chosen. Present simple diagrams or maps that illustrate your approach. Apply your approach to real-world data and present the results of your project in oral form. Students are also required to write a term paper based on the final project.\n The intermediate deliverables count towards 10% of the final project grade A presentation which will count towards 25%. the term paper counts towards the remaining 65% Note: the total project counts as 20% of your final grade, see the syllabus page for more details\n Intermediate deliverables: In order to help keep you accountable, there will be several intermediate deliverables due throughout the semester. This will be done in two stages:\nA proposal: All students are required to write a proposal for the final project. In the proposal, you will describe the project you are planning to carry out, the GIS datasets needed, and GIS analysis functions may be used in the project. You will review the literature, what methods and GIS analysis functions have been used solving similar problems.\nA draft of your paper: To gage your progress and allow me time to provide you feedback, you will submit a draft of your paper which should at minimum contain an introduction and methods sections.\nPresentation: Each student should plan on a 10-minutes presentation of your final project to the class (~7 minutes, 2 minutes for questions and a minute for transitions). This can take several forms, including but not limited to:\n A slide deck A research poster A diorama Presentation order and grading rubric\n note: The breakdown of the presentation will be as follows: 40% will be the average of your classmates grades 10% will be for correctly filling in the grading sheet (participation) the remaining 50% will be based on my grade.\n Term paper The paper should be far more comprehensive than your presentation, and reflective of your college education. Spelling, grammar, and coherency are expected, and the paper should include the following sections:\n Introduction The introduction should start broad and narrow in. Make sure you provide enough background on your topic so that a non-expert could pick up your paper and have enough context to understand what the problem is, why spatial analysis is needed to solve your problem. Review in literature what methods and spatial analysis have been used solving similar problems. Be sure to include a statement like \u0026ldquo;The objective of this project is to \u0026hellip;\u0026rdquo;. Methods Describe the strategy you solve the problem using GIS and any other necessary methods. Describe the statistic method, computer model, GIS analytic functions / operations you use in the problem solving process. Because this is a GIS class this section should include some of the finer details and methods used that would not normally be included in a peer reviewed or technical article. Diagrams are especial helpful to understand your strategy. This should include a dedicated Study Area and Analysis subsection where you describe the study area. Results Here you present your findings, including final figures and the results (but not the meanings) of any statistical analysis Discussion \u0026amp; Conclusions This portion should briefly summarize the topic and question you answered with this project, the methods you employed and what your findings are. Discuss the findings from your problem-solving process. Evaluate your methodology. Discuss the usefulness and limitation of the analytic functions in GIS. Typically, this section also contains recommendations, both for what steps should be taken and the next research questions that should be addressed by future work. References Use whatever formatting you like. I HIGHLY recommend a citation manager (see syllabus) These papers will be graded using the following criteria, with A-F ranges translated into their respective numerical counterparts. Helpful? tips\nOne of the things that helps me approach papers like this is to find a similar paper in academic literature or a professional report and mirror it\u0026rsquo;s structure. Feel free to do the same. This should also go without saying but if you turn in a paper with screenshots and are not pointing out UI features I will automatically take 10 points off the paper grade. You are here to learn ArcMap, demonstrate you have done so.\nThe following are due throughout the semester (dates tentative and subject to change with notice):\nDue dates Proposal (1 ~ 3 pages, due Oct 10th)\nIntroduction (due Nov. 26th)\nPresentations (Week of Dec 3rd)\nFull Term Paper (6 ~ 10 pages, due on Dec 17th)\n"},{"id":62,"href":"/classes/random/FOSS/","title":"FOSS","parent":"Random bin","content":"Free and Open Source Software: There are a whole host of questions I still have regarding the intersections of individuals, societies, economies, and technology. One of the more practically philosophical realizations of this intersection than FOSS. Free and Open Source Software has become one of my go to tools in tackling a problem and has that special allure of feel good contribution and intrinsic reproducibility to it. I\u0026rsquo;ve created a few tools that take a FOSS approach to GIS, and here I\u0026rsquo;d like to demonstrate how to create one.\nI\u0026rsquo;ll flesh this section out more at a later date, but some links I\u0026rsquo;ve leaned heavily on in the creation of FOSSFlood and ClusteR.\nSome useful implementation links: https://ficonsulting.com/filabs/RInno https://www.r-bloggers.com/deploying-desktop-apps-with-r/: The foundation/inspiration for this implementation. https://www.robvanderwoude.com/vbstech_hta.php: A great HTA resource page Shiny and shiny server docs https://stackoverflow.com/questions/24789746/portable-browser-issues-when-deploying-r-shiny-app https://stackoverflow.com/questions/30965410/creating-stand-alone-shiny-app-chrome-error/31272180#31272180 https://englianhu.files.wordpress.com/2018/10/web-application-development-with-r-using-shiny-build-stunning-graphics-and-interactive-data-visualizations-to-deliver-cutting-edge-analytics-3ed.pdf Leaflet resources Useful code bits There are some useful tidbits of code, mostly in R, that I\u0026rsquo;ve flagged specifically. Many are part of FOSSFlood, but that global file is more than 1000 lines of code so it\u0026rsquo;s understandable if you didn\u0026rsquo;t see them.\n#/////////////////////////////////////\r# -- Grab backup roads and points #/////////////////////////////////////\r# TIGRIS Roads from Census\r# Dev notes: here I grab the counties and use roads package to download roads (from Census)\r# This is redundent with the OSM data I download for buildings defaults. # There is also no handling of zip codes which straddle state lines\rprint(\u0026quot;-- Base data collection - This may take upwards of 2 hours to run - Grabbing TIGER road files\u0026quot;)\rCountiesFIPS \u0026lt;- quiet(counties(cb=TRUE))\rCountiesFIPSProj \u0026lt;- sp::spTransform(CountiesFIPS, CRS(\u0026quot;+init=epsg:4326\u0026quot;))\raoiCounties \u0026lt;- CountiesFIPSProj[aoiZIPCodesProj, ]\r# Error checking: Download roads for AOI that interset more than one state or county\rif(length(unique(aoiCounties$COUNTYFP)) \u0026gt; 1) {\rmyHoldRoads \u0026lt;- quiet(roads(unique(aoiCounties$STATEFP),unique(aoiCounties$COUNTYFP)[1], year = 2014, refresh = TRUE)[NULL,])\rfor(i in unique(aoiCounties$COUNTYFP)) {\rmyRoads \u0026lt;- quiet(roads(unique(aoiCounties$STATEFP),i, year = 2014, refresh = TRUE))\rmyHoldRoads \u0026lt;- rbind(myHoldRoads, myRoads)\r}\rmyRoads \u0026lt;- myHoldRoads\r} else {\rmyRoads \u0026lt;- quiet(roads(unique(aoiCounties$STATEFP),unique(aoiCounties$COUNTYFP), year = 2014, refresh = TRUE))\r}\rmyRoadsProj \u0026lt;- sp::spTransform(myRoads, CRS(\u0026quot;+init=epsg:4326\u0026quot;))\rmyRoads_subset \u0026lt;- myRoadsProj[aoiZIPCodesProj, ]\rwriteOGR(obj=myRoads_subset, dsn=paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName,\u0026quot;/geo/roads\u0026quot;), layer=\u0026quot;tigerroads\u0026quot;, driver=\u0026quot;ESRI Shapefile\u0026quot;)\r# OSM Roads and points\rprint(\u0026quot;-- Base data collection - This may take upwards of 2 hours to run - Grabbing OSM data\u0026quot;)\rosmDataURL \u0026lt;- paste0(\u0026quot;https://download.geofabrik.de/north-america/us/\u0026quot;,tolower(gsub(\u0026quot; \u0026quot;, \u0026quot;-\u0026quot;, fips(aoiCounties$STATEFP[1], to = \u0026quot;Name\u0026quot;))),\u0026quot;-latest-free.shp.zip\u0026quot;)\rdownload.file(osmDataURL, paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName,\u0026quot;/geo/tmp/\u0026quot;,tolower(gsub(\u0026quot; \u0026quot;, \u0026quot;-\u0026quot;, fips(aoiCounties$STATEFP[1], to = \u0026quot;Name\u0026quot;))),\u0026quot;-latest-free.shp.zip\u0026quot;))\runzip(paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName,\u0026quot;/geo/tmp/\u0026quot;,tolower(gsub(\u0026quot; \u0026quot;, \u0026quot;-\u0026quot;, fips(aoiCounties$STATEFP[1], to = \u0026quot;Name\u0026quot;))),\u0026quot;-latest-free.shp.zip\u0026quot;), exdir = paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName, \u0026quot;/geo/tmp\u0026quot;))\rosmPoints \u0026lt;- readOGR(paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName,\u0026quot;/geo/tmp/gis_osm_pois_free_1.shp\u0026quot;), verbose = FALSE)\rosmPointsProj \u0026lt;- sp::spTransform(osmPoints, CRS(\u0026quot;+init=epsg:4326\u0026quot;))\rwriteOGR(obj=osmPointsProj[aoiZIPCodesProj, ], dsn=paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName,\u0026quot;/geo/addresses\u0026quot;), layer='OSMaddresses', driver=\u0026quot;ESRI Shapefile\u0026quot;)\rosmRoads \u0026lt;- readOGR(paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName,\u0026quot;/geo/tmp/gis_osm_roads_free_1.shp\u0026quot;), verbose = FALSE)\rosmRoadsProj \u0026lt;- sp::spTransform(osmRoads, CRS(\u0026quot;+init=epsg:4326\u0026quot;))\rwriteOGR(obj=osmRoadsProj[aoiZIPCodesProj, ], dsn=paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName,\u0026quot;/geo/roads\u0026quot;), layer='OSMroads', driver=\u0026quot;ESRI Shapefile\u0026quot;)\r# OpenAddresses points\rprint(\u0026quot;-- Base data collection - This may take upwards of 2 hours to run - Grabbing OpenAddresses data\u0026quot;)\rif(aoiCounties$STATEFP %in% c('09','23','25','33','44','50','34','36','42')) {\roaURL \u0026lt;- \u0026quot;https://data.openaddresses.io/openaddr-collected-us_northeast.zip\u0026quot;\r} else if(aoiCounties$STATEFP %in% c('18','17','26','39','55','19','20','27','29','31','38','46')) {\roaURL \u0026lt;- \u0026quot;https://data.openaddresses.io/openaddr-collected-us_midwest.zip\u0026quot;\r} else if(aoiCounties$STATEFP %in% c('10','11','12','13','24','37','45','51','54','01','21','28','47','05','22','40','48')) {\roaURL \u0026lt;- \u0026quot;https://data.openaddresses.io/openaddr-collected-us_south.zip\u0026quot;\r} else if(aoiCounties$STATEFP %in% c('04','08','16','35','30','49','32','56','02','06','15','41','53')) {\roaURL \u0026lt;- \u0026quot;https://data.openaddresses.io/openaddr-collected-us_west.zip\u0026quot;\r}\rdownload.file(oaURL, paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName,\u0026quot;/geo/tmp/OpenAddresses.zip\u0026quot;))\runzip(paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName,\u0026quot;/geo/tmp/OpenAddresses.zip\u0026quot;), exdir = paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName, \u0026quot;/geo/tmp\u0026quot;))\rFullOpenAddresses \u0026lt;- paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName,\u0026quot;/geo/tmp/us/\u0026quot;,tolower(fips(aoiCounties$STATEFP[1], to = \u0026quot;Abbreviation\u0026quot;)),\u0026quot;/statewide.csv\u0026quot;)\rFullOpenAddressesCSV\u0026lt;- read.csv(FullOpenAddresses, header=TRUE, stringsAsFactors = FALSE)\rcoordinates(FullOpenAddressesCSV)\u0026lt;- ~LON+LAT\rproj4string(FullOpenAddressesCSV) \u0026lt;- CRS(\u0026quot;+init=epsg:4326\u0026quot;)\rwriteOGR(obj=FullOpenAddressesCSV[aoiZIPCodesProj, ], dsn=paste0(basedir,\u0026quot;/AOI/\u0026quot;,UserZipCodeFileName,\u0026quot;/geo/addresses\u0026quot;), layer='OpenAddresses', driver=\u0026quot;ESRI Shapefile\u0026quot;)\r"},{"id":63,"href":"/classes/geog526/labs/lab02/","title":"Lab - Maps \u0026 Aerial Photography","parent":"Labs","content":"Learning Objective This lab provides an introduction on how to use Google Earth Pro and will help familiarize you with many of its features. Although we\u0026rsquo;ll touch on several more advanced software as the class moves on, Google Earth Pro is a really fast and useful arrow to have in your quiver and we\u0026rsquo;ll be back to use it more than once. The steps and analyses we\u0026rsquo;ll do in this introductory lab are pretty basic but foundational, and we\u0026rsquo;ll build on these as we move forward. The goals for you to take away from this lab are:\n Familiarize yourself with the Google Earth Pro (GEP) environment, basic functionality, and navigation using the software Use different GEP layers and features Outline: Learning Objective Submission requirements Guide Reading Topographic Maps United States Public Land Survey (USPLS)/Public Land Survey System (PLSS) Universal Transverse Mercator (UTM) Contour Lines Large scale/Small scale 8 Elements of Photo Interpretation Latitude/Longitude Exercize Wrapping up Submission requirements Materials USGS Lawrence East quadrangle Aerial photo 8900-67 Grid Overlay Aerial photo #35\n.tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rYou are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts, and submit it to blackboard when done.\rGuide Reading Topographic Maps Topographic maps use a variety of symbols to represent human and physical features. The feature that distinguishes a topographic map from any other type of map are contour lines that show lines of equal elevation. This method proves to be a highly effective way of portraying three-dimensional landforms on a two-dimensional piece of paper. In addition, several different coordinate systems are shown on topographic maps. In addition to latitude and longitude, the base coordinates for the map, these maps show UTM grids, USPLS, and others. The wide range of information provided by topographic maps make them extremely useful to professional and recreational map users alike – they are used for a wide variety of tasks such as engineering, energy exploration, natural resource conservation, environmental management, public works design, commercial and residential planning, and outdoor activities like hiking, camping, and fishing.\nUse the following guidelines in order to read topographic maps:\n The title is located in the upper right and lower left of the map sheet The map scale is displayed in the center bottom portion of the map sheet in the form of a representative fraction. Therefore, if the scale is 1:24,000, then each inch on the map is equivalent to 24,000 inches on the ground Thin brown lines represent contours (points of equal elevation). The closer together they are, the steeper the terrain. Contour lines form ‘V’ shapes in valleys or along stream beds; the point of the ‘V’ points uphill Information concerning the meaning of colors and patterns can be found in the legend and on the Topographic Mapping Symbols handout. Generally – Blue – hydrologic features Green – vegetation Red – important roads, urban areas, survey \u0026amp; fence lines Black – roads, buildings names Purple – photorevisions and/or additions to original publication White – cleared woodland Brown – contours, sand areas United States Public Land Survey (USPLS)/Public Land Survey System (PLSS) also referred to as the township and range system Ranges run vertically while townships run horizontally Each township = 36mi2 and is divided into 36 (1mi2, 640 acre) numbered sections Refer to the following diagrams in order to understand the correction notation of the USPLS system Universal Transverse Mercator (UTM) UTM coordinates simply measure in meters east and north from two perpendicular reference baselines. Coordinates are written along the sides of a map designating specific grid lines or tic marks.\n Globe broken down into 60 zones beginning at International Date Line (1800) Each zone is 60 wide (Lawrence in zone 15; majority of Kansas in zone 14) Measured in northings (how far north of equator) and eastings (how far east of zone boundary) Tic marks appear in blue along the side of the map sheet and are 1,000 m apart; abbreviated as follows on map: 425 = 425,000m Typical format for expressing UTM location: 4,308,500m N, 311,000m E Contour Lines All points on the same contour are at the same elevation above sea level All successive contours are the same vertical distance apart; this distance is know as the contour interval Every 5th contour line is printed heavier, marked with its elevation, and called an index contour Contour lines never intersect (except on vertical cliffs) Spacing dependent on slope characteristics – the steeper the slope, the closer the contour lines Irregular, jagged contours indicate rough terrain Large scale/Small scale Concept relates to the amount of land area shown on the map relative to the size of paper the map is on Determines the level of detail (# features, size of features) Large scale – small surface area in great detail; features appear large (1:50,000) Small scale – large surface area in limited detail; features appear small (1:250,000) Air Photo Interpretation\n Advantages – no omission of data; perspective; cheap Disadvantages – too much detail makes it difficult to interpret; distortion; shadow Marginal information on an aerial photo (sometimes in gauge format or in numeric format - see aerial photo nomenclature handout) Date (top L) ID#s (photo #, mission #) Altitude Focal length Level Fiducial marks 8 Elements of Photo Interpretation shape – external form association – context tone – spectral reflectance; lightness/darkness texture – frequency of tonal change; roughness/smoothness shadow – reveal or hide information; provides sun angle info pattern – overall spatial form; arrangement into distinctive form size – large/small; absolute/relative site – location in relation to environment; topographic position Latitude/Longitude Degrees are divided into minutes \u0026amp; seconds as follows:\n 10 = 60 min (60’) 1’ = 60 sec (60”) Topo map covers 7.5 minutes of area Lat/long given in 2.5 minute intervals on sides Calculating latitude and longitude requires interpolation; see the following example in order to answer Part 1, Number 2 and 3 Interpolation (Calculating Lat/Long)\n Find feature at given lat/long designation Ex: what is located at 38057’29” N and 95017’20” W? \\[\rf(x) = \\int_{-\\infty}^\\infty\\hat f(\\xi)\\,e^{2 \\pi i \\xi x}\\,d\\xi\r\\] Exercize Part 1 Obtain the United States Geological Survey (USGS) 7.5 minute topographic maps of the Lawrence East quadrangle, 1967 and 1978 editions.\n What is the Section-Township-Range description for Memorial Hospital? (Be sure to reference the Lab 01 Guide in order to give the designation in the proper format i.e. Section, Township, Range). (1 point)\n What is the Section-Township-Range description for Haskell Institute? (Be sure to reference the Lab 01 Guide in order to give the designation in the proper format). (1 point)\n Approximate the latitude and longitude coordinates for the Lincoln School in degrees, minutes and seconds. (**Show your work here) (2 points)\n What feature is located at 3854'15\u0026quot; N, 9510'57\u0026quot;W? (**Show your work here. Name the feature and its location as ‘x’ in N or S of the closest Latitude, ‘x’ in E or W of ‘the closest Longitude’) (2 points)\n The Coal Creek runs from south to the north across the bottom of the map. If you wished to take water samples five miles further upstream, what adjacent map sheet would you want to acquire? (1 point)\n In Section 22, T.13S, R.20E is the Blue Mound, a famous landmark on the Oregon Trail. What is the elevation at the top of the mound? (1 point)\n0.7 miles southeast of Blue Mound is another hill. What is the elevation of its highest contour? (1 point)\nWhat are the UTM coordinates (unabbreviated) for the top of Blue Mound? (2 points)\n Locate the Union-Pacific and Atchison Topeka \u0026amp; Santa Fe railroad lines on the map. Briefly explain why the railroad engineers decided to lay their tracks down in these areas? What evidence does the map provide to support your hypothesis? (2 points)\n Some features on the map are purple in color. What does the color purple signify? (1 point)\n Note the differences between 1967 and 1978 in Section 2, T.13S, R.20E. List two major changes: (2 points)\n Part 2\tUse the aerial photo 8900-67 and a grid overlay to answer the following questions.\n Locate the area covered by the photo on the map. List four features, by name, that are especially helpful in matching the photo to the map. (4 points)\n Two farm ponds appear in a dark tone on the northern portion of the photo. What is the Section-Township-Range designation of these ponds? (Be sure to reference the Lab 01 Guide in order to give the designation in the proper format). (1 point)\n What land cover type is located north of the river on the east side of the photo? (1 point)\n Lay the acetate grid over the photo so that •\tthe origin (top left corner of cell 1-A) is positioned on the top left (NW) fiducial mark (+ mark) •\tthe same vertical line passes through the southwest (bottom left) fiducial mark.\n If you have correctly positioned the grid, the KU football stadium should be located in cell H19 (approximately). The idea here is to identify features using both the photo AND the topographic map as resources; therefore, be as specific as possible when answering. For example, if the cell contains a school and the topographic map provides the name of that specific school, use the specific name (so instead of simply answering river or school, specify what river and what school if it is provided on the topographic map). Identify the major features located in close proximity to the following grid cell coordinates: (5 points)\nH-19\t___________________________________________\nN-6\t___________________________________________\nK-14\t___________________________________________\nU-24\t___________________________________________\nU-7 ___________________________________________\n Along the north bank of the Kansas River, just east of Massachusetts Street Bridge, a land feature is represented by a purple dot pattern. Find this same feature on the aerial photo. What land feature is represented by the dot pattern? (1 point)\n Compare the size difference between features shown on both the map and the aerial photograph. What is the scale of the map? (1 point)\n 1 inch =____________feet.\nOf the two, which is the largest scale? (that is, which portrays the smaller ground distance per inch of map/image distance?) (1 point)\nPart 3 For the following section, use aerial photo # 35. The number is located in the upper left (northwest) corner of the photo. Exercises in this part are designed to acquaint you with aerial photography annotation and terminology. Notice the difference between the position and format of the written information on the two photographs that you have used for this lab assignment. The information located around the sides of a photo is referred to as marginal information.\n The flight line for Photo 35 is oriented in a north/south direction. The marginal information has been printed along the north margin of the photo. Note the three gauges shown along the north edge of the photo. Starting on the left side, explain the function of each gauge. (3 points)\nGauge 1\t_________________________________\rGauge 2\t_________________________________\rGauge 3\t_________________________________\r Provide the following information: (4 points)\nDate of acquisition: ________________________\nNominal scale: ____________________________\nProject code: _____________________________\nRoll and Exposure Number: _________________\n Is the scale of this photo larger than the map? (1 point) _____________________\n Is it larger than 8900-67 used in Part 2? (1 point) __________________________\r Place the grid overlay on the photo so that cell 1-A on the grid is located on the fiducial printed at the upper left corner on the photo. Make sure that the far left vertical grid line 1-A also passes through the bottom left fiducial mark. If the grid is placed correctly, cell Q-5 will fall onto the middle of the Jayhawk football stadium. With the grid properly in place, provide the correct grid cell coordinates for the following campus features: (3 points)\nLindley Hall\t____________________\nKansas Union\t____________________\nSpencer Museum of Art ____________________\n Wrapping up There is no need to save anything from this lab, so when done you can simply close without saving. Submit your answers to the questions on blackboard.\n"},{"id":64,"href":"/classes/geog526/","title":"Remote Sensing -- GEOG526","parent":"Classes","content":"This class is in VERY rough shape and was transferred to this website so that eventually I can get around to formatting it into a workable class. Don\u0026rsquo;t expect to find much of worth here for a while yet.\n \\ SORRY /\r\\ /\r\\ This page does /\r] not exist yet. [ ,'|\r] [ / |\r]___ ___[ ,' |\r] ]\\ /[ [ |: |\r] ] \\ / [ [ |: |\r] ] ] [ [ [ |: |\r] ] ]__ __[ [ [ |: |\r] ] ] ]\\ _ /[ [ [ [ |: |\r] ] ] ] (#) [ [ [ [ :===='\r] ] ]_].nHn.[_[ [ [\r] ] ] HHHHH. [ [ [\r] ] / `HH(\u0026quot;N \\ [ [\r]__]/ HHH \u0026quot; \\[__[\r] NNN [\r] N/\u0026quot; [\r] N H [\r/ N \\\r/ q, \\\r/ \\\r Instructor: Jim Coll\nOffice: 404C Lindley Hall\nEmail: jcoll@ku.edu\nOffice Hours: day from time-time \u0026amp; day from time-time or by appointment Class Meetings: Monday 2:00 — 4:30 pm\nClass Room: Lindley 228 Labs: Wednesday or Thursday 11:00 — 12:50 pm\nLab Room: Lindley 310 Announcements and mics: Hello all!\nMy name is Jim Coll and I am your instructor for the semester. A few notes for you pertain to how I run this course. Although the KU blackboard site will be the \u0026ldquo;official\u0026rdquo; site for this class and the place you submit all work to, I use this site here for the benefit of all and my own selfish desire to streamline my digital footprint. I will keep both sites as identical as possible when double posting material (e.g. the syllabus and course schedule), but in case of a conflict treat this version as the most recent. Use the navigation table to the left to find the syllabus, labs, and other documents. I have made myself as available to you as I can so feel free to find me via:\n Email Class slack channel Cornering me in the hallway Below you\u0026rsquo;ll find the course outline and slides as appropriate. I look forward to an exciting, productive, and stimulating semester with you all.\nBest,\nJim\nTentative Course Outline: To help us stay on track, I have pinned the semester schedule below with the relevant tasks, due dates, and other important institutional dates: Note that this is a living timeline subject to change. \r\r\rDate\rTopic\rWeekly Activity Due\rLab\rUniversity Dates\r\rWeek 1\r8/20\rSyllabus Day\r\u0026nbsp;\rSpatial skills pretest\r\u0026nbsp;\r\r8/22\rRemote sensing and its history\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 2\r8/27\rRemote sensing and its history Pt. 2\r\u0026nbsp;\rOnline resources\r\u0026nbsp;\r\r8/29\rElectromagnetic Radiation Principles\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 3\r9/3\rLabor Day- No class\r\u0026nbsp;\rNo Lab\r\u0026nbsp;\r\r9/7\rEMR Principles Pt. 2\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 4\r9/10\rMapping cameras\r\u0026nbsp;\rEMR\r\u0026nbsp;\r\r9/12\rDigital imagery\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 5\r9/17\rDigital imagery\r\u0026nbsp;\rBasic computations\r\u0026nbsp;\r\r9/19\rImage Resolution\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 6\r9/24\rImage Resolution\r\u0026nbsp;\rIntroduction to Erdas Imagine\r\u0026nbsp;\r\r9/26\rImage interpretation\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 7\r10/1\rImage interpretation\r\u0026nbsp;\rUnderstanding digital numbers\r\u0026nbsp;\r\r10/3\rMultispectral remote sensing\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 8\r10/8\rExam Review\r\u0026nbsp;\rNo Lab\r\u0026nbsp;\r\r10/10\rExam 1\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 9\r10/15\rFall Break-No class\r\u0026nbsp;\rUnderstanding resolutions\r\u0026nbsp;\r\r10/17\rMultispectral remote sensing\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 10\r10/22\rHyperspectral remote sensing\r\u0026nbsp;\rImage interpretation\r\u0026nbsp;\r\r10/24\rThermal, Microwave sensing\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 11\r10/29\rDigital image processing\r\u0026nbsp;\rThermal infrared image interpretation\r\u0026nbsp;\r\r10/31\rDigital image processing\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 12\r11/5\rImage classification\r\u0026nbsp;\rData acquisition \u0026amp; image preprocessing\r\u0026nbsp;\r\r11/7\rImage classification\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 13\r11/12\rAccuracy assessment\r\u0026nbsp;\rImage classification\r\u0026nbsp;\r\r11/14\rChange detection\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 14\r11/19\rUrban and vegetation\r\u0026nbsp;\rNo Lab\r\u0026nbsp;\r\r11/21\rThanksgiving -No class\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 15\r11/26\rWater and soil\r\u0026nbsp;\rNDVI\r\u0026nbsp;\r\r11/28\rLand cover mapping applications\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 16\r12/3\rWriting assignment\r\u0026nbsp;\rSpatial skills test\r\u0026nbsp;\r\r12/5\rExam Review\r\u0026nbsp;\r\u0026nbsp;\r\rFinals\r12/11\rIn person final\r\u0026nbsp;\r\u0026nbsp;\r\u0026nbsp;\r\r\r\r\r\r\r\r\r\r\r\r"},{"id":65,"href":"/classes/geog558/","title":"Intermediate GIS -- GEOG558","parent":"Classes","content":"This is a living document. Changes will be announced in class. Instructor: Jim Coll\nOffice: 404C Lindley Hall\nEmail: jcoll@ku.edu\nOffice Hours: Monday and Wednesday 10:00 — 11:30 or by appointment Class Meetings: Monday 2:00 — 4:30 pm\nClass Room: Lindley 228 Labs: Wednesday or Thursday 11:00 — 12:50 pm\nLab Room: Lindley 310 Announcements and mics: Hello all!\nMy name is Jim Coll and I am your instructor for the semester. A few notes for you pertain to how I run this course. Although the KU blackboard site will be the \u0026ldquo;official\u0026rdquo; site for this class and the place you submit all work to, I use this site here for the benefit of all and my own selfish desire to streamline my digital footprint. I will keep both sites as identical as possible when double posting material (e.g. the syllabus and course schedule), but in case of a conflict treat this version as the most recent. Use the navigation table to the left to find the syllabus, labs, and other documents. I have made myself as available to you as I can so feel free to find me via:\n Email Class slack channel Cornering me in the hallway Below you\u0026rsquo;ll find the course outline and slides as appropriate. I look forward to an exciting, productive, and stimulating semester with you all.\nBest,\nJim\nTentative Course Outline: To help us stay on track, I have pinned the semester schedule below with the relevant tasks, due dates, and other important institutional dates: Note that this is a living timeline subject to change. \r\r\rDate\rTopic\rWeekly Activity Due\rLab\rUniversity Dates\r\rWeek 1\rSyllabus Day \u0026amp; Overview of Spatial Analysis\r\r\u0026nbsp;\rSpatial skills pretest\r\u0026nbsp;\r\r1/27\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 2\rMapping and Representations\r\r\u0026nbsp;\rLab #1: Survey of data and mapping technologies and building a database\r\u0026nbsp;\r\r2/3\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 3\rMap Algebra (Local Operations)\r\r\u0026nbsp;\rLab #2: Making composite image, calculating normalized water index, and Kudzu suitability\r\u0026nbsp;\r\r2/10\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 4\rMap Algebra (Focal Operations)\r\r\u0026nbsp;\rLab #3: Segment waterbody and extract waterbody boundary\r\u0026nbsp;\r\r2/17\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 5\rMap Algebra (Zonal Operations)\r\r\u0026nbsp;\rLab #4: Estimate water surface elevation and water volume change\r\u0026nbsp;\r\r2/24\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 6\rReview for mid-term \u0026amp; Geoprocessing in ArcGIS\r\r\r\u0026nbsp;\rLab #5: Analyze long term lake water surface area dynamics (ModelBuilder \u0026amp; Python Scripting)\r\u0026nbsp;\r\r3/2\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 7\rSpring Break - No class\r\u0026nbsp;\rNo Lab\r\u0026nbsp;\r\r3/9\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 8\rMidterm exam\r\u0026nbsp;\rLab #6: Sea level rise inundation using region group and cost distance\r\u0026nbsp;\r\r3/16\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 9\rFundamental spatial relationships (distance)\r\r\u0026nbsp;\rLab #7: School siting and fungus dispersion modeling\r\u0026nbsp;\r\r3/23\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 10\rMulti-criteria evaluation and MA applications\r\r\u0026nbsp;\rLab #8: Viewshed, hydrological and NEXRAD blockage terrain analysis\r\u0026nbsp;\r\r3/30\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 11\rTerrain Analysis\r\r\u0026nbsp;\rAAG meeting (No Lab)\r\u0026nbsp;\r\r4/6\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 12\rCharacterize Spatial Pattern\r\r\u0026nbsp;\rLab #9: Tornado point pattern Analysis\r\u0026nbsp;\r\r4/13\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 13\rSpatial Interpolation/Prediction\r\r\u0026nbsp;\rLab #10: Groundwater interpolation\r\u0026nbsp;\r\r4/20\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 14\rReview for final exam \u0026amp; Final project presentation\r\r\u0026nbsp;\rOpen Lab\r\u0026nbsp;\r\r4/27\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 15\rFinal project presentation\r\u0026nbsp;\rSpatial skills test\r\u0026nbsp;\r\r5/4\r\u0026nbsp;\r\u0026nbsp;\r\rWeek 16\rFinal Exam 1:30 – 4:00 p.m.\r\u0026nbsp;\r\u0026nbsp;\r\u0026nbsp;\r\r5/14\r\u0026nbsp;\r\u0026nbsp;\r\r\r\r\r\r\r\r\r\r\r"},{"id":66,"href":"/classes/geog526/labs/lab03/","title":"Lab - INTRODUCTION TO UNIT CONVERSION \u0026 SCALE PROBLEMS","parent":"Labs","content":"Learning Objective Calculation of photo scale and the measurement of distances and areas are fundamental skills in remote sensing. However, the various combinations of available information (scale, height, distance, focal length, etc.) and the use of both metric and imperial units of measurement can be confusing. This lab is designed to familiarize you with the common scale and area computations performed in association with remote sensing.\nOutline: Learning Objective Submission requirements Guide Representative Fraction (RF) Aerial Photo Scale Determination without Maps Wrapping up Submission requirements .tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rGEOG111_Lab2Questions.docx\rHandout to turn in\r\r\rYou are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts, and submit it to blackboard when done.\rGuide Representative Fraction (RF) When looking at a paper map, probably the most important thing to bear in mind is the map scale. The scale represents the ratio of a distance on the map to the actual distance on the ground. Map scales are expressed in a variety of ways (RF, verbal, graphic). The representative fraction relates a unit of measure on a map to some number of the same units of measure on the earth\u0026rsquo;s surface. For instance, a map scale of 1:25,000, tells us that 1 unit of measure represents 25,000 of the same units on the earth\u0026rsquo;s surface. One inch on the map represents 25,000 inches on the earth\u0026rsquo;s surface. Often when using cartographic materials it is useful to convert from various units of measurement. If you have a good understanding of the concept of scale, the techniques are fairly simple. \tExample of various ways to interpret RF \t1:50,000 means that a length of 1 unit of measurement on the photograph represents 50,000 of the same units of distance on the ground \tRF can be restated by applying conversion factors Ex 1: 1:24,000 can be stated as 1in on the map = 2,000ft on the ground 24,000in * 1ft\t/ 12in = 2,000ft\nEx 2: 1:100,000 can be stated as 1cm = 1km 100,000cm * 1km / 100,000cm = 1km\nEx 3: 1:60,000 can be stated as 1in = .95 mi 60,000in * 1mi / 63,360in = .95mi\nAerial Photo Scale Determination without Maps Photogrammetry refers to the technique of obtaining reliable measurements of objects from photographic images. The scale of aerial photographs is a function of the focal length of the camera and the altitude of the camera. There are several pieces of information that are required in order to determine the scale of a photo if you do not have a map as reference. Need to know: Flying height of camera (altitude) Focal length – distance between the center of the lens and the film plane Common focal lengths:\t88mm (~3.5in) 152mm (~6in) 210mm (~8.5in) 305mm (~12in) 450mm (~18in)\nFocalLengthDefined.png Use the following formula to determine scale using focal length and flying height\nWhere h = terrain elevation H = height above ground (AGL) H1 = height above mean sea level (MSL) ***H \u0026amp; f must be in the same units\nEx: focal length = 6in = .5ft Flying height = 20,000ft 1/(H/f) = 1/ (20000/.5) = 1/40,000 = 1:40,000\nFocal Length Determination Ex: scale = 1:10,000 H = 5000 AGL 1/10000 = 1/(5000/f) 1/10000 * f = 1/5000 f * 10000 = 5000 f = 5000/10000 f = .5ft\nAerial Photo Scale Determination with Maps Incorporating reference maps of the photographed area allows you to determine photo scale by comparing distances on the map (with a known scale) with photo distance. Use the following two formulas (depending on given information) to determine scale.\nFormula 1 1/((MD*MSD)/PD)\tWhere\tMD = map distance MSD = map scale denominator PD = photo distance Ex: MD = 6in MSD = 240,000 PD = 9in 1/((6×240,000)/9)=1/(1,440,000/9)=1/160,000=1:160,000\nTo make sure the MD and PD have the same unit.\nFormula 2:\t1/(GD/PD)\twhere\tGD = ground distance PD = photo distance\nEx:\tDistance from point A to B on map = 2.5mi Distance from point A to B on photo = 1.28in 2.5mi = 2.5mi * 63,360 = 158,400in 1/ (158,400)/1.28 = 1/123,750 = 1:123,750\nPhoto Format Format of photo designates size of image acquired by the camera Common formats: 23x23cm (9x9in) 5.7x5.7cm (2.2x2.2in)\nSo, if the scale is 1:10,000 and the film format is 9x9in then: 1:10,000 1 side (9in) = 90,000in = 1.42mi therefore the total image covers area of (1.42)2 = 2.02mi2\nUseful Conversions\n A scale of 1 inch = 1 mile is the same as what RF? (1 point)\nIf a map has a scale of 1:120,000, approximately how many miles are represented by 1in?(1 point)\nA reference map has scale of 1:50,000. How many kilometers are represented by 1cm? (1 point)\n On a 1:24,000 topographic sheet, a segment of a highway measures 4.5in. What is its ground length in miles? (1 point)\nAn airfield measures 2.5cm on a 1:50,000 scale map. How long is it in kilometers? (1 point)\nHow long is it in miles? (1 point)\n If the RC-8A camera has a 6in focal length, what is its focal length in mm? (1 point)\nThe KA-88A camera has 23 x 23cm format. What is the format in inches? (1 point)\n Determine the RF of the following photographic prints by using the given aerial photographic information: (AGL = altitude above ground level) (2 points)\n A)\tAltitude = 100,000ft AGL Focal Length = 6in RF = _________________\nB)\tAltitude = 10mi AGL Focal Length = 254mm RF = _________________\nAn aircraft was taking black and white photographs, using a KC-4 Fairchild camera, while flying at an elevation of 15,000ft (H1) in the vicinity of Denver, CO. Suppose that the average ground elevation in the Denver area is 5,000ft (h) above mean sea level (MSL). (2 points) A)\tIf a camera on board had a focal length of 6in, what would be the scale of the photographs?\nB)\tThe KC-4 usually has a format size of 9x9in. How many square miles does a single frame of the photograph contain?\n A photograph was taken at 10,000ft above mean sea level over a land surface having an elevation of 2,000ft above mean sea level. The scale of the photograph is 1:16,000. What is the focal length (in inches) of the camera? (1 point)\n At 30,000ft AGL, what distance (in miles) would be covered by 3in on photography acquired with a 6in focal length camera? (1 point)\n You have acquired four different photos, each one having been taken at a different altitude (AGL), but all the photos were taken with a SOM Plate Camera which has a 152mm focal length (6in length). Please note that the numbers in parentheses indicate altitude – ensure that you response corresponds with the appropriate altitude.\n A)\tThe first photo was taken by the SCS (Soil Conservation Service) at an altitude of 5,000 ft AGL. What is the scale? (1 point)\n_______________ (5,000)\nB)\tThe other three photographs were taken by NASA from a U-2 aircraft at 10,000ft, 20,000ft, and 40,000ft. All these altitudes are AGL. What is the scale of each of these last three photos? (3 points)\n_______________ (10,000) _______________ (20,000) ____________ (40,000)\nC)\tIf each photograph is in a 9x9in format size, what is the linear distance (in miles) for each photo on one side (ground distance)? (4 points)\n__________ (5,000) _________ (10,000) _________ (20,000) _______ (40,000)\nD)\tHow many square miles does each photo cover? (4 points)\n__________ (5,000) _________ (10,000) _________ (20,000) _______ (40,000)\nE)\tHow does doubling the AGL affect the following: (3 points)\nScale?\nLinear distance?\nArea?\nGiven the following information, calculate the photo scale: (2 points) A)\tMap Distance = 12in Map Scale = 1:20,000 Photo Distance = 6in Photo Scale = ____________\nB)\tMap Distance = 3in Map Scale = 1:20,000 Photo Distance = 50.8mm Photo Scale = ____________\n You want to construct a grid cell overlay for your aerial photo, so that each cell is 8 acres in area. If the scale of your photo is 1:12,000, what is the length of a cell on the overlay to the nearest tenth of an inch? (2 points)\n Two aircraft equipped with KA-20B Hycon cameras (usually for reconnaissance) are flying over terrain in Oregon which is approximately 4,000ft above sea level. Aircraft A is flying at an elevation of 16,000ft and the camera on board the plane has a focal length of 6.0in. Meanwhile, Aircraft B is taking photographs of the same area as Aircraft A; however, B is flying at an elevation of 13,000ft with a camera having a focal length of 9in. Which photographs, from Plane A or B, will show the greater detail? Remember that large scales will show small areas in large detail. (1 point)\nAircraft (1 point) _____________\n Itek nine-lens multispectral aerial camera was used to take aerial photographs (nine spectral bands for each scene) of the northeast Kansas region. The plane flew at an altitude of 9,050ft above sea level, and the camera had a focal length of 152mm. If the average ground elevation is 1,025ft, what is the scale of the photo? (1 point)\n If the format size is 5.7x5.7cm, how many square miles does each photo cover? (1 point)\nYou have just acquired photography that was made from a Zeiss-A-15/23 single frame aerial camera with a 6in focal length at 20,000ft AGL. (5 points) A)\tWhat is the scale of the photograph? ________________\nB)\tWhat would it be at 40,000ft AGL? ________________\nC)\tAt 80,000ft AGL? ________________\nD)\tWhich of the above is the larger scale? ________________\nE)\tIf you measured 1in on the largest scale photo, what would be the ground length in miles?\nYou are doing a research project for NASA and they want you to make a general land use map of Kansas using Skylab photographs. A S-190A Skylab Multiband camera with a 150mm focal length was used by the astronauts in taking the pictures. The multiband camera was designed and built by the Itek corporation and the film used had a 70mm frame format. Skylab was orbiting around the earth at an average altitude (AGL) of 435km. A) What is the scale of these photos? (1 point) _______________________\nB) What is the linear ground distance across 1 side of the photo in km? (1 point)\nC) How many square miles does each photo cover? (1 point) _________________\nD)\tUse the information you calculated in Number 8 (for the 5,000 ft. AGL) and determine approximately how many SCS photographs would be needed to cover all the area in one Skylab photo? (1 point)\nE)\tWhat can you conclude about the purpose of differing scales of photography? (1 point)\n Two objects on a map are 2.25in apart. The same two objects are 3.25in apart on the photograph. The following bar scale goes with the map:\n 1 1 0 4 2 1 2\r|--|--|-----|-----------|\rMILES\r A)\tWhat is the RF scale of the map? (1 point)\nB)\tWhat is the ground distance between the two objects on the map? (1 point)\nC)\tWhat is the ground distance between the two objects on the photo? (1 point)\nD)\tWhat is the RF scale of the photo? (1 point)\nWrapping up There is no need to save anything from this lab, so when done you can simply close without saving. Submit your answers to the questions on blackboard.\n"},{"id":67,"href":"/classes/random/FirstGoogleEarthMap/","title":"A basic map in Google Maps","parent":"Random bin","content":"Learning Objective Google Earth is a great platform to start your exploration of the GISciences, even if it\u0026rsquo;s not a tool you\u0026rsquo;ll use often in \u0026ldquo;the real world\u0026rdquo;. There are a number of ways to access Google Earth, but this tutorial will cover Google Earth Pro on the Desktop (scroll to the bottom of the page). The objective of this exercise is to familiarize yourself with the numerous tools Google Earth has to offer. By the end of this tutorial, you should feel comfortable\u0026hellip;\n navigating around the world, and reorienting yourself if you get lost. adding data and exploring it\u0026rsquo;s format symbolizing features creating a map, adding common map elements, and exporting an image Outline: Learning Objective Submission requirements Tutorial Navigating around the world Change your point of view Explore options Tilt to view hills \u0026amp; mountains Google Earth Flight Simulator past imagery Street view Loading in data Formatting and symbolizing Making a map: Submission requirements .tg {border-collapse:collapse;border-spacing:0;border-color:#ccc;margin:0px auto;}\r.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#fff;}\r.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#ccc;color:#333;background-color:#f0f0f0;}\r.tg .tg-0pky{border-color:inherit;text-align:left;vertical-align:top}\r.tg .tg-btxf{background-color:#f9f9f9;border-color:inherit;text-align:left;vertical-align:top}\r\r\rData Name\rDescription\r\r\rtest\rHandout to turn in\r\r\rYou are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts, and submit it to blackboard when done.\rTutorial Navigating around the world most of this section is gratefully pilfered from the official Google Earth help page\n Open Google Earth and spend some time practicing navigating around in the world.\nChange your point of view Use a mouse to change your perspective and explore different areas.\n To pan in any direction: Left-click and hold. Then, drag the cursor until you see the view you want. To return to the default view (reorient yourself so north is up and the camera angle is pointed straight down) - Click the map and press r. You can zoom in and out to see more or less of a map area. Use the scroll wheel on your mouse or mouse touchpad to zoom in and out. The map controls on the upper right hand side of the map can also be used to pivot, pan, and zoom using just the mouse. Finally, there is a search bar on the toolbar on the left that works just like the search bar on Google Maps. Explore options There are a number of options to explore in Google Earth, particularly under view and tools. For now, the only one you should turn on is the overlay map, which creates a small inset which shows where we are in relation to the rest of the globe. Tilt to view hills \u0026amp; mountains Google earth can be used to view elevation, make sure you turn on Terrain under Layers on the left first.\nWindows \u0026amp; Linux\n Press Shift + Left-click. Then, drag in any direction. on Mac\n Press and hold the scroll button. Then, move the mouse forward or backward. Press Shift and scroll forward or backward to tilt up and down. Google Earth Flight Simulator Google Earth has a build in flight simulator, and you can use either the keyboard and mouse, or a joystick!\n In the menu: Click Tools and then Enter Flight Simulator In the next screen, you can choose the plane you use, and your starting point The Heads up display has the following elements: Speed: current speed in knots Heading: direction the aircraft is pointed Bank angle: angle you’re using to slowly turn the plane in a new direction Vertical speed: rate of ascent or descent in feet per minute Exit flight simulator feature: click this button to exit the flight simulator Throttle: level of the engine’s power Rudder: angle of the vertical axis of the plane Aileron: angle of the plane when you roll or bank it Elevator: angle and lift of the plane’s wings Flap and gear indicators: where the flaps and gears are set Pitch angle: angle between where the airplane is pointed and the horizon in degrees Altitude: how many feet above sea level the plane is flying past imagery Google is one of if not the leading collector of earth observations, and that includes past satellite and aerial imagery. To view past images, scroll in close enough until the clock on the lower left of the map appears, and click on it. You can also use the slider at the top right of the map to page through imagery.\nStreet view Google earth also has a pegman you can drop to see street view imagery.\nLoading in data Google Earth can load in GIS data, and it\u0026rsquo;s native format is KML/KMZ. Here is some data we\u0026rsquo;ll use to create our map. Download it, and then add it to the map using file \u0026gt; open. Finally, turn off all layers except terrain. When finished, your screen should look like so.\nFormatting and symbolizing There are several means of symbolizing features in Google Earth. We can do so feature by feature, or in groups. If you right click on a feature and go to properties, you can change how it\u0026rsquo;s symbolized. If you change the properties for a folder, you apply that symbology to every feature within that folder. For now, clock on the Route1_Merge file and go to style. Lets change these bus routes to red, and the width to 2. Afterwards, right click on the folder for the bike routes, and change those properties to green and 1.5 width. When done, your map should look like so: The LFK_POI folder has several sub-folders. These folders are separated into the following categories:\n Living: all dorms on the Haskell and KU campuses LivingE: Dorms with a dining hall inside Dining: dining halls and other places around campus to eat GIS Buildings where GIS instruction occurs Busstops: these are bus stops\u0026hellip; If you open the properties on these, you can change the symbol used to show these features by clicking on the icon. Change these icons to something you feel is appropriate. When done, zoom into a part of campus you feel frames a nice area, your map may look like so: Making a map: Finally, let\u0026rsquo;s export our map for viewing. Go to File \u0026gt; Save \u0026gt; Save Image\u0026hellip; This adds common map elements including Title blocks and Legend, as well as a north arrow. Fill out the pertinent information, including a relevant title, your name, and the data source (this data comes from Douglas County GIS). Finally, under Map Options at the top, you have the option of toning down the intensity of the base map. When done, export your map and submit it. Congratulations, you just made your first map!\n"},{"id":68,"href":"/classes/geog526/labs/lab04/","title":"Lab - ELECTROMAGNETIC RADIATION PRINCIPLES","parent":"Labs","content":"Background This exercise provides an introduction to using a Global Positioning System (GPS) receiver to obtain coordinates and create a point shapefile. GPS is a system consisting of a network of satellites that orbit ~11,000 nautical miles from the earth in six different orbital paths. They are continuously monitored by ground stations located worldwide. The satellites transmit signals that can be detected with a GPS receiver. Using the receiver, you can determine your location with great precision through the trilateration (not triangulation!) of signals from at least 3 satellites, getting a distance from the difference between time measurements.\nAlthough the system is very sophisticated, and atomic clocks are used by the satellites, there are multiple sources and types of errors involved in finding your location. As the GPS signal passes through the charged particles of the ionosphere and then through the water vapor in the troposphere, this causes the signal to slow a bit, and this creates the same kind of error as bad clocks. Also, if the satellites that are in your view at a particular moment are close together in the sky, the intersecting circles that define a position will cross at very shallow angles. That increases the error margin around a position. The kind of GPS receivers we will be using provide about 10 meter accuracy (which may be reduced to under 3 m if differential corrections like WAAS are used), depending on the number of satellites available and the geometry of those satellites.\nLearning Objective In this exercise we are going to collect the coordinates of some set of campus features using either the GPS units or your phones, generate a few shapefiles of those features, and then use the shapefile to make a map. The goals for you to take away from this lab:\n How to use your selected platform to collect data How to export that data How to view that data in GIS software Outline: Background Learning Objective Submission requirements Tutorial Submission requirements You are answering the questions (laid out in the word doc above and also included in the tutorial below) as you work through the lab. Use full sentences as necessary to answer the prompts, and submit it to blackboard when done.\nTutorial For each of the following, provide the frequency or wavelength as appropriate. Also, specify the region of the electromagnetic spectrum (e.g., ultraviolet, blue, green, red, near-infrared, mid-infrared, thermal-infrared, microwave) (Note: 1 Hz = 1 cycle/sec = 1 sec -1; light speed (C) is at 3×108 m/sec; 1m = 106 μm). (4X2=8 points) (a)\tWavelength = 0.460 μm Frequency =\nSpectral region =\n(b)\tWavelength =\nFrequency = 3.50 x 1013 Hz Spectral region =\n If we compare the radiation described in 1(a) and 1(b) above, which one has the higher energy content per quantum (or photon) (show computations for each)? In one sentence, explain the implications of this fact for remote sensing (10 points).\n We may think the Red-hot object as an 800k blackbody.\n (a)\tWhat is the total amount of energy emitted by it per unit area?\n(b)\tWhat is the dominant wavelength of Red-hot object?\n(c)\tCompared to the Earth which has a dominant wavelength of 9.66 µm, which one produces more radiant exitance? (10 points)\nRead the image below which illustrates different colors or vividness of forests close by and far away. Please explain it according to what we have talked about in lecture class. (Hint: optical effects of atmospheric haze) (3 points) Some streetlights are deliberately manufactured to provide illumination with a reddish color, can you suggest why? (3 points)\n The Earth\u0026rsquo;s land surface reflects about three percent of all incoming solar radiation back to space. The rest is either reflected by the atmosphere, or absorbed and re-radiated as infrared energy. The various objects that make up the surface absorb and reflect different amounts of energy at different wavelengths. The magnitude of energy that an object reflects or emits across a range of wavelengths is called its spectral response pattern or spectral reflectance curve.\nTable 1 gives you the spectral reflectance (%) for three different types of surface covers measured by a sensor with 24 bands (the mid value of band wavelength is given). (25points)\n (a)\tBased on these data, construct spectral reflectance curves for these targeted objects (on the last page). These curves will help you answer questions 4b-4c. Please use a pencil and draw the lines clearly (10 points)\n(b)\tWhich band is better for discriminating vegetation from water: Band 6 versus Band 10? Why? (5 points)\n(c)\tWhich band is better for discriminating barren land from water: Band 1 versus Band 6? Why? (5 points)\n(d)\tWhich band is better for discriminating these three surface covers: Band 12 versus Band 24? Why? (5 points)\n| Band | Mid value of bandwidth\n ** (in 100Xµm)** Reflectance (%) Vegetation Water Barren Land 1 20 26 2 23 31 3 26 40 4 29 51 5 32 58 6 35 59 7 38 58 8 41 51 9 44 45 10 47 42 11 50 37 12 53 36 13 56 30 14 59 31 15 62 36 16 65 33 17 68 28 18 71 24 19 74 25 20 77 24 21 80 22 22 83 20 23 86 18 24 89 17 dfafd\n| Reflectance (%) | 78 | | | | | | | | | | | | | | | | | | | | | | | | | | | | \u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;- | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | \u0026ndash; | | | 75 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 72 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 69 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 66 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 63 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 60 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 57 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 54 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 51 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 48 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 45 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 42 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 39 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 36 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 33 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 30 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 27 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 24 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 21 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 18 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 15 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 12 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 17 | 20 | 23 | 26 | 29 | 32 | 35 | 38 | 41 | 44 | 47 | 50 | 53 | 56 | 59 | 62 | 65 | 68 | 71 | 74 | 77 | 80 | 83 | 86 | 89 | 92 | | Wavelength (100 × µm) |\n"},{"id":69,"href":"/classes/random/dronemapping/","title":"Drone Mapping","parent":"Random bin","content":"Need help passing your part 107? I can recommend the following resources:\n https://www.youtube.com/watch?v=6_ucCKFJUCU\u0026t=1s https://3dr.com/faa/drone-practice-tests/ https://jrupprechtlaw.com/part-107-test-study-guide/ Drones for mapping and research One of the most exciting aspects of working in the Environmental Sciences is in exploring how the application of new tools and technologies has allowed us to reimagine and streamline how we observe, process, understand, and communicate the state of the world. Take for instance, the humble terrain surface. Although I’m not quite old enough to recall walking to school uphill both ways in a snowstorm, there was once a time when the USGS topo maps were the gold standard in terms of how closely we could replicate the undulations of landforms, and can recall many a boy scout meeting where we used these maps to plan our next adventure. ![imcenter](/classes/random/img/KS_Lawrence West.jpg)\nIn early 2000, we brought these representations to the digital domain with the flight of the Shuttle Radar Topography Mission. This gave us a digital representation of the world at a ~25-meter resolution across much of the globe. Fantastic for most large-scale representations, this is still entirely too large to be relevant at a building scale. The next major milestone wouldn’t be seen until early 2016 with the USGS announcement of the [3D Elevation Program (3DEP)] (https://www.usgs.gov/core-science-systems/ngp/3dep). This program set standards for the collection and curation of Airborne LIDAR (Light Detection and Ranging). These data are high resolution point representations of elevation, with an average spacing of \u0026gt;4 points per square meter with less than 6 cm of variation across a flat surface.\nThis represents a major leap forward but is not without its drawbacks. Firstly, although LIDAR collects elevations, it does not collect imagery, which makes viewing the surface much less intuitive than it would otherwise be. Additionally, although the resulting elevations are accurate, they are still not dense enough or accurate enough to compete with high precision surveys, and depending on the use case, may be out of date by more than 3 years. In these instances, small Unmanned Aerial Systems, more commonly known as drones, offer a unique solution to some of these shortcomings. Using commercially available hardware and cutting-edge software, we are able to fly over an area and collect imagery which we use to create our own custom point clouds and 3D models. As part of my dissertation, I am interested in flying areas to determine what factors go into an effective terrestrial mapping effort. The above images are a small sampling of outputs I generate, but these can also include 3D point clouds, photo-realistic models, updated aerial maps, and standard aerial photography outputs. If you have an area you would like flown, particularly if it has interesting and complex geometry or if it’s near a body of water, please contact me at jcoll AT ku.edu and we’ll see if we can find a way to collaborate.\nKU Natural History Museum’s Science on Tap A small presentation for KU Natural History Museum’s Science on Tap | Free State Brewing CO.\n"},{"id":70,"href":"/classes/random/FOSS_RS/","title":"RS","parent":"Random bin","content":"Introduction to Remote Sensing Remote sensing is a course onto itself, but it’s worth introducing here as GIScience, GISystems, and remote sensing overlap more than they diverge. Remote sensing, as opposed to in-situ measurement, is the measurement of an object from a distance, most often a great distance.\nObjectives This lab should provide you with the bulk of the background knowledge needed to dive further into the field of remote sensing. We\u0026rsquo;ll cover common terminology and technology, some basic applications, and cover how R can be used to perform common GIScience and remote sensing operations.\nQuick vocab To start off, lets review some key vocabulary that often comes up when discussing remote sensing. These definitions are important when discussing concepts such as distortion and illumination angles but will only crop up sporadically.\n In physics, the nadir direction for a given location is the local vertical direction pointing in the direction the force of gravity acts at that location. In remote sensing this concept is, in essence, the point on a photo which is directly below the satellite. It\u0026rsquo;s opposite direction, zenith, refers to the direction directly opposed to gravity. Julian days are equivalent to Day of the Year, starting on January first of each year. Epoch time, also know as Unix time or POSIX time, is the number of seconds that have elapsed since January first, 1970 at midnight GMT/UTC, and is often used to indicate the timestamp of the data. (handy web converter) Remote sensing data comes from instruments, or sensors, which in turn are housed aboard platforms, or satellites. The basics It’s difficult to learn about remote sensing without first recalling some basic physics. First, take a second and recall how eyesight and vision works, using the conceptual diagram below.\nRemote sensing is the measurement of the electromagnetic spectrum. Energy (not necessarily in the visible spectrum) from a source, strikes the object, and then travels to the eye. We’ll break this down a little further. There are several different flavors of remote sensing, but the two most common ones are:\n Active sensing: A sensor or pair of instruments that both emit and receive measurements. Common examples of active remote sensing include ground penetrating radar, sonar, and LIDAR. Passive sensing: A sensor which only receives measurements. These are some of the most common forms of sensors, examples of which include: Your eyes, cameras, and most earth orbiting satellites. Next we need to introduce the most common properties used to differentiate and classify different remote sensing platforms. These properties are all in the context of scale and resolution.\nRadiometric resolution: Radiometric resolution is the number of brightness levels that can be detected by the sensor, also known as radiometric sensitivity, quantization, or simply the number of bits per pixel. Spectral resolution: Spectral resolution refers to the number and position of the sensor bands along the EM spectrum. Spatial resolution: Spatial resolution is the ground distance of the pixel, and is typically reported in meters at nadir at the equator. Temporal resolution: Temporal resolution, or more directly refereed to as revisit time, is, as the name suggests, the time between a repeated observation of the same place on the globe. Temporal resolution is the result of a number of factors, but the two largest are the swath width and the orbit.\nSwath width is the width of a single image, and can either be small or large, The image below (gratefully pilfered from Figure 1 in https://doi-org.www2.lib.ku.edu/10.1016/j.rse.2019.111254) shows the swath of a Landsat-8 OLI orbit (185 km wide, left) compared and a Sentinel-2A MSI orbit (290 km wide, right). Orbit refers to the direction and nature of the travel path of satellites around a body. There are a number of common orbital patterns satellites exhibit. For more information, see an example of the different orbits and satellites, and explore satellite orbits at this ESRI site.\n Geostationary: The orbit of the sensor is such that it rotates at a speed which maintains the satellites position over the same spot on the earth. Most commonly seen with communication and weather satellites. Sun synchronous: The orbit of this satellite follows the sun. An introduction to platforms As you might have guessed by now, a proper understanding of remote sensing hinges on a thorough understanding of the data structure and lineage. The Google Earth Engine Dataset catalog has an excellent overview of platforms and datasets avalible for end users, but here we’ll introduce two of my favorite platforms, MODIS, and Landsat\nMODIS MODIS,(Moderate Resolution Imaging Spectroradiometer) is a key instrument aboard the Terra (originally known as EOS AM-1) and Aqua (originally known as EOS PM-1) satellites. These are passive remote sensing platforms, and so follow a sun synchronous orbit. Terra\u0026rsquo;s orbit around the Earth is timed so that it passes from north to south across the equator in the morning, while Aqua passes south to north over the equator in the afternoon. Terra MODIS and Aqua MODIS are viewing the entire Earth\u0026rsquo;s surface every 1 to 2 days, acquiring data in 36 spectral bands.\nLANDSAT The Landsat program is the longest-running enterprise for acquisition of satellite imagery of Earth. It is a joint NASA/USGS program. Back in July of 1972, the first satellite was launched, and eventually became known as Landsat 1. The most recent satellite, Landsat 8, was launched on February 11, 2013, and is still currently operational. There are a few different sensors onboard, which are outlined below. Landsat 1 through 5 carried the Landsat Multispectral Scanner (MSS). Landsat 4 and 5 carried both the MSS and Thematic Mapper (TM) instruments. Landsat 7 uses the Enhanced Thematic Mapper Plus (ETM+) scanner. Landsat 8 uses two instruments, the Operational Land Imager (OLI) for optical bands and the Thermal Infrared Sensor (TIRS) for thermal bands. Handy charts for these instruments can be found on the wikipedia page.\nLANDSAT distro Landsat is distributed in tiles (called scenes) that are identified by their path and row. Although these designations are somewhat arbitrary (in that they provide little geographic context, what does path 34 row 58 tell you?), they are a consistent means of identifying and disseminating tiles. https://www.usgs.gov/land-resources/nli/landsat/landsat-shapefiles-and-kml-files\nLANDSAT names With so much data being collected, you can imagine that a naming convention can get pretty convoluted, and if you don\u0026rsquo;t work with this data commonly, you\u0026rsquo;ll need to look up this information each time you need it. However, it\u0026rsquo;s worthwhile to introduce it here. Landsat collection 1 Level-1 products are all identified using the following naming convention.\nLXSS_LLLL_PPPRRR_YYYYMMDD_yyyymmdd_CC_TX\nWhere:\nL = Landsat X = Sensor (“C”=OLI/TIRS combined, “O”=OLI-only, “T”=TIRS-only, “E”=ETM+, “T”=“TM, “M”=MSS) SS = Satellite (”07”=Landsat 7, “08”=Landsat 8) LLL = Processing correction level (L1TP/L1GT/L1GS) PPP = WRS path RRR = WRS row YYYYMMDD = Acquisition year, month, day yyyymmdd - Processing year, month, day CC = Collection number (01, 02, …) TX = Collection category (“RT”=Real-Time, “T1”=Tier 1, “T2”=Tier 2)\nExample: LC08_L1GT_029030_20151209_20160131_01_RT Means: Landsat 8; OLI/TIRS combined; processing correction level L1GT; path 029; row 030; acquired December 9, 2015; processed January 31, 2016; Collection 1; Real-Time\nLANDSAT anomalies There are quirks to every remote sensing product, but landsat 7 has one of the most unique. Unlike other platforms, such as the Hubble space telescope, these remote sensing platforms do not have the benefit of maintenance after being placed in orbit. So, in June of 2003, when the landsat 7 scan line corrector (slc) module failed, data from half of the collected data was no longer viable. This scan line correcter is responsible for adjusting the position of the sensor as it scanned the earth. The result is a charicteristic striping of the data. There have been numerous efforts to correct for this failure, but for now, just know that most landsat 7 data will look rather zebra-esque. Image from https://www.researchgate.net/publication/236864661_Validating_gap-filling_of_Landsat_ETM_satellite_images_in_the_Golestan_Province_Iran\n A simple application There are a number of uses for remote sensing data, but one of the cornerstones of remote sensing is the normalized difference index. If we take two numbers and subtract them, and divide that by the sum of those same two numbers, we end up with a number that can range from -1 to 1. This normalization can tell us a lot about a process, and is performed for a wide variety of earth classification operations such as snow (NDSI), water (NDWI), and vegetative health (NDVI).\nThe latter, NDVI, is an often performed analysis to give us a roughly quantitative analysis of the vegetative health of a system. When sunlight strikes objects, some wavelengths of this spectrum are absorbed and other wavelengths are reflected. The pigment in plant leaves, chlorophyll, strongly absorbs visible light (from 0.4 to 0.7 µm) for use in photosynthesis, whereas the cell structure of the leaves strongly reflects near-infrared light (from 0.7 to 1.1 µm). The more leaves a plant has, the more these wavelengths of light are affected. NDVI values are fairly relative (the difference between 0.5 to 0.6 is not generally quantifiable), but in general, anything above 0.2 is vegetation, and higher values are roughly equivalent to more vegetation.\n Image from https://eos.com/ndvi/\n Lab exercise Let\u0026rsquo;s use our newfound knowledge to perform one of the most common operations one might want to take, a change detection study. Let’s explore how NDVI changes by land cover type between summer and fall. We\u0026rsquo;ll use landsat 8. If we look back at the landsat 8 OLS sensor (copied below), we can see that those wavelengths are equivalent to bands 4 and 5.\n Band number Spectral resolution Spatial Resolution Band 1 Visible (0.43 - 0.45 µm) 30 m Band 2 Visible (0.450 - 0.51 µm) 30 m Band 3 Visible (0.53 - 0.59 µm) 30 m Band 4 Red (0.64 - 0.67 µm) 30 m Band 5 Near-Infrared (0.85 - 0.88 µm) 30 m Band 6 SWIR 1(1.57 - 1.65 µm) 30 m Band 7 SWIR 2 (2.11 - 2.29 µm) 30 m Band 8 Panchromatic (PAN) (0.50 - 0.68 µm) 15 m Band 9 Near-Infrared (0.85 - 0.88 µm) 30 m Let\u0026rsquo;s lay out our workflow conceptually. We\u0026rsquo;ll first establish an area over which we\u0026rsquo;ll perform the analysis. Next, we\u0026rsquo;ll scrape an amazon web service to identify suitable data to use for our analysis, and grab land cover data to perform our analysis. Afterwards, we\u0026rsquo;ll calculate NDVI and perform our zonal statistics using our land cover. Finally, we\u0026rsquo;ll create a pretty image as our output image.\nYou may notice a slightly unique naming convention in the code:\n# Pick an AOI xx \u0026lt;- list() xx$AOI_center... I like to create an empty list and stuff all cartographic variables into a single place for easy access when it comes time to create outputs. This also makes it easer to create web apps (the global file then only has to return the list), but that will be a topic for a later date.\n First, let\u0026rsquo;s load in the requisite packages # Last build: 7/30/20\r# install.packages(\u0026quot;devtools\u0026quot;) # Will throw errors if RTools is not installed, you can ignore this for the remained of installation\r# install.packages(\u0026quot;geosphere\u0026quot;) # For geodesic buffer function\r# install.packages(\u0026quot;leaflet\u0026quot;) # For visualization\r# install.packages(\u0026quot;leafem\u0026quot;) # For visualization\r# install.packages(\u0026quot;raster\u0026quot;) # for raster vis\r# library(devtools, quietly = TRUE)\r# devtools::install_github(\u0026quot;ropensci/FedData\u0026quot;) # For easy NLCD download\r# devtools::install_github(c(\u0026quot;ropensci/getlandsat\u0026quot;)) # For easy landsat download\rlibrary(raster, quietly = TRUE) # vis\rlibrary(dplyr, quietly = TRUE) # Syntax sugar\rlibrary(lubridate, quietly = TRUE) # more syntax sugar\rlibrary(rgeos, quietly = TRUE) # second buffer function\rlibrary(jsonlite, quietly = TRUE) # API query\rlibrary(httr, quietly = TRUE) # web download\rlibrary(leaflet, quietly = TRUE) # vis\rlibrary(leafem, quietly = TRUE) # vis\rlibrary(htmltools, quietly = TRUE) # vis\rlibrary(FedData, quietly = TRUE) # Easy NLCD\rlibrary(getlandsat, quietly = TRUE) # Easy landsat\rDefine a few helper functions We\u0026rsquo;ll use a few helper functions to help us accomplish this task, one to plot a point in the world for use as the centroid of our AOI, and another to generate the approprite utm epsg for that point.\n## Helper functions\r## geocoding function using OSM Nominatim API -- details: http://wiki.openstreetmap.org/wiki/Nominatim\r## Inputs: a human readable address or colloquial name string\r## Outputs: success: a sf point with lat and long | Failure: an empty dataframe\r## modified from: D.Kisler @ https://datascienceplus.com/osm-nominatim-with-r-getting-locations-geo-coordinates-by-its-address/\rnominatim_osm \u0026lt;- function(address = NULL) {\r# Construct a url request\rd \u0026lt;- jsonlite::fromJSON( gsub('\\\\@addr\\\\@', gsub('\\\\s+', '\\\\%20', address), 'http://nominatim.openstreetmap.org/search/@addr@?format=json\u0026amp;addressdetails=0\u0026amp;limit=1'))\r# return parsed sf point\rreturn(\rsf::st_as_sf(d %\u0026gt;% select('lon','lat','display_name'),\rcoords=c(\u0026quot;lon\u0026quot;,\u0026quot;lat\u0026quot;), crs=sf::st_crs(4326),\rremove=TRUE)\r)\r}\r## geodesic buffer function -- details: http://wiki.openstreetmap.org/wiki/Nominatim\r## Inputs: a coordinate pair in EPSG:4326\r## Outputs: the equivelent UTM zone EPSG\r## all credit to Robin Lovelace, Jakub Nowosad, Jannes Muenchow @ https://geocompr.robinlovelace.net/reproj-geo-data.html\rlonlat2UTM = function(lonlat) {\rutm = (floor((lonlat[1] + 180) / 6) %% 60) + 1\rif(lonlat[2] \u0026gt; 0) {\rutm + 32600\r} else{\rutm + 32700\r}\r}\rDefine our AOI and a small processing area First, let\u0026rsquo;s define our AOI and a small buffer\n# set a working directory\rsetwd(\u0026quot;C:/Users/Cornholio/Desktop/coolclass\u0026quot;) # your own filepath here\r# Pick an AOI and (small) buffer distance\rxx \u0026lt;- list() xx$AOI_center \u0026lt;- nominatim_osm(\u0026quot;Lawrence, KS\u0026quot;) # xx$AOI_center \u0026lt;- sp::SpatialPoints(list(-95.23595,38.97194)) %\u0026gt;% sf::st_as_sfc() %\u0026gt;% sf::st_set_crs(4326) # or spesify a lon/lat point\rmybufferdist_m = 1200\r# Buffer AOI point\r# get UTM epsg\raoiUTMEPSG \u0026lt;- lonlat2UTM(sf::st_coordinates(xx$AOI_center))\r# reproject\rxx$AOI_center_utm \u0026lt;- sf::st_transform(xx$AOI_center,aoiUTMEPSG)\r# buffer (gBuffer requires a spatial object, not an sf), return to sf, reattach crs\rxx$AOI_buffer_utm \u0026lt;- rgeos::gBuffer(sp::SpatialPoints(sf::st_coordinates(xx$AOI_center_utm)),width = mybufferdist_m) %\u0026gt;%\rsf::st_as_sfc() %\u0026gt;%\rsf::st_set_crs(aoiUTMEPSG) xx$AOI_buffer_utm \u0026lt;- rgeos::gBuffer(sp::SpatialPoints(sf::st_coordinates(xx$AOI_center_utm)),width = mybufferdist_m) %\u0026gt;%\rsf::st_as_sfc()\rsf::st_crs(xx$AOI_buffer_utm) = aoiUTMEPSG\rxx$AOI_buffer \u0026lt;- sf::st_transform(xx$AOI_buffer_utm, 4326) # reproject back to wgs84\rxx$processbounds \u0026lt;- sf::st_bbox(xx$AOI_buffer) # st_bbox doesn't return a spatial object, so we'll correct for that here\rxx$AOI_bbox \u0026lt;- sf::st_bbox(xx$AOI_buffer) %\u0026gt;% sf::st_as_sfc() %\u0026gt;% sf::st_set_crs(4326)\r# Finally, let's make sure everything plots as expected\rleaflet() %\u0026gt;%\raddProviderTiles(providers$Stamen.TonerLite, group = \u0026quot;Base Map\u0026quot;) %\u0026gt;%\raddPolygons(data=xx$AOI_buffer,color=\u0026quot;red\u0026quot;)\r# Aside: was geodesic buffering necessary? Let's test using a \u0026quot;relative\u0026quot; simplification!\r# @ https://planetcalc.com/7721/ -- WGS 84 defines ellipsoid parameters as:\ra = 6378137.0 # Semi-major axis in meters\rb = 6356752.314245 # Semi-minor axis in meters\rl = sf::st_coordinates(xx$AOI_center)[2]\rradius_globe_meters \u0026lt;-\rsqrt((((((a^2)*cos(l))^2)+(((b^2)*sin(l))^2)))/((((a*cos(l))^2)+((b*sin(l))^2))))\rradius_degrees \u0026lt;- mybufferdist_m/radius_globe_meters\r# xx$AOI_badbuffer \u0026lt;- rgeos::gBuffer(sp::SpatialPoints(sf::st_coordinates(xx$AOI_center)),width = radius_degrees) %\u0026gt;% # sf::st_as_sfc() %\u0026gt;%\r# sf::st_set_crs(4326)\r# all credit to https://geocompr.robinlovelace.net/reproj-geo-data.html\rlonlat2UTM = function(lonlat) {\rutm = (floor((lonlat[1] + 180) / 6) %% 60) + 1\rif(lonlat[2] \u0026gt; 0) {\rutm + 32600\r} else{\rutm + 32700\r}\r}\raoiUTMEPSG \u0026lt;- lonlat2UTM(sf::st_coordinates(xx$AOI_center))\rxx$AOI_center_utm \u0026lt;- sf::st_transform(xx$AOI_center,newEPSG) xx$AOI_badbuffer \u0026lt;- rgeos::gBuffer(sp::SpatialPoints(sf::st_coordinates(xx$AOI_center_utm)),width = mybufferdist_m) %\u0026gt;% sf::st_as_sfc() %\u0026gt;%\rsf::st_set_crs(aoiUTMEPSG) xx$AOI_badbuffer \u0026lt;- sf::st_transform(xx$AOI_badbuffer,4326)\r# and check visually\rleaflet() %\u0026gt;%\raddProviderTiles(providers$Stamen.TonerLite, group = \u0026quot;Base Map\u0026quot;) %\u0026gt;%\raddPolygons(data=xx$AOI_buffer,color=\u0026quot;red\u0026quot;) %\u0026gt;%\raddPolygons(data=xx$AOI_badbuffer,color=\u0026quot;green\u0026quot;) # Conclusion: it's too late for me to be futzing around with this.... Any ideas?\rDiscover appropriate scenes for processing We next need to identify scenes for use. Landsat distriburts scenes by path and row more here\n# discover path and row\rhttr::GET('https://prd-wret.s3.us-west-2.amazonaws.com/assets/palladium/production/s3fs-public/atoms/files/WRS2_descending_0.zip', write_disk(paste0(getwd(), \u0026quot;/WRS2_descending_0.zip\u0026quot;)))\runzip(paste0(getwd(), \u0026quot;/WRS2_descending_0.zip\u0026quot;), exdir = getwd())\rxx$pathrow \u0026lt;- sf::read_sf(paste0(getwd(), \u0026quot;/WRS2_descending.shp\u0026quot;))\r# That pathrow file covers the entire globe, let's make this a tiny bit gentler to visualize\r# and subset that to just the pathrows that intersect our AOI\rxx$pathrow_subset \u0026lt;- xx$pathrow[xx$AOI_bbox,]\r# explore geographic context\rleaflet() %\u0026gt;%\raddProviderTiles(providers$Stamen.TonerLite, group = \u0026quot;Base Map\u0026quot;) %\u0026gt;%\raddPolygons(data=xx$AOI_buffer,color=\u0026quot;red\u0026quot;) %\u0026gt;%\raddPolygons(data=xx$pathrow_subset,\rcolor=\u0026quot;black\u0026quot;,\rfill = FALSE,\rlabel = mapply(function(x, y) {\rHTML(sprintf(\u0026quot;\u0026lt;em\u0026gt;Path/row:\u0026lt;/em\u0026gt;%s-%s\u0026quot;, htmlEscape(x), htmlEscape(y)))},\rxx$pathrow_subset$PATH, xx$pathrow_subset$ROW, SIMPLIFY = F),\rlabelOptions = lapply(1:nrow(xx$pathrow_subset), function(x) {\rlabelOptions(noHide = T)\r})) %\u0026gt;% fitBounds(as.numeric(floor(xx$processbounds$xmin)), as.numeric(floor(xx$processbounds$ymin)), as.numeric(ceiling(xx$processbounds$xmax)), as.numeric(ceiling(xx$processbounds$ymax)))\rDiscover appropriate scenes for processing part 2 That visualization was perhaps a bit overkill, all we needed to do here is ensure that our AOI is completely enclosed within a single path/row. We\u0026rsquo;ll cover mosaicking (stitching together multiple scenes at once) later in the course, but for now if you end up inside more than one, adjust your point until you fall squarely within a single tile (or entirely in a crossover section).\nHaving identified our path and row, let\u0026rsquo;s now explore the landsat catalog\n# Find the scenes for your path and row\rsummerdaterange \u0026lt;- interval(as.POSIXct(\u0026quot;2015-06-01 00:00:00\u0026quot;), as.POSIXct(\u0026quot;2015-09-01 00:00:00\u0026quot;)) summerscene \u0026lt;- getlandsat::lsat_scenes() %\u0026gt;% filter(row == xx$pathrow_subset$ROW) %\u0026gt;% # Filter for row...\rfilter(path == xx$pathrow_subset$PATH) %\u0026gt;% # and path\rfilter(acquisitionDate %within% summerdaterange) %\u0026gt;% # get our date\rarrange(cloudCover) %\u0026gt;% # put the least cloudy scene first\r.[1,] # and peel it off\rsummerscene\r# as we see, there is a suitable cloud free scene, let's look at it's formatting:\r# In addition to making a handy scene scrapper, the getLandsat package is also a wrapper to download scenes by name\rsummer_band4_path \u0026lt;- getlandsat::lsat_image(paste0(summerscene$entityId,\u0026quot;_B4.TIF\u0026quot;)) summer_band4 \u0026lt;- raster::raster(summer_band4_path)\rxx$summer_band4 \u0026lt;- raster::mask(summer_band4,sf::as_Spatial(xx$AOI_buffer_utm)) # and visualize\rleaflet() %\u0026gt;% addProviderTiles(providers$Stamen.TonerLite, group = \u0026quot;Base Map\u0026quot;) %\u0026gt;%\raddRasterImage(xx$summer_band4, layerId = \u0026quot;values\u0026quot;) %\u0026gt;% addMouseCoordinates() %\u0026gt;%\raddImageQuery(xx$summer_band4, type=\u0026quot;mousemove\u0026quot;, layerId = \u0026quot;values\u0026quot;)\r# Lets do the same for band 5\rsummer_band5_path \u0026lt;- getlandsat::lsat_image(paste0(summerscene$entityId,\u0026quot;_B5.TIF\u0026quot;)) summer_band5 \u0026lt;- raster::raster(summer_band5_path)\rxx$summer_band5 \u0026lt;- raster::mask(summer_band5,sf::as_Spatial(xx$AOI_buffer_utm)) Creating NDVI values Now, let\u0026rsquo;s perform our local operation\nsummer_ndvi \u0026lt;- raster::overlay(xx$summer_band5, xx$summer_band4, fun = function(r1, r2) { return( (r1 - r2)/(r1 + r2)) })\r# and visualize\rleaflet() %\u0026gt;% addProviderTiles(providers$Stamen.TonerLite, group = \u0026quot;Base Map\u0026quot;) %\u0026gt;%\raddRasterImage(summer_ndvi, layerId = \u0026quot;values\u0026quot;) %\u0026gt;% addMouseCoordinates() %\u0026gt;%\raddImageQuery(summer_ndvi, type=\u0026quot;mousemove\u0026quot;, layerId = \u0026quot;values\u0026quot;)\rGrabbing land cover Now that we have NDVI, we need a land cover dataset. more here about NLCD\nNLCD \u0026lt;- FedData::get_nlcd(template = sf::as_Spatial(xx$AOI_buffer_utm),\ryear = 2016,\rdataset = \u0026quot;Land_Cover\u0026quot;,\rlabel = \u0026quot;AOI\u0026quot;)\r# --------------------------------------------------------------------------------------------\r# backup, because MRLC service is down more often than it's up...\r# this may take a while because we're grabbing all of CONUS\rhttr::GET('https://s3-us-west-2.amazonaws.com/mrlc/NLCD_2016_Land_Cover_L48_20190424.zip', write_disk(paste0(getwd(), \u0026quot;/NLCD_2016_Land_Cover_L48.zip\u0026quot;)))\runzip(paste0(getwd(), \u0026quot;/NLCD_2016_Land_Cover_L48.zip\u0026quot;), exdir = getwd())\rNLCD \u0026lt;- raster::raster(paste0(getwd(),'/NLCD_2016_Land_Cover_L48_20190424.img'))\r# --------------------------------------------------------------------------------------------\rlandcover_utm \u0026lt;- NLCD %\u0026gt;%\rraster::projectRaster(crs = raster::crs(xx$summer_band4)) %\u0026gt;%\rraster::mask(sf::as_Spatial(xx$AOI_buffer_utm))\rlandcover_utm \u0026lt;- NLCD %\u0026gt;%\rraster::projectRaster(crs = raster::crs(xx$summer_band4)) %\u0026gt;%\rraster::mask(sf::as_Spatial(xx$AOI_buffer_utm))\r# Finally, we'll (visualize) write the raster to disk, garbage collect, and reload the layer as that was a fairly\r# expensive set of computations we just performed\r# Land cover color pallete values\rxx$NLCD.pal \u0026lt;- c(\u0026quot;#5475A8\u0026quot;,\u0026quot;#FFFFFF\u0026quot;,\u0026quot;#E8D1D1\u0026quot;,\u0026quot;#E29E8C\u0026quot;,\u0026quot;#FF0000\u0026quot;,\u0026quot;#B50000\u0026quot;,\u0026quot;#D2CDC0\u0026quot;,\u0026quot;#85C77E\u0026quot;,\u0026quot;#38814E\u0026quot;,\u0026quot;#D4E7B0\u0026quot;,\u0026quot;#AF963C\u0026quot;,\u0026quot;#DCCA8F\u0026quot;,\u0026quot;#FDE9AA\u0026quot;,\u0026quot;#D1D182\u0026quot;,\u0026quot;#A3CC51\u0026quot;,\u0026quot;#82BA9E\u0026quot;,\u0026quot;#FBF65D\u0026quot;,\u0026quot;#CA9146\u0026quot;,\u0026quot;#C8E6F8\u0026quot;,\u0026quot;#64B3D5\u0026quot;)\rxx$NLCD.val \u0026lt;- as.numeric(c(11,12,21,22,23,24,31,41,42,43,51,52,71,72,73,74,81,82,90,95))\rxx$NLCD.text \u0026lt;- c(\u0026quot;Open Water\u0026quot;,\u0026quot;Perennial Ice/Snow\u0026quot;,\u0026quot;Developed, Open Space\u0026quot;,\u0026quot;Developed, Low Intensity\u0026quot;,\u0026quot;Developed, Medium Intensity\u0026quot;,\u0026quot;Developed, High Intensity\u0026quot;,\u0026quot;Barren Land\u0026quot;,\u0026quot;Deciduous Forest\u0026quot;,\u0026quot;Evergreen Forest\u0026quot;,\u0026quot;Mixed Forest\u0026quot;,\r\u0026quot;Dwarf Scrub\u0026quot;,\u0026quot;Shrub/Scrub\u0026quot;,\u0026quot;Grassland/Herbaceous\u0026quot;,\u0026quot;Sedge/Herbaceous\u0026quot;,\u0026quot;Lichens\u0026quot;,\u0026quot;Moss\u0026quot;,\u0026quot;Pasture/Hay\u0026quot;,\u0026quot;Cultivated Crops\u0026quot;,\u0026quot;Woody Wetlands\u0026quot;,\u0026quot;Emergent Herbaceous Wetlands\u0026quot;)\rxx$NLCD.vis \u0026lt;- colorFactor(palette = xx$NLCD.pal,\rdomain = xx$NLCD.val)\rleaflet() %\u0026gt;%\raddProviderTiles(providers$Stamen.TonerLite, group = \u0026quot;Base Map\u0026quot;) %\u0026gt;%\raddRasterImage(NLCD,colors=xx$NLCD.vis) %\u0026gt;%\raddLegend(\u0026quot;bottomright\u0026quot;,\rtitle = \u0026quot;National Land Cover Data\u0026quot;, colors = xx$NLCD.pal,\rlabels = xx$NLCD.text)\rraster::writeRaster(landcover_utm,paste0(getwd(),\u0026quot;/AOI_LC.tif\u0026quot;))\rgc()\rxx$landcover \u0026lt;- raster::raster(paste0(getwd(),\u0026quot;/AOI_LC.tif\u0026quot;))\rzonal statistics We now have all the pieces we need to generate the desired statistics, our last step is to perform the zonal statistics\nxx$summer_ndvi \u0026lt;- raster::resample(summer_ndvi,xx$landcover,method=\u0026quot;bilinear\u0026quot;)\rxx$zonalstats \u0026lt;- raster::zonal(xx$summer_ndvi, xx$landcover, fun='mean', digits=0, na.rm=TRUE) Finishing up add here https://developers.google.com/earth-engine/datasets/catalog/LANDSAT_LC08_C01_T1_32DAY_NDVI\n"},{"id":71,"href":"/classes/random/GEE_intro/","title":"GEE","parent":"Random bin","content":"Introduction to Google Earth Engine: What is Google Earth Engine: Google Earth Engine (GEE) is a cloud-based data and analysis platform which combines more than 17 petabytes of geospatial data, analytic APIs, and a web based Integrated Development Environment in one package and runs on Google’s computational infrastructure, enabling interactive earth data analyses on scales not previously feasible. This platform was first introduced to the public in 2013 as a means of performing an analysis of global forest cover change, published in Science (Hansen et al., 2013). In this flagship application, the authors sought to quantify the spatial distribution and global state of forest loss and gain from 2000 to 2012 by blending more than 654,000 Landsat scenes across the 12 years of the analysis — the resultant analysis of 700 Terapixels of data took more than 1 million hours of computation and exported results in a comparatively trivial 4 days across the Google compute infrastructure.\nSuch analyses once used to sit behind such extreme barriers of entry that no one would have been able to produce, much less reproduce, these results.\n Accessing GEE: To use GEE, you will need to request a user account at https://signup.earthengine.google.com/#!/. Once approved, you have access to the datasets and the Google computation infrastructure. This access takes the form of a REST API. There are currently two means of doing so, one Python based, and the other JavaScript based. The less popular of the two methods, the Python library, allows users to interact with Earth Engine using the Python programming language. The Google Earth Engine API guide has a full walk through of how to install the needed libraries, and is available at https://developers.google.com/earth-engine/python_install. The more popular means of accessing GEE is through the JavaScript library which is accessed through a web-based IDE, more commonly called the Code Editor, by pointing a browser at https://code.earthengine.google.com/ and logging in with an authorized GEE account. Although not specifically required, it is recommended that you use Google Chrome, and as this is the most popular means of accessing GEE the rest of this entry will be written using the JavaScript IDE. Don’t worry if you are more familiar with Python, the transition to JavaScript is relatively painless and primers are available at https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Introduction. One of the first steps for those new to GEE is to explore the features of the JavaScript API. In the upper right-hand corner of the code editor, click on help \u0026gt; Feature Tour to take a quick guided tour of the platform.\nGEE Operations: Broadly speaking there are 3 major types of operations one can perform in GEE: Methods, Algorithms, and Functions. The general structure of these are exemplified in Figure 1. Methods require an object to act on and take inputs, as exemplified in Figure 1, line 24, where we query the SRTM dataset for all values greater than 2000. Algorithms are objects themselves and take a specific input to return an object, as exemplified in Figure 1, line 27 where we use the Terrain algorithm to create a hillshade image from SRTM. Functions take an input and do something within the API, as exemplified in Figure 1, line 30, where we add the Hillshade layer to the map. Data Structures/Objects: Data objects in GEE are slightly different from more traditional desktop GIS software, but the concepts are very similar. These differences are most pronounced with the data structures outlined below in Figure 2. What GEE calls Images can be thought of as Rasters, and just as Rasters may have more than one band, so too may Images. What GEE refers to as an ImageCollection are simply a collection of Images. What GEE refers to as Features are analogous to features in a shapefile or geodatabase, so FeatureCollections are analogous to the larger shapefile or feature classes in a geodatabase. These data structures also borrow concepts from Object Oriented Programming (OOP) in that Feature can be thought of as the parent of Image and likewise ImageCollections are children of FeatureCollections. Therefore, many of the methods and operations available for Features and FeatureCollections are inherited by Images and ImageCollections. Asset Management: GEE hosts more than 17 petabytes of publicly available data which can be redistributed. However, users also have the option to upload their own data in popular GeoTIFF or shapefile formats for ingestion into the system as user assets. Assets are limited to 10 GB in size, and are counted as part of your shared Google storage quota (spread across Gmail, Google Drive, ect.). Once in the platform, they are treated the same as any other dataset in the platform and can be shared with other users.\nData Preparation: GEE handles many of the concerns typically associated with data preparation, such as storage, cataloging, and projecting the data appropriately. GEE has ingested many of the most popular remote sensing data from various satellite platforms such as LANDSAT, MODIS, and Sentinel to name just a few. The full collection can be browsed at https://developers.google.com/earth-engine/datasets/catalog/. When ingested, the data is stored in its native projection and format to preserve data integrity, and the metadata necessary to effectively use them is also included in each image as properties. Users may inspect, project, and resample the data as necessary, but because the GEE team takes care of these details when the data is ingested, the “time to science” is rapidly accelerated and users are free to spend that time on the analysis and presentation instead of the relatively canned portion of data preprocessing. By using the Search Box at the top of the IDE, you can find relevant datasets and explore how they can be used in GEE.\nMappers and Reducers: When working with data in GEE, there are two types of operations we can perform over the data, mapping and reducing. Mapping applies a function to each image in a collection, so a stack of n elements results in a transformed stack of n elements, and is useful when we need to do something to every image (for instance, calculating NDVI over a timeseries of images). Reducers on the other hand take one or more input features and return a reducer number of outputs. These often take the form of conventional map algebra expressions. These expressions can be as conceptually simple as pulling the maximum value of a stack of pixels in an ImageCollection to as complex as non-parametric trends of slope or regional reducers.\nJoins: Joining data is a hallmark of geospatial analysis, and GEE provides several ways in which different datasets may be joined together based on a specified condition. These conditions, or filters as they are called in GEE, can be spatial, tabular or temporal in nature. GEE applies the conditional filter, and items in the input collections that match the conditions are saved in the output collection depending what output the join wants to keep.\nImage Classification: GEE has several image classification methods built into the API. These include unsupervised and supervised classifiers. As of the time of this writing, these include: Cart, Naïve Bayes and Continuous Naïve Bayes, Decision Tree, Gmo Linear Regression and Max Ent, Ikpamir, Minimum Distance, Pegasos linear, and Polynomial, and Gaussian, Perceptron, Random Forest, Spectral Region, Scm, and Winnow. Additionally, if provided with training data, a confusion matrix can be calculated from that classification to present an accuracy assessment.\nExporting analyses: Although it is often more efficient to perform your desired analysis within GEE, exporting your results for further specialized analysis or publication can come in several forms depending on the desired purposes. This might include CSV files of time series, images, web map tiles, and videos. In addition, a suite of standard JavaScript User Interface (UI) items are available within the IDE which enables users to rapidly design and prototype user interfaces to include items such as combo and check boxes, sliders, and legend elements. These powerful features allow users to publish advanced and fully fleshed out web applications so that those who don’t have a GEE account can interact with or export analysis without looking at a single line of code.\nJavaScript: Example analysis: Let\u0026rsquo;s use GEE to perform one of the most common\n// Get an image using its ID var image = ee.Image(\u0026#39;LANDSAT/LT5_L1T/LT50260332010180EDC00\u0026#39;); //an image covering KC print(image) exploring properties' // Examine image properties/metadata // Get information about the bands as a list. var bandNames = image.bandNames(); print(\u0026#39;Band names: \u0026#39;, bandNames); // ee.List of band names // Get projection information from band 1. var b1proj = image.select(\u0026#39;B1\u0026#39;).projection(); print(\u0026#39;Band 1 projection: \u0026#39;, b1proj); // ee.Projection object // Get scale (in meters) information from band 1. var b1scale = image.select(\u0026#39;B1\u0026#39;).projection().nominalScale(); print(\u0026#39;Band 1 scale: \u0026#39;, b1scale); // ee.Number // Note that different bands can have different projections and scale. var b8scale = image.select(\u0026#39;B6\u0026#39;).projection().nominalScale(); print(\u0026#39;Band 6 scale: \u0026#39;, b8scale); // ee.Number // Get a list of all metadata properties. var properties = image.propertyNames(); print(\u0026#39;Metadata properties: \u0026#39;, properties); // ee.List of metadata properties // Get a specific metadata property. var cloudiness = image.get(\u0026#39;CLOUD_COVER\u0026#39;); print(\u0026#39;CLOUD_COVER: \u0026#39;, cloudiness); // ee.Number // Get the timestamp and convert it to a date. var date = ee.Date(image.get(\u0026#39;system:time_start\u0026#39;)); print(\u0026#39;Timestamp: \u0026#39;, date); // ee.Date // Display the image //set map center and zoom level Map.setCenter(-95,39,8); //add the image to the map Map.addLayer(image,{\u0026#39;bands\u0026#39;:[\u0026#39;B3\u0026#39;,\u0026#39;B2\u0026#39;,\u0026#39;B1\u0026#39;]},\u0026#39;Kansas City in natural color\u0026#39;); Map.addLayer(image,{\u0026#39;bands\u0026#39;:[\u0026#39;B4\u0026#39;,\u0026#39;B3\u0026#39;,\u0026#39;B2\u0026#39;]},\u0026#39;Kansas City in false color composite\u0026#39;); //import or use Landsat image collection // var lt5= ee.ImageCollection(\u0026#34;LANDSAT/LT5_L1T\u0026#34;); //print(\u0026#39;Landsat 5 collection\u0026#39;, lt5); // //Metadata about the collection // // Get the number of images. var count = lt5.size(); print(\u0026#39;Count: \u0026#39;, count); // Get the date range of images in the collection. var dates = ee.List(lt5.get(\u0026#39;date_range\u0026#39;)); var dateRange = ee.DateRange(dates.get(0), dates.get(1)); print(\u0026#39;Date range: \u0026#39;, dateRange); // Get statistics for a property of the images in the collection. var sunStats = lt5.aggregate_stats(\u0026#39;CLOUD COVER\u0026#39;); print(\u0026#39;CLOUD COVER statistics: \u0026#39;, sunStats); //filter images by time, location, and properties // // Load Landsat 5 data, filter by date and bounds. var ic1990s = lt5 .filterDate(\u0026#39;1990-01-01\u0026#39;, \u0026#39;1990-12-31\u0026#39;) .filterBounds(p69_135) //.filterMetadata(\u0026#39;CLOUD_COVER\u0026#39;,\u0026#39;less_than\u0026#39;,10); print(ic1990s) // Get the number of images. var count = ic1990s.size(); print(\u0026#39;Count: \u0026#39;, count); // Get the date range of images in the collection. var dates = ee.List(ic1990s.get(\u0026#39;date_range\u0026#39;)); var dateRange = ee.DateRange(dates.get(0), dates.get(1)); print(\u0026#39;Date range: \u0026#39;, dateRange); // Get statistics for a property of the images in the collection. var sunStats = ic1990s.aggregate_stats(\u0026#39;CLOUD_COVER\u0026#39;); print(\u0026#39;CLOUD COVER statistics: \u0026#39;, sunStats); ic1990s=ic1990s.filterMetadata(\u0026#39;CLOUD_COVER\u0026#39;,\u0026#39;less_than\u0026#39;,10); print(ic1990s); //Access individual images in the collection // var i1990 = ee.Image(ic1990s.first()); // Convert the collection to a list and get the number of images. var secondImage = ic1990s.toList(5).get(1); //print(ic1990s.toList(5)); print(\u0026#39;Second image: \u0026#39;, secondImage); // Sort by a cloud cover property, get the least cloudy image. var leastCloud = ee.Image(ic1990s.sort(\u0026#39;CLOUD_COVER\u0026#39;).first()); print(\u0026#39;Least cloudy image: \u0026#39;, leastCloud); // Limit the collection to the 10 most recent images. var mostRecent = ee.Image(ic1990s.sort(\u0026#39;system:time_start\u0026#39;, false).first()); print(\u0026#39;Most recent image: \u0026#39;, mostRecent); //add the data the map Map.centerObject(p69_135,12); //Map.addLayer(image); Map.addLayer(i1990,{\u0026#39;bands\u0026#39;:[\u0026#39;B4\u0026#39;,\u0026#39;B3\u0026#39;,\u0026#39;B2\u0026#39;]},\u0026#39;1990s\u0026#39;); // Load Landsat 5 data, filter by date and bounds. var ic2010s = ee.ImageCollection(\u0026#34;LANDSAT/LT5_L1T\u0026#34;) .filterDate(\u0026#39;2010-01-01\u0026#39;, \u0026#39;2010-12-31\u0026#39;) .filterBounds(p69_135) .filterMetadata(\u0026#39;CLOUD_COVER\u0026#39;,\u0026#39;less_than\u0026#39;,10); print(ic2010s) var i2010 = ee.Image(ic2010s.first()); //add the image Map.addLayer(i2010,{\u0026#39;bands\u0026#39;:[\u0026#39;B4\u0026#39;,\u0026#39;B3\u0026#39;,\u0026#39;B2\u0026#39;]},\u0026#39;2010s\u0026#39;); Homework: Exploring land cover/use change using Landsat 5 imagery available on GEE Decide one of your favorite places Select two images to show the changes Share your exploration (script) here\nCitations: Gorelick, N., Hancher, M., Dixon, M., Ilyushchenko, S., Thau, D., \u0026amp; Moore, R. (2017). Google Earth Engine: Planetary-scale geospatial analysis for everyone. Remote Sensing Of Environment, 202, 18-27. doi: 10.1016/j.rse.2017.06.031 Hansen, M. C., Potapov, P. V., Moore, R., Hancher, M., Turubanova, S. A., Tyukavina, A., … Townshend, J. R. G. (2013). High-Resolution Global Maps of 21st-Century Forest Cover Change. Science, 342(6160), 850–853. https://doi.org/10.1126/science.1244693 Coll, J. M. and Li, X. (2020). Google Earth Engine. The Geographic Information Science \u0026amp; Technology Body of Knowledge (1st Quarter 2020 Edition), John P. Wilson (ed.). DOI: 10.22224/gistbok/2020.1.9 "},{"id":72,"href":"/classes/random/website/oldwebsite/","title":"Old website tutorial","parent":"Creating a website","content":"Like apparently everything digital these days, not even a year later and most of the material I\u0026rsquo;ve written here is too out of date to be used as a direct tutorial (like nailing a flying carpet down, the reproducibility crisis might really be a thing). The academic theme has now moved to a quasi-GUI driven deployment system which both makes the site easier to create and deploy, but also strips it of the flexibility to modify it as I\u0026rsquo;ve outlined here. I am far too wed to this older format, and you could theoretically go back into github archives and download the older version that I use here. I\u0026rsquo;ll keep this page up as some of these tips are still relevant, but as always make sure you defer to the official documentation when learning how to do this. There are a lot of links in here, to highlight some key resources I used while I was learning how to do this.\n https://gohugo.io/ https://github.com/sourcethemes/academic-www https://github.com/gcushen/hugo-academic There are also several other tutorial linked at the end of this page that may be of use.\n The creation of this website was spurred by a desire to streamline my digital footprint and more easily facilitate sharing (Google Drive simply doesn\u0026rsquo;t cut it here, and OneDrive is a whole different beast). However, building a website of your own is not as straightforward or accessible as it otherwise could be. Somehow among the vast sea of ways to design and host a website, I found workable solution in the form of the HUGO.\nHUGO, a variation of the GO programming language, is touted as “The world’s fastest framework for building websites”. The HUGO documentation, particularly the hosting on github page, is a great place to start and is the “correct” way to go about creating a site. Below I will walk through how I managed to get it to work, your mileage may vary. I also recommend following the Academic theme instructions.\nA cautionary note:\nYour site doesn’t have to look like mine, feel free to pick any off the themes page. The theme you choose will do most of the heavy lifting, and hopefully you find one that does everything you want it to. If not, be prepared to spend a few days figuring out how HUGO and the themes work; there is no shortcut to creating new site behavior and I found it to be a bit unforgiving at times. In short, unless you REALLY want to sink time into this skillset, pick a theme and be happy with it 😊 \u0026ldquo;Step by step\u0026rdquo; instructions Create a github account: Create a Github account. I suppose in an ideal world your website name would be the same as your github username, but these can mismatch. I believe a free user account is restricted to one web page per account, and I haven’t bothered to test that limit.\n(if on windows) Install git for windows: Follow download and installation instructions at https://gitforwindows.org/.\nCreate the website url repository: It is possible to buy a domain to customize your url, but in my case I’m cheap, and jimcoll.github.io is perfectly fine with me. Note that here I deviate from the HUGO docs. I don\u0026rsquo;t use github the right way, and I don\u0026rsquo;t really have a reason to use a separate github repository to keep the site files, instead opting to keep them on my local hard drive.\nCreate a website folder: Next we need to create a folder where our website will live. This folder ideally shouldn’t be moved after you’ve created it, so place it somewhere useful (read: not the desktop). I gave into our digital overlords and moved most of my footprint to the cloud (OneDrive), so on my root folder I created a folder called myWebsite. Then right click on the folder and select the “Always keep it on this device”.\nInstalling HUGO: Next you need to install HUGO in the website directory. Download the appropriate windows install from the HUGO releases. Unzip the file and place the hugo.exe file in your myWebsite folder.\nCreate the website: This is where you need to start thinking for yourself. This process varies depending on which theme you end up choosing. Here I’ll just point you to the Academic doc pages. The raw doc files are also useful for formatting guidance.\nUpdating your website: When you want to \u0026ldquo;manage your content\u0026rdquo;, it’s easiest to test it locally first. Shift-right click in the myWebsite folder and “Open PowerShell window here”. Then, type .\\hugo server --disableFastRender. You can then open a browser and go to http://localhost:1313/. At this point, when you make and save changes to your files, HUGO will rebuild your site automagically and changes appear instantly on the browser. When you are happy with the edits you\u0026rsquo;ve made, you next need to generate your site. Again, from the myWebsite folder, open up PowerShell and generate the files with the .\\hugo command. This repopulates the myWebsite\\public folder. The last step is to push your new changes up to the repository, which you can do by right clicking on the public folder and selecting \u0026ldquo;Git GUI Here\u0026rdquo;, then\u0026hellip;\n Stage Changed files create a commit message Commit the changes, and Push the changes up to the repository. How this site was created: This site is a smashup of the aafu and Academic themes. Although the Academic theme has just about every feature you might need, I fell in love with the AAFU theme and the accordion effect and was too enthralled not to try and integrate. To do this, you need to understand a little about how HUGO creates a website. HUGO uses a tiered folder system to generate the site, and will look in folders following a defined lookup order. Therefore, we need to overwrite the academic home page theme with the aafu theme. To do this, I first install the Academic theme, and then replace the partials folder with the aafu theme. The last step is to reconfigure the config_default/ folder with the aafu theme markdown files. Most of these are straightforward, but within the config_default\\config.toml, we need to set the theme order as follows.\n# Configuration of Academic # Documentation: https://sourcethemes.com/academic/ # # This file is formatted using TOML syntax - learn more at https://learnxinyminutes.com/docs/toml/ # Each configuration section is defined by a name in square brackets (e.g. `[outputs]`). # Title of your site title = \u0026#34;Jim Coll\u0026#34; # The URL of your site. # End your URL with a `/` trailing slash, e.g. `https://example.com/`. baseurl = \u0026#34;https://jimcoll.github.io/\u0026#34; # Enter a copyright notice to display in the site footer. # To display a copyright symbol, type `\u0026amp;copy;`. For current year, type `{year}`. copyright = \u0026#34;\u0026#34; # Enable analytics by entering your Google Analytics tracking ID googleAnalytics = \u0026#34;\u0026#34; ############################ ## Advanced options below ## ############################ # Name of Academic theme folder in `themes/`. theme = [\u0026#34;academic\u0026#34;, \u0026#34;aafu\u0026#34;] defaultContentLanguageInSubdir = false removePathAccents = true # Workaround for https://github.com/gohugoio/hugo/issues/5687 # Get user avatars from Gravatar.com? (true/false) gravatar = false # Align the main menu to the right of the page? (true/false) menu_align_right = false # Show estimated reading time for posts? (true/false) reading_time = true # Display next/previous section pager? (true/false) section_pager = false docs_section_pager = true # Display pager in Docs layout (e.g. tutorials)? # Enable in-built social sharing buttons? (true/false) sharing = true paginate = 10 # Number of items per page in paginated lists. # Taxonomies. [taxonomies] tag = \u0026#34;tags\u0026#34; category = \u0026#34;categories\u0026#34; publication_type = \u0026#34;publication_types\u0026#34; author = \u0026#34;authors\u0026#34; [params] title = \u0026#34;Jim Coll\u0026#34; author = \u0026#34;Darshan Baral\u0026#34; description = \u0026#34;Jim Coll\u0026#34; copyright = \u0026#34;\u0026#34; The rest of that file is the remains of the aafu theme.\n Adding classes To get menus to work, my /config/_default/menus.toml file looks like:\n# Navigation Links # To link a homepage widget, specify the URL as a hash `#` followed by the filename of the # desired widget in your `content/home/` folder. # The weight parameter defines the order that the links will appear in. [[main]] name = \u0026#34;Main Website\u0026#34; url = \u0026#34;\u0026#34; weight = 1 [[main]] name = \u0026#34;GEOG 358\u0026#34; url = \u0026#34;/courses/geog358/\u0026#34; weight = 2 [[main]] name = \u0026#34;GEOG 558\u0026#34; url = \u0026#34;/courses/geog558/\u0026#34; weight = 3 [[main]] name = \u0026#34;Drone Mapping\u0026#34; url = \u0026#34;/courses/dronemapping/\u0026#34; weight = 4 [[main]] name = \u0026#34;Random\u0026#34; url = \u0026#34;/courses/random/\u0026#34; weight = 4 [[geog358]] url = \u0026#34;/courses/geog358\u0026#34; [[geog558]] url = \u0026#34;/courses/geog558\u0026#34; [[random]] url = \u0026#34;/courses/random\u0026#34; [[dronemapping]] url = \u0026#34;/courses/dronemapping\u0026#34; Note that you will also have to set up these links in the AAFU theme as well, these menues are for the academic theme.\n Make it your own One of the best ways I\u0026rsquo;ve found to help improve my understanding of the software/deployment was to change some of the layouts. To add your resume to your image card as a link, and not as an icon as I have, you need to tweak the footer of the aafu theme, which should be in the themes\\academic\\layouts\\partials\\footer.html. Mine now looks like so:\n\u0026lt;footer class=\u0026#34;mb-4\u0026#34;\u0026gt; \u0026lt;a href=\u0026#34;https://jimcoll.github.io/courses/random/media/JamesCollSharedWebResume.docx\u0026#34;\u0026gt;Download a \u0026#34;more traditional\u0026#34; resume \u0026lt;/a\u0026gt; \u0026lt;/br\u0026gt; powered by \u0026lt;a href=\u0026#34;https://gohugo.io/\u0026#34;\u0026gt;hugo\u0026lt;/a\u0026gt; \u0026amp; deployed on \u0026lt;a href=\u0026#34;https://github.com/\u0026#34;\u0026gt;github\u0026lt;/a\u0026gt; \u0026amp;middot; \u0026lt;i\u0026gt;\u0026lt;a href=\u0026#34;https://github.com/darshanbaral/aafu\u0026#34;\u0026gt;aafu\u0026lt;/a\u0026gt;\u0026lt;/i\u0026gt; by \u0026lt;a href=\u0026#34;https://www.darshanbaral.com/\u0026#34;\u0026gt;Darshan\u0026lt;/a\u0026gt; \u0026amp; \u0026lt;i\u0026gt;\u0026lt;a href=\u0026#34;https://github.com/gcushen/hugo-academic\u0026#34;\u0026gt;Academic\u0026lt;/a\u0026gt;\u0026lt;/i\u0026gt; by \u0026lt;a href=\u0026#34;https://georgecushen.com/\u0026#34;\u0026gt;George\u0026lt;/a\u0026gt; \u0026lt;/footer\u0026gt; Website width The Academic theme follows the \u0026ldquo;best\u0026rdquo; practice of a mobile first design, and therefore scales pages to an otherwise anemic width. This is functional, but I was unhappy with it. Following tips on this post, you can change the head of the file at themes\\academic\\assets\\sass\\academic_docs.scss to a larger max width. A small bump is all you need to make it look prettier (960 is about 12 point font on a word document, 1440 is also a solid choice).\n/*************************************************\r* Documentation layout\r**************************************************/\r.docs-article-container {\rmax-width: 960px;\r}\r Adding tabbed content Below is a code snipit that shows how to lay out material within a tabbed content box. Of note, in order to mix markdown into html elements, we need to append markdown=\u0026ldquo;1\u0026rdquo; to each div element, and DIV needs to be capitalized in order for this to render properly. I prefer removing indentations here, it makes writing the raw content easier and doesn\u0026rsquo;t break formatting on Chrome (other browsers untested). The \u0026amp;nbsp; in the headers are added for spacing. Unfortunately, having headers above level 4 within tabbed boxes breaks the page generated table of contents on the right hand side and so should be avoided.\n\u0026lt;ul class=\u0026quot;nav nav-tabs\u0026quot;\u0026gt;\r\u0026lt;li class=\u0026quot;active\u0026quot;\u0026gt;\u0026lt;a data-toggle=\u0026quot;tab\u0026quot; href=\u0026quot;#home\u0026quot;\u0026gt;\u0026amp;nbsp;\u0026amp;nbsp;Home\u0026amp;nbsp;\u0026amp;nbsp;\u0026lt;/a\u0026gt;\u0026lt;/li\u0026gt;\r\u0026lt;li\u0026gt;\u0026lt;a data-toggle=\u0026quot;tab\u0026quot; href=\u0026quot;#menu1\u0026quot;\u0026gt;\u0026amp;nbsp;\u0026amp;nbsp;Menu 1\u0026amp;nbsp;\u0026amp;nbsp;\u0026lt;/a\u0026gt;\u0026lt;/li\u0026gt;\r\u0026lt;li\u0026gt;\u0026lt;a data-toggle=\u0026quot;tab\u0026quot; href=\u0026quot;#menu2\u0026quot;\u0026gt;\u0026amp;nbsp;\u0026amp;nbsp;Menu 2\u0026amp;nbsp;\u0026amp;nbsp;\u0026lt;/a\u0026gt;\u0026lt;/li\u0026gt;\r\u0026lt;/ul\u0026gt;\r\u0026lt;DIV class=\u0026quot;tab-content\u0026quot; markdown=\u0026quot;1\u0026quot;\u0026gt;\r\u0026lt;DIV id=\u0026quot;home\u0026quot; class=\u0026quot;tab-pane fade show active\u0026quot; markdown=\u0026quot;1\u0026quot;\u0026gt;\r\u0026lt;h3\u0026gt;HOME tab is shown by default\u0026lt;/h3\u0026gt;\r\u0026lt;p\u0026gt;Some content.\u0026lt;/p\u0026gt;\r\u0026lt;/DIV\u0026gt;\r\u0026lt;DIV id=\u0026quot;menu1\u0026quot; class=\u0026quot;tab-pane fade\u0026quot; markdown=\u0026quot;1\u0026quot;\u0026gt;\r#### This is content in markdown\rNeat * List\r* More list\r\u0026lt;/DIV\u0026gt;\r\u0026lt;DIV id=\u0026quot;menu2\u0026quot; class=\u0026quot;tab-pane fade\u0026quot; markdown=\u0026quot;1\u0026quot;\u0026gt;\r\u0026lt;h3\u0026gt;Menu 2\u0026lt;/h3\u0026gt;\r\u0026lt;p\u0026gt;Some more content in menu 2.\u0026lt;/p\u0026gt;\r\u0026lt;/DIV\u0026gt;\r\u0026lt;/DIV\u0026gt;\rHow this renders in site:\n\u0026nbsp;\u0026nbsp;Home\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;Menu 1\u0026nbsp;\u0026nbsp;\r\u0026nbsp;\u0026nbsp;Menu 2\u0026nbsp;\u0026nbsp;\r\rHOME tab is shown by default\rSome content.\n\r#### This is content in markdown\rNeat List More list \rMenu 2\rSome more content in menu 2.\n\r\r Extra Links George Cushen, in addition to the academic docs, has likewise created a tutorial on his personal site detailing this workflow. Leslie Myint\u0026rsquo;s website has a great post that effectively covers a large cross section of changes to the Academic theme. Some links to style formatting, more for me than anything\n https://github.com/gcushen/hugo-academic/issues/84 "},{"id":73,"href":"/classes/random/","title":"Random bin","parent":"Classes","content":"This is a place for me to dump some of my uncategorized work, links to other great content creators, and other neat resources I’ve found along my way.\n Educational-ish links Need data? Random education bits Tutorials: ClusteR The worst lines of code I\u0026rsquo;ve ever written: R Python GEE (JavaScript) Educational-ish links Dev tools\n A handy palette generator colorbrewer a great place to find good looking palette, needs no introduction color harmony identification is a tool to find GIS Resources\n GIS lounge has compiled a nice cross section of links to open course-ware across the academic disciplines, and a few GIS-centric resources. Earth Lab has a massive amount of R and Python tutorials related to Earth Data Science. Hydroshare: “github for water” is a sad understatement of the capabilities of the platform, developed by the amazing folks at CUAHSI. OpenTOPO: A great clearinghouse and web toolset for topo data. OSGeoLive: Want to run free? This VM has every open tool you’ll need to do GIScience. For use in VirtualBox. Next, a set of three resources from UCGIS: A compiled list of resoures to expand classes which serve as a great spot to find tutorials and other class material Recorded webinars from the UCGIS Webinars \u0026amp; Workshops series. The UCGIS Body of Knowledge is a great place to start for your broad GIS questions or to get a broad overview on a particular topic. https://geoscripting-wur.github.io/, A very neat implementation of a class. Geospatial Analysis—A comprehensive guide (6th edition) by de Smith, M. J., M. F. Goodchild, P. A. Longley (2018) may serve as a great guide to use as a course book should you want one. Finally, the official list of Google Earth Engine tutorials is a great place to start your self guided introduction to the platform. Next, a few R specific/centric resources:\n Tutorials covering a nice cross section of social GIS from the Center for Spatial Data Science (CSDS) An R based introduction to Geographic Information Science from Mike These R enabled books are excellent resources:\n Big Book of R compiled by Oscar Baruffa is a great catchall to start your R exploration Intro to GIS and Spatial Analysis by Manuel Gimond Geocomputation with R by Robin Lovelace, Jakub Nowosad \u0026amp; Jannes Muenchow An incomplete version of Data Visualization: A practical introduction by Kieran Healy Spatial Data Science by Edzer Pebesma and Roger Bivand Using Spatial Data with R by Claudia A Engel r-statistics.co by Selva Prabhakaran Web Application Development with R Using Shiny by Chris Beeley and Shitalkumar R. Sukhdeve R for Data Science by Garrett Grolemund blogdown: Creating Websites with R Markdown by Yihui Xie, Amber Thomas, and Alison Presmanes Hill Fun folks\n First, a few from John Nelson, a phenomenal cartographer and tutorial writer. https://adventuresinmapping.com/ https://www.esri.com/arcgis-blog/author/j_nelson/ https://www.youtube.com/channel/UCpdwmy5JTFNUkKknxHH9Dsg Mike Johnson: A co-conspirator, data wizard, and all around great dude. hydroblog from Josh Erickson has a neat assortment of his work for the US forest service (check out his github as well). ESRI educational manager Joseph Kerski, and the Spatial Reserves blog are a great example of what a skilled and broadly trained geographer is capable of accomplishing. Neat youtubes\n 3Blue1Brown is absolutely amazing. I wish this was available when I was sitting through math, and consider this mandatory viewing. The Surface Dynamics Modeling Lab out of the University of Alabama has an excellent archive of invited speaker talks and tutorials. PBS Space Time puts together physics-centric videos that revolve around, well, space and time. Numberphile covers interesting or otherwise unique aspects of math in a casual but informative way. My inner film major loves the overly analyzed aspects film covered in Every Frame a Painting. Capitan Disillusion does a phenomenal job explaining digital concepts and demonstrating how modern CGI can be used. The CUAHSI channel features many of their recorded webinars. Finally, Apetor can teach us all a little something about the wonder of life. Thesis leftovers\n A misguided attempt at a website for my masters. Placed here more to remind me to do something with it one of these days: https://sites.google.com/site/globalsnowobservatory/ A post about Mann Kendall and Sens slope in GEE from me. At some point I\u0026rsquo;ll get around to doing something with this too. Need data? Cartographic line data: http://www.projectlinework.org/\nBase imagery: https://www.naturalearthdata.com/downloads/10m-raster-data/\nTopo resources: http://www.earthpoint.us/TopoMap.aspx\nWater resources: https://www.hydroshare.org/landingPage/\nEarth as art: https://eros.usgs.gov/image-gallery/earth-as-art\n Random education bits Tutorials: GEE slides: https://docs.google.com/presentation/d/e/2PACX-1vS6dyB5JjpXAUpQGOj7gzlgpGxGBjAWLqkwizxEbblkzwzNjgPX-xDe69hvSJ_hxAv529ojzyeT1GVD/pub?slide=id.g7a554f672f_1_148 Other useful slide decks: https://agu2019.earthoutreach.org/booth ClusteR See the github: https://github.com/JimColl/clusteR or it\u0026rsquo;s companion on my website: https://jimcoll.github.io/random/\nAbout:\nA tool to add points to a map from csv or shapefile, and interactively cluster them as you see fit, this GUI driven, geographic database display and clustering application built is built on R. Is also a nearly minimal example of how to tie map and database together, how reactive shiny elements work, and how to print dashboards. A project that grew out of a request to see a database from a geographic prospective, here is a FOSS implementation of GIS for people who don\u0026rsquo;t know or have the time to learn it.\nFeatures\n GUI driven interface to select csv or shapefile database. Use drop downs how to cluster (currently broken), color, and label the points. Edit the table view of the database and optionally save the database back out. Click on a table entry to zoom to the location Fully reactive map and editing\n How to run:\nThe \u0026ldquo;installation\u0026rdquo; of this is program is trivially easy by design and requires no permissions. To start the process:\n Download and unzip this repository using the green \u0026ldquo;Clone or download\u0026rdquo; button. Select a csv or a shapefile, enter the LAT LON and EPSG fields as necessary. Replace spaces with. and hit run! A test dataset is included in the data folder. To \u0026ldquo;uninstall\u0026rdquo;, just delete the folder. Pandoc and PhantomJS are also installed, and may be removed through the add or remove programs system dialog and the user/AppData/Roaming folder respectively.\nKnown bugs:\n Selecting fields which are too long or numeric will currently kill the crab. To recover, close the window and re run the HTA. Map and data don\u0026rsquo;t wrap smoothly. Fly to (when you click on a row) will sometimes cause the server to lock up. To fix, zoom in a bit before attempting to fly. Legends can be too large for the window. Print button works but output is gross. Map legends are also not printed. Pandoc needs to restart after being installed, so attempting to print after the installation will break the server. This will only happen when you first run clusteR. To recover, close the window and re run the HTA. Editing the table is not fully tested. Shapefile editing broken. The worst lines of code I\u0026rsquo;ve ever written: A fun(?) section more for me than for anything else to showcase how grossly incompetent I am at actually getting a computer to do what I want.\nR mybbox \u0026lt;- raster::extent(sf::st_buffer( x=xx$mapindex.poly[subset(xx$address.point, eval(parse(text = names(xx$flood.grid)[timestep]))\u0026gt;0),][xx$mapindex.poly[subset(xx$address.point, eval(parse(text = names(xx$flood.grid)[timestep]))\u0026gt;0),]$`Index Label`==index,], dist=0.00005, nQuadSegs=1, endCapStyle=\u0026#34;SQUARE\u0026#34;, joinStyle=\u0026#34;ROUND\u0026#34;, mitreLimit=1, singleSide=FALSE)) The mapping side of FOSSFlood is a bit wild, but this line takes the lead. I needed a slightly larger bounding box from the index that a user wants to view so that I could fly to it. However, these indexes aren\u0026rsquo;t necessarily the same for each time step, so the subset got that eval(parse()) function, inside a buffer, and because it\u0026rsquo;s all temporary this made sense (at the time) to do in the single step you see here.\nPython for x in np.nditer(branch_GRIDCODES_lookup): for y in np.array([[x]])[0][0][0].split(\u0026#34;,\u0026#34;): if int(y)==mat_num: branch_num = np.array([[x]])[0][0][1] This chuck, called iteratively in a for loop, looks at all the grid codes for a given fldpln library and, for a given mat file returns the branch code for that mat file, all so I can eventually stich them together in a numpy array. I have a hate-love-hate relationship with Python\u0026hellip;.\nGEE (JavaScript) if(SpaceSelectValue == \u0026#34;stdDev\u0026#34;) { results = ee.ImageCollection(app.MODDef) .select(stringSeperator,LCtoUse) .map(function(image) { return image.reduceRegions({ collection: AOI.geometry(), reducer: ee.Reducer.stdDev().group({ groupField: 1, groupName: \u0026#34;code\u0026#34;, }), scale: resultsScale }).map(function(f) { // Process the grouped results from list of dictionaries to dictionary. var dict = ee.List(f.get(\u0026#34;groups\u0026#34;)).map(function(group) { group = ee.Dictionary(group).combine(ee.Dictionary({code:99,stdDev:[null]}), false); var code = ee.Number(group.get(\u0026#34;code\u0026#34;)).format(\u0026#34;code_%d\u0026#34;); var stdDev = group.get(\u0026#34;stdDev\u0026#34;); return [code, stdDev]; }); dict = ee.Dictionary(dict.flatten()); // Add a date property to each output feature. return f.set(\u0026#34;date YYYY-MM-dd\u0026#34;, image.date().format(\u0026#34;YYYY-MM-dd\u0026#34;)) .set(\u0026#34;system:time_start\u0026#34;, image.date().millis()) .set(dict) .set(\u0026#34;groups\u0026#34;, null); }); }); A lot of my JavaScript is pretty repetitive as a result of my poor skills and a sub par understanding of client-server interactions. Although this isn\u0026rsquo;t the most egregious example, it is repeated in TEAM no less than 7 times with minor changes for each statistic of aggregation.\n"},{"id":74,"href":"/classes/random/website/","title":"Creating a website","parent":"Random bin","content":" \u0026ldquo;Step by step\u0026rdquo; instructions Create a github account: (if on windows) Install git for windows: Create the website url repository: Create a website folder: Installing HUGO: Create the website: Updating your website: How this site was created: Make it your own Adding potree Extra Links I\u0026rsquo;ve kept an older version of this page just in case that helps anyone, but in most instances you should defer to this one. I\u0026rsquo;m sure a year from now this will all have changed again anyways\u0026hellip; The creation of this website was spurred by a desire to streamline my digital footprint and more easily facilitate sharing. However, building a website of your own is not as straightforward or accessible as it otherwise could be. Somehow among the vast sea of ways to design and host a website, I found workable solution in the form of the HUGO.\nHUGO, a variation of the GO programming language, is touted as “The world’s fastest framework for building websites”. The HUGO documentation, particularly the hosting on github page, is a great place to start and is the “correct” way to go about creating a site. Below I will walk through how I managed to get it to work, your mileage may vary.\nA cautionary note:\nYour site doesn’t have to look like mine, feel free to pick any off the themes page. The theme you choose will do most of the heavy lifting, and hopefully you find one that does everything you want it to. If not, be prepared to spend a few days figuring out how HUGO and the themes work; there is no shortcut to creating new site behavior and I found it to be a bit unforgiving at times. In short, unless you REALLY want to sink time into this skillset, pick a theme and be happy with it 😊 \u0026ldquo;Step by step\u0026rdquo; instructions Create a github account: Create a Github account. I suppose in an ideal world your website name would be the same as your github username, but these can mismatch. I believe a free user account is restricted to one web page per account, and I haven’t bothered to test that limit.\n(if on windows) Install git for windows: Follow download and installation instructions at https://gitforwindows.org/.\nCreate the website url repository: It is possible to buy a domain to customize your url, but in my case I’m cheap, and jimcoll.github.io is perfectly fine with me. Note that here I deviate from the HUGO docs. I don\u0026rsquo;t use github the right way, and I don\u0026rsquo;t really have a reason to use a separate github repository to keep the site files, instead opting to keep them on my local hard drive.\nCreate a website folder: Next we need to create a folder where our website will live. This folder ideally shouldn’t be moved after you’ve created it, so place it somewhere useful (read: not the desktop). I gave into our digital overlords and moved most of my footprint to the cloud (OneDrive), so on my root folder I created a folder called myWebsite. Then right click on the folder and select the “Always keep it on this device”.\nInstalling HUGO: Next you need to install HUGO in the website directory. Download the appropriate windows install from the HUGO releases. Unzip the file and place the hugo.exe file in your myWebsite folder.\nCreate the website: This is where you need to start thinking for yourself. This process varies depending on which theme you end up choosing. Here I’ll just point you to the Academic doc pages. The raw doc files are also useful for formatting guidance.\nUpdating your website: When you want to \u0026ldquo;manage your content\u0026rdquo;, it’s easiest to test it locally first. Shift-right click in the myWebsite folder and “Open PowerShell window here”. Then, type .\\hugo server --disableFastRender. You can then open a browser and go to http://localhost:1313/. At this point, when you make and save changes to your files, HUGO will rebuild your site automagically and changes appear instantly on the browser. When you are happy with the edits you\u0026rsquo;ve made, you next need to generate your site. Again, from the myWebsite folder, open up PowerShell and generate the files with the .\\hugo command. This repopulates the myWebsite\\public folder. The last step is to push your new changes up to the repository, which you can do by right clicking on the public folder and selecting \u0026ldquo;Git GUI Here\u0026rdquo;, then\u0026hellip;\n Stage Changed files create a commit message Commit the changes, and Push the changes up to the repository. How this site was created: This site is a smashup of the aafu and geekdocs themes. Although the Academic theme has just about every feature you might need, I fell in love with the AAFU theme and the accordion effect and was too enthralled not to try and integrate. To do this, you need to understand a little about how HUGO creates a website. HUGO uses a tiered folder system to generate the site, and will look in folders following a defined lookup order. Therefore, we need to overwrite the academic home page theme with the aafu theme. To do this, I first install the Academic theme, and then replace the partials folder with the aafu theme. The last step is to reconfigure the config_default/ folder with the aafu theme markdown files. Most of these are straightforward, but within the config_default\\config.toml, we need to set the theme order as follows.\n Make it your own One of the best ways I\u0026rsquo;ve found to help improve my understanding of the software/deployment was to change some of the layouts. To add your resume to your image card as a link, and not as an icon as I have, you need to tweak the footer of the aafu theme, which should be in the themes\\academic\\layouts\\partials\\footer.html. Mine now looks like so:\n\u0026lt;footer class=\u0026#34;mb-4\u0026#34;\u0026gt; \u0026lt;a href=\u0026#34;https://jimcoll.github.io/random/media/JamesCollSharedWebResume.docx\u0026#34;\u0026gt;Download a \u0026#34;more traditional\u0026#34; resume \u0026lt;/a\u0026gt; \u0026lt;/br\u0026gt; powered by \u0026lt;a href=\u0026#34;https://gohugo.io/\u0026#34;\u0026gt;hugo\u0026lt;/a\u0026gt; \u0026amp; deployed on \u0026lt;a href=\u0026#34;https://github.com/\u0026#34;\u0026gt;github\u0026lt;/a\u0026gt; \u0026amp;middot; \u0026lt;i\u0026gt;\u0026lt;a href=\u0026#34;https://github.com/darshanbaral/aafu\u0026#34;\u0026gt;aafu\u0026lt;/a\u0026gt;\u0026lt;/i\u0026gt; by \u0026lt;a href=\u0026#34;https://www.darshanbaral.com/\u0026#34;\u0026gt;Darshan\u0026lt;/a\u0026gt; \u0026amp; \u0026lt;i\u0026gt;\u0026lt;a href=\u0026#34;https://github.com/gcushen/hugo-academic\u0026#34;\u0026gt;Academic\u0026lt;/a\u0026gt;\u0026lt;/i\u0026gt; by \u0026lt;a href=\u0026#34;https://georgecushen.com/\u0026#34;\u0026gt;George\u0026lt;/a\u0026gt; \u0026lt;/footer\u0026gt; Adding potree TODO\n Extra Links https://discourse.gohugo.io/t/two-themes-as-separate-hugo-directories-deployed-to-the-same-website/27899/4 https://conversiontools.io/convert/excel-to-html Some links to style formatting, more for me than anything\n https://github.com/gcushen/hugo-academic/issues/84 "},{"id":75,"href":"/classes/categories/","title":"Categories","parent":"Classes","content":""},{"id":76,"href":"/classes/","title":"Classes","parent":"","content":"The world presents us with its unbounded complexity, and it is up to each of us to deal with this complexity in our own way. While simplification may save us from sensory overload and decision paralysis, these same simplifications generate friction and discomfort. I truly believe that a well-trained geographer has many of the skills and toolsets needed to slide through that friction to address these complexities and answer the most pressing questions this world has to offer. GIScience and Technology is a critical subset of those tools, and a great way to help frame and analyze the space around you. I\u0026rsquo;ve gotten a lot of utility and benefit from learning, using, and teaching these topics and it is my hope that I can pass some of that knowledge along to you. I’ve placed a lions share of my teaching portfolio here for the benefit of all and my own selfish desire to streamline my digital footprint. Feel free to send me typos, mis-formatted pages, suggestions, edits, other resources, and otherwise follow along. I’ve found GIS to be a profoundly useful tool that helps me add structure, reason, and logic to this otherwise variable world, and my hope is that by the time you\u0026rsquo;ve left this site you\u0026rsquo;ll have taken away something useful as well.\nHow to use the site I\u0026rsquo;ve attempted to standardize my formatting but you have landed on the accumulation of almost every class I\u0026rsquo;ve taught, so there might be a few inconsistencies. I also write this site in notepad++, so spellchekc and local testing can only take me so far. If you find errors (spelling, formatting, or otherwise) or if something needs clarification or fixing please let me know.\nEach lab will start off with an objectives section that outlines the analyses or skills the lab aims to teach. Following this, the requisite data and question word document is attached. I also include a handy table of contents when appropriate.\n How to use the site Digestible steps Sub steps Lab help Teaching philosophy What I teach Beginning data science Grading After that, the tutorial starts in earnest. Major steps you take in an analysis will generally have a level 3 heading, and look like so:\nDigestible steps Substeps or other milestones are in a level 4 heading like so:\nSub steps Writing for technical documentation can be a little awkward. If you need to click or select something I attempt to Bold them. This includes toobar and options clicks. If I want you to write something out explicitly I\u0026rsquo;ll \u0026ldquo;typically quotation it\u0026rdquo;. If you are clicking on an option or choosing settings or sub tabs I\u0026rsquo;ll italizie it.\n Notes (formatted like this) typically serve as parentheticals, or image credits where appropriate.\n Questions in the word documents you have to answer are also repeated in the tutorials like so Lab help The lab is a huge part of most of my teaching. Lectures can take you pretty far, but what you can accomplish on your own is most of what you will be hired for. I have made efforts to make these labs as consistent, organized, and followable as I can but inevitably I will have missed something or you will encounter an error. Fortunately for you, you happen to be sitting in front of one of the most powerful tools the world has ever assembled. I refer of course, to our overlords of Google, who make the internet searchable. Being able to effectively Google is critical to success, so if you ask me a question regarding what went wrong with your analysis, the conversation will generally play out like so:\n I found an error, what did I do wrong? What was the error? It was XYZ What did you Google? When in doubt, you can always try to execute something and then reverse engineer your way back. These are just PC\u0026rsquo;s and nothing we\u0026rsquo;ll do is mission critical. The worst thing you\u0026rsquo;ll do is cause the computer to BSOD, and although those aren\u0026rsquo;t great, they are not the end of the world and your work should already be backed up.\n It will be covered more in class, but unless explicitly stated otherwise labs will be due a week after the lab session meets. Teaching philosophy What I teach One of the foundational goals I have in teaching is to inspire and equip you to tackle your own questions. GISystems and GISScience are an accessible and relevant means of connecting spatial concepts to actionable practices and real-world and digital skills. Although software is the vehicle used to teach these concepts, it\u0026rsquo;s those underlying skills and concepts I hope to transfer to you. This ability to think critically about space (critical spatial thinking) is one of the most in demand skills you can possess and is widely considered to be a key facet of intelligence and a desirable cognitive capability. I define critical spatial thinking as follows:\n Critical spatial thinking is the ability to observe and form a query; place that query in the context of the core spatial concepts; devise an appropriate means of testing that hypothesis; executing that test accurately; communicate those results in textual, verbal or visual form; and objectively reflect on that process.\n In teaching the acquisition of GIScience skills though GISystems, we can deploy two primary means, one can take a graphical user interface (GUI) based approach to introducing GIS, or take a more programmatic approach. GUI GIS includes programs such as Google Maps and Earth, QGIS and ESRI Arc products, whereas programmatic approaches to GIS include Python, R, JavaScript, and Matlab. There are tradeoffs to each, and it’s worth peeling them apart.\nAnything built in a digital system is inherently less free and less flexible than we can conceptualize internally. These limitations are not necessarily even imposed by hardware, they can often simply be too unwieldy to implement in a digital environment. Likewise, a GUI driven implementation will always have fewer degrees of freedom than a programmatic implementation. This loss of flexibility is further compounded by the nature of a GUI driven system in general; one cannot expose too many of the underlying parameters without overwhelming the end user. Finally, although not limited to GUI programs, many of the more advanced GUI driven systems must be paid for, creating additional barriers of access and raising ethical and moral issues related to academia teaching a private company’s software, and building dependence on computational crutches.\nProgrammatic approaches, in contrast, require the ability to program which takes quite a while to build competency in, and progress made while learning is not particularly obvious. Furthermore, it is often harder to find that initial motivation/inspiration in programming, whereas GUI driven systems provide immediate feedback and a tangible goal to visualize. I find it far easier to inspire the desire to program when individuals have a self-developed goal in mind; the learning will follow. However, a skilled GIScientist must learn to program eventually, and in that respect, a programmatic approach to learning GIS might save time in the long run. A strictly programmatic approach also foregoes the process of learning common interface accesses paths of a GIS GUI, but such things are trivially straightforward to learn and by the time one has acquired the ability to think critically, they also likely possess the capabilities for self-directed learning.\nThese differences in approach are summarized in the generalized conceptual diagram shown below. Users who start on the GUI driven track rapidly acquire skills in spatial literacy, but progress quickly plateaus while acquiring computational skills. This is due in parts to the abstraction GUI driven programs provide over the computational domain and the inherent limitations of GUI platforms to design custom tools. Therefore, these users need to spend additional time learning how to program, and in some cases unlearning poor habits that GUI driven tools can create. Eventually, users overcome this learning curve, and move on to become critically spatially literate. In contrast, users who start learning GIS programmatically have a much slower rise to spatial literacy as they overcome the early hurdles associated with programming and visualization, and consequently acquire spatial literacy later than their GUI taught counterparts. However, they have none of the later learning curve, and rapidly transition from spatial literacy and using GIS as a tool to toolmaking. Consequently, they acquire critical spatial and data literacy somewhat sooner.\nOne more critical aspect left unaddressed between these two approaches to teaching GISystems is the retention rates and success (in terms of the number of students who matriculate and go on to practice GIScience). To provide the most accessible experience and archive possible, this site and the classes within will include approaches to labs in both ESRI (ArcMap and ArcPro), QGIS, Python, R, and Google Earth Engine as time and funding allows. Although I obviously hope you become a seasoned GIScience practitioner, my goal is that you leave my class with a deeper appreciation and understanding for the nuances of spatial analysis and phenomenon, and that you gain practical problem solving skills you can deploy in your own careers.\nBeginning data science One of the many skills I hope that you pick up as you progress through my class, and something I continually try to improve, is my ability to effectively manipulate and interpret data. You can\u0026rsquo;t hope to do this when you have files scattered all over your desktop and hard drive with 7 file names which contain variations of \u0026ldquo;\u0026hellip;final\u0026hellip;\u0026rdquo;. I don\u0026rsquo;t have many regrets but one of the largest is that I was not more organized digitally when I started my graduate career and this is something you have the chance to avoid now. Use a file storage format and naming convention that you will 1) use consistently, and 2) makes sense to you. I use a variation of the following.\n I typically don\u0026rsquo;t opt for a week by week folder structure, I am very unlikely to remember what week something happened even a month later, so it makes little sense in my mind to organize in that fashion. YMMV\n Grading I dislike touching on this subject as I find it a bit counterproductive to the goal behind attending university, but it seems warranted given that is likely how you\u0026rsquo;ve ended up on this page :) There are many things I love about teaching, but gatekeeping is not one of them. A 4 year degree is a great signpost on your resume that says you are a well rounded and capable individual with the requisite background and theoretical foundation necessary to excel in your chosen field. However, I dislike that my say so (in the form of a pass or fail grade) can act as that barrier to your perceived success or failure. Even more so, a grade is one of the last things you as a learner should be concerned about. If you find yourself chasing points, in my mind you\u0026rsquo;ve missed the whole reason to learn in the first place. You should be concerned about whether you understand the steps and rational behind the material, and how well you are able to apply that understanding to new situations. Although I have no wish to contribute to grade inflation via grade leniency, grading these labs as laid it out here is much like a positivistic science in that there is a right answer and a wrong answer, and I do not grade on a curve. I will of course push you to do your best and go the extra step, but many of the labs are cut and dry when it comes to assigning a points grade. If you all do well, you get an A, and I\u0026rsquo;d rather not explore the alternative end of what that range is.\nFinally, although it should be obvious at this point in your academic careers, under no circumstances is cheating tolerated. This includes but is not limited to plagiarism in papers, using previous students’ course material in quizzes and tests, and submitting other students’ work as your own. Not only does this detract from the overall integrity of the department and the school (the lesser of the evils in my mind), but cheating in these classes sets you up for disappointment and misery further down the line. You\u0026rsquo;ll have failed to adequately learn foundational concepts and the advanced skills that employers are looking for, and concepts here form the foundation for virtually all material as you move deeper into the field. In short, it is counterproductive to the very concept of attending college in the first place. I am always available through email, slack, and office hours (or by appointment) and am here to help, so don’t do yourselves the disservice of cheating through what should otherwise be interesting and simulating material.\n"},{"id":77,"href":"/classes/tags/","title":"Tags","parent":"Classes","content":""}]