This week's lab assignment complimented last week LULC classification assignment.
Using aerial earth images, I compared how I classified things in LULC assignment to what they actually are. There was a 50% producer error. Most of my errors were from residential areas being something else; either a school, road, or vegetation.
Tuesday, April 11, 2017
Monday, April 10, 2017
This week in lab, I learned how to perform a supervised classification using ERDAS. The task included collecting sets of pixels on an imported image to define spectral signatures and add new signatures. Using these signatures, I was able to classify an entire image.
- create spectral signatures and AOI features
- Produce classified images from satellite data.
- Recognize and eliminate spectral confusion between spectral signatures.
-Recode an image
- Classify, then reclassify an image.
Thursday, March 23, 2017
Unsupervised Image Classification
In this week's lab, we used both ERDAS and Arcmap to perform unsupervised classification. We determined was features the image classes represented, also known as, reclassification and then simplified that into a few types of features by recoding the image.
- perform an unsupervised classfication in both ArcMap and ERDAS
- Accurately classify images of different spatial & spectral resolutions
- Manually reclassify and recode images to simplify the data.
Also, we calculated permeable surfaces surrounding the University of West Florida's campus based on the unsupervised classification image.
Starting with 50 classes of pixels and recoded into 5: grasses, shadows, trees, mixed, buildings/roads.
Monday, March 6, 2017
Thermal & Multispectral Analysis
- Create composite multipsectral images in both ERDAS and ArcMap
- Adjust image symbology and band combinations in ArcMap
- Interpret thermal infrared data
COMPARING BANDS/ WAVELEGNTHS IN NIR USING ARCMAP
Looking at NIR images to see features in ArcMap: the following two images show how to view vegetaion in NIR. NIR is the best wavelength to view vegetation in. The colored image shows RGB combination 4,2,6. And the grey-scale image shows a stretched layer on one band (6). You can see the clear difference and why thermal imaging is important for crops. Cool fact: thermal imaging can tell if a plant is healthy or not. High NIR reflectance and low visible reflectance means that the vegetation is healthy. (YAY!) Low NIR reflectance and a high visible reflectance means that it is unhealthy. The image shown would indicate that the vegetation is healthy. A explanation to the image shown below with the band combination RGB 426:
Vegetaion: low temp and high NIR appears red.
Urban area, high temp and low NIR appears blue.
Turbid river, low temp and low NIR appears green.
CREATING A STACKED LAYER IN ERDAS: TIFF TO .IMG
First you need to import images ( one at a time) and convert them to .img. Then you can use the layer stack tool (located under raster tab in the resolution group). Add all images ( be sure to press add after each .img)
Here is a final image of the layer stack I created using 7 images. Image is of Pensacola Beach, Fl. Displays barrier islnad, inlands, bay, Gulf of Mexico.
Now add a grey-scale image and link the two images together. Which feature appears warmer, the land or water and where is the land warmer, near coast or farther north?
1. Shallow water in the bays and neat the coast are much cooler than the land. However, water father out into the Gulf of Mexico appears to have a temperature similar to the land. The land is warmer towards the coast, where high populations of humans and associated infrastructure are found (roads, building, etc.). Land in the northern area is dominated by vegetation, and is naturally cooler.
Finally, creating a map using thermal infrared to highlight a feature.
I choose to use the map of Pensacola Beach, Fl. I displayed a TIR band combination.It was almost right for identifying my feature. I needed to change the bands just a little. I wanted to compare temperature between vegetation and human infrastructure. Using band combination RGB 647, I finally found a good visible contrast. This combination made the vegetation a vibrant, bright green, while making the human infrastructure bright blue. A lot of band combinations I chose may have created confusion with the water being so bright, and similar in temperature. The combination I finally chose made the water look black, so
confusion could be eliminated.
Wednesday, February 22, 2017
Normalized Difference Vegetation Index
This lab proved to be, both, difficult and time-consuming. But, I did it! Once I figured out what exactly I was doing it was a breeze. The ultimate goal of this week's lab exercise was to identify features by interpreting digital data. I looked at multispectral bands and pixels to determine different water features on a map of Olympic National Forest in Washington State (the same map we have used in previous weeks.)
Skills Obtained or Practiced in this Lab
- Identify features by interpreting digital data
- Explore histograms
- Interpret histogram data
- Understand multispectral bands
- Understand layers and pixels.
- Navigating ERDAS
- Connection between ERDAS and ARCMAP
- Navigating ERDAS
- Connection between ERDAS and ARCMAP
Below are the three maps I created in ArcMap after identifying features in ERDAS. Once imported into ArcMap, I changed bands to really make the feature pop and be close to the normal color. The band order RGB is listed within the map. As well as a description of each process and reason for choosing the specific band combination.
Wednesday, February 15, 2017
Introduction to ERDAS 2
ERDAS Imagine & Digital Data
During this exercise, many skills were practiced. I utilized tools and functions of ERDAS Image, interpreted the layer info of digital data, distinguished between the four types of resolution, and interpreted & analyzed thematic rasters.
(Figure 1 below)
Figure one shows the utilization of ERDAS Image to compare different resolutions. All data was navigated to by using the Metadata tab and analyzing figures.
What's the difference of different resolutions?
Spatial Resolution: This resolution is determined by the inverse relationship between spatial resolution and pixel size. The higher the pixel the lower the spatial resolution
Radiometric Resolution: This resolution is determined by the level of contrast in the photo. The finer the radiometric resolution of a sensor, the more sensitive it is to detecting small differences in reflected or emitted energy, It may be hard to determine this resolution with the eye. Checking metadata for bit size will be the best indicator. The higher the bit the higher the radiometric resolution will be.
Temporal Resolution: This resolution is best determined by how frequently an image of the same area can be taken. Example- Landstat 7 passes of the same area of the planet every 16 days, so therefore it has a temporal resolution of 16.
Spectral Resolution: This resolution is determined by how well an image can be used to distinguish between different wavelengths ( or bands) of EMR (electromagnetic radiation). the more bands an image has, and the narrower bandwidths, the higher the spectral resolution.
(Figure 2 Below)
Figure two shows the starting raster image used in the project. It is an image that contains data on soils and type/location. A vector shape-file was added to the frame viewer, showing a hyrdological
(Figure 3 below)
Figure three shows the completed project. The task was completed by analyzing and making this thematic raster image. A new row was added to attribute table to decipher area and percent coverage of each category. Then a query was built to only include Fine Humus soils, resulting in this final map.