Spatially Enhanced Spectral Unmixing Through Data Fusion of Spectral and Visible Images from Different Sensors

dc.contributorHáskóli Íslandsen_US
dc.contributorUniversity of Icelanden_US
dc.contributor.authorKizel, Fadi
dc.contributor.authorBenediktsson, Jon Atli
dc.contributor.departmentRafmagns- og tölvuverkfræðideild (HÍ)en_US
dc.contributor.departmentFaculty of Electrical and Computer Engineering (UI)en_US
dc.contributor.schoolVerkfræði- og náttúruvísindasvið (HÍ)en_US
dc.contributor.schoolSchool of Engineering and Natural Sciences (UI)en_US
dc.date.accessioned2021-01-15T13:43:58Z
dc.date.available2021-01-15T13:43:58Z
dc.date.issued2020-04-16
dc.descriptionPubliher's version (útgefin grein)en_US
dc.description.abstractWe propose an unmixing framework for enhancing endmember fraction maps using a combination of spectral and visible images. The new method, data fusion through spatial information-aided learning (DFuSIAL), is based on a learning process for the fusion of a multispectral image of low spatial resolution and a visible RGB image of high spatial resolution. Unlike commonly used methods, DFuSIAL allows for fusing data from different sensors. To achieve this objective, we apply a learning process using automatically extracted invariant points, which are assumed to have the same land cover type in both images. First, we estimate the fraction maps of a set of endmembers for the spectral image. Then, we train a spatial-features aided neural network (SFFAN) to learn the relationship between the fractions, the visible bands, and rotation-invariant spatial features for learning (RISFLs) that we extract from the RGB image. Our experiments show that the proposed DFuSIAL method obtains fraction maps with significantly enhanced spatial resolution and an average mean absolute error between 2% and 4% compared to the reference ground truth. Furthermore, it is shown that the proposed method is preferable to other examined state-of-the-art methods, especially when data is obtained from different instruments and in cases with missing-data pixels.en_US
dc.description.sponsorshipThis research was partially funded by the Icelandic Research Fund through the EMMIRS project, and bythe Israel Science Ministry and Space Agency through the Venus project.en_US
dc.description.versionPeer Revieweden_US
dc.format.extent1255en_US
dc.identifier.citationKizel F, Benediktsson JA. Spatially Enhanced Spectral Unmixing Through Data Fusion of Spectral and Visible Images from Different Sensors. Remote Sensing. 2020; 12(8):1255.en_US
dc.identifier.doi10.3390/RS12081255
dc.identifier.issn2072-4292
dc.identifier.journalRemote Sensingen_US
dc.identifier.urihttps://hdl.handle.net/20.500.11815/2384
dc.language.isoenen_US
dc.publisherMDPI AGen_US
dc.relation.ispartofseriesRemote Sensing;12(8)
dc.relation.urlhttps://www.mdpi.com/2072-4292/12/8/1255/pdfen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectData fusionen_US
dc.subjectMultispectral imagesen_US
dc.subjectRemote sensingen_US
dc.subjectSpatial informationen_US
dc.subjectSpatial resolutionen_US
dc.subjectSpectral unmixingen_US
dc.subjectFjarkönnunen_US
dc.subjectMyndgreining (upplýsingatækni)en_US
dc.titleSpatially Enhanced Spectral Unmixing Through Data Fusion of Spectral and Visible Images from Different Sensorsen_US
dc.typeinfo:eu-repo/semantics/articleen_US
dcterms.licenseThis is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly citeden_US

Skrár

Original bundle

Niðurstöður 1 - 1 af 1
Hleð...
Thumbnail Image
Nafn:
Kizel-2020-Spatially-enhanced-spectral-unmixin.pdf
Stærð:
11.72 MB
Snið:
Adobe Portable Document Format
Description:
Publisher´s version

Undirflokkur