Graph optimized locality preserving projection via heuristic optimization algorithms

dc.authorscopusid26665865200
dc.authorscopusid35105306400
dc.contributor.authorCeylan, Oğuzhan
dc.contributor.authorTaşkin, G.
dc.date.accessioned2023-10-19T15:05:36Z
dc.date.available2023-10-19T15:05:36Z
dc.date.issued2019
dc.department-tempCeylan, O., Kadir Has University, Department of Management Information Systems, Istanbul, Turkey; Taşkin, G., Istanbul Technical University, Earthquake Engineering and Disaster Management Institute, Istanbul, Turkeyen_US
dc.descriptionThe Institute of Electrical and Electronics Engineers, Geoscience and Remote Sensing Society (GRSS)en_US
dc.description39th IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2019 --28 July 2019 through 2 August 2019 -- --154792en_US
dc.description.abstractDimensionality reduction has been an active research topic in hyperspectral image analysis due to complexity and nonlinearity of the hundreds of the spectral bands. Locality preserving projection (LPP) is a linear extension of the manifold learning and has been very effective in dimensionality reduction compared to linear methods. However, its performance heavily depends on construction of the graph affinity matrix, which has two parameters need to be optimized: k-nearest neighbor parameter and heat kernel parameter. These two parameters might be optimally chosen simply based on a grid search when using only one representative kernel parameter for all the features, but this solution is not feasible when considering a generalized heat kernel in construction the affinity matrix. In this paper, we propose to use heuristic methods, including harmony search (HS) and particle swarm optimization (PSO), in exploring the effects of the heat kernel parameters on embedding quality in terms of classification accuracy. The preliminary results obtained with the experiments on the hyperspectral images showed that HS performs better than PSO, and the heat kernel with multiple parameters achieves better performance than the isotropic kernel with single parameter. © 2019 IEEE.en_US
dc.identifier.citation0
dc.identifier.doi10.1109/IGARSS.2019.8900479en_US
dc.identifier.endpage3068en_US
dc.identifier.scopus2-s2.0-85113877324en_US
dc.identifier.scopusqualityN/A
dc.identifier.startpage3065en_US
dc.identifier.urihttps://doi.org/10.1109/IGARSS.2019.8900479
dc.identifier.urihttps://hdl.handle.net/20.500.12469/4965
dc.identifier.volume2019-Januaryen_US
dc.identifier.wosqualityN/A
dc.khas20231019-Scopusen_US
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineers Inc.en_US
dc.relation.ispartofInternational Geoscience and Remote Sensing Symposium (IGARSS)en_US
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectDimensionality reductionen_US
dc.subjectHarmony searchen_US
dc.subjectManifold learningen_US
dc.subjectParticle swarm optimizationen_US
dc.subjectDimensionality reductionen_US
dc.subjectGraph algorithmsen_US
dc.subjectHeuristic algorithmsen_US
dc.subjectMatrix algebraen_US
dc.subjectNearest neighbor searchen_US
dc.subjectParticle swarm optimization (PSO)en_US
dc.subjectRemote sensingen_US
dc.subjectSpectroscopyen_US
dc.subjectClassification accuracyen_US
dc.subjectHeuristic optimization algorithmsen_US
dc.subjectK-nearest neighborsen_US
dc.subjectKernel parameteren_US
dc.subjectLinear extensionsen_US
dc.subjectLocality preserving projectionsen_US
dc.subjectManifold learningen_US
dc.subjectMultiple parametersen_US
dc.subjectHeuristic methodsen_US
dc.titleGraph optimized locality preserving projection via heuristic optimization algorithmsen_US
dc.typeConference Objecten_US
dspace.entity.typePublication
relation.isAuthorOfPublicationb80c3194-906c-4e78-a54c-e3cd1effc970
relation.isAuthorOfPublication.latestForDiscoveryb80c3194-906c-4e78-a54c-e3cd1effc970

Files

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
4965.pdf
Size:
1.69 MB
Format:
Adobe Portable Document Format
Description:
Tam Metin / Full Text