<div class="csl-bib-body">
<div class="csl-entry">Lan, L., Zhang, K., Ge, H., Cheng, W., Liu, J., Rauber, A., Li, X.-L., Wang, J., & Zha, H. (2017). Low-rank decomposition meets kernel learning: A generalized Nyström method. <i>Artificial Intelligence</i>, <i>250</i>, 1–15. https://doi.org/10.1016/j.artint.2017.05.001</div>
</div>
-
dc.identifier.issn
0004-3702
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/146919
-
dc.description.abstract
Low-rank matrix decomposition and kernel learning are two useful techniques in building advanced learning systems. Low-rank decomposition can greatly reduce the computational cost of manipulating large kernel matrices. However, existing approaches are mostly unsupervised and do not incorporate side information such as class labels, making the decomposition less effective for a specific learning task. On the other hand, kernel learning techniques aim at constructing kernel matrices whose structure is well aligned with the learning target, which improves the generalization performance of kernel methods. However, most kernel learning approaches are computationally very expensive. To obtain the advantages of both techniques and address their limitations, in this paper we propose a novel kernel low-rank decomposition formulation called the generalized Nyström method. Our approach inherits the linear time and space complexity via matrix decomposition, while at the same time fully exploits (partial) label information in computing task-dependent decomposition. In addition, the resultant low-rank factors can generalize to arbitrary new samples, rendering great flexibility in inductive learning scenarios. We further extend the algorithm to a multiple kernel learning setup. The experimental results on semi-supervised classification demonstrate the usefulness of the proposed method.
en
dc.language.iso
en
-
dc.publisher
ELSEVIER
-
dc.relation.ispartof
Artificial Intelligence
-
dc.subject
Artificial Intelligence
-
dc.subject
Linguistics and Language
-
dc.subject
Language and Linguistics
-
dc.subject
Kernel learning
-
dc.subject
Nyström low-rank decomposition
-
dc.subject
Large-scale learning algorithms
-
dc.subject
Multiple kernel learning
-
dc.title
Low-rank decomposition meets kernel learning: A generalized Nyström method
en
dc.type
Artikel
de
dc.type
Article
en
dc.description.startpage
1
-
dc.description.endpage
15
-
dc.type.category
Original Research Article
-
tuw.container.volume
250
-
tuw.journal.peerreviewed
true
-
tuw.peerreviewed
true
-
wb.publication.intCoWork
International Co-publication
-
tuw.researchTopic.id
I1
-
tuw.researchTopic.name
Logic and Computation
-
tuw.researchTopic.value
100
-
dcterms.isPartOf.title
Artificial Intelligence
-
tuw.publication.orgunit
E194-01 - Forschungsbereich Software Engineering
-
tuw.publisher.doi
10.1016/j.artint.2017.05.001
-
dc.identifier.eissn
1872-7921
-
dc.description.numberOfPages
15
-
wb.sci
true
-
wb.sciencebranch
Informatik
-
wb.sciencebranch.oefos
1020
-
item.languageiso639-1
en
-
item.fulltext
no Fulltext
-
item.openairetype
research article
-
item.grantfulltext
restricted
-
item.openairecristype
http://purl.org/coar/resource_type/c_2df8fbb1
-
item.cerifentitytype
Publications
-
crisitem.author.dept
E194-04 - Forschungsbereich E-Commerce
-
crisitem.author.orcid
0000-0002-9272-6225
-
crisitem.author.parentorg
E194 - Institut für Information Systems Engineering