<div class="csl-bib-body">
<div class="csl-entry">Knees, P., Schedl, M., & Goto, M. (2020). Intelligent User Interfaces for Music Discovery. <i>Transactions of the International Society for Music Information Retrieval</i>, <i>3</i>(1), 165–179. https://doi.org/10.5334/tismir.60</div>
</div>
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/141371
-
dc.description.abstract
Assisting the user in finding music is one of the original motivations that led to the establishment of Music Information Retrieval (MIR) as a research field. This encompasses classic Information Retrieval inspired access to music repositories that aims at meeting an information need of an expert user. Beyond this, however, music as a cultural art form is also connected to an entertainment need of potential listeners, requiring more intuitive and engaging means for music discovery. A central aspect in this process is the user interface.
In this article, we reflect on the evolution of MIR-driven intelligent user interfaces for music browsing and discovery over the past two decades. We argue that three major developments have transformed and shaped user interfaces during this period, each connected to a phase of new listening practices. Phase 1 has seen the development of content-based music retrieval interfaces built upon audio processing and content description algorithms facilitating the automatic organization of repositories and finding music according to sound qualities. These interfaces are primarily connected to personal music collections or (still) small commercial catalogs. Phase 2 comprises interfaces incorporating collaborative and automatic semantic description of music, exploiting knowledge captured in user-generated metadata. These interfaces are connected to collective web platforms. Phase 3 is dominated by recommender systems built upon the collection of online music interaction traces on a large scale. These interfaces are connected to streaming services.
We review and contextualize work from all three phases and extrapolate current developments to outline possible scenarios of music recommendation and listening interfaces of the future.
en
dc.language.iso
en
-
dc.publisher
Ubiquity Press Ltd.
-
dc.relation.ispartof
Transactions of the International Society for Music Information Retrieval
-
dc.subject
recommender systems
-
dc.subject
user interfaces
-
dc.subject
music browsing
-
dc.subject
music access
-
dc.subject
content-based MIR
-
dc.subject
community metadata
-
dc.title
Intelligent User Interfaces for Music Discovery
en
dc.type
Artikel
de
dc.type
Article
en
dc.description.startpage
165
-
dc.description.endpage
179
-
dc.type.category
Original Research Article
-
tuw.container.volume
3
-
tuw.container.issue
1
-
tuw.journal.peerreviewed
true
-
tuw.peerreviewed
true
-
wb.publication.intCoWork
International Co-publication
-
tuw.researchTopic.id
I4a
-
tuw.researchTopic.id
I5
-
tuw.researchTopic.name
Information Systems Engineering
-
tuw.researchTopic.name
Visual Computing and Human-Centered Technology
-
tuw.researchTopic.value
90
-
tuw.researchTopic.value
10
-
dcterms.isPartOf.title
Transactions of the International Society for Music Information Retrieval
-
tuw.publication.orgunit
E194-01 - Forschungsbereich Software Engineering
-
tuw.publisher.doi
10.5334/tismir.60
-
dc.identifier.eissn
2514-3298
-
dc.description.numberOfPages
15
-
tuw.author.orcid
0000-0003-3906-1292
-
wb.sciencebranch
Informatik
-
wb.sciencebranch.oefos
1020
-
wb.facultyfocus
Information Systems Engineering (ISE)
de
wb.facultyfocus
Information Systems Engineering (ISE)
en
wb.facultyfocus.faculty
E180
-
item.openairecristype
http://purl.org/coar/resource_type/c_2df8fbb1
-
item.languageiso639-1
en
-
item.grantfulltext
none
-
item.fulltext
no Fulltext
-
item.openairetype
research article
-
item.cerifentitytype
Publications
-
crisitem.author.dept
E194-04 - Forschungsbereich Data Science
-
crisitem.author.dept
E185 - Institut für Computersprachen
-
crisitem.author.orcid
0000-0003-3906-1292
-
crisitem.author.parentorg
E194 - Institut für Information Systems Engineering