1 of 19

ISMIR 2015 Unconference

2 of 19

ISMIR 2015 Unconference

3 of 19

Big data for musicology

  • Intro by Tilmann
    • cannot apply close reading methods to 100000nds of tracts; new methods needed.
    • Challenge for technology: it should still be useful.
    • Musicological culture is very different from engineering
  • Discussion of challenges and opportunities from musicological viewpoints
  • Alan M:
    • Important issue in slow uptake has to do with confidence in the results. musicologists are trained to be skeptical (like all scholars). Know hot to challenge results coming from other scholars, but do not know how to challenge machine results. What to do in response to them?
  • FW: transparency of software
  • Chris: what about standard evaluation methods in engineering
  • AM: important to see how the result is attained, like for example sonic visualiser

4 of 19

Big data for musicology 2

  • Berit J: interesting to be able to bundle many things in one interface
  • Jan v B: but SV is not suitable for big data
  • Johanna D: the way in which the information is encoded also helps for transparency
  • Berit: SV does not do so -- Johanna combine with score
  • Tilmann: how to scale up the information conveyed in scores
  • Kevin P. Score keeps precision, but no reduction; big data is reduction; must be interpreted
  • ….
  • Tilmann: Keeping the musicologists closer to their comfort zone
  • ??? in indian musicology, most common pattern may not be the most interesting; granularity also matters

5 of 19

Big data for musicology 3

  • how do qualitative concepts translate to quantitative features
  • Kevin: is a bit worried if this way of matching quantitative and quantitative properties of music
  • How necessary is a model?
  • Rafael Caro: musicologist do have models, but they are based on very deep expertise and thus may have a rather different nature
  • Kevin: the issue is thus: is there equivalence between those models. Is it the aim to find this match or is there something more fzzy between them
  • Christopth Weiss: likes the interface, but the fit to the question should be made at a very early stage
  • Xavier Serra: cannot detach tool from the research questions. Research question comes first

6 of 19

Big data for musicology 4

  • XS: absolutely important that we do musicologically relevant work
  • Should we all become computational musicologists are? In fact a lot of us are.
  • Tillman: some of us do develop tools for others
  • Johanna D: take some of the things into account from FW’s survey about the values and technical skills of the musicologists
  • Kevin: process of continuous adaptation of tools.
  • Tillman: requires continuous collaboration, but it’s not widely acknowledged as useful
  • Is our work presented ad musicological conferences. Rafaels work was shown at musicological conferences. IAML-IMS is a very good case. Some people have experienced that they got better feedback from the musicologists

7 of 19

Big data for musicology 5

  • Others have had different experiences. Maybe they do not have the critical vocabulary that is needed for discussing the methodological cotnribution, so we should actually help them with that.
  • Brings us baCK FULL CIRCLE TO THE RELIABILTIY OF QUANTITATIVE DATA
  • What can we then do to convince them…
  • Xavier: address relevant problems
  • Richard Lewis: have a session devoted to musicology at next ISMIR
  • Also demonstrate difference between probems that are more suited for big data and more for close reading
  • Johanna: why are some subcommunities more receptive? eg Early music

8 of 19

Big data for musicology 6

  • Richard: because it’s source based,
  • Rafael: look art the debates
  • frans: Early music is a very big chance for musicology
  • Tim: perfromance study is another, on the verge of breakthroughs. Bring in the performers, e.g. but not exclusively in early music
  • In India there is a big gap between musicology and performance. Musicology is very prescriptive in India (!?!?!)

(Indian issues mainly from Kaustuv Kanti Ganguli)

9 of 19

Transforming musicology: the afterparty (1)

  • abstract

This Unconference session is a complement to Tutorial 2, Addressing the music information needs of musicologists, presented by the researchers from the Transforming Musicology project on the first day of this ISMIR. The main message of the tutorial is that “contemporary musicology [is] a rich source of new and exciting challenges for MIR”. Specifically, the tutorial discusses issues such as:

  • the music information needs of music researchers
  • new musicologically motivated research questions suitable for MIR
  • new MIREX tasks that focus on musicological challenges

In this session we will review new work presented at this ISMIR in the light of these issues. First, two members of the T-Mus team, Kevin Page and Tim Crawford, will briefly report on what they consider the musicological highlights of this ISMIR. We will then have a general discussion about these, and any other insights that the attendants may want to present.

10 of 19

Transforming musicology: the afterparty (2)

  • many attended the tutorial
  • Tim: how well has this conference responded to the ideas raised in the tutorial?
    • Tim’s impression is historical: ISMIR was born from OMRAS project, searching music mostly in the symbolic domain. in 2000 suddenly the signal processing people amazed them by telling them what would be possible in the audio domain.

11 of 19

Transforming musicology: the afterparty (3)

Libraries and musicologists became a minority and for a while they seemed to be marginalised. That tendency has been reversed, possibly (Emmanouil) because the baseline problems in DSP have been solved an musically more interesting questions are now being approached.

Focusing on this year's work, a lot of work that the authors don't consider to be musicology are in his view contributions to technology. Musicology is transformed by becoming more wider

Kevin Page: three important things:

  • glue that brings different types of information together
  • some software that brings things really close to being musicologically useful
  • hook analyis (Jan Van Balen et al.'s paper) is proper new musicological work

Tim: perspective from other attendants on the tutorial?

Anja: in Brazil, engineers suddenly found musical understanding interesting. Is that part of this opening up?

12 of 19

Transforming musicology: the afterparty (4)

Tim: democratization of musicology. Of course there is a danger that the elite won't like it, but on the whole very positive. Web-based musicology

Tutorial: specific picture of musicology; but it is of course wider. Anything that tells you about music

Roger D. There was a sort of middle period of the conference where it became more and more engineering. Discovered that there were some stupid ideas like playlist generation and genre classification. Looking at music understanding has become a major challenge

Tim: also extension to non-standard music repertoires, largely through Xavier S's work. This also sheds a new light on musicology as a whole. The viewpoint that Western Music is special is more and more challenged by this, and that is good.

Folk song research is a bit different also than ethnomusicology. Tim: e.g. transcription. However, there is a fear that transcription is part of an empiricist's agenda rather than an athropological one (Johanna Devaney).

Rafael: music as culture. Transcription has a specific goal

13 of 19

Transforming musicology: the afterparty (5)

Frans: contextualisation is also important--talk to people how that could be done

Tim: provide infrastructures that can be used in a new way.

Richard Lewis: Subject matter is wider than musical materials

Johanna: outcomes of big data may support close reading

Tim: requires understanding of techniques, so there is an educational changed

Anja: question about context: interesting to investigate how far you can get with just the content.

Roger D. lack of models how music works, how do connections get formed at higher levels.

Tim: Music psychologists are not well represented at ISMIR.

Neither are organologists (Emmanouil).

Kaustuv Ganguli-- musicology does not deal with the small embellishments etc. here psychology is quite important.

14 of 19

WiMIR mentoring programme (page 1)

  • Agree on a goal during their first meeting: mentee + mentor agreement
  • 6 months programme leading up to the next ISMIR conference
  • Not a requirement to be present at the conference
  • For the first meeting: have a list of suggestions of what the goal of the mentoring programme for each pair of mentee-mentor could be
  • There should be a safe place (chat room?) to ask novice questions to the entire batch of mentors and mentees
  • Mentees: for the pilot programme → women only (we'll open it to men after the pilot)
  • Mentors: everyone (but mentees can express preferences)
  • Reuse external guidelines if we find some, otherwise try the pilot without any

15 of 19

WiMIR mentoring programme (page 2)

  • 1 (online) meeting per month → 1 hour (+ variable follow-up time per email)
  • Time zone / general availability could be part of the choice criteria
  • A third-party or an algorithm would make the matching
  • A survey of how mentees feel about specific points → before and after the programme (to evaluate success)�+ Questions about how many meetings did they actually have, amount of interactions, etc.
  • Mentor training programme at ISMIR2016 (over lunch?)
  • First google hangout with everyone to get people started and excited
  • "Interview your mentor" activity: not compulsory but a suggestion
  • Have suggested themes for each meeting

16 of 19

Open Reviewing

Opaque System

  • Only way we can fix the problem at the group level is opening more of the guts to the group
  • How “we as a community are making judgments”
  • Only the scientific chair has the necessary perspective to fuse everyone’s reviews
  • Lack of institutional knowledge
  • “Bad reviewers can’t get better without feedback”

Parameters

  • asdf

Reviewer ID

  • Truly Anonymous
  • Hashed names

Author ticked public / non-public

Min word count

Reviewer authored abstract

Embrace positive points of the paper

Bibtex entries

Review / rebuttal

Couple conferences started changing their models, publishing reviews with papers

Same way double-blind reviewing came up

Not going to happen for 2016, but worth discussing

Why

  • Increase transparency in the paper selection
  • Improve quality of reviews
  • helpful info for those who want to publish

Advantages

  • Transparency and accountability
  • Criteria for selection
  • More information is better, deeper insight into
  • Good examples to point to
  • Reviewers might be more polite
  • Cross-iteration comparison (cultural memory)

Disadvantages

  • Ethical consideration, reviews for rejected papers

Parameters

  • Optional / compulsory (would it affect submission rates)
  • Magnitude (how much)
  • Rebuttal

17 of 19

Reproducible Software

Explicit Actions and Agenda

Tag submissions that are reproducible (have code, have data, have both)

Chair / overseer

Reproducibility award

Look into AES with QMUL

Identify folks in the community to act as leaders

Discussion list

Papers are insufficient to reproduce science

Reproducible research = data + software

Who has done it? No one!

PeerJ has these requirements

Allow for submissions

Would reviews do it?

Adapt guidelines -> Reviewing Guidelines

Want versus achieve reproducibility

How to develop / encourage / support engineering skill acquisition and practice

Workshops / how useful are they really?

Values lead to action (educate the community)

Use society's resources to educate

Community guidelines (this is how we do it)

Code review at the community level

Recognize and value good code / practices

Tests, continuous integration, etc

Tech debt

Identify leaders in the community

Collaboration as a means to skill acquisition

DOIs for software

Communication channels

#mir IRC on freenode

discuss@ismir.net

18 of 19

Topic

  • notes

19 of 19

Topic

  • notes