Skip to toolbar

Community & Business Groups

Music Notation Community Group

The Music Notation Community Group develops and maintains format and language specifications for notated music used by web, desktop, and mobile applications. The group aims to serve a broad range of users engaging in music-related activities involving notation, and will document these use cases.

The Community Group documents, maintains and updates the MusicXML and SMuFL (Standard Music Font Layout) specifications. The goals are to evolve the specifications to handle a broader set of use cases and technologies, including use of music notation on the web, while maximizing the existing investment in implementations of the existing MusicXML and SMuFL specifications.

The group is developing a new specification to embody this broader set of use cases and technologies, under the working title of MNX. The group is proposing the development of an additional new specification to provide a standard, machine-readable source of musical instrument data.

w3c/smufl
Group's public email, repo and wiki activity over time

Note: Community Groups are proposed and run by the community. Although W3C hosts these conversations, the groups do not necessarily represent the views of the W3C Membership or staff.

final reports / licensing info

date name commitments
MusicXML Version 3.1 Licensing commitments
SMuFL 1.3 Licensing commitments
SMuFL 1.4 Licensing commitments
MusicXML 4.0 Licensing commitments

Chairs, when logged in, may publish draft and final reports. Please see report requirements.

Publish Reports

SMuFL 1.3 Draft Community Report Published

We are pleased to announce that a draft Community Report for SMuFL 1.3 has now been published, and we invite the Community Group to review the draft report so that we can move from draft status to a Final Community Report as soon as possible.

SMuFL 1.3 is the first release since the interim SMuFL 1.2 release in April 2016, which did not reach Community Report status. The main changes to SMuFL since the inception of the Community Group amount to the addition of some new ranges of characters – chief among them German organ tablature, Kahnotation dance notation, and supplemental ranges for clefs, chord symbols, octave lines, and time signatures – and the clarification of some aspects of the font metadata file specification.

You can view all of the issues that have been addressed in the SMuFL 1.3 Milestone on GitHub, and read a detailed list of changes in the Version History page in the specification itself.

Depending on the nature of the feedback received, we intend to advance from draft status to publishing a Final Community Report in around two weeks, aiming to publish in the week beginning 18 February 2019.

Please read the draft Community Report, and if you have any feedback, please raise an issue in GitHub.

Meeting at NAMM Show 2019

Photo of Anaheim Convention Center during NAMM

The W3C Music Notation Community Group will once again have a meeting and dinner at the NAMM Show in Anaheim, California. Both events are scheduled for Friday, January 25. The meeting will be from 3:00 pm to 5:00 pm in Room 201C of the Anaheim Convention Center. This meeting room is on the second level of the main building, between Peavey and Mackie. Dinner will be at 7:30 pm at Thai Nakorn at 12532 W. Garden Grove Blvd. in Garden Grove.

Because this year’s meeting is in the Anaheim Convention Center, you will need a NAMM badge in order to attend. Please let us know if you still need a NAMM badge. We are grateful to the staff of the NAMM Show for their help with providing both meeting space and badges.

To help us prepare for the meeting, please fill out this sign-up form if you plan to attend the meeting, the dinner, or both:

https://goo.gl/forms/Sr27X0frty7kAes92

We will be discussing the current status and future plans for MNX, SMuFL, and MusicXML during this meeting. We look forward to seeing many of you there.

Co-Chair Announcement

We are pleased to announce the appointment of Adrian Holovaty, CEO and founder of Soundslice, as co-chair of the W3C Music Notation Community Group, joining existing co-chairs Michael Good of MakeMusic and Daniel Spreadbury of Steinberg. Joe Berkovitz, founder and CEO of Noteflight and more recently of Risible LLC, is stepping down as co-chair.

We would like to express our deepest thanks to Joe for his contributions. Joe was instrumental in founding the Community Group and bringing MakeMusic and Steinberg to the W3C, showing them the benefits of developing MusicXML and SMuFL here. He has also been the driving force in the development of the MNX spec to this point.

After leaving Noteflight at the end of 2017, Joe has wanted to devote more of his time to both his artistic pursuits and family matters. After he indicated to the other co-chairs that he planned to step down, Michael and Daniel have been discussing how to keep the group’s efforts moving forward.

We are delighted that Adrian has agreed to help us drive MNX forward from here. Adrian has been an active member of the Community Group since its inception. He is not only the CEO of Soundslice but is also the co-creator of the Django Python framework. Adrian has invaluable experience in developing web-based music notation software and in shepherding large-scale projects. We are very fortunate to be able to call upon his expertise.

Although we believe most everybody in the community is already familiar with Adrian, we asked him to write a few words by way of introduction:

Hi everybody! Adrian here. I’m excited to play a bigger part in this community and help improve the lives of developers of music technologies around the world — and, most importantly, the musicians who use these technologies.

Professionally, I’ve been a full-time web developer since 2002. My largest contribution so far has been co-creating the open-source Django Web framework, used by many developers these days. I implemented much of the framework’s original code and helped build a thriving community of contributors.

Since 2012, I’ve been working full-time on Soundslice, a website that helps people learn and practice music. It has its own notation rendering engine and imports/exports various notation formats, so for better or worse I’ve acquired a deep technical knowledge of Western musical notation and tablature.

I live in Amsterdam and gig a few times a month with various gypsy-jazz bands. I also post videos of guitar performances to YouTube at youtube.com/adrianholovaty.

We can now turn our attentions to our plans for 2019. In the immediate term, we will publish the final community report for SMuFL 1.3 (the change in version number from 1.2 to 1.3 is merely symbolic, and no significant new development work beyond what has been done to date will be undertaken). We are also in the process of organising community meetings at the NAMM Show in Anaheim, California at the end of January and at Musikmesse in Frankfurt at the beginning of April.

Our goal for 2019 is to deliver a draft 1.0 version of the MNX-Common and MNX-Generic specifications by the end of the year. We believe that we have a solid foundation in terms of the basic musical structure for CWMN, and we will next turn our attention to some specific representation challenges before we then attempt to fit other MusicXML elements for specific notations into the framework. These specific representation challenges include:

  • How pitch should be encoded, and whether MNX-Common documents should always be in written pitch or sounding pitch.
  • How, or to what extent, MNX-Common documents should encode multiple presentations for the same musical material, e.g. the full score versus instrumental parts, and the differences between them, such as page and system breaks, differences in enharmonic spelling, information that should appear only in one or other presentation, and so on.
  • What the role of profiles will be in terms of specifying what aspects of MNX-Common will be supported by different types of applications, and how to manage user and developer expectation around these differences.
  • How should layout and performance data be represented in MNX-Common, and how should this interact with the semantic data.

Our medium-term goal is to present proposed solutions to these fundamental issues at our Community Group meeting at Musikmesse in April.

We welcome feedback from members of the Community Group about these changes and our plans for 2019. We look forward to hearing from you.

Daniel, Michael and Adrian

Musikmesse 2018 Meeting Minutes

The W3C Music Notation Community Group met in the Symmetrie 3 room (Hall 8.1) at Messe Frankfurt during the 2018 Musikmesse trade show, on Thursday 12 April 2018 between 2:30 pm and 4:30 pm.

CG co-chairs Joe Berkovitz, Daniel Spreadbury, and Michael Good chaired the meeting, and over 30 members of the CG and interested guests attended. A complete list of the attendees is included at the end of this report. The presentations from the meeting are posted at

W3C MNCG Musikmesse 2018 Presentations

Peter Jonas from MuseScore recorded the meeting and has posted the video on YouTube. The video starting times for each part of the meeting are included in the headings below.

The noise in the background is from the Musikmesse. The audio improves significantly after 5:20 when the microphone is moved closer to the speakers.

MuseScore Sponsor Introduction (1:27)

MuseScore sponsored this year’s meeting reception. Daniel Ray describe how MuseScore was recently acquired by Ultimate Guitar. MuseScore will remain open source and free, but more resources are now available. MuseScore also wants to get more involved in standards activities and the community group.

Review Working Process to Date (2:40)

Daniel Spreadbury led a discussion of the current community group working process. Most of the discussion involved questions about using GitHub. Daniel Ray suggested that it would be good to share links for how to use GitHub. Joe Berkovitz suggested we should also do the same for Bikeshed, the tool that is used to maintain the MNX specification.

We also discussed the possibility of having more in-person meetings, either online or face-to-face. One way to do this might be to have meetings with specific themes. Daniel Ray suggested that in addition to themes based on portions of the spec, we could have themes based on specific use cases. This could tie in with other events – for instance, a meeting focus on music education and performing ensemble use cases at The Midwest Clinic. A conference like Midwest could get participation from composers, teachers, and performers in a way that the NAMM and Musikmesse conferences do not.

We also had a brief discussion of issues to suggest for active review in the near future. Alexander Plötz suggested addressing issues of transpositions and scores in concert or notated pitch. Daniel Ray asked about a timeline for the specification. Our goal is to have a draft version of the specification in early 2019, which is aggressive.

MNX-Generic (32:39)

Joe Berkovitz led a discussion and demonstration of the MNX-Generic, a notation format for generic music notation that coordinates SVG graphics with one or more performances. The demonstration begins at 39:22 in the video.

Discussion centered around synchronization and linking. These topics included synchronizing video as well as audio; synchronizing audio in music with repeat structures; and linking from MNX-Generic to a semantic format like MNX-Common.

Towards an MNX-Common Layout Model (1:14:39)

Joe Berkovitz led a discussion about whether is a possible or practical to standardize the layout of common Western music notation in a way that provides for better exchange between applications. This was one of the major requests to come from publishers during our NAMM meeting last January.

The presentation introduced some layout terminology and outline 5 different levels that we might take to standardizing layout:

  1. The wild west
  2. Explicit positioning
  3. Explicit space requirements
  4. Algorithmic space requirements
  5. Algorithmic layout

The main point for discussion, besides clarifications of what was being proposed, was the reactions by group members to these proposals. Members of the group expressed several concerns.

Christof Schardt said “I think this is totally crazy!” because different applications have such different approaches to layout that bridging between individual applications and a standard layout algorithm would be very difficult. We could get a huge gain simply by making things more rigorous at the current level 2 (explicit positioning) and combining these improvements with MNX-Common’s improved musical data structures.

Reinhold Hoffmann mentioned that the publisher’s use case, where greater portability between applications is so important, is very different than the use case for his company’s everyday musician customers. It seems essential to have this available as an option for applications, not a requirement.

Adrian Holovaty mentioned the differences between capturing tweaks that an engraver has made, and the end results of a particular notation program’s algorithm. The former could be useful for his application when doing reflow, but the latter is something his application really doesn’t care about. Distinguishing these can be tricky though, especially for applications that might have sub-par rendering by default and require more manual adjustments than other applications.

Martin Marris mentioned that the tighter the spacing gets, the more that rules are violated and the more that discussions are overridden. When working for Henle and other classical publishers that prefer tight spacing, Martin is overriding the application’s defaults all the time. How can this type of constant adjustment be better preserved across applications?

Peter Jonas also asked about how to capture the differences between changes that are made for semantic vs aesthetic reasons.

Joe Berkovitz also related the history of CSS styling, which started with a smaller subset of styling features and grew more comprehensive over time. We do not need to take an all-or-nothing approach, but similarly start small and improve layout specification over time.

Attendees

  • Dominique Vandenneucker, Arpege / MakeMusic
  • James Sutton, Dolphin Computing
  • Cyril Coutelier, Flat.io
  • Bob Hamblok, self
  • Christian Pörksen, hamburgmusicnotation.com
  • James Ingram, self
  • Robert Piéchaud, IRCAM
  • Mogens Lundholm, self
  • Michael Good, MakeMusic
  • Thomas Bonte, MuseScore
  • Daniel Ray, MuseScore
  • Peter Jonas, MuseScore / OpenScore
  • Eugeny Naidenov, MuseScore / Ultimate Guitar
  • Mikhail Trutnev, MuseScore / Ultimate Guitar
  • Paul Leverger, Newzik
  • Raphaël Schumann, Newzik
  • Reinhold Hoffmann, Notation Software
  • Martin Marris, Notecraft
  • Hiroyuki Koike, Piascore
  • Alexander Plötz, self
  • Christof Schardt, PriMus
  • Joe Berkovitz, Risible
  • Jan Rosseel, Scora
  • Dietmar Schneider, self
  • Dominik Svoboda, self
  • Martin Beinicke, SoundNotation
  • Simone Erli, SoundNotation
  • Adrian Holovaty, Soundslice
  • Frank Heckel, Steinberg
  • Daniel Spreadbury, Steinberg
  • Jonathan Kehl, Ultimate Guitar / MuseScore
  • Wido Weber, self

Musikmesse 2018 Meeting Agenda

We look forward to seeing many of you at the W3C Music Notation Community Group meeting at the Musikmesse fair in Frankfurt. The meeting will be held on Thursday, 12 April 2018 from 2:30 pm to 5:30 pm in the Symmetrie 3 meeting room at Hall 8.1. Note that this is a different date and location than past Musikmesse meetings.

Here is our agenda for the meeting. It will focus on the MNX specification, including MNX formats for generic music notation (MNX-Generic) and conventional Western music notation (MNX-Common).

  • 2:30 pm: Introduction, agenda, and sponsor message
  • 2:40 pm: Review working process to date
  • 3:00 pm: Solicit forthcoming issues for active review
  • 3:15 pm: MNX-Generic overview and demo
  • 3:30 pm: MNX-Generic discussion
  • 3:45 pm: MNX-Common layout discussion
  • 4:30 pm: Reception sponsored by MuseScore
  • 5:30 pm: End of reception

Please sign up on our Google form at https://goo.gl/forms/CoQowNTi8HQQqytf2 if you plan to attend the meeting. This will help ensure that we have enough room and refreshments for everyone.

You will need a Musikmesse trade visitor or exhibitor ticket to attend the meeting. Trade visitor day tickets cost 30 euros and are available online at www.musikmesse.com.

See you in Frankfurt!

Best regards,

Michael Good, Joe Berkovitz, and Daniel Spreadbury
W3C Music Notation Community Group co-chairs

Musikmesse Meeting on 12 April 2018

Musikmesse 2018 - It's my tune

We are pleased to announce that we will have a face-to-face meeting of the W3C Music Notation Community Group at the Musikmesse in Frankfurt. We look forward to this event each year as we usually have 40 to 50 music notation experts participating in the discussions.

This year’s meeting will be Thursday, 12 April 2018 from 2:30 pm to 5:30 pm in the Symmetrie 3 meeting room in Hall 8.1. Note that this is a different date and location than past Musikmesse meetings. Musikmesse was not able to provide us with a meeting room on Friday, so we needed to move the meeting to Thursday instead. We hope that we can maintain the same level of attendance and participation on this new date.

As in past years, we will have a 2-hour meeting followed by a 1-hour reception. This year’s reception will be sponsored by MuseScore.

We will be following up soon with more details about the meeting agenda. We wanted to get the notification of the confirmed date and time out to everyone as soon as possible, especially given the difference from our previous meetings.

You will need a Musikmesse trade visitor ticket to attend the meeting. These cost 30 euros and are available online at www.musikmesse.com.

Please sign up on our Google form at https://goo.gl/forms/CoQowNTi8HQQqytf2 if you plan to attend the meeting. This will help ensure that we have enough room and refreshments for everyone.

We look forward to seeing you in Frankfurt!

Best regards,

Michael Good, Joe Berkovitz, and Daniel Spreadbury
W3C Music Notation Community Group co-chairs

NAMM 2018 Meeting Minutes

The W3C Music Notation Community Group met in the TEC Tracks Meetup space in the Hilton Anaheim (Level 3, Room 7) during the 2018 NAMM trade show, on Friday, January 26, 2018 between 10:30 am and 12:00 noon.

The meeting was chaired by CG co-chairs Joe Berkovitz, Michael Good, and Daniel Spreadbury, and was attended by 20 members of the CG and interested guests. The handouts from the meeting can be found at

W3C MNCG NAMM 2018 Meeting Handout

Philip Rothman from the Scoring Notes blog recorded the meeting and has posted the video on YouTube. The video starting times for each part of the meeting are included in the headings below.

Introduction to the W3C MNCG (Starts at 0:41)

Michael Good introduced the W3C Music Notation Community Group. This meeting was part of NAMM’s TEC Tracks Meetup sessions, so several people attending were not members of the group.

Michael discussed the history of the group, its progress in 2017 in releasing MusicXML 3.1 as a Community Group Final Report, and its plans for 2018. The 2018 plans include work on the next-generation MNX project, as well as releasing a SMuFL update as a Community Group Final Report.

Group Introductions (Starts at 5:52)

We went around the room and each of the 20 attendees introduced themselves and their interest in the Music Notation Community Group. The attendees in order of their introduction on the video are:

  • Daniel Spreadbury, Steinberg (co-chair)
  • Jeff Kellem, Slanted Hall
  • Kevin Weed, self
  • Tom Nauman, Musicnotes
  • Jon Higgins, Musicnotes
  • Adrian Holovaty, Soundslice
  • Derek Lee, Groove Freedom
  • Philip Rothman, NYC Music Services
  • Jeremy Sawruk, J.W. Pepper
  • Bruce Nelson, Alfred
  • Mark Adler, MakeMusic
  • Steve Morell, NiceChart
  • Jon Brantingham, Art of Composing Academy
  • Evan Balster, Interactopia
  • Fabrizio Ferrari, Virtual Sheet Music
  • Simon Barkow-Oesterreicher, Forte Notation / Uberchord
  • Chris Koszuta, Hal Leonard
  • Doug LeBow, self
  • Joe Berkovitz, Risible (co-chair)
  • Michael Good, MakeMusic (co-chair)

These attendees covered a wide range of the music notation community. In addition to software developers there were composers, performers, music preparers and engravers, publishers, publication and production directors.

MNX (Starts at 21:00)

Joe Berkovitz led a discussion of the current status and future directions for the next-generation MNX project. Given the variety of attendees, Joe tried to balance the discussion between the perspectives of both developers and users of music notation standards.

Currently there are three parts of MNX:

  1. CWMNX is the most familiar part for conventional Western music notation. We can think of this as the next generation of MusicXML, and hope that it will take the place of what would have been MusicXML 4.0.
  2. GMNX, a general music notation format. This emerged from the group’s discussions of how we could encode arbitrary music, not necessarily part of the Western music literature. There is a role for a literal format the encodes a linkage between arbitrary vector graphics and sound. Many applications for Western music notation could use it as well.
  3. The MNX Container covers the need to package an ensemble of files together in a way that reflects the need of a compound document. This is in the most primitive state now and needs to be built out further.

Why Start Again and Work on MNX vs MusicXML? (Starts at 29:50)

MusicXML predated the Internet delivery of music when print was still king. The MusicXML format includes several print-based assumptions such as page breaks and credits (page-attached text) that cause problems for more flexible, mobile, and web-based ways of delivering music.

The success of MusicXML and the web has also created more music notation use cases that people want to address. A key one is for the model of the standard to be closer to the model that you would use for building an interactive notation program. Michael elaborated on why this was an explicit non-goal for MusicXML back in 2000, when MusicXML was trying to create a standard exchange format in the wake of unsuccessful prior efforts such as NIFF and SMDL.

Times have changed since then. We now have a product developer community that has seen the benefits of music notation exchange standards. We also have many more links to the music publisher community than what MusicXML had in 2000.

Where Are We Now? (Starts at 36:40)

We do not have very much yet for MNX. There is a draft specification, but it only covers perhaps 1/4 to 1/3 of what MusicXML does. There are no reference applications, there are not many examples, and there are lots of open issues.

The hope is to have a complete draft of the specification by the end of 2018, though that may be optimistic. At that point the vendor community will not be rushing to build MNX support, but we do expect to see experimental implementations. This is fine – if you don’t have implementations, you don’t learn.

Container Format (Starts at 41:17)

The MNX container format tries to do a better job of representing document hierarchies than MusicXML’s opus document type, which nobody appears to be using. Another goal is to provide a more solid approach to metadata compared to what we have today in MusicXML. Different score types can be included in the container, including CWMNX, GMNX, and other score types such as neumes that might be developed in the future.

Michael asked about using a zip file as an alternative or supplement to the XML format container. Joe replied that zip is just one of many ways we could package an archive, and Michael will file an issue on this.

Michael raised a second question about including digital rights management in the container format. Jeremy Sawruk replied that we should look at the HTML5 video debacle and not specify DRM ourselves. We should not preclude vendors adding DRM, but that should be done at the vendor level.

Doug LeBow raised an issue about being able to identify a creation and usage history for music within the metadata. In his experience with Disney, music get repurposed and reused all the time, and people need to know where different parts came from. Joe suggested that Doug enter issues so that we can capture his knowledge of these use cases. Joe also mentioned that MNX intends for metadata to present at any level in the document, not just at score or collection level.

CWMNX Highlights (Starts at 50:35)

Sequences and directions are at the core of the new organization of musical material in CWMNX. In MusicXML you can hop back and forth between voices and times at will. CWMNX takes MusicXML’s cursor approach to ordering music and makes it much more constrained.

In CWMNX, music from a single voice is arranged into a sequence of events, including rests, notes, and chords. Directions are elements that are not events. Unlike events, they can have their own offsets into a container that they belong to. Dividing things into sequences and directions can make it easier to both encode and decode music notation. It provides a more natural mapping to data structures such as voices that are common among musical notation applications.

MNX tries to make a clear distinction between semantic markup, such as “a C4 quarter note,” and presentation information. Presentation information could be left out and the application could still create readable music, though not necessarily looking as good as you might like. Examples of presentation information include fonts, changes from standard positioning, size, and color. Presentation information in CWMNX is referred to as styles, a clear reference to HTML styles and CSS.

A third category of data in CWMNX is interpretation. This is more general than MusicXML’s sound element. Interpretation can specify that irrespective of what the semantics indicate, here is how some music should be played, using a MIDI-like description.

Michael added that MusicXML handles some of MNX interpretation data not only with the sound element, but with pairs of elements that indicate what is played vs how music looks. One example is using the tie element for playback and the tied element for appearance. These paired elements are a common source of confusion among MusicXML developers. MNX can offer a more systematic approach to addressing the same underlying problem.

CWMNX includes the concept of profiles. A “standard” profile would cover the great majority, but not everything, of what is in conventional Western music notation. Multi-metric music is one of the biggest examples of something that would be in CWMNX but might not be in the standard profile.

We want to support the concept of house styles in CWMNX. This includes font, distance, and other layout information that applies across an entire score. We want to easily substitute one style for another depending on context, enabling responsive styling for music notation.

CWMNX Discussion (Starts at 1:03:00)

Joe asked the group how far should CWMNX go in describing a normative style of positioning for conventional Western music notation? Should it try to do this, and if so, how far should this go? What would the benefits and drawbacks be?

Daniel Spreadbury said that if we go in this direction, then we have to specify norms, and specify them quite thoroughly. That will be difficult to do.

Kevin Weed asked what happens if we don’t have these standards in MNX. What’s the alternative? The alternative is what happens now, where each application decides for itself how to interpret the formatting.

Doug LeBow referred to orchestrator use cases where people split up between Finale and Sibelius to write a single cue under high time pressure, with different people writing for different instruments. Without standards for appearance between applications you would lose control over quality and stylistic consistency in the final music product.

Chris Koszuta said that Hal Leonard has been trying to get their digital files to the pristineness of the printed score. They have worked very hard to get to that point with MusicXML over the past several years, but are not quite there yet. To get the same control of the nuances in digital as you have in print, you need some agreed-upon standards. If not, when things fail and you have to go back to do additional work at the publisher, that’s tens of thousands of files with all the time and money associated with that.

Hal Leonard has been converting into MusicXML over the past four years but still runs into customer problems because a digital service doesn’t do something quite right yet. Customers really do notice these details. Chris hopes we can get to some level of agreement and control where it’s fluid and things are fun, instead of being a lot of extra work to create the next step of interactive music notation. If we don’t lock things down now, we will be fiddling with these details for years and years ahead.

Tom Nauman said that a lot of Musicnotes’ use of MusicXML is inbound. Everything they import has to be tweaked to be satisfactory to the customer. Chris followed up that when Hal Leonard does content deals with partners, they don’t want to provide messy files where the partner has to do extra work.

Daniel said that if we do encode positioning information, we have to lock it down and agree. It will take a long time, but if we don’t do it and things aren’t absolutely black and white, applications won’t be predictable. In other aspects of MNX we are trying to have just one way to encode things, as with sequences. Positioning would be the same way.

Steve Morell raised the point that most developers focus on their MusicXML import, but MusicXML export has less attention paid to it. Is there any way to incentivize export as well as import quality? Doug agreed – there is so much back-and-forth exchange in today’s workflows for musicians that both directions need to work equally well. Joe replied that when we have widely adopted, free, probably open source MNX viewers in browsers, that would provide an incentive to improve export.

GMNX (Starts at 1:16:42)

GMNX is a “graphics plus media” type of format. The notation is an SVG file. Musical sound or performance is either an audio file or a MIDI-like list of timed events. The time relationships can then be linked between the graphics and sound, and applications don’t really need to know what the notation is. Many practice and performance applications don’t need more than this.

Joe has made GMNX demos available online for examples from Fauré, Hebrew cantillation, and aleatoric music from Lutosławski. GMNX might even be applied sooner than CWMNX since it is much simpler.

Adrian Holovaty asked how we could get performance synchronization points from GMNX into CWMNX? The synchronization feature in GMNX would be useful for applications that do know the semantics of music notation. Joe asked Adrian to file an issue so we can address this.

Evan Balster asked a question about longer-term intent and if MNX was something that could be embedded within HTML browser documents in the future, like math and SVG. Joe replied that there will be a namespace immediately, and it could be viewable in a browser once there is a decent JavaScript library that supports it.

Conclusion (Starts at 1:22:30)

At this point we concluded the meeting. We had productive discussions and look forward to these conversations continuing. We hope to figure our a way to have these conversations more often than our once or twice a year meetings at NAMM and Musikmesse.

That evening we had a dinner at Thai Nakorn in Garden Grove. This photo of the dinner attendees is courtesy of Tom Nauman.

Photo of attendees at W3C MNCG Dinner at NAMM 2018

Attendees from bottom left, going clockwise: Matthew Logan, Michael Johnson, Philip Rothman, Adrian Holovaty, Jon Higgins, Joe Berkovitz, Doug LeBow, Tyler LeBow, Daniel Spreadbury, Vili Robert Ollila, John Barron, Michael Good, Jeff Kellem, Evan Balster, Jeremy Sawruk, Tom Nauman, Simon Barkow-Oesterreicher, Steve Morell, Kevin Weed, and Laura Weed.

CG Meeting at NAMM Show 2018

The co-chairs are hosting a meeting at The NAMM Show in Anaheim, CA for CG members and for any interested attendees of the show. The meeting will take place this Friday (January 26, 2018) from 10:30 am to 11:55 am, in the Hilton Anaheim in Room 7 on Mezzanine Level 3.  The discussion will focus on MNX, a next-generation markup language for encoding notated music (see the preceding post).

Details may be found on the NAMM website at: https://www.namm.org/nammu/w3c-music-notation-community-group-meetup

MNX Draft Specification Now Available

An early draft of the new MNX specification is now available for review and discussion. It can be viewed online here:  https://w3c.github.io/mnx/specification/

This is an important milestone for the group, and we’d like to thank everyone who has contributed to the many email threads and issues that helped move MNX forward so far. We’re excited at the prospect of moving the group’s work to a new level, one which can take a fresh look at some of the problems in music notation encoding.

Some of the significant ground covered in this draft includes:

  • A proposed semantic encoding for conventional Western music notation named “CWMNX”. This encoding takes MusicXML as a point of departure, but includes many improvements to syntax, style, content and structure. (See spec and examples.)
  • A new type of literal encoding called “GMNX”, which links SVG graphics to audio/performance data via the time dimension. This encoding is particularly suited to drive music practice and performance applications.  It also tries to remove bias towards notational idioms by avoiding the encoding of semantics: in GMNX, notations are just shapes and regions, and all audible content is encoded separately. A common timeline serves to connect notations and their audible counterparts. (See spec and examples.)

The group will be discussing MNX as well as other topics at the forthcoming NAMM Show in Anaheim, CA on Friday January 26, 2018; see this link for details.

We also expect to hold a meeting later in the year at Musikmesse in Frankfurt. Details forthcoming.