It seems like it could be a really good solution if Justin and schwa and Tack could open up some dialogue about getting Reaticulate more developed and maintreamed into Reaper. It seems like a possibility that could realistically happen.
Reaper development is really great, and I see it taking over a lot of more expensive DAWs due to the release cycle and the aggressive development of both devs, but more help could really take Reaper to new levels. Not necessarily for the sake of money, which Justin may already have enough of, and which doesn't seem to be his primary motive regardless, but to make this DAW even better in the spirit of excellence and service.
I've read Tack's source code and it's really well done. He's a good developer and he knows what he's doing. He could be a great addition to the Reaper team for this particular project.
It seems like it could be a really good solution if Justin and schwa and Tack could open up some dialogue about getting Reaticulate more developed and maintreamed into Reaper. It seems like a possibility that could realistically happen.
Reaper development is really great, and I see it taking over a lot of more expensive DAWs due to the release cycle and the aggressive development of both devs, but more help could really take Reaper to new levels. Not necessarily for the sake of money, which Justin may already have enough of, and which doesn't seem to be his primary motive regardless, but to make this DAW even better in the spirit of excellence and service.
I've read Tack's source code and it's really well done. He's a good developer and he knows what he's doing. He could be a great addition to the Reaper team for this particular project.
Is this project still being considered or has it reached the status of "aborted"?
I think REAPER strongly needs a native articulation map integration in its MIDI editor, maybe together with MPE...
__________________ subproject FRs click here note: don't search for my pseudonym on the web. The "musicbynumbers" you find is not me or the name I use for my own music.
+1 here, been waiting patiently for 3+ years! wish it was just a beta feature we can mess around with again, the whole notation+midi editor solution was great when in beta last time!
+1000 here as well! Been using Reaticulate for ages now in my daily work and I absolutely love it, would be amazing to see it integrated as a native aspect of Reaper.
Both options need to be there because different virtual instruments/sample libraries switch articulations differently. Keyswitch can always be sent right before the note to switch the articulation, too, and it can all happen in the back end, without you being aware of any of that...
Is this project still being considered or has it reached the status of "aborted"?
I think REAPER strongly needs a native articulation map integration in its MIDI editor, maybe together with MPE...
- Mario
Definitely - especially the MPE! :-)
Ableton 11 has MPE now as does studio one and Cubase...and of course Bitwig
Both options need to be there because different virtual instruments/sample libraries switch articulations differently. Keyswitch can always be sent right before the note to switch the articulation, too, and it can all happen in the back end, without you being aware of any of that...
Definitely with the option of keyswitches tied to notes. To be able to trigger different articulation on simultaneous notes AND move notes while retaining the articulation would be a game changer for me.
Definitely with the option of keyswitches tied to notes. To be able to trigger different articulation on simultaneous notes AND move notes while retaining the articulation would be a game changer for me.
Exactly!
Like this:
:-)
__________________
MacOS 10.15.7
Mac Pro 6-Core - 64GB ram
Motu M4
Definitely with the option of keyswitches tied to notes. To be able to trigger different articulation on simultaneous notes AND move notes while retaining the articulation would be a game changer for me.
If articulations are program change events or midi cc, they should move automatically with the notes or not?
Some articulations should be tied to a note while others should not. It depends on the type of articulation and library. Some libraries allow for dynamic adjustments (like choking a cymbal, the choking effect would necessarily come after the note).
Also, workflow needs to be taken into account. It doesn’t make sense to have 10 notes in a row marked with the same articulation when a single articulation at the beginning accomplishes the same result.
There are many examples where articulation/technique becomes more robust when decoupled from the note.
I don't understand the workflow problem of having 10 notes in a row with the same articulation with tied articulations. You just marquee select or CTRL+click them and assign a particular articulation.
If you're working form the notation view, the articulation entry on the left (one entry) is much easier to read and work with than the articulation entry on the right (per note entry).
Quote:
You'd have to give me the more robust examples of the advantage of this decoupling.
Right now, I can see only advantages, like having multiple articulations on a note, simultaneous notes with different articulation.
There's no reason why the decoupling option would prevent multiple articulations per note. Though you are correct that it would limit the ability for simultaneous notes to have different articulations. In my experience, multiple notes that require different articulations suggests separate tracks, but to each his own.
In addition to reducing clutter on the score, anything that would benefit from free positioning would benefit from decoupling.
In theory these could be tied to a single note, but I'd like to see how the events could be manipulated by the user via the midi editor and scripting before I get too excited about the idea.
Yes, I agree with you 100% when using notation view when reading is required by an instrument player. But is Reaper meant for that? For more in depth editing, MIDI Editor is another approach entirely. There are so many things you can't do from a piano roll because its vocation is other.
Notation view in this case isn't meant strictly speaking for the player either. It's meant for the composer, arranger, orchestrator, etc. These are the most likely Reaper users in this context (composers, arrangers, and orchestrators making mock-ups). Ease of reading the notation view is a benefit to whomever uses it.
Back when the articulation maps were in pre-release (4 years ago now), they leveraged notation events to trigger MIDI changes.
So, yes, the concept of articulation mapping in Reaper is definitely meant to be used with the notation editor.
Also, I agree that some things are better left for the piano roll, but right now there is not native support for any kind of articulation mapping. So... something would be better than nothing.
Notation view in this case isn't meant strictly speaking for the player either. It's meant for the composer, arranger, orchestrator, etc. These are the most likely Reaper users in this context (composers, arrangers, and orchestrators making mock-ups). Ease of reading the notation view is a benefit to whomever uses it.
Back when the articulation maps were in pre-release (4 years ago now), they leveraged notation events to trigger MIDI changes.
So, yes, the concept of articulation mapping in Reaper is definitely meant to be used with the notation editor.
Yes, it was leveraged. I had high hopes at the time, I thought I could eventually do everything MIDI in the notation view at the time. So I fully understand what you're saying. There's a discussion at some point with Tack on VI Control forum about this. Code-speaking, there are limits of what can be achieved if Reaper does not get and read the data. That was the conclusion.
I have my own script for articulation mapping from notation that I'll post soon.
I tried my hand at it some years ago, but the results were very lackluster. Tack's Reaticulate gave me some ideas, and I think I'll have a workable prototype in the next couple of weeks.
Last edited by pcartwright; 12-03-2020 at 04:45 PM.
I don't think there should be any difference between Score editor and Piano Roll (or list editor) in terms of articulations. Again my idea is that we should just be able to select notes and "attach" an articulation to them, regardless of where we're seeing those notes inside the software.
__________________
MacOS 10.15.7
Mac Pro 6-Core - 64GB ram
Motu M4
I don't think there should be any difference between Score editor and Piano Roll (or list editor) in terms of articulations. Again my idea is that we should just be able to select notes and "attach" an articulation to them, regardless of where we're seeing those notes inside the software.
I'm a former Notion user myself. In fact, I was really active in their forum before Notion was acquired by Presonus. In particular, I really pushed Notion to open their custom rules process. Folks over there frowned at it first (they looked at what I'd done more or less as hacking, but they eventually opened it up more. It's a shame; Notion had (and still has) a lot of potential, but the pace of development became very sluggish. The biggest pain point for me what the fact that up until a relatively recent version, Notion only sent dynamics data to one MIDI channel. I don't imagine dedicate score editing is a huge money maker, so it's hard to blame them for the slowdown in development.
I've had my eyes on Dorico for a while, but the price tag is a bit much.
********************
********************
Here is what I've developed in my script so far:
- The script reads in a tab delimited file. The file contains a list of rules that converts notation meta data to MIDI CC data.
- A GUI (thanks to Lokasenna) that allows the user to click a button to enter notation markings. If note(s) are selected, then the notation/articulation is attached to the note. If no notes are selected, then the notation is entered as track text.
- A process to capture note length meta data to be used by JSFX or other parts of the script to trigger certain effects (like short note samples)
This script is entirely notation run, meaning that changes in notation meta data drive the changes to CC values. However, piano roll users shouldn't fret, because these entries can be made in the piano roll as well as the notation view.
I'd be happy to offer the script up to testers if anyone is interested. Reply here or shoot me a PM.
If you're working form the notation view, the articulation entry on the left (one entry) is much easier to read and work with than the articulation entry on the right (per note entry).
There's no reason why the decoupling option would prevent multiple articulations per note. Though you are correct that it would limit the ability for simultaneous notes to have different articulations. In my experience, multiple notes that require different articulations suggests separate tracks, but to each his own.
In addition to reducing clutter on the score, anything that would benefit from free positioning would benefit from decoupling.
In theory these could be tied to a single note, but I'd like to see how the events could be manipulated by the user via the midi editor and scripting before I get too excited about the idea.
I think the visualization is a secondary consideration, and it’s more useful to go back to the first principles.
In music no note can be without an articulation, since by definition making a sound with an instrument includes a playing technique. So I think what follows automatically is that each note needs to have its own articulation metadata. It should be no different in implementation from velocity, as both are essential properties in describing how to play a note.
Obviously in a notation view there are conventions such as not repeating ”pizz.” each time there’s another note with pizzicato, so it’s up to the notation view renderer to accomodate this convention. But I see no reason this should have any effect on the underlying metadata.
I think the visualization is a secondary consideration, and it’s more useful to go back to the first principles.
In music no note can be without an articulation, since by definition making a sound with an instrument includes a playing technique. So I think what follows automatically is that each note needs to have its own articulation metadata. It should be no different in implementation from velocity, as both are essential properties in describing how to play a note.
Obviously in a notation view there are conventions such as not repeating ”pizz.” each time there’s another note with pizzicato, so it’s up to the notation view renderer to accomodate this convention. But I see no reason this should have any effect on the underlying metadata.
Exactly!
__________________
MacOS 10.15.7
Mac Pro 6-Core - 64GB ram
Motu M4
As Phillip said, you really do need both. Articulation metadata attached to notes is obvious enough, but there are many use cases that need arbitrarily positioned articulation control events.
Reaper does support track notation but they aren't tied to MIDI channels, which means this isn't usable in practice for articulation control.
I wrote more about the myriad practical problems with using notation events in Reaticulate. Most of these are little tweaks, but they would require intervention by Justin or schwa to address properly. I think all of these things would need to be addressed anyway for usable native articulation maps.
I'm a former Notion user myself. In fact, I was really active in their forum before Notion was acquired by Presonus. In particular, I really pushed Notion to open their custom rules process. Folks over there frowned at it first (they looked at what I'd done more or less as hacking, but they eventually opened it up more. It's a shame; Notion had (and still has) a lot of potential, but the pace of development became very sluggish. The biggest pain point for me what the fact that up until a relatively recent version, Notion only sent dynamics data to one MIDI channel. I don't imagine dedicate score editing is a huge money maker, so it's hard to blame them for the slowdown in development.
I've had my eyes on Dorico for a while, but the price tag is a bit much.
********************
********************
Here is what I've developed in my script so far:
- The script reads in a tab delimited file. The file contains a list of rules that converts notation meta data to MIDI CC data.
- A GUI (thanks to Lokasenna) that allows the user to click a button to enter notation markings. If note(s) are selected, then the notation/articulation is attached to the note. If no notes are selected, then the notation is entered as track text.
- A process to capture note length meta data to be used by JSFX or other parts of the script to trigger certain effects (like short note samples)
This script is entirely notation run, meaning that changes in notation meta data drive the changes to CC values. However, piano roll users shouldn't fret, because these entries can be made in the piano roll as well as the notation view.
I'd be happy to offer the script up to testers if anyone is interested. Reply here or shoot me a PM.
I'd love to try the script out and give feedback if possible!
Hi folks. I’m just digging into Reaper’s handling of articulations right now. Can anybody tell me the current status of this project, and where I might get a hold of it to start getting familiar with it?
I’m also looking for a simple toolbar that will allow me to click, say, an icon for “ppp” and have it simultaneously insert the dynamic marking of “ppp” into the score and a CC 11 event of, say, 8. Has anybody written such a thing and if so where might I find it? Or there a more elegant way for “roughing in” a dynamic during sketching?