|
|
|
04-28-2018, 05:00 AM
|
#321
|
Human being with feelings
Join Date: Apr 2014
Posts: 306
|
Quote:
Originally Posted by Sju
Hey guys. Doing a bit of research on articulation switching methods, and thought this would be a good place to get knowledgeable answers.
I'm wondering if key switching has any advantages compared to channel switching?
As I understand, the advantage of key switching is real-time playability, but a sampler using channel switching could operate in the same manner with an preprocessor that switches the MIDI output channel with keyswitches?
Channel switching sampler would then have the sole advantage of being able to play multiple articulations at the same time (impossible with keyswitching as far as I understand).
Am I missing any other pros/cons?
Thanks in advance.
PS. thank you tack for that script, it looks really useful!
|
There are SWS actions for changing channel input, even if your midi input device/keyboard transmits channel one by default. (includes CC data as well.)
SWS actions "Map Selected tracks midi input to channel 1"
and so on for the other channels.
I use keystrokes 1-0 for Mcha 1 - 10, and logo (Win key) + 1 to 6 for the rest.
So let's assume you have legato on channel 1, and Staccato on channel 2.
You press 1, and you hear the legato when you play your input device.
You press 2, and you hear the Staccato, even though you are inputting from your midi device which is transmitting on channel 1.
This is important as a "preview" mechanism, as you might for example prefer spiccato over staccato after trying them both for a particular passage, in tempo, and alongside any legato you've already recorded.
I work like this and it functions well, and importantly, is fast. (with sufficient practise).
A couple of caveats.
If you have the track FX window open, then the keystrokes won't work, because the arrange (main) window is not in focus.
I don't know if this is a caveat as i find it to be an asset, but if you have channel 2 selected, then any CC data you transmit will be for channel 2. So you can vary volume, velocity, expression directly for each articulation. I find this useful because most of the staccatos i have across my libs are a little "bright" compared to the legatos, so i'm likely to adjust them by default.
And, if you have more than one artic playing at once (because you can, using channels) then you can balance them really effectively.
If you wish to add the same CC data to all channels at once then there is another action to "Set selected tracks midi input to all channels." (Which i have mapped to Logo + 7)
This is all in the main window. I have the same keystrokes mapped to switch the same channels in the midi editor, if i wish to manually input notes out of tempo, for example.
Remember that CC data can be written per channel, but if you wish to edit the data in the ME, you need to select the correct channel to do so. (Which is entirely sensible.)
Hope this helps.
Keyswitching is a decision made by sample companies in the assumption that users will find it to be the most friendly way to handle artics that may seem like they're breeding in large numbers.
It is NOT the only way to work, and i suggest you set up a test track, assign the actions, and then practise channel switching for an afternoon, and draw your own conclusions.
I have 2 1st violin tracks to handle artics from my various sample libs, with all the "popular" artics on the first, and the less used artics on the other, saved as a 1st Violins track template.
As soon as we have some sort of track notes (text window) docked next to the track list that will save listed artics per track, as part of a track template, you'll have no need to open the FX window if you can't remember which artic you assigned to which channel, as it will be listed next to where you're working. We have the track notes window already (thanks again SWS), and it can be docked on the left, changing with selected tracks which is really cool imho, but they can only be saved with the project, for the moment.
If you have your tracks preloaded in a project template, then you're good to go.
Alex.
Last edited by alextone; 04-28-2018 at 05:16 AM.
|
|
|
09-25-2018, 02:08 PM
|
#322
|
Human being with feelings
Join Date: Jul 2009
Location: Germany
Posts: 2,375
|
What about the simple and elegant way of adding articulations through actions?
IF actions included:
"insert note at C-1" to "insert note at G9" (ALL the notes)
We could:
insert at least all the keyswitches through a comprehensive and custom floating toolbar.
If actions included:
"insert MIDI CC [all of them]", then the custom toolbar would be complete.
AND:
I could finally make my custom guitar fret toolbar in which I could combine actions to insert notes according to string and fret number.
BUT:
would require that the action allows auditioning of the inserted note.
Last edited by krahosk; 09-25-2018 at 02:50 PM.
|
|
|
09-25-2018, 09:13 PM
|
#323
|
Human being with feelings
Join Date: Jan 2009
Posts: 1,030
|
Quote:
Originally Posted by krahosk
What about the simple and elegant way of adding articulations through actions?
IF actions included:
"insert note at C-1" to "insert note at G9" (ALL the notes)
We could:
insert at least all the keyswitches through a comprehensive and custom floating toolbar.
If actions included:
"insert MIDI CC [all of them]", then the custom toolbar would be complete.
AND:
I could finally make my custom guitar fret toolbar in which I could combine actions to insert notes according to string and fret number.
BUT:
would require that the action allows auditioning of the inserted note.
|
The method above would require a user to input new keyswtiches, CC values, and channel changes whenever he/she changed patches or instruments.
The key benefit of an articulation mapper is that you can mix and match instruments without having to redo every articulation. For example, if I write a melody line for a trumpet which contains staccatos, legato, and other articulations, I should be able to copy that MIDI data (including notation) into another part (say a violin section) and simply load the violin articulation map to trigger the right samples.
The other benefit is that the user can create the articulation mapper once and not have to worry about it again. The above method would require me to reference relevant keyswitches, CCs, etc. A mapper that does this for the user would be a huge time saver.
|
|
|
09-26-2018, 04:09 AM
|
#324
|
Human being with feelings
Join Date: Oct 2008
Location: France
Posts: 3,701
|
Do the Cockos team still work on Articulation Mapper ?
Last Schwa message on it was just after 5.32 release...
|
|
|
09-26-2018, 04:14 AM
|
#325
|
Human being with feelings
Join Date: Dec 2012
Posts: 13,334
|
Quote:
Originally Posted by benf
Do the Cockos team still work on Articulation Mapper ?
Last Schwa message on it was just after 5.32 release...
|
It is/was be in some pre-releases, but needed more love. It will probably appear when 6.0 will be released.
|
|
|
09-26-2018, 07:23 AM
|
#326
|
Human being with feelings
Join Date: Aug 2015
Location: Florence, Italy
Posts: 463
|
Quote:
Originally Posted by vitalker
It is/was be in some pre-releases, but needed more love. It will probably appear when 6.0 will be released.
|
I hardly hope so.
|
|
|
09-26-2018, 08:05 AM
|
#327
|
Human being with feelings
Join Date: Apr 2017
Location: Los Angeles, CA
Posts: 376
|
Quote:
Originally Posted by benf
Do the Cockos team still work on Articulation Mapper ?
Last Schwa message on it was just after 5.32 release...
|
Use Reaticulate script by @Tack. It's the best articulation mapper you could hope for.
|
|
|
09-26-2018, 08:08 AM
|
#328
|
Human being with feelings
Join Date: Jan 2009
Posts: 1,030
|
Nothing against Tack and his work, but his script doesn't read articulation information from the notation view
|
|
|
09-26-2018, 09:43 AM
|
#329
|
Human being with feelings
Join Date: Oct 2008
Location: France
Posts: 3,701
|
Quote:
Originally Posted by robgb
Use Reaticulate script by @Tack. It's the best articulation mapper you could hope for.
|
I already got it, of course. But I still hope Cockos team will put that functionnality in Reaper itself.
|
|
|
09-26-2018, 05:57 PM
|
#331
|
Human being with feelings
Join Date: Jan 2012
Location: North East UK
Posts: 493
|
Quote:
Originally Posted by krahosk
|
Nope. MPE is for ROLI keyboards, it allows each note to have its own CCs basically. Nothing to do with switching articulations.
|
|
|
09-26-2018, 05:59 PM
|
#332
|
Human being with feelings
Join Date: Jan 2009
Posts: 1,030
|
It doesn't sound like it's designed for that purpose. More to the point, any MIDI standard would have to be adhered to by every synth/sample library (which isn't realistic).
|
|
|
10-21-2018, 09:30 AM
|
#333
|
Human being with feelings
Join Date: Jan 2009
Posts: 1,030
|
@justin and/or schwa
Now that there are pres and dev branches in LoL, would it make sense to test potential articulation mapper functionality in the dev branch? Even if we aren’t in the V6 cycle yet?
|
|
|
10-21-2018, 03:20 PM
|
#334
|
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,787
|
Quote:
Originally Posted by reddiesel41264
Nope. MPE is for ROLI keyboards, it allows each note to have its own CCs basically. Nothing to do with switching articulations.
|
????
It can perfectly be used independent of ROLI.
I recently did a proof of concept (see -> https://forum.cockos.com/showthread.php?t=211580 ) that Midi MPE data can be created in Reaper and is acknowledged by a PianoTeq plugin.
Here I preceeded each note-on by an appropriate pitchbend message and with that perfectly was able to create a microtonale scale.
-Michael
|
|
|
10-21-2018, 06:16 PM
|
#335
|
Human being with feelings
Join Date: Jan 2009
Posts: 1,030
|
The risk with using MPE for articulation mapping is that:
1. There could be conflicting messages if an MPE controller and/or VST were used on a track with articulations
2. MPE (as I understand it) would be necessarily limited to 16 note polyphony (notes temporarily remapped to other channels.). This is fine in most uses with monophonic instruments but could present a problem with polyphonic instruments.
|
|
|
10-21-2018, 09:28 PM
|
#336
|
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,787
|
1) when using MPE, articulations are not polyphonic but per voice.
2) correct for notes with equal rights. But AFAIK, you can dedicate one or some channels to notes without articulation and transfer as many "simple" notes on same as wanted.
-Michjael
Last edited by mschnell; 10-22-2018 at 07:56 AM.
|
|
|
10-22-2018, 06:49 PM
|
#337
|
Human being with feelings
Join Date: Jan 2009
Posts: 1,030
|
Quote:
Originally Posted by mschnell
1) when using MPE, articulations are not polyphonic but per voice.
|
That's not my point. The point is that if you start using MPE outside its intended function, you will likely experience issues if/when plugins start using MPE as part of its playback structure.
Here's a different analogy. Let's say we Reaper uses CC32 for articulation mapping (UACC). We immediately run into issues and workarounds if the VSTi library uses CC32 for anything. The same logical problem exists with trying to use MPE for articulation mapping. At some point you will run into issues for a VSTi that actually uses standard MPE.
|
|
|
10-22-2018, 09:38 PM
|
#338
|
Human being with feelings
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,787
|
Rather obviously the method used by any of Reaper's internal (or affiliated extension's) algorithms needs to be dedicatedly configured for any kind of target VST.
-Michael
|
|
|
10-23-2018, 08:25 AM
|
#339
|
Human being with feelings
Join Date: Dec 2015
Posts: 73
|
Quote:
Originally Posted by pcartwright
@justin and/or schwa
Now that there are pres and dev branches in LoL, would it make sense to test potential articulation mapper functionality in the dev branch? Even if we aren’t in the V6 cycle yet?
|
+1 here! Would enjoy testing out the articulation mapper before V6 enters pre-release building. Combining CC / Key switch / whatever method articulation switching with notation symbols via a native REAPER method will be great to use and test out!
|
|
|
01-20-2019, 09:46 AM
|
#340
|
Human being with feelings
Join Date: Jul 2009
Location: Germany
Posts: 2,375
|
Quote:
Originally Posted by pcartwright
That's not my point. The point is that if you start using MPE outside its intended function, you will likely experience issues if/when plugins start using MPE as part of its playback structure.
Here's a different analogy. Let's say we Reaper uses CC32 for articulation mapping (UACC). We immediately run into issues and workarounds if the VSTi library uses CC32 for anything. The same logical problem exists with trying to use MPE for articulation mapping. At some point you will run into issues for a VSTi that actually uses standard MPE.
|
Hopefully, the upcoming MIDI 2.0 specs will handle such articulation feature more easily.
- 16 bits of articulation data in note on/off
- 256 extended resolution registered per-note controllers (32 bits)
- 236 extended resolution assignable per-note controllers (32 bits)
- Per-note management message
- 32 bit poly and channel pressure and pitch bend
- 16 384 registered controllers (32 bits)
- 16 384 assignable controller (32 bits)
- 128 control change message (32 bits)
https://www.youtube.com/watch?v=ZAK62mn5-Yc
It's in prototyping phase still.
|
|
|
01-28-2019, 07:00 AM
|
#341
|
Human being with feelings
Join Date: Apr 2014
Posts: 58
|
Hollywood Strings question
Has anyone made a rebank file for Hollywood Strings (or any Hollywood Sample)? I would like to see the header layout.
Tom Wagner
Win 7 Pro
32 GB memory
All Hollywood Sample files.
Thanks
Tom Wagner
tswagner.com
|
|
|
01-28-2019, 09:11 AM
|
#342
|
Human being with feelings
Join Date: Jan 2009
Posts: 1,030
|
Quote:
Originally Posted by krahosk
Hopefully, the upcoming MIDI 2.0 specs will handle such articulation feature more easily.
- 16 bits of articulation data in note on/off
- 256 extended resolution registered per-note controllers (32 bits)
- 236 extended resolution assignable per-note controllers (32 bits)
- Per-note management message
- 32 bit poly and channel pressure and pitch bend
- 16 384 registered controllers (32 bits)
- 16 384 assignable controller (32 bits)
- 128 control change message (32 bits)
https://www.youtube.com/watch?v=ZAK62mn5-Yc
It's in prototyping phase still.
|
The new MIDI specification won't help if prior VSTis don't update from the old specification. Still, I imagine a JSFX could be used to map from the new spec to MIDI data in the old spec (i.e. the 16 bit note-on with articulation could be mapped to a keyswitch or the like).
Personally, I hope the devs don't wait that long to start testing the articulation mapper.
|
|
|
03-09-2019, 07:20 AM
|
#343
|
Human being with feelings
Join Date: Dec 2015
Posts: 73
|
Looking forward to testing this and for a newer version of this to get worked on in future! the eminent features for 6.0 are looking awesome so far! this and Envelopes for CC events are features that will definitely be used daily!
Will you be working on this feature in future pre-release dev builds in the near future @schwa?
Last edited by Audio_Birdi; 03-09-2019 at 07:25 AM.
|
|
|
03-21-2019, 01:47 PM
|
#344
|
Human being with feelings
Join Date: Jul 2009
Location: Germany
Posts: 2,375
|
Can't wait for ARticulation Mapper for Reaper because Cubase and Logic already have this:
https://www.babylonwaves.com/
|
|
|
03-22-2019, 08:01 AM
|
#345
|
Human being with feelings
Join Date: Apr 2008
Location: Brasov Romania
Posts: 140
|
Quote:
Originally Posted by krahosk
|
This is possible in Reaper for quite some time already with UACC and Reaticulate with the condition that the different articulation don't perfectly overlap.
https://www.youtube.com/watch?v=1gDJUIrecE8
|
|
|
03-22-2019, 09:18 AM
|
#346
|
Human being with feelings
Join Date: Jan 2009
Posts: 1,030
|
Quote:
Originally Posted by mirceablue
|
Neither solution immediately integrates with Reaper notation AFAIK. That is a missing link for some users.
|
|
|
03-22-2019, 08:40 PM
|
#347
|
Human being with feelings
Join Date: Nov 2010
Posts: 2,436
|
Quote:
Originally Posted by krahosk
|
And if we can manage to convert these articulations to future REAPER native format, that would be f****** awesome.
|
|
|
03-23-2019, 02:11 AM
|
#348
|
Human being with feelings
Join Date: Apr 2008
Location: Brasov Romania
Posts: 140
|
Quote:
Originally Posted by pcartwright
Neither solution immediately integrates with Reaper notation AFAIK. That is a missing link for some users.
|
This is also possible check this.
https://www.youtube.com/watch?time_c...&v=ByNrHGyxUiw
|
|
|
05-23-2019, 04:49 PM
|
#349
|
Human being with feelings
Join Date: Jul 2009
Location: Germany
Posts: 2,375
|
Quote:
Originally Posted by mirceablue
|
MIDI Editor here. No notation.
|
|
|
05-24-2019, 05:29 AM
|
#350
|
Human being with feelings
Join Date: Oct 2008
Location: France
Posts: 3,701
|
Look at the first video. He makes a custom action with both notation action and related uacc script, which make it acts on both notation editor and piano roll.
|
|
|
07-10-2019, 03:28 PM
|
#351
|
Human being with feelings
Join Date: Jul 2009
Location: Germany
Posts: 2,375
|
Devs, when can we expect the articulation mapper to roll out? It's been publicly announced that it's the plan since December 22,2016.
|
|
|
07-29-2019, 07:19 AM
|
#352
|
Human being with feelings
Join Date: Jun 2010
Location: Texas
Posts: 357
|
I also humbly voice my burning desire for articulation mapping integrated with the notation editor. I'm currently using Reddiesel's system [THANK YOU DAVID], but having it fully integrated would be a game changer!
|
|
|
07-29-2019, 01:42 PM
|
#353
|
Human being with feelings
Join Date: Jul 2009
Location: Germany
Posts: 2,375
|
Yes, thank you David Healey. I prefer this method than Reaticulat because of notation support with musical symbols.
Last edited by krahosk; 07-30-2019 at 05:50 AM.
|
|
|
08-12-2019, 04:45 PM
|
#354
|
Human being with feelings
Join Date: Jul 2019
Posts: 39
|
Collab
Is it possible for a few people to come together to make an articulation extension for Reaper with everything anyone could ever want? Reaticulate is pretty awesome even as it is but Tack has said he has very limited time lately. Can the few people working on their own articulation solutions rally together to make this happen? Or would that just be too many cooks in the kitchen? Perhaps turning it into a community funded project would help? If say 500 people chipped in 10 bucks would that make a difference?
Last edited by bywaterandblood; 08-12-2019 at 05:04 PM.
|
|
|
08-12-2019, 04:52 PM
|
#355
|
Human being with feelings
Join Date: Jan 2014
Location: Ontario, Canada
Posts: 1,619
|
Quote:
Originally Posted by bywaterandblood
Reaticulate is pretty awesome even as it is but Tack has said he has very limited time lately.
|
Not to give the impression I'm not working on it: I am actually actively developing it, spending several hours per week. It's just that it demands more than that.
Quote:
Originally Posted by bywaterandblood
Can the few people working on their own articulation solutions rally together to make this happen? Or would that just be too many cooks in the kitchen?
|
There will definitely be the problem of too many cooks, plus the problem of different culinary expectations.
I'm also not so sure how many people are working on solutions in this space. When native articulation maps resurfaces here I'll certainly have much to contribute based on my experiences with Reaticulate's userbase.
Quote:
Originally Posted by bywaterandblood
If say 300 people chipped in 10 bucks would that make a difference?
|
It's not a problem of monetary compensation. Unless it compensates so well that I'm able to quit my day job. But then we're talking wayyyy more than 300 people contributing 10 bucks.
|
|
|
08-12-2019, 05:20 PM
|
#356
|
Human being with feelings
Join Date: Oct 2017
Location: Black Forest
Posts: 5,067
|
Reaticulate is a brilliant tool and I'm only starting to implement it in my projects/workflow. Planned to do that months ago, but never found a way to start with.
With that being said, I think it's the most thought out solution we will ever see, because Tack is using it himself and has a lot of insights with orchestral libraries and different use cases.
In the current state, Reaticulate is already totally usable. The future additions however are very welcomed and will make the creating and organization of banks much easier. But for now I'm a happy camper
|
|
|
08-19-2019, 01:29 PM
|
#357
|
Human being with feelings
Join Date: Jul 2009
Location: Germany
Posts: 2,375
|
Yes, Reaticulate is a brilliant tool! Thank you Tack for all the time you spend on it. Extremely useful for me!
|
|
|
08-19-2019, 03:10 PM
|
#358
|
Human being with feelings
Join Date: Jul 2009
Location: Germany
Posts: 2,375
|
Speaking of articulations, what how can I know what Reaper does and does not support actually in MUSIC XML?
|
|
|
08-22-2019, 10:06 AM
|
#359
|
Human being with feelings
Join Date: Apr 2010
Location: London (UK)
Posts: 412
|
Hi, hoping to chime in with some little input.
Not being able to program or to offer any technical suggestion, I'm happy to offer my opinion as a quite experienced user and "score-oriented" composer:
Articulations should be "glued" to the notes themselves, not another "layer" that has to be controlled separately.
The best articulation systems in existence so far are Logic Articulation Sets (that use Articulations IDs) and Cubase's Expression Maps (only using the "attribute" function, not the direction).
for the only reason that you can select a bunch of notes, set them to "pizzicato" (for example) then move them around, copy them to a different track or different position, and they will still preserve their articulation. Pretty much like in a dedicated notation software.
Hope this could be helpful to get some ideas out in the air.
Thanks developers!
All the best
-t
__________________
MacOS 10.15.7
Mac Pro 6-Core - 64GB ram
Motu M4
|
|
|
09-29-2019, 06:41 AM
|
#360
|
Human being with feelings
Join Date: Jul 2009
Location: Germany
Posts: 2,375
|
I agree with you tusitala.
I also think that articulations "glued" to notes should also be identified on the notes themselves for reading and editing convenience. I don't know how Cubase and Logic handle this. What I mean is it should not handle articulations on a separate window (or pane, like MIDI CC pane) to identify which note has such or such articulation is not easy on the eye, and not convenient editing-wise.
Also, since Reaper has a notation feature, articulations should be assignable in both MIDI Editor mode and notation mode.
Like:
- MIDI editor notes can either display either velocity or pitch name directly on them. I would be fantastic to display the articulation name on them too.
- The notation editor can display notation symbols. If those could be attributed to specific notes, it would be fantastic too!
Last edited by krahosk; 09-29-2019 at 07:14 AM.
|
|
|
Thread Tools |
|
Display Modes |
Linear Mode
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
All times are GMT -7. The time now is 04:22 AM.
|