Go Back   Cockos Incorporated Forums > REAPER Forums > MIDI Hardware, Control Surfaces, and OSC

Reply
 
Thread Tools Display Modes
Old 01-20-2022, 02:51 AM   #1921
helgoboss
Human being with feelings
 
helgoboss's Avatar
 
Join Date: Aug 2010
Location: Germany
Posts: 2,199
Default

Quote:
Originally Posted by Reaktor:[Dave] View Post
Thanks, that's working!

I'm trying to page through the parameters of my virtual instruments, but still with the conversion to MIDI CCs workflow. The basic setup looks like this:
1) one ReaLearn instance as InputFX, converting incoming MIDI messages to specific MIDI CCs that fit the current page of parameters of the track's VSTi. No Feedback is set up at this stage, this is purely done to reroute my fader messages to the respective CCs that a certain VSTi expects as input.
Ah okay, so here you use ReaLearn as a simple MIDI converter in order to conform to the VSTi's internal CC-to-parameter mapping (you can't use ReaLearn because you don't write automation envelopes and therefore offline rendering wouldn't work). BTW, this part could also be done using JSFX.

Quote:
Originally Posted by Reaktor:[Dave] View Post
2) one ReaLearn instance before the VSTi, letting through matched and unmatched events, purely for creating feedback to my controller:
2.1) textual feedback to the channel's display line #1 displaying the plugin's parametername
2.2) numeric feedback to the channel's display line #2 displaying the plugin's parameter value
What's the "Feedback output" of that instance?

Quote:
Originally Posted by Reaktor:[Dave] View Post
2.3) conversion from the MIDI CC message to the VSTi's parameter (purely for making feedback work for the two points above)
Mmh, this one I don't get. Why do you need that? Feedback for 2.1 and 2.2 should work without having any additional mapping?

Quote:
Originally Posted by Reaktor:[Dave] View Post
2.4) conversion from the MIDI CC message to the controller's original input message via "MIDI: Send message" to the feedback output (this one is where I'm having problems with)
This one I don't get either. Why not just use 2.1 and 2.2 and use your controller output device as "Feedback output"? Then you can put that ReaLearn instance actually anywhere. It doesn't have to be in the same chain.


Quote:
Originally Posted by Reaktor:[Dave] View Post
I'd expect this setup to take my controllers input, convert it so the messages arrive at the CC numbers expected by the VSTi (this one is working) and also send feedback to the faders of my control surface (this one only works partially). Feedback to the display is working fine (only when track is record-enabled, works fine during playback, when stopped and when changing the parameter from within the VSTi). But feedback to the faders has some problems like it's always being sent during playback disregarding whether the track is actually record-enabled. It's also not working when changing a parameter from within the VSTi's GUI.
Well, I'm not completely getting the feedback part, see my points above. There shouldn't be any MIDI CC conversion necessary for the feedback direction. You need that conversion only for control direction (for making offline rendering work).

Quote:
Originally Posted by mks View Post
I wanted to ask about one aspect that can hopefully find solutions with Realearn. Having a good time over here getting my controllers setup It was two questions but I solved the OSC text string situation.

Is there any way to have the track order follow either the mixer or arrange visibility? Specifically, ignoring folded folder tracks? Currently I’m playing with track selections and the dynamic assignment function. Combined with reaper track selection snapshots. Which may ultimately be more powerful, albeit a bit slower to work with. Perhaps there’s some use of “dynamic” that would follow track visibility of Reaper.

Thanks. Realearn has become really amazing!
Best open a FR at https://github.com/helgoboss/realearn/issues.

Quote:
Originally Posted by jimosity View Post
I've been playing with ReaLearn and I'm not quite sure how if it can do this, and if so, how to get there.
I'd like to take an incoming midi note from a controller and have it map to a specific Program Change. Is that possible?
Quote:
Originally Posted by foxAsteria View Post
It's not a MIDI translator. MIDI-Ox can do that, though it's not the most user-friendly program. There are probably plugins that do as well.
ReaLearn also can act as MIDI translator (target "MIDI: Send message"). If you want to map a specific note to a specific program change, no problem. "Learn source" the specific note and select "MIDI: Send message" target, pick "Program change", e.g. "Channel 1" and set both Target Min and Max to the same value (the desired program change value).

If you need more complicated stuff, I can strongly recommend writing JSFX.

Both approaches keep everything nicely within the REAPER project (something you can't achieve with MIDI-Ox).

Quote:
Originally Posted by gvanbrunt View Post
Some users have requested a way to be able to monitor which bank is active. I put together a script that allows this.
Cool! Let me know if you need deeper access to ReaLearn from ReaScript. I guess at the moment you just query VST parameter values, but there's this awesome feature "named parameters" of REAPER's VST extensions API, which lets a VST expose parameters that can have arbitrary names and arbitrary values, e.g. strings. If you need some, open a FR.

Quote:
Originally Posted by gvanbrunt View Post
I figure the issue I am running into stems from the fact that this Boss pedal is latching and therefore has discrete on/off states, unlike the sustain pedal. But I am not certain.
If this is the issue, you can choose source character "Toggle-only button (avoid!)" for that mapping. This source character interprets each incoming messages as "On", even the "Off" message

Quote:
Originally Posted by gvanbrunt View Post
So my question is: what configuration (if any) will allow me to run actions (1)-(3) above in the way I currently trigger them with my sustain pedal (i.e. double press, one press, one longer press)?


EDIT #2: are all my problems solved simply by returning this and getting the unlatched pedal instead? https://www.sweetwater.com/store/det...ing-footswitch

Please try above setting first. The main idea of ReaLearn is to work with any controller, allowing reuse of older stuff, not buying new stuff and saving the world from electrical waste
helgoboss is offline   Reply With Quote
Old 01-20-2022, 07:13 AM   #1922
Miscreant
Human being with feelings
 
Miscreant's Avatar
 
Join Date: Mar 2012
Posts: 376
Default

Quote:
Originally Posted by helgoboss View Post

If this is the issue, you can choose source character "Toggle-only button (avoid!)" for that mapping. This source character interprets each incoming messages as "On", even the "Off" message




Please try above setting first. The main idea of ReaLearn is to work with any controller, allowing reuse of older stuff, not buying new stuff and saving the world from electrical waste
Thanks, Helgo. I have been playing with the 'Toggle-only (avoid!)' setting, but unfortunately I am finding it unreliable.

For example, I can get it working most of the time to engage my 'play metronome' action with the 'fire on double press' setting. I say 'most of the time' because sometimes the double press doesn't engage and to get it working I have to re-learn the MIDI source. Similarly, if after engaging the action with the double press I press the foot controller once, then I cannot turn the action off with a double press. I have to stop the transport in other ways (e.g. with my typing keyboard).

For another, on 'Toggle-only (avoid!)' I cannot engage my two other assigned actions with one press timed between 100-500 ms, nor with one long press (>500ms). Neither setting gets any response in Reaper when I engage the footswitch.

Edit #1: I will add that one difference between the 'play metronome', which works, and the other two actions is that the 'play metronome' action is a toggle action (it records on/off states in Reaper). This might explain why the 'Toggle-only (avoid!)' setting does not work for these other two actions.

Last edited by Miscreant; 01-20-2022 at 07:20 AM.
Miscreant is offline   Reply With Quote
Old 01-20-2022, 07:36 AM   #1923
fbeauvaisc
Human being with feelings
 
Join Date: Nov 2018
Location: Montreal
Posts: 405
Default Icon Platform M+

Hi,

I've recently switched from an X-Touch compact to a Platform M+ controller. With the X-Touch, I was using the midi mode and it worked perfectly.

I only need a few motorized faders with midi feedback mostly selected track's volume and send levels. The rest is mapped to things like metronome volume, CC1, CC11, Last touch etc...

I'm using an instance of realearn 2 in the monitoring FX. I ran into issues with "User Defined" mapping with the M+. The fader would go crazy if I mapped more than 1 parameter. In my case, Selected Volume and Send Levels. With only 1 parameter mapped, it works flawlessly but mapping more than 1 makes the fader behave a little crazy.

So after reading in the manual that it worked better with "Mackie Control", I went with that and downloaded the M+ preset from Reapack. It solved the fader going crazy thing and I can now map a bunch of faders with midi feedback and all works well.

My main issue is that now, all faders use the motors and reverts to zero when not receiving parameters. For exemple, I tried to map a fader to metronome volume using "Invoke Reaper Action" and it works but as soon as I let go of the fader, the motors reverts to zero. Same goes with Global : Last touch parameter.

Is there a way to disable the motors on some faders. Disabling the midi feedback on those specific mapped faders does nothing. They still physically move back to zero when I let go of them.

I've tried to look through the manual but I couldn't find a solution for this.

Any thoughts?

Thanks!
fbeauvaisc is offline   Reply With Quote
Old 01-20-2022, 09:46 AM   #1924
helgoboss
Human being with feelings
 
helgoboss's Avatar
 
Join Date: Aug 2010
Location: Germany
Posts: 2,199
Default

Quote:
Originally Posted by Miscreant View Post
Thanks, Helgo. I have been playing with the 'Toggle-only (avoid!)' setting, but unfortunately I am finding it unreliable.

For example, I can get it working most of the time to engage my 'play metronome' action with the 'fire on double press' setting. I say 'most of the time' because sometimes the double press doesn't engage and to get it working I have to re-learn the MIDI source. Similarly, if after engaging the action with the double press I press the foot controller once, then I cannot turn the action off with a double press. I have to stop the transport in other ways (e.g. with my typing keyboard).

Edit #1: I will add that one difference between the 'play metronome', which works, and the other two actions is that the 'play metronome' action is a toggle action (it records on/off states in Reaper). This might explain why the 'Toggle-only (avoid!)' setting does not work for these other two actions.
Source, Glue, Target ... these are totally independent sections in ReaLearn. Looks like you have configured your source section correctly if it works in general, so I wouldn't worry about it anymore. Worry about your target choice instead. "Project: Invoke REAPER action" provides different ways to trigger REAPER actions. But all REAPER actions work differently, this is not so much under ReaLearn's control. If you need an action that toggles, you should find an action that toggles by itself. I've explained this in pretty much detail here in the user guide.


Quote:
Originally Posted by Miscreant View Post
For another, on 'Toggle-only (avoid!)' I cannot engage my two other assigned actions with one press timed between 100-500 ms, nor with one long press (>500ms). Neither setting gets any response in Reaper when I engage the footswitch.
There's no way measuring press duration can work with your button. If your button doesn't send a MIDI message when it's released, how can ReaLearn (or any other software) possibly know how long you have pressed it? If you need to distinguish between different press durations, you either need to configure your button to act as momentary button (e.g. send 127 on press and 0 on receive) or get a new controller which can do this. In both cases you don't need "Toggle-only (avoid!)" anymore (which is simply a hack for poor controllers that don't send messages on button release).

Quote:
Originally Posted by fbeauvaisc View Post
Hi,

I've recently switched from an X-Touch compact to a Platform M+ controller. With the X-Touch, I was using the midi mode and it worked perfectly.

I only need a few motorized faders with midi feedback mostly selected track's volume and send levels. The rest is mapped to things like metronome volume, CC1, CC11, Last touch etc...

I'm using an instance of realearn 2 in the monitoring FX. I ran into issues with "User Defined" mapping with the M+. The fader would go crazy if I mapped more than 1 parameter. In my case, Selected Volume and Send Levels. With only 1 parameter mapped, it works flawlessly but mapping more than 1 makes the fader behave a little crazy.

So after reading in the manual that it worked better with "Mackie Control", I went with that and downloaded the M+ preset from Reapack. It solved the fader going crazy thing and I can now map a bunch of faders with midi feedback and all works well.

My main issue is that now, all faders use the motors and reverts to zero when not receiving parameters. For exemple, I tried to map a fader to metronome volume using "Invoke Reaper Action" and it works but as soon as I let go of the fader, the motors reverts to zero. Same goes with Global : Last touch parameter.

Is there a way to disable the motors on some faders. Disabling the midi feedback on those specific mapped faders does nothing. They still physically move back to zero when I let go of them.

I've tried to look through the manual but I couldn't find a solution for this.

Any thoughts?

Thanks!
Does it work when you map the faders to something "normal" (not REAPER actions, they are special and bad with feedback, see my reply to the other question), such as track volume? Do they stay at the position of the volume?
helgoboss is offline   Reply With Quote
Old 01-20-2022, 10:51 AM   #1925
fbeauvaisc
Human being with feelings
 
Join Date: Nov 2018
Location: Montreal
Posts: 405
Default

Quote:
Originally Posted by helgoboss View Post


Does it work when you map the faders to something "normal" (not REAPER actions, they are special and bad with feedback, see my reply to the other question), such as track volume? Do they stay at the position of the volume?
Yes it works perfectly fine with selected send levels and selected track volume.

For reaper actions or global last touch parameter, even when I deactivate midi feedback, the faders always want to physically go back to zero in mackie control mode. They do work in the sens that they send the right values to the right parameters but the faders are always pulled back down by the motors.

Just to be clear, I don't want feedback on those fonctions like metronome volume, CC1 and CC11 and last touch parameter. I only want feedback on Selected track's Sends and Selected Track's Volume and that works well.

In midi mode the X-Touch works flawlessly but the Platform M+ in user define mode has issues when more than 1 parameter is set for midi feedback.
fbeauvaisc is offline   Reply With Quote
Old 01-20-2022, 06:49 PM   #1926
fbeauvaisc
Human being with feelings
 
Join Date: Nov 2018
Location: Montreal
Posts: 405
Default

My bad I got it working, I found out that with pitch bend, faders react better than with CC and it makes sens. I can't remember if I ran into this with the X-Touch.

So I now command the motorized fonctions with Pitch Bends messages and use CC for passive faders and it works perfectly.

Thanks!
fbeauvaisc is offline   Reply With Quote
Old 01-20-2022, 08:36 PM   #1927
Fergler
Human being with feelings
 
Fergler's Avatar
 
Join Date: Jan 2014
Posts: 5,220
Default

Why is it that ReaLearn can see OSC from Lemur but Reaper can't? o.0
Fergler is offline   Reply With Quote
Old 01-22-2022, 05:43 AM   #1928
Reaktor:[Dave]
Human being with feelings
 
Reaktor:[Dave]'s Avatar
 
Join Date: Jun 2010
Location: Berlin
Posts: 563
Default

Quote:
Originally Posted by helgoboss View Post
Ah okay, so here you use ReaLearn as a simple MIDI converter in order to conform to the VSTi's internal CC-to-parameter mapping (you can't use ReaLearn because you don't write automation envelopes and therefore offline rendering wouldn't work). BTW, this part could also be done using JSFX.
True, this instance is just a MIDI converter. However, I want this to be paged, so clicking a button would jump to the next 8 parameters of the plugin that I have defined. I'm hoping to achieve this with ReaLearn.

Quote:
Originally Posted by helgoboss View Post
What's the "Feedback output" of that instance?
It's set to "D 400", the controller I'm using.

Quote:
Originally Posted by helgoboss View Post
Mmh, this one I don't get. Why do you need that? Feedback for 2.1 and 2.2 should work without having any additional mapping?
It's a Kontakt-specific (or maybe even a library-specific?) hack because Kontakt doesn't update its host parameters on incoming MIDI CCs. It isn't necessary for other plugins like u-he's plugins.

Quote:
Originally Posted by helgoboss View Post
This one I don't get either. Why not just use 2.1 and 2.2 and use your controller output device as "Feedback output"?
I still need to get feedback to the fader. 2.1 and 2.2 only target the controller's display. However, you're right, I don't need to convert the incoming MIDI CC data to the controller's feedback but I can create a mapping between the plugin's parameter and the virtual fader from the controller compartment. I just didn't think of that because at that stage, the controller's data has been converted already and needs to be converted back, but a mapping between the parameter and the controller compartment automatically does that!
Btw., I found a way to convert and feedback the incoming MIDI CC data to the controller by using "MIDI script (feedback only)" as source. But your suggestion is better!

Quote:
Originally Posted by helgoboss View Post
Then you can put that ReaLearn instance actually anywhere. It doesn't have to be in the same chain.
So I could put this instance in the monitor fx chain? How could I scale this approach to support multiple Kontakt libraries from that one instance?
Reaktor:[Dave] is offline   Reply With Quote
Old 01-22-2022, 04:13 PM   #1929
foxAsteria
Human being with feelings
 
foxAsteria's Avatar
 
Join Date: Dec 2009
Location: Oblivion
Posts: 10,271
Default

Quote:
Originally Posted by Fergler View Post
Why is it that ReaLearn can see OSC from Lemur but Reaper can't? o.0
The Reaper OSC system knows it's obsolete so it just doesn't even try anymore.
__________________
foxyyymusic
foxAsteria is offline   Reply With Quote
Old 01-23-2022, 06:13 PM   #1930
mks
Human being with feelings
 
Join Date: Dec 2011
Posts: 171
Default Parameter does not release write automation

I'm trying to write automation in touch mode, and it seems the parameter does not release once touched. Is this intended behaviour?
mks is offline   Reply With Quote
Old 01-24-2022, 01:28 AM   #1931
BenjyO
Human being with feelings
 
Join Date: Nov 2011
Posts: 308
Default

Is it possible to get textual feedback using only MIDI? I believe SysEx messages should be able to do it but I'm not sure. I've been banging my head over this for the last couple of days but I'm not getting any closer and feel like I'm way over my head. I'll appreciate any help.

Here's what I'm trying to achieve:
I'm using an iPad and the new TouchOSC app as my controller and I want to get textual feedback (e.g. track name or track's volume value) back to TouchOSC and display it in a "label control". I know this can be done with OSC but I want to avoid OSC for the following reason: I want to be able to use the ReaLearn mappings offline through a wired connection. TouchOSC supports wired connections but only for MIDI (through TouchOSC Bridge). So as far as I understand all of this, if I want to have an offline set-up, I need to figure out a way to do this over MIDI.

Here's what I've done so far:
I've managed to send SysEx messages from TouchOSC to ReaLearn by creating a "button control" in TouchOSC and setting its MIDI message to "F0 00 20 6B 7F 42 02 00 10 77 00 F7" SysEx message (following the first example in the ReaLearn user guide). I mapped that message via the "Raw MIDI / SysEx" control type to a track's volume fader. According to the user guide that message sends a 100 % value and when pressing that button in TouchOSC, the track's fader in Reaper really moves to its maximum value. I couldn't get any feedback however. I monitored through ReaLearn's own outgoing messages log and through TouchOSC's own messages log. I do have feedback enabled and I tried out all of the feedback options for this mapping but nothing worked. My next attempt would be to change that "button control" in TouchOSC to a "label control" and see where I can get from there but I got stuck here.

Also I'm using latest ReaLearn pre-release version.

So, does anyone have an idea of what I can try next or where I made a mistake? I feel like I'm missing something here.

Also: Where can I easily (or is that asking for too much?) learn what kind of SysEx messagest to use in the first place and how to convert them into meaningful text or numbers in TouchOSC? The new TouchOSC has powerful lua scripting capabilites which I've tried out already so I believe translation of incoming SysEx messages should be possible. I read the section about this topic in ReaLearn's user guide but I don't get it ... :/
How does one know that "F0 00 20 6B 7F 42 02 00 10 77 00 F7" means a 100 % value? AFAIK SysEx messages are sort of like OSC: you - the manufacturer of a MIDI hardware device or the user of TouchOSC where you can set up the SysEx message - define what they mean.
__________________
Check out some of my music
BenjyO is offline   Reply With Quote
Old 01-24-2022, 03:48 AM   #1932
helgoboss
Human being with feelings
 
helgoboss's Avatar
 
Join Date: Aug 2010
Location: Germany
Posts: 2,199
Default

Quote:
Originally Posted by BenjyO View Post
Is it possible to get textual feedback using only MIDI? I believe SysEx messages should be able to do it but I'm not sure. I've been banging my head over this for the last couple of days but I'm not getting any closer and feel like I'm way over my head. I'll appreciate any help.

Here's what I'm trying to achieve:
I'm using an iPad and the new TouchOSC app as my controller and I want to get textual feedback (e.g. track name or track's volume value) back to TouchOSC and display it in a "label control". I know this can be done with OSC but I want to avoid OSC for the following reason: I want to be able to use the ReaLearn mappings offline through a wired connection. TouchOSC supports wired connections but only for MIDI (through TouchOSC Bridge). So as far as I understand all of this, if I want to have an offline set-up, I need to figure out a way to do this over MIDI.
I think you will keep your head banging much longer if you pursue this way. Unless TouchOSC is able to decode Mackie LCD text sys-ex data (or at least a sequence of ASCII bytes) and display it as text. Maybe it is (I haven't looked into scripting). Let's see if anyone chimes in and proves that it can.

But I think even if you could get it to work in theory, the clean way would be to make OSC work over wire. This is not so much a feature that TouchOSC should support, but your iPad and your machine running REAPER. As soon as you manage to establish a wired network connection between iPad and Mac/PC, you have it solved. That's possible I think, just not done very often. You'll probably need an ethernet adapter with lightning connector. Try to find some tutorials.

BTW, if you find a good solution getting wired ethernet to work, let me know. I haven't tried yet but it's on my agenda to check how that works.

Quote:
Originally Posted by BenjyO View Post
Also: Where can I easily (or is that asking for too much?) learn what kind of SysEx messagest to use in the first place and how to convert them into meaningful text or numbers in TouchOSC? The new TouchOSC has powerful lua scripting capabilites which I've tried out already so I believe translation of incoming SysEx messages should be possible. I read the section about this topic in ReaLearn's user guide but I don't get it ... :/
How does one know that "F0 00 20 6B 7F 42 02 00 10 77 00 F7" means a 100 % value? AFAIK SysEx messages are sort of like OSC: you - the manufacturer of a MIDI hardware device or the user of TouchOSC where you can set up the SysEx message - define what they mean.
Text via MIDI is going to be difficult unless TouchOSC supports decoding ASCII (ReaLearn and all the other Mackie-LCD-compatible software sends text as ASCII-encoded strings).

Numeric values encoded within SysEx messages can be received/sent by ReaLearn quite easily, but you need to tell it how it should extract and interpret that values - because as you say, it's not standardized. It's basically just a sequence of arbitrary bytes. Described in great detail in the user guide: https://github.com/helgoboss/realear...aw-midi-source

Update: If you want to send/receive numeric values via MIDI and you are the one who decides over what MIDI messages are used, don't use sys-ex, use plain old 3-byte MIDI messages, e.g. CC. Much easier.

Last edited by helgoboss; 01-24-2022 at 04:05 AM.
helgoboss is offline   Reply With Quote
Old 01-24-2022, 03:51 AM   #1933
helgoboss
Human being with feelings
 
helgoboss's Avatar
 
Join Date: Aug 2010
Location: Germany
Posts: 2,199
Default

Quote:
Originally Posted by mks View Post
I'm trying to write automation in touch mode, and it seems the parameter does not release once touched. Is this intended behaviour?
What kind of parameter are you talking about?
helgoboss is offline   Reply With Quote
Old 01-24-2022, 04:47 AM   #1934
mks
Human being with feelings
 
Join Date: Dec 2011
Posts: 171
Default

Quote:
Originally Posted by helgoboss View Post
What kind of parameter are you talking about?
Seems like any parameter I’ve tried to target so far (various native plug-in parameters). As soon as I adjust the parameter value via the source or even the little slider in Realearn: it begins to write, but never lets go. This happens from a stopped state too. If I adjust the value, it latches and and I need to start/stop to let it go.
mks is offline   Reply With Quote
Old 01-24-2022, 04:54 AM   #1935
helgoboss
Human being with feelings
 
helgoboss's Avatar
 
Join Date: Aug 2010
Location: Germany
Posts: 2,199
Default

Quote:
Originally Posted by mks View Post
Seems like any parameter I’ve tried to target so far (various native plug-in parameters). As soon as I adjust the parameter value via the source or even the little slider in Realearn: it begins to write, but never lets go. This happens from a stopped state too. If I adjust the value, it latches and and I need to start/stop to let it go.
Generally, in automation mode Touch, the DAW/ReaLearn needs a way of knowing when you release a fader/knob. For track volume/pan/width, you do this via the "Track: Set automation touch state" target. The preset combination "Mackie Control" + "DAW control" preset supports this already.

There's an open issue for adding a touch target for FX parameters, too: https://github.com/helgoboss/realearn/issues/474. Easy to add but just a bit lower on my priority list at the moment.
helgoboss is offline   Reply With Quote
Old 01-24-2022, 05:21 AM   #1936
mks
Human being with feelings
 
Join Date: Dec 2011
Posts: 171
Default

Quote:
Originally Posted by helgoboss View Post
Generally, in automation mode Touch, the DAW/ReaLearn needs a way of knowing when you release a fader/knob. For track volume/pan/width, you do this via the "Track: Set automation touch state" target. The preset combination "Mackie Control" + "DAW control" preset supports this already.

There's an open issue for adding a touch target for FX parameters, too: https://github.com/helgoboss/realearn/issues/474. Easy to add but just a bit lower on my priority list at the moment.
Ah. That makes sense. I got excited about the new version when I thought those were added Ironically, I found a post I made about it when I first tried a few years ago. lol. No worries though and you've been doing great work!! Unfortunately, the primary reason I really use controllers is to write automation. And almost always in touch mode. I'll keep an eye on that open issue and will integrate Realearn one day hopefully.

As an aside: it would be amazing if the (parameter and touch) mappings somehow could be automatically paired or at least visible in the same mapping. My dream is to be able to quickly add mappings to OSC controls as a mix happens and have touch mode work fairly quickly without separately configuring that for each parameter. But that's a totally different request. Just having them there would be amazing and a key feature for me.
mks is offline   Reply With Quote
Old 01-24-2022, 06:49 AM   #1937
helgoboss
Human being with feelings
 
helgoboss's Avatar
 
Join Date: Aug 2010
Location: Germany
Posts: 2,199
Default

Quote:
Originally Posted by mks View Post
Ah. That makes sense. I got excited about the new version when I thought those were added Ironically, I found a post I made about it when I first tried a few years ago. lol. No worries though and you've been doing great work!! Unfortunately, the primary reason I really use controllers is to write automation. And almost always in touch mode. I'll keep an eye on that open issue and will integrate Realearn one day hopefully.

As an aside: it would be amazing if the (parameter and touch) mappings somehow could be automatically paired or at least visible in the same mapping. My dream is to be able to quickly add mappings to OSC controls as a mix happens and have touch mode work fairly quickly without separately configuring that for each parameter. But that's a totally different request. Just having them there would be amazing and a key feature for me.
Best if you vote for that issue. That tells me someone is really interested and I might actually implement it sooner.
helgoboss is offline   Reply With Quote
Old 01-24-2022, 07:06 AM   #1938
mks
Human being with feelings
 
Join Date: Dec 2011
Posts: 171
Default

Quote:
Originally Posted by helgoboss View Post
Best if you vote for that issue. That tells me someone is really interested and I might actually implement it sooner.
Will do. Does Reaper have the equivalent of "TRACK_VOLUME_TOUCH" implemented via it's OSC functionality? I don't see it in the default list but perhaps that's a new feature or unlisted? FX_PARAM_TOUCH isn't a thing unfortunately
mks is offline   Reply With Quote
Old 01-24-2022, 05:37 PM   #1939
Sound asleep
Human being with feelings
 
Sound asleep's Avatar
 
Join Date: Nov 2009
Location: Montreal, Canada
Posts: 9,073
Default

Just a small suggestion, there is a warning that comes up when you try to save a certain way which is basically "what you're doing makes little sense for saving a patch you would use with many projects, do you want me to change things so they make more sense?" The default here is yes, which will change everything. This is I guess 2 suggestions in one. First part would be to rephrase the question in such a way that "yes" saves it as originally requested. Maybe just "Do you want to save anyway? Choosing no will do the following instead." Something like that. And/or a little checkbox that says "don't ask me this again" would be cool.

The default usually in programs is that "yes" carries forward with the actual command, even though results might not be desired, like saving over the same filename is a very common one. So, having it this way, can easily have people quickly not reading it, and just choosing yes out of habit, knowing they're saving over the same thing, and if they do that, it changes everything, and writes over the old preset, which is a significant bummer.

Thankfully I have not yet done this lol. But I could definitely see myself doing it.
Sound asleep is offline   Reply With Quote
Old 01-24-2022, 05:46 PM   #1940
foxAsteria
Human being with feelings
 
foxAsteria's Avatar
 
Join Date: Dec 2009
Location: Oblivion
Posts: 10,271
Default

Quote:
Originally Posted by Sound asleep View Post
Just a small suggestion, there is a warning that comes up when you try to save a certain way
This comes up if you have added any mappings and not edited them. I already mentioned to helgoboss that new mappings create this issue, but check the bottom of your list and you can probably delete the unassigned actions and not be bothered with the popup. It wont' change "everything" tho if you click Yes.
__________________
foxyyymusic
foxAsteria is offline   Reply With Quote
Old 01-24-2022, 06:55 PM   #1941
Sound asleep
Human being with feelings
 
Sound asleep's Avatar
 
Join Date: Nov 2009
Location: Montreal, Canada
Posts: 9,073
Default

Quote:
Originally Posted by foxAsteria View Post
This comes up if you have added any mappings and not edited them. I already mentioned to helgoboss that new mappings create this issue, but check the bottom of your list and you can probably delete the unassigned actions and not be bothered with the popup. It wont' change "everything" tho if you click Yes.
I think this is something different than what you experienced. It is a warning that the method I used to assign tracks by ID will only work for this particular project. Like if I open a new project, those tracks won't be there. So it wants to assign them to track number, or something like that, I forget.

But I have track templates I use, so it will work as a preset with lots of other projects I have.
Sound asleep is offline   Reply With Quote
Old 01-24-2022, 11:52 PM   #1942
mschnell
Human being with feelings
 
mschnell's Avatar
 
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,791
Default

Hi HelgoBoss.
Thanks fpor your great Work !
Is there, will there bi a version of ReaLearn for the RasPi ?

Background: A friend of mine is building a Guitar Effect Pedal or Rack based on that hardware. Test work with great success and very low latency. He might need to control the thing via OSC/WiFi.

Thanks again,
-Michael
mschnell is online now   Reply With Quote
Old 01-25-2022, 10:47 AM   #1943
foxAsteria
Human being with feelings
 
foxAsteria's Avatar
 
Join Date: Dec 2009
Location: Oblivion
Posts: 10,271
Default

Quote:
Originally Posted by Sound asleep View Post
I think this is something different than what you experienced. It is a warning that the method I used to assign tracks by ID will only work for this particular project.


That one? Yes it concerned me as well until I realized it was just the unedited mappings.
__________________
foxyyymusic
foxAsteria is offline   Reply With Quote
Old 01-25-2022, 10:55 AM   #1944
helgoboss
Human being with feelings
 
helgoboss's Avatar
 
Join Date: Aug 2010
Location: Germany
Posts: 2,199
Default

Quote:
Originally Posted by mschnell View Post
Hi HelgoBoss.
Thanks fpor your great Work !
Is there, will there bi a version of ReaLearn for the RasPi ?

Background: A friend of mine is building a Guitar Effect Pedal or Rack based on that hardware. Test work with great success and very low latency. He might need to control the thing via OSC/WiFi.

Thanks again,
-Michael
There is already, both 32-bit and 64-bit.

BTW, will reply other questions later, not much time at the moment.
helgoboss is offline   Reply With Quote
Old 01-25-2022, 02:57 PM   #1945
mschnell
Human being with feelings
 
mschnell's Avatar
 
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 14,791
Default

Quote:
Originally Posted by helgoboss View Post
There is already, both 32-bit and 64-bit.
GREAT !!!!
-Michael
mschnell is online now   Reply With Quote
Old 01-26-2022, 07:06 AM   #1946
yop22
Human being with feelings
 
Join Date: Mar 2020
Posts: 22
Default Glitches in touch mode

Does anyone else experience issue when recording automation in touch mode through Realearn ?
As you can see on the picture, the recorded automation looks like a sawtooth.
I'm using TouchOSC and the issue occurs with MIDI and OSC protocols (it doesn't when recorded via Reaper native OSC).
I've also tried from an Arturia controller and it does the same.
Recording automation in latch or write mode is fine.
Attached Images
File Type: jpg automation.JPG (36.1 KB, 77 views)
__________________
Win 10, Avid S3, RME UCX
yop22 is offline   Reply With Quote
Old 01-26-2022, 09:15 AM   #1947
helgoboss
Human being with feelings
 
helgoboss's Avatar
 
Join Date: Aug 2010
Location: Germany
Posts: 2,199
Default

Quote:
Originally Posted by yop22 View Post
Does anyone else experience issue when recording automation in touch mode through Realearn ?
As you can see on the picture, the recorded automation looks like a sawtooth.
I'm using TouchOSC and the issue occurs with MIDI and OSC protocols (it doesn't when recorded via Reaper native OSC).
I've also tried from an Arturia controller and it does the same.
Recording automation in latch or write mode is fine.
This happens when you forgot to create a mapping with target "Track: Set automation touch state". The whole point of automation mode "Touch" is that it allows you to tell REAPER when you release a control so it can stop writing automation. If you need that, create a mapping from the TouchOSC touch event to the mentioned target. If you don't need that, there's no reason to use automation mode Touch.
helgoboss is offline   Reply With Quote
Old 01-26-2022, 12:44 PM   #1948
Sound asleep
Human being with feelings
 
Sound asleep's Avatar
 
Join Date: Nov 2009
Location: Montreal, Canada
Posts: 9,073
Default

Quote:
Originally Posted by foxAsteria View Post


That one? Yes it concerned me as well until I realized it was just the unedited mappings.
That's the popup, ya. It might be coming up for similar reasons for you and I, but in my case choosing yes will change the way I have the tracks mapped. I currently select them to be mapped via ID. This way I have "Percussion" "Bass" "guitar" "piano" "Keys" "orchestra" "vocals" etcetera in a default project. So, every project I have is that way now. that gives me 9 general top level categories, and my keyboard 1-9 skips to each. I also have a midi controller that's mapped to each of these for mute and and solo and level. So, when I'm at my piano, I can solo any group, mute any group, and change the volume for any group. Any group in any project. For record arm, I need to map it for each case, but that's ok. If I use "by position" or track number or whatever, these will be in different spots all the time, so it won't work. If I select "yes" on that dialog box, it's going to change all the mappings to position, or something like that. I'm pretty sure I've done before.
Sound asleep is offline   Reply With Quote
Old 01-26-2022, 07:59 PM   #1949
foxAsteria
Human being with feelings
 
foxAsteria's Avatar
 
Join Date: Dec 2009
Location: Oblivion
Posts: 10,271
Default

Quote:
Originally Posted by Sound asleep View Post
choosing yes will change the way I have the tracks mapped. I currently select them to be mapped via ID.
But if you have a consistent naming system going in all your projects, why can't you just target by name? That's what I do for my VCA master tracks and I've got a similar setup as well.
__________________
foxyyymusic
foxAsteria is offline   Reply With Quote
Old 01-27-2022, 04:06 AM   #1950
BenjyO
Human being with feelings
 
Join Date: Nov 2011
Posts: 308
Default

Quote:
Originally Posted by helgoboss View Post
I think you will keep your head banging much longer if you pursue this way. Unless TouchOSC is able to decode Mackie LCD text sys-ex data (or at least a sequence of ASCII bytes) and display it as text. Maybe it is (I haven't looked into scripting). Let's see if anyone chimes in and proves that it can.
Thanks for your reply helgoboss. I was being hardheaded and decided to continue pursuing this path anyway. I asked a similar question in a facebook group named TouchOSC Templates Makers (There's also a discord server named "TouchOSC"). There several people confirmed that TouchOSC Bridge seems to have a bug and it doesn't send SysEx messages back to your device from your computer. They recommended I try another virtual MIDI cable named "RtpMIDI". I did and it works.

Anyway, for anyone interested in this: I was able to decode Mackie LCD text sys-ex data with the help of this guide on GitHub: Understanding Mackie Control Protocol. Another issue arose however. The TouchOSC manual states that you can only receive SysEx messages on the root level (the root control of the TouchOSC layout document) and they can only be processed with scripting (lua). I managed to write a simple lua snippet which prints out the decoded SysEx message in the Script tab of TouchOSC's message log:
Code:
function onReceiveMIDI(message, connections)
  if (message[1] == 240) then
    local messageLen = #message
    local output = "" 
    for i=8, messageLen - 1, 1 do
      output = output .. string.char(message[i])
    end
    output = string.gsub(output, '^%s*(.-)%s*$', '%1') -- whitespace trim
    print(output)
  end
end
You can then use these values then to dynamically update the text of Label controls and I managed to that as well but it was too laggy for my taste - I think RtpMIDI induces more latency than TouchOSC Bridge. The fact that you then have to write a filter script on the root control for all of these SysEx messages is also not appealing ... So I guess that in the end I'll have to go the OSC route.

Quote:
Originally Posted by helgoboss View Post
But I think even if you could get it to work in theory, the clean way would be to make OSC work over wire. This is not so much a feature that TouchOSC should support, but your iPad and your machine running REAPER. As soon as you manage to establish a wired network connection between iPad and Mac/PC, you have it solved. That's possible I think, just not done very often. You'll probably need an ethernet adapter with lightning connector. Try to find some tutorials.

BTW, if you find a good solution getting wired ethernet to work, let me know. I haven't tried yet but it's on my agenda to check how that works.
I wanted to avoid buying adapters but now after I've learned all of the stuff I wrote above, I'll probably go down that route and I'll let you know if I succeed.

Quote:
Originally Posted by helgoboss View Post
Update: If you want to send/receive numeric values via MIDI and you are the one who decides over what MIDI messages are used, don't use sys-ex, use plain old 3-byte MIDI messages, e.g. CC. Much easier.
I want text values as well (track names, units etc.) so CC isn't an option. Thanks anyway!
__________________
Check out some of my music
BenjyO is offline   Reply With Quote
Old 01-27-2022, 04:40 AM   #1951
helgoboss
Human being with feelings
 
helgoboss's Avatar
 
Join Date: Aug 2010
Location: Germany
Posts: 2,199
Default

Quote:
Originally Posted by BenjyO View Post
You can then use these values then to dynamically update the text of Label controls and I managed to that as well but it was too laggy for my taste - I think RtpMIDI induces more latency than TouchOSC Bridge. The fact that you then have to write a filter script on the root control for all of these SysEx messages is also not appealing ... So I guess that in the end I'll have to go the OSC route.

I wanted to avoid buying adapters but now after I've learned all of the stuff I wrote above, I'll probably go down that route and I'll let you know if I succeed.
Good to know though that it's possible using MIDI for that, at least in theory. Pretty powerful, the new TouchOSC.

Quote:
Originally Posted by BenjyO View Post
I want text values as well (track names, units etc.) so CC isn't an option. Thanks anyway!

You could use sys-ex for text and short MIDI messages for numeric stuff. But anyway, good that you will pursue the OSC way.
Looking forward to know what you will find out.
helgoboss is offline   Reply With Quote
Old 01-27-2022, 07:37 AM   #1952
cjewellstudios
Human being with feelings
 
Join Date: Sep 2017
Posts: 998
Default

Regarding ethernet adapters:

For iPads running iOS 10 or later, you buy a USB camera connection kit (I think that's what they are called) and a regular usb to ethernet adapter. You need iOS 10 or later because it will detect an ethernet adapter some ethernet settings will be available in the main settings menu. This allows you to see the IP easily which is essential for two way communication. I run an older iPad air 2 this way and its rock solid.
cjewellstudios is offline   Reply With Quote
Old 01-27-2022, 07:47 AM   #1953
BenjyO
Human being with feelings
 
Join Date: Nov 2011
Posts: 308
Default

I have a few new questions:

1. How can I convert an OSC message to a MIDI message with FX Output? The user guide has left me confused. It says that it only works if control input is set to <FX input>. If I do that however, my OSC device (iPad) doesn't connect with ReaLearn and no OSC messages can come in. How was this envisioned? I'm sure I'm missing something again.

2. Is it possible to get current region names as textual feedback (the region in which the edit or play cursor resides)? I tried the "Marker/region: Go to" target but the only textual feedback I receive is "On" or "Off" depending on whether the cursor is inside the region.
__________________
Check out some of my music
BenjyO is offline   Reply With Quote
Old 01-27-2022, 07:51 AM   #1954
BenjyO
Human being with feelings
 
Join Date: Nov 2011
Posts: 308
Default

Quote:
Originally Posted by cjewellstudios View Post
Regarding ethernet adapters:

For iPads running iOS 10 or later, you buy a USB camera connection kit (I think that's what they are called) and a regular usb to ethernet adapter. You need iOS 10 or later because it will detect an ethernet adapter some ethernet settings will be available in the main settings menu. This allows you to see the IP easily which is essential for two way communication. I run an older iPad air 2 this way and its rock solid.
That's good to know. I do have a lightning to USB adapter already and my iPad is running the latest OS so it should work. Thanks for pointing it out.
__________________
Check out some of my music
BenjyO is offline   Reply With Quote
Old 01-27-2022, 07:54 AM   #1955
Hartley Mays
Human being with feelings
 
Join Date: Sep 2009
Location: Cincinnati, Ohio
Posts: 307
Default Sysex bit testing/setting

I'm trying to figure out whether Realearn sysex support can handle the sysex messages that my Rodgers organ uses to send and receive stop settings. The individual stops are assigned to specific bits in the 44 byte sysex message sent to/from Reaper to indicate the setting for that stop.

I can get the full 44 byte message into the pattern field using the Learn option. I can then set a variable value for the byte whose bits I want to test. But then I don't see how/where to specify or test (input) or set (output) the specific bit within that byte? Also, if there is a way to do this, will it work properly to set up multiple mappings for each of the bits in that byte, and then the other bytes which may also have stop values? Most of the 44 bytes in the message are fixed, but several are needed to hold all the stop values, and I would need to be able to test just one bit of one of the bytes for each mapping, ignoring the other variable values.

I've got some prototype code working to do this decidedly convoluted stuff with Bome Midi Translator, but would much prefer to use Realearn if possible.
Hartley Mays is offline   Reply With Quote
Old 01-27-2022, 09:12 AM   #1956
helgoboss
Human being with feelings
 
helgoboss's Avatar
 
Join Date: Aug 2010
Location: Germany
Posts: 2,199
Default

Quote:
Originally Posted by Reaktor:[Dave] View Post
True, this instance is just a MIDI converter. However, I want this to be paged, so clicking a button would jump to the next 8 parameters of the plugin that I have defined. I'm hoping to achieve this with ReaLearn.
This should be possible by using mode "Incremental button" and setting Speed Min to 8.

Quote:
Originally Posted by Reaktor:[Dave] View Post
It's set to "D 400", the controller I'm using.
Good, I was fearing you had it set to FX output or something like that.

Quote:
Originally Posted by Reaktor:[Dave] View Post
I still need to get feedback to the fader. 2.1 and 2.2 only target the controller's display. However, you're right, I don't need to convert the incoming MIDI CC data to the controller's feedback but I can create a mapping between the plugin's parameter and the virtual fader from the controller compartment. I just didn't think of that because at that stage, the controller's data has been converted already and needs to be converted back, but a mapping between the parameter and the controller compartment automatically does that!
Btw., I found a way to convert and feedback the incoming MIDI CC data to the controller by using "MIDI script (feedback only)" as source. But your suggestion is better!
Great if it works, but probably what you say would also work in the main compartment. The only purpose of the controller compartment is to "abstract away" the controller, so that you can build main presets that work for any controller (which has the necessary buttons), not just one.

Oh, you have discovered MIDI script, congratulations I regret a bit already that I introduced that, today I would probably choose Lua instead. If you can do without MIDI script source, do it without.

Quote:
Originally Posted by Reaktor:[Dave] View Post
So I could put this instance in the monitor fx chain? How could I scale this approach to support multiple Kontakt libraries from that one instance?
Yes, the instance responsible for "feedback" can on the monitor FX chain, if you would like to have it available in each project. The instance for "control" can't in your case because you need to route track MIDI data to it.

How to scale to multiple Kontakt libraries ... it depends on what you mean by that. Do you have multiple Kontakt instances or just one? What criteria should decide about which Kontakt instance is the active one (pushes out feedback) and which not?

Quote:
Originally Posted by Sound asleep View Post
Just a small suggestion, there is a warning that comes up when you try to save a certain way which is basically "what you're doing makes little sense for saving a patch you would use with many projects, do you want me to change things so they make more sense?" The default here is yes, which will change everything. This is I guess 2 suggestions in one. First part would be to rephrase the question in such a way that "yes" saves it as originally requested. Maybe just "Do you want to save anyway? Choosing no will do the following instead." Something like that. And/or a little checkbox that says "don't ask me this again" would be cool.

The default usually in programs is that "yes" carries forward with the actual command, even though results might not be desired, like saving over the same filename is a very common one. So, having it this way, can easily have people quickly not reading it, and just choosing yes out of habit, knowing they're saving over the same thing, and if they do that, it changes everything, and writes over the old preset, which is a significant bummer.

Thankfully I have not yet done this lol. But I could definitely see myself doing it.
Mmh yes, I need to improve that one day in terms of usability. See https://github.com/helgoboss/realearn/issues/510 and discuss there.

Quote:
Originally Posted by cjewellstudios View Post
Regarding ethernet adapters:

For iPads running iOS 10 or later, you buy a USB camera connection kit (I think that's what they are called) and a regular usb to ethernet adapter. You need iOS 10 or later because it will detect an ethernet adapter some ethernet settings will be available in the main settings menu. This allows you to see the IP easily which is essential for two way communication. I run an older iPad air 2 this way and its rock solid.
Cool.

Quote:
Originally Posted by BenjyO View Post
I have a few new questions:

1. How can I convert an OSC message to a MIDI message with FX Output? The user guide has left me confused. It says that it only works if control input is set to <FX input>. If I do that however, my OSC device (iPad) doesn't connect with ReaLearn and no OSC messages can come in. How was this envisioned? I'm sure I'm missing something again.

2. Is it possible to get current region names as textual feedback (the region in which the edit or play cursor resides)? I tried the "Marker/region: Go to" target but the only textual feedback I receive is "On" or "Off" depending on whether the cursor is inside the region.
1. No, you are not missing anything. That's just how it is. If you want to convert OSC to MIDI, you can only send the MIDI directly to a hardware output. Turned out to be not trivial to implement this for <FX output> as well and I didn't need it. Also, both "MIDI: Send message" and "OSC: Send message" are rather nice bonus features than something that was part of a bigger vision. If you need that feature, please raise a feature request on ReaLearn's issue tracker.

2. Not at the moment, but it would be trivial to implement. I could add a textual feedback expression "target.bookmark.name". Please request it on GitHub, thanks.

Quote:
Originally Posted by Hartley Mays View Post
I'm trying to figure out whether Realearn sysex support can handle the sysex messages that my Rodgers organ uses to send and receive stop settings. The individual stops are assigned to specific bits in the 44 byte sysex message sent to/from Reaper to indicate the setting for that stop.

I can get the full 44 byte message into the pattern field using the Learn option. I can then set a variable value for the byte whose bits I want to test. But then I don't see how/where to specify or test (input) or set (output) the specific bit within that byte? Also, if there is a way to do this, will it work properly to set up multiple mappings for each of the bits in that byte, and then the other bytes which may also have stop values? Most of the 44 bytes in the message are fixed, but several are needed to hold all the stop values, and I would need to be able to test just one bit of one of the bytes for each mapping, ignoring the other variable values.

I've got some prototype code working to do this decidedly convoluted stuff with Bome Midi Translator, but would much prefer to use Realearn if possible.
This is explained here: https://github.com/helgoboss/realear...aw-midi--sysex

Read "Extracing and encoding a value".
helgoboss is offline   Reply With Quote
Old 01-27-2022, 11:39 AM   #1957
Hartley Mays
Human being with feelings
 
Join Date: Sep 2009
Location: Cincinnati, Ohio
Posts: 307
Default Sysex - determine changed value

Thanks for the quick response. I did read that section of the guide but I need to explain the problem better. I understand how I could extract a given bit value = variable from the input sysex message and use that value in an output message, but that doesn't answer the question of whether there's a way to test the value first and compare it in some fashion to its previous state to determine whether it was the bit that changed, causing the sysex to be sent and corresponding to the specific stop that was turned on/off. I guess a new output message for each stop could be generated but that doesn't seem like a practical solution.
Hartley Mays is offline   Reply With Quote
Old 01-27-2022, 12:54 PM   #1958
BenjyO
Human being with feelings
 
Join Date: Nov 2011
Posts: 308
Default

Quote:
Originally Posted by helgoboss View Post
1. No, you are not missing anything. That's just how it is. If you want to convert OSC to MIDI, you can only send the MIDI directly to a hardware output. Turned out to be not trivial to implement this for <FX output> as well and I didn't need it. Also, both "MIDI: Send message" and "OSC: Send message" are rather nice bonus features than something that was part of a bigger vision. If you need that feature, please raise a feature request on ReaLearn's issue tracker.

2. Not at the moment, but it would be trivial to implement. I could add a textual feedback expression "target.bookmark.name". Please request it on GitHub, thanks.
1. Hmm ... I still don't get it. If I set "control input" to <FX input> I effectively disable my source of OSC messages (iPad) and only MIDI messages are allowed to come in from the track. Right? Where do the source OSC messages come from then? From other OSC devices connected to Reaper which are not connected to ReaLearn?

Maybe explaining what I'd like to achieve might help clarify how I see this "problem" and where faults in my reasoning or a lack of my understanding lies. This is what I'd like to achieve:
- Control Reaper with TouchOSC via ReaLearn and be able to receive textual and numeral feedback which ReaLearn offers;
- be able to send MIDI messages from TouchOSC to Reaper (as if using a MIDI grid-controller like a Launchpad or Maschine)
- BUT I'd like to achieve this using only one type of connection - MIDI or OSC - not both. I'd like to avoid having 2 types of connection active at the same time from the same device.

The last point should also help clarify why I was stubbornly trying to make use of SysEx messages for textual feedback (which still seems doable but OSC performed better with textual feedback in ReaLearn and it seems to be much easier to set up than scripting in TouchOSC to distribute all incoming SysEx messages to appropriate controls). If however I want to simulate grid controllers or keyboards with pads by using only OSC then that means I have to convert dedicated incoming OSC messages to MIDI note messages. So I thought that the "MIDI: Send message" target would allow me to do that. I know Reaper can do that with its native OSC implementation via the Virtual MIDI Keyboard but I believe that requires setting up 2 OSC connections then: one to ReaLearn and another directly to Reaper.

I hope all of this makes some sense but if it doesn't, never mind. I'll find a way

2. Thank you for considering it. Such an option would definitely be helpful and I'll open a request on GitHub.
__________________
Check out some of my music
BenjyO is offline   Reply With Quote
Old 01-27-2022, 01:33 PM   #1959
helgoboss
Human being with feelings
 
helgoboss's Avatar
 
Join Date: Aug 2010
Location: Germany
Posts: 2,199
Default

Quote:
Originally Posted by Hartley Mays View Post
Thanks for the quick response. I did read that section of the guide but I need to explain the problem better. I understand how I could extract a given bit value = variable from the input sysex message and use that value in an output message, but that doesn't answer the question of whether there's a way to test the value first and compare it in some fashion to its previous state to determine whether it was the bit that changed, causing the sysex to be sent and corresponding to the specific stop that was turned on/off. I guess a new output message for each stop could be generated but that doesn't seem like a practical solution.
There's no such way. This is quite elaborate and would require some kind of scripting. If you are interested in the feedback direction only, the "MIDI script" source might serve you. This gives you complete freedom over what feedback you send. But there's nothing comparable at the moment for the control direction. I want to provide this one day but I'm waiting for the right moment because this would be much better with some additions to the REAPER API.

Quote:
Originally Posted by BenjyO View Post
1. Hmm ... I still don't get it. If I set "control input" to <FX input> I effectively disable my source of OSC messages (iPad) and only MIDI messages are allowed to come in from the track. Right? Where do the source OSC messages come from then? From other OSC devices connected to Reaper which are not connected to ReaLearn?
They don't come from anywhere. If you use <FX input>, ReaLearn simply won't receive any OSC messages because only MIDI streams can flow through REAPER's FX chains, not OSC.

Quote:
Originally Posted by BenjyO View Post
Maybe explaining what I'd like to achieve might help clarify how I see this "problem" and where faults in my reasoning or a lack of my understanding lies. This is what I'd like to achieve:
- Control Reaper with TouchOSC via ReaLearn and be able to receive textual and numeral feedback which ReaLearn offers;
For this you must select the same OSC device for both control input and feedback output. You won't need "OSC: Send message" or "MIDI: Send message" at all. These are just small utility targets for doing some MIDI/OSC conversion stuff.

Quote:
Originally Posted by BenjyO View Post
- be able to send MIDI messages from TouchOSC to Reaper (as if using a MIDI grid-controller like a Launchpad or Maschine)
- BUT I'd like to achieve this using only one type of connection - MIDI or OSC - not both. I'd like to avoid having 2 types of connection active at the same time from the same device.
Then best use OSC, e.g. addresses like /pad/0/0, /pad/0/1 etc.

Quote:
Originally Posted by BenjyO View Post
The last point should also help clarify why I was stubbornly trying to make use of SysEx messages for textual feedback (which still seems doable but OSC performed better with textual feedback in ReaLearn and it seems to be much easier to set up than scripting in TouchOSC to distribute all incoming SysEx messages to appropriate controls). If however I want to simulate grid controllers or keyboards with pads by using only OSC then that means I have to convert dedicated incoming OSC messages to MIDI note messages. So I thought that the "MIDI: Send message" target would allow me to do that. I know Reaper can do that with its native OSC implementation via the Virtual MIDI Keyboard but I believe that requires setting up 2 OSC connections then: one to ReaLearn and another directly to Reaper.
Okey dokey, now I get it! You want to use the TouchOSC buttons to play an instrument (not to control some parameters), which is usually done using MIDI, not OSC. And this without 2 connections ... let me think. But you will have two connections anyway, no?

- TouchOSC ==OSC==> ReaLearn
- TouchOSC <==OSC== ReaLearn

What's so bad about the following then?

- TouchOSC ==OSC==> TouchOSC Bridge ==MIDI==> REAPER
- TouchOSC <==OSC== ReaLearn

Even if I would make "MIDI: Send message" with output "<FX output>" compatible with OSC sources, I fear the timing might not be 100% satisfying. On the other hand ... as long as you won't play Beethoven on TouchOSC, I guess that wouldn't matter that much. I guess I could try ... feel free to open a FR.

May I know which virtual instruments you want to control with TouchOSC?
helgoboss is offline   Reply With Quote
Old 01-27-2022, 02:42 PM   #1960
BenjyO
Human being with feelings
 
Join Date: Nov 2011
Posts: 308
Default

Quote:
Originally Posted by helgoboss View Post
They don't come from anywhere. If you use <FX input>, ReaLearn simply won't receive any OSC messages because only MIDI streams can flow through REAPER's FX chains, not OSC.
Hmm ... So ReaLearn can't convert OSC to MIDI then? I was lead to believe it can. The following statement in the user guide helped fuel that belief "This target turns ReaLearn into a capable and convenient MIDI → MIDI and OSC → MIDI converter." But I think I'm misunderstanding something.

Quote:
Originally Posted by helgoboss View Post
Okey dokey, now I get it! You want to use the TouchOSC buttons to play an instrument (not to control some parameters), which is usually done using MIDI, not OSC. And this without 2 connections ... let me think. But you will have two connections anyway, no?

- TouchOSC ==OSC==> ReaLearn
- TouchOSC <==OSC== ReaLearn

What's so bad about the following then?

- TouchOSC ==OSC==> TouchOSC Bridge ==MIDI==> REAPER
- TouchOSC <==OSC== ReaLearn

Even if I would make "MIDI: Send message" with output "<FX output>" compatible with OSC sources, I fear the timing might not be 100% satisfying. On the other hand ... as long as you won't play Beethoven on TouchOSC, I guess that wouldn't matter that much. I guess I could try ... feel free to open a FR.
Well, I want to accommodate both. Dedicated buttons would control Reaper in various ways, including FX parameters. Other dedicated buttons (a 4x4 button grid) would serve as a MIDI drum pad.
I meant 2 two-way connections. What you are referring to in your first example I thought of as 1 two-way connection.
The second example seems a bit off to me. AFAIK TouchOSC Bridge doesn't support OSC. It could work like this:
- TouchOSC ==MIDI==> TouchOSC Bridge ==MIDI==> REAPER
- TouchOSC <==OSC== ReaLearn
But that is what I meant by having 2 two-way connections. One would be an "IP: port" connection to accommodate OSC and the other one would be a TouchOSC Bridge connection. If I take that to "offline" world and a wired-only connection (another goal which I failed to mention in my previous post), wouldn't that mean I would need 2 cables connecting my computer and my iPad? An ethernet cable (for the OSC connection) and a usb to lightning cable (for the MIDI connection via TouchOSC Bridge) and all of the adapters to enable such a marvelous "cable-dongle" feat. I hope this makes some sense.

I was under the impression I could set OSC as a source and then in ReaLearn select "MIDI: Send message" as a target, then press my pads in TouchOSC which would send OSC messages to ReaLearn, get converted there to MIDI which would then be sent to a VST.

But since you say, such an implementation could cause timing issues, I think that wouldn't be a good option. I'd like to have a decent timing representation of what was played.

Anyway, I shouldn't bother you with this anymore. Thanks for taking the time to talk this through though and of course for the tremendous work you put into this plug-in. I appreciate it.

Quote:
Originally Posted by helgoboss View Post
May I know which virtual instruments you want to control with TouchOSC?
I would control and play RS5k instances, Battery 4 or other similar sample players - mainly triggering individual samples.

I guess I could best describe my goal as trying to emulate the workflow of the NI Maschine sampler or Ableton Push with the added DAW control (all with an iPad).
It is probably best if I start saving for a grid-controller
__________________
Check out some of my music
BenjyO is offline   Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 07:41 AM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.