Go Back   Cockos Incorporated Forums > REAPER Forums > REAPER for Live Use

Reply
 
Thread Tools Display Modes
Old 12-14-2015, 10:06 AM   #1
Tom Swirly
Human being with feelings
 
Join Date: Dec 2015
Posts: 6
Default Refugee from Ableton/Max-land wanting to port large JS codebase.

Greetings, fellow Reapees!

This post will be fairly long, as I have a fairly substantial task at hand - I both want to get your advice and "talk out" my problem.

I finally realized that Ableton was never going to fix the bugs that I and others have been complaining about for years so I decided to jump ship to Reaper.

I have a pretty large library of Javascript for Max/Max For Live that I wrote for my solo music show, where I play my songs and a few covers with DMX lighting controlled algorithmically from an OS/X machine - it's here: https://github.com/rec/swirly

I started using Javascript to do this because programming with boxes and wires in Max makes generating large, stable programs very difficult - however, I see that Reaper preferentially supports C++ and Python, languages I know intimately and prefer (JS is a pretty neat language, mind you).

I got to a 1.0 state where I could do shows in the middle of this year and did a bunch. Then I started to rewrite it to be able to do longer and more complex shows - but version 2.0 got only half-done before I remembered how frustrating Ableton is - see here:
https://docs.google.com/document/d/1...qSoU8oaSw/edit

So it seems to me I have the following possibilities:

1. Finish version 2.0 in Javascript in Max, and communicate with Reaper using OSC.
2. Port the Javascript to run with Reaper.
3. Port to Reaper as Python.
4. Port to Reaper as C++.
5. Port to Reaper as C++ and Python (either separate modules or using Cython).

---

Now (thanks for listening) let me tell you more about what I'm doing.

I have a repertory of songs (spacey weird pop songs, generally cheerful?) which I play solo. I have the computer where I cannot see it and a show is pre-arranged completely in advance from my collection of songs (actually, due to the limitations of Ableton, I generally was playing exactly the same show every time).

I own a bunch of controllers but at this time I use exactly one - a Yamaha WX-7 electronic wind controller which has a dedicated hardware sound generator and also sends MIDI to the computer.

I also have a microphone and I sing.

I have a small number of DMX lights and lasers that are synchronized to the music, but also triggered by my playing. The WX-7 sends MIDI notes, breath control, pitch bend (effectively only 6-bit precision, what can you do?) and program changes 1 through 5. A typical setting might be "note controls color, breath controls brightness".

While you don't want the lights to lag _too_ far behind the audio, synchronization is really not an issue. I've done experiments where I deliberately introduced a lot of lag and jitter into the system and it was hard to notice unless I really turned it up. I've never had an issue with that.

The actual Javascript algorithms to map the MIDI to the DMX are really pretty primitive. More smarts goes into making the documents describing the scenes easy to read, and I have a mechanism to describe lighting instruments with documents like this one: https://github.com/rec/swirly/blob/m...efinition.json (yes, I know it's broken at the end, you can see the spot where I stopped and said, "I can't work with Ableton any more...")

And the key part is of course sequencing this to the music. I currently have little JSON documents describing the state at each scene and then a complex Rube Goldberg mechanism of sends to from multiple other Max For Live objects and the IAC bus to get around various issues described in that document. I could keep the JSON and reimplement it entirely - and perhaps with a lot less work.

---

So back to options 1 through 5. If I were advising someone else, I'd say, "Keep all the code you already have in Javascript".

But, well, I like to code. A lot of that Javascript was already ported from my own real-time Python libraries https://github.com/rec/echomesh particularly the color code. I have this handrolled unit test system in Javascript, but it's only sort of effective for what I'm doing. I'd love to throw that away.

Most of the Javascript is getting around the deficiencies of the Ableton/Max For Live ecosystem (the Love Canal of the DAW world). I'd have to throw that away anyway.

The one issue is that I only have a Max driver for my DMX interface. Now, this is a DMX USB Pro compatible unit. Someone might very well have already written something I can use. OR, I could have a tiny standalone Max patch that simply received OSC and sent that to DMX. It's one more moving part, but sometimes you can't avoid moving parts.

--

So _if_ I can do this entirely in Python then I think it's the way to go. Unless there is some overwhelming reason that either C++ or Javascript is a more effective solution. Don't get me wrong - I love C++, it's what I do for a living, but I can develop significantly faster in Python than in C++ (though C++11 has definitely evened the score somewhat).

So is this attainable in Python? Reading a score and scenes from Json files and then sending out OSC as Reaper plays? Or is there some reason that doing it in C++ or JS would be better?

Thanks again for listening to me work this out!
Tom Swirly is offline   Reply With Quote
Old 12-14-2015, 10:21 AM   #2
Xenakios
Human being with feelings
 
Xenakios's Avatar
 
Join Date: Feb 2007
Location: Oulu, Finland
Posts: 7,711
Default

Your post is kind of long and would need to be addressed in detail...However, one thing to take note of immediately : Reaper's "JS" is not "JavaScript" but "JesuSonic". And they are completely different languages. Reaper doesn't have any built-in support for JavaScript.
__________________
For info on SWS Reaper extension plugin (including Xenakios' previous extension/actions) :
http://www.sws-extension.org/
https://github.com/Jeff0S/sws
--
Xenakios blog (about HourGlass, Paul(X)Stretch and λ) :
http://xenakios.wordpress.com/
Xenakios is online now   Reply With Quote
Old 12-14-2015, 12:51 PM   #3
Tom Swirly
Human being with feelings
 
Join Date: Dec 2015
Posts: 6
Default

Excellent then. So much the better, it eliminates Javascript entirely! Thank you.

That sort of thing is really just what I'm looking for rather than an in-detail analysis, I know it's too long. :-)

Any reason why Python might be problematic - features you can only get in C++, severe performance implications?
Tom Swirly is offline   Reply With Quote
Old 12-14-2015, 01:00 PM   #4
Xenakios
Human being with feelings
 
Xenakios's Avatar
 
Join Date: Feb 2007
Location: Oulu, Finland
Posts: 7,711
Default

Quote:
Originally Posted by Tom Swirly View Post
So is this attainable in Python? Reading a score and scenes from Json files and then sending out OSC as Reaper plays?
That isn't really going to work too well with any of the Reaper scripting language (Python, Eel, Lua) options. The scripting languages don't have the means to hook into the audio engine in a way that would allow playing back timed events accurately. You could maybe achieve something approximate via the defer-mechanism. The defer-mechanism allows a scripting language function to be called repeatedly from a timer. You can't however control that timer interval yourself, nor will the time intervals be completely predictable. Well, you did mention the sync doesn't need to be very accurate anyway...? I am not sure what you can expect to get with the defer-timer. Maybe around 50 function calls per second? (I've never extensively tested that.)

With C++ you can implement a subclass of Reaper's PCM_source that can result in more predictable and accurate timing, but even with that things might turn out tricky. (I've done such a subclass myself that was able to send OSC messages but I didn't develop that thing very far, it was just for some preliminary testing purposes.)
__________________
For info on SWS Reaper extension plugin (including Xenakios' previous extension/actions) :
http://www.sws-extension.org/
https://github.com/Jeff0S/sws
--
Xenakios blog (about HourGlass, Paul(X)Stretch and λ) :
http://xenakios.wordpress.com/
Xenakios is online now   Reply With Quote
Old 12-14-2015, 02:56 PM   #5
mschnell
Human being with feelings
 
mschnell's Avatar
 
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 7,241
Default

Quote:
Originally Posted by Tom Swirly View Post
...
There are several points in your post that trigger my comments.

1) Javascript Language: Reaper feature a built-in script language called "EEL" that is different but a slightly similar concept from Java Script (both are derived from C, but done for a less "strict" programming style). So when you come from JS, I suppose you will be up to speed with EEL in no time. Reaper uses EEL as well for VST-alike "JesuSonic" Plugins (living together with VSTs and other plugins - handling Midi and audio in and out - in the tracks' effect chains), as for "scripts" that influence the behavior of Reaper itself by an appropriate API. I did a number of EEL Plugins for customizing the "Live" use of Reaper, for myself (playing two master keyboards and a TEC "BBC" Breath controller - I buried my old WX7 when I got <the brand new pre-release version of> the BBC) and for a friend, who plays an EWI wind controller. I did a description of the set of EEL plugins I did for him. -> http://www.bschnell.de/patch.pdf

2) Using Reaper "Live": Here the free SWS "LiveConfigs" extension already provides great functionality without you needing to do any programming. Here a revised manual for that -> http://www.bschnell.de/LiveConfigs_1.pdf. I mainly use reaper for Live Playing with VST instruments and effects. Works great, awesome stability (Windows 7)!

3) DMX: Another friend of mine is about to release (sell) a little device called "ADMX/2" that (among other things) allows for attaching Reaper to DMX lighting equipment via EEL plugins, that send out midi messages. The EEL plugins take (reprogrammed) Reaper "envelops" our sound output from other tracks as an input. (Of course Midi output from tracks cold be used as well.)

Please let me know if you have any specif questions. on these issues.

-Michael

Last edited by mschnell; 12-14-2015 at 11:32 PM.
mschnell is offline   Reply With Quote
Old 12-14-2015, 03:57 PM   #6
mschnell
Human being with feelings
 
mschnell's Avatar
 
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 7,241
Default

Quote:
Originally Posted by Xenakios View Post
The scripting languages don't have the means to hook into the audio engine in a way that would allow playing back timed events accurately.
I did not try this, but I understand that it should be possible to have Midi sync messages sent to a JSFX (in fact I seem to have noticed that my keybords do that) and a JSFX can generate Midi messages based on that signal. The midisend function (e.g. in an @block event) provides an "offset" parameter to position the midi message at a timing point more accurate than the size of a sample block.

-Michael
mschnell is offline   Reply With Quote
Old 12-14-2015, 04:20 PM   #7
Xenakios
Human being with feelings
 
Xenakios's Avatar
 
Join Date: Feb 2007
Location: Oulu, Finland
Posts: 7,711
Default

Quote:
Originally Posted by mschnell View Post
I did not try this, but I understand that it should be possible to have Midi sync messages sent to a JSFX (in fact I seem to have noticed that my keybords do that) and a JSFX can generate Midi messages based on that signal. The midisend function (e.g. in an @block event) provides an "offset" parameter to position the midi message at a timing point more accurate than the size of a sample block.

-Michael
Sure, I was ignoring JesuSonic when writing about the scripting languages. The original poster also seemed to want OSC instead of MIDI and if I recall right, JS has no support for that. But yeah, if MIDI messages are enough, he can probably do something as a JS plugin.
__________________
For info on SWS Reaper extension plugin (including Xenakios' previous extension/actions) :
http://www.sws-extension.org/
https://github.com/Jeff0S/sws
--
Xenakios blog (about HourGlass, Paul(X)Stretch and λ) :
http://xenakios.wordpress.com/
Xenakios is online now   Reply With Quote
Old 12-14-2015, 05:44 PM   #8
cturner
Human being with feelings
 
cturner's Avatar
 
Join Date: Apr 2009
Location: GWB
Posts: 76
Default

Hey Swirly!

Good to hear from you here.

You've gotten some good advice above. What I might add is to think about Lua, which duplicates JS's data as code aspect, enabling you to create a bunch of Lua data files that can be loaded and run. Perhaps a simpler, better choice than Python.

Look for the "Reapscript Documentation" in the Help Menu.

Investigate Jesusonic and/or Eel as M4L substitutes to do the MIDI fiddling.

It may make sense to jettison the OSC for straight MIDI. (I'm not at all familiar with Reaper OSC capabilities.)

Port, port, port!

'Appy 'Olidays!

Tad
cturner is offline   Reply With Quote
Old 12-14-2015, 11:19 PM   #9
mschnell
Human being with feelings
 
mschnell's Avatar
 
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 7,241
Default

Quote:
Originally Posted by Xenakios View Post
Sure, I was ignoring JesuSonic when writing about the scripting languages. The original poster also seemed to want OSC instead of MIDI and if I recall right, JS has no support for that. But yeah, if MIDI messages are enough, he can probably do something as a JS plugin.
Yep.

But I suppose he should re-think, as for live performance JesuSonic plugins make a lot more sense. Especially as he wants to control his setup with one or more Midi Controllers (WX7, ...).

I do this all the time. In fact, I never tried to do a "Reaper Script". I am really happy with what SWS LiveConfigs provides to control Reaper internals (muting/unmuting tracks, pushing Parameters onto plugins, executing Reaper actions (with this also ReaScript - I did not try this), ...).

All this now is controlled by Midi CC messages. Once you find out that you can generate / modify these Midi messages in any tracks' effect chain and direct them towards LiveConfigs by the MidiToReaControl Plugin, there is no need to bother about OSC or ReaScript (and considering appropriate timing issues) in a LivePerformance situation playing with Midi Controllers.

-Michael

Last edited by mschnell; 12-14-2015 at 11:34 PM.
mschnell is offline   Reply With Quote
Old 12-15-2015, 02:44 AM   #10
mschnell
Human being with feelings
 
mschnell's Avatar
 
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 7,241
Default

BTW.:

The latest addition to my live setup (but not too specific for exactly this setup) I did is a small pure Midi JSFX plugin that in a comfortable way allows for transposing and adding a sub-octave purely controlled by the midi input (via definable CCs from MasteKeyboard switches).

A rather simple thingy but it might serve as a "getting started" example for realtime Midi handling via JSFX for live usage - including somebody to ask questions about it .

Of course there are lots of Midi and audio JSFX example to look at, coming with Reaper out of the box.

-Michael
mschnell is offline   Reply With Quote
Old 12-15-2015, 03:18 AM   #11
Geoff Waddington
Human being with feelings
 
Geoff Waddington's Avatar
 
Join Date: Mar 2009
Location: Dartmouth, Nova Scotia
Posts: 3,026
Default

Some great advice here, I would add only one thing (especially since you say you like to code).

Whether to port or not.

From a "production coding" viewpoint that is certainly almost always true.

But through more "R&D" glasses and depending on overall code base size...

- Nice chance to redesign/refactor/whatever that section that has been driving you nuts, but you just haven't found the time to fix.

- Chance to experiment with different ways(MIDI vs OSC, etc.) of implementing previous design with eye toward better performance/stability/future upgrading and one that can sometimes be important in real-time, harmonization with Reaper's soul, if you will. Reaper is extremely stable, however some approaches to interact with the outside world may work better than others

-- Well you know where the rest of this goes
__________________
CSI - You can donate here: geoffwaddington.ca
Pre alpha software: https://stash.reaper.fm/v/33037/CSI%20pre%20alpha.zip
Reaper forum thread: https://forum.cockos.com/showthread.php?t=183143
Geoff Waddington is offline   Reply With Quote
Old 12-17-2015, 02:49 PM   #12
Tom Swirly
Human being with feelings
 
Join Date: Dec 2015
Posts: 6
Default Thanks for all the good ideas!

I still need to digest all this information.

I'm not entirely opposed to learning a new language, but for example, my color manipulation routines I have written already in three languages - C++, Python and Javascript - I'd love to avoid doing a fourth one. I do appreciate making the code better but the reason I am here is because I was spending too much time programming and not enough making music! :-)


Timing isn't TOO critical, as I said. I'm relying on the sequencer for its timing for critical stuff. "Score" was probably a bad choice of name - it's more like a list of "scenes", each of which remaps MIDI onto lighting. But I would like to be doing more stuff with LFOs and lighting.

There are two reasons for using OSC.

One is just because there's a reasonable amount going on and having human readable messages is better. BUT I really haven't had a good system that used OSC before, and I managed encoding everything through MIDI.

The other is more technical - it's that MIDI is 7-bit and DMX is 8-bit. I've run into problems with this before, and it isn't just that you lose a bit of precision - but that sometimes if you have a MIDI- >DMX converter, there are specific patches, settings or values that you need to hit exactly and cannot reach (though no instrument I own today has that issue).

So I have a plan - and my plan is (for a change) not to have a specific plan!

I'm not going to try to port my existing show for a while. Instead, I'm going to start writing new material and then try to hook in the lighting gear according to my existing plans, but coming fresh to Reaper with no preconceptions.

This means I can experiment, fail several times (something I have no real option to do if I'm porting an existing codebase), try things that I might not even want to use.

At some point, I'll have an idea of "best practices" in Reaper, and can move forward.

My theory is that I'll still end up with Max hanging around for quite a long time, if only because the super-standard, very solid driver for my specific DMX interface is in Max, and once that's there, I might as well use my existing work there.

It might well be that much of the logic stays in Max, then. My synchronization wants are primitive - if Max simply got a message every beat from Reaper and used free-running LFOs, your eyes would never be able to detect a few ms of latency of jitter in there.


Thanks again, I'll keep you posted over the next few months. All of this will be open-source so you can point and laugh and hopefully avoid my mistakes.


(* - for those programming geeks, I'm in a pretty happy state right now, because I'm using the same sort of functional-programming-with-side-effects in all three of my languages: Python, C++ and Javascript, thanks to the miracle of C++11 and `std::function`...)
Tom Swirly is offline   Reply With Quote
Old 12-17-2015, 11:14 PM   #13
mschnell
Human being with feelings
 
mschnell's Avatar
 
Join Date: Jun 2013
Location: Krefeld, Germany
Posts: 7,241
Default

Quote:
Originally Posted by Tom Swirly View Post
I have written already in three languages - C++, Python and Javascript - I'd love to avoid doing a fourth one.
You can happily use C in standard VSTs (to be loaded in any DAW and similar software, living in the tracks' FX chains) and in dedicated Reaper plugins (that can use the VST API plus a Reaper propriety API connecting it to the guts of reaper).

BTW.: there is an EEL to C converter (creating VSTs), but note vice-versa.


Quote:
Originally Posted by Tom Swirly View Post
it's that MIDI is 7-bit and DMX is 8-bit.
The Reaper-to-DMX system my friend I mentioned did, sends Midi Sysex messages via a a standard midi channel via USB to the box that creates the DMX output. The Sysex messages hold 8 Bit (or more) values, plus information such as scenes and file names. BTW. he found that to do a really decent show the DMX information needs to be created faster than just synchronized to audio-sample-blocks).


-Michael
mschnell is offline   Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 07:40 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.