My MIDI Omni In module feeds a MIDI Filter module set to block a particular CC message. When I click the Learn button for a widget, however, the Learn function appears to be taking input directly from the MIDI In and bypassing the MIDI Filter. Since the MIDI Filter is presumably removing unwanted messages, I think the Learn function should be able to listen to the output of the filter, whether by default or as an option.
The Midi Learn is always before any processing.
You will have many issues when this would be not the case.
For example you map CH1 input to CH10 output.
When the widget would be learned an CH10 and you change that setting in the Midi in ?
An when you take a look at the rig manager (highly recommended), this is absolutely independent of any Midi In Blocks used in rackspaces - and that is absolutely necessary because only this way you can change your hardware without the need to reconfigure all your rackspaces.
More important from a conceptual perspective, the whole point of widgets is to separate what drives a widget (i.e, some MIDI event from an external device) and how that widget drives a parameter (host automation, come completely different MIDI event). Widgets don’t see MIDI events once they enter the graph (layout view) and once a widget has learned an external MIDI event, that MIDI event will never appear in a MIDI In block in the graph
I ran into this same issue a while back. I don’t think your conceptual perspective is very evident in the layout. We see a MIDI In block (I’ll call it ‘A’ in the signal chain). We see the the ability to filter with a MIDI Filter block (I’ll call it ‘B’ in the signal chain). We connect that to a plugin (I’ll call it ‘C’ in the signal chain).
When we play, all the MIDI messages for ‘C’ are going through ‘A’ and then through ‘B’. We can assign widgets to various functions of A, B, and C independently—but unlike the actual playing, we can’t assign a widget to a function of C through B. That’s where the confusion lies.
At current, something like Bome MIDI translator is needed to accomplish what the OP is looking for. Might you be able to extend the functionality of the MIDI Filter blocks to provide more translation abilities as well?
Can you explain what you’re trying to do here? Maybe I misunderstood something in this topic
There are two kinds of widget learning
- Widget learns an incoming MIDI event from a physical controller knob (say) so that you can turn that widget by turning the knob on your physical controller
- A widget can learn a single parameter of a plugin so that turning the widget (either with your mouse or via your physical controller) adjusts the value of that parameter
So what is assigning a widget to a function of C through B?
That is absolutely on our list, albeit it will happen as a separate block
I can’t recall the precise details of my example, but it involved 2 small foot controllers that were identical, with no ability to change any parameters onboard. I would set up each MIDI In block to use different channels so that, for example, PC1 on Controller 1 wouldn’t activate any PC1 commands assigned to Controller 2. My hope was that since I assigned different channels at the MIDI In block level, any widgets used would follow those same rules. (note, I may be hazy on the exact details, but it was something similar)
I ended up getting Bome Midi Translator to accomplish the task, but was wondering why I had to. I understand your explanation of the design, but as I said, that architecture isn’t altogether evident based on how the signal chains operate otherwise.
Glad to hear that you’ll incorporate that into subsequent versions. Being able to translate and filter MIDI signals for widgets at a higher or lower level in the architecture than currently applies will be useful, I believe.
I still don’t understand. If you have two foot controllers, why aren’t you just associating each one with its corresponding own widget and then using host automation to control whatever you need to control?