Multi-User Setup Recommendations?

Hi All.

Just recently bought in to Gig Performer after trialing and tracking it for a year or so. Excellent product. Loving what this can do!

That said, I’m still a bit unclear on how to make my dream a reality. Here’s what I’m going for:

My 3-member rock band includes:
-Singing Drummer on eDrums with a midi foot pedal and a sidearm 25-key keyboard MIDI controller
-Singing Guitarist with midi foot pedal and a sidearm 44-key MIDI controller… occasionally switches to bass
-Singing Bassist (me) with three keyboards, foot controller… often plays basslines on synth with left hand and keys with right… occasionally switches to guitar or chapman stick
-Behringer X32 connected to network and MIDI for remote control
-OSC-controlled DMX interface for lighting control… with an HDMI output for future flat screen fun
-Band Helper on my iPhone for song selection
-Gig Performer on a PC tower connected to the x32 via USB

My concept would have each song called up via a single program change from my iPhone. This would call up virtual synths, key mappings, FX Plugins on mixer channels, lighting scenes, drum samples, and a simple mix scene (with, for example, the lead singer’s channel up by 8dB.)

It would also configure an instance containing a rackspace which would ‘re-interpret’ the functions assigned to each pedal of all three MIDI footswitches… to allow for the inclusion of song-specific actions (temporary song-specific effects via holding down of momentary switches, song-specific lighting scene changes at important parts of songs, triggered samples, etc) along with general actions. (Lights blackout, Intermission scenes with mixer muting, stage mute for talking to other band members privately through IEMs, Click/Track start/stop, Vocal and Instrumental ‘Solo’ controls for boosting channels at the mixer, etc.)

Most of the above functions have been tested on their own and seem to work just fine. (Although I haven’t figured out how to control an OSC slave from GP yet…) However I have trouble figuring out how to navigate the handling of multiple gig instances (ie Synth Bank / Drums / Channel FX / Show Control) from a single ‘Master’ instance.

So far, I’ve been able to grab incoming MIDI streams at the Master instance and use LoopMIDI to send them to other instances… but I haven’t figured out how to retain the nifty Rackspace/Song feature in EACH instance separately while controlling it from the master. That is to say, if I start a song on my standard Rhodes Patch, but my Drummer needs an 80’s gated snare kit used only for this song, and my guitarist needs to trigger a specific drum sample using a footpedal switch… All while calling up a laser show… How do I program the master instance to select these specific Rackspaces/Songs in each instance based on a single incoming PC message?

I’m sure there’s a solution… can’t wait to hear the responses.

Best,

Chris

Each song part can send multiple MIDI messages out when activated. This could be to different midi devices as well.

For example you could send something like

out1: xC0 0, out2: xC0 1

This would send a program change message 0 to “out1” and PC 1 to “out2”

Maybe that will help.

Is it so that you simply need to send the appropriate PC to each instance and keep the instance you call « master » displayed, or do you need to sometimes also display the other instances to do some adjustments ?
Any « hidden » GP instance should properly respond to MIDI PC and change their Rackspace/variation accordingly.

For my own curiosity what is the DMX+HDMI device you want to use ? (I am curious to see what you can display with this).

So… in my case, each song part in the Master instance would be configured to send specific PC messages to each of the other instances via loopMIDI ports assigned to each other instance?

You replied with a code-style list: “out1: xC0 0, out2: xC0 1”. Is this a reference to GPScript, or just shorthand for assignments made in some other way?

It is described here, how to send Midi messages
https://gigperformer.com/docs/Userguide36/song_part_properties.html

Is it so that you simply need to send the appropriate PC to each instance and keep the instance you call « master » displayed, or do you need to sometimes also display the other instances to do some adjustments?

Nothing will be displayed during performance… the system will be operating on a headless PC with remote desktop used when the instances need editing (between shows.) I may keep the remote desktop link available on a tablet during performance (for troubleshooting if needed) but I don’t plan on interfacing with it other than through Midi controllers and the iPhone Program Changes. In short, I can keep the Master instance visible if that helps my cause.

Any « hidden » GP instance should properly respond to MIDI PC and change their Rackspace/variation accordingly.

I would want each hidden GP instance to respond only to a dedicated PC message meant for that instance alone on a song-by-song basis. I’m confused as to where these PCs are configured and how they are sent to specific hidden instances.

For my own curiosity what is the DMX+HDMI device you want to use ? (I am curious to see what you can display with this).

I’ve got a Raspberry Pi 3 loaded with QLC+ wearing a DiscoHAT DMX interface for lighting control. QLC+ supports video playback via HDMI, although I may need to do some hacking to make it work as I envision on a Pi. This should give me the ability to call up video clips and time stamps therein using scenes and cues. In truth I’d rather have something capable of ‘iTunes Visualizer style’ RTA graphics generation, but the Pi is what I need for Lighting control.

Thank you Paul. I think this is what I was looking for.
I’ll summarize my understanding below… can you read through and confirm that I’ve got it?

  • Main instance maintains a Master song list
    *Each song will refer to the same ‘Master’ rackspace, which will handle routing and translation of incoming MIDI messages to other instances via LoopMIDI
  • Each ‘non-master’ instance will have a specific LoopMIDI port created for the exclusive purpose of receiving PC messages from the Master instance. (Does this port need to be added in the plugin field of every non-master instance rackspace for the instance to pick up the PC message? Or will it work on a ‘Gig’ level with no need to add the module repeatedly?)
  • Each ‘non-master’ instance will be configured to receive PC messages ONLY from its exclusive PC receive port
  • Main instance songs will be called up using PC messages from my iPhone.
  • Main instance songs will use the ‘Extra MIDI’ field within ‘song parts’ to send separate PC messages to each ‘non-master’ instance upon song part selection

In this way, my master routing rackspace can remain static and universal, while song selection from my iPhone controls the active Song/Part in each ‘non-master’ instance.

Sound right?

I try to understand what you want to achieve:
Select rackspaces via incoming PC messages.

Why do you need more than 1 instance?

In my bands I am using 1 instance and Ableton Live for backing tracks and sending programmed midi messages via DMX for Lightning and sending Midi messages via virtual Midi Port to a VJ software called GrandVJ.
OSC message sent when a racksoace is switched control Ableton Live to select to correct scene and start/stop the scene when global play in Gig performer is started/stop.

You want to have an instance with rackspaces for each musician, right?
On your iPhone you select a song and in each instance the correct rackspace should be selected?

Why do you need more than one instance:

1 - I want to reuse synth, drums, and show control rackspaces for multiple songs, but in different combinations for each song… so I need separate control of multiple instances to keep from creating countless permutations
2 - I will be using my CPU to the max, and I believe multiple instances can make better use of my multi-core processor
3 - I cannot fit the plugins required for all of these controls within a single rackspace. Even my synth rackspaces alone take up the full window on some songs.

Your setup sounds amazing! I looked into GrandVJ based on an earlier post of yours. Very intriguing!

You want to have an instance with rackspaces for each musician, right?

Kind of. Rather than create separate synth instances for each musician I figured I’d create ONE synth bank to share. (The other members’ synth needs are minimal.) Likewise ONE set of Mixer Channel effects, one set of footpedal re-routings, and a Drum instance for our drummer, which also might be triggered now and again by another member’s foot pedal.)

On your iPhone you select a song and in each instance the correct rackspace should be selected?

Yes. I tried out your suggestion from earlier today, and it seems to work! Although I assume that predictive loading might not work, as the patches to pre-load will be in a different instance?

Predictive load is a setting for each instance.
So your Master Instance could have predictive load unchecked, because it doesn’t need plugins.
It only fires PC messages to the other instances.
And to preserve memory you would use predictive load in the other instances.

…but this would require me to create and maintain song lists and re-ordered set lists within each instance and for each show… otherwise predictive loading will not work, correct?

I suppose I will also need to enable the foot switch MIDI ports for program changes in each instance, to allow for switching of variations and rack spaces by footswitch mid-song…

OK, predictive load is always working.
The only thing is when you switch to the next rackspace that is not within the predictive window you have to wait as long as all plugins are loaded.

Sorry, I did not get it completely what you want to achieve, the more you describe the complicated for me to understand.
I would make a Master Instance which reacts on your iPhone.
This way you select a song on your iPhone and the Master Instance is switched to the correct rackspace.
This rackspace contains all plugins the keyboarder (for example) needs.
With variations bypassing not needed plugins you can stay on 1 rackspace per song.
In Song Parts you reference rackspace variations and you can send out PC messages to the other instances.
So when in the Master Instance a song part is selected all other instances are synchronized, you decide wether a switch in the other instances has to be made or not.
With a little scripting you can start global play and send out a Master Klick for all musicians and automate variations switches and trigger samples and play notes etc.

Sounds a little bit complicated, but it is not - believe me.

Backing tracks I would use in a separate DAW (Logic/Cubase/BitWig… whatever you are familiar with).
I am using Ableton Live because of the very good OSC integration and because of their concept of Scenes and Clips.

And with OSC and Network Midi (on Mac MIDI Support is excellent) you could use different machines and all is synchronized from 1 Master.

1 Like

Truly appreciate all of this, Paul. Thank you.

Is there a reason to make my synth bank (for example) the master instance, rather than a dedicated instance?

I ask because I currently have a bunch of routings defined in a separate instance which I wish to remain active at all times (but may want to alter in the future, either globally or within specific scenes.) This routing must be in the “Master” instance, because only the main instance can receive MIDI… which the master instance must route to other instances using loopMidi.

Here’s a snapshot of my master screen. The routings include re-organization of midi performance streams from device-port-based 16-channel groups to memorably-named virtual ports (I can use a virtual port called “Chris88key” instead of remembering “Bome-Chris-Net channel 3 when creating rackspaces in other instances) I’m also using it to consolidate all three midi pedals (which are on three separate devices and ports) into a single virtual port with three channels. Regardless of all this, I must at the least include loopMIDI ports for each actual port, to ensure that the performance data is routed to other instances.

As you can see, this takes up considerable real estate in the rackspace. It also must remain static across all songs and scenes.

Here is one of my synth rackspaces:

As you can see, I can’t fit both screens’ content within one rackspace. Neither do I wish to recreate identical copies of the master routing Rackspace content within each of my synth rackspaces.

For these reasons I suspect my Master instance must remain dedicated to routings. Seeing as this was not your recommendation, Does this prohibit me from using predictive loading, etc? Are there alternate solutions which make more sense?

Thanks Again

This is doable, and you don’t need to add a specific MIDI in block for this purpose in each rackspace of your “non-master” instances, but you will need one MIDI out block with an assigned handle name in each rackspace of your “master” instance , because that’s how GP knows to which MIDI out the PC change has to be sent.

1 Like

It seems to be working without output blocks in my master instance… I’m just entering the virtual port name into the “Extra Midi” field of the song part edit window. Fun times!

You are right, I had a typo in my name MIDI out name “LoopBe Internal MIDI” with capital “B”!!! One point for you :stuck_out_tongue_winking_eye:

1 Like