Regarding Multi Car Support - that’s right, I just forgot that. Now:
If everything is off the CPU load is 18%.
Playing the lower keyboard (Omnisphere, Diva): CPU load 55%
Playing the upper keyboard (Lush and UVI (with a sampled piano, no multilayer) with the effects: CPU load 61%, without the effect plugins 60%.
Playing both together (or just have the sustain pedal on one and play the other): CPU 100% and glitches.
Audio Buffer Size is 1024 samples, Sample rate is 44.1.
The same preset in MainStage shows the following:
If everything is off: 15% CPU load.
Playing the lower keyboard: 68% CPU load
Playing the upper keyboard: 59% CPU load
Playing them together: max. CPU load is between 80 and 85%.
Sample rate is the same, the buffer size is 256 plus safety means 512 in total. In GP I have 1042.
Based on Gigperformers comparison chart the CPU load should be drastically lower in GP. They say 4 instances of Omnisphere makes a CPU load of 60% in Mainstage and only 20% in Gigperformer. Why than in may case it is opposite? I understand that Diva eats a lot of CPU, but all the other plugins in my rackspace? They are more then moderate.
Please remove audio input in audio settings of gig performer and try again.
This could save cpu usage.
Also only activate physical channels you really need.
I tried multi instance support from Gig Performer.
I splitted your Rackspace in 2 rackspaces.
One contains Diva, Omnisphere, DS-Thorn and the other the rest of the party.
Each instance of Gig Performer loads its rackspace => CPU usage is really much lower.
Can you try that, this could be a really reliable usage of such massive rackspaces.
By the way: MainStage does not support Multi Instance (as far as I know).
With Gig Performer you are not restricted only to use 1 or 2 instances, it is up to you how many instances you want to use.
And there are possibilities to synchronize the selection of rackspaces of the 2 instances.
And when you move a widget (for example to control the overall output level) you can synchronize it with the widget to control your level in the 2nd instance.
This way 2 keyboarders (for example, could also be 2 guitar players or the bass player or the complete band etc.) could use the same PC or Mac on stage.
A friend of mine uses Gig Performer to play Bass and the 2nd instance is used like a Mixer for the Band.
He uses RME UFX+ and feels total happy with this solution because for each song all the effects are selected by switching the correct rackspace.
And with Lemur and iPad the Sound Guy only has to control EQ and Levels remotely.
No need for a real Mixer.
I did the same like you and t really solved the CPU issue.
On the other side: This means I have to open for each hardware controller one instance of GP. And I have no clue, how to link / synchronize this instances in one setlist and how to control the setlist during the gig. Where can I find advice?
Unfortunately my previous experience is Mainstage, an this seems so much easier. But I still want to try😃. If it works I have to rebuilt around 100 complete songs for GP.
I would only split rackspaces which face cpu spikes and not all.
Synchronize.via iac and midi is very easy.
You can send pc message when a rackspace or song or songpart is switched.
And you can define pc messages to variation or song parts to react.
Widget sync is super easy with OSC
Than it gets real complicated. With 30+ songs per gig, some of them with one instance, some with two. How to manage this? I have no clue.
I understand that one instance should act like a master and the second one should be controlled by the first one. And when I move to the next patch / rackspace in the master the second instance should move respectively.
From the logic it seams to be easier to make the instances in accordance to the midi controllers in use. But how I can program this?
One remark: I don’t split songs in parts like intro, verse, chorus etc. with different patches. It is to much hassle for me to switch between sounds during a song. I prefer to have one setup / patch per song with all necessary sounds (splitted, layered) on two masterkeyboards.
I read in one post that I have o work with IAC driver and create an additional midi module. The problem is that I never did this before and for me it looks complicated. I need time to try it, not sure if I can do it next weekend, I have only 11 days with my trial. Will see.
Just to add to this, on OS X, it is really easy to communicate between multiple GP instances using IAC for MIDI and the awesome free Blackhole virtual device driver for audio.
So for example, you could create an instance that would have a single rackspace with your effects and audio to that instance from your main instance via blackhole.
The only thing you must be careful about is that if you start using IAC, you must not use any Midi In Omni blocks because you will create a MIDI feedback loop.
Here are 2 gigs, inst1.gig is the Master und inst2.gig is the slave.
As MIDI In I used the OMNI Midi - normally this should be done never because endless loop can occur.
So please replace that in the instances with your current used Midi devices!
IAC as MIDI Out Block should work.
In the 2nd instance you should enable this IAC to accept PC messages.
With the program Midi Monitor (or similar) you can check if PC messages are sent.