Question about CPU usage

Do I have to use asio4all and not RME asio drivers to do this? It would mean more latency, I think…

This has nothing to do with the RME or ASIO4All drivers. But of course it’s another layer of computation which by principle introduces some latency. I suppose it is not much but I don‘t have any numbers for you either. I suggest you just try it (for free) - and of you want hard numbers, do so using the GP latency measurement tool.

1 Like

You get aggregate device support but I’m not aware of built-in loopback. I use a 3rd party app from Rogue Amoeba (called LoopBack) to do this kind of thing although I am generally routing audio to/from Gig Performer and Max.

For me, the latency stuff is just way overblown…as mentioned many times, it’s worth reading our article about latency

Keep the RME drivers! If you need an audio loopback, it is preferable to use the RME builtin loopback.

1 Like

Right - I got that mixed up with MIDI loopback :man_facepalming:t3: Thanks for correcting!

Yeah, even though Totalmix FX is really hard to understand, now with loopback function I can sync multiple istances of Gig Performer and distribute the plugins into multiple cores better.

2 Likes

There’s no guarantee that that will actually happen — the OS may or may not use a different core for each instance and it will depend on what else happens to be going on on your machine.

After some days of testing, I must say that now I can use a lot more plugins because multiple istances together make my OS use more cores instead of one only.
But do istances receive MIDI messages? I couldn’t make it work for now!

PS: Sorry guys, found this post: Gig Performer | How to create two Gig Performer instances with a single client ASIO Driver on Windows

Thank you!

Yes the 2nd instance should receive Midi and this has nothing to do with ASIO.
What problem does exist?

Unfortunately in the Windows world, quite a lot of midi drivers unnecessarily lock the ports once they’re opened so that a second application can’t open them. I have no idea what those developers were thinking!

I have 3 istances:

  • Front (plugins with stuff that guitarist usually put in front of the amp, like fuzz or wah)
  • Amp (amp sim + IR)
  • Ambients (delay, reverb…)

I have a MIDI controller connected straight to the computer with usb. It works with Gig Performer but I don’t know how to make the connections to sync the 3 istances.

MIDI In is iCtrl MIDI (my controller), that’s the only thing I know lol. I downloaded Loopbe1 and I see that I can use it as a midi in or midi out.

Something like Loopbe1 can be used to send MIDI out for one GP instance into another GP instance. On a Mac - this can be done very elegantly without third party software because mac can create virtual MIDI ports on the OS level. On Windows - you have to use third party applications.

Using a small third party application doesn’t change anything to me, but I haven’t understood how it works.
Let’s say I have a controller, which is my MIDI IN. Then I’ll send it to Loopbe1 MIDI OUT (we are talking about the first istance).
Then should I choose Loopbe1 MIDI IN in the second instance and send it to Loopbe1 MIDI OUT? Do I need a MIDI out in the last istance? Sorry guys, I’m just trying to understand how the virtual cables work.

Whatever you do, don’t do that, you’ll set up an infinite feedback loop.

If you’re trying to control VSTs in multiple instances

1 Like

Send your controller (midi in) to the loopbe1 output in your first instance. Then in your other instance, create a MidiIn block for Loopbe1 and connect that to your plugins

1 Like

Sorry for the late answer, I’m back from vacation and I really hope to solve my issue!
The point is that I couldn’t connect the Midi In block for Loopbe1 to my plugins, because I don’t use synths but FX plugins (blue plugins, not green).

That’s how I connected the MIDI in the first istance (which is the main one, then I have other 2 istances):

You‘ll want to control the plugin parameters via widgets which are then linked to a MIDI controller. This would be a very straight up process, would not most MIDI drivers on Windows only allow for a single application (for example a single GP instance) to connect to it. That’s we’re loopBe1 comes in: The loopBe1 MIDI port can be accessed by multiple applications at the same time, so the idea is to send to whole signal from your controller into loopBe1 and then (in every GP instance) just treat the loopBe1 port as if it were your controller‘s MIDI port: just assign the Widgets via MIDI learn to events on the loopBe1 port.

One more important thing: GP tries to open all available MIDI ports on startup, so all instances you start after the first one are not able to access the „real“ port on your controller. In consequence, you have to set up the redirection from your controller to loopBe1 in the first GP instance you start.

I’d like to add a question here involving both midi and audio under windows 10.

Here’s my usual setup:

  • Roland v-drums -> USB midi -> GP -> Superior 3 -> Focusrite 18i20 out
  • Roland keyboard -> midi jack -> Focusrite midi port -> GP -> [bunch of VSTs] -> Focusrite 18i20
  • Guitars -> wireless -> Focusrite analog in -> GP -> various VSTs -> Focusrite 18i20

Once in a blue moon throw a Mic and vocal processing into that mix.

Now this isn’t a live on stage gigging setup. This is just people playing in my “home studio.” I had been doing this all through Reaper, but man is that cumbersome to use for just jamming and practicing or whatever.

Now I’ve read all the latency opinions, and it’s just a fact that for me/us and this setup it becomes unpleasant to drum at higher buffer sizes. I find it unpleasant at 128 @ 44.1. When it’s just me playing by myself I run 32 samples because I aim for lowest latency that won’t glitch. But if I’m running all this stuff live on the same machine I’ve generally been fine at 64 samples. Now that’s in Reaper, with each instrument and effects on their own tracks, and no fancy signal chains or anything.

I’ve never tried to analyze whether the VSTs on different tracks were getting sent to different cores. I just assumed they were. I never really cared much, as long as it all worked glitch free.

So now enter GP, which is infinitely more pleasant to use. I’m just getting started with it, though, and in the process of getting familiar with it I figure I might as well learn the details of the different trade offs that will determine how I structure my Gigs/Racks/Variations and the signal routing.

First relevant thing I discovered is that both my Focusrite 18i20 and my RME Babyface support multiclient ASIO. Cool. So I can run multiple instances of GP. But, at anything less than 128 samples they glitch, even if I’m not taxing them in the slightest. Play an mp3 on one instance and Pianoteq in another and I’m glitching at 64 or 96. I didn’t play around with it any more than that, because I figured why add a second instance if just doing that creates glitches where there were none previously?

But now that I’m reading this it occurs to me that I have no idea what’s actually going on. Maybe it’s still sticking every VST on the same core (even though it’s two instances) or maybe the problem was that I’m routing everything to the same “software channel pair” in GP rather than having each GP instance route to its own software channel pair and then mix them on the audio interface.

And then I realized that I have absolutely no idea how those “software channels” (whether in the Focusrite console or RME TotalMix) actually work.

I’m picturing a couple different scenarios in my head. The first, which is what I always thought was happening in Reaper, is that my Drums track, my Keyboard tracks, and my Guitar track each get their own core, each are on their own isolated channel strip, and each of those channel strips is routed individually to the hardware interface (even though it’s over the same USB) and even though I set channel levels and panning in Reaper, combining them into the monitor and headphone mixes happened in the interface hardware (and thus occupies no processor cycles).

A couple different ideas come to mind about how this might happen in GP. Clearly when I run everything in one instance, everything happens on one core, including the final mixdown, which I have in the 16 channel mixer module in GP. I now realize that I’d want to change this if I had more than just me sitting in the room, because even just jamming we often want different mixes in our headphones.

But the good news is I was trying to stress test this with some busy midi tracks and I was glitch free at 64 samples while pushing more VSTs in parallel then I’d likely do in real life. (Although Echoson seemed to push me over the limit, but I just started messing with that and might be user error.)

And then there’s my multi-instance GP scenario that caused crackling with just an mp3 and Pianoteq. Which initially made me think that multi-instance is still jamming everything audio on one core, and then on top of that I’m adding some kind of extra load by having to combine those two signals before it goes to the audio interface.

Or was I “breaking it” by having both instances tied to the same “software channel” rather than running each to separate channels?

I hope that all makes sense. I’ll experiment more when I get a chance, but I feel like I’m trying to reverse engineer a mental model that you guys might be able to explain a lot better. (And I realize this is a bit outside the “normal” use case for GP.)

Test results (W10, i7-8700K @ 4.8 Ghz, 32GB):

  • Different instances of GP do get put on different cores.
  • But with two or more instances of GP running I get crackles (at 128 samples @ 44.1khz) even if the second instance is doing absolutely nothing and I have the output block on Bypass.
  • It doesn’t make any difference if I route audio out from the instances to the same software channels, or to different channels and mix them in the interface (RME TotalMix or Focusrite Control)
  • In brief, multi-instance distributes CPU load more evenly but produces crackles at much lower loads anyway

I suspect the shortcoming here lies in the multiclient ASIO drivers. I’ve done less testing with the RME than the Focusrite, but similar crackle issues at low load on both at 128@44.1khhz.

My observation is that Reaper puts different tracks onto different cores (assuming you’re not routing between them) much like GP puts different instances on different cores. The key difference (on my system) is GP uses multiclient ASIO to do it, and somehow that alone leads to a lot of crackling at much lower loads even with larger buffers. (Regardless of Focusrite or RME interface.)

From all my reading elsewhere, it seems Reaper is the “outlier” in terms of DAWs optimizing multicore utilization, so this shouldn’t be viewed as a criticism of GP. That said, it seems to me that if GP did multi-instance* without using multiclient ASIO it would probably achieve most of what Reaper is. [*I suspect this means doing parallel gigs in one software instance, rather than true multi-instance.]

In your tests with Reaper, did you try having multiple plugins send their audio into the SAME effect (i.e, a third plugin in another bus)? What happens when you do that?