Best way to deal with latency

:slight_smile: Will do

Ok it would appear to me that I can’t perform that test because my audio interface does not have any inputs

How much ms latency do you get with 128 Samples buffer size?

@LilyM

I also have a high sensitivity to latency (I’m a drummer and a keyboard player) and I also switch between a hardware organ and vsts. So I know what you mean.

I also used to use B3-X heavily… I still do occasionally but my hardware organ (Hammond XK-5) has all but replaced the B3-X. The B3-X is hands down the best sounding clonewheel VST but it is a resource hog. It has a lot of graphics and parameters. I did find that turning off the Leslie saves a lot of processing… but that is not something that sounds good. I think if you add anything else beyond the B3-X, at a buffer of 64 you are going to have issues.

I wonder if this is a use study for trying a second instance of GP that is dedicated to just the B3-X? I don’t use multiple instances but I guess it’s something I would test out.

Keep the plugin closed too is my recommendation… just opening that plugin’s GUI takes a big dose of RAM.

I had to net out at 128 for my setup. I occasionally revisit and try to tweak this but 128 seemed to be the lowest reliable threshold. The actual measured latency is higher than what GP reports as the latency you’ve set your audio I/O to.

I use the MOTU M4 and it’s decent. Maybe a RME Hammerfall PCIe in an 8x slot of a desktop type machine would give you less latency. I would seriously doubt the RME Babyface USB interface would be appreciably faster than a thunderbolt MOTU M4.

You could also try dropping the Sample rate from 48KHz to 44.1 (CD quality) and see if that helps.

There is a whole optimization guide on here about how to tweak your laptop/NUC to get the best performance out of GP… I’d also spend some hours with that.

What kind of interface are you using?

Did you try this guide?

1 Like

Mentioned earlier: Apogee Groove - It is a DAC.

I had not heard of this Groove but just spent a few minutes looking it up. As far as I can tell, this product is really intended as a way to listen to music at better fidelity than just the headphone out jack (say).

The default latency is apparent set to “a safe16ms” which would not matter for playing back a song but terrible for real-time interactive.

Yes, it is just a DAC, but fairly high quality and has an Apogee designed ASIO driver. It seems to work very well actually. I am running at 64 samples and am only getting a very occasional crackle with loaded racks that include BX3, a pretty well known resource hog (and Brandon makes a good point above that it may be when I have the BX3 GUI open).

I am not really understanding one thing though…when you say it is just for playing music, isn’t that what I am doing? One of their marketed uses is for using a DAW away from the studio basically…

Whether you’re producing tracks in a hotel room on your laptop, auditioning a mix away from your studio, or any other situation that requires high-quality audio playback, you’ll be well served with the Apogee Groove

That would include playback from plugins in a track too right?. Isn’t that all I’m doing? … getting playback from the plugin in GP? Does the audio interface effect the MIDI coming from my controller to the plugin in GP? I would have thought it was just dealing with “playback” from the plugin. But I’m not too smart on these things

I’m loving BX3, but was getting pops - not necessarily low-latecy clicks, more like audio breakup. Run at either 64 or 96sample latency.

Turns out it was the 1081 eq that was the cause. Went through all my setlist patches, turned the eq off, and no more audio glitches. Try giving that a shot. (I’m on a mac, 10.13, NI Komplete usb interface).

1 Like

What he means…and from what I read in the promo material…The Groove is for making music you are listening to on your computer sound better. Whether it be through GP, listening to mp3’s, watching YouTube or whatever, it’s for listening to music. So, in order to do this it must process the music signals. This is where the latency happens. Have you tried listening through just the headphone jack of your computer to see if there is any difference?

Ok, still trying to understand. Ultimately the main point of the Groove (and all audio interfaces) is to take digital audio and convert it to analog. And also as you say with high fidelity.

Is it not the case that any audio interface is just taking the digital output from GP and converting it to analog…and generally doing it more efficiently and at a higher fidelity than say the headphone output?

Understandably many people are also using the interface to input audio and maybe MIDI as well, but I have no need for those features of an audio interface and am only addressing the DAC aspect of an audio interface. And I am not understanding why the DAC of say a RME Babyface is different from the Groove. Perhaps it is better, has a better driver (which was my question in the beginning) but not any different in function.

As I noted earlier, the Groove is running GP just fine with heavy cpu hitters like Diva and BX3 at at a buffer size of 64 (but I just recently started getting a very occasional crackle which i traced to BX3) Running through the headphone jack does not get me anywhere close to that.

Let me put it this way…let’s say I bought an RME Babyface Pro FS. I would not be using the pristine mic pres, I would not be using the internal effects, I would not be using 22 of the 24 channels and on and on. I would only be using the DAC feature of this interface. Can anyone explain how the DAC of the Babyface is different functionally from the Apogee Groove?

I cannot tell you what is different in the conception, but with an RME you probably won’t need to reduce the buffer to 64 to feel comfortable with the latency (e.g. beside the DAC they build their own USB hardware and driver). I use an RME UCX with a buffer of 256 and even with percussive sounds I don’t feel better when reducing to 128. I don’t feel nor ear the difference. Difficult to say but it could be that your apogee “looses” time somewhere which makes it important for you to reduce your buffer at 64 to feel right with the latency. But, at 64, your have more chances to get crackles.

1 Like

@LilyM Several users have reported that asio4all seems to work better than the apogee asio driver with some software than others. It might be worth trying, just in case.

2 Likes

Ah yes…marketing … I suppose I could claim that you can use a bicycle to get from New York to California … but wouldn’t you rather take an airplane? :slight_smile:

Here’s the thing. If you want to play some existing music (an MP3 file, say), then if there is a 16ms delay between the time that you press the PLAY button and you start to HEAR the music, you will not notice that delay.
Similarly, if you are working with a DAW that is playing back tracks, remember that other than the track that you’re actually recording, the information needed to play those tracks already exists and so the audio can be created in advance. In other words, a DAW can look forward to see what will need to be played and create the audio in advance so as to have it ready.

But Gig Performer is not a DAW. There are no tracks — it is intended for real-time use. That means that there is no way to look forward since there’s no way to know what to process until you actually press a key on your keyboard or pluck a note on your guitar. In that situation, anything more than a few milliseconds of delay (latency) will be noticed.

Now, there are several contributing factors to latency.

  1. The sample rate and buffer size define the inherent latency in the computer itself. For example, if you are using a sample rate of 44,100 Hz and a buffer size of 64 then the inherent latency is 128/44100 which is about 1.4 milliseconds.
    What that means is that the entire collection of running plugins must generate 64 samples of audio within that 1.4 ms deadline. If your computer isn’t fast enough to do that, then you’ll hear clicks and pops and you have to either increase your buffer size or decrease your sample rate, the former being normally preferred to retain decent sound quality.
  2. The latency of the audio interface gets added to that inherent latency and that gives you the total latency (NB double the whole thing if you have audio input from a guitar or vocals that has to be processed). If the audio interface (and/or its drivers) adds significant latency here, that will add to the delay. The less latency introduced by the audio interface, the higher buffer size you can use in your computer (for the same delay), giving the plugins more time to meet that deadline.

Check out this blog article we wrote for more details on this topic.

https://gigperformer.com/audio-latency-buffer-size-and-sample-rate-explained/

2 Likes

If the awfull ASIO4All works better than the Apogee driver, I am worrying about Apogee users
:grimacing:

4 Likes

I really appreciate the detailed responses. I do have a basic understanding of most of this, but I understand now that while indeed the DAC aspect of any audio interface is the same, they are all just playing back audio and generally at a “higher” fidelity… there is playing back audio and then there is playing back audio.

If I am correct in understanding , if you were playing back audio from a DAW that had MIDI tracks and plugins in it, the DAW could preload or look forward to the notes and sounds from the plugins and be better “prepared” (is this really true in the case of plugins? Makes sense with audio). Therefore if an audio device were marketed for basically this use it would not need to be particularly low latency (what I assume would be a well written native driver).

Therefore the Apogee Groove probably does not have a well written driver and is adding its own inherent latency to the equation forcing me to go to 64 samples to compensate for that.

I did find the “safe” 16ms setting you referred to. This is what Apogee calls the USB device streaming settings. I was not aware of that and unfortunately the laptop is in transit for show tonight. I assume that is different form the ASIO settings and if indeed the default is 16ms, Apogee shows these settings going all the way down to 1ms, then this could be of some real help to me. Thank you for that.

I can not find that with Google search, Apogee and ASIO4ALL

No idea about that. I only reacted to what @Hermon said, as ASIO4ALL driver introduce a lot of latency.