Thatās because theyāre like me, early adopters and willing to stretch pretty far to take advantage of new technology. Your average TMP/QC/Helix/AxeFX user is not like that. They come from a pretty static setup where they know, even in the dark, which pedal does what and that placement doesnāt usually change from song to song.
last time I tried to challenge a standard it was a pretty brutal uphill battle, and that was with people who were at least willing to do music on computers. In this case you are talking about people so steeped in tradition that they hold onto ancient and unsupported beliefs about wood having magic electrical properties, itās not even a challenge worth taking on. They work a certain way, and good luck changing that.
Weāe building a product right now that will likely bring a lot of guitar players into the world of computer based guitar and as usual with that community, I will be the one who has to answer for everything, so Iām trying to get all my ducks in a row for how they would be using these things.
I hope everyone here understands, I really appreciate all the help you guys have given me. Iām really trying to pave the way for a somewhat specific community who has been traditionally pretty reluctant to deal with computers for a wide variety of reasons into this side of the aisle. I will epic fail if I donāt have an easy way to present them with a familiar environment and workflow. Had this been any earlier year I wouldnāt even bother, except just to be able to use this stuff for myself, but because of the promises of the Helix Stadium putting your playback engineer onto your pedalboard, it really seems like this is the opportunity to bring this computer averse community into the fold.
I have a little bit of help in that the live mixing community is beginning to accept native processing in the inserts of their consoles in the form of Waves Superrack, or Live Professor or even people using their DAWs like the ReaInsert feature in REAPER. There is a lot of crossover in those two communities and they are the same YouTube influencers, āgurusā and such pushing the guitar gear videos, live mixing videos and recording studio how tos.
Iāve dealt with these people for a long time, built products for them, done many different types of setups or consulting for them and generally coexisted with them in a mutually beneficial way. But now Iām kind of playing in their wheelhouse of live guitar where Iām not the expert so I am studying how they work and how they expect things to go and trying very hard to make sure I have setups, templates and roadmaps for them to be able to confidently take a computer to a gig and play their guitar thru it.
I can sell it on the positives that their existing systems CANāT do, but there are certainly parts of their existing workflow that are must haves. Their pedals canāt be moving all over from place to place anymore than theyād put up with their wah pedal jumping from the left to right side of their pedalboards.
Iām sure Iāll eventually get this all figured out. It feels like Iām 99% there sometimes and others a bit further off, but nothing insurmountable.
And then I gotta figure out all the showcase/ playback engineer features Gig Performer can do! I did try putting events in the streaming audio file player and it actually worked easier than I thought it would so Iām not all that scared of that part of the journey.
Anyhow, thanks for all the help, Iām not trying to be obstinate, just realistic about what the community I am bringing expects. Iām sure I will have a billion more questions.
Iām sure something can be done with scripting.
Is your current setup now to use Setlist mode, and will it make more sense for the panel label widgets to display the song part names?
Gig performer has some things going for it over superrack and live professor. Iām actually mixing shows tonight using reaper for the inserts, as Iām also always doing a live recording but maybe next week Iāll give gig performer a try!
How are people controlling it remotely? Thatās one area where Superrack really shines (assuming the copy protection doesnāt pop up during the show). Iām using vnc on an iPad to control reaper. I think I could do the same thing with gig performer
For FoH purposes, GP has a feature to record all incoming audio channels and all outgoing audio channels.
Each time you press record (or schedule it to record), the individual wave files are stored in a timestamped folder.
This feature was added in a very early version of gig performer. The idea was to make it trivial for a touring band (or their engineeer) to just push a button every night and at the end of their tour theyād have a collection of wave audio files from each performance, ready to drop into a DAW for post production.
Not for the plug-in GUIs. For transport control and moving faders, I can use touch Osc on my phone with the reaper, but mapping the plug-in parameters would be a nightmare if even possible. As much as I want to die when I use waves products, they really got that right, and I would hate to see them waste the development time in Gig Performer to pull off the remote thing. These guys who are using it must be using something that reasonably emulates a desktop, Iāll try and figure out what
It still sounds to me like you are trying to bypass the GP paradigm. In a live situation, you should have widgets to control the parameters of plugins that you would want to change during a show. All other parameters should be preset and for significantly different sounds, one uses different rackspaces.
You can control almost any aspect of GP via OSC. That includes setting/getting widget values, switching songs/parts/rackspaces/variations, turning on recording, sending MIDI messages over OSC to actually play GP from an OSC device if desired. You can certainly control the parameters of all synth plugins directly from OSC (the synths themselves do not have to have their own OSC support)
Yes, this is exactly what Iāve been talking about this whole thread. But the issue is, I donāt want which pedal calls up my clean sound to be moving around from song to song
I wouldnāt want to try and map thousands of parameters into a bunch of knobs that in no way close resemble the graphical user interface meters, and on screen measurements that make up the more well thought out plugin GUIs, you are giving up tons of instant information and usability that way. For a simple example, a dynamic eq where you want to see the RTA and the amount of gain reduction being applied in real time
I put a little explanation here of the three most common ways I see people using multi fx units that I am trying to translate into the Gig Performer paradigm. Hopefully it gives some insight into why Iām asking the questions Iām asking.
Iām hoping the grid selector rank13 is making can help with this problem, but it would still be nice if we could divorce these things from each other. Having parts and variations linked with an unbreakable bond in setlist mode brings up a bunch of nightmare scenarios