Whilst using GP4 on my live rig, which is a Lenovo M920 USFF PC with a touchscreen connected via USB-C, I have found certain actions difficult or impossible to achieve on a touchscreen. Please view this as a request to add some capability that caters for using GP4 in Windows tablet mode and on touchscreens in general and some constructive feedback to inform that.
I do most of my patch creation on a desktop PC with mouse & keyboard, so no issues there at all, but on my live rig, making changes is very difficult. I cannot open a context menu on any element in the Wiring tab at all with a touch interface (tap & hold) as when the item is touched, it moves slightly which suppresses any context menu opening as the software is moving the item instead.
Selecting a different physical MIDI input for a given Connection alias in the Rig Manager is almost impossible and requires many, many attempts to make each submenu open and allow selection of the next item… but with a mouse, it works impeccably.
Given the proliferation of touch devices, and the obvious advantages of using them in a live environment over a mouse & keyboard, are there plans to enhance the experience of the GP4 UI when working with a touch interface over and above the panel widgets??
Is it really requested to build rackspaces on a touchscreen?
I think using rackspaces can be done on a touchscreen without bigger issues, but building complex?
I would not use a touchscreen to do that?
I have the same request.
In my case, the Mac running Gig Performer is in in the 80lb band rack, and as such my programming must be completed when the rig is all connected, all midi inputs and network devices are present, etc. Once set up, I stand behind a dual-tier keyboard stand and access the computers via a Windows 10 touch screen PC and Dell pen via VNC (I am changing the connection from VNC to use PI-KVM for a more responsive interface connection.)
As the OP opined, I have no issue building complex rigs at home… but the devices I need to program are not present.
And I have no problem controlling existing rackspaces live/in rehearsal… but the patches are not yet set up.
(The Dell Pen is a bit awkward, but it does solve the ‘precision’ problem and allows for opening context menus via holding a button while tapping… but the ability to double-tap/double-click (or tap and hold) for context menus would save time and frustration.)
In general, no I wouldn’t, but I found myself having to do that on my touchscreen PC to in order to assign the physical MIDI connections to the Connection Aliases I created on my Studio PC which had to be done from the live touchscreen PC so that the port names were correct.
But if I ever find myself having to knock up a patch when out on the road using just my live PC rig, then I would be needing to use the touch interface to achieve these things.
Have you tried to use two-finger touch and see if your device will recognize that as the contextual menu?
I just gave it a try with a two-finger touch but it is behaving the same as a one-finger tap+hold action.
Another scenario that is a bit hit-and-miss on a touchscreen is double-clicking an element on the wiring screen to bring up the editor window, sometimes the first tap of the double-click is processed as a “move” action and so the second tap is not the completion of a double-click action… When in a live situation, I may want to perform an edit to a plugin value, or change a split-point, a velocity curve, or maybe open an audio mixer panel to balance relative volumes. There are things I would very much be doing in a live situation when I am using the touch-screen interface.
Again, please consider this as constructive feedback with the intention of improving the product for some specific scenarios. Overall, I am really very happy with GP4, and these are just minor issues in the grand scheme of things.
Hi @DaveBoulden, I think I’m one of the users who push the devs to make GP more “touch friendly” and I think GP4 has positively evolved from this point of view. On the other hand I have a slightly different point of view than you, I think that when you prepare or adapt a gig file (e.g. need to access the wiring edition tab), you can afford a small bluetooth keyboard (eg. I use a Logitech K830) and I think it is even more convenient to do so. As far as modifying things live during the gig, then yes, I think everything should be possible with a touch screen. In this last case I recognize that GP is not yet perfect, but the devs are open to well-argued suggestions and we have a category for that in this forum.
I completely agree, it is very much more a frustration than any kind of real issue. I had already surmised I will have to take a small keyboard and mouse out with me on the road, but that just irks me when I have a touchscreen!
I am a dev myself, so always try to make a well-reasoned and informed argument or request for something, especially given I am often on the receiving end of such requests.
The K840 has a touchpad which is far enough to survive on the road…
I guess part of the philosophy here was the assumption that live performing musicians will actually use their controllers (knobs/sliders on keyboards or control surfaces, MIDI foot pedals) and the computer will just sit on the side for the most part, perhaps in a “read-only” mode.
Absolutely, and for the most-part, that is exactly how I will use it, plus maybe some usage of a widget or two during pre-gig setup. It’s just those times when you need to quickly “throw something together” just before the gig and I can absolutely carry a small keyboard & mouse (or keyboard with trackpad), but I always strive to avoid carrying another piece of support gear for the occasional “what-ifs”.
Nowadays, part of the artillery is also often a touch screen; Possibly the one of an iPad with a remote mirror display or an OSC controller. One of the strengths of GP4 is its versatility. For example, I control my virtual mixer with a control surface, but when it’s not there, I use the widgets of my global rackspace directly on a touch screen. Random access to a rackspace is probably more difficult with a MIDI controlller than using a touch screen (at least theoretically).
I use my windows touchscreen very much out and in front of me, actively using the setlist with the lyrics window, as well as turning widgets on and off for which I do not have other controls set up. I like to think of the UI as a control surface in and of itself. Why not? It’s touchscreen after all. I definitely wouldn’t attempt heavy editing or building and would like it a little more touch friendly for live performance, but i can make it work. For some actions I need to use the virtual touch pad on the screen. It helps with grabbing small things or double clicking. My laptop is in portrait mode, so some menu items on top get cutoff in that position. I manage…
Yes, exactly what I do too. GP UI is the best touch control surface for GP
Yeah, some of the “usual” OS functions like moving, pinch/zoom and right mouse click would be great if they would work without the virtual touch pad (which is a nuisance). If that’s at all possible, of course. There’s gotta be a touch screen library somewhere, innit?
Which is not meant as criticism. Just saying what I’m thinking.