I recently did something very similar myself but the code ended up being quite customized for my needs. I should sometime try to break it out though and make a full article showing how I did it.
For a quick reference though, try looking into LibGDX's GestureDetector and GestureListener. I ended up using the "touchDown" event to capture when the user first touches the screen, then the "pan" and "panStop" to watch for movement while still touching. I then created a window and added it to the main Stage and used the gesture listener to capture events from that and assign it's location on the screen accordingly. I also use the same window to determine the "touch space" (for pan and panStop) allowed to be used while it's displayed.
Depending also on how you've got your stage/camera setup, you may also need to unproject your original x/y coordinates before using them for the calculations above.
Hope that helps you get started.
Thanks, this covers a few of the types of the things I was thinking.
If memory serves, the project / unproject is only critical if the touch is meant for the game coordinates, but not required if you are only interacting with the hud in screen coordinates, right?
Finally got some crap graphics made and got the touchpad working on default, next is just to add the functionality I'm looking for, even though I know it's likely overkill.
I've been trying to figure out how to implement the libgdx Touchpad ... Seems straightforward enough; setup the skins, create the stage, and add the pad to the stage, then it's just a matter of updating.
I set up a touch pad about a year ago; I don't recall any hidden gotchas.
You're right, first time I tried using a stage, and I think I was just overcomplicating things in my mind.
... I was thinking that each screen should be in its own class, then each stage with its own class that gets called when the appropriate screen is loaded... in otherwords the game starts loads newgamescreen, which creates newgameHUD class that creates the stage and populates it, then dispose of the HUD with the screen?
This sounds pretty standard. Considering that most menu screens are pretty straightforward I typically use a vanilla stage rather than extending it for each screen. For the game screen I'll use two stages one for the HUD and one for the world. This allows the use of two coordinate systems and means that the widgets in the HUD, which tend to have static screen positions don't need to be updated when the player/world move.
That makes sense, for the game itself, I had made the setup of a "world" that parses the map data, loads terrain and other objects, etc... and the game action takes place in world coordinates. Works great on PC, but I want things on android, and until now, the basic hud I had started didn't even use a stage because there's no interaction...
(I say that as though I've got a lot done... really i'd still be too embarrased to put it up as a WIP).
That makes sense to minimize the number of setups, if keeping the layout of the buttons works for multiple screens, then you're right, no point in figuring it out many times, especially if all that's really needed is changing button captions.
... you touch in the area of the movement controls that creates the start point where the "touchbackground" gets loaded with the knob at the spot where you touched, and then dragging moves the knob. In this way, no matter where you touch on the screen (within the bounds of the touchpad area) and the dragging is what gives the movement direction. Any ideas on how that might be implemented or articles pointing to how this technique might be accomplished?
As an alternative to NetprogsGames excellent recommendation to use libgdx's gesture detection, the stage's standard input listener could also be used. https://github.com/libgdx/libgdx/wiki/Mouse%2C-touch-%26-keyboard
. I'd implement the following psedocode in the touch portion of the input listener.
On Touch down
startPoint = touchDownPoint
draw touchPad at startpoint
On Touch up or drag
endPoint = touchUpPoint
results = fancyVectorMath(startPoint, endPoint)
note this will break in a multi touch situation; so, some additional code to handle that is required. Also, as long as the startPoint and endPoint are generated in the same world space, and the camera hasn't been rotated no unprojecting is required, and fancyVectorMath is simply endPoint - startPoint.
For sure, I know with the multi-touch you have to track the indexes of the touchpoints or whatever... that pseudo-code was pretty much exactly what I was hoping it would be.
Another idea I wanted to play with was to have a menuscreen where the player can customize the size of the actual touchpad to a size they would be comfortable with... if any of you have seen the snes emulator for android called "superretro16", this one allows you to customize the size and positioning of all the buttons to replicate a snes controller... is this doable with stage2d / libgdx? any advice here?
Changing the location yes; all widgets come with position update methods. Changing the size also yes (I've never done it for a touchPad, but I see no reason it wouldn't work). The line
touchpad.setBounds(15, 15, 200, 200);
In the example you linked should set the height and width to 200 px which if the texture is a different size should cause some stretching; although you might need to set a fill or stretch flag. Making the knob and backgrounds 9patches would probably beneficial.
In the interested of full disclosure resizing objects in LibGdx never seems to work quite the way I think it should. So... YMMV...
That's what I thought, and in the little bit of testing I've had time to do, seems to be how it works...
I aim to use a combination of 9patches and multiple graphics scales to accomodate as many resolutions as possible. Thanks alot for the help, seems that everything is as straightforward as I was hoping, and also thanks for saving alot of time in testing figuring out how it all works.