I can’t think of a title

Hello again!

This post I think I’ll try to explain a bit more into detail what has been done code-wise, regarding the touch-based interface, since I probably won’t do any significant changes at this point:

Firstly, we needed a way for the widgets to handle touch-events: in ScummVM widgets and dialogs have the function handleMouseDown which is called upon the active dialog when we detect a mouse-down event (EVENT_LBUTTONDOWN), for example. Now, all we do is change up the Android, iPhone etc. codes to instead of simulating mouse-clicks and movements, use new events for finger-taps, and create new functions using these (Currently, in my branch, only code for Android is present). “Hold it!” you might say, “this will break the games, since they expect mouse movements!”. Fear not! This is where the “touchmapper” comes into play. It is an EventManager that listens for touch-events and injects their counterpart mouse-events, if we are currently in a game (here, we have two modes, touchpad-mode and direct input-mode – in scummvm-master, this is the responsibility of each port, now it has been moved into this more general place). There are still some things needed to be addressed here though, such as right-clicking – certain ports probably want to implement right-clicking in different ways (i.e. a port maybe wants two finger-tap, while another doesn’t have support for multitouch or has hardware modifiers that can be mapped..).

With our new touch-events, the widgets can respond and set their states accordingly. There is also a new widget, a scrollable canvas, which can be dragged in all four directions, and in which other widgets can be drawn, while still being “interactable”. This is achieved in a similar fashion as the partial text-drawing: The scrollable canvas has a “drawable area”, or a viewport, which is usually its bounds in x and y. It then applies the same viewport to its children, and if they are drawn in a viewport that makes certain parts of them outside it, they are redrawn on a separate buffer and blitted back, but only the contents that are inside of the viewport, and if it is outside we skip drawing it altogether.

Now, when a finger is dragged across the container, loop through it’s children and set their position according to the movement. Currently, we need to redraw all of its children after a movement: In ScummVM-master, what is done is that all widgets are pre-rendered and put into a buffer – and when e.g. a mouse cursor is hovered over a button we redraw it in a hovered state, and when it is withdrawn we restore the contents of the buffer. Since we set the position of widgets we can’t do this, and therefore need to skip committing widgets in movable containers to the buffer for now. I will try to explore other ways of doing this in order to get some performance gains, but this is a relatively clean solution.

Anyway, that is the current state of the code in my branch. A small todo list now would be to:

  • Explore solutions to the above mentioned points.
  • Allow scrollbars for the scrollable canvas, create a vertical scrollbar.
  • Create functions to populate the scrollable canvas with pictures for games.
  • Fixing up bugs! (for some reason, dragging on a list sometimes causes it to spazz out, also, sometime ago I seem to have broken tapping on dialogs, such as the About-dialog and Popup-widgets. Whoops :/ ).
  • Adding inertial scrolling (this seems to be easy enough for the android port, since we can get the velocity of a flick directly from the API, however, I am unsure how this would be implemented in other ports).
  • And some small things such as: adding selection of direct-input and touchpad mode in menus (with possibility of changing sensitivity), adding some #ifdef’s for the touchmapper etc.

Anyway, that’s all for now!

Gestures! Widgets! And more! – Part 2

Hello, fellow readers!

What has been done since my last post: Like mentioned, the widgets have been adapted to listen to input from touch: right now they can listen to one finger up, one finger down and swiping with one finger. That means one now can scroll lists, such as the list of saves or folders, by dragging the finger over them, and the selection follows the finger perfectly, altough, there is no inertial scrolling right now. To accomplish smooth scrolling, I’ve added some commits from the previous GUI-Improvements, the partial text-rendering from before.

Also, there has been some work on a ScrollableCanvas-widget. It basically functions the same as the partial text-rendering – altough there had to be a lot of changes to make this work, since widgets in master always draw directly on the screen’s surface. Now it is possible to draw them on a separate surface, to only render parts of them.

Hopefully, by the end of this week, I’ll have the scrollable-canvas working (that is, actually make it scrollable!) flawlessly. Picture-Widgets also need to be adapted to work with this widget (right now, only buttons work).

After that is done, there will be a lot more questions to tackle – in the mockups, the touch-based UI has a scrollable canvas with pictures representing each game. The scollable canvas needs to be populated with these pictures – probably these need to be supplied by the user. But I’ll tackle this problem when I get to it!

Until next time!

Gestures! Widgets! And more!

Hi everyone!

I’ve been away for a couple of days and haven’t been able to write a blog post, but now I suppose is a good time. The last couple days I’ve been working on refactoring gesture-code, so that the new (and old!) widgets can respond to gestures without resorting to ugly hacks and such.

The way this is implemented (currently, subject to change!) is by a couple of new events, such as EVENT_FINGERMOVE, EVENT_TAPDOWN, EVENT_TAPUP etc. The events are also associated with one or more “fingers”, which contain information about the current position of each finger which is tracked, thereby enabling widgets at certain places to respond to when a finger is tapped on them (right now I’m “reusing” the handleMouseDown-function each widget has, but the plan is of course to add a “handleFingerDown”, “handleFingerMove” and similar functions). Currently, I’ve coded support for these new events in the android port, but other ports should be trivial to add. Also, now, since games still rely on mouse events, I’ve had to create an event listener and injector, a “TouchMapper”, where all code regarding movement of the cursor in game has been refactored into (right now, one can either use direct mode or “touch-pad mode”-where dragging over the screen will move the cursor like a touch pad, tapping the left area will simulate a left click and the right area a right click).

As usual, code can be found in my repository. Anyways, that’s all for now!