Figure 5 depicts the internal structure of UIIFace and shows the flow of events. Events are fired by the user’s raw input. Gesture Interpreter determines defined gestures (e.g. zoom, rotate) found in the raw input. If no gestures were found, the Basic Interpreter routes Touch and Kinect1 events to basic cursor and keyboard events. Gestures, speech commands and basic mouse and key-board events are then synchronized in the Interaction Manager and forwarded as Combined Events to the Command Mapper which maps the incoming events to the defined list of interaction commands that can be registered by any Web-based GUI.