Supporting mouse and touch events

Supporting mouse and touch events
0.0 0


I am writing a game which should support both touch and mouse events.

Problem is that cocos2dx automatically translate mouse events to touch. Meaning that if I have both mouse and touch listeners I am getting both events which of course screwing my game.

Is there a way to not propagate the mouse events to a touch?


Very odd… and on what device user wanted to play with mouse and touch at the same time?


I don’t want it to work at the same time. but the user should have the option to choose. currently, it just doesn’t work. the only way is to add configuration screens which I don’t want.

Assume I have a windows machine with a touchscreen and a mouse (very common setup). I don’t know which device the user should use. he can use his mouse or his touch. I don’t want him to configure anything. there is no reason.

Problem is that if he decides to use a mouse it generates two different events. one for mouse and also for touch. I mean the same event will get a mouse and a touch event so it will treat it like double clicks for example.

I want to disable that. If the user clicks the mouse I want a mouseDown event without getting touchBegin/end.

The only solution I currently found is to check if the user has a mouse then not registering for a touch. But if the user have a mouse and would like to use his touch he can’t. unless I provide configuration screens to choose touch or a mouse which I don’t want.

Sounds to me this is a very common issue in desktop games. The problem is that I have different input mechanism if touch is selected or mouse. in touch it works with swipes while with a mouse with a click.


And if you only listen to touchBegin/-End the player is free to chose the input device. Like @KAMIKAZE I don’t see any advantage to restrict to one input device.

Windows is handling the events itself as it were click-events from the mouse. On Android you has a “OnClickListener” and if the user connects a mouse (via USB) to the phone the system will call this. So it’s the other way around.

Why do you want to block touches, if the user uses a mouse? I don’t have a touchscreen Windows, but what are AAA games do? Do they block touch as well? I mean, if you play a shooter and touch the screen, will it shoot (because a left mouse click shoots)?


I don’t want to block. but when a user uses a mouse the input mechanism I am using in my game is different than when you using touch.

When you use a mouse I want to click on the object. When using touch I want to use swipes.

Because of this behavior I cant do that because I am getting a touch event also on mouse events.


You should rethink your input concept. I would only take touches as left click and if you want a swipe (gesture), check the touch/click xy and the movement of this touch/click point.

The user can’t understand that a touch and click are different, because the used operating system doesn’t work your way. It’s touch == left click (at least on Windows and Android - because of the lack of touch devices I can’t tell for other systems).

PS: I can understand, that from the dev point of view it would be cool, to have different behaviors, but for your users it won’t be intuitive.


I understand what you are saying but if you have seen the game it would make sense to you. with a mouse when precision is good, we can use a mouse. but with a touch a touch + swipe is the only way to make it work because precision is impossible.

So still, I don’t think it makes sense that when I register for both mouse and touch I am getting two events when a mouse is clicked. at least I would expect a way to disable converting mouse clicks to touch.


We’ve hashed over this issue at least once. .

Don’t expect anything from the engine anytime soon. If you need this you’ll either need to try and work with the current event system, or probably smarter to just use the GLFW callbacks directly and write your own touch/mouse input system.

Basically there should be one input event that has info on whether touch or mouse or other and which buttons were pressed, etc, such that you can handle everything with fully custom handling.

Most games only need the touch events where touch or mouse triggers the event, such that a player could use either or both (at separate times) for input.

Most of the remaining games can probably get by treating left-click as touch, and then expanding capabilities with right-click, dragging, and other buttons.

The remaining few games, such as yours (it seems), are probably best off writing their own input system, or if you are just getting started, consider a different engine that fits your game’s very specific needs.


I think a very simple solution can be to have a simple setting over the event listener which will just state if we wish mouse even to translate to touch.

I can offer a fix for that in the cocos2dx library. any guidance on how can I contribute ? or should I just generate a simple pull request ?


I’d prob just write and use it in your own custom fork of the engine.
Feel free to submit a PR, but again just try not to have any expectations about it being accepted.