Input events for elements other than Layer

in v3, anything that derives from Node can at least handle touch events. Sprite, Label, etc.

in v2, yes most people used Layer and then decided what was touched after the Layer accepted the event.

I’m currently using v3 and I can’t find any input event related methods in the Node class.

The Widget class though has:
void addClickEventListener(const ccWidgetClickCallback& callback);

Can any Node derived class really handle input events, if so what am I missing?

http://cocos2d-x.org/docs/programmers-guide/event_dispatch/index.html

I’ve reread the event dispatch article that you’ve suggested and it seems that this is for receiving global input events. So if I was to create an EventListenerTouch I could then manually handle any touch event that is detected and indeed I’ve already done this and it works fine for that purpose.

I’m still missing the connection to the Node class on this. I can see you can give a Node an EventDispatcher which would notify on events. Would I be right in assuming that setting an EventDispatcher would mean you can then register for input events on that Node?

I guess I was looking for a corollary of addClickEventListener for Node, automatic input event handling, just the need to register handlers. My apologies if I’m being dense, how would I handle an input event on a Node, what do I need to set up for that to work?

I’ve created an EventDispatcher and EventListener and hooked these up to a variety of objects and there are no updates that appear. It seems that my guess was off, some guidance would be of great use.

Show your source.

I’m sure what I have below is the wrong approach but it’s the test for the assumption I had for how a Node might work with an EventDispatcher. It’s a minimal self contained example, full set up etc.


#include "cocos2dx/cocos2d.h"

#include <iostream>


namespace
{
  auto const WIDTH = 1024;
  auto const HEIGHT = 768;

  class Cocos2dxApp : public cocos2d::Application
  {
  public: // interface
    Cocos2dxApp()
    {
      cocos2d::FileUtils::getInstance()->setSearchPaths( {"some_search_path"} );
      auto director = cocos2d::Director::getInstance();
      auto eglView = cocos2d::GLViewImpl::create("test_view");
      eglView->setFrameSize(WIDTH, HEIGHT);
      director->setOpenGLView(eglView);
      eglView->setDesignResolutionSize(WIDTH, HEIGHT, ResolutionPolicy::EXACT_FIT);
      auto displayLayer = cocos2d::Layer::create();
      auto newScene = cocos2d::Scene::create();

      auto image = cocos2d::Sprite::create("some_example_image");
      image->setPosition(WIDTH/2, HEIGHT/2);

      auto dispatcher = new cocos2d::EventDispatcher();
      auto listener = new cocos2d::EventListenerMouse();

      listener->onMouseDown = [](cocos2d::EventMouse* e)
      {
        std::cout << "Mouse press inside image" << std::endl;
      };

      dispatcher->setEnabled(true);
      dispatcher->addEventListenerWithFixedPriority(listener, 1);
      image->setEventDispatcher(dispatcher);


      displayLayer->addChild(image, 1);


      director->runWithScene(newScene);
      newScene->addChild(displayLayer, 1);
    }

    bool applicationDidFinishLaunching() override { return true; }
    void applicationDidEnterBackground() override {}
    void applicationWillEnterForeground() override {}

  };
}

int main(int argc, char** argv)
{
  auto app = Cocos2dxApp();

  app.run();

  return 0;
}

This is an interesting approach. Why are you not starting with AppDelegate?

Probably because of ignorance. Is AppDelegate something that is generated as part of project set up? I think I remember seeing it, but I’ve learnt how to use the underlying classes so as to fully have a handle on how things are initialised and have minimal code to eliminate cruft.

Do you think my method of set up has something to do with my struggle at getting input events from Node objects?

I think you should start with AppDelegate and add a custom class there or just use HelloWorld to get something working. You can make a class that subclasses cocos2d::Node and do your work there.

cpp-tests has a tone of code.

I wrote this a long time ago.

I appreciate the advice on checking my assumptions but it would be really great if you could give me a direct answer on how to set up input events for a Node. My approach below clearly doesn’t work, but can you confirm it’s the right way of doing it and if not how should I be doing it?

  auto image = cocos2d::Sprite::create("some_example_image");
  image->setPosition(WIDTH/2, HEIGHT/2);

  auto dispatcher = new cocos2d::EventDispatcher();
  auto listener = new cocos2d::EventListenerMouse();

  listener->onMouseDown = [](cocos2d::EventMouse* e)
  {
   std::cout << "Mouse press inside image" << std::endl;
  };

  dispatcher->setEnabled(true);
  dispatcher->addEventListenerWithFixedPriority(listener, 1);
  image->setEventDispatcher(dispatcher);

You don’t need to create a new Event Dispatch. Get the existing one. Use AppDelegate and build on that. AppDelegate helps bring together everything the engine needs. If you want to start from scratch, I guess you can but why re-invent it?

If you want to touch the image, why not let it have the listener attached. The resources I have provided you shows you the correct way to do things. Programmers Guide, Wiki, cpp-tests all have the answers.

Edit: Why doesn’t this approach work for you?

_mouseListener = EventListenerMouse::create();
_mouseListener->onMouseMove = CC_CALLBACK_1(MouseTest::onMouseMove, this);
_mouseListener->onMouseUp = CC_CALLBACK_1(MouseTest::onMouseUp, this);
_mouseListener->onMouseDown = CC_CALLBACK_1(MouseTest::onMouseDown, this);
_mouseListener->onMouseScroll = CC_CALLBACK_1(MouseTest::onMouseScroll, this);

_eventDispatcher->addEventListenerWithSceneGraphPriority(_mouseListener, this);

void MouseTest::onMouseDown(Event *event)
{
    // to illustrate the event....
    EventMouse* e = (EventMouse*)event;
    string str = "Mouse Down detected, Key: ";
    str += tostr(e->getMouseButton());
}

void MouseTest::onMouseUp(Event *event)
{
    // to illustrate the event....
    EventMouse* e = (EventMouse*)event;
    string str = "Mouse Up detected, Key: ";
    str += tostr(e->getMouseButton());
}

void MouseTest::onMouseMove(Event *event)
{
    // to illustrate the event....
    EventMouse* e = (EventMouse*)event;
    string str = "MousePosition X:";
    str = str + tostr(e->getCursorX()) + " Y:" + tostr(e->getCursorY());
}

void MouseTest::onMouseScroll(Event *event)
{
    // to illustrate the event....
    EventMouse* e = (EventMouse*)event;
    string str = "Mouse Scroll detected, X: ";
    str = str + tostr(e->getScrollX()) + " Y: " + tostr(e->getScrollY());
}

Edit 2: Touch events respond to mouse too, so unless you really need mouse up and mouse down use touch events.

I’ve checked out the link that you mentioned and from what I can see it uses no built in characteristic of Node to achieve the outcome. It registers an EventListener with the application global EventDispatcher and does manual checking on each event to see whether the touch is inside the Node.

Is this as far as the event handling goes for Node, because I’m already doing it this way, because there seemed to be no built in mechanism i.e. one where you don’t have to explicitly check the bounds, similar to what is available for the ui classes like ui::Text or ui::Button.

What built in handling is there getting touch events from a Node, there appears to be none, which is fine, I’d just like to know either way.

Thanks for your responses thus far, I realise you probably have very limited time.

Yes, but you can also attach a listener directly to the node if you want.

@grimfate @stevetranby what do you think is a solution here?

The code that you’ve posted does work fine for me and it’s what I use to extract the global input events and then do something with them. However this is different from the ui:: classes, where it’s possible to just register for when something occurs to them. I could make no listeners or event dispatch and a ui:: class could still tell me if it was clicked or not.

I don’t think Node has any built-in ability to handle touch. You need to create an EventListener and add it to the EventDispatcher. Additionally, UI elements elements appear to use the exact same code for touch - creating an EventListener and adding it to the EventDispatcher - so they don’t have some special way of handling touch, it’s just that the touch code has been written for you already so you don’t have to.

Also, you mentioned Layer. The built-in touch handling of Layer appears to be deprecated, but the touch handling for Layer is the same code EventListener code again anyway.

If you wanted a Node with builtin touch, you could create one yourself. Create a class that extends Node called something like “TouchableNode”, add the touch listener code to it, then have your nodes extend “TouchableNode”.

1 Like

There is no solution, out-of-box, as far as I know.

In one game we handle things as radman mentions by using a global event mouse handler on our map controller. We have double trigger touch+mouse prevention, drag map to pan, pinch to zoom (using touch), and then scroll wheel to zoom, hover, drag for selection box, etc with mouse input.

We test entities using a precedence hierarchy similarly to node graph priority (or order), but we allow some entities to trigger regardless if behind another. It’s a bit of an if/else “mess”, but it works, and it’s not really a mess, but it’s just a lot of state tracking and we haven’t refactored into higher level abstractions (essentially hard coded FSMs).

It wouldn’t be too “difficult” to add to the engine, just because it’s been done many times by many engines/developers, but it’ll likely be time consuming to write. Also, there are conflicting issues that would need to be carefully designed (from scratch, or looking at other engines) such that the engine’s touch-first approach wouldn’t break, but that mouse could also be handled.

Unity might be good example of two approaches (others probably as well: Urho3d, imGUI, Unreal, SDL_input, etc). There’s the GUI system mouseover events (enter,over,exit) as well as the trigger events on regular game objects (with similar enter,over,exit).

I’d love for the engine to become more desktop/console friendly, but for now it requires a bit of custom work.

@grimfate @stevetranby Thanks for the help/clarification.

1 Like

@grimfate, looks like my suspicions were correct, thanks very much for the confirmation. I also appreciated the observation of how ui::Widget implements it’s touch handling, taking a look at that was quite illuminating.

One thing that I had been somewhat assuming in the back of my mind was that there must be some sort of quad tree type optimisation going on in the backend of Cocos2d for the touch handling, but it looks like that’s not the case, looks to be raw collision tests on an element by element basis.

@stevetranby, it’s good to hear the perspective from another (I assume) desktop developer. I agree it’d be great if there was stronger support for more desktop centric input, but what is there isn’t completely insignificant, definitely enough to work with. I’d love to have the time to extend the system properly but that’s not really on the cards unfortunately :frowning:. Thanks for the response.

@slackmoehrle, thanks for all the responses and marshaling towards a resolution, much appreciated :slight_smile: