Possible bug in Touch vs MouseEvent locations

I came across this while trying to do collision detection using mouse input.

From an answer to a recent question I asked I was trying to copy the ui::Widget collision detection mechanism for use manually e.g. the contents of ui::Widget::onTouchBegan which does collision detection based on the scene camera and the Widget object. To do this I had copied the relevant parts and was feeding my EventMouse location and this wasn’t working.

After some digging I’ve noted that it appears that the Touch events are in a different coordinate system to the Mouse ones. Notably Mouse events follow the documented coord system, Y axis 0 at top, while Touch uses Y axis 0 at bottom. It appears that the ui::Widget::hitTestmethod only works correctly with Touch events, which is confusing as they seem to be in the wrong coord system.

I’ve constructed a very simple test app to demonstrate the issue and for a mouse click in the center bottom of my window (1024x768) I get the following output:

Touch:       {540.954, 3.62594}
Mouse Press: {540.954, 764.377}

Can anyone shed any light on whether this is correct behaviour and I’m just misunderstanding or if it really is some sort of bug?

Minimal example code:

#include "cocos2dx/cocos2d.h"

#include <iostream>


namespace cocos2d
{
  std::ostream& operator<<(std::ostream& out, const cocos2d::Vec2& p)
  {
    out << "{" << p.x << ", " << p.y << "}";
    return out;
  }
}

namespace
{
  auto const WIDTH = 1024;
  auto const HEIGHT = 768;

  class Cocos2dxApp : public cocos2d::Application
  {
  public: // interface
    Cocos2dxApp()
    {
      auto director = cocos2d::Director::getInstance();
      auto eglView = cocos2d::GLViewImpl::create("test_view");
      eglView->setFrameSize(WIDTH, HEIGHT);
      director->setOpenGLView(eglView);
      eglView->setDesignResolutionSize(WIDTH, HEIGHT, ResolutionPolicy::EXACT_FIT);
      auto displayLayer = cocos2d::Layer::create();
      auto newScene = cocos2d::Scene::create();

      auto mouseListener = cocos2d::EventListenerMouse::create();
      auto touchListener = cocos2d::EventListenerTouchOneByOne::create();

      mouseListener->onMouseDown = [](cocos2d::EventMouse* e)
      {
        std::cout << "Mouse Press: " << e->getLocation() << std::endl;
      };
      touchListener->onTouchBegan = [](cocos2d::Touch* t, cocos2d::Event* e)
      {
        std::cout << "Touch: " << t->getLocation() << std::endl;
        return false;
      };

      director->getEventDispatcher()->addEventListenerWithFixedPriority(mouseListener, 1);
      director->getEventDispatcher()->addEventListenerWithFixedPriority(touchListener, 1);

      director->runWithScene(newScene);
      newScene->addChild(displayLayer, 1);
    }

    bool applicationDidFinishLaunching() override { return true; }
    void applicationDidEnterBackground() override {}
    void applicationWillEnterForeground() override {}

  };


}



int main(int argc, char** argv)
{
  auto app = Cocos2dxApp();

  app.run();

  return 0;
}

I don’t think it’s a bug, but I also have no idea why it is like that.

I just use e->getLocationInView() for the mouse. It adjusts the Y-axis to be the same as touch, i.e. 0 is at the bottom. Strangely, the values of touch and mouse are still off by less than 1, but at least they still pretty much match. Also strangely, using e->getLocationInView() for touches adjusts the Y-axis to work like the mouse.

Not sure why the cocos team chose to have the origin at the bottom-left or why mouse and touch work in the opposite way to each other. A complete guess would be that maybe it has something to do with OpenGL, but I have no evidence or reason why that would be.

1 Like

Thanks for the clarification that it’s (likely) not a bug. I can easily work around it, I just thought it was a strange :stuck_out_tongue: