Multitouch Support?

Hello all, I have the following three implemented on my LayerColor:

ccTouchesBegan: function (touches) {
    console.log(touches);
},

ccTouchesMoved: function (touches) {
},

ccTouchesEnded: function (touches) {
    console.log(touches);
}

I then performed the following sequence and monitored my Web Inspector log.

First I touch on the left~~side of screen. Web Inspector outputs:
<pre>[
Class
*point: Object
x: 180
y: 270
proto: Object
*prevPoint: Object
*viewId: 0
proto: Class
]</pre>
So far so good, we can see the coordinates match up. Next, while still holding the initial touch, I also touch on the rightside of the screen. Web Inspector now shows:
<pre>[
Class
*point: Object
x: 181
y: 276
proto: Object
*prevPoint: Object
*viewId: 0
proto: Class
,
Class
*point: Object
x: 861
y: 276
proto: Object
*prevPoint: Object
*viewId: 0
proto: Class
]</pre>
This is the touchesBegan function that is being triggered, but it is showing two touches. My understanding is that the touches
functions are supposed to only provide the touches that are actually triggering that event. In my case I did them in sequence so I was really only expecting one item
I then remove my finger on the left~~side of the screen. Web Inspector shows:
<pre>[
Class
*point: Object
x: 861
y: 278
proto: Object
*prevPoint: Object
*viewId: 0
*proto_: Class
]</pre>
Note that these coordinates correspond to the right~~side touch point but I just removed the left~~side finger. Maybe I’m misunderstanding and touches are actually what is currently active on screen? This might make sense, but I would then expect nothing returned when removing the last touch point. However…
When I remove the last, right~~side touch point, Web Inspector shows:
<pre>[
Class
*point: Object
x: 860
y: 279
proto: Object
*prevPoint: Object
*viewId: 0
proto: Class
]</pre>
This indicates that I just removed the right~~side touch, which is true; though that’s not what the event before this one indicated (removed left, touch variable contained right).

So, I’m hoping to get some guidance :slight_smile: Maybe my understanding is wrong, or someone’s gone through this themselves and can help me out. I’ve seen mentioned in a few posts a reference to a touch sample application, but I can’t find the link. If anyone knows where it is that’d be appreciated.

My ultimate goal is to have two joysticks (one on each side of the screen), along with a few buttons. I may be able to sneak the buttons in through the cc.Menu approach that people have mentioned, but for two joysticks, I would absolutely require differentiating between touch events.

If you’re still with me here, I appreciate you giving the read. Any insight is more than welcome.

Cheers.

I will try to implement tracking every touch when multitouch. but browser don’t support.
like your test case.
left side is location in [ x: 180,y: 270] when touchesstart. and then is change to [x: 181,y: 276] when touchesmove . I can’t confirm these points is same touch point, because no identity to mark this point[ x: 180,y: 270] is that point[x: 181,y: 276].
I try to print Touch.identifier to console, the value always is 0. maybe the browser not support this property.

so, I’m sorry to say that we can’t track multitouch in this version of Cocos2D-HTML5.

I appreciate the feedback. The platform I’m developing for has its own touch implementation, I’ll see if I can leverage those APIs instead. I’m still new at this so just wanted to double check I wasn’t the only one. Cheers!

In case anyone hits this thread wondering the same thing, I’ve been able to work around this by implementing HTML5 touch straight-up.

I’ve defined my LayerColor as follows:

init: function () {
    var canvas;

    /* Our Layer. */
    this._super();
    this.initWithColor(new cc.Color4B(0, 0, 0, 255));

    this.setPosition(new cc.Point(0, 0));

    /* ...usual object creation stuff goes here... */

    canvas = cc.$('#ccCanvas');
    canvas.addEventListener('touchstart', this.onTouchStart, false);

    return true;
},

onTouchStart: function (event) {
    event.preventDefault();
    console.log(event.changedTouches);
}

Now, Web Inspector is returning appropriate touches with valid identifiers.

First touch:

TouchList
0: Touch
clientX: 239
clientY: 227
constructor: TouchConstructor
identifier: 0
pageX: 239
pageY: 227
screenX: 239
screenY: 315
target: HTMLCanvasElement
webkitForce: 0
webkitRadiusX: 1072693248
webkitRadiusY: 0
webkitRotationAngle: 0
__proto__: TouchPrototype
constructor: TouchListConstructor
length: 1
__proto__: TouchListPrototype

And the second touch (while still holding the first touch):

TouchList
0: Touch
clientX: 770
clientY: 260
constructor: TouchConstructor
identifier: 1
pageX: 770
pageY: 260
screenX: 770
screenY: 348
target: HTMLCanvasElement
webkitForce: 0
webkitRadiusX: 1072693248
webkitRadiusY: 0
webkitRotationAngle: 0
__proto__: TouchPrototype
constructor: TouchListConstructor
length: 1
__proto__: TouchListPrototype

By leveraging the changedTouches property of the event, we only get those touches that affect this specific trigger, each with an appropriate identifier. Basically, it requires stepping out of the Cocos2d-X framework, however it is the standard HTML5 implementation.

The bonus is that since we are implementing this as noted above, we have direct access to any of our Layer’s children as we normally would. If anyone sees any inherent flaws with this approach I’d be happy to hear them; it is a standard HTML5 implementation, but maybe there’s something I’m not seeing in terms of integration with the Cocos2d-X framework.

EDIT: Slight modification. Since the touch events are being added to the canvas, we don’t have direct access to the layer. My Layer variable seems to be undefined within the onTouchStart function, but once I resolve passing the Layer into the function, we should be rolling.

SOLUTION: Alright, I found a way to get around the variable scopes which, is somewhat hackish, but works. First change was in the .node* function. I added aself* reference to the object we create:

MySecondApp.node = function () {
    var pRet;

    pRet = new MySecondApp();
    if (pRet && pRet.init()) {
        MySecondApp.self = pRet;
        return pRet;
    }
    return null;
};

Then, I grab that self reference in my HTML5 touch implementation:

onTouchStart: function (event) {
    var _this, touches, touch, location, n;
    event.preventDefault();

    touches = event.changedTouches;
    for (n = 0; n < touches.length; n = n + 1) {
        touch = touches[n];
        location = new cc.Point(touch.clientX, touch.clientY);
        _this = MySecondApp.self;

        if (location.x < 640) {
            if (_this.objectOnLeftSide.identifier === -1) {
                _this.objectOnLeftSide.identifier = touch.identifier;
                _this.objectOnLeftSide.onTouchStart();
            }
        } else {
            if (_this.objectOnRightSide.identifier === -1) {
                _this.objectOnRightSide.identifier = touch.identifier;
                _this.objectOnRightSide.onTouchStart();
            }
        }
    }
}

Quick explanation:
* We get only the changedTouches from the HTML5 implementation.
* We get the reference to self that we stored on creation.
* We divide the screen in half.
* We check if a touch is already assigned to the left/right objects.
* If no object is assigned, we assign and then call the corresponding object’s onTouchStart.

This has allowed me to create two objects (one on each half of the screen) that fade in when their respective halves are touched, and fade out when released; regardless of the order of actions or where the touches move to.

thanks very much!

I retry to test MultiTouch on iPad, found that the “identifier” property is working.

But on android mobile, this property is not work.

please tell me , what is your test device model?

Hi there, I’m testing on a BlackBerry PlayBook (OS 2.1, uses a Webkit-based engine.) The platform itself does support multitouch (I believe it’s up to 4 touch points) via HTML5.

The original Web Inspector results were with the cc.TouchDelegate implementation, and the workaround was just with direct HTML5 touch events. I’m not sure how the TouchDelegate does it now, but it seems that the potential is there for multitouch support with the Cocos framework and PlayBook.