DrawNode - each "drawDot" increases GL verts amount

Hi,
I want to use drawNode to make user able to draw on screen. Simply using drawDot. However the more I draw the more GL verts are around and fps drops very quickly.

That means drawNode add new verts each time I call drawDot. Can I somehow force it to use one texture and draw on it? So it won’t matter how long I’d draw, it still should be fast, because gl verts amount won’t increase.

Here’s my code:

lauto listener = EventListenerTouchOneByOne::create();
listener->setSwallowTouches(true);

listener->onTouchBegan = [&](Touch* touch, Event* event){
    if(touchID == -1 && !won){
        Point locationInNode = root->convertToNodeSpace(touch->getLocation());
        drawNode->drawDot(locationInNode, DRAW_RADIUS, DRAW_COLOR);
        touchID = touch->getID();
        lastLocation = locationInNode;
        return true;
    }
    
    return false;
};

listener->onTouchMoved = [&](Touch* touch, Event* event){
    if(touch->getID() == touchID && !won){
        Point locationInNode = root->convertToNodeSpace(touch->getLocation());
        
        if(lastLocation.distance(locationInNode) > 5){ //less drawing
            drawLine(lastLocation.x, lastLocation.y, locationInNode.x, locationInNode.y); //connect point with previous one to create a smooth line
        }
        
        lastLocation = locationInNode;
    }
};

listener->onTouchEnded = [&](Touch* touch, Event* event){
    if(touch->getID() == touchID){
        touchID = -1;
    }
};

listener->onTouchCancelled = [&](Touch* touch, Event* event){
    if(touch->getID() == touchID){
        touchID = -1;
    }
};

_eventDispatcher->addEventListenerWithSceneGraphPriority(listener, this);

void MazeScene::drawLine(int x1, int y1, int x2, int y2){
    int dy = y2 - y1;
    int dx = x2 - x1;
    int stepx, stepy;
    
    if (dy < 0) { dy = -dy;  stepy = -1; } else { stepy = 1; }
    if (dx < 0) { dx = -dx;  stepx = -1; } else { stepx = 1; }
    dy <<= 1;        // dy is now 2*dy
    dx <<= 1;        // dx is now 2*dx
    
    drawNode->drawDot(Vec2(x1, y1), DRAW_RADIUS, DRAW_COLOR);
    
    if (dx > dy)
    {
        int fraction = dy - (dx >> 1);  // same as 2*dy - dx
        while (x1 != x2)
        {
            if (fraction >= 0)
            {
                y1 += stepy;
                fraction -= dx;          // same as fraction -= 2*dx
            }
            x1 += stepx;
            fraction += dy;              // same as fraction -= 2*dy
            drawNode->drawDot(Vec2(x1, y1), DRAW_RADIUS, DRAW_COLOR);
        }
    } else {
        int fraction = dx - (dy >> 1);
        while (y1 != y2) {
            if (fraction >= 0) {
                x1 += stepx;
                fraction -= dy;
            }
            y1 += stepy;
            fraction += dx;
            drawNode->drawDot(Vec2(x1, y1), DRAW_RADIUS, DRAW_COLOR);
        }
    }
}

I know there’s OpenCV, but it seems to be an overkill for that purpose. Also I didn’t find any good resources how to implement it for cocos2d-x v3.

You probably want to draw into a RenderTexture without clearing the texture every frame. There’s a test in the test project that you can see how this works.

2 Likes

Hi, thanks for help!

Here’s my code:

renderTexture = RenderTexture::create(1136, 768);
renderTexture->setPosition(480, 320);
renderTexture->beginWithClear(0, 0, 0, 0);
renderTexture->end();
drawingLayer->addChild(renderTexture);

drawNode = DrawNode::create();
drawNode->retain();
drawNode->setPosition(88, 64);

addChild(root);

touchID = -1;
won = false;

auto listener = EventListenerTouchOneByOne::create();
listener->setSwallowTouches(true);

listener->onTouchBegan = [&](Touch* touch, Event* event){
    if(touchID == -1 && !won){
        Point locationInNode = root->convertToNodeSpace(touch->getLocation());
        renderTexture->begin();
        drawNode->clear();
        drawNode->drawDot(locationInNode, DRAW_RADIUS, DRAW_COLOR);
        drawNode->visit();
        renderTexture->end();
        touchID = touch->getID();
        lastLocation = locationInNode;
        return true;
    }
    
    return false;
};

listener->onTouchMoved = [&](Touch* touch, Event* event){
    if(touch->getID() == touchID && !won){
        Point locationInNode = root->convertToNodeSpace(touch->getLocation());
        
        if(lastLocation.distance(locationInNode) > 5){ //less drawing
            renderTexture->begin();
            drawNode->clear();
            drawLine(lastLocation.x, lastLocation.y, locationInNode.x, locationInNode.y);
            drawNode->visit();
            renderTexture->end();
        }
        
        lastLocation = locationInNode;
    }
};

listener->onTouchEnded = [&](Touch* touch, Event* event){
    if(touch->getID() == touchID){
        touchID = -1;
    }
};

listener->onTouchCancelled = [&](Touch* touch, Event* event){
    if(touch->getID() == touchID){
        touchID = -1;
    }
};

_eventDispatcher->addEventListenerWithSceneGraphPriority(listener, this);

Works really fast! There’s only one issue: there’s no anti-aliasing. Found this topic:

Seems like there’s no way to do that…

You’d have to write a anti-aliasing shader. You could either do this on the resulting RenderTexture (RT) or on the drawNode depending on your desired output and visual look. I’d prob just attach a custom shader to the RT and then as a first draft just write a simple blur shader.

I haven’t done much work here as I’ve been working on pixel-art games and want the aliased look, but you may want to look into various full screen anti-aliasing techniques: MSAA, FSAA, SSAA, FXAA.

I though so, I’m newbie in shaders :confused: Here’s what I’ve found:

antialias.fsh:

#ifdef GL_ES
#extension GL_OES_standard_derivatives : enable

varying mediump vec4 v_color;
varying mediump vec2 v_texcoord;
#else
varying vec4 v_color;
varying vec2 v_texcoord;
#endif

void main()
{
#if defined GL_OES_standard_derivatives
    gl_FragColor = v_color*smoothstep(0.0, length(fwidth(v_texcoord)), 1.0 - length(v_texcoord));
#else
    gl_FragColor = v_color*step(0.0, 1.0 - length(v_texcoord));
#endif
}

And here’s what I’ve created (based on my old code, where it was used to add effect on sprite):

AntialiasedRenderTexture.h:

#ifndef __AntialiasedRenderTexture__
#define __AntialiasedRenderTexture__

#include "cocos2d.h"
#include "AntialiasEffect.h"

USING_NS_CC;

class AntialiasEffect;

class AntialiasedRenderTexture : public RenderTexture
{
public:
    static AntialiasedRenderTexture* create(int w, int h, Texture2D::PixelFormat format, GLuint depthStencilFormat);
    static AntialiasedRenderTexture* create(int w, int h, Texture2D::PixelFormat format);
    static AntialiasedRenderTexture* create(int w, int h);
    
    void draw(Renderer *renderer, const Mat4 &transform, uint32_t flags) override;
    
protected:
    AntialiasEffect* effect;
    
    ~AntialiasedRenderTexture();
};

#endif /* defined(__AntialiasedRenderTexture__) */

AntialiasedRenderTexture.cpp:

#include "AntialiasedRenderTexture.h"

AntialiasedRenderTexture* AntialiasedRenderTexture::create(int w, int h, Texture2D::PixelFormat eFormat){
    AntialiasedRenderTexture *ret = new (std::nothrow) AntialiasedRenderTexture();
    
    if(ret && ret->initWithWidthAndHeight(w, h, eFormat))
    {
        ret->autorelease();
        ret->effect = AntialiasEffect::create();
        return ret;
    }
    CC_SAFE_DELETE(ret);
    return nullptr;
}

AntialiasedRenderTexture* AntialiasedRenderTexture::create(int w, int h, Texture2D::PixelFormat eFormat, GLuint uDepthStencilFormat)
{
    AntialiasedRenderTexture *ret = new (std::nothrow) AntialiasedRenderTexture();
    
    if(ret && ret->initWithWidthAndHeight(w, h, eFormat, uDepthStencilFormat))
    {
        ret->autorelease();
        ret->effect = AntialiasEffect::create();
        return ret;
    }
    CC_SAFE_DELETE(ret);
    return nullptr;
}

AntialiasedRenderTexture* AntialiasedRenderTexture::create(int w, int h)
{
    AntialiasedRenderTexture *ret = new (std::nothrow) AntialiasedRenderTexture();
    
    if(ret && ret->initWithWidthAndHeight(w, h, Texture2D::PixelFormat::RGBA8888, 0))
    {
        ret->autorelease();
        ret->effect = AntialiasEffect::create();
        return ret;
    }
    CC_SAFE_DELETE(ret);
    return nullptr;
}

void AntialiasedRenderTexture::draw(Renderer *renderer, const Mat4 &transform, uint32_t flags)
{
    if (_autoDraw)
    {
        //Begin will create a render group using new render target
        begin();
        
        //clear screen
        _clearCommand.init(_globalZOrder);
        _clearCommand.func = CC_CALLBACK_0(AntialiasedRenderTexture::onClear, this);
        renderer->addCommand(&_clearCommand);
        
        QuadCommand q = QuadCommand();
        auto blendFunc = BlendFunc::ALPHA_PREMULTIPLIED;
        q.init(_globalZOrder, _texture->getName(), effect->getGLProgramState(), blendFunc, &_quad, 1, transform, flags); //<-- stuck here, what should I do with _quad parameter
        renderer->addCommand(&q);
        
        //! make sure all children are drawn
        sortAllChildren();
        
        for(const auto &child: _children)
        {
            if (child != _sprite)
                child->visit(renderer, transform, flags);
        }
        
        //End will pop the current render group
        end();
    }
}

AntialiasedRenderTexture::~AntialiasedRenderTexture() {
    CC_SAFE_RELEASE(effect);
}

AntialiasEffect.h:

#ifndef __AntialiasEffect__
#define __AntialiasEffect__

#include "cocos2d.h"
#include "AntialiasedRenderTexture.h"

USING_NS_CC;

class AntialiasedRenderTexture;

class AntialiasEffect : public Ref
{
public:
    CREATE_FUNC(AntialiasEffect);
    GLProgramState* getGLProgramState() const { return _glprogramstate; }
    virtual void setTarget(AntialiasedRenderTexture *t){}
    
protected:
    bool init();
    bool initGLProgramState(const std::string &fragmentFilename);
    AntialiasEffect();
    virtual ~AntialiasEffect();
    GLProgramState *_glprogramstate;
#if (CC_TARGET_PLATFORM == CC_PLATFORM_ANDROID || CC_TARGET_PLATFORM == CC_PLATFORM_WP8 || CC_TARGET_PLATFORM == CC_PLATFORM_WINRT)
    std::string _fragSource;
    EventListenerCustom* _backgroundListener;
#endif
};

#endif /* defined(__AntialiasEffect__) */

AntialiasEffect.cpp:

#include "AntialiasEffect.h"

bool AntialiasEffect::init() {
    return initGLProgramState("Shaders/antialias.fsh");
}

bool AntialiasEffect::initGLProgramState(const std::string &fragmentFilename)
{
    auto fileUtiles = FileUtils::getInstance();
    auto fragmentFullPath = fileUtiles->fullPathForFilename(fragmentFilename);
    auto fragSource = fileUtiles->getStringFromFile(fragmentFullPath);
    auto glprogram = GLProgram::createWithByteArrays(ccPositionTextureColor_noMVP_vert, fragSource.c_str());
    
#if (CC_TARGET_PLATFORM == CC_PLATFORM_ANDROID || CC_TARGET_PLATFORM == CC_PLATFORM_WP8 || CC_TARGET_PLATFORM == CC_PLATFORM_WINRT)
    _fragSource = fragSource;
#endif
    
    _glprogramstate = GLProgramState::getOrCreateWithGLProgram(glprogram);
    
    _glprogramstate->retain();
    
    return _glprogramstate != nullptr;
}

AntialiasEffect::AntialiasEffect()
: _glprogramstate(nullptr)
{
#if (CC_TARGET_PLATFORM == CC_PLATFORM_ANDROID || CC_TARGET_PLATFORM == CC_PLATFORM_WP8 || CC_TARGET_PLATFORM == CC_PLATFORM_WINRT)
    _backgroundListener = EventListenerCustom::create(EVENT_RENDERER_RECREATED,
                                                      [this](EventCustom*)
                                                      {
                                                          auto glProgram = _glprogramstate->getGLProgram();
                                                          glProgram->reset();
                                                          glProgram->initWithByteArrays(ccPositionTextureColor_noMVP_vert, _fragSource.c_str());
                                                          glProgram->link();
                                                          glProgram->updateUniforms();
                                                      }
                                                      );
    Director::getInstance()->getEventDispatcher()->addEventListenerWithFixedPriority(_backgroundListener, -1);
#endif
}

AntialiasEffect::~AntialiasEffect()
{
    CC_SAFE_RELEASE_NULL(_glprogramstate);
#if (CC_TARGET_PLATFORM == CC_PLATFORM_ANDROID || CC_TARGET_PLATFORM == CC_PLATFORM_WP8 || CC_TARGET_PLATFORM == CC_PLATFORM_WINRT)
    Director::getInstance()->getEventDispatcher()->removeEventListener(_backgroundListener);
#endif
}

I’m stuck in class AntialiasedRenderTexture.cpp on line:

q.init(_globalZOrder, _texture->getName(), effect->getGLProgramState(), blendFunc, &_quad, 1, transform, flags); //<-- what should I do with _quad parameter

It seems like you over-engineered this one … though creating a subclass and effect class may end up giving you the best result and it’s possible that you need the recreated event listener, etc, but I’d first test without those.

I think you can just attach the shader to the RT’s rendered sprite.

renderTexture->getSprite()->setGLProgramState(_glprogramstate);

If that doesn’t work then the _quad in your version would likely be the RT’s sprite’s polyInfo.triangles and you’d probably want to use a triangles command (see: CCSprite.cpp).

Thanks! I didn’t know that’s possible. Anyway I tested it on bright-enchacing shader and it worked. But antialias shader, which I’ve posted didn’t. Nothings’ changing…

Hmmm, so can you attach a shader that paints non-transparent or non-black pixels? In other words, are you saying that some custom shader you have tried is working and you see the results, but your specific anti-alias shader is not working?

So, if you change your custom shader to just output red pixels does it work?

A simplistic blur method is to use the default technique by creating a 2x width/height resolution RenderTexture and then changing the sprite’s texture rect to the original width/height. You’ll get the ‘automatic’ blurring by using the GPU’s default interpolation.

The custom blur shader will probably need the resolution of the render texture so that you can calculate the sampling (u,v) for the neighboring pixels surrounding the current fragment’s pixel.

I’ve made an example shader:

#ifdef GL_ES
precision mediump float;
#endif

varying vec4 v_fragmentColor;
varying vec2 v_texCoord;

void main(void)
{
    vec4 c = texture2D(CC_Texture0, v_texCoord);
    gl_FragColor = vec4(1.0, 0.0, 0.0, c.a);
}

And it works (red color on non transparent pixels).
I just cannot find good and simple shader.
I’ve found a blur shader as you suggested:

I have a problem with parameters - I don’t really know what to input:

u_texture - what texture? I have no idea.
resolution - not sure about this, should it be width or height? Or maybe width*height?
radius - i gave 4 as an example, is it fine?
dir - vec2(1, 0) probably is a good parameter

I gave these 3 parameters, didn’t give u_texture, because I don’t know what texture I should give. And as expected it doesn’t do anything.

u_texture => CC_Texture0 in cocos2d terms (it’s just the sprite’s texture)
resolution => uniform vec2 { RT.width, RT.height }
radius => adjustable blur factor (how large area to use for sampling pixels)
dir => vec2(0,0) is default, no direction, but otherwise probably could give blur a motion

Look into cpp-test examples in the ShadersTest.cpp (or named something like that) which references vert/frag shaders in that project’s resources.

Thanks, but I don’t know what texture I should give.
Anyway I’ve found in ShadersTest.cpp another blur shader, which doesn’t require additional texture who-knows for what.
But with just 2 blur radius fps goes from 60 to 2-3…

u_texture isn’t an additional texture, it’s just the Sprite’s own texture (see the standard color/texture/position shaders).

60 to 2-3 seems like something else is going on, unless you’re testing on a low-end device. Maybe you’ve attached the shader to every single draw?

One simple blur shader even on a full screen sized sprite shouldn’t give that drastic of a slowdown.

Good luck.

I’m testing on iPad 3, so it’s not newest device, but performance is actually pretty good (except for this shader).

2-3 fps is unacceptable even on older device.

I attach shader like this:

renderTexture = RenderTexture::create(1136, 768);
renderTexture->setPosition(480, 320);
renderTexture->beginWithClear(0, 0, 0, 0);
renderTexture->end();
auto antialiasEffect = AntialiasEffect::create(renderTexture->getSprite()->getTexture()->getContentSizeInPixels(), 2, 2.0f);
renderTexture->getSprite()->setGLProgramState(antialiasEffect->getGLProgramState());
drawingLayer->addChild(renderTexture);

drawNode = DrawNode::create();
drawNode->retain();
drawNode->setPosition(88, 64);

in AntialiasEffect::init:

    GLfloat gradius = blurRadius;
    GLfloat gsampleNum = sampleNum;
    
    _glprogramstate->setUniformVec2("resolution", resolution);
    _glprogramstate->setUniformFloat("blurRadius", gradius);
    _glprogramstate->setUniformFloat("sampleNum", gsampleNum);

then drawing:

 renderTexture->begin();
 drawNode->clear();
 drawNode->drawDot(locationInNode, DRAW_RADIUS, DRAW_COLOR);
 drawNode->visit();
 renderTexture->end();

works super fast (60fps), after attaching shader goes to 2-3 fps.

The second shader from here: https://github.com/mattdesl/lwjgl-basics/wiki/ShaderLesson5
still doesn’t do anything. I give it a params like this:

auto antialiasEffect = AntialiasEffect::create(renderTexture->getSprite()->getTexture(), renderTexture->getSprite()->getTexture()->getContentSizeInPixels().width * renderTexture->getSprite()->getTexture()->getContentSizeInPixels().height, 8, Vec2(1, 1));

I tried giving different radius and dir… nothing. Fps doesn’t come down, but nothing’s changed.

in AntialiasEffect::init:

    GLfloat gresolution = resolution;
    GLfloat gradius = radius;
    
    _glprogramstate->setUniformTexture("u_texture", uTexture);
    _glprogramstate->setUniformFloat("resolution", gresolution);
    _glprogramstate->setUniformFloat("radius", gradius);
    _glprogramstate->setUniformVec2("dir", dir);

Damn, I’m clueless. I know it’s just a problem with shader itself.

I’ll give it a try in the next couple days if you haven’t solved it in the meantime.

It’d be awesome!

Stil no progress…

Can you share your HelloWorld, AntialiasEffect class and shader files or even just the entire test project if you can as full file source that’d speed testing up? I’ve tried to write up a simple test project based on your ideas and the few snippets, but haven’t found any perf issues (though I’ve not tested on iPad3 or similarly slow device).

Could share code on https://gist.github.com/ or using this forum’s upload feature.

Pszczoly.zip (9.8 MB)

Here it is. I simplified it as much as possible. Everything’s going on in MazeScene. Basically there are two shaders, you have to uncomment code in “AntialiasEffect.cpp” to switch to the other one, because parametrs are different.

Like I said first shader is working on ios, it’s very slow and crashes app on android. Second doesn’t do anything.

1 Like

FYI: the full-screen sampling might give you what you want without shaders. YMMV and you’ll have to test it yourself, but a possibility. I’ll still be testing out your version on android once I get back home where my actual devices are.

iOS OpenGL View setup: 3D Rolling Game

Android: MultisampleConfigChooser (source no longer avail, below are some related resources)

https://stackoverflow.com/questions/4934367/how-to-get-rid-of-jagged-edges-in-android-opengl-es .

https://github.com/d3alek/TheHunt---Interactive-graphical-platform-for-AI-Experiments/blob/master/TheHunt/src/com/primalpond/hunt/MultisampleConfigChooser.java .

https://stackoverflow.com/questions/7379710/how-to-do-multisampling-in-android-opengl-es .

https://developer.android.com/reference/android/opengl/GLSurfaceView.EGLConfigChooser.html .

Thanks for the answer. I tried enabling multisampling (yes, number of samples: 4), something changed with overall look, but line edges are still jagged. And when I’m drawing fps goes down from 60 to 20.
I don’t know how to enable this blending, because there’s no such a method in RenderTexture class.
I also tried “MultisampleConfigChooser”, but nothing changed. :confused: