I realized that a sprite which texture is set to “setAliasTexParameters” must be placed on integer positions to avoid pixel bleeting when moved.
That’s fine but if I use a RenderTexture with screen size and scale it by a non-integer value (for example 1.5f) I get again pixel bleeding although the position of the sprite is rounded.
I guess that I have to take the scalefactor of the rendertexture into account when I calculate the position but I did not find the correct formula yet.
The aim is to use a CRT filter, hqx, pixelart shader etc.
So, at the moment I scale down the layer that has all sprites as children attached to have a “design resolution” with 320x240 pixels. The RenderTexture has same dimensions an is upscaled by required scale factor to have at the end a fullscreen experience. When screen resolution is 1920x1080, the upscaled dimension is 1440x1080 and scale factor is 4.5f. Sprites and rendertexture are set to “setAliasTexParameters”.
Small note, I do not use glview->setDesignResolutionSize() in the AppDelegate class.
If I create a RenderTexture with the screen resolution (Director’s visibleSize) I can round the position of the sprites to integer positions and the output is clean and stable. Means, that some pixels are wider as expected but they keep there widths and I have no pixel bleeding. If the RenderTexture is scaled by a non-integer factor, the rounding does not work anymore.
This may or may not be of some help. It’s how to use a smaller frame buffer directly instead of relying on RenderTexture. Possibly can be extracted to do what you want?
If you’re set on upscaling you may just want to look at how others do it and essentially create your own RenderTexture class that’s focused only on rendering everything on screen into one buffer, and then using that buffer to make a final render pass that performs the CRT+hqx+custom_pixelart shader(s).
Cocos2d-x is an open architecture that allows you to do anything you want by calling into OpenGL directly as desired, but out-of-box the default use case for this engine is definitely not meant for pixel-art.
(ultimately you may prefer to not use cocos2d-x and instead use a “smaller” rendering framework, like BGFX, where you write your game directly on top of a thin wrapper around OpenGL (and Metal/Vulkan) … really depends on how big or long-term this project is)
This probably wouldn’t work, but maybe you could keep native screen resolutions, or a larger 1920x1080 design resolution and then just render every node/sprite scaled up instead of rendering at 320x240. Then you can probably just apply shaders that work directly on textures. The CRT shader seems like something you’d want to either use frame buffers for, or if the RenderTexture is just equal to the screen native design resolution, then you apply the CRT shader to that and adjust any settings as shader uniforms for how many vertical pixels to erase on each CRT line (to give the effect you want).
You could also letterbox by only using integer scaling factors 2x, 3x, 4x, 5x and then whatever resulting resolution that ends up it’ll have black bars on left/right (most likely with widescreen monitor resolutions) or top/bottom. You could also choose a different “design resolution” … say 640x360 … which could give you better scaling on widescreen monitors.
Otherwise there’s probably lower-level things you can try if you read up on G-Buffers, off-screen textures, and how other final screen post-processing works.
I’d have to go read the internal source code of RenderTexture to help any further with other ideas or solutions.
please apologize for the late reply. I am currently busy to get the game content finished but will come back to this topic afterwards.
The problem I have right now is, that the shaders have pixel jittering when the rendertexture is not scaled by a integer factor. So, for me it seems like, the rendertexture does not produce the output required by the shader and unfortunately, the scaling must happen at the rendertexture and is not done by the shaders.
No worries, yeah if you’re scaling < 1x I’m not sure there’s good ways to de-jitter.
If you’re scaling > 1x, but at non-integer scales (e.g. animated zoom in from 1x to 2x) then you should look further at that article I posted above (link below).
From here on out, based on what I’ve seen asked and answered in this community I think you’re mostly on your own. Whenever you’re scaling a RenderTexture (or other off-screen frame buffer) that is then rendered onto the screen (essentially RenderTexture [RT] renders all commands into off-screen frame buffer and then when the RT itself is rendered it basically does a simple fragment shader to render that buffer into the final visual screen/window frame buffer).
You could also consider still only doing integer scaling and just add black “letterbox bars” surrounding the game RenderTexture (or make it a fun pixel art “border” in the same way that Nintendo Switch does for its NES/SNES games).
So, for 1920x1080p screen you figure out max game render texture integer scale:
// RT - Render Texture
// h/w_screen - Screen/Window Resolution (in points if high-dpi retina)
// h/w_RT - height/width for the RT that fits inside screen
w_screen = 1920;
h_screen = 1080;
// find RT's largest integer scale that fits in screen
dx = w_screen/320; // == 6
dy = h_screen/240; // == 4.5
int scale_RT = (int) floor(min(dx,dy)); // == 4
int w_RT = 320 * scale_RT;
int h_RT = 240 * scale_RT;
// find offset to center the smaller RT inside screen
int x_off = w_screen - w_RT/2.0;
int y_off = h_screen - h_RT/2.0;
// possibly can just use screen center instead of top-left/bottom-left
// offset position ... but I don't remember off hand what setPosition()
// is anchored to by default with a RenderTexture
renderTexture->setPosition(x_off, y_off); // or whatever is correct
renderTexture->setScale(scale_RT); // integer scale