I’m just learning Cocos2D myself, but the general way you go about it is:
-Render the entire image to a separate render target (this effectively renders the image to a texture instead of to your screen, a texture that you can later use for whatever)
-Render a full-screen quad using a shader that takes the texture from the previous step as an input. Now ever pixel on the screen is rasterized and the fragment shader is executed once for every pixel on the screen.
-now you can sample from this texture and do whatever you want.
The reason somebody told you about the framebuffer_fetch extension is because it allows to read from the framebuffer (Where everything you draw is stored) at the same time as you’re executing a shader that is about to draw to it, which would skip my first step.
I don’t actually know how to do any of this using Cocos2D specifically. Probably someone else can chime in.