Unless I’m missing something, it seems like there’s a pretty big hole in the way HD resolutions are being handled in Cocos 2DX. In Cocos2D, the use of a retina flag is also, “wrong” but it works for the iPhone. Cocos2DX needs a more general solution, but the current method:
Doesn’t work very well IMO. The issue is that it has no concept of pixel density so a 4" display that is 960x640 would use the same assets as an 8" display that is 960x640, which is not what you want under most circumstances.
What you want to do is use high density assets for the 4" display, and low density assets for the 8" display. The 8" display has more usable real estate, but less quality - and will likely use a very different layout.
I’m fine doing the math on all of this myself, but right now there’s no way that I can find to detect that the difference between the two displays.
All return the exact same values.
always returns 1.
I get that the idea is we’re supposed to set the content scale factor, but without knowing the pixel density I can’t set it correctly.
What should be happening (IMO) is that when you have a high density display the content scale factor should be initialized to the value that normalizes the pixels per inch (for retina, it would be 2 or 0.5 depending on which way you go) and suitable values would need to be used for Android devices (there are methods for getting the pixel density there as well). With that in place, we can always work in Points instead of Pixels, determine what density assets to load based on the scale factor, and determine what layouts to use based on Points.
getWinSize() would return the points, and getWinSizeInPixels() would return pixels. This would also match the behavior of Cocos2D (Obj-C version).
Does that sound right, or am I missing a call somewhere that would get me the info I need?