any thoughts and plan to support?
any thoughts and plan to support?
The easy stuff is automatically supported – if you are providing images by Retina @1x/@2x/@3x scale they will be targeted by app thinning in Cocos2d-x applications.
If you are building multiple binary code by architectures, the ones that are not used will be excluded by the app thinning process in iTunes Connect.
Apple’s On-Demand Resources are essentially a better designed version of Android’s Expansion Files.
Let say i have the following assets folder structure, how should i use @1x @2x @3x for them, what is the best practices to use app thinning feature.
I will have to do some research and testing to see if you can get firstname.lastname@example.org anv email@example.com to load correctly. Are you using cocos2d-x v2 or v3?
I am using cocos2d-x v3 c++.
Based on the Apple Doc , the app thinning is ONLY able to use in the asset catalog.
I am not sure the simple @2x @3x will work.
From the documentation
Image resources are sliced according to their resolution and device family. GPU resources are sliced according to device capabilities. For tvOS apps, assets in catalogs shared between iOS and tvOS targets are sliced and large app icons are removed. When the user installs an app, a variant for the user’s device is downloaded and installed.
Xcode simulates slicing during development so you can create and test variants locally. Xcode slices your app when you build and run your app on a device. When you create an archive, Xcode includes the full version of your app but allows you to export variants from the archive.
I’ve been able to get good results exporting variants and then examining the contents.
App thinning indeed requires using asset catalogs. With that said, if existing sprite atlases could be fed into an asset catalog, app thinning “should” “just work” as we say in Apple land. At least for graphics assets. Then there is the binary and other stuff, might be another story.
I’ve also want implement thining into our game. The problem is that only assets that can be sliced with scaling (1x/2x/3x) are images like .png. But of course if you are using pvr and corresponding plist files you need to use data sets which are very limited in selecting capabilities for slicing. For example you can choose target device memory, device type but I am afraid that this is not enough.
I do not understand then how to choose assets (our pvr) files according target device display. It makes no sense.
I will appreciate any suggestion on it.
So I’ve just found out that you cannot add pvr into image set unless you add .png / .pdf extension
I believe now that I need modify loader to get image/data from catalog and also it has to count on added extensions.
Finally I have thinning with Cocos. It is quite simple at the end although that technology is so dumb. So you have to create two datasets for plist and pvr file.
Conclusion: Thinning is great technology but implementation is total fuck up by Apple.
Nice work. Could you provide some kind of sample project with a working thinning setup? For me I already have split up my texture atlases in 3x, 2x versions…
Any new progress on the possibility of adding Apple Thinning?
On our app we do have pvrs.ccz, jsons, pngs, plists, but we do also use a bunch of jpegs for background images. Our app is a children’s book app, so we needed to keep down the size and the jpegs backgrounds was fine so far.
Has there been any improvement on implementing apple thinning since 2015? Is Swap’s technique the best approach? How is everybody implementing it? I’d think it’d be a pretty common feature developers want to use.
We’ll be trying to do the thinning pretty soon, so please any help or guidance would be much appreciated.
waiting also if there is any news in this.