I’m no expert on this, but currently I’m doing universal app.
My approach goes here, I use one central best quality of images and scale them appropriately in code for other devices. In fact this presents a flaw for game size, and memory, but I use this approach as my game is kind a small and the images aren’t that much.
But for the game with different version for iPhone, and iPad. Then you can split the route for images size much better, and can relax in a bit. Nonetheless, you have to prepare retina-size images for both of them. But the approach you use is another story.
I don’t know exactly the way they did, but let me think of this. They may store a reasonable quality (or least quality) images in the game, then when users downloaded the app and play the game, the code checks and knows which device is currently running on thus they may streaming in higher quality images on-the-fly (if users connected to WiFi which it’s high chance as they just finished in download so WiFi will probably still on). If it fails for some reasons, I do think they will have a fallback as they might load what’s there in the game at the first place (which may present the least quality of the graphics).
This is just my opinion, I will wait also for those who knows what’s exactly going on.
Thanks for the question !