FW iOS Starling integration, version 1

Currently I’m working on integrating FlashyWrappers into the “Hungry Hero” open source game. This is an important step towards testing the framework towards popular AIR / Flash game engines for iOS, and not only for iOS.

So the good news is: it works!

Check out Hungry Hero recorded on iPad mini Retina, fullscreen recording 1024 x 768 @60fps, you can see the game fps in lower left (it’s not too different from when I’m playing the game without recording):


The bad news is, the setup might be a bit more tricky than I thought and there are same cases where it seemingly doesn’t work unless the user knows what to do (which they don’t). Otherwise, they end up recording (and seeing) black screen with FlashyWrappers logo. Not too exciting!

Of course, I found out less serious, but important stuff I didn’t think about when developing the first version of the iOS extension. Which is only a good thing, because all of those fixes to improve the extension will make it into the next release  pretty quickly (about 1-2 weeks tops). For those who might want to experiment with FW 2.2 iOS & Starling:

1) You need depthAndStencil enabled, which means you need to supply your own Context3D to Starling

There is a nice tutorial in here:

http://wiki.starling-framework.org/tutorials/combining_starling_with_other_stage3d_frameworks

You can pretty much copy the tutorial, except of course you want only one Starling and you must setup backbuffer with depthAndStencil flag set to “true” :

mStage3D.context3D.configureBackBuffer(stage.stageWidth, stage.stageHeight, 0, true);

 2) You must call myEncoder.captureFullscreen() after calling context3D.present()

Otherwise you’ll end up with black screen, most likely. I’m still investigating if there wasn’t anything else causing the black screen issue I discovered today. In any case, the library must be more vocal about something appearing out of order, if possible, or at least the manual needs to be updated about Stage3D related issues.

3) I need to decouple encoding a frame and rendering a frame on iOS

The thing is: If your game is running at 60fps and you want to record at 20fps, you might think – simple, let’s just call captureFullscreen() every 3rd frame. But what you don’t realize is that captureFullscreen() is also rendering AIR’s buffer content back to screen (because in the meantime, AIR’s rendering was redirected). Without doing that, AIR’s rendering is literally crippled and it doesn’t display anything. So if you call captureFullscreen() every 3rd ENTER_FRAME you get recording AND rendering of your game only every 3rd frame. This is obviously an issue, as your game seems to be “lagging”, even in reality its running perfectly fine at 60fps. So what I need to do is call captureFullscreen’s rendering all the time, while captureFullscreen’s recording only sometimes (game_fps / movie_fps times).

There is more stuff I realized but I won’t get into detail because that has to do with audio coming into play.

Nevertheless, that’s why fullscreen capturing is called “beta” 🙂

P.S.: I said AIR’s rendering was crippled…however, after calling encodeIt() the ANE hands over rendering back to AIR, so don’t worry about that too much 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *