Tag Archives: IPad

Playing movies with an alpha channel on the iPad

We hit a snag the other day with the development of Tiny Ears. Our strategy to save space on the animation in each storybook was to create lightweight movies to play at the appropriate moment, rather than to have to implement large quantities of frame animation with all of the assets involved (including now retina support with the new iPad). To save further space we decided that we would like to animate only those things that moved by creating a video animation with an alpha channel to make the background visible through the video.

Problem was, when it came to implementing this strategy on the device itself, the transparent .mov file renders with a black background. Why? AVPlayer and MPMoviePlayer  do not have alpha channel support. Desperate not to have to change our strategy and recreate our animations complete with background or return to frame animation I spent some time researching a possible solution on the Internet. After 1 day of looking the best I had come up with was a solution suggesting we use OpenGL to sample the video as it played and turn certain colours transparent I was ready to give it up for lost.

Then, at last, I came across AVAnimator. Hidden in the depths of Google lay this single page site detailing this wonderful library that seems to do pretty much everything that you would want to do with movies in iOS but can’t. There is little documentation and that single page is the only information that exists, but it was enough. Here was a native movie player for iOS with alpha channel support (and a lot more besides but we won’t go into that now).

The code itself was really simple to implement, but in order to play the movies you have to do a little bit of transformation first to turn them into the .mvid format that AVAnimator requires. The tools you need are qtexportani and QTFileParser.

Unpack qtexportani. Open a terminal in that location and type the following:

./qtexportani <name>.mov

This will create you a file in the same directory called export_<name>.mov.

Now unzip QTFile112.zip. Go into QTFileParser and open the XCode project. Build & archive the app and select Distribute. Select Save Built Products and choose somewhere to save it. Then, with the terminal in the same location as the app you just built, run the following command:

./qtaniframes -mvid export_<name>.mov

This will save you a file called export_<name>.mvid.

At this point, don’t be afraid of the fact that your new .mvid file is substantially larger than the original .mov. We’re gonna 7Zip it to make it nice and small. The nice thing about AVAnimator is that you can 7Zip all of your .mvid media into one archive and use that in your app, making all of your media have a delightfully small footprint. I’m not gonna tell you how to 7Zip your files – you’re geeks you should be able to handle that on your own. But at the end of it you should have something like <name>.7z that contains all of your mvid media.

Now comes the fun bit. From the AVAnimator site I could not find just the source file download, but you can grab it by downloading any of the links from their website. I grabbed StreetFighter cos that was the example app that did exactly what I wanted.

So, in your xcodeproj, import all of the files in the AVAnimator folder that you will find in your downloaded project. You will also need to import all the files inside the folder called LZMASDK. In the UIViewController where you want to play your animation then add the following code:
// create the animator view

AVAnimatorView *animationView = [AVAnimatorView aVAnimatorViewWithFrame:CGRectMake(0,0,1024,768)];

// create a new object to store your media in

AVAnimatorMedia *media = [AVAnimatorMedia aVAnimatorMedia];

// Create a 7Zip resource loader

AV7zAppResourceLoader *resLoader = [AV7zAppResourceLoader aV7zAppResourceLoader];

// tell the resource loader what the name of your 7Zip archive is, and the name of the media file inside it you want to play

resLoader.archiveFilename = @”media.7z”;

resLoader.movieFilename = @”export_video.mvid”;

resLoader.outPath = [AVFileUtil getTmpDirPath:animationFilename];

// tell the media holder which resource loader to use

media.resourceLoader = resLoader;

// Create decoder that will generate frames from Quicktime Animation encoded data

AVMvidFrameDecoder *frameDecoder = [AVMvidFrameDecoder aVMvidFrameDecoder];

media.frameDecoder = frameDecoder;

media.animatorFrameDuration = AVAnimator30FPS;      // this is a constant I made for the frame rate

[media prepareToAnimate];

// request to be notified when the movie is played

[[NSNotificationCenter defaultCenter] addObserver:self




// you have to add the AVAnimatorView to the superview before you can attach the AVAnimatorMedia

[self.view addSubview:animationView];

[animationView attachMedia:media];

// play the movie

[media startAnimator];
And that’s it. Everything to you need to play a movie animation while preserving the alpha channel creating a transparent animation. In a few short lines of code. Thank you AVAnimator, you wonderful thing you. It’s astounding that more people don’t know you exist.

Storybook Project Overview

I recently wrote a basic overview for the Storybook project that I’m taking to Chile, and I’ve realised having it on here is probably the best place to put it, so here it is.

Project Overview of Interactive Storybook for iPad
from Radical Robot

The Interactive Storybook for iPad project is designed to create an educational storybook for children between the ages of 4 and 7 that will help them learn to read sentences while providing an engaging and entertaining experience. The age range is designed to target children moving from reading single words, to sentences.

Many interactive storybooks are designed to entertain and tell a story, but provide little towards assisting the child to read the story for themselves. In most cases, this assistance is provided by parents or learning professionals, however these people are not always available to assist the child when they want. This project will utilise Speech Recognition technology to listen to the child as they read the story out loud, so as to provide encouragement, feedback, assistance and rewards at the point at which it is needed. Face detection will be used to determine whether or not the child is reading from the app rather than talking to someone off screen and will therefore serve to provide more accuracy to the speech recognition. Speech recognition can be disabled for adult led enjoyment.

As the child reads the story, the app will listen to their progress. When the child reads a word incorrectly, stumbles on pronunciation or takes a long time to read the word (the app only ‘listens’ when the child is looking at the app). It will then step in and prompt the child with assistance using the Phonics learning system. When the child correctly pronounces the word, then audible (‘Well done!’) and visual (animations) feedback is used to provide rewards.

The app will monitor the child’s progress over time (between readings of the story as well as within a reading session) so that feedback can be adjusted to the child’s progress. This means that words that are consistently mispronounced will receive more intervention and greater rewards for success than words that are more often correctly read. The actual form that the feedback/reward system will take is currently in development.

As additional rewards, at the end of every page, the story so far can be animated and interacted with. These animations can be expanded into games that will assist the child with learning the words that they have been struggling with in a fun and interactive way.

The app is designed to be a fun experience whether read together with a parent or alone. When read with a parent then the speech recognition and face detection can be disabled so that mum or dad can provide the learning value and assist their child in playing the games. However, whenever the child wants to play and mum’s busy, then the Speech Recognition can be activated to provide the required assistance and encouragement during self directed play.

This app is still in development, with a working prototype expected around March. Development at this time is focused around creating the speech recognition technology for children.

Creating an Intelligent, Interactive Storybook for Kids

Apparently, 70% of all parents with iPads let their kids use them. I’ve seen kids as young as 2 pick an iPad up and start using it with little or no direction. It seems that with this tablet there is a device that kids can use, regardless or their current educational state or coordination. iPad developers seem to have caught on to this and the kids app market is a steadily growing one, with educational and gaming apps appearing all the time.

I first encountered kids using iPads when I spent time with Ian’s sister and family in Canada earlier this year. I’d taken my iPad with me and downloaded a bunch of kids games for Olivia (3) and Chloe (1) to play with. One of the first thing I noticed, however, was the poor quality of many of the apps, even those with big names like Peppa Pig behind them. Storybooks especially seemed to be nothing more than a digital version of the analogue book with the added disadvantage of not being chewable or usable for hitting your little sister with.

I also looked at how the girls interacted with the educational apps that I downloaded. Often they were no engaging enough to retain interest for more than a minute or two at a time, or did not offer enough incentive to continue when things got hard. Apart from one game, a memory card game, the average time that Olivia engaged with an app was about 2 minutes.  I started to wonder if I could do better and came up with the idea for an interactive storybook that helped kids between the ages of 4 and 7 learn to read.

I saw two problems with trying to create an educational storybook app. One was of entertainment and the other was feedback. While watching Olivia use the educational apps, I noticed that the amount of engagement she had with the app increased if an adult was there with her as she used the app and gave constant feedback and assistance when she needed it. However, the adults were not always available to give her the attention she needed and so she would put the iPad down and do something else instead.

I wondered if we could create the sense of feedback and support that an adult would normally provide by using Speech Recognition to listen to the child as they read the storybook and provide the feedback, encouragement, rewards and assistance at the moment it is required. We could use face detection on the front facing camera on the iPad 2 to determine whether or not the child was reading the book and use that as a guide to whether or not to listen or ignore what the app heard. Combine this with fun interactions, animations, multiple paths through the story and embedded games and I felt we could create an engaging, interactive learning experience.

Working with the Strong Steam team we are creating the Speech Recognition and Face Detection technology to use within the app and are in the process of putting together a prototype app. We’re looking for an early years learning professional with an interest in technology to help us out with how we make the app as engaging as possible while retaining the educational element. If you are one, or you know of one who might be interested, please do get in touch.

Radical Robot is off to Santiago for Startup Chile

Two weeks ago, Radical Robot received some fantastic news – we got accepted into the Startup Chile accelerator programme. This is very exciting, not only as it means that we get to spend 6 months in beautiful Chile, but it also means that we get a period of funded client-free time to work on our fantastic new product – an interactive educational children’s iPad storybook with speech recognition.

Since Radical Robot started a year ago we have become a successful mobile application agency, but we have always wanted to be more of a product house than a client led agency. Social Ties was our first attempt at developing a product, but although we are very pleased with the app and are happy with our downloads since it went live in both the Android and iTunes app stores, we have realised that monetization of the app would lead us down a path that we were not that interested in. This realisation has allowed us to persue a number of other product ideas and we are happy that our favourite idea has been accepted into Startup Chile.

The idea is to create an interactive storybook for children aged 4-7, kids just learning to read sentences. This app will utilise Speech Recognition to listen to the child as they read the story out loud, enabling focussed feedback, assistance and support at the moment it is required. Along with story animations, positive feedback, reward systems and embedded games, this creates an educational and entertaining tool for early years language learning. The speech recognition can be disabled to provide a fun experience that parent and child can enjoy together, or left on to be used as a self directed learning tool for when mum and dad are busy.

Right now, the aim is to concentrate on developing and perfecting the speech recognition technology. We are using OpenEars / Pocket Sphinx as a basis and training it to deal with kids voices. Then we shall be looking for a story to form the basis of our first app. We would dearly love to meet someone with speech recognition experience to join our AI team as a consultant, and learning professionals who would like to get involved in creating this exciting, cutting edge learning experience. If you are interested, please contact me at Emily@radicalrobot.co.uk, on twitter @fluffyemily or call on 07768646287.

Also, we need to come up with a cool name for this new product. If you have any suggestions, please do let me know via the email address or twitter handle above. We’re very excited here and are looking forward to our adventure and getting stuck into this fascinating challenge.