We hit a snag the other day with the development of Tiny Ears. Our strategy to save space on the animation in each storybook was to create lightweight movies to play at the appropriate moment, rather than to have to implement large quantities of frame animation with all of the assets involved (including now retina support with the new iPad). To save further space we decided that we would like to animate only those things that moved by creating a video animation with an alpha channel to make the background visible through the video.
Problem was, when it came to implementing this strategy on the device itself, the transparent .mov file renders with a black background. Why? AVPlayer and MPMoviePlayer do not have alpha channel support. Desperate not to have to change our strategy and recreate our animations complete with background or return to frame animation I spent some time researching a possible solution on the Internet. After 1 day of looking the best I had come up with was a solution suggesting we use OpenGL to sample the video as it played and turn certain colours transparent I was ready to give it up for lost.
Then, at last, I came across AVAnimator. Hidden in the depths of Google lay this single page site detailing this wonderful library that seems to do pretty much everything that you would want to do with movies in iOS but can’t. There is little documentation and that single page is the only information that exists, but it was enough. Here was a native movie player for iOS with alpha channel support (and a lot more besides but we won’t go into that now).
The code itself was really simple to implement, but in order to play the movies you have to do a little bit of transformation first to turn them into the .mvid format that AVAnimator requires. The tools you need are qtexportani and QTFileParser.
Unpack qtexportani. Open a terminal in that location and type the following:
./qtexportani <name>.mov
This will create you a file in the same directory called export_<name>.mov.
Now unzip QTFile112.zip. Go into QTFileParser and open the XCode project. Build & archive the app and select Distribute. Select Save Built Products and choose somewhere to save it. Then, with the terminal in the same location as the app you just built, run the following command:
./qtaniframes -mvid export_<name>.mov
This will save you a file called export_<name>.mvid.
At this point, don’t be afraid of the fact that your new .mvid file is substantially larger than the original .mov. We’re gonna 7Zip it to make it nice and small. The nice thing about AVAnimator is that you can 7Zip all of your .mvid media into one archive and use that in your app, making all of your media have a delightfully small footprint. I’m not gonna tell you how to 7Zip your files – you’re geeks you should be able to handle that on your own. But at the end of it you should have something like <name>.7z that contains all of your mvid media.
Now comes the fun bit. From the AVAnimator site I could not find just the source file download, but you can grab it by downloading any of the links from their website. I grabbed StreetFighter cos that was the example app that did exactly what I wanted.
So, in your xcodeproj, import all of the files in the AVAnimator folder that you will find in your downloaded project. You will also need to import all the files inside the folder called LZMASDK. In the UIViewController where you want to play your animation then add the following code:
// create the animator view
AVAnimatorView *animationView = [AVAnimatorView aVAnimatorViewWithFrame:CGRectMake(0,0,1024,768)];
// create a new object to store your media in
AVAnimatorMedia *media = [AVAnimatorMedia aVAnimatorMedia];
// Create a 7Zip resource loader
AV7zAppResourceLoader *resLoader = [AV7zAppResourceLoader aV7zAppResourceLoader];
// tell the resource loader what the name of your 7Zip archive is, and the name of the media file inside it you want to play
resLoader.archiveFilename = @”media.7z”;
resLoader.movieFilename = @”export_video.mvid”;
resLoader.outPath = [AVFileUtil getTmpDirPath:animationFilename];
// tell the media holder which resource loader to use
media.resourceLoader = resLoader;
// Create decoder that will generate frames from Quicktime Animation encoded data
AVMvidFrameDecoder *frameDecoder = [AVMvidFrameDecoder aVMvidFrameDecoder];
media.frameDecoder = frameDecoder;
media.animatorFrameDuration = AVAnimator30FPS; // this is a constant I made for the frame rate
[media prepareToAnimate];
// request to be notified when the movie is played
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(animatorDoneNotification:)
name:AVAnimatorDoneNotification
object:media];
// you have to add the AVAnimatorView to the superview before you can attach the AVAnimatorMedia
[self.view addSubview:animationView];
[animationView attachMedia:media];
// play the movie
[media startAnimator];
And that’s it. Everything to you need to play a movie animation while preserving the alpha channel creating a transparent animation. In a few short lines of code. Thank you AVAnimator, you wonderful thing you. It’s astounding that more people don’t know you exist.
Hi there,
I got to this post after going through pretty much exactly what you went so first of all, thanks for taking the time to post it!
Can you please let me know how you are generating your MOV files? Any MOV files I generate through Motion / Final Cut Pro return an error when I try to use the qtaniframes command line. This error message I get that gets logged in the console is “found XXXX samples but only XX frames of video”.
Cheers,
Rogerio
Sorry Rogerio, I don’t generate my original MOV files – they are provided to me by my designers. I have no knowledge about that side of things.
Hi Emily
Thanks for sharing your trails and tips. I just stumbled over AV animator, and kinda felt; “hmmm this is a bit to good to be true” from a little page on the net – hence googling around further, before ending up here!
I’m also astounded that more people don’t know it exists. This is one of the few posts by anyone discussing using it. (The Looping/Combining videos looks pretty handy also.)
Thanks again for sharing!
Hi guys
I am glad you are finding AVAnimator useful. To get a feel for how things work, you can also try the “APNG” app in the app store, it is a free demo that shows how an animated PNG (.apng) can be rendered with AVAnimator.
@Rogerio
Re problems on importing .mov file, run your movie through qtexportani before attempting to convert with qtaniframes and that should fix the problem. If it does not, then there is a solution that works 100% of the time, that is to export the .mov as a series of .png images. Then create a new .mov from the series of images and make sure to save with the Animation codec. Just be sure to write down the framerate and import from images with the same framerate and you should be good to go.
Hello. Thanks for the great post. I’ve got this working fine on the simulator but on devices I am getting errors in regards to some of the assembly code (Specifically, maxvid_decode_arm.s)
Have you been able to get this to work on the most current XCode versions? I’m on a crunch time project and I was really hoping this would have silver bulleted the problems I was having.
Hi Chuck.
I recently wrote to Mo who’s responsible for AVAnimator about this exact issue. Here is his response:
Actually, a much improved fix is described here:
http://stackoverflow.com/questions/12779502/xcode4-5-assembler-fails-to-compile-files-that-xcode4-4-handled-perfectly
Yes, Xcode 4.5 has made some changes that break compilation of ASM code. There fix is easy, just add the -no-integrated-as flag to the compilation options for the maxvid_decode_arm.s file in AVAnimator.
Like so:
Choose the project file in the left window that lists files (it is the one at the top with the blue icon).
Select your Target in the “TARGETS” list.
Select the “Build Phases” Tab.
Double click on maxvid_decode_arm.s, then add -no-integrated-as by pasting into the list that comes up.
Can you link to the source for qtexportani and QTFileParser? Do you have any more information on these utilities, such as who made them?
Hi Emily!
Are you familiar with openFrameworks toolkit for IOS?
Do you know if it is possible to use this library with an openFrameworks project?
cheers
Hi, I am familiar with openFrameworks but have not have the occasion to use it. I cannot see why AVAnimator would not work with them, but to be sure I suggest you contact Mo, the author of AVAnimator.
Looks like the lib is available on github:
https://github.com/mdejong/AVAnimator
I also had to set the deployment target to 10.7 from 10.8 when building qtaniframes b/c I’m running os x 10.7. It would segfault every time I ran it otherwise.
I’m attempting to follow the tutorial posted here for a current project using AVAnimator Library. I’m not the most experienced Objective-C programmer, and I’m running into what is probably a very simple error to fix.
When I follow the example above line for line, changing only the name of my archive file, I get an error “AVAnimator must be loaded into a window before attaching media.” This is in Xcode 4.3, ARC enabled, Storyboard enabled. I know I’m including the correct code to load AVAnimatorVeiw into the superview. That is:
[self.view addSubview:animationView];
[animationView attachMedia:media];
And here’s where I get confused. If I start a new project without Storyboard enabled, and use exactly the same code, I get no error. Any idea what’s different about Storyboards that’s preventing this code from working?
AVAnimator, you saved my app.
So I finally got AVAnimator up and running. I had some difficulty initially, primarily because our app had been designed with ARC and Storyboard enabled. Storyboards leave no option to link your main window as an outlet (hence the “AVAnimator must be loaded into a window” error) and trying to mix ARC and non-ARC code leads to a nightmare of compiler flags and uncaught logic errors. However…
After a crash course in writing memory management code and a tutorial on how to work with a nib, I was able to integrate AVAnimator. This let me boost the frame rate for my animations from 11 frames a second to 30. I can also run two animations at the same time, both with alpha transparency, and without apparent memory leaks!
I am curious though if anyone else has successfully managed to implement AVAnimator with ARC enabled.
@Jesse Laughlin
Are you able to run this on a project which using Storyboard?
Anyone who is able to run this with storyboard.
I an getting error “AVAnimatorView must have been added to a window before media can be attached”
@Rogerio
in that case just open the export_movie.mov with quicktime then reexport it. it works.
Hey everyone, just and FYI here. AVAnimator 2.0 was just released:
http://www.modejong.com/AVAnimator/
The biggest new feature is H264 encoding for videos with an alpha channel (not possible with plain H.264). Also, memory usage is significantly improved by dynamic memory mapping for just the frame being displayed. In addition, there is a new command line tool that makes it easier to deal with conversion from and back to Quicktime.
Here is a blog post about h.264 with an alpha channel:
http://iosgraphicsanimation.wordpress.com/2013/06/05/h-264-video-with-an-alpha-channel/
The examples give in the AVAnimator site are all crappy movies. Is it possible to use good H264 movies without loosing any quality when viewing them on the iPad? I’m mostly worried about feathering edges to total transparency (this would allow to playback movies with masks that remove the square look out of them), color gradients and quality blending this transparency with the background used.
The library provides code to enable playback of lossless video content with an alpha channel under iOS. Complaining about “crappy movies” is like saying you don’t want to use PNG images because you don’t like some example PNG images you saw online. You have to provide the content and verify that the quality and size work for your specific application. By definition lossless is higher quality than h.264 because h.264 is a lossy format like JPEG.
hi, i am a Chinese developer.i want to render alpha video use AVVideoCompositionCoreAnimationTool, can you help me, thank you.
Hi!
I know that this tutorial is some years old, but I have the same problem now and can’t find a solution.
I downloaded qtexportani and QTFileParser but when I try to run qtexportani for my video (mov file with alpha information) I get the following error:
rm: export_video.mov: No such file or directory
qt_export –audio=0 –video=rle,,100 –keyframerate=100000 video.mov export_video.mov
./qtexportani: line 31: qt_export: command not found
wrote export_video.mov
Do I need additional libraries? Or is there a newer and better solution now to play video with transparent background on iOS devices?
Thanks and best wishes!
I solved my problem using MovieMaker (https://github.com/mdejong/MvidMovieMaker). Now on the simulator everything works perfectly fine. I can play transparent video and audio. But when I try to run the project on a device I got following error:
*** Assertion failure in -[SegmentedMappedData mapSegment], /Users/Franzi/Documents/transparentVideo/transparentVideoTest/transparentVideoTest/AVAnimator/SegmentedMappedData.m:300
*** Terminating app due to uncaught exception ‘NSInternalInconsistencyException’, reason: ‘mmap result EINVAL’
Has anyone encountered the same problem and can give me some tipps about possible reasons/solutions?
Thank you and best wishes!
Hi Franziska,
can u give me some suggestion or step how to convert it.
Thanks,
You need this tool installed as well http://www.omino.com/sw/qt_tools/
Just an FYI, but a new AVAnimator 3.0 release is now online. http://www.modejong.com/AVAnimator/