ios - Mixing Images and Video using AVFoundation -
i'm trying splice in images pre-existing video create new video file using avfoundation on mac.
so far i've read apple documentation example,
assetwriterinput making video uiimages on iphone issues
mix video static image in calayer using avvideocompositioncoreanimationtool
avfoundation tutorial: adding overlays , animations videos , few other links
now these have proved pretty useful @ times, problem i'm not creating static watermark or overlay want set in images between parts of video. far i've managed video , create blank sections these images inserted , export it.
my problem getting images insert them selves in these blank sections. way can see feasibly create series of layers animated alter opacity @ right times, can't seem animation work.
the code below i'm using create video segments , layer animations.
//https://developer.apple.com/library/ios/documentation/audiovideo/conceptual/avfoundationpg/articles/03_editing.html#//apple_ref/doc/uid/tp40010188-ch8-sw7 // let's start making our video composition avmutablecomposition* mutablecomposition = [avmutablecomposition composition]; avmutablecompositiontrack* mutablecompositiontrack = [mutablecomposition addmutabletrackwithmediatype:avmediatypevideo preferredtrackid:kcmpersistenttrackid_invalid]; avmutablevideocomposition* mutablevideocomposition = [avmutablevideocomposition videocompositionwithpropertiesofasset:gvideoasset]; // if first point's frame doesn't start on 0 if (gframes[0].starttime.value != 0) { debuglog("inserting vid @ 0"); // add together video track composition track time range 0 first point's starttime [mutablecompositiontrack inserttimerange:cmtimerangemake(kcmtimezero, gframes[0].starttime) oftrack:gvideotrack attime:kcmtimezero error:&gerror]; } if(gerror) { debuglog("error inserting original video segment"); geterror(); } // create our parent layer , video layer calayer* parentlayer = [calayer layer]; calayer* videolayer = [calayer layer]; parentlayer.frame = cgrectmake(0, 0, 1280, 720); videolayer.frame = cgrectmake(0, 0, 1280, 720); [parentlayer addsublayer:videolayer]; // create offset value should added each point new video segment should go cmtime timeoffset = cmtimemake(0, 600); // loop through each additional frame for(int = 0; < gframes.size(); i++) { // create animation layer , assign it's content cgimage of frame calayer* frame = [calayer layer]; frame.contents = (__bridge id)gframes[i].frameimage; frame.frame = cgrectmake(0, 720, 1280, -720); debuglog("inserting empty time range"); // add together frame point composition track starting @ point's start time // insert empty time range duration of frame animation [mutablecompositiontrack insertemptytimerange:cmtimerangemake(cmtimeadd(gframes[i].starttime, timeoffset), gframes[i].duration)]; // update time offset duration timeoffset = cmtimeadd(timeoffset, gframes[i].duration); // create layer transparent frame.opacity = 0.0f; // create animation setting opacity 0 on start cabasicanimation* frameanim = [cabasicanimation animationwithkeypath:@"opacity"]; frameanim.duration = 1.0f; frameanim.repeatcount = 0; frameanim.autoreverses = no; frameanim.fromvalue = [nsnumber numberwithfloat:0.0]; frameanim.tovalue = [nsnumber numberwithfloat:0.0]; frameanim.begintime = avcoreanimationbegintimeatzero; frameanim.speed = 1.0f; [frame addanimation:frameanim forkey:@"animateopacity"]; // create animation setting opacity 1 frameanim = [cabasicanimation animationwithkeypath:@"opacity"]; frameanim.duration = 1.0f; frameanim.repeatcount = 0; frameanim.autoreverses = no; frameanim.fromvalue = [nsnumber numberwithfloat:1.0]; frameanim.tovalue = [nsnumber numberwithfloat:1.0]; frameanim.begintime = avcoreanimationbegintimeatzero + cmtimegetseconds(gframes[i].starttime); frameanim.speed = 1.0f; [frame addanimation:frameanim forkey:@"animateopacity"]; // create animation setting opacity 0 frameanim = [cabasicanimation animationwithkeypath:@"opacity"]; frameanim.duration = 1.0f; frameanim.repeatcount = 0; frameanim.autoreverses = no; frameanim.fromvalue = [nsnumber numberwithfloat:0.0]; frameanim.tovalue = [nsnumber numberwithfloat:0.0]; frameanim.begintime = avcoreanimationbegintimeatzero + cmtimegetseconds(gframes[i].endtime); frameanim.speed = 1.0f; [frame addanimation:frameanim forkey:@"animateopacity"]; // add together frame layer our parent layer [parentlayer addsublayer:frame]; gerror = nil; // if there's point after 1 if( < gframes.size()-1) { // add together our video file composition range of point's end , next point's start [mutablecompositiontrack inserttimerange:cmtimerangemake(gframes[i].starttime, cmtimemake(gframes[i+1].starttime.value - gframes[i].starttime.value, 600)) oftrack:gvideotrack attime:cmtimeadd(gframes[i].starttime, timeoffset) error:&gerror]; } // else add together our video file range of points end point , videos duration else { [mutablecompositiontrack inserttimerange:cmtimerangemake(gframes[i].starttime, cmtimesubtract(gvideoasset.duration, gframes[i].starttime)) oftrack:gvideotrack attime:cmtimeadd(gframes[i].starttime, timeoffset) error:&gerror]; } if(gerror) { char errormsg[256]; sprintf(errormsg, "error inserting original video segment at: %d", i); debuglog(errormsg); geterror(); } }
now in segment frame's opacity set 0.0f, when set 1.0f place lastly 1 of these frames on top of video entire duration.
after vide exported using avassetexportsession shown below
mutablevideocomposition.animationtool = [avvideocompositioncoreanimationtool videocompositioncoreanimationtoolwithpostprocessingasvideolayer:videolayer inlayer:parentlayer]; // create layer instruction our newly created animation tool avmutablevideocompositionlayerinstruction *layerinstruction = [avmutablevideocompositionlayerinstruction videocompositionlayerinstructionwithassettrack:gvideotrack]; avmutablevideocompositioninstruction *instruction = [avmutablevideocompositioninstruction videocompositioninstruction]; [instruction settimerange:cmtimerangemake(kcmtimezero, [mutablecomposition duration])]; [layerinstruction setopacity:1.0f attime:kcmtimezero]; [layerinstruction setopacity:0.0f attime:mutablecomposition.duration]; instruction.layerinstructions = [nsarray arraywithobject:layerinstruction]; // set instructions on our videocomposition mutablevideocomposition.instructions = [nsarray arraywithobject:instruction]; // export final composition video file // convert videopath url our avassetwriter create file @ nsstring* vidpath = creatensstring(outputvideopath); nsurl* vidurl = [nsurl fileurlwithpath:vidpath]; avassetexportsession *exporter = [[avassetexportsession alloc] initwithasset:mutablecomposition presetname:avassetexportpreset1280x720]; exporter.outputfiletype = avfiletypempeg4; exporter.outputurl = vidurl; exporter.videocomposition = mutablevideocomposition; exporter.timerange = cmtimerangemake(kcmtimezero, mutablecomposition.duration); // asynchronously export composition video file , save file photographic camera roll 1 time export completes. [exporter exportasynchronouslywithcompletionhandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ if (exporter.status == avassetexportsessionstatuscompleted) { debuglog("!!!file created!!!"); _close(); } else if(exporter.status == avassetexportsessionstatusfailed) { debuglog("failed damn"); debuglog(cstringcopy([[[exporter error] localizeddescription] utf8string])); debuglog(cstringcopy([[[exporter error] description] utf8string])); _close(); } else { debuglog("noidea"); _close(); } }); }]; }
i feeling animation not beingness started don't know. going right way splice in image info video this?
any assistance appreciated.
well solved issue in way. animation route not working, solution compile insertable images temporary video file , utilize video insert images final output video.
starting first link posted assetwriterinput making video uiimages on iphone issues created next function create temporary video
void createframeimagevideo(nsstring* path) { nslog(@"creating author @ path %@", path); nserror *error = nil; avassetwriter *videowriter = [[avassetwriter alloc] initwithurl: [nsurl fileurlwithpath:path] filetype:avfiletypempeg4 error:&error]; nslog(@"creating video codec settings"); nsdictionary *codecsettings = [nsdictionary dictionarywithobjectsandkeys: [nsnumber numberwithint:gvideotrack.estimateddatarate/*128000*/], avvideoaveragebitratekey, [nsnumber numberwithint:gvideotrack.nominalframerate],avvideomaxkeyframeintervalkey, avvideoprofilelevelh264mainautolevel, avvideoprofilelevelkey, nil]; nslog(@"creating video settings"); nsdictionary *videosettings = [nsdictionary dictionarywithobjectsandkeys: avvideocodech264, avvideocodeckey, codecsettings,avvideocompressionpropertieskey, [nsnumber numberwithint:1280], avvideowidthkey, [nsnumber numberwithint:720], avvideoheightkey, nil]; nslog(@"creating writter input"); avassetwriterinput* writerinput = [[avassetwriterinput assetwriterinputwithmediatype:avmediatypevideo outputsettings:videosettings] retain]; nslog(@"creating adaptor"); avassetwriterinputpixelbufferadaptor *adaptor = [avassetwriterinputpixelbufferadaptor assetwriterinputpixelbufferadaptorwithassetwriterinput:writerinput sourcepixelbufferattributes:nil]; [videowriter addinput:writerinput]; nslog(@"starting session"); //start session: [videowriter startwriting]; [videowriter startsessionatsourcetime:kcmtimezero]; cmtime timeoffset = kcmtimezero;//cmtimemake(0, 600); nslog(@"video width %d, height: %d, writing frame video file", gwidth, gheight); cvpixelbufferref buffer; for(int = 0; i< ganalysisframes.size(); i++) { while (adaptor.assetwriterinput.readyformoremediadata == false) { nslog(@"waiting within loop"); nsdate *maxdate = [nsdate datewithtimeintervalsincenow:0.1]; [[nsrunloop currentrunloop] rununtildate:maxdate]; } //write samples: buffer = pixelbufferfromcgimage(ganalysisframes[i].frameimage, gwidth, gheight); [adaptor appendpixelbuffer:buffer withpresentationtime:timeoffset]; timeoffset = cmtimeadd(timeoffset, ganalysisframes[i].duration); } while (adaptor.assetwriterinput.readyformoremediadata == false) { nslog(@"waiting outside loop"); nsdate *maxdate = [nsdate datewithtimeintervalsincenow:0.1]; [[nsrunloop currentrunloop] rununtildate:maxdate]; } buffer = pixelbufferfromcgimage(ganalysisframes[ganalysisframes.size()-1].frameimage, gwidth, gheight); [adaptor appendpixelbuffer:buffer withpresentationtime:timeoffset]; nslog(@"finishing session"); //finish session: [writerinput markasfinished]; [videowriter endsessionatsourcetime:timeoffset]; bool successfulwrite = [videowriter finishwriting]; // if failed write video if(!successfulwrite) { nslog(@"session failed error: %@", [[videowriter error] description]); // delete temporary file created nsfilemanager *filemanager = [nsfilemanager defaultmanager]; if ([filemanager fileexistsatpath:path]) { nserror *error; if ([filemanager removeitematpath:path error:&error] == no) { nslog(@"removeitematpath %@ error:%@", path, error); } } } else { nslog(@"session complete"); } [writerinput release]; }
after video created loaded avasset , it's track extracted video inserted replacing next line (from first code block in original post)
[mutablecompositiontrack insertemptytimerange:cmtimerangemake(cmtimeadd(gframes[i].starttime, timeoffset), gframes[i].duration)];
with:
[mutablecompositiontrack inserttimerange:cmtimerangemake(timeoffset,ganalysisframes[i].duration) oftrack:gframestrack attime:cmtimeadd(ganalysisframes[i].starttime, timeoffset) error:&gerror];
where gframestrack avassettrack created temporary frame video.
all code relating calayer , cabasicanimation objects have been removed not working.
not elegant solution, don't think 1 @ to the lowest degree works. hope finds useful.
this code works on ios devices (tested using ipad 3)
side note: debuglog function first post callback function prints out log messages, can replaced nslog() calls if need be.
ios osx avfoundation cgimage caanimation
No comments:
Post a Comment