Images and the various different apps with video present playback appropriately regardless of the recording digicam orientation. Is there a easy approach to accomplish the proper rotation for playback?
After a whole lot of digging in stackOverflow and Apple docs I lastly received the orientation of video playback to work – but it surely feels too tough for one thing that’s achieved on a regular basis.
Right here is my resolution – is there a greater method?
My app is locked to panorama full display orientation solely. Following the scheme for video to metallic rendering with filters as outlined within the WWDC22 “Show HDR video in EDR with AVFoundation and Steel”, every body of video ought to be transformed right into a CIImage for enter into CIFilters.
There are two steps:
- Decide the orientation of the recording digicam
- Apply a corrective rotation for the playback
StackOverFlow has a number of solutions for orientation of an AVAsset – one of the best reply appears to be from Learn how to detect if a video file was recorded in portrait orientation, or panorama in iOS the place dhallman gives a operate to reply each UIInterfaceOrientation and AVCaptureDevicePosition. Extract AVAsset Orientation and Digicam Place
The necessity for an app operate to find out UIInterfaceOrientation and AVCaptureDevicePosition appears unsuitable – Is that this hidden within the metadata someplace? One of many different points that hobbled me is the iOS16 change to make loading of the videoTracks and the preferredTransform an async name. So on this extension of AVAsset a operate
func videoOrientation() async -> PGLDevicePosition {
var orientation: UIInterfaceOrientation = .unknown
var machine: AVCaptureDevice.Place = .unspecified
var myVideoTracks:[AVAssetTrack]?
var t: CGAffineTransform = CGAffineTransformIdentity
do {
myVideoTracks = attempt await loadTracks(withMediaType: .video)
}
catch {
/// return init values of .unknown and .unspecificed
return PGLDevicePosition(orientation: orientation, machine: machine)
}
if let videoTrack = myVideoTracks?.first {
do {
t = attempt await videoTrack.load(.preferredTransform)
}
catch {
/// return init values of .unknown and .unspecificed
return PGLDevicePosition(orientation: orientation, machine: machine)
}
The async operate must wrapped right into a activity equivalent to this
func getVideoPreferredTransform(callBack: @escaping (PGLDevicePosition) -> Void ) {
Process {
let devicePosition = await avPlayerItem.asset.videoOrientation()
callBack(devicePosition)
}
}
Now understanding the machine .entrance or .again and orientation a change assertion is decide the proper CGImagePropertyOrientation for the AffineTransform for the CIImage. The change assertion is
var outcome = CGImagePropertyOrientation.up
// default
change (imageOrientation.orientation, imageOrientation.machine) {
case (.unknown,.unspecified) :
outcome = CGImagePropertyOrientation.up
case (.portrait, .entrance) :
outcome = CGImagePropertyOrientation.proper
case (.portraitUpsideDown, .entrance):
outcome = CGImagePropertyOrientation.proper
case (.landscapeLeft, .entrance) :
outcome = CGImagePropertyOrientation.up
case (.landscapeRight, .entrance) :
outcome = CGImagePropertyOrientation.up
case (.portrait, .again) :
outcome = CGImagePropertyOrientation.proper
case (.portraitUpsideDown, .again):
outcome = CGImagePropertyOrientation.left
case (.landscapeLeft, .again) :
outcome = CGImagePropertyOrientation.down
case (.landscapeRight, .again) :
outcome = CGImagePropertyOrientation.up
default:
return outcome // default .up
}
return outcome
Lastly, the correction to the ciImage may be made. First the CIImage is transformed from the cvPixelBuffer as prompt within the WWDC22 workshop. Then rework with the proper CGImagePropertyOrientation is utilized to the CIImage.
let sourceFrame = CIImage(cvPixelBuffer: buffer)
let neededTransform = sourceFrame.orientationTransform(for: videoPropertyOrientation)
videoCIFrame = sourceFrame.remodeled(by: neededTransform)
And we’re achieved… Appears method too sophisticated.. Is not there an easier method???