- The live stream playback is squished. How do I adjust the aspect ratio for that?
- Would really want to improve the live stream playback. How should I optimise/edit the frame rate and latency for iOS devices? It can sometimes be laggy.
- How do I distinguish from live stream ended and live stream paused?
Would appreciate any help on polishing the stream. Thanks a lot guys.
Hi Kevin,
MediaService uses AVFoundation framework for operating with the media resources in AUDIO_AND_VIDEO, ONLY_VIDEO, ONLY_AUDIO modes (see MediaPublishOptions.h), so:
- You can use one of the video resolution options (MPMediaData.h in MediaLibiOS3x):
RESOLUTION_LOW, // 144x192px (landscape) & 192x144px (portrait)
RESOLUTION_CIF, // 288x352px (landscape) & 352x288px (portrait)
RESOLUTION_MEDIUM, // 360x480px (landscape) & 480x368px (portrait)
RESOLUTION_VGA, // 480x640px (landscape) & 640x480px (portrait)
which corresponds with AVFoundation constants:
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPreset640x480;
For properly visualisation of video frame the aspect ratio of your View must correspond to the chosen resolution.
- AVFramework sets FPS automatically: for AVCaptureSessionPresetLow FPS = 15, for other FPS = 30. Stream video bitrate you can set using MediaPublisher method:
-(void)setVideoBitrate:(uint)bitRate;
- Now we are adding the functionality to distinguish the live stream is ended or is paused.
In CUSTOM_VIDEO and AUDIO_AND_CUSTOM_VIDEO modes (see MediaPublishOptions.h) you can use your own (1) frame resolution, (2) fps and bitrate using MediaPublisher methods:
-(BOOL)sendFrame:(CVPixelBufferRef)pixelBuffer timestamp:(int)timestamp;
-(BOOL)sendSampleBuffer:(CMSampleBufferRef)sampleBuffer;
for delivery the video frames by RTMP protocol.
Hi Kevin,
The problem (3) has been fixed - you can update libs and try our VideoService sample from github repo https://github.com/Backendless/iOS-Samples