Integration OpenSDK For iOS In Your Project
This article is dedicated to developers looking to integrate Streamaxia OpenSDK into their iOS project.
The library will capture the audio and the video from the device’s camera and mic and will broadcast it to a RTMP server. You will have to acquire the services of a CDN or media server, like Amazon AWS.
Set up OpenSDK in a Project
- Open an existing project or create a new one
- Add the framework provided (StreamaxiaSDK.framework) to the project
- Add the certificate files or config bundle to the project
- Import StreamaxiaSDK and use the streaming APIs:
#import <StreamaxiaSDK/StreamaxiaSDK.h>
Initialize SDK Main Interface
To initialize the SDK, you must make sure that the .config and .key files are added to the project. It also works if the .config and .key files are bundled together (e.g. certificate.bundle), but in this case the SDK must be initialized using that bundle's URL.
Loading the SDK configuration from the standard URL:
Objective-C |
AXStreamaxiaSDK *sdk = [AXStreamaxiaSDK sharedInstance];
[sdk setupSDKWithCompletion:^(BOOL success, AXError *error){ [sdk debugPrintSDKStatus]; }]; |
Swift |
let sdk = AXStreamaxiaSDK.sharedInstance()!
sdk.setupSDK(completion: { (success, error) in sdk.debugPrintStatus() }) |
Loading the SDK configuration from a custom URL (a custom bundle that contains the provided .config and .key files):
Objective-C |
NSURL *bundleURL = [[NSBundle mainBundle] URLForResource:@"certificate" withExtension:@"bundle"]; NSBundle *bundle = [NSBundle bundleWithURL:bundleURL];
[sdk setupSDKWithURL:bundle.bundleURL withCompletion:^(BOOL success, AXError *error) { [sdk debugPrintSDKStatus]; }]; |
Swift |
let bundleURL = Bundle.main.url(forResource: "certificate", withExtension: "bundle") let bundle = Bundle.init(url: bundleURL!)
sdk.setupSDK(with: bundle?.bundleURL) { (success, error) in sdk.debugPrintStatus() } |
Settings
The stream info contains the settings for the streaming destination server. The appropriate values corresponding to your account should be formatted like below.
The recorder settings contain the properties used to capture and stream the audio and video. When changing some of the settings it is recommended to check if that setting is supported beforehand.
Objective-C |
// The stream info AXStreamInfo *info = [AXStreamInfo streamInfo];
info.useSecureConnection = NO; info.serverAddress = @"23.235.227.213"; info.applicationName = @"test"; info.streamName = @"demo"; info.username = @""; info.password = @"";
// The default recorder settings AXRecorderSettings *settings = [AXRecorderSettings recorderSettings]; |
Swift |
// The stream info let info = AXStreamInfo.init()
info.useSecureConnection = false info.serverAddress = "23.235.227.213" info.applicationName = "test" info.streamName = "demo" info.username = "" info.password = ""
// The default recorder settings let settings = AXRecorderSettings.init() |
Recorder Setup
The recorder must be initialized using valid stream info and recorder settings. Also, the SDK must be properly configured, otherwise the recorder will fail to initialize. Furthermore, you can choose to enable extra features such as adaptive bitrate or local save.
Objective-C |
AXRecorder *recorder = [AXRecorder recorderWithStreamInfo:info settings:settings]; recorder.recorderDelegate = self;
AXError *error;
// Enable adaptive bitrate // Video quality will be adjusted based on available network and hardware resources [recorder activateFeatureAdaptiveBitRateWithError:&error]; if (error) { // Handle error } else { // Succes }
// Enable local save // The broadcast will be saved to the users camera roll when finished [recorder activateFeatureSaveLocallyWithError:&error]; if (error) { // Handle error } else { // Succes } |
Swift |
if let recorder = AXRecorder.init(streamInfo: info, settings: settings) { recorder.recorderDelegate = self recorder.setup(with: self.recorderView) recorder.prepareToRecord()
var error: AXError?
// Enable adaptive bitrate // Video quality will be adjusted based on available network and hardware resources recorder.activateFeatureAdaptiveBitRateWithError(&error) if error != nil { // Handle error } else { // Success }
// Enable local save // The broadcast will be saved to the users camera roll when finished recorder.activateFeatureSaveLocallyWithError(&error) if error != nil { // Handle error } else { // Success } } |
Video Preview
This is optional. It should be done only if the capture output needs to be seen on the screen. The streaming can work without a capture preview.
Objective-C |
// Setup the preview [recorder setupWithView:self.recorderView]; |
Swift |
// Setup the preview if recorder { recorder.setup(with: self.recorderView) } |
Streamer Setup
This step is mandatory before starting the streaming. The capture will be started (and outputted to the preview if available). Also, the streamer will be setup for streaming at this step.
Objective-C |
// Start the capture [recorder prepareToRecord]; |
Swift |
// Start the capture if recorder { recorder.prepareToRecord() } |
Add permissions to Info.plist
In order to successfully start broadcasting, you need to add two usage descriptions to your Info.plist file. Once you open the file in Xcode, right click on the white background, and select 'Show Raw Keys/Values'. You can add a new entry by right-clicking, and choosing 'Add Row'. The two keys for which you need to create a usage description are NSCameraUsageDescription and NSMicrophoneUsageDescription.
Start and Stop Stream
This should be called after prepareToRecord. The stream broadcast will start. The streaming may fail to start due to various reasons, therefore the success flag should be checked, along with the error in case of failure.
Objective-C |
// Start the streaming [self.recorder startStreamingWithCompletion:^(BOOL success, AXError *error) { // … }];
// Stop the streaming [self.recorder stopStreaming]; |
Swift |
// Start the streaming self.recorder.startStreaming(completion: { (success, error) in // … })
// Stop the streaming self.recorder.stopStreaming() |
Taking a snapshot
You can take snapshots at any point after the sdk was configured by using the following API:
Objective-C |
// Start the snapshot [self.recorder takeSnapshotWithCompletion:^(UIImage *snapshot, AXError *error) { // do something with the image }]; |
Swift |
// Take the screenshot recorder.takeSnapshot { (image, error) in if let image = image { // do something with the image } }) |
We hope that you find the integration of the library straightforward. Get in touch with us to add live streaming capabilities to your app today!