How We Integrated a Video Library Into Our Live Video Streaming App

b.live is a live video streaming app developed by Agilie for iOS and Android platforms. The app lets people broadcast live videos to their followers within the app and to social media networks including Facebook and YouTube.

b.live is all about video, so one of the most important components of its technology stack is a library that sets up all video and audio stream manipulations, including drawing on-screen and adding text and emoji during broadcasts. In this article, we present a detailed case study on how we tried out different video libraries for the Android version of b.live.

b.live live streaming

b.live app

Looking for a library

In our case, we needed to employ a library to manage streaming video and audio data from the device with the possibility to add drawings, text, and emojis right on the video feed in real time.

b.live is not our first project dealing with live streaming, so we already had some technologies at our disposal. For example, in our previous project, we used WebRTC technology. However, by the time we started developing the Android version of b.live, the iOS version had already been released, and our iOS team had built it implementing some new technologies. Thus, Real-Time Messaging Protocol (RTMP) was chosen as the protocol for streaming audio and video data, and the VideoCore library was successfully used to manipulate with the video stream. 

For the Android version, we originally intended to use the VideoCore library, too. Generally, it matches all the requirements: it supports streaming video and audio from the device's camera using RTMP. However, the Android version of this library had 'work in progress' and 'contributors welcome' statuses. 

That was really bad news for us. Developing our own library from scratch wasn’t an option as well. It would have been too time-consuming (even if we tried contributing the VideoCore library). So we went out looking for other solutions.

Trying JavaCV

The first library we put our hands on was JavaCV.

JavaCV provides a Java interface for the OpenCV library, an open-source computer vision library written in optimized C/C++. Along with OpenCV, JavaCV uses wrappers from the JavaCPP Presets of the libraries commonly used by computer vision researchers including FFmpeg, libdc1394, PGR FlyCapture, OpenKinect, and others.

For our project, however, we needed only the capabilities of OpenCV and FFmpeg.

I think most of the developers at least heard about FFmpeg, a solution for working with multimedia data, which includes recording, converting and streaming audio and video. Even more, FFmpeg provides a huge filtering toolset based on the libavfilter library.

JavaCV has sample code on how to record a video right from the camera. It also includes the FFmpegFrameRecorder class that already supports RTMP streaming, so we just had to use our RTMP link instead of the default "/mnt/sdcard/stream.flv" path.

When we added the switching camera logic, we found out that there was a mirroring effect while streaming. We solved the problem by creating an FFmpegFrameFilter:

rotateClockwizeFilter = FFmpegFrameFilter("transpose=clock", imageWidth, imageHeight)
rotateClockwizeFilter?.pixelFormat = avutil.AV_PIX_FMT_NV21
rotateClockwizeFilter?.start();
rotateCClockwizeFilter = FFmpegFrameFilter("transpose=cclock", imageWidth, imageHeight)
rotateCClockwizeFilter?.pixelFormat = avutil.AV_PIX_FMT_NV21
rotateCClockwizeFilter?.start();
resizeFilter = FFmpegFrameFilter("scale=w=\'min(500\\, iw*3/2):h=-1\'", imageWidth, imageHeight)
resizeFilter?.pixelFormat = avutil.AV_PIX_FMT_NV21 // default camera format on Android
resizeFilter?.start();

...and applying the filter for every frame:

filter?.push(yuvImage)
val rotatedFrame = filter?.pull()

NOTE: the code examples provided in this article are written in Kotlin language.

Overall, JavaCV provides a bunch of possibilities to work with video and audio streams, which was appealing to us as we planned to expand the future app as well as use the library for other our projects. But…

Issues with JavaCV and how we solved them

Suddenly, integrating the library’s into our app, on some devices we faced a problem which hadn’t appeared in the compiled sample usage:

java.lang.UnsatisfiedLinkError: org.bytedeco.javacpp.avutil
at java.lang.Class.classForName(Native Method)
at java.lang.Class.forName(Class.java:324)
at org.bytedeco.javacpp.Loader.load(Loader.java:390)

For those who have never dealt with Android NDK, the following solution to this problem may not be as obvious as it really is. So here’s a bit of backstory. 

There are over 10 000 Android-powered devices on the market, many of them having different hardware components including сentral processing units (CPUs). The mobile CPU lineup is constantly growing with new powerful units being introduced every year. However, all the existing CPUs are united by the concept of processor architecture.

As a developer, you don’t think about such things until you have to deal with code written in C/C++. But that’s not our case. The Android NDK tools help to compile the code into prebuilt libraries having the .so format. Ideally, this being made for all existing processor architectures in order to optimize the binary code and - ultimately -  boost up the performance. Each architecture has its own ABI (Application Binary Interface) which defines how the app's code is supposed to interact with the system at runtime: the armeabi, armeabi-v7a, x86, mips, arm64-v8a, mips64, x86_64, and .so files are being compiled and added to a folder with a corresponding name.

If you need to know more about ABIs and .so files, here’s a valuable guide. Also, check out Google’s official guide on ABI management.

Noteworthy, that some libraries we use in our projects are platform-dependent and include .so files (for example, SQLCipher, VLC Android SDK). In a final .apk they usually look like this:

/lib/<some_abi>/superlib1.so
/lib/<some_abi>/superlib2.so
/lib/<some_other_abi>/superlib1.so
/lib/<some_other_abi>/superlib2.so

In our case, the file structure looked this way:

abi structure

Here, we have the libavutil.so file, but it isn’t present in all folders - this is the reason the error appeared on the devices which CPU’s ABI is compatible with armeabi-v7a. The libavutil.so file simply hadn’t been deployed together with the rest of the app’s files. Also, we can see there the libvlc.so and libvlcjni.so files. Their names suggest that they are libraries of VLC Player which is used in the app for viewing live streams and is added as a Gradle dependency. Without the VLC Player library, the system would have inquired the armeabi folder, and there wouldn’t have been any problem at all.

Here’s how we chose to solve it.

According to Bytedeco, the JavaCV developer, the libraries in question can be used for armeabi-v7a, too. Thus, we copied all .so files related to JavaCV (plus JavaCPP, FFMpeg и OpenCV) from the armeabi folder and added them to the project’s app/src/main/jniLibs/armeabi-v7a folder (a default path for .so libraries, which you can change).

During the Gradle assembling, all project and dependencies files will be merged into one folder, and the above problem will never appear again.

Unfortunately, it’s not a general practice for developers to ensure the support of all relevant processor architectures in their libraries, and that can eventually lead your app development to a dead end. Needless to say, you’d better avoid such situations, but if you have gotten into one, you can assemble libraries for required CPU architectures yourself - you only need source code. But remember, that for large-scale projects such endeavors may require some significant time investments.

Also, you can set restrictions for the architectures supported by your app right into your app’s module build.gradle file:

buildTypes {
debug {
            ...
            ndk {
                abiFilter "armeabi-v7a"
            }
            ...
        }
}

Why we dropped JavaCV

The problem was solved, and the next thing we needed was to get a grasp on how to add certain filters (fisheye, glow, grey, invert, sepia), so app users could apply them to video feed during the broadcasts. 

Here is a list of effects that can be applied to Camera, but it lacks the ones we needed. We also needed to learn how to overlay images on the camera feed. I’m sure the FFmpeg library’s commands were able to accomplish all those tasks, but that was the time we decided to look for a solution other than JavaCV.

By that moment the app’s size went beyond the 150Mb point because of the huge amount of the .so files, which was a little over the top. Of course, on a device, the installed files will include only files relevant for the corresponding ABI, and Google Play allows us to upload multiple APKs assembled for different CPU architectures… But the team felt that we don’t need such a powerful tool for the task we had, so we proceeded with our investigation.

Looking for a library
We have solid experience in developing VOD and live streaming solutions. Click here to see our expertise!

Next stop: GPUImage for Android

Eventually, we stumbled upon the project called GPUImage for Android. GPUImage is a popular open-source iOS framework for GPU-based image and video processing. It’s used, for example, in the iOS version of Periscope. GPUImage for Android, in its turn, recaptures the capabilities of the iOS framework and does it pretty well.

Alongside with the diversity of ready-made filters, the framework allows us to easily implement our own custom filters - the only thing you need is having some knowledge of OpenGL ES 2.0. Moreover, the nature of the library allows us to avoid tons of boilerplate code.

Now, we needed to find a lightweight library for streaming audio and video feed using RTMP. There are a lot of RTMP clients for Android on GitHub. We laid our eyes on Yasea, SimpleRtmp, and Librtmp Client.

The ultimate choice: Librestreaming

Digging deeper, I accidentally stumbled upon the project called Librestreaming. It uses the Android MediaCodec API for audio and video encoding and the popular lightweight librtmp library for an RTMP streaming. What’s more, it provides the ability to apply real-time effect filters between the camera capture an image encoding phases.

There is also a so-called ‘soft’ mode (NV21 processing) and ‘hard’ mode (image texture rendering) filters. Another great thing is that the library includes the possibility to use GPUImageFilter from the GPUImage library for Android.

Originally, the library hadn’t the division between the idle mode (camera preview display) and the streaming mode, so we added it to our project as a separate module in order to modify the code whenever we need. We also got rid of the Camera-related code, since the library deals with it on its own. To make the app run on Android 6.0 and later, we added the 'Requesting Permissions at Run' Time functionality. For the RTMP player code compilation, we added the NDK to the project.

In the end, all we needed was to find the required filters and add them to the project:

var resClient = RESClient()
...
resClient?.setHardVideoFilter(effectFilter)

Adding drawings and emojis to a video stream

At the same time, we were developing a module for adding hand drawings, text, and emojis over the video preview. The module creates a bitmap which is then being overlaid on the video feed. We achieved that thanks to Librestreaming’s TowInputFilterHard filter which renders the bitmap.

text on-screen

Adding text

As in the case with JavaCV, the output image appeared to be mirrored, but we eradicated the problem simply by adjusting the texture coordinates. That takes only a few small code edits.

The lines:

protected static float texture2Vertices[] = {
            1.0f, 0.0f,
            1.0f, 1.0f,
            0.0f, 1.0f,
            0.0f, 0.0f};

should be changed to:

protected static float texture2Vertices[] = {
            0.0f, 0.0f,
            0.0f, 1.0f,
            1.0f, 1.0f,
            1.0f, 0.0f};

Generally, OpenGL is too complex to explain it with few words, but here you can find out why these coordinates were changed this way.

To apply both an effect filter and a filter with a drawing simultaneously, there’s HardVideoGroupFilter, a filter capable of incorporating as many filters as you want.

Here’s an example, how we used it to incorporate effect and drawing filters:

resClient?.setHardVideoFilter(HardVideoGroupFilter(listOf(effectFilter, drawingFilter)))


on-screen drawing

Drawing on-screen

Conclusion

In this article, we focused more on our research than on the development itself, but thorough research and trying out different solutions to find the one most suitable is an integral part of a successful product development process.

We had our ups and downs trying different technological solutions, but the result is definitely worth it: we’ve got b.live, a sleek app which allows to make live broadcasts, apply different video filters, draw right on the camera preview, add text, and emoji - all on the fly!

Adding drawings and emojis to a video stream
if you're interested in video streaming app development, contact us right away. 

 

Rate this article
15 ratings, average 4.80 of out 5
Table of contents
Get in touch
Related articles
How to Create a Live Video Streaming Website Like Twitch and Get Your Revenue
How to Create a Live Video Streaming Website Like Twitch and Get Your Revenue

Other

18 min read

How to Make a Music Streaming App Like Spotify?
How to Make a Music Streaming App Like Spotify?

Other

16 min read

8 Latest Crypto Fundraising Trends That Shape 2024
8 Latest Crypto Fundraising Trends That Shape 2024

Blockchain

8 min read

How to Create a Live Video Streaming Website Like Twitch and Get Your Revenue
How to Create a Live Video Streaming Website Like Twitch and Get Your Revenue

Other

18 min read

How to Make a Music Streaming App Like Spotify?
How to Make a Music Streaming App Like Spotify?

Other

16 min read