Android 开发NDK的经验套用

作者: zcwfeng | 来源:发表于2020-11-12 13:57 被阅读0次

环境搭建

CmakeLists.txt 的编写,参考规则就好

  • 注意路径

  • 用好指令

 配置路径
include_directories,add_subdirectory,
add_library,target_link_libraries,set

打印调试 
message

引入库文件,注意静态库文件,和动态库文件,静态库文件可以拆分的源码方式有助于缩减包大小。

  • 子模块字库的划分可以查分多个CmakeLists.txt

测试环境

一般会帮你创建好native-lib ,那么先引入你用的库,调用一个简单的函数编译通过,证明成功

移植代码

有时候在桌面开发clion简单的demo比较容易,然后移植代码到 手机端。进行功能拆分和封装。

开发思路

先在java端定义好JNI调用函数Native相关调用逻辑

一般都是先准备好库初始化环境,然后调用具体的功能,然后是释放资源停止等相关。

具体功能,比如直播相关视频封装到VideoChannel.cpp 去实现。图片相关放到ImageUtils.cpp 实现。体现功能封装

java如果调用c的具体功能,可以在java端发起一个long类型的指针引用。传给JNI,通过JNI调用C++等cpp,回调给java,这个时候可以类似java在JNI 实现一个Callback的具体逻辑。C++调用Callback的h头文件接口回传给java

设计:自顶向下,java端,JNI层,C++端

视频播放和推流的开发流程模板

public class RtmpClient {

    private static final String TAG = "RtmpClient";

    static {
        System.loadLibrary("native-lib");
    }

    private LifecycleOwner lifecycleOwner;


    private int width;
    private int height;
    private boolean isConnect;
    private VideoChannel videoChannel;
    private AudioChannel audioChannel;

    public RtmpClient(LifecycleOwner lifecycleOwner) {
        this.lifecycleOwner = lifecycleOwner;
        nativeInit();
    }


    public void initVideo(TextureView displayer, int width, int height, int fps, int bitRate) {
        this.width = width;
        this.height = height;
        videoChannel = new VideoChannel(displayer, this, lifecycleOwner);
        initVideoEnc(width, height, fps, bitRate);
    }

    public void initAudio(int sampleRate,int channels){
        audioChannel = new AudioChannel(sampleRate, channels, this);
        int inputByteNum = initAudioEnc(sampleRate, channels);
        audioChannel.setInputByteNum(inputByteNum);
    }


    public void startLive(String url) {
        connect(url);
    }

    public boolean isConnected() {
        return isConnect;
    }

    public void stopLive() {
        isConnect = false;
        audioChannel.stop();
        disConnect();
        Log.e(TAG, "停止直播==========");
    }

    /**
     * JNICall
     *
     * @param isConnect
     */
    public void onPrepare(boolean isConnect) {
        this.isConnect = isConnect;
        audioChannel.start();
        Log.e(TAG, "开始直播==========");
    }


    public void sendVideo(byte[] buffer) {
        nativeSendVideo(buffer);
    }

    public void sendAudio(byte[] buffer,int len){
        nativeSendAudio(buffer, len);
    }

    public void toggleCamera() {
        videoChannel.toggleCamera();
    }

    public void release() {
        videoChannel.release();
        audioChannel.release();
        releaseVideoEnc();
        releaseAudioEnc();
        nativeDeInit();
    }

    public int getWidth() {
        return width;
    }

    public int getHeight() {
        return height;
    }


    private native void nativeInit();

    private native void connect(String url);

    private native void disConnect();

    private native void nativeDeInit();

    private native void nativeSendVideo(byte[] buffer);

    private native void nativeSendAudio(byte[] buffer,int len);

    private native void initVideoEnc(int width, int height, int fps, int bitRate);

    private native int initAudioEnc(int sampleRate,int channels);

    private native void releaseVideoEnc();

    private native void releaseAudioEnc();

}

准备环境 构造nativeinit()
,链接 startLive(url)
初始化 initVideoEnc(width, height, fps, bitRate);
释放: stopLive()
回调Java:onPrepare(boolean isConnect)

OpenCV 开发模式的模板

public class FaceTracker {

    private long mNativeObj = 0;

   public FaceTracker(String model) {
        mNativeObj = nativeCreateObject(model);
    }

    public void setSurface(Surface surface) {
        nativeSetSurface(mNativeObj, surface);
    }
    
    public void release() {
        nativeDestroyObject(mNativeObj);
        mNativeObj = 0;
    }

    public void start() {
        nativeStart(mNativeObj);
    }

    public void stop() {
        nativeStop(mNativeObj);
    }

    ///其他逻辑
    public void detect(byte[] inputImage, int width, int height, int rotationdegrees) {
        nativeDetect(mNativeObj, inputImage, width, height, rotationdegrees);
    }


    private static native void nativeDestroyObject(long thiz);

    private static native void nativeSetSurface(long thiz, Surface surface);

    private static native void nativeStart(long thiz);

    private static native void nativeStop(long thiz);

    private static native void nativeDetect(long thiz, byte[] inputImage, int width, int height, int rotationDegrees);
}

常用的对应的头文件和Java传递对应C++

include <android/native_window_jni.h>

ANativeWindow * window

对应Java 的Surface

线程同步问题

通常我们做的一些事,都是比较耗时,所以java调用c层可能是不同的子线程,这个时候需要线程同步
我们通常类似java Syncronized,在c中用pthread_mutex_t 枷锁来处理同步问题。

小技巧,通常情况我们会有多出判断 不停的return,看起来比较丑陋,可以运用do while循环优化

如:

      pthread_mutex_lock(&mutex);
        if(!window) {
            pthread_mutex_unlock(&mutex);
            return;
        }
        ANativeWindow_setBuffersGeometry(window,img.cols,img.rows,WINDOW_FORMAT_RGBA_8888);
        ANativeWindow_Buffer buffer;
        ANativeWindow_lock(window,&buffer,0);
        pthread_mutex_unlock(&mutex);

这个里面多个函数需要判断返回值,没有需要跳出返回,用do while优化后,break 处理会好狠多
不需要每个retrun 前面都要写解锁
--->

pthread_mutex_lock(&mutex);
        do {
            if (!window) {
                break;
            }
            ANativeWindow_setBuffersGeometry(window, img.cols, img.rows, WINDOW_FORMAT_RGBA_8888);
            ANativeWindow_Buffer buffer;
            if (ANativeWindow_lock(window, &buffer, 0)) {
                ANativeWindow_release(window);
                window = 0;
                break;
            }
            uint8_t *dstData = static_cast<uint8_t *>(buffer.bits);
            uint8_t *srcData = img.data;
            int srclineSize = img.cols * 4;
            int dstlineSize = buffer.stride * 4;
            // 一行一行拷贝,因为需要填充步长
            for (int i = 0; i < buffer.height; ++i) {
                memcpy(dstData + i * dstlineSize, srcData + i * srclineSize, srclineSize);
            }
        } while (0);
        pthread_mutex_unlock(&mutex);

这样就减少了判断和代码

报错分析

打印,分析日志

debug 不用加断点,出现c++ 问题直接会中断,我们查看调用栈,找问题

  1. 错误找不到库,大部分是搜索路径问题和编译器路径
1-12 12:49:15.487 9583-9583/? E/AndroidRuntime: FATAL EXCEPTION: main
    Process: top.zcwfeng.opencv, PID: 9583
    java.lang.UnsatisfiedLinkError: dlopen failed: library "libopencv_java4.so" not found

将apk反编译看一下。分析:我们自己的代码调用libopencv_java4.so

改下libopencv_java4.so 位置,不要放在jni目录改成jniLibs

  1. 找不到明面上的库
    --------- beginning of crash
11-12 12:59:24.802 11388-11388/top.zcwfeng.opencv E/AndroidRuntime: FATAL EXCEPTION: main
    Process: top.zcwfeng.opencv, PID: 11388
    java.lang.UnsatisfiedLinkError: dlopen failed: library "libc++_shared.so" not found

dlopen failed: library "libc++_shared.so" not found . 是因为libopencv_java4 里面用到了,我们不知道。但是ndk官方开发文档有说明,虽然gradle是自动配置,但是我们需要一个参数
arguments 如下配置

externalNativeBuild {
            cmake {
                cppFlags ""
                arguments "-DANDROID_STL=c++_shared"
                abiFilters 'armeabi-v7a'
            }
            ndk{
                abiFilters 'armeabi-v7a'
            }
        }

相关文章

网友评论

    本文标题:Android 开发NDK的经验套用

    本文链接:https://www.haomeiwen.com/subject/tjaubktx.html