前言 最近项目有个需求,要求使用 ffmpeg 将 PCM 编码成 ADPCM IMA QT 格式。可是网上竟然没有任何相关的资料(至少我是没找到),我也是哭了。唯一能参考的就是官方的示例,不过那个示例就是一个范例,毕竟人家不可能把每种编码都写个例子。所以自己在写代码时遇到了不少挑战。特此分享下学习成果。
关于 ffmpeg 的自定义编译,前文 已经讲过了,并且讲解了如何将 ADPCM IMA QT 编码的音频解码成 PCM。
本文主要讲解如何将 PCM 编码成 ADPCM IMA QT 裸流。
编译环境
编译 前文 已经详细的讲解了如何自定义编码 ffmpeg,这里就不再赘述了。
由于现在我们要进行音频编码,所以需要编码一个仅支持 ADPCM IMA QT 编码器的 ffmpeg 即可。
修改编译脚本 修改之前的编译脚本:
1 2 cd ffmpeg-4.4 vim build_android.sh
内容如下:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 # !/bin/bash NDK_PATH=~/Library/Android/sdk/ndk/22.1.7171670 # linux-x86_64 HOST_TAG=darwin-x86_64 MIN_SDK_VER=21 # ================================== TOOLCHAINS=${NDK_PATH}/toolchains/llvm/prebuilt/${HOST_TAG} SYSROOT=${TOOLCHAINS}/sysroot function build_one { if [ $ARCH == "arm" ] then CROSS_PREFIX=$TOOLCHAINS/bin/arm-linux-androideabi- elif [ $ARCH == "aarch64" ] then CROSS_PREFIX=$TOOLCHAINS/bin/aarch64-linux-android- elif [ $ARCH == "x86_32" ] then CROSS_PREFIX=$TOOLCHAINS/bin/i686-linux-android- else CROSS_PREFIX=$TOOLCHAINS/bin/x86_64-linux-android- fi pushd ffmpeg-4.4 ./configure \ --prefix=$PREFIX \ --extra-cflags="$OPTIMIZE_CFLAGS" \ --cross-prefix=$CROSS_PREFIX \ --sysroot=$SYSROOT \ --enable-cross-compile \ --target-os=android \ --arch=$ARCH \ --cc=${CC} \ --cxx=${CC}++ \ --ld=${CC} \ --ar=${TOOLCHAINS}/bin/llvm-ar \ --as=${CC} \ --nm=${TOOLCHAINS}/bin/llvm-nm \ --ranlib=${TOOLCHAINS}/bin/llvm-ranlib \ --strip=${TOOLCHAINS}/bin/llvm-strip \ --disable-everything \ --disable-programs \ --disable-x86asm \ --disable-inline-asm \ --disable-swresample \ --disable-swscale \ --disable-avfilter \ --disable-avdevice \ --disable-avformat \ --disable-static \ --enable-decoder=adpcm_ima_qt \ --enable-encoder=adpcm_ima_qt \ --enable-shared \ --enable-small \ --enable-pic make clean make -j6 make install popd } # armeabi-v7a ARCH=arm OPTIMIZE_CFLAGS="-g -DANDROID -fdata-sections -ffunction-sections -funwind-tables -fstack-protector-strong -no-canonical-prefixes -D_FORTIFY_SOURCE=2 -march=armv7-a -mthumb -Wformat -Werror=format-security -Oz -DNDEBUG -fPIC --target=armv7-none-linux-androideabi$MIN_SDK_VER --gcc-toolchain=$TOOLCHAINS" PREFIX=`pwd`/prebuilt/armeabi-v7a export CC=$TOOLCHAINS/bin/armv7a-linux-androideabi$MIN_SDK_VER-clang export CXX=$TOOLCHAINS/bin/armv7a-linux-androideabi$MIN_SDK_VER-clang++ build_one # arm64-v8a ARCH=aarch64 OPTIMIZE_CFLAGS="-g -DANDROID -fdata-sections -ffunction-sections -funwind-tables -fstack-protector-strong -no-canonical-prefixes -D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security -O2 -DNDEBUG -fPIC --target=aarch64-none-linux-android$MIN_SDK_VER --gcc-toolchain=$TOOLCHAINS" PREFIX=`pwd`/prebuilt/arm64-v8a export CC=$TOOLCHAINS/bin/aarch64-linux-android$MIN_SDK_VER-clang export CXX=$TOOLCHAINS/bin/aarch64-linux-android$MIN_SDK_VER-clang++ build_one # # ARCH=x86_32 # OPTIMIZE_CFLAGS="-g -DANDROID -fdata-sections -ffunction-sections -funwind-tables -fstack-protector-strong -no-canonical-prefixes -mstackrealign -D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security -O2 -DNDEBUG -fPIC --target=i686-none-linux-android$MIN_SDK_VER --gcc-toolchain=$TOOLCHAINS " # PREFIX=`pwd `/prebuilt/x86 # export CC=$TOOLCHAINS /bin/i686-linux-android$MIN_SDK_VER -clang# export CXX=$TOOLCHAINS /bin/i686-linux-android$MIN_SDK_VER -clang++# build_one # # # ARCH=x86_64 # OPTIMIZE_CFLAGS="-g -DANDROID -fdata-sections -ffunction-sections -funwind-tables -fstack-protector-strong -no-canonical-prefixes -D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security -O2 -DNDEBUG -fPIC --target=x86_64-none-linux-android$MIN_SDK_VER --gcc-toolchain=$TOOLCHAINS " # PREFIX=`pwd `/prebuilt/x86_64 # export CC=$TOOLCHAINS /bin/x86_64-linux-android$MIN_SDK_VER -clang# export CXX=$TOOLCHAINS /bin/x86_64-linux-android$MIN_SDK_VER -clang++# build_one
执行上面的脚本,我们就可以得到所需要的 so 文件了。
修改 Android.mk
文件 修改之前创建过的 Android.mk
文件,内容如下:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 LOCAL_PATH := $(call my-dir ) MY_PREBUILT := $(LOCAL_PATH) /prebuilt/$(TARGET_ARCH_ABI) include $(CLEAR_VARS) LOCAL_MODULE := libavcodec LOCAL_SRC_FILES := $(MY_PREBUILT) /lib/libavcodec.so include $(PREBUILT_SHARED_LIBRARY) include $(CLEAR_VARS) LOCAL_MODULE := libavutil LOCAL_SRC_FILES := $(MY_PREBUILT) /lib/libavutil.so include $(PREBUILT_SHARED_LIBRARY) include $(CLEAR_VARS) LOCAL_MODULE := adpcm-ima-qt-decoder LOCAL_SRC_FILES := $(LOCAL_PATH) /adpcm_ima_qt_decoder/adpcm_ima_qt_decoder.cpp \ $(LOCAL_PATH) /adpcm_ima_qt_decoder/native_adpcm_ima_qt_decoder.cpp LOCAL_CFLAGS := LOCAL_LDLIBS := -llog -ljnigraphics -lz -landroid -lm -pthread -L$(SYSROOT) /usr/lib LOCAL_C_INCLUDES := $(LOCAL_PATH) \ $(LOCAL_C_INCLUDES) \ $(MY_PREBUILT) /include \ $(LOCAL_PATH) /adpcm_ima_qt_decoder LOCAL_SHARED_LIBRARIES := libavcodec libavutil LOCAL_DISABLE_FORMAT_STRING_CHECKS := true LOCAL_DISABLE_FATAL_LINKER_WARNINGS := true include $(BUILD_SHARED_LIBRARY) include $(CLEAR_VARS) LOCAL_MODULE := adpcm-ima-qt-encoder LOCAL_SRC_FILES := $(LOCAL_PATH) /adpcm_ima_qt_encoder/adpcm_ima_qt_encoder.cpp \ $(LOCAL_PATH) /adpcm_ima_qt_encoder/native_adpcm_ima_qt_encoder.cpp LOCAL_CFLAGS := LOCAL_LDLIBS := -llog -ljnigraphics -lz -landroid -lm -pthread -L$(SYSROOT) /usr/lib LOCAL_C_INCLUDES := $(LOCAL_PATH) \ $(LOCAL_C_INCLUDES) \ $(MY_PREBUILT) /include \ $(LOCAL_PATH) /adpcm_ima_qt_encoder LOCAL_SHARED_LIBRARIES := libavcodec libavutil LOCAL_DISABLE_FORMAT_STRING_CHECKS := true LOCAL_DISABLE_FATAL_LINKER_WARNINGS := true include $(BUILD_SHARED_LIBRARY)
编写 cpp
文件 在 Android 项目 app module 的 src/main/jni
目录下创建 adpcm_ima_qt_encoder
文件夹,并在该文件夹中创建如下几个文件:
native_adpcm_ima_qt_encoder.h
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 #include <jni.h> #ifndef NATIVE_ADPCM_IMA_QT_ENCODER_H #define NATIVE_ADPCM_IMA_QT_ENCODER_H #ifdef __cplusplus extern "C" {#endif JNIEXPORT jint JNICALL init (JNIEnv *env, jobject obj, jint sampleRate, jint channels, jint bitRate) ;JNIEXPORT void JNICALL release (JNIEnv *env, jobject obj) ;JNIEXPORT void JNICALL encode (JNIEnv *env, jobject obj, jbyteArray pcmByteArray) ;JNIEXPORT jstring JNICALL getVersion (JNIEnv *env, jobject thiz) ;static void encodedAudioCallback (uint8_t *encodedAudioData, int decodedAudioLength) ;#ifdef __cplusplus } #endif #endif
native_adpcm_ima_qt_encoder.cpp
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 #include "native_adpcm_ima_qt_encoder.h" #include "adpcm_ima_qt_encoder.h" #include "logger.h" #define ADPCM_PACKAGE_BASE "com/leovp/ffmpeg/audio/adpcm/" AdpcmImaQtEncoder *pEncoder = nullptr; JNIEnv *gEnv; jobject gObj; JNIEXPORT jint JNICALL init (JNIEnv *env, jobject obj, jint sampleRate, jint channels, jint bitRate) { if (nullptr == pEncoder) { pEncoder = new AdpcmImaQtEncoder(sampleRate, channels, bitRate); return 0 ; } return -1 ; } JNIEXPORT void JNICALL release(JNIEnv *env, jobject obj) { delete pEncoder; pEncoder = nullptr; } JNIEXPORT void JNICALL encode(JNIEnv *env, jobject obj, jbyteArray pcmByteArray) { gEnv = env; gObj = obj; int pcmLen = env->GetArrayLength(pcmByteArray); auto *pcm_unit8_t_array = new uint8_t[pcmLen]; env->GetByteArrayRegion(pcmByteArray, 0 , pcmLen, reinterpret_cast<jbyte *>(pcm_unit8_t_array)); pEncoder->encode(pcm_unit8_t_array, pcmLen, encodedAudioCallback); delete[] pcm_unit8_t_array; } JNIEXPORT jstring JNICALL getVersion(JNIEnv *env, jobject thiz) { return env->NewStringUTF("0.1.0" ); } void encodedAudioCallback(uint8_t *encodedAudioData, int decodedAudioLength) { jbyteArray encoded_byte_array = gEnv->NewByteArray(decodedAudioLength); gEnv->SetByteArrayRegion(encoded_byte_array, 0 , decodedAudioLength, reinterpret_cast<const jbyte *>(encodedAudioData)); jclass clazz = gEnv->GetObjectClass(gObj); jmethodID constructor = gEnv->GetMethodID(clazz, "encodedAudioCallback" , "([B)V" ); gEnv->CallVoidMethod(gObj, constructor , encoded_byte_array); } static JNINativeMethod methods[] = { {"init" , "(III)I" , (void *) init }, {"release" , "()V" , (void *) release}, {"encode" , "([B)V" , (void *) encode}, {"getVersion" , "()Ljava/lang/String;" , (void *) getVersion}, }; JNIEXPORT jint JNI_OnLoad(JavaVM *vm, void *reserved) { JNIEnv *env; if (vm->GetEnv((void **) &env, JNI_VERSION_1_6) != JNI_OK) { LOGE("JNI_OnLoad GetEnv error." ); return JNI_ERR; } jclass clz = env->FindClass(ADPCM_PACKAGE_BASE"AdpcmImaQtEncoder" ); if (clz == nullptr) { LOGE("JNI_OnLoad FindClass error." ); return JNI_ERR; } if (env->RegisterNatives(clz, methods, sizeof(methods) / sizeof(methods[0 ]))) { LOGE("JNI_OnLoad RegisterNatives error." ); return JNI_ERR; } return JNI_VERSION_1_6; }
adpcm_ima_qt_encoder.h
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 #ifndef LEOANDROIDBASEUTIL_ADPCM_IMA_QT_ENCODER_H #define LEOANDROIDBASEUTIL_ADPCM_IMA_QT_ENCODER_H #include <jni.h> #include <string> #ifdef __cplusplus extern "C" { #endif #include <libavcodec/avcodec.h> #include <libavutil/channel_layout.h> #include <libavutil/common.h> #include <libavutil/frame.h> #include <libavutil/samplefmt.h> #ifdef __cplusplus } #endif typedef void(*pCallbackFunc)(uint8_t *encodedAudioData, int decodedAudioLength); class AdpcmImaQtEncoder {private : AVCodecContext *ctx = nullptr; AVFrame *frame = nullptr; AVPacket *pkt = nullptr; static void do_encode(AVCodecContext *pCtx, AVFrame *pFrame, AVPacket *pPkt, pCallbackFunc callback); public : AdpcmImaQtEncoder(int sampleRate, int channels, int bitRate); ~AdpcmImaQtEncoder(); void encode(const uint8_t *pcmByteArray, int pcmLen, pCallbackFunc callback); }; #endif
adpcm_ima_qt_encoder.cpp
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 #include "adpcm_ima_qt_encoder.h" #include "logger.h" AdpcmImaQtEncoder::AdpcmImaQtEncoder(int sampleRate, int channels, int bitRate) { LOGE("ADPCM encoder init. sampleRate: %d, channels: %d bitRate: %d" , sampleRate, channels, bitRate); const AVCodec *codec = avcodec_find_encoder(AV_CODEC_ID_ADPCM_IMA_QT); if (!codec) { LOGE("ADPCM IMA QT encoder does not found" ); exit(-1 ); } ctx = avcodec_alloc_context3(codec); if (!ctx) { LOGE("Could not allocate audio encoder context" ); exit(-2 ); } ctx->sample_rate = sampleRate; ctx->bit_rate = bitRate; ctx->sample_fmt = AV_SAMPLE_FMT_S16P; ctx->channel_layout = channels == 2 ? AV_CH_LAYOUT_STEREO : AV_CH_LAYOUT_MONO; ctx->channels = av_get_channel_layout_nb_channels(ctx->channel_layout); int ret; if ((ret = avcodec_open2(ctx, codec, nullptr)) < 0 ) { LOGE("Could not open encoder" ); exit(ret); } pkt = av_packet_alloc(); if (!pkt) { LOGE("Could not allocate the packet" ); exit(-3 ); } frame = av_frame_alloc(); if (!frame) { LOGE("Could not allocate audio frame" ); exit(-4 ); } frame->nb_samples = ctx->frame_size; frame->format = ctx->sample_fmt; frame->channel_layout = ctx->channel_layout; ret = av_frame_get_buffer(frame, 0 ); if (ret < 0 ) { LOGE("Could not allocate audio data buffers" ); exit(-5 ); } LOGE("frame_size=%d linesize[0]=%d nb_samples=%d" , ctx->frame_size, frame->linesize[0 ], frame->nb_samples); } AdpcmImaQtEncoder::~AdpcmImaQtEncoder() { if (ctx != nullptr) { avcodec_free_context(&ctx); ctx = nullptr; } if (frame != nullptr) { av_frame_free(&frame); frame = nullptr; } if (pkt != nullptr) { av_packet_free(&pkt); pkt = nullptr; } LOGE("ADPCM encoder released!" ); } void AdpcmImaQtEncoder::encode(const uint8_t *pcm_unit8_t_array, int pcmLen, pCallbackFunc callback) { bool isStereo = ctx->channels == 2 ; uint8_t *outs[ctx->channels]; const int BUF_SIZE = frame->linesize[0 ] * ctx->channels; outs[0 ] = new uint8_t[BUF_SIZE]; if (isStereo) outs[1 ] = new uint8_t[BUF_SIZE]; const int loopStep = 2 * ctx->channels; int ret; for (int loop = 0 ; loop < pcmLen / BUF_SIZE; loop++) { ret = av_frame_make_writable(frame); if (ret < 0 ) { LOGE("av_frame_make_writable error. code=%d" , ret); return ; } for (int idx = 0 ; idx < BUF_SIZE / loopStep; idx++) { outs[0 ][idx * 2 + 0 ] = pcm_unit8_t_array[loop * BUF_SIZE + idx * loopStep + 0 ]; outs[0 ][idx * 2 + 1 ] = pcm_unit8_t_array[loop * BUF_SIZE + idx * loopStep + 1 ]; if (isStereo) { outs[1 ][idx * 2 + 0 ] = pcm_unit8_t_array[loop * BUF_SIZE + idx * loopStep + 2 ]; outs[1 ][idx * 2 + 1 ] = pcm_unit8_t_array[loop * BUF_SIZE + idx * loopStep + 3 ]; } } frame->data [0 ] = outs[0 ]; if (isStereo) frame->data [1 ] = outs[1 ]; do_encode(ctx, frame, pkt, callback); } delete outs[0 ]; if (isStereo) delete outs[1 ]; } void AdpcmImaQtEncoder::do_encode(AVCodecContext *pCtx, AVFrame *pFrame, AVPacket *pPkt, pCallbackFunc callback) { int ret; ret = avcodec_send_frame(pCtx, pFrame); if (ret < 0 ) { LOGE("Error sending the frame to the encoder. code=%d" , ret); return ; } while (ret >= 0 ) { ret = avcodec_receive_packet(pCtx, pPkt); if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF) return ; else if (ret < 0 ) { LOGE("Error encoding audio frame. code=%d" , ret); return ; } callback(pPkt->data , pPkt->size); av_packet_unref(pPkt); } }
在 Android 项目 app module 的 src/main/jni
目录下创建 logger.h
文件,内容如下:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 #ifndef LEO_FFMPEG4ANDROID_LOGGER_H #define LEO_FFMPEG4ANDROID_LOGGER_H #ifdef ANDROID #define LOG_TAG "leo_ffmpeg_jni" #include <android/log.h> #define LOGE(...) ((void)__android_log_print(ANDROID_LOG_ERROR, LOG_TAG, __VA_ARGS__)) #define LOGW(...) ((void)__android_log_print(ANDROID_LOG_WARN, LOG_TAG, __VA_ARGS__)) #else #define LOGE(format, ...) printf(LOG_TAG format "\n" , ##__VA_ARGS__) #define LOGI(format, ...) printf(LOG_TAG format "\n" , ##__VA_ARGS__) #endif #endif
使用 NDK 编译生成 libadpcm-ima-qt-encoder.so
文件 在 jni
目录下执行 ndk-build
,生成所需的 libadpcm-ima-qt-encoder.so
文件:
1 2 3 4 5 6 7 8 9 10 11 12 % ndk-build [armeabi-v7a] Install : libadpcm-ima-qt-decoder.so => libs/armeabi-v7a/libadpcm-ima-qt-decoder.so [armeabi-v7a] Install : libadpcm-ima-qt-encoder.so => libs/armeabi-v7a/libadpcm-ima-qt-encoder.so [armeabi-v7a] Install : libavcodec.so => libs/armeabi-v7a/libavcodec.so [armeabi-v7a] Install : libavutil.so => libs/armeabi-v7a/libavutil.so [armeabi-v7a] Install : libc++_shared.so => libs/armeabi-v7a/libc++_shared.so [arm64-v8a] Install : libadpcm-ima-qt-decoder.so => libs/arm64-v8a/libadpcm-ima-qt-decoder.so [arm64-v8a] Install : libadpcm-ima-qt-encoder.so => libs/arm64-v8a/libadpcm-ima-qt-encoder.so [arm64-v8a] Install : libavcodec.so => libs/arm64-v8a/libavcodec.so [arm64-v8a] Install : libavutil.so => libs/arm64-v8a/libavutil.so [arm64-v8a] Install : libc++_shared.so => libs/arm64-v8a/libc++_shared.so
生成的 so
位于如下目录:src/main/libs/
以 arm64-v8a
为例,将生成的 src/main/libs/arm64-v8a
拷贝到自己的项目里,之后就可以开始进行音频编码了。
好了所有的编译工作,到这里终于大功告成!!!
编写 JNI 文件 在 com.leovp.ffmpeg.audio.adpcm
包下,创建 AdpcmImaQtEncoder
类:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 package com.leovp.ffmpeg.audio.adpcmimport com.leovp.ffmpeg.audio.base.EncodeAudioCallbackclass AdpcmImaQtEncoder private constructor () { companion object { init { System.loadLibrary("adpcm-ima-qt-encoder" ) System.loadLibrary("avcodec" ) System.loadLibrary("avutil" ) } } constructor (sampleRate: Int , channels: Int , bitRate: Int ) : this () { init (sampleRate, channels, bitRate) } var encodedCallback: EncodeAudioCallback? = null private external fun init (sampleRate: Int , channels: Int , bitRate: Int ) : Int external fun release () external fun encode (pcmBytes: ByteArray ) external fun getVersion () : String fun encodedAudioCallback (encodeAudio: ByteArray ) { encodedCallback?.onEncodedUpdate(encodeAudio) } }
其中用到的 EncodeAudioCallback
接口,内容如下:
1 2 3 4 5 6 7 8 9 package com.leovp.ffmpeg.audio.baseinterface EncodeAudioCallback { fun onEncodedUpdate (encodedAudio: ByteArray ) }
OK。编码所需要的所有工作都已经准备好了,剩下的就是直接调用方法对音频进行编码了。这里就不详细讲解了。
注意事项 在开发过程中遇到问题太多了,本来新版本的 ffmpeg 相比旧版本变化就非常的大,再加上网上能查到的资料几乎都是使用的旧版 ffmpeg,开发之路太坎坷。
所以花费了很时间调查,可是几乎没有什么进展,导致一度想暂时先把该工作放一放,等以后心情平静再做。可是就在前几天,自己突然就像开了窍一样,有了重大突破,最后终于编码成功了!!!心情大好!
这里说下需要特别注意的:编码时要求每声道的 PCM 数据大小为 128 个字节,也就是 frame->linesize[0]
个字节。
源码 需要源码的请移至我的Github 。https://github.com/yhz61010/android )
参考文献