英文:
Can we Stream Live Audio from Android phone using FFmpeg?
问题
使用ffmpeg_kit_flutter包在Flutter中将数据流式传输到RTSP服务器。
iOS:正常工作
Android:不起作用
使用的命令:
'ffmpeg -f avfoundation -i ":0" -acodec aac -f rtsp -rtsp_transport tcp "$Url"'
当我在Android上运行“ffmpeg -devices
”命令时,它返回以下响应,从中我了解到Android不支持avfoundation,但Android有android_camera
,这个android_camera
也支持音频吗?
命令:ffmpeg -devices
响应:
I/flutter (10620): logs: libavutil 57. 28.100 / 57. 28.100
I/flutter (10620): logs: libavcodec 59. 37.100 / 59. 37.100
I/flutter (10620): logs: libavformat 59. 27.100 / 59. 27.100
I/flutter (10620): logs: libavdevice 59. 7.100 / 59. 7.100
I/flutter (10620): logs: libavfilter 8. 44.100 / 8. 44.100
I/flutter (10620): logs: libswscale 6. 7.100 / 6. 7.100
I/flutter (10620): logs: libswresample 4. 7.100 / 4. 7.100
I/flutter (10620): logs:Devices:
I/flutter (10620): D. = Demuxing supported
I/flutter (10620): .E = Muxing supported
I/flutter (10620): --
I/flutter (10620): logs: D android_camera
I/flutter (10620): logs: D lavfi
I/flutter (10620): logs: DE video4linux2,v4l2
我在Android上尝试的命令:
FFmpegKit.execute('-y -f android_camera -i 0:1 -r 30 -c:a aac -f rtsp -rtsp_transport tcp "$Url"');
FFmpegKit.execute('-y -f android_camera -i 0:1 -r 30 -c:a libmp3lame -qscale:a 2 "/storage/emulated/0/Download/androidvideo.mp3"');
FFmpegKit.execute('-y -f android_camera -i 0:0 -r 30 -c:a wavpack -b:a 64k "/storage/emulated/0/Download/androidvideo.wav"');
这个命令录制视频但没有音频。
FFmpegKit.execute('-video_size hd720 -f android_camera -camera_index 1 -i anything -r 10 -t 00:00:15 "$dir/androidvideo.mp4"');
英文:
Im using ffmpeg_kit_flutter package to Stream data to the RTSP server in Flutter.
IOS : Working
Android : Its not working
Command Used :
'ffmpeg -f avfoundation -i ":0" -acodec aac -f rtsp -rtsp_transport tcp "$Url"'
When i ran a "ffmpeg -devices
" command on android, it returns follwing response, through which i got to know android doesn't support avfoundation but android has android_camera
,
Does this android_camera
support audio too?
Command : 'ffmpeg -devices'
Response :
I/flutter (10620): logs: libavutil 57. 28.100 / 57. 28.100
I/flutter (10620): logs: libavcodec 59. 37.100 / 59. 37.100
I/flutter (10620): logs: libavformat 59. 27.100 / 59. 27.100
I/flutter (10620): logs: libavdevice 59. 7.100 / 59. 7.100
I/flutter (10620): logs: libavfilter 8. 44.100 / 8. 44.100
I/flutter (10620): logs: libswscale 6. 7.100 / 6. 7.100
I/flutter (10620): logs: libswresample 4. 7.100 / 4. 7.100
I/flutter (10620): logs:Devices:
I/flutter (10620): D. = Demuxing supported
I/flutter (10620): .E = Muxing supported
I/flutter (10620): --
I/flutter (10620): logs: D android_camera
I/flutter (10620): logs: D lavfi
I/flutter (10620): logs: DE video4linux2,v4l2
Commands which I tried in Android
FFmpegKit.execute('-y -f android_camera -i 0:1 -r 30 -c:a aac -f rtsp -rtsp_transport tcp "$Url"');
FFmpegKit.execute('-y -f android_camera -i 0:1 -r 30 -c:a libmp3lame -qscale:a 2 "/storage/emulated/0/Download/androidvideo.mp3"');
FFmpegKit.execute('-y -f android_camera -i 0:0 -r 30 -c:a wavpack -b:a 64k "/storage/emulated/0/Download/androidvideo.wav"');
This command records video but no audio in it.
FFmpegKit.execute('-video_size hd720 -f android_camera -camera_index 1 -i anything -r 10 -t 00:00:15 "$dir/androidvideo.mp4”');
答案1
得分: 2
FFmpeg for Android中的android_camera输入设备不支持音频捕获。它只能从Android相机捕获视频,但不能捕获设备麦克风的音频。
英文:
the android_camera input device in FFmpeg for Android does not support audio capture. It only captures video from the Android camera but does not capture audio from the device's microphone.
答案2
得分: 0
根据以下来源,ffmpeg for Android仅支持使用android_camera
设备从摄像头捕获视频:
- https://stackoverflow.com/questions/62194826/how-to-use-ffmpeg-read-realtime-microphone-audio-volume
- https://github.com/tanersener/mobile-ffmpeg/issues/153
而在iOS上,avfoundation
支持视频和音频。
作为一个选择,音频可以通过本地手段进行录制,例如使用MediaRecorder
或通过JNI
使用AAudio / OpenSL ES
,
然后与从摄像头录制的视频混合。
ffmpeg将从音频缓冲区获取音频数据(例如,作为最简单的选项使用临时文件或内存缓冲区,但需要进一步调查如何在调用字符串中配置它)。
英文:
As seen in the following sources, ffmpeg for Android supports only video capturing from the camera using the android_camera
device:
- https://stackoverflow.com/questions/62194826/how-to-use-ffmpeg-read-realtime-microphone-audio-volume
- https://github.com/tanersener/mobile-ffmpeg/issues/153
Though on iOS the avfoundation
supports video and audio.
As an option, the audio could be recorded by the native means, MediaRecorder
or AAudio / OpenSL ES
via JNI
for example,
and then mixed with the video recorded from the camera.
The ffmpeg would take audio data from an audio buffer (a temp file as a simplest option or a in-memory buffer, but this would need to be investigated further for how to configure it in the call string).
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论