JavaCV错误:不支持没有全局标头的AAC。

huangapple go评论60阅读模式
英文:

JavaCV error AAC with no global headers is currently not supported

问题

我正在尝试将dhav(一种容器格式)转码为RTSP,使用JavaCV(FFmpegFrameGrabber + FFmpegFrameRecorder),当我将dhav转码为RTMP时,一切正常,但是当我改为RTSP时,出现了错误:


错误[rtsp @ 0000002318df7c30] 不支持没有全局标头的AAC

线程pool-1-thread-2中的异常java.lang.RuntimeExceptionorg.bytedeco.javacv.FFmpegFrameRecorder $ Exceptionavformat_write_header error()错误-1094995529无法将标头写入到'rtsp://127.0.0.1:8554/myapp/orange2'(有关更多详细信息,请确保已调用FFmpegLogCallback.set())。
     在org.jfjy.jvc.GetBytes2PipedStreamAndPushRTMP $ 2.run(GetBytes2PipedStreamAndPushRTMP.java:116)
     在java.base / java.util.concurrent.ThreadPoolExecutor.runWorkerThreadPoolExecutor.java:1136
     在java.base / java.util.concurrent.ThreadPoolExecutor $ Worker.runThreadPoolExecutor.java:635
     在java.base / java.lang.Thread.runThread.java:833
Caused byorg.bytedeco.javacv.FFmpegFrameRecorder $ Exceptionavformat_write_header error()错误-1094995529无法将标头写入到'rtsp://127.0.0.1:8554/myapp/orange2'(有关更多详细信息,请确保已调用FFmpegLogCallback.set())。
     在org.bytedeco.javacv.FFmpegFrameRecorder.startUnsafeFFmpegFrameRecorder.java:969
     在org.bytedeco.javacv.FFmpegFrameRecorder.startFFmpegFrameRecorder.java:437
     在org.bytedeco.javacv.FFmpegFrameRecorder.startFFmpegFrameRecorder.java:432
     在org.jfjy.jvc.GetBytes2PipedStreamAndPushRTMP.grabAndPush(GetBytes2PipedStreamAndPushRTMP.java:215)
     在org.jfjy.jvc.GetBytes2PipedStreamAndPushRTMP $ 2.run(GetBytes2PipedStreamAndPushRTMP.java:100)
     ... 3更多

在Google后,我尝试了以下操作:

  • 设置 avFormatContext.flags(avformat.AVFMT_GLOBALHEADER); 无效
  • 在命令行中添加 "-flags +global_header" 或 "-rtpflags latm",但不知道如何在JavaCV中实现这些。
  • 设置 recorder.setAudioOption("flags", "+global_header"); 无效
  • 设置
recorder.setAudioOption("flags", "global_header"); 
recorder.setVideoOption("flags", "global_header");

仍然无效

有人可以指导我吗?

关于dhav的一些信息(部分内容),也许有所帮助:

' .\videostream' 输入 #0dhav
  时长00:00:25.00开始1689678599.000000比特率2360 kb/s
#0:0音频pcm_s16le16000 Hz1 通道s16256 kb/s
#0:1视频h264High),yuvj420ppcbt470bg/bt470bg/bt709),720x128025 fps50 tbr1k tbn

关键代码如下:

public static synchronized void grabAndPush(InputStream inputStream, String pushAddress, String pushPotocol) throws Exception {
        //...(代码太长,请查看上面的代码)
}
英文:

I'm trying to transcode dhav (one of the container format) to RTSP By JavaCV(FFmpegFrameGrabber + FFmpegFrameRecorder) , It's fine when i transcoding dhav to RTMP , but when I change to RTSP ,error occurred:


Error: [rtsp @ 0000002318df7c30] AAC with no global headers is currently not supported.

Exception in thread "pool-1-thread-2" java.lang.RuntimeException: org.bytedeco.javacv.FFmpegFrameRecorder$Exception: avformat_write_header error() error -1094995529: Could not write header to 'rtsp://127.0.0.1:8554/myapp/orange2' (For more details, make sure FFmpegLogCallback.set() has been called.)
	at org.jfjy.jvc.GetBytes2PipedStreamAndPushRTMP$2.run(GetBytes2PipedStreamAndPushRTMP.java:116)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: org.bytedeco.javacv.FFmpegFrameRecorder$Exception: avformat_write_header error() error -1094995529: Could not write header to 'rtsp://127.0.0.1:8554/myapp/orange2' (For more details, make sure FFmpegLogCallback.set() has been called.)
	at org.bytedeco.javacv.FFmpegFrameRecorder.startUnsafe(FFmpegFrameRecorder.java:969)
	at org.bytedeco.javacv.FFmpegFrameRecorder.start(FFmpegFrameRecorder.java:437)
	at org.bytedeco.javacv.FFmpegFrameRecorder.start(FFmpegFrameRecorder.java:432)
	at org.jfjy.jvc.GetBytes2PipedStreamAndPushRTMP.grabAndPush(GetBytes2PipedStreamAndPushRTMP.java:215)
	at org.jfjy.jvc.GetBytes2PipedStreamAndPushRTMP$2.run(GetBytes2PipedStreamAndPushRTMP.java:100)
	... 3 more

After googled , I tried fowllowing :

  • set avFormatContext.flags(avformat.AVFMT_GLOBALHEADER); no use

  • add "-flags +global_header" or "-rtpflags latm" to command line, but I don't know how to do this in JavaCV。

  • set recorder.setAudioOption("flags", "+global_header"); no use

  • set recorder.setAudioOption("flags", "global_header");
    recorder.setVideoOption("flags", "global_header");
    , no use

could someone guide me on this appreciate

about the dhav (a part of ) ,mabay can help:

Input #0, dhav, from '.\videostream':
Duration: 00:00:25.00, start: 1689678599.000000, bitrate: 2360 kb/s
Stream #0:0: Audio: pcm_s16le, 16000 Hz, 1 channels, s16, 256 kb/s
Stream #0:1: Video: h264 (High), yuvj420p(pc, bt470bg/bt470bg/bt709), 720x1280, 25 fps, 50 tbr, 1k tbn

key code is :

public static synchronized void grabAndPush(InputStream inputStream, String pushAddress, String pushPotocol) throws Exception {
        avutil.av_log_set_level(AV_LOG_DEBUG);
        FFmpegLogCallback.set();

        FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(inputStream,0);

        long startTime = System.currentTimeMillis();
      
        grabber.start();

     

        AVFormatContext avFormatContext = grabber.getFormatContext();


        int streamNum = avFormatContext.nb_streams();

        if (streamNum < 1) {
            log.error("no media");
            return;
        }

        int frameRate = (int) grabber.getVideoFrameRate();
        if (0 == frameRate) {
            frameRate = 15;
        }
        log.info("frameRate[{}]duration[{}]secsnumber streams[{}]",
                frameRate,
                avFormatContext.duration() / 1000000,
                avFormatContext.nb_streams());

        for (int i = 0; i < streamNum; i++) {
            AVStream avStream = avFormatContext.streams(i);
            AVCodecParameters avCodecParameters = avStream.codecpar();
            log.info("stream index[{}]codec type[{}]codec ID[{}]", i, avCodecParameters.codec_type(), avCodecParameters.codec_id());
        }

        int frameWidth = grabber.getImageWidth();
        int frameHeight = grabber.getImageHeight();
        int audioChannels = grabber.getAudioChannels();

        log.info("frameWidth[{}]frameHeight[{}]audioChannels[{}]",
                frameWidth,
                frameHeight,
                audioChannels);

        FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(pushAddress,
                frameWidth,
                frameHeight,
                audioChannels);

        recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
        recorder.setInterleaved(true);

        switch (pushPotocol) {
            case "rtsp" -> {
                recorder.setFormat("rtsp");
            }
            case "rtmp" -> {
                recorder.setFormat("flv");
            }
        }
        log.info("push protocol:{}| grabber format:{} | recorder format:{}",pushPotocol,grabber.getFormat(),recorder.getFormat());

        recorder.setFrameRate(frameRate);
        recorder.setAudioCodec(AV_CODEC_ID_AAC);
        log.info("grabber audio codec name :{}|recorder audio codec name :{}",grabber.getAudioCodecName(),recorder.getAudioCodecName());

        recorder.setGopSize(frameRate * 2);

        recorder.setAudioChannels(grabber.getAudioChannels());

        startTime = System.currentTimeMillis();


        avFormatContext.max_interleave_delta(0);
        avFormatContext.flags(avformat.AVFMT_TS_NONSTRICT);
        recorder.setTimestamp(0);
  
        recorder.start(avFormatContext);

     

        Frame frame;


        int videoFrameNum = 0;
        int audioFrameNum = 0;
        int dataFrameNum = 0;
    
        AVPacket packet;
        long lastDTS = 0;
        while ((packet = grabber.grabPacket()) != null) {
            if (packet.pts() == AV_NOPTS_VALUE) {
                if (packet.dts() != AV_NOPTS_VALUE) {
                    packet.pts(packet.dts());
                    lastDTS = packet.dts();
                } else {
                    packet.pts(lastDTS + 1);
                    packet.dts(packet.pts());
                    lastDTS = packet.pts();
                }
            } else {
                if (packet.dts() != AV_NOPTS_VALUE) {
                    if (packet.dts() < lastDTS) {
                        packet.dts(lastDTS + 1);
                    }
                    lastDTS = packet.dts();
                } else {
                    packet.dts(packet.pts());
                    lastDTS = packet.dts();
                }
            }

            if (packet.pts() < packet.dts()) {
                packet.pts(packet.dts());
            }


            recorder.recordPacket(packet);
            Thread.sleep(1);
        }

        log.info("push completevideoFrameNum[{}]audioFrameNum[{}]dataFrameNum[{}]耗时[{}]",
                videoFrameNum,
                audioFrameNum,
                dataFrameNum,
                (System.currentTimeMillis() - startTime) / 1000);

   
        recorder.close();
        grabber.close();
    }

答案1

得分: 1

我明白了,只需添加recorder.setAudioBitrate(grabber.getAudioBitrate());,对于RTSP是可以的。

英文:

ok, I figure out, just add recorder.setAudioBitrate(grabber.getAudioBitrate()); , dhav to rtsp is ok

huangapple
  • 本文由 发表于 2023年7月18日 14:54:07
  • 转载请务必保留本文链接:https://go.coder-hub.com/76710189.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定