Android webrtc 无法创建 offer,因为会话选项无效。

huangapple go评论91阅读模式
英文:

Android webrtc cannot create offer due to invalid session options

问题

I have tried to create a video chat implementation in android using webrtc sdk, using Firebase FireStore as the signalling mechanism. I followed some tutorials and have successfully been able to create a peer-to-peer connection.

Initially I tried sending video stream form one peer to another and it worked. Then when I tried adding the audio track to the stream, the initial peer could no longer create the offer(I found this from debugging). As a result, the webrtc connection can no longer be established. Now I am stumped as to how this can happen.

Later I also found out that if I add only ONE track, either audio track or video track, then my app works fine, but if I try adding both audio and video tracks, then the initial peer can no longer create the offer.

Here is the full code of my CallActivity.java which contains all of the webrtc implementation. I have removed some unneeded statements form this code which I felt weren't needed.

public class CallActivity extends AppCompatActivity {
    private String userUid, friendUid;
    private FirebaseFirestore db = FirebaseFirestore.getInstance();
    private boolean isInitiator = false;

    //request codes
    private int CAMERA_PERMISSION_CODE = 0;

    //views
    SurfaceViewRenderer localVideoView, friendVideoView;

    //webrtc
    private EglBase rootEglBase;
    private PeerConnectionFactory factory;
    private PeerConnection peerConnection;

    // ... (rest of the code)
}

Please note this line

//mediaStream.addTrack(localAudioTrack);

towards the end of the initialize() function. Since this line is commented, everything works fine, but if I uncomment this line, webrtc cannot create a connection since no offer is being created successfully, which is the problem I have no idea how to solve. I tried googling, and only found some irrelevant questions regarding the webrtc browser api. I dont understand how this one line can prevent the entire webrtc connection from working properly. (I know webrtc cannot create the offer, because the onCreateSuccess() method following the peerConnection.createOffer(), never gets called if I uncomment this line). Since everything works fine with this line commented, I don't believe that my firestore code, that performs the signalling, is a problem.

Here is the build.gradle file:

apply plugin: 'com.android.application'
apply plugin: 'com.google.gms.google-services'

android {
    compileSdkVersion 29
    buildToolsVersion "29.0.3"

    defaultConfig {
        applicationId "com.example.myapplication"
        minSdkVersion 19
        targetSdkVersion 29
        versionCode 1
        versionName "1.0"
        multiDexEnabled true
        vectorDrawables.useSupportLibrary true
        testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
    }

    // ... (rest of the build.gradle file)
}

dependencies {
    // ... (rest of the dependencies)
}

I have followed some online tutorials on webrtc android and all of them did only the video stream (which works in my app). Since there are no official webrtc documentation I have no idea how to fix this. Please help!

英文:

I have tried to create a video chat implementation in android using webrtc sdk, using Firebase FireStore as the signalling mechanism. I followed some tutorials and have successfully been able to create a peer-to-peer connection.

Initially I tried sending video stream form one peer to another and it worked. Then when I tried adding the audio track to the stream, the initial peer could no longer create the offer(I found this from debugging). As a result, the webrtc connection can no longer be established. Now I am stumped as to how this can happen.

Later I also found out that if I add only ONE track, either audio track or video track, then my app works fine, but if I try adding both audio and video tracks, then the initial peer can no longer create the offer.

Here is the full code of my CallActivity.java which contains all of the webrtc implementation. I have removed some unneeded statements form this code which I felt weren't needed.

public class CallActivity extends AppCompatActivity {
private String userUid, friendUid;
private FirebaseFirestore db = FirebaseFirestore.getInstance();
private boolean isInitiator = false;
//request codes
private int CAMERA_PERMISSION_CODE = 0;
//views
SurfaceViewRenderer localVideoView, friendVideoView;
//webrtc
private EglBase rootEglBase;
private PeerConnectionFactory factory;
private PeerConnection peerConnection;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_call);
userUid = getIntent().getStringExtra("userUid");
friendUid = getIntent().getStringExtra("friendUid");
isInitiator = getIntent().getBooleanExtra("initiator", false);
checkPermissions();
initialize();
setupFireStoreListeners();
}
@Override
public void onBackPressed() {
hangup();
}
private void initialize()
{
//initialize views
rootEglBase = EglBase.create();
localVideoView = findViewById(R.id.localVideo);
localVideoView.init(rootEglBase.getEglBaseContext(), null);
localVideoView.setEnableHardwareScaler(true);
localVideoView.setMirror(true);
friendVideoView = findViewById(R.id.friendVideo);
friendVideoView.init(rootEglBase.getEglBaseContext(), null);
friendVideoView.setEnableHardwareScaler(true);
friendVideoView.setMirror(true);
//initialize peer connection factory
PeerConnectionFactory.InitializationOptions initializationOptions = PeerConnectionFactory.InitializationOptions.builder(this)
.setEnableInternalTracer(true)
.setFieldTrials("WebRTC-H264HighProfile/Enabled/")
.createInitializationOptions();
PeerConnectionFactory.initialize(initializationOptions);
PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
options.disableEncryption = true;
options.disableNetworkMonitor = true;
factory = PeerConnectionFactory.builder()
.setOptions(options)
.setVideoDecoderFactory(new DefaultVideoDecoderFactory(rootEglBase.getEglBaseContext()))
.setVideoEncoderFactory(new DefaultVideoEncoderFactory(rootEglBase.getEglBaseContext(), true, true))
.createPeerConnectionFactory();
//create video track form camera and show it
VideoCapturer videoCapturer = createVideoCapturer();
if(videoCapturer == null){
finish();
return;
}
VideoSource videoSource = factory.createVideoSource(false);
SurfaceTextureHelper surfaceTextureHelper = SurfaceTextureHelper.create(Thread.currentThread().getName(), rootEglBase.getEglBaseContext());
videoCapturer.initialize(surfaceTextureHelper, localVideoView.getContext(), videoSource.getCapturerObserver());
videoCapturer.startCapture(1240, 720, 30);
VideoTrack localVideoTrack = factory.createVideoTrack("local", videoSource);
localVideoTrack.addSink(localVideoView);
//set ice candidates to null
db.document("users/" + userUid).update("ice", null);
db.document("users/" + friendUid).update("ice", null);
//create peer connection
ArrayList<PeerConnection.IceServer> iceServers = new ArrayList<>();
iceServers.add(PeerConnection.IceServer.builder("stun:stun.l.google.com:19302").createIceServer());
PeerConnection.Observer pcObserver = new SimplePeerConnectionObserver() {
@Override
public void onIceCandidate(IceCandidate iceCandidate) {
Log.d("WEBRTCD", "Ice");
db.runTransaction(new Transaction.Function<Void>() {
@Nullable
@Override
public Void apply(@NonNull Transaction transaction) throws FirebaseFirestoreException {
List<Map> iceList = (List<Map>) transaction.get(db.document("users/" + friendUid)).get("ice");
if(iceList == null) iceList = new ArrayList<>();
Map<String, Object> ice = new HashMap<>();
ice.put("label", iceCandidate.sdpMLineIndex);
ice.put("id", iceCandidate.sdpMid);
ice.put("sdp", iceCandidate.sdp);
iceList.add(0, ice);
transaction.update(db.document("users/" + friendUid), "ice", iceList);
return null;
}
});
}
@Override
public void onAddStream(MediaStream mediaStream) {
VideoTrack remoteVideoTrack = mediaStream.videoTracks.get(0);
if(mediaStream.audioTracks.size() > 0) {
AudioTrack remoteAudioTrack = mediaStream.audioTracks.get(0);
remoteAudioTrack.setEnabled(true);
}
remoteVideoTrack.setEnabled(true);
remoteVideoTrack.addSink(friendVideoView);
}
};
peerConnection = factory.createPeerConnection(iceServers, pcObserver);
//create audio track
MediaConstraints audioConstraints = new MediaConstraints();
AudioSource audioSource = factory.createAudioSource(audioConstraints);
AudioTrack localAudioTrack = factory.createAudioTrack("local", audioSource);
peerConnection.setAudioRecording(true);
peerConnection.setAudioPlayout(true);
//add stream to peer connection
MediaStream mediaStream = factory.createLocalMediaStream("local");
//mediaStream.addTrack(localAudioTrack);
mediaStream.addTrack(localVideoTrack);
peerConnection.addStream(mediaStream);
if(isInitiator) doCall();
else doAnswer();
}
private void doCall()
{
db.document("users/" + friendUid).update("call", userUid);
MediaConstraints mediaConstraints = new MediaConstraints();
mediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true"));
mediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveVideo", "true"));
peerConnection.createOffer(new SimpleSdpObserver() {
@Override
public void onCreateSuccess(SessionDescription sessionDescription) {
peerConnection.setLocalDescription(new SimpleSdpObserver(), sessionDescription);
Map<String, String> sdp = new HashMap<>();
sdp.put("type", "offer");
sdp.put("desc", sessionDescription.description);
db.document("users/" + friendUid).update("sdp", sdp);
}
}, mediaConstraints);
}
private void doAnswer()
{
db.document("users/" + friendUid).update("call", userUid);
db.document("users/" + userUid).get().addOnCompleteListener(new OnCompleteListener<DocumentSnapshot>() {
@Override
public void onComplete(@NonNull Task<DocumentSnapshot> task) {
if(task.isSuccessful()  &&  task.getResult() != null)
{
Map sdpData = (Map) task.getResult().get("sdp");
MediaConstraints mediaConstraints = new MediaConstraints();
mediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true"));
mediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveVideo", "true"));
peerConnection.setRemoteDescription(new SimpleSdpObserver(), new SessionDescription(SessionDescription.Type.OFFER, (String) sdpData.get("desc")));
peerConnection.createAnswer(new SimpleSdpObserver(){
@Override
public void onCreateSuccess(SessionDescription sessionDescription) {
peerConnection.setLocalDescription(new SimpleSdpObserver(), sessionDescription);
Map<String, String> sdp = new HashMap<>();
sdp.put("type", "answer");
sdp.put("desc", sessionDescription.description);
db.document("users/" + friendUid).update("sdp", sdp);
}
}, mediaConstraints);
}
}
});
}
private void hangup()
{
db.document("users/" + friendUid).update("call", "hangup", "ice", null, "sdp", null);
db.document("users/" + userUid).update("call", "hangup", "ice", null, "sdp", null);
}
private void setupFireStoreListeners()
{
//listen for ice candidates
db.document("users/" + userUid).addSnapshotListener(this, new EventListener<DocumentSnapshot>() {
@Override
public void onEvent(@Nullable DocumentSnapshot value, @Nullable FirebaseFirestoreException error) {
if(value != null  &&  value.get("ice") != null)
{
List<Map> iceList = (List<Map>) value.get("ice");
if(iceList == null) iceList = new ArrayList<>();
for(Map iceCandidate : iceList) {
Log.d("WEBRTCD", "Ice added");
peerConnection.addIceCandidate(new IceCandidate((String) iceCandidate.get("id"), Integer.parseInt(iceCandidate.get("label") + ""), (String) iceCandidate.get("sdp")));
}
//db.document("users/" + userUid).update("ice", null);
}
}
});
//listen for hangup
db.document("users/" + userUid).addSnapshotListener(this, new EventListener<DocumentSnapshot>() {
@Override
public void onEvent(@Nullable DocumentSnapshot value, @Nullable FirebaseFirestoreException error) {
if(value != null  &&  value.get("call") != null  &&  value.get("call").equals("hangup"))
{
db.document("users/" + userUid).update("call", null);
endCall();
}
}
});
//listen for answer if initiator
if(!isInitiator) return;
db.document("users/" + userUid).addSnapshotListener(this, new EventListener<DocumentSnapshot>() {
@Override
public void onEvent(@Nullable DocumentSnapshot value, @Nullable FirebaseFirestoreException error) {
if(value != null  &&  value.get("sdp") != null) {
peerConnection.setRemoteDescription(new SimpleSdpObserver(), new SessionDescription(SessionDescription.Type.ANSWER, (String) ((Map) value.get("sdp")).get("desc")));
db.document("users/" + userUid).update("sdp", null);
}
}
});
}
private void endCall(){
peerConnection.close();
super.onBackPressed();
}
private VideoCapturer createVideoCapturer() {
VideoCapturer videoCapturer;
CameraEnumerator enumerator;
if(Camera2Enumerator.isSupported(this))
enumerator = new Camera2Enumerator(this);
else
enumerator = new Camera1Enumerator(true);
for (String device : enumerator.getDeviceNames()) {
if(enumerator.isFrontFacing(device)) {
videoCapturer = enumerator.createCapturer(device, null);
if(videoCapturer != null)
return videoCapturer;
}
}
for (String device : enumerator.getDeviceNames()) {
if(!enumerator.isFrontFacing(device)) {
videoCapturer = enumerator.createCapturer(device, null);
if(videoCapturer != null)
return videoCapturer;
}
}
return null;
}
}

Please note this line

//mediaStream.addTrack(localAudioTrack);

towards the end of the initialize() function. Since this line is commented, everything works fine, but if I uncomment this line, webrtc cannot create a connection since no offer is being created successfully, which is the problem I have no idea how to solve. I tried googling, and only found some irrelevant questions regarding the webrtc browser api. I dont understand how this one line can prevent the entire webrtc connection from working properly. (I know webrtc cannot create the offer, because the onCreateSuccess() method following the peerConnection.createOffer(), never gets called if I uncomment this line). Since everything works fine with this line commented, I don't believe that my firestore code, that performs the signalling, is a problem.

Here is the build.gradle file:

apply plugin: 'com.android.application'
apply plugin: 'com.google.gms.google-services'
android {
compileSdkVersion 29
buildToolsVersion "29.0.3"
defaultConfig {
applicationId "com.example.myapplication"
minSdkVersion 19
targetSdkVersion 29
versionCode 1
versionName "1.0"
multiDexEnabled true
vectorDrawables.useSupportLibrary true
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
}
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
implementation 'com.google.firebase:firebase-firestore:21.5.0'
implementation 'com.android.support:multidex:1.0.3'
implementation 'com.google.firebase:firebase-auth:19.3.2'
implementation 'com.google.firebase:firebase-storage:19.1.1'
implementation 'androidx.navigation:navigation-fragment:2.3.0'
implementation 'androidx.navigation:navigation-ui:2.3.0'
implementation 'androidx.legacy:legacy-support-v4:1.0.0'
implementation 'androidx.lifecycle:lifecycle-extensions:2.2.0'
implementation 'com.google.firebase:firebase-messaging:20.2.4'
testImplementation 'junit:junit:4.13'
androidTestImplementation 'androidx.test.ext:junit:1.1.1'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0'
implementation 'com.google.android.material:material:1.2.0'
implementation 'androidx.documentfile:documentfile:1.0.1'
implementation "com.mikepenz:materialdrawer:6.1.2"
implementation 'org.webrtc:google-webrtc:1.0.30039'
}

I have followed some online tutorials on webrtc android and all of them did only the video stream (which works in my app). Since there are no official webrtc documentation I have no idea how to fix this. Please help!

答案1

得分: 1

我明白了,音频和视频轨道必须具有唯一的ID。在我的情况下,音频和视频轨道都具有ID“local”,这会导致webrtc出现问题,它们必须具有不同的ID。此外,在通话时,客户端应该在流上具有不同的ID。

在我的情况下,我将音频和视频轨道的ID分别更改为_audio和_video,然后它就可以工作了。

英文:

Ok, so I figured it out, the audio and video tracks must have unique IDs. In my case, both the audio and video tracks had id "local" which messed up the webrtc, they must have different IDs. Also, when on a call, the clients should have different IDs on the streams.

In my case, I changed the audio and video track IDs to _audio and _video respectively, and it worked

huangapple
  • 本文由 发表于 2020年8月13日 18:30:56
  • 转载请务必保留本文链接:https://go.coder-hub.com/63393170.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定