英文:
Why Sceneform 1.17.1 displays only audio and not video corectly?
问题
我正在使用Sceneform创建一个AR应用程序,但无论我尝试了什么,Sceneform都不能正确地在图像上显示视频。当相机检测到图像时,我得到的结果如下:
以下是我在相应类中的代码:
texture = new ExternalTexture();
mediaPlayer = MediaPlayer.create(this, R.raw.video);
mediaPlayer.setSurface(texture.getSurface());
mediaPlayer.setLooping(true);
ModelRenderable
.builder()
.setSource(this, Uri.parse("video_screen.sfb"))
.build()
.thenAccept(modelRenderable -> {
renderable = modelRenderable;
modelRenderable.getMaterial().setExternalTexture("videoTexture", texture);
modelRenderable.getMaterial().setFloat4("keyColor",
new Color(0.01843f, 1f, 0.098f));
})
.exceptionally(
throwable -> {
Toast toast = Toast.makeText(this, "无法加载视频可渲染项", Toast.LENGTH_LONG);
toast.setGravity(Gravity.CENTER, 0, 0);
toast.show();
return null;
});
arFragment = (com.stratos.syrostownhall.CustomArFragment)getSupportFragmentManager().findFragmentById(R.id.arFragment);
if (arFragment != null) {
scene = arFragment.getArSceneView().getScene();
}
scene.addOnUpdateListener(this::onUpdate);
}
private void onUpdate(FrameTime frameTime) {
if (isImageDetected)
return;
Frame frame = arFragment.getArSceneView().getArFrame();
Collection<AugmentedImage> augmentedImages =
frame != null ? frame.getUpdatedTrackables(AugmentedImage.class) : null;
for (AugmentedImage painting : Objects.requireNonNull(augmentedImages)) {
if (painting.getTrackingState() == TrackingState.TRACKING) {
if (painting.getName().equals("painting")) {
isImageDetected = true;
playVideo(painting.createAnchor(painting.getCenterPose()), painting.getExtentX(),
painting.getExtentZ());
break;
}
}
}
}
private void playVideo(Anchor anchor, float extentX, float extentZ) {
mediaPlayer.start();
AnchorNode anchorNode = new AnchorNode(anchor);
texture.getSurfaceTexture().setOnFrameAvailableListener(surfaceTexture -> {
anchorNode.setRenderable(renderable);
texture.getSurfaceTexture().setOnFrameAvailableListener(null);
});
anchorNode.setWorldScale(new Vector3(extentX, 1f, extentZ));
scene.addChild(anchorNode);
}
另外,我使用以下语法手动添加资产:
sceneform.asset ('sampledata/video_screen.obj',
'default',
'sampledata/video_screen.sfa',
'src/main/assets/video_screen')
英文:
I am creating an AR app using Sceneform, and no matter what I have tried this far Sceneform doesn't display a video correctly over an image. This is the result I get when the camera detects the image:
The Video Image that I get
Here is the code I have on the corresponding class:
texture = new ExternalTexture();
mediaPlayer = MediaPlayer.create(this, R.raw.video);
mediaPlayer.setSurface(texture.getSurface());
mediaPlayer.setLooping(true);
ModelRenderable
.builder()
.setSource(this, Uri.parse("video_screen.sfb"))
.build()
.thenAccept(modelRenderable -> {
renderable = modelRenderable;
modelRenderable.getMaterial().setExternalTexture("videoTexture",
texture);
modelRenderable.getMaterial().setFloat4("keyColor",
new Color(0.01843f, 1f, 0.098f));
})
.exceptionally(
throwable -> {
Toast toast = Toast.makeText(this, "Unable to load video renderable", Toast.LENGTH_LONG);
toast.setGravity(Gravity.CENTER, 0, 0);
toast.show();
return null;
});
arFragment = (com.stratos.syrostownhall.CustomArFragment)getSupportFragmentManager().findFragmentById(R.id.arFragment);
if (arFragment != null) {
scene = arFragment.getArSceneView().getScene();
}
scene.addOnUpdateListener(this::onUpdate);
}
private void onUpdate(FrameTime frameTime) {
if (isImageDetected)
return;
Frame frame = arFragment.getArSceneView().getArFrame();
Collection<AugmentedImage> augmentedImages =
frame != null ? frame.getUpdatedTrackables(AugmentedImage.class) : null;
for (AugmentedImage painting : Objects.requireNonNull(augmentedImages)) {
if (painting.getTrackingState() == TrackingState.TRACKING) {
if (painting.getName().equals("painting")) {
isImageDetected = true;
playVideo(painting.createAnchor(painting.getCenterPose()), painting.getExtentX(),
painting.getExtentZ());
break;
}
}
}
}
private void playVideo(Anchor anchor, float extentX, float extentZ) {
mediaPlayer.start();
AnchorNode anchorNode = new AnchorNode(anchor);
texture.getSurfaceTexture().setOnFrameAvailableListener(surfaceTexture -> {
anchorNode.setRenderable(renderable);
texture.getSurfaceTexture().setOnFrameAvailableListener(null);
});
anchorNode.setWorldScale(new Vector3(extentX, 1f, extentZ));
scene.addChild(anchorNode);
}
Also, I add the asset manually using the following syntax:
sceneform.asset ('sampledata/video_screen.obj',
'default',
'sampledata/video_screen.sfa',
'src/main/assets/video_screen')
答案1
得分: 0
问题是由 video_screen.sfb
文件引起的。
如果还有其他人遇到同样的问题,请尝试使用以下的 video_screen
文件:
链接:https://github.com/heyletscode/Play-Video-On-Augmented-Image/blob/master/Project/app/src/main/assets/video_screen.sfb
英文:
The problem was caused by the video_screen.sfb
file.
If anyone else is facing the same issue try using the following video_screen file:
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论