A data texture to read all vertex coordinates in the shader. But my datatexture seems to be empty.

huangapple go评论63阅读模式
英文:

A data texture to read all vertex coordinates in the shader. But my datatexture seems to be empty

问题

问题已解决。DataTexture需要RGBA通道。因此,即使我不需要它,也必须添加A通道。也许以后可以用它做点什么。

我上传了一个GitHub仓库。这是链接:

https://github.com/Spiri0/BufferGeometry-from-DataTexture

但我有一个新问题。如果我将分辨率设置为64,我的几何体会消失。我猜是因为DataTexture还没有准备好。有没有人知道一种优雅的解决方案,可以扩展存储库,使其适用于任何分辨率,例如500?我目前没有对如此高的分辨率有需求,但它应该通常稳定工作。

英文:

I created a BufferGeometry from an array.

https://stackoverflow.com/questions/76318881/how-do-i-adjust-individual-vertices-in-the-vertexshader

It all works so far. Now I want to pass the vertex coordinates not only as an attribute but also as a data texture. The reason for this is that I need the vertex coordinates of neighboring points.
In line 195 I create a data texture but it seems to be empty. I still don't understand why. My goal is not to create the geometry with the vertices from the positions attribute, but from the same coordinates that are stored in the data texture. For this I create a data texture with the positions array. I also created an attribute vindex so that the shader can use it to read the coordinates of the respective vertex from the texture. Does anyone know why my data texture isn't working?

import * as THREE from "../resources/libs/three/build/three.module.js";
import { OrbitControls } from '../resources/libs/three/examples/jsm/controls/OrbitControls.js';
import WebGL from '../resources/libs/three/examples/jsm/WebGL.js';
const P = new THREE.Vector3();
const N1 = new THREE.Vector3();
const N2 = new THREE.Vector3();
const N3 = new THREE.Vector3();
const D1 = new THREE.Vector3();
const D2 = new THREE.Vector3();
const VS = `
precision highp float;
precision highp int;
precision highp sampler2D;
uniform mat4 modelMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 viewMatrix;
uniform mat4 projectionMatrix;
uniform vec3 cameraPosition;
uniform sampler2D uSpatialTexture;
uniform float uTextureSize;
uniform float time;
uniform int point;
// Attributes
in vec3 position;
in vec3 normal;
in int vindex;
// Outputs
out vec3 vNormal;
void main() { 	
float wStep = 1. / uTextureSize;
float hWStep = wStep * 0.5;
float t = float(vindex) * wStep + hWStep;
vec3 coordFromTex = texture(uSpatialTexture, vec2(t, 0.5)).rgb;
//just initial normals
vNormal = normalize(normal);	
/*****************************************************************
the goal is to get exactly the same result like with the normal way 
but with reading the vertexcoordinates from the datatexture
*****************************************************************/			
//vec3 newPosition = coordFromTex + vNormal * 3.;
//gl_Position = projectionMatrix * modelViewMatrix * vec4(coordFromTex, 1.0);
//the normal way		
vec3 newPosition = position + vNormal * 3.;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
if(vindex == point){
gl_Position = projectionMatrix * modelViewMatrix * vec4(newPosition, 1.0);
}
}`;
const FS = `
precision highp float;
precision highp int;
precision highp sampler2D;
uniform sampler2D uSpatialTexture;
uniform float uTextureSize;
in vec3 vNormal;
out vec4 out_FragColor;
void main() {
out_FragColor = vec4(vec3(1., 0., 0.), 1.);
}`;
class Main {
constructor(){
this.init();
this.animate();	
}
init(){
if (!WebGL.isWebGL2Available()) {return false;}
const canvas = document.createElement('canvas');
const context = canvas.getContext('webgl2');
this.renderer = new THREE.WebGLRenderer({ canvas: canvas,context: context,antialias: true});					
this.renderer.setPixelRatio( window.devicePixelRatio ); 
this.renderer.shadowMap.enabled = true; 
this.renderer.shadowMap.type = THREE.PCFSoftShadowMap;	
this.container = document.getElementById('container');
this.renderer.setSize(this.container.clientWidth, this.container.clientHeight);
this.container.appendChild( this.renderer.domElement );
this.aspect = this.container.clientWidth / this.container.clientHeight; 
this.scene = new THREE.Scene();
this.scene.background = new THREE.Color( 0x557799 );
this.camera = new THREE.PerspectiveCamera( 50, this.aspect, 0.1, 10000000 );
this.camera.position.set( 30, 20, -30 );
this.controls = new OrbitControls( this.camera, this.renderer.domElement );
this.controls.screenSpacePanning = true;
this.controls.minDistance = 5;
this.controls.maxDistance = 40;
this.controls.target.set( 0, 2, 0 );
this.controls.update();
//************************************************************************
this.params = {
resolution: 5,
width: 20,
}
const positions = [];
const vertexIndex = [];
const resolution = this.params.resolution;
const width = this.params.width;
const half = width / 2;
let idx = 0;
for (let x = 0; x <= resolution; x++) {
const xp = width * x / resolution;
for (let z = 0; z <= resolution; z++) {
const zp = width * z / resolution;
// Compute position
P.set(xp - half, 0, zp - half);
positions.push(P.x, P.y, P.z);
vertexIndex.push(idx);
idx += 1;
}
}
// Generate indices and normals
const indices = this.GenerateIndices();
const normals = this.GenerateNormals(positions, indices);
const bytesInFloat32 = 4;
const bytesInInt32 = 4;
const positionsArray = new Float32Array(new ArrayBuffer(bytesInFloat32 * positions.length));
const normalsArray = new Float32Array(new ArrayBuffer(bytesInFloat32 * normals.length));
const indicesArray = new Uint32Array(new ArrayBuffer(bytesInInt32 * indices.length));
const vIndicesArray = new Uint32Array(new ArrayBuffer(bytesInInt32 * vertexIndex.length));
positionsArray.set(positions, 0);
normalsArray.set(normals, 0);
indicesArray.set(indices, 0);
vIndicesArray.set(vertexIndex, 0);
var uniform = {
point: {value: null},
uSpatialTexture: {value: null},
uTextureSize: {value: null},
}
this.material = new THREE.RawShaderMaterial({
glslVersion: THREE.GLSL3,
uniforms: uniform,
vertexShader: VS,
fragmentShader: FS,
side: THREE.DoubleSide,
wireframe: true,
});
this.geometry = new THREE.BufferGeometry();	
this.mesh = new THREE.Mesh(this.geometry, this.material);
this.mesh.castShadow = false;
this.mesh.receiveShadow = true;
this.mesh.frustumCulled = false;
this.mesh.position.set(0, 0, 0);  
this.mesh.rotation.x = Math.PI;
this.geometry.setAttribute('position', new THREE.Float32BufferAttribute(positionsArray, 3));
this.geometry.setAttribute('normal', new THREE.Float32BufferAttribute(normalsArray, 3));
this.geometry.setAttribute('index', new THREE.Int32BufferAttribute(indicesArray, 1));
this.geometry.setAttribute('vindex', new THREE.Int32BufferAttribute(vIndicesArray, 1));
this.geometry.setIndex(new THREE.BufferAttribute(indicesArray, 1));
this.geometry.attributes.position.needsUpdate = true;
this.geometry.attributes.normal.needsUpdate = true;
this.geometry.attributes.index.needsUpdate = true;
this.geometry.attributes.vindex.needsUpdate = true;
//here i store all vertex coordinates in a data texture
this.tex = new THREE.DataTexture(
positionsArray,
vIndicesArray.length + 1,	//width
1, //height
THREE.RGBFormat, 
THREE.FloatType
);
this.mesh.material.uniforms.uSpatialTexture.value = this.tex;
this.mesh.material.uniforms.uTextureSize.value = vIndicesArray.length + 1;
this.mesh.material.uniformsNeedUpdate = true;
this.scene.add( this.mesh );
const ambientLight = new THREE.AmbientLight( 0xffffff, 0.2 );
this.scene.add( ambientLight );
const pointLight = new THREE.PointLight( 0xffffff, 0.8 );
this.scene.add( this.camera );
this.camera.add( pointLight );
}//end init
GenerateNormals(positions, indices) {
const normals = new Array(positions.length).fill(0.0);
for (let i = 0, n = indices.length; i < n; i+= 3) {
const i1 = indices[i] * 3;
const i2 = indices[i+1] * 3;
const i3 = indices[i+2] * 3;
N1.fromArray(positions, i1);
N2.fromArray(positions, i2);
N3.fromArray(positions, i3);
D1.subVectors(N3, N2);
D2.subVectors(N1, N2);
D1.cross(D2);
normals[i1] += D1.x;
normals[i2] += D1.x;
normals[i3] += D1.x;
normals[i1+1] += D1.y;
normals[i2+1] += D1.y;
normals[i3+1] += D1.y;
normals[i1+2] += D1.z;
normals[i2+2] += D1.z;
normals[i3+2] += D1.z;
}
return normals;
}
GenerateIndices() {
const resolution = this.params.resolution;
const indices = [];
for (let i = 0; i < resolution; i++) {
for (let j = 0; j < resolution; j++) {
indices.push(
i * (resolution + 1) + j,
(i + 1) * (resolution + 1) + j + 1,
i * (resolution + 1) + j + 1);
indices.push(
(i + 1) * (resolution + 1) + j,
(i + 1) * (resolution + 1) + j + 1,
i * (resolution + 1) + j);
}
}
return indices;
}
animate(){
//requestAnimationFrame( this.animate );  
requestAnimationFrame( this.animate.bind(this) );  
this.render();
}//end animate
render(){
//this.controls.update();
this.camera.updateMatrixWorld();
this.camera.updateProjectionMatrix(); 
this.renderer.render(this.scene, this.camera); 
var index = document.getElementById("testfield1").value;
if(index == '')this.mesh.material.uniforms.point.value = 100000000000;	//means no vertex selected
else this.mesh.material.uniforms.point.value = index;
this.mesh.material.uniformsNeedUpdate = true;
}//end render
}//end class
new Main();

//------------------------Update----------------------------

Problem solved. The DataTexture wants RGBA. So I have to add an A channel even if I don't need it. Maybe I can use it for something at some point.

I uploaded a repository to github. Here the link:

https://github.com/Spiri0/BufferGeometry-from-DataTexture

But I have a new problem. If I set the resolution to 64, my geometry disappears.
I guess because the DataTexture isn't ready then. Does anyone know of an elegant solution that I can use to expand the repository so that it works with any resolution, e.g. even with 500? I don't have a need for such high resolutions at the moment, but it should generally work stably.

答案1

得分: 0

I got the solution thanks to prisoner849 from the threejs forum. To refer to the fact that my problem is solved seems sensible to me, so that it doesn't remain an open question in the room forever. I also have a repository on github if anyone is interested.

Here the link

https://github.com/Spiri0/BufferGeometry-from-DataTexture

英文:

I got the solution thanks to prisoner849 from the threejs forum. To refer to the fact that my problem is solved seems sensible to me, so that it doesn't remain an open question in the room forever. I also have a repository on github if anyone is interested.

Here the link

https://github.com/Spiri0/BufferGeometry-from-DataTexture

huangapple
  • 本文由 发表于 2023年5月26日 00:50:13
  • 转载请务必保留本文链接:https://go.coder-hub.com/76334600.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定