Expo Three and Camera

Expo Three Version : 5.4.0
Expo Camera : 8.0.0
Expo GL : 8.0.0
Expo Asset Utils : 1.2.0
Platform : IOS/Android

Hi, I’ve been working on this issue for a few days, and can’t determine wether it’s possible or not :
I need to change a plane’s texture at runtime with the picture taken by the camera. The picture taking part runs smoothly, and I can display it in a normal Image component, but no matter what, I can’t get it to display inside the scene.

I always use this code to take the picture :

this.camera.takePictureAsync({
	skipProcessing: true,
	base64: true,
	onPictureSaved: async function(photo) {
		FileSystem.moveAsync({
		  from: photo.uri,
		  to: `${FileSystem.documentDirectory}pics/Photo_${id}.jpg`,
		}).then(async () => {
           const imageUri = `${FileSystem.documentDirectory}pics/Photo_${id}.jpg`
		   // several tries here, see bellow
		});
	}
});

I’ve tried several different ways to create the new texture, and got varying results :

Black texture:

plane.material.map = await ExpoTHREE.loadAsync(imageUri);

WebGL Invalid Pixel Argument Error:

var loadingManager = new THREE.LoadingManager();
var imageLoader = new THREE.ImageLoader(loadingManager);
const image = await imageLoader.load(imageUri);

var texture = new THREE.Texture();
texture.image = image;
texture.format = THREE.RGBFormat;
texture.needsUpdate = true;

plane.material.map = texture;

Amongsts other tries, I tried using ExpoTHREE.LoadTextureAsync({asset: await AssetUtils.resolveAsync(imageUri)}), as well as a few other things using Asset Utils, but I didn’t manage to display anything else than a black texture or an error…
I also tried using base64, but understood it wasn’t supported by expo-gl

So here’s the question: How can I display an image taken at runtime by the camera into a Three.js scene?

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.