Expo doesn’t seem to play nice with react-native-fetch-blob (requires linking). How can I upload image blobs to a cloud service?
Blob web API isn’t supported but we have a couple of pull requests open on React Native upstream to add support for this. That said, you can actually already upload images without using the
Blob api – closed source tools like Firebase that depend on
Blob won’t work until the pull requests land, but until then, you can just use
If you need to use Firebase or something like that, you could upload to s3 and store the URL in the DB service.
I hope that helps!
Guess that settles which services I’m going to be using for the next few months. Thank you for the help!
Has there been any development on this since March ? Just wondering if this is now possible.
It’s technically possible, you just need to write an http POST request with fetch and use a little multipart-form magic.
@allpwrfulroot would be much appreciated if you could give an example of what you mean! Thanks.
Not a pure answer to the question, but my adaptation of notbrent’s examples and some GraphQL docs to get and upload an image to a server. Getting the correct FormData takes some experimentation and depends on the server / service you’re using; if your docs provide a cURL example that’s where to start?
Thanks! I will try to understand that code, but I think my comprehension of it is limited at best ATM
The example you gave was for uploading to s3, can you do the same thing with the new firebase functions?
yup, you can, I don’t have a link to an example app to do it but I’ve heard of Expo users doing it before. if you can create an example app for that it would be neat!
You can upload with firebase by converting the uri to
XMLHttpRequest - only tested on IOS so far
This solution ultimately did not work so I am just converting the base64 string to a ByteArray in code for both IOS and Android and I can get the file uploading working with firebase - https://github.com/aaronksaunders/expo-rn-firebase-image-upload/blob/master/README.md
Wow. I have just tried this and it works perfect! Thank you.
I tried your example and while its exactly what I’ve been looking for, I can’t seem to get the base64 from the ImagePicker. I’m passing in base64: true into the object parameter for launchImageLibraryAsync, but there is no base64 to come out in my project for some reason, only in yours. So my pickerResult.base64 is coming out null.
what sdk version are you using? this only works on sdk18+
That’ll do it. Updating now. Fingers crossed for no breaking changes.
If I have a published version of my app on the app stores and I update from 17 to 18 sdk do I need to update the app from the app stores or can I do it from expo with publish?
it depends when you did the standalone build – if you did it after we released sdk 18 then you should be fine. if you publish a sdk 18 version of your app and notice that it’s not getting used when you restart the standalone build, then you need to rebuild it. see the limitations section of our publishing guide
Aaron, I tried your solution and everything seems to make a lot of sense, but for some reason when I pass in the base64 string to this.convertToByteArray(pickerResult.base64), I get the following error: Error: ‘atob’ failed: The string to be decoded is not correctly encoded.
Do you have any idea why’s that happening?
Edit: I ended up ditching your atob function for the decode() method from base-64 library and it all worked !!
glad it worked for you, where is the base64 library that you used? it might be helpful to others who run into issues with the implementation I presented here?