I’d like to build a type of audio visualizer app using React Native and Expo, and I wasn’t sure if there’s a way for a React Native application to listen to the audio of the phone? Is there a way of achieving this?
Additionally, if you can listen to the phone output audio is it possible to edit the audio or apply filters to it? For example, adding an equalizer setting for different types of audio?
Thanks in advance.
I’m not sure there is actually a way to do this in general on iOS apps, Expo or not. Do you know of a way to pipe the audio of other apps even using iOS native code?
I’m kind of new to this myself, so I’m not sure I’m looking in the correct place, but here’s some brief notes on my research so far:
Android has a MODIFY_AUDIO_SETTINGS permission that:
Allows an application to modify global audio settings.
On iOS there is a concept of “Audio Units.” There are several included audio units, here’s the few that may be helpful:
iPod Equalizer Unit:
Provides the features of the iPod equalizer.
3D Mixer Unit:
Supports mixing multiple audio streams, output panning, sample rate conversion, and more.
Multichannel Mixer Unit:
Supports mixing multiple audio streams to a single stream.
There’s a Swift sample project in the iOS Developer resources interfacing with the iPod EQ unit here: https://developer.apple.com/library/content/samplecode/iPhoneMixerEQGraphTest/Introduction/Intro.html#//apple_ref/doc/uid/DTS40009555
This person also seems to have used standard React Native to interface with audio and use equalization: https://www.youtube.com/watch?v=2n4feu3O_zU
That’s what I’ve found so far, but not really sure I’m headed in the correct direction.
This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.