英文:
Choose specific input channel as a mono input from USB device in AVAudioSession/AVAudioEngine
问题
我正在开发一个音频录制应用程序,该应用程序使用外部USB音频接口(例如Focusrite Scarlett Solo)连接到iPhone。
当我运行AVAudioSession.sharedInstance().currentRoute.inputs
时,它正确返回接口信息。
通道也正确返回了。
当我将inputNode
连接到AVAudioEngine
中的mainMixerNode
时,它使用多通道输入,因此线路/乐器输入位于右声道,麦克风输入位于左声道。
我该如何使它仅使用第2声道(吉他)作为单声道输出到两个扬声器中?
我已经查看了一些文档和讨论,但找不到答案。
我尝试在音频格式中将通道更改为1,但如预期的那样,它会以单声道播放第一个声道,但我无法选择第2声道以代替播放。
根据您提供的信息,您正在尝试解决如何将多通道输入音频转换为单声道输出并且在两个扬声器中播放。您已经尝试将输入通道更改为1,但这会将音频输出为单声道,而不是选择第二通道。以下是您提供的代码示例,其中您尝试更改音频格式以实现这一目标。
您可以在引擎的连接过程中使用AVAudioMixerNode
来实现这一目标,具体如下:
let engine = AVAudioEngine()
let input = engine.inputNode
// Set up the channel mapping to use only the 2nd channel (guitar) as mono
engine.mainMixerNode.auAudioUnit.channelMap = [1]
// Connect the input to the main mixer with the original input format
engine.connect(input, to: engine.mainMixerNode, format: input.inputFormat(forBus: 0))
// Start the audio engine
try! engine.start()
这段代码设置了主混音节点的通道映射,以便只使用第二通道作为单声道,并将原始的输入格式连接到主混音。这应该允许您将第二通道的音频播放到两个扬声器中。
如果您在测试中遇到问题,请确保您的音频接口和硬件设置正确,以便支持多通道输入和输出。如果您的硬件支持多通道操作,但仍然无法正常工作,请查看您的应用程序是否正确设置了音频会话以支持多通道输入和输出。
英文:
I'm working on an audio recording app which uses an external USB Audio interface (e.g, Focusrite Scarlett Solo) connected to an iPhone.
When I run AVAudioSession.sharedInstance().currentRoute.inputs
it returns the interface correctly.
1 element
- 0 : <AVAudioSessionPortDescription: 0x28307c650, type = USBAudio; name = Scarlett Solo USB; UID = AppleUSBAudioEngine:Focusrite:Scarlett Solo USB:130000:1,2; selectedDataSource = (null)>
Channels are returned correctly as well.
po AVAudioSession.sharedInstance().currentRoute.inputs.first?.channels
▿ Optional<Array<AVAudioSessionChannelDescription>>
▿ some : 2 elements
- 0 : <AVAudioSessionChannelDescription: 0x283070b60, name = Scarlett Solo USB 1; label = 4294967295 (0xffffffff); number = 1; port UID = AppleUSBAudioEngine:Focusrite:Scarlett Solo USB:130000:1,2>
- 1 : <AVAudioSessionChannelDescription: 0x283070b70, name = Scarlett Solo USB 2; label = 4294967295 (0xffffffff); number = 2; port UID = AppleUSBAudioEngine:Focusrite:Scarlett Solo USB:130000:1,2>
When I connect the inputNode
to mainMixerNode
in AVAudioEngine
it uses multi-channel input so the Line/Instrument input is on the right channel and Microphone input is on the left.
How can I make it so that I use only the 2nd Channel (guitar) as a mono to be played back in both speakers?
I've been looking through some docs and discussions but could not find the answer.
I tried changing channels to 1 in audio format but as expected it plays the first channel in mono but I can't select 2nd channel to be played instead.
let input = engine.inputNode
let inputFormat = input.inputFormat(forBus: 0)
let preferredFormat = AVAudioFormat(
commonFormat: inputFormat.commonFormat,
sampleRate: inputFormat.sampleRate,
channels: 1,
interleaved: false
)!
engine.connect(input, to: engine.mainMixerNode, format: preferredFormat)
EDIT:
As asked, I'm adding channel mapping code
let input = engine.inputNode
let inputFormat = input.inputFormat(forBus: 0)
engine.mainMixerNode.auAudioUnit.channelMap = [1]
engine.connect(input, to: engine.mainMixerNode, format: inputFormat)
EDIT 2:
So the answer from bugix indeed works at some point but I had issues with panning. This is how I fixed panning initially but then noticed it was passing through all the inputs as mono. I've just tested it in an empty project in app delegate and posting whole code that you could just run and see in action.
@main
class AppDelegate: UIResponder, UIApplicationDelegate {
let engine = AVAudioEngine()
var inputNode: AVAudioInputNode { engine.inputNode }
var inputFormat: AVAudioFormat { inputNode.inputFormat(forBus: 0) }
var outputNode: AVAudioOutputNode { engine.outputNode }
var outputFormat: AVAudioFormat { outputNode.outputFormat(forBus: 0) }
var mainMixerNode: AVAudioMixerNode { engine.mainMixerNode }
let mixerNode: AVAudioMixerNode = .init()
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
engine.attach(mixerNode)
let layoutTag: AudioChannelLayoutTag = kAudioChannelLayoutTag_Mono
let layout = AVAudioChannelLayout(layoutTag: layoutTag)!
let monoFormat: AVAudioFormat = .init(standardFormatWithSampleRate: inputFormat.sampleRate, channelLayout: layout)
engine.connect(inputNode, to: mixerNode, format: inputFormat)
engine.connect(mixerNode, to: mainMixerNode, format: monoFormat)
try! engine.start()
mixerNode.pan = -0.4
return true
}
}
Also, it didn't work when I tested with 10 input channel audio interface. It was only streaming first two signals. I've tested the same audio interface with Garage Band and it works there. Wonder how Apple does it and the same time doesn't provide us with a straightforward interface to do so. Ideally it should be possible to switch from AVAudioSession
but it doesn't have that feature. I've looked through all the docs of AVAudioSession
, AVAudioEngine
and whatnot but couldn't find anything there.
答案1
得分: 1
这个技术说明是帮助我理解许多东西的文档。我不打算自己写细节,因为如果你也阅读此链接 https://developer.apple.com/library/archive/technotes/tn2091/_index.html 会更好。
无法发布图片,所以我建议查看上面链接中的AUHAL(在iOS的情况下,它将是AURemoteIO)的信号流。
苹果公司表示:
当您想要获取音频设备的输入数据时,连接应该是 Source: AUHAL(输出范围,元素 1) 和 Destination 目标单元(输入范围,输入元素)
我们可以跳过大部分内容,因为AVAudioEngine
在更高级别上执行了它们,但让我们来看看 通道映射 部分。
我们需要在 engine.inputNode.audioUnit
上创建一个通道映射,这是最接近硬件的单元,因此比 auAudioUnit
和 avAudioUnit
更低级。实际上,auAudioUnit
可能仍然显示通道映射为 [0, 1]
,但在我们将属性设置为 audioUnit
后,我们可以说0和1都表示1,如果你有更多通道,它可以是2、3等,甚至是 [4, 6] 之类的。
typealias AudioChannelCount = UInt32
typealias AudioChannelsResolver = (AudioChannelCount) -> [UInt32]
func mapChannels(
_ channels: [UInt32],
inputUnit: AudioUnit,
format: AVAudioFormat,
toChannels destinationChannelsResolver: AudioChannelsResolver
) throws {
var channelMap: UnsafeMutablePointer<Int32>? = nil
let numberOfChannels: UInt32 = format.channelCount
let mapSize: UInt32 = .init(MemoryLayout<Int32>.size * Int(numberOfChannels))
channelMap = UnsafeMutablePointer<Int32>.allocate(capacity: Int(mapSize))
let destinationChannels = destinationChannelsResolver(numberOfChannels)
guard channels.count <= destinationChannels.count else {
throw MyError.sourceChannelCountIsGreaterThanDestChannelCount
}
for (index, channel) in destinationChannels.enumerated() {
guard numberOfChannels > channel else {
throw MyError.invalidChannelIndexSent
}
if channels.indices.contains(index) {
channelMap?[Int(channel)] = Int32(channels[index])
} else {
channelMap?[Int(channel)] = -1
}
}
// 将变异的通道映射分配给音频单元的ChannelMap属性
// 当您想要获取音频设备的输入数据时,连接应该是:
// AUHAL(输出范围,元素 1) -> 目标单元(输入范围,输入元素)
// 输入 - 1,输出 - 0
// https://developer.apple.com/library/archive/technotes/tn2091/_index.html
let status = AudioUnitSetProperty(
inputUnit,
kAudioOutputUnitProperty_ChannelMap,
kAudioUnitScope_Output,
1,
channelMap,
mapSize
)
channelMap?.deallocate()
if status != noErr {
throw MyError.failedToSetChannelMap
}
}
// 在iOS中也可以这样做
class AppDelegate: NSObject, NSApplicationDelegate {
let engine = AVAudioEngine()
func applicationDidFinishLaunching(_ notification: Notification) {
let format = engine.inputNode.outputFormat(forBus: 0)
let desiredFormat = AVAudioFormat(commonFormat: format.commonFormat, sampleRate: format.sampleRate, channels: 1, interleaved: format.isInterleaved)
guard let inputUnit = engine.inputNode.audioUnit else { return }
try! mapChannels(
[1, 1], // 第二个元素实际上不重要,如果我们使用具有1个通道的`desiredFormat`,也可以使用 [1]。
// 或者您可以保持立体声,两个通道仍然会产生相同的输出。
inputUnit: inputUnit,
format: engine.inputNode.inputFormat(forBus: 0),
toChannels: { count in
.init(0 ..< count)
}
)
engine.connect(engine.inputNode, to: engine.mainMixerNode, format: desiredFormat)
try! engine.start()
}
}
在这两种情况下,平移都有效 - 无论是使用立体声和 [1, 1]、[0, 0] 这样的映射,还是使用单声道格式。
例如,engine.inputNode.pan = -0.7
。
最重要的是,只有您设置的通道才会产生声音。例如,如果我的通道映射为 [1, 1],我将音缆从通道1切换到通道0,就不会有声音。
macOS注意
确保在 Info.plist 中为引擎启用 App sandbox 中的 音频输入,并且还要在其中启用麦克风使用说明以便引擎能够产生声音。
英文:
So this technical note was the documentation that helped me a lot understand many things. I'm not going to write details myself as it's better if you also read this https://developer.apple.com/library/archive/technotes/tn2091/_index.html.
Can't post pictures yet so Id suggest checking out the signal flow of AUHAL (in case of iOS it would be AURemoteIO) in the link above.
Apple states:
> When you want to get the audio device's input data, the connection should be Source: AUHAL (output scope, element 1) and Destination Destination Unit (input scope, input element )
We could skip most of the stuff as AVAudioEngine
does them on a higher level but let's have a look at Channel Mapping section.
We need to make a channel map on engine.inputNode.audioUnit
which is the closest unit to hardware, hence the lower level than auAudioUnit
and avAudioUnit
. auAudioUnit
in fact might still show that channel map is [0, 1]
but after we set the property to audioUnit
we could say that both 0 and 1 would represent 1 or if you've got more channels it could be 2,3 etc. or maybe even [4, 6] under-the-hood whatsoever.
typealias AudioChannelCount = UInt32
typealias AudioChannelsResolver = (AudioChannelCount) -> [UInt32]
func mapChannels(
_ channels: [UInt32],
inputUnit: AudioUnit,
format: AVAudioFormat,
toChannels destinationChannelsResolver: AudioChannelsResolver
) throws {
var channelMap: UnsafeMutablePointer<Int32>? = nil
let numberOfChannels: UInt32 = format.channelCount
let mapSize: UInt32 = .init(MemoryLayout<Int32>.size * Int(numberOfChannels))
channelMap = UnsafeMutablePointer<Int32>.allocate(capacity: Int(mapSize))
let destinationChannels = destinationChannelsResolver(numberOfChannels)
guard channels.count <= destinationChannels.count else {
throw MyError.sourceChannelCountIsGreaterThanDestChannelCount
}
for (index, channel) in destinationChannels.enumerated() {
guard numberOfChannels > channel else {
throw MyError.invalidChannelIndexSent
}
if channels.indices.contains(index) {
channelMap?[Int(channel)] = Int32(channels[index])
} else {
channelMap?[Int(channel)] = -1
}
}
// Assign mutated channel map to audio unit's ChannelMap property
// When you want to get the audio device's input data, the connection should be:
// AUHAL (output scope, element 1) -> Destination Unit (input scope, input element)
// Input - 1, Output - 0
// https://developer.apple.com/library/archive/technotes/tn2091/_index.html
let status = AudioUnitSetProperty(
inputUnit,
kAudioOutputUnitProperty_ChannelMap,
kAudioUnitScope_Output,
1,
channelMap,
mapSize
)
channelMap?.deallocate()
if status != noErr {
throw MyError.failedToSetChannelMap
}
}
// Could do the same for iOS
class AppDelegate: NSObject, NSApplicationDelegate {
let engine = AVAudioEngine()
func applicationDidFinishLaunching(_ notification: Notification) {
let format = engine.inputNode.outputFormat(forBus: 0)
let desiredFormat = AVAudioFormat(commonFormat: format.commonFormat, sampleRate: format.sampleRate, channels: 1, interleaved: format.isInterleaved)
guard let inputUnit = engine.inputNode.audioUnit else { return }
try! mapChannels(
[1, 1], // 2nd element doesn't really matter in case we use `desiredFormat` that has 1 channel so could have [1] as well.
// Or you could keep it stereo and both channels would still produce the same output.
inputUnit: inputUnit,
format: engine.inputNode.inputFormat(forBus: 0),
toChannels: { count in
.init(0 ..< count)
}
)
engine.connect(engine.inputNode, to: engine.mainMixerNode, format: desiredFormat)
try! engine.start()
}
}
pan works in both cases - using stereo and [1, 1], [0, 0] and such mapping or when using mono format.
engine.inputNode.pan = -0.7
for example.
And what's most important is that only the channels you set will produce sound. For example, if I've got [1, 1] channel map and I switch the cable from channel 1 into channel 0, I wouldn't get any sound.
macOS Note
Make sure you enable Audio input in App sandbox alongside Microphone usage description in Info.plist for engine to produce sound.
答案2
得分: -1
engine.inputNode.auAudioUnit.channelMap = [1, 1]
英文:
You have to do the channel mapping on the input node like so:
engine.inputNode.auAudioUnit.channelMap = [1, 1]
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论