ios – swift – find out how to get audio output from mic on iphone at 8khz pattern charge with buffer measurement of 16ms & 1 channel?


For a demo app on iphone, I need to get audio output from mic at 8khz, buffer measurement of 16ms in order that it offers 128bytes of framelength at at time.

I’ve tried this AVAudioEngine by putting in a faucet on the enter node however there isn’t a approach of setting pattern charge and buffer measurement. Any recommendations on find out how to obtain 128 bytes output?

Thanks

personal func startRecording() throws {
    checkPermission()
    if !self.hasPermissionToRecord {
        return
    }
    if engine == nil {
        engine = AVAudioEngine()
    }
    let inputNode = engine.inputNode
    let microphoneOutputFormat: AVAudioFormat = inputNode.outputFormat(forBus: 0)
    let microphoneAudioStream = microphoneOutputFormat.formatDescription.audioStreamBasicDescription
    let microphoneSampleRate = microphoneAudioStream!.mSampleRate

    var mulawDescription = AudioStreamBasicDescription(
        mSampleRate: 8000,
        mFormatID: kAudioFormatULaw,
        mFormatFlags: 0,
        mBytesPerPacket: 1,
        mFramesPerPacket: 1,
        mBytesPerFrame: 1,
        mChannelsPerFrame: 1,
        mBitsPerChannel: 8,
        mReserved: 0
    )
        
    let mulawFormat = AVAudioFormat(streamDescription: &mulawDescription)!
    converter = AVAudioConverter(from: microphoneOutputFormat, to: mulawFormat)
        
    inputNode.installTap(onBus: 0, bufferSize: defaultUlawBufferSize, format: microphoneOutputFormat) { [self] (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) in
            
    let secondsInBuffer:Float64 = Float64(buffer.frameLength) / Float64(microphoneSampleRate);
    let ulawBufferSize = UInt32(8000 * secondsInBuffer) // 800
            
    // conversion - downsample
    guard let mulawBuffer = AVAudioPCMBuffer(pcmFormat: converter.outputFormat, frameCapacity: 800) else {
        return
    }
    var outError: NSError? = nil
    let microphoneInputBlock: AVAudioConverterInputBlock = { (inNumbPackets, outStatus) -> AVAudioBuffer? in
        outStatus.pointee = AVAudioConverterInputStatus.haveData
        return buffer
    }
    self.converter.convert(to: mulawBuffer, error: &outError, withInputFrom: microphoneInputBlock)
    let mulawData = Knowledge(buffer:mulawBuffer)
    if (!(receivedAudioCallback?(mulawData) ?? false)) {
         cease()
     }
    }
    engine.put together()
    strive engine.begin()
}

I’ve tried this code, after downsampling at 8khz it offers me 800bytes

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles