How can I save the raw data captured (the buffer) by the microphone?

822 views
Skip to first unread message

Denis Candido

unread,
May 29, 2017, 8:03:42 PM5/29/17
to AudioKit Users Group
Hello,

My plan is to do some raw data processment after capturing it, saving in a .raw data or kind of. Can I do it with current AudioKit support?

If possible, I need to send this raw data via JSON to my Python server. Some way to convert the buffer to literally binary data, so I can send using REST.
Is this possible with AudioKit or I will have to go to the deep and dark world of the Apple's documentation?

//Denis

Aurelius Prochazka Ph.D.

unread,
May 29, 2017, 8:09:46 PM5/29/17
to AudioKit Users Group
Take a look at how we tap the data off of nodes to create the plots.  You can probably post the data to your server instead of plotting it.  Or at least save it for sending on a separate thread.

Aure

Denis Candido

unread,
Jun 1, 2017, 11:39:59 PM6/1/17
to AudioKit Users Group
Hello Aure,

I tried something like this:

override func viewWillAppear(_ animated: Bool) {
        
        mic = AKMicrophone()
        
        installTap(mic!)
    }
    
    
    override func viewWillDisappear(_ animated: Bool) {
        mic?.avAudioNode.removeTap(onBus: 0)
    }
    
    func installTap(_ input:AKNode) {
        
        input.avAudioNode.installTap(onBus: 0, bufferSize: 1024, format: AudioKit.format) { [weak self] (buffer, time) -> Void in
            self?.signalTracker(didReceivedBuffer: buffer, atTime: time)
        }
    }
    
    
    
    func signalTracker(didReceivedBuffer buffer: AVAudioPCMBuffer, atTime time: AVAudioTime){
        
        let elements = UnsafeBufferPointer(start: buffer.floatChannelData?[0], count:1024)

        print (elements)
        print(elements.count)
        
    }

But doesn't print what I expected... I expected an Array of Float but what I have is this:

UnsafeBufferPointer(start: 0x00007fbf1f80bc00, count: 1024)
1024


Can you please help me with this?

Denis Candido

unread,
Jun 1, 2017, 11:40:01 PM6/1/17
to AudioKit Users Group
Hello Aure,

Problem solved with:

func signalTracker(didReceivedBuffer buffer: AVAudioPCMBuffer, atTime time: AVAudioTime){
        
        let elements = UnsafeBufferPointer(start: buffer.floatChannelData?[0], count:8192)
        
        var sample = [Float]()
        
        for i in 0..<8192 {
            sample.append(elements[i])
        }

        print (sample)
        print(sample.count)
        
        
        
    }


I saw it at the AKFFTTap. Thanks.

Em segunda-feira, 29 de maio de 2017 21:09:46 UTC-3, Aurelius Prochazka Ph.D. escreveu:

Denis Candido

unread,
Jun 2, 2017, 1:03:04 PM6/2/17
to AudioKit Users Group
Hello Aure,

Can you tell me what's the difference between these codes:

Way used at AKFFTTap.swift to read buffer samples:

let elements = UnsafeBufferPointer(start: buffer.floatChannelData?[0], count:self.bufferSize)
        
for i in 0..<self.bufferSize {
     self.sample.append(elements[i])
}


Similar way used at AKAmplitudeTap.swift (similar because there calculates the RMS...):

for i in 0 ..< Int(self.bufferSize) {
    self.sample.append(Float((buffer.floatChannelData?.pointee[i]) ?? 0.0))
}


There's any difference between them?


Thanks,
Denis. 

Aurelius Prochazka Ph.D.

unread,
Jun 2, 2017, 6:08:07 PM6/2/17
to AudioKit Users Group
The AKFFTTap.swift on AudioKit 3.7 doesn't have that code. What version are you looking at?  In any event, there might not have been any reason one way was chosen or the other, could just be when it was introduced. 

Aure
Reply all
Reply to author
Forward
0 new messages