Re: Home Audio Project

639 weergaven
Naar het eerste ongelezen bericht

Scott Baillie

ongelezen,
11 jun 2013, 03:32:2011-06-2013
aan beagl...@googlegroups.com, jsm...@nc.rr.com

James , I think I can answer some of your questions but not all of them yet.
I am doing something similar to you, I am using the beagleboard xm to implement guitar effects,
in my case I am writing a valve amplifier preamp simulation am I am using the method of
electronic circuit simulation.
The beagleboard xm has a TPS65950 codec chip on it which is capable of 48khz sample rate
and 16 bit sample resolution. The beagleboard xm also has a texas instruments 64x fixed point
DSP processor.
I have a working program now where the arm processor reads audio data from the audio capture device
and sends the data to the DSP, the DSP processes the data and then sends the data back to the
arm processor and then the arm processor sends the data to the audio playback device.
I am quite impressed with the performance of the DSP processor when I use fixed point maths
for the calculations but floating point maths is fairly slow.
The biggest problem I am having is the latency introduced by the linux audio driver on the arm processor.
I would guess that I am getting about 50->80 ms of latency which is too much for a real time guitar
effects application because you hear a slight delay between playing a note and then hearing it in
the speakers.
I am using the tidspbridge driver that exists in the linux kernel to communicate between linux and the DSP.
There is a user space library that goes with the tidspbridge driver which I use on the arm processor and
also a framework for building your DSP code using the texas instruments compiler which is a free download.
So I dont have to worry about the code associated with communicating between the two processor, that
code is already written for me , I just have to write the DSP code for my algorithm.

I am investigating the source of the latency in the linux audio driver because I want to see if I can
improve the latency, if I can then I would say that the beagleboard xm is suitable for real time audio
processing but otherwise I would say its just a little bit too slow.

Another possibility would be to investigate some of the other texas instruments evaluation boards
and see if there another board with better specs but I would expect that you would have to buy a
software development kit for the board and it would only run on windows which is unacceptable
for me.


Vladimir Pantelic

ongelezen,
11 jun 2013, 03:33:4011-06-2013
aan beagl...@googlegroups.com
jsm...@nc.rr.com wrote:
>
> I want to start a little home audio project. I effectively want to
> reproduce (simulate?) some guitar effects boxes I use. I'm not really
> interested in any open-source filter/effects software, because one
> main goal is to design and code that stuff myself (my graduate work
> was in digital signal processing). My biggest issue is figuring out
> the hardware to support the basic goal: An open source RTOS
> controlling platform and farming the actual processing to some kind of
> DSP-ish hardware. In researching this, I ended up at BeagleBoard.
>
> The first thing I want is some kind of open source OS that supports
> the primary controlling system. I want to develop, test and have some

you can use Linux on the Beaglebone

> The second thing I would like to do is push the effects processing to
> some kind of DSP hardware. I can imagine some scenarios, but this is
> where I am really out of my depth. I have some superficial
> experience with DSP interaction, but the low-level details where
> hidden behind various APIs. So, I'm really hunting in the dark here
> because this seems like the really hard thing to do from scratch. How
> do I go about figuring out whether I can even put together the
> hardware so that I /could/ communicate with say a Motorola or Texas
> Instruments DSP from the RTOS? I'd love to find a setup where I
> could (or had to) write my own device drivers, but honestly I don't
> know if my understanding of Linux device drivers even translates to
> the DSP world. Do you flash code to the DSP? Would I have to write
> it in assembly? Assuming I can get the hardware cheaply, am I even
> going to be able to find open source environments to support
> communicating with it?

No need to flash anything, there are DSP frameworks that allow you to
load your code into the DSP at runtime.

There is no need to write any drivers, you can use the existing
frameworks to write code for the DSP. you can start in C and move over
to assembly if needed.

Take note that the ARM side of the Beaglebone is already quite
powerful for audio processing.

Vladimir Pantelic

ongelezen,
11 jun 2013, 03:37:3711-06-2013
aan beagl...@googlegroups.com
Scott Baillie wrote:

> I am investigating the source of the latency in the linux audio driver
> because I want to see if I can
> improve the latency, if I can then I would say that the beagleboard xm
> is suitable for real time audio
> processing but otherwise I would say its just a little bit too slow.

another option is to make the DSP access the audio codec directly,
thus avoiding the latency of piping the audio data between ARM and
DSP. it involves writing a MsBSP driver for the DSP side, but that
should be covered by existing code for TI DSPs...

> Another possibility would be to investigate some of the other texas
> instruments evaluation boards
> and see if there another board with better specs but I would expect
> that you would have to buy a
> software development kit for the board and it would only run on
> windows which is unacceptable

There are DSP-only boards with codecs, but that setup is close to
using the McBSP directly from the DSP on a Beagleboard. Unless you
need afloating point DSP which the Beagle does not offer..



Scott Baillie

ongelezen,
11 jun 2013, 03:39:5011-06-2013
aan beagl...@googlegroups.com, jsm...@nc.rr.com

Sorry , I have one more thought , and that is , if I cant improve the latency of the linux audio driver,
I will see if I can get a USB audio device that has low latency. If I can find a USB audio device
with low latency and a linux driver exists for the device then I believe the beagleboard xm would
be a good platform for digital guitar effects.

Also, I would prefer 24 bit sample resolution rather than 16 bit sample resolution so I will very
likely explore the USB audio device option.


Siji Sunny

ongelezen,
11 jun 2013, 03:47:3511-06-2013
aan beagl...@googlegroups.com

I want to start a little home audio project.  I effectively want to reproduce (simulate?) some guitar effects boxes I use.  I'm not really interested in any open-source filter/effects software, because one main goal is to design and code that stuff myself (my graduate work was in digital signal processing).   My biggest issue is figuring out the  hardware to support the basic goal:  An open source RTOS controlling platform and farming the actual processing to some kind of DSP-ish hardware.  In researching this, I ended up at BeagleBoard.

The first thing I want is some kind of open source OS that supports the primary controlling system.  I want to develop, test and have some kind graphical/plotting tool on a desktop box, so I'd abstract the physical controlling interface in a way that I could manage it from external software rather than physical hardware (knobs, switches, etc).   I'm also not too concerned with A/D conversion in real-time; in a first pass, I would be very happy to abstract the digital input/output interface and handle that externally too.   I have (or had) a lot of experience w/ VxWorks (on PowerPc) and ThreadX on ARM, but I’m thinking I shouldn’t expect to find open source equivalents of their respective development environments.     I ‘m actually thinking that the main requirement here is going to be based on the second thing.

The second thing I would like to do is push the effects processing to some kind of DSP hardware.  I can imagine some scenarios, but this is where I am really out of my depth.    I have some superficial experience with DSP interaction, but the low-level details where hidden behind various APIs.   So, I'm  really hunting in the dark here because this seems like the really hard thing to do from scratch.  How do I go about figuring out whether I can even put together the hardware so that I could communicate with say a Motorola or Texas Instruments DSP from the RTOS?    I'd love to find a setup where I could (or had to) write my own device drivers, but honestly I don't know if my understanding of Linux device drivers even translates to the DSP world.   Do you flash code to the DSP?  Would I have to write it in assembly?  Assuming I can get the hardware cheaply,  am I even going to be able to find open source environments to support communicating with it?



I think you should explore these possibilities of using gstreamer and gstreamer API's to achieve this.

Have I found the right place?

If you have any advice or you can recommend any pointers to other forums, tech sites, whatever, I'd really appreciate it.

Or if you can quickly and authoritatively squash my dreams with big dollar signs, that's cool too.

Thanks,

James Smyth

--
For more options, visit http://beagleboard.org/discuss
---
You received this message because you are subscribed to the Google Groups "BeagleBoard" group.
To unsubscribe from this group and stop receiving emails from it, send an email to beagleboard...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 



--
Siji Sunny

Scott Baillie

ongelezen,
11 jun 2013, 04:09:1911-06-2013
aan beagl...@googlegroups.com

another option is to make the DSP access the audio codec directly,
thus avoiding the latency of piping the audio data between ARM and
DSP. it involves writing a MsBSP driver for the DSP side, but that
should be covered by existing code for TI DSPs...

Hi Vladimir,

That would be a great solution but I dont see how to do this easily with
the dspbios version 5 that I am using. There is an API for the DSP that
comes with dspbios version 5 but it does not cover the things that I would
need to do in the driver. In order for the DSP to read and write audio data
I would need to setup the DMA, the L3 and L4 interconnects, access the
registers on the MCBSP peripheral , access the codec using the i2c bus
and other things. While that is all possible in theory, I think it would be very
difficult to do.

I am quite happy to be corrected, if some one knows an easy way to do it
I would love to know.


Vladimir Pantelic

ongelezen,
11 jun 2013, 08:33:5411-06-2013
aan beagl...@googlegroups.com
Scott Baillie wrote:
>
> another option is to make the DSP access the audio codec directly,
> thus avoiding the latency of piping the audio data between ARM and
> DSP. it involves writing a MsBSP driver for the DSP side, but that
> should be covered by existing code for TI DSPs...
>
>
> Hi Vladimir,
>
> That would be a great solution but I dont see how to do this easily with
> the dspbios version 5 that I am using. There is an API for the DSP that
> comes with dspbios version 5 but it does not cover the things that I would
> need to do in the driver. In order for the DSP to read and write audio
> data
> I would need to setup the DMA, the L3 and L4 interconnects, access the
> registers on the MCBSP peripheral , access the codec using the i2c bus
> and other things. While that is all possible in theory, I think it
> would be very
> difficult to do.

I don't know if anybody ever did that, but it should be possible McBSP
is one of the peripherals that can be mapped to the DSP side.

Also, setting up a DMA on the DSP should be doable...

maybe do I2C setup of the codec on the Linux side for a start...

Scott Baillie

ongelezen,
11 jun 2013, 09:44:1711-06-2013
aan beagl...@googlegroups.com

I don't know if anybody ever did that, but it should be possible McBSP
is one of the peripherals that can be mapped to the DSP side.

Also, setting up a DMA on the DSP should be doable...

maybe do I2C setup of the codec on the Linux side for a start...

Thanks for the encouragement Vladimir but what you are suggesting is a huge project,
I estimate at least 6 man months and even if I did do all that I would still be stuck with the puny
16 bit A/D and D/A converters on the beagleboard xm.  You need 24 bit sample accuracy to get
a good quality sound, so my plan is to try and find a low latency USB audio device that is supported
by the linux kernel.

Also. from what I can tell, gstreamer is not a valid option because the focus of gstreamer is not
on latency but on throughput. A digital guitar effects processor has to have good throughput AND
good latency. The gstreamer approach still makes use of the alsa audio driver to capture and playback
audio so it will have the same poor latency that I am experiencing.


Vladimir Pantelic

ongelezen,
11 jun 2013, 10:06:1211-06-2013
aan beagl...@googlegroups.com
Scott Baillie wrote:
>
> I don't know if anybody ever did that, but it should be possible
> McBSP
> is one of the peripherals that can be mapped to the DSP side.
>
> Also, setting up a DMA on the DSP should be doable...
>
> maybe do I2C setup of the codec on the Linux side for a start...
>
>
> Thanks for the encouragement Vladimir but what you are suggesting is a
> huge project,
> I estimate at least 6 man months and even if I did do all that I would
> still be stuck with the puny
> 16 bit A/D and D/A converters on the beagleboard xm. You need 24 bit
> sample accuracy to get
> a good quality sound, so my plan is to try and find a low latency USB
> audio device that is supported
> by the linux kernel.

usb audio will also go through alsa, so you will add usb latency on
top of alsa latency, no?

> Also. from what I can tell, gstreamer is not a valid option because
> the focus of gstreamer is not
> on latency but on throughput. A digital guitar effects processor has
> to have good throughput AND
> good latency. The gstreamer approach still makes use of the alsa audio
> driver to capture and playback
> audio so it will have the same poor latency that I am experiencing.

maybe it would make sense to try to reduce alsa latency with the
onboard codec as much as possible as a first step since that will be
the baseline for all the other solutions

Scott Baillie

ongelezen,
11 jun 2013, 11:23:4911-06-2013
aan beagl...@googlegroups.com

usb audio will also go through alsa, so you will add usb latency on
top of alsa latency, no?

Well, no.

If I can find a low latency USB audio device then my problems are solved.

The problem is not alsa in general, the problem is the alsa driver on the
beagleboard xm. This specific driver has a high latency.
I believe that the reasons for the high latency are the following :

1. The TPS65950 audio chip is connected to the the DM3730 processor using
the i2c bus which is inherently a slow interface.

2. The TPS65950 audio chip is connected to the the MCBSP2 interface of the
DM3730 processor which has a massive hardware buffer of 5k.

jsm...@nc.rr.com

ongelezen,
11 jun 2013, 19:57:4611-06-2013
aan Scott Baillie, beagl...@googlegroups.com
Scott,

Thanks for this, and I see there are some other comments on my original post. There's obviously a wealth of information here, and it never occurred to me that there are these eval board setups that do more or less what I'd worried about.

There is also some dude who has posted an entire class around the BeagleBoard, http://elinux.org/Category:ECE497. It looks like he is using the BeagleBone, but that sure seems helpful.

Thanks again,

James

Vladimir Pantelic

ongelezen,
12 jun 2013, 00:08:2812-06-2013
aan beagl...@googlegroups.com
On 06/11/2013 05:23 PM, Scott Baillie wrote:
>
> usb audio will also go through alsa, so you will add usb latency on
> top of alsa latency, no?
>
>
> Well, no.
>
> If I can find a low latency USB audio device then my problems are solved.
>
> The problem is not alsa in general, the problem is the alsa driver on the
> beagleboard xm. This specific driver has a high latency.
> I believe that the reasons for the high latency are the following :
>
> 1. The TPS65950 audio chip is connected to the the DM3730 processor using
> the i2c bus which is inherently a slow interface.

i2c is only used to setup the codec, not for data transfer.


> 2. The TPS65950 audio chip is connected to the the MCBSP2 interface of the
> DM3730 processor which has a massive hardware buffer of 5k.

and you are sure that one has to use the full size of that buffer always?


Vladimir Pantelic

ongelezen,
12 jun 2013, 00:11:2312-06-2013
aan beagl...@googlegroups.com
On 06/11/2013 10:09 AM, Scott Baillie wrote:
>
> another option is to make the DSP access the audio codec directly,
> thus avoiding the latency of piping the audio data between ARM and
> DSP. it involves writing a MsBSP driver for the DSP side, but that
> should be covered by existing code for TI DSPs...
>
>
> Hi Vladimir,
>
> That would be a great solution but I dont see how to do this easily with
> the dspbios version 5 that I am using. There is an API for the DSP that
> comes with dspbios version 5 but it does not cover the things that I would
> need to do in the driver. In order for the DSP to read and write audio data
> I would need to setup the DMA, the L3 and L4 interconnects, access the
> registers on the MCBSP peripheral , access the codec using the i2c bus
> and other things. While that is all possible in theory, I think it would
> be very
> difficult to do.

Another thought is to leave the I2C and MCbsp handling on the Linux
side, but leave out userspace. kernel would setup MCBSP and DMA into a
buffer(s) and DSP would read/write directly from/to that buffer(s)

Scott Baillie

ongelezen,
12 jun 2013, 02:00:4512-06-2013
aan beagl...@googlegroups.com


i2c is only used to setup the codec, not for data transfer.

Yes but what if the driver has to access this bus at a time critical time such as the time
between filling the playback buffer and starting the PCM device.  This delay forces
you to specify a larger playback buffer than you actually need just so you can start the
PCM device.




and you are sure that one has to use the full size of that buffer always?

It appears to be always full on the playback side , not always full on the capture side.
A change to the driver should be able to change this behaviour.


Vladimir Pantelic

ongelezen,
12 jun 2013, 02:43:5712-06-2013
aan beagl...@googlegroups.com
Scott Baillie wrote:
>
>
> i2c is only used to setup the codec, not for data transfer.
>
>
> Yes but what if the driver has to access this bus at a time critical
> time such as the time
> between filling the playback buffer and starting the PCM device. This
> delay forces
> you to specify a larger playback buffer than you actually need just so
> you can start the
> PCM device.

I don't see a problem here, you need I2C to set up stuff like word
length, sample rate and volumes, you dont need it during the actual
data transfer.

> and you are sure that one has to use the full size of that buffer
> always?
>
> It appears to be always full on the playback side , not always full on
> the capture side.
> A change to the driver should be able to change this behaviour.

might be worth a look

Scott Baillie

ongelezen,
12 jun 2013, 03:42:3012-06-2013
aan beagl...@googlegroups.com
Hi Vladimir,

My observation comes from profiling the code and seeing what it actually does
rather than taking guesses at what it should do. Perhaps you can help me
investigate the sources of latency in the alsa driver code by examining and
profiling the code rather than just making a guess.

jsm...@nc.rr.com

ongelezen,
18 jun 2013, 13:27:1618-06-2013
aan beagl...@googlegroups.com
Scott,  I'm obviously new to this hardware, but I was comparing the beagleboard and beagleboard-xm specs and noticed that the xm added a USB hub.  It's not clear to me from your posts if this would be a problem, but it seems like the kind of thing that could add some latency.

Scott Baillie

ongelezen,
20 jun 2013, 11:53:3520-06-2013
aan beagl...@googlegroups.com, jsm...@nc.rr.com

I certainly hope it is not the USB hub causing the latency !!!!!!!!!!!!!!!!!!!!!

I hope I didn't offend Vladimir with my last post, I can see he knows a lot about the OMAP processor and
I guess I got a bit frustrated with all the questions. At this point in time, I don't truly know what the cause of
the latency is and I have ignored it for now and just focused on the DSP algorithm.

The DSP is certainly not the problem. I am really impressed at how fast it operates.

My algorithm at the moment is a saturation curve. The guitar signal has a constant gain for small signals but
the gain reduces for larger signals. It actually sounds pretty good and the DSP can process 64 samples
in about 300 micro seconds I think is pretty good.

Vladimir Pantelic

ongelezen,
21 jun 2013, 05:33:2121-06-2013
aan beagl...@googlegroups.com
Scott Baillie wrote:
>
> I certainly hope it is not the USB hub causing the latency
> !!!!!!!!!!!!!!!!!!!!!
>
> I hope I didn't offend Vladimir with my last post, I can see he knows
> a lot about the OMAP processor and

not offended at all :)

> I guess I got a bit frustrated with all the questions. At this point
> in time, I don't truly know what the cause of
> the latency is and I have ignored it for now and just focused on the
> DSP algorithm.
>
> The DSP is certainly not the problem. I am really impressed at how
> fast it operates.
>
> My algorithm at the moment is a saturation curve. The guitar signal
> has a constant gain for small signals but
> the gain reduces for larger signals. It actually sounds pretty good
> and the DSP can process 64 samples
> in about 300 micro seconds I think is pretty good.

if you have some code to share, put up a github, I am sure others
would like to use the DSP too.

(and I like having some URL to point people at)

David Witten

ongelezen,
21 jun 2013, 11:39:3921-06-2013
aan beagl...@googlegroups.com
I am very interested in this thread.  

I've been considering using TI OMAP or Sitara processor for a couple of similar projects for some time.  One is a potentially pure open-source project  The other has some backers hoping to make something they might sell, but who don't mind if I contribute non-strategic bits back for others.

I share many of the concerns expressed here.  I am told that 50 ms end-to-end latency is too much.  I am very interested in exploring all of these approaches to mitigate this problem.

I have been very happy so far using a BB-XM for rough prototyping.  but I am concerned that I don't expend a lot of effort on hardware that will not be available at satisfactory volumes for a niche manufacturer.   I want to work (if possible) with an ARM CPU tightly coupled to a floating point DSP.   

I am concerned that the only product with in the OMAP line that is supported for my purposes seems to be the  OMAP L138.  Though overall performance of the ARM side is not the main concern, I am worried that its ARM 9 core is a major step down in performance, and wonder what other limitations it may contain that I'm not considering.  I want to use the DSP for effects processing, but I need control of the system and ideally drivers to live in some kind of os, preferably Linux

Does anyone know if this latter is a reasonable concern?

I am also wondering if the 'Programmable Real-Time Unit Subsystem' (PRUSS) processors can be used to good reduce latency in a McBSP<->PRU<->DMA<->CPU chain on any of these processors?


Dave
Allen beantwoorden
Auteur beantwoorden
Doorsturen
0 nieuwe berichten