I want to start a little home audio project. I effectively want to reproduce (simulate?) some guitar effects boxes I use. I'm not really interested in any open-source filter/effects software, because one main goal is to design and code that stuff myself (my graduate work was in digital signal processing). My biggest issue is figuring out the hardware to support the basic goal: An open source RTOS controlling platform and farming the actual processing to some kind of DSP-ish hardware. In researching this, I ended up at BeagleBoard.
The first thing I want is some kind of open source OS that supports the primary controlling system. I want to develop, test and have some kind graphical/plotting tool on a desktop box, so I'd abstract the physical controlling interface in a way that I could manage it from external software rather than physical hardware (knobs, switches, etc). I'm also not too concerned with A/D conversion in real-time; in a first pass, I would be very happy to abstract the digital input/output interface and handle that externally too. I have (or had) a lot of experience w/ VxWorks (on PowerPc) and ThreadX on ARM, but I’m thinking I shouldn’t expect to find open source equivalents of their respective development environments. I ‘m actually thinking that the main requirement here is going to be based on the second thing.
The second thing I would like to do is push the effects processing to some kind of DSP hardware. I can imagine some scenarios, but this is where I am really out of my depth. I have some superficial experience with DSP interaction, but the low-level details where hidden behind various APIs. So, I'm really hunting in the dark here because this seems like the really hard thing to do from scratch. How do I go about figuring out whether I can even put together the hardware so that I could communicate with say a Motorola or Texas Instruments DSP from the RTOS? I'd love to find a setup where I could (or had to) write my own device drivers, but honestly I don't know if my understanding of Linux device drivers even translates to the DSP world. Do you flash code to the DSP? Would I have to write it in assembly? Assuming I can get the hardware cheaply, am I even going to be able to find open source environments to support communicating with it?
Have I found the right place?
If you have any advice or you can recommend any pointers to other forums, tech sites, whatever, I'd really appreciate it.
Or if you can quickly and authoritatively squash my dreams with big dollar signs, that's cool too.
Thanks,
James Smyth
--
For more options, visit http://beagleboard.org/discuss
---
You received this message because you are subscribed to the Google Groups "BeagleBoard" group.
To unsubscribe from this group and stop receiving emails from it, send an email to beagleboard...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
another option is to make the DSP access the audio codec directly,
thus avoiding the latency of piping the audio data between ARM and
DSP. it involves writing a MsBSP driver for the DSP side, but that
should be covered by existing code for TI DSPs...
I don't know if anybody ever did that, but it should be possible McBSP
is one of the peripherals that can be mapped to the DSP side.
Also, setting up a DMA on the DSP should be doable...
maybe do I2C setup of the codec on the Linux side for a start...
usb audio will also go through alsa, so you will add usb latency on
top of alsa latency, no?
i2c is only used to setup the codec, not for data transfer.
and you are sure that one has to use the full size of that buffer always?