Rotary CNC + Bamboo + BBB + CRAMPS?

255 views
Skip to first unread message

jonas hauptman

unread,
Feb 21, 2019, 10:31:53 PM2/21/19
to Machinekit
Hi,

We are new to your group and to machine kit but hoping the community might have some feedback for us.  We are trying to develop a Rotary 4 axis CNC router to machine bamboo poles into precise joints.  We believe this will require six motors and also a scanning function as bamboo poles are highly irregular in size, shape, and straightness.  Our project goal is to democratize CNC rotary machining with a low-cost high-performance machine for bamboo.   A material that has a huge environmental and mechanical upside for both the developed and developing world.  Presently it is difficult to use it in a high precision fashion and we hope to change that.  Initially, we planned to use a 3d printer Arduino boards and Marlin to control the machine but eventually realized we would have trouble independently controlling six motors and true 4 axis machining.  We have a little experience with LinuxCNC, I built a CNC Router Parts kit and outfitted it with a custom electronics bundle that Len from Probotix was kind enough to create for me around there standard control system (Unity). I am a huge fan of the Probotix machines and controls but we are trying to develop a machine that in total costs around $500 to build including computer, scanning camera, touch display, completely mechanical, electrical and CNC system.  Our earlier prototypes used some open source components designs and still share some common strategies with the Sienci Mill One Kit V3.  Realizing that the cost of a full computer and control system even on Linux was too expensive and that Arduino with GRBL lack the horsepower and software features we need we are trying to develop our strategy and prototypes around the Beaglebone with a Cramps Cape.

I am posting hoping to begin to build a community around our project and looking for insights of any kind especially around our need of a control system for 4 axis and that can support our scanning needs.  I have attached a series of schematic and photographic summaries of our progress and look forward to input from the community.  

Best regards,

Jonas Hauptman











VT BCNC.pdf

Charles Steinkuehler

unread,
Feb 24, 2019, 7:25:35 AM2/24/19
to machi...@googlegroups.com
That looks like a very interesting project!

The BeagleBone should be able to handle the 4-axis machine control,
but I'm not sure about handling the vision pipeline. I know some
people have been doing machine vision projects with the BeagleBone,
but I have no personal experience in this area. I recommend asking
about machine vision on the BeagleBoard Google Group:

https://groups.google.com/forum/#!forum/beagleboard

A PC will give you *MUCH* better performance for the vision pipeline,
but then you will need something to move the motors, which means more
cost and electronics (Mesa hardware, Arduino, or even the BeagleBone).

If you're not real worried about speed, the BeagleBone will probably
be able to perform the vision tasks you need, just slowly.
--
Charles Steinkuehler
cha...@steinkuehler.net

Charles Steinkuehler

unread,
Feb 24, 2019, 9:53:07 AM2/24/19
to jonas hauptman, Machinekit Mailing List
I'd call 1-2 minutes for scanning slow, but that depends a bit on how
much processing you're trying to do. Given that sort of time frame, I
think the main problem you might have with the BBB is running out of
memory, but again that depends on what you're trying to do.

For the control, I'd suggest using a tablet/smart-phone and setup a
remote interface using QtQuickVcp. You might also be able to use a
camera on the tablet for your scanning. I'm not sure if that would be
easier or harder than doing the scanning with the BBB.

NOTE: You can get a USB-OTG cable and connect most tablets directly to
the USB Client port on the BBB. The table will see the gadget
Ethernet driver on the BBB and automatically setup networking so you
don't have to communicate via WiFi. I do this with a 7" RCA Voyager
tablet I bought for ~$35. It's not a great tablet, but the
touch-screen works fine for a UI!

On 2/24/2019 7:33 AM, jonas hauptman wrote:
> Thanks!
>
> I am not worried if the vision scanning routine takes 1-2 minutes. Is that
> in the neighborhood of fast or slow in your opinion? Another thought would
> be the run an additional beaglebone or Raspberry PI to handle human
> interface touch display and vision. Still would cost a lot less than full
> size PC and control system.
>
> What do you think?
>
> JH

ce...@tuta.io

unread,
Feb 24, 2019, 12:49:33 PM2/24/19
to Machinekit
Stripping down the newspeak and translating to normal language, the main problem is the cost, right?

If so then maybe the BBB is not the best solution. It's certainly cool and I am also using it on my something-like-printer, but Machinekit (or LinuxCNC) can actually run on pretty much electrotrash. Or if you cannot use trash, there are still J1900 boards in the same pricerange as BBB which are quite good (or so they say) for running RT system. You would have whole "normal" powered computer with usable graphics. The only problem would be interfacing the steppers and the I/Os. If the parallel port is not good enought, you could go for example the way one user on LinuxCNC forum went and build yourself UDP connected MCU hardware (his project for inspiration is http://erste.de/ethraw/ ). The CRAMPS is more for printer project.

How many these bamboo thingies do you want to build?

C.

Dne pátek 22. února 2019 4:31:53 UTC+1 jonas hauptman napsal(a):

Oleg Gavavka

unread,
Feb 25, 2019, 4:23:05 AM2/25/19
to Machinekit
Keep eye on Revolve ( https://www.thing-printer.com/revolve/)
when it will be released you could use same system but with better price, estimated at 100$-120$ range



пятница, 22 февраля 2019 г., 5:31:53 UTC+2 пользователь jonas hauptman написал:

Charles Steinkuehler

unread,
Feb 26, 2019, 11:15:52 AM2/26/19
to machi...@googlegroups.com
FYI:
The BeagleBone-AI may be a good fit for your project:

https://www.facebook.com/photo.php?fbid=10218976824519992&set=a.2907631578284&type=3&theater

It should do machine vision _much_ better than the BBB. It's
basically the SoC from the X15 in a BeagleBone form factor. I'm
working on getting Machinekit working on this board and verifying
capes work as expected.

I couldn't say anything about it earlier, but now they've announced it
at Embedded World. :)

Charles Steinkuehler

unread,
Feb 26, 2019, 3:31:43 PM2/26/19
to Chris Albertson, Machinekit Mailing List
Please keep replies on-list.

For pick-and-place, there are a few open-source projects working on
solving the problem, such as OpenPNP:

http://openpnp.org/

Often, the "back-end" performing the motion does use gcode.

On 2/26/2019 2:01 PM, Chris Albertson wrote:
> I just read the TI's paper on this. They describe the workflow for using
> the machine learning and vision subsystem. And the workflow is not
> super-horrible as I feared. It fact it seems straight forward if you are
> already familiar with machine learning and vision. For others, the
> 15-second summary is this:
>
> You develop your machine learning or vision system on high-end PC hardware
>> (at least a nVidida GTX10XX GPU) under Linux. You use familiar tools like
>> TensorFlow and openCV. Then there is TI software that takes what you
>> have running on the PC and translates it to run on the much smaller device
>> on the Beagle Board. The beagle is about 100x slower but speed is really
>> only needed for training a network, not needed to run it.
>
>
> For those not in the field. "Machine Learning" is almost a misnomer. The
> training and learning happens on the big PC then we "freeze" a snapshot and
> move it the tiny chip where it never leans or changes behavior . The
> learning happens only in the lab.
>
>
> Nowhere is my question: The most simple use case of this that relates to
> Machine Kit is a pick and place machine. This is a 2-axis machine that
> picks up a tiny part from a surface and drops it some other place on the
> surface. It does not even have a true z-axis. But the catch is finding
> the part and finding the *exact* place to drop the part. For that we need
> a camera. No-one uses g-code for these machines because we can't know in
> advance how the machine is to move. So what we do is tell the machine
> where the parts are in general and where relative to the final assembly the
> part should go.
>
> It seems to me this machine would replace the g-code interpreter with
> different logic but otherwise could work exactly the same. Or perhaps the
> g-code is not read from file but there is a process that generates it in
> real time based on the camera.
>
> What I'd like is to use my Harbor Freight Mili mill as a picking place
> machine. This would be REALLY popular., simply place a little vacuum
> picker on the spindle and for very little cost you have a slow slow pick
> and place machine suitable for hobby level PCB assembly. The HF mill
> should be accurate enough.
>
> In any case, how were you planning to this vision and machine learning
> hardware to MK?
>
>
> On Tue, Feb 26, 2019 at 8:15 AM Charles Steinkuehler <
> cha...@steinkuehler.net> wrote:
>
>> FYI:
>> The BeagleBone-AI may be a good fit for your project:
>>
>>
>> https://www.facebook.com/photo.php?fbid=10218976824519992&set=a.2907631578284&type=3&theater
>>
>> It should do machine vision _much_ better than the BBB. It's
>> basically the SoC from the X15 in a BeagleBone form factor. I'm
>> working on getting Machinekit working on this board and verifying
>> capes work as expected.
>>


--
Charles Steinkuehler
cha...@steinkuehler.net

Chris Albertson

unread,
Feb 26, 2019, 3:58:49 PM2/26/19
to Charles Steinkuehler, Machinekit Mailing List


On Tue, Feb 26, 2019 at 12:31 PM Charles Steinkuehler <cha...@steinkuehler.net> wrote:
Please keep replies on-list.

Sorry, I intended to.    

For pick-and-place, there are a few open-source projects working on
solving the problem, such as OpenPNP:


The problem is not just wanting a PnP machine but wanting to learn to use MK for a wider range of machines.    Yes that is one solution, getting MK to read g-code from a pipe or socket and then some process to feed the pipe in real time. 

I'm still wanting to know what you are goin to use the AI/Vision hardware for.


--

Chris Albertson
Redondo Beach, California

Charles Steinkuehler

unread,
Feb 26, 2019, 4:31:44 PM2/26/19
to machi...@googlegroups.com
On 2/26/2019 2:58 PM, Chris Albertson wrote:
> I'm still wanting to know what you are goin to use the AI/Vision hardware
> for.

*I* am not using vision for anything at the moment, so I'm assuming
you're asking the original poster. I'm not sure what they have
planned for vision, but I'd be interested in hearing more details
about it as well.

--
Charles Steinkuehler
cha...@steinkuehler.net

Chris Albertson

unread,
Feb 26, 2019, 7:17:36 PM2/26/19
to Charles Steinkuehler, Machinekit

*I* am not using vision for anything at the moment, so I'm assuming
you're asking the original poster.  I'm not sure what they have
planned for vision, but I'd be interested in hearing more details
about it as well.

I did ask him, The answer was a little confusing.    They use vision to measure the bamboo then put this information into a model in Fusion 360 and then output standard g-code to drive the mill.   So the vision task is complete before the g-code is written and they don't run the mill and the vision at the same time.

The confusing part is that if they are running Fusion 360, they have a high-end PC or Mac available.  If it can run Fusion, it can run any vision software they would need.



--
Charles Steinkuehler
cha...@steinkuehler.net

--
website: http://www.machinekit.io blog: http://blog.machinekit.io github: https://github.com/machinekit
---
You received this message because you are subscribed to the Google Groups "Machinekit" group.
To unsubscribe from this group and stop receiving emails from it, send an email to machinekit+...@googlegroups.com.
Visit this group at https://groups.google.com/group/machinekit.
For more options, visit https://groups.google.com/d/optout.

Charles Steinkuehler

unread,
Feb 26, 2019, 9:24:42 PM2/26/19
to jonas hauptman, Machinekit Mailing List
On 2/26/2019 7:27 PM, jonas hauptman wrote:
> Charles,
>
> Wow sounds awesome. In the meantime until we better understanding the basic
> we are trying your cape ( one it shows up) for CNC and we build the
> scanning into a second Micro Computer, ideally witg two computers able go
> talk to each other.
>
> Any idea what the new BB will cost?

Apx. $100

> Is it likely your CRAMP Cape will work
> as is with it as?

That's the plan. There may be some feature that don't work, or work
differently, but that's likely to be things like reset and system
power vs. the signals used for CNC control.

--
Charles Steinkuehler
cha...@steinkuehler.net

Bradley Turner

unread,
Mar 6, 2019, 12:19:52 AM3/6/19
to Machinekit
Fusion 360 does its calculations in the cloud, so it actually doesn't necessarily mean you need a high-end PC or Mac. However the idea is that there will be a number of preset cam files that the user of the CNC could choose from. After they chose one of the preset files (normally a type of joint) the scanning would take place, and then alter the CAM file to fit the work piece of bamboo. Does that help clarify? Sorry if our intentions aren't totally clear.

Thank you

Bradley Turner

unread,
Mar 6, 2019, 12:28:15 AM3/6/19
to Machinekit
I am working with Jonas on this project as well. I successfully installed machine kit on the BeagleBone with a CRAMPS cape attached. I was able to get into the desktop environment of the beaglebone and launch the machinekit "Mendel Max CRAMPS" setup. I got into the GUI AXIS, and was able to turn machine on/off trigger the E-steps on/off etc., but when I went to do a test move of the machine by using the jogging commands in the GUI of AXIS the stepper motor would not turn. In the software the preview of the machine would move exactly like it was supposed to but the stepper would not move. I swapped out the stepper motor to make sure that wasn't the issue, and it was not. What could be causing this? I tried to scrub through the .ini and .hal files for stepper driver configuration but was not able to find it. Also is there a better machinekit config out there for the CRAMPS board for us with a CNC? I have no problem using the Mendel Max setup, it is just geared towards a 3D printer, and I wasn't sure if there would be a better option already configured. 
Thank you for any help you can provide.

Damien.D

unread,
Mar 6, 2019, 6:56:42 AM3/6/19
to Charles Steinkuehler, Machinekit
On Tue, Feb 26, 2019 at 5:15 PM Charles Steinkuehler <cha...@steinkuehler.net> wrote:
FYI:
The BeagleBone-AI may be a good fit for your project:

https://www.facebook.com/photo.php?fbid=10218976824519992&set=a.2907631578284&type=3&theater

It should do machine vision _much_ better than the BBB.  It's
basically the SoC from the X15 in a BeagleBone form factor.  I'm
working on getting Machinekit working on this board and verifying
capes work as expected.

I couldn't say anything about it earlier, but now they've announced it
at Embedded World.  :)
Sounds amazing!
BBB with PRU is great for CNC but I always thought I will have to switch to a PC+mesa architecture to have (a proper/usable) 3D/2D real time vision but with that new board that shouldn't be a problem anymore! Looking forward to try out that board!!

@Charles Steinkuehler If you are allow to talk about it, is the pinout similar as the BBB for UART, SPI, eQEP pins, ...?

Charles Steinkuehler

unread,
Mar 6, 2019, 7:41:54 AM3/6/19
to machi...@googlegroups.com
Make sure:

* Your stepper driver is installed correctly (it's easy to install
them backwards or off-by-one on the pin alignment)

* You are using the proper axis, there are 6 to choose from and not
all get driven in the MendelMax config

* You have an appropriate power supply connected to the motor power
input on the CRAMPS board. There are several different power rails on
the CRAMPS (Motor, Bed, Extruder, and Aux) which provides flexibility,
but can make it confusing to wire up.
--
Charles Steinkuehler
cha...@steinkuehler.net

Charles Steinkuehler

unread,
Mar 6, 2019, 7:54:49 AM3/6/19
to machi...@googlegroups.com
On 3/6/2019 5:56 AM, Damien.D wrote:
> On Tue, Feb 26, 2019 at 5:15 PM Charles Steinkuehler <
> cha...@steinkuehler.net> wrote:
>
> @Charles Steinkuehler <cha...@steinkuehler.net> If you are allow to talk
> about it, is the pinout similar as the BBB for UART, SPI, eQEP pins, ...?

There are a lot more features available on the AI. You do not loose
pins on P8/P9 for HDMI, and many of the expansion pins are tied to
more than one ball on the CPU to allow for lots of different choices
regarding I/O, with a focus on making more of the encoder and PRU pins
available for use.

There's no handy I/O spreadsheet available yet (and I don't have time
to make one), but it shouldn't be too hard to migrate capes. In
general, the primary pin functions (eg: I2C, SPI, timer) match up,
it's the secondary uses (like encoder & PRU I/O) that might change.

--
Charles Steinkuehler
cha...@steinkuehler.net

Bradley Turner

unread,
Mar 7, 2019, 11:00:58 AM3/7/19
to Machinekit
I had the power connected to the extruder power. Feel a little silly for that. Is there somewhere that I can find a good wire diagram of the CRAMPS board? I haven't been able to find a great one and that may help me with easy answers like these. Thanks!

Bas de Bruijn

unread,
Mar 7, 2019, 11:09:07 AM3/7/19
to Bradley Turner, Machinekit


On 7 Mar 2019, at 17:00, Bradley Turner <btromb...@gmail.com> wrote:

I had the power connected to the extruder power. Feel a little silly for that. Is there somewhere that I can find a good wire diagram of the CRAMPS board? I haven't been able to find a great one and that may help me with easy answers like these. Thanks!

--

Charles Steinkuehler

unread,
Mar 7, 2019, 11:27:55 AM3/7/19
to machi...@googlegroups.com
On 3/7/2019 10:07 AM, Bas de Bruijn wrote:
>
>> On 7 Mar 2019, at 17:00, Bradley Turner <btromb...@gmail.com> wrote:
>>
>> I had the power connected to the extruder power. Feel a little silly for that. Is there somewhere that I can find a good wire diagram of the CRAMPS board? I haven't been able to find a great one and that may help me with easy answers like these. Thanks!
>
> https://reprap.org/wiki/CRAMPS

There isn't really a "wiring diagram" (of the fritzing variety)
available anywhere. Contributions welcome!

The schematics are on github, the reprap.org page above has links.

The page John Elson (of Pico systems, who sells pre-made CRAMPS
boards) created might also be of some help:

http://pico-systems.com/osc2.5/catalog/links/cramps.html

--
Charles Steinkuehler
cha...@steinkuehler.net

jonas hauptman

unread,
Sep 28, 2019, 10:30:03 AM9/28/19
to Machinekit
HI,

We are curious has anyone experimented with using the BB AI with Machinekit yet?  Either way, I was wondering if one was planning to do other operation such as in-browser 3d modeling (onshape, fusion360 or rhinoceros 7) as well as part scanning real-time whether it is yet a good time to try to transition over.  Our project in general /  non-machining parts level working well.  Charles thanks so much for your help!  But now we're about to size things up for longer axis of movement and more toque so along with moving to all larger NEMA 23 motors we are interested in considering BB AI.   Especially if it is a drop-in replacement and still works with the CRAMPS Board and Configuration or if there is a new CRAMPS board that will be paired with it?  Anyway, if anyone has thoughts or insights we are interested to hear them.  Our goal continues to be to develop a sub $1000 (hopefully less) open-source 4 axis CNC for rotary machining bamboo pole stock. To help others to produce precice products and building system components, to democratize high craft for a highly sustainable material system.

Best regards,

Jonas Hauptman
Virginia Tech





On Thursday, February 21, 2019 at 10:31:53 PM UTC-5, jonas hauptman wrote:

Bas de Bruijn

unread,
Sep 28, 2019, 11:44:17 AM9/28/19
to jonas hauptman, Machinekit


On 28 Sep 2019, at 16:30, jonas hauptman <jonash...@gmail.com> wrote:

HI,

We are curious has anyone experimented with using the BB AI with Machinekit yet? 

I have tried updating to a realtime kernel, but that does not work out of the box as of yet. I ran MK on an X15 so I guess there should be no big problems.

Either way, I was wondering if one was planning to do other operation such as in-browser 3d modeling (onshape, fusion360 or rhinoceros 7) as well as part scanning real-time whether it is yet a good time to try to transition over. 

I would not do that from a beaglebone. But I guess you could if you wanted to.

What do you mean with “part scanning real-time”?

Our project in general /  non-machining parts level working well.  Charles thanks so much for your help!  But now we're about to size things up for longer axis of movement and more toque so along with moving to all larger NEMA 23 motors we are interested in considering BB AI.   

Why the BB AI? What holds you back in the BBB?

Bas

Especially if it is a drop-in replacement and still works with the CRAMPS Board and Configuration or if there is a new CRAMPS board that will be paired with it?  Anyway, if anyone has thoughts or insights we are interested to hear them.  Our goal continues to be to develop a sub $1000 (hopefully less) open-source 4 axis CNC for rotary machining bamboo pole stock. To help others to produce precice products and building system components, to democratize high craft for a highly sustainable material system.

Best regards,

Jonas Hauptman
Virginia Tech





On Thursday, February 21, 2019 at 10:31:53 PM UTC-5, jonas hauptman wrote:
Hi,

We are new to your group and to machine kit but hoping the community might have some feedback for us.  We are trying to develop a Rotary 4 axis CNC router to machine bamboo poles into precise joints.  We believe this will require six motors and also a scanning function as bamboo poles are highly irregular in size, shape, and straightness.  Our project goal is to democratize CNC rotary machining with a low-cost high-performance machine for bamboo.   A material that has a huge environmental and mechanical upside for both the developed and developing world.  Presently it is difficult to use it in a high precision fashion and we hope to change that.  Initially, we planned to use a 3d printer Arduino boards and Marlin to control the machine but eventually realized we would have trouble independently controlling six motors and true 4 axis machining.  We have a little experience with LinuxCNC, I built a CNC Router Parts kit and outfitted it with a custom electronics bundle that Len from Probotix was kind enough to create for me around there standard control system (Unity). I am a huge fan of the Probotix machines and controls but we are trying to develop a machine that in total costs around $500 to build including computer, scanning camera, touch display, completely mechanical, electrical and CNC system.  Our earlier prototypes used some open source components designs and still share some common strategies with the Sienci Mill One Kit V3.  Realizing that the cost of a full computer and control system even on Linux was too expensive and that Arduino with GRBL lack the horsepower and software features we need we are trying to develop our strategy and prototypes around the Beaglebone with a Cramps Cape.

I am posting hoping to begin to build a community around our project and looking for insights of any kind especially around our need of a control system for 4 axis and that can support our scanning needs.  I have attached a series of schematic and photographic summaries of our progress and look forward to input from the community.  

Best regards,

Jonas Hauptman











--
website: http://www.machinekit.io blog: http://blog.machinekit.io github: https://github.com/machinekit
---
You received this message because you are subscribed to the Google Groups "Machinekit" group.
To unsubscribe from this group and stop receiving emails from it, send an email to machinekit+...@googlegroups.com.

justin White

unread,
Sep 29, 2019, 2:31:35 PM9/29/19
to Machinekit


On Saturday, September 28, 2019 at 11:44:17 AM UTC-4, Bas de Bruijn wrote:


Why the BB AI? What holds you back in the BBB?

Bas

Kind of an old thread but as he said above he wants to use machine vision cameras with it. The BB AI still has minimal cpu and ram, but apparently it has a good gpu and embedded vision engines so it's meant for this type of thing.

Michael Brown

unread,
Sep 29, 2019, 3:03:34 PM9/29/19
to Machinekit
Hi Jonas
I know I'm late to this thread I would just like to mention that Machinekit now has a new hardware option called the Ultra96 
This (socfpga) board is somewhat more expensive than the BBxx offerings, but it also has a lot of more muscle. (also in the cpu's)
It already has a vision recognition framework described in these links below.

https://www.hackster.io/karl-nl/binary-neural-network-demonstration-on-ultra96-6b48e0

The final bits and pieces have just been put into place to make this Board part of the Machinekit mksocfpga ecosystem.

Best wishes
Michael B
Reply all
Reply to author
Forward
0 new messages