Working on making a very simple micrometer driver. I need to start with an initial speed and accelerate over time. Ran into a problem with it decelerating after going half the distance, so I've added a check statement and doubled the distance.
But my problem is that the given initial speed that I thought I set with setSpeed is about 120 times lower than what my measured value was. Is there something I missing in my code or some math that I'm not thinking about? The speeds should all be in steps/second and the acceleration in steps/second^2.
Any help would be great!
#include <AccelStepper.h>
// create an instance of the stepper class, specifying
// the number of steps of the motor and the pins it's
// attached to
AccelStepper stepper(4, 4, 5, 6, 7);
long checkdist = stepper.currentPosition();
void setup()
{
Serial.begin(9600);
Serial.println("Stepper test!");
stepper.setMaxSpeed(0.36666667);
stepper.setSpeed(0.06087957);
stepper.setAcceleration(0.00002231);
}
void loop()
{
checkdist = stepper.currentPosition();
if(checkdist >= 2921){
stepper.stop();
}
else
{
stepper.moveTo(6000);
stepper.run();
}
}