Thanks for the link. I would have lots of comments to share, but since the thread is from 2013 I guess it is too late ;-) I'll just add this here: Although I also use python, I don't have any of the forum posters problems, because I use a sound-card which essentially does all real-time processing for me. So the actual application does not require real-time performance. Needless to say, the motors of my i3 move smooth as butter when controlled with kamaq.
This is the most interesting system for controlling stepper motors and 3D printer kinematics that I have ever seen! I am just curious about how you overcome the hard real time constraints of stepper pulse generation using some USB audio codecs? How can you manage to generate an exact number of stepper pulses in any direction and how do you synchronize the pulses for two or more steppers?
Thanks for the compliment. To answer your question: The hard real-time constraints are not really overcome (that is theoretically impossible with a non-deterministic "soft real-time" OS such as Linux). But the fact that Kamaq uses a sound-card makes this a non-issue. The sound-card, audio drivers and the audio back-end of Kamaq itself provide enough data buffering that latency hick-ups of several milliseconds have no effect on the smoothness of the movement. You could say that the sound-card itself is doing the real-time processing. It works just as well as playing audio through the sound-card: You don't get any hick-ups when playing music either, do you?