Wireless Real-Time 2D Plotter

Justin Selig '17 (jss459)
Peter Moegenburg '17 (pbm53)

May 2017

The goal of this project was to design a wireless canvas and plotter system which would translate real-time movements on a touch screen to a 2D plotter. The plotter would scale up the canvas image and replicate this on larger physical paper.



We began this project by exploring ways to allow a user to paint in a virtual environment and produce an image via a computer-aided system. We came across an old plotter and worked to modify the device to meet our needs. Our final system involved the use of a wireless ‘canvas’ - a touch screen onto which a user draws an image - and the plotter system which uses these movements to produce a scaled-up physical rendering of the image drawn. The final prototype involves two Raspberry Pis: one which sits on the plotter and controls the motors, and a second handheld Pi with a touchscreen attached.

Figure 1: Final Design

System Outline

Figure 2: System Diagram The system is built such that the user interacts with a single touch screen on which all controls are presented. When the user draws on the screen, the brush strokes are communicated wirelessly to a second system which acts as the mechanical interface. The second system consists of a plotter shell with a receiver which also controls a set of motors to manipulate the plotter head.

Design and Testing


Our initial idea involved constructing our own mechanical system which would be able to move freely in two-dimensions through control by two independent motors.

Figure 3: Hardware Design Ideation We then realized that our system already reflected that of a standard 2D plotter. On search for a similar device, we came across a plotter generously donated by Prof. Bruce Land.

Figure 4: Old plotter before its gutting The HIPLOT DMP-29 pen plotter undercarriage contained outdated electrical and mechanical components, which were removed. The original servo motor and drive gearing is highlighted in figure 4.

The servo motors were replaced with NEMA 17 stepper motors and the belts were replaced with GT2 belts with teeth.

Each motor shaft was equipped with a GT2 pulley, which provided a no-slip connection to the drive belt. Structural plastic was carved out from the underside of the plotter to make way for custom brackets for the NEMA 17 motors.

Figure 5: Gutted Components
Figure 6: Custom geared belt and pulley components Figure 7: NEMA 17 stepper motor with gear attached

Figure 8: Final gutted undercarriage

The solenoid in control of the ‘pen down’ rail (red arrow below) did not operate with the low power supply for our electronics (9 V), so we attempted to install a continuous rotation servo motor (purple arrow below) to toggle the rail on and off. Unfortunately, the servo did not hold tension. This could perhaps be fixed using a positional rotation servo motor.

Figure 9: Attempt to install servo



Aboard the plotter is a Raspberry Pi which runs the Debian Operating System “Jessie,” a form of embedded Linux. This microcontroller acts as a server which accepts a connection from the canvas over TCP. The server implements a series of methods which perform motor controls to move the plotter head. The server runs a main thread, move_dispatch, which queries a custom queue data structure for motor movements. This data structure, move_queue, stores delta x and y values as tuples (dx, dy) which are used to independently move the two stepper motors attached to the mechanical belt system. In particular, when a new movement is read in from the move_queue, the next element is popped off the queue and a thread is spawned which calls upon a stepper_worker to tick each motor in either the forward or reverse direction for the specified number of deltas.

A second important method, move, is invoked via Remote Procedure Calls (RPC) from the client device. This method fills the move_queue data structure by dispatching a thread to push the received delta x and y values to the move_queue as tuples. Based on the RPC system, the move method must terminate before the client is allowed to resume execution. Therefore, we found it necessary to dispatch a thread in this method which would independently push a new tuple to the queue. This permits the client to resume as quickly as possible while allowing the server to arbitrate the shared queue resource.

Figure 10: Raspberry Pi with Stepper Motor Hat on Plotter

The handheld device consists of a second Raspberry Pi which runs a program that reads in user interaction via a TFT touch-screen. Using the PyGame API, the program displays a number of buttons on the touch screen and detects particular motions to invoke appropriate procedures on the server. The Graphical User Interface (GUI) has several buttons that allow the user to:

  • Quit the program and close the server
  • Clear the TFT screen of any drawings
  • Pause the plotter while it’s in motion and resume the plotter from its position
  • Change colors on the touch screen
  • Manually move the plotter head up, down, left, or right
Figure 11: Handheld Raspberry Pi showing controls

The client program runs a main loop which polls for user ‘mouse clicks’ on the screen. If a single click is detected, we determine that a button on the navbar must be pressed, so an associated function is executed for each (listed above). In the case that the user makes a brush stroke movement, the stroke is drawn to the screen and the program invokes a remote procedure move on the server with the calculated dx and dy values based on the difference between previous and current mouse positions.

Figure 12: Some random scribbles on test screen

Interprocess Communication: Remote Procedure Calls

The communication protocol operates over TCP between the two host machines. To modularize and streamline the operation of the motors, we used an API msgpack-rpc. MessagePack is a communication protocol that uses binary serialization of JSON packets to communicate between running processes. Through this process, by establishing a TCP connection between the server and client, we are able to easily invoke processes between hosts without having to build our own message decoding system. The Python API, msgpack-rpc, allows a user to invoke server-side methods from the client by simply using the syntax ‘client.call(server_method_name,arg1,arg2,...).’ In our case, we used this communication method to call server method ‘move’ with dx and dy values for the x and y stepper motors, respectively.

Issues Faced

Plenty of issues arose in both the mechanical construction and software design of this project. Firstly, since we were working with an old plotter, there were rusty parts that made smooth movement of the plotter head near impossible. A lot of WD-40 did the trick (somewhat). Secondly, as far as the mechanical construction goes, the stepper motors we intended to install didn’t fit the existing frame. We therefore machined custom L-brackets and modified the frame such that it would fit our components. Lastly, for hardware constraints, it was difficult interfacing with the on-board solenoid for lifting and placing down the plotter’s pen head. The solenoid had power requirements that were out of the scope of our design. Given time constraints, we decided to implement our own external servo for lifting the pen head. However, as fate would have it, this also proved difficult to implement since the servo would not hold tension on the plotter head. We eventually scrapped the idea for having control over lifting and placing the plotter head for the time being.

As for software, there were issues in working to make the plotter operate in real-time given speed constraints of the stepper motors. Since the user moves quicker than the motors could operate, it was necessary to build a custom queue data structure to hold user movements such that these could be executed by the plotter sequentially. Additionally, when trying to dispatch multiple stepper workers to execute motor operations, there was significant jitter from multiple threads trying to move the same motor at once. To resolve this, we added a busy-wait in the main loop of the server which prevents a motor movement from being released on the move_queue until each motor has completed its previous movement.

Testing Performed

Our first goal was to operate the stepper motors independently of the plotter frame. This involved interfacing with the motor hat on the server-side alone. Once functions were written which performed motor movements immediately upon invocation on the server, we worked to incorporate these into the full system with the client. After making the GUI display to the user, we turned the system such that the x and y motors would move independently according to the delta values from the user input. Once the system was fully integrated, we began tuning the motor speeds and mechanical parts such that we obtained the highest-fidelity image drawn to paper. This involved experimenting with different motor speeds, scalar sizes, and stepper worker method calls. Once satisfied with the system, it was simply a matter of setting up the Pis such that each obtained a unique IP address so that they could communicate over the Cornell network.


We met our expected functionality in producing a system which replicates user movements on a touch screen to physical movements of a 2D plotter. A demonstration of the final working prototype can be seen in the video above.

Figure 13: Some random scribbles on plotter

Some downfalls to the prototype system are as follows. Since the machine is quite old, the existing belt system occasionally experiences friction from rusted parts which make geolocation of the plotter head difficult. Therefore, for the purposes of this project, the plotter is only built to handle continuous user strokes. Additionally, since operating the solenoid which lifts the plotter head proved difficult given power constraints and concerns over isolation and voltage spiking, the pen head is held down throughout the drawing process. Nevertheless, our final system compensates by allowing a person to manually move the plotter head in all four directions in order to calibrate and center the machine. Our system also allows a user to swap out pens for different sizes and colors.


We discovered along the way that the smooth operation of the mechanical portion of the system is vital to the final output image. Since the retro plotter shell worked against us because of it’s rusted interior, it was difficult to implement higher-order functionality. For instance, we wanted to allow the user to simply place a point on the touchscreen and allow the plotter to translate directly to this point. This proved difficult, however, because the accumulation of steps in the x and y directions did not translate directly to the number of steps incurred by the motor across non-uniform stepping regions of the plotter surface. As for future improvements, there are several enhancements to our design that we would consider:

  • The plotter contains an internal solenoid which, when powered, engages a pulley system that retracts the pen head. This allow the plotter to lift the drawing mechanism in order to translate the plotter head to another location without writing over anything already drawn. After testing the solenoid, we discovered that it requires at least 12 V and non-trivial current draw, which, under time constraints, our power electronics couldn’t accommodate. Moving forward, we would utilize this component to allow the system to lift and place the pen head.
  • Our system allows for real-time drawing, however, we would build a feature that allows the plotter to take in an image or G-Code and use this as a template from which to sketch an image. This would require a simple augmentation of our system beyond its current capabilities.
  • The plotter currently moves in a Manhattan grid-style when drawing diagonal lines. To make these lines smoother, we would interpolate between strokes to move directly between points instead of feeding delta values directly to the motors.
  • If we could refurbish the mechanical system, we’d also incorporate a feedback system to determine when plotter head has hit a bound. This would ensure that the motors do not stall and draw too much power.




We'd like to acknowledge Prof. Skovira and the course assistants for providing assistance where needed throughout the project. We'd also like to thank Prof. Bruce Land for donating the plotter used for this project. Lastly, we'd like to thank Curran Sinha and Keshav Varma for donating materials from their previous project.


Message Pack Website
Message Pack API
ECE 5725 Former Project
Adafruit Motor Hat
ECE 4760 Project Using Same Plotter

Bill of Materials

Item Cost Link Note
Raspberry Pi Stepper Motor Hat $24 https://www.amazon.com/gp/offer-listing/B00TIY5JM8/ref=dp_olp_0?ie=UTF8&condition=all&qid=1492465072&sr=8-2 Not purchased, used from previous project.
NEMA 17 Stepper Motor $14/each x 2 https://www.adafruit.com/product/324 Not purchased, used from previous project.
Aluminum Belt Pulley $7.89 https://www.amazon.com/Qunqi-5packs-Aluminum-Timing-Printer/dp/B01IMPM44O
Geared Belt $20.88 https://www.amazon.com/280-2GT-6-Timing-Belt-Closed-Loop/dp/B014SLWP68
HIPLOT DMP-29 Pen Plotter (1983) Unknown http://www.atarimagazines.com/creative/v9n10/59_Houston_Instrument_HiPlot.php Provided by Prof. Bruce Land
Tasks Carried Out By Team Members
  • Justin: Server and client program implementations, debugging, electrical component installation, website design, and all website content besides hardware section.
  • Peter: Mechanical disassembly of plotter and machining of components; hardware section of website.