Conductive Paint Data Codes
Use Electric Paint to create data codes for your iPhone or tablet touch screen!
Conductive paint can be used to simulate fingers on a capacitive touch screen. I used this to create conductive paint data codes: unique paint patterns that can be read with a smartphone display and work like a barcode or QR code.
When pressed onto a capacitive touchscreen, the phone recognizes individual patterns and identifies individual codes.
It’s interesting how my paper cards or any other object equipped with conductive paint data codes get a second, digital dimensions. More information can be associated with a piece of paper than just the words or pictures on its surface.
Also, because the codes can only be read using a smartphone, the cards can also draw data from the device. Using the iPhone sensors like GPS, a piece of paper suddenly gets an exact geolocation. This opens the door for a whole range of calculations and speculations about the objects environment.
Materials What I used:
Bare Conductive's Electric Paint
pencil & ruler
device with a capacitive touchscreen (e.g. iPhone or iPod Touch)
Step 1 Creating Patterns
On the back of some paper business cards, I painted rectangles on a four by four grid. The rectangles are connected with thin lines of conductive paint. It is important that the circuit touches your hand at some point.
When a person touches the card he connects the circuit to ground. When pressed onto a capacitive touchscreen, the phone recognizes the bigger rectangles as fingers touching the screen and compares the detected pattern against a database.
It’s interesting that the ink circuits don’t have to be on the papers surface in order to be recognized. The codes could also be printed between two layers of paper, creating an invisible alternative to barcodes or QR codes.
Step 2 Pattern Detection
In order to identify individual cards, I am making use of an artificial neural network.
according to Wikipedia:
artificial neural networks (ANNs) are computational models inspired by animals’ central nervous systems (in particular the brain) that are capable of machine learning and pattern recognition.
Patterns are not calculated from the touch position on the screen, but from the relations between the individual touches. This way, it doesn’t matter where or in which direction the card is places on the screen, which makes for a more robust detection.