We developed a set of tools in TouchDesigner to help us map LED pixels with custom visual content. The TouchDesigner sketch is modular which lets us plug in any sample any content with any shape of LED array. This is particularly useful when trying to map pixels that are not in an even grid, like our project Stratus above which uses an animated noise map to animate a two-dimensional array of LEDs.

The two videos above show how different sampled content on the left can be easily mapped to the three various LED arrays on the right.

The main network is broken up a series of sub-network: Image, Geometry, Image Sampling, and Instance Render. 

The Image sub-network is where we can input our 2D content that we want to display across our LED array.  Here we are using TouchDesigner’s native noise and ramp components to create 2D animations.  You can also use an external image or movie file here as well.

The Geometry sub-network is where we can either import or create geometry that represents our LED array.  Since we are using this geometry to describe LEDs we are most interested in the points that describe the geometry.  For instance, normally a straight line is best described as its start and endpoint, but in this case since that line is representing a number of LEDs we want that line to have a number of divisions on it.

In order to sample a 2D image with our geometry point positions we need to normalize our point positions so that their x, y and z values lie between 0 and 1.  This is because we are going to use our point positions as UV values to sample the color of image at its respective location.  Inside this network we are extracting the point positions into 3 channels (tx, ty, and tz).  Then we take the normalized values of the tx and ty channels and use them as U and V coordinates to sample the image, (0,0) being the bottom left of the image and (1,1) being the top right.  This returns 3 channels for each coordinate (RGB).  Now we can merge our RBG channels together with our point positions channels to use for the next part of our network, instance rendering.