In the following examples created in early 2015, we explore the design space beyond Instagram, where effects can be applied to specific areas of an image. Once a user sends a Tweet to a specific account with an attached selfie, I use face tracking algorithms to dynamically target in and around the eyes, nose and mouth to composite, move, scale, rotate and mask graphics into place, generating animations via code. Predefined hashtags define the effects, e.g., #blink, #smoking, #glasses, #tattoo, etc. Effects can also be applied to the entire image, e.g., #frame, #vignette, #filter.
The entire experience was co-developed by William Mincy and Dr. Woohoo!
The image to the left was the original post to the Twitter bot, the image to the right shows one of the steps during the automated process, in this case identifying and outlining facial features.