Vol. 29 Issue 4 Reviews

softVNS 2 Real Time Video Processing and Tracking Software

softVNS 2.0 is listed for US$ 350. Current versions are softVNS 2.19d for Macintosh OS X, and softVNS 2.17 for Macintosh OS 9. Contact David Rokeby; electronic mail drokeby@sympatico.ca; Web homepage.mac.com/davidrokeby/softVNS.html.

Reviewed by Margaret Schedel
San Francisco, CA USA

David Rokeby developed the Very Nervous System in 1986 as an installation which enabled a computer to trigger sound and music in response to the movement of human bodies. His computer was not fast enough to analyze streaming video data so he created a rudimentary camera out of sixty-four light sensors and a plastic lens. Using the Very Nervous System with its dedicated external digitizers to capture, convert, and extract motion information from live video, he has created installations for over a decade, adding hardware and software modules as needed. Since 1993 Mr. Rokeby has sold his technology as several generations of upgradable proprietary hardware. In 1999 he reworked the system to run under the Max programming environment and dubbed it SoftVNS; in July 2002 he released an updated and expanded version of this software: softVNS 2.

The integration of softVNS with Max is seamless—the software resides within the Max folder and is authorized by SoftVideo.key, a file that the developer sends the user via electronic mail which also resides within the Max folder. SoftVNS objects are denoted by a "v." prepend similar to Jitter's jit. nomenenclature. Speaking of Jitter, Mr. Rokeby has written objects which translate from softVNS streams to Jitter matrices and back again. SoftVNS has some overlap with the functionality of Jitter in terms of video playback and processing, but Jitter is designed for general data processing including OpenGL geometry, audio, physical models, state maps, generative systems, text, or any other type of data that can be described in a matrix of values. SoftVNS is designed purely for processing video and contains a very useful set of objects for real-time video tracking, including presence and motion tracking, color tracking, head-tracking, and object following.

The tracking algorithms in softVNS 2 are unbelievable. I'm sure given enough time I could build similar patches in Jitter, but Mr. Rokeby has had 20 years of experience tracking movement and it shows. A key component in classic softVNS motion tracking is used in v.motion, which is implemented by subtracting the current frame from the previous one—the differences are generally caused by movement in the image. There are many other tracking algorithms: v.edges shows the edges of movement; v.heads tracks multiple objects at once; v.track follows a specified small object; v.bounds draws a rectangle around the borders of the object and gives its center (see Figure 1); and v.centroid gives the center of gravity of an object. It is fascinating to compare the center of an object as measured by its boundaries and its center of gravity. Together with objects that massage the incoming video stream to make it easier to track, these following objects are the strongest reason to purchase softVSN 2 if you already own Jitter.


Figure 1

The other half of softVNS allows video playback and manipulation. In his own installations, Mr. Rokeby has always had video and sound reacting to body movement, and he developed softVNS before Jitter and Nato were available. Working with video artist Charlie Woodman, I created a patch that tracks the movement of a dancer who controls over 60 parameters of the sound and 10 parameters of the video. Previously, I had created video patches for Mr. Woodman with Jitter, but he was more than happy with the video "effects" in softVNS 2. Although we didn't have to use the v.jit object, it works perfectly.

Getting started with SoftVNS 2 is pretty easy; there is no tutorial per se but there is a patch called softVNS_2_Overview that introduces the softVNS 2 objects and allows easy access to example/help patches for each object. The objects are divided according to type including sources/capture/display, spatial transformation, and tracking/analysis. I found it much more useful to simply work my way down the help files in alphabetical order. It may sound absurd, but there are only 183 objects; many of them are minor variations of each other such as the arithmetic operators or QuickTime effects objects. The help files are extensive and show some of Mr. Rokeby's coding tricks to make the tracking objects work their best; my only complaint is that some of these files use bpatchers, making it a little less convenient to pirate sections for personal use.

Mr. Rokeby has been using softVNS in interactive video installations over the past two years. Some of these are permanent exhibits that have been running all day every day for well over a year without downtime. While my own experience with softVNS has been more performance oriented, I haven't experienced any problems with it crashing. There was a noticeable slowing down of processing when using several of the tracking objects at once on my 1.25 GHz Powerbook G4 with 1.5 GB of RAM. The computer could handle the tracking, but as soon as I added 60 streams of MIDI messages things got ugly. The CPU load for the tracking objects is very high, but it is definitely worth it.

SoftVNS 2 is coded for the Macintosh G4 velocity engine, and will not run on any earlier hardware. It includes extensive support for iSight, enabling users to easily turn off the auto-focus and auto-exposure, necessary for accurate video tracking. It also includes intuitive controls for my interface of choice, the Imaging Source DFG-1394-1 video-to-firewire converter which streams uncompressed video directly into the computer from any composite or s-video source.

Motion tracking has become more common since the 1980s, but this quote from Mr. Rokeby's Web site is a striking description of the artistic impetus behind the development  of his system:

Because the computer is purely logical, the language of interaction should strive to be intuitive. Because the computer removes you from your body, the body should be strongly engaged. Because the computer's activity takes place on the tiny playing fields of integrated circuits, the encounter with the computer should take place in a human-scaled physical space.

The developer deliberately set out to oppose his art to the accepted belief of what computers are good at, continuing a contrarian streak that started at the Ontario College of Art. He has pursued his dream, creating amazingly powerful tracking software—thankfully he has turned it into an affordable and well-documented product for all artists to use.