The invitation for an exhibit in the Frederieke Taylor gallery offered an opportunity to research the generative aspects of integral architectures. The thought was to re-investigate the “Armature”, which in its true form originates as a modulated core for the GT residence, it functions not only as an infra-structural unit but also as a circulatory and generative core. Its organic shape distorts the geometry of the house as ‘pure box’; it softens, tilts, and fragments. The generative aspects of the armature are based on its performance and are distilled in its precise configuration and operation. It generates different environments around and adjacent to it. This generative aspect was intriguing to us, the installation was a push to investigate this tension and the consequential mutation of the “real object” and the environment it creates; could the object become the environment? As Husserl states, “here too, proceeding from the factual, an essential form becomes recognizable through a method of variation.” He calls these “thing-shapes.” Specific for these thing-shapes in their relationship to the human body are their “surfaces - more or less ‘smooth’, more or less perfect” — registrations that strive for gradual perfection. The generative aspects of integral architectures activate these surfaces into a hybrid system of layering. The politics of layering, creasing, and wrapping activate zones and environments where boundaries are negotiated and distinctions are blurred.
How to abstract the object and generate a digital environment of light, speed, and sound? As investigated in the science of mathematics; the higher dimensional can only be analyzed by slicing and animating the object in order to study its complex four (or higher) dimensional behavior in digital simulations. Hence, the Armature was sliced and the animation software was used to program different levels of elasticities, speeds and deformation values. The model had a built-in memory; the object always re-configuers and when ‘triggered’ the mutation would resume. The process of animation in itself is ‘objective,’ the variation in repetition is not subjective or hierarchical. Data derive its meaning, time its form.
Sensors or triggers distributed in floor fields, activate the projected construct as a dissection of an organic unit that expands, contracts and envelops. This interaction challenges the relationship of the viewer and the object, constantly re-investigating its 'objectness'. The trigger is both an activator and a violator of the visitor’s interaction with the accelerated to have the same frequency as the animations; each distortion is sampled differently and has its own character. The sound technology further enhances the affect; localized hypersonic sound beams developed by Robotics International scramble sound until the sound waves hit a surface; there the re-organize and hence the sound will seem to be projected from that surface, thus creating a 3D sound space. An ambiguous animated environment ensues, enveloping the visitor and the gallery's confines.
The “interactive floor” developed by the Context-Aware Computing Group at the MIT Media Lab became integral part of the gallery installation and enhanced or ‘triggered’ the visitor’s interaction with the armature. Once triggered the sensor data are fed back to the computer, which launch holographic animations, and digital environments. A special prototype of a modular steel floor, sponsored by Steelcase, incorporates the MIT Media Lab sensor technology of the PIC microchip, which analyzes changes in the capacitance of the interactive floor due to the compression of the foam dielectric.
“From HardWare to SoftForm", is a 3-D digital interactive installation of the 'Armature' exhibited in the Frederieke Taylor Gallery in Chelsea, NYC [September 2002], in the ‘Art & Idea’ Gallery, Mexico City [September 2004] and in Monterrey [Nov 2004].
HW&SF WAS SUPPORTED BY
The Netherlands Foundation for Visual Arts, Design and Architecture, Amsterdam, The Consulate General of The Netherlands in New York, Steelcase NYC, and the Context-Aware Computing Group at the MIT MediaLab, Cambridge.