DIGITAL MEDIA TOOLS, SYSTEMS & INSTALLATIONS

 

Concepts

Functional Art

The lighting in our sculptures can be driven by a number of different sources. An internet, sensor, or other procedural data-feed will turn the sculpture into a beautiful readout, showing different data conditions with different sets of lighting. An example of this might be to show different lighting scenes depending on what the local weather forecast is. Another might show a building's current energy-use, as compared to a goal, to involve its occupants visually in the goal of reducing their energy footprint. Internal logic can cycle through multiple display modes, automatically, or on user command.

iOS multitouch control

With an interactive multi-touch application, we can quickly adjust our lighting. This controller is leveraged in a number of different areas as follows:
  • To control the final sculpture's lighting in real-time, via a WiFi connection. This entertainment application is a crowd-pleaser, and will be used to drive sales or rentals to customers such as restaurants, where customers might download the control apps, and use their own iOS devices to play with the sculptures on the wall while they wait for service.
  • To control virtual, interactive previsualizations of the sculptural lighting, as if we were controlling the real thing. This helps us to quickly arrive at a beautiful lighting design.
  • To control the automated camera-capture rig, that is used to generate previsualization images from photos of a prototype.

CAD-driven, automated fabrication.

Both 2D CNC cutting, and 3D printing are used to manufacture the Wallships line. The 2D shapes, and 3D models needed to drive this process are the designed byproducts of our early previsualization steps. As such, once we've used the previsualizations to generate a good design, the sculpture is ready for fabrication, with very few additional steps needed. This allows for just-in-time production of sometimes quite expensive finished pieces, enabling us to sell products virtually with very low overhead.

'Napkin Sketches'

Most of our best ideas begin as quick, conceptual sketches ( often on a napkin, if they happen while dining out ). The conceptual sketches are scanned and imported into a 2D drawing program to provide the original shape reference, ensuring the spirit of the original design is captured into the computer-aided-design process.

2D CAD Drawings

Using a 2D drawing, or illustration application such as Adobe Illustrator, or OmniGraffle, we convert our sketches into 2D shape curves, that are used to generate CNC cutting instructions for production, and also reference shapes for our 3D modeling stages.

3D Modeling

Using our 2D shape curves as reference, the sculpture design is modeled in 3D. This produces shape definitions that can be used by the 3D printing stage to directly produce our non-visible, structural, and shadow-casting parts. The 3D model is also used to generate animated movies of our virtual sculpture for sales/marketing purposes, and also to generate the images used for interactive previsualization.

Virtual Lighting

In our 3D package, using Lumieria custom lighting rigs, we can quickly experiment with different lighting schemes, until we arrive at a nice-looking, or otherwise functional lighting design. Our LED light sources are accurately modeled using area lights, whose outputs are calibrated against reference images taken of real-world LEDs in a physical configuration that matches their eventual usage in the sculpture.

Interactive Previsualization

Because nice renderings of our sometime complex lighting schemes take a fair bit of computing time, we've developed a model for rapid iteration that is based on film-studio CG lighting techniques. Using a compositing tool such as Shake, or Nuke, we set up a fast, interactive workflow that automates the combination of pre-rendered, per-light images to quickly generate a final image of any given lighting configuration. This setup lets us quickly adjust each light's color, and intensity, enabling us to quickly test different animation schemes and generate visualizations of them.

Automated Camera-Capture Rig

The Lumieria iOS lighting-control app has an automated 'capture' mode, which is used to control a camera in capturing an image of each of the different individual lights at full intensity. By communicating with the sculpture via a WiFi connection, and to the camera via an integrated infra-red LED remote controller, the full set of images needed for interactive previsualization process (this can be many images), are automatically captured, without the need for any manual intervention. This huge time-saver allows us to use our previsualization setup to preview how the lighting will look on a real sculpture. Additionally, the images are used to calibrate the rendering of our virtual lights, to match the appearence of real-world lights.