Temporal Innovations
Lightning swift, interactive video analysis
Navigate through terabytes of video data with ease
Fluidly explore forwards and backwards, from stop-frame to 1,000,000 times normal rate
Observe phenomena across time and space, behavior and place
Use diversely in training, research, presentations, and exhibits

Core Technology

FrameGlide video explorer software - past, present & future

The FrameGlide video explorer was originally developed by for use within public museum exhibits. In this context, the technology has enabled visitors to fluidly explore diverse kinetic phenomena including: birds in flight, plants growing, explosive demolition of buildings, terabyte-size year-long live weather capture with live satellite data overlay, CT scan data, still images converted to virtual flipbook, point-of-view hundred mile river flyovers, muffins baking, high speed video of ballistic impacts, decomposition, Olympic sports, and 1001 other subjects in between.
Often, the user interface chosen for this context is our ultra-rugged Spin Browser dial
.

(When the FrameGlide application is used with the Spin Browser dial, we refer to the unit conventionally as the FG|SB.)

The system has also been used in the endoscopy suite as a clinical add-on and training tool, at trade shows to allow booth personnel to show prospects diverse company offerings, and at conferences to allow presenters fluid flexibility when responding to audience questions.

This site was created to introduce the technology to new markets and receive feedback regarding desired functionality.

Core concepts

Several principles guide the FrameGlide software design and ongoing evolution. These include:

  • Treat video content as data to be explored, not a produced experience. (One immediate result was that interactive exhibits based on our system went from the usual museum exhibit 1-5 minutes of content, to over 24 hours at 30ps - an increase by a factor of over 1000.)
  • Focus on dynamic phenomena captured across frames, as opposed to static imagery within a frame.
  • Leverage human ability to perceive complex patterns and notice behaviors, by providing an environment with minimal barriers between the human user and the kinetics of the visual data. (This requires, among other things, realtime, immediate responsiveness to user actions.)
  • Allow temporal highlighting of image data based on internal image metrics (color, scene complexity, rate-of-change of brightness, etc.), and metrics derived from secondary data streams (audio, barometric pressure, heart rate, etc.)

FrameGlide evolution, Phase I - Extreme liquidity of perusal

The first focus was on providing ultra-smooth movement through video (no visual distortion, and no temporal stutter), forward and back, fast and slow. This liquidity extends through arbitrary frame paths determined on the fly. The system remains superlative in this area, allowing the exploration of multiple time-synchronized channels of HD from stop frame to over 1,000,000 times normal rate, running on nothing more than a conventional PC.

FrameGlide evolution, Phase II - Parallel live capture

The next major enhancement was the addition of a live camera feed. The system can currently capture to both disk and RAM, at frame rates ranging from timelapse to over 2000fps, across multiple HD cameras, with capture sets aggregated over years and terabytes in size (including dynamically pruned frame density as imagery ages) -all while allowing ultra-smooth parallel user review right up to the present moment.

FrameGlide evolution, Phase III - Diverse features desirable for exhibit applications

We then added many more features relevant to our museum clients such as:

  • Sophisticated, client configurable relationship between dial motion and advance through video, including linear "gear ratio", threshold entry into exponential mode, speed caps to prevent users missing important sections, etc.
  • Custom, client-configurable exposure algorithms designed specifically for timelapse capture, including innovative methods to remove luminosity flicker.
  • Graphic and audio overlays whose appearance is cued to user dwell time in a particular area.
  • Realtime deinterlacing.
  • Realtime chroma and difference keying.
For a relatively complete feature list (we are never quite up to date as the system functionality advances with every project) geared toward museum clients, please see the Features and Advantages PDF on

Spin Browser dial hardware

We offer both dome mount
and panel mount
stainless steel Spin Browser dial navigators, with both USB (convenient) and RS232 serial (static resistant) interfaces. These user interfaces significantly enhance the user experience when navigating through large amounts of content. They are ultra-rugged (coming out of TechnoFrolics work in the interactive museum exhibit world), and come with a 3 year parts and labor warranty.

Offerings and work process

Basic systems

For configurations as used in museums to fluidly observe phenomena within large quantities of video using the dial interface, we have offerings essentially on-the-shelf and ready-to-go.

Data Filtering

view storm data Frame 1 default (inactive) shape larger version of selected frame
  • button to view storm data
  • button to view strobe data
  • button to How do you
  • button to view complexity data
Isolate Fleeting Phenomema (Night Storm)
button to see filtered content

Actual interactive data filtering is as quick as clicking between these sample images!

Find a needle in a haystack of time - here lightning bolts in hours of storm data.

The newest generation of the FrameGlide application extends the environment to:

  • Allow metric-based frame reordering, so that parameters other than time may be used to determine frame ordering/proximity - e.g., sort frames in ascending order of brightness, descending order of ambient temperature, etc.
  • Make filtering and reordering so fast and easy to configure that dozens of hypotheses may be tested in a single afternoon - during a meeting even! The FrameGlide swiftness in filtering and reordering image data is unique in our experience.

    Regarding this point, once metric files are available, (reversibly) applying frame filters based upon boolean combinations of metrics typically takes just seconds for 100 gigabyte-sized video files. We contrast this with using more conventional means to achieve the same goal, where the time and effort would be measured in hours to weeks.

With the addition of these capabilities, which provide powerful research and analysis functionality, we expect clients in fields such as animal behavior, factory automation, sports, ballistics, surveillance, healthcare, meteorology, transportation, and bio-mechanics.

More On Metrics

Creating the metric files initially can take essentially no computational time if they already exist - for example wind speed data from NOAA. If generated from the video data itself - for example frame brightness - then the time to export will be affected by image capture file size and complexity of metric calculation. Once exported however, playing with hypotheses can iteratively occur at high speed.

One useful way to think about the filtering aspect of the FrameGlide environment is as a "virtual strobe light", but where, unlike a normal strobe, the times between flashes can be measured in days as well as milliseconds (and thus can be used to see repetitive motions too slow to be seen with the naked eye), can be uneven in time, and can be keyed to external or within-image metrics.

Examples of the kinds of things the system makes quick and easy may be seen on the bulleted list on our homepage.

Research-Related Offerings and Work Process

For complex data analysis projects, we recommend an arrangement that combines your domain-specific knowledge and data with our unique speed and flexibility. The most efficient process will likely be one in which we work together to extend and configure the system for your specific needs, assist with metric export and query construction, etc.

In the software arena, extensions might include new metrics, support for additional file types, or adding global system functionality.

In the hardware arena, extensions might include triggering capture based on project-specific sensors, integrating with novel user interfaces, capturing metric data from external sensors on the fly, etc.

Collaborations

Free analysis, barters, demos, and more

If you have a data set that is both broadly interesting and lends itself to analysis without undue custom development work on our part, we welcome discussion of a barter scenario, where in exchange for access to your data (we are happy to sign a non-disclosure agreement), and your endorsement if we are able tease out significant new information, we will "play" a bit to see what we can find for you.

Examples could include:

  • 100 hours of bird behavior video with accompanying time-synchronized audio chirps.
  • Tornado captures with accompanying wind speed and barometric pressure measurements.
  • A week worth of traffic patterns at an intersection, ideally with accompanying metadata such as time of day, state of the traffic light, weather conditions, etc.

Finally, please be sure to check out our demos area.

Grants

We are eager and available to help write grant proposals for joint research and development efforts. For the "right" problems, where the match between your needs and our offerings mesh, we can speed up your pace of discovery by over a factor of 1000. Going further, we can allow you to see patterns that otherwise might never be noticed at all.