playListNetWork authoring environment

screenshot of playfield

The playListNetWork is a distributed video editing database that allows multiple users in different locations to edit and annotate media clips and playlists simultaneously.

initial goal:
Our goal was to 'author media collaboratively' and retain the author's edit decisions, while displaying the results to the public from other perpectives as well as the author's - image choice, thematic keyword choice, etc. Thus our media repository was built to aid the artists in navigating while composing and for the public to use these same navigations - the artist's annotations, as ways into the complex work.

A) a database is used to retain:
media locations; auto-generated or 'personally entered' metadata; keyframe representations; the links and junctures between the clips and sequences that create runtimes and playlists; interaction commands for later viewing, etc, etc.

A full history is kept of all 'moves' for eventual statistical analysis and tracking.

B) the playfield allows users to:
1.1. create "clips" on a visual "playfield" representing media assets

like video, images and sounds.

1.2. arrange these clips in playlists

1.3. add many types of properties (metadata) to clips under any name you like and change properties that are already there.

1.4. quickly switch between multiple views of the project

representing different searches or filters of the assets.

1.5 link the sequences and media

1.6. export these playlists to be rendered.

Metadata columns used in our example include:
name

date

media type

start and end time

duration

thumbnail

media location

runtime junction points

keyword single

keyword text

SMIL transition

descriptions

implementation comments

still file

real media file

audio file

audio file #2

VRML audio file

VRML animation instructions

and others...

Our 71 keywords (each of which can be a single word, a short phrase or a couple of words) are: archive, back, barriers to entry, bomb, bubble, cell, conscious, control, counter intelligence, cross pollination, data entry, deconstruct, discharge, disengage, diversion, down turn, dropped frame, feel, fluctuation, furl, game, gloss, guardian, hear, help, I/O, ignore, label, line, mar, meaning, mock up, mutation, nature, news cycle, nurture, obligate, online, park, pedestrian, petro chemical, pick me up, POV, point to point, pop up filter, proliferate, reproduce, saturated, scape, see, smear, smell, spam, spatial recognition, specials!, spin, split, stack heap overflow, subspace, symptom, tele commute, thought processed, time stamp, tomorrow, transmit, trust, two dimensional, vehicle, voice, wait, waste

Central runtimes use letter codes: A, B, C, etc.

click here for a more detailed technical overview of the playList authoring environment

 

 

disPlayList viewer's interface

 

The application called disPlayList is the public view and interface for a streaming media work authored with the playListNetWork software. It is a web application embedded in a browser using various plugins to display media. As an interface it is used to visualize the multi-threaded playlists and provide navigation through their structure.

5. A) backend: takes commands associated with the information in the database and displays the streaming results. These can be based on words, images, pre-determined sequences, 'the look of an aspect of the display', etc.
The two links below show the generation of a playlist based on 1. a word search and 2. the generation of a playlist based on a position pointer referring directly to a clip.

clicking on the links below show the XML generated code that is then used to stream the video and synchonized audio and keyframes.

Search Clips generates an XML document that uses as a startpoint a keyword search term. It then looks at all the associated keywords chosen and choses a random authored playlist with one of the words searched terms as a start.

Generate Sequence generates a SMIL document that uses as a startpoint a particular location point in a runtime. It then looks at all the associated junctures from that point on and choses one of the authored playlist paths randomly. You can paste the .smi directly into Real player to see the actual result.

Tracking shows hitcount for word and location searches.

displaylist process illsutrated

----------

Playlist Public Interface:
Two approaches to navigation are each visible in separate windows. They rely on the open source VRML to display navigable animated 3D representations of the authored environment that is malleable and depends on the public user's search, camera choice, navigation and keyframes chosen and 'clicked'.

5. B). visual display and visual navigation engine
allows the user to move through clip representations from various perspectives, and shows the clips in each sequence 'dance' into position, once recalled.

You can download movies, navigate the VRMl files and look at screen grabs. They are examples and are not fully functional with the specific information in the database yet.

5. C) textual search engine - returned media stream and presents an animated/interactive visualization of the playlist based on a keyword search that can then be navigated visually. it includes any metadata columns, which could then be made interactive (words float over top).
It shows a thematic inter-relation between the parts of the work and is comprised of:

media stream (top left corner)

textual search engine (top right corner)

animated/interactive 3D visualization of the playlist (bottom half of screen)

To navigate; select a keyword- the search return brings up key frames associated with the word and clicking on one of them will launch the stream in the top left window frame. The vrml animation is then pulled around by a communication with the video playback. This effect creates a map of the playlists.
These keyframes allow a thematic visualization of the authored work, rolling over any of them will reveal a more detailed text that contextulizes the thematic keyword. These texts are derived from the comments the authors included with their clips and entered in the database while they were authoring the video structure.

Here we haven't 'shown' tracking of where and when people 'stop watching' by re-clicking or choosing again, but this information is available.

for flow charts and technical overview of disPlaylist see:

disPlayList INTERFACE TECHNOLOGY