Why Common Sense for Video Production? IC Technical Report 2002-00-00 00:00:00
Video cameras are becoming cheap, small and ubiquitous. With advances in memory, cameras will increasingly be designed to be always ready, always recording. When cameras are always ready, how will videographers -- professional and/or amateur -- decide what to shoot, when to shoot and how to index their video material to best support their communication requirements? In this paper, we describe an approach and early experiments that use a commonsense database and reasoning techniques to support a partnership between the camera and videographer during video capture. We describe a new paradigm for producing commonsense video metadata and describe how it can have a positive impact on video content capture, representation, and presentation.
Video Database Design: Convivial Storytelling Tools IC Technical Report 1994-04-00 00:00:00
Traditionally video and film stories have been developed by a single author for a single release movie. Increasingly, video databases will be constructed as content libraries. These libraries will be used to deliver personalized messages to people who know very little about video story construction or editing. The challenge in making these systems usable is to develop story telling tools for those unsophisticated users. Story generation presumes some input by the user, first to create appropriate video description and second to suggest a story to tell. This paper offers an overview of some methods of description which have been associated with particular types of video logging and databases in the past. A general problem with these systems has been how to develop video annotations with efficiency and consistency. A new approach, story based annotation, is proposed. In this method a tool set is used for creating a top down story abstraction. Coupled with automatic database selection, this tool set allows the user to encode story based annotations and expert knowledge about editing into the database while producing stories. As the database grows, it becomes structured and annotated by a process appropriate to the medium, namely storytelling. This structuring process optimizes the database for retrieval of video in a story form.
LogBoy: A Graphical Interface for Narrative Databases IC Technical Report 1993-04-00 00:00:00
Filmmakers who create both interactive and linear video experiences are beginning to experiment with a new production technique using annotated video databases and description-based story-structures. This process calls for a new type of video logging tool which will serve the specific needs of the filmmaker, including temporal representation, sketchy description bases, contextual description generation and overhead views of descriptions. LogBoy is a prototype video logging tool which addresses these needs by centering the logging process around descriptive characteristics rather than the video itself. The paper describes the motivation for and design of LogBoy's graphical interface.
Maitre-D: A Site-based Agent for Web-Page Recommendation IC Technical Report 1997-01-00 00:00:00
This project explores website-based agents, designed to provide direction and recommendations to visiting clients. Time, processing power and intimate knowledge of the content being served are leveraged on the server-side, to allow suggestions and linkages to be pushed to users, based on their current interests and browsing history.
Let\'s See That Again: A Multiuse Video Database Project IC Technical Report 1994-00-00 00:00:00
In the past, video production has had three distinct phases: content collection, logging, and video editing. Production was a fairly linear process, with little overlap or communication among the three phases. Typically, logging supported editing to produce single use structures. With the advent of digital video technology, we can imagine the production of multiuse video databases. These databases pose the problem of how to describe content for multiple use.
M-Views Evaluation IC Technical Report 2003-10-00 00:00:00
Our goal is to investigate how the audience will participate in mobile cinema and how the M-Views system will support both presentation and creation of mobile cinema content. We are going to conduct four evaluation studies: "MIT in Pocket", "15 Minutes", Media Lab tour production, and Media Lab tour presentation. Each study focuses on several technical, aesthetic, and methodological issues in mobile cinema...
Lurker: A Thinkie for the Society of Audience IC Technical Report 1995-00-00 00:00:00
This paper introduces the newest example of a form of interactive cinema called "thinkies." Thinkies use the medium of interaction along with cinema, to elicit a thought experience in an audience.
Multiscale Coding of Images: Approaching the Problem from a Filmmaker\'s Perspective IC Technical Report 1988-00-00 00:00:00
The use of the video coding system proposed by William J. Butera in the thesis "Multiscale Coding of Images" is explored in terms of its relevance to current and proposed work in the Film/Video Group. Also, utilizing knowledge of motion picture techniques, a strategy for the possible optimization of the encoding process is explored.
LogBoy and FilterGirl: Tools for Personalizable Movies IC Technical Report 1993-08-00 00:00:00
LogBoy and FilterGirl constitute a toolkit designed specifically for building personalizable movies which use a fluid interaction mode. Fluid interaction is an important interaction metaphor which relies on narrative playout which is uninterrupted by the interface. By avoiding periodic viewer queries, fluid interaction encourages reverie and continuity in storytelling. Fluid interaction requires not only an interactive story structure, but also a machine-readable representation of content. LogBoy and FilterGirl provide tools for these ends. LogBoy is a video database tool which allows the moviemaker to attached descriptions to video clips. LogBoy's interface provides a graphical "overhead" view of descriptions which aids in the creative process. FilterGirl provides a way for creators to quickly and easily create personalizable story structures. FilterGirl implements a filter-based story description language as well as a story structure previewing facility. Functionality of the tool set is discussed as well as interface design and implementation details.
The Galatea Network Video Device Control System, Version 2.5 Project Athena Technical Report 1991-00-00 00:00:00
"Galatea" is a network transparent video device control system, providing reliable access to various video devices in a distributed network environment. This paper describes many features of "Galatea", in addition to a description of the goals and strategies used in creating the system, and contains a manual for the C language programming interface.
The Combinatorics of Storytelling: \"Mystery Train\" Interactive IC Technical Report 1990-04-00 00:00:00
What is an interactive story? The traditional idea of a "story" is linear - it has a beginning, a middle, and an end. There is an apparent contradiction in the phrase "interactive story," because non-linearity  is essential to interactivity.
Issues and Strategies for Computerized Delivery of Videodisk-Based Materials IC Technical Report 1987-05-00 00:00:00
A rapidly-growing "library" of movies and slide collections is available on CAV optical videodisks. This important resource is expanding even before delivery systems which take full advantage of its power are available. In the future, videodisk material will be delivered by sophisticated computers using a standard set of commands, data structures, and user-interface functions; in the meantime, chaos reigns.
Tired of Giving In: An Experiment in Narrative Unfolding Technical Report 1999-00-00 00:00:00
We present a model for structuring computer-based narratives so that viewers can interact without changing the story. Instead, interactions reveal the story through varying perspectives and at varying levels of detail. The story maintains its integrity while taking on progressively deeper meanings. Viewers are like members of a Greek chorus who query characters and comment on events. We describe lessons learned from a prototype that retells the story of the 1955 arrest of Rosa Parks and events leading to the Montgomery Bus Boycott. Lessons include observations about narrative structure, writing to support narrative unfolding through user interaction, designing the multimedia interface to communicate conditions for interactivity, and implementing the program so that information about viewer interactions helps to control the presentation. We include projections for using the chorus model to support multiuser interactions.
Moving Pictures: Looking out / looking in Tech Note, MLE, Dublin 2004-06-00 00:00:00
Moving Pictures, a concept we are currently developing, is an accessible, robust multiuser unit and a set of physical tools that enables young users to explore, manipulate and share video content with others.
Digital Liberties Tech Note, MLE, Dublin 2004-07-00 00:00:00
We hypothesize s that the relating, capturing, re-enacting and re-distribution of historical and personal narratives as told by community members can play an important role in fostering a sense of place.
Emonic Environment Users Guide Tech Report 2006-01-00 00:00:00
The Emonic Environment (EE for short) allows you to create, browse, compose, and exchange audio, video and text in an improvisational manner. It is designed to provide you a new kind of interaction with a computer, one in which the computer takes an active role in determining the content of what is being is created. It makes your media look, sound, and behave differently than how it would have if you were the only “decision maker” in its design, who used the computer for no more than executing ideas. In the EE, the computer contributes to the creation process in various ways which will be described later on.
Multiple Views of Digital Video Interactivve Cinema Tech Report 1992-03-00 00:00:00
Recordings of moving pictures can be displayed in a variety of different ways to show what they hold. The historical and most absorbing way is to display the images through a rapid succession of full screen frames. However different forms of presentation can be used to emphasize different attributes. The video streamer positions frames of digital video sequentially in front of each other with a slight offset from frame to frame; visually this appears as a three dimensional extrusion of the video stream in time which emphasizes differences along the side and top edges of adjacent frames. In this way the video streamer helps us see characteristics between frames and across shots such as transition types and cutting rhythms. While viewing the video stream one can select bounds of interest in time; this area can be adjusted using a rubbing motion along the stream. The micro-viewer shows us more precise frame to frame relationships, based on the portion of the video stream we have currently selected. The shot parser uses a frame differencing algorithm to offer a helpful element of machine assisted abstract analysis.
Comic Creation Game for StoryNet Tech Report 2004-06-21 00:00:00
The StoryNet pro ject aims to collect information about stories to further machine under- standing of the common sense aspect of human experience. A story by this definition is a causal series of events or states. The existing OpenMind website has been effective at gathering information about the relationships between everyday ob jects, such as “shampoo is found in the bathroom”. The Common Sense group also wishes to collect information about how events normally are sequenced, such as, “Eric was thirsty. Eric took milk out of the fridge. Eric drank the milk, and was satisfied.” Ministories such as these can teach a great deal about motivation and a combination of actions leading to a result.
Hopstory II Tech Report 2004-07-04 00:00:00
Hopstory II is novel in the way it distributes narrative content in the same building where the story took place, applying narrative theory and new technologies to the practice of mobile distributed cinema.
The Slipstream Project Tech Report 1989-09-00 00:00:00
Since the beginning of cinema, motion picture production has evolved a methodology for communicating necessary data about the film to a wide range of production personnel whose actions will affect the final product. From initial conception and visualization to direction, editing and final printing, much of the information which shapes the common vision for the story is visually represented and/or identified by location within the story. Traditionally, the data generated over the course of a film's realization has been isolated within specific stages of production: pre-production and planning, Production and Shooting, and post- production. Pre-production, for example, with its volumes of storyboards, notes, location photographs, conceptual art and design passes surprisingly little information on to the shooting crew and editorial. This isolation of data has been shaped in part by the nature of the technology used to record the information: paper, the usual recording and storage medium, is bulky, can be damaged easily, and cannot be updated easily...