topical media & game development

talk show tell print

Synchronized Multimedia


To enable simple authoring of TV-like multimedia presentations such as training courses on the Web, W3C has designed the Synchronized Multimedia Integration Language (SMIL, pronounced "smile"). The SMIL language is an easy-to-learn HTML-like language. Thus, SMIL presentations can be written using a simple text-editor. A SMIL presentation can be composed of streaming audio, streaming video, images, text or any other media type.

For a more detailed description of the goals of the SMIL language, see the W3C Activity Statement on Synchronized Multimedia; a regularly updated report to W3C members that is also available to the public.

Work on Synchronized Multimedia is being managed as part of W3C's User Interface Domain.


  1. Introduction
  2. Current Situation
  3. Concepts simply explained
  4. What the future holds


W3C's Synchronized Multimedia Activity has focused on the design of a new language for choreographing multimedia presentations where audio, video, text and graphics are combined in real-time. The language, the Synchronized Multimedia Integration Language (SMIL) is written as an XML application and is currently a W3C Recommendation. Simply put, it enables authors to specify what should be presented when, enabling them to control the precise time that a sentence is spoken and make it coincide with the display of a given image appearing on the screen.

Concepts -- sequential and parallel timing

The basic idea is to name media components for text, images, audio and video with URLs and to schedule their presentation either in parallel or in sequence.

SMIL presentation characteristics

The SMIL language has been designed so that it is easy to author simple presentations with a text editor. The key to success for HTML was that attractive hypertext content could be created without requiring a sophisticated authoring tool. The SMIL language achieves the same goal for synchronized hypermedia.

Example applications

      <a href="#Story"> <img src="button1.jpg"/> </a>
      <a href="#Weather"> <img src="button2.jpg"/></a>
           <par id="Story" begin="0s">
             <video src="video1.mpg"/>
             <text src="captions.html"/>
           <par id="Weather">
             <img src="weather.jpg"/>
             <audio src="weather_rpt.mp3"/>

This example demonstrates the use of two tags PAR (parallel) and EXCL (exclusive). The EXCL element has been introduced in the SMIL 2.0 draft. In the example, two images are shown as buttons. When the user clicks on one of the buttons, only then is the selection played. If the user selects "Story" then selects "Weather", "Story" is stopped and "Weather" is played. The tag is used to make it possible to select one (but not both) at a time.

Notice the use of the <par> tag to schedule captions for the story in parallel to the video, and a spoken weather report in parallel to the picture of the weather.

The SMIL 2.0 Working Draft proposes XML tags for controlling presentation of multimedia components in sequence, in parallel and also on an "exclusive" basis (as in the example above). The draft also defines a number of elements and attributes useful for controlling presentation, synchronization, and interactivity.


Experience from both the CD-ROM community and from the Web multimedia community suggested that it would be beneficial to adopt a declarative format for expressing media synchronization on the Web as an alternative and complementary approach to scripting languages. Following a workshop in October 1996, W3C established a first working group on synchronized multimedia in March 1997. This group focused on the design of a declarative language and the work gave rise to SMIL 1.0 becoming a W3C Recommendation in June 1998.


You can watch a SMIL presentation by downloading a SMIL "player" on your PC. A list of SMIL players is available from the Synchronized Multimedia Home Page.

Synchronized Multimedia Working Group

Mission Statement

The mission of the SYMM working group is to continue W3C's work on synchronized multimedia that started with SMIL 1.0. The goal is to extend the development of SMIL as a declarative, XML-based timing and synchronization language, and advance the corresponding timing model.

Design goals

SMIL 2.0

SMIL 2.0


SMIL 2.0

Allow module-based reuse of SMIL syntax and semantics in other XML-based languages, in particular those that need to represent timing and synchronization. For example:

SMIL 2.0 etcetera

The SYMM Working Group plans to achieve W3C Proposed Recommendation Status for SMIL 2.0 in May 2001, and W3C Recommendation status end of June 2001.

(C) Æliens 04/09/2009

You may not copy or print any of this material without explicit permission of the author or the publisher. In case of other copyright issues, contact the author.