Está en la página 1de 66









Submitted To


Submitted in partial fulfillment of requirement for the award of



This is to certify that the project report entitled "JAVA MEDIA PLAYER submitted to the "RAJIV GANDHI PROUDYOGIKI

VISHWAVIDYALAYA, BHOPAL for the award of DIPLOMA IN COMPUTER SCIENCE & ENGINEERING (C.S.E.), Project work carried out by "JAY PRATAP SINGH ", ROLL. No. 08017C04056 under the guidance of :-







DATE: - JUNE 3, 2011 PLACE: - GWALIOR (M.P.)

It is my esteemed pleasure to present this project work on JAVA MEDIA PLAYER. I whole heartily thank all those who assisted me to consummate this task. I first of all them JAVA MEDIA PLAYER my Mr. MOHAN DHURVEY who was the propelling force behind this endeavor. I express my sincere sense of gratitude to our esteemed head of Department Mr. SAILENDRA SATYARTHI who is my project Guide, Mr. MOHAN DHURVEY who encouraged me to do my Project in this field without his sustained interest unlimited patience and sound counsel, this work would have not been possible. Through the project work he was a perpetual source of inspiration and advice.




The word PROJECT is a word of its own importance and its meaning is explained below:-

P Stands for Planning R Stands for Resources O Stands for Operation J Stands for Joint Effort E Stands for Efficiency C Stands for Communication T Stands for Technique 1. It is said that first plan work, without planning a work should not be
Started to avoid dialog and confusion faced during the work.

2. While preparing a project, it should be well known about resources

i.e. employee records.

3. It should be decided previously, which operations are to be

performed And how it will be done.

4. For proper working of a system, it should be insured that proper coordination between the staff a students are working with their own efforts.

5. Efficiency & Ability of the group members.

There should be proper communication with all persons who join the particular task.

Media player is a term typically used to describe computer software for playing back multimedia files. While many media players can play both audio and video, others focus only on one media type or the other. Such players are known as either audio players or video players and often have a user interface tailored for the specific media type. Media players often display icons known from physical devices such as tape recorders and CD players. Examples of these icons are (play), (pause), and (stop).

Many media players, especially those designed to play music, display available songs in a format known as a media library, which allows the user to organize their music by categories such as artist, album, genre, year, and rating. Examples of media players that include media libraries are Amarok, Clementine, Banshee, iTunes, Rhythmbox, Winamp, and Windows Media Player.


Microsoft Windows comes pre-loaded with Windows Media Player. Mac OS X comes pre-loaded with QuickTime Player and iTunes. Linux distributions come pre-loaded with various media players, including Amarok, Audacious, Banshee, MPlayer, Rhythmbox, Totem, VLC, and xine.

To be intuitative and innovative is the wit of soul and mind of human being. This is what engineering demands from us that we must explore new avenues and dimensions by tapping our in built capabilities. This wa s t h e ma i n mo t i va t i n g f a ct or b e hi n d s u bmi t t e d p r o je c t . T h e co re p ur p o se b e hi nd t hi s te a m wo r k wa s t o u tili ze t h e d if fe r e n t b r ai n s i n order to create such a pragmatic innovation with the name of JAVA MEDIA PLAYER using JAVA. We have decided to go with the same spirit in the prospective times in order to utilize our technical knowledge in the best possible manner. So that we can prove ourselves to be the most utilitarian members of engineering slot.



Java is a programming language originally developed by James Gosling at Sun Microsystems (which is now a subsidiary of Oracle Corporation) and released in 1995 as a core component of Sun Microsystems' Java platform. The language derives much of its syntax from C and C++ but has a simpler object model and fewer low-level facilities. Java applications are typically compiled to bytecode (class file) that can run on any Java Virtual Machine (JVM) regardless of computer architecture. Java is a general-purpose, concurrent, class-based, object-oriented language that is specifically designed to have as few implementation dependencies as possible. It is intended to let application developers "write once, run anywhere". Java is currently one of the most popular programming languages in use, and is widely used from application software to web applications. The original and reference implementation Java compilers, virtual machines, and class libraries were developed by Sun from 1995. As of May 2007, in compliance with the specifications of the Java Community Process, Sun relicensed most of its Java technologies under the GNU General Public License. Others have also developed alternative implementations of these Sun technologies, such as the GNU Compiler for Java, GNU Classpath, and Dalvik.

Swing is a graphical user interface library for the Java SE platform. It is possible to specify a different look and feel through the pluggable look and feel system of Swing. Clones of Windows, GTK+ and Motif are supplied by Sun. Apple also provides an Aqua look and feel for Mac OS X. Where prior implementations of these looks and feels may have been considered lacking, Swing in Java SE 6 addresses this problem by using more native GUI widget drawing routines of the underlying platforms.


The Java Media Framework (JMF) is a Java library that enables audio, video and other timebased media to be added to Java applications and applets. This optional package, which can capture, play, stream, and transcode multiple media formats, extends the Java Platform, Standard Edition (Java SE) and allows development of cross-platform multimedia applications



// JAVA MEDIA PLAYER import java.awt.*; import java.awt.event.*; import*; import javax.swing.*; import*; import java.awt.image.*; import javax.imageio.*; public class Media extends JFrame { private Player player; private File file; JFrame frame1 = new JFrame(); JFrame frame2 = new JFrame(); JPanel p=new JPanel(); int i=0; public Media() { super( "JVIN PLAYER" ); JMenu m1 = new JMenu("Media"); JMenuItem i1=m1.add(new JMenuItem("Open File")); m1.add(new JSeparator()); JMenuItem i2=m1.add(new JMenuItem("Playlist")); m1.add(new JSeparator()); JMenuItem i3=m1.add(new JMenuItem("Quit")); JMenu m2 = new JMenu("View");

JMenuItem i4=m2.add(new JMenuItem("Small Screen")); m2.add(new JSeparator()); JMenuItem i5=m2.add(new JMenuItem("Full Screen")); m2.add(new JSeparator()); JMenuItem i6=m2.add(new JMenuItem("Snapshot")); p.setLayout (new BoxLayout(p, BoxLayout.Y_AXIS)); JMenu m3 = new JMenu("Info."); JMenuItem i7=m3.add(new JMenuItem("Help")); m3.add(new JSeparator()); JMenuItem i8=m3.add(new JMenuItem("About")); frame2.setTitle("Playlist"); JMenuBar bar = new JMenuBar(); bar.add(m1); bar.add(m2); bar.add(m3); JFrame frame = new JFrame(); setJMenuBar(bar); setSize(300,300); setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); setVisible(true); i1.addActionListener( new ActionListener() { public void actionPerformed( ActionEvent e ) { openFile(); createPlayer();

} } ); i2.addActionListener( new ActionListener() { public void actionPerformed( ActionEvent e ) { openFile(); playlist(); } } ); i3.addActionListener( new ActionListener() { public void actionPerformed( ActionEvent e ) { System.exit(0); } } ); i4.addActionListener( new ActionListener() { public void actionPerformed( ActionEvent e ) { setLocation(0,0); setSize(300,300);

} } ); i5.addActionListener( new ActionListener() { public void actionPerformed( ActionEvent e ) { setLocation(0,0); setSize(1000,700); } } ); i6.addActionListener( new ActionListener() { public void actionPerformed( ActionEvent e ) { snap(); } } ); i7.addActionListener( new ActionListener() { public void actionPerformed( ActionEvent e ) { JOptionPane.showMessageDialog( null, " Java Media Player \n \nTo increase or decerese the volume: right click on the sound icon\nTo increase decerese

the speed of player: right click on the next icon\nTo view the properties of media files: left click on icon", "About",JOptionPane.PLAIN_MESSAGE ); } } ); i8.addActionListener( new ActionListener() { public void actionPerformed( ActionEvent e ) { JOptionPane.showMessageDialog( null, "This program is made by JAY PRATAP, VINAY Kr. SHRIVASTAVA ","About",JOptionPane.PLAIN_MESSAGE ); } } ); } private void snap() { try { Robot robot = new Robot(); BufferedImage [] bi=new BufferedImage[10]; bi[i]=robot.createScreenCapture(new Rectangle((Toolkit.getDefaultToolkit().getScreenSize()))); ImageIO.write(bi[i], "jpg", new File("d:/imageTest"+i+".jpg")); i++; } catch (AWTException e1)

{ e1.printStackTrace(); } catch (IOException e1) { e1.printStackTrace(); } } private void openFile() { JFileChooser fileChooser = new JFileChooser(); fileChooser.setFileSelectionMode(JFileChooser.FILES_ONLY ); int result = fileChooser.showOpenDialog( this ); if ( result == JFileChooser.CANCEL_OPTION ) file = null; else file = fileChooser.getSelectedFile(); } private void playlist() { final JButton l1, l2,l3; l1 = new JButton(file.toString()); if (p!=null) frame2.remove(p); p.add(l1); l1.addActionListener(new ActionListener()

{ public void actionPerformed( ActionEvent e ) { file=new File(l1.getText()); createPlayer(); } } ); p.add(Box.createVerticalGlue()); frame2.add(p); frame2.pack(); frame2.setVisible(true); } private void createPlayer() { if ( file == null ) return; removePreviousPlayer(); try { player = Manager.createPlayer( file.toURL() ); player.addControllerListener( new EventHandler() ); player.start(); } catch ( Exception e ) {

JOptionPane.showMessageDialog( this, "Invalid file or location", "Error loading file",JOptionPane.ERROR_MESSAGE ); } } private void removePreviousPlayer() { if ( player == null ) return; Component visual = player.getVisualComponent(); Component control = player.getControlPanelComponent(); Container c = getContentPane(); if ( visual != null ) c.remove( visual ); if ( control != null ) c.remove( control ); player.close(); } public static void main(String args[]) { Media app = new Media(); app.addWindowListener( new WindowAdapter() { public void windowClosing( WindowEvent e ) { System.exit(0); } }

); } private class EventHandler implements ControllerListener { public void controllerUpdate( ControllerEvent e ) { if ( e instanceof RealizeCompleteEvent ) { Container c = getContentPane(); Component visualComponent =player.getVisualComponent(); if ( visualComponent != null ) c.add( visualComponent, BorderLayout.CENTER ); Component controlsComponent =player.getControlPanelComponent(); if ( controlsComponent != null ) c.add( controlsComponent, BorderLayout.SOUTH ); c.doLayout(); } } } }


In digital electronics, analogue electronics and entertainment, the interface of media may include media controls or player controls, to enact and change or adjust the process of watching film or listening to audio.

SYMBOLS:These are common icons on physical devices and application software: Play Pause Stop Options to increase or decrease speed, search and change chapters: Rewind or fast forward Skip to the start or end Controls to record content or eject media form a device: Record Eject Other symbols are Shuffle and Repeat

TECHNOLOGY:There are many types of entertainment devices that include media controls, some are mobile:
y y y y y y y

Blu-ray player CD player Computer DVD player Record player Remote control Tape player

Java is a programming language originally developed by James Gosling at Sun Microsystems (which is now a subsidiary of Oracle Corporation) and released in 1995 as a core component of Sun Microsystems' Java platform. The language derives much of its syntax from C and C++ but has a simpler object model and fewer low-level facilities. Java applications are typically compiled to bytecode (class file) that can run on any Java Virtual Machine (JVM) regardless of computer architecture. Java is a general-purpose, concurrent, class-based, object-oriented language that is specifically designed to have as few implementation dependencies as possible. It is intended to let application developers "write once, run anywhere". Java is currently one of the most popular programming languages in use, and is widely used from application software to web applications. The original and reference implementation Java compilers, virtual machines, and class libraries were developed by Sun from 1995. As of May 2007, in compliance with the specifications of the Java Community Process, Sun relicensed most of its Java technologies under the GNU General Public License. Others have also developed alternative implementations of these Sun technologies, such as the GNU Compiler for Java, GNU Classpath, and Dalvik.

James Gosling, Mike Sheridan, and Patrick Naughton initiated the Java language project in June 1991. Java was originally designed for interactive television, but it was too advanced for the digital cable television industry at the time. The language was initially called Oak after an oak tree that stood outside Gosling's office; it went by the name Green later, and was later renamed Java, from a list of random words. Gosling aimed to implement a virtual machine and a language that had a familiar C/C++ style of notation. Sun Microsystems released the first public implementation as Java 1.0 in 1995. It promised "Write Once, Run Anywhere" (WORA), providing no-cost run-times on popular platforms. Fairly secure and featuring configurable security, it allowed network- and file-access restrictions. Major web browsers soon incorporated the ability to run Java applets within web pages, and Java quickly became popular. With the advent of Java 2 (released initially as J2SE 1.2 in December 19981999), new versions had multiple configurations built for different types of platforms. For example, J2EE targeted enterprise applications and the greatly stripped-down version J2ME for mobile applications (Mobile Java). J2SE designated the Standard Edition. In 2006, for marketing purposes, Sun renamed new J2 versions as Java EE, Java ME, and Java SE, respectively. In 1997, Sun Microsystems approached the ISO/IEC JTC1 standards body and later the Ecma International to formalize Java, but it soon withdrew from the process. Java remains a de facto standard, controlled through the Java Community Process. At one time, Sun made most of its Java implementations available without charge, despite their proprietary software status. Sun generated revenue from Java through the selling of licenses for specialized products such as the Java Enterprise System. Sun distinguishes between its Software Development Kit (SDK) and Runtime Environment (JRE) (a subset of the SDK); the primary distinction involves the JRE's lack of the compiler, utility programs, and header files. On November 13, 2006, Sun released much of Java as open source software under the terms of the GNU General Public License (GPL). On May 8, 2007, Sun finished the process, making all of

Java's core code available under free software/open-source distribution terms, aside from a small portion of code to which Sun did not hold the copyright. Sun's vice-president Rich Green has said that Sun's ideal role with regards to Java is as an "evangelist." Following Oracle Corporation's acquisition of Sun Microsystems in 20092010, Oracle has described itself as the "steward of Java technology with a relentless commitment to fostering a community of participation and transparency".

There were five primary goals in the creation of the Java language: 1. It should be "simple, object oriented and familiar". 2. It should be "robust and secure". 3. It should be "architecture-neutral and portable". 4. It should execute with "high performance". 5. It should be "interpreted, threaded, and dynamic".

One characteristic of Java is portability, which means that computer programs written in the Java language must run similarly on any supported hardware/operating-system platform. This is achieved by compiling the Java language code to an intermediate representation called Java bytecode, instead of directly to platform-specific machine code. Java bytecode instructions are analogous to machine code, but are intended to be interpreted by a virtual machine (VM) written specifically for the host hardware. End-users commonly use a Java Runtime Environment (JRE) installed on their own machine for standalone Java applications, or in a Web browser for Java applets. Standardized libraries provide a generic way to access host-specific features such as graphics, threading, and networking. A major benefit of using bytecode is porting. However, the overhead of interpretation means that interpreted programs almost always run more slowly than programs compiled to native executables would. Just-in-Time compilers were introduced from an early stage that compile bytecodes to machine code during runtime.

Sun Microsystems officially licenses the Java Standard Edition platform for Linux, Mac OS X, and Solaris. Although in the past Sun has licensed Java to Microsoft, the license has expired and has not been renewed. Through a network of third-party vendors and licensees, alternative Java environments are available for these and other platforms. Sun's trademark license for usage of the Java brand insists that all implementations be "compatible". This resulted in a legal dispute with Microsoft after Sun claimed that the Microsoft implementation did not support RMI or JNI and had added platform-specific features of their own. Sun sued in 1997, and in 2001 won a settlement of US$20 million, as well as a court order enforcing the terms of the license from Sun. As a result, Microsoft no longer ships Java with

Windows, and in recent versions of Windows, Internet Explorer cannot support Java applets without a third-party plugin. Sun, and others, have made available free Java run-time systems for those and other versions of Windows. Platform-independent Java is essential to the Java EE strategy, and an even more rigorous validation is required to certify an implementation. This environment enables portable server-side applications, such as Web services, Java Servlets, and Enterprise JavaBeans, as well as with embedded systems based on OSGi, using Embedded Java environments. Through the new GlassFish project, Sun is working to create a fully functional, unified open source implementation of the Java EE technologies. Sun also distributes a superset of the JRE called the Java Development Kit (commonly known as the JDK), which includes development tools such as the Java compiler, Javadoc, Jar, and debugger.

Programs written in Java have a reputation for being slower and requiring more memory than those written in C. However, Java programs' execution speed improved significantly with the introduction of Just-in-time compilation in 1997/1998 for Java 1.1, the addition of language features supporting better code analysis (such as inner classes, StringBuffer class, optional assertions, etc.), and optimizations in the Java Virtual Machine itself, such as HotSpot becoming the default for Sun's JVM in 2000. Currently, Java code has approximately half the performance of C code. Some platforms offer direct hardware support for Java; there are microcontrollers that can run java in hardware instead of a software JVM, and ARM based processors can have hardware support for executing Java bytecode through its Jazelle option.


Java uses an automatic garbage collector to manage memory in the object lifecycle. The programmer determines when objects are created, and the Java runtime is responsible for recovering the memory once objects are no longer in use. Once no references to an object remain, the unreachable memory becomes eligible to be freed automatically by the garbage collector. Something similar to a memory leak may still occur if a programmer's code holds a reference to an object that is no longer needed, typically when objects that are no longer needed are stored in containers that are still in use. If methods for a nonexistent object are called, a "null pointer exception" is thrown. One of the ideas behind Java's automatic memory management model is that programmers can be spared the burden of having to perform manual memory management. In some languages, memory for the creation of objects is implicitly allocated on the stack, or explicitly allocated and deallocated from the heap. In the latter case the responsibility of managing memory resides with the programmer. If the program does not deallocate an object, a memory leak occurs. If the program attempts to access or deallocate memory that has already been deallocated, the result is undefined and difficult to predict, and the program is likely to become unstable and/or crash. This can be partially remedied by the use of smart pointers, but these add overhead and complexity. Note that garbage collection does not prevent "logical" memory leaks, i.e. those where the memory is still referenced but never used.

Garbage collection may happen at any time. Ideally, it will occur when a program is idle. It is guaranteed to be triggered if there is insufficient free memory on the heap to allocate a new object; this can cause a program to stall momentarily. Explicit memory management is not possible in Java. Java does not support C/C++ style pointer arithmetic, where object addresses and unsigned integers (usually long integers) can be used interchangeably. This allows the garbage collector to relocate referenced objects and ensures type safety and security. As in C++ and some other object-oriented languages, variables of Java's primitive data types are not objects. Values of primitive types are either stored directly in fields (for objects) or on the stack (for methods) rather than on the heap, as commonly true for objects (but see Escape analysis). This was a conscious decision by Java's designers for performance reasons. Because of this, Java was not considered to be a pure object-oriented programming language. However, as of Java 5.0, autoboxing enables programmers to proceed as if primitive types were instances of their wrapper class. Java contains multiple types of garbage collectors. By default, HotSpot uses the Concurrent Mark Sweep collector, also known as the CMS Garbage Collector. However, there are also several other garbage collectors that can be used to manage the Heap. For 90% of applications in Java, the CMS Garbage Collector is good enough.

The syntax of Java is largely derived from C++. Unlike C++, which combines the syntax for structured, generic, and object-oriented programming, Java was built almost exclusively as an object-oriented language. All code is written inside a class, and everything is an object, with the exception of the primitive data types (integers, floating-point numbers, boolean values, and characters), which are not classes for performance reasons. Unlike C++, Java does not support operator overloading and multiple inheritance for classes in order to simplify the language and to prevent possible errors and anti-pattern design. Java uses similar commenting methods to C++. There are three different styles of comment: a single line style marked with two slashes (//), a multiple line style opened with a slash asterisk (/*) and closed with an asterisk slash (*/), and the Javadoc commenting style opened with a slash and two asterisks (/**) and closed with an asterisk slash (*/). The Javadoc style of commenting allows the user to run the Javadoc executable to compile documentation for the program.

Java applets are programs that are embedded in other applications, typically in a Web page displayed in a Web browser.

Java Servlet technology provides Web developers with a simple, consistent mechanism for extending the functionality of a Web server and for accessing existing business systems. Servlets are server-side Java EE components that generate responses (typically HTML pages) to requests (typically HTTP requests) from clients. A servlet can almost be thought of as an applet that runs on the server sidewithout a face.

JavaServer Pages (JSP) are server-side Java EE components that generate responses, typically HTML pages, to HTTP requests from clients. JSPs embed Java code in an HTML page by using the special delimiters <% and %>. A JSP is compiled to a Java servlet, a Java application in its own right, the first time it is accessed. After that, the generated servlet creates the response.

Swing is a graphical user interface library for the Java SE platform. It is possible to specify a different look and feel through the pluggable look and feel system of Swing. Clones of Windows, GTK+ and Motif are supplied by Sun. Apple also provides an Aqua look and feel for Mac OS X. Where prior implementations of these looks and feels may have been considered lacking, Swing in Java SE 6 addresses this problem by using more native GUI widget drawing routines of the underlying platforms.

In 2004, generics were added to the Java language, as part of J2SE 5.0. Prior to the introduction of generics, each variable declaration had to be of a specific type. For container classes, for example, this is a problem because there is no easy way to create a container that accepts only specific types of objects. Either the container operates on all subtypes of a class or interface, usually Object, or a different container class has to be created for each contained class. Generics allow compile-time type checking without having to create a large number of container classes, each containing almost identical code.

A number of criticisms have been leveled at Java programming language for various design choices in the language and platform. Such criticisms include the implementation of generics, the handling of unsigned numbers, the implementation of floating-point arithmetic, and security vulnerabilities.


Java Platform and Class libraries diagram


Java libraries are the compiled bytecodes of source code developed by the JRE implementor to support application development in Java. Examples of these libraries are:

The core libraries, which include:


Collection libraries that implement data structures such as lists, dictionaries, trees, sets, queues and double-ended queue, or stacks XML Processing (Parsing, Transforming, Validating) libraries Security Internationalization and localization libraries

The integration libraries, which allow the application writer to communicate with external systems. These libraries include:

The Java Database Connectivity (JDBC) API for database access Java Naming and Directory Interface (JNDI) for lookup and discovery RMI and CORBA for distributed application development JMX for managing and monitoring applications The (heavyweight, or native) Abstract Window Toolkit (AWT), which provides GUI components, the means for laying out those components and the means for handling events from those components The (lightweight) Swing libraries, which are built on AWT but provide (nonnative) implementations of the AWT widgetry

User interface libraries, which include:

y y

APIs for audio capture, processing, and playback A platform dependent implementation of Java Virtual Machine (JVM) that is the means by which the byte codes of the Java libraries and third party applications are executed Plugins, which enable applets to be run in Web browsers Java Web Start, which allows Java applications to be efficiently distributed to end-users across the Internet Licensing and documentation.

y y

Javadoc is a comprehensive documentation system, created by Sun Microsystems, used by many Java developers. It provides developers with an organized system for documenting their code. Javadoc comments have an extra asterisk at the beginning, i.e. the tags are /** and */, whereas the normal multi-line comment tags comments in Java and C are set off with /* and */.

Sun has defined and supports four editions of Java targeting different application environments and segmented many of its APIs so that they belong to one of the platforms. The platforms are:
y y y y

Java Card for smartcards. Java Platform, Micro Edition (Java ME) targeting environments with limited resources. Java Platform, Standard Edition (Java SE) targeting workstation environments. Java Platform, Enterprise Edition (Java EE) targeting large distributed enterprise or Internet environments.

The classes in the Java APIs are organized into separate groups called packages. Each package contains a set of related interfaces, classes and exceptions. Refer to the separate platforms for a description of the packages available. The set of APIs is controlled by Sun Microsystems in cooperation with others through the Java Community Process program. Companies or individuals participating in this process can influence the design and development of the APIs. This process has been a subject of controversy. Sun also provided an edition called PersonalJava that has been superseded by later, standardsbased Java ME configuration-profile pairings.

Swing is the primary JavaGUI widget toolkit. It is part of Sun Microsystems' Java Foundation Classes (JFC) an API for providing a graphical user interface (GUI) for Java programs. Swing was developed to provide a more sophisticated set of GUI components than the earlier Abstract Window Toolkit. Swing provides a native look and feel that emulates the look and feel of several platforms, and also supports a pluggable look and feel that allows applications to have a look and feel unrelated to the underlying platform. It is more powerful and flexible components than AWT. In addition to familiar components such as buttons, check box and labels, swings provide several advanced components such as tabbed panel, scroll panes, trees, tables and lists. Unlike AWT components, Swing components are not implemented by platform-specific code. Instead they are written entirely in Java and therefore are platform-independent. The term "lightweight" is used to describe such a element.

The Internet Foundation Classes (IFC) were a graphics library for Java originally developed by Netscape Communications Corporation and first released on December 16, 1996. On April 2, 1997, Sun Microsystems and Netscape Communications Corporation announced their intention to incorporate IFC with other technologies to form the Java Foundation Classes. Swing introduced a mechanism that allowed the look and feel of every component in an application to be altered without making substantial changes to the application code. The introduction of support for a pluggable look and feel allows Swing components to emulate the appearance of native components while still retaining the benefits of platform independence. This feature also makes it easy to make an application written in Swing look very different from native programs if desired. Originally distributed as a separately downloadable library, Swing has been included as part of the Java Standard Edition since release 1.2. The Swing classes and components are contained in the javax.swing


Swing is a platform-independent, Model-View-ControllerGUI framework for Java. It follows a single-threaded programming model, and possesses the following traits:

Swing is platform independent both in terms of expression (Java) and implementation (Look-andFeel).

Swing is a highly partitioned architecture, which allows for the "plugging" of various custom implementations of specified framework interfaces: Users can provide their own custom implementation(s) of these components to override the default implementations. In general, Swing users can extend the framework by extending existing (framework) classes and/or providing alternative implementations of core components.

Swing is a component-based framework. The distinction and components is a fairly subtle point: concisely, a component is a well-behaved object with a known/specified characteristic pattern of behaviour. Swing objects asynchronously fire events, have "bound" properties, and respond to a well-known set of commands (specific to the component.) Specifically, Swing components are Java Beans components, compliant with the java beans component architecture specifications.

Given the programmatic rendering model of the Swing framework, fine control over the details of rendering of a component is possible in Swing. As a general pattern, the visual representation of a Swing component is a composition of a standard set of elements, such as a "border", "inset", decorations, etc. Typically, users will programmatically customize a standard Swing component (such as a JTable) by assigning specific Borders, Colors, Backgrounds, opacities, etc., as the properties of that component. The core component will then use these properties (settings) to determine the appropriate renderers to use in painting its various aspects. However, it is also completely possible to create unique GUI controls with highly customized visual representation.

Swing's heavy reliance on runtime mechanisms and indirect composition patterns allows it to respond at runtime to fundamental changes in its settings. For example, a Swing-based application can change its look and feel at runtime. Further, users can provide their own look and feel implementation, which allows for uniform changes in the look and feel of existing Swing applications without any programmatic change to the application code.

Swing's configurability is a result of a choice not to use the native host OS's GUI controls for displaying itself. Swing "paints" its controls programmatically through the use of Java 2D APIs, rather than calling into a native user interface toolkit. Thus, a Swing component does not have a corresponding native OS GUI component, and is free to render itself in any way that is possible with the underlying graphics APIs. However, at its core every Swing component relies on an AWT container, since (Swing's) JComponent extends (AWT's) Container. This allows Swing to plug into the host OS's GUI management framework, including the crucial device/screen mappings and user interactions, such as key presses or mouse movements. Swing simply "transposes" its own (OS agnostic) semantics over the underlying (OS specific) components. So, for example, every Swing component paints its rendition on the graphic device in response to a call to component.paint(), which is defined in (AWT) Container. But unlike AWT components, which delegated the painting to their OS-native "heavyweight" widget, Swing components are responsible for their own rendering. This transposition and decoupling is not merely visual, and extends to Swing's management and application of its own OS-independent semantics for events fired within its component containment hierarchies. Generally speaking, the Swing Architecture delegates the task of mapping the various flavors of OS GUI semantics onto a simple, but generalized, pattern to the AWT container. Building on that generalized platform, it establishes its own rich and complex GUI semantics in the form of the JComponent model.


The Swing library makes heavy use of the Model/View/Controller software design pattern,[1] which conceptually decouples the data being viewed from the user interface controls through which it is viewed. Because of this, most Swing components have associated models (which are specified in terms of Java interfaces), and the programmer can use various default implementations or provide their own. The framework provides default implementations of model interfaces for all of its concrete components. The typical use of the Swing framework does not require the creation of custom models, as the framework provides a set of default implementations that are transparently, by default, associated with the corresponding JComponent child class in the Swing library. In general, only complex components, such as tables, trees and sometimes lists, may require the custom model implementations around the application-specific data structures. To get a good sense of the potential that the Swing architecture makes possible, consider the hypothetical situation where custom models for tables and lists are wrappers over DAO and/or EJB services.. Typically, Swing component model objects are responsible for providing a concise interface defining events fired, and accessible properties for the (conceptual) data model for use by the associated JComponent. Given that the overall MVC pattern is a loosely-coupled collaborative object relationship pattern, the model provides the programmatic means for attaching event listeners to the data model object. Typically, these events are model centric (ex: a "row inserted" event in a table model) and are mapped by the JComponent specialization into a meaningful event for the GUI component. For example, the JTable has a model called TableModel that describes an interface for how a table would access tabular data. A default implementation of this operates on a two-dimensional array. The view component of a Swing JComponent is the object used to graphically "represent" the conceptual GUI control. A distinction of Swing, as a GUI framework, is in its reliance on programmatically-rendered GUI controls (as opposed to the use of the native host OS's GUI controls). Prior to Java 6 Update 10, this distinction was a source of complications when mixing AWT controls, which use native controls, with Swing controls in a GUI (see Mixing AWT and Swing components). Finally, in terms of visual composition and management, Swing favors relative layouts (which specify the positional relationships between components) as opposed to absolute layouts (which specify the exact location and size of components). This bias towards "fluid"' visual ordering is due to its origins in the applet operating environment that framed the design and development of the original Java GUI toolkit. (Conceptually, this view of the layout management is quite similar to that which informs the rendering of HTML content in browsers, and addresses the same set of concerns that motivated the former.)


AWT and Swing class hierarchy Since early versions of Java, a portion of the Abstract Window Toolkit (AWT) has provided platform-independent APIs for user interface components. In AWT, each component is rendered and controlled by a native peer component specific to the underlying windowing system. By contrast, Swing components are often described as lightweight because they do not require allocation of native resources in the operating system's windowing toolkit. The AWT components are referred to as heavyweight components. Much of the Swing API is generally a complementary extension of the AWT rather than a direct replacement. In fact, every Swing lightweight interface ultimately exists within an AWT heavyweight component because all of the top-level components in Swing (JApplet, JDialog, JFrame, and JWindow) extend an AWT top-level container. Prior to Java 6 Update 10, the use of both lightweight and heavyweight components within the same window was generally discouraged due to Z-order incompatibilities. However, later versions of Java have fixed these issues, and both Swing and AWT components can now be used in one GUI without Z-order issues. The core rendering functionality used by Swing to draw its lightweight components is provided by Java 2D, another part of JFC.

The Standard Widget Toolkit (SWT) is a competing toolkit originally developed by IBM and now maintained by the Eclipsecommunity. SWT's implementation has more in common with the heavyweight components of AWT. This confers benefits such as more accurate fidelity with the underlying native windowing toolkit, at the cost of an increased exposure to the native platform in the programming model.

The advent of SWT has given rise to a great deal of division among Java desktop developers, with many strongly favoring either SWT or Swing. Sun's development on Swing continues to focus on platform look and feel (PLAF) fidelity with each platform's windowing toolkit in the approaching Java SE 7 release (as of December 2006). There has been significant debate and speculation about the performance of SWT versus Swing; some hinted that SWT's heavy dependence on JNI would make it slower when the GUI component and Java need to communicate data, but faster at rendering when the data model has been loaded into the GUI, but this has not been confirmed either way. A fairly thorough set of benchmarks in 2005 concluded that neither Swing nor SWT clearly outperformed the other in the general case. SWT serves the Windows platform very well but is considered by some to be less effective as a technology for cross-platform development. By using the high-level features of each native windowing toolkit, SWT returns to the issues seen in the mid 1990s (with toolkits like zApp, Zinc, XVT and IBM/Smalltalk) where toolkits attempted to mask differences in focus behaviour, event triggering and graphical layout. Failure to match behavior on each platform can cause subtle but difficult-to-resolve bugs that impact user interaction and the appearance of the GUI.


The Abstract Window Toolkit (AWT) is Java's original platform-independent windowing, graphics, and user-interfacewidget toolkit. The AWT is now part of the Java Foundation Classes (JFC) the standard API for providing a graphical user interface (GUI) for a Java program. AWT is also the GUI toolkit for a number of Java ME profiles. For example, Connected Device Configuration profiles require Java runtimes on mobile telephones to support AWT.

(Java Media Framework)
The JMF is a set of three new APIs being co-defined by the JMF Working Group members -Sun, Silicon Graphics, and Intel. These APIs eventually will include Java Media Player, Capture, and Conferencing. The first to be delivered, the Player API provides a framework for implementers to build media players and provide them in a standard way on all Java platforms. The JMF specification is flexible enough to allow developers to extend players by adding their own nodes (such as images filters, audio reverb effects, and so on) or to use the standard players without making any additions. Before the JMF Player API, multimedia playback support in Java was extremely limited. Programmers had to generate their own GUI controls. (JMF returns a standard set of controls in the form of a Control Panel and other Control objects.) The supported media types in the core Java API were limited (Sun's muLaw format for sound, and no media option for video), so developers were forced to implement their own players without any underlying framework to assist them. With the JMF Player API, however, Java programmers can implement support for almost any audio or video format by building upon an established media playback framework. In addition, standard implementations (see Resources for URLs pointing to more information on implementations from Intel, Silicon Graphics, and Sun) provide built-in support for common Web formats such as muLaw, Apple AIFF, and Microsoft PC WAV for audio, as well as Apple QuickTime video, Microsoft AVI video, and Motion Picture Expert Group's MPEG-1 and MPEG-2 for video. MIDI currently is supported in the Silicon Graphics IRIX implementation and is slated for support in Intel's Windows implementation. If you want to use one of these standard Webbased formats, you are now able to easily integrate multimedia playback into applets and applications alike with only a few lines of code. JMF allows player implementers to use native methods as need be underneath the covers for greater speed. This lets the implementers optimize performance on each platform. At the same time, the common Java Media Player API ensures that applets and standalone applications will run on any Java platform.


Installation of the JMF software is straightforward. You need only download a package containing the classes, documentation, and accompanying files for your platform and install it using the standard method. Implementations currently are available from Silicon Graphics and Intel for IRIX and Windows 95/NT, respectively. Sun currently is working on a Solaris implementation. Note that you can use JMF with Sun's Java Development Kit (JDK) or with a browser. Implementations that work for Netscape Navigator 3.01 are available on all platforms, and Intel's Windows 95/NT implementation also supports Microsoft Internet Explorer 3.01.

To interact with the JMF applets embedded in this article, you must have the JMF Player implementation for your platform.

Java Media players support a six-state model based on the two fundamental states Stopped and Started. This model is outlined in the accompanying state diagram, with Stopped states given in green and the Started state in white. The states are Unrealized, Realizing, Realized, Prefetching, Prefetched, and Started. Note that the transitions from Realizing to Realized and Prefetching to Prefetched are automatic (Realizing and Prefetching are transient states of indeterminate length). Other transitions are brought about by method calls, with some of the most important methods being given in the diagram.

State transitions are accompanied by the appropriate Transition Event being generated. Any interested class can implement the Controller Listener interface and use its controller Update method to handle such Transition Events accordingly. A complete listing of Transition Events is available in the JMF Player API Documentation. A JMF player fundamentally is an encapsulation of the multimedia component that allows for control of state transitions during playback. JMF players provide methods to query the current state, to acquire necessary resources, and to start, stop, and control the actual playback of the media file or stream. Read on for a brief description of how to create a player and control it. uses the media sample's URL to build a player using a Player Factory. This factory model is very similar to the Connection Factory used in JDBC and similar to other factories used throughout the Java APIs. The Factory itself uses appropriate protocol handlers and content handlers to build and return the final media player. A player is built and returned as: Player myPlayer = Manager.createPlayer(myURL);

After being returned from the Player Factory, a player must be "Realized" and "Perfected" before it can be started. Realization refers to the process of finding all resources the player will need to play, whereas perfecting actually loads the resources and readies the player to begin playing. Each of these state transitions are completed by making one call to the Player API. Note that the realize method is a non-blocking method, but that players need to be realized to use many of their methods (such as getVisualComponent, for example), so it is often useful to implement a blocking Realize yourself for use in guaranteeing that you have a realized player. This example blocking Realize works in cooperation with the controller Update method and a Boolean variable to ensure that a realized player is returned. Once perfected, a player has the necessary resources to begin playback. A call to the start () method begins playback at the beginning of the media sample or the appropriate point in a live multimedia stream. Note that if start () is called on an Unrealized player, the player first uses its realize () and prefetch() methods before starting. Similarly, calling start () on a realized player that is not yet prefetched will result in prefetching occurring before starting. Because JMF is not yet included in browser Java implementations by default, I've placed the applets on a separate page. After you have installed JMF on your platform, you can interact with the example players. Both applets on this page are instances of the same example applet, with a complete code listing available here. Note that TransitionEvents are printed to standard output, so if you open your Java Console before loading the example page, you can see events as the player transitions from state to state.


JMF was developed for Java Development Kit (JDK) 1.0.2 compatibility, and follows JDK 1.0.2 design practices. For example, it employs its own event model rather than using the new JDK 1.1 event model based on java.util.EventObject. This is both strength and a weakness, as it provides backward compatibility while failing to take advantage of some of the benefits available in the new JDK event model. The interface extends from the Controller interface, which itself extends the Clock interface. The Clock interface specifies methods used for synchronization and timekeeping, while Controller adds variables and methods used to track state and state transitions. The separation of functionality between the Clock, Controller, and Player interfaces allows developers to implement portions of functionality without having to implement everything. For example, one could create a Controller without having to bother with providing an implementation for the methods in Player that a controller does not need. The JMF Player API allows for the creation of streaming media players as well as players for stored media files. For example, players can be created to display and integrate live broadcasts and large multimedia streams such as movies or music albums into the Web. Such players, combined with a high-performance media server such as Silicon Graphics Cosmo MediaBase, allow for the dissemination of pay-per-view content on the Web, training videos on corporate intranets, etc., all without the content provider having to worry about the client's platform (so long as the client has a Java-capable browser).

Note that while the long-term plan for JMF is possibly to provide Java implementations for decoders and lower-level framework components, current Player implementations use native methods for much of the lower-level processing. For example, Intel uses Microsoft's ActiveMovie while Silicon Graphics uses its own Digital Media libraries to provide the core decoders for movie playback. Though the lower-level native code is non-portable, this tradeoff is made for the sake of speed. A further disadvantage, however, is that native code complicates the debugging of applets and applications that use JMF players, as native method debugging is not yet well supported in most Java development environments. Other advanced features are provided for in the Player API. Any player may act as a controller for one or more other players. This synchronization is achieved through the use of TimeBase objects, which function as clocks in JMF. Players can synchronize with one another using methods such as getTimeBase and setTimeBase. The JMF Player API includes a CachingControl interface for use in building a CachingControl. This minimizes the impact of varying network performance by providing a buffer out of which the multimedia stream can be played while the network catches up to the player after heavy loading slows it down. Interfaces also are provided for GainControl and GainChangeListener so that multimedia samples with soundtracks can be better controlled. The API also contains packages to provide for reliable and unreliable, or streaming, media content (see the API documentation for packages) and to provide for file and HTTP protocols (outlined in the API documentation for the packages).


The JMF Player API currently is in beta, with a final implementation expected in the second quarter of 1997. Though the current JMF Player API supports only media display, future additions to JMF are intended to add support for media capture and conferencing. JavaSoft currently states that the Media Capture portion of the JMF will be available with a final implementation sometime during 1997 (the specific quarter is to be determined), while the Media Conference API's final implementation date is still to be determined. The JMF Player API will be integrated into the Java platform sometime subsequent to the current JDK 1.1 release. Whether JMF will be a core API or a standard extension is currently listed as "to be determined" by JavaSoft. Up-to-date information is available from the Java API Overview page at JavaSoft.

An example of JMF

VERSIONS AND LICENSING:An initial, playback-only version of JMF was developed by Sun Microsystems, Silicon Graphics, and Intel, and released as JMF 1.0 in 1997. JMF 2.0, developed by Sun and IBM, came out in 1999 and added capture, streaming, pluggable codecs, and transcoding. JMF is branded as part of Sun's "Desktop" technology of J2SE opposed to the Java server-side and client-side application frameworks. The notable exceptions are Java applets and Java Web Start, which have access to the full JMF in the web browser's or applet viewers underlying JRE. JMF 2.0 originally shipped with an MP3 decoder and encoder. This was removed in 2002, and a new MP3 playback-only plug-in was posted in 2004. JMF binaries are available under a custom license, and the source is available under the SCSL. The current version ships with four JAR (file format) files, and shell scripts to launch four JMFbased applications: JMStudio - A simple player GUI JMFRegistry - A GUI for managing the JMF "registry," which manages preferences, plug-ins, etc. JMFCustomizer - Used for creating a JAR file that contains only the classes needed by a specific JMF application, which allows developers to ship a smaller application. JMFInit SJMF is available in an all-Java version and as platform-specific "performance packs", which can contain native-code players for the platform, and/or hooks into a multimedia engine specific to that platform. JMF 2.0 offers performance packs for Linux, Solaris (on SPARC) and Windows.

In January 2011, Tudor Holton of Bentokit Project released a Debian package for the JMF to alleviate difficulties that had arisen over time when installing the JMF on Debian and Ubuntu GNU/Linux. This package does not contain the JMF, but presents the user with the JMF License, retrieves it from the Oracle website, and then installs it. A similar Debian package installer for the JMF MP3 Plugin was also built in February 2011.

DESIGN CONCEPTS:JMF abstracts the media it works with into Data Sources (for media being read into JMF) and Data Sinks (for data being exported out). It does not afford the developer significant access to the particulars of any given format; rather, media is represented as sources (themselves obtained from URL's) that can be read in and played, processed, and exported (though not all codecs support processing and transcoding). A Manager class offers static methods that are the primary point-of-contact with JMF for applications.

CRITICISM AND ALTERNATIVES:Many JMF developers have complained that the JMF implementation supplied in up-to-date JRE's supports relatively few up-to-date codecs and formats. Its all-Java version, for example, cannot play MPEG-2, MPEG-4, Windows Media, Real Media, most QuickTime movies, Flash content newer than Flash 2, and needs a plug-in to play the ubiquitous MP3 format. While the performance packs offer the ability to use the native platform's media library, they're only offered for Linux, Solaris and Windows. In particular, MS Windows-based JMF developers new to JMF often expect support for some newer formats on all platforms when such formats are only, in fact, supported on MS Windows. While JMF is considered a very useful framework, the freely-available implementation provided by Sun is suffering from a lack of updates and maintenance. JMF does not get much maintenance effort from Sun; the API has not been enhanced since 1999, and the last news item on JMF's home page was posted in September 2008. While JMF is built for extensibility, there are few such third-party extensions. Furthermore, content editing functionality in JMF is effectively non-existent. You can do simple recording and playback for audio and video, but the implementation provided by Sun can do little else. Platforms beyond those that Sun provides support to are left to their corresponding JRE vendors. While Sun still provides a forum for discussion of its implementation, there have been several efforts to implement open-source alternatives.

ALTERNATIVES:Depending on a developer's needs, several other libraries may be more suitable than JMF. These include:
y y y y y

Freedom for Media in Java (FMJ) An API-compatible with JMF alternative! JavaSound QuickTime for Java IBM Toolkit for MPEG-4 Jffmpeg

y y y y y y y y y

jvlc vlcj gstreamer-java Cortado, a complete player for Ogg Vorbis and Theora in a Java applet Directshow <> Java Wrapper Fobs4JMF JLayer MP3 library Xuggler Video4Linux4Java

STAGES:The JMF architecture is organized into three stages:

During the input stage, data is read from a source and passed in buffers to the processing stage. The input stage may consist of reading data from a local capture device (such as a webcam or TV capture card), a file on disk or stream from the network. The processing stage consists of a number of codecs and effects designed to modify the data stream to one suitable for output. These codecs may perform functions such as compressing or decompressing the audio to a different format, adding a watermark of some kind, cleaning up noise or applying an effect to the stream (such as echo to the audio). Once the processing stage has applied its transformations to the stream, it passes the information to the output stage. The output stage may take the stream and pass it to a file on disk, output it to the local video display or transmit it over the network. For example, a JMF system may read input from a TV capture card from the local system capturing input from a VCR in the input stage. It may then pass it to the processing stage to add a watermark in the corner of each frame and finally broadcast it over the local Intranet in the output stage.

COMPONENT ARCHITECTURE:JMF is built around a component architecture. The compenents are organized into a number of main categories:
y y y y y

Media handlers Data sources Codecs/Effects Renderers Mux/Demuxes

MEDIA HANDLERS Media Handlers are registered for each type of file that JMF must be able to handle. To support new file formats, a new Media Handler can be created. DATA SOURCES A Data Source handler manages source streams from various inputs. These can be for network protocols, such as http or ftp, or for simple input from disk. CODECS/EFFECTS Codecs and Effects are components that take an input stream, apply a transformation to it and output it. Codecs may have different input and output formats, while Effects are simple transformations of a single input format to an output stream of the same format. RENDERERS A renderer is similar to a Codec, but the final output is somewhere other than another stream. A Video Renderer outputs the final data to the screen, but another kind of renderer could output to different hardware, such as a TV out card. MUX/DEMUXES Multiplexers and Demultiplexers are used to combine multiple streams into a single stream or vice-versa, respectively. They are useful for creating and reading a package of audio and video for saving to disk as a single file, or transmitting over a network.

PRESENTING DATA:The Java Media Framework provides a number of pre-built classes that handle the reading, processing and display of data. Using the Player, media can easily be incorporated into any graphical application (AWT or Swing). The Processor allows you to control the encoding or decoding process at a finer level than the Player, such as adding a custom codec or effect between the input and output stages.


Supported Media Formats JMF supports audio sample rates from 8KHz to 48KHz. Note that cross-platform version of JMF only supports the following rates: 8, 11.025, 11.127, 16, 22.05, 22.254, 32, 44.1, and 48 KHz. The JMF 2.1.1 Reference Implementation supports the media types and formats listed in the table below. In this table: y y y y D indicates the format can be decoded and presented. E indicates the media stream can be encoded in the format. read indicates the media type can be used as input (read from a file) write indicates the media type can be generated as output (written to a file) JMF 2.1.1 Cross Platform Version read/write D,E D,E D,E D D,E read/write D,E JMF 2.1.1 Solaris/Linux Performance Pack read/write D,E D,E D,E D D,E read/write D,E JMF 2.1.1 Windows Performance Pack read/write D,E D,E D,E D D,E read/write D,E

Media Type

AIFF (.aiff) 8-bit mono/stereo linear 16-bit mono/stereo linear G.711 (U-law) A-law IMA4 ADPCM AVI (.avi) Audio: 8-bit mono/stereo linear Audio: 16-bit mono/stereo linear Audio: DVI ADPCM compressed







Audio: G.711 (U-law) Audio: A-law Audio: GSM mono Audio: ACM** Video: Cinepak Video: MJPEG (422) Video: RGB Video: YUV Video: VCM** GSM (.gsm) GSM mono audio HotMedia (.mvr) IBM HotMedia MIDI (.mid) Type 1 & 2 MIDI MPEG-1 Video (.mpg) Multiplexed System stream Video-only stream MPEG Layer II Audio (.mp2) MPEG layer 1, 2 audio

D,E D D,E D D D,E D,E read/write D,E read only D read only read only D

D,E D D,E D,E D,E D,E D,E read/write D,E read only D read only D read only D D read/write D,E

D,E D D,E D,E D D,E D,E D,E D,E read/write D,E read only D read only D read only D D read/write D,E

QuickTime (.mov) Audio: 8 bits mono/stereo linear Audio: 16 bits mono/stereo linear Audio: G.711 (U-law) Audio: A-law Audio: GSM mono Audio: IMA4 ADPCM Video: Cinepak Video: H.261 Video: H.263 Video: JPEG (420, 422, 444) Video: RGB Sun Audio (.au) 8 bits mono/stereo linear 16 bits mono/stereo linear G.711 (U-law) A-law Wave (.wav) 8-bit mono/stereo linear

read/write D,E

read/write D,E

read/write D,E

D,E D,E D D,E D,E D D D D,E read/write D,E D,E D,E D read/write D,E

D,E D,E D D,E D,E D,E D D,E D,E D,E read/write D,E D,E D,E D read/write D,E

D,E D,E D D,E D,E D D D,E D,E D,E read/write D,E D,E D,E D read/write D,E

16-bit mono/stereo linear G.711 (U-law) A-law GSM mono DVI ADPCM MS ADPCM ACM** Notes: y

D,E D,E D D,E D,E D -

D,E D,E D D,E D,E D -


ACM** - Window's Audio Compression Manager support. Tested for these formats: A-law, GSM610, MSNAudio, MSADPCM, Truespeech, mp3, PCM, Voxware AC8, Voxware AC10. VCM** - Window's Video Compression Manager support. Tested for these formats: IV41, IV51, VGPX, WINX, YV12, I263, CRAM, MPG4.

RTP Formats The JMF 2.1.1 Reference Implementation can receive and transmit the following RTP formats: y y R indicates that the format can be decoded and presented. T indicates that media streams can be encoded and transmitted in the format. JMF 2.1.1 RTP Cross Platform Payload Version 0 3 4 5 R,T R,T R R,T JMF 2.1.1 Solaris/Linux Performance Pack R,T R,T R,T R,T JMF 2.1.1 Windows Performance Pack R,T R,T R,T R,T

Media Type

Audio: G.711 (U-law) 8 kHz Audio: GSM mono Audio: G.723 mono Audio: 4-bit mono DVI 8 kHz

Audio: 4-bit mono DVI 11.025 kHz Audio: 4-bit mono DVI 22.05 kHz Audio: MPEG Layer I, II Video: JPEG (420, 422, 444)* Video: H.261 Video: H.263** Video: MPEG-I***













26 31 34 32

R Mode A Only T



* JPEG/RTP can only be transmitted in video dimensions that are in multiple of 8 pixels. ** H.263/RTP can only be transmitted in 3 different video dimensions: SQCIF (128x96), QCIF (176x144) and CIF (352x288). *** MPEG/RTP video can only be transmitted from pre-encoded MPEG content, i.e. from an MPEG-encoded file or MPEG enabled capture source. Real-time software MPEG encoding is not feasible for RTP transmission. Capture Devices The JMF 2.1.1 Reference Implementation supports SunVideo / SunVideoPlus capture devices on Solaris. On Windows, most capture devices that have VFW drivers are supported. On Linux, devices that have a Video4Linux driver are expected to work, but not extensively tested. The table below lists the capture devices known to work with this release. JMF 2.1.1 JMF 2.1.1 Cross Platform Solaris Version Performance Pack X (J2SE 1.3+) X X X JMF 2.1.1 Windows Performance Pack X -


JavaSound (16-bit, 44100, 22050, 11025Hz, 8000Hz linear) SunVideo SunVideoPlus

VFW Intel Create & Share Diamond Supra Video Kit; Share QuickCam VC (camera) e-cam (camera) Winnow Videum Creative Web Cam II Miro Video DC30 Iomega Buz QuickCam Home USB (Camera) Smart Video Recorder III

X Win9x Win98 WinNT WinNT, 9X WinNT, 9X Win9X Win9X Win9X Win98 Win9X

Audio is an electrical or other representation of sound. Audio is sound within the acoustic range available to humans. An audio frequency (AF) is an electrical alternating current within the 20 to 20,000 hertz (cycles per second) range that can be used to produce acoustic sound. In computers, audio is the sound system that comes with or can be added to a computer. An audio card contains a special built-in processor and memory for processing audio files and sending them to speakers in the computer. An audio file is a record of captured sound that can be played back. Sound is a sequence of naturally analog signals that are converted to digital signals by the audio card, using a microchip called an analog-to-digital converter (ADC). When sound is played, the digital signals are sent to the speakers where they are converted back to analog signals that generate varied sound. Audio files are usually compressed for storage or faster transmission. Audio files can be sent in short stand-alone segments - for example, as files in the Wave file format. In order for users to receive sound in real-time for a multimedia effect, listening to music, or in order to take part in an audio or video conference, sound must be delivered as streaming sound. More advanced audio cards support wavetable, or precaptured tables of sound. The most popular audio file format today is MP3 (MPEG-1 Audio Layer-3).


An audio file format is a file format for storing digital audio data on a computer system. This data can be stored uncompressed, or compressed to reduce the file size. It can be a raw bitstream, but it is usually a container format or an audio data format with defined storage layer.

It is important to distinguish between a file format and an audio codec. A codec performs the encoding and decoding of the raw audio data while the data itself is stored in a file with a specific audio file format. Most of the publicly documented audio file formats can be created with one of two or more encoders or codecs. Although most audio file formats support only one type of audio data (created with an audio coder), a multimedia container format (as Matroska or AVI) may support multiple types of audio and video data. There are three major groups of audio file formats:
y y

Uncompressed audio formats, such as WAV, AIFF, AU or raw header-less PCM; Formats with lossless compression, such as FLAC, Monkey's Audio (filename extension APE), WavPack (filename extension WV), Shorten, TTA, ATRAC Advanced Lossless, Apple Lossless (filename extension m4a), MPEG-4 SLS, MPEG-4 ALS, MPEG-4 DST, Windows Media Audio Lossless (WMA Lossless), and SHN or Shorten. Formats with lossy compression, such as MP3, Vorbis, Musepack, AAC, ATRAC and Windows Media Audio Lossy (WMA lossy)).


There is one major uncompressed audio format, PCM, which is usually stored in a .wav file on Windows or in a .aiff file on Mac OS. The AIFF format is based on the Interchange File Format (IFF). The WAV format is based on the Resource Interchange File Format (RIFF), which is similar to IFF. WAV and AIFF are flexible file formats designed to store more or less any combination of sampling rates or bitrates. This makes them suitable file formats for storing and archiving an original recording. BWF (Broadcast Wave Format) is a standard audio format created by the European Broadcasting Union as a successor to WAV. BWF allows metadata to be stored in the file. See European Broadcasting Union: Specification of the Broadcast Wave Format (EBU Technical document 3285, July 1997). This is the primary recording format used in many professional audio workstations in the television and film industry. BWF files include a standardized Timestamp reference which allows for easy synchronization with a separate picture element. Stand-alone, file based, multi-track recorders from Sound Devices, Zaxcom, HHB USA, Fostex, and Aaton all use BWF as their preferred format. The .cda (Compact Disk Audio Track) is a small file that serves as a shortcut to the audio data for a track on a music CD. It does not contain audio data and is therefore not considered to be a proper audio file format.


A lossless compressed format stores data in less space by eliminating unnecessary data. It requires more processing power both to compress the data and to uncompress for playback. Uncompressed audio formats encode both sound and silence with the same number of bits per unit of time. Encoding an uncompressed minute of absolute silence produces a file of the same size as encoding an uncompressed minute of symphonic orchestra music. In a lossless compressed format, however, the music would occupy a smaller portion of the file and the silence take up almost no space at all. Lossless compression formats enable the original uncompressed data to be recreated exactly. They include the common FLAC, WavPack, Monkey's Audio, ALAC/Apple Lossless). They provide a compression ratio of about 2:1 (i.e. their files take up half the space of the originals). Development in lossless compression formats aims to reduce processing time while maintaining a good compression ratio.


Lossy compression enables even greater reductions in file size by removing some of the data. A variety of techniques are used, mainly by exploiting psychoacoustics, to remove data with minimal reduction in the quality of reproduction. For many everyday listening situations, the loss in data (and thus quality) is imperceptible. The popular MP3 format is probably the best-known example, but Apple's AAC format is another common one. Most formats offer a range of degrees of compression, generally measured in bit rate. The lower the rate, the smaller the file and the greater the quality loss.


3gp - multimedia container format can contain proprietary formats as AMR, AMR-WB or AMR-WB+, but also some open formats act- ACT is a lossy ADPCM 8 kbit/s compressed audio format recorded by most Chinese MP3 and MP4 players with a recording function, and voice recorders AIFF standard audio file format used by Apple. It could be considered the Apple equivalent of wav. aac the Advanced Audio Coding format is based on the MPEG2 and MPEG4 standards. aac files are usually ADTS or ADIF containers. ALAC- Apple Lossless compression, a lossless compression format from Apple. amr- AMR-NB audio, used primarily for speech. atrac (.wav) the older style Sony ATRAC format. It always has a .wav file extension. To open these files, install the ATRAC3 drivers. Au the standard audio file format used by Sun, Unix and Java. The audio in au files can be PCM or compressed with the -law, a-law or G729 codecs. awb- AMR-WB audio, used primarily for speech, same as the ITU-T's G.722.2 specification. dct A variable codec format designed for dictation. It has dictation header information and can be encrypted (as may be required by medical confidentiality laws). dss Digital Speech Standard files are an Olympus proprietary format. It is a fairly old and poor codec. Gsm or mp3 are generally preferred where the recorder allows. It allows additional data to be held in the file header. dvf a Sony proprietary format for compressed voice files; commonly used by Sony dictation recorders. flac File format for the Free Lossless Audio Codec, a lossless compression codec. gsm designed for telephony use in Europe, gsm is a very practical format for telephone quality voice. It makes a good compromise between file size and quality. Note that wav files can also be encoded with the gsm codec. iklax An iKlax Media proprietary format, the iKlax format is a multi-track digital audio format allowing various actions on musical data, for instance on mixing and volumes arrangements. IVS A proprietary version with Digital Rights Management developed by 3D Solar UK Ltd for use in music downloaded from their Tronme Music Store and interactive music and video player. Mp4 A proprietary version of AAC in MP4 with Digital Rights Management developed by Apple for use in music downloaded from their iTunes Music Store. mmf - a Samsung audio format that is used in ringtones.

y y y

y y

mpc - Musepack or MPC (formerly known as MPEGplus, MPEG+ or MP+) is an open source lossy audio codec, specifically optimized for transparent compression of stereo audio at bitrates of 160180 kbit/s. msv a Sony proprietary format for Memory Stick compressed voice files. mxp4 a Musinaut proprietary format allowing play of different versions (or skins) of the same song. It allows various interactivity scenarios between the artist and the end user. ogg a free, open source container format supporting a variety of formats, the most popular of which is the audio format Vorbis. Vorbis offers compression similar to MP3 but is less popular. ra & rm a Real Audio format designed for streaming audio over the Internet. The .ra format allows files to be stored in a self-contained fashion on a computer, with all of the audio data contained inside the file itself. ram a text file that contains a link to the Internet address where the Real Audio file is stored. The .ram file contains no audio data itself. raw a raw file can contain audio in any format but is usually used with PCM audio data. It is rarely used except for technical tests. TTA - The True Audio, real-time lossless audio codec. vox the vox format most commonly uses the Dialogic ADPCM (Adaptive Differential Pulse Code Modulation) codec. Similar to other ADPCM formats, it compresses to 4-bits. Vox format files are similar to wave files except that the vox files contain no information about the file itself so the codec sample rate and number of channels must first be specified in order to play a vox file. wav standard audio file container format used mainly in Windows PCs. Commonly used for storing uncompressed (PCM), CD-quality sound files, which means that they can be large in sizearound 10 MB per minute. Wave files can also contain data encoded with a variety of (lossy) codecs to reduce the file size (for example the GSM or MP3 formats). Wav files use a RIFF structure. wma the popular Windows Media Audio format owned by Microsoft. Designed with Digital Rights Management (DRM) abilities for copy protection.

y y

y y

Video is the technology of electronically capturing, recording, processing, storing, transmitting, and reconstructing a sequence of still images representing scenes in motion.

Video technology was first developed for cathode ray tube television systems, but several new technologies for video display devices have since been invented. Charles Ginsburg led the research team at Ampex Corporation in developing the first practical videotape recorder (VTR). In 1951 the first video tape recorder (VTR) captured live images from television cameras by converting the information into electrical impulses and saving the information onto magnetic tape. This Tape was sold for around $50,000 in 1956. Sony Started to sell the first VCR tapes to the public in 1971. Since then computer technology has advanced a huge amount and can now be used to capture, store, edit and transmit film clips. Since the invention of the DVD in 1997 and Blu-ray Disc in 2006 video sales have plummeted and are often regarded as old fashioned and obsolete.


Analog video standards worldwide The term video ("video" meaning "I see", from the Latin verb "videre") commonly refers to several storage formats for moving pictures: digital video formats, including Blu-ray Disc, DVD, QuickTime, and MPEG-4; and analogvideotapes, including VHS and Betamax. Video can be recorded and transmitted in various physical media: in magnetic tape when recorded as PAL or NTSC electric signals by video cameras, or in MPEG-4 or DV digital media when recorded by digital cameras. Quality of video essentially depends on the capturing method and storage used. Digital television (DTV) is a relatively recent format with higher quality than earlier television

formats and has become a standard for television video. (See List of digital television deployments by country.) 3D-video, digital video in three dimensions, premiered at the end of 20th century. Six or eight cameras with realtime depth measurement are typically used to capture 3D-video streams. The format of 3D-video is fixed in MPEG-4 Part 16 Animation Framework eXtension (AFX). In the United Kingdom, Estonia, Australia, Netherlands, Finland, Hungary and New Zealand, the term video is often used informally to refer to both Videocassette recorders and video cassettes; the meaning is normally clear from the context.


Frame rate, the number of still pictures per unit of time of video, ranges from six or eight frames per second (frame/s) for old mechanical cameras to 120 or more frames per second for new professional cameras. PAL (Europe, Asia, Australia, etc.) and SECAM (France, Russia, parts of Africa etc.) standards specify 25 frame/s, while NTSC (USA, Canada, Japan, etc.) specifies 29.97 frame/s. Film is shot at the slower frame rate of 24photograms/s, which complicates slightly the process of transferring a cinematic motion picture to video. The minimum frame rate to achieve the illusion of a moving image is about fifteen frames per second.

Video can be interlaced or progressive. Interlacing was invented as a way to achieve good visual quality within the limitations of a narrow bandwidth. The horizontal scan lines of each interlaced frame are numbered consecutively and partitioned into two fields: the odd field (upper field) consisting of the odd-numbered lines and the even field (lower field) consisting of the evennumbered lines. NTSC, PAL and SECAM are interlaced formats. Abbreviated video resolution specifications often include an i to indicate interlacing. For example, PAL video format is often specified as 576i50, where 576 indicates the vertical line resolution, i indicates interlacing, and 50 indicates 50 fields (half-frames) per second. In progressive scan systems, each refresh period updates all of the scan lines. The result is a higher spatial resolution and a lack of various artifacts that can make parts of a stationary picture appear to be moving or flashing. A procedure known as deinterlacing can be used for converting an interlaced stream, such as analog, DVD, or satellite, to be processed by progressive scan devices, such as TFT TV-sets, projectors, and plasma panels. Deinterlacing cannot, however, produce a video quality that is equivalent to true progressive scan source material.


Common computer and TV display resolutions. The size of a video image is measured in pixels for digital video, or horizontal scan lines and vertical lines of resolution for analog video. In the digital domain (e.g. DVD) standard-definition television (SDTV) is specified as 720/704/640480i60 for NTSC and 768/720576i50 for PAL or SECAM resolution. However in the analog domain, the number of visible scanlines remains constant (486 NTSC/576 PAL) while the horizontal measurement varies with the quality of the signal: approximately 320 pixels per scanline for VCR quality, 400 pixels for TV broadcasts, and 720 pixels for DVD sources. Aspect ratio is preserved because of non-square "pixels". New high-definition televisions (HDTV) are capable of resolutions up to 19201080p60, i.e. 1920 pixels per scan line by 1080 scan lines, progressive, at 60 frames per second. Video resolution for 3D-video is measured in voxels (volume picture element, representing a value in three dimensional space). For example 512512512 voxels resolution, now used for simple 3D-video, can be displayed even on some PDAs.


Comparison of common cinematography and traditional television (green) aspect ratios.

Many arcade games use 3:4 portrait mode to efficiently utilize the entire display area. Aspect ratio describes the dimensions of video screens and video picture elements. All popular video formats are rectilinear, and so can be described by a ratio between width and height. The screen aspect ratio of a traditional television screen is 4:3, or about 1.33:1. High definition televisions use an aspect ratio of 16:9, or about 1.78:1. The aspect ratio of a full 35 mm film frame with soundtrack (also known as the Academy ratio) is 1.375:1. Ratios where the height is taller than the width are uncommon in general everyday use, but do have application in computer systems where the screen may be better suited for a vertical layout. The most common tall aspect ratio of 3:4 is referred to as portrait mode and is created by physically rotating the display device 90 degrees from the normal position. Other tall aspect ratios such as 9:16 are technically possible but rarely used. (For a more detailed discussion of this topic please refer to the page orientation article.) Pixels on computer monitors are usually square, but pixels used in digital video often have nonsquare aspect ratios, such as those used in the PAL and NTSC variants of the CCIR 601 digital video standard, and the corresponding anamorphic widescreen formats. Therefore, an NTSC DV image which is 720 pixels by 480 pixels is displayed with the aspect ratio of 4:3 (which is the traditional television standard) if the pixels are thin and displayed with the aspect ratio of 16:9 (which is the anamorphic widescreen format) if the pixels are fat.


Example of U-V color plane, Y value=0.5 Color model name describes the video color representation. YIQ was used in NTSC television. It corresponds closely to the YUV scheme used in NTSC and PAL television and the YDbDr scheme used by SECAM television. The number of distinct colors that can be represented by a pixel depends on the number of bits per pixel (bpp). A common way to reduce the number of bits per pixel in digital video is by chroma subsampling (e.g. 4:4:4, 4:2:2, 4:2:0/4:1:1).

Video quality can be measured with formal metrics like PSNR or with subjective video quality using expert observation. The subjective video quality of a video processing system may be evaluated as follows:
y y y

Choose the video sequences (the SRC) to use for testing. Choose the settings of the system to evaluate (the HRC). Choose a test method for how to present video sequences to experts and to collect their ratings. Invite a sufficient number of experts, preferably not fewer than 15. Carry out testing. Calculate the average marks for each HRC based on the experts' ratings.

y y y

Many subjective video quality methods are described in the ITU-T recommendation BT.500. One of the standardized method is the Double Stimulus Impairment Scale (DSIS). In DSIS, each expert views an unimpaired reference video followed by an impaired version of the same video.

The expert then rates the impaired video using a scale ranging from "impairments are imperceptible" to "impairments are very annoying".


A wide variety of methods are used to compress video streams. Video data contains spatial and temporal redundancy, making uncompressed video streams extremely inefficient. Broadly speaking, spatial redundancy is reduced by registering differences between parts of a single frame; this task is known as intraframe compression and is closely related to image compression. Likewise, temporal redundancy can be reduced by registering differences between frames; this task is known as interframe compression, including motion compensation and other techniques. The most common modern standards are MPEG-2, used for DVD, Blu-ray and satellite television, and MPEG-4, used for AVCHD, Mobile phones (3GP) and Internet.


Bit rate is a measure of the rate of information content in a video stream. It is quantified using the bit per second (bit/s or bps) unit or Megabits per second (Mbit/s). A higher bit rate allows better video quality. For example VideoCD, with a bit rate of about 1 Mbit/s, is lower quality than DVD, with maximum bit rate of 10.08 Mbit/s for video. HD (High Definition Digital Video and TV) has a still higher quality, with a bit rate of about 20 Mbit/s. Variable bit rate (VBR) is a strategy to maximize the visual video quality and minimize the bit rate. On fast motion scenes, a variable bit rate uses more bits than it does on slow motion scenes of similar duration yet achieves a consistent visual quality. For real-time and non-buffered video streaming when the available bandwidth is fixed, e.g. in videoconferencing delivered on channels of fixed bandwidth, a constant bit rate (CBR) must be used.

Stereoscopic video can be created using several different methods:

two channels a right channel for the right eye and a left channel for the left eye. Both channels may be viewed simultaneously by using light-polarizing filters 90 degrees off-axis from each other on two video projectors. These separately polarized channels are viewed wearing eyeglasses with matching polarization filters. one channel with two overlaid color coded layers. This left and right layer technique is occasionally used for network broadcast, or recent "anaglyph" releases of 3D movies on DVD. Simple Red/Cyan plastic glasses provide the means to view the images discretely to form a stereoscopic view of the content. One channel with alternating left/right frames for each eye, using LCD shutter glasses which read the frame sync from the VGA Display Data Channel to alternately cover each eye, so the appropriate eye sees the correct frame. This method is most common in computer virtual reality applications such as in a Cave Automatic Virtual Environment, but reduces the effective video framerate to one-half of normal (for example, from 120 Hz to 60 Hz).

Blu-ray Discs greatly improve the sharpness and detail of the two-color 3D effect in color coded stereo programs. See articles Stereoscopy and 3-D film.

There are different layers of video transmission and storage, each with its own set of formats to choose from. For transmission, there is a physical connector and signal protocol ("video connection standard" below). A given physical link can carry certain "display standards" which specify a particular refresh rate, display resolution, and color space. There are a number of analog and digital tape formats, though digital video files can also be stored on a computer file system which have their own formats. In addition to the physical format used by the storage or transmission medium, the stream of ones and zeros that is sent must be in a particular digital video "encoding", of which a number are available.



See List of video connectors for information about physical connectors and related signal standards.


New formats for digital television broadcasts use the MPEG-2video codec and include:
y y y

ATSC - USA, Canada, Korea DVB - Europe ISDB - Japan


ISDB-Tb - Uses the MPEG-4 video codec. Brazil, Peru

DMB - Korea

Analog television broadcast standards include:
y y y y y

FCS - USA, Russia; obsolete MAC - Europe; obsolete MUSE - Japan NTSC - USA, Canada, Japan PAL - Europe, Asia, Oceania
o o

PAL-M - PAL variation. Brazil PALplus - PAL extension, Europe

y y

RS-343 (military) SECAM - France, Former Soviet Union, Central Africa

An analog video format consists of more information than the visible content of the frame. Preceding and following the image are lines and pixels containing synchronization information or

a time delay. This surrounding margin is known as a blanking interval or blanking region; the horizontal and vertical front porch and back porch are the building blocks of the blanking interval. Many countries are planning a digital switchover soon.

See Computer display standard for a list of standards used for computer monitors and comparison with those used for television.


y y

Phonovision Kinescope


y y y y y y y y y y y y y y y

1" Type B video tape (Bosch) 1" Type C videotape (Ampex and Sony) 2" Quadruplex videotape (Ampex) Ampex Betacam Betacam SP Betamax (Sony) S-VHS (JVC) (1987) W-VHS (JVC) (1994) U-matic (Sony) VCR, VCR-LP, SVR VERA (BBC experimental format ca. 1958) VHS (JVC) VHS-C (JVC) Video 2000 (Philips)


y y y y y y y

Betacam IMX (Sony) D-VHS (JVC) D-Theater D1 (Sony) D2 (Sony) D3 D5 HD

y y y y y y y y

Digital-S D9 (JVC) Digital Betacam (Sony) Digital8 (Sony) DV HDV ProHD (JVC) MicroMV MiniDV Blu-ray Disc (Sony) CBHD DVD (was Super Density Disc, DVD Forum) UMD (Sony)


y y y y

y y y y

Enhanced Versatile Disc (EVD, Chinese government-sponsored) HD DVD (NEC and Toshiba) HD-VMD Laserdisc (old, MCA and Philips) CCIR 601 (ITU-T) H.261 (ITU-T) H.263 (ITU-T) H.264/MPEG-4 AVC (ITU-T + ISO) M-JPEG (ISO) MPEG-1 (ISO) MPEG-2 (ITU-T + ISO) MPEG-4 (ISO) Ogg-Theora VC-1 (SMPTE) System M System B


y y y y y y y y y y

y y

Video storage formats

Quadruplex (1956) VERA (1958) Type A (1965) CV-2000 (1965) Akai (1967) U-matic (1969) EIAJ-1 (1969) Cartrivision (1972) Philips VCR (1972) V-Cord (1974) VX (1974) Betamax (1975) IVC (1975) Type B Analog (1976) Type C (1976) VHS (1976) VK (1977) SVR (1979) Video 2000 (1980) CVC (1980) VHS-C (1982) M (1982) Betacam (1982) Video8 (1985) MII (1986) S-VHS (1987) S-VHS-C (1987) Hi8 (1989) W-VHS (1994)

D1 (1986) D2 (1988) D3 (1991) DCT (1992) Digital Betacam (1993) D5 (1994) DV (1995) Digital-S (D9) (1995) DVCPRO (1995) Betacam SX Digital (1996) DVCAM (1996) HDCAM (1997) DVCPRO50 (1997) D-VHS (1998) Digital8 (1999) DVCPRO HD (2000) D6 HDTV VTR (2000) MicroMV (2001) HDV (2003) HDCAM SR (2003)


Phonovision (1927) Ampex-HS (1967) TeD (1975) Laserdisc (1978) CED (1981) VHD (1983) Laserfilm (1984) CD Video (1987)


VCD (1993) MovieCD (c. 1995) DVD/DVD-Video (1995) MiniDVD (c. 1995) CVD (1998) SVCD (1998) EVD (2003) XDCAM (2003) HVD (High-Definition Versatile Disc) (2004) FVD (2005) UMD (2005) VMD (2006)

High Definition HD DVD (2006) Blu-ray Disc (2006) HVD (Holographic Versatile Disc) (2007) CBHD (2008)

This program is to be developed in JAVA, which would be a great experience to me. This project will requires hard work & labor prompted in devotion to meaningful objectives for completion. In this project I will take care of keeping all requirements as per the specification for the extreme of perfection.