Está en la página 1de 30

The Java Media Framework:

javax.media
• Access to devices & media files
• Resources
– Sun JMF web site
• http://java.sun.com/products/java-media/jmf/index.jsp
• http://java.sun.com/products/java-media/jmf/2.1.1/apidocs/
– Sample chapter from book, “Essential JMF - Java
Media Framework” Gordon & Talley, 1998
• http://www.pearsonptg.com/samplechapter/0130801046.pdf
– www.javaolympus.com
• http://www.javaolympus.com/J2SE/MEDIA/JMF/JMF.jsp
– Article “Image Capture From Webcams using the Java
Media Framework API”
• http://java.sun.com/dev/evangcentral/totallytech/jmf.html (stale)
– Terrazas et al. (2002). Java Media API’s. SAMS
Pub. 1
Topics
• General contents of JMF
• Digital video primer
• Classes
– java.awt.image.BufferedImage
• Abstract Window Toolkit class
– java.awt.image.ColorModel
– javax.media.format.RGBFormat
• Pixel representation
• Classes
– javax.media.Format;
– javax.media.MediaLocator;
– javax.media.protocol.DataSource;
2
– javax.media.CaptureDeviceManager;
Digital Video Primer
• Cameras return “frames” of visual information
• Each frame is a 2D matrix
– Components of these arrays are pixels
• Usually integers
– Dimensions: height by width
• E.g., 160 x 120 pixels
• Frames are returned from the camera at a
certain rate
– I.e., the frame rate
– E.g., 15 frames per second
• Frames are often represented in programs
as one-dimensional arrays
3
int[] frame; // in Java
Frames as 1D Arrays
int [] frame = null;
// code to get frame from camera…
int x = 10, y = 20; // get some pixel
// 0 <= x <= width-1; 0 <= y <= height-1
// (0, 0)-- upper left of image
int pixel = frame[y * width + x];
// assumes a row oriented representation
4
Class
java.awt.image.BufferedImage
• An Image comprised of a data buffer for
image, plus color model information
• Represents color or grayscale images in
various formats
– E.g., TYPE_BYTE_GRAY, TYPE_USHORT_GRAY,
TYPE_USHORT_565_RGB, TYPE_INT_RGB
• Methods include
public ColorModel getColorModel()
• ColorModel (java.awt.image.ColorModel)
describes the format of the image
public BufferedImage getSubimage(int x, int y, int w, int h)
public int[] getRGB(parameter list…) 5
Class
java.awt.image.ColorModel
• Abstract class used to construct
representations of colors (color
models) for images
• Methods include
– public abstract int getRed(int pixel)
– public abstract int getGreen(int pixel)
– public abstract int getBlue(int pixel)

6
Class
javax.media.format.RGBFormat
• The RGBFormat class of the
javax.media.format package
– Subclass of VideoFormat
(javax.media.format.VideoFormat)
• “32-bit packed RGB stored as 32-bit
integers would have the following
masks: redMask = 0x00FF0000,
greenMask = 0x0000FF00, blueMask
= 0x000000FF.” (From JMF API)
7
Pixel Representation
• Pixels as returned from the BufferedImage
class getRGB method
– Represented using lower 24 bits of int’s
• High order byte: Red
• Middle byte: Green
• Low-order byte: Blue
• This format because the Java programs we
are using set up the image formats this way
– RGBFormat (javax.media.format.RGBFormat)
image formats were obtained from the camera
8
Example
• 1) Given a pixel in RGBFormat as
described, use bit manipulation
operations to extract out the R, G, and
B components
• 2) Use bit manipulation operations to
create a new RGBFormat pixel given 8
bit R, G, and B components

9
Outline of Using JMF to
Access USB Camera Devices
• Determine available camera devices
– CaptureDeviceManager.getDeviceList
• Obtain a DataSource
– Use Manager to create this DataSource, based
on a MediaLocator
• Obtain a Processor
– From Manager based on the DataSource
• Obtain a PushBufferDataSource
– From the Processor

10
Class
javax.media.CaptureDeviceManager
• Methods
public static CaptureDeviceInfo getDevice(java.lang.String deviceName)
• Get a device by name
public static java.util.Vector getDeviceList(Format format)
• Returns a Vector containing a list of CaptureDeviceInfo
descriptions of devices available on the computer of the format
given
– Formats returned may be more than that specified
– E.g., if a given device supports YUV and RGB formats, then all
formats (including both YUV and RGB) will be returned in the
result of the getDeviceList call
• If Format is null, returns the list of all available devices on the
system

11
Class
javax.media.CaptureDeviceInfo

• Methods include
public Format[] getFormats()
• Returns an array of Format objects
• Specific Format’s available for the video device
public MediaLocator getLocator()

12
Class
javax.media.Format
• Format class of javax.media package
• Constructor
Format(java.lang.String encoding)
• Device/media specific description of a media
format
– E.g., for a video (visual) device
• "RGB, 160x120, Length=38400, 16-bit,
Masks=31744:992:31, PixelStride=2,
LineStride=320, Flipped”
– E.g., for an a audio device
• "PCM, 44.1 KHz, Stereo, Signed"
13
Class
javax.media.MediaLocator
• Constructor
– MediaLocator(java.lang.String locatorString)
• Locator strings are similar to URLs
• Example locatorString
– “vfw://0”
• This is for my USB web camera
• MediaLocator’s can describe files
e.g.,
MediaLocator m = new
MediaLocator(“file://media/example.mov”);

14
Class
javax.media.Manager - 1
• Access point for obtaining system dependent
resources
– Player’s, DataSource’s, Processor’s, DataSink’s,
TimeBase
• DataSource
– Object used to deliver time-based multimedia
data that is specific to a delivery protocol.
• Examples of protocols: http, ftp, file
– Once you have a “usable” (formatted)
DataSource, you can display the media (via a
Player), or manipulate the information (via a
Processor)
15
Class javax.media.Manager - 2
• Manager methods include
public static DataSource createDataSource(MediaLocator sourceLocator)
Returns a data source for the protocol specified by the
MediaLocator.
When you call createDataSource on the media locator of a video
capture device (obtained from the CaptureDeviceManager), the
returned DataSource will implement the CaptureDevice interface
public static DataSource
createCloneableDataSource(DataSource source)
- create a DataSource that can be cloned; this enables
DataSource to be processed by different tasks
• public static Processor
createProcessor(DataSource source)
- Processors are used in controlling the processing of
media
16
Class
javax.media.protocol.DataSource
• An abstraction for media protocol-handlers.
DataSource manages the life-cycle of the
media source by providing a simple
connection protocol.
• Methods include
– DataSource(MediaLocator source)
– connect
– start, stop
• Start and stop data transfer

17
Interface CaptureDevice
• A capture device is a DataSource of type
PullDataSource, PullBufferDataSource,
PushDataSource or PushBufferDataSource.
It also implements the CaptureDevice
interface… A CaptureDevice DataSource
contains an array of SourceStream's.
These SourceStreams provide the
interface for the captured data
streams to be read. (From JMF API)
• Methods include
– public FormatControl[] getFormatControls()
18
Interface
FormatControl
• Objects implementing the
FormatControl interface can be used to
set the Format of the CaptureDevice to
the Format desired
• Methods
– public Format setFormat(Format format)
• Returns null if the format is not supported.
Otherwise, it (typically) returns the format that's
actually set.

19
Interface
javax.media.Processor
• Processes and controls time-based media data.
Extends the Player interface. Unlike a Player,
which processes data as a "black box" and only
renders data to preset destinations, a Processor
supports a programmatic interface enabling
control over media data processing and access
to output data streams.
• Processing performed by a Processor is split
into three stages:
– Demultiplexing - into separate tracks of data
– Data transcoding - of each track into other formats
– Multiplexing - to form an interleaved stream
• Data transcoding and multiplexing processes 20
are programmable.
Interface javax.media.Processor
• Processor is a Player is a Controller
• Methods include
DataSource getDataOutput()
get the output DataSource from the Processor
public void addControllerListener(ControllerListener listener)
• From Controller interface
• Events get posted to the listener, I.e., the
public void controllerUpdate(ControllerEvent event)
method gets called for that listener
public void realize()
• Constructs media dependent portions of Controller.
public void prefetch()
public void start()
public int getState()
• States: Unrealized, Configuring, Configured, Realizing,
21
Realized, Prefetching, Prefetched, and Started.
Controller Life Cycle
• Controller’s have five resource-allocation states:
Unrealized, Realizing, Realized, Prefetching, and
Prefetched.
• These states provide programmatic control over
potentially time-consuming operations. For
example, when a Controller is first constructed, it's
in the Unrealized state. While Realizing, the
Controller performs the communication necessary
to locate all of the resources it needs to function
(such as communicating with a server, other
controllers, or a file system). The realize method
allows an application to initiate this potentially time-
consuming process (Realizing) at an appropriate
time.
22
Class
javax.media.protocol.PushBufferDataSource

• A data source that manages data in the form of


push streams. The streams from this DataSource
are PushBufferStream’s and contain Buffer
objects
• We we use getDataOutput to get a DataSource
from our Processor, and this DataSource is
actually a PushBufferDataSource
• Methods
public abstract PushBufferStream[] getStreams()
• Return streams that this source manages. The collection of
streams is entirely content dependent. The ContentDescriptor
of this DataSource provides the only indication of what
streams can be available on this connection.
• For our USB camera capture device, there is only one 23
PushBufferStream
Interface
javax.media.protocol.PushBufferStream
• A read interface that pushes data in the form
of Buffer objects. Allows a source stream to
transfer data in the form of a media object.
The media object transferred is the Buffer
object as defined in javax.media.Buffer. The
user of the stream will allocate an empty
Buffer object and pass this to the source
stream in the read() method. The source
stream allocates the Buffer object's data and
header, sets them on the Buffer and sends
them to the user.

24
Interface
javax.media.protocol.PushBufferStream
• Methods include
public Format getFormat()
public void
setTransferHandler(BufferTransferHandler transferHandler)
• Register an object to service data transfers to this stream.
public void read(Buffer buffer)
• Read from the stream without blocking.
• (Not documented regarding what happens if you read and
there is no data available-- Is the buffer empty or is an
IOException raised).

25
Interface
javax.media.protocol.BufferTransferHandler
• Implements a callback from a
PushBufferStream.
• A PushBufferStream needs to notify the data
handler when data is available to be pushed.
• Methods include
public void
transferData(PushBufferStream stream)
• Notification from the PushBufferStream to the handler
that data is available to be read from stream. The data
can be read by this handler in the same thread or can be
read later.
26
Class
javax.media.Buffer
• A media-data container that carries media
data from one processing stage to the next
inside of a Player or Processor. Buffer
objects are also used to carry data between
a buffer stream and its handler.
• Maintains information including the time
stamp, length, and Format of the data it
carries, as well as any header information
that might be required to process the media
data.
27
Class
javax.media.util.BufferToImage
• A utility class to convert a video Buffer
object to an AWT Image object
– you can then render the Image to the user
display with AWT methods.
• Methods include
public BufferToImage(VideoFormat format)
public java.awt.Image createImage(Buffer buffer)
Converts the input Buffer to a standard
AWT image and returns the image.
28
(From Above)
Class
java.awt.image.BufferedImage
• An Image comprised of a data buffer for image, plus
color model information
– A subclass of java.awt.Image
• Represents color or grayscale images in various
formats
– E.g., TYPE_BYTE_GRAY, TYPE_USHORT_GRAY,
TYPE_USHORT_565_RGB, TYPE_INT_RGB
• Methods include
public ColorModel getColorModel()
• ColorModel (java.awt.image.ColorModel) describes the
format of the image
public BufferedImage getSubimage(int x, int y, int w, int h)
public int[] getRGB(parameter list…)
29
Capturing the Webcam Images
PushBufferStream PushBuffer; // created above
Buffer ImageBuffer = new Buffer();
BufferToImage ImageConverter =
new BufferToImage((VideoFormat) (PushBuffer.getFormat()));
// assume a new frame is ready
PushBuffer.read(ImageBuffer);
BufferedImage InputImage =
((BufferedImage) (ImageConverter.createImage(ImageBuffer)));
int[] CurrentFrame = InputImage.getRGB(0, 0,
VideoWidth, VideoHeight, null, 0, VideoWidth);
30
From ImageProcessingThread.java (Lab1)

También podría gustarte