110
© 2002 by Prentice-Hall, Inc. 22 Java Media Framework and Java Sound (on CD) Objectives To understand the capabilities of the Java Media Framework (JMF). To understand the capabilities of the Java Sound API. To be able to play audio and video media with JMF. To be able to stream media over a network. To be able to capture, format and save media. To be able to play sounds with the Java Sound API. To be able to play, record, and synthesize MIDI with the Java Sound API. TV gives everyone an image, but radio gives birth to a million images in a million brains. Peggy Noonan Noise proves nothing. Often a hen who has merely laid an egg cackles as if she had laid an asteroid. Mark Twain, Following the Equator A wide screen just makes a bad film twice as bad. Samuel Goldwyn Isn’t life a series of images that change as they repeat themselves? Andy Warhol

jhtp4 22 - Com Sci Gatecomscigate.com/java/htp4e/additional/chapter/jhtp4_22.pdf · This chapter continues our multimedia discussions of Chapter 18 by introducing some of ... Macromedia

  • Upload
    vokhanh

  • View
    215

  • Download
    1

Embed Size (px)

Citation preview

© 2002 by Prentice-Hall, Inc.

22Java Media Framework and Java Sound (on CD)

Objectives• To understand the capabilities of the Java Media

Framework (JMF).• To understand the capabilities of the Java Sound API.• To be able to play audio and video media with JMF.• To be able to stream media over a network.• To be able to capture, format and save media.• To be able to play sounds with the Java Sound API.• To be able to play, record, and synthesize MIDI with

the Java Sound API.TV gives everyone an image, but radio gives birth to a million images in a million brains.Peggy Noonan

Noise proves nothing. Often a hen who has merely laid an egg cackles as if she had laid an asteroid.Mark Twain, Following the Equator

A wide screen just makes a bad film twice as bad.Samuel Goldwyn

Isn’t life a series of images that change as they repeat themselves?Andy Warhol

jhtp4_22.FM Page 1236 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1237

22.1 IntroductionThis chapter continues our multimedia discussions of Chapter 18 by introducing some ofJava’s multimedia APIs that enable programmers to enhance applications with video andaudio features. In recent years, the digital multimedia sector of the computer industry hasexperienced tremendous growth, as evidenced by the enormous quantity of multimediacontent available on the Internet. Web sites have been transformed from text-based HTMLpages to multimedia-intensive experiences. Advances in hardware and software technolo-gies have allowed developers to integrate multimedia into the simplest applications. At thehigh end of multimedia applications, the video game industry has used multimedia pro-gramming to take advantage of the latest hardware technologies, such as 3D video cardsthat create virtual reality experiences for users.

Acknowledging that Java applications should support digital video and audio capabil-ities, Sun Microsystems, Intel and Silicon Graphics worked together to produce a multi-media API known as the Java Media Framework (JMF). The JMF API is one of severalmultimedia APIs in Java. Using the JMF API, programmers can create Java applicationsthat play, edit, stream and capture many popular media types. The first half of this chapterdiscusses the JMF API.

IBM and Sun developed the latest JMF specification—version 2.0. Sun provides a ref-erence implementation—JMF 2.1.1—of the JMF specification which supports media filetypes such as Microsoft Audio/Video Interleave (.avi), Macromedia Flash 2 movies(.swf), Future Splash (.spl), MPEG Layer 3 Audio (.mp3), Musical Instrument DigitalInterface (MIDI;.mid), MPEG-1 videos (.mpeg, .mpg), QuickTime (.mov), Sun Audio(.au), Wave audio (.wav), AIFF (.aiff) and GSM (.gsm) files. The JMF also supportsmedia from capture devices such as microphones and digital cameras.

Outline

22.1 Introduction22.2 Playing Media22.3 Formatting and Saving Captured Media22.4 RTP Streaming22.5 Java Sound22.6 Playing Sampled Audio22.7 Musical Instrument Digital Interface (MIDI)

22.7.1 MIDI Playback22.7.2 MIDI Recording22.7.3 MIDI Synthesis22.7.4 Class MidiDemo

22.8 Internet and World Wide Web Resources22.9 (Optional Case Study) Thinking About Objects: Animation and

Sound in the View

Summary • Terminology • Self-Review Exercises • Answers to Self-Review Exercises • Exercises

jhtp4_22.FM Page 1237 Monday, July 16, 2001 11:29 AM

1238 Java Media Framework and Java Sound (on CD) Chapter 22

In addition to the sample media clips provided with this chapter’s examples on the CD,many Web sites offer an abundant supply of free-for-download audio and video clips. Youcan download media clips from these sites (and many others on the Internet) and use them totest the examples in this chapter. We present a list of sites here to get you started. Free AudioClips (www.freeaudioclips.com) is an excellent site for various types of audio files.The 13 Even site (www.13-even.com/media.html) provides audio and video clips inmany formats for your personal use. If you are looking for MIDI audio files for use in Section22.7, check out the free MIDI clips at www.freestuffgalore.commidi.asp. FunnyVideo Clips (www.video-clips.co.uk) offers entertaining material. Microsoft’sdownloads site (msdn.microsoft.com/downloads) contains a multimedia sectionproviding audio clips and other media.

Currently, JMF is available as an extension package separate from the Java 2 SoftwareDevelopment Kit. The CD that accompanies this book contains JMF 2.1.1. The most recentJMF implementation can be downloaded from the official JMF Web site:

java.sun.com/products/java-media/jmf

The JMF Web site provides versions of the JMF that take advantage of the performancefeatures of the platform on which the JMF is running. For example, the JMF Windows Per-formance Pack provides extensive media and device support for Java programs running onMicrosoft Windows platforms (Windows 95/98/NT 4.0/2000). JMF’s official Web site alsoprovides continually updated support, information and resources for JMF programmers.

Portability Tip 22.1Writing programs using JMF’s Windows Performance Pack reduces the portability of thoseprograms to other operating systems. 22.1

The rest of this chapter discusses the Java Sound API and its extensive sound-pro-cessing capabilities. Internally, the JMF uses Java Sound for its audio functions. In Sections22.5 through 22.7, we will demonstrate sampled audio playback and MIDI functionalitiesusing Java Sound, a standard extension of the Java 2 Software Development Kit.

22.2 Playing Media The JMF is commonly used to playback media clips in Java applications. Many applica-tions such as financial managers, encyclopedias and games use multimedia to illustrate ap-plication features, present educational content and entertain users.

The JMF offers several mechanisms for playing media, the simplest of which is viaobjects that implement interface Player. Interface Player (package javax.media)extends Controller, which is a handler for JMF-supported media.

The following steps are needed to play a media clip:

1. Specify the media source.

2. Create a Player for the media.

3. Obtain the output media and Player controls.

4. Display the media and controls.

Class SimplePlayer (Fig. 22.1) is a simple Java media player program that dem-onstrates several common features of popular media players. The SimplePlayer demo

jhtp4_22.FM Page 1238 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1239

can play most JMF-supported media files with the possible exception of the latest versionsof the formats. This application permits users to access files on the local computer that con-tain supported media types by clicking the Open File button. Clicking the Open Loca-tion button and specifying a media URL allows the user to access media from a mediasource, such as a capture device, a Web server, or a streaming source. A capture device(discussed in Section 22.3) reads media from audio and video devices such as microphones,CD players and cameras. A Real-Time Transport Protocol (RTP) stream is a stream ofbytes sent over a network from a streaming server. An application buffers and plays thestreaming media on the client computer.

1 // Fig. 22.1: SimplePlayer.java2 // Opens and plays a media file from3 // local computer, public URL, or an RTP session45 // Java core packages6 import java.awt.*;7 import java.awt.event.*;8 import java.io.*;9 import java.net.*;

1011 // Java extension packages12 import javax.swing.*;13 import javax.media.*;1415 public class SimplePlayer extends JFrame {1617 // Java media player18 private Player player;1920 // visual content component21 private Component visualMedia;2223 // controls component for media24 private Component mediaControl;2526 // main container27 private Container container;2829 // media file and media locations30 private File mediaFile;31 private URL fileURL;3233 // constructor for SimplePlayer34 public SimplePlayer()35 {36 super( "Simple Java Media Player" );3738 container = getContentPane();3940 // panel containing buttons41 JPanel buttonPanel = new JPanel();

Fig. 22.1 Playing media with interface Player (part 1 of 8).

jhtp4_22.FM Page 1239 Monday, July 16, 2001 11:29 AM

1240 Java Media Framework and Java Sound (on CD) Chapter 22

42 container.add( buttonPanel, BorderLayout.NORTH );4344 // opening file from directory button45 JButton openFile = new JButton( "Open File" );46 buttonPanel.add( openFile );4748 // register an ActionListener for openFile events49 openFile.addActionListener(5051 // anonymous inner class to handle openFile events52 new ActionListener() {5354 // open and create player for file55 public void actionPerformed( ActionEvent event )56 {57 mediaFile = getFile();5859 if ( mediaFile != null ) {6061 // obtain URL from file62 try {63 fileURL = mediaFile.toURL();64 }6566 // file path unresolvable67 catch ( MalformedURLException badURL ) {68 badURL.printStackTrace();69 showErrorMessage( "Bad URL" );70 }7172 makePlayer( fileURL.toString() );7374 } 7576 } // end actionPerformed7778 } // end ActionListener7980 ); // end call to method addActionListener8182 // URL opening button83 JButton openURL = new JButton( "Open Locator" );84 buttonPanel.add( openURL );8586 // register an ActionListener for openURL events87 openURL.addActionListener(8889 // anonymous inner class to handle openURL events90 new ActionListener() {9192 // open and create player for media locator93 public void actionPerformed( ActionEvent event )94 {

Fig. 22.1 Playing media with interface Player (part 2 of 8).

jhtp4_22.FM Page 1240 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1241

95 String addressName = getMediaLocation();9697 if ( addressName != null )98 makePlayer( addressName );99 }100101 } // end ActionListener102103 ); // end call to method addActionListener104105 // turn on lightweight rendering on players to enable106 // better compatibility with lightweight GUI components107 Manager.setHint( Manager.LIGHTWEIGHT_RENDERER,108 Boolean.TRUE );109110 } // end SimplePlayer constructor111112 // utility method for pop-up error messages113 public void showErrorMessage( String error )114 {115 JOptionPane.showMessageDialog( this, error, "Error",116 JOptionPane.ERROR_MESSAGE );117 }118119 // get file from computer120 public File getFile()121 {122 JFileChooser fileChooser = new JFileChooser();123124 fileChooser.setFileSelectionMode(125 JFileChooser.FILES_ONLY );126127 int result = fileChooser.showOpenDialog( this );128129 if ( result == JFileChooser.CANCEL_OPTION )130 return null;131132 else133 return fileChooser.getSelectedFile();134 }135136 // get media location from user input137 public String getMediaLocation()138 {139 String input = JOptionPane.showInputDialog(140 this, "Enter URL" );141142 // if user presses OK with no input143 if ( input != null && input.length() == 0 )144 return null;145146 return input;147 }

Fig. 22.1 Playing media with interface Player (part 3 of 8).

jhtp4_22.FM Page 1241 Monday, July 16, 2001 11:29 AM

1242 Java Media Framework and Java Sound (on CD) Chapter 22

148149 // create player using media's location150 public void makePlayer( String mediaLocation )151 {152 // reset player and window if previous player exists153 if ( player != null )154 removePlayerComponents();155156 // location of media source157 MediaLocator mediaLocator =158 new MediaLocator( mediaLocation );159160 if ( mediaLocator == null ) {161 showErrorMessage( "Error opening file" );162 return;163 }164165 // create a player from MediaLocator166 try {167 player = Manager.createPlayer( mediaLocator );168169 // register ControllerListener to handle Player events170 player.addControllerListener(171 new PlayerEventHandler() );172173 // call realize to enable rendering of player's media174 player.realize();175 }176177 // no player exists or format is unsupported178 catch ( NoPlayerException noPlayerException ) {179 noPlayerException.printStackTrace();180 }181182 // file input error183 catch ( IOException ioException ) {184 ioException.printStackTrace();185 }186187 } // end makePlayer method188189 // return player to system resources and190 // reset media and controls191 public void removePlayerComponents()192 {193 // remove previous video component if there is one194 if ( visualMedia != null )195 container.remove( visualMedia );196197 // remove previous media control if there is one198 if ( mediaControl != null )199 container.remove( mediaControl );200

Fig. 22.1 Playing media with interface Player (part 4 of 8).

jhtp4_22.FM Page 1242 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1243

201 // stop player and return allocated resources202 player.close();203 }204205 // obtain visual media and player controls206 public void getMediaComponents()207 {208 // get visual component from player209 visualMedia = player.getVisualComponent();210211 // add visual component if present212 if ( visualMedia != null )213 container.add( visualMedia, BorderLayout.CENTER );214215 // get player control GUI216 mediaControl = player.getControlPanelComponent();217218 // add controls component if present219 if ( mediaControl != null )220 container.add( mediaControl, BorderLayout.SOUTH );221222 } // end method getMediaComponents223224 // handler for player's ControllerEvents225 private class PlayerEventHandler extends ControllerAdapter {226227 // prefetch media feed once player is realized228 public void realizeComplete( 229 RealizeCompleteEvent realizeDoneEvent )230 {231 player.prefetch();232 }233234 // player can start showing media after prefetching235 public void prefetchComplete( 236 PrefetchCompleteEvent prefetchDoneEvent )237 {238 getMediaComponents();239240 // ensure valid layout of frame241 validate();242243 // start playing media244 player.start();245246 } // end prefetchComplete method247248 // if end of media, reset to beginning, stop play249 public void endOfMedia( EndOfMediaEvent mediaEndEvent )250 {251 player.setMediaTime( new Time( 0 ) );252 player.stop();253 }

Fig. 22.1 Playing media with interface Player (part 5 of 8).

jhtp4_22.FM Page 1243 Monday, July 16, 2001 11:29 AM

1244 Java Media Framework and Java Sound (on CD) Chapter 22

254255 } // end PlayerEventHandler inner class256257 // execute application258 public static void main( String args[] )259 {260 SimplePlayer testPlayer = new SimplePlayer();261262 testPlayer.setSize( 300, 300 );263 testPlayer.setLocation( 300, 300 );264 testPlayer.setDefaultCloseOperation( EXIT_ON_CLOSE );265 testPlayer.setVisible( true );266 }267268 } // end class SimplePlayer

Fig. 22.1 Playing media with interface Player (part 6 of 8).

jhtp4_22.FM Page 1244 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1245

Fig. 22.1 Playing media with interface Player (part 7 of 8).

jhtp4_22.FM Page 1245 Monday, July 16, 2001 11:29 AM

1246 Java Media Framework and Java Sound (on CD) Chapter 22

A media clip must be processed before it is played. To process a media clip the pro-gram must access a media source, create a Controller for that source and output themedia. Prior to output, users may perform optional formatting such as changing an AVIvideo to a QuickTime video. Although JMF hides low-level media processing (e.g.checking for file compatibility) from the programmer, both programmers and users canconfigure how a Player presents media. Section 22.3 and Section 22.4 reveal that cap-turing and streaming media follow the same guidelines. Section 22.8 lists several Web sitesthat have JMF-supported media contents.

Figure 22.1 introduces some key objects for playing media. The JMF extensionpackage javax.media—imported in line 13—contains interface Player and otherclasses and interfaces needed for events. Line 18 declares a Player object to play mediaclips. Lines 30–31 declare the location of these clips as File and URL references.

Lines 21 and 24 declare Component objects for the video display and for holding thecontrols. Component mediaControl enables users to play, pause and stop the mediaclip. Component visualMedia displays the video portion of a media clip (if the mediaclip is a video). The JMF provides lightweight video renderers that are compatible with light-weight Swing components (See Chapter 13). Lines 107–108 in SimplePlayer’s con-structor specify that the Player should draw its GUI components and video portion (if there

Fig. 22.1 Playing media with interface Player (part 8 of 8).

jhtp4_22.FM Page 1246 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1247

is one) using lightweight renderers so that the media player will look like other GUIs withSwing components. By default, Player’s video components are heavyweight components,which may not display correctly when mixed with lightweight Swing GUI components.

Before playing the media, SimplePlayer displays an initial GUI consisting of twobuttons, Open File and Open Locator, that enable users to specify the media location.The event handlers for these two buttons (lines 52–78 and lines 90–101) perform similarfunctions. Each button prompts users for a media resource such as an audio or video clip,then creates a Player for the specified media. When the user clicks Open File, line 57calls method getFile (lines 120–134) to prompt users to select a media file from the localcomputer. Line 63 calls the File method toURL to obtain a URL representation of theselected file’s name and location. Line 72 calls SimplePlayer method makePlayer(lines 150–187) to create a Player for the user-selected media. When users click OpenLocator, line 95 invokes method getMediaLocation (lines 137–147), promptingusers to input a String giving the media location. Line 98 calls SimplePlayer methodmakePlayer to create a Player for the media at the specified location.

Method makePlayer (lines 150–187) makes the necessary preparations to create aPlayer of media clips. The String argument indicates the media’s location. Lines 153–154 invoke SimplePlayer method removePlayerComponents (lines 191–203) toremove the previous Player’s visual component and GUI controls from the frame beforecreating a new Player. Line 202 invokes Player method close to stop all playeractivity and to release system resources held by the previous Player.

Method makePlayer requires a pointer to the source from which the media isretrieved, which is accomplished by instantiating a new MediaLocator for the valuegiven by the String argument (lines 157–158). A MediaLocator specifies the loca-tion of a media source, much like a URL typically specifies the location of a Web page. AMediaLocator can access media from capture devices and RTP sessions as well as fromfile locations. The MediaLocator constructor requires the media’s location as aString, so all URLs must be converted to Strings as in line 72.

Method makePlayer instantiates a new Player with a call to Manager methodcreatePlayer. Class Manager provides static methods that enable programs toaccess most JMF resources. Method createPlayer opens the specified media source anddetermines the appropriate Player for the media source. Method createPlayer throwsa NoPlayerException if an appropriate Player cannot be found for the media clip. AnIOException is thrown if there are problems connecting to the media source.

ControllerListeners listen for the ControllerEvents that Players gen-erate to track the progress of a Player in the media-handling process. Lines 170–171 reg-ister an instance of inner class PlayerEventHandler (lines 225–255) to listen forcertain events that player generates. Class PlayerEventHandler extends classControllerAdapter, which provides empty implementations of methods from inter-face ControllerListener. Class ControllerAdapter facilitates implementingControllerListener for classes that need to handle only a few Controller-Event types.

Players confirm their progress while the processing media based on their state tran-sitions. Line 174 invokes Player method realize to confirm all resources necessaryto play media are available. Method realize places the Player in the Realizing stateto indicate that it is connecting to and interacting with its media sources. When a Player

jhtp4_22.FM Page 1247 Monday, July 16, 2001 11:29 AM

1248 Java Media Framework and Java Sound (on CD) Chapter 22

completes realizing, it generates a RealizeCompleteEvent—a type of Control-lerEvent that occurs when a Player completes its transition to state Realized. Thisstate indicates that the Player has completed all preparations needed to start processingthe media. The program invokes method realizeComplete (lines 228–232) whenPlayer generates a RealizeCompleteEvent.

Most media players have a buffering feature, which stores a portion of downloadedmedia locally so that users do not have to wait for an entire clip to download before playingit, as reading media data can take a long time. By invoking Player method prefetch,line 231 transitions the player to the Prefetching state. When a Player prefetches amedia clip, the Player obtains exclusive control over certain system resources needed toplay the clip. The Player also begins buffering media data to reduce the delay before themedia clip plays.

When the Player completes prefetching, it transitions to state Prefetched and isready to play media. During this transition, the Player generates a ControllerEventof type PrefetchCompleteEvent to indicate that it is ready to display media. ThePlayer invokes PlayerEventHandler method prefetchComplete (lines 235–246), which displays the Player’s GUI in the frame. After obtaining the hardwareresources, the program can get the media components it requires. Line 238 invokes methodgetMediaComponents (lines 206–222) to obtain the GUI’s controls and the media’svisual component (if the media is a video clip) and attach them to the application window’scontent pane. Player method getVisualComponent (line 209) obtains the visualcomponent of the video clip. Similarly, line 216 invokes Player method getControl-PanelComponent to return the GUI’s controls. The GUI (Fig. 22.1) typically providesthe following controls:

1. A positioning slider to jump to certain points in the media clip.

2. A pause button.

3. A volume button that provides volume control by right clicking and a mute func-tion by left clicking.

4. A media properties button that provides detailed media information by right click-ing and frame rate control by left clicking.

Look-and-Feel Observation 22.1Invoking Player method getVisualComponent yields null for audio files, becausethere is no visual component to display. 22.1

Look-and-Feel Observation 22.2Invoking Player method getControlPanelComponent yields different sets of GUIcontrols depending on the media type. For example, media content streamed directly from alive conference does not have a progress bar because the length of the media is not pre-de-termined. 22.2

After validating the new frame layout (line 241), line 244 invokes Player methodstart (line 239) to start playing the media clip.

Software Engineering Observation 22.1If the Player has not prefetched or realized the media, invoking Player method startprefetches and realizes the media. 22.1

jhtp4_22.FM Page 1248 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1249

Performance Tip 22.1Starting the Player takes less time if the Player has already prefetched the media beforeinvoking start. 22.1

When the media clip ends, the Player generates a ControllerEvent of typeEndOfMediaEvent. Most media players “rewind” the media clip after reaching the endso users can see or hear it again from the beginning. Method endOfMedia (lines 249–253) handles the EndOfMediaEvent and resets the media clip to its beginning positionby invoking Player method setMediaTime with a new Time (packagejavax.media) of 0 (line 251). Method setMediaTime sets the position of the mediato a specific time location in the media, and is useful for “jumping” to a different part of themedia. Line 252 invokes Player method stop, which ends media processing and placesthe Player in state Stopped. Invoking method start on a Stopped Player that has notbeen closed resumes media playback.

Often, it is desirable to configure the media before presentation. In the next section, wediscuss interface Processor, which has more configuration capabilities than interfacePlayer. Processors enable a program to format media and to save it to a file.

22.3 Formatting and Saving Captured MediaThe Java Media Framework supports playing and saving media from capture devices suchas microphones and video cameras. This type of media is known as captured media. Cap-ture devices convert analog media into digitized media. For example, a program that cap-tures an analog voice from a microphone attached to computer can create a digitized filefrom a that recording.

The JMF can access video capture devices that use Video for Windows drivers. Also,JMF supports audio capture devices that use the Windows Direct Sound Interface or theJava Sound Interface. The Video for Windows driver provides interfaces that enable Win-dows applications to access and process media from video devices. Similarly, Direct Soundand Java Sound are interfaces that enable applications to access sound devices such as hard-ware sound cards. The Solaris Performance Pack provides support for Java Sound and SunVideo capture devices on the Solaris platform. For a complete list of devices supported byJMF, visit JMF’s official Web site.

The SimplePlayer application presented in Fig. 22.1 allowed users to play mediafrom a capture device. A locator string specifies the location of a capture device that theSimplePlayer demo accesses. For example, to test the SimplePlayer’s capturingcapabilities, plug a microphone into a sound card’s microphone input jack. Typing the locatorstring javasound:// in the Open Location input dialog specifies that media should beinput from the Java Sound-enabled capture device. The locator string initializes the Media-Locator that the Player needs for the audio material captured from the microphone.

Although SimplePlayer provides access to capture devices, it does not format themedia or save captured data. Figure 22.2 presents a program that accomplishes these twonew tasks. Class CapturePlayer provides more control over media properties via classDataSource (package javax.media.protocol). Class DataSource providesthe connection to the media source, then abstracts that connection to allow users to manip-ulate it. This program uses a DataSource to format the input and output media. TheDataSource passes the formatted output media to a Controller, which will format it

jhtp4_22.FM Page 1249 Monday, July 16, 2001 11:29 AM

1250 Java Media Framework and Java Sound (on CD) Chapter 22

further so that it can be saved to a file. The Controller that handles media is a Pro-cessor, which extends interface Player. Finally, an object that implements interfaceDataSink saves the captured, formatted media. The Processor object handles theflow of data from the DataSource to the DataSink object.

JMF and Java Sound use media sources extensively, so programmers must understandthe arrangement of data in the media. The header on a media source specifies the mediaformat and other essential information needed to play the media. The media content usuallyconsists of tracks of data, similar to tracks of music on a CD. Media sources may have oneor more tracks that contain a variety of data. For example, a movie clip may contain onetrack for video, one track for audio, and a third track for closed-captioning data for thehearing-impaired.

1 // Fig. 22.2: CapturePlayer.java2 // Presents and saves captured media34 // Java core packages5 import java.awt.*;6 import java.awt.event.*;7 import java.io.*;8 import java.util.*;9

10 // Java extension packages11 import javax.swing.*;12 import javax.swing.event.*;13 import javax.media.*;14 import javax.media.protocol.*;15 import javax.media.format.*;16 import javax.media.control.*;17 import javax.media.datasink.*;1819 public class CapturePlayer extends JFrame {2021 // capture and save button22 private JButton captureButton;2324 // component for save capture GUI25 private Component saveProgress;2627 // formats of device's media, user-chosen format28 private Format formats[], selectedFormat;2930 // controls of device's media formats31 private FormatControl formatControls[];3233 // specification information of device34 private CaptureDeviceInfo deviceInfo;3536 // vector containing all devices' information37 private Vector deviceList;38

Fig. 22.2 Formatting and saving media from capture devices (part 1 of 9).

jhtp4_22.FM Page 1250 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1251

39 // input and output data sources40 private DataSource inSource, outSource;4142 // file writer for captured media43 private DataSink dataSink;4445 // processor to render and save captured media46 private Processor processor;4748 // constructor for CapturePlayer49 public CapturePlayer()50 {51 super( "Capture Player" );5253 // panel containing buttons54 JPanel buttonPanel = new JPanel();55 getContentPane().add( buttonPanel );5657 // button for accessing and initializing capture devices58 captureButton = new JButton( "Capture and Save File" );59 buttonPanel.add( captureButton, BorderLayout.CENTER );6061 // register an ActionListener for captureButton events62 captureButton.addActionListener( new CaptureHandler() );6364 // turn on light rendering to enable compatibility65 // with lightweight GUI components66 Manager.setHint( Manager.LIGHTWEIGHT_RENDERER,67 Boolean.TRUE );6869 // register a WindowListener to frame events70 addWindowListener(7172 // anonymous inner class to handle WindowEvents73 new WindowAdapter() {7475 // dispose processor76 public void windowClosing(77 WindowEvent windowEvent )78 {79 if ( processor != null )80 processor.close();81 }8283 } // end WindowAdapter8485 ); // end call to method addWindowListener8687 } // end constructor8889 // action handler class for setting up device90 private class CaptureHandler implements ActionListener {91

Fig. 22.2 Formatting and saving media from capture devices (part 2 of 9).

jhtp4_22.FM Page 1251 Monday, July 16, 2001 11:29 AM

1252 Java Media Framework and Java Sound (on CD) Chapter 22

92 // initialize and configure capture device93 public void actionPerformed( ActionEvent actionEvent )94 {95 // put available devices' information into vector96 deviceList =97 CaptureDeviceManager.getDeviceList( null );9899 // if no devices found, display error message100 if ( ( deviceList == null ) ||101 ( deviceList.size() == 0 ) ) {102103 showErrorMessage( "No capture devices found!" );104105 return;106 }107108 // array of device names109 String deviceNames[] = new String[ deviceList.size() ];110111 // store all device names into array of112 // string for display purposes113 for ( int i = 0; i < deviceList.size(); i++ ){114115 deviceInfo =116 ( CaptureDeviceInfo ) deviceList.elementAt( i );117118 deviceNames[ i ] = deviceInfo.getName();119 }120121 // get vector index of selected device122 int selectDeviceIndex =123 getSelectedDeviceIndex( deviceNames );124125 if ( selectDeviceIndex == -1 )126 return;127128 // get device information of selected device129 deviceInfo = ( CaptureDeviceInfo )130 deviceList.elementAt( selectDeviceIndex );131132 formats = deviceInfo.getFormats();133134 // if previous capture device opened, disconnect it135 if ( inSource != null )136 inSource.disconnect();137138 // obtain device and set its format139 try {140141 // create data source from MediaLocator of device142 inSource = Manager.createDataSource(143 deviceInfo.getLocator() );144

Fig. 22.2 Formatting and saving media from capture devices (part 3 of 9).

jhtp4_22.FM Page 1252 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1253

145 // get format setting controls for device146 formatControls = ( ( CaptureDevice )147 inSource ).getFormatControls();148149 // get user's desired device format setting150 selectedFormat = getSelectedFormat( formats );151152 if ( selectedFormat == null )153 return;154155 setDeviceFormat( selectedFormat );156157 captureSaveFile();158159 } // end try160161 // unable to find DataSource from MediaLocator162 catch ( NoDataSourceException noDataException ) {163 noDataException.printStackTrace();164 }165166 // device connection error167 catch ( IOException ioException ) {168 ioException.printStackTrace();169 }170171 } // end method actionPerformed172173 } // end inner class CaptureHandler174175 // set output format of device-captured media176 public void setDeviceFormat( Format currentFormat )177 {178 // set desired format through all format controls179 for ( int i = 0; i < formatControls.length; i++ ) {180181 // make sure format control is configurable182 if ( formatControls[ i ].isEnabled() ) {183184 formatControls[ i ].setFormat( currentFormat );185186 System.out.println (187 "Presentation output format currently set as " +188 formatControls[ i ].getFormat() );189 }190191 } // end for loop192 }193194 // get selected device vector index195 public int getSelectedDeviceIndex( String[] names )196 {

Fig. 22.2 Formatting and saving media from capture devices (part 4 of 9).

jhtp4_22.FM Page 1253 Monday, July 16, 2001 11:29 AM

1254 Java Media Framework and Java Sound (on CD) Chapter 22

197 // get device name from dialog box of device choices198 String name = ( String ) JOptionPane.showInputDialog(199 this, "Select a device:", "Device Selection",200 JOptionPane.QUESTION_MESSAGE,201 null, names, names[ 0 ] );202203 // if format selected, get index of name in array names204 if ( name != null )205 return Arrays.binarySearch( names, name );206207 // else return bad selection value208 else209 return -1;210 }211212 // return user-selected format for device213 public Format getSelectedFormat( Format[] showFormats )214 {215 return ( Format ) JOptionPane.showInputDialog( this,216 "Select a format: ", "Format Selection",217 JOptionPane.QUESTION_MESSAGE,218 null, showFormats, null );219 }220221 // pop up error messages222 public void showErrorMessage( String error )223 {224 JOptionPane.showMessageDialog( this, error, "Error",225 JOptionPane.ERROR_MESSAGE );226 }227228 // get desired file for saved captured media229 public File getSaveFile()230 {231 JFileChooser fileChooser = new JFileChooser();232233 fileChooser.setFileSelectionMode(234 JFileChooser.FILES_ONLY );235 int result = fileChooser.showSaveDialog( this );236237 if ( result == JFileChooser.CANCEL_OPTION )238 return null;239240 else241 return fileChooser.getSelectedFile();242 }243244 // show saving monitor of captured media245 public void showSaveMonitor()246 {

Fig. 22.2 Formatting and saving media from capture devices (part 5 of 9).

jhtp4_22.FM Page 1254 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1255

247 // show saving monitor dialog248 int result = JOptionPane.showConfirmDialog( this,249 saveProgress, "Save capture in progress...",250 JOptionPane.DEFAULT_OPTION,251 JOptionPane.INFORMATION_MESSAGE );252253 // terminate saving if user presses "OK" or closes dialog254 if ( ( result == JOptionPane.OK_OPTION ) ||255 ( result == JOptionPane.CLOSED_OPTION ) ) {256257 processor.stop();258 processor.close();259260 System.out.println ( "Capture closed." );261 }262 }263264 // process captured media and save to file265 public void captureSaveFile()266 {267 // array of desired saving formats supported by tracks268 Format outFormats[] = new Format[ 1 ];269270 outFormats[ 0 ] = selectedFormat;271272 // file output format273 FileTypeDescriptor outFileType =274 new FileTypeDescriptor( FileTypeDescriptor.QUICKTIME );275276 // set up and start processor and monitor capture277 try {278279 // create processor from processor model280 // of specific data source, track output formats,281 // and file output format282 processor = Manager.createRealizedProcessor(283 new ProcessorModel( inSource, outFormats, 284 outFileType ) );285286 // try to make a data writer for media output287 if ( !makeDataWriter() )288 return;289290 // call start on processor to start captured feed291 processor.start();292293 // get monitor control for capturing and encoding294 MonitorControl monitorControl =295 ( MonitorControl ) processor.getControl(296 "javax.media.control.MonitorControl" );297298 // get GUI component of monitoring control299 saveProgress = monitorControl.getControlComponent();

Fig. 22.2 Formatting and saving media from capture devices (part 6 of 9).

jhtp4_22.FM Page 1255 Monday, July 16, 2001 11:29 AM

1256 Java Media Framework and Java Sound (on CD) Chapter 22

300301 showSaveMonitor();302303 } // end try304305 // no processor could be found for specific306 // data source307 catch ( NoProcessorException processorException ) {308 processorException.printStackTrace();309 }310311 // unable to realize through312 // createRealizedProcessor method313 catch ( CannotRealizeException realizeException ) {314 realizeException.printStackTrace();315 }316317 // device connection error318 catch ( IOException ioException ) {319 ioException.printStackTrace();320 }321322 } // end method captureSaveFile323324 // method initializing media file writer325 public boolean makeDataWriter()326 {327 File saveFile = getSaveFile();328329 if ( saveFile == null )330 return false;331332 // get output data source from processor333 outSource = processor.getDataOutput();334335 if ( outSource == null ) {336 showErrorMessage( "No output from processor!" );337 return false;338 }339340 // start data writing process341 try {342343 // create new MediaLocator from saveFile URL344 MediaLocator saveLocator =345 new MediaLocator ( saveFile.toURL() );346347 // create DataSink from output data source348 // and user-specified save destination file349 dataSink = Manager.createDataSink(350 outSource, saveLocator );351

Fig. 22.2 Formatting and saving media from capture devices (part 7 of 9).

jhtp4_22.FM Page 1256 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1257

352 // register a DataSinkListener for DataSinkEvents353 dataSink.addDataSinkListener(354355 // anonymous inner class to handle DataSinkEvents356 new DataSinkListener () {357358 // if end of media, close data writer359 public void dataSinkUpdate(360 DataSinkEvent dataEvent )361 {362 // if capturing stopped, close DataSink363 if ( dataEvent instanceof EndOfStreamEvent )364 dataSink.close();365 }366367 } // end DataSinkListener368369 ); // end call to method addDataSinkListener370371 // start saving372 dataSink.open();373 dataSink.start();374375 } // end try376377 // DataSink could not be found for specific 378 // save file and data source379 catch ( NoDataSinkException noDataSinkException ) {380 noDataSinkException.printStackTrace();381 return false;382 }383384 // violation while accessing MediaLocator 385 // destination386 catch ( SecurityException securityException ) {387 securityException.printStackTrace();388 return false;389 }390391 // problem opening and starting DataSink392 catch ( IOException ioException ) {393 ioException.printStackTrace();394 return false;395 }396397 return true;398399 } // end method makeDataWriter400401 // main method402 public static void main( String args[] )403 {404 CapturePlayer testPlayer = new CapturePlayer();

Fig. 22.2 Formatting and saving media from capture devices (part 8 of 9).

jhtp4_22.FM Page 1257 Monday, July 16, 2001 11:29 AM

1258 Java Media Framework and Java Sound (on CD) Chapter 22

405406 testPlayer.setSize( 200, 70 );407 testPlayer.setLocation( 300, 300 );408 testPlayer.setDefaultCloseOperation( EXIT_ON_CLOSE );409 testPlayer.setVisible( true );410 }411412 } // end class CapturePlayer

Fig. 22.2 Formatting and saving media from capture devices (part 9 of 9).

jhtp4_22.FM Page 1258 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1259

Class CapturePlayer (Fig. 22.2) illustrates capturing, setting media formats andsaving media from capture devices supported by JMF. The simplest test of the programuses a microphone for audio input. Initially, the GUI has only one button, which users clickto begin the configuration process. Users then select a capture device from a pull-downmenu dialog. The next dialog box has options for the format of the capture device and fileoutput. The third dialog box asks users to save the media to a specific file. The final dialogbox provides a volume control and the option of monitoring the data. Monitoring allowsusers to hear or see the media as it is captured and saved without modifying it in any way.Many media capture technologies offer monitoring capabilities. For example, many videorecorders have a screen attachment to let users see what the camera is capturing withoutlooking through the viewfinder. In a recording studio producers can listen to live musicbeing recorded through headphones in another room. Monitoring data is different fromplaying data in that it does not make any changes to the format of the media or affect thedata being sent to the Processor.

In the CapturePlayer program, lines 14–16 import the JMF Java extension packagesjavax.media.protocol, javax.media.format and javax.media.con-trol, which contain classes and interfaces for media control and device formatting. Line 17imports the JMF package javax.media.datasink, which contains classes for output-ting formatted media. The program uses the classes and interfaces provided by these packagesto obtain the desired capture device information, set the format of the capture device, createa Processor to handle the captured media data, create a DataSink to write the mediadata to a file and monitor the saving process.

CapturePlayer is capable of setting the media format. JMF provides class Formatto describe the attributes of a media format, such as the sampling rate (which controls thequality of the sound) or whether the media should be in stereo or mono format. Each mediaformat is encoded differently and can be played only with a media handler that supports itsparticular format. Line 28 declares an array of the Formats that the capture device supportsand a Format reference for the user-selected format (selectedFormat).

After obtaining the Format objects, the program needs access to the formatting con-trols of the capture device. Line 31 declares an array to hold the FormatControls whichwill set the capture-device format. Class CapturePlayer sets the desired Format forthe media source through the device’s FormatControls (line 31). The CaptureDe-viceInfo reference deviceInfo (line 34) stores the capture device information,which will be placed in a Vector containing all of the device’s information.

Class DataSource connects programs to media data sources, such as capturedevices. The SimplePlayer of Figure 22.1 accessed a DataSource object byinvoking Manager method createPlayer, passing in a MediaLocator. However,class CapturePlayer accesses the DataSource directly. Line 40 declares twoDataSources—inSource connects to the capture device’s media and outSourceconnects to the output data source to which the captured media will be saved.

An object that implements interface Processor provides the primary function thatcontrols and processes the flow of media data in class CapturePlayer (line 46). Theclass also creates an object that implements interface DataSink to write the captured datato a file (line 43).

Clicking the Capture and Save File button configures the capture device byinvoking method actionPerformed (lines 93–171) in private inner class Cap-

jhtp4_22.FM Page 1259 Monday, July 16, 2001 11:29 AM

1260 Java Media Framework and Java Sound (on CD) Chapter 22

tureHandler (lines 90–173). An instance of inner class CaptureHandler is regis-tered to listen for ActionEvents from captureButton (line 62). The programprovides users with a list of available capture devices when lines 96–97 invoke Capture-DeviceManager static method getDeviceList. Method getDeviceListobtains all of the computer’s available devices that support the specified Format. Speci-fying null as the Format parameter returns a complete list of available devices. ClassCaptureDeviceManager enables a program to access this list of devices.

Lines 109–119 copy the names of all capture devices into a String array (device-Names) for display purposes. Lines 122-123 invoke CapturePlayer method get-SelectedDeviceIndex (lines 195–210) to show a selector dialog with a list of all thedevice names stored in array deviceNames. The method call to showInputDialog(lines 198–201) has a different parameter list than earlier examples. The first four parametersare the dialog’s parent component, message, title, and message type, as earlier chapters use.The final three, which are new, specify the icon (in this case, null), the list of values pre-sented to the user (deviceNames), and the default selection (the first element of device-Names). Once users select a device, the dialog returns the string, which is used to return theinteger index of the selected name in deviceNames. This String helps determine the par-allel element in the deviceList. This element, which is an instance of CaptureDevi-ceInfo, creates and configures a new device from which the desired media can be recorded.

A CaptureDeviceInfo object encapsulates the information that the program needsto access and configure a capture device, such as location and format preferences. Callingmethods getLocator (line 143) and getFormats (line 132) access these pieces of infor-mation. Lines 129–130 access the new CaptureDeviceInfo that the user specified in thedeviceList. Next, lines 135–136 call inSource’s disconnect method to disengageany previously opened capture devices before connecting the new device.

Lines 142–143 invoke Manager method createDataSource to obtain theDataSource object that connects to the capture device’s media source, passing in thecapture device’s MediaLocator object as an argument. CaptureDeviceInfomethod getLocator returns the capture device’s MediaLocator. Method create-DataSource in turn invokes DataSource method connect which establishes a con-nection with the capture device. Method createDataSource throws aNoDataSourceException if it cannot locate a DataSource for the capture device.An IOException occurs if there is an error opening the device.

Before capturing the media, the program needs to format the DataSource as speci-fied by the user in the Format Selection dialog. Lines 146–147 use CaptureDevicemethod getFormatControls to obtain an array of FormatControls for Data-Source inSource. An object that implements interface FormatControl specifiesthe format of the DataSource. DataSource objects can represent media sources otherthan capture devices, so for this example the cast operator in line 146 manipulates objectinSource as a CaptureDevice and accesses capture device methods such as get-FormatControls. Line 150 invokes method getSelectedFormat (lines 213–219)to display an input dialog from which users can select one of the available formats. Lines176–192 call the method setDeviceFormat to set the media output format for thedevice to the user-selected Format. Each capture device can have several FormatCon-trols, so setDeviceFormat uses FormatControl method setFormat tospecify the format for each FormatControl object.

jhtp4_22.FM Page 1260 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1261

Formatting the DataSource completes the configuration of the capture device. AProcessor (object inSource) converts the data to the file format in which it will besaved. The Processor works as a connector between the outSource and methodcaptureSaveFile since DataSource inSource does not play or save the media,it only serves to configure the capture device. Line 157 invokes method captureSave-File (lines 265–322) to perform the steps needed to save the captured media in a recog-nizable file format.

To create a Processor, this program first creates a ProcessorModel, a templatefor the Processor. The ProcessorModel determines the attributes of a Processorthrough a collection of information which includes a DataSource or MediaLocator,the desired formats for the media tracks which the Processor will handle, and a Con-tentDescriptor indicating the output content type. Line 268 creates a new Formatarray (outFormats) that represents the possible formats for each track in the media. Line270 sets the default format to the first element of the array. To save the captured output toa file, the Processor must first convert the data it receives to a file-enabled format. Anew QuickTime FileTypeDescriptor (package javax.media.format) is cre-ated to store a description of the content type of the Processor’s output and store it inoutFileType (lines 273–274). Lines 282–284 use the DataSource inSource, thearray outFormats, and the file type outFileType to instantiate a new Processor-Model (lines 283–284).

Generally, Processors need to be configured before they can process media, but the onein this application does not since lines 282–284 invoke Manager method createReal-izedProcessor. This method creates a configured, realized Processor based on theProcessorModel object passed in as an argument. Method createRealizedPro-cessor throws a NoProcessorException if the program cannot locate a Processorfor the media or if JMF does not support the media type. The method throws a CannotReal-izeException if the Processor cannot be realized. This may occur if another program isalready using the media, thus blocking communication with the media source.

Common Programming Error 22.1Be careful when specifying track formats. Incompatible formats for specific output file typesprevent the program from realizing the Processor. 22.1

Software Engineering Observation 22.2Recall that the Processor transitions through several states before being realized. Man-ager method createProcessor allows a program to provide more customized config-uration before a Processor is realized. 22.2

Performance Tip 22.2When method createRealizedProcessor configures the Processor, the methodblocks until the Processor is realized. This may prevent other parts of the program fromexecuting. In some cases, using a ControllerListener to respond to Controller-Events may enable a program to operate more efficiently. When the Processor is real-ized, the listener is notified so the program can begin processing the media. 22.2

Having obtained media data in a file format from the Processor, the program canmake a “data writer” to write the media output to a file. An object that implements interfaceDataSink enables media data to be output to a specific location—most commonly a file.

jhtp4_22.FM Page 1261 Monday, July 16, 2001 11:29 AM

1262 Java Media Framework and Java Sound (on CD) Chapter 22

Line 287 invokes method makeDataWriter (lines 325–399) to create a DataSinkobject that can save the file. Manager method createDataSink requires the Data-Source of the Processor and the MediaLocator for the new file as arguments.Within makeDataWriter, lines 229–242 invokes method getSaveFile to promptusers to specify the name and location of the file to which the media should be saved. TheFile object saveFile stores the information. Processor method getDataOutput(line 333) returns the DataSource from which it received the media. Lines 344–345create a new MediaLocator for saveFile. Using this MediaLocator and theDataSource, lines 349–350 create a DataSink object which writes the output mediafrom the DataSource to the file in which the data will be saved, as specified by theMediaLocator. Method createDataSink throws a NoDataSinkException ifit cannot create a DataSink that can read data from the DataSource and output the datato the location specified by the MediaLocator. This failure may occur as a result ofinvalid media or an invalid MediaLocator.

The program needs to know when to stop outputting data, so lines 353–369 register aDataSinkListener to listen for DataSinkEvents. DataSinkListener methoddataSinkUpdate (lines 359–365) is called when each DataSinkEvent occurs. If theDataSinkEvent is an EndOfStreamEvent, indicating that the Processor hasbeen closed because the capture stream connection has closed, line 364 closes the Data-Sink. Invoking DataSink method close stops the data transfer. A closed DataSinkcannot be used again.

Common Programming Error 22.2The media file output with a DataSink will be corrupted if the DataSink is not closedproperly. 22.2

Software Engineering Observation 22.3Captured media may not yield an EndOfMediaEvent if the media’s endpoint cannot bedetermined. 22.3

After setting up the DataSink and registering its listener, line 372 calls DataSinkmethod open to connect the DataSink to the destination that the MediaLocator spec-ifies. Method open throws a SecurityException if the DataSink attempts to writeto a destination for which the program does not have write permission, such as a read-onlyfile.

Line 373 calls DataSink method start to initiate data transfer. At this point, theprogram returns from method makeDataWriter back to method captureSaveFile(lines 265–322). Although the DataSink prepares itself to receive the transfer and indi-cates that it is ready by calling start, the transfer does not actually take place until theProcessor’s start method is called. The invocation of Processor method startbegins the flow of data from the capture device, formats the data and transfers that data tothe DataSink. The DataSink writes the media to a file, which completes the processperformed by class CapturePlayer.

While Processor encodes the data and the DataSink saves it to a file, Capture-Player monitors the process. Monitoring provides a method of overseeing data as thecapture device collects it. Lines 294–296 obtain an object that implements interface Mon-itorControl (package javax.media.control) from the Processor by callingmethod getControl. Line 299 calls MonitorControl method getControlCom-

jhtp4_22.FM Page 1262 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1263

ponent to obtain the GUI component that displays the monitored media. MonitorCon-trols typically have a checkbox to enable or disable displaying media. Also, audiodevices have a volume control and video devices have a control for setting the previewframe rate. Line 301 invokes method showSaveMonitor (lines 245–261) to display themonitoring GUI in a dialog box. To terminate capturing, users can press the OK button orclose the dialog box (lines 254–261). For some video capture devices, the Processorneeds to be in a Stopped state to enable monitoring of the saving and capturing process.

Thus far, we have discussed JMF’s capabilities to access, present and save media con-tent. Our final JMF example demonstrates how to send media between computers usingJMF’s streaming capabilities.

22.4 RTP StreamingStreaming media refers to media that is transferred from a server to a client in a continuousstream of bytes. The client can begin playing the media while still downloading the mediafrom the server. Audio and video media files are often many megabytes in size. Live events,such as concerts or football game broadcasts, may have indeterminable sizes. Users couldwait until a recording of a concert or game is posted, then download the entire recording.However, at today’s Internet connection speeds, downloading such a broadcast could takedays and typically users prefer to listen to live broadcasts as they occur. Streaming mediaenables client applications to play media over the Internet or over a network without down-loading the entire media file at once.

In a streaming media application, the client typically connects to a server that sends astream of bytes containing the media back to the client. The client application buffers (i.e.stores locally) a portion of the media, which the client begins playing after a certain portionhas been received. The client continually buffers additional media, providing users with anuninterrupted clip, as long as network traffic does not prevent the client application frombuffering additional bytes. With buffering, users experience the media within seconds ofwhen the streaming begins, even though all of the material has not been received.

Performance Tip 22.3Streaming media to a client enables the client to experience the media faster than if the clientmust wait for an entire media file to download. 22.3

Demand for real-time, robust multimedia is rising dramatically as the speed of Internetconnections increase. Broadband Internet access, which provides high-speed network con-nections to the Internet for so many home users, is becoming more popular, though thenumber of users remains relatively small compared to the total number of Internet users.With faster connections, streaming media can provide a better multimedia experience.Users with slower connections can still experience the multimedia, though with lesserquality. The wide range of applications that use streaming media is growing. Applicationsthat stream video clips to clients have expanded to provide real-time broadcast feeds. Thou-sands of radio stations stream music continuously over the Internet. Client applicationssuch as RealPlayer have focused on streaming media content with live radio broadcasts.Applications are not limited to audio and video server-to-client streaming. For example,teleconferencing and video conferencing applications increase efficiency in everyday busi-ness by reducing the need for business people to travel great distances to attend meetings.

jhtp4_22.FM Page 1263 Monday, July 16, 2001 11:29 AM

1264 Java Media Framework and Java Sound (on CD) Chapter 22

JMF provides a streaming media package that enables Java applications to send andreceive streams of media in some of the formats discussed earlier in this chapter. For a com-plete list of formats, see the official JMF Web site:

java.sun.com/products/java-media/jmf/2.1.1/formats.html

JMF uses the industry-standard Real-Time Transport Protocol (RTP) to control mediatransmission. RTP is designed specifically to transmit real-time media data.

The two mechanisms for streaming RTP-supported media are passing it through aDataSink and buffering it. The easier mechanism to use is a DataSink, which writesthe contents of a stream to a host destination (i.e., a client computer), via the same tech-niques shown in Fig. 22.2 to save captured media to a file. In this case, however, the desti-nation MediaLocator’s URL would be specified in the following format:

rtp://host:port/contentType

where host is the IP address or host name of the server, port is the port number on whichthe server is streaming the media and contentType is either audio or video.

Using a DataSink as specified allows only one stream to be sent at a time. To sendmultiple streams (e.g., as a karaoke video with separate tracks for video and audio would)to multiple hosts, a server application must use RTP session managers. An RTPManager(package javax.media.rtp) provides more control over the streaming process, per-mitting specification of buffer sizes, error checking and streaming reports on the propaga-tion delay (the time it takes for the data to reach its destination).

The program in Fig. 22.3 and Fig. 22.4 demonstrates streaming using the RTP sessionmanager. This example supports sending multiple streams in parallel, so separate clientsmust be opened for each stream. This example does not show a client that can receive theRTP stream. The program in Fig. 22.1 (SimplePlayer) can test the RTP server by spec-ifying an RTP session address

rtp://127.0.0.1:4000/audio

as the location SimplePlayer should open to begin playing audio. To execute thestreaming media server on a different computer, replace 127.0.0.1 with either the IP ad-dress or host name of the server computer.

1 // Fig. 22.3: RTPServer.java2 // Provides configuration and sending capabilities3 // for RTP-supported media files45 // Java core packages6 import java.io.*;7 import java.net.*;89 // Java extension packages

10 import javax.media.*;11 import javax.media.protocol.*;12 import javax.media.control.*;13 import javax.media.rtp.*;14 import javax.media.format.*;

Fig. 22.3 Serving streaming media with RTP session managers (part 1 of 7).

jhtp4_22.FM Page 1264 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1265

1516 public class RTPServer {1718 // IP address, file or medialocator name, port number19 private String ipAddress, fileName;20 private int port;2122 // processor controlling data flow23 private Processor processor;2425 // data output from processor to be sent26 private DataSource outSource;2728 // media tracks' configurable controls29 private TrackControl tracks[];3031 // RTP session manager32 private RTPManager rtpManager[];3334 // constructor for RTPServer35 public RTPServer( String locator, String ip, int portNumber )36 {37 fileName = locator;38 port = portNumber;39 ipAddress = ip;40 }4142 // initialize and set up processor43 // return true if successful, false if not44 public boolean beginSession()45 {46 // get MediaLocator from specific location47 MediaLocator mediaLocator = new MediaLocator( fileName );4849 if ( mediaLocator == null ) {50 System.err.println(51 "No MediaLocator found for " + fileName );5253 return false;54 }5556 // create processor from MediaLocator57 try {58 processor = Manager.createProcessor( mediaLocator );5960 // register a ControllerListener for processor61 // to listen for state and transition events62 processor.addControllerListener(63 new ProcessorEventHandler() );6465 System.out.println( "Processor configuring..." );66

Fig. 22.3 Serving streaming media with RTP session managers (part 2 of 7).

jhtp4_22.FM Page 1265 Monday, July 16, 2001 11:29 AM

1266 Java Media Framework and Java Sound (on CD) Chapter 22

67 // configure processor before setting it up68 processor.configure();69 }7071 // source connection error72 catch ( IOException ioException ) {73 ioException.printStackTrace();74 return false;75 }7677 // exception thrown when no processor could78 // be found for specific data source79 catch ( NoProcessorException noProcessorException ) {80 noProcessorException.printStackTrace();81 return false;82 }8384 return true;8586 } // end method beginSession8788 // ControllerListener handler for processor89 private class ProcessorEventHandler90 extends ControllerAdapter {9192 // set output format and realize 93 // configured processor94 public void configureComplete(95 ConfigureCompleteEvent configureCompleteEvent )96 {97 System.out.println( "\nProcessor configured." );9899 setOutputFormat();100101 System.out.println( "\nRealizing Processor...\n" );102103 processor.realize();104 }105106 // start sending when processor is realized107 public void realizeComplete(108 RealizeCompleteEvent realizeCompleteEvent )109 {110 System.out.println(111 "\nInitialization successful for " + fileName );112113 if ( transmitMedia() == true )114 System.out.println( "\nTransmission setup OK" );115116 else117 System.out.println( "\nTransmission failed." );118 }119

Fig. 22.3 Serving streaming media with RTP session managers (part 3 of 7).

jhtp4_22.FM Page 1266 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1267

120 // stop RTP session when there is no media to send121 public void endOfMedia( EndOfMediaEvent mediaEndEvent )122 {123 stopTransmission();124 System.out.println ( "Transmission completed." );125 }126127 } // end inner class ProcessorEventHandler128129 // set output format of all tracks in media130 public void setOutputFormat()131 {132 // set output content type to RTP capable format133 processor.setContentDescriptor(134 new ContentDescriptor( ContentDescriptor.RAW_RTP ) );135136 // get all track controls of processor137 tracks = processor.getTrackControls();138139 // supported RTP formats of a track140 Format rtpFormats[];141142 // set each track to first supported RTP format143 // found in that track144 for ( int i = 0; i < tracks.length; i++ ) {145146 System.out.println( "\nTrack #" +147 ( i + 1 ) + " supports " );148149 if ( tracks[ i ].isEnabled() ) {150151 rtpFormats = tracks[ i ].getSupportedFormats();152153 // if supported formats of track exist,154 // display all supported RTP formats and set155 // track format to be first supported format156 if ( rtpFormats.length > 0 ) {157158 for ( int j = 0; j < rtpFormats.length; j++ )159 System.out.println( rtpFormats[ j ] );160161 tracks[ i ].setFormat( rtpFormats[ 0 ] );162163 System.out.println ( "Track format set to " +164 tracks[ i ].getFormat() );165 }166167 else168 System.err.println (169 "No supported RTP formats for track!" );170171 } // end if172

Fig. 22.3 Serving streaming media with RTP session managers (part 4 of 7).

jhtp4_22.FM Page 1267 Monday, July 16, 2001 11:29 AM

1268 Java Media Framework and Java Sound (on CD) Chapter 22

173 } // end for loop174175 } // end method setOutputFormat176177 // send media with boolean success value178 public boolean transmitMedia()179 {180 outSource = processor.getDataOutput();181182 if ( outSource == null ) {183 System.out.println ( "No data source from media!" );184185 return false;186 }187188 // rtp stream managers for each track189 rtpManager = new RTPManager[ tracks.length ];190191 // destination and local RTP session addresses192 SessionAddress localAddress, remoteAddress;193194 // RTP stream being sent195 SendStream sendStream;196197 // IP address198 InetAddress ip;199200 // initialize transmission addresses and send out media201 try {202203 // transmit every track in media204 for ( int i = 0; i < tracks.length; i++ ) {205206 // instantiate a RTPManager207 rtpManager[ i ] = RTPManager.newInstance();208209 // add 2 to specify next control port number;210 // (RTP Session Manager uses 2 ports)211 port += ( 2 * i );212213 // get IP address of host from ipAddress string214 ip = InetAddress.getByName( ipAddress );215216 // encapsulate pair of IP addresses for control and217 // data with 2 ports into local session address218 localAddress = new SessionAddress(219 ip.getLocalHost(), port );220221 // get remoteAddress session address222 remoteAddress = new SessionAddress( ip, port );223224 // initialize the session225 rtpManager[ i ].initialize( localAddress );

Fig. 22.3 Serving streaming media with RTP session managers (part 5 of 7).

jhtp4_22.FM Page 1268 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1269

226227 // open RTP session for destination228 rtpManager[ i ].addTarget( remoteAddress );229230 System.out.println( "\nStarted RTP session: "231 + ipAddress + " " + port);232233 // create send stream in RTP session234 sendStream =235 rtpManager[ i ].createSendStream( outSource, i );236237 // start sending the stream238 sendStream.start();239240 System.out.println( "Transmitting Track #" +241 ( i + 1 ) + " ... " );242243 } // end for loop244245 // start media feed246 processor.start();247248 } // end try249250 // unknown local or unresolvable remote address251 catch ( InvalidSessionAddressException addressError ) {252 addressError.printStackTrace();253 return false;254 }255256 // DataSource connection error257 catch ( IOException ioException ) {258 ioException.printStackTrace();259 return false;260 }261262 // format not set or invalid format set on stream source263 catch ( UnsupportedFormatException formatException ) {264 formatException.printStackTrace();265 return false;266 }267268 // transmission initialized successfully269 return true;270271 } // end method transmitMedia272273 // stop transmission and close resources274 public void stopTransmission()275 {276 if ( processor != null ) {277

Fig. 22.3 Serving streaming media with RTP session managers (part 6 of 7).

jhtp4_22.FM Page 1269 Monday, July 16, 2001 11:29 AM

1270 Java Media Framework and Java Sound (on CD) Chapter 22

Class RTPServer’s purpose is to stream media content. As in previous examples, theRTPServer (Fig. 22.3) sets up the media, processes and formats it, then outputs it. Con-trollerEvents and the various states of the streaming process drive this process. Pro-cessing of the media has three distinct parts—Processor initialization, Formatconfiguration and data transmission. The code in this example contains numerous confir-mation messages displayed to the command prompt and an emphasis on error checking. Aproblem during streaming will most likely end the entire process and it will need to berestarted.

To test RTPServer, class RTPServerTest (Fig. 22.4) creates a new RTPServerobject and passes its constructor (lines 35–40) three arguments—a String representingthe media’s location, a String representing the IP address of the client and a port numberfor streaming content. These arguments contain the information class RTPServer needsto obtain the media and set up the streaming process. Following the general approach out-lined in SimplePlayer (Fig. 22.1) and CapturePlayer (Fig. 22.2), class RTPS-erver obtains a media source, configures the source through a type of Controller andoutputs the data to a specified destination.

RTPServerTest calls RTPServer method beginSession (lines 44–86) to setup the Processor that controls the data flow. Line 47 creates a MediaLocator andinitializes it with the media location stored in fileName. Line 58 creates a Processorfor the data specified by that MediaLocator.

278 // stop processor279 processor.stop();280281 // dispose processor282 processor.close();283284 if ( rtpManager != null )285286 // close destination targets287 // and dispose RTP managers288 for ( int i = 0; i < rtpManager.length; i++ ) {289290 // close streams to all destinations291 // with a reason for termination292 rtpManager[ i ].removeTargets(293 "Session stopped." );294295 // release RTP session resources296 rtpManager[ i ].dispose();297 }298299 } // end if300301 System.out.println ( "Transmission stopped." );302303 } // end method stopTransmission304305 } // end class RTPServer

Fig. 22.3 Serving streaming media with RTP session managers (part 7 of 7).

jhtp4_22.FM Page 1270 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1271

Unlike the program in Fig. 22.2, this Processor is not preconfigured by a Man-ager. Until class RTPServer configures and realizes the Processor, the mediacannot be formatted. Lines 62–63 register a ProcessorEventHandler to react to theprocessor’s ControllerEvents. The methods of class ProcessorEvent-Handler (lines 89–127) control the media setup as the Processor changes states. Line68 invokes Processor method configure to place the Processor in the Config-uring state. Configuration is when the Processor inquires of the system and the mediathe format information needed to program the Processor to perform the correct task. AConfigureCompleteEvent occurs when the Processor completes configuration.The ProcessorEventHandler method configureComplete (lines 94–104)responds to this transition. Method configureComplete calls method setOutput-Format (lines 130–175) then realizes the Processor (line 103). When line 99 invokesmethod setOutputFormat, it sets each media track to an RTP streaming media format.Lines 133–134 in method setOutputFormat specify the output content-type by callingProcessor method setContentDescriptor. The method takes as an argument aContentDescriptor initialized with constant ContentDescriptor.RAW_RTP.The RTP output content-type restricts the Processor to only support RTP-enabledmedia track formats. The Processor’s output content-type should be set before themedia tracks are configured.

Once the Processor is configured, the media track formats need to be set. Line 137invokes Processor method getTrackControls to obtain an array that contains thecorresponding TrackControl object (package javax.media.control) for eachmedia track. For each enabled TrackControl, lines 144–173 obtain an array of all sup-ported RTP media Formats (line 151), then set the first supported RTP format as the pre-ferred RTP-streaming media format for that track (line 161). When methodsetOutputFormat returns, line 103 in method configureComplete realizes theProcessor.

As with any controller realization, the Processor can output media as soon as it hasfinished realizing itself. When the Processor enters the Realized state, the Proces-sorEventHandler invokes method realizeComplete (lines 107–118). Line 113invokes method transmitMedia (lines 178–271) which creates the structures needed totransmit the media to the Processor. This method obtains the DataSource from theProcessor (line 180), then declares an array of RTPManagers that are able to start andcontrol an RTP session (line 189). RTPManagers use a pair of SessionAddressobjects with identical IP addresses, but different port numbers—one for stream control andone for streaming media data. An RTPManager receives each IP address and port numberas a SessionAddress object. Line 192 declares the SessionAddress objects usedin the streaming process. An object that implements interface SendStream (line 195)performs the RTP streaming.

Software Engineering Observation 22.4For videos that have multiple tracks, each SendStream must have its own RTPManagermanaging its session. Each Track has its own SendStream. 22.4

The try block (lines 201–248) of method transmitMedia sends out each track ofthe media as an RTP stream. First, managers must be created for the sessions. Line 207invokes RTPManager method newInstance to instantiate an RTPManager for each

jhtp4_22.FM Page 1271 Monday, July 16, 2001 11:29 AM

1272 Java Media Framework and Java Sound (on CD) Chapter 22

track stream. Line 211 assigns the port number to be two more than the previous portnumber, because the each track uses one port number for the stream control and one to actu-ally stream the data. Lines 218–219 instantiate a new local session address where thestream is located (i.e., the RTP address that clients use to obtain the media stream) with thelocal IP address and a port number as parameters. Line 219 invokes InetAddressmethod getLocalHost to get the local IP address. Line 222 instantiates the client’s ses-sion address, which the RTPManager uses as the stream’s target destination. When line225 calls RTPManager method initialize, the method directs the local streamingsession to use the local session address. Using object remoteAddress as a parameter,line 228 calls RTPManager method addTarget to open the destination session at thespecified address. To stream media to multiple clients, call RTPManager method add-Target for each destination address. Method addTarget must be called after the ses-sion is initialized and before any of the streams are created on the session.

The program can now create the streams on the session and start sending data. Thestreams are created in the current RTP session with the DataSource outSource(obtained at line 180) and the source stream index (i.e. media track index) in lines 234–235.Invoking method start on the SendStream (line 238) and on the Processor (line246) starts transmission of the media streams, which may cause exceptions. An Invalid-SessionAddressException occurs when the specified session address is invalid. AnUnsupportedFormatException occurs when an unsupported media format is spec-ified or if the DataSource’s Format has not been set. An IOException occurs if theapplication encounters networking problems. During the streaming process, RTPMan-agers can be used with related classes in package javax.media.rtp and packagejavax.media.rtp.event controls the streaming process and send reports to theapplication.

The program should close connections and stop streaming transmission when itreaches the end of the streaming media or when the program terminates. When the Pro-cessor encounters the end of the media, it generates an EndOfMediaEvent. Inresponse, the program calls method endOfMedia (lines 121–125). Line 123 invokesmethod stopTransmission (lines 274–303) to stop and close the Processor (lines279–282). After calling stopTransmission, streaming cannot resume because it dis-poses of the Processor and the RTP session resources. Lines 288–297 invoke RTPMan-ager method removeTargets (lines 292–293) to close streaming to all destinations.RTPManager method dispose (line 296) is also invoked, releasing the resources heldby the RTP sessions. The class RTPServerTest (Fig. 22.4), explicitly invokes methodstopTransmission when users terminates the server application (line 40).

1 // Fig. 22.4: RTPServerTest.java2 // Test class for RTPServer34 // Java core packages5 import java.awt.event.*;6 import java.io.*;7 import java.net.*;8

Fig. 22.4 Application to test class RTPServer from Fig. 22.3 (part 1 of 6).

jhtp4_22.FM Page 1272 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1273

9 // Java extension packages10 import javax.swing.*;1112 public class RTPServerTest extends JFrame {1314 // object handling RTP streaming15 private RTPServer rtpServer;1617 // media sources and destination locations18 private int port;19 private String ip, mediaLocation;20 private File mediaFile;2122 // GUI buttons23 private JButton transmitFileButton, transmitUrlButton;2425 // constructor for RTPServerTest26 public RTPServerTest()27 {28 super( "RTP Server Test" );2930 // register a WindowListener for frame events31 addWindowListener(3233 // anonymous inner class to handle WindowEvents34 new WindowAdapter() {3536 public void windowClosing(37 WindowEvent windowEvent )38 {39 if ( rtpServer != null )40 rtpServer.stopTransmission();41 }4243 } // end WindowAdpater4445 ); // end call to method addWindowListener4647 // panel containing button GUI48 JPanel buttonPanel = new JPanel();49 getContentPane().add( buttonPanel );5051 // transmit file button GUI52 transmitFileButton = new JButton( "Transmit File" );53 buttonPanel.add( transmitFileButton );5455 // register ActionListener for transmitFileButton events56 transmitFileButton.addActionListener(57 new ButtonHandler() );5859 // transmit URL button GUI60 transmitUrlButton = new JButton( "Transmit Media" );61 buttonPanel.add( transmitUrlButton );

Fig. 22.4 Application to test class RTPServer from Fig. 22.3 (part 2 of 6).

jhtp4_22.FM Page 1273 Monday, July 16, 2001 11:29 AM

1274 Java Media Framework and Java Sound (on CD) Chapter 22

6263 // register ActionListener for transmitURLButton events64 transmitUrlButton.addActionListener(65 new ButtonHandler() );6667 } // end constructor6869 // inner class handles transmission button events70 private class ButtonHandler implements ActionListener {7172 // open and try to send file to user-input destination73 public void actionPerformed( ActionEvent actionEvent )74 {75 // if transmitFileButton invoked, get file URL string76 if ( actionEvent.getSource() == transmitFileButton ) {7778 mediaFile = getFile();7980 if ( mediaFile != null )8182 // obtain URL string from file83 try {84 mediaLocation = mediaFile.toURL().toString();85 }8687 // file path unresolvable88 catch ( MalformedURLException badURL ) {89 badURL.printStackTrace();90 }9192 else93 return;9495 } // end if9697 // else transmitMediaButton invoked, get location98 else99 mediaLocation = getMediaLocation();100101 if ( mediaLocation == null )102 return;103104 // get IP address105 ip = getIP();106107 if ( ip == null )108 return;109110 // get port number111 port = getPort();112

Fig. 22.4 Application to test class RTPServer from Fig. 22.3 (part 3 of 6).

jhtp4_22.FM Page 1274 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1275

113 // check for valid positive port number and input114 if ( port <= 0 ) {115116 if ( port != -999 )117 System.err.println( "Invalid port number!" );118119 return;120 }121122 // instantiate new RTP streaming server123 rtpServer = new RTPServer( mediaLocation, ip, port );124125 rtpServer.beginSession();126127 } // end method actionPeformed128129 } // end inner class ButtonHandler130131 // get file from computer132 public File getFile()133 {134 JFileChooser fileChooser = new JFileChooser();135136 fileChooser.setFileSelectionMode(137 JFileChooser.FILES_ONLY );138139 int result = fileChooser.showOpenDialog( this );140141 if ( result == JFileChooser.CANCEL_OPTION )142 return null;143144 else145 return fileChooser.getSelectedFile();146 }147148 // get media location from user149 public String getMediaLocation()150 {151 String input = JOptionPane.showInputDialog(152 this, "Enter MediaLocator" );153154 // if user presses OK with no input155 if ( input != null && input.length() == 0 ) {156 System.err.println( "No input!" );157 return null;158 }159160 return input;161 }162163 // method getting IP string from user164 public String getIP()165 {

Fig. 22.4 Application to test class RTPServer from Fig. 22.3 (part 4 of 6).

jhtp4_22.FM Page 1275 Monday, July 16, 2001 11:29 AM

1276 Java Media Framework and Java Sound (on CD) Chapter 22

166 String input = JOptionPane.showInputDialog(167 this, "Enter IP Address: " );168169 // if user presses OK with no input170 if ( input != null && input.length() == 0 ) {171 System.err.println( "No input!" );172 return null;173 }174175 return input;176 }177178 // get port number179 public int getPort()180 {181 String input = JOptionPane.showInputDialog(182 this, "Enter Port Number: " );183184 // return flag value if user clicks OK with no input185 if ( input != null && input.length() == 0 ) {186 System.err.println( "No input!" );187 return -999;188 }189190 // return flag value if user clicked CANCEL191 if ( input == null )192 return -999;193194 // else return input195 return Integer.parseInt( input );196197 } // end method getPort198199 // execute application200 public static void main( String args[] )201 {202 RTPServerTest serverTest = new RTPServerTest();203204 serverTest.setSize( 250, 70 );205 serverTest.setLocation( 300, 300 );206 serverTest.setDefaultCloseOperation( EXIT_ON_CLOSE );207 serverTest.setVisible( true );208 }209210 } // end class RTPServerTest

Fig. 22.4 Application to test class RTPServer from Fig. 22.3 (part 5 of 6).

jhtp4_22.FM Page 1276 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1277

22.5 Java SoundMany of today’s computer programs capture users’ attention with audio features. Even basicapplets and applications can enhance users’ experiences with simple sounds or music clips.With sound programming interfaces, developers can create applications that play sounds inresponse to user interactions. For example, in many applications, when an error occurs and adialog box appears on the screen, the dialog is often accompanied by a sound. Users thus re-ceive both audio and visual indications that a problem has occurred. As another example,game programmers use extensive audio capabilities to enhance players’ experiences.

Fig. 22.4 Application to test class RTPServer from Fig. 22.3 (part 6 of 6).

jhtp4_22.FM Page 1277 Monday, July 16, 2001 11:29 AM

1278 Java Media Framework and Java Sound (on CD) Chapter 22

The Java Sound API is a simpler way to incorporate audio media into applications thanthe Java Media Framework. The Java Sound API is bundled with Java 2 Software Devel-opment Kit version 1.3. The API consists of four packages—javax.sound.midi,javax.sound.midi.spi, javax.sound.sampled and javax.sound.sam-pled.spi. The next two sections focus on packages javax.sound.midi andjavax.sound.sampled, which provide classes and interfaces for accessing, manipu-lating and playing Musical Instrumental Data Interface (MIDI) and sampled audio. Thepackages ending in .spi provide developers with the tools to add Java Sound support foradditional audio formats that are beyond the scope of this book.

The Java Sound API provides access to the Java Sound Engine which creates digitizedaudio and captures media from the supported sound devices discussed in Section 22.3. JavaSound requires a sound card to play audio. A program using Java Sound will throw an excep-tion if it accesses audio system resources on a computer that does not have a sound card.

22.6 Playing Sampled AudioThis section introduces the features of package javax.sound.sampled for playingsampled audio file formats, which include Sun Audio (.au), Windows Waveform (.wav)and Macintosh Audio Interchange File Format (.aiff). The program in Fig. 22.5 andFig. 22.6 shows how audio is played using these file formats.

When processing audio data, a line provides the path through which audio flows in asystem. One example of a line is a pair of headphones connected to a CD player.

Class ClipPlayer (Fig. 22.5) is an example of how lines can be used. It contains anobject that implements interface Clip, which in turn extends interface DataLine. AClip is a line that processes an entire audio file rather than reading continuously from anaudio stream. DataLines enhance Lines by providing additional methods (such asstart and stop) for controlling the flow of data, and Clips enhance DataLines byproviding methods for opening Clips and methods for precise control over playing andlooping the audio.

1 // Fig. 22.5: ClipPlayer.java2 // Plays sound clip files of type WAV, AU, AIFF34 // Java core packages5 import java.io.*;67 // Java extension packages8 import javax.sound.sampled.*;9

10 public class ClipPlayer implements LineListener {1112 // audio input stream13 private AudioInputStream soundStream;1415 // audio sample clip line16 private Clip clip;17

Fig. 22.5 ClipPlayer plays an audio file (part 1 of 4).

jhtp4_22.FM Page 1278 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1279

18 // Audio clip file19 private File soundFile;2021 // boolean indicating replay of audio22 private boolean replay = false;2324 // constructor for ClipPlayer25 public ClipPlayer( File audioFile )26 {27 soundFile = audioFile;28 }2930 // open music file, returning true if successful31 public boolean openFile()32 {33 // get audio stream from file34 try {35 soundStream =36 AudioSystem.getAudioInputStream( soundFile );37 }3839 // audio file not supported by JavaSound40 catch ( UnsupportedAudioFileException audioException ) {41 audioException.printStackTrace();42 return false;43 }4445 // I/O error attempting to get stream46 catch ( IOException ioException ) {47 ioException.printStackTrace();48 return false;49 }5051 // invoke loadClip, returning true if load successful52 return loadClip();5354 } // end method openFile5556 // load sound clip57 public boolean loadClip ()58 {59 // get clip line for file60 try {6162 // get audio format of sound file63 AudioFormat audioFormat = soundStream.getFormat();6465 // define line information based on line type,66 // encoding and frame sizes of audio file67 DataLine.Info dataLineInfo = new DataLine.Info(68 Clip.class, AudioSystem.getTargetFormats(69 AudioFormat.Encoding.PCM_SIGNED, audioFormat ),

Fig. 22.5 ClipPlayer plays an audio file (part 2 of 4).

jhtp4_22.FM Page 1279 Monday, July 16, 2001 11:29 AM

1280 Java Media Framework and Java Sound (on CD) Chapter 22

70 audioFormat.getFrameSize(),71 audioFormat.getFrameSize() * 2 );7273 // make sure sound system supports data line74 if ( !AudioSystem.isLineSupported( dataLineInfo ) ) {7576 System.err.println( "Unsupported Clip File!" );77 return false;78 }7980 // get clip line resource81 clip = ( Clip ) AudioSystem.getLine( dataLineInfo );8283 // listen to clip line for events84 clip.addLineListener( this );8586 // open audio clip and get required system resources87 clip.open( soundStream );8889 } // end try9091 // line resource unavailable92 catch ( LineUnavailableException noLineException ) {93 noLineException.printStackTrace();94 return false;95 }9697 // I/O error during interpretation of audio data98 catch ( IOException ioException ) {99 ioException.printStackTrace();100 return false;101 }102103 // clip file loaded successfully104 return true;105106 } // end method loadClip107108 // start playback of audio clip109 public void play()110 {111 clip.start();112 }113114 // line event listener method to stop or replay at clip end115 public void update( LineEvent lineEvent )116 {117 // if clip reaches end, close clip118 if ( lineEvent.getType() == LineEvent.Type.STOP && 119 !replay )120 close();121

Fig. 22.5 ClipPlayer plays an audio file (part 3 of 4).

jhtp4_22.FM Page 1280 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1281

All Lines generate LineEvents, which can be handled by LineListeners.LineEvents occur when starting, stopping, playing and closing a Line object. Althougha Line stops playback automatically when it reaches the end of an audio file, class Clip-Player implements interface LineListener (line 10) and can close the Clip perma-nently or replay the Clip (discussed shortly). LineListeners are useful for tasks thatmust be synchronized with the LineEvent states of a line.

The Clip reads audio data from an AudioInputStream (a subclass of Input-Stream), which provides access to the stream’s data content. This example loads clips ofthe audio data before attempting to play it, and therefore is able to determine the length ofthe clip in frames. Each frame represents data at a specific time interval in the audio file.To play sampled audio files using Java Sound, a program must obtain an AudioInput-Stream from an audio file, obtain a formatted Clip line, load the AudioInput-Stream into the Clip line and start the data flow in the Clip line.

To play back sampled audio, the audio stream must be obtained from an audio file.ClipPlayer method openFile (lines 31–54) obtains audio from soundFile (initial-ized in the ClipPlayer constructor at lines 25–28). Lines 35–36 call AudioSystemstatic method getAudioInputStream to obtain an AudioInputStream forsoundFile. Class AudioSystem facilitates access to many of the resources needed toplay and manipulate sound files. Method getAudioInputStream throws an Unsup-

122 // if replay set, replay forever123 else124125 if ( lineEvent.getType() == LineEvent.Type.STOP && 126 replay ) {127128 System.out.println( "replay" );129130 // replay clip forever131 clip.loop( Clip.LOOP_CONTINUOUSLY );132 }133 }134135 // set replay of clip136 public void setReplay( boolean value )137 {138 replay = value;139 }140141 // stop and close clip, returning system resources142 public void close()143 {144 if ( clip != null ) {145 clip.stop();146 clip.close();147 }148 }149150 } // end class ClipPlayer

Fig. 22.5 ClipPlayer plays an audio file (part 4 of 4).

jhtp4_22.FM Page 1281 Monday, July 16, 2001 11:29 AM

1282 Java Media Framework and Java Sound (on CD) Chapter 22

portedAudioFileException if the specified sound file is a non-audio file or if itcontains a format that is not supported by Java Sound.

Next the program must provide a line through which audio data can be processed. Line52 invokes method loadClip (lines 57–106) to open a Clip line and load the audiostream for playback. Line 81 invokes AudioSystem static method getLine toobtain a Clip line for audio playback. Method getLine requires a Line.Info objectas an argument, to specify the attributes of the line that the AudioSystem should return.The line must be able to process audio clips of all supported sampled audio formats, so theDataLine.Info object must specify a Clip data line and a general encoding format. Abuffer range should also be specified so the program can determine the best buffer size. TheDataLine.Info constructor receives four arguments. The first two are the format (oftype AudioFormat.Encoding) into which the program should convert the audio dataand the AudioFormat of the audio source. The AudioFormat sets the format sup-ported by the line, according to the audio format of the stream. Line 63 obtains the Audio-Format of the AudioInputStream, which contains format specifications that theunderlying system uses to translate the data into sounds. Lines 68–69 call AudioSystemmethod getTargetFormats to obtain an array of the supported AudioFormats. Thethird argument of the DataLine.Info constructor, which specifies the minimum buffersize, is set to the number of bytes in each frame of the audio stream. Line 70 invokesAudioFormat method getFrameSize to obtain the size of each frame in the audiostream. The maximum buffer size should be equivalent to two frames of the audio stream(line 71). Using the DataLine.Info object, line 74 checks if the underlying audiosystem supports the specified line. If it does, line 81 obtains the line from the audio system.

When an audio clip starts playing and when it finishes, the program needs to be alerted.Line 84 registers a LineListener for the Clip’s LineEvents. If a LineEventoccurs, the program calls LineListener method update (lines 115–133) to process it.The four LineEvent types, as defined in class LineEvent.Type, are OPEN, CLOSE,START and STOP. When the event type is LineEvent.Type.STOP and variablereplay is false, line 120 calls ClipPlayer’s closemethod (lines 142–148) to stopaudio playback and close the Clip. All audio resources obtained previously by the Clipare released when audio playback stops. When the event type isLineEvent.Type.STOP and variable replay is true, line 131 calls Clip methodloop with parameter Clip.LOOP_CONTINUOUSLY, causing the Clip to loop until theuser terminates the application. Invoking method stop of interface Clip stops dataactivity in the Line. Invoking method start resumes data activity.

Once the program finishes validating the Clip, line 87 calls Clip method open withthe AudioInputStream soundStream as an argument. The Clip obtains the systemresources required for audio playback. AudioSystem method getLine and Clipmethod open throw LineUnavailableExceptions if another application is usingthe requested audio resource. Clip method open also throws an IOException if theClip cannot read the specified AudioInputStream. When the test program (Fig. 22.6)calls ClipPlayer method play (lines 109–112), the Clip method start beginsaudio playback.

Class ClipPlayerTest (Fig. 22.6) enables users to specify an audio file to play byclicking the Open Audio Clip button. When users click the button, method action-Performed (lines 37–58) prompts an audio file name and location (line 39) and creates

jhtp4_22.FM Page 1282 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1283

a ClipPlayer for the specified audio file (line 44). Line 47 invokes ClipPlayermethod openFile, which returns true if the ClipPlayer can open the audio file. Ifso, line 50 calls ClipPlayer method play to play the audio and line 53 calls Clip-Player method setReplay to indicate that the audio should not loop continuously.

Performance Tip 22.4Large audio files take a long time to load, depending on the speed of the computer. An alter-native playback form is to buffer the audio by loading a portion of the data to begin playbackand continuing to load the remainder as the audio plays. This is similar to the streaming ca-pability provided by JMF. 22.4

1 // Fig. 22.6: ClipPlayerTest.java2 // Test file for ClipPlayer34 // Java core packages5 import java.awt.*;6 import java.awt.event.*;7 import java.io.*;89 // Java extension packages

10 import javax.swing.*;1112 public class ClipPlayerTest extends JFrame {1314 // object to play audio clips15 private ClipPlayer clipPlayer;1617 // constructor for ClipPlayerTest18 public ClipPlayerTest()19 {20 super( "Clip Player" );2122 // panel containing buttons23 JPanel buttonPanel = new JPanel();24 getContentPane().add( buttonPanel );2526 // open file button27 JButton openFile = new JButton( "Open Audio Clip" );28 buttonPanel.add( openFile, BorderLayout.CENTER );2930 // register ActionListener for openFile events31 openFile.addActionListener(3233 // inner anonymous class to handle openFile ActionEvent34 new ActionListener() {3536 // try to open and play an audio clip file37 public void actionPerformed( ActionEvent event )38 {39 File mediaFile = getFile();40

Fig. 22.6 ClipPlayerTest enables the user to specify the name and location of the audio to play with ClipPlayer (part 1 of 3).

jhtp4_22.FM Page 1283 Monday, July 16, 2001 11:29 AM

1284 Java Media Framework and Java Sound (on CD) Chapter 22

41 if ( mediaFile != null ) {4243 // instantiate new clip player with mediaFile44 clipPlayer = new ClipPlayer( mediaFile );4546 // if clip player opened correctly47 if ( clipPlayer.openFile() == true ) {4849 // play loaded clip50 clipPlayer.play();5152 // no replay53 clipPlayer.setReplay( false );54 }5556 } // end if mediaFile5758 } // end actionPerformed5960 } // end ActionListener6162 ); // end call to addActionListener6364 } // end constructor6566 // get file from computer67 public File getFile()68 {69 JFileChooser fileChooser = new JFileChooser();7071 fileChooser.setFileSelectionMode(72 JFileChooser.FILES_ONLY );73 int result = fileChooser.showOpenDialog( this );7475 if ( result == JFileChooser.CANCEL_OPTION )76 return null;7778 else79 return fileChooser.getSelectedFile();80 }8182 // execute application83 public static void main( String args[] )84 {85 ClipPlayerTest test = new ClipPlayerTest();8687 test.setSize( 150, 70 );88 test.setLocation( 300, 300 );89 test.setDefaultCloseOperation( EXIT_ON_CLOSE );90 test.setVisible( true );91 }92

Fig. 22.6 ClipPlayerTest enables the user to specify the name and location of the audio to play with ClipPlayer (part 2 of 3).

jhtp4_22.FM Page 1284 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1285

22.7 Musical Instrument Digital Interface (MIDI)The Musical Instrument Digital Interface (MIDI) is a standard format for electronic music.MIDI music can be created through a digital instrument, such as an electronic keyboard, orthrough software. The MIDI interface allows musicians to create synthesized digital musicthat reproduces the actual music. Then they can share their musical creations with musicenthusiasts around the world. A MIDI synthesizer is a device that can produce MIDI soundsand music.

Programs can easily manipulate MIDI data. Like other types of audio, MIDI data hasa well-defined format that MIDI players can interpret, play and use to create new MIDIdata. The Complete Detailed MIDI 1.0 specification provides detailed information on MIDIfiles. Visit the official MIDI Web site at www.midi.org for information on MIDI andits specification. Java Sound’s MIDI packages (javax.sound.midi andjavax.sound.midi.spi) allow developers to access MIDI data.

Interpretation of MIDI data varies between synthesizers, so a file may sound quite dif-ferent when played on synthesizers other than the one on which it was created. Synthesizerssupport varying types and numbers of instrumental sounds and different numbers of simul-taneous sounds. Usually hardware-based synthesizers are capable of producing higher-quality synthesized music than software-based synthesizers.

Many Web sites and games use MIDI for music playback, as it enables developers toentertain users with lengthy, digitized music files that do not require a lot of memory. In com-parison, sampled audio files can grow to be quite large. Package javax.sound.midienables programs to manipulate, play and synthesize MIDI. Java Sound supports MIDI fileswith mid and rmf (Rich Music Format or RMF) extensions.

The example presented in Sections 22.7.1 through 22.7.4 covers MIDI synthesis, play-back, recording and saving. Class MidiDemo (Fig. 22.10) is the main application classthat utilizes classes MidiData (Fig. 22.7), MidiRecord (Fig. 22.8) and MidiSyn-thesizer (Fig. 22.9). Class MidiSynthesizer provides resources for generating

93 } // end class ClipPlayerTest

Fig. 22.6 ClipPlayerTest enables the user to specify the name and location of the audio to play with ClipPlayer (part 3 of 3).

jhtp4_22.FM Page 1285 Monday, July 16, 2001 11:29 AM

1286 Java Media Framework and Java Sound (on CD) Chapter 22

sounds and transmitting them to other MIDI devices, such as recorders. Class MidiDatahandles MIDI playback, track initialization and event information. Class MidiRecordprovides MIDI recording capabilities. Class MidiDemo ties the other classes together withan interactive GUI that includes a simulated piano keyboard, play and record buttons, anda control panel for configuring MIDI options. Class MidiDemo also uses MIDI event-syn-chronization to play a MIDI file and highlight the appropriate piano keys, simulatingsomeone playing the keyboard.

An integral part of this MIDI example is its GUI, which allows users to play musicalnotes on a simulated piano keyboard (see screen capture in Fig. 22.10). When the mousehovers over a piano key, the program plays the corresponding note. In this section, we referto this as user synthesis. The Play MIDI button in the GUI allows the user to select a MIDIfile to play. The Record button records the notes played on the piano (user synthesis).Users can save the recorded MIDI to a file using the Save MIDI button and play back therecorded MIDI using the PlayBack button. Users can click the Piano Player button toopen a MIDI file, then play that file back through a synthesizer. The program signifies syn-chronization of notes and piano keys by highlighting the key that corresponds to the notenumber. This playback and synchronization ability is called the “piano player.” While the“piano player” is running, users can synthesize additional notes, and record both the oldaudio material and the new user-synthesized notes by clicking the Record button. TheJComboBox in the upper-left corner of the GUI enables users to select an instrument forsynthesis. Additional GUI components include a volume control for user-synthesized notesand a tempo control for controlling the speed of the “piano player.”

Testing and Debugging Tip 22.1Testing the MIDI file playback functions requires a sound card and an audio file in MIDIformat. 22.1

22.7.1 MIDI PlaybackThis section discusses how to play MIDI files and how to access and interpret MIDI filecontents. Class MidiData (Fig. 22.7) contains methods that load a MIDI file for play-back. The class also provides the MIDI track information required by the “piano player”feature. A MIDI sequencer is used to play and manipulate the audio data. Often, MIDI datais referred to as a sequence, because the musical data in a MIDI file is composed of a se-quence of events. The steps performed in MIDI playback are accessing a sequencer, load-ing a MIDI sequence or file into that sequencer and starting the sequencer.

1 // Fig. 22.7: MidiData.java2 // Contains MIDI sequence information3 // with accessor methods and MIDI playback methods45 // Java core package6 import java.io.*;78 // Java extension package9 import javax.sound.midi.*;

10

Fig. 22.7 MidiData loads MIDI files for playback (part 1 of 5).

jhtp4_22.FM Page 1286 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1287

11 public class MidiData {1213 // MIDI track data14 private Track track;1516 // player for MIDI sequences17 private Sequencer sequencer;1819 // MIDI sequence20 private Sequence sequence;2122 // MIDI events containing time and MidiMessages23 private MidiEvent currentEvent, nextEvent;2425 // MIDI message usually containing sounding messages26 private ShortMessage noteMessage;2728 // short, meta, or sysex MIDI messages29 private MidiMessage message;3031 // index of MIDI event in track, command in MIDI message32 private int eventIndex = 0, command;3334 // method to play MIDI sequence via sequencer35 public void play()36 {37 // initiate default sequencer38 try {3940 // get sequencer from MidiSystem41 sequencer = MidiSystem.getSequencer();4243 // open sequencer resources44 sequencer.open();4546 // load MIDI into sequencer47 sequencer.setSequence( sequence );4849 // play sequence50 sequencer.start();51 }5253 // MIDI resource availability error54 catch ( MidiUnavailableException noMidiException ) {55 noMidiException.printStackTrace();56 }5758 // corrupted MIDI or invalid MIDI file encountered59 catch ( InvalidMidiDataException badMidiException ) {60 badMidiException.printStackTrace();6162 }63

Fig. 22.7 MidiData loads MIDI files for playback (part 2 of 5).

jhtp4_22.FM Page 1287 Monday, July 16, 2001 11:29 AM

1288 Java Media Framework and Java Sound (on CD) Chapter 22

64 } // end method play6566 // method returning adjusted tempo/resolution of MIDI67 public int getResolution()68 {69 return 500 / sequence.getResolution();70 }7172 // obtain MIDI and prepare track in MIDI to be accessed73 public boolean initialize( File file )74 {75 // get valid MIDI from file into sequence76 try {77 sequence = MidiSystem.getSequence( file );78 }7980 // unreadable MIDI file or unsupported MIDI81 catch ( InvalidMidiDataException badMIDI ) {82 badMIDI.printStackTrace();83 return false;84 }8586 // I/O error generated during file reading87 catch ( IOException ioException ) {88 ioException.printStackTrace();89 return false;90 }9192 return true;9394 } // end method initialize9596 // prepare longest track to be read and get first MIDI event97 public boolean initializeTrack()98 {99 // get all tracks from sequence100 Track tracks[] = sequence.getTracks();101102 if ( tracks.length == 0 ) {103 System.err.println( "No tracks in MIDI sequence!" );104105 return false;106 }107108 track = tracks[ 0 ];109110 // find longest track111 for ( int i = 0; i < tracks.length; i++ )112113 if ( tracks[ i ].size() > track.size() )114 track = tracks[ i ];115

Fig. 22.7 MidiData loads MIDI files for playback (part 3 of 5).

jhtp4_22.FM Page 1288 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1289

116 // set current MIDI event to first event in track117 currentEvent = track.get( eventIndex );118119 // get MIDI message from event120 message = currentEvent.getMessage();121122 // track initialization successful123 return true;124125 } // end method initializeTrack126127 // move to next event in track128 public void goNextEvent()129 {130 eventIndex++;131 currentEvent = track.get( eventIndex );132 message = currentEvent.getMessage();133 }134135 // get time interval between events136 public int getEventDelay()137 {138 // first event's time interval is its duration139 if ( eventIndex == 0 )140 return ( int ) currentEvent.getTick();141142 // time difference between current and next event143 return ( int ) ( track.get( eventIndex + 1 ).getTick() -144 currentEvent.getTick() );145 }146147 // return if track has ended148 public boolean isTrackEnd()149 {150 // if eventIndex is less than track's number of events151 if ( eventIndex + 1 < track.size() )152 return false;153154 return true;155 }156157 // get current ShortMessage command from event158 public int getEventCommand()159 {160 if ( message instanceof ShortMessage ) {161162 // obtain MidiMessage for accessing purposes163 noteMessage = ( ShortMessage ) message;164 return noteMessage.getCommand();165 }166167 return -1;168 }

Fig. 22.7 MidiData loads MIDI files for playback (part 4 of 5).

jhtp4_22.FM Page 1289 Monday, July 16, 2001 11:29 AM

1290 Java Media Framework and Java Sound (on CD) Chapter 22

To play a MIDI file with a sequence, a program must obtain the MIDI sequence andcheck for compatibility issues. MidiData method initialize (lines 73–94) obtains aSequence of MIDI data from a file with MidiSystem method getSequence (line77). A Sequence contains MIDI tracks, which, in turn, contain MIDI events. Each eventencapsulates a MIDI message of instructions for the MIDI devices. Individual tracks of aMIDI sequence are analogous to tracks on a CD. However, while CD tracks are played inorder, MIDI tracks are played in parallel. A MIDI track is a recorded sequence of data.MIDIs usually contain multiple tracks. Method getSequence can also obtain a MIDIsequence from a URL or an InputStream. Method getSequence throws anInvalidMidiDataException if the MIDI system detects an incompatible MIDI file.

Portability Tip 22.2Because of incompatible file parsers in different operating systems, sequencers may not beable to play RMF files. 22.2

After obtaining a valid MIDI sequence, the program must obtain a sequencer and loadthe sequence into the sequencer. Method play (lines 35–64) in class MidiData callsMidiSystem method getSequencer (line 41) to obtain a Sequencer to play theSequence. Interface Sequencer, which extends interface MidiDevice (the super-interface for all MIDI devices), provides the standard sequencer device to play MIDI data.If another program is using the same Sequencer object, method getSequencerthrows a MidiUnavailableException. Line 44 calls Sequencer method opento prepare to play a Sequence. Sequencer method setSequence (line 47) loads aMIDI Sequence into the Sequencer and throws an InvalidMidiException ifthe Sequencer detects an unrecognizable MIDI sequence. Line 50 begins playing theMIDI sequence by calling the Sequencer’s start method.

In addition to MIDI playback methods, class MidiData also provides methods thatenable a program to access the events and messages of a MIDI sequence. As we shall see,class MidiDemo (Figure 22.10) uses class MidiData to access the data in a MIDI file forsynchronizing the highlighting of piano keys. The MIDI events are stored in the MIDI’s

169170 // get note number of current event171 public int getNote()172 {173 if ( noteMessage != null )174 return noteMessage.getData1();175176 return -1;177 }178179 // get volume of current event180 public int getVolume()181 {182 return noteMessage.getData2();183 }184185 } // end class MidiData

Fig. 22.7 MidiData loads MIDI files for playback (part 5 of 5).

jhtp4_22.FM Page 1290 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1291

tracks, which are instances of class Track (package javax.sound.midi). MIDI eventsin MIDI tracks are represented by class MidiEvent (package javax.sound.midi).Each MIDI event contains an instruction and the time at which it should occur. The individualevents in a track contain messages of type MidiMessage that specify the MIDI instructionsfor a MidiDevice. There are three types of MIDI messages—ShortMessage,SysexMessage and MetaMessage. ShortMessages are explicit musical instructions,such as the specific notes to play and pitch changes. The other two less-used messages areSysexMessages, system-exclusive messages for MIDI devices, and MetaMessages,which may indicate to a MIDI device that the MIDI has reached the end of a track. This sec-tion deals exclusively with ShortMessages that play specific notes.

Next, the program must obtain the tracks and read their events. MidiData methodinitializeTrack (lines 97–125) invokes Sequence’s getTracks method (line 100)to obtain all of the tracks in the MIDI sequence. Lines 108–114 determine the longest trackin the MIDI and set it as the one to play. Line 117 obtains the first MIDI event in the Trackby invoking its get method with the index of the event in the track as the parameter. At thispoint eventIndex is set to 0 (line 32). Line 120 obtains the MIDI message from the MIDIevent using method getMessage of class MidiEvent. To help a program step througheach event in the tracks, the program can call MidiData method goNextEvent (lines128–133) to load the next event and message. Method goNextEvent incrementseventIndex in the loaded MIDI Track and finds the next event’s MidiMessage.

In addition to reading the events, the program must also determine how long each eventlasts and the spacing between events. Method getEventDelay (lines 136–145) returnsthe duration of a MidiEvent as the time difference between two events in the MIDIsequence (lines 143–144). The MidiEvent’s getTick method provides the specifictime when the event takes place (also called a time stamp). Lines 139–140 return the firstMidiEvent’s time stamp as the event’s duration.

Class MidiData provides other methods to return the commands, the note numbersand the volume of note-related ShortMessages. Method getEventCommand (lines158–168) determines the command number representing the command instruction. Line160 of method getEventCommand indicates whether the currently loaded MidiMes-sage is a ShortMessage. If so, line 163 assigns the ShortMessage to object note-Message and line 164 returns the ShortMessage’s command status byte by invokingShortMessage’s getCommand method. Method getEventCommand returns -1 ifthe event does not contain a ShortMessage. MidiData method getNote (lines 171–177) invokes ShortMessage method getData1 (line 174) to return the note number.Method getVolume (lines 180–183) invokes ShortMessage method getData2 toreturn the volume. Class MidiData also provides an indication of the end of a track inmethod isTrackEnd (lines 148–155), which determines whether the event index has sur-passed the number of events in the track (line 151).

22.7.2 MIDI RecordingA program can record MIDI using a sequencer. Class MidiRecord (Fig. 22.8) handlesthe recording functions of this MIDI demo using an object that implements interface Se-quencer as a MIDI recorder. As long as the MIDI devices are set up correctly, interfaceSequencer provides simple methods for recording. Class MidiRecord has a construc-tor (lines 29–32) that receives an object that implements interface Transmitter as an

jhtp4_22.FM Page 1291 Monday, July 16, 2001 11:29 AM

1292 Java Media Framework and Java Sound (on CD) Chapter 22

argument. A Transmitter sends MIDI messages to a MIDI device that implements in-terface Receiver. Think of Transmitters and Receivers as output and input portsrespectively for MIDI devices.

1 // Fig. 22.8: MidiRecord.java2 // Allows for recording and playback3 // of synthesized MIDI45 // Java core packages6 import java.io.*;78 // Java extension package9 import javax.sound.midi.*;

1011 public class MidiRecord {1213 // MIDI track14 private Track track;1516 // MIDI sequencer to play and access music17 private Sequencer sequencer;1819 // MIDI sequence20 private Sequence sequence;2122 // receiver of MIDI events23 private Receiver receiver;2425 // transmitter for transmitting MIDI messages26 private Transmitter transmitter;2728 // constructor for MidiRecord29 public MidiRecord( Transmitter transmit )30 {31 transmitter = transmit;32 }3334 // initialize recording sequencer, set up recording sequence35 public boolean initialize()36 {37 // create empty MIDI sequence and set up sequencer wiring38 try {3940 // create tempo-based sequence of 10 pulses per beat41 sequence = new Sequence( Sequence.PPQ, 10 );4243 // obtain sequencer and open it44 sequencer = MidiSystem.getSequencer();45 sequencer.open();4647 // get receiver of sequencer48 receiver = sequencer.getReceiver();

Fig. 22.8 MidiRecord enables a program to record a MIDI sequence (part 1 of 3).

jhtp4_22.FM Page 1292 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1293

4950 if ( receiver == null ) {51 System.err.println(52 "Receiver unavailable for sequencer" );53 return false;54 }5556 // set receiver for transmitter to send MidiMessages57 transmitter.setReceiver( receiver );5859 makeTrack();60 }6162 // invalid timing division specification for new sequence63 catch ( InvalidMidiDataException invalidMidiException ) {64 invalidMidiException.printStackTrace();65 return false;66 }6768 // sequencer or receiver unavailable69 catch ( MidiUnavailableException noMidiException ) {70 noMidiException.printStackTrace();71 return false;72 }7374 // MIDI recorder initialization successful75 return true;7677 } // end method initialize7879 // make new empty track for sequence80 public void makeTrack()81 {82 // if previous track exists, delete it first83 if ( track != null )84 sequence.deleteTrack( track );8586 // create track in sequence87 track = sequence.createTrack();88 }8990 // start playback of loaded sequence91 public void play()92 {93 sequencer.start();94 }9596 // start recording into sequence97 public void startRecord()98 {99 // load sequence into recorder and start recording100 try {101 sequencer.setSequence( sequence );

Fig. 22.8 MidiRecord enables a program to record a MIDI sequence (part 2 of 3).

jhtp4_22.FM Page 1293 Monday, July 16, 2001 11:29 AM

1294 Java Media Framework and Java Sound (on CD) Chapter 22

The first step of recording MIDI data is similar to the playback mechanism in classMidiData. In addition to obtaining an empty sequence and a sequencer, a MIDI recordingprogram needs to connect the transmitters and receivers. After successfully “wiring” thesequencer’s receiver as the “IN PORT,” the recorder loads the empty sequence into thesequencer to start recording to a new track in the sequence. The following discussion coversthese steps.

102103 // set track to recording-enabled and default channel104 sequencer.recordEnable( track, 0 );105106 sequencer.startRecording();107 }108109 // sequence contains bad MIDI data110 catch ( InvalidMidiDataException badMidiException ) {111 badMidiException.printStackTrace();112113 }114115 } // end method startRecord116117 // stop MIDI recording118 public void stopRecord()119 {120 sequencer.stopRecording();121 }122123 // save MIDI sequence to file124 public void saveSequence( File file )125 {126 // get all MIDI supported file types127 int[] fileTypes = MidiSystem.getMidiFileTypes( sequence );128129 if ( fileTypes.length == 0 ) {130 System.err.println( "No supported MIDI file format!" );131 return;132 }133134 // write recorded sequence into MIDI file135 try {136 MidiSystem.write( sequence, fileTypes[ 0 ], file );137 }138139 // error writing to file140 catch ( IOException ioException ) {141 ioException.printStackTrace();142 }143144 } // end method saveSequence145146 } // end class MidiRecord

Fig. 22.8 MidiRecord enables a program to record a MIDI sequence (part 3 of 3).

jhtp4_22.FM Page 1294 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1295

Method initialize (lines 35–77) of class MidiRecord sets up the sequencer forrecording. Line 41 of method initialize instantiates an empty sequence. Midi-Record will record data to the empty sequence once the transmitter is connected to thereceiver. Line 48 obtains the recording sequencer’s receiver and line 57 specifies thattransmitter will send its messages to receiver.

MIDI messages must be placed in a track, so method initialize invokes methodmakeTrack (lines 80–88) to delete the previous existing track (line 84) and to createan empty Track (line 87). Method makeTrack can also be called from an external classto record a new sequence without instantiating new sequencers and a new sequence.

After setting up a sequencer and an empty sequence, calling MidiRecord methodstartRecord (lines 97–115) starts the recording process. Line 101 loads an emptysequence into the sequencer. Sequencer method recordEnable is called and passed thetrack object and a channel number as arguments (line 104), which enables recording on thattrack. Line 106 invokes Sequencer’s startRecording method to start the recordingof MIDI events sent from the transmitter. Sequencer’s stopRecording method stopsrecording and is called in MidiRecord’s stopRecord method (lines 118–121).

Class MidiRecord can also supports save a recorded sequence to a MIDI file usingits saveSequence method (lines 124–144). Although most MIDI sequences can supportMIDI type 0 files (the most common type of MIDI file), the sequence should be checked forother supported file types. Line 127 obtains an array of MIDI file types supported by thesystem for writing a sequence to a file. The MIDI file types are represented by integervalues of 0, 1 or 2. Using the first supported file type, the MidiSystem writes thesequence to a specified File (line 136) passed into method saveSequence as an argu-ment. MidiRecord’s play method (lines 91–94) enables the program to play back thenewly recorded sequence.

22.7.3 MIDI SynthesisThis MidiDemo program provides an interactive piano that generates notes according tothe keys pressed by the user. Class MidiSynthesizer (Fig. 22.9) generates these notesdirectly, and sends them to another device. Specifically, it sends the notes to a sequencer’sreceiver through a transmitter to record the MIDI sequence. Class MidiSyn-thesizer uses an object that implements interface Synthesizer (a sub-interface ofMidiDevice) to access the default synthesizer’s sound generation, instruments, channelresources and sound banks. A SoundBank is the container for various Instruments,which instructs the computer on how to make the sound of a specific note. Different notesmade by various instruments are played through a MidiChannel on different tracks si-multaneously to produce symphonic melodies.

1 // Fig. 22.9: MidiSynthesizer.java2 // Accessing synthesizer resources34 // Java extension package5 import javax.sound.midi.*;6

Fig. 22.9 MidiSynthesizer can generate notes and send them to another MIDI device (part 1 of 4).

jhtp4_22.FM Page 1295 Monday, July 16, 2001 11:29 AM

1296 Java Media Framework and Java Sound (on CD) Chapter 22

7 public class MidiSynthesizer {89 // main synthesizer accesses resources

10 private Synthesizer synthesizer;1112 // available instruments for synthesis use13 private Instrument instruments[];1415 // channels through which notes sound16 private MidiChannel channels[];17 private MidiChannel channel; // current channel1819 // transmitter for transmitting messages20 private Transmitter transmitter;2122 // receiver end of messages23 private Receiver receiver;2425 // short message containing sound commands, note, volume26 private ShortMessage message;2728 // constructor for MidiSynthesizer29 public MidiSynthesizer()30 {31 // open synthesizer, set receiver,32 // obtain channels and instruments33 try {34 synthesizer = MidiSystem.getSynthesizer();3536 if ( synthesizer != null ) {3738 synthesizer.open();3940 // get transmitter of synthesizer41 transmitter = synthesizer.getTransmitter();4243 if ( transmitter == null )44 System.err.println( "Transmitter unavailable" );4546 // get receiver of synthesizer47 receiver = synthesizer.getReceiver();4849 if ( receiver == null )50 System.out.println( "Receiver unavailable" );5152 // get all available instruments in default53 // soundbank or synthesizer54 instruments = synthesizer.getAvailableInstruments();5556 // get all 16 channels from synthesizer57 channels = synthesizer.getChannels();58

Fig. 22.9 MidiSynthesizer can generate notes and send them to another MIDI device (part 2 of 4).

jhtp4_22.FM Page 1296 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1297

59 // assign first channel as default channel60 channel = channels[ 0 ];61 }6263 else64 System.err.println( "No Synthesizer" );65 }6667 // synthesizer, receiver or transmitter unavailable68 catch ( MidiUnavailableException noMidiException ) {69 noMidiException.printStackTrace();70 }7172 } // end constructor7374 // return available instruments75 public Instrument[] getInstruments()76 {77 return instruments;78 }7980 // return synthesizer's transmitter81 public Transmitter getTransmitter()82 {83 return transmitter;84 }8586 // sound note on through channel87 public void midiNoteOn( int note, int volume )88 {89 channel.noteOn( note, volume );90 }9192 // sound note off through channel93 public void midiNoteOff( int note )94 {95 channel.noteOff( note );96 }9798 // change to selected instrument99 public void changeInstrument( int index )100 {101 Patch patch = instruments[ index ].getPatch();102103 channel.programChange( patch.getBank(), 104 patch.getProgram() );105 }106107 // send custom MIDI messages through transmitter108 public void sendMessage( int command, int note, int volume )109 {

Fig. 22.9 MidiSynthesizer can generate notes and send them to another MIDI device (part 3 of 4).

jhtp4_22.FM Page 1297 Monday, July 16, 2001 11:29 AM

1298 Java Media Framework and Java Sound (on CD) Chapter 22

MidiSynthesizer’s constructor (lines 29–72) acquires the synthesizer and initial-izes related resources. Line 34 obtains a Synthesizer object from the MidiSystemand line 38 opens the Synthesizer. To enable sounds to be played and recorded at thesame time, lines 41–47 obtain the Transmitter and Receiver of the Synthesizer.When a MIDI message is sent to the synthesizer’s receiver, the synthesizerexecutes the message’s instruction, generating notes, and the transmitter sends thatmessage to designated Receivers of other MidiDevices.

Common Programming Error 22.3A MidiUnavailableException occurs when a program attempts to acquire unavail-able MidiDevice resources such as synthesizers and transmitters. 22.3

MIDI messages are sent to the MidiSynthesizer from MidiDemo as a result ofeither pressing a piano key or a MidiEvent in the preloaded track of MidiData. A notecan be generated by accessing the channels of the synthesizer directly. For sim-plicity, MidiSynthesizer uses only the first channel (out of a possible 16) to soundnotes. Line 57 invokes Synthesizer method getChannels to obtain all 16 channelsfrom synthesizer, and line 60 sets the default channel to the first channel. A Mid-iChannel sounds a note by calling its noteOn method with the note number (0–127)and a volume number as arguments. MidiChannel’s noteOff method turns off a notewith just the note number as an argument. MidiSynthesizer accesses these Midi-Channel methods through method midiNoteOn (lines 87–90) and methodmidiNoteOff (lines 93–96), respectively.

A synthesizer can use its default instruments to sound notes. Line 54 obtains the defaultinstrument available through the synthesizer or through a default sound bank by invokingSynthesizer method getAvailableInstruments. A sound bank usually has 128instruments. The instrument in use can be changed by invoking MidiSynthesizer

110 // send a MIDI ShortMessage using this method's parameters111 try {112 message = new ShortMessage();113114 // set new message of command (NOTE_ON, NOTE_OFF),115 // note number, volume116 message.setMessage( command, note, volume );117118 // send message through receiver119 receiver.send( message, -1 );120 }121122 // invalid message values set123 catch ( InvalidMidiDataException badMidiException ) {124 badMidiException.printStackTrace();125 }126127 } // end method sendMessage128129 } // end class MidiSynthesizer

Fig. 22.9 MidiSynthesizer can generate notes and send them to another MIDI device (part 4 of 4).

jhtp4_22.FM Page 1298 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1299

method changeInstrument (lines 99–105). Lines 103–104 invoke MidiChannel’sprogramChange method to load the desired instrument program with the bank and pro-gram number obtained from patch (line 104) as the parameters. A Patch is the locationof a loaded instrument.

Performance Tip 22.5A program can import more instruments by loading a customized sound bank through Syn-thesizer method loadAllInstruments with a SoundBank object. 22.4

By sending MidiMessages to a Synthesizer’s Receiver, a program caninvoke the synthesizer to sound notes without using its channels. Sending MidiMessagesto a MidiDevice’s Receiver also allows the device’s Transmitters to send thesemessages to another MidiDevice’s Receiver.

In MidiSynthesizer’s sendMessage method (lines 108–127), lines 112–116create a new ShortMessage from the parameters of method sendMessage and sendthe message to the synthesizer’s receiver (line 119). Line 116 of method sendMessageinvokes ShortMessage method setMessage to set the contents of the message’sinstructions using three int arguments: a command, the note to play and the volume of thenote. Method setMessage throws an InvalidMidiDataException if the desig-nated command and parameter values are invalid.

When creating a new ShortMessage using method setMessage, the meaning ofthe second and third arguments vary depending on the command. Command ShortMes-sage.NOTE_ON designates the second parameter to be the note number and third argumentto be the velocity (i.e. volume) of the note. The ShortMessage.PROGRAM_CHANGEcommand designates the second argument as the instrument program to use and ignores thethird argument.

Line 119 sends the created ShortMessage to the synthesizer’s receiver bycalling Receiver method send with the MidiMessage and a time stamp as its argu-ments. MidiSynthesizer does not deal with the complexity of MIDI synthesis timing.The receiver sends a value of -1 for the time stamp parameter to designate that the timestamp should be ignored. The sequence recorder in class MidiRecord takes care oftiming issues when it receives the messages.

Up to this point, we have discussed the tools needed to create our MIDI piano. In briefsynopsis, class MidiDemo (Fig. 22.10) uses class MidiSynthesizer to generatesounds and to access channels and instruments. MidiDemo uses MidiData to playbackMIDI files and access MIDI track information. MidiRecord provides the recording func-tion for MidiDemo, which receives messages from MidiSynthesizer.

22.7.4 Class MidiDemoWe now present class MidiDemo (Fig. 22.10), which provides the GUI for our piano aswell as other GUI components to control the capabilities of this example.

Using a for loop, utility method makeKeys (lines 86–155) in class MidiDemo cre-ates 64 buttons that represent 64 different piano keys. Whenever the mouse hovers over akey, the program sounds the designated note. Method makeKeys arranges the keys at thebottom of the frame using each button’s setBounds method (line 106) to designate thelocation and size of the buttons. The program arranges the buttons horizontally accordingto their index in array noteButton.

jhtp4_22.FM Page 1299 Monday, July 16, 2001 11:29 AM

1300 Java Media Framework and Java Sound (on CD) Chapter 22

1 // Fig. 22.10: MidiDemo.java2 // Simulates a musical keyboard with various3 // instruments to play, also featuring recording, MIDI file4 // playback and simulating MIDI playback with the keyboard56 // Java core packages7 import java.awt.*;8 import java.awt.event.*;9 import java.io.*;

1011 // Java extension packages12 import javax.swing.*;13 import javax.swing.event.*;14 import javax.sound.midi.*;1516 public class MidiDemo extends JFrame {1718 // recording MIDI data19 private MidiRecord midiRecord;2021 // synthesize MIDI functioning22 private MidiSynthesizer midiSynthesizer;2324 // MIDI data in MIDI file25 private MidiData midiData;2627 // timer for simulating MIDI on piano28 private Timer pianoTimer;2930 // piano keys31 private JButton noteButton[];3233 // volume, tempo sliders34 private JSlider volumeSlider, resolutionSlider;3536 // containers and panels holding GUI37 private Container container;38 private JPanel controlPanel, buttonPanel;3940 // instrument selector and buttons GUI41 private JComboBox instrumentBox;42 private JButton playButton, recordButton,43 saveButton, pianoPlayerButton, listenButton;4445 // tempo, last piano key invoked, volume of MIDI46 private int resolution, lastKeyOn = -1, midiVolume = 40;4748 // boolean value indicating if program is in recording mode49 private boolean recording = false;5051 // first note number of first piano key, max number of keys52 private static int FIRST_NOTE = 32, MAX_KEYS = 64;

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 1 of 14).

jhtp4_22.FM Page 1300 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1301

5354 // constructor for MidiDemo55 public MidiDemo()56 {57 super( "MIDI Demo" );5859 container = getContentPane();60 container.setLayout( new BorderLayout() );6162 // synthesizer must be instantiated to enable synthesis63 midiSynthesizer = new MidiSynthesizer();6465 // make piano keys66 makeKeys();6768 // add control panel to frame69 controlPanel = new JPanel( new BorderLayout() );70 container.add( controlPanel, BorderLayout.NORTH );7172 makeConfigureControls();7374 // add button panel to frame75 buttonPanel = new JPanel( new GridLayout( 5, 1 ) );76 controlPanel.add( buttonPanel, BorderLayout.EAST );7778 // make GUI79 makePlaySaveButtons();80 makeRecordButton();81 makePianoPlayerButton();8283 } // end constructor8485 // utility method making piano keys86 private void makeKeys()87 {88 // panel containing keys89 JPanel keyPanel = new JPanel( null );90 container.add( keyPanel, BorderLayout.CENTER );9192 // piano keys93 noteButton = new JButton[ MAX_KEYS ];9495 // add MAX_KEYS buttons and what note they sound96 for ( int i = 0; i < MAX_KEYS; i++ ) {9798 final int note = i;99100 noteButton[ i ] = new JButton();101102 // setting white keys103 noteButton[ i ].setBackground( Color.white );104

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 2 of 14).

jhtp4_22.FM Page 1301 Monday, July 16, 2001 11:29 AM

1302 Java Media Framework and Java Sound (on CD) Chapter 22

105 // set correct spacing for buttons106 noteButton[ i ].setBounds( ( i * 11 ), 1, 11, 40 );107 keyPanel.add( noteButton[ i ] );108109 // register a mouse listener for mouse events110 noteButton[ i ].addMouseListener(111112 // anonymous inner class to handle mouse events113 new MouseAdapter() {114115 // invoke key note when mouse touches key116 public void mouseEntered( MouseEvent mouseEvent )117 {118 // if recording, send message to receiver119 if ( recording )120 midiSynthesizer.sendMessage(121 ShortMessage.NOTE_ON,122 note + FIRST_NOTE, midiVolume );123124 // else just sound the note125 else126 midiSynthesizer.midiNoteOn(127 note + FIRST_NOTE, midiVolume );128129 // turn key color to blue130 noteButton[ note ].setBackground(131 Color.blue );132 }133134 // turn key note off when mouse leaves key135 public void mouseExited( MouseEvent mouseEvent )136 {137 if ( recording )138 midiSynthesizer.sendMessage(139 ShortMessage.NOTE_OFF,140 note + FIRST_NOTE, midiVolume );141 else142 midiSynthesizer.midiNoteOff(143 note + FIRST_NOTE );144145 noteButton[ note ].setBackground(146 Color.white );147 }148149 } // end MouseAdapter150151 ); // end call to addMouseListener152153 } // end for loop154155 } // end method makeKeys156

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 3 of 14).

jhtp4_22.FM Page 1302 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1303

157 // set up configuration controls158 private void makeConfigureControls()159 {160 JPanel configurePanel =161 new JPanel( new GridLayout( 5, 1 ) );162163 controlPanel.add( configurePanel, BorderLayout.WEST );164165 instrumentBox = new JComboBox(166 midiSynthesizer.getInstruments() );167168 configurePanel.add( instrumentBox );169170 // register an ActionListener for instrumentBox events171 instrumentBox.addActionListener(172173 // anonymous inner class to handle instrument selector174 new ActionListener() {175176 // change current instrument program177 public void actionPerformed( ActionEvent event )178 {179 // change instrument in synthesizer180 midiSynthesizer.changeInstrument(181 instrumentBox.getSelectedIndex() );182 }183184 } // end ActionListener185186 ); // end call to method addActionListener187188 JLabel volumeLabel = new JLabel( "volume" );189 configurePanel.add( volumeLabel );190191 volumeSlider = new JSlider(192 SwingConstants.HORIZONTAL, 5, 80, 30 );193194 // register a ChangeListener for slider change events195 volumeSlider.addChangeListener(196197 // anonymous inner class to handle volume slider events198 new ChangeListener() {199200 // change volume201 public void stateChanged( ChangeEvent changeEvent )202 {203 midiVolume = volumeSlider.getValue();204 }205206 } // end class ChangeListener207208 ); // end call to method addChangeListener

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 4 of 14).

jhtp4_22.FM Page 1303 Monday, July 16, 2001 11:29 AM

1304 Java Media Framework and Java Sound (on CD) Chapter 22

209210 configurePanel.add( volumeSlider );211212 JLabel tempLabel = new JLabel( "tempo" );213 configurePanel.add( tempLabel );214215 resolutionSlider = new JSlider(216 SwingConstants.HORIZONTAL, 1, 10, 1 );217218 // register a ChangeListener slider for change events219 resolutionSlider.addChangeListener(220221 // anonymous inner class to handle tempo slider events222 new ChangeListener() {223224 // change resolution if value changed225 public void stateChanged( ChangeEvent changeEvent )226 {227 resolution = resolutionSlider.getValue();228 }229230 } // end ChangeListener231232 ); // end call to method addChangeListener233234 resolutionSlider.setEnabled( false );235 configurePanel.add( resolutionSlider );236237 } // end method makeConfigureControls238239 // set up play and save buttons240 private void makePlaySaveButtons()241 {242 playButton = new JButton( "Playback" );243244 // register an ActionListener for playButton events245 playButton.addActionListener(246247 // anonymous inner class to handle playButton event248 new ActionListener() {249250 // playback last recorded MIDI251 public void actionPerformed( ActionEvent event )252 {253 if ( midiRecord != null )254 midiRecord.play();255 }256257 } // end ActionListener258259 ); // end call to method addActionListener260

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 5 of 14).

jhtp4_22.FM Page 1304 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1305

261 buttonPanel.add( playButton );262 playButton.setEnabled( false );263264 listenButton = new JButton( "Play MIDI" );265266 // register an ActionListener for listenButton events267 listenButton.addActionListener(268269 // anonymous inner class to handle listenButton events270 new ActionListener() {271272 // playback MIDI file273 public void actionPerformed( ActionEvent event )274 {275 File midiFile = getFile();276277 if ( midiFile == null )278 return;279280 midiData = new MidiData();281282 // prepare MIDI track283 if ( midiData.initialize( midiFile ) == false )284 return;285286 // play MIDI data287 midiData.play();288 }289290 } // end ActionListener291292 ); // end call to method addActionListener293294 buttonPanel.add( listenButton );295296 saveButton = new JButton( "Save MIDI" );297298 // register an ActionListener for saveButton events299 saveButton.addActionListener(300301 // anonymous inner class to handle saveButton events302 new ActionListener() {303304 // get save file and save recorded MIDI305 public void actionPerformed( ActionEvent event )306 {307 File saveFile = getSaveFile();308309 if ( saveFile != null )310 midiRecord.saveSequence( saveFile );311 }312

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 6 of 14).

jhtp4_22.FM Page 1305 Monday, July 16, 2001 11:29 AM

1306 Java Media Framework and Java Sound (on CD) Chapter 22

313 } // end ActionListener314315 ); // end call to method addActionListener316317 buttonPanel.add( saveButton );318 saveButton.setEnabled( false );319320 } // end method makePlaySaveButtons321322 // make recording button323 private void makeRecordButton()324 {325 recordButton = new JButton( "Record" );326327 // register an ActionListener for recordButton events328 recordButton.addActionListener(329330 // anonymous inner class to handle recordButton events331 new ActionListener() {332333 // start or stop recording334 public void actionPerformed( ActionEvent event )335 {336 // record MIDI when button is "record" button337 if ( recordButton.getText().equals("Record") ) {338339 if ( midiRecord == null ) {340341 // create new instance of recorder342 // by passing in synthesizer transmitter343 midiRecord = new MidiRecord(344 midiSynthesizer.getTransmitter() );345346 if ( midiRecord.initialize() == false )347 return;348 }349350 else351 midiRecord.makeTrack();352353 midiRecord.startRecord();354355 // disable playback during recording356 playButton.setEnabled( false );357358 // change recording button to stop359 recordButton.setText( "Stop" );360 recording = true;361362 } // end if363

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 7 of 14).

jhtp4_22.FM Page 1306 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1307

364 // stop recording when button is "stop" button365 else {366 midiRecord.stopRecord();367368 recordButton.setText( "Record" );369 recording = false;370371 playButton.setEnabled( true );372 saveButton.setEnabled( true );373 }374375 } // end method actionPerformed376377 } // end ActionListener378379 ); // end call to method addActionListener380381 buttonPanel.add( recordButton );382383 } // end method makeRecordButton384385 // create Piano Player button and functionality386 private void makePianoPlayerButton()387 {388 pianoPlayerButton = new JButton( "Piano Player" );389390 // register an ActionListener for pianoPlayerButton events391 pianoPlayerButton.addActionListener(392393 // anonymous inner class to handle pianoPlayerButton394 new ActionListener() {395396 // initialize MIDI data and piano player timer397 public void actionPerformed( ActionEvent event )398 {399 File midiFile = getFile();400401 if ( midiFile == null )402 return;403404 midiData = new MidiData();405406 // prepare MIDI track407 if ( midiData.initialize( midiFile ) == false )408 return;409410 if ( midiData.initializeTrack() == false )411 return;412413 // set initial resolution from MIDI414 resolution = midiData.getResolution();415

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 8 of 14).

jhtp4_22.FM Page 1307 Monday, July 16, 2001 11:29 AM

1308 Java Media Framework and Java Sound (on CD) Chapter 22

416 // new instance of timer for handling417 // piano sounds and key pressing with tempo418 pianoTimer = new Timer(419 midiData.getEventDelay() * resolution,420 new TimerHandler() );421422 listenButton.setEnabled( false );423 pianoPlayerButton.setEnabled( false );424 resolutionSlider.setEnabled( true );425426 pianoTimer.start();427428 } // method end actionPerformed429430 } // end ActionListener431432 ); // end call to method addActionListener433434 buttonPanel.add( pianoPlayerButton );435436 } // end method makePianoPlayerButton437438 // inner class handles MIDI timed events439 private class TimerHandler implements ActionListener {440441 // simulate key note of event if present, jump to next442 // event in track and set next delay interval of timer443 // method invoked when timer reaches next event time444 public void actionPerformed( ActionEvent actionEvent )445 {446 // if valid last key on, set it white447 if ( lastKeyOn != -1 )448 noteButton[ lastKeyOn ].setBackground(449 Color.white );450451 noteAction();452 midiData.goNextEvent();453454 // stop piano player when end of MIDI track455 if ( midiData.isTrackEnd() == true ) {456457 if ( lastKeyOn != -1 )458 noteButton[ lastKeyOn ].setBackground(459 Color.white );460461 pianoTimer.stop();462463 listenButton.setEnabled( true );464 pianoPlayerButton.setEnabled( true );465 resolutionSlider.setEnabled( false );466467 return;

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 9 of 14).

jhtp4_22.FM Page 1308 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1309

468469 } // end if isTrackEnd470471 // set interval before next sounding event472 pianoTimer.setDelay(473 midiData.getEventDelay() * resolution );474475 } // end actionPerformed method476477 } // end inner class TimerHandler478479 // determine which note to sound480 // according to MIDI messages481 private void noteAction()482 {483 // during Note On message, sound note and press key484 if ( midiData.getEventCommand() ==485 ShortMessage.NOTE_ON ) {486487 // make sure valid note is in range of keys488 if ( ( midiData.getNote() >= FIRST_NOTE ) &&489 ( midiData.getNote() < FIRST_NOTE + MAX_KEYS ) ) {490491 lastKeyOn = midiData.getNote() - FIRST_NOTE;492493 // set key color to red494 noteButton[ lastKeyOn ].setBackground( Color.red );495496 // send and sound note through synthesizer497 midiSynthesizer.sendMessage( 144,498 midiData.getNote(), midiData.getVolume() );499500 } // end if501502 // else no last key pressed503 else504 lastKeyOn = -1;505506 } // end if507508 // receiving Note Off message will sound off note509 // and change key color back to white510 else511512 // if message command is note off513 if ( midiData.getEventCommand() ==514 ShortMessage.NOTE_OFF ) {515516 if ( ( midiData.getNote() >= FIRST_NOTE ) &&517 ( midiData.getNote() < FIRST_NOTE + MAX_KEYS ) ) {518

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 10 of 14).

jhtp4_22.FM Page 1309 Monday, July 16, 2001 11:29 AM

1310 Java Media Framework and Java Sound (on CD) Chapter 22

519 // set appropriate key to white520 noteButton[ midiData.getNote() -521 FIRST_NOTE ].setBackground( Color.white );522523 // send note off message to receiver524 midiSynthesizer.sendMessage( 128,525 midiData.getNote(), midiData.getVolume() );526 } 527528 } // end if529530 } // end method noteAction531532 // get save file from computer533 public File getSaveFile()534 {535 JFileChooser fileChooser = new JFileChooser();536537 fileChooser.setFileSelectionMode(538 JFileChooser.FILES_ONLY );539 int result = fileChooser.showSaveDialog( this );540541 if ( result == JFileChooser.CANCEL_OPTION )542 return null;543544 else545 return fileChooser.getSelectedFile();546 }547548 // get file from computer549 public File getFile()550 {551 JFileChooser fileChooser = new JFileChooser();552553 fileChooser.setFileSelectionMode(554 JFileChooser.FILES_ONLY );555 int result = fileChooser.showOpenDialog( this );556557 if ( result == JFileChooser.CANCEL_OPTION )558 return null;559560 else561 return fileChooser.getSelectedFile();562 }563564 // execute application565 public static void main( String args[] )566 {567 MidiDemo midiTest = new MidiDemo();568569 midiTest.setSize( 711, 225 );570 midiTest.setDefaultCloseOperation ( EXIT_ON_CLOSE );

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 11 of 14).

jhtp4_22.FM Page 1310 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1311

571 midiTest.setVisible( true );572 }573574 } // end class MidiDemo

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 12 of 14).

jhtp4_22.FM Page 1311 Monday, July 16, 2001 11:29 AM

1312 Java Media Framework and Java Sound (on CD) Chapter 22

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 13 of 14).

jhtp4_22.FM Page 1312 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1313

Fig. 22.10 MidiDemo provides the GUI that enables users to interact with the application (part 14 of 14).

jhtp4_22.FM Page 1313 Monday, July 16, 2001 11:29 AM

1314 Java Media Framework and Java Sound (on CD) Chapter 22

Look-and-Feel Observation 22.3To arrange GUI components at specific locations without the help of Layout Managers, setthe layout of the panel containing the components to null. By default, a JPanel sets aFlowLayout LayoutManager when the panel is instantiated with no arguments. 22.3

Lines 110–151 register MouseListeners for each piano-key button. The programcalls method mouseEntered (lines 116–132) when the mouse hovers over that button. Ifthe program is not in recording mode, method mouseEntered directly accesses the chan-nels in class MidiSynthesizer to sound the note (lines 125–127). Otherwise, methodmouseEntered invokes MidiSynthesizer’s sendMessage method to send a notemessage to the synthesizer and to the recording device (lines 119–122). Lines 130–131 set thebutton’s background color to blue to indicate that the note is being played. When the mouseis no longer hovering over the button, the program calls method mouseExited (lines 135–147) to turn off the note and change the button’s background to its original color.

Out of the possible 128 notes, only the middle 64 notes are accessible via the piano inthe example. The range of notes can be changed by modifying the constants in line 52. Con-stant FIRST_NOTE is the value of the first key and the sum of FIRST_NOTE andMAX_KEYS is the value of the last key. Constant MAX_KEYS specifies the number of pianokeys to create.

Class MidiDemo invokes method makeConfigureControls (lines 158–237) tosetup the program’s MIDI controls, which consist of an instrument selector JComboBox,a user-synthesis volume changer JSlider and a “piano player” tempo changerJSlider. When users select an instrument, the program calls instrumentBox methodactionPerformed (lines 177–182) to change to the selected instrument program byinvoking MidiSynthesizer method changeInstrument with the selected instru-ment index as the parameter.

When users drag the volume slider, the program calls volumeSlider methodstateChanged (lines 201-204) to change the volume. Note that changing the volumeaffects only the volume of user-synthesized MIDI notes. When users drag the tempo slider,the program calls resolutionSlider’s stateChanged method (lines 225-228) toset the tempo.

Invoking method makePlaySaveButtons (lines 240–320) sets up the Play MIDI,Playback and Save buttons. Clicking Play MIDI invokes the actionPerformedmethod of the listenButton (lines 273–288) to playback an opened MIDI file in itsentirety using class MidiData. Line 275 obtains a file from a file-chooser dialog usingMidiDemo method getFile (lines 549–562). Lines 280–284 initialize and play the MIDIfile using the instantiated midiData object. When the user clicks Playback, line 254 playsthe recorded MIDI. This button is enabled only if some recording took place. Clicking theSave button allows the user to save the recorded sequence to a file (lines 307–310).

Method makeRecordButton (lines 323–383) creates the Record button and a lis-tener for it. Clicking the button when it is set to recording mode (line 337) creates a newrecorder using class MidiRecord (lines 339–348). If a recorder has already been created,line 351 invokes MidiRecord’s makeTrack method to make a new track for objectmidiRecord. When recording starts (line 353), lines 356–360 turn the recording buttoninto a stop button and disable the playButton temporarily. When users stop therecording by clicking the recording button again, the GUI returns to its state prior torecording and the user can playback and save the MIDI sequence (lines 365–373).

jhtp4_22.FM Page 1314 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1315

“Piano Player”The “piano player” feature of this example synchronizes the playing of a note with the high-lighting of the corresponding key. The program first obtains a MIDI track from a user-spec-ified MIDI file using class MidiData. A Timer synchronizes the MIDI events in theMIDI sequence. When the Timer reaches a MIDI event’s time stamp, it resets its delay tothe time stamp of the next event to synchronize. The delay of a timer is the period of timethat it waits before it causes an event.

The “piano player driver” responsible for synchronization is located in main classMidiDemo. Method makePianoPlayerButton (lines 386–436) loads the MIDI fileand initializes the Timer that handles the timing of musical notes. As with the listen-Button, method actionPerformed (lines 397–428) of makePlayerPi-anoButton uses class MidiData to load MIDI data. Lines 399–408 open a file from afile dialog box and load the MIDI data from the file. Line 410 invokes MidiData’s ini-tializeTrack method to obtain the longest track from the loaded MIDI and to obtainthe first MIDI event message from the track.

Line 414 invokes MidiData’s getResolution method (lines 67–70 in Fig. 22.7)to obtain the default tempo of the “piano player”. Lines 418–420 instantiate a new Timerobject that drives the playing of notes at the time of each MIDI event. Line 419 sets thedelay of the timer to be the duration of the first MidiEvent in the track by invoking Mid-iData’s getEventDelay method. To allow user-specified tempo changes to the “pianoplayer,” line 424 enables the tempo slider and line 419 sets the delay of the timer to be mul-tiplied by the value of resolution. The resolution variable, which specifies thetempo of “piano player,” is multiplied by the event interval time to obtain a new intervaltime used by pianoTimer. PianoTimer’s delay is set only for the first event’s durationto start the timer. Later, inner class TimerHandler (lines 439–477) method action-Performed (lines 444–475) resets the timer’s delay to the next MidiEvent’s duration(lines 472–473). Line 426 starts the pianoTimer.

At the time of the next MIDI event, class TimerHandler’s actionPerformedmethod synthesizes a note from the track and “presses” a key on the piano. Using piano-Timer and its event handler, the program iterates through the track, simulating note eventswhen it encounters appropriate MidiEvents. Inner class TimerHandler drives thesynchronization by iterating through all the MidiEvents on a given track and callingmethod actionPerformed to play the piano. Line 451 of method actionPer-formed invokes utility method noteAction (lines 481–530) to sound the note and tochange the color of the specific piano key, given that the event contains a note message andthat the note is within the range designator for the piano keys in this example.

Lines 484–485 of method noteAction determine whether the current MidiEventcontains a note command by invoking method getEventCommand of class MidiData.Lines 488–489 invoke MidiData method getNote to determine whether the notenumber specified in the noteMessage is a valid note within the range of possible pianokeys, specified in constants FIRST_NOTE and MAX_KEYS (line 52).

If the command is a ShortMessage.NOTE_ON command and is within range ofthe piano keys, the synthesizer receives the specified note message (lines 497–498) andthe corresponding piano key’s color becomes red for the duration of the event (line 494).Line 491 obtains the corresponding piano button number (lastKeyOn) that shouldbecome red.

jhtp4_22.FM Page 1315 Monday, July 16, 2001 11:29 AM

1316 Java Media Framework and Java Sound (on CD) Chapter 22

If the command is ShortMessage.NOTE_OFF and is within the allowable pianokey range (lines 513–517), lines 520–521 change the background of the specified piano keyback to white. Lines 524-525 send the ShortMessage to the synthesizer so that the notewill stop sounding. Because not all NOTE_ON ShortMessages are followed by aNOTE_OFF ShortMessage as one would expect, the program needs to change the lastNOTE_ON key back to its original color at the time of the next event. For that purposemethod noteAction assigns a lastKeyOn value to the last piano button invoked. ThelastKeyOn object initialized to –1 remains –1 if the NOTE_ON command note is out ofrange. This limits access only to keys in range of our simulated keyboard. When piano-Timer reaches the next event, the program changes the background of the last “pressed”piano key back to white (lines 447–449).

When the program finishes executing method noteAction, line 452 invokes Mid-iData method goNextEvent to transition to the next event in the track. Every time thehandler’s actionPerformed method finishes loading the next event, line 455 deter-mines whether the next event is the last event in the track by invoking MidiData methodisTrackEnd, assuming that the last event is the end-of-track MetaEvent. If the nextevent is the last event, lines 457–459 change the background color of the last “pressed” keyto white and line 461 stops the pianoTimer. Lines 463–465 re-enable buttons that weredisabled during the “piano player” feature.

22.8 Internet and World Wide Web ResourcesThis section presents several Internet and Web resources for the Java Media Frameworkand other multimedia related sites.

java.sun.com/products/java-media/jmf/The Java Media Framework home page on the Java Web site. Here you can download the latest Sunimplementation of the JMF. The site also contains the documentation for the JMF.

www.nasa.gov/gallery/index.htmlThe NASA multimedia gallery contains a wide variety of images, audio clips and video clips that youcan download and use to test your Java multimedia programs.

sunsite.sut.ac.jp/multimed/The Sunsite Japan Multimedia Collection also provides a wide variety of images, audio clips and vid-eo clips that you can download for educational purposes.

www.anbg.gov.au/anbg/index.htmlThe Australian National Botanic Gardens Web site provides links to sounds of many animals. Try theCommon Birds link.

www.midi.com Midi.com is a MIDI resource site with a MIDI search engine, links to other MIDI sites, a list of MIDI-related books and other MIDI information.www.streamingmedia.comStreamingmedia.com provides many articles of the streaming media industry and technical informa-tion streaming media technology. www.harmony-central.com/MIDI/Harmony Central’s MIDI resources section contains many useful MIDI documents, links and forumsthat can be useful for a MIDI programmer.

jhtp4_22.FM Page 1316 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1317

22.9 (Optional Case Study) Thinking About Objects: Animation and Sound in the ViewThis case study has focused mainly on the elevator system model. Now that we have com-pleted our design of the model, we turn our attention to the view, which provides the visualpresentation of the model. In our case study, the view—called ElevatorView—is aJPanel object containing other JPanel “child” objects, each representing a unique ob-ject in the model (e.g. a Person, a Button, the Elevator). Class ElevatorView isthe largest class in the case study. In this section, we discuss the graphics and sound classesused by class ElevatorView. We present and explain the remainder of the code in thisclass in Appendix I.

In Section 3.7, we constructed a class diagram for our model by locating the nouns andnoun phrases from the problem statement of Section 2.7. We ignored several of thesenouns, because they were not associated with the model. Now, we list the nouns and nounsphrases that apply to displaying the model:

• display

• audio

• elevator music

The noun “display” corresponds to the view, or the visual presentation of the model.As described in Section 13.17, class ElevatorView aggregates several classes com-prising the view. The “audio” refers to the sound effects that our simulation generates whenvarious actions occur—we create class SoundEffects to generate these sound effects.The phrase “elevator music” refers to the music played as the Person rides in the Ele-vator—we create class ElevatorMusic to play this music.

The view must display all objects in the model. We create class ImagePanel to rep-resent stationary objects in the model, such as the ElevatorShaft. We create classMovingPanel, which extends ImagePanel, to represent moving objects, such as theElevator. Lastly, we create class AnimatedPanel, which extends MovingPanel,to represent moving objects whose corresponding images change continuously, such as aPerson (we use several frames of animation to show the Person walking then pressinga button). Using these classes, we present the class diagram of the view for our simulationin Fig. 22.9.

The notes indicate the roles that the classes play in the system. According to the classdiagram, class ElevatorView represents the view, classes ImagePanel, Moving-Panel and AnimatedPanel relate to the graphics, and classes SoundEffects andElevatorMusic relate to the audio. Class ElevatorView contains several instancesof classes ImagePanel, MovingPanel and AnimatedPanel and one instance eachof classes SoundEffects and ElevatorMusic. In Appendix I, we associate eachobject in the model with a corresponding class in the view.

In this section, we discuss classes ImagePanel, MovingPanel and Animated-Panel to explain the graphics and animation. We then discuss classes SoundEffectsand ElevatorMusic to explain the audio functionality.

ImagePanelThe ElevatorView uses objects from JPanel subclasses to represent and display eachobject in the model (such as the Elevator, a Person, the ElevatorShaft, etc.).

jhtp4_22.FM Page 1317 Monday, July 16, 2001 11:29 AM

1318 Java Media Framework and Java Sound (on CD) Chapter 22

Class ImagePanel (Fig. 22.10) is a JPanel subclass capable of displaying an image ata given screen position. The ElevatorView uses ImagePanel objects to represent sta-tionary objects in the model, such as the ElevatorShaft and the two Floors. ClassImagePanel contains an integer attribute—ID (line 16)—that defines a unique identifierused to track the ImagePanel in the view if necessary. This tracking is useful when sev-eral objects of the same class exist in the model, such as several Person objects. ClassImagePanel contains Point2D.Double object position (line 19) to represent theImagePanel screen position. We will see later that MovingPanel, which extends Im-agePanel, defines velocity with doubles—using type double yields a highly accu-rate velocity and position. We cast the position coordinates to ints to place theImagePanel on screen (Java represents screen coordinates as ints) in method setPo-sition (lines 90–94). Class ImagePanel also contains an ImageIcon object calledimageIcon (line 22)—method paintComponent (lines 54–60) displays image-Icon on screen. Lines 41–42 initialize imageIcon using a String parameter holdingthe name of the image. Lastly, class ImagePanel contains Set panelChildren (line25) that stores any child objects of class ImagePanel (or objects of a subclass of Im-agePanel). The child objects are displayed on top of their parent ImagePanel—for ex-ample, a Person riding inside the Elevator. The first method add (lines 63–67)appends an object to panelChildren. The second method add (lines 70–74) inserts anobject into panelChildren at a given index. Method setIcon (lines 84–87) sets im-ageIcon to a new image. Objects of class AnimatedPanel use method setIcon re-peatedly to change the image displayed, which causes the animation for the view—wediscuss animation later in the section.

Fig. 22.9 Class diagram of elevator simulation view.

ElevatorView

1 1

MovingPanel

AnimatedPanel

1

SoundEffects

ElevatorMusic

11..*

1..* 1

graphics

audio

view

ImagePanel

1

1..*

1

jhtp4_22.FM Page 1318 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1319

1 // ImagePanel.java2 // JPanel subclass for positioning and displaying ImageIcon3 package com.deitel.jhtp4.elevator.view;45 // Java core packages6 import java.awt.*;7 import java.awt.geom.*;8 import java.util.*;9

10 // Java extension packages11 import javax.swing.*;1213 public class ImagePanel extends JPanel {1415 // identifier16 private int ID;1718 // on-screen position19 private Point2D.Double position;2021 // imageIcon to paint on screen22 private ImageIcon imageIcon;2324 // stores all ImagePanel children25 private Set panelChildren;2627 // constructor initializes position and image28 public ImagePanel( int identifier, String imageName )29 {30 super( null ); // specify null layout31 setOpaque( false ); // make transparent3233 // set unique identifier34 ID = identifier;3536 // set location37 position = new Point2D.Double( 0, 0 );38 setLocation( 0, 0 );3940 // create ImageIcon with given imageName41 imageIcon = new ImageIcon(42 getClass().getResource( imageName ) );4344 Image image = imageIcon.getImage();45 setSize( 46 image.getWidth( this ), image.getHeight( this ) );4748 // create Set to store Panel children49 panelChildren = new HashSet();5051 } // end ImagePanel constructor52

Fig. 22.10 Class ImagePanel represents and displays a stationary object from the model (part 1 of 3).

jhtp4_22.FM Page 1319 Monday, July 16, 2001 11:29 AM

1320 Java Media Framework and Java Sound (on CD) Chapter 22

53 // paint Panel to screen54 public void paintComponent( Graphics g )55 {56 super.paintComponent( g );5758 // if image is ready, paint it to screen59 imageIcon.paintIcon( this, g, 0, 0 );60 }6162 // add ImagePanel child to ImagePanel63 public void add( ImagePanel panel )64 {65 panelChildren.add( panel );66 super.add( panel );67 }6869 // add ImagePanel child to ImagePanel at given index70 public void add( ImagePanel panel, int index )71 {72 panelChildren.add( panel );73 super.add( panel, index );74 }7576 // remove ImagePanel child from ImagePanel77 public void remove( ImagePanel panel )78 {79 panelChildren.remove( panel );80 super.remove( panel );81 }8283 // sets current ImageIcon to be displayed84 public void setIcon( ImageIcon icon )85 {86 imageIcon = icon;87 }8889 // set on-screen position90 public void setPosition( double x, double y )91 {92 position.setLocation( x, y );93 setLocation( ( int ) x, ( int ) y );94 }9596 // return ImagePanel identifier97 public int getID()98 {99 return ID;100 }101102 // get position of ImagePanel103 public Point2D.Double getPosition()104 {

Fig. 22.10 Class ImagePanel represents and displays a stationary object from the model (part 2 of 3).

jhtp4_22.FM Page 1320 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1321

MovingPanelClass MovingPanel (Fig. 22.11) is an ImagePanel subclass capable of changing itsscreen position according to its xVelocity and yVelocity (lines 20–21). The Ele-vatorView uses MovingPanel objects to represent moving objects from the model,such as the Elevator.

105 return position;106 }107108 // get imageIcon109 public ImageIcon getImageIcon()110 {111 return imageIcon;112 }113114 // get Set of ImagePanel children115 public Set getChildren()116 {117 return panelChildren;118 }119 }

1 // MovingPanel.java2 // JPanel subclass with on-screen moving capabilities3 package com.deitel.jhtp4.elevator.view;45 // Java core packages6 import java.awt.*;7 import java.awt.geom.*;8 import java.util.*;9

10 // Java extension packages11 import javax.swing.*;1213 public class MovingPanel extends ImagePanel {1415 // should MovingPanel change position?16 private boolean moving;1718 // number of pixels MovingPanel moves in both x and y values19 // per animationDelay milliseconds20 private double xVelocity;21 private double yVelocity;22

Fig. 22.11 Class MovingPanel represents and displays a moving object from the model (part 1 of 3).

Fig. 22.10 Class ImagePanel represents and displays a stationary object from the model (part 3 of 3).

jhtp4_22.FM Page 1321 Monday, July 16, 2001 11:29 AM

1322 Java Media Framework and Java Sound (on CD) Chapter 22

23 // constructor initializes position, velocity and image24 public MovingPanel( int identifier, String imageName )25 {26 super( identifier, imageName );2728 // set MovingPanel velocity29 xVelocity = 0;30 yVelocity = 0;3132 } // end MovingPanel constructor3334 // update MovingPanel position and animation frame35 public void animate()36 {37 // update position according to MovingPanel velocity38 if ( isMoving() ) {39 double oldXPosition = getPosition().getX();40 double oldYPosition = getPosition().getY();4142 setPosition( oldXPosition + xVelocity,43 oldYPosition + yVelocity );44 }4546 // update all children of MovingPanel47 Iterator iterator = getChildren().iterator();4849 while ( iterator.hasNext() ) {50 MovingPanel panel = ( MovingPanel ) iterator.next();51 panel.animate();52 }53 } // end method animate5455 // is MovingPanel moving on screen?56 public boolean isMoving()57 {58 return moving;59 }6061 // set MovingPanel to move on screen62 public void setMoving( boolean move )63 {64 moving = move;65 }6667 // set MovingPanel x and y velocity68 public void setVelocity( double x, double y )69 {70 xVelocity = x;71 yVelocity = y;72 }73

Fig. 22.11 Class MovingPanel represents and displays a moving object from the model (part 2 of 3).

jhtp4_22.FM Page 1322 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1323

Method animate (lines 35–53) moves the MovingPanel according to the currentvalues of attributes xVelocity and yVelocity. If boolean variable moving (line16) is true, lines 38–44 use attributes xVelocity and yVelocity to determine thenext location for the MovingPanel. Lines 47–52 repeat the process for any children. Inour simulation, ElevatorView invokes method animate and method paintCompo-nent of class ImagePanel every 50 milliseconds. These rapid, successive calls movethe MovingPanel object.

AnimatedPanelClass AnimatedPanel (Fig. 22.12), which extends class MovingPanel, represents ananimated object from the model (i.e., moving objects whose corresponding image changescontinuously), such as a Person. The ElevatorView animates an AnimatedPanelobject by changing the image associated with imageIcon.

74 // return MovingPanel x velocity75 public double getXVelocity()76 {77 return xVelocity;78 }7980 // return MovingPanel y velocity81 public double getYVelocity()82 {83 return yVelocity;84 }85 }

1 // AnimatedPanel.java2 // MovingPanel subclass with animation capabilities3 package com.deitel.jhtp4.elevator.view;45 // Java core packages6 import java.awt.*;7 import java.util.*;89 // Java extension packages

10 import javax.swing.*;1112 public class AnimatedPanel extends MovingPanel {1314 // should ImageIcon cycle frames15 private boolean animating;1617 // frame cycle rate (i.e., rate advancing to next frame)18 private int animationRate;

Fig. 22.12 Class AnimatedPanel represents and displays an animated object from the model (part 1 of 4).

Fig. 22.11 Class MovingPanel represents and displays a moving object from the model (part 3 of 3).

jhtp4_22.FM Page 1323 Monday, July 16, 2001 11:29 AM

1324 Java Media Framework and Java Sound (on CD) Chapter 22

19 private int animationRateCounter;20 private boolean cycleForward = true;2122 // individual ImageIcons used for animation frames23 private ImageIcon imageIcons[];2425 // storage for all frame sequences26 private java.util.List frameSequences;27 private int currentAnimation;2829 // should loop (continue) animation at end of cycle?30 private boolean loop;3132 // should animation display last frame at end of animation?33 private boolean displayLastFrame;3435 // helps determine next displayed frame36 private int currentFrameCounter;3738 // constructor takes array of filenames and screen position39 public AnimatedPanel( int identifier, String imageName[] )40 {41 super( identifier, imageName[ 0 ] );4243 // creates ImageIcon objects from imageName string array44 imageIcons = new ImageIcon[ imageName.length ];4546 for ( int i = 0; i < imageIcons.length; i++ ) {47 imageIcons[ i ] = new ImageIcon( 48 getClass().getResource( imageName[ i ] ) );49 }5051 frameSequences = new ArrayList();5253 } // end AnimatedPanel constructor5455 // update icon position and animation frame56 public void animate()57 {58 super.animate();5960 // play next animation frame if counter > animation rate61 if ( frameSequences != null && isAnimating() ) {6263 if ( animationRateCounter > animationRate ) {64 animationRateCounter = 0;65 determineNextFrame();66 }67 else68 animationRateCounter++;69 }70 } // end method animate

Fig. 22.12 Class AnimatedPanel represents and displays an animated object from the model (part 2 of 4).

jhtp4_22.FM Page 1324 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1325

7172 // determine next animation frame73 private void determineNextFrame()74 {75 int frameSequence[] = 76 ( int[] ) frameSequences.get( currentAnimation );7778 // if no more animation frames, determine final frame,79 // unless loop is specified80 if ( currentFrameCounter >= frameSequence.length ) {81 currentFrameCounter = 0;8283 // if loop is false, terminate animation84 if ( !isLoop() ) {8586 setAnimating( false );8788 if ( isDisplayLastFrame() )8990 // display last frame in sequence91 currentFrameCounter = frameSequence.length - 1;92 }93 }9495 // set current animation frame96 setCurrentFrame( frameSequence[ currentFrameCounter ] );97 currentFrameCounter++;9899 } // end method determineNextFrame100101 // add frame sequence (animation) to frameSequences ArrayList102 public void addFrameSequence( int frameSequence[] )103 {104 frameSequences.add( frameSequence );105 }106107 // ask if AnimatedPanel is animating (cycling frames)108 public boolean isAnimating()109 {110 return animating;111 }112113 // set AnimatedPanel to animate114 public void setAnimating( boolean animate )115 {116 animating = animate;117 }118119 // set current ImageIcon120 public void setCurrentFrame( int frame )121 {122 setIcon( imageIcons[ frame ] );

Fig. 22.12 Class AnimatedPanel represents and displays an animated object from the model (part 3 of 4).

jhtp4_22.FM Page 1325 Monday, July 16, 2001 11:29 AM

1326 Java Media Framework and Java Sound (on CD) Chapter 22

Class AnimatedPanel chooses the ImageIcon object to be drawn on screen fromamong several ImageIcon objects stored in array imageIcons (line 23). Class Ani-matedPanel determines the ImageIcon object according to a series of frame sequencereferences, stored in List frameSequences (line 26). A frame sequence is an array of

123 }124125 // set animation rate126 public void setAnimationRate( int rate )127 {128 animationRate = rate;129 }130131 // get animation rate132 public int getAnimationRate()133 {134 return animationRate;135 }136137 // set whether animation should loop138 public void setLoop( boolean loopAnimation )139 {140 loop = loopAnimation;141 }142143 // get whether animation should loop144 public boolean isLoop()145 {146 return loop;147 }148149 // get whether to display last frame at animation end150 private boolean isDisplayLastFrame()151 {152 return displayLastFrame;153 }154155 // set whether to display last frame at animation end156 public void setDisplayLastFrame( boolean displayFrame )157 {158 displayLastFrame = displayFrame;159 }160161 // start playing animation sequence of given index162 public void playAnimation( int frameSequence )163 {164 currentAnimation = frameSequence;165 currentFrameCounter = 0;166 setAnimating( true );167 }168 }

Fig. 22.12 Class AnimatedPanel represents and displays an animated object from the model (part 4 of 4).

jhtp4_22.FM Page 1326 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1327

integers holding the proper sequence to display the ImageIcon objects; specifically, eachinteger represents the index of an ImageIcon object in imageIcons. Figure 22.13demonstrates the relationship between imageIcons and frameSequences (this is nota diagram of the UML). For example, frame sequence number

2 = { 2, 1, 0 }

refers to { imageIcon[ 2 ], imageIcon[ 1 ], imageIcon[ 0 ] }, which yields theimage sequence { C, B, A }. In the view, each image is a unique .png file. Method ad-dFrameSequence (lines 102–105) adds a frame sequence to List frameSequenc-es. Method playAnimation (lines 162–167) starts the animation associated with theparameter frameSequence. For example, assume an AnimatedPanel object calledpersonAnimatedPanel in class ElevatorView. The code segment

animatedPanel.playAnimation( 1 );

would generate the { A, B, D, B, A } image sequence using Fig. 22.13 as a reference.Method animate (lines 56–70) overrides method animate of superclass Moving-

Panel. Lines 61–69 determine the next frame of animation depending on attribute ani-mationRate, which is inversely proportional to the animation speed—a higher value foranimationRate yields a slower frame rate. For example, if animationRate is 5,animate moves to the next frame of animation every fifth time it is invoked. Using thislogic, the animation rate maximizes when animationRate has a value of 1, because thenext frame is determined each time animate runs.

Method animate calls determineNextFrame (lines 73–99) to determine thenext frame (image) to display—specifically, it calls method setCurrentFrame (lines120–123), which sets imageIcon (the current image displayed) to the image returnedfrom the current frame sequence. Lines 84–92 of determineNextFrame are used for“looping” purposes in the animation. If loop is false, the animation terminates after oneiteration. The last frame in the sequence is displayed if displayLastFrame is true,and the first frame in the sequence is displayed if displayLastFrame is false. Weexplain in greater detail in Appendix I how ElevatorView uses displayLastFramefor the Person and Door AnimatedPanels to ensure the proper display of the image.If loop is true, the animation repeats until stopped explicitly.

Sound EffectsWe now discuss how we generate audio in our elevator simulation. We divide audio func-tionality between two classes—SoundEffects and ElevatorMusic. (these classesare not part of the Java packages, although SoundEffects uses the java.appletpackage and ElevatorMusic uses the javax.sound.midi package). ClassSoundEffects (Figure 22.14) transforms audio (.au) and wave (.wav) files, contain-ing such sounds as the bell ring and the person’s footsteps, into java.applet.Audi-oClip objects. In Appendix I, we list all AudioClips used in our simulation. ClassElevatorMusic (Fig. 22.15) plays a MIDI (.mid) file when the person rides the ele-vator. The ElevatorView object will play the AudioClip and ElevatorMusic ob-jects to generate sound. All sound files are in the directory structure

com/deitel/jhtp4/elevator/view/sounds

jhtp4_22.FM Page 1327 Monday, July 16, 2001 11:29 AM

1328 Java Media Framework and Java Sound (on CD) Chapter 22

(i.e., in the sounds directory where the classes for the view are located in the file system).In our simulation, we use sounds and MIDI files provided free for download by Microsoftat the Web site:

msdn.microsoft.com/downloads/default.asp

To download these sounds, click on “Graphics and Multimedia,” “Multimedia (General),”then “Sounds.”

Fig. 22.13 Relationship between array imageIcons and ListframeSequences.

0 1 2

0 1 3 1 0

2 1 0

3 2 2 0

0=

1=

2=

3=

frameSequences

A D

B

C

0 1 2 3

imageIcons A

B

C

A B B A D

C B A

D C C A

image sequences

1 // SoundEffects.java2 // Returns AudioClip objects3 package com.deitel.jhtp4.elevator.view;45 // Java core packages6 import java.applet.*;78 public class SoundEffects {9

10 // location of sound files11 private String prefix = "";1213 public SoundEffects() {}1415 // get AudioClip associated with soundFile16 public AudioClip getAudioClip( String soundFile )17 {18 try {19 return Applet.newAudioClip( getClass().getResource( 20 prefix + soundFile ) );21 }2223 // return null if soundFile does not exist24 catch ( NullPointerException nullPointerException ) {25 return null;26 }27 }

Fig. 22.14 Class SoundEffects return AudioClip objects (part 1 of 2).

jhtp4_22.FM Page 1328 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1329

Class SoundEffects contains method getAudioClip (lines 16–27), which usesstatic method newAudioClip (of class java.applet.Applet) to return anAudioClip object using the soundFile parameter. Method setPrefix (lines 30–33) allows for changing the directory of a sound file (useful if we want to partition oursounds among several directories).

2829 // set prefix for location of soundFile30 public void setPathPrefix( String string )31 {32 prefix = string;33 }34 }

1 // ElevatorMusic.java2 // Allows for MIDI playing capabilities3 package com.deitel.jhtp4.elevator.view;45 // Java core packages6 import java.io.*;7 import java.net.*;89 // Java extension packages

10 import javax.sound.midi.*;1112 public class ElevatorMusic implements MetaEventListener {1314 // MIDI sequencer15 private Sequencer sequencer;1617 // should music stop playing?18 private boolean endOfMusic;1920 // sound file name21 private String fileName;2223 // sequence associated with sound file24 private Sequence soundSequence;2526 // constructor opens a MIDI file to play27 public ElevatorMusic( String file )28 {29 // set sequencer30 try {31 sequencer = MidiSystem.getSequencer();32 sequencer.addMetaEventListener( this );33 fileName = file;34 }

Fig. 22.15 Class ElevatorMusic plays music when a Person rides in the Elevator (part 1 of 3).

Fig. 22.14 Class SoundEffects return AudioClip objects (part 2 of 2).

jhtp4_22.FM Page 1329 Monday, July 16, 2001 11:29 AM

1330 Java Media Framework and Java Sound (on CD) Chapter 22

3536 // handle exception if MIDI is unavailable37 catch ( MidiUnavailableException midiException ) {38 midiException.printStackTrace();39 }40 } // end ElevatorMusic constructor4142 // open music file43 public boolean open()44 {45 try {4647 // get URL for media file48 URL url = getClass().getResource( fileName );4950 // get valid MIDI file51 soundSequence = MidiSystem.getSequence ( url );5253 // open sequencer for specified file54 sequencer.open();55 sequencer.setSequence( soundSequence );56 }5758 // handle exception if URL does not exist59 catch ( NullPointerException nullPointerException ) {60 nullPointerException.printStackTrace();61 return false;62 }6364 // handle exception if MIDI data is invalid65 catch ( InvalidMidiDataException midiException ) {66 midiException.printStackTrace();67 soundSequence = null;68 return false;69 }7071 // handle IO exception72 catch ( java.io.IOException ioException ) {73 ioException.printStackTrace();74 soundSequence = null;75 return false;76 }7778 // handle exception if MIDI is unavailable79 catch ( MidiUnavailableException midiException ) {80 midiException.printStackTrace();81 return false;82 }8384 return true;85 }86

Fig. 22.15 Class ElevatorMusic plays music when a Person rides in the Elevator (part 2 of 3).

jhtp4_22.FM Page 1330 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1331

As we discussed in Section 22.7, Java 2 offers MIDI support. Class Elevator-Music uses the javax.sound.midi package to play the MIDI file. Class Eleva-torMusic listens for a MetaMessage event from the MIDI file. The sequencergenerates a MetaMessage event. Class ElevatorMusic’s constructor (lines 27–40) ofthe constructor initializes the system’s MIDI sequencer and registers class Elevator-Music for MetaMessage events from the sequencer. Method open (lines 43–85) opensthe sequencer for a specified file and ensures the MIDI data is valid. Method play (lines88–92) starts the sequencer and plays the MIDI file.

ConclusionYou have completed a substantial object-oriented design (OOD) process that was intendedto help prepare you for the challenges of “industrial-strength” projects. We hope you havefound the optional “Thinking About Objects” sections informative and useful as a supple-ment to the material presented in the chapters. In addition, we hope you have enjoyed theexperience designing the elevator system using the UML. The worldwide software industryhas adopted the UML as the de facto standard for modeling object-oriented software.

Although we have completed the design process, we have merely “scratched the sur-face” of the implementation process. We urge you to read Appendices G, H and I on theaccompanying CD, which fully implement the design. These appendices translate the UMLdiagrams into a 3,465-line Java program for the elevator simulation. In these appendices,we present all code that we did not cover in the “Thinking About Objects” sections and acomplete “walkthrough” of this code.

1. Appendix G presents the Java files that implement events and listeners

2. Appendix H presents the Java files that implement the model

87 // play MIDI track88 public void play()89 {90 sequencer.start();91 endOfMusic = false;92 }9394 // get sequencer95 public Sequencer getSequencer()96 {97 return sequencer;98 }99100 // handle end of track101 public void meta( MetaMessage message )102 {103 if ( message.getType() == 47 ) {104 endOfMusic = true;105 sequencer.stop();106 }107 }108 }

Fig. 22.15 Class ElevatorMusic plays music when a Person rides in the Elevator (part 3 of 3).

jhtp4_22.FM Page 1331 Monday, July 16, 2001 11:29 AM

1332 Java Media Framework and Java Sound (on CD) Chapter 22

3. Appendix I presents the Java files that implement the view

We do not introduce an abundance of new material or UML design in these appen-dices—they simply serve to implement the UML-based diagram we have presented in pre-vious chapters into a fully functional program. Studying the implementation in theappendices should hone the programming skills you have developed throughout the bookand reinforce your understanding of the design process.

SUMMARY• Through the JMF API, programmers can create Java applications that play, edit, stream and cap-

ture many popular and high-quality media types.

• JMF 2.1.1 supports popular media file types such as Microsoft Audio/Video Interleave (.avi),Macromedia Flash 2 movies (.swf), MPEG Layer 3 Audio (.mp3), Musical Instrument DigitalInterface (MIDI;.mid), MPEG-1 videos (.mpeg, .mpg), QuickTime (.mov) and Sun Audio(.au).

• The Java Sound API and its extensive sound processing capabilities. Java Sound is a lower-levelAPI that supports many of the JMF’s internal audio capabilities.

• A Player is a type of Controller in JMF that can process and play media clips. Playing me-dia clips with interface Player can be as simple as specifying the media source, creating aPlayer for the media, obtaining the output media and controls GUI components from Playerand displaying them. In addition, Players can access media from a capture device such as a mi-crophone and from a Real-time transport protocol (RTP) stream—a stream of bytes sent over a net-work that can be buffered and played on the client computer.

• Play media involves accessing the media, creating a Controller for the media and outputtingthe media. Before outputting the media, there is the option of formatting the media.

• JMF provides lightweight video renderers compatible with Swing GUI components using Man-ager method setHint with parameter Manager.LIGHTWEIGHT_RENDERER and Bool-ean.TRUE.

• A MediaLocator is similar to a URL, but it also supports RTP streaming session addresses andcapture device locations.

• Invoke Manager method createPlayer to create a Player object that references a mediaplayer. Method createPlayer opens the specified media source and determines the appropri-ate player for the media source. A NoPlayerException occurs if no appropriate player can befound for the media clip.

• Class Manager provides static methods that enable programs to access to most JMF resourc-es.

• Throughout the media-handling process, Players generate ControllerEvents that Con-trollerListeners listen for. Class ControllerAdapter, which implements methods ofinterface ControllerListener.

• Controllers use state transitions to confirm their position in the media processing algorithm.

• Player’s realize method to confirm all resources necessary to play media. Method realizeplaces the Player in a Realizing state where the player interacts with its media sources.When a Player completes realizing, it generates a RealizedCompleteEvent—a type ofControllerEvent that occurs when a Player completes its transition to state Realized.

• Player method prefetch causes the Player to obtain hardware resources for playing themedia and begin buffering the media data. Buffering the media data reduces the delay before themedia clip plays because media reading can take a long time.

jhtp4_22.FM Page 1332 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1333

• Invoke Player method getVisualComponent method to obtain the visual component of avideo. Invoke Player method getControlPanelComponent to return the player’s GUIcontrols.

• When the media clip ends, the Player generates a ControllerEvent of type EndOfMedi-aEvent.

• Player method setMediaTime sets the position of the media to a specific time in the media.

• Invoking Player method start starts media playback. It also buffers and realizes the player ifthat has not been done.

• Capture devices such as microphones have the ability to convert analog media into digitized me-dia. This type of media is known as captured media.

• Class DataSource abstracts the media source to allow a program to manipulate it and providesa connection to the media source.

• Interface Processor allows a program to manipulate data at the various processing stages. It ex-tends interface Player and provides more control over media processing.

• Monitoring allows you to hear or see the captured media as it is captured and saved. A Monitor-Control and other control objects obtained from Controller by invoking method getCon-trol.

• JMF provides class Format to describe the attributes of a media format, such as the sampling rate(which controls the quality of the sound) and whether the media should be in stereo or mono for-mat. FormatControl objects that allow us to format objects that support format controls.

• Class CaptureDeviceManager enables a program to access capture device information.

• A CaptureDeviceInfo object provides the essential information necessary of a capture de-vice’s DataSource.

• Invoke Manager method createDataSource to obtain the DataSource object that of thatmedia location.

• Manager method createRealizedProcessor creates a realized Processor object thatcan start processing media data. The method requires as an argument a ProcessorModel objectcontaining the specifications of the Processor.

• Use a ContentDescriptor to describe the content-type of output from a Processor.FileTypeDescriptor specifies the a file media content.

• Call Processor method getTrackControls to get each track’s controls.

• An object that implements interface DataSink enables media data to be output to a specific lo-cation—most commonly a file. Manager method createDataSink receives the Data-Source and MediaLocator as arguments to create a DataSink object.

• Register a DataSinkListener to listen for DataSinkEvents generated by a DataSink.A program can call DataSinkListener method dataSinkUpdate when each Data-SinkEvent occurs. A DataSink causes an EndOfStreamEvent when the capture streamconnection closes

• Streaming media refers to media that is transferred from a server to a client in a continuous streamof bytes. Streaming media technology loads media data into buffers before displaying media.

• JMF provides a streaming media package that enables Java applications to send and receivestreams of media in the formats discussed earlier in this chapter. JMF uses the industry-standardReal-Time Transport Protocol (RTP) to control media transmission. RTP is designed specificallyto transmit real-time media data.

• Use a DataSink or a RTPManager to stream media. RTPManagers provide more control andversatility for the transmission. If an application sends multiple streams, the application must have

jhtp4_22.FM Page 1333 Monday, July 16, 2001 11:29 AM

1334 Java Media Framework and Java Sound (on CD) Chapter 22

an RTPManager for each separate streaming session. Both require the DataSource obtainedfrom Processor’s getOutput method.

• The URL of RTP streams is in format: rtp://<host>:<port>/<contentType>

• Formatting the media can only be done when the Processor has been configured. To notify theprogram when it completes Processor configuration, register a ControllerListener tonotify the program that it has completed configuring. A ConfigureCompleteEvent occurswhen Processor completes configuration.

• Processor method setContentDescriptor sets the stream to an RTP-enabled formatwith ContentDescriptor.RAW_RTP parameter.

• TrackControl interface allow the formats of the media tracks to be set.

• SessionAddress contains an IP addresses and port number used in the streaming process.RTPMangers use SessionAddresses to stream media.

• Invoke RTPManager method initialize to initialize the local streaming session with the lo-cal session address as the parameter. Invoke RTPManager method addTarget to add the des-tination session address as the client recipient of the media stream. To stream media to multipleclients, call RTPManger method addTarget for each destination address.

• RTPManager method removeTargets closes streaming to specific destinations. RTPMan-ager method dispose release the resources held by the RTP sessions

• The Java Sound API provides classes and interfaces for accessing, manipulating and playing Mu-sical Instrumental Data Interface (MIDI) and sampled audio.

• A sound card is required to play audio with Java Sound. Java Sound throws exceptions when itaccesses audio system resources to process audio on a computer that does not have a sound card.

• Programmers can use package javax.sound.sampled to play sampled audio file formats,which includes Sun Audio (.au), Wave (.wav) and AIFF (.aiff).

• To process audio data, we can to use a Clip line that allows the flow of raw digital data to audiodata we can listen to.

• An AudioInputStream object to point to the audio stream. Class AudioInputStream (asubclass of InputStream) provides access to the audio stream contents.

• The length of video and audio clips is measured in frames. Each frame represents data at a specifictime interval in the audio file.

• The algorithm for playing sampled audio supported by Java Sound is as follows: obtain an Au-dioInputStream from an audio file, obtain a formatted Clip line, load the AudioInput-Stream into the Clip line, start the data flow in the Clip line.

• All Lines generate LineEvents that can be handled by LineListener. The first step to sam-pled audio playback involves obtaining the audio stream from an audio file.

• Class AudioSystem enables a program to access many audio system resources required to playand manipulate sound files.

• Method getAudioInputStream throws an UnsupportedAudioFileException if thespecified sound file is a non-audio file or contains a sound clip format that is not supported by JavaSound.

• Method getLine requires a Line.Info object as an argument, which specifies the attributesof the line the AudioSystem should obtain.

• We can use a DataLine.Info object that specifies a Clip data line, a general encoding formatand a buffer range. We need to specify a buffer range so the program can determine the best buffersize given a preferred range.

jhtp4_22.FM Page 1334 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1335

• DataLine.Info objects specify information about a Clip line, such as the formats supportedby the Clip. The DataLine.Info object constructor receives as arguments the Line class,the line’s supported AudioFormats, the minimum buffer size and the maximum buffer size inbytes.

• AudioSystem method getLine and Clip method open throw LineUnavailableEx-ceptions if another application is using the requested audio resource. Clip method open alsothrows an IOException if open is unable to read the specified AudioInputStream.

• Invoke Clip method start to begin audio playback.

• When a LineEvent occurs, the program calls LineListener method update to process theevent. The four LineEvent types are defined in class LineEvent.Type. The event types areCLOSE, OPEN, START and STOP.

• Method close of class Line stops audio activity and closes the line—which releases any audioresources obtained previously by the Line.

• Clip method loop can be called with parameter Clip.LOOP_CONTINUOSLY. to replay theaudio clip forever.

• Invoking method stop of interface Clip only stops data activity in the Line. Invoking methodstart resumes data activity.

• MIDI (Musical Instrument Digital Interface) music can be created through a digital instrument,such as an electronic keyboard (synthesizer), or through packaged software synthesizers. A MIDIsynthesizer is a device that can produce MIDI sounds and music.

• The MIDI specification provides detailed information on the formats of a MIDI file. For detailedinformation on MIDI and its specification, visit their official Web site at www.midi.org. JavaSound’s MIDI package allows developers to access the data that specify the MIDI, but it does notprovide support for the specification.

• Interpretation of MIDI data varies between synthesizers and will sound different with different in-struments. Package javax.sound.midi enables program to manipulate, play and synthesizeMIDI. There are three MIDI types—0 (the most common), 1 and 2. Java Sound supports MIDIfiles with .mid extensions and .rmf (Rich Music Format).

• Some file parsers in various operating systems are unable to interpret the MIDI file as a MIDI filethat Java can play.

• MIDI playback is accomplished by a MIDI sequencer. Specifically, sequencers can play andmanipulate a MIDI sequence, which is the data formula that tells a device how to handle theMIDI data.

• Often, MIDI is referred to as a sequence, because the musical data in MIDI is composed of a se-quence of events. The simplicity of MIDI data enables us to view each event individually and learnthe purpose of each event. The process to MIDI playback involves accessing a sequencer, loadinga MIDI sequence or a MIDI file into a sequencer and starting the sequencer.

• Method getSequence also can obtain a MIDI sequence from a URL or an InputStream.Method getSequence throws an InvalidMidiDataException if the MIDI system de-tects an incompatible MIDI file.

• Interface Sequencer, which extends interface MidiDevice (the super-interface for all MIDIdevices), represents the standard device to play MIDI data.

• Sequencer’s open method prepares to play a Sequence. Sequencer method setSe-quence loads a MIDI Sequence into the Sequencer and throws an InvalidMidiExcep-tion if the Sequencer detects an unrecognizable MIDI sequence. Sequencer method playbegins playing the MIDI sequence.

jhtp4_22.FM Page 1335 Monday, July 16, 2001 11:29 AM

1336 Java Media Framework and Java Sound (on CD) Chapter 22

• A MIDI track is a recorded sequence of data; MIDIs usually contain multiple tracks. MIDI tracksare similar to CD tracks except that the music data in MIDI are played simultaneously. ClassTrack (package javax.sound.midi) provides access to the MIDI music data stored in theMIDI tracks.

• MIDI data in MIDI tracks are represented by MIDI events. MIDI events are the holders of theMIDI action and the time when the MIDI command should occur. There are three types of MIDImessage—ShortMessage, SysexMessage and MetaMessage. ShortMessages pro-vide instructions, such as specific notes play, and can configure options, such when a MIDI starts.The other two less-used messages are exclusive system messages called SysexMessage andMetaMessages which may tell a device that the MIDI has reached the end of a track. This sec-tion deals exclusively with ShortMessages that play specific notes. Each MidiMessage isencapsulated in a MidiEvent and a sequence of MidiEvents form a MIDI track.

• Each MidiEvent’s getTick method provides the time when the event takes place (timestamp).

• ShortMessage method getCommand returns the command integer of the message. Short-Message method getData1 returns the first status byte of the message. ShortMessagemethod getData2 returns the second status byte. The first and second status bytes vary in inter-pretation according to the type of command in ShortMessage.

• General MIDI recording is accomplished through a sequencer. Interface Sequencer providessimple methods for recording—assuming the transmitters and receivers of MIDI devices are“wired” correctly.

• After setting up a sequencer and an empty sequence, a Sequencer object can invoke its start-Recording method to enable and start recording on the empty track. Method record-Enable of interface Sequencer takes a Track object and a channel number as the parametersto enable recording on a track.

• Method write of class MidiSystem writes the sequence to a specified file.

• An alternative method to record MIDI without having to deal with transmitters and receivers is tocreate events from ShortMessages. The events should be added to a track of a sequence.

• Interface Synthesizer is a MidiDevice interface which enables access to MIDI sound gen-eration, instruments, channel resources, and sound banks.

• A SoundBank is the container for various Instruments, which tell the computer how to sounda specific note and are programmed algorithms of instructions. Different notes on various instru-ments are played through a MidiChannel on different tracks simultaneously to produce sym-phonic melodies.

• Acquiring any MIDI resources throws a MidiUnavailableException if the resource is un-available.

• Invokes Synthesizer’s getChannels method to obtain all 16 channels from the synthesizer.A MidiChannel can sound a note by calling its noteOn method with the note number (0-127)and volume as parameters. MidiChannel’s noteOff method turns off a note with just the notenumber parameter.

• Synthesizer’s getAvailableInstruments method obtains the default instrument pro-grams of a synthesizer. One can also import more instruments by loading a customized sound bankthrough method loadAllInstruments (SoundBank) in interface Synthesizer. A soundbank usually has 128 instruments. MidiChannel’s programChange method loads the desiredinstrument program into the synthesizer.

• Invoke Receiver’s send method with a MidiMessage and a time stamp as its parameters tosend MIDI message to all its transmitters.

jhtp4_22.FM Page 1336 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1337

TERMINOLOGYaddControllerListener method of Controller

DataLine interfaceDataLine.Info class

addDataSinkListener method of DataSink

DataSink interfaceDataSinkEvent class

addLineListener method of Line DataSinkListener interfaceaddMetaEventListener method of Sequencer

dataSinkUpdate method of DataSinkListener

addTarget method of RTPManager DataSource classAudioFormat class deleteTrack method of SequenceAudioFormat.Encoding.PCM_SIGNED device portsAudioInputStream class Direct SoundAudioSystem class dispose method of RTPManagerBoolean.TRUE encodingbroadband endOfMedia method of

ControllerAdapterCannotRealizeException classcapture EndofMediaEvent classcapture device EndofStreamEvent classcaptured media FileTypeDescriptor classCaptureDevice interface FileTypeDescriptor.QUICKTIMECaptureDeviceInfo class Format classCaptureDeviceManger class FormatControl interfaceClip interface framesClip.class get method of TrackClip.LOOP_CONTINUOUSLY getAudioInputStream method of

AudioSystemClock interfaceclose method of Controller getBank method of Patchclose method of Line getChannels method of Synthesizerclose method of MidiDevice getCommand method of ShortMessageconfigure method of Processor getControlComponent method of ControlconfigureComplete method of ControllerAdapter

getControlPanelComponent method of Player

ConfigureCompleteEvent class getData1 method of ShortMessageController.Prefetching getData2 method of ShortMessageController.Prefeteched getDataOutput method of ProcessorController.Realized getDeviceList method of

CaptureDeviceManagerController.RealizingController.Started getFormat method of AudioFormatController.Stopped getFormat method of FormatControlControllerAdapter class getFormatControls method of

CaptureDeviceControllerEvent classControllerListener interface getFrameLength method of

AudioInputStreamgetFrameSize method of AudioFormat

createDataSink method of ManagercreateDataSource method of ManagercreatePlayer method of Manager getLine method of AudioSystemcreateProcessor method of Manager getLocator method of

CaptureDeviceInfocreateRealizedProcessor method of Manager getMessage method of MidiEventcreateSendStream method of RTPManagergetMidiFileTypes method of MidiSystemcreateTrack method of Sequence getPatch method of Instrument

jhtp4_22.FM Page 1337 Monday, July 16, 2001 11:29 AM

1338 Java Media Framework and Java Sound (on CD) Chapter 22

getProgram method of Patch MidiEvent classgetReceiver method of MidiDevice MidiMessage classgetResolution method of Sequence MidiSystem classgetSequence method of MidiSystem MidiUnavailableException classgetSequencer method of MidiSystem mixersgetSupportedFormats method of FormatControl

MonitorControl interfacemonitoring

getSynthesizer method of MidiSystem MP3getTargetFormat method of AudioSystemMPEG-1getTick method of MidiEvent network portsgetTrackControls method of Processor newInstance method of RTPManagergetTracks method of Sequence NoDataSinkException classgetTransmitter method of MidiDevice NoDataSourceException classgetType method of LineEvent NoPlayerException classgetVisualComponent method of Player NoProcessorException classinitialize method of RTPManager noteOff method of MidiChannelInstrument class noteOn method of MidiChannelInvalidMidiException class open method of ClipInvalidSessionAddress class open method of DataSinkisEnabled method of FormatControl open method of MidiDeviceisLineSupported method of AudioSystemoutput formatJava Media Framework packetized dataJava Sound Patch classjavax.media package pitchjavax.media.control package Player interfacejavax.media.datasink package pre-bufferjavax.media.format package prefetchComplete method of

ControllerAdapterjavax.media.protocol packagejavax.media.rtp package PrefetchCompleteEvent classjavax.sound.midi package pre-processjavax.sound.sampled package Processor interfaceJOptionPane.CLOSED_OPTION Processor.ConfiguredJOptionPane.DEFAULT_OPTION Processor.ConfiguringJOptionPane.OK_OPTION ProcessorModel classLine interface propagation delayLineEvent class protocolLineEvent.Type.STOP QuickTimeLineListener interface realize method of ControllerLineUnavailableException class realizeComplete method of

ControllerAdapterloop method of ClipManager class RealizeCompleteEvent classManger.LIGHTWEIGHT_RENDERER Receiver interfacemedia recordEnable method of Sequencermedia clip removeTargets method of RTPManagermedia location RMF (Rich Music Format)media tracks RTP (Real-time Transport Protocol)MediaLocator class RTPManger classMIDI (Musical Instrument Digital Interface) SecurityException classMIDI Specification send method of ReceiverMidiChannel interface SendStream interface

jhtp4_22.FM Page 1338 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1339

SELF-REVIEW EXERCISES22.1 Fill in the blanks in each of the following.

a) Class provides access to many JMF resources.b) In addition to locations of media files stored on the local computer, a can

also specify the location of capture devices and RTP sessions.c) Class provides access to sampled audio system resources while class

provides access to MIDI system resources.d) An event of type indicates that a Controller has establish communica-

tions with the media source.e) Method createRealizedProcessor takes a as an argument.f) In order, the Processor’s states are: Unrealized, , ,

, , , and Started.g) Constant specifies that the Processor should output media in QuickTime

format.h) To stream media, we can use a or a .i) objects set the stream formats for capture devices.j) Invoking Clip method with constant as an argument replays a

sampled audio file continuously.k) A MIDI contains multiple tracks, which contain a sequence of MIDI

that each encapsulate a MIDI .

22.2 State whether each of the following is true or false. If false, explain why.a) Manager method setHint can be used to specify that the visual component of a media

clip should be rendered using lightweight GUI components.b) A ControllerListener handles events generated by a DataSink.c) Only objects that implement interface Processor can play media.

Sequence class start method of PlayerSequence.PPQ start method of SendStreamSequencer interface start method of SequencerSessionAddress class startRecord method of SequencerSessionAddress class stop method of DataLineSessionEvent class stop method of DataSinkSessionListener interface stop method of SequencersetContentDescriptor method of Processor

stopRecord method of Sequencerstreaming media

setFormat method of FormatControl streamssetHint method of Manager synchronizationsetMediaTime of Clock synthesissetMessage method of ShortMessage Synthesizer interfacesetReceiver method of Transmitter teleconferencingsetSequence method of Sequencer tempoShortMessage class time stampShortMessage.NOTE_OFF Track classShortMessage.NOTE_ON TrackControl interfaceShortMessage.PROGRAM_CHANGE Transmitter interfacesimulation UnsupportedAudioFileException classsize method of track UnsupportedFormatException classSoundBank interface video conferencestart method of DataLine Video for Windowsstart method of DataSink write method of MidiSystem

jhtp4_22.FM Page 1339 Monday, July 16, 2001 11:29 AM

1340 Java Media Framework and Java Sound (on CD) Chapter 22

d) A Player cannot access media from capture devices; a Processor must be used forthis purpose.

e) A Clip plays MIDI Sequences.f) MIDI playback stops automatically when the Sequencer reaches the end of a MIDI

Sequence.g) An RTPManger can stream an entire media file regardless of the number of tracks in

the file.h) Method createPlayer throws a NoDataSourceException if it is unable to lo-

cate the specified media data source.

ANSWERS TO SELF-REVIEW EXERCISES22.1 a) Manager. b) MediaLocator. c) AudioSystem, MidiSystem. d) Realize-CompleteEvent. e) ProcessorModel. f) Configuring, Configured, Realizing, Realized,Prefetching, Prefetched. g) FileTypeDescriptor.QUICKTIME. h) DataSink, RTPMan-ager. i) FormatControl. j) loop, Clip.LOOP_CONTINUOUSLY. k) Sequence, events,message.

22.2 The answers to Self-Review Exercise 3.2 are as follows:a) True.b) False. A DataSinkListener handles DataSinkEvents generated by a Data-

Sink.c) False. Objects that implement Player or Processor can play media.d) False. Both a Processor and a Player can access media from capture devices.e) False. A Sequencer plays MIDI sequences.f) True.g) False. Each RTPManager can stream only one track.h) False. Method createPlayer throws a NoPlayerException if it is unable to lo-

cate the specified media data source.

EXERCISES22.3 Wave audio clips are commonly used to play sounds that alert the user of a problem in a pro-gram. Typically, such sounds are accompanied by error-message dialogs. Modify the DivideByZ-eroTest example of Fig. 14.1 to play an error-message sound (in addition to displaying an errormessage dialog) if the user enters an invalid integer or attempts to divide by zero. Preload a compat-ible sound clip using a Clip line as demonstrated in Fig. 22.5. The Clip line needs to support onlythe format of the chosen sound clip. There should be a separate method that invokes the playback ofthe clip. When the program detects an exception, it should call this method to play the error messagesound. After each clip playback, the program needs to rewind the clip by invoking Clip’s methodsetFramePosition with the frame position as the argument, so that the clip can replay from itsbeginning position.

22.4 Incorporate MIDI file playback capabilities, as demonstrated in class MidiData(Fig. 22.7), into the ClipPlayer demo. Class ClipPlayer should have separate methods for ob-taining MIDI sequence data and for playing back the sequence with a sequencer.

22.5 The SimplePlayer demo (Fig. 22.7) demonstrated JMF’s media playback (videos, cap-ture media) capabilities using interface Player. Using the SimplePlayer demo as a guideline,develop a karaoke application in which one portion of the program plays a music/video file (prefera-bly without lyrics) while another portion of the program simultaneously captures the user’s voice. Theprogram should start playback and capture as soon as it obtains the media. It is important that the pro-gram allows the user to control both the capture and music, so the control GUIs of each media shouldbe displayed. When the media file finishes playing, the voice capture should cease and the program

jhtp4_22.FM Page 1340 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1341

should reset the music to the beginning. The program should close all Player-related resourceswhen the user terminates the program.

22.6 Modify your solution to Exercise 3.5 by implementing the program using interface Pro-cessor. Create a Processor that is ready to display media with no format or output specifica-tions. The voice capture should not end and the media should not rewind when the media finishesplaying. Deallocate Processor-related resources when the user opens a new file or closes the pro-gram. All other program details remain the same as specified in Exercise 3.5.

22.7 Referring to the file saving process demonstrated in class CapturePlayer (Fig. 22.2),modify your solution to Exercise 22.6 by saving both audio streams to two separate QuickTime files.Specify the media tracks to be in AudioFormat.IMA4 encoding. The program should display filesaver dialogs for each audio file saver and a message when the saving completes or stops. Thereshould be separate data writers for each audio stream. The program should close the data writers whenthere is no more data to process or when the user terminates the program. DataSinkEvent methodgetSourceDataSink is available to obtain the DataSink that is generating the Data-SinkEvent. Use MonitorControls to monitor both audio streams, so there is no need to displayvideo or default user controls. MonitorControl method setEnabled is available to enablemonitoring of the audio streams. Display one of the MonitorControls in a dialog box. Make sureto close Processor resources when there is no more data from the media file or when the useropens another file or terminates the program.

22.8 Modify class MidiSynthesizer (Fig. 22.9) into an application where the user can playviolin notes by pressing the computer keyboard keys. Use the virtual key code of the keys to specifythe note number that the synthesizer should play. The violin’s program number is 40. Use the first andninth channel to sound the notes at the same time. The ninth channel may generate a different versionof the violin notes.

SPECIAL SECTION: CHALLENGING MULTIMEDIA PROJECTSThe preceding exercises are keyed to the text and designed to test the reader’s understanding of fun-damental JMF and Java Sound concepts. This section includes a collection of advanced multimediaprojects. The reader should find these problems challenging, yet entertaining. The problems varyconsiderably in difficulty. Some require an hour or two of program writing and implementation. Oth-ers are useful for lab assignments that might require two or three weeks of study and implementa-tion. Some are challenging term projects. [Note: Solutions are not provided for these exercises.].

22.9 Modify the ClipPlayer (Fig. 22.5, Fig. 22.6) demo to provide a replay checkbox that al-lows the user to replay the sampled audio file.

22.10 Modify the RTPServer demo (Fig. 22.3) to enable transmission of the audio portions ofmedia files to two clients. The application testing class should have twice the number of IP and portnumber inquiry dialog boxes. The program can check for audio formats by matching track control for-mats of the media to be instances of AudioFormat (class AudioFormat).

22.11 Many Web sites are able to play video clips. The SimplePlayer (Fig. 22.1) program canbe an applet. Simplify the SimplePlayer program to an applet that plays a preloaded media clipon a web page. Insert this applet tag into your HTML file:

<applet code = AppletName.class width = # height = # > <param name = file value = "sample.mov" >

</applet>

22.12 Modify class MidiRecord (Fig. 22.8) and class MidiData (Fig. 22.7) to create a new ap-plication class that duplicates MIDI files by recording the sequence to a new file. Play the sequence

jhtp4_22.FM Page 1341 Monday, July 16, 2001 11:29 AM

1342 Java Media Framework and Java Sound (on CD) Chapter 22

using MidiData and use its Transmitter to transmit MIDI information to the Receiver ofMidiRecord.

22.13 Modify your solution to Exercise 22.6 to create a streaming karaoke application in which theapplication streams only the video portion of a music video and the sound stream is replaced by avoice capture stream. [Note: If the video format contains only one track for both audio and video, theapplication cannot choose to stream only the video portion of the track.]

22.14 Modify class MidiData (Fig. 22.7) to load all tracks of a MIDI file and revise class Mid-iDemo (Fig. 22.10) to enable the user to select the playback of each individual track displayed in aJList selector panel. Allow the user to replay the sequence forever.

22.15 Implement an MP3 player with a file list window using Vectors and a JList.

22.16 Modify class MidiRecord (Fig. 22.8) and class MidiDemo (Fig. 22.10) to allow the userto record MIDI to individual tracks stored in a Vector. Playback of the recorded MIDI should playall MIDI tracks simultaneously.

22.17 Currently the MidiDemo program (Section 22.7) records synthesized music with the firstavailable instrument (i.e., Grand Piano). Modify class MidiDemo so that music will be recorded witha user-selected instrument, and allow the user to change the instrument during recording. Also allowthe user to import their own sound banks. Make changes to classes MidiSynthesizer, MidiDa-ta and MidiRecord as needed. (Hint: The command parameter to change instrument is Short-Message.PROGRAM_CHANGE)

22.18 Modify the SimplePlayer demo (Fig. 22.1) to support multiple media players. Presenteach media clip in its own JInternalFrame. The program needs to create separate Players foreach media clip and should register a ControllerListener for each player. Controller-Event method getSourceController is available to obtain the controller generating the Con-trollerEvent. Implement the program using a dynamic data structure such as a Vector to storethe multiple Players.

22.19 Modify your solution to Exercise 3.7 to save both media streams to one file and play the com-bined stream. Use Manager method createMergingDataSource, which receives an array ofDataSource objects, to save both the capture stream and the music stream into one stream, whosecontent-type will be MIXED. The program should obtain the output DataSources from the Pro-cessors as the DataSource objects to be merged. The program must also obtain a duplicateDataSource (of the merged DataSource) for creating the Player for that DataSource. Todo this, use Manager method createCloneableDataSource to create a CloneableDataSource with the merged DataSource as the argument. Duplicate the DataSource for theplayer by invoking method createClone of interface Cloneable on the DataSource (similarto obtaining the FormatControls of a CaptureDevice DataSource in the Capture-Player demo (Fig. 22.2) ).

22.20 A program can record MIDI without the use of transmitters and receivers by manually creat-ing MidiMessages, placing them in MidiEvents, and adding these events to a track. In addi-tion to the MidiMessage argument, a program must specify a time stamp to create theMidiEvent, expressed in ticks (i.e. milliseconds of long type), so the program must obtain the sys-tem’s current time in milliseconds, which can be obtained from System method current-TimeMillis. Track method add is available to add events to a track. Create an acoustics table(e.g. drums, cymbals, etc.) where the user can selected a sequence of instruments to play. Allow theuser to save the recorded MIDI sequence to a file.

22.21 Create a peer-to-peer teleconferencing kit that enables users to talk to and hear each other.To listen to each other, each user must open an RTP session for the capture stream. A program canopen an RTP stream with a MediaLocator specifying the RTP session address. Then, the program

jhtp4_22.FM Page 1342 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1343

can use the MediaLocator to create a Player for the RTP stream. The program can send thevoice capture as demonstrated in the RTPManager demo (Fig. 22.3). To send the voice capture tomore than two people, use RTPManagers for each separate session and call its addTarget methodto add each recipient’s session address as a destination of the capture stream. This is referred to asmulticast-unicast sessions.

22.22 Using the “piano player” driver (Section 22.7.4 discusses the driver) from class MidiDemo(Fig. 22.10) and image animation functions from Chapter 18, write a program that displays a bounc-ing ball whose pinnacle height is specified by the note numbers (of a ShortMessage in a Midi-Event) of a loaded track in a MIDI file. The program should use the duration ticks of theMidiEvent as the duration of the bounce.

22.23 Most karaoke music videos are in MPEG format, for which JMF provides an MpegAudio-Control interface to control the audio channels. For multilingual and karaoke MPEG videos, chan-nel layers direct the main audio to one of two audio streams (e.g. english-dubbed audio stream) orboth audio streams (e.g. both music and song channels turned on in karaoke videos). As an additionalfeature for MPEG videos to your solution to Exercise 3.6, obtain and display the control GUI com-ponent from a MpegAudioControl, if any, or display a customized radio button GUI selector tolet the user select the channel layout of the MPEG audio. Interface MpegAudioControl providesmethod setChannelLayout to set the audio channel layout for MPEG videos, with the channellayout as the argument. (Refer to the use of MonitorControl in Fig. 22.2)

22.24 Create a multimedia-rich Tic-Tac-Toe game that plays sampled audio sounds when a playermakes a valid move or makes an invalid move. Use a Vector to store preloaded AudioInput-Streams representing each audio clip. Use a Clip line to play sounds in response to the user inter-actions with the game. Play continuous MIDI background music while the game is in progress. Usethe JMF’s interface Player to play a video when a player wins.

22.25 Java Sound’s MIDI package can access MIDI software devices as well as hardware devices.If you have a MIDI keyboard that the computer can detect or one that can be plugged into a MIDI INport of a sound card, Java Sound can access that keyboard. Use synthesizers, receivers, transmittersand sequencers to allow the user to record MIDI synthesized by the electronic keyboard. MidiSys-tem method getMidiDeviceInfo is available to obtain all detectable MIDI devices’ information(array of MidiDevice.Info objects). Use MidiSystem method getMidiDevice with a Mi-diDevice.Info argument to obtain the specified MIDI device resource.

22.26 Enhance the MidiDemo program to allow for more configurations such as tempo, controland replay. Sequencers can implement MetaEventListeners to handle MetaMessages andControlEventListeners to handle ShortMessage.CONTROL_CHANGE commands (seethe MIDI specification for the type of changes and the various MetaEvents). Interface Sequenc-er also provides many configuration and control methods that affect sequencer playback.

22.27 Enhance the RTPServer program (Fig. 22.3, Fig. 22.4) to a video distribution server where alive video feed from a video capture device (e.g., VFW TV card, digital cameras) is broadcast to every-one in the network. Use the IP address ending with .255 (i.e. to create a SessionAddress) to broad-cast to everyone in the network subnet. A server program should have access to stream transmissionstatus, error controls and delay management. Package javax.media.rtp provides many interfacesto handle stream configurations and statistics. Package javax.media.rtcp allows the user to ac-cess RTP session reports. Package javax.media.rtp.event contains many events generated dur-ing an RTP session that can be used to perform RTP enhancements at those stages in the session.Package javax.media.control provides several control interfaces useful in RTP sessions.

22.28 Enhance the media player application solution to Exercise 3.18 by adding editing features tothe program. First add a replay checkbox. Although a Processor is more suitable for control tasksthan a Player is, one can also use both interfaces by creating a Player for the output Data-

jhtp4_22.FM Page 1343 Monday, July 16, 2001 11:29 AM

1344 Java Media Framework and Java Sound (on CD) Chapter 22

Source from the Processor. Control features should include track formatting, frame control set-ting (interface FrameRateControl), buffer control (interface BufferControl), and qualitycontrol (interface QualityControl). Include a program option that allows the user to save mediaclips given the settings of these controls. For capture devices, there is a PortControl interfaceavailable to control their ports of devices. These interfaces are in package javax.media.con-trol. In package javax.media, there are other interfaces such as Codec and Effect that en-able further rendering and processing of media to specific media formats. Allow the user to importnew codecs. Also implement an editing feature which enables the user to extract certain portions of amedia clip. (Hint: set the media position of a media clip and obtain the Processor output at themarked positions.)

22.29 Package javax.media.sound offers many audio system resources. Use this package tocreate a sound capture program that allows the users to save the capture stream in their desired for-mats, bit rates, frequencies and encodings.

22.30 Create a visualization studio that displays graphics bars that synchronize with sampled audioplayback.

22.31 Extend MP3 playback support for class ClipPlayer (Fig. 22.5) using classes in packagejavax.media.sound.spi. The MP3 encoding process uses the Huffman algorithm.

22.32 (Story Teller) Record audio for a large number of nouns, verbs, articles, prepositions, etc.Then use random number generation to forms sentences and have your program speak the sentences.

22.33 (Project: Multimedia Authoring System) Develop a general-purpose multimedia authoringsystem. Your program should allow the user to form multimedia presentations consisting of text, au-dios, images, animations and eventually, videos. Your program lets the user weave together a presen-tation consisting of any of these multimedia elements that are selected from a catalog your programdisplays. Provide controls to allow the user to customize the presentation dynamically as the presen-tation is delivered.

22.34 (Video Games) Video games have become wildly popular. Develop your own Java videogame program. Have a contest with your classmates to develop the best original video game.

22.35 (Physics Demo: Bouncing Ball) Develop an animated program that shows a bouncing ball.Give the ball a constant horizontal velocity. Allow the user to specify the coefficient of restitution,e.g., a coefficient of restitution of 75% means that after the ball bounces it returns to only 75% of itsheight before it was bounced. Your demo should take gravity into effect—this will cause the bouncingball to trace a parabolic path. Track down a “boing” sound (like a spring bouncing) and play the soundevery time the ball hits the ground.

22.36 (Physics Demo: Kinetics) If you have taken physics, implement a Java program that willdemo concepts like energy, inertia, momentum, velocity, acceleration, friction, coefficient of restitu-tion, gravity and others. Create visual effects and use audios for emphasis and realism.

22.37 (Project: Flight Simulator) Develop your own flight simulator Java program. This is a verychallenging project. It is also an excellent candidate for a contest with your classmates.

22.38 (Towers of Hanoi) Write an animated version of the Towers of Hanoi problem we presentedin Exercise 6.37. As each disk is lifted off a peg or slid onto a peg play a “whooshing” sound. As eachdisk lands on the pile play a “clunking” sound. Play some appropriate background music.

22.39 (Tortoise and the Hare) Develop a multimedia version of the Tortoise and Hare simulationwe presented in Exercise 7.41. You might record an announcer’s voice calling the race, “The contend-ers are at the starting line.” “And they’re off!” “The Hare pulls out in front.” “The Tortoise is comingon strong.” etc. As the race proceeds, play the appropriate recorded audios. Play sounds to simulatethe animals’ running, and don’t forget the crowd cheering! Do an animation of the animals racing upthe side of the slippery mountain.

jhtp4_22.FM Page 1344 Monday, July 16, 2001 11:29 AM

Chapter 22 Java Media Framework and Java Sound (on CD) 1345

22.40 (Knight’s Tour Walker) Develop multimedia-based versions of the Knight’s Tour programsyou wrote in Exercises 7.22 and 7.23.

22.41 (Pinball Machine) Here’s another contest problem. Develop a Java program that simulates apinball machine of your own design. Have a contest with your classmates to develop the best originalmultimedia pinball machine. Use every possible multimedia trick you can think of to add “pizzazz”to your pinball game. Try to keep the game mechanisms close to those of real pinball games.

22.42 (Roulette) Study the rules for the game of roulette and implement a multimedia-based versionof the game. Create an animated spinning roulette wheel. Use audio to simulate the sound of the balljumping the various compartments that correspond to each of the numbers. Use an audio to simulatethe sound of the ball falling into its final slot. While the roulette wheel is spinning, allow multipleplayers to place their bets. When the ball lands in its final slot, you should update the bank accountsof each of the players with the appropriate wins or losses.

22.43 (Craps) Simulate the complete game of craps. Use a graphical representation of a craps table.Allow multiple players to place their bets. Use an animation of the player who is rolling the dice andshow the animated dice rolling eventually to a stop. Use audio to simulate some of the chatter aroundthe craps table. After each roll, the system should update the bank accounts of each of the players de-pending on the bets they have made.

22.44 (Morse Code) Modify your solution to Exercise 10.26 to output the morse code using audioclips. Use two different audio clips for the dot and dash characters in Morse code.

jhtp4_22.FM Page 1345 Monday, July 16, 2001 11:29 AM