Home / Articles / Creating a Stream Archive System in Macromedia MX

Creating a Stream Archive System in Macromedia MX

Chapter Description

Learn how to create a FlashCom application and client that enable two users to conduct an audio/videoconference, which is recorded by the FlashCom application.

Building The Retrieval Client

You've come a long way since the beginning of this chapter. You have created the Flash client movie, which publishes live streams from two participants, and you have developed the server-side ActionScript code necessary to record the streams and session information for each application instance. In this section, you will create the final element of the application: the retrieval client movie. This movie allows a user to view any previously recorded conference session, as you saw at the beginning of this chapter.

Creating the Interface

In the following steps, you will build the elements of the retrieval client interface. Most of this client's interface resembles the chat frame of the confRecord_100.fla document.

TO BUILD THE RETRIEVAL MOVIE INTERFACE:

  1. In Macromedia Flash MX, create a new document (File > New). Save this document as confRetrieve_100.fla.

  2. Change the document's width and height to match the recording client's dimensions. Choose Modify > Document, and specify a width of 425 px and a height of 300 px.

  3. Copy the heading and frame artwork from the confRecord_101.fla document into the confRetrieve_100.fla document. Keep the artwork and text on individual layers, as shown in Figure 13.22. Change the Static text to Two-Party Audio Video Conference :: Retrieval.

    Figure 13.22Figure 13.22 The heading and frame layers in the new document for the retrieval client are replicated.


  4. Create a new layer named login_mc. On frame 1 of this layer, place an instance of the SimpleConnect component. Place the instance near the top-left corner of the stage (Figure 13.23). In the Property inspector, name the instance login_mc. In the Application Directory field, type the URI of the conference application on your FlashCom server. If the Flash movie and FlashCom application are hosted on the same server, type rtmp:/conference. Do not specify an application instance name in the URI.

    Figure 13.23Figure 13.23 An instance of the SimpleConnect component is placed near the top-left corner of the stage.


  5. Create a new layer named connLight_mc. On frame 1 of this layer, place an instance of the ConnectionLight component. Name this instance connLight_mc in the Property inspector. Place the instance to the right of the login_mc instance.

  6. Make another layer, and name it confList_cb. On frame 1 of this layer, place an instance of the ComboBox component below the login_mc instance, as shown in Figure 13.24. This combo box displays the titles of every recorded session from the conference application. Name the new ComboBox instance confList_cb in the Property inspector. You may want to stretch the width of the instance with the Free Transform tool to accommodate long session titles.

    Figure 13.23Figure 13.24 An instance of the ComboBox component is positioned below the login_mc instance.


  7. Add a new layer named textfields. On frame 1 of this layer, use the Text tool to add the text Choose a conference call in the list below: above the confList_cb instance (Figure 13.25).

    Figure 13.25Figure 13.25 Descriptive Static text is added above the confList_cb instance.


  8. Now, you're ready to build the MovieClip objects that display the recorded streams from a session. Choose Insert >New Symbol (Ctrl+F8). Name the symbol speakerWin, choose the Movie Clip behavior, and click the OK button.

  9. On the speakerWin timeline, rename Layer 1 to userName_txt. On frame 1 of this layer, use the Text tool to create a Dynamic text field with an instance name of userName_txt. Disable the Show border option for the field. Place the top-left corner of the field at the registration point of the symbol (0, 0). See Figure 13.26.

    Figure 13.26Figure 13.26 In the speakerWin symbol, a Dynamic text field named userName_txt is created.


  10. Open the Library panel (F11) for the confRetrieve_100.fla document. In the options menu (located at the top-right corner of the panel), choose New Video. A new Embedded Video symbol appears in the panel. On the speak-erWin timeline, create a new layer named videoWin. On frame 1 of this layer, drag an instance of the new Embedded Video symbol to the stage. Place the instance below the userName_txt field. In the Property inspector, name the instance videoWin and change the Width and Height values to 120. (These are the dimensions of the stream recorded by the AVPresence component.) See Figure 13.27.

    Figure 13.27Figure 13.27 An instance of the Embedded Video symbol is placed below the userName_txt instance, and the dimensions of the instance are changed in the Property inspector.


  11. Create a new layer named frame, and place this layer above the videoWin layer. On frame 1 of the frame layer, use the Rectangle tool to create a non-filled square above the videoWin instance. Assign the same dimensions to the square artwork in the Property inspector. Convert the artwork into a graphic symbol named frame (do not include the videoWin instance in the new symbol).

  12. Go back to the Main Timeline (that is, Scene 1) of the confRetrieve_100.fla document. Create a new layer named speaker_1_mc. On frame 1 of this layer, drag an instance of the speakerWin symbol from the Library panel to the stage. Place the instance on the left half of the stage, below the confList_cb instance. Name the new instance speaker_1_mc in the Property inspector (Figure 13.28).

    Figure 13.28Figure 13.28 The speaker_1_mc instance is placed in the lower-left half of the stage.

  13. Create another layer and name it speaker_2_mc. Make a copy of the speaker_1_mc instance, and paste it in frame 1 of the speaker_2_mc layer. In the Property inspector, name the instance speaker_2_mc. Place the instance to the right of the speaker_1_mc instance.

  14. The last interface elements for the retrieval client are two Dynamic text field instances that display the current running time of the recording session and the overall length of the recording session. Using the Text tool, create a Dynamic text field instance named confTime_txt on frame 1 of the textfields layer. Disable the Show border option for this field. Place the instance to the right of the speaker_2_mc instance. Duplicate the confTime_txt instance, and place the new copy below the original. Rename this instance totalTime_txt in the Property inspector. (Figure 13.29)

    Figure 13.29Figure 13.29 Two Dynamic text fields are created to display the elapsed time and total time of the conference session.

  15. The user interface for the retrieval client is now complete. Save the document. Before you can use the client, you will create the client-side Action-Script code that retrieves the data from the recorded sessions.


    The completed document, confRetrieve_100.fla, is located in the chapter_13 folder of this book's CD-ROM.

Connecting to the savedCalls Data

When the retrieval client loads and establishes a connection to the default instance of the conference application, the savedCalls data created during prior recording sessions is available. In the following steps, you will enable the retrieval client to access the session data stored with the savedCalls SharedObject.

TO CONNECT TO THE savedCalls DATA:

  1. Open the confRetrieve_100.fla document from the previous section, and save it as confRetrieve_101.fla.

  2. Create a new layer, and name it actions. Place this layer at the top of the layer stack.

  3. Construct an object that establishes a connection to the savedCalls data stored in the default instance of the conference application. This object should have a connect() method, so it can be easily integrated with the SimpleConnect component instance login_mc. Select frame 1 of the actions layer, and open the Actions panel (F9). Add the code in Listing 13.12 to the actions list:

    Listing 13.12 Frame 1 actions of the retrieval movie

    1. Stage.scaleMode = "noScale";
    2. initSO = {};
    3. initSO.connect = function(nc) {
    4.    savedCalls_so = SharedObject.getRemote("savedCalls", nc.uri, 
          true);
    5.    savedCalls_so.onSync = function(list) {
    6.       for (var i in list) {
    7.          trace(i + ": name:" + list[i].name + " code:" + .list[i].code);
    8.          var propName = list[i].name;
    9.          var itemLabel = this.data[propName].confTitle;
    10.         confList_cb.addItem(itemLabel, propName);
    11.      }
    12.    };
    13.    savedCalls_so.connect(nc);
    14.    confList_cb.setChangeHandler("openSession");
    15. };

    Line 1 sets the scaleMode of the Stage object to "noScale", preventing the movie from scaling beyond 100 percent in a Web browser or the stand-alone player.

    Line 2 creates the initSO object as an Object instance, and lines 3-15 define a connect() handler for the object. This handler will be invoked by the login_mc instance once a successful connection is made to the conference application. The argument nc represents the NetConnection object created by the login_mc instance.

    In line 4, the nc argument is used to specify the location of the savedCalls data for the savedCalls_so instance. Lines 5-12 define the onSync() handler for this instance. Whenever an update to the savedCalls data occurs, the for loop in lines 6-11 adds each property name (propName) of the savedCalls SharedObject as the data property of a new item to the confList_cb instance. The confTitle property from each conference session is used as the label property (itemLabel) of each new item as well.

    Line 13 connects the savedCall_so instance to the conference application. Line 14 defines the change handler for the confList_cb instance. The change handler of a ComboBox instance is invoked when the user chooses an item in the drop-down menu. This handler, openSession(), will be discussed in the next section.

  4. Add the names of the connLight_mc and initSO instances to the Communication Components list parameter of the login_mc instance. Select the login_mc instance. In the Property inspector, click the Communication Components field to open the Values dialog box. In this box, add the instance names connLight_mc and initSO (Figure 13.30).

    Figure 13.30Figure 13.30 The instance names are added to the Communication Components list.


  5. Save the Flash MX document and test it (Control >Test Movie). When the movie loads and successfully connects to the conference application, the confList_cb instance populates with the titles from each recorded session (Figure 13.31).

    Figure 13.31Figure 13.31 The confList_cb instance displays the confTitle property values stored in the savedCalls data.


    The completed version of this document, confRetrieve_101.fla, can be found in the chapter_13 folder of this book's CD-ROM.

Retrieving Each Stream's Data

After the titles and property names from the savedCalls data have been added to the confList_cb instance, the user can choose one of the sessions from the list. When this selection has occurred, the retrieval client needs to access the two other persistent remote SharedObjects, the time trackers, which were created during the recording session. In the following steps, you will add the client-side ActionScript code necessary to retrieve this data.

TO RETRIEVE THE TRACKING INFORMATION FOR EACH STREAM:

  1. In Macromedia Flash MX, open the confRetrieve_101.fla document created in the previous section and save it as confRetrieve_102.fla.

  2. Create a new layer and name it functions. Place this layer at the top of the layer stack.

  3. Define a function that can convert seconds into a time format of mm:ss, where mm represents minutes and ss represents seconds. This function is used to convert the running time and total time of each conference session to the mm:ss format. This time format is shown in the confTime_txt and totalTime_txt fields. Select frame 1 of this layer and open the Actions panel (F9). Add the code in Listing 13.13 to the actions list.

    Listing 13.13 calculateTime() function

    1. function calculateTime(seconds) {
    2.    var newTime = Math.floor(seconds);
    3.    var newMinutesExact = newTime/60;
    4.    var newMinutesWhole = Math.floor(newMinutesExact);
    5.    var newSeconds = Math.floor((newMinutesExact -
          newMinutesWhole)*60);
    6.    if (newSeconds<10)
    7.       newSeconds = "0" + newSeconds.toString();
    8.    if (newMinutesWhole<10)
    9.       newMinutesWhole = "0" + newMinutesWhole.toString();
    10.   return newMinutesWhole + ":" + newSeconds;
    11. }

    The calculateTime() function takes one argument, seconds. In line 2, the seconds value is rounded down. In lines 3 and 4, the number of minutes is determined. In line 5, the difference between newMinutesExact and newMin-utesWhole is used to determine the number of seconds remaining. Lines 6-9 add a leading 0 to each value if the value is less than 10. Line 10 returns the values in the mm:ss format.

  4. Define the openSession() function, which is used as the change handler of the confList_cb instance. This function determines which item in the con-fList_cb instance was selected by the user, retrieves the data associated with the item's corresponding recorded session, and sets up the NetStream instances, which will play the recorded streams. After the calculateTime() function declaration, add the code in Listing 13.14.

    Listing 13.14 openSession() function

    1. function openSession(obj) {
    2.    var item = obj.getSelectedItem();
    3.    var sessionObj = savedCalls_so.data[item.data];
    4.    _global.confLength = sessionObj.confLength;
    5.    totalTime_txt.text = "Length:\t" + .calculateTime(confLength/1000);
    6.    var confDate = sessionObj.confDate;
    7.    var confInstance = sessionObj.confInstance;
    8.    var nc = login_mc.main_nc;
    9.    for(var i=1; i<=2; i++){
    10.      var speaker = "speaker_" + i;
    11.      var avNum = "av_" + i;
    12.      var basePath = confInstance + "_" + avNum + "_" + confDate;
    13.      this[speaker + "_ns"] = new NetStream(nc);
    14.      var speakerStream = this[speaker + "_ns"];
    15.      speakerStream.pathToStream = "callStreams/" + basePath;
    16.      speakerStream.pathToSO = basePath;
    17.      this[speaker + "_mc"].videoWin.attachVideo(speakerStream);
    18.      this[speaker + "_so"] = 
             SharedObject.getRemote(speakerStream.pathToSO, nc.uri,
             true);
    19.      var speakerSO = this[speaker + "_so"];
    20.      speakerSO.num = i;
    21.      speakerSO.onSync = speakerSyncHandler;
    22.      speakerSO.connect(nc);
    23.   }
    24. }

    The openSession() function uses one argument, obj, which represent the confList_cb instance. In line 2, an item variable points to the current item that the user selected in the combo box. Each item in a ComboBox instance is an Object instance, with label and data properties. In line 3, the data property of the item object is used to retrieve the session information stored in the savedCalls SharedObject. The local variable sessionObj represents the same information stored in the server-side sessionObj instance created in the shutdown.asc document.

    In line 4, a global variable named confLength is created and assigned the value of the confLength property stored in the sessionObj instance. In line 5, the totalTime_txt field displays the total minutes and seconds of the recording session. The confLength value is stored in milliseconds; therefore, the value is divided by 1000 and sent to the calculateTime() function. In line 6, the date string, confDate, is retrieved from the sessionObj instance. Then, the instance name of the conference application that recorded the session is retrieved (line 7). The nc variable in line 8 points to the NetConnection object, login_mc.main_nc, of the SimpleConnect component. These values are used to create the persistent remote SharedObjects and NetStream instances, as shown in the for loop (lines 9-23). This for loop re-creates the objects similar to the server-side objects created during the recording session.

    Line 14 creates the NetStream instances, one for each potential stream recorded by the AVPresence components in the recording client. In lines 15 and 16, two vital properties are created for each NetStream instance: path-ToStream and pathToSO. These values indicate the stream and remote Share-dObject names associated with each recorded stream. Line 17 attaches the streams to the videoWin instances inside of the speaker_1_mc and speaker_2_mc instances.

    Line 18 creates the SharedObject instances, speaker_1_so and speaker_2_so. These instances connect to the time tracker data created by the server-side av_1_timeTracker_so and av_2_timeTracker_so instances during the recording session.

    Line 20 sets a num property for each client-side SharedObject instance. This property indicates which stream number the instance represents. Line 21 defines the onSync() handler for each SharedObject instance. The handler uses a function named speakerSyncHandler(), which will be discussed in the next step.

    Line 22 establishes the actual connection to the SharedObject data within the default instance of the conference application.

  5. Define the speakerSyncHandler() function, which is used as the onSync() handler for each stream's SharedObject instance. After the createStreams() function, add the code in Listing 13.15.

    Listing 13.15 speakerSyncHandler() function

    1. function speakerSyncHandler(list) {
    2.    var currentStream = _root["speaker_" + this.num + "_ns"];
    3.    for (var i in list) {
    4.       trace(i + ": name: " + list[i].name + " code: " + 
             list[i].code);
    5.       if (list[i].name == "recordTimes" && list[i].code == 
             "change") {
    6.          currentStream.syncList = this.data.recordTimes.slice(0);
    7.          startPresentation();
    8.          this.close();
    9.       }
    10.   }
    11.   if (this.data.recordTimes == null) {
    12.      currentStream.syncList = [];
    13.      startPresentation();
    14.      this.close();
    15.   }
    16. }

    When the connection to the SharedObject data for each instance is made (see line 12 of the createStreams() function), the onSync() handler is invoked. In line 2, a reference named currentStream is created, pointing to the appropriate NetStream object. Each SharedObject instance is matched to a NetStream instance in the createStreams() function.

    When the onSync() handler is invoked for the first time, the if conditions in line 5 evaluate to true. At this time, each NetStream instance is assigned a syncList property, which is a copy of the recordTimes array created for each stream during the recording session.

    In line 7, a function named startPresentation is invoked. This function begins playback of each NetStream object. This function will be discussed in the next section.

    In line 8, the connection to the SharedObject data is closed. Once the recordTimes information has been retrieved, the connection is no longer necessary.

    Lines 9-15 are processed if either stream does not have any recordTimes data. This situation can occur if only one user logged into a conference session and published a stream. As you will learn in the next section, each NetStream instance must have a syncList property in order for a session to begin playback via the startPresentation() function. Line 12 assigns an empty array to the syncList property of the NetStream instance, and line 13 continues to invoke the startPresentation() function. Line 14 then closes the connection to the SharedObject data.

  6. Save the Flash MX document.

Playing a Recorded Session

You're approaching the final steps to complete the retrieval client. After each conference stream's recordTimes data has been accessed and assigned to its syn-cList property, all the data for the recorded session is available to begin playing the actual saved streams on the FlashCom server.

TO INITIATE PLAYBACK OF A RECORDED SESSION:

  1. Continue to work with the confRetrieve_102.fla document from the last section. Now, define the startPresentation() function, which is invoked by each SharedObject instance's onSync() handler after the recordTimes data has been loaded. Select frame 1 of the functions layer and open the Actions panel (F9). After the speakerSyncHandler() function declaration, add the following code:

    function startPresentation() {
       if (speaker_1_ns.syncList && speaker_2_ns.syncList) {
          startTime = getTimer();
          playID = setInterval(playStreams, 10);
       }
    }

    The if expressions check the existence of the syncList property on each NetStream instance. If both properties have been set, the nested code is processed. A variable named startTime is initialized, using the current time of the Flash movie as returned by the getTimer() function. The startTime variable serves the same role as the application.startTime property in the application instance; startTime creates a reference point to determine how much time has elapsed. Then, a function named playStreams is invoked with the setInterval() function. The playStreams() function is discussed in the next step. The ID for this interval is playID, and the playStreams() function is invoked every 10 ms to constantly track the time of the conference session.

  2. Define the playStreams() function as shown in Listing 13.16, which constantly monitors the playback of the recorded streams and the elapsed time of the Flash retrieval client. After the startPresentation() function declaration, add the following code:

    Listing 13.16 playStreams() function

    1. function playStreams() {
    2.    var elapsedTime = getTimer() - startTime;
    3.    confTime_txt.text = "Time:\t" + calculateTime(elapsedTime/1000);
    4.    for (var i = 1; i <= 2; i++) {
    5.       var currentStream = _root["speaker_" + i + "_ns"];
    6.       var syncList = currentStream.syncList;
    7.       var activeStart = syncList[0].start;
    8.       var speakerWin = _root["speaker_" + i + "_mc"];
    9.
    10.      if (elapsedTime >= activeStart && syncList.length > 0) {
    11.         currentStream.activeSeek = syncList.shift();
    12.         currentStream.nextStop = currentStream.activeSeek.end;
    13.         currentStream.checkStop = true;
    14.
    15.         var userName = currentStream.activeSeek.user;
    16.         trace("userName = " + userName);
    17.         speakerWin.userName_txt.text = userName;
    18.
    19.         var seekTime = (!currentStream.nextSeek) ? 0 :.currentStream.nextSeek;
    20.         trace("speaker_" + i + "_ns seekTime = " + seekTime);
    21.         currentStream.play(currentStream.pathToStream, seekTime);
    22.      }
    23.      if (currentStream.checkStop) {
    24.         if (elapsedTime >= currentStream.nextStop) {
    25.            trace("speaker_" + i + "_ns has paused or stopped.");
    26.            currentStream.checkStop = false;
    27.            currentStream.play(false);
    28.            var activeSeek = currentStream.activeSeek;
    29.            currentStream.nextSeek = (activeSeek.end -.activeSeek.start) / 1000;
    30.         }
    31.      }
    32.   }
    33.   if (elapsedTime >= confLength) {
    34.      if(playID){
    35.         clearInterval(playID);
    36.         delete playID;
    37.      }
    38.      trace("---Conference session has finished playing.");
    39.   }
    40.   updateAfterEvent();
    41. }

    The purpose of the playStreams() function is to check the elapsed time of the conference playback and to control a speaker's stream whenever a start or stop time matches the elapsed time. In line 2, the elapsedTime variable is determined, by subtracting the startTime value from the current time retrieved by getTimer(). In line 3, the confTime_txt field is updated with this new value, and the calculateTime() function is used to format this time correctly.

    Lines 4-32 use a for loop to compare the elapsedTime with the current start and stop values stored in the syncList array for each NetStream instance. In line 10, the elapsedTime value is compared to the activeStart value of the current stream. activeStart is determined in line 7, from the first index element of the current stream's syncList property.

    If the elapsedTime value is equal to or greater than the activeStart value and elements remain in the syncList array, lines 11-21 are processed. In line 11, the current first index element of the syncList array is removed and set equal to an activeSeek property of the currentStream instance. The activeSeek property contains an object with start, user, and end properties, as saved in the recordTimes array during the live conference recording. A property named nextStop is added to the currentStream instance, indicating the time value when the stream should stop playing. A checkStop property is also added to the currentStream instance, indicating that an end value (via nextStop) should be checked in line 23.

    In lines 15-17, the user property is retrieved form the current activeSeek property. This variable, userName, is then used to set the userName_txt field of the speakerWin instance (that is, either the speaker_1_mc or speaker_2_mc instance), which is determined in line 8.

    In lines 19-21, playback of the recorded stream begins with a seek time determined by the existence (and value) of the NetStream instance's nextSeek property, which is set later in line 29. When the first recorded segment plays, the nextSeek property does not exist and seekTime is set to 0, which indicates the beginning of the stream. In line 21, the play() method is invoked on the NetStream instance, using the instance's pathToStream property as the name of the stream to play.

    Lines 23-31 handle the detection of the stop time for each recording segment. If the checkStop property of the current NetStream instance has been set to true (as shown in line 13), the if statement in line 24 compares the elapsedTime value to the nextStop value of the NetStream instance. If elapsedTime is greater than or equal to the value of nextStop, lines 25-29 are processed. Here, the checkStop property is set to false (line 26), the stream stops playing (line 27), and the nextSeek point is calculated (lines 28 and 29).

    Lines 33-39 are invoked if the elapsedTime value is greater than or equal to the length of the conference session, confLength. If this occurs, the setInterval() ID is cleared and deleted (lines 35 and 36).

    The updateAfterEvent() function in line 40 allows faster execution of the setInterval() function and improves the time accuracy of the playStreams() function.

  3. Now, you need to create a function, as shown in Listing 13.17, that can check whether a conference session had been retrieved and/or is still playing when the openSession() function is invoked by the confList_cb instance.

    Listing 13.17 checkPlayback() function

    function checkPlayback(){
       if(playID || startTime != null){
          clearInterval(playID);
          startTime = null;
          for(var i = 1; i<=2; i++){
             var speaker = "speaker_" + i;
             delete _root[speaker + "_so"];
             with(_root[speaker + "_mc"]){
                videoWin.clear();
                userName_txt.text = "";
             }
             _root[speaker +"_ns"].close();
             delete _root[speaker + "_ns"];
          }
       }
    }

    If the playID variable exists or if startTime has a value other than null, the interval ID is cleared, and the startTime is set to null. Each speaker SharedObject is deleted, and each speakerWin instance is cleared. The NetStream instances are also closed and deleted.

  4. Invoke the checkPlayback() function at the start of the openSession() function. Add the following highlighted line of code to the openSession() function. Note that only the first few lines of the function are shown; the rest of the function remains the same.

    function openSession(obj) { 
    
       checkPlayback();
       var item = obj.getSelectedItem();
       ... 
  5. Save the Flash document. You're now ready to test the Flash retrieval client. Choose File > Publish Settings, and in the Formats tab, clear the Use default names check box. Specify a Flash movie name of confRetrieve.swf and an HTML name of confRecord.html. Click OK. Now, choose File > Publish Preview > Default - (HTML). In the Web browser window, the retrieval client should show the titles of each recorded session. (You may want to run the record client a few times to add more recorded sessions.) Choose one of the sessions in the combo box. As soon as you release the mouse button after making the choice, the confTime_txt field begins to display the elapsed time of the session. One of the streams should begin to play in a speakerWin instance (Figure 13.32). When the session finishes, the confTime_txt field stops updating the elapsed time.

    Figure 13.32Figure 13.32 Recorded streams from an earlier session can now be viewed with the retrieval client.


    You can find the confRetrieve_102.fla document in the chapter_13 folder of this book's CD-ROM.

As you have seen, the development of this application was not an easy or short task, despite the simplicity of some of the features as seen from the user's point of view. You now have an understanding of how detailed client-side and server-side operations complement one another to create a fully functional application.

5. Extending the Possibilities | Next Section Previous Section