Images, Bitmaps, Videos, Sounds: Chapter 8 - Flex 3 Cookbook
Pages: 1, 2, 3, 4

Its parameters are as follows:

sourceBitmapData:BitmapData
The input bitmap image to use. The source image can be a different BitmapData object or the current BitmapData object.
sourceRect:Rectangle
A rectangle that defines the area of the source image to use as input.
destPoint:Point
The point within the destination image (the current BitmapData instance) that corresponds to the upper-left corner of the source rectangle.
redMultiplier:uint
A hexadecimal uint value by which to multiply the red channel value.
greenMultiplier:uint
A hexadecimal uint value by which to multiply the green channel value.
blueMultiplier:uint
A hexadecimal uint value by which to multiply the blue channel value.
alphaMultiplier:uint
A hexadecimal uint value by which to multiply the alpha transparency value.

A complete code listing follows with modifiable controls to alter the values of the ColorTransform:

<mx:VBox xmlns:mx="http://www.adobe.com/2006/mxml" width="500" height="550" 
creationComplete="imgMod()">
    <mx:Script>
        <![CDATA[
            import mx.core.BitmapAsset;
            import mx.controls.Image;


            [Embed(source="../assets/bigshakey.png")]
            private var shakey:Class;

            [Embed(source="../assets/mao.jpg")]
            private var mao:Class;

            //superimpose the two images together
            //using the vslider data
            private function imgMod():void
            {
                var maoData:BitmapData = new BitmapData(firstImg.width, 
firstImg.height);
                var shakeyData:BitmapData = new BitmapData(secondImg.width, 
secondImg.height);
                maoData.draw(firstImg);
                shakeyData.draw(secondImg);
                maoData.colorTransform(new Rectangle(0, 0, maoData.width, 
maoData.height), new ColorTransform(redSlider.value/10, greenSlider.value/10,
blueSlider.value/10,alphaSlider.value/10));
                var red:uint = (uint(redSlider.value.toString(16)) / 10) * 160;
                var green:uint = (uint(greenSlider.value.toString(16)) / 10) * 160;
                var blue:uint = (uint(blueSlider.value.toString(16)) / 10) * 160;
                var alpha:uint = (uint(alphaSlider.value.toString(16)) / 10) * 160;
                shakeyData.merge(maoData, new Rectangle(0, 0, shakeyData.width, 
shakeyData.height), new Point(0, 0), red, green, blue, alpha);
                mainImg.source = new BitmapAsset(shakeyData);
            }

        ]]>
    </mx:Script>
    <mx:HBox>
        <mx:Image id="firstImg" source="{mao}" height="200" width="200"/>
        <mx:Image id="secondImg" source="{shakey}" height="200" width="200"/>
    </mx:HBox>
    <mx:HBox>
        <mx:Text text="Red"/>
        <mx:VSlider height="100" id="redSlider" value="5.0" change="imgMod()"/>
        <mx:Text text="Blue"/>
        <mx:VSlider height="100" id="blueSlider" value="5.0" change="imgMod()"/>
        <mx:Text text="Green"/>
        <mx:VSlider height="100" id="greenSlider" value="5.0" change="imgMod()"/>
        <mx:Text text="Alpha"/>
        <mx:VSlider height="100" id="alphaSlider" value="5.0" change="imgMod()"/>
    </mx:HBox>
    <mx:Image id="mainImg"/>
</mx:VBox>

Section 8.6: Apply a Convolution Filter to an Image

Problem

You want to allow users to alter the colors, contrast, or sharpness of an image.

Solution

Create an instance of a ConvolutionFilter and bind the properties of the matrix within the ConvolutionFilter to text inputs that the user can alter. Then push the filter onto the image's filters array to apply the filter.

Discussion

ConvolutionFilter is one of the most versatile and complex filters in the package. It can be used to emboss, detect edges, sharpen, blur, and perform many other effects. All the parameters are controlled by a Matrix object representing a three-by-three matrix that is passed to the filter in its constructor. The ConvolutionFilter conceptually goes through each pixel in the source image one by one and determines the final color of that pixel by using the value of the pixel and its surrounding pixels. A matrix, specified as an array of numeric values, indicates to what degree the value of each particular neighboring pixel affects the final resulting value. The constructor is shown here:

ConvolutionFilter(matrixX:Number = 0, matrixY:Number = 0, matrix:Array = null, 
divisor:Number = 1.0, bias:Number = 0.0, preserveAlpha:Boolean = true,
clamp:Boolean = true, co
lor:uint = 0, alpha:Number = 0.0)

Its parameters are as follows:

matrixX:Number (default = 0)
The x dimension of the matrix (the number of columns in the matrix). The default value is 0.
matrixY:Number (default = 0)
The y dimension of the matrix (the number of rows in the matrix). The default value is 0.
matrix:Array (default = null)
The array of values used for matrix transformation. The number of items in the array must equal matrixX * matrixY.
divisor:Number (default = 1.0)
The divisor used during matrix transformation. The default value is 1. A divisor that is the sum of all the matrix values evens out the overall color intensity of the result. A value of 0 is ignored and the default is used instead.
bias:Number (default = 0.0)
The bias to add to the result of the matrix transformation. The default value is 0.
preserveAlpha:Boolean (default = true)
A value of false indicates that the alpha value is not preserved and that the convolution applies to all channels, including the alpha channel. A value of true indicates that the convolution applies only to the color channels. The default value is true.
clamp:Boolean (default = true)
For pixels that are off the source image, a value of true indicates that the input image is extended along each of its borders as necessary by duplicating the color values at the given edge of the input image. A value of false indicates another color should be used, as specified in the color and alpha properties. The default is true.
color:uint (default = 0)
The hexadecimal color to substitute for pixels that are off the source image.
alpha:Number (default = 0.0)
The alpha of the substitute color.

Some common effects for the ConvolutionFilter are as follows:

new ConvolutionFilter(3,3,new Array(-5,0,1,1,-2,3,-1,2,1),1)
Creates an edge-detected image, where only areas of greatest contrast remain.
new ConvolutionFilter(3,3,new Array(0,20,0,20,-80,20,0,20,0),10)
Creates a black-and-white outline.
new ConvolutionFilter(5,5,new Array(0,1,2,1,0,1,2,4,2,1,2,4,8,4,2,1,2,4,2, 1,0,1,2,1,0),50);
Creates a blur effect.
new ConvolutionFilter(3,3,new Array(-2,-1,0,-1,1,1,0,1,2),0);
Creates an emboss effect.

The complete code listing is shown here:

<mx:VBox xmlns:mx="http://www.adobe.com/2006/mxml" width="450" height="550">
    <mx:Script>
        <![CDATA[
            import mx.core.BitmapAsset;

            [Embed(source="../assets/mao.jpg")]
            private var mao:Class;

            private function convolve():void
            {
                var asset:BitmapAsset = new mao() as BitmapAsset;
                var convolution:ConvolutionFilter = 
new ConvolutionFilter(matrixXSlider.value, matrixYSlider.value,
            [input1.text, input2.text, input3.text, input4.text, input5.text, 
input6.text],
                divisorSlider.value, biasSlider.value, true);
                var _filters:Array = [convolution];
                asset.filters = _filters;
                img.source = asset;
            }

        ]]>
    </mx:Script>
    <mx:Button click="convolve()" label="convolve away"/>
    <mx:HBox>
        <mx:Text text="Matrix X"/>
        <mx:VSlider height="100" id="matrixXSlider" value="5.0" change="convolve()"/>
        <mx:Text text="Matrix Y"/>
        <mx:VSlider height="100" id="matrixYSlider" value="5.0" change="convolve()"/>
        <mx:Text text="Divisor"/>
        <mx:VSlider height="100" id="divisorSlider" value="5.0" change="convolve()"/>
        <mx:Text text="Bias"/>
        <mx:VSlider height="100" id="biasSlider" value="5.0" change="convolve()"/>
        <mx:VBox>
            <mx:TextInput id="input1" change="convolve()" width="40"/>
            <mx:TextInput id="input2" change="convolve()" width="40"/>
            <mx:TextInput id="input3" change="convolve()" width="40"/>
            <mx:TextInput id="input4" change="convolve()" width="40"/>
            <mx:TextInput id="input5" change="convolve()" width="40"/>
            <mx:TextInput id="input6" change="convolve()" width="40"/>
        </mx:VBox>
    </mx:HBox>
    <mx:Image id="img"/>
</mx:VBox>

Section 8.7: Send Video to an FMS Instance via a Camera

Problem

You want to send a stream from the user's camera to a Flash Media Server (FMS) instance for use in a chat or other live media application.

Solution

Capture the user's camera stream by using the flash.media.Camera.getCamera method and then attach that camera to a NetStream that will be sent to the Flash Media Server instance. Use the publish method of the NetStream class to send the stream with a specified name to the application that will handle it.

Discussion

The publish method indicates to a Flash Media Server that has been connected to via the NetConnection class, that the NetStream will be sending information to the server. What the server does with that information depends on the application, but there are flags that can be set in the publish method that indicate to the server and the Flash Player what should be done with the streamed information. The publish method has the following signature:

publish(name:String = null, type:String = null):void

Its parameters are as follows:

name:String (default = null)
A string that identifies the stream. If you pass false, the publish operation stops. Clients that subscribe to this stream must pass this same name when they call NetStream.play.
type:String (default = null)
A string that specifies how to publish the stream. Valid values are record, append, and live (the default). If you pass record, Flash Player publishes and records live data, saving the recorded data to a new FLV file with a name matching the value passed to the name parameter. The file is stored on the server in a subdirectory within the directory that contains the server application. If the file already exists, it is overwritten. If you pass append, Flash Player publishes and records live data, appending the recorded data to an FLV file with a name that matches the value passed to the name parameter, stored on the server in a subdirectory within the directory that contains the server application. If no file with a name matching the name parameter is found, a file is created. If you omit this parameter or pass live, Flash Player publishes live data without recording it. If a file with a name that matches the value passed to the name parameter exists, the file is deleted.

When you record a stream by using Flash Media Server, the server creates an FLV file and stores it in a subdirectory in the application's directory on the server. Each stream is stored in a directory whose name matches the application instance name passed to NetConnection.connect. The server creates these directories automatically; you don't have to create one for each application instance. For example, the following code shows how you would connect to a specific instance of an application stored in a directory named lectureSeries in your application's directory. A file named lecture.flv is stored in a subdirectory named /yourAppsFolder/lectureSeries/streams/Monday:

  var myNC:NetConnection = new NetConnection();
  myNC.connect("rtmp://server.domain.com/lectureSeries/Monday");
  var myNS:NetStream = new NetStream(myNC);
  myNS.publish("lecture", "record");

If you don't pass a value for the instance name that matches, the value passed to the name property is stored in a subdirectory named /yourAppsFolder/appName/streams/_definst_ (for default instance).

This method can dispatch a netStatus event with several different information objects. For example, if someone is already publishing on a stream with the specified name, the netStatus event is dispatched with a code property of NetStream.Publish.BadName. For more information, see the netStatus event.

In the following example, the connection to the server is established, and the data from the camera is streamed to the server:

<mx:Canvas xmlns:mx="http://www.adobe.com/2006/mxml" width="400" height="500" 
creationComplete="setUpCam()">
    <mx:Script>
        <![CDATA[

            private var cam:Camera;
            private var nc:NetConnection;
            private var ns:NetStream;

            private function setUpCam():void
            {
                trace(Camera.names.join(","));
                //I'm doing this only because it's the only way the
                //flash player will pick up the camera on my MacBook
                cam = flash.media.Camera.getCamera("2");
                vid.attachCamera(cam);
                nc = new NetConnection();
                nc.addEventListener(NetStatusEvent.NET_STATUS, netStatus);
                nc.connect("http://localhost:3002");
            }

            private function netStatus(event:NetStatusEvent):void
            {
                switch(event.info)
                {
                    case "NetConnection.Connect.Success":
                        ns = new NetStream(nc);
                        ns.attachCamera(cam, 20);
                        ns.attachAudio(Microphone.getMicrophone());
                        ns.publish("appname", "live");
                    break;
                }
            }

        ]]>
    </mx:Script>
    <mx:VideoDisplay id="vid" width="360" height="320"/>
</mx:Canvas>

Section 8.8: Access a User's Microphone and Create a Sound Display

Problem

You want to access a user's microphone and use the sound level of the microphone to draw a sound level.

Solution

Access the microphone by using the Microphone.getMicrophone method. Access the sound level that this method detects by using the mic.activityLevel property of the Microphone class on a regular interval.

Discussion

The Microphone class provides access to a user's microphone and computer, and the user must allow the Flash Player application access for you to use the class. The Microphone class shows the level of sound that the microphone is detecting, and dispatches events when sound begins and when there has not been any sound for a given period of time.

Three properties of the Microphone class monitor and control the detection of activity. The read-only activityLevel property indicates the amount of sound the microphone is detecting on a scale from 0 to 100. The silenceLevel property specifies the amount of sound needed to activate the microphone and dispatch an ActivityEvent.ACTIVITY event. The silenceLevel property also uses a scale from 0 to 100, and the default value is 10. The silenceTimeout property describes the number of milliseconds that the activity level must stay below the silence level, until an ActivityEvent.ACTIVITY event is dispatched to indicate that the microphone is now silent. The default silenceTimeout value is 2000. Although both Microphone.silenceLevel and Microphone.silenceTimeout are read-only, you can change their values by using the method.

The following example creates a Microphone object, which will prompt the user to accept or deny the Flash Player access to the microphone. Then, after microphone activity is detected via the Activity event, an enter frame event listener is added that will draw the soundLevel of the microphone into a Canvas.

<mx:VBox xmlns:mx="http://www.adobe.com/2006/mxml" width="400" height="300" 
creationComplete="createMic()">

<mx:Script>
    <![CDATA[
    import flash.media.Microphone;
    import flash.events.ActivityEvent;
    import flash.events.Event;
    import flash.events.StatusEvent;

      public var mic:Microphone;

      public function createMic():void
      {
        mic = Microphone.getMicrophone();
        mic.setLoopBack(true);
        mic.addEventListener(ActivityEvent.ACTIVITY, activity);
        mic.addEventListener(StatusEvent.STATUS, status);
        mic.addEventListener(Event.ACTIVATE, active);
      }

      private function active(event:Event):void
      {
        trace(' active ');
      }

      private function status(event:StatusEvent):void
      {
        trace("status");
      }

      private function activity(event:ActivityEvent):void
      {
        trace("active ");
        addEventListener(Event.ENTER_FRAME, showMicLevel);
      }

      private function showMicLevel(event:Event):void
      {
        trace(mic.gain+" "+mic.activityLevel+" "+mic.silenceLevel+" "+mic.rate);
        level.graphics.clear();
        level.graphics.beginFill(0xccccff, 1);
        level.graphics.drawRect(0, 0, (mic.activityLevel * 30), 100);
        level.graphics.endFill();
      }

    ]]>
</mx:Script>
<mx:Canvas width="300" height="50" id="level"/>
</mx:VBox>

Section 8.9: Smooth Video Displayed in a Flex Application

Problem

You need to control the smoothing of a video that is played back in an application.

Solution

Create a custom component that contains the flash.media.Video component, and then set Video's smoothing property to true.

Discussion

To smooth video—that is, to make the video look less pixilated—you need to access the flash.media.Video object. Video smoothing, like image smoothing, requires more processing power than un-smoothed playback and can slow video playback for large or extremely high-quality videos.

The Flex VideoDisplay component does not allow you to set the smoothing property of the flash.media.Video object that it contains, so you must create a separate component that adds the lower-level Flash Video component and set the smoothing property:

<mx:Canvas xmlns:mx="http://www.adobe.com/2006/mxml" width="400" height="300" 
creationComplete="setup()">
    <mx:Script>
        <![CDATA[

            private var vid:Video;

            private var nc:NetConnection;
            private var ns:NetStream;
            private var metaDataObj:Object = {};

            private function setup():void {
                vid = new Video(this.width, this.height);
                vid.smoothing = true;
                this.rawChildren.addChild(vid);
                vid.y = 50;
                this.invalidateDisplayList();
            }

            private function startVid():void {
                nc = new NetConnection();
                nc.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
                nc.connect(null);
            }

            private function netStatusHandler(event:NetStatusEvent):void {
                ns = new NetStream(nc);
                metaDataObj.onMetaData = this.onMetaData;
                ns.client = metaDataObj;
                vid.attachNetStream(ns);
                ns.play("http://localhost:3001/Trailer.flv");
            }

            private function onMetaData(obj:Object):void 
            {
                    trace(obj.duration+" "+obj.framerate+" "+obj.bitrate);
            }
                var i:int = 0;
                for each(var prop:Object in obj)
                {
                    trace(obj[i] + "  :  " + prop);
                    i++;
                }
                trace(obj.duration+" "+obj.framerate+" "+obj.bitrate);
            }

        ]]>
    </mx:Script>
    <mx:Button  click="startVid()" label="load" x="50"/>
    <mx:Button click="ns.resume()" label="resume" x="120"/>
    <mx:Button click="ns.pause()" label="pause" x="190"/>
</mx:Canvas>

Section 8.10: Check Pixel-Level Collisions

Problem

You need to check whether images with alpha transparency regions are colliding with other images.

Solution

Draw the data of both images to a BitmapData object and use the BitmapData.hitTest method.

Discussion

The BitmapData object possesses a hitTest method that works similarly to the hitTest method defined by DisplayObject with one notable exception: whereas 's hitTest method returns true if the point given intersects with the bounds of the object, BitmapData's hitTest method returns true if the pixel at the point given is above a certain threshold of alpha transparency. The signature of the method is shown here:

public function hitTest(firstPoint:Point, firstAlphaThreshold:uint, secondObject:
Object, secondBitmapDataPoint:Point = null, secondAlphaThreshold:uint = 1):Boolean

If an image is opaque, it is considered a fully opaque rectangle for this method. Both images must be transparent to perform pixel-level hit testing that considers transparency. When you are testing two transparent images, the alpha threshold parameters control what alpha channel values, from 0 to 255, are considered opaque. The method's parameters are as follows:

firstPoint:Point
A position of the upper-left corner of the BitmapData image in an arbitrary coordinate space. The same coordinate space is used in defining the secondBitmapPoint parameter.
firstAlphaThreshold:uint
The highest alpha channel value that is considered opaque for this hit test.
secondObject:Object
A Rectangle, Point, Bitmap, or BitmapData object.
secondBitmapDataPoint:Point (default = null)
A point that defines a pixel location in the second BitmapData object. Use this parameter only when the value of secondObject is a BitmapData object.
secondAlphaThreshold:uint (default = 1)
The highest alpha channel value that's considered opaque in the second BitmapData object. Use this parameter only when the value of secondObject is a BitmapData object and both BitmapData objects are transparent.

In the following code sample, each corner of a rectangular image is checked for collisions against a PNG file with alpha transparency:

<mx:Canvas xmlns:mx="http://www.adobe.com/2006/mxml" width="1500" height="900">
    <mx:Script>
        <![CDATA[
            import flash.display.BlendMode;

            private var mainBmp:BitmapData;
            private var dragBmp:BitmapData;
            private var hasDrawn:Boolean = false;

            private function loaded():void{
                if(!hasDrawn){
                mainBmp = new BitmapData(mainImg.width, mainImg.height, true, 
0x00000000);
                dragBmp = new BitmapData(dragImg.width, dragImg.height, true, 
0x00000000);
                hasDrawn = true;
                this.addEventListener(Event.ENTER_FRAME, showHits);
                }
            }

            private function showHits(event:Event):void
            {
                mainBmp.draw(mainImg);
                dragBmp.draw(dragImg);
                if(mainBmp.hitTest(new Point(0,0), 0xff, dragImg.getBounds(this).
topLeft)){
                    trace(" true ");
                    return;
                }
                if(mainBmp.hitTest(new Point(0,0), 0xff, dragImg.getBounds(this).
bottomRight)){
                    trace(" true ");
                    return;
                }
                if(mainBmp.hitTest(new Point(0,0), 0xff, new Point(dragImg.getBounds
(this).left, dragImg.getBounds(this).bottom))){
                    trace(" true ");
                    return;
                }
                if(mainBmp.hitTest(new Point(0,0), 0xff, new Point(dragImg.getBounds
(this).right, dragImg.getBounds(this).top))){
                    trace(" true ");
                    return;
                }
                trace(" false ");
            }

        ]]>
    </mx:Script>
    <mx:Image id="mainImg" source="../assets/alphapng.png" cacheAsBitmap="true"/>
    <mx:Image cacheAsBitmap="true" id="dragImg" mouseDown="dragImg.startDrag(false, 
this.getBounds(stage)), loaded()" rollOut="dragImg.stopDrag()"
mouseUp="dragImg.stopDrag()" source="../assets/bigshakey.png"/>

</mx:Canvas>

Pages: 1, 2, 3, 4

Next Pagearrow