Saturday, February 12, 2011

Taking a picture - Base64 encoding in JavaMe

It's been a while since my last post, January was a busy month, but here I am again.

In this post, we are going to do something really interesting: taking a snapshot with the phone's camera and send it to the server in Base64 format.

You should read the post about Base64 in JavaMe (J2ME) if you haven't:


Taking the snapshot is quite easy, you just have to be careful about blocking calls. The real problem happens when uploading the image to the server. I tried several scenarios using Glassfish 3.0.1 and Tomcat 6.0.32 and none of the scenarios worked on both servers... It was very frustrating becasue the code seems to be OK, but works for Glassfish or for Tomcat but not both.

Let's start with the general code, assume the following is inside the MIDlet class:

  
     //inside MIDlet class...

     /** Commands to control the application flow*/
     private Command commandShoot = null;
     private Command commandExit = null;
     private Command commandSend = null;
     private Command commandBack = null;
     private Display display = null;

     /** flags to indicate whether the phone 
      *  supports png or jpeg encoding for snapshots
      */
     private boolean png = false;
     private boolean jpeg = false;

     /** Canvas where the video is going to be played*/
     private VideoCanvas videoCanvas = null;

     /** Canvas where the snapshot is going to be shown*/
     private ImageCanvas imgCanvas = null;


     public void startApp() {
        display = Display.getDisplay(this);

        commandExit = new Command("Exit", Command.EXIT, 0);
        commandShoot = new Command("Shoot", Command.SCREEN, 0);
        commandSend = new Command("Send", Command.SCREEN, 0);
        commandBack = new Command("Back", Command.BACK, 0);

        //Verifies if the application can use the camera
        //and encoding
        if (checkCameraSupport() && 
            checkSnapShotEncodingSupport()) {

            //this is the canvas where the video will render
            videoCanvas = new VideoCanvas(this);
            videoCanvas.addCommand(commandExit);
            videoCanvas.addCommand(commandShoot);
            videoCanvas.setCommandListener(this);

            display.setCurrent(videoCanvas);
        } else {
            System.out.println("NO camera support");
        }
    }

    /**
     * Checks whether the phone has camera support
     * @return true if and only if the phone has camera
     * support
     */
    private boolean checkCameraSupport() {
        String propValue = System.getProperty
                           ("supports.video.capture");
        return (propValue != null) && propValue.equals("true");
    }
    
    /**
     * Checks if the phone has png or jpeg support.
     * @return true if and only if the phone supports
     * png or jpeg encoding
     */
    private boolean checkSnapShotEncodingSupport() {
        String encodings = System.getProperty
                           ("video.snapshot.encodings");
        png = (encodings != null) 
              && (encodings.indexOf("png") != -1);

        jpeg = (encodings != null) 
               && (encodings.indexOf("jpeg") != -1);

        return png || jpeg;
    }

The previous methods tell you if the phone has camera support and if it can encode snapshots in either png or jpeg format. The +startApp():void method, starts the application, checks for the camera support and starts the canvas where the video will render.

Next is the method that actually encodes an array of bytes. Later we will show how to pass the Image as a byte array to this method in order to get the Image in Base64 format:

  
     //inside MIDlet class...

    /**
     * Encodes an array of bytes
     * @param imgBytes Array of bytes to encode
     * @return String representing the Base64 format of
     * the array parameter
     */
    public String encodeImage(byte[] imgBytes) {
        byte[] coded = Base64.encode(imgBytes);
        return new String(coded);
    }

As we saw in the Base64 encode-decode in JavaMe (J2ME) Post, the previous method encodes using the bouncy castle library.

The following methods goes inside the MIDlet class as well, they are responsible of starting the canvas that takes the snapshot and passing the snapshot to another canvas in order to show it to the user:

  
    //inside MIDlet...

    /**
     * Starts the Video Canvas to take the snapshot
     */
    public void snapShot() {

        if (png) {
            videoCanvas.startSnapShot("png");
        } else if (jpeg) {
            videoCanvas.startSnapShot("jpeg");
        } else {
            videoCanvas.startSnapShot(null);
        }
    }

    /**
     * Shows the snapshot in a Canvas
     * @param bytes Array of bytes representing the
     * snapshot
     */
    public void showSnapShot(byte[] bytes) {
        if (bytes != null) {

            imgCanvas = new ImageCanvas(bytes);
            imgCanvas.addCommand(commandSend);
            imgCanvas.addCommand(commandBack);
            imgCanvas.setCommandListener(this);

            display.setCurrent(imgCanvas);
        }
    }

OK, that's pretty much the general code for the MIDlet, let's see what happens within the VideoCanvas class. The next code should be inside the constructor, it starts the video player:

        
    //inside VideoCanvas class...

    /* Application MIDlet**/
    ImageCaptureMidlet midlet = null;

    /** Control for the video*/
    VideoControl videoControl = null;

    public VideoCanvas(ImageCaptureMidlet mid) {
        midlet = mid;
        Player player = null;

        try {
            player = Manager.createPlayer("capture://video");
            player.realize();
            videoControl = (VideoControl) player.getControl
                           ("VideoControl");
            videoControl.initDisplayMode
                       (VideoControl.USE_DIRECT_VIDEO, this);
            videoControl.setDisplayFullScreen(true);
            videoControl.setVisible(true);

            player.start();

        } catch (Exception e) {
            e.printStackTrace();
        }
    }

So, when the canvas is created it starts the player and the video starts playing inside the canvas. When the MIDlet invokes the +snapShot():void method, the following piece of code is executed inside the VideoCanvas class:

    //inside VideoCanvas class...

    /**
     * Starts a thread to take the snapshot. 
     * Some devices will take the snapshot without 
     * the need of a thread, but some others 
     * doesn't (including my emulator). 
     * So start a new Thread...
     * @param encoding String representing the encoding
     * to use when taking the snapshot
     */
    public void startSnapShot(final String encoding) {
        new Thread(new Runnable() {

            public void run() {
                try {
                    byte[] rawBytes = null;
                    if (encoding != null) {
                        //take the snapshot using the encoding
                        rawBytes = videoControl.getSnapshot
                                   ("encoding=" + encoding);

                    } else {
                        //take the snapshot using the best
                        //possible encoding. 
                        //Implementation dependent
                        rawBytes = videoControl.getSnapshot
                                   (null);
                    }
                    //ask the midlet to show the snapshot
                    midlet.showSnapShot(rawBytes);
                } catch (MediaException ex) {
                    ex.printStackTrace();
                }
            }
        }).start();
    }

You may notice that the method starts a new Thread. That's because not all devices will take the snapshot right away, some times it takes a little bit to start the camera and take the sanpshot. That's why, we have started a new Thread. Also, check that +VideoControl.getSnapshot(String):byte[] method can receive a null parameter, indicating that let the implementation decide which is the best encoding possible to take the snapshot. Finally, when the snapshot is taken, the method asks the application MIDlet to show it.

We have previously seen the +showSnapShot(byte[]):void method in the MIDlet class, so let's take care about the ImageCanvas class where the snapshot is shown:

    //inside ImageCanvas class...

    /** Snapshot to render*/
    private Image image = null;

    /** bytes of the snapshot*/
    private byte[] imageBytes = null;

    public ImageCanvas(byte[] bytes) {
        imageBytes = bytes;

        //creates an Image using the byte array 
        image = Image.createImage(imageBytes, 0, 
                                  imageBytes.length);
    }

    public void paint(Graphics g) {
        int width = getWidth();
        int height = getHeight();
        g.setColor(0x000000);
        g.fillRect(0, 0, width, height);

        //render the snapshot
        g.drawImage(image, getWidth() / 2, getHeight() / 2, 
                    Graphics.HCENTER | Graphics.VCENTER);
    }

    //Getters and Setters...


OK, up until now we have seen how to take a snapshot and render it on a Canvas. That's the general code we were talking about at the beginning of this post. Now let's see how to send the snapshot to the server. Here we are going to present two scenarios, one for Glassfish server and another for the Tomcat server.

Glassfish Server:
The following method is the method inside the MIDlet that sends the snapshot to a servlet deployed on Glassfish server 3.0.1:

    //inside MIDlet class

    /**
     * Sends the snapshot to the server
     */
    public void send() {
        new Thread(new Runnable() {

            public void run() {

                String imageEncoded = 
                   encodeImage(imgCanvas.
                               getImageBytes());                

                String format = png ? "png" : jpeg?"jpeg":"";
                
                //This is my servlet’s URL
                String URL = 
                           "http://localhost:8080/" + 
                           "Base64ExampleServlet_v2/" +
                           "ImageServlet";

                HttpConnection http = null;
                OutputStream os = null;
                DataOutputStream dout = null;
                try {
                    //Open HttpConnection using POST
                    http = (HttpConnection) Connector.open(URL);
                    http.setRequestMethod(HttpConnection.POST);
                    //Content-Type is must to pass parameters 
                    //in POST Request
                    http.setRequestProperty(
                         "Content-Type", 
                         "application/x-www-form-urlencoded");
                   
                    os = http.openOutputStream();

                    ByteArrayOutputStream bout = new 
                                       ByteArrayOutputStream();
                    dout = new DataOutputStream(bout);
                    dout.writeUTF(imageEncoded);
                    dout.writeUTF(format);
                    os.write(bout.toByteArray());
                    
                    os.flush();  
                } catch (IOException ex) {
                    ex.printStackTrace();
                } 
                //whatever happens, close the streams 
                //and connections
                finally {
                    if (os != null) {
                        try {
                            os.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                    if (dout != null) {
                        try {
                            dout.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                    if (http != null) {
                        try {
                            http.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                }
            }
        }).start();
    }

As you can see, the method calls the encoding method passing the snapshot as a byte array, then it starts a HttpConnection to the servlet deployed on Glassfish server. It opens the OutputStream and writes the image as Base64 format and it also writes the encoding (format) used. On the server side, you should read the stream the same way you wrote it, taht is, first read the image in Base64 format and then the encoding used (format).

Tomcat Server:
The following method is the method inside the MIDlet that sends the snapshot to a servlet deployed on Tomcat server 6.0.32:

    //inside MIDlet class

    /**
     * Sends the snapshot to the server
     */
    public void send() {
        new Thread(new Runnable() {

            public void run() {

                String imageEncoded = 
                   encodeImage(imgCanvas.
                               getImageBytes());                

                String format = png ? "png" : jpeg?"jpeg":"";
                
                //This is my servlet’s URL
                String URL = 
                           "http://localhost:8080/" + 
                           "Base64ExampleServlet_v2/" +
                           "ImageServlet";

                HttpConnection http = null;
                OutputStream os = null;
                try {
                    //Open HttpConnection using POST
                    http = (HttpConnection) Connector.open(URL);
                    http.setRequestMethod(HttpConnection.POST);
                    //Content-Type is must to pass parameters 
                    //in POST Request
                    http.setRequestProperty(
                         "Content-Type", 
                         "application/x-www-form-urlencoded");
                   
                    os = http.openOutputStream();

                    //IMPORTANT, when writing Base64 format
                    //there are chars like '+' that  
                    //must be replaced when sending.
                    imageEncoded = 
                        imageEncoded.replace('+', '-');
                    StringBuffer params = new StringBuffer();
                    params.append("image" + "=" + imageEncoded);
                    params.append("&" + 
                                  "format" + "=" + format);
                    System.out.println(params.toString());
                    os = http.openOutputStream();
                    os.write(params.toString().getBytes());
                    
                    os.flush();  
                } catch (IOException ex) {
                    ex.printStackTrace();
                } 
                //whatever happens, close the streams 
                //and connections
                finally {
                    if (os != null) {
                        try {
                            os.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                    if (http != null) {
                        try {
                            http.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                }
            }
        }).start();
    }

This method differs from the Glassfish method in the way it writes the parameters. Tomcat's method writes the parameters as a key/value pair. Notice one important point, the Base64 format uses chars like '+', chars that may be confused in the HTTP protocol, because they can be translated as a space char. So before sending the image in Base64, replace the '+' chars with '-' chars, but when receiving in the servlet (server side), convert the chars back before writing the image to the disk. This is only needed in the Tomcat's method.

Wow, loooooong post... Again, my advise is be careful when deciding which method to use in which server. Sometimes the code you wrote may be well written but for some reasons it won't work on the server... so you have to try another way to write your code until it works.

That's it for now, hope this helps you to write applications that use the device's camera.

see ya soon!


References:

Taking Pictures with MMAPI. 2011. Oracle [online].
Available on Internet: http://developers.sun.com/mobility/midp/articles/picture/
[accessed on February 01 2011].

Java Tips - Capturing Video on J2ME devices. 2011. Java Tips [online].
Available on Internet: http://www.java-tips.org/java-me-tips/midp/capturing-video-on-j2me-devices.html
[accessed on February 01 2011].

J2ME Sample Codes: Image Capturing in J2ME. 2011. J2ME Sample Codes [online].
Available on Internet: http://j2mesamples.blogspot.com/2009/06/image-capturing-in-j2me.html
[accessed on February 01 2011].

Embedded Interaction - Working with J2ME - Picture transmission over HTTP. 2011. Embedded Interaction [online].
Available on Internet: http://www.hcilab.org/documents/tutorials/PictureTransmissionOverHTTP/index.html
[accessed on February 01 2011].

Base64. 2010. Wikipedia [online].
Available on Internet: http://en.wikipedia.org/wiki/Base64
[accessed on December 28 2010].

The Legion of the Bouncy Castle. bouncycastle.org [online].
Available on Internet: http://www.bouncycastle.org/
[accessed on December 28 2010].

7 comments:

  1. Hi Alexis,
    Thanks a lot for this wonderful tutorial. I have really liked it.
    Regards,
    Elly

    ReplyDelete
  2. shanth.mysore@gmail.comFebruary 22, 2011 at 12:30 AM

    Excellent....

    ReplyDelete
  3. there is a parameter in base64 to use the url safe version (without '-' and '/')

    ReplyDelete
  4. Very very usefull. Thanks

    ReplyDelete
  5. excellent article! i m doing sth similar and i ve noticed u r using "application/x-www-form-urlencoded" as content type. using "multipart/form-data" complicates coding a bit, but eliminates the server content acceptance problem. recommend using curl for testing server-side, it also uses "multipart/form-data"

    ReplyDelete
  6. hello.. sir.. can a ask a favor. i hope you had a time to answer my question. i also used J2ME for computer vision. I need to capture a tomato and analyze the ripeness of the tomato. i used netbeans 7.0 and jdk 1.6..
    can you teach me how to do. i need you help. hope you can read my comment. thank a lot..

    ReplyDelete
  7. hi sir, i just want to ask how to make/build text to speech in android using syllable algorithm..thanks in advance..

    ReplyDelete