Image Effects for Android using OpenCV: Animated GIFs

Working with image effects that require multiple images

OpenCV is a powerful tool for doing intensive operations in real-time, even for resource-limited mobile devices. Throughout a couple of tutorials, we’re going to create an Android app that applies various effects over images.

The Android app has 3 limitations, which are:

  1. There is no space for adding buttons for new effects.
  2. The user cannot select images to apply effects—only the resource images packaged within the app can be used.
  3. For effects working with multiple images, all images must be of the same size.

This tutorial overcomes these limitations. By the end of this tutorial, the user can use the Android app to select multiple, custom images rather than using a single resource image.

Secondly, the images don’t have to be of the same size and the user may not care whether their sizes are different. In this tutorial, we’re going to ensure that they’re resized, if needed, before using them. And lastly, we’re going to edit the app layout to make space for adding more buttons.

The sections covered in this tutorial are as follows:

  • Using GridLayout
  • Selecting a Single Image
  • Selecting More than One Images
  • Reading One Image for Single Image Effects
  • Reading Multiple Images for Multi-Image Effects
  • Create Animated GIF
  • Building Android App

The source code of this tutorial is available in this GitHub project:

Let’s get started.

Using GridLayout

The Button views in the layout of the Android app created in the previous tutorial take up a lot of space on the screen, as shown in the figure below. As such, there’s no space for adding more effects. Each button uses the entire width of the screen. This section organizes these buttons using GridLayout, where the single row is split into 2 columns, with each row holding 2 buttons.

The modified XML layout of the app is listed below. As usual, a LinearLayout is used as the root view. This LinearLayout has 3 child views, which are 2 GridLayout views and a single ImageView. The ImageView is just for displaying the results of each effect. Let’s discuss the 2 GridLayout views.

The previous tutorial added all Button views directly inside this view. In this tutorial, the Button views are grouped into a GridLayout, which is the first GridLayout view inside the LinearLayout. The GridLayout has an attribute called columnCount that specifies how many views can be added within the same row. It’s set to 2, which means 2 different Button views will be added inside the single row. The orientation attribute of the GridLayout is set to horizontal for adding the views in horizontally.

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical"
    tools:context=".MainActivity">

    <GridLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:columnCount="2"
        android:orientation="horizontal">

        <Button
            android:id="@+id/stitchHorizontal"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="stitchHorizontal"
            android:text="Stitch Horizontally" />

        <Button
            android:id="@+id/stitchVertical"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="stitchVectical"
            android:text="Stitch Vertically" />

        <Button
            android:id="@+id/reduceColors"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="reduceImageColors"
            android:text="Reduce Colors" />

        <Button
            android:id="@+id/reduceColorsGray"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="reduceImageColorsGray"
            android:text="Reduce Colors Gray" />

        <Button
            android:id="@+id/medianFilter"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="medianFilter"
            android:text="Median Filter" />

        <Button
            android:id="@+id/adaptiveThreshold"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="adaptiveThreshold"
            android:text="Adaptive Threshold" />

        <Button
            android:id="@+id/cartoon"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="cartoonImage"
            android:text="Cartoon Image" />

        <Button
            android:id="@+id/transparency"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="blendImages"
            android:text="Image Blending" />

        <Button
            android:id="@+id/regionBlending"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="blendRegions"
            android:text="Region Blending" />

    </GridLayout>

    <GridLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:columnCount="2"
        android:rowCount="1"
        android:orientation="horizontal">

        <Button
            android:id="@+id/selectImage"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:background="#FFCCE5"
            android:onClick="selectImage"
            android:text="Select Image(s)" />

        <Button
            android:id="@+id/saveImage"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:background="#FFCCE5"
            android:layout_gravity="center"
            android:onClick="saveImage"
            android:text="Save Image" />
    </GridLayout>

    <ImageView
        android:id="@+id/opencvImg"
        android:layout_width="match_parent"
        android:layout_height="wrap_content" />

</LinearLayout>

The second GridLayout holds 2 Button views. The first Button view is for selecting images. When clicked, the selectImage() callback method is called, which will be implemented in this tutorial.

In the previous tutorial, the result of each effect was saved automatically. If the user is working with multiple effects, then all images resulting from these effects will be automatically saved in the device storage.

In this tutorial, however, the second Button view in the second GridLayout is used for saving the result of the last effect applied when clicked. Thus, the user can control when to save the result. When clicked, the callback method named saveImage() is called, which is what will be implemented in this tutorial.

The app screen is shown in the next figure. The 2 buttons used for selecting images and saving the result have different colors.

The next section discusses the details of implementing the selectImage() method for selecting a single image.

Selecting a Single Image

The implementation of the selectImage() callback method is listed below. It just creates an Intent for selecting images. The Intent is started with the last line, where the file chooser is opened to allow the user to select an image.

public void selectImage(View v) {
    Intent intent = new Intent();
    intent.setType("*/*");
    intent.setAction(Intent.ACTION_GET_CONTENT);
    startActivityForResult(Intent.createChooser(intent, "Select Picture"), 0);
}

Once the Intent returns, the onActivityResult() callback method is called. It’s implemented below. Using an if statement, there is a check to ensure the user has successfully selected an image. Then the URI of the selected image is returned using the getData() method.

protected void onActivityResult(int requestCode, int resultCode, Intent data) {
    super.onActivityResult(requestCode, resultCode, data);
    try {
        if (requestCode == 0 && resultCode == RESULT_OK && null != data) {
            if (data.getData() != null) {
                Uri uri = data.getData();
                String currentImagePath = getPath(getApplicationContext(), uri);
            }
        } 
    } catch (Exception e) {
        e.printStackTrace();
    }
}

To be able to read the images from device storage, we need to convert the URI into the actual path. This is done using the getPath() method which is implemented in this Stackoverflow answer.

After selecting a single image, the next step is to allow users to select multiple images.

Selecting More than One Image

To select multiple images, the selectImage() method needs to be modified as shown below. The only new line is for adding an extra attribute for the Intent by setting EXTRA_ALLOW_MULTIPLE to true. This allows the user to select multiple images. Besides this line, everything is identical to the previous implementation.

public void selectImage(View v) {
    Intent intent = new Intent();
    intent.setType("*/*");
    intent.putExtra(Intent.EXTRA_ALLOW_MULTIPLE, true);
    intent.setAction(Intent.ACTION_GET_CONTENT);
    startActivityForResult(Intent.createChooser(intent, "Select Picture"), 0);
}

To retrieve all selected images, the onActivityResult() method is modified as shown below. In the previous implementation, the getData() method was used when a single image was selected to return its URI. When multiple images are selected, their URIs are returned using the getClipData() method. This method returns an instance of the ClipData class, which holds the details of all selected image files.

For getting the URI of each file, a for loop goes through all of the items within the ClipData instance. The number of selected image files is returned using the getItemCount() method.

For each selected image, its URI is returned using the getItemAt() method. It accepts the index of the file and returns an instance of the ClipData.Item class. Finally, the URI is retrieved using the getUri() method. As before, the URI is converted into a path using the getPath() method.

ArrayList<String> selectedImagesPaths; // Paths of the image(s) selected by the user.
boolean imagesSelected = false; // Whether the user selected at least an image or not.

protected void onActivityResult(int requestCode, int resultCode, Intent data) {
    super.onActivityResult(requestCode, resultCode, data);
    try {
        if (requestCode == 0 && resultCode == RESULT_OK && null != data) {
            // When a single image is selected.
            String currentImagePath;
            selectedImagesPaths = new ArrayList<>();
            if (data.getData() != null) {
                Uri uri = data.getData();
                currentImagePath = getPath(getApplicationContext(), uri);
                selectedImagesPaths.add(currentImagePath);
                imagesSelected = true;
            } else {
                // When multiple images are selected.
                if (data.getClipData() != null) {
                    ClipData clipData = data.getClipData();
                    for (int i = 0; i < clipData.getItemCount(); i++) {

                        ClipData.Item item = clipData.getItemAt(i);
                        Uri uri = item.getUri();

                        currentImagePath = getPath(getApplicationContext(), uri);
                        selectedImagesPaths.add(currentImagePath);
                        imagesSelected = true;
                    }
                }
            }
        }
        Toast.makeText(getApplicationContext(), selectedImagesPaths.size() + " Image(s) Selected.", Toast.LENGTH_LONG).show();
    } catch (Exception e) {
        e.printStackTrace();
    }
}

Because the images are selected just once and used multiple times, we have to keep the paths of the selected images stored in a class variable. This is why 2 class variables are defined.

The first one is the selectedImagesPaths variable, which is a Java ArrayList of String type to hold the paths of the selected image(s). The second one is a Boolean variable named imagesSelected, which is false by default. When at least one image is selected, its value becomes true. At the end of the callback method, the number of selected images is printed in a Toast message.

When the user clicks the “Select Image(s)” button, the file chooser is opened for multiple images selection as shown below.

After returning to the application, the number of selected images is printed in a Toast message, as shown below.

After selecting the images, the next step is to edit the implementation of the callback methods associated with the buttons applying the effects. This is why they’re currently using the resource images, and we need to use the images selected in the selectedImagesPaths ArrayList.

Applying Effects Using Selected Images

For the simplest case, let’s start by editing the callback methods of the effects applied to just a single image. Let’s discuss the implementation of the method named medianFilter(), which is the callback method of the “Median Filter” button. Its current implementation is shown below — it uses a resource image read as a Bitmap array and saved in the original variable. This Bitmap array is then converted to an OpenCV Mat array, and then the medianBlur() method is used to apply the median filter.

    public void medianFilter(View view) {
        BitmapFactory.Options options = new BitmapFactory.Options();
        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
        Bitmap original = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);

        Mat img1 = new Mat();
        Utils.bitmapToMat(original, img1);
        Mat medianFilter = new Mat();
        Imgproc.cvtColor(img1, medianFilter, Imgproc.COLOR_BGR2GRAY);

        Imgproc.medianBlur(medianFilter, medianFilter, 15);

        resultBitmap = Bitmap.createBitmap(medianFilter.cols(), medianFilter.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(medianFilter, resultBitmap);
        resultName = "median_filter";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

To use the image selected in the selectedImagesPaths ArrayList, the method is modified as shown below. The Bitmap array is returned using a method named returnSingleImageSelected(). If the result of this method is null, then this indicates the user did not select any image and thus the method returns.

    public void medianFilter(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap original = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);

        Bitmap original = returnSingleImageSelected(selectedImagesPaths);
        if (original == null) {
            return;
        }

        Mat img1 = new Mat();
        Utils.bitmapToMat(original, img1);
        Mat medianFilter = new Mat();
        Imgproc.cvtColor(img1, medianFilter, Imgproc.COLOR_BGR2GRAY);

        Imgproc.medianBlur(medianFilter, medianFilter, 15);

        resultBitmap = Bitmap.createBitmap(medianFilter.cols(), medianFilter.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(medianFilter, resultBitmap);
        resultName = "median_filter";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

The implementation of the returnSingleImageSelected() method is listed below. From its name, this method just returns a single image from the selectedImagesPaths ArrayList. There is an if statement that checks whether the user has selected an image or not by checking the value of the imagesSelected Boolean class variable. If it’s false, then the user didn’t select an image and a Toast message appears requesting the selection of an image, and then the method returns null.

If true, then at least a single image is selected. Even if more than one image is selected, the path of the first image is returned by using index 0 in the get() method for returning the first item in the selectedImagesPaths ArrayList.

Bitmap returnSingleImageSelected(ArrayList<String> selectedImages) {
    if (imagesSelected == true) {
        return BitmapFactory.decodeFile(selectedImagesPaths.get(0));
    } else {
        Toast.makeText(getApplicationContext(), "No Image Selected. You have to Select an Image.", Toast.LENGTH_LONG).show();
        return null;
    }
}

When the user clicks on the “Median Filter” button without selecting an image, the Toast message appears, as shown below.

When an image is selected, the filter will be applied over it:

Again, this is the approach for using a single selected image for applying an effect. All other effects that use a single image will read the image the same way. In the next section, we’ll work with the effects that use multiple images, so we’ll edit the app to allow the user to select more than one image.

Reading Multiple Images for Multi-Image Effects

For effects using multiple images, multiple resource images are read as Bitmap arrays before the effect takes place. blendImages() is a callback method applied to blend 2 images, as shown below. The 2 resource images are read and converted into 2 Mat arrays named img1 and img2. Such Mat arrays are then fed to the imageBlending() method, which is where blending takes place.

After this method returns successfully, there are 2 class variables that have their values changed. The first one is the class variable named resultBitmap, which holds the Bitmap returned by any effect that’s stored in this variable. The second one is the resultName class variable, which holds the initial name of the image in case the user wants to save it by clicking the “Save Image” button.

public void blendImages(View view) {
    BitmapFactory.Options options = new BitmapFactory.Options();
    options.inScaled = false; // Leaving it to true enlarges the decoded image size.
    Bitmap img1Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.im1, options);
    Bitmap img2Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.im2, options);

    Mat img1 = new Mat();
    Mat img2 = new Mat();
    Utils.bitmapToMat(img1Bitmap, img1);
    Imgproc.cvtColor(img1, img1, Imgproc.COLOR_BGRA2BGR);
    Utils.bitmapToMat(img2Bitmap, img2);
    Imgproc.cvtColor(img2, img2, Imgproc.COLOR_BGRA2BGR);


    Mat result = imageBlending(img1, img2, 128.0);

    resultBitmap = Bitmap.createBitmap(result.cols(), result.rows(), Bitmap.Config.ARGB_8888);
    Utils.matToBitmap(result, resultBitmap);
    resultName = "image_blending";

    ImageView imageView = findViewById(R.id.opencvImg);
    imageView.setImageBitmap(resultBitmap);
}

To allow the method to read the images selected in the selectedImagesPaths ArrayList, the method is modified, as given below. There is a method named returnMultipleSelectedImages() used to return the number of required images as a Java List of Mat arrays. This method accepts 3 arguments which are:

  1. ArrayList<String> selectedImages: ArrayList of paths of all selected images.
  2. int numImagesRequired: Number of images required by the effect.
  3. boolean moreAccepted: If the effect can work with more images than the number specified in numImagesRequired.

If the user selects the number of images required by the effect, then such images are returned as a Java List of Mat arrays. If no images are selected or the number is below the requirements, then it returns null. When null is returned, the blendImages() method returns.

    public void blendImages(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap img1Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.im1, options);
//        Bitmap img2Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.im2, options);
//
//        Mat img1 = new Mat();
//        Mat img2 = new Mat();
//        Utils.bitmapToMat(img1Bitmap, img1);
//        Imgproc.cvtColor(img1, img1, Imgproc.COLOR_BGRA2BGR);
//        Utils.bitmapToMat(img2Bitmap, img2);
//        Imgproc.cvtColor(img2, img2, Imgproc.COLOR_BGRA2BGR);

        List<Mat> imagesMatList = returnMultipleSelectedImages(selectedImagesPaths, 2, false);
        if (imagesMatList == null) {
            return;
        }

        Mat result = imageBlending(imagesMatList.get(0), imagesMatList.get(1), 128.0);

        resultBitmap = Bitmap.createBitmap(result.cols(), result.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(result, resultBitmap);
        resultName = "image_blending";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

The implementation of the returnMultipleSelectedImages() method is listed below. Using some if statements, the method makes sure that the user selected the number of required images before going ahead. If this condition is not met, then Toast messages are printed explaining the problem and the method returns null.

If the number of required images is selected by the user, then an empty List of Mat arrays is created in which all images are read inside a for loop, except for the first image. This means that the loop starts from index 1 rather than 0.

Because the effects that use multiple images require them to be of the same size, the first image is read outside the for loop, and all other images are resized to the size of the first image. If the current image size is identical to the first image size, then the image is inserted directly inside the List. Otherwise, it’s resized first then inserted.

At the end of the method, the List is returned where all images are read as Mat arrays.

List<Mat> returnMultipleSelectedImages(ArrayList<String> selectedImages, int numImagesRequired, boolean moreAccepted) {
    if (selectedImages == null) {
        Toast.makeText(getApplicationContext(), "No Images Selected. You have to Select More than 1 Image.", Toast.LENGTH_LONG).show();
        return null;
    } else if (selectedImages.size() == 0 && moreAccepted == true) {
        Toast.makeText(getApplicationContext(), "No Images Selected. You have to Select at Least " + numImagesRequired + " Images.", Toast.LENGTH_LONG).show();
        return null;
    } else if (selectedImages.size() == 0 && moreAccepted == false) {
        Toast.makeText(getApplicationContext(), "No Images Selected. You have to Select Exactly " + numImagesRequired + " Images.", Toast.LENGTH_LONG).show();
        return null;
    } else if (selectedImages.size() < numImagesRequired && moreAccepted == true) {
        Toast.makeText(getApplicationContext(), "Sorry. You have to Select at Least " + numImagesRequired + " Images.", Toast.LENGTH_LONG).show();
        return null;
    } else if (selectedImages.size() < numImagesRequired && moreAccepted == false) {
        Toast.makeText(getApplicationContext(), "Sorry. You have to Select Exactly " + numImagesRequired + " Images.", Toast.LENGTH_LONG).show();
        return null;
    }

    List<Mat> imagesMatList = new ArrayList<>();
    Mat mat = Imgcodecs.imread(selectedImages.get(0));
    Imgproc.cvtColor(mat, mat, Imgproc.COLOR_BGR2RGB);
    imagesMatList.add(mat);

    for (int i = 1; i < selectedImages.size(); i++) {
        mat = Imgcodecs.imread(selectedImages.get(i));
        Imgproc.cvtColor(mat, mat, Imgproc.COLOR_BGR2RGB);
        if (imagesMatList.get(0).size().equals(mat)) {
            imagesMatList.add(mat);
        } else {
            Imgproc.resize(mat, mat, imagesMatList.get(0).size());
            imagesMatList.add(mat);
        }
    }
    return imagesMatList;
}

Following the same process, the images are read for all callback methods for all effects requiring more than one image.

After editing the callback methods to force them to use the selectedImages ArrayList for reading the selected images, the main activity of the Android app is listed below.

The implementation of the saveImage() method is given in the code below. It’s very simple, as it just checks to ensure the class variable named resultName isn’t null and then saves the image according to the saveBitmap() method.

package com.example.imageeffectsopencv;

import android.content.ClipData;
import android.content.ContentUris;
import android.content.Context;
import android.content.Intent;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.media.MediaScannerConnection;
import android.net.Uri;
import android.os.Build;
import android.os.Bundle;
import android.os.Environment;
import android.provider.DocumentsContract;
import android.provider.MediaStore;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.View;
import android.widget.EditText;
import android.widget.ImageView;
import android.widget.Toast;

import com.example.imageeffectsopencv.R;

import org.opencv.android.OpenCVLoader;
import org.opencv.android.Utils;
import org.opencv.core.Core;
import org.opencv.core.Mat;
import org.opencv.core.Size;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.imgproc.Imgproc;

import java.io.File;
import java.io.FileOutputStream;
import java.lang.reflect.Array;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
import java.util.List;
import java.util.Locale;

import static org.opencv.core.Core.LUT;
import static org.opencv.core.CvType.CV_8UC1;

public class MainActivity extends AppCompatActivity {

    final int SELECT_MULTIPLE_IMAGES = 1;
    ArrayList<String> selectedImagesPaths; // Paths of the image(s) selected by the user.
    boolean imagesSelected = false; // Whether the user selected at least an image or not.

    Bitmap resultBitmap; // Result of the last operation.
    String resultName = null; // File name to save the result of the last operation.

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        OpenCVLoader.initDebug();
    }

    public void saveImage(View v) {
        if (resultName == null) {
            Toast.makeText(getApplicationContext(), "Please Apply an Operation to Save its Result.", Toast.LENGTH_LONG).show();
            return;
        }
        saveBitmap(resultBitmap, resultName);
        Toast.makeText(getApplicationContext(), "Image Saved Successfully.", Toast.LENGTH_LONG).show();
    }

    public void selectImage(View v) {
        Intent intent = new Intent();
        intent.setType("*/*");
        intent.putExtra(Intent.EXTRA_ALLOW_MULTIPLE, true);
        intent.setAction(Intent.ACTION_GET_CONTENT);
        startActivityForResult(Intent.createChooser(intent, "Select Picture"), SELECT_MULTIPLE_IMAGES);
    }

    @Override
    protected void onActivityResult(int requestCode, int resultCode, Intent data) {
        try {
            if (requestCode == SELECT_MULTIPLE_IMAGES && resultCode == RESULT_OK && null != data) {
                // When a single image is selected.
                String currentImagePath;
                selectedImagesPaths = new ArrayList<>();
                if (data.getData() != null) {
                    Uri uri = data.getData();
                    currentImagePath = getPath(getApplicationContext(), uri);
                    Log.d("ImageDetails", "Single Image URI : " + uri);
                    Log.d("ImageDetails", "Single Image Path : " + currentImagePath);
                    selectedImagesPaths.add(currentImagePath);
                    imagesSelected = true;
                } else {
                    // When multiple images are selected.
                    // Thanks tp Laith Mihyar for this Stackoverflow answer : https://stackoverflow.com/a/34047251/5426539
                    if (data.getClipData() != null) {
                        ClipData clipData = data.getClipData();
                        for (int i = 0; i < clipData.getItemCount(); i++) {

                            ClipData.Item item = clipData.getItemAt(i);
                            Uri uri = item.getUri();

                            currentImagePath = getPath(getApplicationContext(), uri);
                            selectedImagesPaths.add(currentImagePath);
                            Log.d("ImageDetails", "Image URI " + i + " = " + uri);
                            Log.d("ImageDetails", "Image Path " + i + " = " + currentImagePath);
                            imagesSelected = true;
                        }
                    }
                }
            } else {
                Toast.makeText(this, "You haven't Picked any Image.", Toast.LENGTH_LONG).show();
            }
            Toast.makeText(getApplicationContext(), selectedImagesPaths.size() + " Image(s) Selected.", Toast.LENGTH_LONG).show();
        } catch (Exception e) {
            Toast.makeText(this, "Something Went Wrong.", Toast.LENGTH_LONG).show();
            e.printStackTrace();
        }

        super.onActivityResult(requestCode, resultCode, data);
    }

    public void blendRegions(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap img1Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.im1, options);
//        Bitmap img2Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.im2, options);
//        Bitmap img1MaskBitmap = BitmapFactory.decodeResource(getResources(), R.drawable.mask_im1, options);
//
//        Mat img1 = new Mat();
//        Mat img2 = new Mat();
//        Mat img1Mask = new Mat();
//
//        Utils.bitmapToMat(img1Bitmap, img1);
//        Imgproc.cvtColor(img1, img1, Imgproc.COLOR_BGRA2BGR);
//
//        Utils.bitmapToMat(img2Bitmap, img2);
//        Imgproc.cvtColor(img2, img2, Imgproc.COLOR_BGRA2BGR);
//
//        Utils.bitmapToMat(img1MaskBitmap, img1Mask);

        List<Mat> imagesMatList = returnMultipleSelectedImages(selectedImagesPaths, 3, false);
        if (imagesMatList == null) {
            return;
        }

        Mat img1Mask = imagesMatList.get(imagesMatList.size() - 1);
        Imgproc.cvtColor(img1Mask, img1Mask, Imgproc.COLOR_BGRA2BGR);
        Imgproc.cvtColor(img1Mask, img1Mask, Imgproc.COLOR_BGR2GRAY);
        Imgproc.threshold(img1Mask, img1Mask, 200, 255.0, Imgproc.THRESH_BINARY);

        Mat result = regionBlending(imagesMatList.get(0), imagesMatList.get(1), img1Mask);

        resultBitmap = Bitmap.createBitmap(result.cols(), result.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(result, resultBitmap);
        resultName = "region_blending";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    public void blendImages(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap img1Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.im1, options);
//        Bitmap img2Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.im2, options);
//
//        Mat img1 = new Mat();
//        Mat img2 = new Mat();
//        Utils.bitmapToMat(img1Bitmap, img1);
//        Imgproc.cvtColor(img1, img1, Imgproc.COLOR_BGRA2BGR);
//        Utils.bitmapToMat(img2Bitmap, img2);
//        Imgproc.cvtColor(img2, img2, Imgproc.COLOR_BGRA2BGR);

        List<Mat> imagesMatList = returnMultipleSelectedImages(selectedImagesPaths, 2, false);
        if (imagesMatList == null) {
            return;
        }

        Mat result = imageBlending(imagesMatList.get(0), imagesMatList.get(1), 128.0);

        resultBitmap = Bitmap.createBitmap(result.cols(), result.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(result, resultBitmap);
        resultName = "image_blending";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    public void cartoonImage(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap original = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);

        Bitmap original = returnSingleImageSelected(selectedImagesPaths);
        if (original == null) {
            return;
        }

        Mat img1 = new Mat();
        Utils.bitmapToMat(original, img1);
        Imgproc.cvtColor(img1, img1, Imgproc.COLOR_BGRA2BGR);

        Mat result = cartoon(img1, 80, 15, 10);

        resultBitmap = Bitmap.createBitmap(result.cols(), result.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(result, resultBitmap);
        resultName = "cartoon";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    public void reduceImageColors(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap original = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);

        Bitmap original = returnSingleImageSelected(selectedImagesPaths);
        if (original == null) {
            return;
        }

        Mat img1 = new Mat();
        Utils.bitmapToMat(original, img1);

        Mat result = reduceColors(img1, 80, 15, 10);

        resultBitmap = Bitmap.createBitmap(result.cols(), result.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(result, resultBitmap);
        resultName = "reduce_colors";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    Bitmap returnSingleImageSelected(ArrayList<String> selectedImages) {
        if (imagesSelected == true) {
            return BitmapFactory.decodeFile(selectedImagesPaths.get(0));
        } else {
            Toast.makeText(getApplicationContext(), "No Image Selected. You have to Select an Image.", Toast.LENGTH_LONG).show();
            return null;
        }
    }

    public void reduceImageColorsGray(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap original = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);

        Bitmap original = returnSingleImageSelected(selectedImagesPaths);
        if (original == null) {
            return;
        }

        Mat img1 = new Mat();
        Utils.bitmapToMat(original, img1);

        Imgproc.cvtColor(img1, img1, Imgproc.COLOR_BGR2GRAY);
        Mat result = reduceColorsGray(img1, 5);

        resultBitmap = Bitmap.createBitmap(result.cols(), result.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(result, resultBitmap);
        resultName = "reduce_colors_gray";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    public void medianFilter(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap original = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);

        Bitmap original = returnSingleImageSelected(selectedImagesPaths);
        if (original == null) {
            return;
        }

        Mat img1 = new Mat();
        Utils.bitmapToMat(original, img1);
        Mat medianFilter = new Mat();
        Imgproc.cvtColor(img1, medianFilter, Imgproc.COLOR_BGR2GRAY);

        Imgproc.medianBlur(medianFilter, medianFilter, 15);

        resultBitmap = Bitmap.createBitmap(medianFilter.cols(), medianFilter.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(medianFilter, resultBitmap);
        resultName = "median_filter";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    public void adaptiveThreshold(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap original = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);

        Bitmap original = returnSingleImageSelected(selectedImagesPaths);
        if (original == null) {
            return;
        }

        Mat adaptiveTh = new Mat();
        Utils.bitmapToMat(original, adaptiveTh);
        Imgproc.cvtColor(adaptiveTh, adaptiveTh, Imgproc.COLOR_BGR2GRAY);

        Imgproc.medianBlur(adaptiveTh, adaptiveTh, 15);

        Imgproc.adaptiveThreshold(adaptiveTh, adaptiveTh, 255, Imgproc.ADAPTIVE_THRESH_MEAN_C, Imgproc.THRESH_BINARY, 9, 2);

        resultBitmap = Bitmap.createBitmap(adaptiveTh.cols(), adaptiveTh.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(adaptiveTh, resultBitmap);
        resultName = "adaptive_threshold";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    List<Mat> returnMultipleSelectedImages(ArrayList<String> selectedImages, int numImagesRequired, boolean moreAccepted) {
        if (selectedImages == null) {
            Toast.makeText(getApplicationContext(), "No Images Selected. You have to Select More than 1 Image.", Toast.LENGTH_LONG).show();
            return null;
        } else if (selectedImages.size() == 0 && moreAccepted == true) {
            Toast.makeText(getApplicationContext(), "No Images Selected. You have to Select at Least " + numImagesRequired + " Images.", Toast.LENGTH_LONG).show();
            return null;
        } else if (selectedImages.size() == 0 && moreAccepted == false) {
            Toast.makeText(getApplicationContext(), "No Images Selected. You have to Select Exactly " + numImagesRequired + " Images.", Toast.LENGTH_LONG).show();
            return null;
        } else if (selectedImages.size() < numImagesRequired && moreAccepted == true) {
            Toast.makeText(getApplicationContext(), "Sorry. You have to Select at Least " + numImagesRequired + " Images.", Toast.LENGTH_LONG).show();
            return null;
        } else if (selectedImages.size() < numImagesRequired && moreAccepted == false) {
            Toast.makeText(getApplicationContext(), "Sorry. You have to Select Exactly " + numImagesRequired + " Images.", Toast.LENGTH_LONG).show();
            return null;
        }

        List<Mat> imagesMatList = new ArrayList<>();
        Mat mat = Imgcodecs.imread(selectedImages.get(0));
        Imgproc.cvtColor(mat, mat, Imgproc.COLOR_BGR2RGB);
        imagesMatList.add(mat);

        for (int i = 1; i < selectedImages.size(); i++) {
            mat = Imgcodecs.imread(selectedImages.get(i));
            Imgproc.cvtColor(mat, mat, Imgproc.COLOR_BGR2RGB);
            if (imagesMatList.get(0).size().equals(mat)) {
                imagesMatList.add(mat);
            } else {
                Imgproc.resize(mat, mat, imagesMatList.get(0).size());
                imagesMatList.add(mat);
            }
        }
        return imagesMatList;
    }

    public void stitchVectical(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap im1 = BitmapFactory.decodeResource(getResources(), R.drawable.part1, options);
//        Bitmap im2 = BitmapFactory.decodeResource(getResources(), R.drawable.part2, options);
//        Bitmap im3 = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);
//
//        Mat img1 = new Mat();
//        Mat img2 = new Mat();
//        Mat img3 = new Mat();
//        Utils.bitmapToMat(im1, img1);
//        Utils.bitmapToMat(im2, img2);
//        Utils.bitmapToMat(im3, img3);

        List<Mat> imagesMatList = returnMultipleSelectedImages(selectedImagesPaths, 2, true);
        if (imagesMatList == null) {
            return;
        }

        resultBitmap = stitchImagesVectical(imagesMatList);
        resultName = "stitch_vectical";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    public void stitchHorizontal(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap im1 = BitmapFactory.decodeResource(getResources(), R.drawable.part1, options);
//        Bitmap im2 = BitmapFactory.decodeResource(getResources(), R.drawable.part2, options);
//        Bitmap im3 = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);
//
//        Mat img1 = new Mat();
//        Mat img2 = new Mat();
//        Mat img3 = new Mat();
//        Utils.bitmapToMat(im1, img1);
//        Utils.bitmapToMat(im2, img2);
//        Utils.bitmapToMat(im3, img3);

        List<Mat> imagesMatList = returnMultipleSelectedImages(selectedImagesPaths, 2, true);
        if (imagesMatList == null) {
            return;
        }

        resultBitmap = stitchImagesHorizontal(imagesMatList);
        resultName = "stitch_horizontal";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    Mat regionBlending(Mat img, Mat img2, Mat mask) {
        Mat result = img2;

        for (int row = 0; row < img.rows(); row++) {
            for (int col = 0; col < img.cols(); col++) {
                double[] img1Pixel = img.get(row, col);
                double[] binaryPixel = mask.get(row, col);
                if (binaryPixel[0] == 255.0) {
                    result.put(row, col, img1Pixel);
                }
            }
        }
        return result;
    }

    Mat imageBlending(Mat img, Mat img2, double alpha) {
        Mat result = img;

        if (alpha == 0.0) {
            return img2;
        } else if (alpha == 255.0) {
            return img;
        }

        for (int row = 0; row < img.rows(); row++) {
            for (int col = 0; col < img.cols(); col++) {
                double[] pixel1 = img.get(row, col);

                double[] pixel2 = img2.get(row, col);

                double fraction = alpha / 255.0;

                pixel1[0] = pixel1[0] * fraction + pixel2[0] * (1.0 - fraction);
                pixel1[1] = pixel1[1] * fraction + pixel2[1] * (1.0 - fraction);
                pixel1[2] = pixel1[2] * fraction + pixel2[2] * (1.0 - fraction);

                result.put(row, col, pixel1);
            }
        }
        return result;
    }

    Mat cartoon(Mat img, int numRed, int numGreen, int numBlue) {
        Mat reducedColorImage = reduceColors(img, numRed, numGreen, numBlue);

        Mat result = new Mat();
        Imgproc.cvtColor(img, result, Imgproc.COLOR_BGR2GRAY);
        Imgproc.medianBlur(result, result, 15);

        Imgproc.adaptiveThreshold(result, result, 255, Imgproc.ADAPTIVE_THRESH_MEAN_C, Imgproc.THRESH_BINARY, 15, 2);

        Imgproc.cvtColor(result, result, Imgproc.COLOR_GRAY2BGR);

        Log.d("PPP", result.height() + " " + result.width() + " " + reducedColorImage.type() + " " + result.channels());
        Log.d("PPP", reducedColorImage.height() + " " + reducedColorImage.width() + " " + reducedColorImage.type() + " " + reducedColorImage.channels());

        Core.bitwise_and(reducedColorImage, result, result);

        return result;
    }

    Mat reduceColors(Mat img, int numRed, int numGreen, int numBlue) {
        Mat redLUT = createLUT(numRed);
        Mat greenLUT = createLUT(numGreen);
        Mat blueLUT = createLUT(numBlue);

        List<Mat> BGR = new ArrayList<>(3);
        Core.split(img, BGR); // splits the image into its channels in the List of Mat arrays.

        LUT(BGR.get(0), blueLUT, BGR.get(0));
        LUT(BGR.get(1), greenLUT, BGR.get(1));
        LUT(BGR.get(2), redLUT, BGR.get(2));

        Core.merge(BGR, img);

        return img;
    }

    Mat reduceColorsGray(Mat img, int numColors) {
        Mat LUT = createLUT(numColors);

        LUT(img, LUT, img);

        return img;
    }

    Mat createLUT(int numColors) {
        // When numColors=1 the LUT will only have 1 color which is black.
        if (numColors < 0 || numColors > 256) {
            System.out.println("Invalid Number of Colors. It must be between 0 and 256 inclusive.");
            return null;
        }

        Mat lookupTable = Mat.zeros(new Size(1, 256), CV_8UC1);

        int startIdx = 0;
        for (int x = 0; x < 256; x += 256.0 / numColors) {
            lookupTable.put(x, 0, x);

            for (int y = startIdx; y < x; y++) {
                if (lookupTable.get(y, 0)[0] == 0) {
                    lookupTable.put(y, 0, lookupTable.get(x, 0));
                }
            }
            startIdx = x;
        }
        return lookupTable;
    }

    Bitmap stitchImagesVectical(List<Mat> src) {
        Mat dst = new Mat();
        Core.vconcat(src, dst); //Core.hconcat(src, dst);
        Bitmap imgBitmap = Bitmap.createBitmap(dst.cols(), dst.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(dst, imgBitmap);

        return imgBitmap;
    }

    Bitmap stitchImagesHorizontal(List<Mat> src) {
        Mat dst = new Mat();
        Core.hconcat(src, dst); //Core.vconcat(src, dst);
        Bitmap imgBitmap = Bitmap.createBitmap(dst.cols(), dst.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(dst, imgBitmap);

        return imgBitmap;
    }

    void saveBitmap(Bitmap imgBitmap, String fileNameOpening) {
        SimpleDateFormat formatter = new SimpleDateFormat("yyyy_MM_dd_HH_mm_ss", Locale.US);
        Date now = new Date();
        String fileName = fileNameOpening + "_" + formatter.format(now) + ".jpg";

        FileOutputStream outStream;
        try {
            // Get a public path on the device storage for saving the file. Note that the word external does not mean the file is saved in the SD card. It is still saved in the internal storage.
            File path = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES);

            // Creates a directory for saving the image.
            File saveDir = new File(path + "/HeartBeat/");

            // If the directory is not created, create it.
            if (!saveDir.exists())
                saveDir.mkdirs();

            // Create the image file within the directory.
            File fileDir = new File(saveDir, fileName); // Creates the file.

            // Write into the image file by the BitMap content.
            outStream = new FileOutputStream(fileDir);
            imgBitmap.compress(Bitmap.CompressFormat.JPEG, 100, outStream);

            MediaScannerConnection.scanFile(this.getApplicationContext(),
                    new String[]{fileDir.toString()}, null,
                    new MediaScannerConnection.OnScanCompletedListener() {
                        public void onScanCompleted(String path, Uri uri) {
                        }
                    });

            // Close the output stream.
            outStream.close();
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    // Implementation of the getPath() method and all its requirements is taken from the StackOverflow Paul Burke's answer: https://stackoverflow.com/a/20559175/5426539
    public static String getPath(final Context context, final Uri uri) {

        final boolean isKitKat = Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT;

        // DocumentProvider
        if (isKitKat && DocumentsContract.isDocumentUri(context, uri)) {
            // ExternalStorageProvider
            if (isExternalStorageDocument(uri)) {
                final String docId = DocumentsContract.getDocumentId(uri);
                final String[] split = docId.split(":");
                final String type = split[0];

                if ("primary".equalsIgnoreCase(type)) {
                    return Environment.getExternalStorageDirectory() + "/" + split[1];
                }

                // TODO handle non-primary volumes
            }
            // DownloadsProvider
            else if (isDownloadsDocument(uri)) {

                final String id = DocumentsContract.getDocumentId(uri);
                final Uri contentUri = ContentUris.withAppendedId(
                        Uri.parse("content://downloads/public_downloads"), Long.valueOf(id));

                return getDataColumn(context, contentUri, null, null);
            }
            // MediaProvider
            else if (isMediaDocument(uri)) {
                final String docId = DocumentsContract.getDocumentId(uri);
                final String[] split = docId.split(":");
                final String type = split[0];

                Uri contentUri = null;
                if ("image".equals(type)) {
                    contentUri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
                } else if ("video".equals(type)) {
                    contentUri = MediaStore.Video.Media.EXTERNAL_CONTENT_URI;
                } else if ("audio".equals(type)) {
                    contentUri = MediaStore.Audio.Media.EXTERNAL_CONTENT_URI;
                }

                final String selection = "_id=?";
                final String[] selectionArgs = new String[]{
                        split[1]
                };

                return getDataColumn(context, contentUri, selection, selectionArgs);
            }
        }
        // MediaStore (and general)
        else if ("content".equalsIgnoreCase(uri.getScheme())) {
            return getDataColumn(context, uri, null, null);
        }
        // File
        else if ("file".equalsIgnoreCase(uri.getScheme())) {
            return uri.getPath();
        }

        return null;
    }

    public static String getDataColumn(Context context, Uri uri, String selection,
                                       String[] selectionArgs) {

        Cursor cursor = null;
        final String column = "_data";
        final String[] projection = {
                column
        };

        try {
            cursor = context.getContentResolver().query(uri, projection, selection, selectionArgs,
                    null);
            if (cursor != null && cursor.moveToFirst()) {
                final int column_index = cursor.getColumnIndexOrThrow(column);
                return cursor.getString(column_index);
            }
        } finally {
            if (cursor != null)
                cursor.close();
        }
        return null;
    }

    public static boolean isExternalStorageDocument(Uri uri) {
        return "com.android.externalstorage.documents".equals(uri.getAuthority());
    }

    public static boolean isDownloadsDocument(Uri uri) {
        return "com.android.providers.downloads.documents".equals(uri.getAuthority());
    }

    public static boolean isMediaDocument(Uri uri) {
        return "com.android.providers.media.documents".equals(uri.getAuthority());
    }

}

After making some good edits to the project, the next step is to add a new effect for created animated GIF images.

Create Animated GIFs

First, the XML layout of the app is edited to create a new Button view for the new effect. The new file is listed below. The new Button has the text GIF and calls the createAnimatedGIF() method when clicked.

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical"
    tools:context=".MainActivity">

    <GridLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:columnCount="2"
        android:orientation="horizontal">

        <Button
            android:id="@+id/stitchHorizontal"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="stitchHorizontal"
            android:text="Stitch Horizontally" />

        <Button
            android:id="@+id/stitchVertical"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="stitchVectical"
            android:text="Stitch Vertically" />

        <Button
            android:id="@+id/reduceColors"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="reduceImageColors"
            android:text="Reduce Colors" />

        <Button
            android:id="@+id/reduceColorsGray"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="reduceImageColorsGray"
            android:text="Reduce Colors Gray" />

        <Button
            android:id="@+id/medianFilter"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="medianFilter"
            android:text="Median Filter" />

        <Button
            android:id="@+id/adaptiveThreshold"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="adaptiveThreshold"
            android:text="Adaptive Threshold" />

        <Button
            android:id="@+id/cartoon"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="cartoonImage"
            android:text="Cartoon Image" />

        <Button
            android:id="@+id/transparency"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="blendImages"
            android:text="Image Blending" />

        <Button
            android:id="@+id/regionBlending"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="blendRegions"
            android:text="Region Blending" />

        <Button
            android:id="@+id/animatedGIF"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="createAnimatedGIF"
            android:text="GIF" />

    </GridLayout>

    <GridLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:columnCount="2"
        android:rowCount="1"
        android:orientation="horizontal">

        <Button
            android:id="@+id/selectImage"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:background="#FFCCE5"
            android:onClick="selectImage"
            android:text="Select Image(s)" />

        <Button
            android:id="@+id/saveImage"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:background="#FFCCE5"
            android:layout_gravity="center"
            android:onClick="saveImage"
            android:text="Save Image" />
    </GridLayout>

    <ImageView
        android:id="@+id/opencvImg"
        android:layout_width="match_parent"
        android:layout_height="wrap_content" />

</LinearLayout>

The next figure shows the app screen after adding the new Button view.

The implementation of the createAnimatedGIF() method is shown below. As usual, it returns the selected images as a List of Mat arrays by calling the returnMultipleSelectedImages() method. This list is then passed to the createGIF() method to create the GIF image.

The GIF image is then saved in the class variable named GIFImageByteArray. This variable is of type ByteArrayOutputStream. The name of the GIF image to be saved is saved in the resultName class variable.

public void createAnimatedGIF(View view) {
    List<Mat> imagesMatList = returnMultipleSelectedImages(selectedImagesPaths, 2, false);
    if (imagesMatList == null) {
        return;
    }

    GIFImageByteArray = createGIF(imagesMatList);
    resultName = "animated_GIF";
    GIFLastEffect = true;
}

Because saving the GIF image is different from saving the other Bitmap images, there’s a new method named saveGIF() created for that purpose. It’s similar to the saveBitmap() method except for saving a variable of type ByteArrayOutputStream rather than Bitmap.

When the user clicks the “Save Image” button, the saveImage() method is called. Within this method, we have to call the method saveGIF() rather than saveBitmap() when working with GIF images. The modified method is listed below.

There’s a class variable named GIFLastEffect that’s set to true when the user creates a GIF and false when any other effect is created. As a result, when this variable is true, the saveGIF() method is called. Otherwise, the saveBitmap() method is called.

public void saveImage(View v) {
    if (resultName == null) {
        Toast.makeText(getApplicationContext(), "Please Apply an Operation to Save its Result.", Toast.LENGTH_LONG).show();
        return;
    }

    if (GIFLastEffect == true) {
        saveGif(GIFImageByteArray, "animated_GIF");
    } else {
        saveBitmap(resultBitmap, resultName);
    }
    Toast.makeText(getApplicationContext(), "Image Saved Successfully.", Toast.LENGTH_LONG).show();
}

The implementation of the createGIF() method is listed below. It accepts 2 arguments representing the List of Mat arrays in addition to the delay in milliseconds between every 2 frames.

At first, it creates an encoder of type AnimatedGifEncoder. The implementation of this class is found on this page. Using a for loop, the Bitmap images are added as frames to the encoder using the addFrame() method. After images are added to the encoder, the finish() method is called to indicate its end.

The AnimatedGifEncoder class supports several useful methods to specify how the GIF image is produced, including:

  • setDelay(): Accepts an integer representing the delay in milliseconds between every 2 frames. In our case it’s set to 150 milliseconds.
  • setFrameRate(): Accepts a float number representing the number of frames per second.
ByteArrayOutputStream createGIF(List<Mat> imagesMatList, int delay) {
    ByteArrayOutputStream imageByteArray = new ByteArrayOutputStream();
    // Implementation of the AnimatedGifEncoder.java file: https://gist.githubusercontent.com/wasabeef/8785346/raw/53a15d99062a382690275ef5666174139b32edb5/AnimatedGifEncoder.java
    AnimatedGifEncoder encoder = new AnimatedGifEncoder();
    encoder.start(imageByteArray);

    AnimationDrawable animatedGIF = new AnimationDrawable();

    for (Mat img : imagesMatList) {
        Bitmap imgBitmap = Bitmap.createBitmap(img.cols(), img.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(img, imgBitmap);
        encoder.addFrame(imgBitmap);
        encoder.setDelay(delay);

        animatedGIF.addFrame(new BitmapDrawable(getResources(), imgBitmap), 50);
    }

    encoder.finish();

    ImageView imageView = findViewById(R.id.opencvImg);
    imageView.setBackground(animatedGIF); // attach animation to a view
    animatedGIF.run();

    return imageByteArray;
}

When images are added to the encoder, they are, at the same time, also added as frames to an instance of the AnimationDrawable class. This is just responsible for animating images on the ImageView. Images are added to the AnimationDrawable instance using the addFrame() method. After adding all images within it, it’s added as a background to the imageView and then the animation starts using the run() method.

The createGIF() method finally returns the ByteArrayOutputStream.

Building Android App

After building the code for creating GIF images, here’s the new implementation of the main activity of the Android app.

package com.example.imageeffectsopencv;

import android.content.ClipData;
import android.content.ContentUris;
import android.content.Context;
import android.content.Intent;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.drawable.AnimationDrawable;
import android.graphics.drawable.BitmapDrawable;
import android.media.MediaScannerConnection;
import android.net.Uri;
import android.os.Build;
import android.os.Bundle;
import android.os.Environment;
import android.provider.DocumentsContract;
import android.provider.MediaStore;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.View;
import android.widget.EditText;
import android.widget.ImageView;
import android.widget.Toast;

import com.example.imageeffectsopencv.R;

import org.opencv.android.OpenCVLoader;
import org.opencv.android.Utils;
import org.opencv.core.Core;
import org.opencv.core.Mat;
import org.opencv.core.Size;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.imgproc.Imgproc;

import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileOutputStream;
import java.lang.reflect.Array;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
import java.util.List;
import java.util.Locale;

import static org.opencv.core.Core.LUT;
import static org.opencv.core.CvType.CV_8UC1;

public class MainActivity extends AppCompatActivity {

    final int SELECT_MULTIPLE_IMAGES = 1;
    ArrayList<String> selectedImagesPaths; // Paths of the image(s) selected by the user.
    boolean imagesSelected = false; // Whether the user selected at least an image or not.

    Bitmap resultBitmap; // Result of the last operation.
    String resultName = null; // File name to save the result of the last operation.

    boolean GIFLastEffect = false;
    ByteArrayOutputStream GIFImageByteArray;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        OpenCVLoader.initDebug();
    }

    public void createAnimatedGIF(View view) {
        List<Mat> imagesMatList = returnMultipleSelectedImages(selectedImagesPaths, 2, false);
        if (imagesMatList == null) {
            return;
        }

        GIFImageByteArray = createGIF(imagesMatList, 150);
        resultName = "animated_GIF";
        GIFLastEffect = true;
    }

    ByteArrayOutputStream createGIF(List<Mat> imagesMatList, int delay) {
        ByteArrayOutputStream imageByteArray = new ByteArrayOutputStream();
        // Implementation of the AnimatedGifEncoder.java file: https://gist.githubusercontent.com/wasabeef/8785346/raw/53a15d99062a382690275ef5666174139b32edb5/AnimatedGifEncoder.java
        AnimatedGifEncoder encoder = new AnimatedGifEncoder();
        encoder.start(imageByteArray);

        AnimationDrawable animatedGIF = new AnimationDrawable();

        for (Mat img : imagesMatList) {
            Bitmap imgBitmap = Bitmap.createBitmap(img.cols(), img.rows(), Bitmap.Config.ARGB_8888);
            Utils.matToBitmap(img, imgBitmap);
            encoder.setDelay(delay);
            encoder.addFrame(imgBitmap);

            animatedGIF.addFrame(new BitmapDrawable(getResources(), imgBitmap), delay);
        }

        encoder.finish();

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setBackground(animatedGIF); // attach animation to a view
        animatedGIF.run();

        return imageByteArray;
    }

    void saveGif(ByteArrayOutputStream imageByteArray, String fileNameOpening) {
        SimpleDateFormat formatter = new SimpleDateFormat("yyyy_MM_dd_HH_mm_ss", Locale.US);
        Date now = new Date();
        String fileName = fileNameOpening + "_" + formatter.format(now) + ".gif";

        FileOutputStream outStream;
        try {
            // Get a public path on the device storage for saving the file. Note that the word external does not mean the file is saved in the SD card. It is still saved in the internal storage.
            File path = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES);

            // Creates a directory for saving the image.
            File saveDir = new File(path + "/HeartBeat/");

            // If the directory is not created, create it.
            if (!saveDir.exists())
                saveDir.mkdirs();

            // Create the image file within the directory.
            File fileDir = new File(saveDir, fileName); // Creates the file.

            // Write into the image file by the BitMap content.
            outStream = new FileOutputStream(fileDir);
            outStream.write(imageByteArray.toByteArray());

            MediaScannerConnection.scanFile(this.getApplicationContext(),
                    new String[]{fileDir.toString()}, null,
                    new MediaScannerConnection.OnScanCompletedListener() {
                        public void onScanCompleted(String path, Uri uri) {
                        }
                    });

            // Close the output stream.
            outStream.close();
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    public void saveImage(View v) {
        if (resultName == null) {
            Toast.makeText(getApplicationContext(), "Please Apply an Operation to Save its Result.", Toast.LENGTH_LONG).show();
            return;
        }

        if (GIFLastEffect == true) {
            saveGif(GIFImageByteArray, "animated_GIF");
        } else {
            saveBitmap(resultBitmap, resultName);
        }
        Toast.makeText(getApplicationContext(), "Image Saved Successfully.", Toast.LENGTH_LONG).show();
    }

    public void selectImage(View v) {
        Intent intent = new Intent();
        intent.setType("*/*");
        intent.putExtra(Intent.EXTRA_ALLOW_MULTIPLE, true);
        intent.setAction(Intent.ACTION_GET_CONTENT);
        startActivityForResult(Intent.createChooser(intent, "Select Picture"), SELECT_MULTIPLE_IMAGES);
    }

    @Override
    protected void onActivityResult(int requestCode, int resultCode, Intent data) {
        try {
            if (requestCode == SELECT_MULTIPLE_IMAGES && resultCode == RESULT_OK && null != data) {
                // When a single image is selected.
                String currentImagePath;
                selectedImagesPaths = new ArrayList<>();
                if (data.getData() != null) {
                    Uri uri = data.getData();
                    currentImagePath = getPath(getApplicationContext(), uri);
                    Log.d("ImageDetails", "Single Image URI : " + uri);
                    Log.d("ImageDetails", "Single Image Path : " + currentImagePath);
                    selectedImagesPaths.add(currentImagePath);
                    imagesSelected = true;
                } else {
                    // When multiple images are selected.
                    // Thanks tp Laith Mihyar for this Stackoverflow answer : https://stackoverflow.com/a/34047251/5426539
                    if (data.getClipData() != null) {
                        ClipData clipData = data.getClipData();
                        for (int i = 0; i < clipData.getItemCount(); i++) {

                            ClipData.Item item = clipData.getItemAt(i);
                            Uri uri = item.getUri();

                            currentImagePath = getPath(getApplicationContext(), uri);
                            selectedImagesPaths.add(currentImagePath);
                            Log.d("ImageDetails", "Image URI " + i + " = " + uri);
                            Log.d("ImageDetails", "Image Path " + i + " = " + currentImagePath);
                            imagesSelected = true;
                        }
                    }
                }
            } else {
                Toast.makeText(this, "You haven't Picked any Image.", Toast.LENGTH_LONG).show();
            }
            Toast.makeText(getApplicationContext(), selectedImagesPaths.size() + " Image(s) Selected.", Toast.LENGTH_LONG).show();
        } catch (Exception e) {
            Toast.makeText(this, "Something Went Wrong.", Toast.LENGTH_LONG).show();
            e.printStackTrace();
        }

        super.onActivityResult(requestCode, resultCode, data);
    }

    public void blendRegions(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap img1Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.im1, options);
//        Bitmap img2Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.im2, options);
//        Bitmap img1MaskBitmap = BitmapFactory.decodeResource(getResources(), R.drawable.mask_im1, options);
//
//        Mat img1 = new Mat();
//        Mat img2 = new Mat();
//        Mat img1Mask = new Mat();
//
//        Utils.bitmapToMat(img1Bitmap, img1);
//        Imgproc.cvtColor(img1, img1, Imgproc.COLOR_BGRA2BGR);
//
//        Utils.bitmapToMat(img2Bitmap, img2);
//        Imgproc.cvtColor(img2, img2, Imgproc.COLOR_BGRA2BGR);
//
//        Utils.bitmapToMat(img1MaskBitmap, img1Mask);

        GIFLastEffect = false;

        List<Mat> imagesMatList = returnMultipleSelectedImages(selectedImagesPaths, 3, false);
        if (imagesMatList == null) {
            return;
        }

        Mat img1Mask = imagesMatList.get(imagesMatList.size() - 1);
        Imgproc.cvtColor(img1Mask, img1Mask, Imgproc.COLOR_BGRA2BGR);
        Imgproc.cvtColor(img1Mask, img1Mask, Imgproc.COLOR_BGR2GRAY);
        Imgproc.threshold(img1Mask, img1Mask, 200, 255.0, Imgproc.THRESH_BINARY);

        Mat result = regionBlending(imagesMatList.get(0), imagesMatList.get(1), img1Mask);

        resultBitmap = Bitmap.createBitmap(result.cols(), result.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(result, resultBitmap);
        resultName = "region_blending";
        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    public void blendImages(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap img1Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.im1, options);
//        Bitmap img2Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.im2, options);
//
//        Mat img1 = new Mat();
//        Mat img2 = new Mat();
//        Utils.bitmapToMat(img1Bitmap, img1);
//        Imgproc.cvtColor(img1, img1, Imgproc.COLOR_BGRA2BGR);
//        Utils.bitmapToMat(img2Bitmap, img2);
//        Imgproc.cvtColor(img2, img2, Imgproc.COLOR_BGRA2BGR);

        GIFLastEffect = false;

        List<Mat> imagesMatList = returnMultipleSelectedImages(selectedImagesPaths, 2, false);
        if (imagesMatList == null) {
            return;
        }

        Mat result = imageBlending(imagesMatList.get(0), imagesMatList.get(1), 128.0);

        resultBitmap = Bitmap.createBitmap(result.cols(), result.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(result, resultBitmap);
        resultName = "image_blending";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    public void cartoonImage(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap original = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);

        GIFLastEffect = false;

        Bitmap original = returnSingleImageSelected(selectedImagesPaths);
        if (original == null) {
            return;
        }

        Mat img1 = new Mat();
        Utils.bitmapToMat(original, img1);
        Imgproc.cvtColor(img1, img1, Imgproc.COLOR_BGRA2BGR);

        Mat result = cartoon(img1, 80, 15, 10);

        resultBitmap = Bitmap.createBitmap(result.cols(), result.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(result, resultBitmap);
        resultName = "cartoon";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    public void reduceImageColors(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap original = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);

        GIFLastEffect = false;

        Bitmap original = returnSingleImageSelected(selectedImagesPaths);
        if (original == null) {
            return;
        }

        Mat img1 = new Mat();
        Utils.bitmapToMat(original, img1);

        Mat result = reduceColors(img1, 80, 15, 10);

        resultBitmap = Bitmap.createBitmap(result.cols(), result.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(result, resultBitmap);
        resultName = "reduce_colors";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    Bitmap returnSingleImageSelected(ArrayList<String> selectedImages) {
        if (imagesSelected == true) {
            return BitmapFactory.decodeFile(selectedImagesPaths.get(0));
        } else {
            Toast.makeText(getApplicationContext(), "No Image Selected. You have to Select an Image.", Toast.LENGTH_LONG).show();
            return null;
        }
    }

    public void reduceImageColorsGray(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap original = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);

        GIFLastEffect = false;

        Bitmap original = returnSingleImageSelected(selectedImagesPaths);
        if (original == null) {
            return;
        }

        Mat img1 = new Mat();
        Utils.bitmapToMat(original, img1);

        Imgproc.cvtColor(img1, img1, Imgproc.COLOR_BGR2GRAY);
        Mat result = reduceColorsGray(img1, 5);

        resultBitmap = Bitmap.createBitmap(result.cols(), result.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(result, resultBitmap);
        resultName = "reduce_colors_gray";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    public void medianFilter(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap original = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);

        GIFLastEffect = false;

        Bitmap original = returnSingleImageSelected(selectedImagesPaths);
        if (original == null) {
            return;
        }

        Mat img1 = new Mat();
        Utils.bitmapToMat(original, img1);
        Mat medianFilter = new Mat();
        Imgproc.cvtColor(img1, medianFilter, Imgproc.COLOR_BGR2GRAY);

        Imgproc.medianBlur(medianFilter, medianFilter, 15);

        resultBitmap = Bitmap.createBitmap(medianFilter.cols(), medianFilter.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(medianFilter, resultBitmap);
        resultName = "median_filter";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    public void adaptiveThreshold(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap original = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);

        GIFLastEffect = false;

        Bitmap original = returnSingleImageSelected(selectedImagesPaths);
        if (original == null) {
            return;
        }

        Mat adaptiveTh = new Mat();
        Utils.bitmapToMat(original, adaptiveTh);
        Imgproc.cvtColor(adaptiveTh, adaptiveTh, Imgproc.COLOR_BGR2GRAY);

        Imgproc.medianBlur(adaptiveTh, adaptiveTh, 15);

        Imgproc.adaptiveThreshold(adaptiveTh, adaptiveTh, 255, Imgproc.ADAPTIVE_THRESH_MEAN_C, Imgproc.THRESH_BINARY, 9, 2);

        resultBitmap = Bitmap.createBitmap(adaptiveTh.cols(), adaptiveTh.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(adaptiveTh, resultBitmap);
        resultName = "adaptive_threshold";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    List<Mat> returnMultipleSelectedImages(ArrayList<String> selectedImages, int numImagesRequired, boolean moreAccepted) {
        if (selectedImages == null) {
            Toast.makeText(getApplicationContext(), "No Images Selected. You have to Select More than 1 Image.", Toast.LENGTH_LONG).show();
            return null;
        } else if (selectedImages.size() == 0 && moreAccepted == true) {
            Toast.makeText(getApplicationContext(), "No Images Selected. You have to Select at Least " + numImagesRequired + " Images.", Toast.LENGTH_LONG).show();
            return null;
        } else if (selectedImages.size() == 0 && moreAccepted == false) {
            Toast.makeText(getApplicationContext(), "No Images Selected. You have to Select Exactly " + numImagesRequired + " Images.", Toast.LENGTH_LONG).show();
            return null;
        } else if (selectedImages.size() < numImagesRequired && moreAccepted == true) {
            Toast.makeText(getApplicationContext(), "Sorry. You have to Select at Least " + numImagesRequired + " Images.", Toast.LENGTH_LONG).show();
            return null;
        } else if (selectedImages.size() < numImagesRequired && moreAccepted == false) {
            Toast.makeText(getApplicationContext(), "Sorry. You have to Select Exactly " + numImagesRequired + " Images.", Toast.LENGTH_LONG).show();
            return null;
        }

        List<Mat> imagesMatList = new ArrayList<>();
        Mat mat = Imgcodecs.imread(selectedImages.get(0));
        Imgproc.cvtColor(mat, mat, Imgproc.COLOR_BGR2RGB);
        imagesMatList.add(mat);

        for (int i = 1; i < selectedImages.size(); i++) {
            mat = Imgcodecs.imread(selectedImages.get(i));
            Imgproc.cvtColor(mat, mat, Imgproc.COLOR_BGR2RGB);
            if (imagesMatList.get(0).size().equals(mat)) {
                imagesMatList.add(mat);
            } else {
                Imgproc.resize(mat, mat, imagesMatList.get(0).size());
                imagesMatList.add(mat);
            }
        }
        return imagesMatList;
    }

    public void stitchVectical(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap im1 = BitmapFactory.decodeResource(getResources(), R.drawable.part1, options);
//        Bitmap im2 = BitmapFactory.decodeResource(getResources(), R.drawable.part2, options);
//        Bitmap im3 = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);
//
//        Mat img1 = new Mat();
//        Mat img2 = new Mat();
//        Mat img3 = new Mat();
//        Utils.bitmapToMat(im1, img1);
//        Utils.bitmapToMat(im2, img2);
//        Utils.bitmapToMat(im3, img3);

        GIFLastEffect = false;

        List<Mat> imagesMatList = returnMultipleSelectedImages(selectedImagesPaths, 2, true);
        if (imagesMatList == null) {
            return;
        }

        resultBitmap = stitchImagesVectical(imagesMatList);
        resultName = "stitch_vectical";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    public void stitchHorizontal(View view) {
//        BitmapFactory.Options options = new BitmapFactory.Options();
//        options.inScaled = false; // Leaving it to true enlarges the decoded image size.
//        Bitmap im1 = BitmapFactory.decodeResource(getResources(), R.drawable.part1, options);
//        Bitmap im2 = BitmapFactory.decodeResource(getResources(), R.drawable.part2, options);
//        Bitmap im3 = BitmapFactory.decodeResource(getResources(), R.drawable.part3, options);
//
//        Mat img1 = new Mat();
//        Mat img2 = new Mat();
//        Mat img3 = new Mat();
//        Utils.bitmapToMat(im1, img1);
//        Utils.bitmapToMat(im2, img2);
//        Utils.bitmapToMat(im3, img3);

        GIFLastEffect = false;

        List<Mat> imagesMatList = returnMultipleSelectedImages(selectedImagesPaths, 2, true);
        if (imagesMatList == null) {
            return;
        }

        resultBitmap = stitchImagesHorizontal(imagesMatList);
        resultName = "stitch_horizontal";

        ImageView imageView = findViewById(R.id.opencvImg);
        imageView.setImageBitmap(resultBitmap);
    }

    Mat regionBlending(Mat img, Mat img2, Mat mask) {
        Mat result = img2;

        for (int row = 0; row < img.rows(); row++) {
            for (int col = 0; col < img.cols(); col++) {
                double[] img1Pixel = img.get(row, col);
                double[] binaryPixel = mask.get(row, col);
                if (binaryPixel[0] == 255.0) {
                    result.put(row, col, img1Pixel);
                }
            }
        }
        return result;
    }

    Mat imageBlending(Mat img, Mat img2, double alpha) {
        Mat result = img;

        if (alpha == 0.0) {
            return img2;
        } else if (alpha == 255.0) {
            return img;
        }

        for (int row = 0; row < img.rows(); row++) {
            for (int col = 0; col < img.cols(); col++) {
                double[] pixel1 = img.get(row, col);

                double[] pixel2 = img2.get(row, col);

                double fraction = alpha / 255.0;

                pixel1[0] = pixel1[0] * fraction + pixel2[0] * (1.0 - fraction);
                pixel1[1] = pixel1[1] * fraction + pixel2[1] * (1.0 - fraction);
                pixel1[2] = pixel1[2] * fraction + pixel2[2] * (1.0 - fraction);

                result.put(row, col, pixel1);
            }
        }
        return result;
    }

    Mat cartoon(Mat img, int numRed, int numGreen, int numBlue) {
        Mat reducedColorImage = reduceColors(img, numRed, numGreen, numBlue);

        Mat result = new Mat();
        Imgproc.cvtColor(img, result, Imgproc.COLOR_BGR2GRAY);
        Imgproc.medianBlur(result, result, 15);

        Imgproc.adaptiveThreshold(result, result, 255, Imgproc.ADAPTIVE_THRESH_MEAN_C, Imgproc.THRESH_BINARY, 15, 2);

        Imgproc.cvtColor(result, result, Imgproc.COLOR_GRAY2BGR);

        Log.d("PPP", result.height() + " " + result.width() + " " + reducedColorImage.type() + " " + result.channels());
        Log.d("PPP", reducedColorImage.height() + " " + reducedColorImage.width() + " " + reducedColorImage.type() + " " + reducedColorImage.channels());

        Core.bitwise_and(reducedColorImage, result, result);

        return result;
    }

    Mat reduceColors(Mat img, int numRed, int numGreen, int numBlue) {
        Mat redLUT = createLUT(numRed);
        Mat greenLUT = createLUT(numGreen);
        Mat blueLUT = createLUT(numBlue);

        List<Mat> BGR = new ArrayList<>(3);
        Core.split(img, BGR); // splits the image into its channels in the List of Mat arrays.

        LUT(BGR.get(0), blueLUT, BGR.get(0));
        LUT(BGR.get(1), greenLUT, BGR.get(1));
        LUT(BGR.get(2), redLUT, BGR.get(2));

        Core.merge(BGR, img);

        return img;
    }

    Mat reduceColorsGray(Mat img, int numColors) {
        Mat LUT = createLUT(numColors);

        LUT(img, LUT, img);

        return img;
    }

    Mat createLUT(int numColors) {
        // When numColors=1 the LUT will only have 1 color which is black.
        if (numColors < 0 || numColors > 256) {
            System.out.println("Invalid Number of Colors. It must be between 0 and 256 inclusive.");
            return null;
        }

        Mat lookupTable = Mat.zeros(new Size(1, 256), CV_8UC1);

        int startIdx = 0;
        for (int x = 0; x < 256; x += 256.0 / numColors) {
            lookupTable.put(x, 0, x);

            for (int y = startIdx; y < x; y++) {
                if (lookupTable.get(y, 0)[0] == 0) {
                    lookupTable.put(y, 0, lookupTable.get(x, 0));
                }
            }
            startIdx = x;
        }
        return lookupTable;
    }

    Bitmap stitchImagesVectical(List<Mat> src) {
        Mat dst = new Mat();
        Core.vconcat(src, dst); //Core.hconcat(src, dst);
        Bitmap imgBitmap = Bitmap.createBitmap(dst.cols(), dst.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(dst, imgBitmap);

        return imgBitmap;
    }

    Bitmap stitchImagesHorizontal(List<Mat> src) {
        Mat dst = new Mat();
        Core.hconcat(src, dst); //Core.vconcat(src, dst);
        Bitmap imgBitmap = Bitmap.createBitmap(dst.cols(), dst.rows(), Bitmap.Config.ARGB_8888);
        Utils.matToBitmap(dst, imgBitmap);

        return imgBitmap;
    }

    void saveBitmap(Bitmap imgBitmap, String fileNameOpening) {
        SimpleDateFormat formatter = new SimpleDateFormat("yyyy_MM_dd_HH_mm_ss", Locale.US);
        Date now = new Date();
        String fileName = fileNameOpening + "_" + formatter.format(now) + ".jpg";

        FileOutputStream outStream;
        try {
            // Get a public path on the device storage for saving the file. Note that the word external does not mean the file is saved in the SD card. It is still saved in the internal storage.
            File path = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES);

            // Creates a directory for saving the image.
            File saveDir = new File(path + "/HeartBeat/");

            // If the directory is not created, create it.
            if (!saveDir.exists())
                saveDir.mkdirs();

            // Create the image file within the directory.
            File fileDir = new File(saveDir, fileName); // Creates the file.

            // Write into the image file by the BitMap content.
            outStream = new FileOutputStream(fileDir);
            imgBitmap.compress(Bitmap.CompressFormat.JPEG, 100, outStream);

            MediaScannerConnection.scanFile(this.getApplicationContext(),
                    new String[]{fileDir.toString()}, null,
                    new MediaScannerConnection.OnScanCompletedListener() {
                        public void onScanCompleted(String path, Uri uri) {
                        }
                    });

            // Close the output stream.
            outStream.close();
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    // Implementation of the getPath() method and all its requirements is taken from the StackOverflow Paul Burke's answer: https://stackoverflow.com/a/20559175/5426539
    public static String getPath(final Context context, final Uri uri) {

        final boolean isKitKat = Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT;

        // DocumentProvider
        if (isKitKat && DocumentsContract.isDocumentUri(context, uri)) {
            // ExternalStorageProvider
            if (isExternalStorageDocument(uri)) {
                final String docId = DocumentsContract.getDocumentId(uri);
                final String[] split = docId.split(":");
                final String type = split[0];

                if ("primary".equalsIgnoreCase(type)) {
                    return Environment.getExternalStorageDirectory() + "/" + split[1];
                }

                // TODO handle non-primary volumes
            }
            // DownloadsProvider
            else if (isDownloadsDocument(uri)) {

                final String id = DocumentsContract.getDocumentId(uri);
                final Uri contentUri = ContentUris.withAppendedId(
                        Uri.parse("content://downloads/public_downloads"), Long.valueOf(id));

                return getDataColumn(context, contentUri, null, null);
            }
            // MediaProvider
            else if (isMediaDocument(uri)) {
                final String docId = DocumentsContract.getDocumentId(uri);
                final String[] split = docId.split(":");
                final String type = split[0];

                Uri contentUri = null;
                if ("image".equals(type)) {
                    contentUri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
                } else if ("video".equals(type)) {
                    contentUri = MediaStore.Video.Media.EXTERNAL_CONTENT_URI;
                } else if ("audio".equals(type)) {
                    contentUri = MediaStore.Audio.Media.EXTERNAL_CONTENT_URI;
                }

                final String selection = "_id=?";
                final String[] selectionArgs = new String[]{
                        split[1]
                };

                return getDataColumn(context, contentUri, selection, selectionArgs);
            }
        }
        // MediaStore (and general)
        else if ("content".equalsIgnoreCase(uri.getScheme())) {
            return getDataColumn(context, uri, null, null);
        }
        // File
        else if ("file".equalsIgnoreCase(uri.getScheme())) {
            return uri.getPath();
        }

        return null;
    }

    public static String getDataColumn(Context context, Uri uri, String selection,
                                       String[] selectionArgs) {

        Cursor cursor = null;
        final String column = "_data";
        final String[] projection = {
                column
        };

        try {
            cursor = context.getContentResolver().query(uri, projection, selection, selectionArgs,
                    null);
            if (cursor != null && cursor.moveToFirst()) {
                final int column_index = cursor.getColumnIndexOrThrow(column);
                return cursor.getString(column_index);
            }
        } finally {
            if (cursor != null)
                cursor.close();
        }
        return null;
    }

    public static boolean isExternalStorageDocument(Uri uri) {
        return "com.android.externalstorage.documents".equals(uri.getAuthority());
    }

    public static boolean isDownloadsDocument(Uri uri) {
        return "com.android.providers.downloads.documents".equals(uri.getAuthority());
    }

    public static boolean isMediaDocument(Uri uri) {
        return "com.android.providers.media.documents".equals(uri.getAuthority());
    }

}

The next figure shows a GIF image created by the app as a combination of 8 images.

Conclusion

This tutorial introduced a new effect— creating animated GIF images using Android. It also made some useful edits to the app to allow loading multiple, custom images.

The app can also work with any images, even with different sizes, as the images are resized to be of the same size before applying effects that work with multiple images. The next tutorial in the series continues adding more effects to the Android app.

Fritz

Our team has been at the forefront of Artificial Intelligence and Machine Learning research for more than 15 years and we're using our collective intelligence to help others learn, understand and grow using these new technologies in ethical and sustainable ways.

Comments 0 Responses

Leave a Reply

Your email address will not be published. Required fields are marked *

wix banner square