Android App Keep Crashing Caused by: java.lang.NullPointerException: Attempt to invoke virtual method

I’m trying to create an application that can detect emotions through face, voice, and text. To do so, I created three buttons that will direct to the specific Activities.

I’m using Azure Cognitive service to detect the face.

but my app keeps crashing whenever I clicked on the first button "face" instead of opening the FaceEmotionActivity.

following is the error

roidRuntime: FATAL EXCEPTION: main
    Process: com.example.epp, PID: 19422
    java.lang.RuntimeException: Unable to start activity ComponentInfo{com.example.epp/com.example.epp.FaceEmotionActivity}: java.lang.NullPointerException: Attempt to invoke virtual method 'void android.widget.Button.setOnClickListener(android.view.View$OnClickListener)' on a null object reference
        at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3511)
        at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3650)
        at android.app.servertransaction.LaunchActivityItem.execute(LaunchActivityItem.java:83)
        at android.app.servertransaction.TransactionExecutor.executeCallbacks(TransactionExecutor.java:151)
        at android.app.servertransaction.TransactionExecutor.execute(TransactionExecutor.java:111)
        at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2213)
        at android.os.Handler.dispatchMessage(Handler.java:107)
        at android.os.Looper.loop(Looper.java:238)
        at android.app.ActivityThread.main(ActivityThread.java:7864)
        at java.lang.reflect.Method.invoke(Native Method)
        at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:492)
        at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:998)
     Caused by: java.lang.NullPointerException: Attempt to invoke virtual method 'void android.widget.Button.setOnClickListener(android.view.View$OnClickListener)' on a null object reference
        at com.example.epp.FaceEmotionActivity.onCreate(FaceEmotionActivity.java:86)
        at android.app.Activity.performCreate(Activity.java:7967)
        at android.app.Activity.performCreate(Activity.java:7956)
        at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1320)
        at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3482)
        at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3650) 
        at android.app.servertransaction.LaunchActivityItem.execute(LaunchActivityItem.java:83) 
        at android.app.servertransaction.TransactionExecutor.executeCallbacks(TransactionExecutor.java:151) 
        at android.app.servertransaction.TransactionExecutor.execute(TransactionExecutor.java:111) 
        at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2213) 
        at android.os.Handler.dispatchMessage(Handler.java:107) 
        at android.os.Looper.loop(Looper.java:238) 
        at android.app.ActivityThread.main(ActivityThread.java:7864) 
        at java.lang.reflect.Method.invoke(Native Method) 
        at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:492) 
        at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:998) 

I searched for the answer but could not find the proper answer.

My MianActivity.java is:

package com.example.epp;

import android.Manifest;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.os.Bundle;
import android.provider.MediaStore;
import android.widget.Button;

import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;

public class MainActivity extends AppCompatActivity {


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
/// checking for camera permission
        if (ContextCompat.checkSelfPermission(getApplicationContext(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
            ActivityCompat.requestPermissions(MainActivity.this, new String[]{Manifest.permission.CAMERA}, 110);
        }

        ///3 button objects accessed by id
        Button faceButton = findViewById(R.id.facebtn);
        Button voiceButton = findViewById(R.id.voicebtn);
        Button textButton = findViewById(R.id.textbtn);

        ///face accessing method
        faceButton.setOnClickListener(v -> {
            Intent faceIntent = new Intent(MainActivity.this, FaceEmotionActivity.class);
            startActivity(faceIntent);

        });
        ///voice accessing method
        voiceButton.setOnClickListener(v -> {
            Intent voiceIntent = new Intent(MainActivity.this, VoiceActivity.class);
            startActivity(voiceIntent);
        });
        ///text accessing method
        textButton.setOnClickListener(v -> {
            Intent textIntent = new Intent(MainActivity.this, TextActivity.class);
            startActivity(textIntent);
        });


    }
}

XML for activity_main is

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="wrap_content"
    android:layout_height="match_parent"
    android:textAlignment="center"
    tools:context=".MainActivity">

    <Button
        android:id="@+id/facebtn"
        android:layout_width="match_parent"
        android:layout_height="78dp"
        android:layout_alignTop="@+id/voicebtn"
        android:layout_alignParentStart="true"
        android:layout_alignParentEnd="true"
        android:layout_centerHorizontal="true"
        android:layout_marginStart="-3dp"
        android:layout_marginTop="-149dp"
        android:layout_marginEnd="2dp"
        android:layout_marginBottom="75dp"
        android:background="#8E2121"
        android:text="FACE"
        android:textSize="30sp"
        app:layout_constraintBottom_toTopOf="@+id/textbtn"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintHorizontal_bias="0.505"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintTop_toBottomOf="@+id/textView2" />

    <Button
        android:id="@+id/voicebtn"
        android:layout_width="match_parent"
        android:layout_height="75dp"
        android:layout_alignTop="@+id/textbtn"
        android:layout_alignParentStart="true"
        android:layout_alignParentEnd="true"
        android:layout_marginStart="-2dp"
        android:layout_marginTop="-140dp"
        android:layout_marginEnd="0dp"
        android:layout_marginBottom="75dp"
        android:background="#8E2121"
        android:text="VOICE"
        android:textSize="30sp"
        app:layout_constraintBottom_toTopOf="@+id/textbtn"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintHorizontal_bias="0.505"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintTop_toBottomOf="@+id/textView2" />

    <Button
        android:id="@+id/textbtn"
        android:layout_width="match_parent"
        android:layout_height="75dp"
        android:layout_alignParentStart="true"
        android:layout_alignParentBottom="true"
        android:layout_marginStart="0dp"
        android:layout_marginTop="348dp"
        android:layout_marginBottom="149dp"
        android:background="#8E2121"
        android:text="TEXT"
        android:textSize="30sp"
        app:layout_constraintBottom_toBottomOf="parent"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintHorizontal_bias="0.494"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintTop_toBottomOf="@+id/textView2" />

</RelativeLayout>

FaceEmotionActivity.java where the Face button suppose to direct

package com.example.epp;

import android.annotation.SuppressLint;
import android.app.ProgressDialog;
import android.content.ActivityNotFoundException;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.os.AsyncTask;
import android.os.Bundle;
import android.provider.MediaStore;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.Toast;

import androidx.annotation.Nullable;
import androidx.appcompat.app.AppCompatActivity;

import com.microsoft.projectoxford.face.FaceServiceClient;
import com.microsoft.projectoxford.face.FaceServiceRestClient;
import com.microsoft.projectoxford.face.contract.Face;
import com.microsoft.projectoxford.face.contract.FaceRectangle;

import org.json.JSONObject;

import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.InputStream;
import java.util.Arrays;

import static android.content.ContentValues.TAG;


public class FaceEmotionActivity extends AppCompatActivity {

    private final FaceServiceClient faceServiceClient = new FaceServiceRestClient("https://emotiondet.cognitiveservices.azure.com//face/v1.0/", "529b219bc4b94cfe9f97e20fa3f13b51");

    JSONObject jsonObject, jsonObject1;
    ImageView imageView;
    Bitmap mBitmap;
    boolean takePicture = false;

    private ProgressDialog detectionProgressDialog;
    Face[] facesDetected;
    static final int REQUEST_IMAGE_CAPTURE = 1;


    @Override
    protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
        super.onActivityResult(requestCode, resultCode, data);
        if (requestCode == 0 && resultCode == RESULT_OK) {
            assert data != null;
            Bitmap bitmap = (Bitmap) data.getExtras().get("data");
            imageView.setImageBitmap(bitmap);
            detectAndFrame(bitmap);
        }
    }
    

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        detectionProgressDialog = new ProgressDialog(this);

        jsonObject = new JSONObject();
        jsonObject1 = new JSONObject();
        imageView = findViewById(R.id.imageView);
        Toast.makeText(getApplicationContext(), "Press the Detect Button to take a picture. Press Identify to identify the person.", Toast.LENGTH_LONG).show();

        Button btnDetect = findViewById(R.id.faceDetectBtn);

        btnDetect.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
                try {
                    startActivityForResult(takePictureIntent, REQUEST_IMAGE_CAPTURE);
                } catch (ActivityNotFoundException e) {
                    // display error state to the user
                }
            }
        });
    }


    private void detectAndFrame(final Bitmap imageBitmap) {
        ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
        imageBitmap.compress(Bitmap.CompressFormat.JPEG, 100, outputStream);
        ByteArrayInputStream inputStream =
                new ByteArrayInputStream(outputStream.toByteArray());

        @SuppressLint("StaticFieldLeak") AsyncTask<InputStream, String, Face[]> detectTask =

                new AsyncTask<InputStream, String, Face[]>() {
                    String exceptionMessage = "";

                    @SuppressLint("DefaultLocale")
                    @Override
                    protected Face[] doInBackground(InputStream... params) {
                        try {
                            publishProgress("Detecting...");
                            Face[] result = faceServiceClient.detect(
                                    params[0],
                                    true,         // returnFaceId
                                    false,        // returnFaceLandmarks
                                    // returnFaceAttributes:
                                    new FaceServiceClient.FaceAttributeType[]{
                                            FaceServiceClient.FaceAttributeType.Emotion,
                                            FaceServiceClient.FaceAttributeType.Gender}
                            );

                            for (int i = 0; i < result.length; i++) {
                                jsonObject.put("happiness", result[i].faceAttributes.emotion.happiness);
                                jsonObject.put("sadness", result[i].faceAttributes.emotion.sadness);
                                jsonObject.put("surprise", result[i].faceAttributes.emotion.surprise);
                                jsonObject.put("neutral", result[i].faceAttributes.emotion.neutral);
                                jsonObject.put("anger", result[i].faceAttributes.emotion.anger);
                                jsonObject.put("contempt", result[i].faceAttributes.emotion.contempt);
                                jsonObject.put("disgust", result[i].faceAttributes.emotion.disgust);
                                jsonObject.put("fear", result[i].faceAttributes.emotion.fear);
                                Log.e(TAG, "doInBackground: " + jsonObject.toString());

                                jsonObject1.put((String.valueOf(i)), jsonObject);
                            }
//
                            runOnUiThread(() -> Toast.makeText(FaceEmotionActivity.this, "DATA" + jsonObject1.toString(), Toast.LENGTH_LONG).show());

                            Log.e("TAG", "doInBackground: " + "   " + result.length);
                            publishProgress(String.format(
                                    "Detection Finished. %d face(s) detected",
                                    result.length));

                            return result;
                        } catch (Exception e) {
                            exceptionMessage = String.format(
                                    "Detection failed: %s", e.getMessage());
                            return null;
                        }
                    }

                    @Override
                    protected void onPreExecute() {
                        detectionProgressDialog.show();
                    }

                    @Override
                    protected void onProgressUpdate(String... progress) {
                        detectionProgressDialog.setMessage(progress[0]);
                    }

                    @Override
                    protected void onPostExecute(Face[] result) {
                        detectionProgressDialog.dismiss();

                        facesDetected = result;

                        
                        if (result == null) {
                            //                                showError("No faces detected");
                        }
                        Log.e("TAG", "onPostExecute: " + Arrays.toString(facesDetected));

                        ImageView imageView = findViewById(R.id.imageView);
                        imageView.setImageBitmap(
                                drawFaceRectanglesOnBitmap(imageBitmap, result));
                        imageBitmap.recycle();
                        Toast.makeText(getApplicationContext(), "Now you can identify the person by pressing the "Identify" Button", Toast.LENGTH_LONG).show();
                        takePicture = true;
                    }
                };

        detectTask.execute(inputStream);
    }

    private static Bitmap drawFaceRectanglesOnBitmap(
            Bitmap originalBitmap, Face[] faces) {
        Bitmap bitmap = originalBitmap.copy(Bitmap.Config.ARGB_8888, true);
        Canvas canvas = new Canvas(bitmap);
        Paint paint = new Paint();
        paint.setAntiAlias(true);
        paint.setStyle(Paint.Style.STROKE);
        paint.setColor(Color.RED);
        paint.setStrokeWidth(9);
        if (faces != null) {
            for (Face face : faces) {
                FaceRectangle faceRectangle = face.faceRectangle;
                canvas.drawRect(
                        faceRectangle.left,
                        faceRectangle.top,
                        faceRectangle.left + faceRectangle.width,
                        faceRectangle.top + faceRectangle.height,
                        paint);
            }
        }
        return bitmap;
    }
}

Xml for FaceEmotionActivity.java is activity_face_emotion.xml

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical"
    tools:context=".MainActivity">

    <ImageView
        android:id="@+id/imageView"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />

    <LinearLayout
        android:id="@+id/groupButton"
        android:layout_alignParentBottom="true"

        android:gravity="center_vertical"

        android:layout_width="match_parent"
        android:layout_height="wrap_content">

        <Button
            android:id="@+id/faceDetectBtn"
            android:layout_weight="3"
            android:text="Detect Face"
            android:layout_width="0dp"
            android:layout_height="wrap_content" />


    </LinearLayout>


</RelativeLayout>

I haven’t created the other two activities since they are not needed right now.

Thank you

Source: Android Questions

LEAVE A COMMENT