How to Access Android Camera Using Qt/C++

In this article I am going to describe the required steps needed for accessing Android Camera (or Default Camera Interface) using Qt. Unfortunately OpenCV does not provide a reliable way of connecting to Camera in Android so you have go for a method like this if you intend to write an Android application which uses OpenCV and Qt together. I strongly recommend that you should first read this article (which describes how to access Android Gallery from Qt) and also this article (which shows how to mix Java and C++ code in Qt) and then return here because I will be assuming that you are familiar with those processes. So if you can already access Android Gallery using Qt then continue reading the steps described below.

 Almost all parts of the code and approach for accessing Camera is the same as accessing Gallery except a few differences which start from AndroidManifest.xml file. You need to add Camera capability to your Qt Android project. You can do this by adding the following lines to AndroidManifest.xml file.

<uses-permission android:name="android.permission.CAMERA"/>

<uses-feature android:name="android.hardware.camera" android:required="false"/>

Next, you need to have a simple function in you Qt code to initiate Camera (capture) function. Camera function will be written in Java function but you need to call it from your Qt C++ code. Something similar to this:

QAndroidJniObject::callStaticMethod<void>("com/amin/QtAndroidCameraApp/QtAndroidCameraApp",
	"captureAnImage",
	"()V");

Note that QtAndroidCameraApp is just a name I have given and you should use your own app name instead of that.

Again, just like accessing the Gallery, you need to have a JNI function which reads the selectedFileName variable returned from Java code. Like this:

JNIEXPORT void JNICALL
Java_com_amin_QtAndroidCameraApp_QtAndroidCameraApp_fileSelected(JNIEnv*/*env*/,
	jobject /*obj*/,
	jstring results)
{
	selectedFileName = QAndroidJniObject(results).toString();
}

 Finally, you should have the following in your Java code:

captureAnImage function is the function that will be called from C++. It needs to be defined static so that it can be called without any issues:

static void captureAnImage()
{
	m_instance.dispatchTakePictureIntent();
}


m_instance, which always points to “this” class calls dispatchTakePictureIntent to start the camera access interface:

private void dispatchTakePictureIntent()
{
	Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);

	// Ensure that there's a camera activity to handle the intent
	if (takePictureIntent.resolveActivity(getPackageManager()) != null)
	{
		// Create the File where the photo should go
		File photoFile = null;
		try
		{
			photoFile = m_instance.createImageFile();
		}
		catch (IOException ex)
		{
			// Error occurred while creating the File
			Toast.makeText(QtAndroidCameraApp.this, ex.getLocalizedMessage(), Toast.LENGTH_LONG).show();
			Toast.makeText(QtAndroidCameraApp.this, ex.getMessage(), Toast.LENGTH_LONG).show();
		}
		// Continue only if the File was successfully created
		if (photoFile != null)
		{
			takePictureIntent.putExtra(MediaStore.EXTRA_OUTPUT, Uri.fromFile(photoFile));

			lastCameraFileUri = photoFile.toString();

			startActivityForResult(takePictureIntent, REQUEST_CAPTURE_IMAGE);
		}
	}
	else
	{
		Toast.makeText(QtAndroidCameraApp.this, "Problems with your camera?!", Toast.LENGTH_SHORT).show();
	}
}

Updated: Added missing function createImageFile

createImageFile function mentioned above needs to be defined as follows:

private File createImageFile() throws IOException
{
	// Create an image file name
	String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
	String imageFileName = "MYAPPTEMP_" + timeStamp + "_";
	File storageDir = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES);
	File image = File.createTempFile(imageFileName, /* prefix */
		".jpg", /* suffix */
		storageDir /* directory */
	);

	return image;
}

Even though it might seem a little bit complex at first, it’s quite simple. You start the camera interface and wait for the result. If the result is a correct image file then you return it and catch it in your C++ code. If it’s not, then you just show some error toasts and exit. You might have noticed that there are some variables and functions called in dispatchTakePictureIntent which are described below:

First one is lastCameraFileUri which is self explanatory:

public String lastCameraFileUri;

Next is REQUEST_CAPTURE_IMAGE which is a constant you define to make sure the result you receive at activity result is the image file taken by the camera:

static final int REQUEST_CAPTURE_IMAGE = 1;


And finally you need to override onActivityResult function as seen below:

@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data)
{
	if (resultCode == RESULT_OK)
	{
		if (requestCode == REQUEST_CAPTURE_IMAGE)
		{
			String filePath = lastCameraFileUri;
			fileSelected(filePath);
		}
	}
	else
	{
		fileSelected(":(");
	}

	super.onActivityResult(requestCode, resultCode, data);
}

fileSelected is a function that needs to be defined as native because as you saw above, it is the function that is called from C++ to the the file name. You only need to write the definition as seen below. This function’s contents are defined in you C++ Qt code above:

public static native void fileSelected(String fileName);

It’s done. You should be able to access Android Camera using the approach described here. An example of this implementation can be found in my Android app for Fourier Transform analysis called Image Transformer.

Important Note: In most recent Android update (6.0.0 API23) it seems there are some errors and crashes that might be related to this, but you should be able to fix it by yourself if you understood the steps described here.

As always, you can post your comments and questions below. I will also be glad if someone lets me know of a better method or reports any errors that might be caused by using this method.

Good luck! 🙂

7 Replies to “How to Access Android Camera Using Qt/C++”

  1. hi, were u using opencv in this post? I didn’t see any opencv related function in it. Do you know if opencv supports to using Qt for Android to develop Android App to open android camera and read frame? I succeed to develop for my Android Device(android verion is 4.2) with the lib of libnative_camera_r4.2.0.so, and now I want to use my programme on Android version 5.1, I can’t find suitable native camera lib, it failed to run. Do u have any idea? Thanks.
    I installed opencv3.1 and opencv 2.4.11

    1. No this is not related to OpenCV. This is an example of using JNI and a mixture of C++/Java to access android camera. Unfortunately there is no stable camera lib in OpenCV for Android that you can use without worrying about anything. You have no other choice but to go for built-in gallery and/or camera.

      1. Thanks for your reply.
        Do you know is there any method to access Android camera and get the frames data by using QT? I am developing an APP with function of ‘Video Chat’, I tried QCamera and QML camera, the received frames can’t be output on Android device smoothly, if you have any idea, can you add my skype at ******, so we can discuss on this?

        1. I’m quite sure this was not at all possible with previous versions of Qt (and OpenCV) but haven’t tried in recent versions (such as 5.7)
          Perhaps I’ll write a tutorial on this after I give it a try myself.

  2. Thanks for the tutorial i have problem with this line :
    photoFile = m_instance.createImageFile();
    it say : can not find symbol .
    i already ran succeffully GalleryImage but not this one.
    ANOTHER QUESTION ? can we open camera interface et get frames in order to process them by using opencv ?

    1. Thanks for letting me know about the missing function. I just updated the article and added createImageFile function. It needs to be defined like this:

      private File createImageFile() throws IOException
      {
      // Create an image file name
      String timeStamp = new SimpleDateFormat(“yyyyMMdd_HHmmss”).format(new Date());
      String imageFileName = “MYAPPTEMP_” + timeStamp + “_”;
      File storageDir = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES);
      File image = File.createTempFile(imageFileName, /* prefix */
      “.jpg”, /* suffix */
      storageDir /* directory */
      );

      return image;
      }

      And about your other question, do you mean process video frames in real time? Or record videos and process them? If you meant the latter then it’s quite simple. Almost the same as this post. But if you mean processing video frames on the fly then you have to override the camera interface. Technically speaking it is possible. (Technically speaking there is nothing that is impossible!) But in practice it needs quite a lot of effort specially if you intend to work with Qt and write native codes.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.