Camera example

Updated: October 28, 2024

The Camera example application shows you how to work with the Camera library, by allowing you to use cameras connected to your reference image.

You must run the application using a configuration that includes cameras. This is useful to test whether your connected cameras work, and to understand their settings and how to use their various features.

This application is meant to be a reference for learning how to use the Camera library API to build an application. Here are the type of options the Camera example provides:
Camera viewfinder
This option allows you to show image buffers captured from a camera on the display. If your camera is configured correctly, you should see at least one camera connected to your system.
Record video to a file
This option allows you to record the image buffers from a camera to a file in an uncompressed format (MOV/UCV). You can record from only one of the cameras that's connected to your system, even file cameras (i.e., simulated cameras). If you record multiple times, there are multiple files created. The files are recorded in the camera roll, which is located in the roll directory that's specified at the start of the Sensor service.
Camera stream
Nothing is displayed, but you'll see the frames from the image buffers being sent and a printout indicating the current frame rate on the console. The code for this option demonstrates how to get buffers from a camera using the Screen Graphics Subsystem.
Multiple camera video
Show video from one or more cameras. You can choose the cameras from which to stream video as well as encode or record the video to an uncompressed video file format (MOV/UCV). This option synchronizes the start time so that all cameras start providing the output at the same time. This option also stops the output of all cameras at the same time.
EGL viewfinder
This option allows you to show image buffers captured from a camera on the display, but the images are shown in a texture using the GPU.
Event example
Show the video using event mode. The example sets up a pulse to be received every time a buffer is received. The buffer is both readable and writable. After the pulse is received, the buffer is acquired and processed by inverting only the luma component. This makes the video show objects in their original color with the brightness inverted. For example, a red light remains red in the video as shown in the following illustration; however, it is dimmer than the surroundings rather than brighter.


Figure 1. Inverted luma.
Exit the example
Stop running the application.

Run the Camera example application

To run the Camera example application (camera_example), do the following:
  1. Connect to your image using ssh.
  2. If another example application is running, type slmctl "stop example_app" in your ssh session to stop it; for instance:
    # slmctl "stop camera_mux"
    The application should stop running and you should see a blank display.
  3. Type camera_example to run the application:
    # camera_example
After you run the command, a menu appears. To exit the application, press CtrlC. If you are using the default configuration, enter the following menu commands to see the video for the file camera:
  • 1 (Camera viewfinder)
  • 1 (CAMERA_UNIT_1)
  • 2 (CAMERA_VFMODE_VIDEO)
  • n
  • n

Examples

Show video from a single camera

Here's how you can run the application to show video from one USB camera on the display:
# camera_example  
Select which example you want to run: 
	1) Camera viewfinder
	2) Record video to a file
	3) Camera stream
	4) Multiple camera video
	5) EGL viewfinder
	6) Event example
	x) Exit the example
1
Select which of the following cameras you want to use:
	1) CAMERA_UNIT_1
Enter Choice:
1
Select which of the following VF modes you want to use:
	2) CAMERA_VFMODE_VIDEO
Enter Choice:
2
Do you want to modify the viewfinder configuration (y/n)?
n
Choose from the following options:
	e) Modify exposure
	i) Modify image attributes
	l) Lock 3A
	v) Modify viewfinder configuration
	w) Modify whitebalance
	x) Exit the example
x

Modify the white balance for one camera

Here's how you can modify the white balance on a supported camera:
# camera_example  
Select which example you want to run: 
	1) Camera viewfinder
	2) Record video to a file
	3) Camera stream
	4) Multiple camera video
	5) EGL viewfinder
	6) Event example
	x) Exit the example
1
Select which of the following 1 cameras you want to use:
	1) CAMERA_UNIT_1
Enter Choice:
1
Select which of the following VF modes you want to use:
	2) CAMERA_VFMODE_VIDEO
Enter Choice:
2
Do you want to modify the viewfinder configuration (y/n)?
n
Do you want to modify image attributes (y/n)?
n
Choose from the following options:
	e) Modify exposure
	i) Modify image attributes
	l) Lock 3A
	v) Modify viewfinder configuration
	w) Modify whitebalance
	x) Exit the example
e
Select the desired exposure mode:
	1) Default
	2) Auto
	3) ISO Priority
	4) Shutter Priority
	5) ISO Shutter Priority
Enter Choice:
4
Select the desired shutter speed value within range of 0.000011 to 0.033249
0.032
Select the desired EV offset within range of -7.584959 to 2.413635
0
Choose from the following options:
	e) Modify exposure
	i) Modify image attributes
	l) Lock 3A
	v) Modify viewfinder configuration
	w) Modify whitebalance
	x) Exit the example
x               

Build the Camera example application

On your host computer, the source code is provided for the Camera example application. You can build the source code and deploy the resulting binary onto your target.

  1. A ZIP file installed with the QNX SDP 7.1 Sensor Framework Base package is located at installation_directory/source/sf-camera-examples-version.zip. You must extract it to get the ZIP file for the camera examples.
  2. In the extraction_directory/source_package_sf_camera directory, there's a README.txt that you can follow to build the application.
  3. Create a backup of the original camera_example file on your target. This file is found in the target's /usr/bin/ directory.
  4. Deploy your newly built version of the Camera example application, by navigating to extraction_directory/source_package_sf_camera/apps/sensor/camera_example/nto/aarch64/so.le on your host and transferring the camera_example file to the /usr/bin/ directory on the target. You can use the QNX Momentics IDE to transfer the file.
  5. Connect to your target through a terminal connection and run the Camera example application.
  翻译: