Functions to set up a Raspberry Pi Camera v2 for scientific use.
This module provides slower, simpler functions to set the gain, exposure, and white balance of a Raspberry Pi camera, using the picamera2 Python library. It's mostly used by the OpenFlexure Microscope, though it deliberately has no hard dependencies on said software, so that it's useful on its own.
There are three main calibration steps:
- Setting exposure time and gain to get a reasonably bright image.
- Fixing the white balance to get a neutral image
- Taking a uniform white image and using it to calibrate the Lens Shading Table
The most reliable way to do this, avoiding any issues relating to "memory" or nonlinearities in the camera's image processing pipeline, is to use raw images. This is quite slow, but very reliable. The three steps above can be accomplished by:
picamera = picamera2.Picamera2() adjust_shutter_and_gain_from_raw(picamera) adjust_white_balance_from_raw(picamera) lst = lst_from_camera(picamera) picamera.lens_shading_table = lst
| Class | |
Record the results of testing the camera's current exposure settings. |
| Function | adjust |
Adjust exposure and analog gain based on raw images. |
| Function | adjust |
Adjust the white balance in a single shot, based on the raw image. |
| Function | as |
Flatten array, round, and then convert to list. |
| Function | channels |
Given the 'array' from a PiBayerArray, return the 4 channels. |
| Function | check |
Check whether the brightness is within the specified target range. |
| Function | copy |
Copy the rpi.alsc algorithm from one tuning to another. |
| Function | downsampled |
Generate a downsampled, un-normalised image from which to calculate the LST. |
| Function | get |
Compresses channel down to a 16x12 grid - from libcamera. |
| Function | get |
Get the rpi.ccm section of a camera tuning dict. |
| Function | grids |
Convert form luminance/chrominance dict to four RGGB channels. |
| Function | index |
Find the index of an algorithm's section in the tuning file. |
| Function | load |
Load the default tuning file for the camera. |
| Function | lst |
Acquire a raw image and use it to calculate a lens shading table. |
| Function | lst |
Given the 4 Bayer colour channels from a white image, generate a LST. |
| Function | lst |
Given 4 downsampled grids, generate the luminance and chrominance tables. |
| Function | lst |
Whether the lens shading table is set to static. |
| Function | raw |
Acquire a raw image and return a 4xNxM array of the colour channels. |
| Function | recreate |
Delete and recreate the camera manager. |
| Function | set |
Enable manual exposure, with low gain and shutter speed. |
| Function | set |
Update the rpi.alsc section of a camera tuning dict to use a static correction. |
| Function | set |
Update the rpi.geq section of a camera tuning dict. |
| Function | set |
Update the rpi.alsc section of a camera tuning dict to use a static correction. |
| Function | test |
Evaluate current exposure settings using a raw image. |
| Function | upsample |
Zoom an image in the last two dimensions. |
| Type Alias | |
Undocumented |
| Function | _geq |
Whether the green equalisation is set to static. |
Picamera2, target_white_level: int = 700, max_iterations: int = 20, tolerance: float = 0.05, percentile: float = 99.9) -> float:
(source)
¶
Adjust exposure and analog gain based on raw images.
This routine is slow but effective. It uses raw images, so we are not affected by white balance or digital gain.
| Parameters | |
camera:Picamera2 | A Picamera2 object. |
targetint | The raw, 10-bit value we aim for. The brightest pixels should be approximately this bright. Maximum possible is about 900, 700 is reasonable. |
maxint | We will terminate once we perform this many iterations, whether or not we converge. More than 10 shouldn't happen. |
tolerance:float | How close to the target value we consider "done". Expressed as a fraction of the target_white_level so 0.05 means +/- 5% |
percentile:float | Rather then use the maximum value for each channel, we calculate a percentile. This makes us robust to single pixels that are bright/noisy. 99.9% still picks the top of the brightness range, but seems much more reliable than just np.max(). |
| Returns | |
float | Undocumented |
Picamera2, percentile: float = 99, luminance: np.ndarray | None = None, Cr: np.ndarray | None = None, Cb: np.ndarray | None = None, luminance_power: float = 1.0, method: Literal[ 'percentile', 'centre'] = 'centre') -> tuple[ float, float]:
(source)
¶
Adjust the white balance in a single shot, based on the raw image.
NB if channels_from_raw_image is broken, this will go haywire. We should probably have better logic to verify the channels really are BGGR...
Copy the rpi.alsc algorithm from one tuning to another.
This is done in-place, i.e. modifying to_tuning.
Generate a downsampled, un-normalised image from which to calculate the LST.
TODO: blacklevel probably ought to be determined from the camera...
Compresses channel down to a 16x12 grid - from libcamera.
This is taken from https://git.linuxtv.org/libcamera.git/tree/utils/raspberrypi/ctt/ctt_alsc.py for consistency.
Convert form luminance/chrominance dict to four RGGB channels.
Note that these will be normalised - the maximum green value is always 1. Also, note that the channels are BGGR, to be consistent with the channels_from_raw_image function. This should probably change in the future.
Load the default tuning file for the camera.
This will open and close the camera to determine its model. If you are using a model that's supported by picamera2 it should have a tuning file built in. If not, this will probably crash with an error.
Error handling for unsupported cameras is not something we are likely to test in the short term.
Given the 4 Bayer colour channels from a white image, generate a LST.
Internally, is just calls downsampled_channels and lst_from_grids.
Given 4 downsampled grids, generate the luminance and chrominance tables.
The grids are the 4 BAYER channels RGGB
The LST format has changed with picamera2 and now uses a fixed resolution, and is in luminance, Cr, Cb format. This function returns three ndarrays of luminance, Cr, Cb, each with shape (12, 16).
Enable manual exposure, with low gain and shutter speed.
We set exposure mode to manual, analog and digital gain to 1, and shutter speed to the minimum (8us for Pi Camera v2)
Note ISO is left at auto, because this is needed for the gains to be set correctly.
dict, col_corr_matrix: tuple[ float, float, float, float, float, float, float, float, float]):
(source)
¶
Update the rpi.alsc section of a camera tuning dict to use a static correction.
tuning will be updated in-place to set its shading to static, and disable any adaptive tweaking by the algorithm.
Update the rpi.geq section of a camera tuning dict.
| Parameters | |
tuning:dict | the raspberry pi tuning file. This will be updated in-place to set the geq offset to the given value. |
offset:int | The desired green equalisation offset. Default 65535. The default is the maximum allowed value. This means the brightness will always be below the threshold where averaging is used. This is default as we always need the green equalisation to averages the green pixels in the red and blue rows due to the chief ray angle compensation issue when the the stock lens is replaced by an objective. |
Update the rpi.alsc section of a camera tuning dict to use a static correction.
tuning will be updated in-place to set its shading to static, and disable any adaptive tweaking by the algorithm.
Evaluate current exposure settings using a raw image.
CAMERA SHOULD BE STARTED!
We will acquire a raw image and calculate the given percentile of the pixel values. We return a dictionary containing the percentile (which will be compared to the target), as well as the camera's shutter and gain values.