class StreamingPiCamera2(BaseCamera): (source)
Constructor: StreamingPiCamera2(camera_num)
A Thing that provides and interface to the Raspberry Pi Camera.
Currently the Thing only supports the PiCamera v2 board. This needs generalisation.
| Method | __enter__ |
Start streaming when the Thing context manager is opened. |
| Method | __exit__ |
Close the picamera connection when the Thing context manager is closed. |
| Method | __init__ |
Initialise the camera with the given camera number. |
| Method | analogue |
The Analogue gain applied by the camera sensor. |
| Method | analogue |
Undocumented |
| Method | auto |
Adjust exposure until a the target white level is reached. |
| Method | calibrate |
Take an image and use it for flat-field correction. |
| Method | calibrate |
Correct the white balance of the image. |
| Method | capture |
Acquire one image from the camera and return as an array. |
| Method | capture |
Acquire one image from the camera. |
| Method | capture |
Acquire one image from the camera as a JPEG. |
| Method | colour |
Undocumented |
| Method | colour |
The red and blue colour gains, must be between 0.0 and 32.0. |
| Method | colour |
Undocumented |
| Method | correct |
Correct white balance gains for the effect of lens shading. |
| Method | discard |
Discard frames so that the next frame captured is fresh. |
| Method | exposure |
The camera exposure time in microseconds. |
| Method | exposure |
Undocumented |
| Method | flat |
Disable flat-field correction. |
| Method | flat |
Disable flat-field correction. |
| Method | full |
Perform a full auto-calibration. |
| Method | get |
Return the active tuning algorithm settings for the given algorithm. |
| Method | lens |
Set the lens shading tables. |
| Method | reset |
Overwrite the colour correction matrix in camera tuning with default values. |
| Method | reset |
Revert to default lens shading settings. |
| Method | save |
Override save_settings to ensure that camera properties don't recurse. |
| Method | sensor |
Change the sensor mode used. |
| Method | set |
Set the green equalisation to a static value. |
| Method | start |
Start the MJPEG stream. This is where persistent controls are sent to camera. |
| Method | stop |
Stop the MJPEG stream. |
| Class Variable | mjpeg |
Bitrate for MJPEG stream (None for default). |
| Class Variable | stream |
Resolution to use for the MJPEG stream. |
| Instance Variable | camera |
Undocumented |
| Instance Variable | camera |
Undocumented |
| Instance Variable | colour |
The colour_correction_matrix from the tuning file. |
| Instance Variable | default |
Undocumented |
| Instance Variable | stream |
Whether the MJPEG stream is active. |
| Instance Variable | tuning |
The Raspberry PiCamera Tuning File JSON. |
| Property | camera |
The "configuration" dictionary of the picamera2 object. |
| Property | capture |
Return the metadata from the camera. |
| Property | lens |
Whether the lens shading is static. |
| Property | lens |
The current lens shading (i.e. flat-field correction). |
| Property | manual |
The camera settings to expose as property controls in the settings panel. |
| Property | primary |
The calibration actions for both calibration wizard and settings panel. |
| Property | secondary |
The calibration actions that appear only in settings panel. |
| Property | sensor |
The intended sensor mode of the camera. |
| Property | sensor |
All the available modes the current sensor supports. |
| Property | sensor |
The native resolution of the camera's sensor. |
| Property | streaming |
True if the camera is streaming. |
| Method | _get |
Undocumented |
| Method | _initialise |
Acquire the picamera device and store it as self._picamera. |
| Method | _streaming |
Lock access to picamera and return the underlying Picamera2 instance. |
| Class Variable | _sensor |
Undocumented |
| Instance Variable | _analogue |
Undocumented |
| Instance Variable | _colour |
Undocumented |
| Instance Variable | _exposure |
Undocumented |
| Instance Variable | _picamera |
Undocumented |
| Instance Variable | _picamera |
Undocumented |
| Instance Variable | _sensor |
Undocumented |
| Instance Variable | _setting |
Undocumented |
Inherited from BaseCamera:
| Method | background |
The data for each background detector, used to save to disk. |
| Method | background |
Set the data for each detector. Only to be used as settings are loaded from disk. |
| Method | capture |
Capture an image and save it to disk. |
| Method | capture |
Acquire one image from the camera, downsample, and return as an array. |
| Method | capture |
Capture an image to memory. This can be saved later with save_from_memory. |
| Method | clear |
Clear all images in memory. |
| Method | detector |
The name of the active background selector. |
| Method | detector |
Validate and set detector_name. |
| Method | grab |
Acquire one image from the preview stream and return as an array. |
| Method | grab |
Acquire one image from the preview stream and return as blob of JPEG data. |
| Method | grab |
Acquire one image from the preview stream and return its size. |
| Method | image |
Label the current image as either background or sample. |
| Method | kill |
Kill the streams now as the server is shutting down. |
| Method | save |
Save an image that has been captured to memory. |
| Method | set |
Grab an image, and use its statistics to set the background. |
| Method | settle |
Sleep for the settling time, ready to provide a fresh frame. |
| Method | update |
Update the settings of the current detector. |
| Class Variable | downsampled |
The downsampling factor when calling capture_downsampled_array. |
| Class Variable | lores |
Undocumented |
| Class Variable | mjpeg |
Undocumented |
| Class Variable | settling |
The settling time when calling the settle() method. |
| Instance Variable | background |
Undocumented |
| Property | active |
The active background detector instance. |
| Property | background |
The status of the active detector for the UI. |
| Method | _robust |
Capture an image in memory and return it with metadata. |
| Method | _save |
Save the captured image and metadata to disk. |
| Class Variable | _memory |
Undocumented |
| Instance Variable | _detector |
Undocumented |
Start streaming when the Thing context manager is opened.
This opens the picamera connection, initialises the camera, sets the sensor_modes property, and then starts the streams.
Initialise the camera with the given camera number.
This makes no connection to the camera (except to get the default tuning file).
| Parameters | |
cameraint | The number of the camera. This should generally be left as 0 as most Raspberry Pi boards only support 1 camera. |
int = 700, percentile: float = 99.9):
(source)
¶
Adjust exposure until a the target white level is reached.
Starting from the minimum exposure, gradually increase exposure until the image reaches the specified white level.
| Parameters | |
targetint | The target 10bit white level. 10-bit data has a theoretical maximum of 1023, but with black level correction the true maximum is about 950. Default is 700 as this is approximately 70% saturated. |
percentile:float | The percentile to use instead of maximum. Default 99.9. When calculating the brightest pixel, a percentile is used rather than the maximum in order to be robust to a small number of noisy/bright pixels. |
Take an image and use it for flat-field correction.
This method requires an empty (i.e. bright) field of view. It will take a raw image and effectively divide every subsequent image by the current one. This uses the camera's "tuning" file to correct the preview and the processed images. It should not affect raw images.
Literal[ 'percentile', 'centre'] = 'centre', luminance_power: float = 1.0):
(source)
¶
Correct the white balance of the image.
This calibration requires a neutral image, such that the 99th centile of each colour channel should correspond to white. We calculate the centiles and use this to set the colour gains. This is done on the raw image with the lens shading correction applied, which should mean that the image is uniform, rather than weighted towards the centre.
If method is "centre", we will correct the mean of the central 10% of the image.
Literal[ 'main', 'lores', 'raw', 'full'] = 'main', wait: float | None = 0.9) -> ArrayModel:
(source)
¶
Acquire one image from the camera and return as an array.
This function will produce a nested list containing an uncompressed RGB image. It's likely to be highly inefficient - raw and/or uncompressed captures using binary image formats will be added in due course.
stream_name: (Optional) The PiCamera2 stream to use, should be one of ["main", "lores", "raw", "full"]. Default = "main" wait: (Optional, float) Set a timeout in seconds. A TimeoutError is raised if this time is exceeded during capture. Default = 0.9s, lower than the 1s timeout default in picamera yaml settings
Literal[ 'main', 'lores', 'raw'] = 'main', wait: float | None = 0.9) -> Image:
(source)
¶
Acquire one image from the camera.
Return it as a PIL Image
stream_name: (Optional) The PiCamera2 stream to use, should be one of ["main", "lores", "raw"]. Default = "main" wait: (Optional, float) Set a timeout in seconds. A TimeoutError is raised if this time is exceeded during capture. Default = 0.9s, lower than the 1s timeout default in picamera yaml settings
lt.deps.GetThingStates, resolution: Literal[ 'lores', 'main', 'full'] = 'main', wait: float | None = 0.9) -> JPEGBlob:
(source)
¶
Acquire one image from the camera as a JPEG.
The JPEG will be acquired using Picamera2.capture_file. If the resolution parameter is main or lores, it will be captured from the main preview stream, or the low-res preview stream, respectively. This means the camera won't be reconfigured, and the stream will not pause (though it may miss one frame).
If full resolution is requested, we will briefly pause the MJPEG stream and reconfigure the camera to capture a full resolution image.
wait: (Optional, float) Set a timeout in seconds. A TimeoutError is raised if this time is exceeded during capture. Default = 0.9s, lower than the 1s timeout default in picamera yaml settings
Note that this always uses the image processing pipeline - to bypass this, you must use a raw capture.
tuple[ float, float]) -> tuple[ float, float]:
(source)
¶
Correct white balance gains for the effect of lens shading.
The white balance algorithm we use assumes the brightest pixels should be white, and that the only thing affecting the colour of said pixels is the colour_gains.
The lens shading correction is normalised such that the minimum gain in the Cr and Cb channels is 1. The white balance assumption above requires that the gain for the brightest pixels is 1. The solution might be that, when calibrating, we note which pixels are brightest (usually the centre) and explicitly use the LST values for there. However, for now I will assume that we need to normalise by the maximum of the Cr and Cb channels, which is correct the majority of the time.
The camera exposure time in microseconds.
When setting this property the camera will adjust the set value to the nearest allowed value that is lower than the current setting.
Disable flat-field correction.
This method will set a completely flat lens shading table. It is not the same as the default behaviour, which is to use an adaptive lens shading table.
This flat table is used to take an image with no lens shading so that the correct lens shading table can be calibrated.
Disable flat-field correction.
This method will set the chrominance of the lens shading table to be flat, i.e. we'll correct vignetting of intensity, but not any change in colour across the image.
Perform a full auto-calibration.
This function will call the other calibration actions in sequence:
- flat_lens_shading to disable flat-field
- auto_expose_from_minimum
- set_static_green_equalisation to set geq offset to max
- calibrate_lens_shading
- reset_ccm
- calibrate_white_balance
- set_background
str, raise_if_missing: Literal[ True]) -> dict:str, raise_if_missing: bool) -> dict | None:Return the active tuning algorithm settings for the given algorithm.
| Returns | |
dict | None | The algorithm dictionary if found, returns None if no tuning data is loaded or if the tuning algorithm is not found. |
| Raises | |
MissingCalibrationError | If raise_if_missing is true and there is no tuning file is available, or the requested algorithm is not present. |
Overwrite the colour correction matrix in camera tuning with default values.
These values are from the Raspberry Pi Camera Algorithm and Tuning Guide, page 45.
Revert to default lens shading settings.
This method will restore the default "adaptive" lens shading method used by the Raspberry Pi camera.
Override save_settings to ensure that camera properties don't recurse.
This method is run by any Thing when a ThingSetting is saved. However, the method reads the thing_setting. As reading the thing setting talks to the camera and calls save_settings if the value is not as expected, this could cause recursion. Also this means that saving one setting causes all others to be read each time.
Set the green equalisation to a static value.
Green equalisation avoids the debayering algorithm becoming confused by the two green channels having different values, which is a problem when the chief ray angle isn't what the sensor was designed for, and that's the case in e.g. a microscope using camera module v2.
A value of 0 here does nothing, a value of 65535 is maximum correction.
tuple[ int, int] = (int = 6):
(source)
¶
Start the MJPEG stream. This is where persistent controls are sent to camera.
Sets the camera resolutions based on input parameters, and sets the low-res resolution to (320, 240). Note: (320, 240) is a standard from the Pi Camera manual.
Create two streams:
- lores_mjpeg_stream for autofocus at low-res resolution
- mjpeg_stream for preview. This is the main_resolution if this is less
- than (1280, 960), or the low-res resolution if above. This allows for high resolution capture without streaming high resolution video.
main_resolution: the resolution for the main configuration. Defaults to (820, 616), 1/4 sensor size. buffer_count: the number of frames to hold in the buffer. Higher uses more memory, lower may cause dropped frames. Value must be between 1 and 8, Defaults to 6.
tuple[ float, float, float, float, float, float, float, float, float] =
(source)
¶
The colour_correction_matrix from the tuning file.
This is broken out into its own property for convenience and compatibility with the micromanager API
Ir is a 9 value tuple used to specify the 3x3 matrix that the GPU pipeline uses to convert from the camera R,G,B vector to the standard R,G,B.
See page Raspberry Pi Camera Algorithm and Tuning Guide, page 45.
The "configuration" dictionary of the picamera2 object.
The "configuration" sets the resolution and format of the camera's streams. Together with the "tuning" it determines how the sensor is configured and how the data is processed.
Note that the configuration may be modified when taking still images, and this property refers to whatever configuration is currently in force - usually the one used for the preview stream.
Whether the lens shading is static.
This property is true if the lens shading correction has been set to use a static table (i.e. the number of automatic correction iterations is zero). The default LST is not static, but all the calibration controls will set it to be static (except "reset")
The current lens shading (i.e. flat-field correction).
Return the current lens shading correction, as three 2D lists each with dimensions 16x12, if a static lens shading table is in use.
Return None if: - adaptive control is enabled - multiple LSTs in use (for different colour temperatures),
Acquire the picamera device and store it as self._picamera.
This duplicates logic in Picamera2.__init__ to provide a tuning file that will be read when the camera system initialises.
Iterator[ Picamera2]:
(source)
¶
Lock access to picamera and return the underlying Picamera2 instance.
Optionally the stream can be paused to allow updating the camera settings.
| Parameters | |
| pause | If False the Picamera2 instance is simply yielded. If True:
|
| Returns | |
Iterator[ | Undocumented |