The postings in this thread aim to give recommendations towards the usage of 3D cameras from Automation Technology (AT) with the software Common Vision Blox (CVB ) on a general basis. First programming steps and commonly asked questions will be handled. However, it does not cover questions regarding the installation itself or certain features of the device. For this kind of information please refer to the corresponding sensor manuals, application notes or the FAQ from the manufacturer.
This is a closed thread. For programming questions regarding the AT-cameras please start a new thread in the appropriate language category.
The explanations of the following posts are valid for all kind of 3D cameras from Automation Technology. Currently, this includes the following camera series:
C2
C4
C5
C5-CS
Both camera types (standalone 3D cameras and compact sensors) differentiate only in the hardware setup and configuration. The way how to connect and control those with Common Vision Blox is identical.
Firmware Version
However, please be aware that there are different firmware versions present for the different camera series and models. Hence, not all cameras might have all the mentioned parameters and algorithms available. In doubt, please refer to the mentioned sensor manuals for a detailed listing of the present parameters or check the Firmware Release Notes to see if your camera runs the latest firmware.
All Automation Technology 3D cameras have a CVB CameraSuite license provided based on their internal MAC-address. The free CameraSuite license is valid as long as at least one AT camera is connected to a system with one of the following CVB installations:
MAC-address lies within one of the following valid license ranges
⢠00:50:C2:xx:xx:xx
⢠70:B3:D2:02:80:00 till 70:B3:D5:02-8F-FF
Manufacturer Name/Device Vendor Name of the camera starts with string âAT-Automation Technology GmbHâ
Device CVB License (Camera Register 0xCFFC or Nodemap parameter âDeviceCVBLICâ (GURU view)) contains value != 0
To verify if the CVB CameraSuite is successfully licensed on your system please open the CVB License Manager and go to âTask 2 - Licensingâ and to panel âSerial Numbersâ. In case of a successful licensing you will find an entry âGigE Vision CameraSuiteâ with additional information.
For additional information regarding the licensing principles of CVB please refer to this application note.
Trouble Shooting
If the AT camera is connected and the CameraSuite license should be valid, based on the points above, but there is still a watermark in the CVB applications (e.g. in Device Configurator of Management Console or GenICam Browser) following reasons could cause this:
Check in the CVB License Manager if CVB Management Service and Codemeter Runtime Service are running. If not - start the services and use it as Startup Type: Automatic, so it is always running.
Add a firewall exception for the CVMgmtSvc.
Normally, a prompt on startup of the Management Console gives the user the choice to add an exception to the firewall. It could be that the user canceled this prompt or a specific firewall setting prevents the prompt from popping up.
It is crucial to correctly set up a number of networking parameters when using the high performant 3D cameras of Automation Technology (and GigEVision cameras in general) in order to guarantee the expected performance in your software. These parameters must be set on both sides, network card and camera.
The required network efficiency parameters inside the camera are non-volatil and are lost at a power-off of the camera. Hence, it is necessary to set the parameters after every camera reboot. This can be either accomplished manually via the GenICam Grid (e.g. in the GenICam Browser), programmatically with the GenApi or automatically by the CVB GenICam driver.
The GenICam driver can be configured to propagate important camera TL settings to the device at driver loading time. It also has capabilities to determine these parameters automatically by querying the device. We however recommend to set them statically in case of non-default configurations. Additionally, the driver has some more parameters, which donât affect the camera operation.
To modify the respective values, open the file GenICam.ini, which is to be found in the Drivers directory of your CVB installation (%CVB%\Drivers).
When using an AT 3D-camera, we recommend to set the parameters as listed:
Parameter
Value
Description
Color Mode
Raw
The raw color mode ensures that all information is transmitted from the camera and no auto-mapping to 8Bit is done on the images. Mandatory if user want to work on original 16Bit range maps.
Packet Size
8192
Using jumbo frames reduces overhead and increases data rate. Notice: Network adapter (and switch) must be set to jumbo packets as well!
Interpacket Delay
2000
In case of corrupted frames an increasing of the inter-packet delay is suggested. A lowering of the value might be necessary when the data rate is insufficient. The interpacket delay is usually specified in GenICam TimestampCounterTicks, which commonly differs between camera models. For AT cameras one tick is 10 ns.
All 3D cameras from Automation Technology follow the GenICam standard which means that the image acquisition of the AT cameras in CVB is not different as with any other GenICam device.
C-style API
Within the installation directory of CVB you can find a variety of example tutorials for the different programming languages that demonstrate the main concepts of CVB, including how to:
run multiple cameras in parallel (e.g. Image Manager â CSharp â MultiCam)
set paramaters of the camera (e.g. Hardware â GenICam â GenICamExample)
Please take a look at the corresponding section in the CVB User Guide.
Object Oriented APIs
New users are encouraged to use the object oriented APIs of CVB. Getting Started Guides include basic image acquisition concepts and can be found for CVB.Net and CVBpy in this forum. Tutorials for CVB++ can be found within the directory of current CVB installations.
However, please be aware that not all functionality of the C-API is covered by the new APIs yet! If you encounter missing tools, be aware that you may have to combine the object oriented wrappers with CVBâs classic C-API.
With ATâs 3D cameras multiple different image data can be obtained and transmitted to the host. This can be either images including the range information, intensity information or laser thickness information. All output channels can be selected individually and in combinations. Every DC is saved in a new image row, resulting in multi information images which must be split on the host side afterwards if multiple DCs are enabled.
For detailed information regarding the data channel assignment DC0-DC2 please take a look into the manufacturerâs sensor manual.
The following two functions demonstrate an example way (C#) to split an image consisting of multiple DCs into its sub-images, depending on its bit depth.
Split 8Bit Image
private static Boolean Split8BitImage(Cvb.Image.IMG cameraImg, ref Cvb.Image.IMG[] singleImages)
{
// Define variables
int numSingleImages = singleImages.GetLength(0);
IntPtr baseIn;
int xIncIn, yIncIn;
int imageWidth, imageHeight;
imageWidth = Cvb.Image.ImageWidth(cameraImg);
imageHeight = Cvb.Image.ImageHeight(cameraImg);
// Images might not have a multiple of lines of the requested images
if (imageHeight % numSingleImages != 0)
imageHeight = imageHeight / numSingleImages;
// Get linear access to the base image
Cvb.Utilities.GetLinearAccess(cameraImg, 0, out baseIn, out xIncIn, out yIncIn);
// Init a pointer array for every image
IntPtr[] baseOut = new IntPtr[numSingleImages];
// Prepare variables
int xIncOut, yIncOut;
xIncOut = yIncOut = 0;
// Create the images and get access
for (int i = 0; i < numSingleImages; i++)
{
// 8 bit images
Cvb.Image.CreateGenericImageDT(1, imageWidth, imageHeight / numSingleImages, 8, out singleImages[i]);
Cvb.Utilities.GetLinearAccess(singleImages[i], 0, out baseOut[i], out xIncOut, out yIncOut);
}
// split data
unsafe
{
char* grayval;
char* pixel;
for (int y = 0; y < imageHeight; y++)
{
for (int x = 0; x < imageWidth; x++)
{
grayval = (char*)((int)baseIn + x * xIncIn + y * yIncIn);
// Assign correct address to pixel
pixel = (char*)((int)baseOut[y % numSingleImages] + x * xIncOut + (y / numSingleImages) * yIncOut);
// Write value of grayval to pixel address
*((char*)pixel) = (char)*grayval;
}
}
}
return true;
}
Split 16Bit Image
private static Boolean Split16BitImage(Cvb.Image.IMG cameraImg, ref Cvb.Image.IMG[] singleImages)
{
// Define variables
int numSingleImages = singleImages.GetLength(0);
IntPtr baseIn;
int xIncIn, yIncIn;
int imageWidth, imageHeight;
imageWidth = Cvb.Image.ImageWidth(cameraImg);
imageHeight = Cvb.Image.ImageHeight(cameraImg);
// Images might not have a multiple of lines of the requested images
if (imageHeight % numSingleImages != 0)
imageHeight = imageHeight / numSingleImages;
// Get linear access to the base image
Cvb.Utilities.GetLinearAccess(cameraImg, 0, out baseIn, out xIncIn, out yIncIn);
// Init a pointer array for every image
IntPtr[] baseOut = new IntPtr[numSingleImages];
// Prepare variables
int xIncOut, yIncOut;
xIncOut = yIncOut = 0;
// Create the images and get access
for (int i = 0; i < numSingleImages; i++)
{
// For 16 bit images
Cvb.Image.CreateGenericImageDT(1, imageWidth, imageHeight / numSingleImages, 16, out singleImages[i]);
Cvb.Utilities.GetLinearAccess(singleImages[i], 0, out baseOut[i], out xIncOut, out yIncOut);
}
// split data
unsafe
{
// For 16 bit images
ushort* grayval;
ushort* pixel;
for (int y = 0; y < imageHeight; y++)
{
for (int x = 0; x < imageWidth; x++)
{
// grayval is the address of the pixel holding the value we want to access
grayval = (ushort*)((int)baseIn + x * xIncIn + y * yIncIn);
// Assign correct address to pixel
pixel = (ushort*)((int)baseOut[y % numSingleImages] + x * xIncOut + (y / numSingleImages) * yIncOut);
// Write value of grayval to pixel address
*((ushort*)pixel) = (ushort)*grayval;
}
}
}
return true;
}
Please notice, that these functions require previous knowledge about the selected number of DCs. This information can be determined by a GenICam query on the value of the specific features (EnableDC0-DC2).
CVB offers the functionality to monitor connected GenICam devices in real time. This allows notifications as soon as cameras get disconnected or reconnected from or to the host. It can be very useful in automated processes if the communication with a device is temporarily lost, e.g. due to a power-off and needs to be automatically re-established.
The Connection Monitoring of CVB is realized over the INotify interface of the driver, with its DEVICE_DISCONNECTED and DEVICE_RECONNECT event. There is sample code available for the C-API (C++, C#) and can be found in the corresponding section of the CVB User Guide.
GigE Vision Events are typically used to synchronize the host application with some events happening in the device. A typical usecase in machine vision applications is for example a host that waits to be notified in real time of the sensorâs exposure end to move the inspected part on a conveyer belt.
This post describes how to register a callback function to monitor an event of an AT-3D Camera with CVB.
For AT cameras there are a number of different events available:
Event Name
Event ID
Description
AcquisitionStart
36882
Frame Acquisition is started
AcquisitionEnd
36883
Frame Acquisition is terminated
TransferStart
36884
Frame transfer is started from the camera
TransferEnd
36885
Frame transfer is terminated
AOITrackingOn
36886
The AOI tracking process is started and the laser line image is valid for AOI alignment
AOITrackingOff
36887
The AOI tracking process is stopped and the AOI position is not updated anymore
AOISearchFailed
36888
AOI-Search failed to detect the laser line
AutoStarted
36889
Frame Acquisition is initiated through AutoStart
The number of events might increase for newer firmware versions. Please refer to the current sensor manual for all supported events.
Implementation
As @parsd already pointed out in this posting, there are two ways to register events with CVB. Either by using the preferred way over the GenApi with NRegisterUpdate() or by using the INotify interface. Which one you take depends on whether the camera supports the event handling via the GenApi or not.
In general, the 3D cameras of AT support the standardized way of the event handling. However, this requires a current firmware version to be running on the sensor, that includes timestamps for each event. Please refer to the Firmware Release Notes to check if your camera supports the âEventNotification via GenICam node accessâ.
For the object oriented CVB-APIs there is another (and even easier!) example to register an Event Callback (CVB++) to a node. It can be found in this post.
When working with AT cameras it is important to understand the influence of the cameraâs subpixel parameter on the measurement precision and its calibration. A changing of the number of subpixels can result in wrong metric values if the calibration file wasnât changed acoordingly.
How to choose matching number of subpixels
Rangemaps from AT are transmitted with a fixed bit depth (8Bit or 16Bit). In most applications 16Bit is the recommended data format, however, the choice of the format is a trade-off between a better height resolution or a higher data rate.
The bit depth of a range map limits the amount of information that can be stored for each pixel (height), whereas each height is described by the number of the row in the sensor and its subpixel position. The number of bits needed to unambiguously describe the sensor row as an integer value depends on the height of the used AOI set in the camera (see table below).
Once the AOI height is set, it defines the number of bits per pixel left for the subpixel information.
Parameter NumSubPixels has a valid range between 0 and 6 bits. This value describes the possible maximum resolution for the calculation of the laser line positions in the COG and FIR-Peak modes. More precisely, the resolution of those 3D modes is 1/(2^n), where n is the number of subpixel bits.
max. AOI height
max. subpixels (8Bit)
resolution (8Bit)
max. subpixels (16Bit)
resolution (16Bit)
3
6
0.015625
6
0.015625
7
5
0.03125
6
0.015625
15
4
0.0625
6
0.015625
31
3
0.125
6
0.015625
63
2
0.25
6
0.015625
127
1
0.5
6
0.015625
255
0
1
6
0.015625
511
0
bit overflow
6
0.015625
1023
0
bit overflow
6
0.015625
2047
0
bit overflow
5
0.03125
4095
0
bit overflow
4
0.0625
Conclusion: The smaller you can set your AOI height, the more subpixels you can use and therefore the more precisely you can theoretically calculate the heights. Using more rows than allowed will lead to bit overflow and ambiguity. Note that the sum of camera AOIHeight + AOIOffsetY must not exceed the maximum AOIHeight.
Calibration File
The knowledge of the number of used subpixels is required when applying a set of calibration parameters to a range map. For this reason it is included as a parameter in the calibration file. Hence, it is important that the number of used subpixels set in the camera matches the number of subpixels set in the calibration file!
This parameter can be changed in the different data formats of the calibration files as follows:
It is important to understand that all 3D Sensors from AT output uncalibrated 2D range maps with no relation to real world units. However, it is possible to easily get calibrated metric 3D points by applying a number of calibration parameters to this range map.
When using an AT Compact Sensor (CS) these calibration parameters are previously calculated by the manufacturer and provided together with the camera. They are stored on the sensorâs memory and can be downloaded in a calibration file (.dat, .xml) with the manufacturerâs cxExplorer software (Device â Load/Save Calibration Metric⌠).
Alternatively, it is possible to download the parameters in the .dat format programmatically from the camera using CVB:
Load Calibration File from AT camera (cvb.NET)
// load driver
Device camera = DeviceFactory.Open("GenICam.vin");
NodeMap cameraNodeMap = camera.NodeMaps[NodeMapNames.Device];
// get calibration file from camera
var files = cameraNodeMap.GetAvailableFiles();
if (files.Contains("UserData"))
cameraNodeMap.DownloadFile("CalibrationFile.dat", "UserData");
When using a modular AT camera setup a user has to manually perform a calibration in order to get these parameters. In this case the CVB Metric Tool (part of the Foundation package) can be used. Please refer to the specific calibration post or the CVB documentary for further information.
Reconstructing Point Cloud
With the acquired rangemap and the matching calibration parameters it is possible to reconstruct the metrically correct 3D points. This can be done either with one of the existing GUI tutorials from CVB (e.g. VBCore3D) or programmatically in CVB:
Sample Code: Classic API (C#)
// load calibration file
Cvb.SharedCalibrator calibrator = null;
if (Cvb.Core3D.LoadCalibrator(fileName, out calibrator) < 0)
MessageBox.Show("Error loading calibration file");
else
{
// reconstruct point cloud
Cvb.SharedComposite pointCloud = null;
Cvb.Core3D.CreateCalibratedPointCloudFromRangeMap(rangeMap, calibrator, out pointCloud);
}
For a better troubleshooting during the development of an application or while running live, it is recommended to add a few additional lines to your code.
Acquisition Health
When streaming with a GenICam compliant camera you can query statitistics to monitor your acquisition health via the transport layer of Stemmer Imaging. This also applies to the AT GigE cameras.
With these statistics a user can detect easily when e.g. packets or images got lost during transmission. In a couple of posts parsd described how to use it in cvb.Net: https://forum.commonvisionblox.com/t/getting-started-with-cvb-net/246/13?u=moli
Poll Specific Camera Parameters
Another way to monitor your device is to continuously check health data by polling the cameraâs parameters.
The manufacturer recommends that the temperature does not exceed 65 °C when measuring. Furthermore, keep in mind that dark current and noise performance for CMOS sensors will degrade at higher temperatures.
The AT sensors features a Chunk Data mode for providing additional information to the acquired image data. The ChunkData generated by the camera have the following format:
ChunkImage
1âŚN x ChunkAcqInfo
ChunkImageInfo
Depending on camera mode (image or 3D) the ChunkData block (âChunkAcqInfoâ) can
be sent as follows:
In image mode, the camera can send only one ChunkAcqInfo block per image frame.
In 3D mode, the camera can send one ChunkAcqInfo block either per 3D frame (âOneChunkPerFrameâ) or per 3D profile (âOneChunkPerProfileâ)
The âChunkImageInfoâ is the last ChunkData sent by the camera and contains following
data:
Number of valid rows in ChunkImage
Number of valid ChunkAcqInfo blocks
Flags identifying the current frame as âStartâ or âStopâ and the buffer status in AutoStart mode
The ChunkAcqInfo block consists of totally 32 bytes containing following data:
64 bit timestamp
32 bit frame counter
32 bit trigger coordinate
8 bit Trigger status
32 bit I/O Status
72 bit AOI information
The data of timestamp, frame counter, trigger coordinate, trigger status and I/O status are assigned at the start of every image integration.
When ChunkMode is disabled, the camera uses the âregularâ GEV image protocol, in which the optional transfer of frames with variable height and payload is supported.
Furthermore, when ChunkMode is enabled, the camera sends the full payload, even if the ChunkImage or ChunkAcqInfo blocks contain partially valid data. The number of valid ChunkImage rows and ChunkAcqInfo blocks can be read from ChunkImageInfo. For example, when in Start/Stop mode with instant frame transmission, the camera stops the frame acquisition as soon as the stop trigger occurs and transfers the complete contents of internal image buffer. Using the ChunkImageInfo data block, it is possible to detect how many image rows and ChunkAcqInfo blocks are valid in the payload buffer. The tag of ChunkData has big endian byte order. The data of ChunkData has little endian byte order. An endian converter for ChunkData is not supported.
using (var device = DeviceFactory.Open("GenICam.vin"))
{
var nodemap = device.NodeMaps[NodeMapNames.Device];
var chunkModeActive = nodemap["ChunkModeActive"] as BooleanNode;
chunkModeActive.Value = true;
var stream = device.Stream;
stream.Start();
try
{
stream.Wait();
var deviceImage = device.DeviceImage;
var chunks = GevChunkParser.Parse(deviceImage);
var info = DereferenceAcqInfoOn(deviceImage, chunks.First(chunk => chunk.ID == ATC5AcqInfoChunk.ID));
Console.WriteLine($"FrameCount: {info.FrameCount}");
}
finally
{
stream.Stop();
}
}
With this helper method and struct:
private static unsafe ATC5AcqInfoChunk DereferenceAcqInfoOn(DeviceImage image, GevChunk chunk)
{
Debug.Assert(chunk.ID == ATC5AcqInfoChunk.ID);
Debug.Assert(chunk.Length >= sizeof(ATC5AcqInfoChunk));
var chunkPtr = new IntPtr(image.GetBufferBasePtr().ToInt64() + chunk.Offset);
return *(ATC5AcqInfoChunk*)chunkPtr;
}
[StructLayout(LayoutKind.Sequential, Pack = 1)]
struct ATC5AcqInfoChunk
{
public const uint ID = 0x66669999;
public uint TimeStampLow;
public uint TimeStampHigh;
public uint FrameCount;
public int TriggerCoordinate;
public byte TriggerStatus;
public ushort DAC;
public ushort ADC;
public byte IntIdX;
public byte AoiIdX;
public ushort AoiYs;
public ushort AoiDy;
public ushort AoiXs;
public ushort AoiThreshold;
public byte AOIAlgorithm;
}
For this to work you need these three new classes @parsd put on github:
Hi,
There is a easier way to split the image.
Using CreateImageMap() and start point line0, line1 and line2.
We can easily get DC0, DC1 and DC2 images.
Like as below:
Extracting the information from TriggerStatus as byte will result in a decimal value between 0 ant 255.
From this the information on the 6 status elements can already be read unambiguous. For interpretation reasons it might be easier to convert the decimal number to a binary number.
In our example we receive that decimal byte value of 193 from the TriggerStatus Chunk value.
Decoded to binary the value can be read like this:
11000001
Each bit is now describing the status of one of the 6 elements. The logic is set to the following:
Out1 = high, Out0 = high, In1 = low, In0 = low, -, -, EncoderStatus = off/back, TriggerOverrun = true.