There are metadata controls in the properties of a Dalsa Nano-M2590, but it’s not clear to me if they are sent in the image, with the image in the RingBuffer, or if one has to query the camera separately. Is this something that is exposed via ?
(I’m looking into this because we’ve had difficulty in the past correlating images to triggers (positions), especially as we’re doing HDR by cycling expositions, so it’s important to group the right images together).
The Metadata of the Nano GigE Series are included into the image as “GenICam Chunk Data”. Those data do not overwrite the image data (as we discussed for the Linea camera). However they are send with each image and you are able to access them via software.
(device.NodeMaps[NodeMapNames.Device]["Std::ChunkDataControl"] as CategoryNode)
But strangely, all of the sub-nodes are “<n/a>”, even though if you use GenICam Browser you can see the values. So I’m pretty close, but still missing something, I think. (I did set nodeMap.Set("Std::ChunkModeActive", true); first)
For the AT cameras at least, converting the pixel format (so effectively using anything but raw) made the chunk data go to 0. I’d be verry suprised if that only holds true for AT cameras.
Converting the pixeltypes seems to effect the chunk data appended to them as well.
Well it’s definitely not all zero, but I’m having difficulty working out what it is! I’ve managed to parse out the Tag, and it says that the length is 136 bytes, which divided by 17 (the number - I think - of properties I see in GenICam Browser), would give exactly 8 bytes each. But currently if I map a struct of 17 long/doubles onto that, I get weird numbers (again, I’m guessing the order too).
Now I’m going to try doing several images in a row - where only the timestamp, cycling preset and exposure time should change - and see which bytes move. Didn’t think I’d have to get this down and dirty
OK, a few pointers for any other poor mortal who bites of more than they can chew.
Do I have to use DeviceImage, or can I use StreamImage?
This question related to the code here which uses DeviceImage. Given that the only ‘added value’ of DeviceImage is that it contains a reference to Device, I refactored the three classes from github, which ends up meaning that the signature of GetChunk is GetChunk(Image image, Device device).
Do I really have to set PixelFormat to Raw to get the metadata?
It doesn’t look like it.
Could someone point me towards documents which could at least give me the memory layout, offsets etc.?
I didn’t find any documents, but I’ve managed to get this far (as it’s all I need):
For the rest
[StructLayout(LayoutKind.Sequential, Pack = 1)]
struct NanoInfoChunk
{
public const uint ID = 0xCD000001;
public long Dummy;
public long ChunkExposureTime;
public long ChunkCyclingPresetCurrentActiveSet;
}
If you want to work on the rest, you can add the three middle lines to DeferenceNanoInfoOn (still inspired from the code posted on the other topic):
private static unsafe NanoInfoChunk DereferenceNanoInfoOn(Image image, GevChunk chunk)
{
var chunkPtr = new IntPtr(image.GetBufferBasePtr().ToInt64() + chunk.Offset);
byte[] bytes = new byte[chunk.Length];
Marshal.Copy(chunkPtr, bytes, 0, (int)chunk.Length);
System.Diagnostics.Debug.WriteLine(BitConverter.ToString(bytes));
return *(NanoInfoChunk*)chunkPtr;
}
The AT cameras don’t use GenApi chunk and thus you need to parse it manually
supports GenApi chunks even when converting pixel formats
If GenApi chunk nodes are available use these instead of parsing the raw data
If, as with the Nano, you want to use GenApi chunks you need to activate them manually. Sometimes even per value. You first need to set the selector (the parent node in the grid) and then enable it.
Then after calling .Wait the data is updated.
Regarding the last question: the DeviceImage is always the last acquired image. Thus this one is nice to have for display purposes, but not for processing as it is also updated with the .Wait call.
i am struggling with the read out of the chunk data. I do select the values before trying to read the specific node, e.g. ExposureTime. In the GinICam Browser the values are shown, even though the fields are gray. With the API the nodes are not readable. Did i miss something or do i have to parse the payload manually?
Did you activate chunk parsing for the GenICam.vin? This can be done either in the camera section of the GenICam.ini in %CVBDATA%\Drivers:
AttachChunk = 1
or via discovery:
var foundDevices = DeviceFactory.Discover();
var firstDeviceInfo = foundDevices[0]; // we simply assume we found at least one camera
firstDeviceInfo.SetParameter("AttachChunk", "1")
using (var device = DeviceFactory.Open(firstDeviceInfo))
{
// ...
}
This is off by default as parsing this costs performance.
Chunk-Nodes are only readable (this is why they are gray in the GenApi Grid). When you used them in the GenICam Browser: did you enable the features you wanted? This also needs to be done in your application as this is not persisted.
Both the GenICam Browser and your app via our API use the same underlying technology. It is not standardized what happens if you disconnect from a device. Some settings might still be active in your camera, others not. So it is always the safe way to set-up your camera as you want it to be in your application.
With enabling you mean selecting the meta data field with the ChunkSelector and enabling it with ChunkEnable before reading the field i assume? I did, but this triggers an error that says the ChunkEnable node is not writeable. This field is also gray in the GenICam Browser for each chunk field i select.
I do get the raw meta data block for each frame, so it is just about interpreting. I could mess around with it to extract the values but i´d like to do it based on the Nodemaps.
If the ChunkEnable is not writeable it sounds like you started the acquisition locking all frame buffer related operations (buffer size would change). Can you check whether enabling this before Stream.Start helps?
One last thing I can think of is to check whether ChunkEnable’s selector is set correctly. If that doesn’t help please contact support to get in-depth consultation.