Bit depth detection changed in 14.01

Hello,

our software uses the CVB parameters DataType and BitsPerPixel to determine the pixel format of the camera.
It seems the CVB behaviour has changed from 14.0 (and before) to 14.1.

PixelFormat in-camera CVB 14.0 CVB 14.1
mono8 DataType = 8, BitsPerPixel = 8 DataType = 8, BitsPerPixel = 8
mono10 DataType = 10, BitsPerPixel = 10 DataType = 16, BitsPerPixel = 16
mono12 DataType = 12, BitsPerPixel = 12 DataType = 16, BitsPerPixel = 16

As you can see, the bit depth is not reported correctly (or at least in the same way) in 14.1.
Do I miss something, is this the accepted way of processing this, is there maybe a better approach or an added function in CVB 14.1?

Best regards.

Hi @whisper, could you add information about the way/context you are fetching the DataType and BitsPerPixel? Which API/language, which application?

AFAIK we are using the (deprecated) .vin stack with C.
In 14.1, the images are transferred fine, just the detection of the bit depth has changed from previous versions.

Thank you, I will dive and look for what you are reporting here, but it would be great if you could say more about how you access the values you call “CVB parameters”. Actually, there is a export in iCVCImg.h, that is called ImageDatatype, returning a cvbdatatype_t from an image (IMG). Do you mean this?

Affirmative.
The functions ImageDatatype and BitsPerPixel from iCVCImg.h.

Thanks that helps a lot. Could you also point out, where you get the IMG from?

The devs tell me the software is using the function LoadImageFile.

1 Like