Matlab fails reading TIFF files that it has written

2 views (last 30 days)
To work around IMREAD/IMWRITE failing on large image files, I tried using the underlying Tiff library to write out an array of 1s as a BIGTIFF and then read it back:
%create test data
test = ones(37899, 38687,3, 'uint8');
%write it out
t = Tiff('test.tiff','w8');
setTag(t,'Photometric',Tiff.Photometric.RGB);
setTag(t,'PlanarConfiguration',Tiff.PlanarConfiguration.Chunky);
setTag(t,'BitsPerSample',8);
setTag(t,'SamplesPerPixel',3);
setTag(t,'ImageLength',size(test ,1));
setTag(t,'ImageWidth',size(test ,2));
setTag(t,'Compression',Tiff.Compression.LZW);
write(t, test );
close(t);
%read it back
t2=Tiff('test.tiff')
test2=read(t2);
close(t2)
The test.tiff file looks fine if I open it in a third party program, but is always blank in matlab:
>> test2(1)
ans =
uint8
0
>> test(1)
ans =
uint8
1
If I shrink the file to be less than 2 GB, it reads fine. Am I doing something wrong or is Matlab's TIFF read function not 64 bit clean ? I don't see anything in the documentation about this, and it appears to be able to write large TIFF files fine.

Accepted Answer

mgiacomelli
mgiacomelli on 12 Sep 2018
I spoke with support and after some back and forth we were able to trace this problem down to the RowsPerStrip parameter, which is not well explained in the documentation. For anyone else using matlab on BigTIFF files, the default value for this parameter seem to work for input data up to 2GB. Above that the default results in a file that matlab cannot read and you must explicitly set it to a smaller value if you want to eventually open the file in matlab.
I suspect this is actually a memory corruption bug in matlab's Tiff read function (which is why it happens at the 32 bit signed integer boundary), but it is easy to work around so I did not dig into it further.

More Answers (0)

Products


Release

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!