It is possible to craft a TIFF document where the IFD list is circular,
leading to an infinite loop while traversing the chain. The libtiff
directory reader has a failsafe that will break out of this loop after
reading 65535 directory entries, but it will continue processing,
consuming time and resources to process what is essentially a bogus TIFF
document.
This change fixes the above behavior by breaking out of processing when
a TIFF document has >= 65535 directories and terminating with an error.
From https://github.com/facebook/zstd
"Zstandard, or zstd as short version, is a fast lossless compression
algorithm, targeting real-time compression scenarios at zlib-level
and better compression ratios. It's backed by a very fast entropy stage,
provided by Huff0 and FSE library."
We require libzstd >= 1.0.0 so as to be able to use streaming compression
and decompression methods.
The default compression level we have selected is 9 (range goes from 1 to 22),
which experimentally offers equivalent or better compression ratio than
the default deflate/ZIP level of 6, and much faster compression.
For example on a 6600x4400 16bit image, tiffcp -c zip runs in 10.7 seconds,
while tiffcp -c zstd runs in 5.3 seconds. Decompression time for zip is
840 ms, and for zstd 650 ms. File size is 42735936 for zip, and
42586822 for zstd. Similar findings on other images.
On a 25894x16701 16bit image,
Compression time Decompression time File size
ZSTD 35 s 3.2 s 399 700 498
ZIP/Deflate 1m 20 s 4.9 s 419 622 336
Fix for http://bugzilla.maptools.org/show_bug.cgi?id=2704
This vulnerability - at least for the supplied test case - is because we
assume that a tiff will only have one transfer function that is the same
for all pages. This is not required by the TIFF standards.
We than read the transfer function for every page. Depending on the
transfer function, we allocate either 2 or 4 bytes to the XREF buffer.
We allocate this memory after we read in the transfer function for the
page.
For the first exploit - POC1, this file has 3 pages. For the first page
we allocate 2 extra extra XREF entries. Then for the next page 2 more
entries. Then for the last page the transfer function changes and we
allocate 4 more entries.
When we read the file into memory, we assume we have 4 bytes extra for
each and every page (as per the last transfer function we read). Which
is not correct, we only have 2 bytes extra for the first 2 pages. As a
result, we end up writing past the end of the buffer.
There are also some related issues that this also fixes. For example,
TIFFGetField can return uninitalized pointer values, and the logic to
detect a N=3 vs N=1 transfer function seemed rather strange.
It is also strange that we declare the transfer functions to be of type
float, when the standard says they are unsigned 16 bit values. This is
fixed in another patch.
This patch will check to ensure that the N value for every transfer
function is the same for every page. If this changes, we abort with an
error. In theory, we should perhaps check that the transfer function
itself is identical for every page, however we don't do that due to the
confusion of the type of the data in the transfer function.
program. This is in response to the report associated with
CVE-2017-16232 but does not solve the extremely high memory usage
with the associated POC file.
mode (ie default) when there is both a StripOffsets and
TileOffsets tag, or a StripByteCounts and TileByteCounts
Fixes http://bugzilla.maptools.org/show_bug.cgi?id=2689
* tools/tiff2ps.c: call TIFFClose() in error code paths.
writeBufferToSeparateStrips(), writeBufferToContigTiles() and
writeBufferToSeparateTiles() that could cause heap buffer overflows.
Reported by Henri Salo from Nixu Corporation.
Fixes http://bugzilla.maptools.org/show_bug.cgi?id=2592
compressed images. Reported by Tyler Bohan of Cisco Talos as
TALOS-CAN-0187 / CVE-2016-5652.
Also prevents writing 2 extra uninitialized bytes to the file stream.
t2p_readwrite_pdf_image_tile(), causing crash, when reading a
JPEG compressed image with TIFFTAG_JPEGTABLES length being one.
Reported as MSVR 35101 by Axel Souchet and Vishal Chauhan from
the MSRC Vulnerabilities & Mitigations team.
required tags. Found on test case of MSVR 35100.
* tools/tiffcrop.c: fix read of undefined buffer in
readContigStripsIntoBuffer() due to uint16 overflow. Probably not a
security issue but I can be wrong. Reported as MSVR 35100 by Axel
Souchet from the MSRC Vulnerabilities & Mitigations team.
in heap or stack allocated buffers. Reported as MSVR 35093,
MSVR 35096 and MSVR 35097. Discovered by Axel Souchet and Vishal
Chauhan from the MSRC Vulnerabilities & Mitigations team.
* tools/tiff2pdf.c: fix out-of-bounds write vulnerabilities in
heap allocate buffer in t2p_process_jpeg_strip(). Reported as MSVR
35098. Discovered by Axel Souchet and Vishal Chauhan from the MSRC
Vulnerabilities & Mitigations team.
* libtiff/tif_pixarlog.c: fix out-of-bounds write vulnerabilities
in heap allocated buffers. Reported as MSVR 35094. Discovered by
Axel Souchet and Vishal Chauhan from the MSRC Vulnerabilities &
Mitigations team.
* libtiff/tif_write.c: fix issue in error code path of TIFFFlushData1()
that didn't reset the tif_rawcc and tif_rawcp members. I'm not
completely sure if that could happen in practice outside of the odd
behaviour of t2p_seekproc() of tiff2pdf). The report points that a
better fix could be to check the return value of TIFFFlushData1() in
places where it isn't done currently, but it seems this patch is enough.
Reported as MSVR 35095. Discovered by Axel Souchet & Vishal Chauhan &
Suha Can from the MSRC Vulnerabilities & Mitigations team.
buffer, when -b mode is enabled, that could result in out-of-bounds
write. Based initially on patch tiff-CVE-2016-3945.patch from
libtiff-4.0.3-25.el7_2.src.rpm by Nikola Forro, with correction for
invalid tests that rejected valid files.
on a tiled separate TIFF with more than 8 samples per pixel.
Reported by Kaixiang Zhang of the Cloud Security Team, Qihoo 360
(CVE-2016-5321, bugzilla #2558)
ras2tiff, sgi2tiff, sgisv, and ycbcr are completely removed from
the distribution. The libtiff tools rgb2ycbcr and thumbnail are
only built in the build tree for testing. Old files are put in
new 'archive' subdirectory of the source repository, but not in
distribution archives. These changes are made in order to lessen
the maintenance burden.
Roger Leigh (via tiff mailing list on 2015-09-01) to fix issue
with BSD make and to make use of cmake in 'distcheck' target
conditional on if cmake is available.
libtiff mailing list on Mon, 22 Jun 2015 21:21:01 +0100. Several
corrections to ensure that the autotools build still works were
added by me. I have not yet tested the build using 'cmake' or
MSVC with 'nmake'.
definitions that configure produces, including for WIN64. Still
needs to be tested.
'lld' is not assured by the run-time DLLs and so GCC warns.
Add TIFF_SIZE_T and TIFF_SIZE_FORMAT to provide a type definition
and printf format specifier to deal with printing values of
'size_t' type. In particular, this was necessary for WIN64.
Added a configure test for if the system headers provide 'optarg'
(normal case) and block out the many explicit 'extern' statements
in the utilities. This was found to be necessary under Windows
when getopt is in a DLL and the symbols are already imported with
dllimport via standard header files.
An Ubuntu user noticed that tiffgt was not appropriately calling glFlush(),
which was causing tiffgt to open blank windows in some cases. I'm not sure
what the cases are, but my system happened to be one that saw blank windows,
and the one-line fix the user provided was sufficient to solve it in my case.
1. libtiffcrop-fix.patch fixes a small problem in tiffcrop, it seems it
was incorrectly using TIFFSetField instead of CopyField.
And in libtiff-correct-fax-scaling.patch we have some other changes:
2. I had to remove a check in main() that didn't allow maxPageWidth to
be bigger than pageWidth.
3. [ Omitted due to question on universality ]
4. the pagewidth variable was being set as the maxpagewidth instead,
which made all the calculations bad. This made sense when the check in
point 2 was in place, but not anymore. I've modified it so that
pagewidth is set with the specified pagewidth when maxpagewidth is
bigger.
5. The remaining lines of the patch - in exportMaskedImage() -
basically fix the scaling.
http://bugzilla.maptools.org/show_bug.cgi?id=2078#c9
The problem is that TIFF library attempts to write TIFF header as soon as the
tiff2pdf utility initializes the library. Fortunately, the library contains an
I/O abstraction feature, so there are no hardcoded writes to a file descriptor
anywhere. In particular, it appears that the utility's output suppression
feature can be used to suppress the initial write of the header.
There are a lot of code like this:
buflen=snprintf(buffer, sizeof(buffer), "%lu", (unsigned long)number);
written += t2pWriteFile(output, (tdata_t) buffer, buflen );
in tiff2pdf. This is seriously broken: when formatted string is larger than
buffer, snprintf return value is >= sizeof(buffer) [current standard] or -1
[legacy]. And in case of other errors, snprintf returns -1.
Both would result in reading unallocated memory and possible SIGSEGV (DoS).
I doubt it can be really exploitable (to begin with, in most cases buffer was
large enough and sprintf->snprintf change was pure paranoia, IMO), but /if/ you
decided previous code was not safe and snprintf is necessary, /then/ you MUST
check its return value.
value as argument".
(checksignature): Fix Coverity 1024894 "Ignoring number of bytes
read".
(readextension): Fix Coverity 1024893 "Ignoring number of bytes
read".
(readgifimage): Fix Coverity 1024890 "Ignoring number of bytes
read".
(readraster): Fix Coverity 1024891 "Ignoring number of bytes
read".
(readgifimage): Fix Coverity 1024892 "Ignoring number of bytes
read".
value from library".
(guessSize): Fix Coverity 1024888 "Unchecked return value from
library".
(guessSize): Fix Coverity 1214162 "Ignoring number of bytes read".
(guessSize): Fix Coverity 1024889 "Unchecked return value from
library".
as argument".
(main): Fix Coverity 1024678 "Unchecked return value from
library".
(main): Fix Coverity 1024679 "Unchecked return value from
library".
(main): Fix Coverity 1214160 "Ignoring number of bytes read".
Based on patch by Tomasz Buchert (http://bugzilla.maptools.org/show_bug.cgi?id=2480)
Description: fix for Debian bug #741451
tiffcp crashes when converting JPEG-encoded TIFF to a different
encoding (like none or lzw). For example this will probably fail:
tiffcp -c none jpeg_encoded_file.tif output.tif
The reason is that when the input file contains JPEG data,
the tiffcp code forces conversion to RGB space. However,
the output normally inherits YCbCr subsampling parameters
from the input, which leads to a smaller working buffer
than necessary. The buffer is subsequently overrun inside
cpStripToTile() (called from writeBufferToContigTiles).
Note that the resulting TIFF file would be scrambled even
if tiffcp wouldn't crash, since the output file would contain
RGB data intepreted as subsampled YCbCr values.
This patch fixes the problem by forcing RGB space on the output
TIF if the input is JPEG-encoded and output is *not* JPEG-encoded.
Author: Tomasz Buchert <tomasz.buchert@inria.fr>
* libtiff/tif_dir.c: TIFFSetField(): refuse to set negative values for
TIFFTAG_XRESOLUTION and TIFFTAG_YRESOLUTION that cause asserts when writing
the directory
* libtiff/tif_dirread.c: TIFFReadDirectory(): refuse to read ColorMap or
TransferFunction if BitsPerSample has not yet been read, otherwise reading
it later will cause user code to crash if BitsPerSample > 1
* libtiff/tif_getimage.c: TIFFRGBAImageOK(): return FALSE if LOGLUV with
SamplesPerPixel != 3, or if CIELAB with SamplesPerPixel != 3 or BitsPerSample != 8
* libtiff/tif_next.c: in the "run mode", use tilewidth for tiled images
instead of imagewidth to avoid crash
* tools/bmp2tiff.c: fix crash due to int overflow related to input BMP dimensions
* tools/tiff2pdf.c: fix crash due to invalid tile count (should likely be checked by
libtiff too). Detect invalid settings of BitsPerSample/SamplesPerPixel for CIELAB / ITULAB
* tools/tiffcrop.c: fix crash due to invalid TileWidth/TileHeight
* tools/tiffdump.c: fix crash due to overflow of entry count.
tag can return one channel, with the other two channels set to
NULL. The tiff2pdf code was expecting that other two channels
were duplicate pointers in the case where there is only one
channel. Detect this condition in order to avoid a crash, and
presumably perform correctly with just one channel.
sp->dec_codetab in LZWPreDecode (bug #2459)
* libtiff/tif_read.c: in TIFFReadBufferSetup(), avoid passing -1 size
to TIFFmalloc() if passed user buffer size is 0 (bug #2459)
* libtiff/tif_ojpeg.c: make Coverity happier (not a bug, #2459)
* libtiff/tif_dir.c: in _TIFFVGetField() and _TIFFVSetField(), make
Coverity happier (not a bug, #2459)
* libtiff/tif_dirread.c: in TIFFFetchNormalTag(), make Coverity happier
(not a bug, #2459)
* tools/tiff2pdf.c: close PDF file (bug #2479)
* tools/fax2ps.c: check malloc()/realloc() result (bug #2470)
* tools/tiffdump.c: detect cycle in TIFF directory chaining (bug #2463)
and avoid passing a NULL pointer to read() if seek() failed before (bug #2459)
* tools/tiffcrop.c: fix segfault if bad value passed to -Z option
(bug #2459) and add missing va_end in dump_info (#2459)
* tools/gif2tif.c: apply patch for CVE-2013-4243 (#2451)
Date: Thu, 18 Jul 2013 14:36:47 -0400
Here's a patch to correct an issue with creating G4-compressed PDFs.
The issue is caused by == being used to compare bitfields when only
one bit is intended to be compared. Some of the tiffs I have had both
T2P_CS_ICCBASED and T2P_CS_BILEVEL set; therefore, the current code
will fail, producing certain pages that are inverted.
The patch follows, and is also attached.
--David