Skip to content

Add option to ignore tags having greater than some threshold of bytes to avoid OutOfMemoryErrors on large files #60

@GoogleCodeExporter

Description

@GoogleCodeExporter
I'm running into a problem where I'm getting an out of memory error while 
processing a large tiff file (a little less than 500MB). I'm not sure whether 
metadata-extractor is designed to handle large files like this?

Here is the stack trace:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
  at com.drew.lang.RandomAccessFileReader.getBytes(Unknown Source)
  at com.drew.metadata.exif.ExifReader.processTag(Unknown Source)
  at com.drew.metadata.exif.ExifReader.processDirectory(Unknown Source)
  at com.drew.metadata.exif.ExifReader.extractIFD(Unknown Source)
  at com.drew.metadata.exif.ExifReader.extractTiff(Unknown Source)
  at com.drew.imaging.tiff.TiffMetadataReader.readMetadata(Unknown Source)
  at com.drew.imaging.ImageMetadataReader.readMetadata(Unknown Source)
  at com.drew.imaging.ImageMetadataReader.readMetadata(Unknown Source)
  at digitalfusion.util.TestMetadataReader.main(TestMetadataReader.java:19)

The method calling the above just looks like this:

public static void main(String[] args) throws Exception {
  Metadata metadata = ImageMetadataReader.readMetadata(new File("/path/to/myfile.tif"));
}

I'm using version 2.6.2 with java 1.6 on OS X. The problem is also happening on 
our ubuntu servers (also java 1.6)

Please contact me if you would like a copy of the file for testing.

Thanks,
Michael

Original issue reported on code.google.com by [email protected] on 25 Oct 2012 at 6:04

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions