Context:
I frequently work with very large images at my work, using a microscope with the Olympus Stream software. I generally choose to export as JPEG2000 as this format is the best for lossless image compression, support for HUGE images, and good software interoperability... usually. Unfortunately the software engine exports all JPEG2000 as a single full-res "tile", rather than a grid of smaller tiles as most other JP2 libraries do. I have had a massive headache of a time trying to work with these monolithic JP2 files, especially for very very large resolution files.
Problem:
- Often (30-60% of the time) these massive JP2 files will crash XnConvert when trying to add a file to the queue, or, they often have an error when trying to export/process.
- I have attached one of the problematic files here.
- If you are able to fix this, I would be very grateful! I suspect the issue is due to a bug in the internal JP2 decoder used.
- I did have very good success converting these files using the "OpenJPEG" library directly via commandline.
- I still had problems with some files when using "ImageMagik" via commandline (which apparently also uses the OpenJPEG engine internally?)
- Kakadu JPEG2000 software handles it fine but that is $$$$$$ software.
- I didn't try the much vaunted Grok JPEG2000 library, but that is another alternative open source library, and they claim very high performance.
- I didn't try the older open source Jasper JPEG2000 decoder, but I'm guessing that XnConvert uses that one internally, as they list XnView as a client on their website.
- I have 128GB of RAM on my PC.
- I used XnConvert mostly for personal use previously. However, I just bought an XnConvert license, as if this issue is fixed, I would love to use this heavily for my work too ♥