Should "DCT Method" be fast or slow?

Ask for help and post your question on how to use XnView Classic.

Moderators: XnTriq, helmut, xnview

artistgrrl
Posts: 8
Joined: Sun Jan 15, 2012 5:25 am

Should "DCT Method" be fast or slow?

Post by artistgrrl »

Hi, I have been using the basic XnView for 2 years now, mainly for converting default BMP files to optimized JPGs. Lately on my art blog I have been noticing a very slight loss of image quality, even when the image is uploaded as a regular PNG file instead of JPG. It could just be the nature of some images to be that way, I do both photography and abstract art. It does seem that the quality of an image does improve by a hair when the image is left as a PNG file. I am a stickler about quality since my work contains a lot of detail. Anyway, aside from that, a friend of mine commented on the slow blurry upload online, and I informed her that this is due to the DCT Method. I am not even sure exactly what that is. Mine is set for "slow" but I wonder what would happen if it were set to "fast"? Would that help my image quality, besides causing images to load and clear a lot faster? Or would it just make things worse? I await your sage advice... :D

thank you

artistgrrl
cday
XnThusiast
Posts: 3891
Joined: Sun Apr 29, 2012 9:45 am
Location: Cheltenham, U.K.

Re: Should "DCT Method" be fast or slow?

Post by cday »

artistgrrl wrote:... a friend of mine commented on the slow blurry upload online ...
When an image is downloaded it is blurred at first, then becomes sharp after a short delay? That is characteristic of an image being downloaded in a format that supports progressive download, where a low resolution version of the image is downloaded initially so that the image can be viewed quickly, and higher quality versions are displayed as more image data arrives, until all the image data is received and the full quality image is displayed.

The common image format that supports progressive display is JPEG2000, and I suspect that the images you upload are being converted to that format before being made available for download; if your images remained as PNGs or JPEGs I don't think you would see the effect you describe. There is also apparently a progressive version of JPEG itself, although it is probably little used.

If images you upload as PNGs, which is a lossless format, show the slight loss of quality you've noticed that indicates that the loss is occurring after you upload the images.

With regard to the JPEG DCT method, the Slow option would produce better quality than the Fast option although it would take slightly longer to save the file. There is also a third option, Float, which in principle could produce even better quality but take longer to save, but any benefit is reportedly slight.

The JPEG saving dialog also contains 'Smoothing factor' and 'SubSampling factor' options, but you haven't asked about their effect on quality, and in any case I would be interested in knowing the answer myself!

So, the slight loss of quality you've noticed is probably not related to your saving settings, and sadly is largely outside your control...
cday
XnThusiast
Posts: 3891
Joined: Sun Apr 29, 2012 9:45 am
Location: Cheltenham, U.K.

Re: Should "DCT Method" be fast or slow?

Post by cday »

Looking at your previous posts, I notice that you were working with very large pixel dimension images, could the slight loss of quality that you've noticed recently be due to your images being downsampled to a lower resolution after you upload them?

And in terms of possible conversion to a file format that supports progressive download, and pixel dimensions, are you able to provide a link to a typical image?
User avatar
XnTriq
Moderator & Librarian
Posts: 6259
Joined: Sun Sep 25, 2005 3:00 am
Location: Ref Desk

Re: Should "DCT Method" be fast or slow?

Post by XnTriq »

Wikibooks » GIMP (Saving as JPEG » Advanced Settings » [url=http://en.wikibooks.org/wiki/GIMP/Saving_as_JPEG#Lossy]Lossy[/url] » DCT Method) wrote:The 'Floating' DCT method produces slightly better results than the 'Integer' method with a slight cost to speed. 'Fast Integer' should only be used where speed is imperative.
Wikibooks » GIMP (Saving as JPEG » Advanced Settings » [url=http://en.wikibooks.org/wiki/GIMP/Saving_as_JPEG#Lossless]Lossless[/url] » Progressive) wrote:Selecting Progressive will change the encoding to display the image at increasingly higher quality levels until the image is fully loaded. Progressive encoding also benefits the image's compression.
Leaving this option unchecked will switch to Standard encoding, where the image is displayed in rows from top to bottom.
spacemarine ([url=http://newsgroup.xnview.com/viewtopic.php?t=13602&p=53760#p53760]XnView uses same JPEG encoder as GIMP ?[/url]) wrote:I would strongly recommend to make Float the new default,
as speed is no issue on todays PCs IMHO.

I would also recommend to make the descriptions a little more
user-friendly:

Fast --> Fast (worst but fastest)
Slow --> Slow
Float --> Float (best but slowest)

The same goes for Subsampling:

2x2,1x1,1x1 ----> 2x2,1x1,1x1 (default)
2x1,1x1,1x1 (4:2:2) ----> 2x1,1x1,1x1 (4:2:2)
1x1,1x1,1x1 ----> 1x1,1x1,1x1 (best quality)
RGBA ([url=http://newsgroup.xnview.com/viewtopic.php?t=15865&p=65783#p65783]DCT Float in JPEG, Who use this?[/url]) wrote:I just wanted to ask you, did you ever used the floating-point DCT method?
The following differences are just compared between DCT float and DCT int (slow) method.

Pro:
- more accurate (depend on CPU)
- the difference should be not visible
- lower size
XnTriq ([url=http://newsgroup.xnview.com/viewtopic.php?t=22499&p=93079#p93079]Adjust-Automatic Levels / filesize[/url]) wrote:
Drahken wrote:That being said, if you check the "estimate quality" box on that dialog, the program will do it's best to guess, although it will still be somewhat larger than the original.
The problem with this setting ...
  • Tools » Options » General » Read/Write » Write » JPEG » Parameters » Use estimated original quality if possible
... is that XnView only takes the compression level (“Q factor”) into account, ...
Calvin Hass ([url=http://www.impulseadventure.com/photo/jpeg-compression.html]JPEG Compression, Quality and File Size[/url] » Where does the error come from?) wrote:By far the biggest contributor to the error (ie. file size savings) in the JPEG algorithm is the quantization step. This is also the step that allows tuning by the user. A user may choose to have a slightly smaller file while preserving much of the original (ie. high quality, or low compression ratio), or a much smaller file size with less accuracy in matching the original (ie. low quality, or high compression ratio). The tuning is simply done by selecting the scaling factor to use with the quantization table.
... but ignores other criteria such as chroma sub-sampling.
Calvin Hass ([url=http://www.impulseadventure.com/photo/jpeg-compression.html]JPEG Compression, Quality and File Size[/url] » Where does the error come from?) wrote:The act of rounding the coefficients to the nearest integer results in a loss of image information (or more specifically, adds to the error). With larger quality scaling factors (ie. low image quality setting or high numbers in the quantization table), the amount of information that is truncated or discarded becomes significant. It is this stage (when combined with the Run Length Encoding that compresses the zeros) that allows for significant compression capabilities.

There are other contributors to the compression error, such as the color space conversions, but the quantization step is the most important.
IMHO, the default settings should be:
  • Optimize Huffman table = activated
  • DCT Method = Float (best but slowest)
Someone correct me if I'm wrong ;-)
Gordon Richardson (Photo.net » Learn About Photography » Jpeg Compression » [url=http://www.photo.net/learn/jpeg/index.html#qual]Jpeg Quality Settings[/url]) wrote:Typically the only thing that the user can control in Jpeg compression is the quality setting (and rarely the chroma sub-sampling). The value chosen is used in quantisation stage above, where less common values are discarded by using tables tuned to visual perception. This reduces the amount of information while preserving the perceived quality. Chroma sub-sampling settings are dealt with separately (below).

Ranges of quality settings differ in each implementation, but the IJG values range from 99 (best) to 1 (worst). Please note that these are not percentages, nor is there a direct correlation with the final file size. The example at the top of the page uses an IJG quality setting of 50, and has a file size ratio of roughly 20:1. Anytime you read that an image has been compressed with 10:1 Jpeg quality, you should know that this is slightly misleading (see digital cameras below).

Jpeg is a discrete algorithm, and for a given quality setting different input images may give widely differing file sizes. An image with lots of texture and fine detail will produce a large Jpeg file, while one consisting only of blue sky will be very small. Chosing an appropriate Jpeg quality setting is a subjective decision, with no hard rules. I personally use IJG quality settings of 75 to 50 depending on the subject.
Links of interest regarding interlaced and progressive image rendering:
XnTriq ([url=http://newsgroup.xnview.com/viewtopic.php?t=24905&p=100364#p100364]Newbie somewhat confused, needing some basic pointers[/url]) wrote:
About.com ([url=http://graphicssoft.about.com/od/formatsjpeg/a/jpegmythsfacts_2.htm]JPEG Myths and Facts[/url]) wrote:Progressive JPEGs display gradually as they download, so they will appear initially at a very low quality and slowly become clearer until the image is fully downloaded. On a slow dial-up connection, this may give the illusion of a faster download, but usually a progressive JPEG is larger in file size and requires more processing power to decode and display. Also, some software is incapable of displaying progressive JPEGs, most notably, the free Imaging program bundled with many versions of Windows.
XnTriq ([url=http://newsgroup.xnview.com/viewtopic.php?p=75783#p75783]Some pictures edited in XnView are incompatible with...[/url]) wrote:
eddyad (AVForums.com: [url=http://www.avforums.com/forums/dvd-blu-ray-recorders-recording-media/694771-help-jpg-reading-panasonic-ex77.html#post6282648]Help!! JPG reading on Panasonic EX77[/url]) wrote:The simplest reason is that your images are 'progressive' JPGs.
There are three ways of saving JPGs:
1. Baseline
2. Baseline Optimized (in Photoshop - I'm not sure how it differs from Baseline)
3. Progressive - essentially designed for the dark days before broadband. The image is built in 'layers' so thet users of slow lines would see it appear gradually 'all over' instead of very slowly from top to bottom.
Image editors usually give you a choice of JPG format. I don't know of anything which lumbers you with Progressive, like it or not

Generally, DVD players will not read (3). I don't know about (2), but are happy with (1).
TGB_72 ([url=http://newsgroup.xnview.com/viewtopic.php?t=12052]progressive to baseline jpeg[/url]) wrote:I've many pictures that were compressed as "progressive jpeg" but my pioneer dvd player can't play them, only can play baseline jpeg pictures, so I would like to know if it's possible a loosless jpeg conversion from progressive to baseline with xnview.
User avatar
JohnFredC
XnThusiast
Posts: 2010
Joined: Wed Mar 17, 2004 8:33 pm
Location: Sarasota Florida

Re: Should "DCT Method" be fast or slow?

Post by JohnFredC »

Wow. What an incredible post!

The "final word".

Thank you, XnTriq.
John