Tuesday 27 May 2014

Finally the FPS changes!

In my previous post I mentioned how it appears that JPEG encoder is working at 100% encoding quality. Turns out it is true. I hardcoded to make the core run at 50% quality and 100% quality and there was a clear change in the fps. At 100%, the fps was same as the normal HDMI2USB frame rate. And at 50%, framerate jumps to around 20 fps.
Now 100% quality encoding is not good.Why? In the quantisation step, a each sample of a 8x8 block is divided by a number which depends on encoding quality and frequency component the sample. Human eye is sensitive to low frequency so in quantisation process, low frequencies are preserved and high frequencies are suppressed. But when for 100% quality all components are preserved, so the output file is large but for the human eye it looks as good as say 75% quality.
Plus at 100% quality, the clock cycles spent on quantisation are waste as we are basically dividing by 1.  

 (^100%-11.50 fps)
(^50%-19 fps)

I will look up some material on chrominance and luminance tables and see what is a good trade off between quality and frame rate.


5 comments:

  1. Couple of things;
    * Is there a possibility of doing less work when 100% quality and getting a speed up? IE If you removed the stages of the pipeline which are effectively no-ops?
    * I don't quite understand why does the quality level changes the throughput? Is it because the slow stages now have less data to work on?
    * Is it possible that this could be software configurable? Could we be able to set the quality via the control port? (Then we could dynamically change the fps verses quality trade off).
    * Are you sure that the USB speed isn't having an impact here? Reducing the size would mean more frames are able to be shipped up the USB system. Is the jpeg encoder throttled by the USB speed in any way? Could you record the output rate at the jpeg encoder and send it out the control port or via some other way?

    ReplyDelete
    Replies
    1. *It is possible to do less when 100% encoding quality is required and also not compromise with the speed. If the core is set to 100%, a mux(a complex one) can be used to bypass the quantisation step. But the point is there is no need for 100% quality because what matters is the visual perception. Quantisation table are made in such a way that data gets compressed but there is no change to the visual perception of the picture.
      * I am speculating two reasons for improvement in speed when quality level changes:
      1) Steps after quantisation become more efficient: Run length encoding is more efficient if there is a repetition of data. By quantisation, the high frequency samples become zero so RLE improves.
      2) Smaller size: I the blocks after JPEG core are unable to consume the data fed to it at 100% quality. Hence when size of frames reduce, frame rate increases.

      * There is a jpeg_cmd signal which the top module of jpeg core receives from the USB block. This signal selects the encoding quality by chosing the appropriate quantisation tables. So this can be used to change the fps dynamically. I guess currently jpeg_cmd is always "00" which means 100% quality.
      * Yes I suspect this. I will follow up on it.

      Delete
    2. As JPEG compression works poorly under a number of conditions we care about (IE high contrast text) we probably want to support 100% quality. If we support it via different loadable firmware or via a controllable flag is something we need to figure out.

      Delete
  2. Great work! I think this backs up our suspicion that the MJPEG core is producing faster frame rates than the PC sees, and that we are being held back by transferring the data out of the system.

    As a next move I would disconnect the MJPEG core from any outputs that could slow it down, and measure how fast it goes.

    ReplyDelete
    Replies
    1. Also the fact that changing the operating frequency from 78 to 90 has absolutely no effect on fps adds fuel to the suspicion.

      Delete