In reference to article https://software.intel.com/en-us/node/503871
I don't think the reason why we need gamma correction on current generation panels is correctly described in article above. Snippet from the article below: "The luminance intensity generated by most displays is not a linear function of the applied signal but is proportional to some power (referred to as gamma) of the signal voltage. As a result, high intensity ranges are expanded and low intensity ranges are compressed. This nonlinearity must be compensated to achieve correct color reproduction."
Above explanation is valid for CRT monitors but for modern day monitors their response is pretty much linear to applied luminance. The reason why we need gamma is because human eye can differentiate tonal difference at low light much more than bright light. Gamma encoding helps allocate more bits for darker region where human eye is more sensitive as compared to brighter region. Otherwise, higher bits would be required per channel to perceptually give the same perception. When this gamma encoded pixels reach panels, it's gamma decoded to make it linear again before it can be rendered on to the screen.
So, in essence gamma encoding/decoding helps in storing perceptually more info in lesser number of bits.
ps: CRT non linear response and gamma decode function being nearly same is coincidental.
Thank you very much for your comments! I agree with you that the above statement is an application of gamma correction instead of a more general explanation. I will forward your comment to our document team. Thank you!