There are too many situations where the ICC workflows don't produce as well as results as one would hope for and rightfully expect. What accounts...

      

There are too many situations where the ICC workflows don't produce as well as results as one would hope for and rightfully expect. What accounts for these less-than-expected results?

  

Answers


Lydia
1. Quantization
from the world of analog to digital conversion where analog data has to be broken in to digital representations with fixed precision and range and hence "quantized". in colour management the term refers to "truncation" or "loss of data". It usually means dropping lower order (right most) digits in a result in order to allow the result to be represented in a certain number of binary digits. The computer industry standardized on the 8-bit byte as a basic computer storage and computational unit. With 8-bits one can represent or encode 256 distinct values. For many situations 256 levels or colours or variations per colour channel matches the human visual system reasonably closely. However, if the coding of data into the 8-bit bytes is done without regard for the way that the values to be represented spread themselves out in a coding scheme, the 8-bit representation can become very lossy when representing colour values.

2. Gamut Mapping
Since the most notorious colour gamut compressions involve reducing the total number of
colours that can be represented then going back to the original values is usually impossible.
What has been lost is lost. So it is best to delay any steps where gamut compression can occur to avoid reducing the gamut of the data we have. It must be done for output to gamut reduced devices but it is best to reduce your data in this way only in the last step.

3. White Point/Black Point
The International Colour Consortium (ICC) is now a group of more than 60 companies working on and agreeing to a standard profile specification, and implicitly through that, agreeing to a standard way in which to perform colour management and colour conversion.
The clarity of the agreements hasn't always been perfect and the specification has several places where the explanation and wording can be greatly improved. There is also other more long term work being done within the ICC to change some of the fundamental assumptions of the current architecture with hopes of making bigger advancements.

4. Bad Arithmetic
Any time when one does computation on a computer she has to worry about loss of significance and magnified error creeping into our computations.

5. Interpolation Errors
Interpolation is a simple process of guessing a result value intermediate between two known values. For smooth functions this is a very effective way to reduce table sizes when functions are to be computed by simply looking up the proper answer in a table. For many functions the table would be required to have an impractical number of entries so tables with fewer entries are provided and the smoothness of the function is relied upon to make computation of intermediate results meaningful.



lydiajane74 answered the question on May 11, 2018 at 18:37


Next: Define the following terms as they are used in Desktop publishing and graphic design. a. WYSIWYG b. RGB c. LOGO d. BLEED e. ZOOM f.RMM...
Previous: The Haber process largely relies on catalysts. Give reason for this.

View More Computer Science Questions and Answers | Return to Questions Index


Exams With Marking Schemes

Related Questions