Professional graphic arts monitors not only support higher bit rates (14 is common), but a much wider color gamut (usually 100+ % of adobe RGB, 140+ % of srgb (what your 'kit' monitor shows).
Any pro grade printer can easily out resolve any of this.
The real issue however is precision for calculations.
Even though your bank cannot write a check for fractional pennies, if competent, their precision in all intermediate stages of calculation is much higher (typically 32 bit floating point binary), otherwise cumulative rounding errors will result in real money errors misplacing real pennies, even in processes as simple as compound interest. The results are rounded (sort of) at the end of the calculation using methods to insure that the banks books actually crossfoot, otherwise auditors get grumpy. However never store money (or anything else) as 'float' in any database.. trust me on this.
The calculations involved in image processing are much more complex and recursive than any financial calculations.
Any decent editor works in 16 bit precision.
The assertion that some may not care about the difference, while true, does not mean these differences are not real, nor that others should or should not care.
90% of the worlds 'photographers' are happy with their cellphones and do not understand why any of us care about any issue on this forum... so what ?.
Happy New Year .... H
D4 & D7000 | Nikon Holy Trinity Set + 105 2.8 Mico + 200 F2 VR II | 300 2.8G VR II, 10.5 Fish-eye, 24 & 50 1.4G, 35 & 85 1.8G, 18-200 3.5-5.6 VR I SB-400 & 700 | TC 1.4E III, 1.7 & 2.0E III, 1.7 | Sigma 35 & 50 1.4 DG HSM | RRS Ballhead & Tripods Gear | Gitzo Monopod | Lowepro Gear | HDR via Promote Control System |
The issue isn't how many colors we can see, about 10 million on average, but how many colors we need in order to do reasonable post-processing and colorspace manipulations. You can quickly show that with 8-bit color per channel (24 bit color, or 16 million colors) you start to posterize as soon as you start any colorspace manipulations. Laura does a pretty good job here: http://laurashoe.com/2011/08/09/8-versus-16-bit-what-does-it-really-mean/
If what we talked about is correct, how can any editor can see the difference ?
It was in repose to your comment and you my friend the professional photographer who worked 2 days perfecting that shot in the magazine, sorry but I looked at your image for 25 seconds before flipping over so I couldn't detect those few hundred different shades of green you put in there !
I don't know how but I know, from experience, Magazine editors can. I think Golf007sd has given the explanation
If you are professional, the only option that matters. The person who writes the cheque
(I wondered what had happened to our conversation)
All very interesting. I suspect many of us will seek out higher bit monitors when we purchase new ones. Monitor bit rate could be the new megapixel war!
If I understand it correctly 24 bit monitors are fine but we need more bits in the raw file to avoid roundig errors in the processing. It is the same in music, you record with a higher resolution than in the final mastered file.
If I understand it correctly 24 bit monitors are fine but we need more bits in the raw file to avoid roundig errors in the processing. It is the same in music, you record with a higher resolution than in the final mastered file.
Yes! You are correct. And video editing works the same way. You need to work in a higher resolution and color space so that when you prepare for final output it looks great. If you haven't looked at Laura's page, please do so: http://laurashoe.com/2011/08/09/8-versus-16-bit-what-does-it-really-mean/
@haroldp Any decent editor works in 16 bit precision.
I guess that refers to guys at advert/fashion agencies, printhouses, magazines etc.... So the 14bit files are utilized fully by them because they can actually see the color tones they are working on on their screens.
For those of us here though, amateur or professional ( which I believe of 99% do their own editing at home and do not own 12/14 bit monitors ), 14 bit file option makes sense only for the expectation that in very near future, standard screens will support those bits. Not because they are workable now.
Someone did say in the previous topic that there was more to 14bits than just color depth - don't remember who... So what else is there ? I read about Dynamic Range being one but there seems to be no concensus on that as some say DR is not related ( it is interesting topic if anyone wants to shed on some light )
Adobe Lightroom, Photoshop (CS4 and up), and Photoshop elements (13 is 16b for most functions) are all 16b capable and that is their default unless you give them a JPG. Those editors probably account for 80-90% of anyone who uses any editor, add Nikon NX 2 and D, Capture one, DXO and silkypix and you are up to 99%. If those don't work, then GIMP and RPP are free as is linux to run them on (they also work fine on Mac) all of them are 16b.
It is perfectly valid for you not to care about these increments of quality, without requiring that the rest of us either join you in apathy, or to convince yourself that all who do not share your view are delusional.
For reference, my camera equipment is mostly in my signature.
I edit on a NEC PA271W monitor which is adobe rgb color gamut and 14 bit. It is 27 in. 2560 resolution and under USD $1,000 in cost. Calibrated with a Spyder 4 ($ 79).
I edit on a MacPro (2013 6 core 3.5 ghz) using PS CS-6, Nikon Capture, or DXO 10 depending on which camera and what I am trying to do.
This is hardly professional equipment that you would see in the cosmo editorial rooms, and I would bet quite similar to what many of the serious members of this forum use.
Wishing a happy new year to all .... H
D810, D3x, 14-24/2.8, 50/1.4D, 24-70/2.8, 24-120/4 VR, 70-200/2.8 VR1, 80-400 G, 200-400/4 VR1, 400/2.8 ED VR G, 105/2 DC, 17-55/2.8. Nikon N90s, F100, F, lots of Leica M digital and film stuff.
+1 to @haroldp I just started using a 4k UHDTV as my primary monitor. It's 12bit capable, but I'm only driving it at 10bits right now. I now have 2 times as many colors as I used to, but I am looking forward to 4x colors.
For those of us here though, amateur or professional ( which I believe of 99% do their own editing at home and do not own 12/14 bit monitors ), 14 bit file option makes sense only for the expectation that in very near future, standard screens will support those bits. Not because they are workable now.
Incorrect. Without editing in the 16bit space, you will suffer image degradation quite quickly. Please actually read Laura's blog that I posted above and you will understand why.
Someone did say in the previous topic that there was more to 14bits than just color depth - don't remember who... So what else is there ? I read about Dynamic Range being one but there seems to be no concensus on that as some say DR is not related ( it is interesting topic if anyone wants to shed on some light )
Also incorrect. In a RAW file each pixel is represented by a 12 or 14 bit value. This information is all that is needed to reproduce a picture.
Harold & Ironheart, you've proved me wrong in my assumption that 99% here would not own 12/14 bit monitors :-S . I must admit I was carried away by the fact that the Apple 27" which I always thought was "the" monitor for the pro-photographer was limited to 8 bits.
Ironheart, I do get ghe fact that 14 bit offers more leeway in editing - did mention that before. I was thinking more about the editing colors without actually seeing them part ( on a 8 bit screen ) ....I can't not agree with you in 14 bits giving guys like you more room to play in postprocess...
Also incorrect. In a RAW file each pixel is represented by a 12 or 14 bit value. This information is all that is needed to reproduce a picture.
This, I don't get what I am incorrect on/about as I am expressing 2 opposing opinions, neither of which is mine ....
Paperman: It is easy to see how one could come to that conclusion.
Apple monitors in general are optimized for movie/TV viewing since many no longer have a separate TV, and stream their video.
I would actually like a thunderbolt monitor since I use a MacPro which is basically built around thunderbolt, but apple's monitors that I have seen are not up to the best graphics monitors in several respects, color gamut being the most important.
Real commercial editors will use monitors that emulate the CMYK gamut to best 'soft proof' before letting the presses run. These are hideously expensive, by any standard.
They typically want the raw files from photographers because they do not trust us (for good reason) to profile the offset presses they use.
That being said, for my personal use, I do not work too hard to try to get 'accurate' color , but strive for believable color.
When running a fashion advert for several million copies however, the fabric color had better be 'right'.
We are all on this forum trying to learn.
Happy New Year ... H
D810, D3x, 14-24/2.8, 50/1.4D, 24-70/2.8, 24-120/4 VR, 70-200/2.8 VR1, 80-400 G, 200-400/4 VR1, 400/2.8 ED VR G, 105/2 DC, 17-55/2.8. Nikon N90s, F100, F, lots of Leica M digital and film stuff.
Yep, continuing learning Harold ... Thanks for the clear picture ( & shattering my plans on the Mac :-) )
Just read Nasim's blog - thanks Knockknock ... Quite some surprises for me there.
Lists the conditions/hardware one needs for 10 bits ( calls it 30bit ) workflow. Man, I thought it was limited to the monitor ! ! Says Mac OPERATING system won't even support it - one has to have Windows ! !. TRUE 10bit monitors are rare and VERY EXPENSIVE ( as you said ) and only a few are out there ( ever heard of Eizo ?? ) The other so called 10bit monitors in the market are actually 8 bit plus sth else ( sorry Ironheart ) , and so forth ...
Which brings me to the question ...What happens if you edit a 10/12/14 bit image on hardware ( like Mac or Windows Vista) and software ( like Lightroom ) that does not support it. Do you still keep the advantage of " having more leeway to edit " and just bear with not being able to work on the color depths of 10/12/14 bits or is it just like working on 8 bits ?
Ironheart, I do get ghe fact that 14 bit offers more leeway in editing - did mention that before. I was thinking more about the editing colors without actually seeing them part ( on a 8 bit screen ) ....I can't not agree with you in 14 bits giving guys like you more room to play in postprocess...
Also incorrect. In a RAW file each pixel is represented by a 12 or 14 bit value. This information is all that is needed to reproduce a picture.
This, I don't get what I am incorrect on/about as I am expressing 2 opposing opinions, neither of which is mine ....
Heh, sorry I will try and be more clear :-/. The 12/14 bits per pixel represent all of the information available and includes DR, and everything else. I'm skipping over a ton of technical details, like the demasicing process, etc... But once you get to the point where you have your RGB values on a per pixel basis, the DR is "contained" within those values. I'll take a simple example, using 8bit numbers for simplicity's sake (happy new year!) In RGB black is 1,1,1 (I'm skipping zero on purpose, as sometimes it is reserved for transparency) 50% gray is 127,127,127 Bright white is 255,255,255 You can easily guess that 25% gray would be 64,64,64 etc... Thinking DR for black, gray, and white values, there would be a dynamic range of 255 steps.
... Which brings me to the question ...What happens if you edit a 10/12/14 bit image on hardware ( like Mac or Windows Vista) and software ( like Lightroom ) that does not support it. Do you still keep the advantage of " having more leeway to edit " and just bear with not being able to work on the color depths of 10/12/14 bits or is it just like working on 8 bits ?
Internally, Photoshop and Lightroom work in 16bits, it doesn't matter what the display does. You can override this and work in 8 bit, or any other, but you would need a good reason to do so.
Yep, continuing learning Harold ... Thanks for the clear picture ( & shattering my plans on the Mac :-) )
Which brings me to the question ...What happens if you edit a 10/12/14 bit image on hardware ( like Mac or Windows Vista) and software ( like Lightroom ) that does not support it. Do you still keep the advantage of " having more leeway to edit " and just bear with not being able to work on the color depths of 10/12/14 bits or is it just like working on 8 bits ?
Paperman
Mac OSX definitely supports all editing operations at whatever bit depth the tools will use (currently all consumer available apps are 16b).
The 8 bit limitation people are talking about is that of the DVI and Thunderbolt video outputs to the monitors.
Graphics monitors try to compensate for thi by using an LUT (lok up table) to interpolate denser bit patterns.
You are perfectly safe using mc for editing and manipulation and will not lose any bit depth, except on final monitor display output where it matters the least.
You can print 16b from mac if your print driver can handle tiff's.
.... H
D810, D3x, 14-24/2.8, 50/1.4D, 24-70/2.8, 24-120/4 VR, 70-200/2.8 VR1, 80-400 G, 200-400/4 VR1, 400/2.8 ED VR G, 105/2 DC, 17-55/2.8. Nikon N90s, F100, F, lots of Leica M digital and film stuff.
I am still catching up on all this, but in plain english with all the technical jargon aside, The color, hue, saturation, delta, gama, etc all the data needs to be stored in bits. We know that for a fact.
We also know that each person's eye sight is going to be marginally different, no one has 100% perfect eye sight and eyes are analog.
Dynamic range may not directly be affected by bits or bit depth, but that color value has to be stored somewhere.... presumably in a "bit" ( I know this is not correct, but you know what I mean).
If I have 24bits of Red, 24 bits of Green, 24 bits of blue, 24 bits of gama, 24 bits of depth in delta, and so on.....
the "Place holders" for that color data is there..... whether the value is 0 or 16 billion or some other value.
I get that monitors that can display that level of "color fidelity" either are super expensive or don't exist at the moment, but there is no reason 32bit color should not be around now.
I remember using Photoshop on an Apple Macintosh SE30 with floppy drives...... 4bit, 8bit, 16bit.
I remember when the first "32bit" graphics cards came out.......32bit graphics cards and on 16 bit pc's and mac's....
There is no reason with 20 years of :"32bit" graphics cards that monitors, still camera's, video camera's should not be producing 32bit color images with dynamic ranges we never thought possible.
That detail was there in film... it was analog. Part of the difference now, is we have storage options for files with that much data in them now.
As for the music example, I have taken a an MP3 ripped from a CD at 96 kbit/s, re-ripped the track at 320 kbit/s and then did a third rip of the track using 24bit flac at 2000 kbit/s and put them on an ipod with the cheap Apple headphones and people asked if it was a different song because they could tell the difference.
I'm certainly not an audiophile, but 96 kbit/s MP3's often sound like shit. It depends largely on the music though. 192 kbit/s is good enough for most kind of music. I rip my music as OGG with variable bitrate (average 200 kbit/s).
I would expect the difference to be similar with a 12bit, 14bit, 16bit or say a 24bit raw file for a photo.
When it comes to cameras, the only time you actually need 14 bits is when shooting at lowest possible ISO with a full frame camera. Other than that, the DR fits nicely inside 12 bits.
Nikon D7100 with Sigma 10-20 mm, Nikon 16-85 mm, Nikon 70-300 mm, Sigma 150-500 mm, Nikon 28 mm f/1.8G and Nikon 50 mm f/1.8G. Nikon1 J3 with 10-30 mm and 10 mm f/2.8
As for the music example, I have taken a an MP3 ripped from a CD at 96 kbit/s, re-ripped the track at 320 kbit/s and then did a third rip of the track using 24bit flac at 2000 kbit/s and put them on an ipod with the cheap Apple headphones and people asked if it was a different song because they could tell the difference.
I'm certainly not an audiophile, but 96 kbit/s MP3's often sound like shit. It depends largely on the music though. 192 kbit/s is good enough for most kind of music. I rip my music as OGG with variable bitrate (average 200 kbit/s).
I would expect the difference to be similar with a 12bit, 14bit, 16bit or say a 24bit raw file for a photo.
When it comes to cameras, the only time you actually need 14 bits is when shooting at lowest possible ISO with a full frame camera. Other than that, the DR fits nicely inside 12 bits.
I don't consider myself an audiophile either, and everyone's hearing is different, no to people are the same. Some people hear the difference and some don't.
My dad at over 70 years old is experiencing hearing loss. He has an Ipod with the standard Apple ear buds, I plugged in a set of shure se530's ear buds that most people would find over priced and a luxury item.
He could not believe the difference. He had no idea what kind of ear buds they were, what they cost, etc.......
He is not an "Audiophile" I have had similar experiences other people.
Then you take a 96 kbit/s MP3 which is missing so much data/so many bits/ what ever we want to call it and play the same song as a lossless un-compressed 24bit 2000/kbit file and they will hear things they did not in the mp3. Even with the cheap ipod headphones.
Saying you don't hear a difference is just as funny as saying you do to people that are on either end of the spectrum.
Comments
The issue isn't how many colors we can see, about 10 million on average, but how many colors we need in order to do reasonable post-processing and colorspace manipulations. You can quickly show that with 8-bit color per channel (24 bit color, or 16 million colors) you start to posterize as soon as you start any colorspace manipulations. Laura does a pretty good job here:
http://laurashoe.com/2011/08/09/8-versus-16-bit-what-does-it-really-mean/
and you my friend the professional photographer who worked 2 days perfecting that shot in the magazine, sorry but I looked at your image for 25 seconds before flipping over so I couldn't detect those few hundred different shades of green you put in there !
I don't know how but I know, from experience, Magazine editors can. I think Golf007sd has given the explanation
If you are professional, the only option that matters. The person who writes the cheque
(I wondered what had happened to our conversation)
http://laurashoe.com/2011/08/09/8-versus-16-bit-what-does-it-really-mean/
Any decent editor works in 16 bit precision.
I guess that refers to guys at advert/fashion agencies, printhouses, magazines etc.... So the 14bit files are utilized fully by them because they can actually see the color tones they are working on on their screens.
For those of us here though, amateur or professional ( which I believe of 99% do their own editing at home and do not own 12/14 bit monitors ), 14 bit file option makes sense only for the expectation that in very near future, standard screens will support those bits. Not because they are workable now.
Someone did say in the previous topic that there was more to 14bits than just color depth - don't remember who... So what else is there ? I read about Dynamic Range being one but there seems to be no concensus on that as some say DR is not related ( it is interesting topic if anyone wants to shed on some light )
Adobe Lightroom, Photoshop (CS4 and up), and Photoshop elements (13 is 16b for most functions) are all 16b capable and that is their default unless you give them a JPG. Those editors probably account for 80-90% of anyone who uses any editor, add Nikon NX 2 and D, Capture one, DXO and silkypix and you are up to 99%. If those don't work, then GIMP and RPP are free as is linux to run them on (they also work fine on Mac) all of them are 16b.
It is perfectly valid for you not to care about these increments of quality, without requiring that the rest of us either join you in apathy, or to convince yourself that all who do not share your view are delusional.
For reference, my camera equipment is mostly in my signature.
I edit on a NEC PA271W monitor which is adobe rgb color gamut and 14 bit. It is 27 in. 2560 resolution and under USD $1,000 in cost. Calibrated with a Spyder 4 ($ 79).
I edit on a MacPro (2013 6 core 3.5 ghz) using PS CS-6, Nikon Capture, or DXO 10 depending on which camera and what I am trying to do.
This is hardly professional equipment that you would see in the cosmo editorial rooms, and I would bet quite similar to what many of the serious members of this forum use.
Wishing a happy new year to all .... H
Nikon N90s, F100, F, lots of Leica M digital and film stuff.
https://photographylife.com/what-is-30-bit-photography-workflow
I just started using a 4k UHDTV as my primary monitor. It's 12bit capable, but I'm only driving it at 10bits right now. I now have 2 times as many colors as I used to, but I am looking forward to 4x colors. Incorrect. Without editing in the 16bit space, you will suffer image degradation quite quickly. Please actually read Laura's blog that I posted above and you will understand why. Also incorrect. In a RAW file each pixel is represented by a 12 or 14 bit value. This information is all that is needed to reproduce a picture.
Ironheart, I do get ghe fact that 14 bit offers more leeway in editing - did mention that before. I was thinking more about the editing colors without actually seeing them part ( on a 8 bit screen ) ....I can't not agree with you in 14 bits giving guys like you more room to play in postprocess...
Also incorrect. In a RAW file each pixel is represented by a 12 or 14 bit value. This information is all that is needed to reproduce a picture.
This, I don't get what I am incorrect on/about as I am expressing 2 opposing opinions, neither of which is mine ....
It is easy to see how one could come to that conclusion.
Apple monitors in general are optimized for movie/TV viewing since many no longer have a separate TV, and stream their video.
I would actually like a thunderbolt monitor since I use a MacPro which is basically built around thunderbolt, but apple's monitors that I have seen are not up to the best graphics monitors in several respects, color gamut being the most important.
Real commercial editors will use monitors that emulate the CMYK gamut to best 'soft proof' before letting the presses run. These are hideously expensive, by any standard.
They typically want the raw files from photographers because they do not trust us (for good reason) to profile the offset presses they use.
That being said, for my personal use, I do not work too hard to try to get 'accurate' color , but strive for believable color.
When running a fashion advert for several million copies however, the fabric color had better be 'right'.
We are all on this forum trying to learn.
Happy New Year ... H
Nikon N90s, F100, F, lots of Leica M digital and film stuff.
Just read Nasim's blog - thanks Knockknock ... Quite some surprises for me there.
Lists the conditions/hardware one needs for 10 bits ( calls it 30bit ) workflow. Man, I thought it was limited to the monitor ! ! Says Mac OPERATING system won't even support it - one has to have Windows ! !. TRUE 10bit monitors are rare and VERY EXPENSIVE ( as you said ) and only a few are out there ( ever heard of Eizo ?? ) The other so called 10bit monitors in the market are actually 8 bit plus sth else ( sorry Ironheart ) , and so forth ...
Which brings me to the question ...What happens if you edit a 10/12/14 bit image on hardware ( like Mac or Windows Vista) and software ( like Lightroom ) that does not support it. Do you still keep the advantage of " having more leeway to edit " and just bear with not being able to work on the color depths of 10/12/14 bits or is it just like working on 8 bits ?
In RGB black is 1,1,1 (I'm skipping zero on purpose, as sometimes it is reserved for transparency)
50% gray is 127,127,127
Bright white is 255,255,255
You can easily guess that 25% gray would be 64,64,64 etc...
Thinking DR for black, gray, and white values, there would be a dynamic range of 255 steps.
Mac OSX definitely supports all editing operations at whatever bit depth the tools will use (currently all consumer available apps are 16b).
The 8 bit limitation people are talking about is that of the DVI and Thunderbolt video outputs to the monitors.
Graphics monitors try to compensate for thi by using an LUT (lok up table) to interpolate denser bit patterns.
You are perfectly safe using mc for editing and manipulation and will not lose any bit depth, except on final monitor display output where it matters the least.
You can print 16b from mac if your print driver can handle tiff's.
.... H
Nikon N90s, F100, F, lots of Leica M digital and film stuff.
We also know that each person's eye sight is going to be marginally different, no one has 100% perfect eye sight and eyes are analog.
Dynamic range may not directly be affected by bits or bit depth, but that color value has to be stored somewhere.... presumably in a "bit" ( I know this is not correct, but you know what I mean).
If I have 24bits of Red, 24 bits of Green, 24 bits of blue, 24 bits of gama, 24 bits of depth in delta, and so on.....
the "Place holders" for that color data is there..... whether the value is 0 or 16 billion or some other value.
I get that monitors that can display that level of "color fidelity" either are super expensive or don't exist at the moment, but there is no reason 32bit color should not be around now.
I remember using Photoshop on an Apple Macintosh SE30 with floppy drives...... 4bit, 8bit, 16bit.
I remember when the first "32bit" graphics cards came out.......32bit graphics cards and on 16 bit pc's and mac's....
There is no reason with 20 years of :"32bit" graphics cards that monitors, still camera's, video camera's should not be producing 32bit color images with dynamic ranges we never thought possible.
That detail was there in film... it was analog. Part of the difference now, is we have storage options for files with that much data in them now.
Nikon1 J3 with 10-30 mm and 10 mm f/2.8
My dad at over 70 years old is experiencing hearing loss. He has an Ipod with the standard Apple ear buds, I plugged in a set of shure se530's ear buds that most people would find over priced and a luxury item.
He could not believe the difference. He had no idea what kind of ear buds they were, what they cost, etc.......
He is not an "Audiophile" I have had similar experiences other people.
Then you take a 96 kbit/s MP3 which is missing so much data/so many bits/ what ever we want to call it and play the same song as a lossless un-compressed 24bit 2000/kbit file and they will hear things they did not in the mp3. Even with the cheap ipod headphones.
Saying you don't hear a difference is just as funny as saying you do to people that are on either end of the spectrum.