So I'd be very cautious about the fact that the issue could come from the web displayers or the editing lightning evironement and there is for sure something that is not setted well on the chain. IMHO.
All fine and good points... and understood.
Here's my question -> why, in 2012, is this so complicated? With stills I can shoot raw and convert the file into a known color space. I can shoot into a known color space. I can convert from one color space to another based on the end use of the image. I can assign a color space if an image profile is missing. I can pretty much predict how an image will look and get consistency in that look as it moves across platforms.
Why no equivalent with video? Why can't support at Vimeo, YouTube, etc. provide an answer or a workflow that ensures what you see in your NLE will be what is seen on their hosting platforms? (assuming viewed on same monitor, etc.) With all the gurus out there leading seminars, blogs for the masses, workshops, bootcamps, why can't anyone address this issue in a clear manner? With all of the blogs which descend onto NAB, CineGear, PhotoPlus, and take the time to shoot and edit reports on equipment, stop for a moment and provide a roadmap of how to ensure consistent color & tone from capture to output with video? How is it that Hollywood movie trailers look great on a PC, a Mac, an iPhone, Vimeo, YouTube, etc.? What the secret sauce? Are they providing customized versions for each?
If the answer is a higher end NLE or a customized computer setup why is that not definitively stated? e.g. if you edit in ____ NLE, on ____ computer system, and avoid QuickTime, then your hosted film on Vimeo, YouTube, etc., will always match the film as seen in the NLE.
Looking at Sareesh's comments above:
"What are the values of color bit depth, color space, encoded gamma, compression codec, chroma subsampling, viewing LUT, display gamma for the following:
1. Source footage (+intermediate codec if used)
2. Project settings
3. Display card+monitor+connection protocol
4. Render settings+output file
5. Vimeo's settings "
For most of us #1 is set (e.g. H.264 on a Canon, MXF on a C300, etc.). Depending upon the NLE #2 may be set. #3 is usually set as well (other than calibrating your monitor,) and #5 is set. "Set" meaning it's locked in and we have no control over it.
So, the only step where we have control over the parameters is #4. How should it be set differently? For Vimeo I follow their suggested settings but get the results the OP relates. Consistently. Should the gamma in #4 be altered? If that's the case then why doesn't someone who knows and understands what is going on under the hood with steps #1-#5 just say that?
Or should you edit in your NLE and at the end create a duplicate project, then look at the wave form/parade and set the range between certain IRE settings... and doing that will ensure an accurate rendition in third-party hosting services?
The questions above are meant to be rhetorical, you need not answer them. It just seems like the world of color management with HD video is the wild west and no one is providing a way out.