Jeremy, give me a day or two to get some evidence together for you. I'm in the midst of a string of 16 hour days here. But do note that my ETTR advice isn't a yes/no thing, but one of knowing just how much additional margin you need to give the exposure depending on color.
There are six basic colors we need to consider: Red, Green, Blue, Cyan, Magenta and Yellow. Along with that we have combined luminance. None of these clip at the same point.
Consider what happens with the color yellow. by the time you bring yellow to maximum value you've most likely clipped the red sensels without knowing it. Lexa on Rawdigger just happened to write about this two days ago, so this isn't just me talking about this. The problem is that if you look at the red channel histogram in the camera or normal Raw converter you don't know the red sensels clipped because the histogram is looking at the processed yellow image which has no clipping.
So, what about Olympus and Panasonic? Instead of three sensels with varying saturation points, you have four that are part of the conversion matrix. (some algorithms use just three and those ones cause artifacts with 4-color arrays).
There are two topics of discussion here: One is specific to Olympus/Panasonic sensors and the other has to do with derived colors. The two topics merge in regards to how no two types of cameras are the same.
Ken