I suspect that once everyone agrees how to map the so called gamut of the camera, we'd see there are large areas of each that probably don't fit for one to say either way. And is a device that captures data we can't see producing color and something we should plot?
The prophoto RGB primaries were chosen to include the gamut of real-world surface colours without including large areas of encoding space outside the spectrum locus (as happens with CIE XYZ).
Some primaries have to be a compromise or how could we justify, other than the need for a really big space, where the blue primaries fall?