Let's please stop focusing on monitor for a second and focus on exposure and print.
I understand your frustration here, and I believe I commented on this apsect in my second post. Since you indicated earlier in the thread that your monitor is showing a resonable approximation of the Real World, it's reasonable to assume that a difference between your monitor and print is similar to a difference between the real world and your print.
Do we or don't we agree that if I take a correctly exposed shot of objects against wall, immediately print that shot without any post processing on reasonably white/reflective paper (whatever is the right term, I hope you understand what I am trying to illustrate) and immediately put that print next to those same objects on same wall under same light I should expect that that print look close enough in brightness level to scene instead of being _significantly_ darker?
I believe everyone in this thread is saying and agreeing that it will be significantly darker, and significantly could even be as much as one full stop (or more) of exposure adjustment required.
My second blog entry is probably most useful to take a look at and has nothing to do with monitors. Each histogram in that blog entry is from an image that one of my clients insisted was properly exposed and didn't need exposure adjustments prior to printing. In every case they were incorrect, and to make the print be something they were happy with they had to make exposure adjustments (including +1.25 in one situation!) to get a print that matched their expectation.
If I could invite you over to my office for a minute I could show you side-by-side comparisons of this. I have example prints of each of the images mentioned in the blog article printed 2-up on a single sheet of paper, one without the exposure adjustment and one with. It's a shocking difference and really makes the point clear.
You have to adjust your images. That's the joys of printing on paper.