One reason I even popped into this discussion it because there are some assumptions being made by the OP which I don’t believe are a happening. Retina displays are not new, and I have been using one while in the field and on other occasions for nearly 2 years now,
I think your main argument against 4/5k is that you'd rather use it as 2.5k display! If that’s the case, then it seems sensible to say that you'd be better of with a 2.5k display.
One of the main advantages of using a HiDPI display is there really isn’t a native resolution anymore. But it also means the resolution is so fine the native resolution is really unusable. It’s just how they work. But the advantage is certain elements can take advantage of the increased density to offer some improved clarity. I’m not sure I see a real advantage to photographers who are after the best, because gamut and color reliability are much bigger issues. I’m not giving up my NEC 302w until someone (probably not Apple) comes up with a 4k or 5K display that can equal it’s ability in gamut and rendering. Now if I was into 4k video, probably a different story.
However, if the 5k iMac screens are just a doubling in each direction, all a person has to do, is see any image at 200% and they’re emulating the 2.5k display.
Yeah, seems pretty simple but when I first had the computer I resized some images to web size in photoshop, used 200% view to judge final sharpening, then saved that out as a jpeg 12 and opened the image in a browser, there seem to some very slight anomalies in the PS view when comparing to the web view. Probably just looking too hard but maybe from not using the retina displays at 50% of the normal size, but a smaller percentage than that, something I believe most do and will also do on the new Retina display (at that size the menu bar would be 0.2 inches tall).
I prefer to emulate the previous macBook Pro glossy by using 1680x1050 which was the native resolution in the previous MacBook I had (the native is 3840x2400, so 50% size is 1920x1200). But you are correct that within photoshop the image area it operates as you describe so a 200% should be pretty straightforward. This means a retina display in Photoshop is similar to working in other displays at 50%, but perhaps the best benefit is output sharpening which was always best judged at 50% in PS before is now pretty effectively judged on the retina displays at 100%.
The argument about not being able to see the cursor / panels / menus is surely a red herring because presumably, the makers of any OS that is going to be used with a high res screen will just enlarge the number of pixels making up those cursor / panels / menus, and again you’re back where you started. (cf retina iPads.)
As I mentioned, this already happens and is a complete non issue, you choose the resolution you want the interface/computer to work at, but the application can leverage the full pixel density for elements within it. Both Photosohp and Lightroom already do this.
Sure, it will need a new way of organising OS elements, but as Butch pointed out, the main thrust of my argument is that high res displays will surely necessitate a better and faster LR, and that i'd hope that this comes sooner than later.
The other part is that as things currently stand, it seems to me, LR has an upper limit for grid images, that is measured in pixels. If those pixels are being used at a higher density, then the Grid mode becomes crippled, because it will only show very small images (perhaps so small, that you can only see 10 on a row, in 5k). Because Grid mode is probably the primary mode for many people, as things currently stand, using LR on a 4/5k display for many people may be a step backwards.
All of this is on the assumption that you choose the use the display at an incredibly high resolution for everything. Even with good glasses, I have to get about 15” from the screen of the MacBook Pro to even have any hope of clicking on many elements. The mac menu bar is 22 pixels high, and at full resolution this means on the new iMac it would be 0.10 inches high. And yes the grid will be too small with too many pictures. But the entire thing is also unusable. The solution already exists, and LR doesn’t need to figure out how to enlarge the interface elements, it happens by simply changed the interface resolution in the OS. This solves the problem with how many images show up in the grid, because the grid itself works based on the interface resolution. Whether the image area within each grid is working at the full display resolution I don’t know, but don’t think it matters, because the only real time the full resolution would be useful are areas with LR that you are at 100%, and LR already does this.
Now I don’t have a 27” retina iMac, and certainly there might be users who will choose to use a finer resolution than the current iMac (I do so on my macbook Pro), so maybe the lightroom team might tweak the grid a little based on new larger HiDPI displays like the new iMac.
To try and sum up my point of view, whatever Lightroom’s performance issues are(and certainly there seem to be areas which can be improved) , I don’t believe they are affected by the use of a retina display, and users of a retina display don’t/won’t see any real difference in usability. Retina displays are not new and the issues mentioned do not seem to be plaguing all of those currently using retina displays.