Oddly enough, your best bet for predicting these things may be to explain the details to a bunch of domain experts, ask them to make their best guess, and average the results. There are some pretty deep reasons why this works unreasonably well.
Here's a story from the cold war: A US submarine went down someplace in the Atlantic. Information was sketchy, but there was a little data. Some clever bugger shared the data with a bunch of submarine guys, and asked them where they thought the wreck was. The answers clustered around two locations, let's call them A and B.
So this clever bugger looked, roughly, halfway between A and B, where exactly zero of the experts had guessed, and by god the wreck was right there. Within some idiotically tiny margin, 100s of yards or something.
Economies, climatological systems, ecologies, agricultural systems, all these things, are insanely hard to model. The smallest accurate model of these damned things is generally the thing itself. Chaos, dynamical systems, blah blah blah. There are reasons. You can make fairly accurate short term predictions, and you can make some broad-strokes guesses in the longer term. Those tend to be, really, "well, my best guess is X, but really? Anything is possible."
Note, however, that "well, we can't reliably predict the result of <massive change> will be" isn't a justification for performing that massive change. Normal people think that it's actually a worse scenario that being able to predict the result. But the climate change debate isn't based on rational argument or reasoning of any kind. It's entirely political, and anything that looks like an argument is just an attempt to rationalize a position held for purely social and emotional reasons.