I think it is very easy, just using a RAW developer that only applies white balance and demosaicing. E.g. DCRAW, or RAW Therapee and ACR with all settings to 0.
There are no
true "zeros" in a raw processor/viewer. Every "zero" or "default slider location" is an active choice made by an image-processing engineer. There are some established standards (e.g. what RGB value should "median gray" have) and some loosely followed conventions (e.g. how strong of an s-curve to apply to the linear output of the sensor). But both the goals and methods of getting there are entirely up to the developers of the processor/viewer and are proprietary, often patented, and vary in surprisingly large ways from each other.
What's worse is I feel this misconception hurts photographers. In the days of film, photographers would select a film and processor based on the subject, intention of the shoot, client needs, and intended style; i.e. they'd pick it for it's look (color, tonality, range, grain, etc) and technical attributes. Now, photographers often shoot digital raw files starting with the assumption that the "default" settings (whether the in-camera LCD/JPG or in their preferred raw processor) are a sort of "truth".
One negative impact of this mentality is that photographers assume different raw processors are just different interfaces to show you the one "true" image. Whereas in fact even the "default" look between raw processors can vary
a lot. Everything about the image can vary between processors, from the amount of absolute detail, to the amount and structure of the grain, to the way tonal transitions are rendered, to the way highlights and shadows roll off (i.e. the placement/severity/shape of the toe and shoulder), to the color of the image. The fact is those "default" settings and "0" positions on the slider are largely arbitrary.
Moreover, the math in any given processor is varied from one camera to the next. Both the apparent-to-the-user slider positions and the fundamental behind-the-scenes math. e.g. Capture One does not apply the same underlying math to an IQ180 raw as it does to a Canon G10 raw - it would be stupid for it to do so; the sensors (and the characteristics of the data they put out) are radically different and demand to be treated differently.
I think cooking is a good analogy. Give two chefs the same ingredients, tell them to cook an Ohio-style beef casserole, and you'll get two different meals. Even if you give them a specific point of reference, say for instance your grandma's casserole (e.g. Fuji Astia with a 1/3rd stop pull), you won't get the same meal.
From unscientific surveys at raw-processing workshops I've taught I'd guess that roughly half of serious photographers have never bothered to open a raw file in more than one processor to compare the results. Compare this to the percentage of photographers who only ever shot one emulsion of film with one processor and I hope you'll agree that we've lost something; I attribute much of this loss to this misconception that "defaults = true image".