Scanner software and LR don’t let you do real RAW processing, but you can get close. The less you do to your scanner’s CCD output, the closer to RAW. For example, if you output aRGB, that is a step away from your scanner’s native color space and a step away from RAW. (I’m not saying that there is anything wrong with outputting aRGB, it’s just an example of a difference from RAW.)
Just to be clear, so long as your output color space isn’t smaller than your scanner’s native color space, it
probably makes no difference in LR whether you work with aRGB output from your scanner or the 16 bit linear output from your scanner’s CCD (as long as you properly color manage that linear output). My scanner’s color space is larger than aRGB, so I use ProPhoto. It’s likely that your Epson’s native color space is also larger then aRGB, but I don’t know if that difference would be significant.
With my scanner, I can take the same 16 bit linear file and make two versions of it. With one I assign my scanner’s linear profile, but otherwise leave the file untouched. It’s still in linear, which can be seen clearly from the histogram, even though the photo now looks normal. For the other version, I go one step further in that I convert to ProPhoto. In other words, I first assign my scanner’s linear profile and then convert to ProPhoto. Now, the photo not only looks normal, so does the histogram. If I then process both versions in LR, using all the same settings, the results are identical.
I haven’t done such a test using aRGB, so it’s possible that using aRGB could make a difference (over and above just a gamut difference between my scanner’s native color space and aRGB).