Some people believe that using Photoshop to change a photograph is altering what is real into something that does not exist. The truth is that every exposure in your camera is only a partial interpretation of what you are seeing. And of course, you know if you have studied perception, that what you are seeing is only a small bit of what is really there. I'd like to explore this using three pictures taken the same day of the same landscape.
Heres the first one taken and processed in Lightroom (click on for larger view):
I underexposed this to hold the cloud detail then used the curves adjustment to brighten the rock and trees the best I could. Sharpened and added clarity as normal. This does not really look like what I saw that day because the camera simply can't capture the range of tones my eye and brain were seeing.
The next photo show the same scene but I've put a 10X neutral (actually slightly warm) density filter on the lens and exposed on manual for about 60 seconds. This has the effect of smoothing out the water and losing detail from the sky. The bonus is that the rock seems to almost float even though it is very grounded. So is this version less real that the first one? Again what I saw didn't look like this because I can't effectively merge images I see over 60 seconds into one.
The last example looks closest to what I saw and is a HDR image made merging three images (light, medium, dark) in the app Photomatix then sharpening and adjusting in lightroom. As you see we have good detail in the clouds, reflection and the rock. This is the kind of result that Ansel Adams tried for by dodging and burning in his darkroom before digital capture was invented. If he were alive today, I don't think he would hesitate for a moment to use any tool that helped him achieve his vision.