Matthew Newton , Mar 01, 2010; 09:27 a.m.
Interesting. I can certainly see the applications and they deffinitely mentioned them. The only issue I see is the computing power required. I agree that with some things, such as space probe image capture/enhancement it could prove invaluable and revolutionary. Same thing with image enhancement. However, I don't see this being a common thing for digital cameras or for 'compression'. As mentioned a couple of hours of computer time to enhance a single MRI image. Not sure on the resoultion of an MRI image, but I'd imagine that a 12mp image is at least on par.
I really don't want to snap a couple of thousand pictures at 'lower resolutions' to save battery power and memory card space and then have to spend the next year 24/7 processing the images up to spec.
Computing power only grows at an approximate doubling every 18-20 months. At that rate to get processing times down to maybe 10 seconds an image for post capture enhancement you're going to have something like 40+ years of computing improvements. Of course in 40 years you might still have a 20mp sensor, but you are creating 200mp images in the end because of enhancement. Personally I see this as a wonderful replacement for upresing an image when you need to print really big.
I do wonder what the limit of the algorithm is for upresing or other enhancement before the error introduced actually becomes significant. A 4x enhancement, 10x, 100x, 1,000x? I can't imagine it is truely infinnite. How could you turn a 50 pixel 'image' in to a 12mp image and have resonable accuracy (or for that matter how can you do it without a few trillion petaflops of processing and the answer might be you can't).