So I wanted to see how my camera looked compared to a scanner and then an image from the internet all of the same thing. Here is a cover of a CD that has a wide range of color.
First the scanner
The Panasonic DMC-FS15
Its a little blurry because i'm holding it.
Off an internet site
http://www.sonymusic.com.tr/images/albu ... RICANA.gif
I was surprised to see how bad the scanned image looks. The internet file and camera one look similar.
All in all I am starting to like the Panasonic DMC-FS15, the only problem I see with it is its lack of focus lock.
Image Comparison Test
Moderator: peterZ
-
- Posts: 596
- Joined: 06 Jun 2009, 23:57
Re: Image Comparison Test
A scan of a halftone image (an image composed of little "dots" of pigment on a regular grid) usually results in aliasing artifacts such as those in your first example. Even though an image taken by a digital camera is technically a scan too, I suspect the fact that it has less resolution than the flatbed scan makes such artifacts less likely.
Re: Image Comparison Test
On the other hand, depending on what resolution you end up photographing at and the halftone resolution, you're equally likely to end up with moiré patterns. I've had that happen to me a few times. For example:
- Insufficient resolution to resolve crosshatching within Nyquist results in rainbow patterns
(Cover page from the Brant County Illustrated atlas, 1875)
Wrong resolution for halftoning results in strange dot patterns in what should resemble one continuous colour tone
(Crop from the cover to a Tears For Fears concert laserdisc)
When you're using a flatbed scanner, there are descreening techniques that help to reduce halftoning; it would make up the quality difference you're seeing. Not that you can't get great quality with a good camera! When I haven't run afoul of moiré, I have definitely had images that rival flatbed quality.
- Insufficient resolution to resolve crosshatching within Nyquist results in rainbow patterns
(Cover page from the Brant County Illustrated atlas, 1875)
Wrong resolution for halftoning results in strange dot patterns in what should resemble one continuous colour tone
(Crop from the cover to a Tears For Fears concert laserdisc)
When you're using a flatbed scanner, there are descreening techniques that help to reduce halftoning; it would make up the quality difference you're seeing. Not that you can't get great quality with a good camera! When I haven't run afoul of moiré, I have definitely had images that rival flatbed quality.
The opinions expressed in this post are my own and do not necessarily represent those of the Canadian Museum for Human Rights.
- daniel_reetz
- Posts: 2812
- Joined: 03 Jun 2009, 13:56
- E-book readers owned: Used to have a PRS-500
- Number of books owned: 600
- Country: United States
- Contact:
Re: Image Comparison Test
Digital cameras have an "Optical Low Pass Filter" (sometimes called an AA or Anti-Aliasing) filter right on top of the sensor. Basically, the OLPF is a sheet of glass that's slightly blurry. Because the sensor of the camera is a regular grid, you would get moire every day all over the place from cloth and other patterned surfaces (here is a famous example featuring Bush). The OLPF, by blurring the image slightly, prevents those high-frequency patterns from causing aliasing at lower harmonics (in this case, color moire patterns). That's why Misty's example is so perfect -- fine, linear detail like those hatch marks reliably cause aliasing.
The tradeoff is that none of our cameras resolve as much detail as they potentially could, because the OLPF blurs things a bit. Generally, it's not an issue. But it bothers some people. For some cameras, it is possible to remove the filter. This company will do it for you. They have some great example pictures.
Newer cameras with high-performance sensors employ a much less aggressive OLPF. You don't need it as much with better RAW conversion algorithms. Another approach to eliminating moire is to make more of the sensor actually sense light.
Right now, the majority of cameras we are using have sensors for which most of the sensor surface is not light-sensitive. In fact, the little light-sensing elements (photosites) are actually surrounded and obscured by a bunch of microscopic wires and other wafer-level features. (Incidentally, this is true of your eyes, too -- your rods and cones are actually facing backwards and you see light through a mass of veins and membranes). Anyway the little photosites on your sensor sit in little holes and so they don't see much light. The spacing between each little photosite is actually rather large.
There are two approaches. One is to use microlenses. Each photosite gets a lens over its surface. The lens helps to gather more light onto each element, increasing the total surface area of the sensor that is used for light gathering. Since the gaps between light gathering areas is much smaller, moire is reduced.
Another approach, that's just starting to be employed, is called "back side illumination". Ironically, BSI is equivalent to flipping the photosites the right way around. It makes the sensor many times more sensitive and increases the light-gathering area. In conjunction with microlenses, it makes for a really awesome sensor with lots of desirable properties. Casio is coming out with a new high-speed camera called the FH100 that uses this technology. I am going to sell my old Casio and buy it to test out this new sensor tech.
Blah blah. I love cameras and wish I was an optical engineer.
The tradeoff is that none of our cameras resolve as much detail as they potentially could, because the OLPF blurs things a bit. Generally, it's not an issue. But it bothers some people. For some cameras, it is possible to remove the filter. This company will do it for you. They have some great example pictures.
Newer cameras with high-performance sensors employ a much less aggressive OLPF. You don't need it as much with better RAW conversion algorithms. Another approach to eliminating moire is to make more of the sensor actually sense light.
Right now, the majority of cameras we are using have sensors for which most of the sensor surface is not light-sensitive. In fact, the little light-sensing elements (photosites) are actually surrounded and obscured by a bunch of microscopic wires and other wafer-level features. (Incidentally, this is true of your eyes, too -- your rods and cones are actually facing backwards and you see light through a mass of veins and membranes). Anyway the little photosites on your sensor sit in little holes and so they don't see much light. The spacing between each little photosite is actually rather large.
There are two approaches. One is to use microlenses. Each photosite gets a lens over its surface. The lens helps to gather more light onto each element, increasing the total surface area of the sensor that is used for light gathering. Since the gaps between light gathering areas is much smaller, moire is reduced.
Another approach, that's just starting to be employed, is called "back side illumination". Ironically, BSI is equivalent to flipping the photosites the right way around. It makes the sensor many times more sensitive and increases the light-gathering area. In conjunction with microlenses, it makes for a really awesome sensor with lots of desirable properties. Casio is coming out with a new high-speed camera called the FH100 that uses this technology. I am going to sell my old Casio and buy it to test out this new sensor tech.
Blah blah. I love cameras and wish I was an optical engineer.
Re: Image Comparison Test
Interesting, I didn't know about those low-pass filters. Thanks for the explanation, Dan!
The opinions expressed in this post are my own and do not necessarily represent those of the Canadian Museum for Human Rights.
- IcantRead
- Posts: 95
- Joined: 17 Sep 2009, 02:56
- Number of books owned: 0
- Country: United States
- Location: Arizona
Re: Image Comparison Test
Wow I didn't expect I would get such a great in depth responses to this post.