PHOTOSYNTHESIS IN NATURE  Nature & outdoor photography

HOME      GALLERIES      PRINTS      [ PHOTOGRAPHY ]      EXPERIMENTAL      ABOUT/CONTACT

CAMERAS        LENSES        FILTERS        METERS & MICROMETERS        TYPES OF LIGHT        EDITING        BITS & BYTES        COLOR MANAGEMENT        PHOTOGRAPHING THE SKY
MACRO PHOTOGRAPHY        MICROSCOPE PHOTOGRAPHY        IR PHOTOGRAPHY        STEREO PHOTOGRAPHY        SOLARGRAPHY        MATTING CALCULATOR

Camera obscura

 

The easiest way to project images is the camera obscura, which is nothing more than a dark space with a tiny hole in one of the walls which will give an upside down and mirrored image of the world outside on the opposite wall. In the scheme below on the left, a box with a tiny hole is used, which gives a projection on the inside of the box.

If the opposite wall is some sort of sensor, then the picture can also be captured, like in the picture below to the right, which was taken with my digital camera covered with a piece of aluminum foil with a very small hole.







Schematic depiction of a camera obscura

Comparing a lens with a pinhole => [taken with a pinhole] [taken with a lens]

 

The focal length of the pinhole setup (meaning the distance between the sensor and the aluminum foil) was measured to be about 47-48 mm and this corresponds well with the focal length of the lens used for the photo taken with a lens, which was 48 mm. Furthermore, the pinhole photo needed a shutter speed of 4 seconds, while the photo taken with a lens only needed 0.05 seconds at an aperture of f/13. So the pinhole photo needed a shutter speed that was 80 times longer, which equals to about 6.32 stops:

$$\frac{log{(\frac{4}{0.05})}}{log{(2)}} =\frac{log{(80)}}{log{(2)}} = 6.32$$

 

Since each stop equals to a factor of √2 for the aperture (more about that here and here), this means that the pinhole had an aperture of about f/116:

$$13×{(\sqrt{2})^{6.32}}=116$$

 

Knowing that the focal length of the setup was 48 mm, this in turn means that the aperture for the pinhole set-up (the hole in the aluminum foil) was about 0.41 mm in diameter:

$$\frac{48}{116} = 0.41$$

 

But it's clear that there is room for improvement in the pinhole photo and that's where lenses enter the picture; they are sharper, collect more light and are more versatile in terms of depth of field. Lenses in their simplest form are made of just one element, but the lenses used for photography consist of various elements.

 

Focal length

 

Schematic depiction of a focussed lens =>
[subject at infinity] [subject at closer distance]

 

Schematic depiction of a lens system => [thin lens] [lens system]

The focal length of a lens is the distance of the image for an subject at infinity as is shown on the right, where f represents the focal length. However, if we want to focus on something at a closer distance, then the image is formed at a larger distance than the focal length, and the lens optics need to be adjusted for that by increasing the distance from the lens to the image plane.

For a thin lens, the amount of adjustment needed can be calculated by the thin lens formula:

$$\frac{1}{f} = \frac{1}{u}+\frac{1}{v}$$

 

  f = focal length

  u = subject distance

  v = image distance

 

For a lens with multiple elements, this is a bit different as it has two hypothetical planes in stead of just one and the distance between these two planes is ignored in this formula (in the picture to the right, the combined lens elements are represented as the thick lens).

 

For a subject at infinity (so u = infinite) the image distance is approximately equal to the focal length:

$$\frac{1}{v} = \frac{1}{f}-\frac{1}{u} = \frac{1}{f}-\frac{1}{\infty} \approx \frac{1}{f}$$

 

But with a 50 mm lens and a subject at 5 meters distance (= 5000 mm), the image distance will be:

$$\frac{1}{v}=\frac{1}{50} - \frac{1}{5000}$$

so:

$$v=\frac{1}{\frac{1}{50} - \frac{1}{5000}}=50.5$$

 

So the focal plane is shifted 0.5 mm (50.5 - 50.0) backwards and, by focusing our lens, we shift the whole lens system 0.5 mm forwards, so that the focal plane coincides with our sensor.

 

From this formula, it can also be deduced that the shortest focus distance (the distance between subject and sensor, so u + v) has to be 4 times the focal length. The shortest focus distance is when u and v are equally long, so v = u:

$$u=v$$

so:

$$\frac{1}{f}=\frac{1}{u} + \frac{1}{v}=\frac{1}{v} + \frac{1}{v}=\frac{2}{v}$$

so:

$$v=u=2f$$

 

So the shortest focus distance is u + v = 2f + 2f = 4f. At this point magnification is 1:1, and a greater magnification can only be obtained by using, for example, extension tubes (giving a longer focus distance).

 

In general, a focal length with the same size as the diagonal of the sensor is regarded as normal, so in the case of full frame format this would be 43 mm (50 mm is taken in stead as there are no fixed 43 mm lenses). A 50 mm lens is generally regarded as having approximately the same perspective as a human eye. This is, however, something different than the field of view, as the field of view of a human eye resembles a 22 mm lens.

 

Lenses which have a focal length shorter than the long side of the sensor are considered wide angle, and if the focal length is shorter than the short side of the sensor than we're dealing with ultra wide angle lenses. (A special type of ultra wide angle lenses are the fisheye lenses, which do not produce a rectilinear but a circular image.) For full frame, the different focal lengths are summarized in the table below. Wide angle lenses often give an exaggerated feeling of perspective, while photos taken with telelenses have a flatter feeling.

< 24 mm   ultra wide angle
24mm - 35 mm   wide angle
50 mm   normal
80mm - 300 mm   tele
> 300 mm   super tele

 

In photography there are two types of lenses: fixed lenses (also called prime lenses), which have a fixed focal length, and zoom lenses, which cover a range of focal lengths. Fixed lenses are easier in their design and as a result they usually give better results than zoom lenses (although the price tag is also of big importance! A cheap fixed lens will be inferior to an expensive zoom lens). For zoom lenses there is a so called zoom factor, which is the maximum focal length divided by the minimum focal length. So a zoom lens with a range from 25 mm to 75 mm has a zoom factor of 3 (= 75/25). The higher the zoom factor, the more problems the lens designers face in terms of corrections and, as a rule of thumb, a larger zoom factor often means a lens with a decreased quality.

 

Two different types of lens design are very common:

• The first is the retrofocus design. In this case, a diverging lens is used as the first component, which makes it possible to place a wide angle lens further away from the sensor than in a conventional design. This is a big advantage when using a wide angle lens on an SLR camera, as the mirror will need more space for proper functioning than a conventional wide angle can give. An extra advantage of this design is that the light will reach the sensor at a smaller angle, which will decrease vignetting. This does come at a price though, because of the large front element, a retrofocus lens is heavy and it will often show barrel distortion.

• The inverse of this principal is used for long focal lengths in order to make tele lenses more practical, a lens design known as telephoto. This means that a 300 mm telephoto lens can be significantly shorter than a regular 300 mm tele lens. Like the retrofocus design, this also comes at a price as retrofocus lenses often suffer from pincushion distortion.

 

Aperture

 

Different aperture sizes => f/4, f/5.6, f/8, f/11, f/16, f/22, f/32.

The aperture is the lens diaphragm that regulates how much light is transmitted to the sensor. It is usually a construction of several blades as can be seen in the example on the right, where f/4 is wide open.

But more important than regulating the amount of light, the aperture is crucial for controlling the depth of field (more about the depth of field here). A small aperture gives a large depth of field, whereas a large aperture gives a narrow depth of field. So if the depth of field is important for your picture then the aperture should be used to get the right depth of field and the shutter speed should be adjusted accordingly. If you don't care about the depth of field and shutter speed matters more to you, then using the aperture for changing the exposure is an option.

 

Below are two examples showing the change in depth of field for different apertures (the first one is at a focal length of 20 mm and the second one at 200 mm). In both cases, focus was on the trees in the foreground. A large aperture (small aperture number) gives a small depth of field, whereas a small aperture (large aperture number) gives a large depth of field.







The difference between a large and small aperture => [large (f/4)] [small (f/22)]

The difference between a large and small aperture =>
[large (f/2.8)] [small (f/22)]

 

The aperture number is a relative figure and is obtained by dividing the focal length by the apparent diameter of the aperture (the apparent diameter is the size of the aperture as we see it when looking into the lens, which is what counts). Thus, a 100 mm lens with an aperture number of f/4 has an aperture of 25 mm, whereas a 16 mm lens at f/4 has an aperture of only 4 mm. That's why it's written like f/4, f stands for focal length and the importance of this is that an aperture of f/4 gives the same amount of light, regardless of the focal length. If we would not write it like this, then we would be obliged to either learn that a 4 mm aperture at 16 mm is equal to a 25 mm aperture at 100 mm, or calculate this for every instance.

 

For the aperture, every stop represents a factor 1.4 × (= √2) and, as a result, a factor of 2 × (= 1.4 × 1.4) means a difference of 2 stops. This is because the amount of light which passes through an aperture is directly related to its area, which in turn is related to the square of the diameter. So by going from f/8 to f/5.6, you increase the exposure by 1 stop, going from f/8 to f/4 increases the exposure by 2 stops, going from f/8 to f/2.8 increases the exposure by 3 stops, etc.

Focal length (mm):

Aperture number (f/):

Size of aperture (mm):

Calculate!

 

Aperture - diffraction

 

Schematic depiction of diffraction => [small aperture] [large aperture]

Diffraction is the effect of light spreading out when squeezed through a small opening, and in our case the small opening is the aperture. The diffraction causes a so called Airy disk to form, which becomes larger with increasing diffraction. Diffraction occurs at any given aperture, but for the larger apertures (small f/-numbers) it is insignificant. At smaller apertures, diffraction increases and at a certain point (when the Airy disks start to approach the size of the circle of confusion) it starts to degrade image sharpness. So if you want a large depth of field in your photo, don't just use the smallest aperture as you will end up with decreased overall sharpness due to diffraction. Calculate, or even better, test at which aperture sharpness starts to degrade due to diffraction.

Examples of how diffraction ruins sharpness at small apertures => [f/16] [f/32] [f/64]

 

The size of the Airy disk can be calculated with the formula below:

$$a = {2.44×λ×N×(1+m)}$$

 

  a = airy disk

  λ = wavelength of the light

  N = aperture number

  m = magnification

This formula can be simplified by just using green light (546 nm wavelength), at which point the formula becomes approximately (for most conditions, the magnification will be small and can be ignored):

$$a = \frac{N×(1+m)}{750}$$

 

  a = airy disk

  N = aperture number

 

 

Some examples of how diffraction affects sharpness are above on the left, all are 100% crops. At f/16 the picture looks sharp, but notice how sharpness decreases significantly due to diffraction at f/45 and f/64.

 

The table below shows for various wavelengths (400, 450, 500, 550, 600 and 650 nm) wether the apertures are diffraction limited or not, as well as the size of the airy disk for that aperture and wavelength. If they are diffraction limited, it means that sharpness decreases due to diffraction at that aperture. The focal length of a lens is of no importance. A lens with a longer focal length has a larger aperture, which would decrease the diffraction, but the light has a longer way to travel from aperture to sensor, which increases the effect. These two factors more or less cancel each other out.

 

Note that the results of this calculator should be taken with a small pinch of salt. Diffraction will most of the time be hardly visible in real life for the first two or maybe even three aperture settings were diffraction is occurring according to this calculator. That's why testing your lens is the best way to see at which aperture diffraction starts to degrade image quality.

Circle of confusion (mm):

Magnification*:

 

Wavelength (nm):

f/

f/

f/

f/

f/

f/

f/

f/

f/

f/

Limit (f/):

Calculate!

*Only of importance when taking pictures at high magnifications. For more on this, see lenses/magnification and macro photography/magnification.

 

Angle of view (AOV) and field of view (FOV)

 

Schematic depiction of the FOV for various focal lengths

The focal length of a lens is a measurement of the angle of view which the lens covers, and the picture on the right shows some examples of the horizontal angle of view for some focal lengths based on full frame. For example, a focal length of 16 mm gives a horizontal angle of 97°, whereas a focal length of 200 mm gives an angle of 10°.

 

The picture below on the left was taken at 16 mm, and by hovering over the focal lengths you can see what the angle of view would have been if I had used that focal length.

 

Do keep in mind that the angle of view changes slightly when adjusting the focus, an effect which is also called "focus breathing". Especially when working at really close focus distances, like in macro photography, this is readily observable (for more on this, see more under macro photography/magnification). In the example below on the right, focus is on the front part of the lichen. When looking at the second picture, where the focus is on the rear part of the lichen, we see that the picture has a slightly wider angle of view. This is the reason why the calculator below has a checkbox for "take magnification into account".

Focal lengths compared => [16 mm] [25 mm] [50 mm] [100 mm] [200 mm]

Changing the focus also changes the angle of view => [focus on front part] [focus on rear part]

 

The angle of view can be calculated according to the formula below:

$$aov = {2×arctan\left(\frac{w}{2×f×(1+m)}\right)}$$

 

  aov = angle of view

  w = sensor width

  f = focal length

  m = magnification

 

If you know the focus distance then you can also calculate the field of view at the point of focus with this formula:

$$fov = {\frac{w×d}{f×(1+m)}}$$

 

  fov = field of view

  w = sensor width

  d = focus distance

  f = focal length

  m = magnification

 

The field of view can be calculated below.

Focal length (mm):

   

Focus distance* (m):

Take magnification into account**:

Sensor dimensions (w×h, mm):

×

 

Horizontal AOV (°):

Horizontal FOV (m):

Vertical AOV (°):

Vertical FOV (m):

Diagonal AOV (°):

Diagonal FOV (m):

Calculate!

*Focus distance is the distance between sensor and subject, should at least be 4 × focal length (only angular values will be given if no value for focus distance is submitted).
**Only of importance when taking pictures at high magnifications. For more information see lenses/magnification.
Note: the pupil factor is not taken into account in this calculator, which only starts having an influence at smaller focus distances (it is defined as the ratio between the apparent exit pupil diameter and entrance pupil diameter).

 

Magnification

 

Different magnifications => [situation 1] [situation 2] [situation 3] [situation 4] [situation 5]

 

 

Situation 1: u > 2 × f

Situation 2: u = 2 × f

Situation 3: u < 2 × f, but u > f

Situation 4: u = f

Situation 5: u < f

 

→ mag < 1:1, focus distance > 4 × focal length

→ mag = 1:1, focus distance = 4 × focal length

→ mag > 1:1, focus distance > 4 × focal length

→ no image formed, no sharp picture recorded

→ image projected in front of the lens, no sharp picture recorded

The magnification is the relative size of the subject on the sensor. For example, a focal length of 100 mm and a focus distance (the distance between subject and sensor) of 0.4 m gives a magnification of 1:1, which means that the subject is projected life size on the sensor (the size of the sensor plays no role in magnification). Lenses which can produce magnifications of 1:1 or larger are considered macro lenses.

 

The scheme on the right shows the relationship between magnification, focal length and subject distance (the subject distance, u, is the distance between lens and subject). Blue is the subject, red is the formed image, f is the focal length.

 

When knowing the focus distance and the focal length, the magnification can be calculated with the formula below.

$$m = {\frac{d}{2}-{\sqrt{\frac{d^2}{4}-f×d}} \over \frac{d}{2}+{\sqrt{\frac{d^2}{4}-f×d}}}$$

 

  m = magnification

  d = focus distance

  f = focal length

 

 

The calculator below calculates what magnification you are working with for a known focal length and focus distance. When working at higher magnifications, the effective aperture changes significantly because of this high magnification (more on that under macro photography/magnification), and you should take that into account when calculating certain other parameters. For this reason some of the other calculators on this site have an input field for magnification.

(The effective aperture can be calculated by multiplying the aperture number with a factor "1 + magnification". E.g. a lens set at f/8 will have an effective aperture of f/16 when the magnification is 1. This correction is mostly of importance in the case of macro photography where magnifications are high.)

Focal length (mm):

Focus distance* (m):

Magnification:

Calculate!

* Focus distance is the distance between sensor and subject, should at least be 4 × focal length.

 

35 mm equivalent and crop factor

 

The crop factor of a camera is used to describe the size of a sensor relative to full frame format. Full frame is 36 × 24 mm, so if a camera sensor is 24 × 16 mm, then it will have a crop factor of 1.5. In the same way a camera with a 12 × 8 mm sensor has a crop factor of three.

If you are using a lens on a camera with a crop factor, then this changes the angle of view on your picture compared to using the same lens on full frame format. A so called 35 mm equivalent of a focal length can be calculated by multiplying the focal length with the crop factor. This does not mean that the focal length of that lens changes, it is just a way to compare the result to the result on a full frame camera. So a 50 mm lens will still be a 50 mm lens when used on a camera with a crop factor of 1.5, but due to the cropping caused by the smaller sensor, the produced angle of view of your image is comparable to using a 75 mm lens on a full frame camera.

 

If you take a picture of the reindeer below with a full frame camera equipped with a 200 mm lens, then you end up with the picture on the right. However, if you take the picture with the same lens on a camera with a crop factor of 1.5, then the picture will be the one for the blue outline. Effectively, this is the same angle of view as a 300 mm lens would have given you on a full frame camera, but it's still a 200 mm lens! In the same way, crop factors of two and four give the corresponding field of views as a 400 m lens and an 800 mm lens would give on a full frame camera, respectively.

How the angle of view changes with a changing crop factor (CF) =>
[CF 1 (full frame)] [CF 1.5 (blue outline)] [CF 2 (red outline)] [CF 4 (green outline)]

 

The 35 mm equivalent can be calculated with this calculator.

Focal length (mm):

Crop factor:

35 mm equivalent (mm):

Calculate!

 

To calculate the crop factor based on the size of your sensor, use this calculator. If the horizontal and vertical crop factors are not identical then the aspect ratio of the sensor is not the same as 35 mm film format (aspect ratio 3:2) in which case the diagonal crop factor is often used. Be sure to fill in the sensor dimensions in correct order (width × height).

Sensor dimensions (w × h, mm):

×

Crop factor vertical:

Crop factor horizontal:

Crop factor diagonal:

Calculate!

 

This one calculates the sensor dimensions of your camera based on the crop factor. It assumes that your camera has an aspect ratio of 3:2.

Crop factor:

Sensor dimensions (w × h, mm):

×

Calculate!

 

Lens characteristics

 

Below is a list of some of the most common lens characteristics.

 

Lens characteristics - spherical aberration

 

Example of spherical aberration => [f/4] [f/5.6] [f/8] [f/11]

Spherical aberration is the phenomenon that light rays going through the edge of a spherical lens (meaning that the lens is evenly curved) are not focused at the same point as light that is going through the center of the lens.

 

This effect is especially observed at large apertures where pictures can be a bit blurry, but stopping down will minimize the spherical aberration. To reduce this effect, lenses often have aspherical lens elements, which means that for these elements, the curve changes from center to edge.

 

An example of the influence of spherical aberration at larger apertures is shown on the right. In this case sharpness suffers visibly at f/4 compared to f/8 or f/11. Enlargements are at 100%.

Schematic depiction of spherical aberration

Simulation => [without] [with]

 

Lens characteristics - coma

 

Coma is related to spherical aberration, but in this case the rays enter the lens at an angle (oblique rays). The result is that points along the optical axis will be sharp (if we ignore any spherical aberration), but points toward the edges will have a comet shape. Stopping down can decrease coma.

Schematic depiction of coma

Simulation => [without] [with]

 

Some examples of coma. The first one shows a crop from the left part of a star picture, and shows how the stars are shaped somewhat like a comet. The pictures on the right show an extreme example of coma, and how stopping down minimizes coma. These are crops from the lower left corner.

A clear example of coma in a star photo

Example of coma at different apertures => [f/1.8] [f/2.2] [f/2.8] [f/4.0]

 

Lens characteristics - chromatic aberration

 

There are two main types of chromatical aberration:

• Longitudinal or axial chromatic aberration is caused by the effect that shorter wavelengths are focused closer to the lens then longer wavelengths. Together with spherical aberration, longitudinal chromatic aberration is the only on-axis aberration in this list (meaning that it also occurs in the center of the picture). Longitudinal chromatic aberration is most obvious in areas with great contrast, such as dark objects against a light background, and the effect can be reduced by stopping down the lens.

Schematic depiction of longitudinal chromatic aberration

Simulation => [without] [with]

 

• Lateral or transverse chromatic aberration is caused by the effect that when light enters the lens at an angle, shorter wavelengths will focus at a longer distance from the optical axis then longer wavelengths. Like longitudinal chromatic aberration, lateral chromatic aberration is most obvious in areas with great contrast, such as dark objects against a light background. However, in this case the effect can not be reduced by stopping down the lens.

Schematic depiction of lateral chromatic aberration

Simulation => [without] [with]

 

The picture below to the left is a nice example of lateral chromatic aberration, taken from the left upper corner of the original picture. The picture below to the right is an example of longitudinal chromatic aberration and shows how stopping down can decrease this (f/3.5 is wide open).

Example of lateral chromatic aberration

Example of longitudinal chromatic aberration => [f/3.5] [f/4.5] [f/5.6]

 

Many lenses have special constructions to minimize the effect of chromatic aberrations, such as achromats and apochromats:

• In achromats, red and blue light focus in the same plane while green focuses somewhat differently, thereby reducing primary chromatic aberration (chromatic aberration of the primary colors). But there is still secondary chromatic aberration left which can be seen as green and magenta edges (magenta = blue and red together).

• In apochromats, red, green and blue all focus in the same plane, giving very little chromatic aberrations.

 

However, as a result of correcting lenses for these two aberrations, a different chromatic aberration becomes more noticable; spherochromatism. Spherochromatism is the result of lenses being corrected for in focus areas, but not for out of focus areas. Like longitudinal chromatic aberration, spherochromatism is caused by the variation of focus distances with different wavelengths, and thus is also an on-axis aberration. Spherochromatism is a bigger problem in faster lenses, but it can be reduced by stopping down the lens.

The result of spherochromatism is green out of focus highlights in the background and magenta out of focus highlights in the foreground. Examples of this can be seen below. The second example is not as clear as the first one, but most out of focus branches have colored edges, where the ones behind the owl are green, and the ones in front of the owl are magenta.

Example of spherochromatism

Example of spherochromatism

 

Lens characteristics - field curvature

 

Field curvature is caused by the fact that oblique rays focus slightly in font of the sensor yielding a curved image.

Schematic depiction of field curvature

Simulation => [without] [with (ex 1)] [with (ex 2)]

Example of field curvature => [whole picture] [center] [corner]

 

 

 

On the photo this will result in objects in the edges of the photo being blurred to a certain extent while objects in the center are sharp (example 1). Or, when the focus is on the edge of the picture, the rays going through the center will focus behind the sensor, resulting in the center of the picture being blurred (example 2).

 

On the right is a picture of a flat rock surface covered with lichen. The first picture shows the whole picture, while the other two show enlarged parts of respectively the center and the lower right corner. It is clear that the part from the center is a lot sharper than the corner part, which is mostly due to curvature of field.

 

 

 

 

 

 

 

 

 

 

Lens characteristics - astigmatism

 

Astigmatism is the effect that a lens does not focus at the same point in tangential/meridional (the wheel) and sagittorial (the spokes) orientations.

This is never a problem in the center of the image, as both orientations focus at the same point along the optical axis, but towards the edges of the picture this will change the shape of details.

Example of tangential astigmatism

 

 

 

The exact shape of the details depends on how the details are blurred in both directions. If there is more blurring in the tangential direction than in the sagittal direction, then tangential astigmatism is observed. Similarly, if there is more blurring in the sagittal direction, then sagittal astigmatism is observed. Astigmatism can be decreased by stopping down.

 

An example of astigmatism is shown to the left, where the out of focus parts towards the edges of the picture are elongated outwards (tangential astigmatism).

 

 

 

 

 

 

 

 

 

 

 

Lens characteristics - distortion

 

There are a few types of distortion and the most common are shown below on the left. They are most obvious when straight lines are present in the picture.

Below on the right is an example of distortion, where the lens suffers from quite a bit of moustache distortion. In the corrected example all the distortion is removed. In nature photography, distortion is not that often a big issue, since there are not that many straight lines in nature. In the uncorrected example it is hardly noticeable that there is such a big amount of distortion, until you compare it to the corrected picture!




Different types of distortions =>
[normal rectilinear] [pincushion] [barrel] [moustache]

Example of distortion => [original] [corrected]

 

Lens characteristics - flare

 

When light from a bright light source (either from inside or outside the frame) can enter the lens, it can reflect between different elements of the lens which will form bright spots, called flares. When the light is reflected in such a way that it reaches the sensor as a haze, it is called veiling glare, which will lower the contrast of the scene. Filters in front of the lens are extra surfaces on which light can reflect and potentially enhance flaring.

The brightest flares become visible when looking through the viewfinder and pushing the depth of field preview button (if the camera has one). This stops down the aperture to the value you have chosen for the picture and this will usually reveal flares.

 

The picture below on the left shows a classic example of flare caused by the sun, whereas the one on the right shows a type of flare whose shape is a bit more peculiar. By looking at the enlarged version, it can be seen that these flares have a strange shape and these types of flares can be found on the internet, where people claim them to be the evidence of UFOs, which is pretty hilarious. I am quite sure I did not see anything flying around when I took this picture! Notice that these flares are caused by the lamps from the factory, and if you draw lines between the lamps and their corresponding flares, they all meet in the middle of the picture. Often when the source of the flare is in the picture, its corresponding flare is opposite to the center of the picture (this rule only applies to flares that are caused by internal reflections in the lens, not to flares that are caused by for example dust on the lens).

Example of flares caused by the sun

Flares caused by factory lamps => [original] [enlarged] [with lines]

 

When the light source that causes flaring is outside the frame it can be removed by simply blocking that light source. This can be done by using a proper lens hood, but this only works well for prime lenses, since a lens hood only works optimal at the widest angle in the case of zoom lenses. Another way is using your hand or something else to block the light, but care has to be taken not to enter the frame with the "blocking device". I never carry lens hoods with me, so my hand is my trusted "blocking device". The examples below show how carefully using a hand can be very effective against flaring. In the first case on the left you can see a green regular flare and a massive amount of veiling glare, and both are removed by using my hand. In the second example on the right, the lighter circles that can be seen on the left side of the photo are caused by light reflected on dust particles on my lens (I know, shame on me for not cleaning my lens....). Furthermore, there is a large amount of veiling glare, but, once again, my hand comes to the rescue and gives me a clean picture.

Example of preventing flares => [unprotected] [sun blocked with my hand]

Example of preventing flares => [unprotected] [sun blocked with my hand]

 

It is of course difficult to block the light source when the light source is part of the composition. In the case of flares caused by dust on the lens, simply cleaning your lens does the job, as can be seen in the pictures below on the left.

For flares that are caused by internal reflections in the lens, things become a bit more complicated, but there is a work around. The solution is to take two pictures, one normal and one where the light source is blocked. These two can then be combined afterwards on the computer to give a clean picture. In the example below to the right, several flares can be seen throughout the photo, which is caused by the sun in the top of the frame. By taking another picture while blocking the sun, and combining that one with the original, you get a flare free photo.







Cleaning a lens will prevent flares => [dusty lens] [clean lens]

Removing flares with multiple exposures =>
[unprotected] [sun blocked with my hand] [combined]

 

Lens characteristics - vignetting

 

There are several types of vignetting, all caused differently. I will discuss four of them here:

• Optical vignetting is caused by the effect that the lens will block part of the light traveling at an angle, which is demonstrated by the pictures below on the left, where the bright white area in the lens represents the entrance pupil. The wide open apertures show a significant decrease in area when seen from an instead of head-on, and this causes the vignetting. Because light travelling at an angle is blocked to a certain extent, a gradual darkening in the edges of the picture is observed. The smaller apertures have the same size, wether they are seen from the optical axis or from an angle. As a result, this type of vignetting can be decreased by stopping down.

The example on the right demonstrates how stopping down decreases optical vignetting. Stopping down one or two stops usually suffices to suppress most of this type of vignetting.







Wide open => [head-on] [at an angle]
Stopped down => [head-on] [at an angle]

A large aperture will cause vignetting =>
[wide open (f/2.8)] [stopped down (f/4)]

 

An example of mechanical vignetting

• Mechanical vignetting is the simplest type of vignetting and it can be caused by a number of things, like when your lenshood is too long, or when you stack multiple filters on top of each other. This will cause them to enter the field of view giving rise to dark corners in your picture. It is very easy to prevent this type by using the proper lenshood and not stacking too many filters, especially in the case of wide angle lenses. If you want to be sure you can check it by just looking through your viewfinder (although keep in mind that many viewfinders don't show the full 100%) to see if you have some mechanical vignetting. On the right is an example, where the upper corners have turned black because of a filter blocking the way.

 

• Natural vignetting is caused by multiple factors which together add up to the cosine^4th law, where it is the angle that the light makes relative to the sensor which matters. The first factor is the inverse square law which says that light that has to travel farther will decrease in intensity. And in our case, the light travelling to the edge of the sensor travels a longer distance than light going to the center of the sensor. The loss of intensity this causes is according to a cos^2 factor.

Second, light hitting a surface at an angle will be spread out over a larger surface than light which will hit the surface straight on. This is the same principle why an evening sun is cooler than a midday sun, its light is spread out over a larger area. And in our case, it is again the edges of the sensor which suffer from this as they recieve their light at an angle, which introduces another cos factor.

Third, a circle seen from an angle does not look like a circle, but looks elliptical, and that same principle can be applied to our aperture. Light entering from an angle will not "see" the aperture as a circle but as an eliptical shape, which effectively reduces the area of the aperture and thereby blocks part of the light. So again it is the edge of the picture which is affected with yet another cos factor.

These three factors combined give the cos^4 factor for natural vignetting, and there is no remedy for this type of vignetting.

 

• Pixel vignetting only applies to digital cameras, and it is in some way related to natural vignetting. It is caused by the fact that sensors will record the light as slightly less bright when it hits the sensor at an angle, which it does at the edges. This can to a certain extent be minimized by using microlenses on top of the sensors.

 

Lens characteristics - bokeh and cat's eye effect

 

Bokeh is a term used to describe the quality of the out of focus elements, but it is hard to describe what is good bokeh and what is bad bokeh, it's mostly up to ones personal taste. The shape of out of focus objects is determined by the aperture, lens construction, lens aberrations and lens aberration corrections. When the aperture is not wide open, the out of focus elements will take the shape of the aperture, which depends on the number of blades used in the lens. The picture below to the left is an example where bokeh can be seen in the background.

 

The same principle that caused the optical vignetting (see lenses/vignetting) is also the cause for the so called cat's eye effect, which is seen at larger apertures. It's the effect that the shape of the bokeh on the optical axis is round, whereas it has the shape of a cat's eye toward the edges. This is caused by the cat's eye shape of the entrance pupil when seen from an angle.

 

Below to the right is an example where the cat's eye effect is very obvious around the edges. This also shows how stopping down decreases this effect, as it's almost gone at f/2.8, and completely gone at f/4. Note that, except for f/1.8 which is wide open, all apertures give heptagonal shaped results, which results from the seven blades of the aperture.

Example of bokeh in the background







Cat's eye bokeh at large apertures => [f/1.8] [f/2.8] [f/4.0]

 

Lens characteristics - sensor reflection

 

Part of the light reaching the sensor is reflected back towards the lens, and when that light in turn is reflected back by the rear element of the lens to the sensor, it will show up on the photo as a distinctive pattern of colored dots (which is in fact the pattern of the sensor pixels). In normal scenes, this is nothing to worry about, but when taking a photo of for example the sun, then this might become a problem. To prevent this from happening, manufacturers coat the rear elements of lenses with an anti-reflective coating. But sometimes the sensor reflection will show up anyway, and unfortunately, it does not look very pretty in a picture. Below are two examples were you can see colored dots scattered around in the sun star.

Example of sensor reflection







Example of sensor reflection

 

Vibration reduction (VR)

 

Some lenses are equipped with something called Vibration Reduction (VR) or Image Stabilization (IS), which is a mechanism that reduces blur caused by shaking the camera. I'm sure many love it, but I love my tripod and see no need for Vibration Reduction for my photography. So I was a bit disappointed when Nikon released the 16-35mm with Vibration Reduction, but I like that lens a lot so I purchased it anyway. However, I soon found out that there was a problem with the lens when doing night photography.

The examples below to the left show a red glow and some other red structures as well and it took me a long time to realize the Vibration Reduction was the culprit I can say. Below to the right are some tests I did at home with a lens cap on the lens to test the Vibration Reduction, and indeed the same structures appear on the photo. The structure changes and becomes less noticeable when changing the focal length to 35 mm, but the red glow is still there. Changing to a good old nifty-fifty 50 mm lens removed almost all of the red glow. The only glow visible in that image is caused by my sensor getting really warm after several long exposures (~7 minutes each) at room temperature.

After scouring the internet looking for confirmation, I read that there is an infrared sensor in the lens which is used to control the Vibration Reduction, which is what is causing the red glow. And in case you wondered, turning off the Vibration Reduction does not make a difference, it was off during all the photos below. As far as I know, this problem is not consistent, some people have no problems with their lenses it seems.

So there you go, one more reason for me to dislike Vibration Reduction. But I still really like this lens, I just have to use another wide-angle for night photography.

Red glow caused by the VR => [example 1] [example 2]

Red glow caused by the VR => [16-35 mm lens @ 16 mm] [16-35 mm lens @ 35 mm] [50 mm lens]

 

Tilt-shift lenses

 

A tilt-shift lens gives a lot of possibilities to play with perspective and sharpness, which can be really useful in landscape photography. Unfortunately, they are also really expensive.... Since I did not want to empty my bank account before having experimented with the tilt and shift effects, I made one myself by using a Mamiya 645 45 mm lens. More information and examples of that set-up can be found here.

 

Tilt-shift lenses - shift

 

By employing the shift function of your lens, you can play around with the perspective. In the example below on the left, the subject is a piece with a red part and a blue part, and it is important to notice that both parts are equally long. If you can take the picture directly in front of the subject, than that will give the first "normal" example, where the red and blue part end up equally large on the photo as well.

But if you can't stand in front of the subject, than you can rotate your camera to give the second "rotated" example. This also captures the same scene, but now the red and blue parts will not end up equally large on the photo! The blue part will be larger than the red part, and it is this effect that causes perspective distortion in pictures that are taken at and angle. (It's the same effect that causes the edges of photos taken with a wide angle lens to be distorted.) This effect can be prevented by shifting your lens. So the camera stays at the same place and angle as the "normal" setup, but the lens is shifted to the left until the same composition is obtained, and this will give a photo that is free from perspective distortion.

 

On the right is an example of this effect. The normal photo is taken with the camera pointing upwards to get the desired composition. But this results in a slight keystone effect, making the trees appear to bend inwards at the top of the photo. To prevent this, the camera was placed horizontally, and, in stead, the lens was shifted upwards to achieve the same composition as the first photo. Now you can see that the trees are no longer appearing to bend inwards.









Schematic depiction of a shifted lens => [normal] [rotated] [shifted]

Example of a shifted lens => [normal] [lens shifted upwards]

 

Example of a shifted lens => [normal] [rotated] [shifted]

Because the shifted images are free from distortion, they can be a good way of making photos that are to be stitched together. The only problem is that it will be limited to a stitch of about about three pictures (shifted to the left picture, middle picture, shifted to the right picture) since there is a limit for how far a lens can be shifted. So it will not work for a very wide panorama or something similar.

 

To the left is an example of some pieces of wood on the ground which I photographed from a relatively close distance, and which I wanted to stitch together. The first picture is taken directly above the wood, with the camera parallel to the ground. Then I took a second picture with the camera rotated to the left, where you can see that the overlap between the two photos is pretty bad. The perspective is very different from the first picture, which will give the stitching software a hard time. However, with the camera still parallel to the ground and the lens shifted to the left, the overlap between the photos is very good. This will make stitching these photos to a panorama a lot easier.

 

Removing reflections => [normal] [lens shifted to the right and setup moved to the left]

 

 

In architecture photography, the shift function of tilt-shift lenses is often used to take pictures of shiny buildings, where you don't want your reflection to show up on the photo.

On the right is an example of something reflective that I photographed. In the first picture, which is taken with the camera directly in front of it, you can clearly see my reflection. So I moved my setup to the left, and shifted the lens to the right, which gave a similar picture, but without my reflection.

 

 

 

 

 

 

 

 

 

 

 

 

 

Tilt-shift lenses - tilt

 

By tilting a lens, you can play with sharpness, and it is possible to both increase and decrease sharpness. The scheme on the left shows how the depth of field depends on the angle of tilt. In each case, the bright green line is the plane of sharp focus, and an approximation of the depth of field is represented by the two outer, dimmer green lines. Focus in these examples is not on infinity, so that's why the distance between sensor (the black bar) and lens is slightly larger than the focal length f.

This principle of the plane of sharp focus changing because of the tilting of the lens is called the Scheimpflug principle. The degree in which it is tilted is determined by two cross sections, called the Scheimpflug line and the Hinge line which are shown in the picture below on the right. By connecting these two lines, you get the plane of sharp focus (the green line).

Since the plane of sharp focus gets tilted in the case of a tilted lens, the depth of field also gets tilted. Moreover, with a tilted lens, the depth of field is no longer defined by two parallel planes, but it has the shape of a cone, starting at the Scheimpflug line.

Schematic depiction of a tilted lens => [no tilt] [small angle] [medium angle] [large angle]

Schematic depiction of a tilted lens

Smaller angle => [focus on nearby] [focus on infinity]
Larger angle => [focus on nearby] [focus on infinity]

 

Focussing with a tilted lens is a bit trickier than usual. First of all, tilt-shift lenses are manual focus, and second, it is an iterative process to focus them. As can be seen below on the left, near focus is determined only by the angle of the lens (since that determines distance J), and not by focussing the lens. So the first thing to do is to find the right angle to get the foreground in focus. After that, you focus the lens with the focus ring to achieve far focus. But, as can be seen in the difference between close focus and focus on infinity in the scheme below on the left, focussing also changes the position of the Scheimpflug line, which means that the angle of the lens has to be adjusted slightly to correct for near focus. But that also changes the far focus, so that has be slightly corrected as well. After a couple of times doing this process, everything should be sharp.

 

As is the case with regular lenses, the depth of field can be regulated with the aperture, as is shown in the scheme below on the left.

Below on the right is a scheme depicting what the influence of the focal length is, while maintaining the same level of tilt.

Focus on nearby => [larger aperture] [smaller aperture]
Focus on infinity => [larger aperture] [smaller aperture]

The influence of focal length =>
[shorter focal length] [medium focal length] [longer focal length]

 

These are some real life examples of using the tilt function, all taken at f/2.8. The first shows how you can achieve a very narrow depth of field by having the lens tilted. On the other hand, the second example shows how you can increase depth of field by using tilt. The camera was facing downwards at an angle of approximately 45 degrees and the lens was tilted upwards.

Example of a tilted lens => [normal] [tilted]

 

The angle for the plane of sharpest focus (θ) and the Hinge line to camera distance (J) can be calculated with the formulas below.

Schematic depiction of the various symbols

$$J = {\frac{f}{arcsin(θ)}}$$

 

  J = Hinge line to camera distance

  f = focal length

  θ = angle of tilt

$$ψ = {arctan\left(\frac{d}{J}\right)}$$

 

  ψ = angle of plane of focus

  d = focus distance

  J = Hinge line to camera distance

 

Focal length (mm):

θ (°):

Focus distance* (m):

J (m):

ψ (°):

Calculate!

* Focus distance is the distance between sensor and subject, should at least be 4 × focal length.

 

Δ
Δ
Δ