PHOTOSYNTHESIS IN NATURE  Nature & outdoor photography

HOME      GALLERIES      PRINTS      [ PHOTOGRAPHY ]      EXPERIMENTAL      ABOUT/CONTACT

CAMERAS        LENSES        FILTERS        METERS & MICROMETERS        TYPES OF LIGHT        EDITING        BITS & BYTES        COLOR MANAGEMENT        PHOTOGRAPHING THE SKY
MACRO PHOTOGRAPHY        MICROSCOPE PHOTOGRAPHY        IR PHOTOGRAPHY        STEREO PHOTOGRAPHY        SOLARGRAPHY        MATTING CALCULATOR

Sun and moon photography

 

An averagely sized full moon

The size of the sun and moon (both are about 0.5° in size) on the sensor can be estimated by dividing the focal length with 110, giving the size on the sensor in mm. Do keep in mind that especially the moon is subject to changes in size due to its elliptical orbit around the Earth (which is why the calculator below gives a range instead of one value). The difference between the largest and smallest size of the moon (called perigee and apogee respectively) is about 14%, whereas the sun varies less than the moon.

 

The picture to the right was taken at 210 mm with a crop factor of 1.5 (giving a sensor height of 16 mm). The moon is approximately 310 pixels in diameter, while the height of the photo is 2592 pixels, which gives the moon a size of about 12%, which corresponds with the outcome of the calculator. Looks like the moon was average in size on that day.

 

Focal Length (mm):

Sensor height (mm):

Size of the moon on sensor (mm):

-

Size of the moon (% of sensor height):

-

Size of the sun on sensor (mm):

-

Size of the sun (% of sensor height):

-

Calculate!

 

Sun and moon photography - solar filter

 

Since the sun has such a high intensity, you'll need special filters to prevent overexposure if you want to capture the details in the sun. I use a Baader solar filter, which is a very thin foil which blocks 99.999% of the light. The type I ordered comes as A4 pieces, and I just taped a piece of that between two bits of plastic, and connected a Cokin P system adapter ring to that. This way, the filter can be screwed on top your lens, since you don't want to run the risk of the filter falling off while looking through the viewfinder (which might permanently damage your eyes). The filter construction can be seen below on the left. It's not pretty, but it works perfectly! On the right is a photo of a solar eclipse which I took with this filter (taken at f/7.1, ISO 400 and 1/1600 s).

A homemade solar filter

A photo taken with the homemade solar filter of a partially eclipsed sun

 

Sun and moon photography - solar latitude

 

This calculator gives the solar latitude in degrees for a given time and date. The solar latitude is the angle which the sun makes with the horizon, and a negative value means that the sun is under the horizon.

Year/month/day (today):

/ /

Latitude (°)*:

   

Longitude (°)**:

   

Time zone ***:

Time (hour : min):

:    

Daylight saving time:

   

Solar latitude (°):

at   

 

Maximum solar latitude on this day (°):

at   

 

Solar latitude at winter solstice (°):

 

 

Solar latitude at summer solstice (°):

 

 

Calculate!

* North of the equator is positive, south of the equator is negative.
** East of Greenwich is positive, west of Greenwich is negative.
*** Fill in the time zone relative to UTC (overview). Examples: Hawaii is -10, the UK is 0, CET is +1, New Zealand is +12.

 

Night photography - stars

 

The shape of stars at different exposure times => [8 seconds] [15 seconds] [30 seconds]

It is hardly visible to the unaided eye, but since the Earth rotates, so do the stars when observing them from Earth. The "rule of 600" is often used to estimate the time for an exposure without stars ending up as stripes on the picture. It's done by dividing 600 by the focal length (35 mm equivalent for cameras with smaller sensors), to give the time in seconds. This means that a 50 mm lens results in an exposure time of about 12 seconds or less to capture the stars more or less as points. However, this is a really optimistic way of determining the maximum exposure, and I often think the stars are too stripy when using this rule. Even the "rule of 500", which is often cited as being more accurate, is often a bit too optimistic.

That's why I based the calculator below on the circle of confusion, that way you know exactly which exposure time will give the stars as dots. And by prolonging the exposure time by a known amount, it is also possible to make a pretty accurate guess about how much striping will occur because of that. If we, for example, double the exposure time given by the calculator, than we know that the fastest stars will travel two times the circle of confusion.

 

Look at the examples on the right, which are the stars of Orion. This was taken with a 16 mm lens, which means that the 'rules of 500/600' would predict a maximum shutter speed of 31 seconds and 38 seconds respectively. These are all at 100%, so I'm going to be really picky here, and set my circle of confusion for this example at the pixel pitch of my camera (0.0085 mm), which gives a shutter speed of 7.3 seconds according to the calculator below.

 

Star trails evolving around the celestial pole

8 seconds (close enough to 7.3 seconds) clearly gives the stars as dots, whereas they start to be egg shaped at 15 seconds. That's two times the calculated exposure time, but the result is still perfectly acceptable. But at 30 seconds they really are visible as stripes, even though 30 seconds is below the values predicted by the "rule of 500/600"!

 

A good thing to know is that, as soon as stars start to move in your picture, you've reached a point were longer exposures no longer will give brighter stars! This is because their light will reach a different part of the sensor when they have moved. So from beyond that point, longer exposures will only enhance the background and only a larger aperture or a higher ISO value will increase the brightness of the stars.

 

The amount of striping depends not only on exposure time and focal length, but also on the stars themselves. Stars near the celestial poles don't move very fast, while the stars at the celestial equator move the fastest, which is clearly visible in the example on the left (the green in the picture is the result of some auroral activity). The stars in the bottom move fast (longest streaks) while the stars in the top hardly move. This is the celestial north pole and that is were Polaris (the North Star) is located. During long exposures, the stars are captured circling around the celestial poles.

 

This calculator gives the maximum exposure time where stars are still captured as dots, based on the circle of confusion. As stated above, stars move at different speeds, depending on their declination (the declination is the angular distance of a star from the celestial equator). 0 degrees is for stars on the celestial equator, where stars move the fastest, and 90 degrees is for the celestial poles, where stars hardly move. If you want to be sure, just leave it at 0, that way you calculate the value for the fastest stars.

Focal length (mm):

Circle of confusion (mm):

Declination* (°):

Maximum exposure (s):

Calculate!

*Declination should be between 0 and 90

 

Night photography - star trails

 

In the analog days, shooting star trails "simply" meant opening your shutter for a long period, but with digital sensors, that will give some problems. A sensor will get warm when collecting light, and this will increase noise and amplifier glow, which is why star trails (or any long exposure) are best shot at cold nights (this will decrease noise as the low temperature will cool the sensor). On the ten minute exposure on the left, the amplifier glow can be seen as the purple artifacts in the upper corners (in all fairness, this was done with a relatively old camera, so modern cameras are a lot better at this). To prevent this from happening, multiple shots can be taken with shorter exposure times (typically between thirty seconds and several minutes per exposure), and these are then combined afterwards on the computer. This method also makes it possible to shoot star trails in areas with more light pollution. See for example the picture on the right, which is also a ten minute exposure, and compare it to the picture where twenty photos are combined to a single picture. These twenty photos all had a thirty seconds exposure time, so that adds to a total time of approximately ten minutes.

Lots of amplifier glow in the upper corners

Single image vs combining images => [single exposure] [twenty exposures combined]

 

Combining the pictures to one can be done in two different ways. The easiest one is to add them as separate layers in Photoshop and put the blend mode to "lighten" for each layer, except the first one. This way, Photoshop will select the lightest pixels at each position from each layer and combine those to the final output (which will also decrease noise!). There are even several free programs to download (Startrails, StarStaX, to name a few) which will do exactly this for you.

Layer nr:

Blend mode:

 

4

lighten

→ combine to single picture

3

lighten

2

lighten

1

lighten

background

normal

 

Different blending methods => [lighten method] [lighten/screen method]

 

 

This method works well with slow moving stars, but as soon as the stars move faster, a problem arises, which can be seen in the examples on the right (crops are at 100%). The trails are not continuous lines, but are fragmented! To circumvent this, the pictures need to be blended in a more complex way, making use of two different blending modes. It is a bit of a tedious process, especially when working with lots of photos. Furthermore, you lose some intensity of the stars, but that can be fixed with some post-processing if desired. But, in the end, this will give continuous stripes, which is most important (for me at least).

 

Since the screen blending mode will enhance the lightness of the picture, it is important to first decrease the exposure on all pictures with about one stop. If you won't do this, the picture will turn out way too light.

The next step is to duplicate all layers except the first and the last one and blend the layers as is shown in the table below. Once you've obtained all these combined layers, you can then combine those together in the same way as described above.

 

layer nr:

blending mode:

 

combined layer nr:

blending mode:

 

4

screen

→ combine to single layer

3 copy + 4

lighten

→ combine to single picture

3 copy

normal

3

screen

→ combine to single layer

2 copy + 3

lighten

2 copy

normal

2

screen

→ combine to single layer

1 copy + 2

lighten

1 copy

normal

1

screen

→ combine to single layer

background + 1

normal

background

normal

 

It's pretty obvious I guess, but make sure to have your tripod placed really well for star trails. Below on the left is a 100% crop which I took while being convinced that my setup was steady as a rock, but the resulting picture proves I was once again wrong.... The star trails are curving on the top end, which means that my camera slowly moved for about five to seven minutes, after which it eventually became stable. I guess I failed to tighten one of the legs of my tripod.

 

Another enemy of star trails is condensation on your lens. There are ways to prevent it, which mostly involve heating your lens in one way or another. I've never tried that myself though, most of the time there is at least some wind which will reduce the risk of condensation.

Below on the right is an example where I tried to make a star trail photo which included the Milky Way, consisting of six four-minute exposures. There is already a small amount of condensation in the first picture, but the sixth is worthless since all details are lost! It is still possible to make a star trail photo from this, but when looking closely it will give the effect that you can see in the detailed example, which is at 50%. They look more like comets than like star trails....







Curved star trails due to a failing tripod set-up (I lost the original file of this photo, which is why
the image quality is pretty crappy)

Condensation ruining a star trail photo =>
[first photo] [sixth photo] [combined] [detailed]

 

Night photography - directions for star trail photos

 

The shape of star trails on the final picture depends on which latitude you are, and in which direction you are looking. Below are some schematic examples of how they approximately will end up (the zenith is the point in the sky which is directly overhead).

 

For an observer at the equator:

   
    North
 
Northeast
 
East
 
Southeast
 
South
 
Southwest
 
West
 
Northwest
 
Zenith
 
 

For an observer at ~30 degrees latitude north:

   
    North
 
Northeast
 
East
 
Southeast
 
South
 
Southwest
 
West
 
Northwest
 
Zenith
(facing north)
 

For an observer at ~30 degrees latitude south:

   
    North
 
Northeast
 
East
 
Southeast
 
South
 
Southwest
 
West
 
Northwest
 
Zenith
(facing south)
 

For an observer on the poles:

   
    North
 
Northeast
 
East
 
Southeast
 
South
 
Southwest
 
West
 
Northwest
 
Zenith
 
 

 

Night photography - photographing stars with a tracker

 

If you don't want to get star trails, then one option is to take many short exposures, and align and combine them afterwards on the computer. But that will eat a lot of space on your computer, and it can be a lot of work to align them. The other option is to let the camera follow the stars as they travel through the sky. In that case the land will of course move on your photo, so if you want to include both the sky and the land in the photo you need to take separate photos for them and combine them afterwards.

Tracking the stars can be down by using an equatorial mount, but these things don't come cheap, so I build my own motorized so-called Scotch mount (also known as a barn door tracker), which can be seen below on the left (more pictures and information about building it here). This does the same thing, but of course it's not as stable and easy to use. But it does work pretty well and it's fun to build! The large bended rod to the left regulates the rotation of the camera, and the smaller rod to the right is there make it more easy to adjust the hinge at the right angle.

The principle is that as long as the hinge is aimed at the celestial pole and the motor goes at the right speed, the camera will move at the same pace as the stars and you can aim your camera anywhere you want, giving pinpoint stars. The example below on the right is taken at 200 mm with a shutter speed of 2 minutes, and the difference is pretty clear.

My homemade star tracker







Example of using a star tracker => [tracker off] [tracker on]

 

Night photography - removing airplanes

 

Removing airplanes => [original] [dark line] [result after combining]

Man, I really hate seeing airplanes in my night pictures! Every time I notice an airplane flying through my picture while I am taking a night photo, I raise my fist and curse it. But in the end there is nothing you can do about them, airplanes are almost everywhere on Earth and it is impossible to plan around them.... So the only options are accepting them or removing them. I for sure can't accept them because I think they seriously degrade any night picture, so I remove them.

 

If the picture is composed of several photos (like in the case of a star trail photo), then you can either remove the airplanes from every single photo that you took, or you can do it afterwards on the completed picture. I usually do it on the completed picture.

On the right is an example of removing an airplane, with the pictures at 100%. I draw a black line with a thickness of 1 pixel along the airplane (more often than not the airplane did not fly along a straight line and I need several lines, as in this case). In the case of a relatively strongly colored background, as in the case of for example an aurora, a slightly colored line in stead of a pure black line works better. So if the background is green(ish), like in this case, then choosing very dark green for the line usually works better. Then I apply a Gaussian blur of ~0.2-1 px to this line and adjust its opacity until the airplane is (almost) gone. Sometimes some parts need a bit more opacity than others, but in general this is how I do it and what has been working pretty well so far.

 

Night photography - meteors

 

Besides airplanes, there are other objects that will leave their marks on star photos, and meteors are one of them. Meteors occur when meteoroids (debris from outher space) enter our atmosphere and start to create heat due to friction with the atmosphere, emitting light as they do so. Most do not survive this process, but some do and will eventually land on Earth, at which point they are called meteorites. Below are two examples of meteors in photos, where the first one is an Orionid. The second one was a larger meteroid, giving a much brighter result.

An Orionid in the sky

A meteor

 

Night photography - meteor showers

 

A special type of meteors are meteor showers, which are caused by comets which are made up of ice and smaller rocks and sand. During its flight trough space, the heat of the sun will evaporate some of this ice, leaving behind the rocks and sand in the orbit of the comet. The closer the comet is to the sun the more intense this evaporating will be.

Correcting meteor showers => [corrected] [uncorrected]

When earth travels through this trail of rocks and sand, they will enter the atmosphere and burn just like any other meteoroid. The frequency of meteoroids is just al lot higher, and they all radiate from the same spot in the night sky. This point of radiation is what gives meteor showers their name. For example, the Orionids are called like that since they radiate from a point in the constellation of Orion.

 

Since it takes earth a year to orbit around the sun, meteor showers will occur annually around the same time of the year, when the earth is in the same spot in space. Intensities do change from year to year though, depending on the varying density of meteoroids in the comets trail.

 

On the left is an example of a photo of a meteor shower, in this case the Geminids which occur in mid-December. To get a picture like this, you put your camera with a wide angle lens on a sturdy tripod and let it take pictures continueously for a given period of time (these were the brightest meteors over a period of 1.5 hour). You can than look up the pictures with meteors on it, and combine the meteors in a single photo. However, since the sky rotates during the night, these combined meteors will most likely not seem to radiate from a single point, since that point moves around the sky in time. So to get the meteors to radiate from a single point, you need to correct them and align each meteor to the right point in the sky.

 

 

Night photography - Iridium flares

 

An Iridium flare

Iridium flare are reflections of sunlight in one of the satelites of the Iridium system (which is why they are called Iridium flares). Since the positions of these satellites are known, it is possible to predict when and where Iridium flares will occur (there are websites online where this information can be found, like heavens-above). These Iridium flares are always directed in north-south direction, due to the orbits of these satellites. To the right is an example of an Iridium flare (the photo is greenish due to some auroral activity).

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Night photography - viewfinder cover

 

Example of covering the viewfinder => [viewfinder unprotected] [viewfinder protected]

During long exposures, there is the possibility that light leaks in via the viewfinder, and since night photography often means long exposures and relatively dark pictures, this might ruin your picture. The simple solution is to cover the viewfinder, and some cameras are actually equipped with a viewfinder cover. On the left is a test I did on my camera, where I darkened the room, covered the lens with the lens cap and a black shirt, and exposed at ISO 800 for a long time, while shining directly on the viewfinder with a light. Then, I repeated the test with the viewfinder covered. Clearly, light finds its way to the sensor under unprotected conditions! Now, this is of course a situation that will most likely not happen in real life, but it's better to be safe than sorry, so always protect your viewfinder under these conditions! Maybe a streetlight or accidentally pointing your headlamp at the camera might be enough to result in artifacts!

 

 

 

 

 

 

 

 

 

 

Night photography - dark frames

 

Example of dark frame substraction => [original] [dark frame] [dark frame substracted]

As explained under bits & bytes/noise, there is a type of noise called fixed pattern noise. This is noise which always appears at the same spot, and is pretty stable when conditions are kept the same. The best example are hot pixels, which can be seen in the picture on the right (this is at 100%). These are bound to appear in your picture when making night photos, since both long exposures and higher ISO values enhance hot pixels.

 

Fortunately, there are several ways to remove them. You can clone them out, but that is a tedious thing to do, so a better option is the use of dark frames. Dark frames are pictures which are taken under the exact same conditions as your regular picture, but this time the lens cap is on, so the sensor will not recieve any light. Therefore, the only thing appearing on this photo are the hot pixels, and by substracting this dark frame from the real picture, you will get a cleaner picture. Most of the time, this does not remove the hot pixels completely though (look in the corrected example on the right, and you will still see the hot pixels, but to a lesser extent), but the result is acceptable!

When dark frame substraction fails =>
[original] [dark frame] [dark frame substracted]

 

 

 

 

 

However, on lighter parts of the picture this process sometimes can give some "inverse hot pixels", so it is always good to check for those in your end result. On the left is an example, at 100%. After substraction of the dark frame, a darker spot has taken the place of the nearly invisible original hot pixel, which can hardly be called an improvement...

 

Dotted star trails when long exposure noise reduction is activated

 

 

If you don't want to do the dark frame substraction yourself on the computer, there is a function in most DSLR's which is called "long exposure noise reduction" (LENR). This means that, after you have taken a picture, the camera automatically takes another picture with the same conditions, but without opening the shutter. It then substracts the second picture from the first one. This does mean that taking a picture takes twice the shutter speed, so a 10 second exposure will take about 20 seconds before you can take another picture.

 

LENR does have two disadvantages: First, as stated above, the substraction of a dark frame can give artifacts, and once your camera has done the dark frame substraction, there is no way back. Second, making a blended star trail photo with this function activated will give results like the picture on the right, so don't forget to turn it off when planning a blended star trail photo! Luckily for me, I only made this misstake once! It's amazing how easily these kind of things stick in your memory after discovering you've been freezing your ass off at -18°C for over two hours, only to have dotted star trails as the result (this was taken during my first winter living in Sweden, so I did not yet have the right type of clothing for this kind of temperatures)....

 

 

Night photography - stacking

 

When taking pictures of specific objects in the sky (deep space objects, planets, comets, etc.) you'll be needing a lens with a large focal length. But that means that you only take pictures with relatively short exposure times, due to the rotation of the sky as explained higher up on this page. And using short exposure times with higher ISO values will give noisy pictures, which is not what you want.

 

Stacking photos to eliminate noise => [five photos stacked] [single photo]

One solution is of course to use the aforementioned star tracker, but if you don't have have one or can't take it with you, than stacking photos can be the solution to minimize noise. As explained here, a lot of noise can be removed when stacking photos, since the end product will be the average of the stacked photos, eliminating random noise.

 

To use this for distant objects in the night sky, you put your camera on a sturdy tripod, and have it take a sequence of photos with an appropiate shutter speed that keeps the stars as points. Of course, the sky in the photos will shift a little bit for each photo since the sky rotates, so you have to align them prior to stacking them. When using long lenses, this aligning is often very easy, since there will be no distortion there as would have been with for example a wide angle lens.

Once you have all the photos aligned, you set the opacity for each layer as 100/"nr of the layer" in percent: layer 1: 100% opacity, layer 2: 50% opacity, layer 3: 33% opacity, layer 4: 25% opacity, etc.

 

On the left is an example of this principle. The greenish object is comet Wirtanen, photographed with a 200 mm lens (and cropped for this example). I used a shutter speed that was a bit too long, since the stars are a bit elongated, but this still perfectly shows the principle of stacking. This was only a combination of five photos, and the more photos you stack, the more you reduce the noise.

 

Night photography - light pollution

 

Light pollution is the mostly orange glow that can turn up in night photos, and it is caused by lights from buildings and street lights being reflected by particles in the sky. It sure is a mood killer in night photos for me, and, unfortunately, it is pretty hard to get rid of it completely, even with cities at distances of 30-40 km they can still ruin photos. Especially when the air is humid, light pollution can travel long distances, even though it might look clear with stars and all. And when the sky is cloudy things turn even worse. The solution is to shoot at nights with low humidity, and preferably not in the direction of populated areas. A very good tool to check the light pollution on specific locations is the website lightpollutionmap.info. Do remember that this website does not take clouds and humidity into consideration.

Several sources of light pollution in the distance

Light pollution in the distance

 

Aurora photography

 

An approximation of how concentric rings might look when photographing the aurora with a filter on your camera

When taking pictures of the aurora, it is important to take off any filters you may have left on your lens. This is because the two planes of the filter can act as an interferometer for the monochromatic emission of the aurora, which results in concentric rings in the middle of your picture. This is an effect which occurs at any time when using a filter (also with daytime pictures), but normally there are so many different colors (= different wavelengths) that their corresponding interferences cancel each other out. But the aurora sends light of just a few very specific wavelengths, which is the reason why this interference only shows up on aurora photos.

I've never experienced this effect myself as I read about it before I had a chance of seeing the aurora, so I made sure not to have any filter on my lens, but on the right is what it approximately looks like (this is not a real life example, but some Photoshop magic applied to a real aurora picture).

 

As far as the exposure is concerned, it all depends on the aurora itself, which can be very weak or very strong. I usually start with 15 or 20 seconds at ISO 1600 or 3200 and somewhere between f/2.8 and f/4, and judge on the LCD how to alter the exposure. This is trickier than it sounds, because the LCD will look really bright when it's dark, and the photos will look a lot brighter than what they would look like in daylight. Wouldn't be the first time for me that aurora photos look great on the LCD, but once I view them on the computer they are underexposed. The trick is to look at the histograms, and not at the photo itself.

 

Aurora photography - when to go outside

 

Auroras can occur at any time, but the activity of the sun goes through a cycle of about 11 years, and auroral activity is higher when the sun is at its activity peak. Especially when the sun emits a coronal mass ejection (CME) towards Earth, auroras can be expected within 1-2 days.

 

For reasons that are not entirely understood, auroral activity is also higher around the equinoxes (the two times a year when the sun is exactly above the equator, which is in March and September). Even small amounts of charged particles will cause auroras during these periods.

 

It is impossible to exactly predict the aurora, but at several sites on the internet, a forecast is given, like at the Geophysical Institute at the University of Alaska. This is only an indication of the chance of seeing an aurora. I have spent nights outdoors spotting the aurora while there was no high auroral activity forecasted, and also the other way around more often than I would like to admit. But quite often the forecast does a good job, although you still need to have luck and a lot of patience. I don't have a lot of patience, so I just spend a lot of nights outdoors, thereby increasing my chances of seeing the aurora!

 

Aurora photography - what is the aurora?

 

The aurora is caused by charged particles from the sun colliding with molecules and atoms in our atmosphere. The more particles that are expelled by the sun, the more intense the aurora will be. The auroras are concentrated on circles around the magnetic poles, which roughly coincide with the polar circles. This means that the further away from these circles the observer is, the lower on the horizon the aurora will be seen. It occurs in both the northern hemisphere (where it is called aurora borealis) and on the southern hemisphere (where it is called aurora australis). The aurora can have all kind of shapes and they can change in a matter of seconds, but they can also be motionless for a really long time (minutes/hours).

 

The different colors in the aurora are caused by different processes at different heights:

•  at ~100 km: Nitrogen molecules give red/purple/blue light.

•  at 100-180 km: Oxygen atoms give the green/yellowish light which is most commonly seen.

•  180 km and higher: Oxygen atoms give the rarer red aurora.

•  at very high altitudes: Molecular nitrogen gives purple/blue light.

 

When excited particles collide with other particles, they will loose their energy and they will not emit any light. The higher you go up in the atmosphere, the lower the density and the greater the distance is between particles, decreasing the chances of colliding with each other.

When nitrogen is excited, it emits light almost instantaneously (in a matter of microseconds), which means the chances of excited particles colliding with other particles before emitting light is small. Oxygen however takes a relatively long time to emit light, about 0.7 s for the green light. For that reason, the green aurora does not occur below 100 km, the density is simply too high, and the excited particles collide with other particles before they have a chance of emitting light.

For the red oxygen aurora, the delay is even longer, just under two minutes, which means that it only occurs even higher up in the atmosphere. Most auroras take the shape of the field lines of Earth's magnetic field, but the red aurora takes such a long time that the excited particles have already moved around when they finally emit their light, which is why this red aurora has a faint and gradual appearance.

 

Aurora photography - eyes vs camera

 

Every now and then I get the comment that aurora photos are "fake" because things do not look like the photos in real life, which I frankly think is bogus. The fact is that these colors are there for real and our cameras capture them. There is nothing fake about them, our eyes just are very bad at low light conditions and can't really be compared to a camera, but that doen't affect the reality of the aurora being there.

The difference between eyes and a camera => [camera] [eyes]

The way I see it, photography isn't just about showing things exactly the way we see them, it's also about using the camera as a tool to show other less visible things that are equally interesting, otherwise for example infrared photography and macro photography should also be classified as fake.

 

There is a reason why we don't see colors during the night: light sensitive parts in our eyes can be divided in two groups; cones and rods. Cones detect color, but are less sensitive to light. Rods don't detect color, but are about 1000 times more sensitive than cones. That's why we hardly (or not at all) see colors under very low light conditions, like at night. For this reason, auroras will be percieved as greyish when the intensity is not that high (our eyes are more sensitive to green than to other colors, so the green aurora will often look more like greenish grey in stead). Higher intensity auroras can be seen in muted color though. Fortunately, cameras are just as color sensitive at night as during the day and they capture all the colors from the aurora, even at low intensities.

 

On the left is an approximation of the difference between how a camera registered it (unedited RAW photo) and how I saw it. Of course, the intensity of the aurora on the picture depends very much on the chosen camera settings. The photo is a long exposure, whereas our eyes are not able to gather light over a long period of time to convert that into the equivalent of a "long exposure".

 

Airglow

 

Airglow as red and green bands

Airglow is a process which is pretty similar to the aurora, but it has a different cause and, unlike the aurora, can be seen from any place on Earth. I have mistakingly regarded it as noise from my camera for quite some years before finding out about airglow.

 

It is caused by ultraviolet radiation from the sun breaking down molecules and ionizing atoms in the atmosphere during the day, and, at night, the products recombine again, emitting light as they do so, which we see as airglow. The colors are mainly green and red and often a banded pattern can be seen. This pattern is caused by gravity waves and it often moves slowly in time.

 

The strength of the airglow varies quite a bit. Sometimes my night photos are nearly black, and sometimes there is a lot of color to be seen. The photo on the right of the Milky Way is the strongest display I have seen so far.

 

Airglow is weaker than the aurora, and its colors can most of the time not be seen by our eyes. Also, you'll capture it best during a moonless night and with higher sensitivities on your camera (this photo was taken at 15 s, f/4 and ISO 6400).

 

 

 

 

Noctilucent clouds

 

Noctilucent clouds (NLCs) occur at a height of about 80 km and consist of very tiny ice crystals. In order for NLCs to form, the temperature has to drop below -120 °C, which only happens for some weeks around summer solstice. Because of their extreme height, NLCs still receive light from the sun after sunset and before sunrise. This is why they look like white shapes againts the dark blue background.

Noctilucent clouds in the sky

In this case, the noctilucent clouds are the bright clouds and the dark clouds are "normal" clouds

 

NLCs are only visible during the summer months (May-August in the northern hemisphere, November-Februari in the southern hemisphere), and only at latitudes of about 50-65°. At latitudes above 65°, the sun does not get low enough beneath the horizon and the noctilucent clouds will not show against the relatively light background. At latitudes below 50° the air is not cold enough for NLCs to form.

 

NLCs become visible at about the same time as the brightest stars and are best visible when the sun is between 6° and 16° below the horizon. The calculator below estimates the times when the sun is between these values, and if there is a chance of seeing NLCs. However, the chance it gives should only be used as a rough indication, since the limits aren't that straightforward of course.

Year/month/day (today):

/ /

Latitude (°)*:

     

Longitude (°)**:

     

Time zone ***:

Daylight saving time:

     
 

 

16°

 

Set:

-

 

Rise:

-

 

Chance of seeing NLCs:

 

Calculate!

* North of the equator is positive, south of the equator is negative.
** East of Greenwich is positive, west of Greenwich is negative.
*** Fill in the time zone relative to UTC (overview). Examples: Hawaii is -10, the UK is 0, CET is +1, New Zealand is +12.

 

Δ
Δ
Δ