A question and not a criticism

Ken Rennie

Well-Known Member
I notice in many of the astro shots posted that the brighter stars often appear as disks and not as pinpoints. The images themselves are usually stunning. Is the disk shape just a form of flare? it doesn't look like lack of focus. I realise that a great deal of manipulation of files is going on and this may just be a consequence of that or it may be something that I am in total ignorance of. Ken
 

JimFox

Moderator
Staff member
Hey Ken, any way you can grab a sample image and crop in so we can see what you are talking about? You have me curious now. It doesn't sound like you are talking about slight star movement.
 

Jameel Hyder

Moderator
Staff member
If this is around the outer edges of the frame, it’s most likely coma (comatic aberration). Most non wide lenses not specifically designed for astro) suffer from it. Rokinons surprisingly control it a lot better than others
 

Ken Rennie

Well-Known Member
1599149992766.png

This is a lovely image and is not the best example but it is close at hand. Bang in the middle 1/3 up not streaking or coma. I will see if I can find better candidates.
 

Kyle Jones

Moderator
I'm pretty sure it is Jupiter.

Unless you are using a tracker, there is always some streaking which turns things into (even slight) ovals. I'd blame lens aberrations for the rest.
 

JimFox

Moderator
Staff member
And of course, we are shooting fast and wide open or close to it, so any lens issues will show up more then too.
 

JimFox

Moderator
Staff member
I notice in many of the astro shots posted that the brighter stars often appear as disks and not as pinpoints. The images themselves are usually stunning. Is the disk shape just a form of flare? it doesn't look like lack of focus. I realise that a great deal of manipulation of files is going on and this may just be a consequence of that or it may be something that I am in total ignorance of. Ken
Ken, as a side note there is very little manipulation of the files going on. The sky in the photo you copied had no manipulation whatsoever. It was simply processed in ACR and then had a light pass of Topaz Clarity run on it to add a slight bit more contrast and detail. That's it.
 

Ken Rennie

Well-Known Member
Ken, as a side note there is very little manipulation of the files going on. The sky in the photo you copied had no manipulation whatsoever. It was simply processed in ACR and then had a light pass of Topaz Clarity run on it to add a slight bit more contrast and detail. That's it.
We may have to disagree about topaz clarity being no manipulation. I find it gives images a strong signature, in landscapes certainly, although I don't know what it is like with astro shots. I don't usually take astro shots but since I now have a Sony 16-35 f2.8 and live close to dark sky areas I will give it a go this winter. I will continue to look for images that show stars as disks although I will say that if this is a planet then it is still looks too large to my eyes. Ken
 

JimFox

Moderator
Staff member
We may have to disagree about topaz clarity being no manipulation. I find it gives images a strong signature, in landscapes certainly, although I don't know what it is like with astro shots. I don't usually take astro shots but since I now have a Sony 16-35 f2.8 and live close to dark sky areas I will give it a go this winter. I will continue to look for images that show stars as disks although I will say that if this is a planet then it is still looks too large to my eyes. Ken
Definitely point it out in the future if you see something like you are describing. I wouldn't be surprised if it's mostly planets, but it would be a great exercise.

I totally agree that Topaz Clarity modifies the image, I prefer the word modify to manipulate. To me manipulate implies an unnatural modification I think. It certainly sounds harsher, and as someone who tries to keep as much reality in my photo processing as possible, to manipulate a photo sounds like someone needs to slap me on my hands as it sounds bad. :)

To the amount of Topaz Clarity I used, I would say the change to the image is very slight and probably no different then if I had used the Clarity slider in ACR, which is why I said there was very little manipulation, not no manipulation. And again, I would not describe post processing as manipulation, though perhaps it's a cultural use of words?
 

Ken Rennie

Well-Known Member
Jim I think that this is another case of two countries separated by a common language. Manipulates the electorate = bad, manipulate your spine as a chyropractor = good. Manipulate an image for me is neutral but implies a greater degree of change than modify. I have just "modified" a milky way image from New Zealand only using ACR but it required large movements of sliders that in a daylit landscape or portrait would produce a total mess so for me this would be manipulate but still not negative. Ken
 

JimFox

Moderator
Staff member
Jim I think that this is another case of two countries separated by a common language. Manipulates the electorate = bad, manipulate your spine as a chyropractor = good. Manipulate an image for me is neutral but implies a greater degree of change than modify. I have just "modified" a milky way image from New Zealand only using ACR but it required large movements of sliders that in a daylit landscape or portrait would produce a total mess so for me this would be manipulate but still not negative. Ken
Yeah, I figured it was going to be like that. Thanks Ken for explaining that.
 

Mike Lewis

Staff Member
OK, so here is my take on Ken's excellent question.

Of course, stars are so far away that like Ken says, they are optical point sources. But when one starts to take longer exposures, shooting with a digital sensor, and dealing with the the seeing effects of imaging through the atmosphere, those point sources start to bloom out. Some of it is just the integration effects of the scintillation caused by the seeing (the turbulence of the atmosphere that the telescope has to look through from the surface of the Earth) and I think some of it is a saturation effect of the digital sensor in the presence of a strong signal. If you take a significantly short exposure, you can make almost all the visible stars in the image look much more like point sources. But those will then be only the brightest stars, and a number of the dimmer stars will not be visible. Lastly, for at least many of the images of deep sky objects, these images have been stretched (had the intensity curve manipulated) to make what was linear data very non-linear. That is to say, the very dim areas have had there exposure values increased a lot more than the very brightest values. Even with this non-linear stretch, it is usually hard to not saturate a number of the brightest stars. That is why some of the more complex techniques actually separate the starfield from the nebulosity, to allow for more aggressive stretching of the nebulosity, and a less aggressive stretch of the stars. This tends to minimize the star sizes compared to more conventional processing, as well as have the generally pleasant side effect of also showing more of the natural star colors, instead of having many of the stars be a blown out white color. But of course, the technique is generally a PITA to complete, and introduces its own set of issues that must be dealt with. I have only just started to use this technique on some of my images. I like the results, but not the amount of time it takes to get an acceptable result. And at least so far for me, it does not seem to always work well with every set of data.

For Illustrations purposes, here are a few representative images:

Here is a smaller version my M81-M82 image in its final form. It was processed in a conventional fashion, with no separation between the stars and any other objects (in this case galaxies.) I like it, but it does exhibit that effect Ken asked about - many of the brighter stars are larger looking (bloated) compared to the dimmer ones:

LRCC_sRGB_FW_small_M81_M82_HaLRGB_BN_CC1_SCNR_MT_ABE.jpg


Here is what one is up against with an image like this. Here is what the luminance data for this image looked like after all the luminance frames had the dark signal and noise subtracted and had been registered and stacked. In this case, this was a result of stacking and combining frames totaling 2.5 hours worth of integration time on this target. Should be amazing, right?

M81-M82_Luminance_MasterNew_small.jpg


Yikes! Where are the galaxies? Well, they are in there, but they are much fainter than the stars (which is why many of the stars can be seen by the naked eye but the galaxies in this image cannot.) This is the linear un-stretched data, not very impressive. So to illustrate the point, here is the exact same frame, with a brute force stretch applied to exaggerate the star bloating (and in this case partially blow out some galaxy details). This is what is required to show the galaxy in the image. As one can see, the stars are also unavoidably brighter (and unfortunately larger looking in many cases) too.

M81-M82_Luminance_MasterNew_stretched_small.jpg


This frame also shows a bunch of other defects that have yet to be fixed (mis-registration at the edges, gradients across the frame, vignetting, etc.) An even worse problem is that this bloating is not always uniform color-wise, and so in the case of my system, I seem to have lots of blue bloated stars. Many of the blue stars are very hot and therefore also pretty bright, but I think there is also some other effects with the filters and the atmosphere that make the blue bloat more. But in any case, one can see the problem in the data even at this early processing step.

Lastly, here is a reduced size image of a different target, M16, where the stars were removed from the nebulosity early on, and the nebula was then processed without stars until the end, when a separate image of the stars that had been exposed for a shorter duration and stretched less aggressively was added back in. In this case the stars are much smaller, with very few looking bloated, and as a side effect, more colorful as well. In this case, this might have produced stars that almost look 'too small' - but after 20+ hours of processing I was happy enough with this result to not go back far enough in the processing to mess with that :)

LRCC_sRGB_FW_small_LRCC_sRGB_M16_NB_LRGB_PSCC_Full_Topaz_Healed_DSE_SNR.jpg



So maybe that is hopefully not just a long winded answer, but perhaps somewhat informative.

ML
 

Ken Rennie

Well-Known Member
Thank
OK, so here is my take on Ken's excellent question.

Of course, stars are so far away that like Ken says, they are optical point sources. But when one starts to take longer exposures, shooting with a digital sensor, and dealing with the the seeing effects of imaging through the atmosphere, those point sources start to bloom out. Some of it is just the integration effects of the scintillation caused by the seeing (the turbulence of the atmosphere that the telescope has to look through from the surface of the Earth) and I think some of it is a saturation effect of the digital sensor in the presence of a strong signal. If you take a significantly short exposure, you can make almost all the visible stars in the image look much more like point sources. But those will then be only the brightest stars, and a number of the dimmer stars will not be visible. Lastly, for at least many of the images of deep sky objects, these images have been stretched (had the intensity curve manipulated) to make what was linear data very non-linear. That is to say, the very dim areas have had there exposure values increased a lot more than the very brightest values. Even with this non-linear stretch, it is usually hard to not saturate a number of the brightest stars. That is why some of the more complex techniques actually separate the starfield from the nebulosity, to allow for more aggressive stretching of the nebulosity, and a less aggressive stretch of the stars. This tends to minimize the star sizes compared to more conventional processing, as well as have the generally pleasant side effect of also showing more of the natural star colors, instead of having many of the stars be a blown out white color. But of course, the technique is generally a PITA to complete, and introduces its own set of issues that must be dealt with. I have only just started to use this technique on some of my images. I like the results, but not the amount of time it takes to get an acceptable result. And at least so far for me, it does not seem to always work well with every set of data.

For Illustrations purposes, here are a few representative images:

Here is a smaller version my M81-M82 image in its final form. It was processed in a conventional fashion, with no separation between the stars and any other objects (in this case galaxies.) I like it, but it does exhibit that effect Ken asked about - many of the brighter stars are larger looking (bloated) compared to the dimmer ones:

View attachment 31911

Here is what one is up against with an image like this. Here is what the luminance data for this image looked like after all the luminance frames had the dark signal and noise subtracted and had been registered and stacked. In this case, this was a result of stacking and combining frames totaling 2.5 hours worth of integration time on this target. Should be amazing, right?

View attachment 31912

Yikes! Where are the galaxies? Well, they are in there, but they are much fainter than the stars (which is why many of the stars can be seen by the naked eye but the galaxies in this image cannot.) This is the linear un-stretched data, not very impressive. So to illustrate the point, here is the exact same frame, with a brute force stretch applied to exaggerate the star bloating (and in this case partially blow out some galaxy details). This is what is required to show the galaxy in the image. As one can see, the stars are also unavoidably brighter (and unfortunately larger looking in many cases) too.

View attachment 31913

This frame also shows a bunch of other defects that have yet to be fixed (mis-registration at the edges, gradients across the frame, vignetting, etc.) An even worse problem is that this bloating is not always uniform color-wise, and so in the case of my system, I seem to have lots of blue bloated stars. Many of the blue stars are very hot and therefore also pretty bright, but I think there is also some other effects with the filters and the atmosphere that make the blue bloat more. But in any case, one can see the problem in the data even at this early processing step.

Lastly, here is a reduced size image of a different target, M16, where the stars were removed from the nebulosity early on, and the nebula was then processed without stars until the end, when a separate image of the stars that had been exposed for a shorter duration and stretched less aggressively was added back in. In this case the stars are much smaller, with very few looking bloated, and as a side effect, more colorful as well. In this case, this might have produced stars that almost look 'too small' - but after 20+ hours of processing I was happy enough with this result to not go back far enough in the processing to mess with that :)

View attachment 31914


So maybe that is hopefully not just a long winded answer, but perhaps somewhat informative.

ML
Thank you for the answer Mike. From 2 readings I take it that the apparent size increase of bright stars is due to many things firstly and mainly atmospheric conditions causing the light to spread. Rayleigh scattering would also occur in even the pristine atmosphere. You mention the saturation effect on sensors and I had initially thought that this may be the primary cause and it does happen as we get to full well capacity adjacent pixels start to see an increase but the nature of all camera sensors are such that this would appear as a vertical or horizontal streak not a circular bloom. One other cause that you fail to mention or I fail to spot in your excellent explanation is veiling flare in the lens, each individual star is causing its own little pool of very dim brightness around it which although near invisible will be magnified many fold by the non linear way we then treat the data. However it is the way that we treat the data that leads to the apparent increase in size. I have been doing back of envelope calculations of the apparent size of planets( in pixels) using a 24mm lens and a 24Mp full frame camera and the largest apparent and actual one Jupiter at its closest is approx 2px wide it is between 30 and 50 arcseconds in diameter. At 24mm it would be approx 1.5px wide. I know that Venus can be up to 66 arcseconds across but it is a very thin crescent at this time and the images all show planets as spheres so I am assuming that it is not venus but even if it were it would still only be approx 2px wide. Ken
 
Last edited:
Top Bottom