Jump to content
Welcome to Backyard Astronomy Space - please register to gain access to all of our features. Click here for more details. ×
SmallWorldsForum Microscopy and macro photography - a companion forum to BYA ×

NormalizeScaleGradient Script


John Murphy

Recommended Posts

9 hours ago, zerolatitude said:

I did normalise with PI, but used the LN files generated by NSG.

 

Was under the impression that LN does not affect the registered files, but only generates additional data?

 

Okay, will try again w/o PI LN and see.

 

Thanks

This one is with LN only in NSG. Still a band. Any suggestions?

 

Thanks

NSG_SingleLN_Band.png

Link to comment
Share on other sites

7 minutes ago, zerolatitude said:

This one is with LN only in NSG. Still a band. Any suggestions?

 

Thanks

NSG_SingleLN_Band.png

OK, that does look like a bug. Can you reproduce it with a small dataset? Please send me a link to the registered images, let me know what reference image you were using, and I will try to reproduce it.

Thanks for reporting it, John Murphy

Link to comment
Share on other sites

12 hours ago, John Murphy said:

OK, that does look like a bug. Can you reproduce it with a small dataset? Please send me a link to the registered images, let me know what reference image you were using, and I will try to reproduce it.

Thanks for reporting it, John Murphy

Okay let me see. This is 191 files, so will try with a subset.

  • Like 1
Link to comment
Share on other sites

On 5/6/2022 at 5:13 AM, zerolatitude said:

Okay let me see. This is 191 files, so will try with a subset.

Lets take this off line. I have sent you a message.

Regards, John Murphy

Link to comment
Share on other sites

On 5/6/2022 at 5:13 AM, zerolatitude said:

Okay let me see. This is 191 files, so will try with a subset.

I have just sent you a link to NSG 2.1.7 and NSGXnml 1.0.3

Regards, John Murphy

Link to comment
Share on other sites

HI

 

I am having a problem since installing the purchased version. Everything appears installed correctly and I used the process NSGXnml to register the license. I set up the NSG batch process by adding all my registered subframes, I selected the reference image correctly, set up the output directory, and selected the process icon for image integration. However, when I run it I get the following error and failure that it can't find the reference image. I cant seem to resolve this and I can confirm that the reference file is located in the correct directory

thanks for the help

Al

 

NSGXnml: Global context

Creating xnml for E:/Pixinisight Working/Normalized scale/xnml_w067_2022-05-06_21-39-35__M 51_9.25 EdgeHD 0.7 Reducer_LIGHT_101_10_-10.00_90.00s_0010_c_d_r_nsg.xisf

**ERROR** NSGXnml cannot find reference file:

E:/Pixinisight Working/Aligned - Normalized Scale/2022-05-09_00-07-22__M 51_9.25 EdgeHD 0.7 Reducer_LIGHT_101_10_-10.00_90.00s_0059_c_d_r.xisf

23.005 ms

Failed to normalize '2022-05-06_21-39-35__M 51_9.25 EdgeHD 0.7 Reducer_LIGHT_101_10_-10.00_90.00s_0010_c_d_r' [1/385]

 

Normalized [0/385] total time 12 s

 

Summary

Using noise estimates from FITS header: NOISE00 NOISE01 NOISE02

[1] Failed to create .xnml file [E:/Pixinisight Working/Normalized scale/2022-05-06_21-39-35__M 51_9.25 EdgeHD 0.7 Reducer_LIGHT_101_10_-10.00_90.00s_0010_c_d_r_nsg_w067.xnml]

Aborted ...

======= NSG HAS FINISHED =======

Link to comment
Share on other sites

5 hours ago, John Murphy said:

I have just sent you a link to NSG 2.1.7 and NSGXnml 1.0.3

Regards, John Murphy

Thanks. Will run and get back with results

Link to comment
Share on other sites

6 hours ago, Al Ros said:

HI

 

I am having a problem since installing the purchased version. Everything appears installed correctly and I used the process NSGXnml to register the license. I set up the NSG batch process by adding all my registered subframes, I selected the reference image correctly, set up the output directory, and selected the process icon for image integration. However, when I run it I get the following error and failure that it can't find the reference image. I cant seem to resolve this and I can confirm that the reference file is located in the correct directory

thanks for the help

Al

 

NSGXnml: Global context

Creating xnml for E:/Pixinisight Working/Normalized scale/xnml_w067_2022-05-06_21-39-35__M 51_9.25 EdgeHD 0.7 Reducer_LIGHT_101_10_-10.00_90.00s_0010_c_d_r_nsg.xisf

**ERROR** NSGXnml cannot find reference file:

E:/Pixinisight Working/Aligned - Normalized Scale/2022-05-09_00-07-22__M 51_9.25 EdgeHD 0.7 Reducer_LIGHT_101_10_-10.00_90.00s_0059_c_d_r.xisf

23.005 ms

Failed to normalize '2022-05-06_21-39-35__M 51_9.25 EdgeHD 0.7 Reducer_LIGHT_101_10_-10.00_90.00s_0010_c_d_r' [1/385]

 

Normalized [0/385] total time 12 s

 

Summary

Using noise estimates from FITS header: NOISE00 NOISE01 NOISE02

[1] Failed to create .xnml file [E:/Pixinisight Working/Normalized scale/2022-05-06_21-39-35__M 51_9.25 EdgeHD 0.7 Reducer_LIGHT_101_10_-10.00_90.00s_0010_c_d_r_nsg_w067.xnml]

Aborted ...

======= NSG HAS FINISHED =======

Hi Al

Yes, this is a known problem that occurs when filenames contain many spaces. Two consecutive spaces are a particular problem. I have already fixed this, and will be releasing the new version within a few days. You will then be able to run this dataset without any problems.

Thanks for your patience, the new version should be worth the wait!

Regards, John Murphy

Link to comment
Share on other sites

1 hour ago, John Murphy said:

Hi Al

Yes, this is a known problem that occurs when filenames contain many spaces. Two consecutive spaces are a particular problem. I have already fixed this, and will be releasing the new version within a few days. You will then be able to run this dataset without any problems.

Thanks for your patience, the new version should be worth the wait!

Regards, John Murphy

Thanks John!

Link to comment
Share on other sites

Hi 

 

IFrom above, I changed my naming convention to eliminate spaces. One more question please.

 

My normal process has been initially blinking the subframes to remove obvious bad ones movement blurred images etc. then after debayer, I have been using subframe selector to eliminate further frames based on eccentricity and FWHM from light clouds etc. there after I rank the subrames and weight them.

 

With NSG, i am blinking the images as before, but I am not sure if I can skip the subframe selector step or continue with it to eliminate frames with that are above an ecentricity / FWHM level? Does NSG consider these quality elements or is it mostly concerned with background and gradients only ?

 

thanks very much

Al

Link to comment
Share on other sites

Hi Al,

John is the authority, but I have been using NSG and following it's developments as well as the new PI weightings of PSF Power and PSF Signal. NSG will give you weighting very accurately. This is in addition to the correcting each sub's background to have the same gradient as your selected, simplest gradient, reference image.

 

For me, I use NSG's Nweight for the weighting algorithm. And cancel those near the bottom if significantly lower than the best Nweights.  In subframe selector you can see both PSF power and PSF signal and sort them. Then you can compare to Nweight. They will follow each other quite well. You can then look at some different subs (Blink) and see if visually you think the results match what you see on screen.

 

Roger

 

  • Like 1
Link to comment
Share on other sites

On 5/13/2022 at 12:48 PM, Al Ros said:

My normal process has been initially blinking the subframes to remove obvious bad ones movement blurred images etc. then after debayer, I have been using subframe selector to eliminate further frames based on eccentricity and FWHM from light clouds etc. there after I rank the subrames and weight them.

 

With NSG, i am blinking the images as before, but I am not sure if I can skip the subframe selector step or continue with it to eliminate frames with that are above an ecentricity / FWHM level? Does NSG consider these quality elements or is it mostly concerned with background and gradients only ?

 

thanks very much

Al

What follows is my personal view on this tricky area.

 

For me, astrophotography is all about revealing the maximum detail within very faint objects. However, resolution is not just dependent on sharpness. For faint objects, it is more dependent on the signal to noise ratio.

1920px-Photon-noise.jpg

This image is from the Wikipedia Shot Noise page: https://en.wikipedia.org/wiki/Shot_noise

 

(1) The signal to noise ratio of these images improves from top left to bottom right. Each image has 10 x the number of detected photons as the previous. You can clearly see that no matter how sharp the images on the top row are, they cannot match the detail or resolution of the images on the bottom row.

(2) Stars are very unforgiving of any aberration, tracking error or seeing. This is because they are bright point sources of light. During an exposure, seeing or tracking errors may cause the star to look elongated or enlarged. However, it is usually an incredibly tiny proportion of the light that falls in the wrong place. It is only easily visible because we stretch the images. The astronomical objects are very faint in comparison, so the tiny proportion of their light in the wrong place will usually be completely undetectable.

(3) During ImageIntegration, data rejection will reject detected light that only appears in a small proportion of the images. This does a surprisingly good job removing the seeing and tracking errors contained within the images that you would otherwise have discarded. 

 

Putting (1), (2) and (3) together, it can be seen that by rejecting images that contain stars with higher FWHM, you will only make a very small difference to the star resolution, but if you reject too many images it will hurt the resolution of the galaxy or nebula. If your image is all about the galaxy / nebula, then I feel that this would be a shame. I think a better strategy would be to fix the stars in post processing.

 

If you want to preserve the maximum signal to noise ratio, the weights must be based on the physics of shot noise; the signal to noise ratio squared. If the weights deviate from this, those images will actually damage the final SNR. Not only must you use the correct formula, you must also have an accurate measure of the (brightness) scale factor. Measuring this scale factor within astronomical images is harder than you might expect. In my experience, the only accurate way to do this is to use differential stellar photometry between a reference and the target images. This is the method that NormalizeScaleGradient uses. I am not going to just assert that the measured scale is accurate. I will share with you the test results:

image.png.0481adfebe9bb077714d0ab84be4a020.png

This graph shows the brightness scale measured by NSG (the blue line) and the scale measured by APT, a photometry program that is used by professional astronomers (the red line). I used APT to measure 15 stars in the unregistered reference and in each target image. I plotted the average, and calculated the standard deviation to determine the accuracy. APT provided an impressive accuracy of +/- 0.3%. NSG's accuracy was +/- 0.5%. This is still impressively accurate, and more accurate than needed. This test data was taken in a range of conditions. Images from 1 to 8 suffered from extra light pollution and a strong gradient; both due to either twilight or moonlight. Note that light pollution does not change the air transparency! The measured scale factor stays approximately constant. At the end of the night, light cloud formed, causing the transparency to drop. This shows that NSG is very sensitive to changes in transparency. You should also notice that there is very little scatter between consecutive images. High scatter would indicate an inaccurate, inconsistent scale factor.

 

image.png.34c85a590a8acb82ebf430e59298f5fe.png

This graph shows the NSG calculated weight. When the images contained a bright gradient due to moonlight or twilight, this increased the noise in the images. When this is combined with the scale factor, it produces the correct weight. Notice that my test data spans many different situations - from high light pollution, strong gradients and light clouds. In all cases, NSG has produced the correct result with very high accuracy.

 

My recommendation is to use NSG to do both the image weighting and to reject images when their weight is too low. I have just released NSG 2.2.0, which now has the ability to display a transmission graph (shows which images were affected by clouds) and a weight graph. These graphs can be used to reject images. The rejected images can be moved to a reject sub folder.

 

Thanks for reading! John Murphy

https://sites.google.com/view/normalizescalegradient

 

 

Edited by John Murphy
  • Like 3
  • Thanks 1
Link to comment
Share on other sites

Hi John,

Excellent and clear description of your methodology and results. Very factual and convincing. 

From your understanding of the science you have described, would you expect the duration of subs taken to impact NWeight?

-- For example if you set up your imaging system to alternately take two 100 sec subs, then one 200 sec sub repeatedly (50 times) then should the NWeight be higher for the individual 200 sec subs if all subs evaluated in NSG in one pass to same reference image?

Note: I alternated so the imaging conditions should be the same. Images were properly calibrated and  registered with cosmetic correction prior to NSG. Tracking was good. Cooled, low readout noise camera (like CMOS).

-- If I run NSG on the 100 x 100 sec subs and then run NSG on the  50 x 200 sec subs, and integrate separately using NWeight, would (should) the resulting two images to be same quality? I guess I mean Signal to Noise.

 

Your thoughts appreciated!

     Roger

 

 

 

Link to comment
Share on other sites

Thanks so much for the detailed explanation ! makes a lot of sense.

 

So it follows that selecting the correct reference image will be the key step as NSG takes over from there.

 

So are you generally selecting the potential reference frames based  on NSGs initial rating when the frames are added. Ie based on noise and altitude ? or are you blinking all the subframes and choosing the frames with the darkest background (ie no haze clouds etc) and your visual impression of simplest gradient ? I feel that I may not be selecting the best reference based on blink because the screen transfer stretch  seems different between different nights.

 

thanks so much

Al

Edited by Al Ros
Link to comment
Share on other sites

On 5/17/2022 at 4:37 AM, rockenrock said:

Hi John,

Excellent and clear description of your methodology and results. Very factual and convincing. 

From your understanding of the science you have described, would you expect the duration of subs taken to impact NWeight?

-- For example if you set up your imaging system to alternately take two 100 sec subs, then one 200 sec sub repeatedly (50 times) then should the NWeight be higher for the individual 200 sec subs if all subs evaluated in NSG in one pass to same reference image?

Note: I alternated so the imaging conditions should be the same. Images were properly calibrated and  registered with cosmetic correction prior to NSG. Tracking was good. Cooled, low readout noise camera (like CMOS).

-- If I run NSG on the 100 x 100 sec subs and then run NSG on the  50 x 200 sec subs, and integrate separately using NWeight, would (should) the resulting two images to be same quality? I guess I mean Signal to Noise.

 

Your thoughts appreciated!

     Roger

Hi Roger,

In theory, with a perfect zero noise camera, this relationship will hold. Provided that the total exposure time is the same, if the exposure is halved, the image weight for each image should also half. However, there are twice as many of them, so the sum of all the weights will be the same as the longer exposures. If the total exposure is the same, the final SNR will be the same, irrespective of the individual exposure times.

 

In practice, all cameras add some noise to the image. If this camera noise is significant compared to the other sources of noise (for example light pollution, and even the shot noise from the nebula / galaxy that you are imaging) the relationship will break down as the exposures get very small. For example, halving the exposure may result in 1/3 of the weight instead of 1/2. When this starts to happen depend on how dark your observing site is, if you are imaging with narrow band filters, and how noisy your camera is.

 

I have tested NormalizeScaleGradient with different exposure lengths (taken in the same conditions) to test that its weight algorithm produces the correct answer. The NSG weight algorithm does not know the exposure time, so getting the right result depends entirely on the the correctness of the algorithm. Provided the exposures were long enough that the camera noise was not significant, the weights correctly matched the expected relationship. Not a great surprise. NSG has always passed every test with great accuracy.

 

NormalizeScaleGradient 2.2.0 can now display a transmission and weight graph after it is run:

image.png.b41b214908e9d725a6b119293bfacad3.png

 

image.png.9773ad55840053c08ab9b6095c155a8f.png

 

These graphs are used to reject images that were too badly affected by a drop in atmospheric transparency (for example, due to cloud) and those images too badly affected by light pollution (Weight graph). NSG can then either move these rejected images into a 'NSG_Reject' sub folder, or automatically deselect these images in ImageIntegration.

 

Adam Block had previously made an excellent suggestion to me: A weight rejection criteria should be relative to the exposure time. For example, we might want to reject images that have half their expected weight (perhaps due to a passing cloud). If we have 10 minute and 5 minute exposures, we need to ensure that this rejection criteria does not reject all the good 5 minute exposures; we only want the images affected by cloud to be rejected. 

 

For this reason, the values displayed in these graphs are compensated for their exposure. A 5 minute and 10 minute exposure taken in identical conditions should appear in the graph with the same exposure compensated weight. The exposure compensated weight is only used for display and rejection. The uncompensated, real weight, is passed to ImageIntegration. During integration, we need the good 5 minute exposures to be worth half as much as the 10 minute ones.

 

If you wish to graph the actual weights that NSG has calculated, you can do this by creating a CSV file (Available in NSG 2.2.0, in the Output Files section). The CSV file is used to provide a permanent record of your imaging session. By default it sorts the rows in order of descending weight. To create a weight graph in your favorite spreadsheet program, you first need to sort the columns on the 'DATE-OBS' column. Then graph the weights.

 

Regards, John Murphy

Edited by John Murphy
'CSV file' option can be used to graph statistics in a spreadsheet.
Link to comment
Share on other sites

19 hours ago, John Murphy said:

 

Adam Block had previously made an excellent suggestion to me: A weight rejection criteria should be relative to the exposure time. For example, we might want to reject images that have half their expected weight (perhaps due to a passing cloud). If we have 10 minute and 5 minute exposures, we need to ensure that this rejection criteria does not reject all the good 5 minute exposures; we only want the images affected by cloud to be rejected. 

 

For this reason, the values displayed in these graphs are compensated for their exposure. A 5 minute and 10 minute exposure taken in identical conditions should appear in the graph with the same exposure compensated weight. The exposure compensated weight is only used for display and rejection. The uncompensated, real weight, is passed to ImageIntegration. During integration, we need the good 5 minute exposures to be worth half as much as the 10 minute ones.

 

Regards, John Murphy

 Hi John,

You exactly answered the questions I had in mind. As I was watching (in April 2022) Adam Block's Fundamentals comparison video how the 3 weighting algorithms work (PSFSNR, PSFSW, and NSG  Part 3) he seemed to reach the quandary how the sub's duration should (?) be part of the weight.  That was at 8:30 into the video. Well, maybe I was in the quandary, not him.

 

If a weighting scheme did not reduce the weight vs a longer sub, then we could all compress our image taking time into 1x10 min exposure and then 60 x 1 sec exposures for a night's total imaging of 11 minutes. And with the 1 sec images having close to the same weight as the 10 min sub. Of course that does not make any logical sense.  

 

So your and Adam's "rejection criteria' to weight so that a good shorter sub is not rejected really makes logical sense. But then to use the raw Nweight during the integration. After all you need more short exposures to equal the duration of the longer exposures. 

 

I have really learned a lot from you, and from watching so many of Adam Block's videos. I am really glad you guys collaborate and move the industry forward. Maybe next year you and Adam can hook up and do some presentations together at AIC in San Jose, Ca!

 

Roger 

Link to comment
Share on other sites

NormalizeScaleGradient 2.2.0 is now available from the NSG repository. See the NSG website for details:

https://sites.google.com/view/normalizescalegradient

 

It works particularly well with PixInsight 1.8.9-1, which has just been released by PixInsight.

 

How to upgrade to NSG 2.2.0

NormalizeScaleGradient 2.2.0 is available for PixInsight 1.8.9 and 1.8.9-1. PixInsight 1.8.8-12 is no longer supported. If you are running 1.8.8-12, please update your copy of PixInsight. NormalizeScaleGradient 2.2.0 is installed by using the NSG repository https://nsg.astropills.it

 

If this is the first time you are using the NSG repository, and you have previously manually installed NSGXnml, you will need to uninstall NSGXnml from PixInsight first. The NSG website describes how to do this.

 

For Windows and Linux users, the optional NSGXnml 1.0.4 will be installed automatically. Mac OS users will need to install this manually. The NSG website describes how to do this.

 

Whats new

(1) Start up delay

Previously, if your previous NSG run included a large number of target images, the next time NSG was started, there could be a long delay while NSG read fits headers from all the target images. To avoid this frustration, NSG now caches the data needed to display all the target image information. Fits headers will only be read if the image has been modified. NSG start up is now always very quick.

 

(2) Bug fix: Previously, spaces in filenames could occasionally cause problems. This has now been fixed.

 

(3) Automatic Image Rejection

image.png.35e51da73e83da29d6fef8b855c414f4.png

It is now possible to reject images based on how transparent the atmosphere was, and on the calculated weight.

 

Sudden drops in atmospheric transparency indicates clouds. It is usually worth rejecting these images. The Minimum transmission should usually be set to between 0.75 and 0.9 The Transmission graph shows how the transparency varied during your imaging session, and allows you to set the lower limit. Points that were rejected due to the transmission cut off are displayed with a blue 'X'. In this example, the last image was affected by cloud.

image.png.4c8cc0987f033eb4066dc5478efee07c.png

 

It is also possible to reject images based on their weight (the weight is the signal to noise ratio squared). The weight graph shows the effect of both changes in transmission and changes in light pollution. The minimum weight should usually be set to between 0.2 and 0.5

image.png.6dbfed057b9a87965a9bd7cb832bd9f4.png

Images that are rejected due to the weight cutoff are displayed with a red 'X'. In this example, the rejected point was rejected due to its reduced transmission (cloud). It therefore is a blue 'X'.

 

In these graphs, the reference image is displayed with a green 'X'. Click on a point to display the image name in the title bar. The vertical scale is relative to the highest transmission or weight. The plotted points are also compensated for exposure time. My message from yesterday explains this:

Adam Block had previously made an excellent suggestion to me: A weight rejection criteria should be relative to the exposure time. For example, we might want to reject images that have half their expected weight (perhaps due to a passing cloud). If we have 10 minute and 5 minute exposures, we need to ensure that this rejection criteria does not reject all the good 5 minute exposures; we only want the images affected by cloud to be rejected.

 

(4) Drizzle Integration and rejected images

ImageIntegration adds extra information to the '.xdrz' drizzle data files, which DrizzleIntegration will need. If images were rejected by NSG, these image will not get processed by ImageIntegration. It is therefore essential that these rejected images are NOT added to DrizzleIntegration. Previously, the user would need to very carefully select which files to add to DrizzleIntegration. This could be a major headache if there were hundreds of images...

 

I have added the ability to move the images that were rejected by NSG to a 'NSG_Reject' sub folder. This solves the problem safely and completely.

 

(5) Added 'DATE-OBS' header to the target image table. This allows the images to be sorted by date and time. The transmission and weight graphs are sorted by 'DATE-OBS' (or filename if the header does not exist).

 

(6) Faster! The performance improvements are greatest when running on PixInsight 1.8.9-1, when using the optional NSGXnml C++ module. This also significantly reduces file I/O and requires less disk space.

 

(7) NWEIGHT or PSF Scale SNR (PixInsight 1.8.9-1)?

image.png.ebbc8c3c5d93e6301a067362e3a2a613.png

When using the optional NSGXnml module, the Weight is now set to use 'PSF Scale SNR' instead of NWEIGHT. This new PixInsight weighting method uses exactly the same algorithm that NSG has always used. The NSG calculated scale (which is extremely accurate) is passed to ImageIntegration (and hence the weighting algorithm) via the '.xnml' data files, which ensures that the calculated PSF Scale SNR weight is exactly the same as NWEIGHT.

 

(8) Compatibility with SubFrameSelector (PixInsight 1.8.9-1)

When using the optional NSGXnml module, SubFrameSelector can be used between NSG and ImageIntegration. To do this, add the NSG generated '.xnml' data files to SubFrameSelector. The NSG weight is then displayed as 'PSF Scale SNR'.

 

(9) Photometry region of interest

image.png.383492273ec26ee6e6eed31f600bdd0c.png

If you are imaging a small object within a large field of view, it is worth using a region of interest that covers the object and its immediate neighborhood. NSG will run faster, and the average scale will be calculated for the significant area, which will improve accuracy - especially if some images are partially affected by light cloud.

 

In this example, the region of interest should not be used because this galaxy covers the whole of the field of view. However, when imaging planetary nebulae or remote galaxies, the region of interest is often very useful. 

 

Note that the whole of the image still gets corrected for both the relative gradient and the average scale.

 

(10) CSV file

It is now possible to export data to a CSV file to create a permanent record of your observing session. This can then be displayed in your favorite spreadsheet program. The data is sorted by decreasing weight. If you wish to graph the data, you should first sort on the 'DATE-OBS' column.

 

(11) No need to create an ImageIntegration process icon (PixInsight 1.8.9-1)

In the latest version of PixInsight, some restrictions were removed. This allows NSG to display ImageIntegration directly. You only need to specify an ImageIntegration template icon if you wish to specify the ImageIntegration settings yourself.

 

(12) Updated the Reference Documentation to v2.2.0

 

(13) The repository and NSG script are both signed to provide extra security.

image.png.04d372950b2996c0e81110262a323005.png

 

As you can see, I have been very busy. I hope you find these improvements useful!

Regards, John Murphy

Edited by John Murphy
Link to comment
Share on other sites

On 5/17/2022 at 3:20 PM, Al Ros said:

So it follows that selecting the correct reference image will be the key step as NSG takes over from there.

 

So are you generally selecting the potential reference frames based  on NSGs initial rating when the frames are added. Ie based on noise and altitude ? or are you blinking all the subframes and choosing the frames with the darkest background (ie no haze clouds etc) and your visual impression of simplest gradient ? I feel that I may not be selecting the best reference based on blink because the screen transfer stretch  seems different between different nights.

 

thanks so much

Al

Hi Al,

This highlights a vitally important point. NormalizeScaleGradient is not designed to remove all gradients. Instead, it is used to match all the target images so that they have the same gradient and (brightness) scale as the chosen reference image. Whatever gradients the reference image has will be reproduced in all of the target images.

 

So if NSG does not entirely remove all gradients, why use it?

  • The gradient from the best image usually has a simple gradient. This should be much easier to remove in a background extraction process (for example DBE).
  • Once all the target images have the same gradient and scale, the variation between the frames is entirely due to noise, and stuff that should not be there (satellite trails, cosmic ray strikes, hot pixels...). This makes it much easier for ImageIntegration to use statistical analysis to determine what to reject. There are two things to notice about this process.
    • The first is that data rejection should only be used to remove the stuff that should not be there. The noise should not be rejected. It is mostly shot noise due to the random arrival of light photons. This is real signal, which will contribute to the final stacked image.
    • If really low weight images are included, although in theory they may slightly improve the final signal to noise ratio, the high level of noise will confuse the data rejection statistical analysis. This will make it difficult for ImageIntegration to remove the stuff that should not be there. Using a low weight cut off solves this problem. See the previous post for details on how to do this.
  • If ImageIntegraton is given images that have not been accurately normalized, if these image contain significant gradients, ImageIntegration will actually reject some of the real signal.
  • NSG calculates an extremely accurate scale factor, which results in very accurate weights. This accuracy has been confirmed with solid test results. Accurate weights are really important. Errors here can significantly reduce the signal to noise ratio of the final image.

For many of these issues, the choice of reference frame is not too critical. The vital thing is that all the images are consistent. But the choice of reference frame does completely determine the gradient in the final stacked image. Hence it is useful to choose a reference image that has the simplest and smallest gradient. Hopefully a gradient that will be easy to remove.

 

So, to answer Al's question, how should you choose the reference frame?

  • For a first approximation, I would sort the target images on NSG's Noise column. The images with the lowest level of noise will have the least light pollution, and the light pollution gradient is less likely to be significant. 
  • If you wish to go further, you should visually compare the images with the least noise. You can double click on images in NSG's target list to display the images (then exit NSG to compare them). Or you can display them in Blink. Either way, you will want to apply the same stretch to these short listed images. A visual inspection is usually the best strategy.
  • Alternatively, in NSG, use an initial run to determine the images with the best transmission and the best weights: 
    • Create a NSG process icon to save the current NSG target image list and settings. Then to reduce the test run's processing time:
    • Remove all images that don't have low noise levels, and set the Gradient smoothness to maximum (+4). Set the reference to the image with the least noise.
    • After this quick run, inspect the Transmission and Weight graphs to determine the best reference image. Choose an image with both high transmission and high weight. Note that lower transmission indicates clouds.
    • Exit NSG and restart from the process icon. Set the reference image.
  • Consider taking a single reference image on a particularly good clear moonless night, when the object is high in the sky.

Hope this helps, John Murphy

https://sites.google.com/view/normalizescalegradient

Edited by John Murphy
Link to comment
Share on other sites

Thanks.

 

The latest version has a lot of new features which are proving extremely helpful.

 

I don't want to get into software comparison and all that, but do have a question. If the PI 1.8.9-1 has the PSF Scale SNR weighting and is also doing photometry based measurements, is there any significant difference in the calculations between NSG and the latest PI default?

 

Clarifications:

1. I say latest as earlier versions obviously had a completely different approach.

2. I'm referring specifically to the algorithm, not the UI / UX features.

 

Thanks.

Edited by zerolatitude
Link to comment
Share on other sites

On 5/5/2022 at 9:18 PM, zerolatitude said:

This one is with LN only in NSG. Still a band. Any suggestions?

 

Thanks

NSG_SingleLN_Band.png

Hi John,

 

Just referring back to this issue and our private messages, the problem seems to have been resolved by redoing the calibration, registration etc from scratch with PI 1.8.9-1. I have now tested it using this approach with all the data sets and reference frames which were creating these bands, and all are okay now.

 

On examining the data set, I realized it had been pre-processed with PI 1.8.8, and looks like the changes in the PI algorithm in 1.8.9 were incompatible. So reprocessing required a complete redo from calibration onwards.

 

An interesting learning experience.

 

 

Edited by zerolatitude
Link to comment
Share on other sites

NormalizeScaleGradient Reviews

 

There are now quite a few NSG reviews available. As with all reviews, it is worth being a discerning consumer! I am going to try to give some pointers that I hope will help you get the most out of these reviews.

  1. Does the reviewer demonstrated that he understands how to use NSG? Has he gone through the most important controls and described what they do, and how to use them? If he has only displayed the stacked results, it is possible that he is not an expert user (it's hard to be an expert at everything!); the results may not be optimum.
  2. Does the reviewer understand what programs like Local Normalization and NSG are trying to achieve? It is a common misconception that the aim is to remove all gradients. This is not the case. The aim is to make all the target images match the reference image. If this is successfully done, any gradient in the reference image will also be replicated in all the target images. Hence it is no good comparing too stacks and simply saying "I prefer this image to that one". The differences will most likely be due to which reference frame was chosen. Optimum settings also help!

The most important aspect of a normalization program is how accurately it can match the reference image's gradient without adding artifacts. As far as I am aware, when used correctly, NSG is (and always has been) very resilient against artifacts.

 

There are some very good reviews and video tutorials on the internet. Do go and watch them!

 

A final note:

It is easy to check how well NSG is performing by using its ability to blink between the reference image and a corrected target image (Gradient graph dialog). The gradient smoothness can then be adjusted to provide a correction that is as accurate as the user wishes. Please read the Quick Start Guide in the NormalizeScaleGradient Reference Documentation to learn how to get the best out of NSG.

 

Regards, John Murphy

  • Like 1
Link to comment
Share on other sites

When using the rejection, is it possible to display how many images have been rejected / are left?

 

Thanks

Link to comment
Share on other sites

5 minutes ago, zerolatitude said:

When using the rejection, is it possible to display how many images have been rejected / are left?

Yes, I am thinking of doing this for the next version. It could show the number of images that remain, the total exposure time, and the weight adjusted total exposure time.

Edited by John Murphy
Link to comment
Share on other sites

6 hours ago, John Murphy said:

Yes, I am thinking of doing this for the next version. It could show the number of images that remain, the total exposure time, and the weight adjusted total exposure time.

Thanks.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Welcome to Backyard Astronomy Space - please register to gain access to all of our features

    Once registered you will be able to contribute to this site by submitting your own content or replying to existing content. You will also be able to customise your profile, receive reputation points for submitting content, whilst also communicating with other members via your own private personal messaging inbox. 

     

    This message will be removed once you have signed in.

  • Tell a friend

    Love The Backyard Astronomy Space? Tell a friend!
  • Topics

×
×
  • Create New...