Astrobdlbug Posted February 8, 2021 Share Posted February 8, 2021 Hi, I am not normally someone who starts threads like this, I'm usually in the background somewhere trying to learn and grow my knowledge, but hopefully there are views out there that can advise if this is an aceptable way to process image data and present to the community. So what this is about is that I have been generally using my Samyang 135mm lens and ASI1600 camera - good widefield views at about 6.5arcSec/pixel - so straightforward to guide and get 'good' star shapes etc... I also have a AT106EDT APO refractor and use that with same ASI1600 camera - I also have a FF/FR , and images about 1.6arcSec/pixel I recently posted a widefield of the Rosette in narrowband HOO, but last year in fact exactly a year ago I imaged the Rosette with my AT106EDT /ASI1600 and Ha filter. So I can use a tool like RegiStar to register the high res Ha image onto the widefield image to enhance the details of the Rosette and leave the widefield dust and stars as is. So to compare the image are below Side by side or top bottom depending on device, Samyang135mm is top or left image Samyang135+AT106EDT is bottom or Right image What is the general consesus regards this techique ? The image contains 'real' data' its almost like HDR merge for image scale rather than usual HDR compressing dynamic range of lightness... Anyway if you have a view it would be good to discuss. I do this to a number of my widefield images when I have supporting data from the AT106EDT, but have not posted such images in any forums. I did a Orion widefield and merged the higher angular resolution data with widefield to bring out more detail in HH and M42 Bryan 7 Quote Link to comment Share on other sites More sharing options...
Gina Posted February 8, 2021 Share Posted February 8, 2021 WOW!!! I like that. Never used the technique myself but it certainly seems to work. That second image is a real beauty! Well done! 👍👍 👍👍 1 Quote Link to comment Share on other sites More sharing options...
AstronomyUkraine Posted February 8, 2021 Share Posted February 8, 2021 31 minutes ago, Astrobdlbug said: Hi, I am not normally someone who starts threads like this, I'm usually in the background somewhere trying to learn and grow my knowledge, but hopefully there are views out there that can advise if this is an aceptable way to process image data and present to the community. So what this is about is that I have been generally using my Samyang 135mm lens and ASI1600 camera - good widefield views at about 6.5arcSec/pixel - so straightforward to guide and get 'good' star shapes etc... I also have a AT106EDT APO refractor and use that with same ASI1600 camera - I also have a FF/FR , and images about 1.6arcSec/pixel I recently posted a widefield of the Rosette in narrowband HOO, but last year in fact exactly a year ago I imaged the Rosette with my AT106EDT /ASI1600 and Ha filter. So I can use a tool like RegiStar to register the high res Ha image onto the widefield image to enhance the details of the Rosette and leave the widefield dust and stars as is. So to compare the image are below Side by side or top bottom depending on device, Samyang135mm is top or left image Samyang135+AT106EDT is bottom or Right image What is the general consesus regards this techique ? The image contains 'real' data' its almost like HDR merge for image scale rather than usual HDR compressing dynamic range of lightness... Anyway if you have a view it would be good to discuss. I do this to a number of my widefield images when I have supporting data from the AT106EDT, but have not posted such images in any forums. I did a Orion widefield and merged the higher angular resolution data with widefield to bring out more detail in HH and M42 It is quite an acceptable practice to merge images with two different image scales. I have never done it myself, but it seems a straightforward process in PI, by registering one of the images to the other. The result definitely improves the end result. It has a lot more detail than the original wide angle image. Brian 1 Quote Link to comment Share on other sites More sharing options...
Astrobdlbug Posted February 8, 2021 Author Share Posted February 8, 2021 Thanks Gina - it was something that I started to experiment with a few years ago when I bought RegiStar and realised it had more applications than just aligning RGB integrations.... @AstronomyUkraine - Brian - so interesting - I've just started using PI a lot more since I started with narrowband imaging - so didnt realise the technique could be used as you describe - I typically use PS layers and masks to merge the two image scales , its good you see it as acceptable practice as I was concerned that I was mis-representing the image - guess like all things as long as you are transparent regards how the image was aquired and processed it can be viewed on its own merit. Bryan 2 Quote Link to comment Share on other sites More sharing options...
AstronomyUkraine Posted February 8, 2021 Share Posted February 8, 2021 (edited) 14 minutes ago, Astrobdlbug said: Thanks Gina - it was something that I started to experiment with a few years ago when I bought RegiStar and realised it had more applications than just aligning RGB integrations.... @AstronomyUkraine - Brian - so interesting - I've just started using PI a lot more since I started with narrowband imaging - so didnt realise the technique could be used as you describe - I typically use PS layers and masks to merge the two image scales , its good you see it as acceptable practice as I was concerned that I was mis-representing the image - guess like all things as long as you are transparent regards how the image was aquired and processed it can be viewed on its own merit. Bryan I see no difference in this method, than adding Ha to an RGB image. Anyone who uses the hubble palette is misrepresenting the image. 😀 I can imagine many astrophotographers have two imaging rigs, and shoot the same target to achieve enough data quicker. In PI, I guess you would use your wide angle image as reference when stacking the Ha images, and PI would do the rest. Maybe you can try it out and let us know. Brian Edited February 8, 2021 by AstronomyUkraine Quote Link to comment Share on other sites More sharing options...
Carastro Posted February 8, 2021 Share Posted February 8, 2021 (edited) I do it, in fact I ran a dual rig for some time with different cameras and scopes on each. This year I did an image with my Samyang/Atik460 https://www.astrobin.com/full/ps6p3q/E/ And added half of it to this image I did in 2019 where I hadn't collected enough data https://www.astrobin.com/419048/0/?image_list_page=2&nc= and ended up with this, which I thought was a big improvement. https://www.astrobin.com/full/zeqq2c/0/ I think so long as it is all your own data, it is fine. Combining can be a bit problematic, but I simply use Registar to register them, and then paste one over the other in Photoshop giving a 50% to the top layer in layers. Carole Edited February 8, 2021 by Carastro 1 Quote Link to comment Share on other sites More sharing options...
ApophisAstros Posted February 9, 2021 Share Posted February 9, 2021 10 hours ago, AstronomyUkraine said: I see no difference in this method, than adding Ha to an RGB image. Anyone who uses the hubble palette is misrepresenting the image. 😀 I can imagine many astrophotographers have two imaging rigs, and shoot the same target to achieve enough data quicker. In PI, I guess you would use your wide angle image as reference when stacking the Ha images, and PI would do the rest. Maybe you can try it out and let us know. Brian Surely as long as its your own acquired data then you can process in any colour palette you like and dont see it as "misrepresenting" the image? Roger Quote Link to comment Share on other sites More sharing options...
ApophisAstros Posted February 9, 2021 Share Posted February 9, 2021 Definitely agree the third combined image is improved😄 Roger 1 Quote Link to comment Share on other sites More sharing options...
AstronomyUkraine Posted February 9, 2021 Share Posted February 9, 2021 7 minutes ago, ApophisAstros said: Surely as long as its your own acquired data then you can process in any colour palette you like and dont see it as "misrepresenting" the image? Roger That was my point. Maybe I should have put quotation marks around the word misrepresentation. After calibration and stacking, it's all down to artists prerogative. Brian 2 Quote Link to comment Share on other sites More sharing options...
TerryMcK Posted February 9, 2021 Share Posted February 9, 2021 I think it is a fantastic technique. I've not been able to do it yet but that second image is brilliant. 1 Quote Link to comment Share on other sites More sharing options...
MarkAR Posted February 9, 2021 Share Posted February 9, 2021 Works a treat. Superb improvement in the second image. Quote Link to comment Share on other sites More sharing options...
Astrobdlbug Posted February 9, 2021 Author Share Posted February 9, 2021 thanks everyone for your replies - it would appear that consensus is that this is an accepted process in developing astro images - which I am very pleased to hear - I've a few more like this in the wings. I may dig them out and share sometime, Bryan Quote Link to comment Share on other sites More sharing options...
Gina Posted February 9, 2021 Share Posted February 9, 2021 Looking forward to seeing them particularly if they are as good as your present post!! 😀 1 Quote Link to comment Share on other sites More sharing options...
peter shah Posted February 9, 2021 Share Posted February 9, 2021 Superb...I find using multiple scopes with different FL works very well, It can bring colour and detail in to places of interest Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.