Realtime Seeing Filter idea - to benefit Altair GPCAM / Hypercam owners and others

Use this section to discuss "standard" Baader/Coronado/ Lunt SolarView/ Daystar, etc… filters, cameras and scopes. No mods, just questions/ answers and reviews.
Post Reply
Nicksolar
Oh, I get it now!
Oh, I get it now!
Posts: 23
Joined: Fri Apr 29, 2016 8:09 am
Has thanked: 7 times
Been thanked: 20 times

Realtime Seeing Filter idea - to benefit Altair GPCAM / Hypercam owners and others

Post by Nicksolar »

Hi Guys,

I'm tired of capturing big video files and discarding 80% of them, so I came up with this idea while processing solar images in white light with our solar wedge. In bad conditions, I'm using like 10% of my frames so each file is basically 10x larger than it needs to be if you think of it that way. Basically I want my Hypercam to capture frames I like and not the bad ones. That would stop my hard drive filling up and speed up processing considerably. Hopefully it would work not just for solar but planetary'lunar imaging too.

Currently one can measure "seeing quality" (amount of atmospheric distortion) by measuring the contrast of a video frame, just like you would with focus detection. The higher the contrast, theoretically the more detail is visible, so the less atmospheric distortion or blurring. I guess the trick is doing it fast enough to capture a frame, analyse it, and move onto the next frame. Also how do we know what "good" looks like for a particular imaging session. Seeing is pretty variable. That's why they call it "Lucky Imaging". Well I hope we can take the luck out of lucky imaging :)

Initially, I envisaged you, the end user would do something like this (with my reasoning in brackets):

1) choose a detection area (so you can focus in on a feature of interest within the camera frame, for example a sunspot, because many frames are sharp in some areas and not in others so you can choose an interesting feature. This would help with big aperture scopes in bad seeing).
2) Check the "typical seeing quality level" in a graphical display over time (just to get a feel for the conditions and where to set the bar to trigger the camera).
3) Quantify the "seeing quality" (Atmospheric distortion) you want to trigger the camera to capture. (Tune your capture parameters to prevailing conditions).
4) Define the time period over which the capture process will run, say for solar I would choose 30 seconds (more and surface details are blurred) but for lunar you could get away with minutes (before shadows change too much at your resolution).
5) Click "start" and sit back to watch it trigger the camera when the seeing gets above the defined score.
6) Process the resulting (smaller!) video file in Autostakkert as usual but faster with less hassle!

Anyway I realised that SharpCap PRO has some of the features already working (focus assistance works on surfaces) so I spoke to Robin G about this and he tells me a Beta should be around soon based on detection across the whole frame. ROI might take more time to implement as there are some pipeline changes. SharpCap PRO users will get the Beta to play with soon (you get a SharpCap PRO licence with every Altair camera starting in 2017 as a special offer so I hope we'll get some good Beta feedback on the SharpCap Group, or Altair Google Group.

Kind of fun, harks back to my old product design days.

Be good to hear what you think of the idea.

(I'll post this in UKAI and Altair Facebook Group to get some opinions of other folks there too. but it's really high resolution solar with the Altair Hypercam 174M and 178M which got me thinking this way)

Best, Nick (Altair).


User avatar
Spectral Joe
Ohhhhhh My!
Ohhhhhh My!
Posts: 120
Joined: Wed Mar 14, 2012 4:34 am
Location: Livermore, California

Re: Realtime Seeing Filter idea - to benefit Altair GPCAM / Hypercam owners and others

Post by Spectral Joe »

Not quite real time, but try converting tiff files to jpeg or bmp, the more detail (better focus or seeing, for example) the less compression you get. Maybe someone can automate the process. On the other hand, the scintillation seeing monitor with Firecapture support thread in the Solar Scope Modifications forum covers this: viewtopic.php?f=9&t=16746


Observing the Sun with complex optical systems since 1966, and still haven't burned, melted or damaged anything.
Not blind yet, either!
Light pollution? I only observe the Sun, magnitude -26.74. Pollute that!
Nicksolar
Oh, I get it now!
Oh, I get it now!
Posts: 23
Joined: Fri Apr 29, 2016 8:09 am
Has thanked: 7 times
Been thanked: 20 times

Re: Realtime Seeing Filter idea - to benefit Altair GPCAM / Hypercam owners and others

Post by Nicksolar »

I think the difference is that with this idea it doesn't sense the whole sun, just a part of it within the camera's FOV. Also it doesn't look at brightness (though I guess it could) but instead contrast. I.e. the relative brightness between features as a way to judge the "Quality" of a frame, on a per frame basis. So rather than external analysis, this is internal to the camera. Hope that makes more sense.


User avatar
GreatAttractor
Almost There...
Almost There...
Posts: 964
Joined: Sat Jun 01, 2013 1:04 pm
Location: Switzerland
Has thanked: 747 times
Been thanked: 753 times

Re: Realtime Seeing Filter idea - to benefit Altair GPCAM / Hypercam owners and others

Post by GreatAttractor »

Hi Nick,

That's a good idea; implementing it should be straightforward.
Currently one can measure "seeing quality" (amount of atmospheric distortion) by measuring the contrast of a video frame, just like you would with focus detection. The higher the contrast, theoretically the more detail is visible, so the less atmospheric distortion or blurring. I guess the trick is doing it fast enough to capture a frame, analyse it, and move onto the next frame.
This kind of "frame quality" is calculated e.g. by Stackistry; the values follow the human-perceived blurriness very well. No need to worry about speed, it's very fast. Basically, I calculate a blurred version of the frame (by applying a fast box filter three times, i.e. a near-Gaussian) and subtract it from the original. "Quality" is the sum of pixel values of the difference (in other words, sum of the high-frequency component).

A couple of notes though:
I think the difference is that with this idea it doesn't sense the whole sun, just a part of it within the camera's FOV.
Doesn't seem to make a difference. The Sun is just 0.5° in diameter, I'd imagine that the seeing characteristics over a few-second duration are more or less uniform within the subtended angle.
Also it doesn't look at brightness (though I guess it could) but instead contrast
As shown on statistics grounds by Dr. Seykora (the author of the simple SSM circuit folks have been building here) in his solar scintillation article, it's the same thing; the variability of whole Sun's illumination directly corresponds to seeing values (which can also be estimated by contrast analysis, as above).

Anyway, I agree it's easier to analyze a frame than build an SSM, but the end result should be the same (and it's easier to bring a few SSMs on a hike when scouting new solar observatory locations ;)).


My software:
Stackistry — an open-source cross-platform image stacker
ImPPG — stack post-processing and animation alignment
My images

SW Mak-Cass 127, ATM Hα scopes (90 mm, 200 mm), Lunt LS50THa, ATM SSM, ATM Newt 300/1500 mm, PGR Chameleon 3 mono (ICX445)
christian viladrich
Way More Fun to Share It!!
Way More Fun to Share It!!
Posts: 2145
Joined: Sun Jun 14, 2015 4:46 pm
Location: France
Has thanked: 1 time
Been thanked: 2703 times
Contact:

Re: Realtime Seeing Filter idea - to benefit Altair GPCAM / Hypercam owners and others

Post by christian viladrich »

Hi,
Unfortunatly, there is a major limit in what is suggested. Let's go back to how Autostakert (or Avistack) operates :
- the image is first divided in N sub-arrays. The surface of this sub-array is given by the AP Size. For example, if you choose AP size = 50 pixels, it means that each frame is divided in sub-arrays (or tiles) of 50 x 50 pixels (approximately),
- the quality of each sub-array of each frame is measured,
- so AS does not select the best frames, but select the best sub-arrays among all frames.
All of this is because the quality of images is not uniform over all the surface of the frame (things are of course much worse if you take the full solar disk). Typically, on my 300 mm solar telescope, good seeing occurs in patches of 30 x 30 pixels.
So basing the selection of frames on the global quality of the frame, or on the quality of a small Region of Interest is not a good criterion. A lot a good sub-arrays would be lost.
AS gives good results because the selection is based on sub-arrays, not on global frame.

Analysing the quality of many sub-arrays in real time could be a bit challenging ? If we assume this is feasible, then it remains to re-assemble the different sub-arrays (or tiles) in order to build a complete frame. This means basically doing all the job of AS from start to end.

However, this idea could be a good one to trigger automatically the acquisition on the basis of the quality of a Region of Interet (or the quality of the global image).

Just some ideas ...


Christian Viladrich
Co-author of "Planetary Astronomy"
http://planetary-astronomy.com/
Editor of "Solar Astronomy"
http://www.astronomiesolaire.com/
RTJoe
Oh, I get it now!
Oh, I get it now!
Posts: 24
Joined: Tue Jun 16, 2015 8:26 am
Location: Germany
Been thanked: 1 time

Re: Realtime Seeing Filter idea - to benefit Altair GPCAM / Hypercam owners and others

Post by RTJoe »

Hi,
Basically the Nick’s idea could be solved with a FireCapture plugin, very similar to the SSM plugin. The only problem might be how to choose the detection area because the plugin interface currently does not offer the possibility to get something like a region of interest.

On the other hand Christian’s objections are correct (thank you for the explanation how AS works!). Realtime analysis of the whole image (with sub-arrays) seems to be impossible with current hardware, at least for high frame rates. Analyzing a small region of interest should be possible.

BTW, @GreatAttractor: Do you have ever thought about porting some Stackistry algorithms to something like OpenCL and run calculations on a GPU?


User avatar
GreatAttractor
Almost There...
Almost There...
Posts: 964
Joined: Sat Jun 01, 2013 1:04 pm
Location: Switzerland
Has thanked: 747 times
Been thanked: 753 times

Re: Realtime Seeing Filter idea - to benefit Altair GPCAM / Hypercam owners and others

Post by GreatAttractor »

Joe, yes I did. At least the quality analysis phase would be straightforward (it's just an iterated box blur filter and subtraction). Though I'd rather use GLSL (OpenGL shaders) which is widely supported. There's no OpenCL support yet in Radeon open source drivers on Linux (that's what I use).


My software:
Stackistry — an open-source cross-platform image stacker
ImPPG — stack post-processing and animation alignment
My images

SW Mak-Cass 127, ATM Hα scopes (90 mm, 200 mm), Lunt LS50THa, ATM SSM, ATM Newt 300/1500 mm, PGR Chameleon 3 mono (ICX445)
Post Reply