I have owned my Celestron Origin for almost a year. Beyond being able to “observe” so many targets that I’ve never seen before, I would like to give back to the scientific community, if I can.
I’ve experimented with ASTAP, and I know how to retrieve my raw .FITS file from the Origin and then calibrate them with the Flat, Dark, and Bias master files, then get the astrometric solution.
I’ve been using the AAVSO DLSR Observing Manual. My issue so far has been that with the easy beginner variables I get saturation with my 10-second images. That doesn’t help with photometry.
I saw that the AAVSO has a tool here called VPHOT. I would like to use that tool if I can, but I’m getting stuck in creating a new telescope setup. The Origin is a 6 inch f/2.2 RASA, and the camera has a Sony IMX178_LQJ_sensor. I can find that it has 2.4um x 2.4um pixels, BUT I cannot find the Gain and Linearity Limit needed in the setup.
Has anyone here created a setup for a Celestron Origin? Can anyone here point me to where I can obtain this info? Do I have to join as an official AAVSO member to get this kind of assistance while I am still learning as a beginner?
Thank you very much for listening. I would greatly appreciate any advice or assistance that you may be able to provide!
I can’t answer your question about gain and linearity in the Celestron Origin directly, but I wonder if you could determine gain yourself. I understand from the manual that you can select different gain levels (via ISO settings), and subexposure levels to a maximum of 30 seconds.
Maybe it would be possible to use the above functionalities to determine the gain settings of your sensor using the procedures described by Craig Stark in the attached article. The procedure uses flats, and the description starts near the bottom of page 2. I used it some years ago to measure the performance of my DSLR camera.
To open and read the attachment in my last message I found I had to download the file, go to the download directory, then double click to open it. Maybe it will be easier when you try.
Hi, Stephen:
I’m part of the AAVSO’s Smart Telescope Working Group, where we’re in the process of putting together a software processing pipeline to make it easy to extract photometry from smart telescope images. (But we’re not there, yet – still a work in progress.)
One of the things we noticed early on is how high most of the smart telescope vendors have set their camera gain. At this point we’ve pulled photometry from thousands of smart telescope images, and one common problem is the need to be extra vigilant about saturation – and it’s particularly common on the Origin. I urge you to set the gain setting on the control app to its lowest setting – at some point I hope we can provide more specific suggestions based on telescope type, but today, that seems to be the best way to avoid problems with 10 second exposures.
We also need more information about the relationship between the cameras’ “gain setting” and the actual system gain that results from that gain setting. For standalone astro cameras, the camera vendors are quite good about providing gain curve graphs that show this relationship, but the smart telescope vendors aren’t making it easy to figure that out. Craig Stark’s paper has a good explanation of how to measure gain. That procedure can be somewhat tricky on a smart telescope, though, because some of the smart telescopes make it really hard to take exposures that have no stars in them. If you can go through that procedure, measure your system gain, and then post the results here, you’ll be doing us a great favor (and helping speed the development and refinement of our analysis pipeline).
Please keep us posted on your progress!
– Mark Munkacsy (MMU)
Good morning folks, and many thanks for your responses regarding my issue here.
Mark, thank you very much for your detailed explanation of what is going on and what I am experiencing. This is tremendously appreciated!
For now, I’ll read the Craig Stark article, and see if I can coax the Origin to take images with no stars - without ruining my current master flat, dark, and bias files.
The Celestron Origin is designed for the mass consumer market. It’s frustrating to find that flat frame creation appears to be an automated or default process. ‘How to’ videos on the Internet seem to take the viewer to that default end point.
I wonder if the way to take flat frames for measurements such as Craig Stark’s procedures for determining sensor properties would be to create them manually in snapshot mode and save the images on a computer? Maybe something like that would be the only way to determine linearity limit?
Disclaimer: I don’t own a Celestron Origin, so the above may be incorrect. I’m just interested in the issues involved here.
I went into my garage and operated my Origin this afternoon. When powered-up, the Origin immediately tries to initialize itself. Of course, inside, with the plastic lens cap on, it fails. At that point, the Origin will not allow one to hit the Start Imaging button. I put the Origin in manual operation, and when I selected Snapshot mode, it allowed me to take 0.1 second snaps. It defaulted to ISO 200, but I was able to select ISO 100 (from the 3 available: 100, 200, 2000).
I took two snaps, one at the default ISO 200, and the second at ISO 100. The Origin saves the snaps directly to Photos, as a TIFF file. It appears that no raw .FITS file is saved. Nothing is saved onboard the Origin. These snaps are obviously Dark files, but it appears that I will be able to rig up something to take the needed Flat files.
I found and downloaded the ImageJ application, and it is able to open the TIFF files and operate on them. I’m not yet fully familiar with ImageJ of course, but it appears to be able to split the TIFF file into individual Red, Blue, and Green files.
Do I have to split the TIFF file and then use only the Green file for the Flat file process described in the article, or could I simply use the TIFF files?
From an Internet search it looks like you need to select Save Raw Images from one of the Origin menus. Seems that files are then saved in the .FITS format, which is what you need.
Should have answered your question. If you plan too use VPhot, what you do with RGB .FITS files if you can save them will be part of the workflow. I’m not familiar
with that aspect of VPhot.
Now it’s my turn to apologize for not being clear.
When I’m doing my normal “observing” with the Origin, it does indeed save all the individual raw frames as .FITS files onboard the Origin. As well, it saves the Final Stacked Master .FITS file before it does any of its normal image processing. I can then download the folder full of files, including the Master Dark, Master Flat, and Master Bias files that were used, onto a USB memory stick for transfer into my computer for image processing.
When taking a snapshot, the .TIFF file is saved directly into Photos on my iPad, but nothing is saved on the Origin, so I don’t get the raw .FITS file.
If I cannot use the .TIFF file, it may be that I’ll have to get the Origin properly initialized before I can trick it into taking a series of Dark files or Flat files.
Can the .TIFF file be turned back into a .FITS file, or would that lose information from the image?
Could I separate only the green channel from the .TIFF file and then convert that into a .FITS file?
I looked this morning, and ASTAP will load a .TIFF file, and then save it as a .FITS file.
The two different files are the same size, both 37.6 MB, so it doesn’t look like I lost anything.
So, now I can get Dark files, and I have to move on to getting the needed Flat files.
Hi, Stephen (et al): This year’s SAS Symposium includes a workshop on “science with smart telescopes”, focusing on photometry. Richard Berry will talk about his experience with the Origin; and Mark M. will also be one of the speakers. For details, see: https://socastrosci.org/symposium/
Bob Buchheim
I just successfully processed some Seestar fit files with ASTAP. I modified them to tri green files, selected a check and comparison star and it returned results that compared very well to the AAVSO light curve V and visual curves. Since then I found an excellent step by step pdf in the facebook Seestar (Official ZWO group). (Guy Boistel, April 20, 2025) I just took a course on VPHOT and it concentrated totally on a mono camera with BVIR filters. It might do color but there was no mention of how to do it.
Looks like the procedure uses ASTAP. Another app that can be used for photometry with images from RGB sensors is Tycho Tracker, which Andrew Pearce has used very successfully with Seestar. Andrew has posted his procedure previously. I’ve finally got round to reading the Tycho Tracker manual. It callibrates images, debayers them, determines transformation coefficients from AAVSO standard fields, applies the TCs, allows stacking of every n images where n is a number chosen by the user, plots light curves and saves the final data in the AAVSO Extended Format. From what I’ve read in rhe ASTAP manual, its default operation is to produce TG magnitudes. Although it allows for transformations, the procedure seems to apply to one channel only, and is based on stars in the image automatically selected by the app - not as comprehensive or standard as the Tycho Tracker transformation procedure.