-
Notifications
You must be signed in to change notification settings - Fork 4
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Release v1.1.0: Improve interoperability between pipeline, database a…
…nd webpage
- Loading branch information
Showing
34 changed files
with
1,346 additions
and
443 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,148 @@ | ||
# Flows Pipeline | ||
[![DOI](https://zenodo.org/badge/241705955.svg)](https://zenodo.org/badge/latestdoi/241705955) | ||
[![Tests](https://github.com/SNflows/flows/actions/workflows/tests.yml/badge.svg?branch=devel)](https://github.com/SNflows/flows/actions/workflows/tests.yml) | ||
[![Code Coverage](https://codecov.io/github/SNflows/flows/branch/devel/graph/badge.svg?token=H8CQGPG0U6)](https://codecov.io/github/SNflows/flows) | ||
[![Hits-of-Code](https://hitsofcode.com/github/SNflows/flows?branch=devel)](https://hitsofcode.com/view/github/SNflows/flows?branch=devel) | ||
[![License](https://img.shields.io/github/license/SNflows/flows.svg)](https://github.com/SNflows/flows/blob/devel/LICENSE) | ||
|
||
## Installation instructions | ||
Go to the directory where you want the Python code to be installed and simply download it or clone it via *git* as:: | ||
|
||
```shell | ||
git clone https://github.com/SNflows/flows.git | ||
cd flows | ||
``` | ||
|
||
Required dependencies can be installed using the following command. It is recommended to do this in a dedicated [virtualenv](https://virtualenv.pypa.io/en/stable/) or similar: | ||
|
||
```shell | ||
python3 -m venv env | ||
source env/bin/activate | ||
pip install --upgrade pip | ||
pip install -r requirements.txt | ||
``` | ||
|
||
In addition, in order to run tests and do development, do | ||
|
||
```shell | ||
pip install -r dev_requirements.txt # in same virtual environment as above | ||
_pyver=$(find env/lib/ -type d -mindepth 1 -maxdepth 1 | cut -d '/' -f 3) | ||
ln -s ../env/lib/${_pyver}/site-packages/tendrils/utils/config.ini flows/config.ini | ||
wget -O tests/input/2020aatc/SN2020aatc_K_20201213_495s.fits.gz https://anon.erda.au.dk/share_redirect/FJGx69KFvg | ||
wget -O tests/input/2020lao/59000.96584_h_e_20200531_33_1_1_1_2020lao_LT_gp.fits.gz https://anon.erda.au.dk/share_redirect/E98lmqOVWf | ||
wget -O tests/input/2020lao/subtracted/59000.96584_h_e_20200531_33_1_1_1_2020lao_LT_gpdiff.fits.gz https://anon.erda.au.dk/share_redirect/bIxyzrRXbg | ||
wget -O tests/input/2021wyw/ADP.2021-10-15T11_40_06.553.fits.gz https://anon.erda.au.dk/share_redirect/Gr8p2K7ph5 | ||
``` | ||
|
||
TODO: Reformulate following bullet point and check/decide on which config paths are/should be available...: | ||
|
||
* **Changed with ``tendrils`` API.** If using ``tendrils``, follow the steps below, but then let ``tendrils`` know of the config file location. Alternatively, individual config file elements can be set programatically using `tendrils` and will be saved to a config file automatically. Last step is to create a config-file. Create a file named "config.ini" and place it in the "flows" directory. Make sure that the file can only be read by you (``chmod 0600 config.ini``)! | ||
This file can contain all the settings for running the pipeline. A minimal file for working with the pipeline is | ||
|
||
```ini | ||
[api] | ||
token = <my api token> | ||
|
||
[TNS] | ||
api_key = <AUFLOWS_BOT API key> | ||
``` | ||
|
||
where your API token can be found on the Flows webpage. | ||
|
||
## How to run tests | ||
|
||
You can test your installation by going to the root directory where you cloned the repository and run the command: | ||
|
||
```shell | ||
pytest | ||
``` | ||
|
||
## Full configuration file | ||
|
||
Text coming soon... | ||
|
||
```ini | ||
########################################################## | ||
### --Tendrils configurations used at FLOWS run-time-- ### | ||
### Configurations without a leading '#' are required ### | ||
### and must be specified by the user. ### | ||
### Configurations with a leading '#' are optional, ### | ||
### and their assigned values are documentation of ### | ||
### their default values in the FLOWS pipeline. ### | ||
### Default values of optional configurations with a ### | ||
### leading '$' signify environment variables resolved ### | ||
### at run-time; their fallbacks are given following ### | ||
### a '/'. ### | ||
########################################################## | ||
|
||
[api] | ||
# photometry_cache = None | ||
# pipeline = False | ||
token = None | ||
|
||
# casjobs: | ||
# wsid and password required for run_catalogs.py, | ||
# user registration at | ||
# https://galex.stsci.edu/casjobs/CreateAccount.aspx | ||
# wsid can be found at | ||
# https://galex.stsci.edu/casjobs/changedetails.aspx | ||
# after login | ||
[casjobs] | ||
# wsid = $CASJOBS_WSID/None | ||
# password = $CASJOBS_PASSWORD/None | ||
|
||
# database: | ||
# username and password required for run_catalogs.py, | ||
# the user is a registered user in the flows database | ||
# with access to the 'adastra' schema | ||
[database] | ||
# username = $AUDBUsername/None | ||
# password = $AUDBPassword/None | ||
|
||
[photometry] | ||
archive_local = None | ||
# output = . | ||
|
||
# TNS: | ||
# api_key required for run_querytns.py, | ||
# user registration at | ||
# https://www.wis-tns.org/user | ||
# api_key is that of a TNS bot; ask a flows group | ||
# member for one | ||
# if user_id and user_name are not given, fallback | ||
# to a TNS bot's bot_id and bot_name, which must | ||
# match with api_key | ||
[TNS] | ||
# api_key = None | ||
# bot_id = 191396 | ||
# bot_name = AUFLOWS_BOT2 | ||
# user_id = None | ||
# user_name = None | ||
|
||
[URL] | ||
# base_url = https://flows.phys.au.dk/api/ | ||
# catalogs_url = reference_stars.php | ||
# catalogs_missing_url = catalog_missing.php | ||
# cleanup_photometry_status_url = cleanup_photometry_status.php | ||
# datafiles_url = datafiles.php | ||
# filters_url = filters.php | ||
# lightcurves_url = lightcurve.php | ||
# photometry_upload_url = upload_photometry.php | ||
# photometry_url = download_photometry.php | ||
# set_photometry_status_url = set_photometry_status.php | ||
# sites_url = sites.php | ||
# targets_post_url = targets_add.php | ||
# targets_url = targets.php | ||
# verify_ssl = True | ||
|
||
[ztf] | ||
# output_photometry = . | ||
``` | ||
|
||
## Making a release | ||
|
||
- Bump sem-version when Devel is ready to merge in file = VERSION (v1.0.0). Checkout devel. Edit Version. Push devel. | ||
- Merge Devel into Master (Create PR from Devel -> Master), wait until tests are passing. Create issues if not. Then Merge. | ||
- Create tag on Master corresponding to right semversion. This means, checkout master. Pull master locally. Create tag using git tag called "v1.0.0" or whatever the sem-version. Push local tag to GitHub. | ||
- Merge Master into devel to propagate tag (Create PR on GitHub). | ||
- Create release on GH releases tab if all tests passing. |
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,120 @@ | ||
#!/usr/bin/env python3 | ||
# -*- coding: utf-8 -*- | ||
|
||
import os.path | ||
import getpass | ||
import numpy as np | ||
from astropy.table import Table | ||
from astropy.units import Quantity | ||
import sys | ||
if sys.path[0] != os.path.abspath(os.path.join(os.path.dirname(__file__), 'flows')): | ||
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), 'flows'))) | ||
from flows.aadc_db import AADC_DB | ||
from tendrils.utils import load_config | ||
|
||
if __name__ == '__main__': | ||
with AADC_DB() as db: | ||
|
||
#db.cursor.execute("SELECT fileid,path,targetid FROM flows.files LEFT JOIN flows.photometry_details d ON d.fileid_phot=files.fileid WHERE files.datatype=2 AND d.fileid_phot IS NULL;") | ||
# | ||
#for row in db.cursor.fetchall(): | ||
# | ||
# fileid_phot = row['fileid'] | ||
# filepath = os.path.join('/archive/photometry/', row['path']) | ||
# | ||
# tab = Table.read(filepath, format='ascii.ecsv') | ||
# | ||
# indx_raw = (tab['starid'] == 0) | ||
# indx_sub = (tab['starid'] == -1) | ||
# indx_ref = (tab['starid'] > 0) | ||
# | ||
# phot_details = { | ||
# 'fileid_phot': fileid_phot, | ||
# 'fileid_img': int(tab.meta['fileid']), | ||
# 'fileid_template': tab.meta['template'], | ||
# 'fileid_diffimg': None if 'diffimg' not in tab.meta else tab.meta['diffimg'], | ||
# 'obstime': tab.meta['obstime-bmjd'], | ||
# 'mag_raw': float(tab[indx_raw]['mag']), | ||
# 'mag_raw_error': float(tab[indx_raw]['mag_error']), | ||
# 'mag_sub': None if not any(indx_sub) else float(tab[indx_sub]['mag']), | ||
# 'mag_sub_error': None if not any(indx_sub) else float(tab[indx_sub]['mag_error']), | ||
# 'zeropoint': float(tab.meta['zp']), | ||
# 'zeropoint_error': None if 'zp_error' not in tab.meta else float(tab.meta['zp_error']), | ||
# 'zeropoint_diff': None if 'zp_diff' not in tab.meta else float(tab.meta['zp_diff']), | ||
# 'fwhm': None if 'fwhm' not in tab.meta else tab.meta['fwhm'], | ||
# 'seeing': None if 'seeing' not in tab.meta else tab.meta['seeing'], | ||
# 'references_detected': int(np.sum(indx_ref)), | ||
# 'used_for_epsf': int(np.sum(tab['used_for_epsf'])), | ||
# 'faintest_reference_detected': float(np.max(tab[indx_ref]['mag'])), | ||
# 'pipeline_version': tab.meta['version'], | ||
# } | ||
# | ||
# for key in ('fwhm', 'seeing'): | ||
# if isinstance(phot_details[key], Quantity): | ||
# phot_details[key] = float(phot_details[key].value) | ||
# | ||
# print(phot_details) | ||
# print(row['targetid']) | ||
# | ||
# db.cursor.execute("""INSERT INTO flows.photometry_details ( | ||
# fileid_phot, | ||
# fileid_img, | ||
# fileid_template, | ||
# fileid_diffimg, | ||
# obstime_bmjd, | ||
# mag_raw, | ||
# mag_raw_error, | ||
# mag_sub, | ||
# mag_sub_error, | ||
# zeropoint, | ||
# zeropoint_error, | ||
# zeropoint_diff, | ||
# fwhm, | ||
# seeing, | ||
# references_detected, | ||
# used_for_epsf, | ||
# faintest_reference_detected, | ||
# pipeline_version | ||
# ) VALUES ( | ||
# %(fileid_phot)s, | ||
# %(fileid_img)s, | ||
# %(fileid_template)s, | ||
# %(fileid_diffimg)s, | ||
# %(obstime)s, | ||
# %(mag_raw)s, | ||
# %(mag_raw_error)s, | ||
# %(mag_sub)s, | ||
# %(mag_sub_error)s, | ||
# %(zeropoint)s, | ||
# %(zeropoint_error)s, | ||
# %(zeropoint_diff)s, | ||
# %(fwhm)s, | ||
# %(seeing)s, | ||
# %(references_detected)s, | ||
# %(used_for_epsf)s, | ||
# %(faintest_reference_detected)s, | ||
# %(pipeline_version)s | ||
# );""", phot_details) | ||
# | ||
#db.conn.commit() | ||
|
||
db.cursor.execute("SELECT fileid,path,targetid FROM flows.files LEFT JOIN flows.photometry_details d ON d.fileid_phot=files.fileid WHERE files.datatype=2 AND d.faintest_reference_detected='NaN'::real;") | ||
for row in db.cursor.fetchall(): | ||
|
||
fileid_phot = row['fileid'] | ||
filepath = os.path.join('/archive/photometry/', row['path']) | ||
|
||
tab = Table.read(filepath, format='ascii.ecsv') | ||
print(len(tab)) | ||
|
||
indx_ref = (tab['starid'] > 0) | ||
|
||
frd = float(np.nanmax(tab[indx_ref]['mag'])) | ||
if np.isnan(frd): | ||
frd = None | ||
|
||
print(fileid_phot, frd) | ||
|
||
db.cursor.execute("UPDATE flows.photometry_details SET faintest_reference_detected=%s WHERE fileid_phot=%s;", [frd, fileid_phot]) | ||
|
||
db.conn.commit() |
Oops, something went wrong.