Projects¶
Machine-Learning Prototype¶
Overview¶
- This project aims to develop a prototype machine-learning (ML) framework to study and reproduce human expert decision-making in data analysis.
- Possible partnership with industry
- Architecture of the prototype will be done with AT2025ulz:
Motivation¶
During the AT2025ulz follow-up campaign, several limitations of automated processing were identified, including with template subtraction due to galaxy contamination and image quality when the source was faint. Manual runs were conducted, manually adjusting parameters and selectively overriding defaults, which resulted in a significantly cleaner and more reliable light curve.
This project is motivated by the need to formalize, quantify, and learn from these human interventions, with the long-term goal of improving automated processing and informing future ML-assisted analysis strategies.
Objectives¶
- Extract the human-tuned parameters during the different STDWeb tasks + Re-run STDpipe with default settings.
- Compare human-tuned processing with a purely machine approach.
To identify which aspects of human-tuned STDWeb/STDPipe processing most strongly affect final photometric outcomes - Construct a database of all relevant parameters to be used as input for the ML experiments
Scope & Methodology¶
- Formalization of Human Interventions: Identify and categorize manual actions performed during expert runs, such as:
- Template-subtraction adjustments
- Image-quality overrides
- Parameter tuning in detection and photometry steps
- Parameter Availability Assessment
- Evaluate which of these decisions are already logged or recoverable from STDPipe databases and task configurations.
- Parameter Schema Definition
- Define a sufficient parameter schema (per image / per epoch) describing how each analysis was performed.
- Dataset Construction: Generate parallel datasets comparing:
- Expert-tuned runs
- Default STDpipe runs
Timeline & Milestones¶
- Short-term (by January 2026):
- Formalize list of human interventions to be modeled
- Define initial parameter schema
- Extract a first subset of task configurations
- Mid-term:
- Construct paired datasets (human-tuned vs default)
- Validate consistency and completeness of extracted parameters
- Later stages:
- ML experimentation and evaluation
- Assessment of implications for automated STDWeb/STDPipe workflows
Participants & Coordination¶
Key contributors:
- S. Antier, M. Pillas
- S. Karpov, D. Akl
Coordination is handled through the STDWeb group communication channels.
Automation of Data Reduction via OwnCloud Integration¶
Overview¶
This project aims to implement a fully automated data-reduction workflow for GRANDMA observations, using OwnCloud as the ingestion interface and STDWeb/STDPipe as the processing tool. The objective is to enable reliable, low-latency processing from image deposition to delivery of photometric data points onto Skyportal, with minimal manual intervention.
Motivation¶
GRANDMA follow-up campaigns routinely involve large data volumes from multiple telescopes operating under heterogeneous conditions. Manual triggering and supervision of data reduction is not scalable during high-priority or time-critical campaigns. This project addresses the need for a robust, automated, and monitored processing chain that ensures consistency, traceability, and timely availability of results.
Objectives¶
The main objectives of this project are:
- Automate the execution of STDWeb/STDPipe tasks upon image deposition on Owncloud
- Standardize data ingestion across telescopes and observing programs
- Ensure consistent production of reduced products and photometric measurements
- Integrate automated uploads of results into SkyPortal
- Define monitoring and flagging mechanisms for unreliable or marginal results.
Scope & Methodology¶
The automated workflow is designed as follows:
OwnCloud Repository Initialization
OwnCloud repositories are created and organized by telescope and/or observing program: To be defined based on new folder creation logic.
Automated Task Triggering
When a user deposits an image into the appropriate OwnCloud folder, the action triggers STDWeb processing.
Pipeline Execution
STDWeb executes the relevant processing steps, including:
- Astrometry
- Photometry
- + Additional tasks (e.g. transient detection, image subtraction), depending on folder configuration: To be defined based on target/need. This could require certain flags on the folders and will be a second step after the initial automation of astrometry and forced photomerty is implemented.
- Reduced images, measurements, and diagnostic outputs are produced
- Photometric results are uploaded automatically to SkyPortal: Monitoring is required to ensure correct association with source coordinates, instrument metadata, and filter definitions, particularly for tasks with unspecified or uncertain coordinates (e.g. poorly localized soruces).
Quality Flagging
Selected detections (e.g. marginal measurements or low-quality results) will need to be flagged for manual review.
Participants & Coordination¶
This project is coordinated within the STDWeb/STDPipe group, with close interaction between developers, telescope teams, and SkyPortal contributors.
Key contributors:
- S. Karpov, D. Akl
- TBD