# Welcome to astroNN’s documentation!

astroNN is a python package to do various kinds of neural networks with targeted application in astronomy by using Keras API as model and training prototyping, but at the same time take advantage of Tensorflow or PyTorch flexibility.

For non-astronomy applications, astroNN contains custom loss functions and layers which are compatible with Keras v3. The custom loss functions mostly designed to deal with incomplete labels. astroNN contains demo for implementing Bayesian Neural Net with Dropout Variational Inference in which you can get reasonable uncertainty estimation and other neural nets.

For astronomy applications, astroNN contains some tools to deal with APOGEE, Gaia and LAMOST data. astroNN is mainly designed to apply neural nets on APOGEE spectra analysis and predicting luminosity from spectra using data from Gaia parallax with reasonable uncertainty from Bayesian Neural Net. Generally, astroNN can handle 2D and 2D colored images too. Currently astroNN is a python package being developed by the main author to facilitate his research project on deep learning application in stellar and galactic astronomy using SDSS APOGEE, Gaia and LAMOST data.

For learning purpose, astroNN includes a deep learning toy dataset for astronomer - Galaxy10 DECaLS Dataset.

## Acknowledging astroNN

**Deep learning of multi-element abundances from high-resolution spectroscopic data**[arXiv:1808.04428][ADS]

Here is a list of publications using `astroNN`

- Publications using astroNN

## Indices, tables and astroNN structure

- Getting Started
- Contributor and Issue Reporting guide
- Changelog
- Publications using astroNN
- Loss Functions and Metrics
- Correction Term for Magic Number
- Mean Squared Error
- Mean Absolute Error
- Mean Error
- Regression Loss and Predictive Variance Loss for Bayesian Neural Net
- Mean Squared Logarithmic Error
- Mean Absolute Percentage Error
- Mean Percentage Error
- Categorical Cross-Entropy
- Binary Cross-Entropy
- Categorical Cross-Entropy and Predictive Logits Variance for Bayesian Neural Net
- Binary Cross-Entropy and Predictive Logits Variance for Bayesian Neural Net
- Categorical Classification Accuracy
- Binary Classification Accuracy
- Zeros Loss

- Layers
- Monte Carlo Dropout Layer
- Monte Carlo Spatial Dropout Layer
- Monte Carlo Gaussian Dropout Layer
- Error Propagation Layer
- KL-Divergence Layer for Variational Autoencoder
- Mean and Variance Calculation Layer for Bayesian Neural Net
- Repeat Vector Layer for Bayesian Neural Net
- Fast Monte Carlo Integration Layer for Keras Model
- Gradient Stopping Layer
- Boolean Masking Layer

- Callbacks and Utilities
- Neural Nets Classes and Basic Usage
- Available astroNN Neural Net Classes
- NeuralNetMaster Class API
- Workflow of Setting up astroNN Neural Nets Instances and Training
- Load astroNN Generated Folders
- Load and Use Multiple astroNN Generated Folders
- Workflow of Testing and Distributing astroNN Models
- Creating Your Own Model with astroNN Neural Net Classes
- NeuralNetMaster Class

- Mini Tools for APOGEE data
- Mini Tools for LAMOST data
- Mini Tools for Gaia data
- APOGEE Spectra with Convolutional Neural Net -
**ApogeeCNN** - APOGEE Spectra with Bayesian Neural Net -
**ApogeeBCNN** - APOGEE Spectra with Censored Bayesian NN -
**ApogeeBCNNCensored** - APOGEE Spectra with Bayesian NN and Gaia offset calibration -
**ApogeeDR14GaiaDR2BCNN** - Convolutional Variational Autoencoder -
**ApogeeCVAE** - Encoder-decoder for APOGEE and Kepler -
**ApokascEncoderDecoder** - StarNet (arXiv:1709.09182)
- Cifar10 with astroNN