Callbacks and Utilities¶
A callback is a set of functions under astroNN.nn.callbacks
and astroNN.nn.utilities
modules to be applied at given stages of the training procedure.
astroNN provides some customized callbacks which built on tensorflow.keras. You can just treat astroNN customized callbacks as conventional Keras callbacks.
astroNN also contains some handy utilities for data processing
Virtual CSVLogger (Callback)¶

class
astroNN.nn.callbacks.
VirutalCSVLogger
(filename='training_history.csv', separator=',', append=False)[source]¶ A modification of keras’ CSVLogger, but not actually write a file until you call method to save
 Parameters
 Returns
callback instance
 Return type
 History
 2018Feb22  Written  Henry Leung (University of Toronto)2018Mar12  Update  Henry Leung (University of Toronto)
VirutalCSVLogger is basically Keras’s CSVLogger without Python 2 support and won’t write the file to disk until savefile() method is called after the training where Keras’s CSVLogger will write to disk immediately.
VirutalCSVLogger can be imported by
1  from astroNN.nn.callbacks import VirutalCSVLogger

It can be used with Keras, you just have to import the function from astroNN
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21  def keras_model():
# Your keras_model define here
return model
# Create a Virtual_CSVLogger instance first
csvlogger = VirutalCSVLogger()
# Default filename is training_history.csv
# You have to set filename first before passing to Keras
csvlogger.filename = 'training_history.csv'
model = keras_model()
model.compile(....)
model.fit(...,callbacks=[csvlogger])
# Save the file to current directory
csvlogger.savefile()
# OR to save the file to other directory
csvlogger.savefile(folder_name='some_folder')

Raising Error on Nan (Callback)¶

class
astroNN.nn.callbacks.
ErrorOnNaN
[source]¶ Callback that raise error when a NaN loss is encountered.
 Returns
callback instance
 Return type
 History
2018May07  Written  Henry Leung (University of Toronto)
ErrorOnNaN is basically Keras’s TerminateOnNaN but will raise ValueError on Nan, its useful for python unittest to make sure you can catch the error and know something is wrong.
Normalizer (Utility)¶
astroNN Normalizer is called when train() method is called and involved pre_training_checklist_master() method
defined in NeuralNetMaster Class. Normalizer will not normalize data/labels equal to magicnumber
defined in configuration file.
So that astroNN loss function can recognize those missing/bad data.
Normalizer consists of a few modes that you can, but the mode will minus mean and divide standard derivation to the data.
Mode 0 means normalizing data with mean=0 and standard derivation=1 (same as doing nothing)
1 2 3 4 5 6 7 8  # If we have some data
data = np.array([[1,2,3], [9,8,7]])
# THe normalized data, mean std will as follow by this mode
norm_data = array([[1,2,3], [9,8,7]])
# the mean and standard derivation used to do the normalization
mean = [0.]
std = [1.]

Mode 1 means normalizing data with a single mean and a single standard derivation of the data
1 2 3 4 5 6 7 8  # If we have some data
data = np.array([[1,2,3], [9,8,7]])
# THe normalized data, mean std will as follow by this mode
norm_data = array([[1.28653504, 0.96490128, 0.64326752], [ 1.28653504, 0.96490128, 0.64326752]])
# the mean and standard derivation used to do the normalization
mean = [5.0]
std = [3.11]

Mode 2 means normalizing data with pixelwise means and pixelwise standard derivations of the data
1 2 3 4 5 6 7 8  # If we have some data
data = np.array([[1,2,3], [9,8,7]])
# THe normalized data, mean std will as follow by this mode
norm_data = array([[4., 3., 2.], [ 4., 3., 2.]])
# the mean and standard derivation used to do the normalization
mean = [5., 5., 5.]
std = [4., 3., 2.]

Mode 3 means normalizing data with featurewise mean and standard derivation=1 the data (only centered the data), it is useful for normalizing spectra
1 2 3 4 5 6 7 8  # If we have some data
data = array([[1,2,3], [9,8,7]])
# THe normalized data, mean std will as follow by this mode
norm_data = array([[1., 1., 1.], [ 1., 1., 1.]])
# the mean and standard derivation used to do the normalization
mean = [5., 5., 5.]
std = [1.]

Mode 3s means normalizing data with featurewise mean and standard derivation=1 the data (only centered the data), then apply sigmoid for normalization or sigmoid inverse for denormalization. It is useful for normalizing spectra for Variational Autoencoder with Negative Log Likelihood objective.
Mode 255 means normalizing data with mean=127.5 and standard derivation=127.5, this mode is designed to normalize 8bit images
1 2 3 4 5 6 7 8  # If we have some data
data = np.array([[255,125,100], [99,87,250]])
# THe normalized data, mean std will as follow by this mode
norm_data = array([[ 1. , 0.01960784, 0.21568627], [0.22352941, 0.31764706, 0.96078431]])
# the mean and standard derivation used to do the normalization
mean = [127.5]
std = [127.5]

You can set the mode from a astroNN neural net instance before called train() method by
1 2 3  # To set the normalization mode for input and labels
astronn_neuralnet.input_norm_mode = ...
astronn_neuralnet.labels_norm_mode = ...

You can use Normalizer() independently to take advantage of this function won’t touch data equal magicnumber
.
Normalizer() always return you the normalized data, the mean and standard derivation used to do the normalization
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24  from astroNN.nn.utilities.normalizer import Normalizer
import numpy as np
# Make some data up
data = np.array([[1.,2.,3.], [9.,8.,7.]])
# Setup a normalizer instance with a mode, lets say mode 1
normer = Normalizer(mode=1)
# Use the instance method normalize to normalize the data
norm_data = normer.normalize(data)
print(norm_data)
>>> array([[1.28653504, 0.96490128, 0.64326752], [ 1.28653504, 0.96490128, 0.64326752]])
print(normer.mean_labels)
>>> 5.0
print(normer.std_labels)
>>> 3.1091263510296048
# You can use the same instance (with same mean and std and mode) to demoralize data
denorm_data = normer.denormalize(data)
print(denorm_data)
>>> array([[1.,2.,3.], [9.,8.,7.]])

Useful Handy Tensorflow function  astroNN.nn¶

astroNN.nn.
reduce_var
(x, axis=None, keepdims=False)[source]¶ Calculate variance using Tensorflow (as opposed to tf.nn.moment which return both variance and mean)
 Parameters
x (tf.Tensor) – Data
axis (int) – Axis
keepdims (boolean) – Keeping variance dimension as data or not
 Returns
Variance
 Return type
tf.Tensor
 History
2018Mar04  Written  Henry Leung (University of Toronto)

astroNN.nn.
intpow_avx2
(x, n)[source]¶ Calculate integer power of float (including negative) even with Tensorflow compiled with AVX2 since –fastmath compiler flag aggressively optimize float operation which is common with AVX2 flag
 Parameters
x (tf.Tensor) – identifier
n (int) – an integer power (a float will be casted to integer!!)
 Returns
powered float(s)
 Return type
tf.Tensor
 History
2018Aug13  Written  Henry Leung (University of Toronto)
1 2 3 4 5 6 7 8 9 10 11  from astroNN.nn import intpow_avx2
import tensorflow as tf
print(intpow_avx2(tf.constant([1.2]), 2))
>>> tf.Tensor([1.44], shape=(1,), dtype=float32)
print(tf.pow(tf.constant([1.2]), 2))
# if your tensorflow is compiled with AVX2 or fastmath
>>> tf.Tensor([nan], shape=(1,), dtype=float32)
# if your tensorflow is NOT compiled with AVX2 or fastmath
>>> tf.Tensor([1.44], shape=(1,), dtype=float32)

NumPy Implementation of Tensorflow function  astroNN.nn.numpy¶
astroNN has some handy numpy implementation of a number of tensorflow functions. The list of available functions are

astroNN.nn.numpy.
kl_divergence
(x, y)[source]¶ NumPy implementation of tf.distributions.kl_divergence
Either both x and y are ndarray or both x and y are astropy.Quatity, return without astropy units in all case

astroNN.nn.numpy.
mean_absolute_error
(x, y, axis=None)[source]¶ NumPy implementation of tf.keras.metrics.mean_absolute_error with capability to deal with
magicnumber
and astropy QuantityEither both x and y are ndarray or both x and y are astropy.Quatity, return without astropy units in all case
 Parameters
 Raise
TypeError when only either x or y contains astropy units. Both x, y should carry/not carry astropy units at the same time
 Returns
Mean Absolute Error
 Return type
Union[ndarray, float]
 History
2018Apr11  Written  Henry Leung (University of Toronto)

astroNN.nn.numpy.
mean_absolute_percentage_error
(x, y, axis=None)[source]¶  NumPy implementation of tf.keras.metrics.mean_absolute_percentage_error with capability to deal with
magicnumber
and astropy QuantityEither both x and y are ndarray or both x and y are astropy.Quatity, return has no astropy units in all case Parameters
 Raise
TypeError when only either x or y contains astropy units. Both x, y should carry/not carry astropy units at the same time
 Returns
Mean Absolute Percentage Error
 Return type
Union[ndarray, float]
 History
2018Apr11  Written  Henry Leung (University of Toronto)

astroNN.nn.numpy.
median_absolute_error
(x, y, axis=None)[source]¶ NumPy implementation of a median version of tf.keras.metrics.mean_absolute_error with capability to deal with
magicnumber
and astropy QuantityEither both x and y are ndarray or both x and y are astropy.Quatity, return without astropy units in all case
 Parameters
 Raise
TypeError when only either x or y contains astropy units. Both x, y should carry/not carry astropy units at the same time
 Returns
Median Absolute Error
 Return type
Union[ndarray, float]
 History
2018May13  Written  Henry Leung (University of Toronto)

astroNN.nn.numpy.
median_absolute_percentage_error
(x, y, axis=None)[source]¶  NumPy implementation of a median version of tf.keras.metrics.mean_absolute_percentage_error with capability todeal with
magicnumber
and astropy QuantityEither both x and y are ndarray or both x and y are astropy.Quatity, return has no astropy units in all case Parameters
 Raise
TypeError when only either x or y contains astropy units. Both x, y should carry/not carry astropy units at the same time
 Returns
Median Absolute Percentage Error
 Return type
Union[ndarray, float]
 History
2018May13  Written  Henry Leung (University of Toronto)