AIMET TensorFlow Cross Layer Equalization Primitive API
Introduction
If a user wants to modify the order of Cross Layer equalization, not use some features or manually tweak the list of layers that need to be equalized, the following APIs can be used.
Higher level API can be used for using one or more features one after the other. It automatically finds the layers to be folded or scaled.
Lower level APIs can be used to manually tweak the list of layers to be folded. The user has to pass the list of layers in the correct order that they appear in the model.
Note: Before using High Bias fold, Cross Layer Scaling (CLS) needs to be applied and scaling factors obtained from CLS need to be plugged in to High Bias Fold. And, if there are batchnorm layers, they need to be folded and the info saved to be plugged into high bias fold API.
Higher Level APIs for Cross Layer Equalization
API for Batch Norm Folding
- aimet_tensorflow.batch_norm_fold.fold_all_batch_norms(sess, input_op_names, output_op_names)[source]
- Fold all batch_norm layers in a model into corresponding conv layers - Parameters
- sess ( - Session) – active tf.compat.v1.Session
- input_op_names ( - Union[- str,- List[- str]]) – Name of the starting op in the given graph or a list of names in case of multi-input model
- output_op_names ( - Union[- str,- List[- str]]) – List of output op names of the model, used to help ConnectedGraph determine valid ops (to ignore training ops for example). If None, all ops in the model are considered valid.
 
- Return type
- Tuple[- Session,- List[- Tuple[- Operation,- Operation]]]
- Returns
- A new session with edited graph and a list of pairs of layers [(Conv/Linear, BN layer that got folded)] 
 
API for Cross Layer Scaling
- aimet_tensorflow.cross_layer_equalization.CrossLayerScaling.scale_model(sess, input_op_names, output_op_names)
- Uses cross-layer scaling to scale all applicable layers in the given model - Parameters
- sess ( - Session) – Session containing graph to scale
- input_op_names ( - Union[- str,- List[- str]]) – Names of starting ops in the model
- output_op_names ( - Union[- str,- List[- str]]) – List of output op names of the model, used to help ConnectedGraph determine valid ops (to ignore training ops for example). If None, all ops in the model are considered valid.
 
- Return type
- ( - Session,- List[- ClsSetInfo])
- Returns
- updated session, CLS information for each CLS set 
 
API for High Bias Folding
- aimet_tensorflow.cross_layer_equalization.HighBiasFold.bias_fold(sess, folded_pairs, cls_set_info_list)
- Folds bias values greater than 3 * sigma to next layer’s bias - Parameters
- sess ( - Session) – Current session
- folded_pairs ( - List[- Tuple[- Operation,- Operation]]) – Key: Conv/Linear layer Value: Corresponding folded BN layer
- cls_set_info_list ( - List[- ClsSetInfo]) – List of info elements for each cls set
 
- Return type
- Session
- Returns
- updated session after graph updates from hbf 
 
Code Examples for Higher Level APIs
Required imports
import tensorflow as tf
from tensorflow.keras.applications.resnet50 import ResNet50
from aimet_tensorflow.cross_layer_equalization import GraphSearchUtils, CrossLayerScaling, HighBiasFold
from aimet_tensorflow.batch_norm_fold import fold_all_batch_norms
Perform Cross Layer Equalization in auto mode step by step
def cross_layer_equalization_auto_stepwise():
    """ Individual api calls to perform cross layer equalization one step at a time"""
    # load a model
    tf.keras.backend.clear_session()
    _ = ResNet50(weights='imagenet', input_shape=(224, 224, 3))
    sess = tf.compat.v1.keras.backend.get_session()
    # get starting op name to invoke api for cle
    start_op_name = 'input_1'
    output_op_name = 'fc1000/Softmax'
    with sess.as_default():
        # replace any ReLU6 layers with ReLU
        graph_util = GraphSearchUtils(sess.graph, start_op_name, output_op_name)
        after_relu_replace_sess = graph_util.find_and_replace_relu6_with_relu(sess)
        # fold batchnorm layers
        after_bn_fold_sess, folded_pairs = fold_all_batch_norms(after_relu_replace_sess, start_op_name, output_op_name)
        # perform cross-layer scaling on applicable layer groups
        after_cls_sess, cls_set_info_list = CrossLayerScaling.scale_model(after_bn_fold_sess, start_op_name, output_op_name)
        # perform high bias fold
        # use the session after high bias fold returned for further evaluations on TF graph
        after_hbf_sess = HighBiasFold.bias_fold(after_cls_sess, folded_pairs, cls_set_info_list)
    sess.close()
Lower Level APIs for Cross Layer Equalization
API for Batch Norm Folding on subsets of convolution-batchnorm layer pairs
- aimet_tensorflow.batch_norm_fold.fold_given_batch_norms(sess, input_op_names, output_op_names, layer_pairs)[source]
- Api to fold custom set of bn layers in a model - Parameters
- sess ( - Session) – active tensorflow session
- input_op_names ( - Union[- str,- List[- str]]) – starting op in model or a list of starting ops in the model
- layer_pairs ( - List[- Tuple[- Operation,- Operation,- bool]]) – List of tuple with conv and bn op layers as tf.Operation and a flag to indicate fold upstream or downstream
- output_op_names ( - Union[- str,- List[- str]]) – List of output op names of the model, used to help ConnectedGraph determine valid ops (to ignore training ops for example).
 
- Return type
- Session
- Returns
- updated_session after fold 
 
API for Cross Layer Scaling on subset of conv layer groups
- aimet_tensorflow.cross_layer_equalization.CrossLayerScaling.scale_cls_sets(sess, cls_sets)
- Scale multiple CLS sets - Parameters
- sess ( - Session) – Current session
- cls_sets ( - List[- Union[- Tuple[- Operation,- Operation],- Tuple[- Operation,- Operation,- Operation]]]) – List of CLS sets
 
- Return type
- List[- Union[- ndarray,- Tuple[- ndarray]]]
- Returns
- Scaling factors calculated and applied for each CLS set in order 
 
API for High bias folding
- aimet_tensorflow.cross_layer_equalization.HighBiasFold.bias_fold(sess, folded_pairs, cls_set_info_list)
- Folds bias values greater than 3 * sigma to next layer’s bias - Parameters
- sess ( - Session) – Current session
- folded_pairs ( - List[- Tuple[- Operation,- Operation]]) – Key: Conv/Linear layer Value: Corresponding folded BN layer
- cls_set_info_list ( - List[- ClsSetInfo]) – List of info elements for each cls set
 
- Return type
- Session
- Returns
- updated session after graph updates from hbf 
 
Custom Datatype used
- class aimet_tensorflow.cross_layer_equalization.ClsSetInfo(cls_pair_1, cls_pair_2=None)[source]
- This class hold information about the layers in a CLS set, along with corresponding scaling factors for CLS set layers - class ClsSetLayerPairInfo(layer1, layer2, scale_factor, relu_activation_between_layers)[source]
- Models a pair of layers that were scaled using CLS. And related information. - Parameters
- layer1 ( - Operation) – layer as tf.Operation
- layer2 ( - Operation) – layer as tf.Operation
- scale_factor ( - ndarray) – scale factors as np.ndarray
- relu_activation_between_layers – list of flags per layer set indicating if they have Relu activations in-between. 
 
 
 - static map_cls_sets_to_new_session(tf_names_op_dict, cls_set_info_list)[source]
- Helper function to updates ops stored during cls to be used by high bias fold with updated session. - Parameters
- tf_names_op_dict ( - Dict[- str,- Operation]) – map of tf op names to ops
- cls_set_info_list – list of ClsSetInfo type 
 
- Returns
- None /cls_set_info_list updated in-place 
 
 
Code Example for Lower level APIs
Required imports
import tensorflow as tf
from tensorflow.keras.applications.resnet50 import ResNet50
from aimet_tensorflow.batch_norm_fold import fold_given_batch_norms
from aimet_tensorflow.utils.graph_saver import save_and_load_graph
from aimet_tensorflow.utils.op.conv import BiasUtils
Perform Cross Layer Equalization in manual mode
def cross_layer_equalization_manual():
    """ perform cross layer equalization using manual api"""
    # load a model
    tf.keras.backend.clear_session()
    _ = ResNet50(weights='imagenet', input_shape=(224, 224, 3))
    sess = tf.compat.v1.keras.backend.get_session()
    with sess.as_default():
        # Batch Norm Fold
        # pick potential pairs of conv and bn ops for fold
        layer_pairs = get_layer_pairs_Resnet50_for_folding(sess)
        # fold given layer
        after_fold_sess = fold_given_batch_norms(sess=sess, input_op_names="input_1", output_op_names="fc1000/Softmax",
                                                 layer_pairs=layer_pairs)
        # replace any ReLU6 layers with ReLU
        graph_search = GraphSearchUtils(after_fold_sess.graph, "input_1", "fc1000/Softmax")
        after_relu_replace_sess = graph_search.find_and_replace_relu6_with_relu(after_fold_sess)
        # Cross Layer Scaling
        # Create a list of consecutive conv layers to be equalized
        consecutive_layer_list = get_consecutive_layer_list_from_resnet50_for_scaling(after_relu_replace_sess)
        # invoke api to perform scaling on given list of cls pairs
        scaling_factor_list = CrossLayerScaling.scale_cls_sets(after_relu_replace_sess, consecutive_layer_list)
        # get info from bn fold and cross layer scaling in format required for high bias fold
        after_cls_sess, folded_pairs, cls_set_info_list = format_info_for_high_bias_fold(after_relu_replace_sess,
                                                                                         layer_pairs,
                                                                                         consecutive_layer_list,
                                                                                         scaling_factor_list)
        # perform high-bias fold
        after_hbf_sess = HighBiasFold.bias_fold(after_cls_sess, folded_pairs, cls_set_info_list)
    sess.close()
Example helper methods to perform CLE in manual mode
Helper to pick layers for batchnorm fold
def get_layer_pairs_Resnet50_for_folding(sess: tf.compat.v1.Session):
    """
    Helper function to pick example conv-batchnorm layer pairs for folding.
    :param sess: tensorflow session as tf.compat.v1.Session
    :return: pairs of conv and batchnorm layers for batch norm folding in Resnet50 model.
    """
    # pick conv and bn op pairs
    conv_op_1 = sess.graph.get_operation_by_name('res2a_branch2a/Conv2D')
    bn_op_1 = sess.graph.get_operation_by_name('bn2a_branch2a/cond/FusedBatchNorm_1')
    conv_op_2 = sess.graph.get_operation_by_name('res2a_branch2b/Conv2D')
    bn_op_2 = sess.graph.get_operation_by_name('bn2a_branch2b/cond/FusedBatchNorm_1')
    conv_op_3 = sess.graph.get_operation_by_name('res2a_branch2c/Conv2D')
    bn_op_3 = sess.graph.get_operation_by_name('bn2a_branch2c/cond/FusedBatchNorm_1')
    # make a layer pair list with potential the conv op and bn_op pair along with a flag
    # to indicate if given bn op can be folded upstream or downstream.
    # example of two pairs of conv and bn op  shown below
    layer_pairs = [(conv_op_1, bn_op_1, True),
                   (conv_op_2, bn_op_2, True),
                   (conv_op_3, bn_op_3, True)]
    return layer_pairs
Helper to pick layers for cross layer scaling
def get_consecutive_layer_list_from_resnet50_for_scaling(sess: tf.compat.v1.Session):
    """
    helper function to pick example consecutive layer list for scaling.
    :param sess: tf.compat.v1.Session
    :return: sample layers for scaling as consecutive_layer_list from Resnet50 model
    """
    conv1_op = sess.graph.get_operation_by_name('res2a_branch2a/Conv2D')
    conv1_depthwise_op = sess.graph.get_operation_by_name('res2a_branch2b/Conv2D')
    conv1_pointwise_op = sess.graph.get_operation_by_name('res2a_branch2c/Conv2D')
    # conv layers for scaling (after bn fold)
    consecutive_layer_list = [(conv1_op, conv1_depthwise_op, conv1_pointwise_op)]
    return consecutive_layer_list
Helper to format data from batchnorm fold and cross layer scaling for usage by high bias fold
def format_info_for_high_bias_fold(sess, layer_pairs, consecutive_layer_list, scaling_factor_list):
    """
     Helper function that formats data from cross layer scaling and bn fold for usage by high bias fold.
    :param sess: tf.compat.v1.Session type
    :param layer_pairs: info obtained after batchnorm fold.
    :param consecutive_layer_list: info obtained after cross layer scaling
    :param scaling_factor_list: scaling params corresponding to consecutive_layer_list
    :return: data formatted for high bias fold.
    """
    # convert info after batch norm fold and cross layer scaling for usage by high bias fold api
    folded_pairs = []
    for (conv_op, bn_op_with_meta, _fold_upstream_flag) in layer_pairs:
        folded_pairs.append((conv_op, bn_op_with_meta.op))
    # List that hold a boolean for if there were relu activations between layers of each cross layer scaling set
    is_relu_activation_in_cls_sets = []
    # Note the user is expected to fill in this list manually
    # Convert to a list of cls-set-info elements
    cls_set_info_list = CrossLayerScaling.create_cls_set_info_list(consecutive_layer_list,
                                                                   scaling_factor_list,
                                                                   is_relu_activation_in_cls_sets)
    # load and save the updated graph after scaling
    after_cls_sess = save_and_load_graph('./temp_cls', sess)
    return after_cls_sess, folded_pairs, cls_set_info_list