Multi Mat LaueNN scripts

In this repository we have included few scripts that will show the capability of the LaueNN to detect many phases (here 4 phases). The scripts can be modified to include more or less number of materials to detect.

An example script to use LaueNN with python script is presented below.

There are three basic steps to launch the LaueNN module. Step 1 and Step 2 need to be run only once per material/case.

  1. Generation of training dataset and training the neural network script:

    #!/usr/bin/env python
    # coding: utf-8
    
    # # Notebook script for generation of training dataset (supports n phase material)
    # 
    # ## For case of one or two phase, GUI works
    # 
    # ## Different steps of data generation is outlined in this notebook (LaueToolsNN GUI does the same thing)
    # 
    # ### Define material of interest
    # ### Generate class hkl data for Neural Network model (these are the output neurons)
    # ### Clean up generated dataset
    
    # In[1]:
    
    if __name__ == '__main__':     #enclosing required because of multiprocessing
    
        ## If material key does not exist in Lauetoolsnn dictionary
        ## you can modify its JSON materials file before import or starting analysis
        import json
        
        ## Load the json of material and extinctions
        with open(r'C:\Users\purushot\Desktop\github_version_simple\lauetoolsnn\lauetools\material.json','r') as f:
            dict_Materials = json.load(f)
    
        with open(r'C:\Users\purushot\Desktop\github_version_simple\lauetoolsnn\lauetools\extinction.json','r') as f:
            extinction_json = json.load(f)
            
        ## Modify the dictionary values to add new entries
        dict_Materials["ZrO2_mono"] = ["ZrO2_mono", [5.1471,  5.2125, 5.3129, 90, 99.23, 90], "VO2_mono"]
        dict_Materials["ZrO2_tet"] = ["ZrO2_tet", [3.64,  3.64, 5.27, 90, 90, 90], "VO2_mono2tet"]
        dict_Materials["ZrO2_cub"] = ["ZrO2_cub", [4.625,  4.625, 4.625, 90, 90, 90], "225"]
        dict_Materials["Zr_alpha"] = ["Zr_alpha", [3.23,  3.23, 5.15, 90, 90, 120], "hcp"]
        dict_Materials["Zr_beta"] = ["Zr_beta", [3.62,  3.62, 3.62, 90, 90, 90], "229"]
        dict_Materials["Nb_beta"] = ["Nb_beta", [3.585,  3.585, 3.585, 90, 90, 90], "229"]
        dict_Materials["Zr_Nb_Fe"] = ["Zr_Nb_Fe", [4.879,  4.879, 7.992, 90, 90, 120], "hcp"]
        dict_Materials["Cr"] = ["Cr", [2.87,  2.87, 2.87, 90, 90, 90], "229"]
        dict_Materials["C14"] = ["C14", [5.05,  5.05, 8.24, 90, 90, 120], "hcp"]
        dict_Materials["C15"] = ["C15", [7.15,  7.15, 7.15, 90, 90, 90], "227"]
        
        extinction_json["VO2_mono2tet"] = "VO2_mono2tet"
        extinction_json["VO2_mono"] = "VO2_mono"
        extinction_json["hcp"] = "hcp"
        extinction_json["229"] = "229"
        extinction_json["227"] = "227"
        extinction_json["225"] = "225"
        
        ## dump the json back with new values
        with open(r'C:\Users\purushot\Desktop\github_version_simple\lauetoolsnn\lauetools\material.json', 'w') as fp:
            json.dump(dict_Materials, fp)
    
        with open(r'C:\Users\purushot\Desktop\github_version_simple\lauetoolsnn\lauetools\extinction.json', 'w') as fp:
            json.dump(extinction_json, fp)
    
        ## Import modules used for this Notebook
        import os
        import numpy as np
        import _pickle as cPickle
        import itertools
        from keras.callbacks import EarlyStopping, ModelCheckpoint
        import matplotlib.pyplot as plt
        from tqdm import trange
        ## if LaueToolsNN is properly installed
        try:
            from lauetoolsnn.utils_lauenn import generate_classHKL, generate_multimat_dataset, \
                                            rmv_freq_class_MM, get_multimaterial_detail,\
                                            array_generator, array_generator_verify, vali_array
            from lauetoolsnn.NNmodels import model_arch_general_optimized, LoggingCallback,\
                                        model_arch_general_onelayer, model_arch_CNN_DNN_optimized
        except:
            # else import from a path where LaueToolsNN files are
            import sys
            sys.path.append(r"C:\Users\purushot\Desktop\github_version_simple\lauetoolsnn")
            from utils_lauenn import generate_classHKL, generate_multimat_dataset, \
                                    rmv_freq_class_MM, get_multimaterial_detail,\
                                    array_generator, array_generator_verify, vali_array
            from NNmodels import model_arch_general_optimized, LoggingCallback,\
                                        model_arch_general_onelayer, model_arch_CNN_DNN_optimized
                                    
                                    
        
        # ## step 1: define material and other parameters for simulating Laue patterns
            
        # In[2]:
        # =============================================================================
        ## User Input dictionary with parameters
        ## In case of only one phase/material, keep same value for material_ and material1_ key
        # =============================================================================
        input_params = {
                        # =============================================================================
                        #       GENERATION OF DATASET              
                        # =============================================================================
                        "prefix" : "",
                        "material_": ["Zr_alpha",
                                      "ZrO2_mono",
                                      ],             ## same key as used in dict_LaueTools
                        "symmetry": ["hexagonal",
                                     "monoclinic",
                                     ],           ## crystal symmetry of material_
                        "SG": [194,
                               14,
                               ],                     ## Space group of material_ (None if not known)
                        "hkl_max_identify" : [6,
                                              8,
                                              ],        ## Maximum hkl index to classify in a Laue pattern
                        "nb_grains_per_lp" : [4,
                                              4,
                                              ],        ## max grains to be generated in a Laue Image
                        "grains_nb_simulate" : 500,    ## Number of orientations to generate (takes advantage of crystal symmetry)
                        "maximum_angle_to_search":120, ## Angle of radial distribution to reconstruct the histogram (in deg)
                        "step_for_binning" : 0.1,      ## bin widht of angular radial distribution in degree
                        # =============================================================================
                        #        Detector parameters (roughly) of the Experimental setup
                        # =============================================================================
                        ## Sample-detector distance, X center, Y center, two detector angles
                        "detectorparameters" :  [79.553,979.32,932.31,0.37,0.447], 
                        "pixelsize" : 0.0734,          ## Detector pixel size
                        "dim1":2018,                   ## Dimensions of detector in pixels
                        "dim2":2016,
                        "emin" : 5,                    ## Minimum and maximum energy to use for simulating Laue Patterns
                        "emax" : 22,
                        # =============================================================================
                        #       Training paarmeters             
                        # =============================================================================
                        "freq_rmv_classhkl" : [100,
                                               100,
                                               ],
                        "keep_length_classhkl" : [50,
                                                  100,
                                                  ],
                        "list_hkl_keep" : [
                                            [(0,0,1)],
                                            [(0,0,0)]                        
                                            ],
                        "batch_size":50,               ## batches of files to use while training
                        "epochs":30,   
                        }
        
        generate_data = False
        train_model = True
        
        # ### number of files it will generate fro training
        nb_grains_list = []
        for ino, imat in enumerate(input_params["material_"]):
            nb_grains_list.append(list(range(input_params["nb_grains_per_lp"][ino]+1)))
        list_permute = list(itertools.product(*nb_grains_list))
        list_permute.pop(0)
        print(len(list_permute)*input_params["grains_nb_simulate"])
        # ## Step 2: Get material parameters 
        # ### Generates a folder with material name and gets material unit cell parameters and symmetry object 
        # from the get_material_detail function
        
        # In[3]:
         
        material_= input_params["material_"]
        n = input_params["hkl_max_identify"]
        maximum_angle_to_search = input_params["maximum_angle_to_search"]
        step_for_binning = input_params["step_for_binning"]
        nb_grains_per_lp = input_params["nb_grains_per_lp"]
        grains_nb_simulate = input_params["grains_nb_simulate"]
        detectorparameters = input_params["detectorparameters"]
        pixelsize = input_params["pixelsize"]
        emax = input_params["emax"]
        emin = input_params["emin"]
        symm_ = input_params["symmetry"]
        SG = input_params["SG"]
        
        if len(material_) > 1:
            prefix_mat = material_[0]
            for ino, imat in enumerate(material_):
                if ino == 0:
                    continue
                prefix_mat = prefix_mat + "_" + imat
        else:
            prefix_mat = material_[0]
        
        save_directory = os.getcwd()+"//"+prefix_mat+input_params["prefix"]
    
        print("save directory is : "+save_directory)
        if not os.path.exists(save_directory):
            os.makedirs(save_directory)
        
        ## get unit cell parameters and other details required for simulating Laue patterns
        rules, symmetry, lattice_material, \
            crystal, SG = get_multimaterial_detail(material_, SG, symm_)
        
        
        # ## Step 3: Generate Neural network output classes (Laue spot hkls) using the generate_classHKL function
        
        # In[4]:
        if generate_data:
            ### generate_classHKL_multimat
            ## procedure for generation of GROUND TRUTH classes
            # general_diff_cond = True will eliminate the hkl index that does not satisfy the general reflection conditions
            for ino in trange(len(material_)):
                generate_classHKL(n[ino], rules[ino], lattice_material[ino], \
                                  symmetry[ino], material_[ino], \
                                  crystal=crystal[ino], SG=SG[ino], general_diff_cond=False,
                                  save_directory=save_directory, write_to_console=print, \
                                  ang_maxx = maximum_angle_to_search, \
                                  step = step_for_binning)
            
            # ## Step 4: Generate Training and Testing dataset only for the output classes (Laue spot hkls) calculated in the Step 3
            # ### Uses multiprocessing library
            
            # In[5]:
            ############ GENERATING MULTI MATERIAL TRAINING DATA ##############
            # data_realism =True ; will introduce noise and partial Laue patterns in the training dataset
            # modelp can have either "random" for random orientation generation or "uniform" for uniform orientation generation
            # include_scm (if True; misorientation_angle parameter need to be defined): this parameter introduces misoriented crystal of 
            # specific angle along a crystal axis in the training dataset    
            generate_multimat_dataset(material_=material_, 
                                     ang_maxx=maximum_angle_to_search,
                                     step=step_for_binning, 
                                     nb_grains=nb_grains_per_lp, 
                                     grains_nb_simulate=grains_nb_simulate, 
                                     data_realism = True, 
                                     detectorparameters=detectorparameters, 
                                     pixelsize=pixelsize, 
                                     type_="training_data",
                                     var0 = 1, 
                                     dim1=input_params["dim1"], 
                                     dim2=input_params["dim2"], 
                                     removeharmonics=1, 
                                     save_directory=save_directory,
                                     write_to_console=print, 
                                     emin=emin, 
                                     emax=emax, 
                                     modelp = "random",
                                     general_diff_rules = False, 
                                     crystal = crystal,)
            
            ############ GENERATING TESTING DATA ##############
            factor = 5 # validation split for the training dataset  --> corresponds to 20% of total training dataset
            generate_multimat_dataset(material_=material_, 
                                     ang_maxx=maximum_angle_to_search,
                                     step=step_for_binning, 
                                     nb_grains=nb_grains_per_lp, 
                                     grains_nb_simulate=grains_nb_simulate//factor, 
                                     data_realism = True, 
                                     detectorparameters=detectorparameters, 
                                     pixelsize=pixelsize, 
                                     type_="testing_data",
                                     var0 = 1, 
                                     dim1=input_params["dim1"], 
                                     dim2=input_params["dim2"], 
                                     removeharmonics=1, 
                                     save_directory=save_directory,
                                     write_to_console=print, 
                                     emin=emin, 
                                     emax=emax, 
                                     modelp = "random",
                                     general_diff_rules = False, 
                                     crystal = crystal,)
            
            #%%# Updating the ClassHKL list by removing the non-common HKL or less frequent HKL from the list
            ## The non-common HKL can occur as a result of the detector position and energy used
            # freq_rmv: remove output hkl if the training dataset has less tha 100 occurances of the considered hkl (freq_rmv1 for second phase)
            # Weights (penalty during training) are also calculated based on the occurance
            
            freq_rmv = input_params["freq_rmv_classhkl"]
            elements = input_params["keep_length_classhkl"]
            list_hkl_keep = input_params["list_hkl_keep"]
            
            rmv_freq_class_MM(freq_rmv = freq_rmv, elements = elements,
                              save_directory = save_directory, material_ = material_,
                              write_to_console = print, progress=None, qapp=None,
                              list_hkl_keep = list_hkl_keep)
            
            
            ## End of data generation for Neural network training: all files are saved in the same folder 
            ## to be later used for training and prediction
        
            # ## Step 2: Load the necessary files generated in Step 1 script
            # ### Loading the Output class and ground truth
    
        # In[3]:
        if train_model:
            
            classhkl = np.load(save_directory+"//MOD_grain_classhkl_angbin.npz")["arr_0"]
            angbins = np.load(save_directory+"//MOD_grain_classhkl_angbin.npz")["arr_1"]
            loc_new = np.load(save_directory+"//MOD_grain_classhkl_angbin.npz")["arr_2"]
            with open(save_directory+"//class_weights.pickle", "rb") as input_file:
                class_weights = cPickle.load(input_file)
            class_weights = class_weights[0]
            n_bins = len(angbins)-1
            n_outputs = len(classhkl)
            print(n_bins, n_outputs)
            
            # ## Step 4: Training  
                
            # In[5]:
        
            epochs = input_params["epochs"]
            batch_size = input_params["batch_size"] 
            
            # model save directory and filename
            if len(material_) > 1:
                prefix_mat = material_[0]
                for ino, imat in enumerate(material_):
                    if ino == 0:
                        continue
                    prefix_mat = prefix_mat + "_" + imat
            else:
                prefix_mat = material_[0]
                
            model_name = save_directory+"//model_"+prefix_mat
            
            # Define model and train
            # neurons_multiplier is a list with number of neurons per layer, the first 
            # value is input shape and last value is output shape, 
            # inbetween are the number of neurons per hidden layers
            
            # model = model_arch_general_optimized(  n_bins, n_outputs,
            #                                         kernel_coeff = 1e-5,
            #                                         bias_coeff = 1e-6,
            #                                         lr = 1e-3,
            #                                         )
            # model = model_arch_general_onelayer(  n_bins, n_outputs,
            #                                         kernel_coeff = 1e-5,
            #                                         bias_coeff = 1e-6,
            #                                         lr = 1e-3,
            #                                         )
            
            model = model_arch_CNN_DNN_optimized(
                                                    (n_bins, 1), 
                                                    layer_activation="relu", 
                                                    output_activation="softmax",
                                                    dropout=0.2,
                                                    stride = [5,2],
                                                    kernel_size = [10,3],
                                                    pool_size=[2,1],
                                                    CNN_layers = 2,
                                                    CNN_filters = [128,128],
                                                    DNN_layers = 0,
                                                    DNN_filters = [1000,500],
                                                    output_neurons = n_outputs,
                                                    learning_rate = 0.001,
                                                    output="DNN"
                                                    )
            # Save model config and weights
            model_json = model.to_json()
            with open(model_name+".json", "w") as json_file:
                json_file.write(model_json)  
        
            ## temp function to quantify the spots and classes present in a batch
            trainy_inbatch = array_generator_verify(save_directory+"//training_data", batch_size, 
                                                    len(classhkl), loc_new, print)
            print("Number of spots in a batch of %i files : %i" %(batch_size, len(trainy_inbatch)))
            print("Min, Max class ID is %i, %i" %(np.min(trainy_inbatch), np.max(trainy_inbatch)))
            
            ## Batch loading for numpy grain files (Keep low value to avoid overcharging the RAM)
            nb_grains_list = []
            for ino, imat in enumerate(material_):
                nb_grains_list.append(list(range(nb_grains_per_lp[ino]+1)))
            list_permute = list(itertools.product(*nb_grains_list))
            list_permute.pop(0)
            steps_per_epoch = len(list_permute)*(grains_nb_simulate)//batch_size        
            val_steps_per_epoch = int(steps_per_epoch / 5)
            if steps_per_epoch == 0:
                steps_per_epoch = 1
            if val_steps_per_epoch == 0:
                val_steps_per_epoch = 1 
                
            ## Load generator objects from filepaths (iterators for Training and Testing datasets)
            training_data_generator = array_generator(save_directory+"//training_data", batch_size,                                          
                                                      len(classhkl), loc_new, print)
            testing_data_generator = array_generator(save_directory+"//testing_data", batch_size,                                           
                                                     len(classhkl), loc_new, print)
            
            
            ######### TRAIN THE DATA
            es = EarlyStopping(monitor='val_accuracy', mode='max', patience=2)
            ms = ModelCheckpoint(save_directory+"//best_val_acc_model.h5", monitor='val_accuracy', 
                                  mode='max', save_best_only=True)
            lc = LoggingCallback(None, None, None, model, model_name)
            ## Fitting function
            stats_model = model.fit(
                                    training_data_generator, 
                                    epochs=epochs, 
                                    steps_per_epoch=steps_per_epoch,
                                    validation_data=testing_data_generator,
                                    validation_steps=val_steps_per_epoch,
                                    verbose=1,
                                    class_weight=class_weights,
                                    callbacks=[es, ms, lc]
                                    )          
            # serialize weights to HDF5
            model.save_weights(model_name+".h5")
            print("Saved model to disk")
            
            print( "Training Accuracy: "+str( stats_model.history['accuracy'][-1]))
            print( "Training Loss: "+str( stats_model.history['loss'][-1]))
            print( "Validation Accuracy: "+str( stats_model.history['val_accuracy'][-1]))
            print( "Validation Loss: "+str( stats_model.history['val_loss'][-1]))
            
            # Plot the accuracy/loss v Epochs
            epochs = range(1, len(model.history.history['loss']) + 1)
            fig, ax = plt.subplots(1,2)
            ax[0].plot(epochs, model.history.history['loss'], 'r', label='Training loss')
            ax[0].plot(epochs, model.history.history['val_loss'], 'r', ls="dashed", label='Validation loss')
            ax[0].legend()
            ax[1].plot(epochs, model.history.history['accuracy'], 'g', label='Training Accuracy')
            ax[1].plot(epochs, model.history.history['val_accuracy'], 'g', ls="dashed", label='Validation Accuracy')
            ax[1].legend()
            plt.savefig(save_directory+"//loss_accuracy_"+prefix_mat+".png", bbox_inches='tight',format='png', dpi=1000)
            plt.close()
            
            text_file = open(save_directory+"//loss_accuracy_logger_"+prefix_mat+".txt", "w")
            text_file.write("# EPOCH, LOSS, VAL_LOSS, ACCURACY, VAL_ACCURACY" + "\n")
            for inj in range(len(epochs)):
                string1 = str(epochs[inj]) + ","+ str(model.history.history['loss'][inj])+\
                                ","+str(model.history.history['val_loss'][inj])+","+str(model.history.history['accuracy'][inj])+\
                                ","+str(model.history.history['val_accuracy'][inj])+" \n"  
                text_file.write(string1)
            text_file.close() 
            
            
            # ## Stats on the trained model with sklearn metrics
            # In[6]:
            from sklearn.metrics import classification_report
            ## verify the statistics
            x_test, y_test = vali_array(save_directory+"//testing_data", 50, len(classhkl), loc_new, print)
            y_test = np.argmax(y_test, axis=-1)
            y_pred = np.argmax(model.predict(x_test), axis=-1)
            print(classification_report(y_test, y_pred))
    
    
    
    
    
    
  2. Generate simulated Laue Image of multi-phase material:

    #!/usr/bin/env python
    # coding: utf-8
    
    # # Notebook script for generating simulated laue patterns to be used to verify prediction of Laue hkl (step 3) using the Trained model from step 2
    # 
    # ### Define material of interest for which the simulated data is generated (angular coordinates data is generated based on the defined detector geometry);
    # ### Simulate Laue patterns of required complexity
    
    # In[1]:
    if __name__ == '__main__':     #enclosing required because of multiprocessing
    
    
        ## Import modules used for this Notebook
        import os
        import numpy as np
        import random
        from random import random as rand1
        from math import acos
        from tqdm import trange
        
        ## if LaueToolsNN is properly installed
        try:
            from lauetoolsnn.utils_lauenn import get_multimaterial_detail, Euler2OrientationMatrix
            from lauetoolsnn.lauetools import dict_LaueTools as dictLT
            from lauetoolsnn.lauetools import IOLaueTools as IOLT
            from lauetoolsnn.lauetools import lauecore as LT
            from lauetoolsnn.lauetools import CrystalParameters as CP
            from lauetoolsnn.lauetools import generaltools as GT
        except:
            # else import from a path where LaueToolsNN files are
            import sys
            sys.path.append(r"C:\Users\purushot\Desktop\github_version_simple\lauetoolsnn")
            from utils_lauenn import get_multimaterial_detail, Euler2OrientationMatrix
            sys.path.append(r"C:\Users\purushot\Desktop\github_version_simple\lauetoolsnn\lauetools")
            import dict_LaueTools as dictLT
            import IOLaueTools as IOLT
            import lauecore as LT
            import CrystalParameters as CP
            import generaltools as GT
        
        
        # ## step 1: define material and other parameters
        
        # In[2]:
        # =============================================================================
        ## User Input dictionary with parameters
        ## In case of only one phase/material, keep same value for material_ and material1_ key
        # =============================================================================
        input_params = {
                        # =============================================================================
                        #       GENERATION OF DATASET              
                        # =============================================================================
                        "material_": ["Zr_alpha",
                                      "ZrO2_mono",
                                      "ZrO2_tet",
                                      "ZrO2_cub",
                                      "Zr_Nb_Fe",
                                      ],             ## same key as used in dict_LaueTools
                        "prefix" : "",                 ## prefix for the folder to be created for training dataset
                        "symmetry": ["hexagonal",
                                     "monoclinic",
                                     "tetragonal",
                                     "cubic",
                                     "hexagonal",
                                     ],           ## crystal symmetry of material_
                        "SG": [194,
                               14,
                               137,
                               225,
                               194,
                               ],                     ## Space group of material_ (None if not known)
                        # =============================================================================
                        #        Detector parameters (roughly) of the Experimental setup
                        # =============================================================================
                        ## Sample-detector distance, X center, Y center, two detector angles
                        "detectorparameters" :  [79.553,979.32,932.31,0.37,0.447], 
                        "pixelsize" : 0.0734,          ## Detector pixel size
                        "dim1":2018,                   ## Dimensions of detector in pixels
                        "dim2":2016,
                        "emin" : 5,                    ## Minimum and maximum energy to use for simulating Laue Patterns
                        "emax" : 22,
                        # =============================================================================
                        #       simulation parameters             
                        # =============================================================================
                        "experimental_directory": "",
                        "experimental_prefix": "",
                        "use_simulated_dataset": True,  ## Use simulated dataset (generated at step 3a) incase no experimental data to verify the trained model
                        "grid_size_x" : 25,            ## Grid X and Y limit to generate the simulated dataset (a rectangular scan region)
                        "grid_size_y" : 25,  
                        "grains_max" :[1,
                                       1,
                                       1,
                                       1,
                                       1,
                                       ], ## Maximum number of grains to simulate (randomly generate between 1 and grains_max parameters)
                        }
        
        # ## Step 2: Get material parameters 
        
        # In[3]:
        material_= input_params["material_"]
        detectorparameters = input_params["detectorparameters"]
        pixelsize = input_params["pixelsize"]
        emax = input_params["emax"]
        emin = input_params["emin"]
        dim1 = input_params["dim1"]
        dim2 = input_params["dim2"]
        symm_ = input_params["symmetry"]
        SG = input_params["SG"]
        grains_sim = input_params["grains_max"]
        grid = input_params["grid_size_x"]*input_params["grid_size_y"] 
        
        if len(material_) > 1:
            prefix_mat = material_[0]
            for ino, imat in enumerate(material_):
                if ino == 0:
                    continue
                prefix_mat = prefix_mat + "_" + imat
        else:
            prefix_mat = material_
        
        save_directory = os.getcwd()+"//"+prefix_mat+input_params["prefix"]
    
        save_directory_sim_data = save_directory + "//simulated_dataset"
        
        print("save directory is : "+save_directory)
        print("Simulated data save directory is : "+save_directory_sim_data)
        if not os.path.exists(save_directory):
            os.makedirs(save_directory)
        if not os.path.exists(save_directory_sim_data):
            os.makedirs(save_directory_sim_data)
        
        ## get unit cell parameters and other details required for simulating Laue patterns
        rules, symmetry, lattice_material, \
            crystal, SG = get_multimaterial_detail(material_, SG, symm_)
    
        
        # ## Step 3: Generate Laue patterns 
        
        # In[4]:
        
        text_file = open(save_directory_sim_data+"//filecreation_stats_"+prefix_mat+"_v2.txt", "w")
        
        detector_label = "sCMOS" ## by default (no need to modify; used later to get detector bounds)
        for ii in trange(grid):
            
            #time.sleep(1) ## 1second pause to replicate Experiment time in case 
                           ## we want to do live prediction while data is being written
            
            grains_to_sim = [random.randint(0,igrain) for igrain in grains_sim]
            while True:
                if np.all(np.array(grains_to_sim) == 0):
                    grains_to_sim = [random.randint(0,igrain) for igrain in grains_sim]
                else:
                    break
                    
            l_tth, l_chi, l_miller_ind, l_posx, l_posy, l_E, l_intensity = [],[],[],[],[],[],[]
            detectordiameter = pixelsize * dim1
            prefix_cor_header = []
            nbgrains = []
            g = []
            for no, i in enumerate(grains_to_sim):
                if i != 0:
                    nbgrains.append(i)
                    for igr in range(i):
                        prefix_cor_header.append(material_[no])
                        phi1 = rand1() * 360.
                        phi = 180. * acos(2 * rand1() - 1) / np.pi
                        phi2 = rand1() * 360.
                        UBmatrix = Euler2OrientationMatrix((phi1, phi, phi2))
                        g.append(UBmatrix)
                        
                        grain = CP.Prepare_Grain(material_[no], UBmatrix)
                        s_tth, s_chi, s_miller_ind, \
                            s_posx, s_posy, s_E= LT.SimulateLaue_full_np(grain, emin, emax,
                                                                        detectorparameters,
                                                                        pixelsize=pixelsize,
                                                                        dim=(dim1, dim2),
                                                                        detectordiameter=detectordiameter,
                                                                        removeharmonics=1)
                        s_miller_ind = np.c_[s_miller_ind, np.ones(len(s_miller_ind))*no]
                        s_intensity = 1./s_E
                        l_tth.append(s_tth)
                        l_chi.append(s_chi)
                        l_miller_ind.append(s_miller_ind)
                        l_posx.append(s_posx)
                        l_posy.append(s_posy)
                        l_E.append(s_E)
                        l_intensity.append(s_intensity)
            #flat_list = [item for sublist in l for item in sublist]
            s_tth = np.array([item for sublist in l_tth for item in sublist])
            s_chi = np.array([item for sublist in l_chi for item in sublist])
            s_miller_ind = np.array([item for sublist in l_miller_ind for item in sublist])
            s_posx = np.array([item for sublist in l_posx for item in sublist])
            s_posy = np.array([item for sublist in l_posy for item in sublist])
            s_E = np.array([item for sublist in l_E for item in sublist])
            s_intensity=np.array([item for sublist in l_intensity for item in sublist])
            
            #sortintensity
            indsort = np.argsort(s_intensity)[::-1]
            s_tth=np.take(s_tth, indsort)
            s_chi=np.take(s_chi, indsort)
            s_miller_ind=np.take(s_miller_ind, indsort, axis=0)
            s_posx=np.take(s_posx, indsort)
            s_posy=np.take(s_posy, indsort)
            s_E=np.take(s_E, indsort)
            s_intensity=np.take(s_intensity, indsort)
            
            # considering all spots
            allspots_the_chi = np.transpose(np.array([s_tth/2., s_chi]))
            tabledistancerandom = np.transpose(GT.calculdist_from_thetachi(allspots_the_chi, allspots_the_chi))
            # ground truth
            hkl_sol = s_miller_ind
            
            framedim = dictLT.dict_CCD[detector_label][0]
            dict_dp={}
            dict_dp['kf_direction']='Z>0'
            dict_dp['detectorparameters']=detectorparameters
            dict_dp['detectordistance']=detectorparameters[0]
            dict_dp['detectordiameter']=pixelsize*dim1
            dict_dp['pixelsize']=pixelsize
            dict_dp['dim']=framedim
            dict_detector = detectorparameters
            CCDcalib = {"CCDLabel":"cor",
                        "dd":dict_detector[0], 
                        "xcen":dict_detector[1], 
                        "ycen":dict_detector[2], 
                        "xbet":dict_detector[3], 
                        "xgam":dict_detector[4],
                        "pixelsize": pixelsize}
    
            IOLT.writefile_cor(save_directory_sim_data+"//"+prefix_mat+"_"+str(ii), s_tth, s_chi, s_posx, s_posy, s_intensity,
                               param=CCDcalib, sortedexit=0)    
            
            text_file.write("####### File : "+save_directory_sim_data+"//"+prefix_mat+"_"+str(ii) + ".cor generated \n")
            for ino, rm in enumerate(g):
                if np.all(rm == 0):
                    continue
                text_file.write("# Phase "+prefix_cor_header[ino]+":  "+str(nbgrains[ino]) + " grains \n")
                temp_ = rm.flatten()
                string1 = "[["+str(temp_[0])+","+str(temp_[1])+","+str(temp_[2])+"],"+                      "["+str(temp_[3])+","+str(temp_[4])+","+str(temp_[5])+"],"+                          "["+str(temp_[6])+","+str(temp_[7])+","+str(temp_[8])+"]]"+ " \n"  
                text_file.write(string1)
            text_file.write("# ********** \n \n")
        text_file.close()
        
        
        # ## Generate a config file for the simulated dataset to be used with GUI
        
        # In[6]:
        
        ## make calib text file
        calib_file = save_directory_sim_data+"//calib.det"
        text_file = open(calib_file, "w")
        
        text_file.write("79.553, 979.32, 932.31, 0.37, 0.447, 0.07340000, 2018, 2016 \n")
        text_file.write("Sample-Detector distance(IM), xO, yO, angle1, angle2, pixelsize, dim1, dim2 \n")
        text_file.write("Calibration done with Ge at Wed Sep 22 14:31:38 2021 with LaueToolsGUI.py \n")
        text_file.write("Experimental Data file: G:\\bm32\\SH1\\Ge_0001_LT_2.dat \n")
        text_file.write("Orientation Matrix: \n")
        text_file.write("[[0.0969250,0.6153840,-0.7822455],[-0.9616391,0.2605486,0.0858177],[0.2566238,0.7439200,0.6170310]] \n")
        text_file.write("# Material : Ge \n")
        text_file.write("# dd : 79.553 \n")
        text_file.write("# xcen : 979.32 \n")
        text_file.write("# ycen : 932.31 \n")
        text_file.write("# xbet : 0.37 \n")
        text_file.write("# xgam : 0.447 \n")
        text_file.write("# pixelsize : 0.0734 \n")
        text_file.write("# xpixelsize : 0.0734 \n")
        text_file.write("# ypixelsize : 0.0734 \n")
        text_file.write("# CCDLabel : cor \n")
        text_file.write("# framedim : (2018, 2016) \n")
        text_file.write("# detectordiameter : 162.93332000000004 \n")
        text_file.write("# kf_direction : Z>0")
        text_file.close()
        
     
    # In[ ]:
    
    
    
    
    
  3. verify the neural network prediction and indexation of multi-phase Laue Images:

    #!/usr/bin/env python
    # coding: utf-8
    
    # # Notebook script for Prediction of Laue spot hkl using the Trained model from step 2 (supports n phase material)
    # # This notebook also includes complete indexation process from the predicted spot hkl
    # 
    # ## Different steps of loading model to predicting the hkl of spots is outlined in this notebook (LaueToolsNN GUI does the same thing)
    # 
    # ### Define material of interest and path to experimental data; the path to trained model will be extracted automatically by default
    # ### Load the trained model 
    # ### Prediction of Laue spots hkl 
    # ### Constructing orientation matrix from the predicted hkl (i.e. index Laue Patterns)
    
    
    # =============================================================================
    # As of 12/04/2022 Multi_mat scripts only suport "slow" mode of UB matrix computation
    # =============================================================================
    
    # In[1]:
    if __name__ == "__main__":    
    
        ## Import modules used for this Notebook
        import numpy as np
        import os
        import multiprocessing
        from multiprocessing import cpu_count
        import time, datetime
        import glob, re
        import configparser
        from itertools import accumulate
        ## if LaueToolsNN is properly installed
        try:
            from lauetoolsnn.utils_lauenn import get_multimaterial_detail, new_MP_multimat_function, resource_path, global_plots_MM
            from lauetoolsnn.lauetools import dict_LaueTools as dictLT
            from lauetoolsnn.NNmodels import read_hdf5
        except:
            # else import from a path where LaueToolsNN files are
            import sys
            sys.path.append(r"C:\Users\purushot\Desktop\github_version_simple\lauetoolsnn")
            from utils_lauenn import get_multimaterial_detail, new_MP_multimat_function, resource_path, global_plots_MM
            from NNmodels import read_hdf5        
            sys.path.append(r"C:\Users\purushot\Desktop\github_version_simple\lauetoolsnn\lauetools")
            import dict_LaueTools as dictLT
        
        import _pickle as cPickle
        from tqdm import tqdm
        
        ncpu = cpu_count()
        print("Number of CPUs available : ", ncpu)
        
        # ## step 1: define material and path to data and trained model
        # =============================================================================
        ## User Input dictionary with parameters
        ## In case of only one phase/material, keep same value for material_ and material1_ key
        # =============================================================================
        input_params = {
                        # =============================================================================
                        #       GENERATION OF DATASET              
                        # =============================================================================
                        "material_": ["Zr_alpha",
                                      "ZrO2_mono",
                                      ],             ## same key as used in dict_LaueTools
                        "prefix" : "",                 ## prefix for the folder to be created for training dataset
                        "symmetry": ["hexagonal",
                                     "monoclinic",
                                     ],           ## crystal symmetry of material_
                        "SG": [194,
                               14,
                               ],                     ## Space group of material_ (None if not known)
                        # =============================================================================
                        #        Detector parameters (roughly) of the Experimental setup
                        # =============================================================================
                        ## Sample-detector distance, X center, Y center, two detector angles
                        "detectorparameters" :  [79.553,979.32,932.31,0.37,0.447], 
                        "pixelsize" : 0.0734,          ## Detector pixel size
                        "dim1":2018,                   ## Dimensions of detector in pixels
                        "dim2":2016,
                        "emin" : 5,                    ## Minimum and maximum energy to use for simulating Laue Patterns
                        "emax" : 22,
                        # =============================================================================
                        #       Prediction paarmeters             
                        # =============================================================================
                        "experimental_directory": r"C:\Users\purushot\Desktop\Guillou_Laue\scan_0012",
                        "experimental_prefix": "ech1_map2D_3_",
                        "grid_size_x" : 1,            ## Grid X and Y limit to generate the simulated dataset (a rectangular scan region)
                        "grid_size_y" : 5,  
                        "UB_tolerance": [0.6,
                                         0.6,
                                         ],
                        "tolerance_strain": [
                                            [0.6,0.55,0.5,0.45,0.4,0.35,0.3,0.25,0.2,0.15],
                                            [0.6,0.55,0.5,0.45,0.4,0.35,0.3,0.25,0.2,0.15],
                                            ],
                        "strain_free_parameters": ["b","c","alpha","beta","gamma"],
                        "material_ub_limit": [10,
                                              10,
                                              ],
                        
                        "UB_matrix_detect": 12,
                        "material_phase_always_present": [1,1,1,1,1,1,1,1,1,2,2,2],
                        ## in case if one phase is always present in a Laue pattern (useful for substrate cases)
                        # or for pretty plots
                        }
        
        #%% ## Step 2: Get material parameters 
        # ### Get model and data paths from the input
        # ### User input parameters for various algorithms to compute the orientation matrix
    
        material_= input_params["material_"]
        detectorparameters = input_params["detectorparameters"]
        pixelsize = input_params["pixelsize"]
        emax = input_params["emax"]
        emin = input_params["emin"]
        dim1 = input_params["dim1"]
        dim2 = input_params["dim2"]
        symm_ = input_params["symmetry"]
        SG = input_params["SG"]
        tolerance = input_params["UB_tolerance"]
        tolerance_strain = input_params["tolerance_strain"]
        strain_free_parameters = input_params["strain_free_parameters"]
        material_limit = input_params["material_ub_limit"]
        material_phase_always_present = input_params["material_phase_always_present"]
        model_annote = "from_file"
        
        if len(material_) > 1:
            prefix_mat = material_[0]
            for ino, imat in enumerate(material_):
                if ino == 0:
                    continue
                prefix_mat = prefix_mat + "_" + imat
        else:
            prefix_mat = material_[0]
        
        model_direc = os.getcwd()+"//"+prefix_mat+input_params["prefix"]
        
        if not os.path.exists(model_direc):
            print("The directory doesn't exists; please veify the path")
        else:
            print("Directory where trained model is stored : "+model_direc)
            
        ## get unit cell parameters and other details required for simulating Laue patterns
        rules, symmetry, lattice_material, \
            crystal, SG = get_multimaterial_detail(material_, SG, symm_)
    
        if input_params["experimental_directory"] == "" and input_params["experimental_prefix"] == "":
            filenameDirec =  model_direc + "//simulated_dataset"
            experimental_prefix = prefix_mat+"_"
            lim_x, lim_y = input_params["grid_size_x"], input_params["grid_size_y"]
            format_file = "cor"
        else:
            filenameDirec = input_params["experimental_directory"]
            experimental_prefix = input_params["experimental_prefix"]
            lim_x, lim_y = input_params["grid_size_x"], input_params["grid_size_y"] 
            format_file = dictLT.dict_CCD["sCMOS"][7]
        
        hkl_all_class0 = []
        for ino, imat in enumerate(material_):
            with open(model_direc+"//classhkl_data_nonpickled_"+imat+".pickle", "rb") as input_file:
                hkl_all_class_load = cPickle.load(input_file)[0]
            hkl_all_class0.append(hkl_all_class_load)
            
        ## Experimental peak search parameters in case of RAW LAUE PATTERNS from detector
        intensity_threshold = 150
        boxsize = 10
        fit_peaks_gaussian = 1
        FitPixelDev = 18
        NumberMaxofFits = 2000 ### Max peaks per LP
        bkg_treatment = "A-B"
    
        ## Requirements
        ubmat = input_params["UB_matrix_detect"] # How many orientation matrix to detect per Laue pattern
        mode_spotCycle = "graphmode" ## mode of calculation
        use_previous_UBmatrix_name = False ## Try previous indexation solutions to speed up the process
        strain_calculation = True ## Strain refinement is required or not
        ccd_label_global = "sCMOS"
    
        ## Parameters to control the orientation matrix indexation
        softmax_threshold_global = 0.80 # softmax_threshold of the Neural network to consider
        mr_threshold_global = 0.70 # match rate threshold to accept a solution immediately
        cap_matchrate = 0.20 * 100 ## any UB matrix providing MR less than this will be ignored
        coeff = 0.10            ## coefficient to calculate the overlap of two solutions
        coeff_overlap = 0.10    ##10% spots overlap is allowed with already indexed orientation
        
        ## Additional parameters to refine the orientation matrix construction process
        use_om_user = "false"
        nb_spots_consider = 500
        residues_threshold=0.5
        nb_spots_global_threshold=8
        option_global = "v2"
        additional_expression = ["none"] # for strain assumptions, like a==b for HCP
        
        config_setting = configparser.ConfigParser()
        filepath = resource_path('settings.ini')
        print("Writing settings file in " + filepath)
        config_setting.read(filepath)
        config_setting.set('CALLER', 'residues_threshold',str(residues_threshold))
        config_setting.set('CALLER', 'nb_spots_global_threshold',str(nb_spots_global_threshold))
        config_setting.set('CALLER', 'option_global',option_global)
        config_setting.set('CALLER', 'use_om_user',use_om_user)
        config_setting.set('CALLER', 'nb_spots_consider',str(nb_spots_consider))
        config_setting.set('CALLER', 'path_user_OM',"none")
        config_setting.set('CALLER', 'intensity', str(intensity_threshold))
        config_setting.set('CALLER', 'boxsize', str(boxsize))
        config_setting.set('CALLER', 'pixdev', str(FitPixelDev))
        config_setting.set('CALLER', 'cap_softmax', str(softmax_threshold_global))
        config_setting.set('CALLER', 'cap_mr', str(cap_matchrate/100.))
        config_setting.set('CALLER', 'strain_free_parameters', ",".join(strain_free_parameters))
        config_setting.set('CALLER', 'additional_expression', ",".join(additional_expression))
        with open(filepath, 'w') as configfile:
            config_setting.write(configfile)
        
        ## load model related files and generate the model
        classhkl = np.load(model_direc+"//MOD_grain_classhkl_angbin.npz")["arr_0"]
        angbins = np.load(model_direc+"//MOD_grain_classhkl_angbin.npz")["arr_1"]
        ind_mat_all = np.load(model_direc+"//MOD_grain_classhkl_angbin.npz",allow_pickle=True)["arr_5"]
        ind_mat = []
        for inni in ind_mat_all:
            ind_mat.append(len(inni))
        ind_mat = [item for item in accumulate(ind_mat)]
        
        # json_file = open(model_direc+"//model_"+prefix_mat+".json", 'r')
        load_weights = model_direc + "//model_"+prefix_mat+".h5"
        wb = read_hdf5(load_weights)
        temp_key = list(wb.keys())
    
        ct = time.time()
        now = datetime.datetime.fromtimestamp(ct)
        c_time = now.strftime("%Y-%m-%d_%H-%M-%S")   
        
        #%% ## Step 3: Initialize variables and prepare arguments for multiprocessing module
        
        col = [[] for i in range(int(ubmat))]
        colx = [[] for i in range(int(ubmat))]
        coly = [[] for i in range(int(ubmat))]
        rotation_matrix = [[] for i in range(int(ubmat))]
        strain_matrix = [[] for i in range(int(ubmat))]
        strain_matrixs = [[] for i in range(int(ubmat))]
        match_rate = [[] for i in range(int(ubmat))]
        spots_len = [[] for i in range(int(ubmat))]
        iR_pix = [[] for i in range(int(ubmat))]
        fR_pix = [[] for i in range(int(ubmat))]
        mat_global = [[] for i in range(int(ubmat))]
        best_match = [[] for i in range(int(ubmat))]
        spots1_global = [[] for i in range(int(ubmat))]
        for i in range(int(ubmat)):
            col[i].append(np.zeros((lim_x*lim_y,3)))
            colx[i].append(np.zeros((lim_x*lim_y,3)))
            coly[i].append(np.zeros((lim_x*lim_y,3)))
            rotation_matrix[i].append(np.zeros((lim_x*lim_y,3,3)))
            strain_matrix[i].append(np.zeros((lim_x*lim_y,3,3)))
            strain_matrixs[i].append(np.zeros((lim_x*lim_y,3,3)))
            match_rate[i].append(np.zeros((lim_x*lim_y,1)))
            spots_len[i].append(np.zeros((lim_x*lim_y,1)))
            iR_pix[i].append(np.zeros((lim_x*lim_y,1)))
            fR_pix[i].append(np.zeros((lim_x*lim_y,1)))
            mat_global[i].append(np.zeros((lim_x*lim_y,1)))
            best_match[i].append([[] for jk in range(lim_x*lim_y)])
            spots1_global[i].append([[] for jk in range(lim_x*lim_y)])
    
        # =============================================================================
        #         ## Multi-processing routine
        # =============================================================================        
        ## Number of files to generate
        grid_files = np.zeros((lim_x,lim_y))
        filenm = np.chararray((lim_x,lim_y), itemsize=1000)
        grid_files = grid_files.ravel()
        filenm = filenm.ravel()
        count_global = lim_x * lim_y
        list_of_files = glob.glob(filenameDirec+'//'+experimental_prefix+'*.'+format_file)
        ## sort files
        list_of_files.sort(key=lambda var:[int(x) if x.isdigit() else x for x in re.findall(r'[^0-9]|[0-9]+', var)])
        
        if len(list_of_files) == count_global:
            for ii in range(len(list_of_files)):
                grid_files[ii] = ii
                filenm[ii] = list_of_files[ii]     
            print("expected "+str(count_global)+" files based on the XY grid ("+str(lim_x)+","+str(lim_y)+") defined by user")
            print("and found "+str(len(list_of_files))+" files")
        else:
            print("expected "+str(count_global)+" files based on the XY grid ("+str(lim_x)+","+str(lim_y)+") defined by user")
            print("But found "+str(len(list_of_files))+" files (either all data is not written yet or maybe XY grid definition is not proper)")
            digits = len(str(count_global))
            digits = max(digits,4)
            # Temp fix
            for ii in range(count_global):
                text = str(ii)
                if ii < 10000:
                    string = text.zfill(4)
                else:
                    string = text.zfill(5)
                file_name_temp = filenameDirec+'//'+experimental_prefix + string+'.'+format_file
                ## store it in a grid 
                filenm[ii] = file_name_temp
        
        check = np.zeros((count_global,int(ubmat)))
        # =============================================================================
        blacklist = None
        
        ### Create a COR directory to be loaded in LaueTools
        cor_file_directory = filenameDirec + "//" + experimental_prefix+"CORfiles"
        if list_of_files[0].split(".")[-1] in ['cor',"COR","Cor"]:
            cor_file_directory = filenameDirec 
        if not os.path.exists(cor_file_directory):
            os.makedirs(cor_file_directory)
        
        try_prevs = False
        files_treated = []
        
        valu12 = [[ filenm[ii].decode(), 
                    ii,
                    rotation_matrix,
                    strain_matrix,
                    strain_matrixs,
                    col,
                    colx,
                    coly,
                    match_rate,
                    spots_len, 
                    iR_pix, 
                    fR_pix,
                    best_match,
                    mat_global,
                    check,
                    detectorparameters,
                    pixelsize,
                    angbins,
                    classhkl,
                    hkl_all_class0,
                    emin,
                    emax,
                    material_,
                    symmetry,
                    lim_x,
                    lim_y,
                    strain_calculation, 
                    ind_mat, 
                    model_direc, 
                    tolerance,
                    int(ubmat), ccd_label_global, 
                    None,
                    float(intensity_threshold),
                    int(boxsize),
                    bkg_treatment,
                    filenameDirec, 
                    experimental_prefix,
                    blacklist,
                    None,
                    files_treated,
                    try_prevs, ## try previous is kept true, incase if its stuck in loop
                    wb,
                    temp_key,
                    cor_file_directory,
                    mode_spotCycle,
                    softmax_threshold_global,
                    mr_threshold_global,
                    cap_matchrate,
                    tolerance_strain,
                    NumberMaxofFits,
                    fit_peaks_gaussian,
                    FitPixelDev,
                    coeff,
                    coeff_overlap,
                    material_limit,
                    use_previous_UBmatrix_name,
                    material_phase_always_present,
                    crystal,
                    strain_free_parameters,
                    model_annote] for ii in range(count_global)]
        
        # start_time = time.time()
        # # Launch on single file to verify
        # results = new_MP_multimat_function(valu12[0])
        # print('Took ',time.time()-start_time, "seconds")
        
        # print("matching rate")
        # temp_0 = [results[6][i][0][0] for i in range(len(results[6]))]
        # print(temp_0)
        
        # print("Orientation metrix")
        # temp_0 = [results[2][i][0][0].ravel() for i in range(len(results[6]))]
        # for itemm in temp_0:
        #     print(",".join(str(x) for x in itemm))
            
        #% Launch multiprocessing prediction     
        if 1:
            args = zip(valu12)
            with multiprocessing.Pool(ncpu) as pool:
                results = pool.starmap(new_MP_multimat_function, tqdm(args, total=len(valu12)))
                
                for r in results:
                    r_message_mpdata = r
                    strain_matrix_mpdata, strain_matrixs_mpdata, rotation_matrix_mpdata, col_mpdata,\
                    colx_mpdata, coly_mpdata, match_rate_mpdata, mat_global_mpdata,\
                        cnt_mpdata, meta_mpdata, files_treated_mpdata, spots_len_mpdata, \
                            iR_pixel_mpdata, fR_pixel_mpdata, best_match_mpdata, check_mpdata = r_message_mpdata
            
                    for i_mpdata in files_treated_mpdata:
                        files_treated.append(i_mpdata)
            
                    for intmat_mpdata in range(int(ubmat)):
                        check[cnt_mpdata,intmat_mpdata] = check_mpdata[cnt_mpdata,intmat_mpdata]
                        mat_global[intmat_mpdata][0][cnt_mpdata] = mat_global_mpdata[intmat_mpdata][0][cnt_mpdata]
                        strain_matrix[intmat_mpdata][0][cnt_mpdata,:,:] = strain_matrix_mpdata[intmat_mpdata][0][cnt_mpdata,:,:]
                        strain_matrixs[intmat_mpdata][0][cnt_mpdata,:,:] = strain_matrixs_mpdata[intmat_mpdata][0][cnt_mpdata,:,:]
                        rotation_matrix[intmat_mpdata][0][cnt_mpdata,:,:] = rotation_matrix_mpdata[intmat_mpdata][0][cnt_mpdata,:,:]
                        col[intmat_mpdata][0][cnt_mpdata,:] = col_mpdata[intmat_mpdata][0][cnt_mpdata,:]
                        colx[intmat_mpdata][0][cnt_mpdata,:] = colx_mpdata[intmat_mpdata][0][cnt_mpdata,:]
                        coly[intmat_mpdata][0][cnt_mpdata,:] = coly_mpdata[intmat_mpdata][0][cnt_mpdata,:]
                        match_rate[intmat_mpdata][0][cnt_mpdata] = match_rate_mpdata[intmat_mpdata][0][cnt_mpdata]
                        spots_len[intmat_mpdata][0][cnt_mpdata] = spots_len_mpdata[intmat_mpdata][0][cnt_mpdata]
                        iR_pix[intmat_mpdata][0][cnt_mpdata] = iR_pixel_mpdata[intmat_mpdata][0][cnt_mpdata]
                        fR_pix[intmat_mpdata][0][cnt_mpdata] = fR_pixel_mpdata[intmat_mpdata][0][cnt_mpdata]
                        best_match[intmat_mpdata][0][cnt_mpdata] = best_match_mpdata[intmat_mpdata][0][cnt_mpdata]
                    
            #% Save results
            save_directory_ = filenameDirec+"//results_"+prefix_mat+"_"+c_time
            if not os.path.exists(save_directory_):
                os.makedirs(save_directory_)
            
            ## intermediate saving of pickle objects with results
            np.savez_compressed(save_directory_+ "//results.npz", 
                                best_match, mat_global, rotation_matrix, strain_matrix, 
                                strain_matrixs, col, colx, coly, match_rate, files_treated,
                                lim_x, lim_y, spots_len, iR_pix, fR_pix,
                                material_)
            ## intermediate saving of pickle objects with results
            with open(save_directory_+ "//results.pickle", "wb") as output_file:
                    cPickle.dump([best_match, mat_global, rotation_matrix, strain_matrix, 
                                  strain_matrixs, col, colx, coly, match_rate, files_treated,
                                  lim_x, lim_y, spots_len, iR_pix, fR_pix,
                                  material_, lattice_material,
                                  symmetry, crystal], output_file)
            print("data saved in ", save_directory_)
    
            try:
                global_plots_MM(lim_x, lim_y, rotation_matrix, strain_matrix, strain_matrixs, 
                             col, colx, coly, match_rate, mat_global, spots_len, 
                             iR_pix, fR_pix, save_directory_, material_,
                             match_rate_threshold=5, bins=30)
            except:
                print("Error in the global plots module")
    
  4. Multi mat is also available with prediction window in GUI:

    #!/usr/bin/env python
    # coding: utf-8
    
    # # Notebook script for Prediction of Laue spot hkl using the Trained model from step 2 (supports n phase material)
    # # This notebook also includes complete indexation process from the predicted spot hkl
    # 
    # ## Different steps of loading model to predicting the hkl of spots is outlined in this notebook (LaueToolsNN GUI does the same thing)
    # 
    # ### Define material of interest and path to experimental data; the path to trained model will be extracted automatically by default
    # ### Load the trained model 
    # ### Prediction of Laue spots hkl 
    # ### Constructing orientation matrix from the predicted hkl (i.e. index Laue Patterns)
    
    
    # =============================================================================
    # As of 12/04/2022 Multi_mat scripts only suport "slow" mode of UB matrix computation
    # =============================================================================
    
    # In[1]:
    if __name__ == "__main__":    
        ## Import modules used for this Notebook
        import os
        from multiprocessing import cpu_count
        import configparser
        try:
            from lauetoolsnn.utils_lauenn import get_multimaterial_detail, resource_path
            from lauetoolsnn.lauetools import dict_LaueTools as dictLT
            from lauetoolsnn.GUI_multi_mat_LaueNN import start
        except:
            import sys
            sys.path.append(r"C:\Users\purushot\Desktop\github_version_simple\lauetoolsnn")
            from utils_lauenn import get_multimaterial_detail, resource_path
            from GUI_multi_mat_LaueNN import start
            sys.path.append(r"C:\Users\purushot\Desktop\github_version_simple\lauetoolsnn\lauetools")
            import dict_LaueTools as dictLT
        
        ncpu = cpu_count()
        print("Number of CPUs available : ", ncpu)
        
        # ## step 1: define material and path to data and trained model
        # =============================================================================
        ## User Input dictionary with parameters
        ## In case of only one phase/material, keep same value for material_ and material1_ key
        # =============================================================================
        input_params = {
                        # =============================================================================
                        #       GENERATION OF DATASET              
                        # =============================================================================
                        "prefix" : "",
                        "material_": ["Al2TiO5",
                                      ],             ## same key as used in dict_LaueTools
                        "symmetry": ["orthorhombic",
                                     ],           ## crystal symmetry of material_
                        "SG": [63,
                               ],                     ## Space group of material_ (None if not known)
                        # =============================================================================
                        #        Detector parameters (roughly) of the Experimental setup
                        # =============================================================================
                        ## Sample-detector distance, X center, Y center, two detector angles
                        "detectorparameters" :  [79.50800, 977.8200, 931.9600, 0.3600000, 0.4370000], 
                        "pixelsize" : 0.0734,          ## Detector pixel size
                        "dim1":2018,                   ## Dimensions of detector in pixels
                        "dim2":2016,
                        "emin" : 5,                    ## Minimum and maximum energy to use for simulating Laue Patterns
                        "emax" : 22,
                        # =============================================================================
                        #       Prediction paarmeters             
                        # =============================================================================
                        "experimental_directory": r"F:\bm32_Rene_Al2O3\VF_P1\VF_P1_R4",
                        "experimental_prefix": "VF_P1_R4_",
                        "grid_size_x" : 31,           ## Grid X and Y limit to generate the simulated dataset (a rectangular scan region)
                        "grid_size_y" : 141,  
                        "UB_tolerance": [
                                         0.5,
                                         ],
                        "tolerance_strain": [
                                            [0.6,0.5,0.4,0.3,0.2],
                                            ],
                        "strain_free_parameters": ["b","c","alpha","beta","gamma"],
                        "material_ub_limit": [
                                              1,
                                              ],
                        "UB_matrix_detect": 1,
                        "material_phase_always_present": [1] ## in case if one phase is always 
                                                                    # present in a Laue pattern (useful for substrate cases)
                                                                    # or for pretty plots
                        }
        #%% ## Step 2: Get material parameters 
        # ### Get model and data paths from the input
        # ### User input parameters for various algorithms to compute the orientation matrix
        material_= input_params["material_"]
        detectorparameters = input_params["detectorparameters"]
        pixelsize = input_params["pixelsize"]
        emax = input_params["emax"]
        emin = input_params["emin"]
        dim1 = input_params["dim1"]
        dim2 = input_params["dim2"]
        symm_ = input_params["symmetry"]
        SG = input_params["SG"]
        tolerance = input_params["UB_tolerance"]
        tolerance_strain = input_params["tolerance_strain"]
        strain_free_parameters = input_params["strain_free_parameters"]
        material_limit = input_params["material_ub_limit"]
        material_phase_always_present = input_params["material_phase_always_present"]
        # =============================================================================
        # Experimental file extension "cor" for simulated dataset or "sCMOS" if experimental
        # =============================================================================
        ccd_label_global = "sCMOS"
        ## Experimental peak search parameters in case of RAW LAUE PATTERNS from detector
        intensity_threshold = 150
        boxsize = 10
        fit_peaks_gaussian = 1
        FitPixelDev = 15
        NumberMaxofFits = 2000 ### Max peaks per LP
        bkg_treatment = "A-B"
        
        ## Requirements
        ubmat = input_params["UB_matrix_detect"] # How many orientation matrix to detect per Laue pattern
        mode_spotCycle = "graphmode" ## mode of calculation
        use_previous_UBmatrix_name = False ## Try previous indexation solutions to speed up the process
        strain_calculation = True ## Strain refinement is required or no
        
        ## Parameters to control the orientation matrix indexation
        softmax_threshold_global = 0.80 # softmax_threshold of the Neural network to consider
        mr_threshold_global = 0.90 # match rate threshold to accept a solution immediately
        cap_matchrate = 0.45 * 100 ## any UB matrix providing MR less than this will be ignored
        coeff = 0.10            ## coefficient to calculate the overlap of two solutions
        coeff_overlap = 0.10   ##10% spots overlap is allowed with already indexed orientation
    
        ## Additional parameters to refine the orientation matrix construction process
        use_om_user = "false"
        nb_spots_consider = 350
        residues_threshold=0.35
        nb_spots_global_threshold=8
        option_global = "v2"
        additional_expression = ["none"] # for strain assumptions, like a==b for HCP
        # =========================================================================
        # END OF USER INPUT    
        # =========================================================================
        if len(material_) > 1:
            prefix_mat = material_[0]
            for ino, imat in enumerate(material_):
                if ino == 0:
                    continue
                prefix_mat = prefix_mat + "_" + imat
        else:
            prefix_mat = material_[0]
        
        model_direc = os.getcwd()+"//"+prefix_mat+input_params["prefix"]
        model_weights = model_direc + "//model_"+prefix_mat+".h5"
        json_file = model_direc + "//model_"+prefix_mat+".json"
        model_annote = "CNN"
        
        if not os.path.exists(model_direc):
            print("The directory doesn't exists; please veify the path")
        else:
            print("Directory where trained model is stored : "+model_direc)
            
        ## get unit cell parameters and other details required for simulating Laue patterns
        rules, symmetry, lattice_material, \
            crystal, SG = get_multimaterial_detail(material_, SG, symm_)
    
        if input_params["experimental_directory"] == "" and input_params["experimental_prefix"] == "":
            filenameDirec =  model_direc + "//simulated_dataset"
            experimental_prefix = prefix_mat+"_"
            lim_x, lim_y = input_params["grid_size_x"], input_params["grid_size_y"]
            format_file = "cor"
        else:
            filenameDirec = input_params["experimental_directory"]
            experimental_prefix = input_params["experimental_prefix"]
            lim_x, lim_y = input_params["grid_size_x"], input_params["grid_size_y"] 
            format_file = dictLT.dict_CCD["sCMOS"][7]
        
        config_setting = configparser.ConfigParser()
        filepath = resource_path('settings.ini')
        print("Writing settings file in " + filepath)
        config_setting.read(filepath)
        config_setting.set('CALLER', 'residues_threshold',str(residues_threshold))
        config_setting.set('CALLER', 'nb_spots_global_threshold',str(nb_spots_global_threshold))
        config_setting.set('CALLER', 'option_global',option_global)
        config_setting.set('CALLER', 'use_om_user',use_om_user)
        config_setting.set('CALLER', 'nb_spots_consider',str(nb_spots_consider))
        config_setting.set('CALLER', 'path_user_OM',"none")
        config_setting.set('CALLER', 'intensity', str(intensity_threshold))
        config_setting.set('CALLER', 'boxsize', str(boxsize))
        config_setting.set('CALLER', 'pixdev', str(FitPixelDev))
        config_setting.set('CALLER', 'cap_softmax', str(softmax_threshold_global))
        config_setting.set('CALLER', 'cap_mr', str(cap_matchrate/100.))
        config_setting.set('CALLER', 'strain_free_parameters', ",".join(strain_free_parameters))
        config_setting.set('CALLER', 'additional_expression', ",".join(additional_expression))
        with open(filepath, 'w') as configfile:
            config_setting.write(configfile)
        
        if strain_calculation:
            strain_label_global = "YES"
        else:
            strain_label_global = "NO"
            
            
            
        ##Start the GUI plots
        start(        
                model_direc,
                material_,
                emin,
                emax,
                symmetry,
                detectorparameters,
                pixelsize,
                lattice_material,
                mode_spotCycle,
                softmax_threshold_global,
                mr_threshold_global,
                cap_matchrate,
                coeff,
                coeff_overlap,
                fit_peaks_gaussian,
                FitPixelDev,
                NumberMaxofFits,
                tolerance_strain,
                material_limit,
                use_previous_UBmatrix_name,
                material_phase_always_present,
                crystal,
                strain_free_parameters,
                additional_expression,
                strain_label_global, 
                ubmat, 
                boxsize, 
                intensity_threshold,
                ccd_label_global, 
                experimental_prefix, 
                lim_x, 
                lim_y,
                tolerance, 
                filenameDirec, 
                model_weights,
                model_annote
                )