alexa Simulation of Two Dimensional Raceway Using Back Propagation Neural Networks | Open Access Journals
ISSN: 2169-0022
Journal of Material Sciences & Engineering
Make the best use of Scientific Research and information from our 700+ peer reviewed, Open Access Journals that operates with the help of 50,000+ Editorial Board Members and esteemed reviewers and 1000+ Scientific associations in Medical, Clinical, Pharmaceutical, Engineering, Technology and Management Fields.
Meet Inspiring Speakers and Experts at our 3000+ Global Conferenceseries Events with over 600+ Conferences, 1200+ Symposiums and 1200+ Workshops on
Medical, Pharma, Engineering, Science, Technology and Business

Simulation of Two Dimensional Raceway Using Back Propagation Neural Networks

Gupta P* and Varshney RG

MSc Engineering, Graphic Era University, Dehradun, Uttarakhand, India

*Corresponding Author:
Punit Gupta
MSc Engineering
Graphic Era University
Dehradun, Uttarakhand, India
Tel: +918449385221
E-mail: punit.gupta71@gmail.com

Received Date: April 29, 2017; Accepted Date: May 08, 2017; Published Date: May 18, 2017

Citation: Gupta P, Varshney RG (2017) Simulation of Two Dimensional Raceway Using Back Propagation Neural Networks. J Material Sci Eng 6: 342. doi: 10.4172/2169-0022.1000342

Copyright: © 2017 Gupta P, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Visit for more related articles at Journal of Material Sciences & Engineering

Abstract

The raceway phenomenon of the iron-making blast furnace has been studied in two dimensional cold models. The effect of tuyere diameter, blast velocity, and particle diameter and particle density has been physically modeled and a semi-empirical correlation has been developed. The data collected from the experimental study was simulated using a three layer back propagation neural network, for the first time. The effects of the network design parameters on the performance of the network have been studied and the optimum values of these parameters have been found. . Sensitivity analysis has been done to establish the effect of the tuyere diameter, blast velocity, particle density and particle diameter on the raceway diameter. The performance of the ANN has been compared with the semi empirical correlation developed from experimental data and it has been found that ANN is superior.

Keywords

Raceway; Cold model; Artificial neural networks; Back propagation; Preprocessing; Regression; Sensitivity analysis

Introduction

When a high velocity blast of gas is introduced into a packed bed through a hollow tube, a cavity with moving particles in the periphery is generally formed in front of the tube nose. The cavity that is formed in the iron making blast furnace in front of the hot blast tuyeres is known as the raceway [1]. The air velocity through the tuyeres is high (around 200 m/s) and this causes the coke particles to circulate in a rotating flow field around 0.75 m deep. Due to the extreme conditions (temperature in excess of 2000°C, high pressure etc.) involved, direct experimentation is very limited [2-7] and a number of cold model studies [8-13] have been performed in the past and their results have been presented in the form of empirical correlations. Experimental studies to determine the shape of the raceway have also been performed [14,15]. There have also been extensive mathematical models developed for quantifying this phenomenon [11,16,17]. The effect of liquid flow in the raceway zone has also been studied and simulated [18]. There are considerable differences in the dependence of raceway diameter on fundamental quantities like blast momentum, particle size and bed height between all correlations. In the present study, experiments have been performed in two dimensional models to simulate the gas-solid interactions. The effect of the blast and bed parameters on the raceway dimension has been studied and the Back propagation ANN (BPNN) technique has been used to simulate the results. The main advantage with the neural network modeling technique as compared with statistical techniques is that no functional form of the correlation is assumed a priori. Sensitivity analysis has also been performed to establish the effects of the inputs on the raceway size. The performance of the ANN model and the semi empirical correlation developed from experimental data has been compared.

Experimental Set up

Experiments were performed in transparent two dimensional models of 15 mm depth, 700 mm height and 280 mm width, with sidewalls made of glass. A schematic of the experimental setup is as shown in the Figure 1. The tuyere (hollow tube) in the form of a rectangular slot was placed horizontally 85 mm above the base of the model. The effect of the following blast and bed parameters was studied:

material-sciences-engineering-experimental

Figure 1: Schematic diagram of the experimental set up (1) Gas Line; (2) Thermometer; (3) Valve; (4) Pressure Gauge; (5) Rotameter; (6) Pressure Gauge; (7) Tuyere; (8) Glass model.

(1) Blast velocity (2) Tuyere diameter (3) Bed height (4) Particle size (5) Particle density.

The blast velocities varied from 15 to 75 m/s f, the solid materials used were quartz and ragi (a food grain). In the present report, the data has been considered for those bed heights where the raceway dimensions become independent of the height as this is the case close to actual operating conditions

Back-propagation ANN Model

An artificial neural network (ANN) is a computational model that is inspired by the structure and functional aspects of the biological neuron, like those found in the human brain. In general, the network model consists of a weighted interconnection of input nodes, hidden nodes and output nodes, biases and activation functions [19]. The most widely used neural network for function approximation and regression analysis is the Back Propagation (BP) algorithm whose details are described explicitly by Rumelhart and Demuth [20,21]. The design considerations are to find the number of hidden layers, the number of nodes in each layer, the interconnection weights and biases and activation functions that best fit the problem at hand (Figure 2).

material-sciences-engineering-architecture

Figure 2: Back propagation network architecture.

Mathematical Formulation

Consider a BP network with d nodes in the input, M nodes in the first hidden layer, S nodes in the second hidden layer and one node in the output layer, as shown in Figure 3. During the forward propagation phase, the processing from input layer to the jth neuron in the

material-sciences-engineering-hidden

Figure 3: Epochs vs. Number of nodes in first hidden layer for different number of nodes in second hidden layer.

First hidden layer is:

equation (1)

Similarly, the processing from the first hidden layer to the kth neuron in the second hidden layer is:

equation (2)

The final output for the pth pattern of data is:

equation (3)

During training of the network, this output is compared with the target to give the error:

equation (4)

for the pth data. For one complete cycle of inputs, this error is summed up to give the error function

equation (5)

During the weight update phase (the back propagation phase) the objective is to update the weights so as to minimize the above error function, also known as the mean square error (MSE). The commonly used optimization techniques for this like the Gauss -Newton, gradient descent, Levenberg-Marquardt can be found in refs. [22,23]. All of these techniques attempt to calculate the gradients ∂E/∂wij in a unique and optimized manner. The update rule using the momentum backpropagation technique with η, the learning rate, and μ, the momentum parameter is:

equation (6)

Similarly the bias terms are also updated after each training cycle till the performance criteria (number of training epochs/MSE/mean absolute error), is met.

Results and Discussion

Preprocessing data

The raw data is preprocessed to avoid any duplicate sets and also condition the data for processing by the network. Multilayer networks can be trained to generalize well within the range of inputs for which they have been trained. However, they do not have the ability to accurately extrapolate beyond this range, so it is important that the training data span the full range of the input space.

The main considerations for scaling the inputs/outputs are the transfer functions.

The commonly used transfer functions like logsig and tansig are sensitive to inputs only within certain ranges. For example the tansig transfer function is sensitive to inputs between (-1,1) and becomes saturated in adjacent intervals. If the saturation occurs at the beginning of the training process, the gradients will be very small, and the network training will be very slow. Two preprocessing techniques were used and their performance studied for the BP networks. In the first technique, the inputs and targets were normalized to have zero mean and unity standard deviation. In the second, inputs and targets were scaled to fall in the range (-1,1) . Due to the nonlinearity of the first mapping, the performance was also poorer than the second technique and so the latter was adopted for the simulations.

Results

Simulations on different architectures were done with the help of the Neural Networks tool in Matlab software [24]. The models used had one input layer with four inputs, two hidden layers, with variable number of nodes and one Neuron in the output to characterize the raceway size. The effect of the learning rate, momentum term, training algorithm, number of neurons in each layer and transfer functions on the number of epochs to meet the performance goal (MSE was kept constant at 1 e-08) was studied. The dataset had 107 data points in all of which 64 data points were kept for training, 22 for validation and 21 for testing. It was observed during the trials that the convergence criteria is not met in exactly the same number of epochs and, the network weights and biases always do not converge to the same values even when initialized to exactly the same constants (zeros/+1/-1). For each architecture, after conducting a number of trials, and considering the average and standard deviation of each set of trials, it was found that the statistics for twenty runs can be considered as the representative. The fastest convergence was observed with the Levenberg-Marquardt algorithm which is in accordance with other findings [25]. The activation function chosen were tansig for both the layers and the purelin function performed the best for the output node. In order to arrive at the optimum architecture, trials were made only with the training data by keeping a constant number of nodes in one layer and varying the number of nodes in the other layer and the results are being shown here graphically, in Figures 3 and 4.

material-sciences-engineering-layer

Figure 4: Epochs vs. Number of nodes in second hidden layer for different number of nodes in first hidden layer.

Based on these results, network architecture of 16 by 16 by 1 was chosen for validation. The data set was partitioned into 60%training, 20% validation and 20% test. A typical performance graph is as shown in Figure 5. It can be seen from this figure that when the training goal of 10-8 (Mean Square Error, MSE) is satisfied in the epoch number 10, the test and validation MSE's are also the minimum.

The regression statistics of this trial are as shown in Figure 6. It can be seen that the regression coefficient is very close to 1 for all datasets.

material-sciences-engineering-dataset

Figure 5: Performance graph of a typical run of the entire dataset.

material-sciences-engineering-datasets

Figure 6: Regression plot of training, validation, test and all datasets.

equation (7)

Where Dr: Raceway diameter in mm; dt: Tuyere diameter in mm.

ρg: Density of gas kg/m^3; ρs: Density of solid kg/m^3, vb: Blast velocity m/s; dp: Particle diameter m; g: 9.81 m/s2 and Heff: Effective bed height in mm (=400 mm).

A Regression coefficient value of 0.79 was obtained and the MSE was also higher (1.49 × 10-2). It can thus be concluded that the BPNN model is superior. Sensitivity analysis was performed by perturbing each variable over its entire range while holding other variables constant. The average rate of change of the raceway diameter with each of the four variables is presented in Table 1.

Variable Slope
Tuyerediameter 1.40
Blast velocity 1.00
Particle diameter -0.4
Particle density -0.82

Table 1: Slope of the variables after sensitivity analysis.

The negative value of the slope for particle diameter and density signifies an inverse propotionality with the raceway diameter.

Conclusion and Further Study

The raceway has been simulated in a cold model and from the data collected; the raceway diameter has been completely simulated by a three layer (16 × 16 × 1) back propagation artificial neural network, for the first time. The network performance is optimum when the inputs are scaled to the range (-1,1), the Levenenberg-Marquardt algorithm is used for training with tansig transfer functions in both the intermediate layers and purelin for the output layer. The BPNN model was found to be superior to the semi empirical correlation developed. Sensitivity analysis has been done to establish the effect of the tuyere diameter, blast velocity, particle density, particle diameter on the raceway diameter. In a forthcoming study, pseudo 2D modeling data will be used and Radial basis networks will also be used for simulation.

Acknowledgments

The authors would like to thank Dr. A.K. Lahiri, Retired Professor, Department of Materials Engineering, I.I.Sc., Bangalore for his guidance and support.

References

Select your language of interest to view the total content in your interested language
Post your comment

Share This Article

Relevant Topics

Recommended Conferences

  • 3rd International Conference and Expo on Ceramics and Composite Materials
    June 26-27, 2017 Madrid, Spain
  • 2nd International Conference and Exhibition on Materials Chemistry
    July 13-14, 2017 Berlin, Germany
  • 10th International Conference on Emerging Materials and Nanotechnology
    July 27-29, 2017 Vancouver, Canada
  • 10th International Conference on Advanced Materials and Processing
    August 16-17 Edinburgh, Scotland
  • 3rd International Conference on Polymer Science and Engineering
    October 2-4, 2017 Chicago, USA
  • International Conference on Advanced Materials and Nanotechnology
    October 26-28, 2017 Osaka, Japan
  • 13th International Conference and Exhibition on Materials Science and Engineering
    November 13-15, 2017 Las Vegas, Nevada, USA
  • 14th International Conference on Functional Energy Materials
    December 06-07, 2017 Atlanta, USA

Article Usage

  • Total views: 60
  • [From(publication date):
    June-2017 - Jun 24, 2017]
  • Breakdown by view type
  • HTML page views : 42
  • PDF downloads :18
 
 

Post your comment

captcha   Reload  Can't read the image? click here to refresh

Peer Reviewed Journals
 
Make the best use of Scientific Research and information from our 700 + peer reviewed, Open Access Journals
International Conferences 2017-18
 
Meet Inspiring Speakers and Experts at our 3000+ Global Annual Meetings

Contact Us

Agri, Food, Aqua and Veterinary Science Journals

Dr. Krish

agrifoodaquavet@omicsonline.com

1-702-714-7001 Extn: 9040

Clinical and Biochemistry Journals

Datta A

clinical_biochem@omicsonline.com

1-702-714-7001Extn: 9037

Business & Management Journals

Ronald

business@omicsonline.com

1-702-714-7001Extn: 9042

Chemical Engineering and Chemistry Journals

Gabriel Shaw

chemicaleng_chemistry@omicsonline.com

1-702-714-7001 Extn: 9040

Earth & Environmental Sciences

Katie Wilson

environmentalsci@omicsonline.com

1-702-714-7001Extn: 9042

Engineering Journals

James Franklin

engineering@omicsonline.com

1-702-714-7001Extn: 9042

General Science and Health care Journals

Andrea Jason

generalsci_healthcare@omicsonline.com

1-702-714-7001Extn: 9043

Genetics and Molecular Biology Journals

Anna Melissa

genetics_molbio@omicsonline.com

1-702-714-7001 Extn: 9006

Immunology & Microbiology Journals

David Gorantl

immuno_microbio@omicsonline.com

1-702-714-7001Extn: 9014

Informatics Journals

Stephanie Skinner

omics@omicsonline.com

1-702-714-7001Extn: 9039

Material Sciences Journals

Rachle Green

materialsci@omicsonline.com

1-702-714-7001Extn: 9039

Mathematics and Physics Journals

Jim Willison

mathematics_physics@omicsonline.com

1-702-714-7001 Extn: 9042

Medical Journals

Nimmi Anna

medical@omicsonline.com

1-702-714-7001 Extn: 9038

Neuroscience & Psychology Journals

Nathan T

neuro_psychology@omicsonline.com

1-702-714-7001Extn: 9041

Pharmaceutical Sciences Journals

John Behannon

pharma@omicsonline.com

1-702-714-7001Extn: 9007

Social & Political Science Journals

Steve Harry

social_politicalsci@omicsonline.com

1-702-714-7001 Extn: 9042

 
© 2008-2017 OMICS International - Open Access Publisher. Best viewed in Mozilla Firefox | Google Chrome | Above IE 7.0 version