Modeling of the Phosphate Beneficiation Process as a Forerunner to Adaptive Control

02-103-132Final

Executive Summary

Phosphate flotation plants in the United States are not currently at at performance; this inadequacy must be rectified if America is to continue competing in the world market. The biggest improvements in the efficiency of existing phosphate processing plants in the near future are likely to be made by improving process control. Many of the most effective process control systems currently being developed and implemented in industry are model-following control systems. Development of these model-following control systems requires fast and accurate computer mows of the system to be controlled. This project was undertaken to provide a suite of computer models that can be used to develop intelligent, adaptive controllers for phosphate flotation systems.

The computer modeling software was written in the C++ computer language and is object-oriented for maintainability. The system includes a graphical user interface so that the user does not have to be intimately familiar with the computer modeling methods, search algorithms, and error reduction techniques used in the models. The modeling software and the proposed control software are both very complex. A blackboard system was developed to manage the complexity. This blackboard system coordinates the efforts of the various software modules.

Six computer models have been developed to simulate a phosphate flotation system. These models have been tested on plant data obtained from the Florida phosphate industry. The six computer models include both first principle and empirical models. First principle models are based on the actual physics of the system being modeled. They allow the user to investigate the effects of altering parameters outside of the region for which data is available. Empirical models are not necessarily concerned with the physics of the system being modeled. They are concerted only with accurately modeling the response of the system.

The first computer model discussed is a first principle model originally developed by Jordan and Spears (1990). This model is in effect a standard model for flotation in conventional cells because it is based on a few decades of worldwide research. This computer model uses particle sizes and densities, bubble sizes, relative velocities, and induction time to compute three probabilities generally associated with flotation: (1) the probability of a bubble-particle collision, (2) the probability of a particle adhering to a bubble, and (3) the probability of a particle remaining attached to a bubble. These probabilities are used in conjunction with two additional measured probabilities to compute an expected recovery.

The second and third computer models are statistical models. These models utilize various statistical tools to analyze the data obtained from the flotation system. These tools include regression analysis, factor analysis, cluster analysis, and projection to latent structures. The statistical models provide very little insight into the physics of the flotation system. However, they are very useful in exploring the structure of data. They can identify different physical mechanisms contributing to the data elements and allow insight into the extent and nature of the error associated with the data.

The fourth computer model is a neural network model. Neural networks are approximate models of the human brain in which computational cements called neurons are placed in a network of weighted connections called synapses. These networks are then trained on existing data to accurately reproduce the response of the system. Neural networks produce very powerful and very flexible models. Unfortunately, they are the ultimate “black box”; they give no indication why a particular input produces a given output.

The fifth computer model is based on fuzzy logic and genetic algorithms. Fuzzy logic is a technique that allows computers to manipulate subjective, linguistic concepts like those commonly used in human decision-making. Genetic algorithms are search algorithms based on the mechanics of genetics. They are robust algorithms that have been used successfully to solve a wide range of problems. Together fuzzy logic and genetic algorithms can accurately model phosphate flotation plaints from relevant plant data. Although the performance of this model is similar to that of the neural network model, it is not a black box. The fuzzy model gives a clear explanation of its reasoning in the form of linguistic rules.

The last computer model is a rate constant model. The rate constant model, like the first principles model, consists of a collection of simple first-order differential equations. But instead of basing the computation of the rate constants for the differential equations on a model of the underlying physical processes, the rate constant model fits them statistically from plant operating data. The only independent variables in this model’s differential equations are the concentrations of mineral species in the feed slurry. This model requires the plant data observations to be partitioned into clusters, each representing the same kind of operating conditions. Rate constants are then computed for each cluster. Thus, the rate constant model consists of several sets of rate constants along with a rule for choosing the appropriate set for any given operating state for the plant.

During the development of the models it became apparent that all of the parameters which are important for the system were not being measured. Unmeasured factors such as clay content of the material going to the conditioner and/or particle size dramatically affect the short range response of the system. For example, for an identical recovery, fatty acid requirements may vary by a factor of two. BCD researchers have developed a model following control system based on fuzzy logic and genetic algorithms, which can overcome a partial lack of sensory data from the system. This method has been tested on helicopter flight control, the formation of hexamine, and pH control by titration. Based on this experience we believe that the six models can be used to model existing plants from operational data. If the data is inadequate for the control methodology, the models can indicate precisely what additional data is needed.

The programs, which implement the various models and any additional control software used to control the plant or a model of the plant, are of necessity complex. In addition, the programs rely on a large number of concepts, which may be unfamiliar to most users. For this reason, we have developed a prototype interface, which stands between the user and the program complexity. The interface can be used to design and test circuits using the models embedded in the interface.

Bernard J. Scheiner - BCD Technologies; Charles L. Karr and Donald A. Stanley - University of Alabama