0% found this document useful (0 votes)
52 views

Process Information: Achieving A Unified View: Rocess Esign Rends

0001edga

Uploaded by

Diana Bonyuet
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views

Process Information: Achieving A Unified View: Rocess Esign Rends

0001edga

Uploaded by

Diana Bonyuet
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

PROCESS DESIGN TRENDS

Process Information:
Achieving a
Unified View
Advances in computer hardware and
software, coupled with better
Thomas F. Edgar, fundamental models, will change
The University of Texas
at Austin
the way processes are designed and
plants are operated.

D igital computers have a major in-


fluence on our lives, and the de-
velopment of computers is the
most important technological ad-
vance over the past 40 years. In 1982, John
Naisbitt wrote the book “Megatrends” (1),
which predicted that the U. S. would be trans-
Despite the remarkable advances achieved
by computers, we also know that computers
sometimes can behave strangely or unpre-
dictably. Even so, the reliability of computers
has gone up by at least an order of magnitude
during the past decade, and computers now
are widely used in mission-critical activities
formed from an industrial to an information such as process control, automation, and
society, which, in fact, has happened. The manufacturing.
costs and capabilities of computers in 1960 Meanwhile, mathematical modeling and
(see Table 1) provide an interesting reference simulation have become important and useful
point for the significant changes in digital areas of computer application in the chemi-
technology, and illustrate how affordable cal process industries (CPI). In the mid-
computers have become. The IBM 7090, for 1970s, engineers were skeptical of simula-
example, cost $55,000/month in 1960 to lease, tion as a valid way to solve manufacturing
not own. Yet, this computer was less powerful problems; few people then believed accurate
than the microprocessor in today’s $500 video predictions could be obtained from mathe-
camera. matical models. Today, however, the prevail-
In the field of computing, Moore’s Law has ing view in industry is that it is much less
become a frequently quoted benchmark. It expensive and more reproducible to run sim-
states that the speed of computers will double ulation experiments than to perform repeated
every 18 months; in other words, every 10 experiments involving actual equipment. The
years the speed of computing will essentially confidence level in what can be done with
increase by about a factor of 100. Moore’s simulation has risen considerably, and this is
©Copyright 2000
Law is driven by the growing number of tran- having a profound influence on the practice
American Institute
of Chemical Engineers.
sistors per chip, which is a measure of the of process engineering.
All rights reserved. chip’s computing power. That trend has per-
Copying and sisted since the 1960s, and technology leaders Paradigms for process engineering
downloading permitted predict that Moore’s Law will hold for at least The traditional paradigm for process engi-
with restrictions. another 10–20 years. neering as practiced over the last 50 years by

Chemical Engineering Progress January 2000 51


PROCESS DESIGN TRENDS

Table 1. Key computer benchmarks in 1960.

Computer Average Monthly Rental Maximum Core Storage Capacity Add Time Cards Read
1960 $ in 1,000 bits micro-s per min
IBM 7090 55,000 160 0.004 250
CDC 1604 34,000 32 0.005 1,300
DEC PDP1 2,200 4 0.010 Tape input

chemical engineers is shown in ers will enable more-detailed model-


Table 2. There are several stages in- ing that can be performed interac- Table 2.
volved with developing and com- tively with advanced analysis and vi- The current paradigm of
mercializing a process. The research sualization tools. As shown in Figure process engineering.
and development (R&D) stage in- 1, simulation not only is embedded
volves discovery, usually by in the experimental phase, but also is Stage Activity
chemists. It is followed by the pilot- an integral part of theories that rely 1. Research and development
plant stage, where chemical engi- on intensive computations. In many 2. Scale-up, scale-up, …
neers scale up the process to higher future engineering calculations (such 3. Design of commercial facility
production levels. Scale-up can be as for separation or reaction), struc- 4. Optimization
performed repetitively, incremental- ture/property relationships will be
ly moving to higher production lev- derived from first principles that use
els until eventually there is enough computer simulation. the internal components extended
confidence to design the commercial Commercial computational fluid the useful life and greatly reduced
facility. After the commercial facili- dynamics (CFD) packages, such as energy consumption. Tools such as
ty is designed and constructed, pro- FLUENT, FIDAP, and NEKTON, CFD and molecular modeling some-
cess optimization may commence. can perform a rigorous simulation of day will realize the holy grail called
In this traditional paradigm, the en- flow patterns in a process unit (for ab initio process design, which
gineer usually waits until the scale- instance, a reactor or exchanger) by means design of an entire chemical
up process is completed, then a pro- solving time-dependent, three-di- plant from first principles. This is a
cess engineering analysis is carried mensional equations of change to fairly radical notion, but one that
out via detailed simulation. analyze and understand the system might actually come to pass in
A new paradigm is emerging, behavior at a microscopic level. 10–15 years. While it is easy to be
however, where simulation is per- Coupled heat- and mass-transfer skeptical about such predictions, re-
formed even at the discovery phase problems also can be solved with member that no one in the computer
(2). Here, an engineer enters ap- CFD codes. DuPont recently report- business four years ago foresaw the
proximate equipment sizes into a ed an application of CFD in a piece stunning impact of the World Wide
flowsheeting package to build a of equipment operating in a highly Web.
model of the process and its eco- erosive environment that consumed In ab initio process design, you
nomics. This allows the engineer to large amounts of steam (3). CFD stipulate a product with certain tar-
decide whether further scale-up is modeling showed how the equip- get properties. Various flowsheets
warranted. A few companies are ment failed on occasion; redesign of can be synthesized to find a scheme
using this procedure now. As scale- with predictable environmental im-
up progresses, the process model is pact and minimal cost. In other
updated, based on the data acquired. words, you begin with a set of de-
This approach may yield a consider- Experiment sired properties and then reverse-en-
Theory
(Laboratory, Pilot Plant)
able savings both in time to com- gineer the process chemistry and
mercialization and in total costs. process design to attain those de-
Figure 1 illustrates the relation- sired properties. Many biotechnolo-
ships between the various steps in Simulation Process Design
gy companies now use molecular
the new paradigm. The addition of modeling tools in drug design to an-
the simulation block in the research, swer questions about electronic
development, and commercialization structures, chemical properties of
progression demonstrates its impor- ■ Figure 1. The new paradigm for process candidate molecules, locations of
tance. The advent of faster comput- engineering. active sites, and how certain drugs

52 January 2000 Chemical Engineering Progress


will operate. While I am not sug- tinues to hold, supercomputers tal costs dominate the sales price. In
gesting experiments can be elimi- should exceed the petaflop bench- batch plants, labor and inventory-
nated altogether, replacing repetitive mark by 2020, while engineering storage costs are responsible for a
experiments with simulation will workstations may approach one ter- much larger portion of the selling
drive the cost of process develop- aflop in speed then. A 233-mHz price. A fuller cost breakdown ap-
ment down. Pentium PC available today has a pears in their Table 2 (p. 42). They
Examples of the utility of molec- rating of about 50 megaflops, or also analyze the possibilities for
ular modeling include the prediction 0.05 gigaflop (5 ´ 10-5 teraflop or achieving savings in both raw mate-
of thermochemistry to determine 5 ´ 10-9 petaflop); so, a cluster of rials and investment.
feasible reaction pathways, predic- 1,000 personal computers (PCs) New technologies can provide
tion of spectroscopic properties to run in a parallel computing mode significant cost savings for some
enable identification of chemical could approach the computing processes. For instance, Keller and
species that might be involved in power of what are called supercom- Bryan point to the use of reactive
determining environmental impact, puters today, and provide the com- distillation, where the two unit oper-
and electronic structure calculations putational horsepower needed to ations take place in a single vessel.
to provide insights on bonding ener- solve such modeling problems. There already have been a number
gies. A number of researchers now of notable successes with this tech-
are using molecular simulation to Challenges in process design nology, such as for making tetra
predict some vapor/liquid equilibri- As Keller and Bryan note in their amyl methyl ether (TAME), but it is
um properties from first principles; article on p. p. 41 (2), for the CPI to unclear how many other processes
determining such property values be globally competitive in the 21st could benefit from reactive distilla-
normally requires experimental data century, process engineers must re- tion. At UT Austin, we are carrying
or empirical correlations. duce raw material costs, capital in- out two experimental projects using
Application of molecular model- vestment, plant energy consumption, reactive distillation for production
ing approaches is limited now, inventory in the plant, and the of TAME and detergent alkylate. We
however, by the availability of fast amount of pollutants generated. They are investigating challenging control
computers (in some cases super- also stress that the CPI in the future problems that come about when sep-
computers) to carry out rigorous will need improved process flexibili- aration and reaction are combined in
computations. To fully develop the ty, safety, and control technology. the same vessel. Table 3 compares
potential of molecular modeling, a Keller and Bryan estimate that, the effects of three typical distur-
petaflop (1015 floating point opera- for smaller-sized plants (typically bances — changes in feed flow, feed
tions per second, or flops) machine batch, semi-batch, and campaign enthalpy, and column pressure — on
may be ultimately needed. The operations), raw material costs ac- conventional and reactive distilla-
fastest machine available this year count for 5–20% of a product’s sell- tion columns. When using reactive
at the San Diego Supercomputer ing price, and capital costs for distillation, new control problems
Center, one of two leading academ- 5–30%. In larger plants, which gen- may arise. A variation in feed flow
ic facilities for computing, will be erally are continuous, the percent- changes the system’s residence time,
around one teraflop (1012 flops). ages change quite significantly. For which, in turn, alters the reaction
Assuming that Moore’s Law con- these plants, raw-material and capi- characteristics. This must be dealt

Table 3. The impact of disturbances on


conventional distillation vs. reactive distillation.

Disturbance Distillation Reactive Distillation


Feed flow change Effectively handled by manipulating Change in residence time
variables ratioed to the feed rate over catalysts must be addressed

Feed enthalpy change May significantly alter internal Low reflux ratios employed
liquid/vapor ratio for low reflux columns under kinetic control

Column pressure change Affects relative volatility, Pressure directly affects


usually with negligible effect temperature profile in reaction zone

Chemical Engineering Progress January 2000 53


PROCESS DESIGN TRENDS

with through some sort of advanced than the hub-centric approach. This drift, or lack of precision. Plant
model-based control strategy. Pres- promotes an open application envi- maintenance will be integrated with
sure swings are usually not too diffi- ronment (open control system) and operational activities, using sophis-
cult to handle in a conventional col- makes accessible the wide variety of ticated tools such as data mining to
umn but, in a reactive distillation PC object-oriented software tools improve reliability. Maintenance
column, the temperature profile may that are now available. databases will contain full details of
change markedly. As these examples The demand for smart field de- all equipment, and knowledge-based
illustrate, new process technology vices, particularly digital ones, is ris- computers will monitor routine and
will require new operating strategies ing rapidly. It clearly is desirable to preventive repairs, utilizing expert
and the resulting plants may be be able to query a remote instrument systems to prevent failure through
more difficult to control. to determine if it is functioning prop- the proper level of maintenance.
Additional savings will accrue erly. In addition, smart instruments In the area of process modeling,
from cutting inventories and adopt- can perform self-calibration and fault industrial groups are beginning to
ing just-in-time production at batch detection/diagnosis. An instrument examine whether it is possible to
operations (2). But, keeping invento- using digital signals has the key ad- achieve a seamless transition be-
ry at a minimum will force these vantage that data can be transmitted tween models used for design and
plants to be much more agile in car- (even by wireless means) without simulation and models used for con-
rying out process changes and the normal degradation experienced trol. The CAPE-OPEN industrial
switching from one product slate to with analog signals. In smart valves, consortium in Europe and groups in
another. And, manufacturing prod- proportional/integral/derivative (PID) the U.S. are working toward an open
ucts with a short cycle time will re- control resides right in the instru- architecture for commercial simula-
quire an increased emphasis on mod- ment; this can permit the central com- tors. Standard data exchange is
eling and control in batch processing. puters to do more-advanced process needed so that models in a library
control and information management. can come from different sources.
Plant operations It is projected that installation of This will allow users to “plug and
and information technology smart instruments can reduce instru- play” company-specific libraries,
The next 20 years undoubtedly mentation costs by up to 30% com- such as physical property packages,
will see a greater emphasis in the use pared to conventional approaches. into any compliant simulator.
of information technology in plant There has been much recent ac- These steady-state flowsheet sim-
operations (4). This will be abetted tivity in defining standards for the ulators now are being extended to
by a new stage in the evolution of digital, multidrop (connection) com- handle dynamic cases (e.g., linking
plant information and control archi- munications protocol among sensors, Aspenplus to Speedup). The goal is
tectures that now is emerging. actuators, and controllers; in the to have models for real-time control
Progress in computer control during U.S., the concept is called fieldbus that run at 50–500 times real time.
the last 20 years has been spurred control. Vendors and users have been This, however, will require in-
by wide acceptance by vendors of working together to develop and test creased computational efficiency
the distributed-control hub system, interoperability standards via several and, perhaps, application of parallel
which was pioneered during the commercial implementations. computing. Models for any use
1970s by Honeywell. A distributed When data become readily avail- should be derivable from a single
control system (DCS) employs a hi- able at a central point, it will be eas- source, and dynamic models should
erarchy of computers. A single mi- ier to apply advanced advisory sys- be an extension of steady-state
crocomputer controls 8–16 individu- tems (e.g., expert systems) to moni- equations. Models should be robust
al loops; more-detailed calculations tor the plant for performance, as and insensitive to starting condi-
are performed using workstations well as to detect and diagnose tions, and should match the full op-
that receive information from those faults. Recent efforts have built erating range of the plant.
lower-level devices. In this scheme, upon the traditional single-variable A new generation of model-based
set points, often determined by real- statistical-process-control (SPC) ap- control theory that is tailored to the
time optimization, are sent from the proach, extending it to multivariable successful operation of modern
higher-level computers to the lower- problems (that is, ones with many plants has emerged. These advanced
level microprocessors. Now, with the process variables and sensors) using algorithms include model predictive
focus on enterprise integration, some multivariate statistics and such tools control (MPC), robust control, and
automation vendors are implement- as principal component analysis. adaptive control, where a mathemat-
ing Windows NT as the new solution These techniques can handle sensor ical model is explicit in developing
for process control, utilizing PCs in validation to determine if a given a control strategy. The success of
a client-server architecture rather sensor has failed or exhibits bias, MPC in solving large, multivariable

54 January 2000 Chemical Engineering Progress


industrial control problems is im- expected. CIM is defined as a unified independently. Successful supply-
pressive (5), perhaps even reaching network of computer hardware, soft- chain management will require
the status of a “killer application.” ware, and manufacturing systems that breaking down or re-engineering
MPC of units with as many as 10 in- combines business and process func- these silos. A company that is “able
puts and 10 outputs already is estab- tions (such as administration, econom- to promise” to its customer base will
lished in industrial practice. Com- ic analysis, scheduling, design, control, have a competitive advantage in the
puting power is not causing a criti- and operations). This includes interac- future. Technology tools required to
cal bottleneck in process control, tions among suppliers, different plants, achieve this “able to promise” state
but larger MPC implementations distribution sites, transportation net- include forecasting, optimization,
and faster sample rates will proba- works, and customers. Operations will simulation, and expert systems.
bly accompany faster computing. be guided by complete information Manufacturing operations will re-
For the next several years, better al- throughout the supply chain — that is, spond to customer demand that
gorithms could easily have more im- integration of sales, marketing, manu- comes either by pull or by make-to-
pact than improved hardware. MPC facturing, supply, and R&D data, order. This demand will drive
will appear at the lowest level in the which seamlessly flow along the whole scheduling, raw material supply,
DCS, thus reducing the number of supply chain. CIM provides general distribution, deployment of manu-
PID loops implemented. access to a common database and pro- facturing resources, and production.
The combination of process simu- duces reports for managers, engineers, The system will operate automati-
lation, optimization, and control into and operations staff, so that optimum cally without manual intervention;
one software package will be a near- decisions can be made and executed in logic and expert systems will make
term reality, at least if some the routine decisions.
corporate mergers and acquisi- In the factory of the future,
tions are any indication. For in- process control will be carried
stance, AspenTech’s acquisition The operator in out in a different environment
of Dynamic Matrix Control, than it is today. In fact, some
Setpoint, and Neuralware will tomorrow’s plant may need forward-thinking companies
provide it the ability to offer in- believe that the operator in to-
tegrated technology with a set to be an engineer morrow’s plant may need to be
of consistent models across an engineer (as is the case now
R&D, engineering, and produc- in Europe), with economic and
tion stages, with increased emphasis a timely and efficient manner. The CPI safety responsibilities comparable to
on rigorous dynamic models and the recognize CIM as an important tool for those of an airline pilot. Because of
best control solutions. Similarly, improving competitiveness, but it is greater integration of plant equip-
Shell Oil and Simulation Sciences not yet implemented in a significant ment, tighter quality specification,
have developed a new modeling soft- number of plants. Cooperation among and more emphasis on maximum
ware system called rigorous on-line computer vendors is required to devel- profitability while maintaining safe
modeling and equation-based opti- op a satisfactory computer/communi- operating conditions, the importance
mization (ROMEO). Simulation Sci- cation/software system. of process control will increase.
ences is now part of Invensys, which Supply chain management poses Plant personnel will rely on very so-
also owns Foxboro, historically one difficult decision-making problems, phisticated computer-based tools.
of the stronger process-control ven- because of the wide-ranging tempo- Controllers will be self-tuning, oper-
dors. As a result of such mergers, ral and geographical scales, along ating conditions will be optimized
software users will be able to opti- with a high level of uncertainty due frequently, fault-detection algo-
mize plant-wide operations using to changing market factors and plant rithms will deal with abnormal
real-time data and current economic availability. A successful package of events, total plant control will be im-
objectives. Software will determine tools must anticipate customer re- plemented using a hierarchical (dis-
the location and cause of operating quirements, commit to customer or- tributed) multivariable strategy, and
problems and provide a unified ders, procure new materials, allocate expert systems will help the plant
framework for data reconciliation and capacity, plan production, and engineer make intelligent decisions.
parameter estimation in real time. schedule delivery. In today’s manu- Plant data will be analyzed contin-
facturing environment, most of uously and reconciled via material
Computer-integrated these decisions are made in manage- and energy balances and nonlinear
manufacturing ment “silos” — marketing, procure- programming; unmeasured variables
Further movement toward comput- ment, manufacturing, distribution, will be reconstructed using parameter
er-integrated manufacturing (CIM) is shipping, and sales all operate fairly estimation techniques (soft sensors).

Chemical Engineering Progress January 2000 55


PROCESS DESIGN TRENDS

Digital instrumentation will be more As I’ve already mentioned, future state process models are stimulating
reliable and self-calibrating, while batch plants will adopt just-in-time the use of nonlinear programming
composition measurements hereto- production. The plant computer sys- tools. The methodology for schedul-
fore unavailable will be produced on- tem will monitor the status and ing of multipurpose batch and con-
line. Many plants already have incor- availability of equipment, people, tinuous production facilities has
porated several of these ideas, but no and materials, as well as mainte- been under investigation for over 20
plant has reached the highest level of nance needs and other elements years; it has progressed from rule-
sophistication over the total spectrum necessary to support automated based and heuristic randomized
of control activities. scheduling. Both continuous and search methods to mixed-integer
Figure 2 (6) illustrates a possible batch processes will have fully auto- programming methods.
hierarchical CIM structure that mated handling systems for raw ma- An excellent video on how real-
could be used in merging business terials, package components, and time decisions in manufacturing will
optimization with plant and process finished products. Robotics will be be made in the 21st Century was is-
operations and control. Each layer used extensively to replace manual sued by DuPont and Digital Equip-
will have different models and time labor. This, combined with complete ment in 1991 (7). Although this
scales and will include checks of the process automation, will minimize video is almost nine years old, it cap-
model against data obtained by the the need for manual intervention. tures many of the elements of pro-
computer systems. Consistent, ro- At the planning and scheduling cess engineering discussed above and
bust models will serve as the central level, multiperiod linear program- illustrates the effect of globalization
repository of process knowledge. ming tools capable of handling on future manufacturing issues. It
This will require advances in model large-scale systems have been used also demonstrates how modeling and
building, which is a major obstacle since the 1970s, especially in the simulation can assist in dealing with
because of the level of expertise re- petroleum/petrochemical sector. issues of off-specification products,
quired to formulate and use mathe- Now, real-time, plant-wide optimiza- improved control strategies, and ab-
matical models. tion applications based on steady- normal situation management.
The growth in complexity of the
■ Figure 2. automated plant environment can be
One possible mitigated through the use of artifi-
Enterprise structure for cial intelligence (AI) tools to assist
Data model-based
operators and engineers. The con-
computer-integrated
manufacturing. cept of the thinking computer, popu-
larized 32 years ago by Stanley
Planning Supply, Logistics Kubrick with the indelible image of
HAL in the film “2001. A Space
Capacity, Manpower
Odyssey,” has inspired numerous
Scheduling
predictions. While there have been
significant improvements in AI
Plant-wide Plant-wide technology since 1967, they have
Management Optimization been modest compared with some of
the predictions made then. Even so,
Automation in 1998 the IBM Deep Blue chess
Consistent Process Control
Robust Unit
Transient Automation computer defeated one of the best
Models Management chess players in the world (Gary
Abnormal Situation Management
Intelligent Decisions Kasparov). Speculation certainly re-
mains active. Kurzweil contends in
his latest book (8) that, with the dra-
Soft Sensors
Knowledge Rectification matic advances in computing power,
Statistical Analysis machines will surpass humans in in-
telligence at some point in the next
century, threatening the human race
Sensors as a species. Human cognition will
Process On-line Monitoring be augmented by downloading
Redundancy thoughts and memories into comput-
ers, moving to some form of ma-
chine consciousness. Such ideas are

56 January 2000 Chemical Engineering Progress


being hotly debated now, and it will sional software may help faculty straightforward in the future, mak-
be interesting to see whether the stay up-to-date. For instance, a so- ing distance learning at the desktop
predictions from this latest group of called meta-computing approach a real possibility.
futurists are more accurate than uses software modules, does not re- To keep up-to-date computer
those made 30 years ago. quire student or faculty knowledge tools in the hands of faculty, staff,
of FORTRAN, and is interactive via and students, educational institutions
Implications for academia a graphical user interface (GUI). and their departments must adopt
There is no doubt that the revolu- MATLAB is a good example of this life cycle planning for hardware and
tion in computing and information approach. Automatic generation of software. This differs from the ap-
technology during the past 40 years html statements is now possible proach today at many universities —
has changed the industrial world and with Microsoft Word, and incorpo- faculty and departments must fend
process engineering. In contrast, the ration of audio and video via the for themselves due to inconsistent
typical engineering educator, rather World Wide Web will become quite funding models for new technology.
than being on the cutting edge of Information technology costs are ap-
those developments, has been slow proaching 10–20% of total costs in
to incorporate new computer-based Literature Cited corporations today, and university
ideas into curriculum, teaching 1. Naisbitt, J., “Megatrends,” Warner, cost structures are expected to fol-
methodologies, and educational ma- New York (1982). low suit. Training must be given a
terials. With a few notable excep- 2. Keller, G. E., and P. F. Bryan, “Pro- high priority and encouraged as part
cess Engineering: Moving in New Di-
tions, computing and information of faculty development programs.
rections,” Chem. Eng. Progress, 96
technologies have not had a major (1), pp. 41–50 (Jan. 2000).
Universities also must provide re-
impact on the chemical engineering 3. Trainham, J. A., “Benefits of Funda- lease time for educators to develop
canon. At most institutions, thermo- mental Modeling and the Need for Im- content for technology-enhanced
dynamics and transport phenomena proved Simulation Tools and Model- learning. Administrators must adjust
are not taught any differently than ing Environment,” internal report, the faculty reward system to offer in-
they were 30 years ago. It is appar- DuPont, Wilmington, DE (July 1996). centives for introducing technology
ent that, in the eyes of a large num- 4. “Technology Vision 2020: The U.S. in teaching. The World Wide Web
Chemical Industry,” AIChE, New York
ber of faculty, the investment made gives educators a new publishing ve-
(1996). (Available on the Web at
by students to learn computer lan- http://www.chem.purdue.edu/v2020.)
hicle for their lecture material and,
guages and programming does not 5. Qin, S. J., and T. A. Badgwell, “An eventually, for research publications
yield any discernible advantage in Overview of Industrial Model Predic- in peer-reviewed on-line journals.
the training of chemical engineers. tive Control Technology,” AIChE These topics deserve considerable
The questions of how extensively Symp. Ser., 93 (316), pp. 232–256 discussion, but that is the subject of
computers should be used and (1997). another presentation (9). CEP

which computing skills should be 6. Edgar, T. F., G. V. Reklaitis, and D.


A. Dixon, “Computational Needs of
taught in the undergraduate chemi-
the Chemical Industry,” Proc.,, Work-
cal engineering curriculum are diffi- shop on Impact of Advances in Comp.
T. F. EDGAR is Associate Vice President of
cult to answer for several reasons: Academic Computing and Instructional
and Comm. Techn. on Chem. Sci. and Technology Services at the University of
1. There is no generally agreed Tech., Nat. Acad. Press, Washington, Texas at Austin, and holds the George T.
upon “core” set of computing skills DC (1999). and Gladys H. Abell Chair in Engineering
or professional software tools neces- 7. “The Edge,” video developed by (Edgar@mail.utexas.edu). He has
sary for being a productive engineer, DuPont and Digital Equipment (1991). concentrated his academic work in
8. Kurzweil, R., “The Age of Spiritual process modeling, control, and
either in academia or in industry. optimization, and has written three books
Machines: When Computers Exceed
2. Providing and maintaining a and over 200 articles and book chapters.
Human Intelligence,” Viking Press,
state-of-the-art facility for engi- New York (1999).
He received a BS from the Univ. of Kansas,
neering computation is expensive, and a PhD from Princeton Univ., both in
9. Edgar, T. F., “Information Technology chemical engineering. A Fellow of AIChE,
both in terms of capital and human in Chemical Engineering Education: he served as the Institute’s President in
resources. Evolution or Revolution,” Chem. 1997, and has received AIChE’s Colburn
3. Development of high-quality Acad. Lecture, Univ. of Missouri, Award and Computing in Chemical
computer-based lessons is quite Rolla (Apr. 1999). Engineering Award.

costly in terms of faculty and staff


time, and faculty are challenged to Related Web Site Acknowledgment
stay up-to-date with the latest ad- Technology Vision 2020: This article is based on the 1999 Phillips
Lecture, sponsored by Phillips Petroleum
vances in information technology. www.chem.purdue.edu/v2020.
Co., at Oklahoma State University.
Recent improvements in profes-

Chemical Engineering Progress January 2000 57

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy