Next Issue
Volume 5, December
Previous Issue
Volume 5, June
 
 

Processes, Volume 5, Issue 3 (September 2017) – 19 articles

Cover Story (view full-size image): Datasets with missing values arising from causes such as sensor failure, inconsistent sampling rates, and merging data from different systems are common in the process industry. Methods for handling missing data are typically applied during data pre-processing, but can also be applicable during model building. This article considers missing data within the context of principal component analysis (PCA), a method that has widespread industrial application for multivariate statistical process control. Algorithms for applying PCA to datasets with missing values are reviewed, and a case study using the Tennessee Eastman process is presented to demonstrate the performance of the algorithms. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
1976 KiB  
Article
On-Line Dynamic Data Reconciliation in Batch Suspension Polymerizations of Methyl Methacrylate
by Jamille C. Coimbra, Príamo A. Melo, Diego M. Prata and José Carlos Pinto
Processes 2017, 5(3), 51; https://doi.org/10.3390/pr5030051 - 5 Sep 2017
Cited by 6 | Viewed by 5996
Abstract
A phenomenological model was developed to describe the dynamic evolution of the batch suspension polymerization of methyl methacrylate in terms of reactor temperature, pressure, concentrations and molecular properties of the final polymer. Then, the phenomenological model was used as a process constraint in [...] Read more.
A phenomenological model was developed to describe the dynamic evolution of the batch suspension polymerization of methyl methacrylate in terms of reactor temperature, pressure, concentrations and molecular properties of the final polymer. Then, the phenomenological model was used as a process constraint in dynamic data reconciliation procedures, which allowed for the successful monitoring of reaction variables in real-time and on-line. The obtained results indicate that heat transfer coefficients change significantly during the reaction time and from batch to batch, exerting a tremendous impact on the process operation. Obtained results also indicate that it can be difficult to attain thermodynamic equilibrium conditions in this system, because of the continuous condensation of evaporated monomer and the large mass transfer resistance offered by the viscous suspended droplets. Full article
Show Figures

Graphical abstract

858 KiB  
Review
Synthesis of Water-Soluble Group 4 Metallocene and Organotin Polyethers and Their Ability to Inhibit Cancer
by Charles E. Carraher, Michael R. Roner, Jessica Frank, Alica Moric-Johnson, Lindsey C. Miller, Kendra Black, Paul Slawek, Francesca Mosca, Jeffrey D. Einkauf and Floyd Russell
Processes 2017, 5(3), 50; https://doi.org/10.3390/pr5030050 - 1 Sep 2017
Cited by 2 | Viewed by 6216
Abstract
Water-soluble metallocene and organotin-containing polyethers were synthesized employing interfacial polycondensation. The reaction involved various chain lengths of poly(ethylene glycol), and produced water-soluble polymers in decent yield. Commercially available reactants were used to allow for easy scale up. The polymers exhibited a decent ability [...] Read more.
Water-soluble metallocene and organotin-containing polyethers were synthesized employing interfacial polycondensation. The reaction involved various chain lengths of poly(ethylene glycol), and produced water-soluble polymers in decent yield. Commercially available reactants were used to allow for easy scale up. The polymers exhibited a decent ability to inhibit a range of cancer cell lines, including two pancreatic cancer cell lines. This approach should allow the synthesis of a wide variety of other water-soluble polymers. Full article
(This article belongs to the Special Issue Water Soluble Polymers)
Show Figures

Figure 1

1442 KiB  
Article
Optimal Experimental Design for Parameter Estimation of an IL-6 Signaling Model
by Andrew Sinkoe and Juergen Hahn
Processes 2017, 5(3), 49; https://doi.org/10.3390/pr5030049 - 1 Sep 2017
Cited by 18 | Viewed by 5802
Abstract
IL-6 signaling plays an important role in inflammatory processes in the body. While a number of models for IL-6 signaling are available, the parameters associated with these models vary from case to case as they are non-trivial to determine. In this study, optimal [...] Read more.
IL-6 signaling plays an important role in inflammatory processes in the body. While a number of models for IL-6 signaling are available, the parameters associated with these models vary from case to case as they are non-trivial to determine. In this study, optimal experimental design is utilized to reduce the parameter uncertainty of an IL-6 signaling model consisting of ordinary differential equations, thereby increasing the accuracy of the estimated parameter values and, potentially, the model itself. The D-optimality criterion, operating on the Fisher information matrix and, separately, on a sensitivity matrix computed from the Morris method, was used as the objective function for the optimal experimental design problem. Optimal input functions for model parameter estimation were identified by solving the optimal experimental design problem, and the resulting input functions were shown to significantly decrease parameter uncertainty in simulated experiments. Interestingly, the determined optimal input functions took on the shape of PRBS signals even though there were no restrictions on their nature. Future work should corroborate these findings by applying the determined optimal experimental design on a real experiment. Full article
(This article belongs to the Special Issue Biological Networks)
Show Figures

Figure 1

1604 KiB  
Review
Perspectives on Resource Recovery from Bio-Based Production Processes: From Concept to Implementation
by Isuru A. Udugama, Seyed Soheil Mansouri, Aleksandar Mitic, Xavier Flores-Alsina and Krist V. Gernaey
Processes 2017, 5(3), 48; https://doi.org/10.3390/pr5030048 - 21 Aug 2017
Cited by 26 | Viewed by 9428
Abstract
Recovering valuable compounds from waste streams of bio-based production processes is in line with the circular economy paradigm, and is achievable by implementing “simple-to-use” and well-established process separation technologies. Such solutions are acceptable from industrial, economic and environmental points of view, implying relatively [...] Read more.
Recovering valuable compounds from waste streams of bio-based production processes is in line with the circular economy paradigm, and is achievable by implementing “simple-to-use” and well-established process separation technologies. Such solutions are acceptable from industrial, economic and environmental points of view, implying relatively easy future implementation on pilot- and full-scale levels in the bio-based industry. Reviewing such technologies is therefore the focus here. Considerations about technology readiness level (TRL) and Net Present Value (NPV) are included in the review, since TRL and NPV contribute significantly to the techno-economic evaluation of future and promising process solutions. Based on the present review, a qualitative guideline for resource recovery from bio-based production processes is proposed. Finally, future approaches and perspectives toward identification and implementation of suitable resource recovery units for bio-based production processes are discussed. Full article
(This article belongs to the Special Issue Biofilm Processes)
Show Figures

Figure 1

1647 KiB  
Article
A Long-Short Term Memory Recurrent Neural Network Based Reinforcement Learning Controller for Office Heating Ventilation and Air Conditioning Systems
by Yuan Wang, Kirubakaran Velswamy and Biao Huang
Processes 2017, 5(3), 46; https://doi.org/10.3390/pr5030046 - 18 Aug 2017
Cited by 113 | Viewed by 14487
Abstract
Energy optimization in buildings by controlling the Heating Ventilation and Air Conditioning (HVAC) system is being researched extensively. In this paper, a model-free actor-critic Reinforcement Learning (RL) controller is designed using a variant of artificial recurrent neural networks called Long-Short-Term Memory (LSTM) networks. [...] Read more.
Energy optimization in buildings by controlling the Heating Ventilation and Air Conditioning (HVAC) system is being researched extensively. In this paper, a model-free actor-critic Reinforcement Learning (RL) controller is designed using a variant of artificial recurrent neural networks called Long-Short-Term Memory (LSTM) networks. Optimization of thermal comfort alongside energy consumption is the goal in tuning this RL controller. The test platform, our office space, is designed using SketchUp. Using OpenStudio, the HVAC system is installed in the office. The control schemes (ideal thermal comfort, a traditional control and the RL control) are implemented in MATLAB. Using the Building Control Virtual Test Bed (BCVTB), the control of the thermostat schedule during each sample time is implemented for the office in EnergyPlus alongside local weather data. Results from training and validation indicate that the RL controller improves thermal comfort by an average of 15% and energy efficiency by an average of 2.5% as compared to other strategies mentioned. Full article
(This article belongs to the Collection Process Data Analytics)
Show Figures

Figure 1

1741 KiB  
Article
Characterizing Gene and Protein Crosstalks in Subjects at Risk of Developing Alzheimer’s Disease: A New Computational Approach
by Kanchana Padmanabhan, Kelly Nudelman, Steve Harenberg, Gonzalo Bello, Dongwha Sohn, Katie Shpanskaya, Priyanka Tiwari Dikshit, Pallavi S. Yerramsetty, Rudolph E. Tanzi, Andrew J. Saykin, Jeffrey R. Petrella, P. Murali Doraiswamy, Nagiza F. Samatova and Alzheimer’s Disease Neuroimaging Initiative
Processes 2017, 5(3), 47; https://doi.org/10.3390/pr5030047 - 17 Aug 2017
Cited by 2 | Viewed by 6979
Abstract
Alzheimer’s disease (AD) is a major public health threat; however, despite decades of research, the disease mechanisms are not completely understood, and there is a significant dearth of predictive biomarkers. The availability of systems biology approaches has opened new avenues for understanding disease [...] Read more.
Alzheimer’s disease (AD) is a major public health threat; however, despite decades of research, the disease mechanisms are not completely understood, and there is a significant dearth of predictive biomarkers. The availability of systems biology approaches has opened new avenues for understanding disease mechanisms at a pathway level. However, to the best of our knowledge, no prior study has characterized the nature of pathway crosstalks in AD, or examined their utility as biomarkers for diagnosis or prognosis. In this paper, we build the first computational crosstalk model of AD incorporating genetics, antecedent knowledge, and biomarkers from a national study to create a generic pathway crosstalk reference map and to characterize the nature of genetic and protein pathway crosstalks in mild cognitive impairment (MCI) subjects. We perform initial studies of the utility of incorporating these crosstalks as biomarkers for assessing the risk of MCI progression to AD dementia. Our analysis identified Single Nucleotide Polymorphism-enriched pathways representing six of the seven Kyoto Encyclopedia of Genes and Genomes pathway categories. Integrating pathway crosstalks as a predictor improved the accuracy by 11.7% compared to standard clinical parameters and apolipoprotein E ε4 status alone. Our findings highlight the importance of moving beyond discrete biomarkers to studying interactions among complex biological pathways. Full article
(This article belongs to the Special Issue Biological Networks)
Show Figures

Figure 1

593 KiB  
Article
Data Visualization and Visualization-Based Fault Detection for Chemical Processes
by Ray C. Wang, Michael Baldea and Thomas F. Edgar
Processes 2017, 5(3), 45; https://doi.org/10.3390/pr5030045 - 14 Aug 2017
Cited by 6 | Viewed by 7635
Abstract
Over the years, there has been a consistent increase in the amount of data collected by systems and processes in many different industries and fields. Simultaneously, there is a growing push towards revealing and exploiting of the information contained therein. The chemical processes [...] Read more.
Over the years, there has been a consistent increase in the amount of data collected by systems and processes in many different industries and fields. Simultaneously, there is a growing push towards revealing and exploiting of the information contained therein. The chemical processes industry is one such field, with high volume and high-dimensional time series data. In this paper, we present a unified overview of the application of recently-developed data visualization concepts to fault detection in the chemical industry. We consider three common types of processes and compare visualization-based fault detection performance to methods used currently. Full article
(This article belongs to the Collection Process Data Analytics)
Show Figures

Figure 1

4334 KiB  
Review
Comparison of CO2 Capture Approaches for Fossil-Based Power Generation: Review and Meta-Study
by Thomas A. Adams II, Leila Hoseinzade, Pranav Bhaswanth Madabhushi and Ikenna J. Okeke
Processes 2017, 5(3), 44; https://doi.org/10.3390/pr5030044 - 14 Aug 2017
Cited by 63 | Viewed by 12892
Abstract
This work is a meta-study of CO2 capture processes for coal and natural gas power generation, including technologies such as post-combustion solvent-based carbon capture, the integrated gasification combined cycle process, oxyfuel combustion, membrane-based carbon capture processes, and solid oxide fuel cells. A [...] Read more.
This work is a meta-study of CO2 capture processes for coal and natural gas power generation, including technologies such as post-combustion solvent-based carbon capture, the integrated gasification combined cycle process, oxyfuel combustion, membrane-based carbon capture processes, and solid oxide fuel cells. A literature survey of recent techno-economic studies was conducted, compiling relevant data on costs, efficiencies, and other performance metrics. The data were then converted in a consistent fashion to a common standard (such as a consistent net power output, country of construction, currency, base year of operation, and captured CO2 pressure) such that a meaningful and direct comparison of technologies can be made. The processes were compared against a standard status quo power plant without carbon capture to compute metrics such as cost of CO2 emissions avoided to identify the most promising designs and technologies to use for CO2 emissions abatement. Full article
Show Figures

Figure 1

1651 KiB  
Article
Rheology of Green Plasticizer/Poly(vinyl chloride) Blends via Time–Temperature Superposition
by Roya Jamarani, Hanno C. Erythropel, Daniel Burkat, James A. Nicell, Richard L. Leask and Milan Maric
Processes 2017, 5(3), 43; https://doi.org/10.3390/pr5030043 - 7 Aug 2017
Cited by 22 | Viewed by 8361
Abstract
Plasticizers are commonly added to poly(vinyl chloride) (PVC) and other brittle polymers to improve their flexibility and processing properties. Phthalate plasticizers such as di(2-ethylhexyl phthalate) (DEHP) are the most common PVC plasticizers and have recently been linked to a wide range of developmental [...] Read more.
Plasticizers are commonly added to poly(vinyl chloride) (PVC) and other brittle polymers to improve their flexibility and processing properties. Phthalate plasticizers such as di(2-ethylhexyl phthalate) (DEHP) are the most common PVC plasticizers and have recently been linked to a wide range of developmental and reproductive toxicities in mammals. Our group has developed several replacement compounds that have good biodegradation kinetics, low toxicity profiles, and comparable plasticizer properties to DEHP. Knowledge of the rheology of PVC–plasticizer blends at elevated temperatures is crucial for understanding and predicting the behavior of the compounds during processing. In this work, the time–temperature profiles of PVC blended with our replacement green plasticizers—succinates, maleates, and dibenzoates, of varying alkyl chain length—are compared to blends prepared with DEHP and a commercially available non-phthalate plasticizer, di(isononyl cyclohexane-1,2-dicarboxylate) (Hexamoll® DINCH®). The relationship between the plasticizer molecular structure and viscoelastic response was examined by applying time–temperature superposition. All compounds except the diethyl esters showed a comparable viscoelastic response to DEHP and Hexamoll® DINCH®, and dihexyl succinate exhibited the most effective reduction of the storage modulus G′. All of the dibenzoate blends exhibited a lower stiffness than the DEHP blends. These experiments help to show that the green plasticizers described herein are viable replacements for DEHP, providing a less toxic alternative with comparable processing and rheological performance. Full article
Show Figures

Graphical abstract

5016 KiB  
Review
Design of Experiments for Control-Relevant Multivariable Model Identification: An Overview of Some Basic Recent Developments
by Shobhit Misra, Mark Darby, Shyam Panjwani and Michael Nikolaou
Processes 2017, 5(3), 42; https://doi.org/10.3390/pr5030042 - 3 Aug 2017
Cited by 4 | Viewed by 6076
Abstract
The effectiveness of model-based multivariable controllers depends on the quality of the model used. In addition to satisfying standard accuracy requirements for model structure and parameter estimates, a model to be used in a controller must also satisfy control-relevant requirements, such as integral [...] Read more.
The effectiveness of model-based multivariable controllers depends on the quality of the model used. In addition to satisfying standard accuracy requirements for model structure and parameter estimates, a model to be used in a controller must also satisfy control-relevant requirements, such as integral controllability. Design of experiments (DOE), which produce data from which control-relevant models can be accurately estimated, may differ from standard DOE. The purpose of this paper is to emphasize this basic principle and to summarize some fundamental results obtained in recent years for DOE in two important cases: Accurate estimation of the order of a multivariable model and efficient identification of a model that satisfies integral controllability; both important for the design of robust model-based controllers. For both cases, we provide an overview of recent results that can be easily incorporated by the final user in related DOE. Computer simulations illustrate outcomes to be anticipated. Finally, opportunities for further development are discussed. Full article
(This article belongs to the Collection Process Data Analytics)
Show Figures

Figure 1

1998 KiB  
Article
Effects of Inoculum Type and Aeration Flowrate on the Performance of Aerobic Granular SBRs
by Mariele K. Jungles, Ángeles Val del Río, Anuska Mosquera-Corral, José Luis Campos, Ramón Méndez and Rejane H. R. Costa
Processes 2017, 5(3), 41; https://doi.org/10.3390/pr5030041 - 19 Jul 2017
Cited by 8 | Viewed by 6171
Abstract
Aerobic granular sequencing batch reactors (SBRs) are usually inoculated with activated sludge which implies sometimes long start-up periods and high solids concentrations in the effluent due to the initial wash-out of the inoculum. In this work, the use of aerobic mature granules as [...] Read more.
Aerobic granular sequencing batch reactors (SBRs) are usually inoculated with activated sludge which implies sometimes long start-up periods and high solids concentrations in the effluent due to the initial wash-out of the inoculum. In this work, the use of aerobic mature granules as inoculum in order to improve the start-up period was tested, but no clear differences were observed compared to a reactor inoculated with activated sludge. The effect of the aeration rate on both physical properties of granules and reactor performance was also studied in a stable aerobic granular SBR. The increase of the aeration flow rate caused the decrease of the average diameter of the granules. This fact enhanced the COD and ammonia consumption rates due to the increase of the DO level and the aerobic fraction of the biomass. However, it provoked a loss of the nitrogen removal efficiency due to the worsening of the denitrification capacity as a consequence of a higher aerobic fraction. Full article
Show Figures

Figure 1

4727 KiB  
Article
Development of Molecular Distillation Based Simulation and Optimization of Refined Palm Oil Process Based on Response Surface Methodology
by Noree Tehlah, Pornsiri Kaewpradit and Iqbal M. Mujtaba
Processes 2017, 5(3), 40; https://doi.org/10.3390/pr5030040 - 16 Jul 2017
Cited by 21 | Viewed by 10215
Abstract
The deodorization of the refined palm oil process is simulated here using ASPEN HYSYS. In the absence of a library molecular distillation (MD) process in ASPEN HYSYS, first, a single flash vessel is considered to represent a falling film MD process which is [...] Read more.
The deodorization of the refined palm oil process is simulated here using ASPEN HYSYS. In the absence of a library molecular distillation (MD) process in ASPEN HYSYS, first, a single flash vessel is considered to represent a falling film MD process which is simulated for a binary system taken from the literature and the model predictions are compared with the published work based on ASPEN PLUS and DISMOL. Second, the developed MD process is extended to simulate the deodorization process. Parameter estimation technique is used to estimate the Antoine’s parameters based on literature data to calculate the pure component vapor pressure. The model predictions are then validated against the patented results of refining edible oil rich in natural carotenes and vitamin E and simulation results were found to be in good agreement, within a ±2% error of the patented results. Third, Response Surface Methodology (RSM) is employed to develop non-linear second-order polynomial equations based model for the deodorization process and the effects of various operating parameters on the performance of the process are studied. Finally, an optimization framework is developed to maximize the concentration of beta-carotene, tocopherol and free fatty acid while optimizing the feed flow rate, temperature and pressure subject to process constrains. The optimum results of feed flow rate, temperature, and pressure were determined as 1291 kg/h, 147 °C and 0.0007 kPa respectively, and the concentration responses of beta- carotene, tocopherol and free fatty acid were found to be 0.000575, 0.000937 and 0.999840 respectively. Full article
Show Figures

Figure 1

4390 KiB  
Article
Big Data Analytics for Smart Manufacturing: Case Studies in Semiconductor Manufacturing
by James Moyne and Jimmy Iskandar
Processes 2017, 5(3), 39; https://doi.org/10.3390/pr5030039 - 12 Jul 2017
Cited by 200 | Viewed by 40495
Abstract
Smart manufacturing (SM) is a term generally applied to the improvement in manufacturing operations through integration of systems, linking of physical and cyber capabilities, and taking advantage of information including leveraging the big data evolution. SM adoption has been occurring unevenly across industries, [...] Read more.
Smart manufacturing (SM) is a term generally applied to the improvement in manufacturing operations through integration of systems, linking of physical and cyber capabilities, and taking advantage of information including leveraging the big data evolution. SM adoption has been occurring unevenly across industries, thus there is an opportunity to look to other industries to determine solution and roadmap paths for industries such as biochemistry or biology. The big data evolution affords an opportunity for managing significantly larger amounts of information and acting on it with analytics for improved diagnostics and prognostics. The analytics approaches can be defined in terms of dimensions to understand their requirements and capabilities, and to determine technology gaps. The semiconductor manufacturing industry has been taking advantage of the big data and analytics evolution by improving existing capabilities such as fault detection, and supporting new capabilities such as predictive maintenance. For most of these capabilities: (1) data quality is the most important big data factor in delivering high quality solutions; and (2) incorporating subject matter expertise in analytics is often required for realizing effective on-line manufacturing solutions. In the future, an improved big data environment incorporating smart manufacturing concepts such as digital twin will further enable analytics; however, it is anticipated that the need for incorporating subject matter expertise in solution design will remain. Full article
(This article belongs to the Collection Process Data Analytics)
Show Figures

Figure 1

924 KiB  
Article
Principal Component Analysis of Process Datasets with Missing Values
by Kristen A. Severson, Mark C. Molaro and Richard D. Braatz
Processes 2017, 5(3), 38; https://doi.org/10.3390/pr5030038 - 6 Jul 2017
Cited by 39 | Viewed by 12228
Abstract
Datasets with missing values arising from causes such as sensor failure, inconsistent sampling rates, and merging data from different systems are common in the process industry. Methods for handling missing data typically operate during data pre-processing, but can also occur during model building. [...] Read more.
Datasets with missing values arising from causes such as sensor failure, inconsistent sampling rates, and merging data from different systems are common in the process industry. Methods for handling missing data typically operate during data pre-processing, but can also occur during model building. This article considers missing data within the context of principal component analysis (PCA), which is a method originally developed for complete data that has widespread industrial application in multivariate statistical process control. Due to the prevalence of missing data and the success of PCA for handling complete data, several PCA algorithms that can act on incomplete data have been proposed. Here, algorithms for applying PCA to datasets with missing values are reviewed. A case study is presented to demonstrate the performance of the algorithms and suggestions are made with respect to choosing which algorithm is most appropriate for particular settings. An alternating algorithm based on the singular value decomposition achieved the best results in the majority of test cases involving process datasets. Full article
(This article belongs to the Collection Process Data Analytics)
Show Figures

Figure 1

6067 KiB  
Article
Reduction of Dust Emission by Monodisperse System Technology for Ammonium Nitrate Manufacturing
by Maksym Skydanenko, Vsevolod Sklabinskyi, Saad Saleh and Shahzad Barghi
Processes 2017, 5(3), 37; https://doi.org/10.3390/pr5030037 - 3 Jul 2017
Cited by 14 | Viewed by 10464
Abstract
Prilling is a common process in the fertilizer industry, where the fertilizer melt is converted to droplets that fall, cool down and solidify in a countercurrent flow of air in a prilling tower. A vibratory granulator was used to investigate liquid jet breakup [...] Read more.
Prilling is a common process in the fertilizer industry, where the fertilizer melt is converted to droplets that fall, cool down and solidify in a countercurrent flow of air in a prilling tower. A vibratory granulator was used to investigate liquid jet breakup into droplets. The breakup of liquid jets subjected to a forced perturbation was investigated in the Rayleigh regime, where a mechanical vibration was applied in order to achieve the production of monodispersed particles. Images of the jet trajectory, breakup, and the formed drops were captured using a high-speed camera. A mathematical model for the liquid outflow conditions based on a transient two-dimensional Navier–Stokes equation was developed and solved analytically, and the correlations between the process parameters of the vibrator and the jet pressure that characterize their disintegration mode were identified. The theoretical predications obtained from the correlations showed a good agreement with the experimental results. Results of the experiments were used to specify the values of the process parameters of the vibration system, and to test them in the production environment in a mode of monodispersed jet disintegration. The vibration frequency was found to have a profound effect on the production of monodispersed particles. The results of experiments in a commercially-sized plant showed that the granulator design based on this study provided prills with a narrower size range compared to the conventional granulators, which resulted in a substantial reduction in dust emission. Full article
Show Figures

Figure 1

2478 KiB  
Opinion
On the Use of Multivariate Methods for Analysis of Data from Biological Networks
by Troy Vargason, Daniel P. Howsmon, Deborah L. McGuinness and Juergen Hahn
Processes 2017, 5(3), 36; https://doi.org/10.3390/pr5030036 - 3 Jul 2017
Cited by 17 | Viewed by 7831
Abstract
Data analysis used for biomedical research, particularly analysis involving metabolic or signaling pathways, is often based upon univariate statistical analysis. One common approach is to compute means and standard deviations individually for each variable or to determine where each variable falls between upper [...] Read more.
Data analysis used for biomedical research, particularly analysis involving metabolic or signaling pathways, is often based upon univariate statistical analysis. One common approach is to compute means and standard deviations individually for each variable or to determine where each variable falls between upper and lower bounds. Additionally, p-values are often computed to determine if there are differences between data taken from two groups. However, these approaches ignore that the collected data are often correlated in some form, which may be due to these measurements describing quantities that are connected by biological networks. Multivariate analysis approaches are more appropriate in these scenarios, as they can detect differences in datasets that the traditional univariate approaches may miss. This work presents three case studies that involve data from clinical studies of autism spectrum disorder that illustrate the need for and demonstrate the potential impact of multivariate analysis. Full article
(This article belongs to the Special Issue Biological Networks)
Show Figures

Figure 1

773 KiB  
Article
Industrial Process Monitoring in the Big Data/Industry 4.0 Era: from Detection, to Diagnosis, to Prognosis
by Marco S. Reis and Geert Gins
Processes 2017, 5(3), 35; https://doi.org/10.3390/pr5030035 - 30 Jun 2017
Cited by 227 | Viewed by 22230
Abstract
We provide a critical outlook of the evolution of Industrial Process Monitoring (IPM) since its introduction almost 100 years ago. Several evolution trends that have been structuring IPM developments over this extended period of time are briefly referred, with more focus on data-driven [...] Read more.
We provide a critical outlook of the evolution of Industrial Process Monitoring (IPM) since its introduction almost 100 years ago. Several evolution trends that have been structuring IPM developments over this extended period of time are briefly referred, with more focus on data-driven approaches. We also argue that, besides such trends, the research focus has also evolved. The initial period was centred on optimizing IPM detection performance. More recently, root cause analysis and diagnosis gained importance and a variety of approaches were proposed to expand IPM with this new and important monitoring dimension. We believe that, in the future, the emphasis will be to bring yet another dimension to IPM: prognosis. Some perspectives are put forward in this regard, including the strong interplay of the Process and Maintenance departments, hitherto managed as separated silos. Full article
(This article belongs to the Collection Process Data Analytics)
Show Figures

Figure 1

422 KiB  
Review
Review of Field Development Optimization of Waterflooding, EOR, and Well Placement Focusing on History Matching and Optimization Algorithms
by Jackson Udy, Brigham Hansen, Sage Maddux, Donald Petersen, Spencer Heilner, Kevin Stevens, David Lignell and John D. Hedengren
Processes 2017, 5(3), 34; https://doi.org/10.3390/pr5030034 - 27 Jun 2017
Cited by 49 | Viewed by 10160
Abstract
This paper presents a review of history matching and oil field development optimization techniques with a focus on optimization algorithms. History matching algorithms are reviewed as a precursor to production optimization algorithms. Techniques for history matching and production optimization are reviewed including global [...] Read more.
This paper presents a review of history matching and oil field development optimization techniques with a focus on optimization algorithms. History matching algorithms are reviewed as a precursor to production optimization algorithms. Techniques for history matching and production optimization are reviewed including global and local methods. Well placement, well control, and combined well placement-control optimization using both secondary and tertiary oil production techniques are considered. Secondary and tertiary recovery techniques are commonly referred to as waterflooding and enhanced oil recovery (EOR), respectively. Benchmark models for comparison of methods are summarized while other applications of methods are discussed throughout. No single optimization method is found to be universally superior. Key areas of future work are combining optimization methods and integrating multiple optimization processes. Current challenges and future research opportunities for improved model validation and large scale optimization algorithms are also discussed. Full article
Show Figures

Figure 1

1559 KiB  
Article
Techno-Economic Assessment of Benzene Production from Shale Gas
by Salvador I. Pérez-Uresti, Jorge M. Adrián-Mendiola, Mahmoud M. El-Halwagi and Arturo Jiménez-Gutiérrez
Processes 2017, 5(3), 33; https://doi.org/10.3390/pr5030033 - 23 Jun 2017
Cited by 29 | Viewed by 11662
Abstract
The availability and low cost of shale gas has boosted its use as fuel and as a raw material to produce value-added compounds. Benzene is one of the chemicals that can be obtained from methane, and represents one of the most important compounds [...] Read more.
The availability and low cost of shale gas has boosted its use as fuel and as a raw material to produce value-added compounds. Benzene is one of the chemicals that can be obtained from methane, and represents one of the most important compounds in the petrochemical industry. It can be synthesized via direct methane aromatization (DMA) or via indirect aromatization (using oxidative coupling of methane). DMA is a direct-conversion process, while indirect aromatization involves several stages. In this work, an economic, energy-saving, and environmental assessment for the production of benzene from shale gas using DMA as a reaction path is presented. A sensitivity analysis was conducted to observe the effect of the operating conditions on the profitability of the process. The results show that production of benzene using shale gas as feedstock can be accomplished with a high return on investment. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop