scholarly journals Planning a Plant

2003 ◽  
Vol 125 (02) ◽  
pp. 42-45 ◽  
Author(s):  
Jean Thilmany

This article focuses on the efficient planning involved in the manufacturing plant. Factory simulation software gives a sense of whether or not such a bottleneck would occur if the factory process were laid out as planned. It extrapolates problems of a particular line layout and forecasts the costs of a problem like a bottleneck. The digital factory software allows manufacturers to simulate factory layout digitally, in order to see how the plant would function under the proposed arrangement and to realize potential problems on the line. In addition to robots, employees themselves can be represented in a digitized version. In this way, engineers can figure out where employees should stand on the line and design workstations for them to both optimize their movements and to ensure they are not under any kind of ergonomic stress. Virtual factories can be planned together by people in different locations with the help of software that allows for fly-through.

2020 ◽  
Vol Vol. 36 (No. 2) ◽  
pp. 49-57
Author(s):  
István Vajna ◽  
Anita Tangl

The case study shows the re-optimization of an initial new factory layout design with Value Stream Design (VSD). The VSD is a quantitative method and its’ final goal is to make a waste free optimized material flow. The primary goal of arrangement is to reduce transportation distances and frequencies, optimize human load. Initially the whole factory shop floor layout design was already made in push concept. The plans were made by production management, logistics, engineering department at the headquarter of the multinational automotive company with based on VDI2870 holistic concept linking strategy on tactics and operation. On the layout (v1.) the hundreds of machines were placed and arranged by CAD (Computer Design) engineers to fit the space. The factory building has 15,000 m2 with empty shop floor waiting for the final decisions for equipment. The factory production area was shared into six main production areas (P1-P6), which correlates with their product complexity of the product families. Each production area output can be finished product (FP) or semi-finished product (SFP) for the next production areas. To validate the whole factory layout it was necessary to involve lean experts that identified disadvantages and constraints. Without lean implementation the company’s transportation waste would be 49% more per year. The Value Stream Design importance nowadays is upgrading to a higher level, when the whole global business is changed, the labor force fluctuates, and the cost and delivery time reduction plays a vital role in the company’s profit and future. The research shows that if the decision taking is based on real data and facts the controlling and management can do its best in time. Using VSD and re-evaluating the transportation routes, frequency and costs is the first step to define a smooth, low cost, material flow (v2.). This development ensured the company to drive from push to pull production through mixed production system. Originally, the production flow was clockwise orientation. It was changed step by step to mixed production by eliminating work in process storages, implementing FIFO lanes, Milk Run, and Kanban. The total annual transportation distances were reduced from 4,905,000 m between the rump-up and serial production period. The warehouse storage size was reduced to 50% and implementation cost from €75,000 to €32,500. By eliminating work in process storages along production lines it was possible to open a new two way transportation road that also will serve the AGV’s operations in industry 4.0 projects. Due to decreased lead time the logistic labor productivity increased by 45%. Besides taking measurements for the VSD it was used Value Stream Mapping as a lean tool and an own designed VSD evaluation and a simulation software. The VSD team’s cooperative actions reduced the evaluation and validation time with 65% then it was initially planned. The implementations were evaluated from the rump-up phase to the first serial productions and the results were confirmed by controlling and management


Author(s):  
J.N. Ramsey ◽  
D.P. Cameron ◽  
F.W. Schneider

As computer components become smaller the analytical methods used to examine them and the material handling techniques must become more sensitive, and more sophisticated. We have used microbulldozing and microchiseling in conjunction with scanning electron microscopy, replica electron microscopy, and microprobe analysis for studying actual and potential problems with developmental and pilot line devices. Foreign matter, corrosion, etc, in specific locations are mechanically loosened from their substrates and removed by “extraction replication,” and examined in the appropriate instrument. The mechanical loosening is done in a controlled manner by using a microhardness tester—we use the attachment designed for our Reichert metallograph. The working tool is a pyramid shaped diamond (a Knoop indenter) which can be pushed into the specimen with a controlled pressure and in a specific location.


1991 ◽  
Vol 4 (02) ◽  
pp. 38-45 ◽  
Author(s):  
F. Baumgart

SummaryThe so-called “mixing” of implants and instruments from different producers entertain certain risks.The use of standardized implant materials (e.g. stainless steel ISO 5832/1) from different producers is necessary but is not sufficient to justify the use of an osteosynthesis plate from one source and a bone screw from another.The design, dimensions, tolerances, manufacturing procedure, quality controls, and application technique of the instruments and implants also vary according to make. This can lead to damage, failure or fracture of the biomechanical system called “osteosynthesis” and hence the failure of the treatment undertaken. In the end, it is the patient who pays for these problems.Some examples also illustrate the potential problems for the staff and institutions involved.The use of a unique, consistent, well-tested, and approved set of implants and instruments is to be strongly recommended to avoid any additional risk.


TAPPI Journal ◽  
2014 ◽  
Vol 13 (8) ◽  
pp. 65-78 ◽  
Author(s):  
W.B.A. (SANDY) SHARP ◽  
W.J. JIM FREDERICK ◽  
JAMES R. KEISER ◽  
DOUGLAS L. SINGBEIL

The efficiencies of biomass-fueled power plants are much lower than those of coal-fueled plants because they restrict their exit steam temperatures to inhibit fireside corrosion of superheater tubes. However, restricting the temperature of a given mass of steam produced by a biomass boiler decreases the amount of power that can be generated from this steam in the turbine generator. This paper examines the relationship between the temperature of superheated steam produced by a boiler and the quantity of power that it can generate. The thermodynamic basis for this relationship is presented, and the value of the additional power that could be generated by operating with higher superheated steam temperatures is estimated. Calculations are presented for five plants that produce both steam and power. Two are powered by black liquor recovery boilers and three by wood-fired boilers. Steam generation parameters for these plants were supplied by industrial partners. Calculations using thermodynamics-based plant simulation software show that the value of the increased power that could be generated in these units by increasing superheated steam temperatures 100°C above current operating conditions ranges between US$2,410,000 and US$11,180,000 per year. The costs and benefits of achieving higher superheated steam conditions in an individual boiler depend on local plant conditions and the price of power. However, the magnitude of the increased power that can be generated by increasing superheated steam temperatures is so great that it appears to justify the cost of corrosion-mitigation methods such as installing corrosion-resistant materials costing far more than current superheater alloys; redesigning biomassfueled boilers to remove the superheater from the flue gas path; or adding chemicals to remove corrosive constituents from the flue gas. The most economic pathways to higher steam temperatures will very likely involve combinations of these methods. Particularly attractive approaches include installing more corrosion-resistant alloys in the hottest superheater locations, and relocating the superheater from the flue gas path to an externally-fired location or to the loop seal of a circulating fluidized bed boiler.


TAPPI Journal ◽  
2009 ◽  
Vol 8 (1) ◽  
pp. 4-11
Author(s):  
MOHAMED CHBEL ◽  
LUC LAPERRIÈRE

Pulp and paper processes frequently present nonlinear behavior, which means that process dynam-ics change with the operating points. These nonlinearities can challenge process control. PID controllers are the most popular controllers because they are simple and robust. However, a fixed set of PID tuning parameters is gen-erally not sufficient to optimize control of the process. Problems related to nonlinearities such as sluggish or oscilla-tory response can arise in different operating regions. Gain scheduling is a potential solution. In processes with mul-tiple control objectives, the control strategy must further evaluate loop interactions to decide on the pairing of manipulated and controlled variables that minimize the effect of such interactions and hence, optimize controller’s performance and stability. Using the CADSIM Plus™ commercial simulation software, we developed a Jacobian sim-ulation module that enables automatic bumps on the manipulated variables to calculate process gains at different operating points. These gains can be used in controller tuning. The module also enables the control system designer to evaluate loop interactions in a multivariable control system by calculating the Relative Gain Array (RGA) matrix, of which the Jacobian is an essential part.


Author(s):  
Kazunori KADOWAKI ◽  
Daiko MIUMA ◽  
Keiichiro MIZUTA ◽  
Takanori MIHARA

Sign in / Sign up

Export Citation Format

Share Document