Application and Evaluation of SPD Based Logistics Management Model for Medical Consumables in Clinical Nursing Departments

Author(s):  
Chai YANG ◽  
Wei GU ◽  
Tongzhu LIU

Background: Supply, processing, and distribution (SPD) model is sparingly used in hospitals in China. We evaluated its effects on the management efficiency, quality control, and operating costs of medical consumables (MCs) in the clinical nursing surroundings in a single Chinese hospital-Anhui Provincial Hospital from 2014 to 2015. Methods: Amount-based packages (ABP) and procedure-based packages (PBP) models were created. They were introduced the use of quick response (QR) code scanning for using in clinical nursing departments (CNDs). Questionnaires were prepared by referring to previous literature and using Delphi method repeatedly, further discussed and formalized. Partial results of the formal questionnaire were analyzed using SPSS. Results: Frequency of MCs claims reduced without any requirements of MCs in 70% of CNDs. Average time spent on the inventory per week decreased and the time required to procure MCs reduced. Moreover, the average satisfaction score with MCs management increased, reaching 100%. Average space occupied by MCs decreased significantly, reducing by 1.2444m3. Overall, 100% of the respondents concluded that the management of MCs improved effectively and the inventory turnover rate had accelerated. The cost of MCs decreased by 15% with more than 10% increase in in-hospital amount, and the average daily cost of MCs also showed decrease. Conclusion: SPD can improve the efficiency of MCs management in CNDs, reducing medical risks and disputes, saving hospital operating costs, and decreasing capital occupation.

2021 ◽  
Vol 108 (Supplement_2) ◽  
Author(s):  
A Rogers ◽  
R Ramasubbu ◽  
B Ramasubbu

Abstract Introduction The NHS’ move towards increasing digitisation is limited by inadequate resourcing. It is estimated 70% of a junior doctor’s time is spent completing computer-based administrative work. Aging and insufficient equipment leads to inefficiency. The objective of this study is to investigate the hidden cost of insufficient and poorly performing computer technology. Method Surveys were disseminated to doctors and data was collected regarding designation, ward, salary and estimated ‘minutes-waiting’ for computers to become free (CF) and to load (CL). Results 33 surveys were completed. The hospital-wide average CF and CL were 25 minutes and 31.06 minutes respectively, with a corresponding average daily cost per doctor of £10.16 (CF) and £12.63 (CL), totalling £22.79/doctor/day. In the highest-expense ward, CF (31.66 minutes) and CL (38.33 minutes) equated to £30.28/doctor/day. Following acquisition of new hardware and re-audit, CL was significantly reduced to 20.4 minutes (p = 0.0142). Conclusions This study highlights the hidden cost of insufficient, poorly performing hardware. Every day the total cost of time-wasted greatly surpasses the cost of a single computer unit, illustrating the false economy of reduced capital investment in computer technology.


1989 ◽  
Vol 21 (S10) ◽  
pp. 127-136 ◽  
Author(s):  
S. R. Daga

Newborn infants are among those which generate the highest health care costs. For instance, the cost of hospital care until discharge was assessed at US $ 14,200 (Boyle et al., 1983) for babies weighing 1000–1499g at birth. The average hospital stay for a baby weighing less than 1500g at birth in 1981 was 100 days at an average daily cost of US$ 898 (Stahlman, 1984). Achievements in neonatal survival, especially of extremely low birth weight babies, have necessitated frequent revision of the definition of viability. However, modern neonatal intensive care cannot be regarded as appropriate for developing countries as it cannot be made accessible to all at an affordable cost.


TAPPI Journal ◽  
2012 ◽  
Vol 11 (7) ◽  
pp. 29-35 ◽  
Author(s):  
PETER W. HART ◽  
DALE E. NUTTER

During the last several years, the increasing cost and decreasing availability of mixed southern hardwoods have resulted in financial and production difficulties for southern U.S. mills that use a significant percentage of hardwood kraft pulp. Traditionally, in the United States, hardwoods are not plantation grown because of the growth time required to produce a quality tree suitable for pulping. One potential method of mitigating the cost and supply issues associated with the use of native hardwoods is to grow eucalyptus in plantations for the sole purpose of producing hardwood pulp. However, most of the eucalyptus species used in pulping elsewhere in the world are not capable of surviving in the southern U.S. climate. This study examines the potential of seven different cold-tolerant eucalyptus species to be used as replacements for, or supplements to, mixed southern hardwoods. The laboratory pulping and bleaching aspects of these seven species are discussed, along with pertinent mill operational data. Selected mill trial data also are reviewed.


2019 ◽  
Vol 290 ◽  
pp. 02007
Author(s):  
Radu Dan Paltan ◽  
Cristina Biriş ◽  
Loredana Anne-Marie Rădulescu

Of many techniques that are used to optimize production and costs, the studies conducted within a profile company lead to our choice for testing the 6Sigma method (the most used method in the automotive industry) in view of the economic efficiency applied in the wood Industry company. This method measures how many flaws exist in a process and determines in a systematic way how to improve it by technical overhauling and eliminating or minimizing the process for efficiency. This research article aims to study the state of research on the optimization of the production process through technical overhauling for panels reconstituted from solid wood and ways to make production more efficient by cutting costs through technical overhauling. From preliminary research, we estimate that all the items founded and others that will result from further research will result in a significant decrease in production costs that are reflected in the cost of the finished product and consequently in increasing the yield of the company by maximizing its profit. At the same time it may be the basis of future research studies in the field. The easier it is to maximize profits, the lower the operating costs are and the higher recovery rate of investments are, that will result a change in the operating mode: “working smarter not harder”.


1992 ◽  
Vol 22 (7) ◽  
pp. 980-983 ◽  
Author(s):  
Richard G. Oderwald ◽  
Elizabeth Jones

Formulas are derived for determining the total number of sample points and the number of volume points for a point, double sample with a ratio of means estimator to replace a point sample and achieve the same variance. A minimum ratio of the cost of measuring volume to the cost of measuring basal area at a point is determined for which the point, double sample will be less costly, in terms of time required to measure points, than the point sample.


2020 ◽  
Vol 4 (1) ◽  
pp. 42-49
Author(s):  
Ajeng Eka Pratama ◽  
Muhaimin Dimyati ◽  
Yanna Eka Pratiwi

This study aims to determine the effect of working capital turnover, operational cost ratio, inventory turnover on the performance of UD. Firmansyah. The data used are the financial statements for the period 2015-2018. The data obtained were analyzed using multiple linear regression. The number of samples used was 48 samples. Partially the results show that working capital turnover and inventory turnover do not have a significant effect on company performance, while the ratio of operating costs has a significant effect on company performance. Meanwhile, simultaneously the research shows that working capital turnover, operational cost ratio, and inventory turnover have a significant effect on company performance. The coefficient of determination in this study is 0.165, which means that 16.5% of the company's performance can be explained by working capital turnover, operational cost ratio, and inventory turnover variables. At the same time, the remaining 83.5% is explained by other factors not included in this study.


Quantum ◽  
2018 ◽  
Vol 2 ◽  
pp. 78 ◽  
Author(s):  
M. B. Hastings

We give a quantum algorithm to exactly solve certain problems in combinatorial optimization, including weighted MAX-2-SAT as well as problems where the objective function is a weighted sum of products of Ising variables, all terms of the same degree D; this problem is called weighted MAX-ED-LIN2. We require that the optimal solution be unique for odd D and doubly degenerate for even D; however, we expect that the algorithm still works without this condition and we show how to reduce to the case without this assumption at the cost of an additional overhead. While the time required is still exponential, the algorithm provably outperforms Grover's algorithm assuming a mild condition on the number of low energy states of the target Hamiltonian. The detailed analysis of the runtime dependence on a tradeoff between the number of such states and algorithm speed: fewer such states allows a greater speedup. This leads to a natural hybrid algorithm that finds either an exact or approximate solution.


Author(s):  
Suzanne Tsacoumis

High fidelity measures have proven to be powerful tools for measuring a broad range of competencies and their validity is well documented. However, their high-touch nature is often a deterrent to their use due to the cost and time required to develop and implement them. In addition, given the increased reliance on technology to screen and evaluate job candidates, organizations are continuing to search for more efficient ways to gather the information they need about one's capabilities. This chapter describes how innovative, interactive rich-media simulations that incorporate branching technology have been used in several real-world applications. The main focus is on describing the nature of these assessments and highlighting potential solutions to the unique measurement challenges associated with these types of assessments.


2019 ◽  
Vol 52 (2) ◽  
pp. 191-212
Author(s):  
Christian Kalhoefer ◽  
Guenter Lang

Abstract Governments worldwide reacted swiftly to the global financial crisis by tougher regulations. This paper investigates the impacts of the regulatory environment on operating costs using panel data of 2,200 German banks over the timeframe from 1999 to 2014. We estimate cost functions with and without proxies for regulation and analyze the results with respect to period, bank size, and group affiliation. Our results show that regulatory costs were peaking in 2001, 2008, and lately since 2012. Most interesting, however, is the asymmetry of regulation: Whereas the cost effects were symmetric for all banks until 2003, the last ten years were different. Larger institutions and savings banks could neutralize the impacts of increasing regulation on operating costs. In contrast, smaller banks, especially if they are cooperative banks, were facing significant cost increases. We therefore expect unintended structural shifts like a reduction in the diversity of banks, which are negative for competition, service quality, and for the stability of the financial system. Zusammenfassung Weltweit wurde als Folge der globalen Finanzkrise die Regulierung des Finanzsektors verschärft. Dieser Beitrag geht der Frage nach, welche Konsequenzen diese Regulierungsmaßnahmen für die operativen Kosten im Bankengeschäft haben. Auf der Basis von Paneldaten von 2,200 in Deutschland aktiven Banken über den Zeitraum von 1999 bis 2014 schätzen wir Kostenfunktionen mit und ohne Proxies für Regulierung und werten die Ergebnisse nach Beobachtungsjahr, Bankengröße, und Gruppenzugehörigkeit aus. Unsere Ergebnisse zeigen Kostenspitzen in den Jahren 2001, 2008, und zuletzt seit 2012. Am interessantesten sind jedoch die asymmetrischen Effekte der Bankenregulierung: Während unsere Modelle bis einschließlich 2003 nahezu gleichmäßige Kostenbelastungen anzeigen, änderte sich dies deutlich mit dem Jahr 2004. Im Gegensatz zu großen Institute und Sparkassen, die die Regulierungskosten nahezu neutralisieren konnten, sahen sich kleine Institute und Genossenschaftsbanken mit deutlichen Kostensteigerungen konfrontiert. Als Folge dieser asymmetrischen Kostenwirkungen staatlicher Bankenregulierung erwarten wir unbeabsichtigte Strukturveränderungen wie z.B. Konzentrationsprozesse, die sich negativ auf Wettbewerb, Dienstleistungsqualität, und letztendlich auch negativ auf die Stabilität des gesamten Finanzsystems auswirken werden. JEL Classification: G21, G38


2011 ◽  
Vol 133 (07) ◽  
pp. 46-53
Author(s):  
Burton Dicht

This article analyzes the decisions and technological challenges that drove the Space Shuttle’s development. The goal of the Shuttle program was to create a reusable vehicle that could reduce the cost of delivering humans and large payloads into space. Although the Shuttle was a remarkable flying machine, it never lived up to the goals of an airline-style operation with low operating costs. In January 2004, a year after the Columbia accident, President George W. Bush unveiled the “Vision for U.S. Space Exploration” to guide the U.S. space effort for the next two decades. A major component of the new vision, driven by the recommendations of the Columbia Accident Investigation Board, was to retire the Space Shuttle fleet as soon as the International Space Station assembly was completed. With cancellation of the Constellation program in 2010, the planned successor to the Shuttle, the U.S. space program is now in an era of uncertainty.


Sign in / Sign up

Export Citation Format

Share Document