average complexity
Recently Published Documents


TOTAL DOCUMENTS

86
(FIVE YEARS 22)

H-INDEX

8
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Johan Westerman ◽  
Dirk Witteveen ◽  
Erik Bihagen ◽  
Roujman Shahbazian

There is a wide-spread idea that contemporary careers continue to become ever more complex. Pioneering research of full-career complexity has shown that work lives have indeed become more complex, yet at modest increasing pace. This paper examines whether career complexity continues to increase using Swedish registry data across an exceptionally long time period, including younger cohorts than in previous research: up to those born in 1983. The full early- and midcareers of selected birth cohorts cover several macroeconomic booms and downturns, a long period of upskilling of the Swedish labor force, as well as the convergence of working hours of women and men. The following conclusions are drawn using state-of-the-art methods of measuring career complexity. For early-careers, an increasing complexity trend is evident between the 1950s and 1960s birth cohorts, yet complexity fluctuates around a stable trend for the 1970s birth cohorts and onward. For mid-careers, which are considerably more stable on average, complexity has decreased among women born between the 1930s and the early-1950s. However, the opposite trend holds true for men, resulting in gender convergence of complexity. We observe a standstill of the mid-career complexity trend across both genders, followed by a modest decline for the last observed cohorts. Subsequent analyses point to educational expansion as an important driver of the initial increase of early-career complexity. Taken together, our analysis affirms an initial shift to more career complexity in the 20th century, yet we find no unidirectional trend toward more career complexity over the last decades.


2021 ◽  
Vol 11 (24) ◽  
pp. 11589
Author(s):  
Mattia Morri ◽  
Cristiana Forni ◽  
Andrea Evangelista ◽  
Tania Broggian ◽  
Elisa Ambrosi ◽  
...  

The aim of this work was to measure the healthcare outcomes for patients undergoing surgery for femur fractures during the second wave of the COVID-19 pandemic within a context of orthopaedic surgery units and living with the pandemic and compare them with pre-pandemic outcomes. A retrospective observational study was conducted. The incidence of pressure ulcers and deambulation recovery time were the main outcome. The pre-pandemic group consisted of 108 patients and the second wave pandemic group included 194 patients. The incidence of pressure ulcers increased from 10% in the pre-pandemic period to 21% in the second wave (p = 0.016) and the crude relative risk (RR) was 2.06 (p = 0.023). The recovery of deambulation showed no significant difference in the recovery time in terms of days needed to walk the first time (3 days vs. 2 days; p = 0.44). During the second wave of COVID-19, the risk of pressure ulcers for patients undergoing femur fracture surgery increased significantly. This variation could be explained by the absence of a caregiver for these patients and the increased average complexity of the patients managed in the orthopaedic setting. The hospital management should take into account these aspects when restoring the hospital’s normal surgical activities.


Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 4018
Author(s):  
Bong-seok Kim ◽  
Youngseok Jin ◽  
Jonghun Lee ◽  
Sangdong Kim

This paper proposes a high-efficiency super-resolution frequency-modulated continuous-wave (FMCW) radar algorithm based on estimation by fast Fourier transform (FFT). In FMCW radar systems, the maximum number of samples is generally determined by the maximum detectable distance. However, targets are often closer than the maximum detectable distance. In this case, even if the number of samples is reduced, the ranges of targets can be estimated without degrading the performance. Based on this property, the proposed algorithm adaptively selects the number of samples used as input to the super-resolution algorithm depends on the coarsely estimated ranges of targets using the FFT. The proposed algorithm employs the reduced samples by the estimated distance by FFT as input to the super resolution algorithm instead of the maximum number of samples set by the maximum detectable distance. By doing so, the proposed algorithm achieves the similar performance of the conventional multiple signal classification algorithm (MUSIC), which is a representative of the super resolution algorithms while the performance does not degrade. Simulation results demonstrate the feasibility and performance improvement provided by the proposed algorithm; that is, the proposed algorithm achieves average complexity reduction of 88% compared to the conventional MUSIC algorithm while achieving its similar performance. Moreover, the improvement provided by the proposed algorithm was verified in practical conditions, as evidenced by our experimental results.


2021 ◽  
Vol 39 ◽  
Author(s):  
Alexandria Connor ◽  
◽  
Resad Pasic ◽  
Amira Quevedo ◽  
Petra Chamseddine ◽  
...  

Introduction: Robotic systems provide a platform for surgeons to expand their capabilities, allowing them to perform complex procedures safely and efficiently. Within the field of benign gynecology, this has become an increasingly popular option since receiving Food and Drug Administration (FDA) approval in 2005. However, the appropriate indications for robotic versus laparoscopic surgery continue to be debated. Materials and Methods: Literature was reviewed to provide a comprehensive, evidence-based evaluation of the advantages and pitfalls of robotic surgery, the applications of robotic surgery for benign gynecologic procedures in comparison to conventional laparoscopy, and the role of robotic surgery as an educational tool. Results: Robotic surgery has favorable outcomes for surgeons in the areas of ergonomics, dexterity, and fatigue. Cost comparisons are widely varied and elaborate. Most patient outcomes are comparable between robotic and laparoscopic hysterectomies and endometriosis resections. In patients with a body mass index >30mg/m2 and uteri >750mg, hysterectomy outcomes are improved when surgery is done robotically. The use of the robotic system may be beneficial for patients undergoing myomectomy. Robotic surgery confers advantages for trainees and novice surgeons. There is no consensus on a standardized curriculum for robotic training or credentialing process for experienced surgeons. Conclusion: Robotic surgery has distinct features that make it a valuable tool for gynecologic surgeons. There are no clear indications regarding when a robotic route should be chosen but could be considered when above average complexity is anticipated and when training new surgeons.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
M. D. V. Bodmer ◽  
P. M. Wheeler ◽  
P. Anand ◽  
S. E. Cameron ◽  
Sanni Hintikka ◽  
...  

AbstractWhen Caribbean long-spined sea urchins, Diadema antillarum, are stable at high population densities, their grazing facilitates scleractinian coral dominance. Today, populations remain suppressed after a mass mortality in 1983–1984 caused a loss of their ecosystem functions, and led to widespread declines in ecosystem health. This study provides three lines of evidence to support the assertion that a lack of habitat complexity on Caribbean coral reefs contributes to their recovery failure. Firstly, we extracted fractal dimension (D) measurements, used as a proxy for habitat complexity, from 3D models to demonstrate that urchins preferentially inhabit areas of above average complexity at ecologically relevant spatial scales. Secondly, controlled behaviour experiments showed that an energetically expensive predator avoidance behaviour is reduced by 52% in complex habitats, potentially enabling increased resource allocation to reproduction. Thirdly, we deployed a network of simple and cost-effective artificial structures on a heavily degraded reef system in Honduras. Over a 24-month period the adult D. antillarum population around the artificial reefs increased by 320% from 0.05 ± 0.01 to 0.21 ± 0.04 m−2 and the juvenile D. antillarum population increased by 750% from 0.08 ± 0.02 to 0.68 ± 0.07 m−2. This study emphasises the important role of habitat structure in the ecology of D. antillarum and as a barrier to its widespread recovery.


2021 ◽  
Vol 7 ◽  
pp. e406
Author(s):  
Luca Ardito ◽  
Luca Barbato ◽  
Riccardo Coppola ◽  
Michele Valsesia

Rust is an innovative programming language initially implemented by Mozilla, developed to ensure high performance, reliability, and productivity. The final purpose of this study consists of applying a set of common static software metrics to programs written in Rust to assess the verbosity, understandability, organization, complexity, and maintainability of the language. To that extent, nine different implementations of algorithms available in different languages were selected. We computed a set of metrics for Rust, comparing them with the ones obtained from C and a set of object-oriented languages: C++, Python, JavaScript, TypeScript. To parse the software artifacts and compute the metrics, it was leveraged a tool called rust-code-analysis that was extended with a software module, written in Python, with the aim of uniforming and comparing the results. The Rust code had an average verbosity in terms of the raw size of the code. It exposed the most structured source organization in terms of the number of methods. Rust code had a better Cyclomatic Complexity, Halstead Metrics, and Maintainability Indexes than C and C++ but performed worse than the other considered object-oriented languages. Lastly, the Rust code exhibited the lowest COGNITIVE complexity of all languages. The collected measures prove that the Rust language has average complexity and maintainability compared to a set of popular languages. It is more easily maintainable and less complex than the C and C++ languages, which can be considered syntactically similar. These results, paired with the memory safety and safe concurrency characteristics of the language, can encourage wider adoption of the language of Rust in substitution of the C language in both the open-source and industrial environments.


2020 ◽  
Vol 7 (2) ◽  
pp. 107-112
Author(s):  
Annisa Heparyanti Safitri ◽  
Muhammad Ainul Yaqin ◽  
Adi Heru Utomo

Abstract— In an organization, some problems often arise, one of which lies in the complexity of business process modeling. In business processes, high complexity values ​​are complicated to analyze and maintain as a whole, so a method is needed to break down the business process into smaller parts called the fragment process model. Therefore, a decomposition was carried out to decompose the process model to make it simpler. The benefit of decomposition is to make it easier for users to compose the required business process model. We used three different scenarios for the TMA process model to analyze each fragment. There is a process model with scenarios that tend to be the sequence, multi-branching, and nested branching. Furthermore, to support the results of the RPST, the calculation of the average complexity value with the Yaqin Complexity formula, and the standard deviation for the process model fragment was also carried out. Our experimental results found that the rate of the tree at the RPST affected the number of fragments. Also, we found that the more profound the tree depth, the higher the average complexity value. In this study, we found that scenarios that tend to be sequential, have the lowest average complexity value with the number 22, and a standard deviation value of 5,567. While the highest value is in the scenario that has nested branching, and there is a repetition process with an average complexity value of 29.8 and a standard deviation value of 13.405. Keywords— Process Model, RPST, Decomposition, Complexity Matrix, Standard Deviation.   Abstrak— Dalam suatu organisasi seringkali timbul beberapa permasalahan, salah satunya terletak pada kompleksitas pemodelan proses bisnis. Dalam proses bisnis, nilai kompleksitas yang tinggi rumit untuk dianalisis dan dipelihara secara keseluruhan, sehingga diperlukan metode untuk memecah proses bisnis menjadi bagian-bagian yang lebih kecil yang disebut model proses fragmen. Oleh karena itu, dekomposisi dilakukan untuk menguraikan model proses agar lebih sederhana. Manfaat dekomposisi adalah memudahkan pengguna untuk menyusun model proses bisnis yang dibutuhkan. Kami menggunakan tiga skenario berbeda untuk model proses TMA untuk menganalisis setiap fragmen. Terdapat model proses dengan skenario yang cenderung berurutan, bercabang banyak, dan bercabang bersarang. Selanjutnya untuk mendukung hasil RPST juga dilakukan perhitungan nilai kompleksitas rata-rata dengan rumus Yaqin Complexity, dan standar deviasi untuk fragmen model proses. Hasil eksperimental kami menemukan bahwa laju pohon di RPST memengaruhi jumlah fragmen. Selain itu, kami menemukan bahwa semakin mendalam kedalaman pohon, semakin tinggi nilai kompleksitas rata-ratanya. Pada penelitian ini ditemukan skenario yang cenderung berurutan, memiliki nilai rata-rata kompleksitas terendah dengan angka 22, dan nilai standar deviasi 5,567. Sedangkan nilai tertinggi ada pada skenario bercabang nested, dan terjadi proses pengulangan dengan nilai kompleksitas rata-rata 29,8 dan nilai standar deviasi 13,405. Keywords—Model Proses, RPST, Dekomposisi, Matrik Kompleksitas, Standar Deviasi.


2020 ◽  
pp. short76-1-short76-13
Author(s):  
Roman Chernyak ◽  
Roman Meshcheryakov

This paper considers an application map method usage together with the Nose Suppression Filter (NSF) in-loop filter for video compression. The application map method is described in details and simulations results of basic NSF are provided as well as simulation results of NSF powered with the application map method. It is demonstrated that the application map method allows to significantly improve objective performance results of basic NSF and additionally decrease decoder average complexity. As a result, an average bd-rate saving of NSF with the application map reaches 2.6% for luma component in Low Delay P coding configuration in comparing to Versatile Video Coding reference implementation (VTM) version 1.0.


Author(s):  
Tomoyuki Fujita ◽  
Takashi Kakuta ◽  
Naonori Kawamoto ◽  
Yusuke Shimahara ◽  
Shin Yajima ◽  
...  

Abstract OBJECTIVES To determine whether robotic mitral valve repair can be applied to more complex lesions compared with minimally invasive direct mitral valve repair through a right thoracotomy. METHODS We enrolled 335 patients over a 9-year period; 95% of the robotic surgeries were performed after experience performing direct mitral valve repair. RESULTS The mean age in the robotic versus thoracotomy repair groups was 61 ± 14 vs 55 ± 11 years, respectively (P < 0.001); 97% vs 100% of the patients, respectively, had degenerative aetiologies. Repair complexity was simple in 106 (63%) vs 140 (84%), complex in 34 (20%) vs 20 (12%) and most complex in 29 (17%) vs 6 (4%) patients undergoing robotic versus thoracotomy repair, respectively. The average complexity score with robotic repair was significantly higher versus thoracotomy repair (P < 0.001). The robotic group underwent more chordal replacement using polytetrafluoroethylene and less resections. All patients underwent ring annuloplasty. Cross-clamp time did not differ between the groups, and no strokes or deaths occurred. More patients undergoing robotic repair underwent concomitant procedures versus the thoracotomy group (30% vs 14%, respectively; P < 0.001). The overall repair rate was 100%, with no early mortality or strokes in either group. Postoperative mean residual mitral regurgitation was 0.3 in both groups, and the mean pressure gradient through the mitral valve was 2.4 vs 2.7 mmHg (robotic versus thoracotomy repair, respectively; P = 0.031). CONCLUSIONS Robotic surgery can be applied to repair more complex mitral lesions, with excellent early outcomes.


Sign in / Sign up

Export Citation Format

Share Document