scholarly journals Articles Signifying Renogram Processing Practices-A Systematic Review

Author(s):  
Pradnya Gokhale ◽  
Babasaheb Rajaram Patil

Abstract The renography represents time activity process detected when one measures the activity in the kidneys after the dose injection of radiolabeled radio tracer(e.g.99mTc- DTPA,99mTc-MAG3). Interpretation of this renal scan helps to diagnose whether the drainage function from the kidney is normal or abnormal. This renal tracer’s data is processed by mathematical models and data processing techniques like Rutland-Patlak and deconvolution methods to produce renograph. This research study is carried out to review previously published research articles incorporating various methods, their applications and image processing algorithms as well as techniques that were applied to process renal radiotracer’s transit time data. This review includes various types, advantages, gaps and possible scopes for existing renogram data processing techniques. After analysis process of 142 articles it is found that, maximum of the articles are associated with renal scan’s processing methods that are limited to renal patient’s related disease categories and having absence of quantifiable measurement and study of parenchymal radio tracer’s transit time counted from renal cortex to renal pelvis path while limited numbers of articles are purely related to applied algorithms for detecting obstruction level qualitatively.

2019 ◽  
Vol 6 (1) ◽  
Author(s):  
Noussair Fikri ◽  
Mohamed Rida ◽  
Noureddine Abghour ◽  
Khalid Moussaid ◽  
Amina El Omri

Abstract In this paper we are proposing an adaptive and real-time approach to resolve real-time financial data integration latency problems and semantic heterogeneity. Due to constraints that we have faced in some projects that requires real-time massive financial data integration and analysis, we decided to follow a new approach by combining a hybrid financial ontology, resilient distributed datasets and real-time discretized stream. We create a real-time data integration pipeline to avoid all problems of classic Extract-Transform-Load tools, which are data processing latency, functional miscomprehensions and metadata heterogeneity. This approach is considered as contribution to enhance reporting quality and availability in short time frames, the reason of the use of Apache Spark. We studied Extract-Transform-Load (ETL) concepts, data warehousing fundamentals, big data processing technics and oriented containers clustering architecture, in order to replace the classic data integration and analysis process by our new concept resilient distributed DataStream for online analytical process (RDD4OLAP) cubes which are consumed by using Spark SQL or Spark Core basics.


2017 ◽  
Vol 22 (1) ◽  
Author(s):  
Oey Hannes Widjaya ◽  
Louis Utama

This study was conducted to determine Systems and Procedures Recruitment Candidates HR conducted at PT Hero Supermarket, to find weaknesses in systems and procedures Recruitment conducted at PT Hero Supermarket, to know the systems and procedures Recruitment Candidates human resources for this cause problems for the company to look for a proper solution in the Systems and Procedures Recruitment conducted at PT Hero Supermarket. The object of this research study is PT Hero Supermarket Tbk. engaged in the supermarket industry retailers (supermarket) which was established in 1971 and is located on Jalan Gatot Subroto Kav 64 177 A South Jakarta, Indonesia. Subject Research and Systems Analysis Candidate Recruitment Procedures HR. Methods of data collection by interview and observation. Data processing techniques using the editing, coding, tabulation and komputerasisasi. The analysis and discussion of the results obtained by computerized namely Flowchart Flowchart Candidate Filing HR Recruitment, HR Candidate Recruitment Planning Flowchart, Flowchart Ad Candidate Recruitment HR, Cover Letters Candidate Selection Flowchart human resources PT Hero Supermarket Tbk


1990 ◽  
Vol 29 (04) ◽  
pp. 170-176 ◽  
Author(s):  
M. V. Yester ◽  
Eva Dubovsky ◽  
C. D. Russell

Renal parenchymal transit time of the recently introduced radiopharmaceutical 99mTc-MAG3 (mercaptoacetylglycylglylcylglycinel) was measured in 37 kidneys, using factor analysis to separate parenchymal activity from that in the collecting system. A new factor algorithm was employed, based on prior interpolative background subtraction and use of the fact that the initial slope of the collecting system factor time-activity curve must be zero. The only operator intervention required was selection of a rectangular region enclosing the kidney (by identifying two points at opposite corners). Transit time was calculated from the factor time-activity curves both by deconvolution of the parenchymal factor curve and also by measuring the appearance time for collecting system activity from the collecting system factor curve. There was substantial agreement between the two methods. Factor analysis led to a narrower range of normal values than a conventional cortical region-of-interest method, presumably by decreasing crosstalk from the collecting system. In preliminary trials, the parenchymal transit time did not well separate four obstructed from seventeen unobstructed kidneys, but it successfully (p <0.05) separated six transplanted kidneys with acute rejection or acute tubular necrosis from 10 normal transplants.


2006 ◽  
Vol 46 (9) ◽  
pp. S693-S707 ◽  
Author(s):  
P Varela ◽  
M.E Manso ◽  
A Silva ◽  
the CFN Team ◽  
the ASDEX Upgrade Team

2020 ◽  
Vol 14 ◽  
pp. 174830262096239 ◽  
Author(s):  
Chuang Wang ◽  
Wenbo Du ◽  
Zhixiang Zhu ◽  
Zhifeng Yue

With the wide application of intelligent sensors and internet of things (IoT) in the smart job shop, a large number of real-time production data is collected. Accurate analysis of the collected data can help producers to make effective decisions. Compared with the traditional data processing methods, artificial intelligence, as the main big data analysis method, is more and more applied to the manufacturing industry. However, the ability of different AI models to process real-time data of smart job shop production is also different. Based on this, a real-time big data processing method for the job shop production process based on Long Short-Term Memory (LSTM) and Gate Recurrent Unit (GRU) is proposed. This method uses the historical production data extracted by the IoT job shop as the original data set, and after data preprocessing, uses the LSTM and GRU model to train and predict the real-time data of the job shop. Through the description and implementation of the model, it is compared with KNN, DT and traditional neural network model. The results show that in the real-time big data processing of production process, the performance of the LSTM and GRU models is superior to the traditional neural network, K nearest neighbor (KNN), decision tree (DT). When the performance is similar to LSTM, the training time of GRU is much lower than LSTM model.


2011 ◽  
Vol 121-126 ◽  
pp. 3195-3199
Author(s):  
Li Feng Yang ◽  
Jun Yuan ◽  
Wei Na Liu ◽  
Xiu Ming Nie ◽  
Xue Liang Pei

Use Kingview to acquire and display the centrifugal pump performance parameters for the real-time data, and will stored the collected experimental data in Access databases, using VB database read, and drawing function for the data processing and rendering performance parameters of relationship curves.


Sign in / Sign up

Export Citation Format

Share Document