python programming
Recently Published Documents


TOTAL DOCUMENTS

630
(FIVE YEARS 477)

H-INDEX

13
(FIVE YEARS 5)

Author(s):  
Prof. F. S. Ghodichor

Abstract: Counterfeit money has always existed an issue that has caused many problems in the market. Technological growth development has made it possible to create extra counterfeit items which are distributed in the mitigation market the global economy. Bangui existing banking equipment and so on trading sites to check the authenticity of funds. But the average person does not do that have access to such systems and that is why they are needed in order for the software to receive counterfeit money, which can be used by ordinary people. This the proposed system uses image processing to find out if the money is real or fake. System built uses the Python system completely language. It contains similar steps grayscale modification, edge detection, separation, etc. made using appropriate methods. Keyword: Counterfeit currency, Image Processing, Python programming language, grayscale conversion, edge detection, segmentation.


Author(s):  
Harsha Vardhan Peela ◽  
◽  
Tanuj Gupta ◽  
Nishit Rathod ◽  
Tushar Bose ◽  
...  

Credit risk as the board in banks basically centers around deciding the probability of a customer's default or credit decay and how expensive it will end up being assuming it happens. It is important to consider major factors and predict beforehand the probability of consumers defaulting given their conditions. Which is where a machine learning model comes in handy and allows the banks and major financial institutions to predict whether the customer, they are giving the loan to, will default or not. This project builds a machine learning model with the best accuracy possible using python. First we load and view the dataset. The dataset has a combination of both mathematical and non-mathematical elements, that it contains values from various reaches, in addition to that it contains a few missing passages. We preprocess the dataset to guarantee the AI model we pick can make great expectations. After the information is looking great, some exploratory information examination is done to assemble our instincts. Finally, we will build a machine learning model that can predict if an individual's application for a credit card will be accepted. Using various tools and techniques we then try to improve the accuracy of the model. This project uses Jupyter notebook for python programming to build the machine learning model. Using Data Analysis and Machine Learning, we attempted to determine the most essential parameters for obtaining credit card acceptance in this project. The machine learning model we built gave an 86 % accuracy for predicting whether the credit card will be approved or not, considering the various factors mentioned in the application of the credit card holder. Even though we achieved an accuracy of 86%, we conducted a grid search to see if we could increase the performance even further. However, using both the machine learning models: random forest and logistic regression, the best we could get from this data was 86 percent.


2022 ◽  
pp. 37-44
Author(s):  
DMІTRIY NOVAK ◽  
KATERYNA MARYNIAKA

Purpose. Creation of software for evaluating the uniformity of distribution of the filler in a polyethylene matrix.Methodology. Software development was carried out using the Python programming language and libraries: PIL, Numpy, Matplotlib, Xlsxwriter. The suitability of the developed software for use was determined by verifying it. During this verification, polyethylene compositions filled with colloidal graphite in the form of compressed films were evaluated. To obtain these compositions, we chose P6006AD grade polyethylene and C-1 colloidal graphite. Samples of polyethylene compositions were obtained in two stages: 1) obtaining a strand by extrusion; 2) additional mixing of the composition on a disc mixer and pressing the obtained compositions into a film.Findings. The software has been developed to assess the uniformity of the distribution of the filler in the polyethylene matrix. The data were established on the dependence of the coefficient of heterogeneity of polyethylene compositions on the content of colloidal graphite with use of the developed software. The increase in the content of the filler leads to a decrease in its heterogeneity. It is shown that this effect can be explained by the structuring of the filler in the polyethylene matrix. Despite the formation of aggregates in polyethylene compositions, a significant amount of small colloidal particles of graphite is located between the aggregate space. This leads to a certain leveling of the concentration in the film and reduces its inhomogeneity.Scientific novelty. The influence of the content of colloidal graphite on the homogeneity of polyethylene compositions is determined. It is shown that with an increase in the graphite content from 0 to 20% vol. the coefficient of heterogeneity of the composition decreases from 5.3% to 3.9%, which is due to the structuring of the filler in the polyethylene matrix.Practical value. Software that makes it possible to evaluate the uniformity of the distribution of filler particles in a polymer matrix, and can be used to study the quality of mixing of polymer composite materials has been developed.


2022 ◽  
Vol 14 (4) ◽  
pp. 114-121
Author(s):  
Julia Sergeevna Shevnina ◽  
Larisa Gagarina ◽  
Andrey Chirkov ◽  
Nikolay Mironov

Within the framework of this work, the tasks of studying the subject area of exchange rate management software, comparative analysis of several software solutions were solved. To implement the server part of the PM KB, the Python programming language was chosen. The Django framework formed the basis of the server part of the PM UKB. To implement the client, tools such as the Jinja template engine for collecting HTML pages, the Bootstrap framework for working with a grid and styles were used, and the JS language was used to create interactivity. The paper also presents a general scheme of the algorithm in a graphical form. Further in the article, the program blocks of authentication, data unloading, switching on and off the exchange, collecting modified data, adding the control data block to the point exchange rate management page, updating data in the database, updating data of specific rates are considered.


2022 ◽  
Vol 8 (2) ◽  
pp. 99-114
Author(s):  
Lamyaa Gamal EL-Deen Taha ◽  
Manar A. Basheer ◽  
Amany Morsi Mohamed

Nowadays, desertification is one of the most serious environment socioeconomic issues and sand dune advances are a major threat that causes desertification. Wadi El-Rayan is one of the areas facing severe dune migration. Therefore, it's important to monitor desertification and study sand dune migration in this area. Image differencing for the years 2000 (Landsat ETM+) and 2019 (OLI images) and Bi-temporal layer stacking was performed. It was found that image differencing is a superior method to get changes of the study area compared to the visual method (Bi-temporal layer stacking). This research develops a quantitative technique for desertification assessment by developing indicators using Landsat images. Spatial distribution of the movement of sand dunes using some spectral indices (NDVI, BSI, LDI, and LST) was studied and a Python script was developed to calculate these indices. The results show that NDVI and BSI indices are the best indices in the identification and detection of vegetation. It was found that mobile sand dunes on the southern side of the lower Wadi El-Rayan Lake caused filling up of large part of the lower lake. The indices results show that sand movement decreased the size of the lower Wadi El-Rayan Lake and there are reclamation activities in the west of the lower lake. The results show that a good result could be achieved from the developed codes compared to ready-made software (ENVI 5).


2022 ◽  
Author(s):  
Michael S. Horn ◽  
Melanie West ◽  
Cameron Roberts

2022 ◽  
pp. 88-102
Author(s):  
Basetty Mallikarjuna ◽  
Anusha D. J. ◽  
Sethu Ram M. ◽  
Munish Sabharwal

An effective video surveillance system is a challenging task in the COVID-19 pandemic. Building a model proper way of wearing a mask and maintaining the social distance minimum six feet or one or two meters by using CNN approach in the COVID-19 pandemic, the video surveillance system works with the help of TensorFlow, Keras, Pandas, which are libraries used in Python programming scripting language used in the concepts of deep learning technology. The proposed model improved the CNN approach in the area of deep learning and named as the Ram-Laxman algorithm. The proposed model proved to build the optimized approach, the convolutional layers grouped as ‘Ram', and fully connected layers grouped as ‘Laxman'. The proposed system results convey that the Ram-Laxman model is easy to implement in the CCTV footage.


2022 ◽  
Vol 7 (4) ◽  
pp. 5871-5894
Author(s):  
Daniel Clemente-López ◽  
◽  
Esteban Tlelo-Cuautle ◽  
Luis-Gerardo de la Fraga ◽  
José de Jesús Rangel-Magdaleno ◽  
...  

<abstract><p>The optimization of fractional-order (FO) chaotic systems is challenging when simulating a considerable number of cases for long times, where the primary problem is verifying if the given parameter values will generate chaotic behavior. In this manner, we introduce a methodology for detecting chaotic behavior in FO systems through the analysis of Poincaré maps. The optimization process is performed applying differential evolution (DE) and accelerated particle swarm optimization (APSO) algorithms for maximizing the Kaplan-Yorke dimension ($ D_{KY} $) of two case studies: a 3D and a 4D FO chaotic systems with hidden attractors. These FO chaotic systems are solved applying the Grünwald-Letnikov method, and the Numba just-in-time (jit) compiler is used to improve the optimization process's time execution in Python programming language. The optimization results show that the proposed method efficiently optimizes FO chaotic systems with hidden attractors while saving execution time.</p></abstract>


2021 ◽  
Vol 3 (1) ◽  
pp. 61-66
Author(s):  
Ihor Farmaha ◽  
◽  
Viktor Hadomskyi ◽  

This paper is devoted develop software for time series forecasting using Python programming language. SARIMA model was used to develop the system.


2021 ◽  
Vol 15 (3) ◽  
pp. 205-215
Author(s):  
Gurjot Singh Mahi ◽  
Amandeep Verma

  Web crawlers are as old as the Internet and are most commonly used by search engines to visit websites and index them into repositories. They are not limited to search engines but are also widely utilized to build corpora in different domains and languages. This study developed a focused set of web crawlers for three Punjabi news websites. The web crawlers were developed to extract quality text articles and add them to a local repository to be used in further research. The crawlers were implemented using the Python programming language and were utilized to construct a corpus of more than 134,000 news articles in nine different news genres. The crawler code and extracted corpora were made publicly available to the scientific community for research purposes.


Sign in / Sign up

Export Citation Format

Share Document