scholarly journals Six-in-a-Row Artificial Intelligence Design and Its Implementation Based on Java

2021 ◽  
Vol 14 (4) ◽  
pp. 47
Author(s):  
Liujue Zhang

We can use java language to design the game of chess on eclipse platform and design a new rule for it : “Six in a Row”, this game is more complex than Five-in-a-Row,and we can add a function of Man-Machine Battle in the program.This paper which based on the practical programming experience introduce the algorithm of the above game. The basic mentality of the algorithm is creating a scoring table and all points of the chessboard are scored by using ergodic method.Finally, the machine will placing the chess in the point which have the highest score.This algorithm is clearly operable, and the machine has a high rate of winning when play with human.

2019 ◽  
Vol 6 (2) ◽  
pp. 127-134
Author(s):  
Takudzwa Fadziso

Agriculture has a critical role to play in the financial domain. Likewise, automation of multiple processes in agriculture has been a great concern as well as an alarming subject across the world. The population all over the world is growing at a high rate and with this increment, demand for agriculture and its jobs is also growing exponentially. The usual techniques that were used by the farmers are not efficient enough to meet these requirements. Along these lines, new digital techniques are presented. These new strategies satisfy the proper management of agricultural products as well as services so that farmers can make the most of technology to increase their profit rates. AI in the agricultural landscape has initiated a revolutionary change. It has guarded the harvest yield from different declining factors such as environmental changes, over population, dynamic business demands, and food safety issues. By using artificial intelligence we can foster smart farming practices to limit the loss of farmers and give them high returns. Using artificial intelligence platforms, one can collect an enormous amount of information from government and public sites or real-time monitoring and collection of different information is likewise possible by utilizing IoT (Internet of Things) and afterward can be explored with precision to empower the farmers for resolving every one of the issues faced by farmers in the agriculture area. This research is conducted in order to help local farmers everywhere in the world to manage their agriculture practices all the more effectively. The strategy discussed in this paper is leveraging the model of waterfall methodology for planning and creating a system smart enough by performing a sequential cycle that starts with data collection, requirement analysis, plan, coding, and testing and finally implements that system as a whole. This system can also be used to foster ideas to manage normal issues in agriculture information systems, to improve the policy programs, the augmentation, and analysis practices, and to manage data on agriculture. Finally, conclusion about agricultural information systems are discussed and suggestions for additional development of agriculture data systems is presented.  


2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Hassaan Haider Syed ◽  
Muhammad Attique Khan ◽  
Usman Tariq ◽  
Ammar Armghan ◽  
Fayadh Alenezi ◽  
...  

The excessive number of COVID-19 cases reported worldwide so far, supplemented by a high rate of false alarms in its diagnosis using the conventional polymerase chain reaction method, has led to an increased number of high-resolution computed tomography (CT) examinations conducted. The manual inspection of the latter, besides being slow, is susceptible to human errors, especially because of an uncanny resemblance between the CT scans of COVID-19 and those of pneumonia, and therefore demands a proportional increase in the number of expert radiologists. Artificial intelligence-based computer-aided diagnosis of COVID-19 using the CT scans has been recently coined, which has proven its effectiveness in terms of accuracy and computation time. In this work, a similar framework for classification of COVID-19 using CT scans is proposed. The proposed method includes four core steps: (i) preparing a database of three different classes such as COVID-19, pneumonia, and normal; (ii) modifying three pretrained deep learning models such as VGG16, ResNet50, and ResNet101 for the classification of COVID-19-positive scans; (iii) proposing an activation function and improving the firefly algorithm for feature selection; and (iv) fusing optimal selected features using descending order serial approach and classifying using multiclass supervised learning algorithms. We demonstrate that once this method is performed on a publicly available dataset, this system attains an improved accuracy of 97.9% and the computational time is almost 34 (sec).


Author(s):  
Youngsang Kim ◽  
Hoonsik Yoo

We analyzed international differences in preferences related to the two dimensional (2D) versus three dimensional (3D) and male versus female external appearance of artificial intelligence (AI) agents for use in self-driving automobiles. We recruited 823 participants in five countries (South Korea, United States, China, Russia, and Brazil), who completed a survey. South Korean, Chinese, and North American respondents preferred a 2D appearance of the AI agent, which appears to result from the religious or philosophical views held in countries with a large or growing number of Christians, whereas Brazilian and Russian respondents preferred a 3D appearance. Brazilian respondents’ high rate of functional illiteracy may be the reason for this finding; however, there were difficulties in identifying the reason for the Russian preference. Furthermore, men in all five countries preferred female AI agents, whereas South Korean, Chinese, and Russian women preferred female agents, but in the United States and Brazil women preferred male agents. These findings may offer valuable guidelines for design of personalized AI agent appearance, taking into account differences in preferences between countries and by gender.


Author(s):  
Jose L. Salmeron ◽  
Cristina Lopez

There are many uncertainties that can influence the success of Information Technology (IT) and Information Systems (IS) projects. These are characterized to be highly complex and risky, among other issues. These features explain the high rate of failures in this kind of projects. So, if practitioners want to prevent undesired outcomes in their IT/IS projects, they have to continuously manage the risks existing in them. In this way, practitioners should monitor risks impacts on IT/IS projects success. However, current methods used for it, have several limitations that can be overcome by employing artificial intelligence techniques. Based on the fuzzy theory, this chapter proposes the use of fuzzy approaches to model risks effects on IT/IS projects success measures. Its applicability is presented through an illustrative case. The findings highlight that the method proposed give project managers insights into the causes of failure or delay of their IT/IS projects, in order to develop effective strategies.


2021 ◽  
Vol 11 (9) ◽  
pp. 886
Author(s):  
Ken Asada ◽  
Masaaki Komatsu ◽  
Ryo Shimoyama ◽  
Ken Takasawa ◽  
Norio Shinkai ◽  
...  

The coronavirus disease 2019 (COVID-19) pandemic began at the end of December 2019, giving rise to a high rate of infections and causing COVID-19-associated deaths worldwide. It was first reported in Wuhan, China, and since then, not only global leaders, organizations, and pharmaceutical/biotech companies, but also researchers, have directed their efforts toward overcoming this threat. The use of artificial intelligence (AI) has recently surged internationally and has been applied to diverse aspects of many problems. The benefits of using AI are now widely accepted, and many studies have shown great success in medical research on tasks, such as the classification, detection, and prediction of disease, or even patient outcome. In fact, AI technology has been actively employed in various ways in COVID-19 research, and several clinical applications of AI-equipped medical devices for the diagnosis of COVID-19 have already been reported. Hence, in this review, we summarize the latest studies that focus on medical imaging analysis, drug discovery, and therapeutics such as vaccine development and public health decision-making using AI. This survey clarifies the advantages of using AI in the fight against COVID-19 and provides future directions for tackling the COVID-19 pandemic using AI techniques.


2021 ◽  
Vol 11 (10) ◽  
pp. 4624
Author(s):  
Andrea Zingoni ◽  
Juri Taborri ◽  
Valentina Panetti ◽  
Simone Bonechi ◽  
Pilar Aparicio-Martínez ◽  
...  

Specific learning disorders affect a significant portion of the population. A total of 80% of its instances are dyslexia, which causes significant difficulties in learning skills related to reading, memorizing and the exposition of concepts. Whereas great efforts have been made to diagnose dyslexia and to mitigate its effects at primary and secondary school, little has been done at the university level. This has resulted in a sensibly high rate of abandonment or even of failures to enroll. The VRAIlexia project was created to face this problem by creating and popularizing an innovative method of teaching that is inclusive for dyslexic students. The core of the project is BESPECIAL, a software platform based on artificial intelligence and virtual reality that is capable of understanding the main issues experienced by dyslexic students and to provide them with ad hoc digital support methodologies in order to ease the difficulties they face in their academic studies. The aim of this paper is to present the conceptual design of BESPECIAL, highlighting the role of each module that composes it and the potential of the whole platform to fulfil the aims of VRAIlexia. Preliminary results obtained from a sample of about 700 dyslexic students are also reported, which clearly show the main issues and needs that dyslexic students experience and these will be used as guidelines for the final implementation of BESPECIAL.


Author(s):  
L. E. Murr ◽  
G. Wong

Palladium single-crystal films have been prepared by Matthews in ultra-high vacuum by evaporation onto (001) NaCl substrates cleaved in-situ, and maintained at ∼ 350° C. Murr has also produced large-grained and single-crystal Pd films by high-rate evaporation onto (001) NaCl air-cleaved substrates at 350°C. In the present work, very large (∼ 3cm2), continuous single-crystal films of Pd have been prepared by flash evaporation onto air-cleaved (001) NaCl substrates at temperatures at or below 250°C. Evaporation rates estimated to be ≧ 2000 Å/sec, were obtained by effectively short-circuiting 1 mil tungsten evaporation boats in a self-regulating system which maintained an optimum load current of approximately 90 amperes; corresponding to a current density through the boat of ∼ 4 × 104 amperes/cm2.


Author(s):  
A. Elgsaeter ◽  
T. Espevik ◽  
G. Kopstad

The importance of a high rate of temperature decrease (“rapid freezing”) when freezing specimens for freeze-etching has long been recognized1. The two basic methods for achieving rapid freezing are: 1) dropping the specimen onto a metal surface at low temperature, 2) bringing the specimen instantaneously into thermal contact with a liquid at low temperature and subsequently maintaining a high relative velocity between the liquid and the specimen. Over the last couple of years the first method has received strong renewed interest, particularily as the result of a series of important studies by Heuser and coworkers 2,3. In this paper we will compare these two freezing methods theoretically and experimentally.


Sign in / Sign up

Export Citation Format

Share Document