Introduction to Video Game Engine Development

2021 ◽  
Author(s):  
Victor G Brusca
2014 ◽  
Vol 169 ◽  
pp. 443-453 ◽  
Author(s):  
Jeremiah J. Shepherd ◽  
Lingxi Zhou ◽  
William Arndt ◽  
Yan Zhang ◽  
W. Jim Zheng ◽  
...  

More and more evidence indicates that the 3D conformation of eukaryotic genomes is a critical part of genome function. However, due to the lack of accurate and reliable 3D genome structural data, this information is largely ignored and most of these studies have to use information systems that view the DNA in a linear structure. Visualizing genomes in real time 3D can give researchers more insight, but this is fraught with hardware limitations since each element contains vast amounts of information that cannot be processed on the fly. Using a game engine and sophisticated video game visualization techniques enables us to construct a multi-platform real-time 3D genome viewer. The game engine-based viewer achieves much better rendering speed and can handle much larger amounts of data compared to our previous implementation using OpenGL. Combining this viewer with 3D genome models from experimental data could provide unprecedented opportunities to gain insight into the conformation–function relationships of a genome.


Author(s):  
Sarika Chaudhary ◽  
Shalini Bhaskar Bajaj ◽  
Aman Jatain ◽  
Pooja Nagpal

Game controllers have been planned and improved throughout the years to be as easy to understand as could reasonably be expected. A game controller is a gadget utilized with games or theatre setups to give contribution to a computer game, commonly to control an item or character in the game. Information gadgets that have been named game controllers incorporate consoles, mice, gamepads, joysticks, and so on. A few controllers are intended to be purposely best for one sort of game, for example, guiding wheels for driving games, move cushions for moving games, and light firearms for firing games. The aim here is to create a virtual environment, where the user is appealed by various gesture controls in a gaming application. A Gesture is an action that has to be seen or felt by someone else (here a PC) and has to convey some piece of information. Now obviously, to create a virtual gaming environment, we need to create a real-time gaming application first. We’ll be designing our 2D and 3D gaming applications through Unity 3D video game engine. The data used in this project is primarily from the Ego Hands dataset. After an input has been taken, and the consequent action has been performed, we’ll use this activity for future development of the model by using Tensor-Flow. The input will be taken through the webcam of the PC which will be accessed and combined to the gaming application and hands dataset by WebGL. WebGL is a JavaScript API for rendering interactive 2D and 3D graphics within any compatible web browser without the use of plug-ins.


2021 ◽  
pp. 1-18
Author(s):  
Kelvin Sung ◽  
Jebediah Pavleas ◽  
Matthew Munson ◽  
Jason Pace

2020 ◽  
Vol 79 (17-18) ◽  
pp. 12307-12328
Author(s):  
Miguel Chover ◽  
Carlos Marín ◽  
Cristina Rebollo ◽  
Inmaculada Remolar

2021 ◽  
Author(s):  
◽  
Mohsin Ali

<p>The technology of today, such as the Oculus Rift, can provide immersion in ways that were unachievable in the past. The Oculus Rift is a virtual reality headset that allows the user to see the three-dimensional world without the use of a traditional monitor. Unlike television, computer and mobile screens, a virtual reality headset digitally transports the user into the environment. Functionality such as depth tracking and rotational head tracking provides immersion unlike anything experienced to date.   My interest is to investigate interactive storytelling in combination with the Oculus Rift, to determine if virtual reality headsets can enrich storytelling experiences. This will be achieved by developing an application where interactive storytelling is compatible with the Oculus Rift, and testing that application with participants. Finally, a conclusion will be drawn from the data collected by participants.   Alongside the written thesis, a digital application will be produced in Unreal Engine 4 (Video game engine). The application will be an Oculus Rift driven experience, meaning that users can only experience it through an Oculus Rift. The application will have an interactive plot, which allows the user to influence the storyline. The design will be iterative and will be refined after each user testing session. The application hopes to strengthen the theories and concepts found in the written section of the thesis.</p>


Author(s):  
Kelvin Sung ◽  
Jebediah Pavleas ◽  
Fernando Arnez ◽  
Jason Pace

2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
Dario Maggiorini ◽  
Laura Anna Ripamonti ◽  
Federico Sauro

Video games are (also) real-time interactive graphic simulations: hence, providing a convincing physics simulation for each specific game environment is of paramount importance in the process of achieving a satisfying player experience. While the existing game engines appropriately address many aspects of physics simulation, some others are still in need of improvements. In particular, several specific physics properties of bodies not usually involved in the main game mechanics (e.g., properties useful to represent systems composed by soft bodies), are often poorly rendered by general-purpose engines. This issue may limit game designers when imagining innovative and compelling video games and game mechanics. For this reason, we dug into the problem of appropriately representing soft bodies. Subsequently, we have extended the approach developed for soft bodies to rigid ones, proposing and developing a unified approach in a game engine: Sulfur. To test the engine, we have also designed and developed “Escape from Quaoar,” a prototypal video game whose main game mechanic exploits an elastic rope, and a level editor for the game.


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S576-S577
Author(s):  
Gregory E Souza ◽  
Flávio Henrique Batista de Souza ◽  
Marconi A Aguiar dos Reis ◽  
Raoni A Dorim ◽  
Elisângela C Cristine Oliveira Gonçalves ◽  
...  

Abstract Background Brazillian authorities reported a total of 16.3 million cases and 454.000 deaths during COVID-19 pandemic in Brazil by may 2021. It became necessary to educate healthcare professionals on diagnosis and treatment of the syndrome. Game based learning surfaced as an effective alternative, since it promotes critical thinking and problem solving skills. A team of Brazilian and Peruvian students, physicians, designers and programmers gathered to create a decision based computer game that simulates a hospital scenario and allows medical students to analise, make decisions and receive feedback. This work describes the creative process and showcase the initial version of the software. Methods Professors and students of Medicine, Information Technology (IT), Design and Architecture from Brazil and Peru assembled a team in order to develop the computer game. Clinical cases were created by the medical students and professors, comprising medical procedures for the treatment and management of COVID 19, and a video game script was developed exploring gamification principles of challenge, objectivity, persistence, failure, reward and feedback. Algorithms (image 1) were created, under supervision of professors of Medicine, to define possible courses of action and outcomes (e.g. gain or loss of points, improvement or worsening of the patient). Students of Design created artistic elements, and IT students programmed with a game engine software. This fluxogram, written in portuguese, describes in detail all the possible courses of actions that can be exercised by the player. It is created by a team of Professors of Medicine and medical students, in accordance with evidence-based guidelines. Primarily, this document guides the programmers and designers throughout the development phase of the game. Results Initially, an expandable minimum viable product was obtained. The game, visualized on image 2, consists in a non-playable character and a playable character (i.e. doctor), with a scenario and a dialogue script simulating a clinical examination of a COVID 19 patient. The player can interact with certain elements within the game, e.g. the computer and other characters, to retrieve test results or start dialogues with relevant information. Hospital scenario and dialogue window between doctor (player in black) and patient (non playable character) are displayer in the game engine software (Unity 2D). On the bottom half of the screen, the dialogue box allows the player to collect the patient’s medical history. The player can interact with certain elements to obtain relevant information to make decision and progress in the game. Conclusion The game allows medical students to practice diagnosis and treatment of COVID 19. Future versions will include assessment reports of player’s actions, and a new score system will be implemented. New diseases will be incorporated in the gameplay to match the variety of scenarios offered by real hospitals and patients. Artificial intelligence will be employed to optimize gameplay, feedback and learning. Disclosures All Authors: No reported disclosures


2018 ◽  
Author(s):  
Marcus. R. Watson ◽  
Voloh Benjamin ◽  
Thomas Christopher ◽  
Hasan Asif ◽  
Womelsdorf Thilo

1Abstract1.1BackgroundThere is a growing interest in complex, active, and immersive behavioral neuroscience tasks. However, the development and control of such tasks present unique challenges.1.2New MethodThe Unified Suite for Experiments (USE) is an integrated set of hardware and software tools for the design and control of behavioral neuroscience experiments. The software, developed using the Unity video game engine, supports both active tasks in immersive 3D environments and static 2D tasks used in more traditional visual experiments. The custom USE SyncBox hardware, based around an Arduino Mega2560 board, integrates and synchronizes multiple data streams from different pieces of experimental hardware. The suite addresses three key issues with developing cognitive neuroscience experiments in Unity: tight experimental control, accurate sub-ms timing, and accurate gaze target identification.1.3ResultsUSE is a flexible framework to realize experiments, enabling (i) nested control over complex tasks, (ii) flexible use of 3D or 2D scenes and objects, (iii) touchscreen-, button-, joystick- and gaze-based interaction, and (v) complete offline reconstruction of experiments for post-processing and temporal alignment of data streams.1.4Comparison with Existing MethodsMost existing experiment-creation tools are not designed to support the development of video-game-like tasks. Those that do use older or less popular video game engines as their base, and are not as feature-rich or enable as precise control over timing as USE.1.5ConclusionsUSE provides an integrated, open source framework for a wide variety of active behavioral neuroscience experiments using human and nonhuman participants, and artificially-intelligent agents.2GlossaryActive task: Experimental tasks which involve some combination of realistic, usually moving, stimuli, continuous opportunities for action, ecologically valid tasks, complex behaviours, etc. Here, they are contrasted with static tasks (see below)Arduino: A multi-purpose generic micro-processor, here used to control inter-device communication and time synchronization.Raycast: A game-engine method that sends a vector between two points in a virtual three-dimensional environment, and returns the first object in that environment it hits. Often used to determine if a character in a game can see or shoot another character.State Machine (also Finite State Machine): A way of conceptualizing and implementing control in software, such that at any one moment the software is in one, and only one, state. In hierarchical state machines, as used in the present software suite, these can be organized into different levels, such that each level can only be in one state, but a state can pass control to a lower level.Static task: Experimental tasks like those traditionally used in the cognitive neurosciences. Simple, usually stationary, stimuli, limited opportunities for action, simple behaviours, etc. Here, they are contrasted with active tasks (see above).Unity: One of the most popular video game engines. Freely available.Video game engine: A software development kit designed to handle many of the common issues involved in creating video games, such as interfacing with controllers, simulating physical collisions and lighting, etc.


Author(s):  
Cecile Meier ◽  
Jose Luis Saorín ◽  
Alejandro Bonnet de León ◽  
Alberto Guerrero Cobos

This paper describes an experience to incorporate the realization of virtual routes about the sculptural heritage of a city in the classroom by developing a simulation of the urban environment using a video game engine. Video game engines not only allow the creation of video games but also the creation and navigation of in-teractive three-dimensional worlds. For this research, Roblox Studio has been used, a simple and intuitive program in which no previous programming skills are required. During the 2018/2019 academic year, a pilot experience was carried out with 53 secondary school students who were given the task of designing a virtual environment in which they had to include 3D models of the sculptural her-itage of the city of Santa Cruz de Tenerife. Before starting the experience, the par-ticipants answered a questionnaire to obtain a previous idea of the students' knowledge about the creation of video games. Once the activity was finished and in order to evaluate the result of the activity, the participants answered a final questionnaire. The students emphasized that after the activity they are more aware of the sculptural heritage of Santa Cruz and that they consider themselves capable of creating their own interactive worlds with Roblox.


Sign in / Sign up

Export Citation Format

Share Document