scholarly journals Bifurcations in valveless pumping techniques from a coupled fluid-structure-electrophysiology model in heart development

BIOMATH ◽  
2017 ◽  
Vol 6 (2) ◽  
pp. 1711297 ◽  
Author(s):  
Nicholas Anthony Battista ◽  
Laura Ann Miller

We explore an embryonic heart model that couples electrophysiology and muscle-force generation to flow induced using a $2D$ fluid-structure interaction framework based on the immersed boundary method. The propagation of action potentials are coupled to muscular contraction and hence the overall pumping dynamics. In comparison to previous models, the electro-dynamical model does not use prescribed motion to initiate the pumping motion, but rather the pumping dynamics are fully coupled to an underlying electrophysiology model, governed by the FitzHugh-Nagumo equations. Perturbing the diffusion parameter in the FitzHugh-Nagumo model leads to a bifurcation in dynamics of action potential propagation. This bifurcation is able to capture a spectrum of different pumping regimes, with dynamic suction pumping and peristaltic-like pumping at the extremes. We find that more bulk flow is produced within the realm of peristaltic-like pumping.

1980 ◽  
Vol 84 (1) ◽  
pp. 119-136
Author(s):  
D. Mellon ◽  
J. E. Treherne ◽  
N. J. Lane ◽  
J. B. Harrison ◽  
C. K. Langley

Intracellular recordings demonstrated a transfer of impulses between the paired giant axons of Sabella, apparently along narrow axonal processes contained within the paired commissures which link the nerve cords in each segment of the body. This transfer appears not to be achieved by chemical transmission, as has been previously supposed. This is indicated by the spread of depolarizing and hyperpolarizing voltage changes between the giant axons, the lack of effects of changes in the concentrations of external divalent cations on impulse transmission and by the effects of hyperpolarization in reducing the amplitude of the depolarizing potential which precedes the action potentials in the follower axon. The ten-to-one attenuation of electronic potentials between the giant axons argues against the possibility of an exclusively passive spread of potential along the axonal processes which link the axons. Observation of impulse traffic within the nerve cord commissures indicates, on the other hand, that transmission is achieved by conduction of action potentials along the axonal processes which link the giant axons. At least four pairs of intact commissures are necessary for inter-axonal transmission, the overall density of current injected at multiple sites on the follower axon being, it is presumed, sufficient to overcome the reduction in safety factor imposed by the geometry of the system in the region where axonal processes join the giant axons. The segmental transmission between the giant axons ensures effective synchronization of impulse traffic initiated in any region of the body and, thus, co-ordination of muscular contraction, during rapid withdrawal responses of the worm.


2011 ◽  
Vol 21 (12) ◽  
pp. 2523-2550 ◽  
Author(s):  
DANIELE BOFFI ◽  
NICOLA CAVALLINI ◽  
LUCIA GASTALDI

The Immersed Boundary Method (IBM) has been designed by Peskin for the modeling and the numerical approximation of fluid-structure interaction problems, where flexible structures are immersed in a fluid. In this approach, the Navier–Stokes equations are considered everywhere and the presence of the structure is taken into account by means of a source term which depends on the unknown position of the structure. These equations are coupled with the condition that the structure moves at the same velocity of the underlying fluid. Recently, a finite element version of the IBM has been developed, which offers interesting features for both the analysis of the problem under consideration and the robustness and flexibility of the numerical scheme. Initially, we considered structure and fluid with the same density, as it often happens when dealing with biological tissues. Here we study the case of a structure which can have a density higher than that of the fluid. The higher density of the structure is taken into account as an excess of Lagrangian mass located along the structure, and can be dealt with in a variational way in the finite element approach. The numerical procedure to compute the solution is based on a semi-implicit scheme. In fluid-structure simulations, nonimplicit schemes often produce instabilities when the density of the structure is close to that of the fluid. This is not the case for the IBM approach. In fact, we show that the scheme enjoys the same stability properties as in the case of equal densities.


2018 ◽  
Vol 21 (16) ◽  
pp. 813-823 ◽  
Author(s):  
John T. Wilson ◽  
Lowell T. Edgar ◽  
Saurabh Prabhakar ◽  
Marc Horner ◽  
Raoul van Loon ◽  
...  

Author(s):  
Fande Kong ◽  
Xiao-Chuan Cai

Fluid-structure interaction (FSI) problems are computationally very challenging. In this paper we consider the monolithic approach for solving the fully coupled FSI problem. Most existing techniques, such as multigrid methods, do not work well for the coupled system since the system consists of elliptic, parabolic and hyperbolic components all together. Other approaches based on direct solvers do not scale to large numbers of processors. In this paper, we introduce a multilevel unstructured mesh Schwarz preconditioned Newton–Krylov method for the implicitly discretized, fully coupled system of partial differential equations consisting of incompressible Navier–Stokes equations for the fluid flows and the linear elasticity equation for the structure. Several meshes are required to make the solution algorithm scalable. This includes a fine mesh to guarantee the solution accuracy, and a few isogeometric coarse meshes to speed up the convergence. Special attention is paid when constructing and partitioning the preconditioning meshes so that the communication cost is minimized when the number of processor cores is large. We show numerically that the proposed algorithm is highly scalable in terms of the number of iterations and the total compute time on a supercomputer with more than 10,000 processor cores for monolithically coupled three-dimensional FSI problems with hundreds of millions of unknowns.


Sign in / Sign up

Export Citation Format

Share Document