A stress net is simply a graphical depiction of principal stress directions (or other directions derived from them, such as rotating them by 45 degrees to get the maximum shear lines.) Continue reading

# Tag Archives: fracture

# Publication: A model for statistical variation of fracture properties in a continuum mechanics code

H.W. Meyer Jr. and R.M. Brannon

[This post refers to the original on-line version of the publication. The final (paper) version with page numbers and volume is found at http://dx.doi.org/10.1016/j.ijimpeng.2010.09.007. Some further details and clarifications are in the 2012 posting about this article]

Continuum mechanics codes modeling failure of materials historically have considered those materials to be homogeneous, with all elements of a material in the computation having the same failure properties. This is, of course, unrealistic but expedient. But as computer hardware and software has evolved, the time has come to investigate a higher level of complexity in the modeling of failure. The Johnsone-Cook fracture model is widely used in such codes, so it was chosen as the basis for the current work. The CTH ﬁnite difference code is widely used to model ballistic impact and penetration, so it also was chosen for the current work. The model proposed here does not consider individual ﬂaws in a material, but rather varies a material’s Johnsone-Cook parameters from element to element to achieve in homogeneity. A Weibull distribution of these parameters is imposed, in such a way as to include a size effect factor in the distribution function. The well-known size effect on the failure of materials must be physically represented in any statistical failure model not only for the representations of bodies in the simulation (e.g., an armor plate), but also for the computational elements, to mitigate element resolution sensitivity of the computations.The statistical failure model was tested in simulations of a Behind Armor Debris (BAD) experiment, and found to do a much better job at predicting the size distribution of fragments than the conventional (homogeneous) failure model. The approach used here to include a size effect in the model proved to be insufﬁcient, and including correlated statistics and/or ﬂaw interactions may improve the model.

Available Online:

http://www.mech.utah.edu/~brannon/pubs/7-2011MeyerBrannon_IE_1915_final_onlinePublishedVersion.pdf

http://www.sciencedirect.com/science/article/pii/S0734743X10001466

# Research: Weibull fragmentation in the Uintah MPM code

Below are links to two simulations of disks colliding. The first is elastic and the second uses a fracture model with spatially variable strength based on a scale-dependent Weibull realization. Both take advantage of the automatic contact property of the MPM.

WeibConstMovie: disks colliding without fracture

WeibPerturbedGood: disks colliding with heterogeneous fracture

This basic capability to support statistically variable strength in a damage model has been extended to the Kayenta plasticity model in Uintah.

# Protected: Publication: Variability and Scale Dependence of Brittle Failure Strength Attributable to Uncertainty in Crack Orientations

# Publication: Survey of Four Damage Models for Concrete

R.M. Brannon and S. Leelavanichkul

Four conventional damage plasticity models for concrete, the Karagozian and Case model (K&C),the Riedel-Hiermaier-Thoma model (RHT), the Brannon-Fossum model (BF1), and the Continuous Surface Cap Model (CSCM) are compared. The K&C and RHT models have been used in commercial finite element programs many years, whereas the BF1 and CSCM models are relatively new. All four models are essentially isotropic plasticity models for which plasticity is regarded as any form of inelasticity. All of the models support nonlinear elasticity, but with different formulations.All four models employ three shear strength surfaces. The yield surface bounds an evolving set of elastically obtainable stress states. The limit surface bounds stress states that can be reached by any means (elastic or plastic). To model softening, it is recognized that some stress states might be reached once, but, because of irreversible damage, might not be achievable again. In other words, softening is the process of collapse of the limit surface, ultimately down to a final residual surface for fully failed material. The four models being compared differ in their softening evolution equations, as well as in their equations used to degrade the elastic stiffness. For all four models, the strength surfaces are cast in stress space. For all four models, it is recognized that scale effects are important for softening, but the models differ significantly in their approaches. The K&C documentation, for example, mentions that a particular material parameter affecting the damage evolution rate must be set by the user according to the mesh size to preserve energy to failure. Similarly, the BF1 model presumes that all material parameters are set to values appropriate to the scale of the element, and automated assignment of scale-appropriate values is available only through an enhanced implementation of BF1 (called BFS) that regards scale effects to be coupled to statistical variability of material properties. The RHT model appears to similarly support optional uncertainty and automated settings for scale-dependent material parameters. The K&C, RHT, and CSCM models support rate dependence by allowing the strength to be a function of strain rate, whereas the BF1 model uses Duvaut-Lion viscoplasticity theory to give a smoother prediction of transient effects. During softening, all four models require a certain amount of strain to develop before allowing significant damage accumulation. For the K&C, RHT, and CSCM models, the strain-to-failure is tied to fracture energy release, whereas a similar effect is achieved indirectly in the BF1 model by a time-based criterion that is tied to crack propagation speed.

Available Online:

http://www.mech.utah.edu/~brannon/pubs/7-2009BrannonLeelavanichkulSurveyConcrete.pdf

# Publication: The Use of Sphere Indentation Experiments to Characterize Ceramic Damage Models

R.B. Leavy; R.M. Brannon; O.E. Strack

Sphere impact experiments are used to calibrate and validate ceramic models that include statistical variability and/or scale effects in strength and toughness parameters. These dynamic experiments supplement traditional characterization experiments such as tension, triaxial compression, Brazilian, and plate impact, which are commonly used for ceramic model calibration.The fractured ceramic specimens are analyzed using sectioning, X-ray computed tomography, microscopy, and other techniques. These experimental observations indicate that a predictive material model must incorporate a standard deviation in strength that varies with the nature of the loading. Methods of using the spherical indentation data to calibrate a statistical damage model are presented in which it is assumed that variability in strength is tied to microscale stress concentrations associated with microscale heterogeneity.

Available Online:

http://www.mech.utah.edu/~brannon/pubs/7-2009-LeavyBrannonStrack-IJACT.pdf

http://dx.doi.org/10.1111/j.1744-7402.2010.02487.x

# Research: Radial cracking as a means to infer aleatory uncertainty parameters

Aleatory uncertainty in constitutive modeling refers to the intrinsic variability in material properties caused by differences in micromorphology (e.g., grain orientation or size, microcracks, inclusions, etc.) from sample to sample. Accordingly, a numerical simulation of a nominally axisymmetric problem must be run in full 3D (non-axisymmetric) mode if there is any possibility of a bifurcation from stability.

Dynamic indentation experiments, in which a spherical ball impacts to top free surface of a cylindrical specimen, nicely illustrate that fracture properties must have spatial variability — in fact, the intrinsic instability that leads to radial cracking is regarded by the Utah CSM group as a potential inexpensive means of inferring the spatial frequency of natural variations in material properties.

# Merits and shortcomings of conventional smeared damage models

Four classical damage models for concrete (three of which are available in commercial codes) have been compared and critiqued, showing that they all share the notions of a “teardrop” yield surface that can harden and (for some models) translate until reaching a three-invariant fracture limit surface that then collapses to account for softening (i.e., permanent loss of strength). Practical engineering models for rock and ceramics are similar. The common drawbacks of these models (primarily severe mesh dependence) can be mitigated, though not eliminated, by seeding their material properties in the simulation with spatial variability (aleatory uncertainty) and by using appropriate scale effects for the strength and failure progression properties. Continue reading

# Computational approaches for dynamically loaded low-ductility metals

Eulerian simulations of un-notched Charpy impact specimens, provide unsatisfactory results in that experimentally observed bend angle, absorbed energy, and fracture mode are not reproduced. The Utah CSM group is independently confirming poor simulation fidelity using conventional constitutive models. From there, we aim to identify the cause, and investigate solutions using capabilities in the Kayenta material framework.

**UofU Contributors/collaborators:**

Krishna Kamojjala (PhD student, Mech. Engr., UofU)

Scot Swan (MS student, Mech. Engr., UofU)