0
RESEARCH PAPERS: Offshore Mechanics

Three-Dimensional Analysis of Ice Sheet Indentation: Lower-Bound Solutions

[+] Author and Article Information
D. G. Karr

Massachusetts Institute of Technology, Cambridge, Mass. 02139

J. Offshore Mech. Arct. Eng 110(1), 81-86 (Feb 01, 1988) (6 pages) doi:10.1115/1.3257128 History: Received June 17, 1987; Online October 30, 2009

Abstract

The methods of plastic limit analysis are used to determine the indentation pressures of a flat rigid punch on a columnar ice sheet. The ice sheet is idealized as a semi-infinite layer of elastic-perfectly plastic material. Representative strength parameters of columnar sea ice are used to define anisotropic yield criteria for the ice sheet. The anisotropic yield criteria reflect the variations in mechanical properties caused by the horizontal orientation of the c-axis of sea ice in the columnar zone. Numerical results are obtained by applying the lower-bound theorem of plastic limit analysis. A three-dimensional stress field is optimized for a given ice condition for various indentor sizes. The effects of varying the aspect ratio (defined as the ratio of indentor width to ice thickness) are then addressed. A comparison of results for intermediate aspect ratios to results for extremely high (plane stress) and extremely low (plane strain) aspect ratios is presented. It is found that the transition from plane stress to plane strain is governed by the tensile strength of the ice medium.

Copyright © 1988 by ASME
Your Session has timed out. Please sign back in to continue.

References

Figures

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In