Efficient prediction and uncertainty propagation of correlated loads

I. Tartaruga, J. E. Cooper, P. Sartor, M. Lowenberg, S. Coggon, Y. Lemmens

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

Abstract

Aircraft structural design is influenced by the static and dynamic loads resulting from flight manoeuvres, gust/turbulence encounters and ground manoeuvres; thus the identification of such loads is crucial for the development and structural analysis of aircraft, requiring the solution of the aeroelastic dynamic responses. Numerical aeroelastic models are used to predict a large number (1000s) of “Interesting Quantities” (IQs), and for aircraft design the identification of the worst cases for each IQ is very important, but involves a significant computational effort. Of particular interest are the so-called correlated loads, where coincident values of pairs of IQs are plotted against each other. This paper demonstrates how to reduce the computational burden to determine the behaviour of the correlated loads envelopes with little reduction in the accuracy, and also to quantify the effects of uncertainty, for a range of different parameters. The methodology is demonstrated on a numerical aeroelastic wing model of a civil jet airliner.

Original languageEnglish
Title of host publication56th AIAA/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference
PublisherAmerican Institute of Aeronautics and Astronautics Inc. (AIAA)
ISBN (Print)9781624103421
Publication statusPublished - 2015
Event56th AIAA/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference 2015 - Kissimmee, United States
Duration: 5 Jan 20159 Jan 2015

Conference

Conference56th AIAA/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference 2015
CountryUnited States
CityKissimmee
Period5/01/159/01/15

Fingerprint

Dive into the research topics of 'Efficient prediction and uncertainty propagation of correlated loads'. Together they form a unique fingerprint.

Cite this