Google Answers Logo
View Question
 
Q: Aviation safety ( Answered 5 out of 5 stars,   4 Comments )
Question  
Subject: Aviation safety
Category: Miscellaneous
Asked by: lhunt-ga
List Price: $100.00
Posted: 18 Dec 2005 11:15 PST
Expires: 17 Jan 2006 11:15 PST
Question ID: 607122
I'm interested in any reference material, or literature that talks
about a type of pilot error sometimes referred to as "landing
expectations", where the pilot will proceed with a flight plan in
detriorating conditions because of their expectation of landing
according to their original flight plan.  I'm told it's one of the
principle risks in general aviation.  Other names for the same
phemonomon would be acceptable.
Answer  
Subject: Re: Aviation safety
Answered By: hummer-ga on 18 Dec 2005 16:56 PST
Rated:5 out of 5 stars
 
Hi lhunt,

I believe I have found just what you are looking for. Not only are the
articles terrific, the Reference lists should give you enough material
to keep you busy for awhile (hopefully your library will be able to
supply you with some of the journals). I've copied and pasted exerpts
from the articles below, but please click on the links for full
details.

In regards to what to call the phenomena,  "plan continuation error"
(PCE) is used in safety studies.

Risk Factors Associated with Weather-Related General Aviation Accidents
"'Other researchers have attempted to characterize the types of
decision-making errors that lead pilots to make unsafe decisions. One
class of decision-making error attributed to pilots in weather
accidents is known as a plan continuation error. A plan continuation
error is defined as ?failure to revise a flight plan despite emerging
evidence that suggests it is no longer safe.'  (14) For example,
rather than revising the intended route of flight by changing course
or altitude, deviating to an alternate airport, or returning to the
departure airport, pilots may opt to press on into deteriorating
weather."
(14) J. Orasanu, L. Martin and J. Davison, ?Cognitive and Contextual
Factors in Aviation Accidents,? in E. Salas and G.A. Klein (Eds.),
Linking Expertise and Naturalistic Decision Making (Mahwah, NJ:
Lawrence Erlbaum Associates: 2001), pp. 209-225.
http://www.ntsb.gov/publictn/2005/SS0501.pdf

I like this one, "get-there-itus".

PCE: ?The continuation of an original plan even with the availability
of information that suggests that the plan should be abandoned.? In
other words: a growing commitment to a chosen course of action,
get-there-itus."
http://www.psychol.ucl.ac.uk/ljdm/talkppt/BolandLJDM.pdf.

Here's a good quote for you:

Reality:
?When something changes, recognize it; don?t deny it. Deal with things
as they are - not as they are on your plan.?
https://www.avemco.com/briefingroom/2003%20edition%20.pdf


CHANGE DETECTION IN A FLIGHT PLANNING TASK ENVIRONMENT:
LINKING PLANNING ERRORS TO BIASES IN PLAN MONITORING
Emily K. Muthard & Christopher D. Wickens
University of Illinois, Aviation Research Lab
Savoy, Illinois
"The present study investigated a link between plan continuation
errors and plan monitoring. Pilots were asked to execute a flight plan
that traversed through hazardous airspace and then monitor the success
of the plan by seeking and detecting changes in the airspace that
could affect the safety of the plan. Following change detection,
pilots had the opportunity to revise these plans. In nearly one-third
of trials, pilots failed to revise flight plans, thereby committing a
plan continuation error, and were more likely to do so when plan
monitoring was inadequate. Overall, more than half of changes went
undetected, though detection response times were improved when changes
were relevant to the flight planning task or when traffic aircraft
were changed rather than weather systems. Findings imply that plan
monitoring is less than perfect, which may be a substantial
contributing factor to plan continuation errors."
REFERENCES
- Burian, B. K., Orasanu, J., & Hitt, J. (2000). Weather-related
decision errors: Differences across flight types.
Proceedings of the IEA 2000 and HFES 2000 Congress, 22-25.
- Einhorn, H. J., & Hogarth, R. M. (1978). Confidence in judgment:
Persistence of the illusion of validity.
Psychological Review, 70, 193-242.
- Endsley, M. R. (1988). Design and evaluation for situation awareness
enhancement. In Proceedings of the 32nd
Annual Human Factors Society Meeting (pp. 97-101). Santa Monica, CA:
Human Factors Society.
- Goh, J., & Wiegmann, D. A. (2001a). Visual flight rules (VFR) flight
into instrument meteorological conditions
(IMC): A review of the accident data. Proceedings of the 11th
International Symposium on Aviation Psychology. Columbus, OH: Ohio
State University.
- Goh, J., & Wiegmann, D. A. (2001b). Visual flight rules flight into
instrument meteorological conditions: An empirical investigation of
the possible causes. The International Journal of Aviation Psychology,
11, 359-379.
- Goh, J., & Wiegmann, D. (2001c). An investigation of the factors
that contribute to pilots? decisions to continue visual flight rules
flight into adverse weather. Proceedings of the Human Factors and
Ergonomics Society 45th Annual Meeting, 26-29.
- McCoy, C. E., & Mickunas, A. (2000). The role of context and
progressive commitment in plan continuation error. Proceedings of the
IEA 2000/HFES 2000 Congress, 26- 29.
- Mumaw, R. J., Sarter, N. D., & Wickens, C. D. (2001). Analysis of
pilots? monitoring and performance on an automated flight deck. In
Proceedings of the 11th International Symposium on Aviation
Psychology. Columbus, OH: The Ohio State University
- Nikolic, M. I., & Sarter, N. B. (2001). Peripheral visual feedback:
A powerful means of supporting effective attention allocation in
event-driven data-rich environments. Human Factors, 43, 30-38.
- Orasanu, J., Martin, L., & Davison, J. (2001). Cognitive and
contextual factors in aviation accidents: Decision errors. In E. Salas
& G. A. Klein (Eds.), Linking expertise and naturalistic decision
making (pp. 209-225). Mahwah, NJ: Lawrence Erlbaum.
- Podczerwinski, E., Wickens, C. D., & Alexander, A. L. (2001).
Exploring the ?out of sight, out of mind? phenomenon in dynamic
settings across electronic map displays (Tech. Rep.
ARL-01-8/NASA-01-4). Savoy, IL: University of Illinois at
Urbana-Champaign, Aviation Research Lab.
- Pringle, H. L., Irwin, D. E., Kramer, A. F., & Atchley, P. (2001).
The role of attentional breadth in perceptual change detection.
Psychonomic Bulletin and Review, 8, 89-95.
- Rensink, R. A. (2002). Change detection. Annual Review of
Psychology, 53, 245-277.
- Rensink, R. A., O?Regan, J. K., & Clark, J. J. (1997). To see or not
to see: The need for attention to perceive changes in scenes.
Psychological Science, 8, 368-373.
- Scholl, B. J. (2000). Attenuated change blindness for exogenously
attended items in a flicker paradigm. Visual Cognition, 7, 377-396.
- Senders, J. W. (1967). On the distribution of attention in a dynamic
environment. Acta Psychologica, 27, 349-354.
- Simons, D. J. (2000). Current approaches to change blindness. Visual
Cognition, 7, 1-15.
- Thomas, L. C., & Wickens, C. D. (2000). Effects of display frames of
reference on spatial judgments and change detection (Technical Report
ARL-00-14/FED-LAB-00-4). Savoy, IL: University of Illinois, Aviation
Research Lab.
- Wason, P. C. (1960). On the failure to eliminate hypotheses in a
conceptual task. Quarterly Journal of Experimental Psychology, 12,
129-140.
- Wason, P. C., & Johnson-Laird, P. N. (1972). Psychology of
reasoning: Structure and content. London: Batsford.
- Wickens, C. D. (2001). Attention to safety and the psychology of
surprise. Keynote address at The 11th Annual .International Symposium
on Aviation Psychology. Columbus, OH: Ohio State University.
- Wickens, C. D., Xu, X., Helleberg, J., & Marsh, R. (2001). Pilot
visual workload and task management in freeflight: A model of visual
scanning. In Proceedings of the 11th Annual International Symposium on
Aviation Psychology. Columbus, OH: Ohio State University.
- Wilson, D. R., & Fallshore, M. (2001). Optimistic and ability biases
in pilots? decisions and perceptions of risk regarding VFR into IMC.
Proceedings of the 11th Annual International Symposium on Aviation
Psychology. Columbus, OH: Ohio State University.
http://www.humanfactors.uiuc.edu/Reports&PapersPDFs/humfac02/muthardhf02.pdf

FACTORS THAT MEDIATE FLIGHT PLAN MONITORING AND ERRORS IN PLAN REVISION:
PLANNING UNDER AUTOMATED AND HIGH WORKLOAD CONDITIONS
Emily K. Muthard
Christopher D. Wickens
University of Illinois Aviation Human Factors Division
Savoy, Illinois
"An experiment was conducted to explore the effects of automation and
task loading on aviation plan monitoring and errors in plan revision.
Pilots were asked to select one of two flight paths that traversed
through hazardous airspace and to then monitor the safety of the path
by seeking and reporting changes in dynamic traffic aircraft and
weather systems. Following change detection, pilots were given the
opportunity to revise their flight plan as a result of the changes.
Attention guidance automation, which was reliable for plan selection,
but failed to highlight a critical change that threatened safety after
plan selection, was present on half of trials. Automation improved
planning accuracy and confidence in high workload conditions. However,
in nearly one third of trials, pilots failed to revise the flight
plans as a result of a change, and were more likely to do so with
imperfect automation in high workload."
REFERENCES
- Billings, C. E. (1991). Human-centered aircraft automation: A
concept and guidelines (NASA Technical Memorandum 103885). Moffet
Field, CA: NASA Ames Research Center.
- Endsley, M. (2000). Theoretical underpinnings of situation
awareness: A critical review. In M.R. Endsley & D.J. Garland (Eds.),
Situation Awareness Analysis  and Measurement. Mahwah, NJ: Lawrence
Erlbaum Associates.
- Koslowski, B., & Maqueda, M. (1993). What is the
confirmation bias and when do people actually have it?
Merrill-Palmer Quarterly, 39, 104-130.
- Layton, C.F., Smith, P.J., & McCoy, C.E. (1993). Design of a
cooperative problem-solving system for enroute flight planning: An
empirical evaluation. Human Factors, 36, 94-119.
- Mosier, K.L., Palmer, E.A., & Degani, A. (1992). Electronic
checklists: Implications for decision making. Proceedings of the Human
Factors Society 36th Annual  Meeting (pp. 7-11). Santa Monica, CA:
Human Factors Society.
- Muthard, E.K., & Wickens, C.D. (2002). Factors  that Mediate Flight
Plan Monitoring and Errors in Plan  Revision: An Examination of
Planning under Automated  Conditions (Tech. Rep.
AFHD-02-11/NASA-02-8). University of Illinois, Institute of Aviation:
Savoy, IL.
- National Transportation Safety Board (NTSB) (1994). Safety study: A
review of flightcrew-involved  major accidents of U. S. air carriers,
1978 through 1990 (NTSB/SS-94/01). Springfield, VA: National Technical
Information Service.
- Riley, V., Lyall, B., & Wiener, E. (1993). Analytic methods for
flight-deck automation design and  evalution. Phase two report: Pilot
use of automation (Technical Report). Minneapolis, MN: Honeywell
Technology Center.
- Ward, G., & Allport, A. (1997). Planning and problem-solving using
the five-disc Tower of London task. The Quarterly Journal of
Experimental Psychology, 50A, 49-78.
- Weiner, E.L. (1985). Cockpit automation: In need of a philosophy.
Proceedings of the 1985 Behavioral  Engineering Conference (pp.
369-375). Warrendale, PA: Society of Automotive Engineers.
- Xiao, Y., Milgrim, P., & Doyle, D.J. (1997). Planning behavior and
its functional role in interactions with complex systems. IEEE
Transactions on Systems, Man, and Cybernetics-Part A: Systems and
Humans, 27, 313-324.
- Yeh, M., & Wickens, C.D. (2001). Display signaling in augmented
reality: Effects of cue reliability and image realism on attention
allocation and trust calibration. Human Factors, 43, 455-465.
http://www.humanfactors.uiuc.edu/Reports&PapersPDFs/isap03/mutwic.pdf


Additional Links of Interest:

Association of Aviation Psychology (AAP)
http://www.avpsych.org/

THE INTERNATIONAL JOURNAL OF AVIATION PSYCHOLOGY  (subscription)
https://www.erlbaum.com/shop/tek9.asp?pg=products&specific=1050-8414

TRACKING OPERATOR ACTIVITIES IN COMPLEX SYSTEMS:
AN EXPERIMENTAL EVALUATION USING BOEING 757 PILOTS
http://human-factors.arc.nasa.gov/IHpersonnel/tc/OSU_paper/OSU-paper.html

Aviation Decision Making
How to save gyro pilot lives - stop Pilot Error
http://www.pra73.net/Downloads/DecisionMaking.pdf

Flight Cognition Laboratory - Publications/Presentations
http://human-factors.arc.nasa.gov/flightcognition/publications.html

FAA: The Human Factors Research and Engineering Division
http://www.hf.faa.gov/

Errors in Aviation Decision Making: A Factor in Accidents and Incidents
Judith Orasanu
Lynne Martin
NASA-Ames Research Center
http://www.dcs.gla.ac.uk/~johnson/papers/seattle_hessd/judithlynne-p.pdf

IT'S HUMAN NATURE!
By Loukia D. Loukopoulos
http://human-factors.arc.nasa.gov/flightcognition/hottopic/article1.htm

Books
http://shop.pilotwarehouse.co.uk/category26023.html

I hope this is just what you were hoping for when you posted your
question. If you need further assistance, please post a clarification
request and wait for me to respond before closing/rating my answer.

Thank you,
hummer

Google Search Terms Used: 

"pilot errors" in making "flight plan"
plan revisions "pilot errors" in "plan revisions"
pilot continues with "flight plan" despite problems
causes of "pilot error"
"pilot error" continues "flight plan" deteriorating conditions
"plan continuation error"
lhunt-ga rated this answer:5 out of 5 stars and gave an additional tip of: $10.00
Excellent answer.  The only question I would ask is whether they had
been any reference to the term, "landing expetations."

Comments  
Subject: Re: Aviation safety
From: hummer-ga on 19 Dec 2005 11:47 PST
 
lhunt, thank you for the tip, I really appreciate it! Researching your
question was interesting and I'm glad to hear that you are happy with
the links that I provided. No, "landing expectations" or something
similar didn't come up - I'd stick with PCE.

Wishing you a safe holiday season with no PCEs!
hummer
Subject: Re: Aviation safety
From: tugtug1-ga on 02 Jan 2006 19:00 PST
 
Excellent answer.  I am a professional pilot.  Good stuff.  But let me
add some practical advice.  This problem is most prevelant among those
types of personalities that think they get to play by a different set
of rules than the rest of us.  Take John John Kennedy.  Nothing
against him, rest his soul, but he is a well known example.  He saw
his family members and he himself habitually get out of trouble with
money, contacts, and influence.  In my opinion, he had a mindset that
since he always got out of jams, he always would.  But the laws of
physics apply to all.  We all have limits.  And sometimes we should
take the bus.  No matter who we are, how sucessful we are, and what
job we hold.
Subject: Re: Aviation safety
From: lhunt-ga on 04 Jan 2006 10:52 PST
 
Thanks, hummer-ga.  Adds another human element to the principle.  I'm
looking at how the principle that was first described to me as
"landing expectations", and later "get-home-itis", really affect us
all in many endeavors, from investing to personal life management. 
For instance, not selling a stock when the indications are clear the
original "flight plan" has changed.  Or not quitting a job under the
same scenario. You've added another component.  Thanks,

Hunt
Subject: Re: Aviation safety
From: myoarin-ga on 04 Jan 2006 16:44 PST
 
LHunt,
You said it with your comment; we all do it, personally and
professionally:  your example of sticking with a disappointing stock
investment; anyone who continued their vacation plans to New Orleans
two days before the hurricane; etc., etc.
Tugtug's comment is also very enlightening, but I think it also has a
lot to do with the prior emotional, planning and financial investment
in the project, even in acknowledgement of the risks.  This could be a
scenario such as:  "Dammit, we planned this vacation for years," (New
Orleans or unexpected financial problems); or "But I promised my kid
xyz,"  when it has subsequently become obvious that for the child or
parent xyz should be postponed (kid has an exam the next day, dad
shouldn't spend the money or knows that taking the day off is much
more inopportune than anticipated).

When I read Hummer's great answer, my first thought was:
"Plan Continuation Error", that sounds like what happens to many
business and government projects.  Once the boss or the town council
says:  "we'll do it," the project plows on even when it immediately
become evident that the plan was half-baked or other factors occur
that would (should?) have led to another decision, maybe costing much
more, maybe being entirely pointless (say, a local  garbage disposal
facility when a regional one is in the planning that the town will
have to join.  Garbage is a good example here.)

In the business and political area, something similar to what Tugtug
describes occurs:  the higher the decision maker, the more difficult
it is to reverse the decision.  This is where good project planning
proves itself (also for that flight plan).  The plan should call for
ongoing checks of feasability that require re-approval of the project.

(Iraq only occurred to me after my abstract thought, but it seems to
fit the description, from the misfounded assumptions about a potential
immediate bio/chemical threat onwards.  But I don't want to change the
subject.)

Thanks for the question and the answer,  Myoarin

Important Disclaimer: Answers and comments provided on Google Answers are general information, and are not intended to substitute for informed professional medical, psychiatric, psychological, tax, legal, investment, accounting, or other professional advice. Google does not endorse, and expressly disclaims liability for any product, manufacturer, distributor, service or service provider mentioned or any opinion expressed in answers or comments. Please read carefully the Google Answers Terms of Service.

If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you.
Search Google Answers for
Google Answers  


Google Home - Answers FAQ - Terms of Service - Privacy Policy