Saturday, October 07, 2006



This is the second of my four articles series
on FALSE SENSE OF SAFETY in the Air Traffic Control
systems. The results and advices however not solid,
may be considered valid for many large systems,
such as nuclear energy reactors, railway
transport systems etc...

The hardcore large systems guys may consider
these unnecessary or invalid but the
developments in the psychology and neurology
in the last ten years have made the old
books out-of-date. Even these developments
alone are precursory to discuss
these issues, however subjective or not
solidly founded. The high rate of
mental diseases among people working in the
large systems can not be and should not be
hidden any more... Also, finding the
reasons for these mental breakdowns etc...
may give us the chance to build even
more complex systems. We are building
complex systems that can not be controlled
healthily, namely the operator's health and
the mission's healthy success.

In this article, I will compare an internet
article named "Night Operations"'s
"Disorientation" section with an Air Trafic
Control System's safety situation. I will
make a few references to the Swiss ATC accident
that caused 45 Ukranian chieldren's death also.

"Disorientation, or vertigo, is actually a state
of temporary spatial confusion resulting from
misleading information sent to the brain by
various sensory organs."

Every guy, and also the operational responsible
person in an ATC system assumes that an accident
is inevitable. The technical stuff tries to
improve the system so that the possible date of
this inevitable accident is postponed in to the
future. So, the direction is the direction
of the movement of a possible accident on the
time domain.

How can a disorientation happen in the minds
of technical stuff in an ATC system?
Just like it happened in Switzerland.
It was night, the most appropriate time to
do technical changes. Apparently, there were
technical maintenances on the system. The
operational management, had considered that
the changes they made and the operation itself
would push the inevitable accident a way...
But the fact was, telephones not working,
short term conflict alert not working on
the radar,etc...The operational management
was fully disoriented...

An accident does not happen easily in a large
system. There was also an other disorientation
in the systems design of conflict alert
of the airplane systems. They were designed to
help the pilots but practically the procedures
were not clear on how to use it. The pilots
got totally disoriented and one of them acted
according to the advise of the controller not
to the instrument, which is a typical vertigo
mistake. The pilot, although died, had acted
right according to the rules. The real vertigo
was in the minds of the Swiss operational

"The most difficult adjustment that you must
make as you acquire flying skill is a willingness
to believe that, under certain condition, your
senses can be wrong." The fact is many large
systems were state owned and have recently been
privatized. There is a strong hierarchy in
these organisations. Unfortunately,
hierarchy not only gives the operational
management the power to make singular individual
decisions but also may slow down the information
flow coming from the engineers working on the
maintenance and enhancement floor.
Even Lars Fredholm, from Swedish Fire
Department states "the problem concerns the
capacity to make co-ordinated decision at
different levels of the management. In a
static situation you have time to follow the
sequence planning, executing, evaluating.
In a dynamic situation the sequence is disturbed
by the dynamics of emergency. The co-ordination
of the decision making at different levels of
management has to be more dynamic and flexible."
Report 3111, Lund 1999... That night,
the Swiss operations management was in a dynamic
situation, I believe, according to my
experience at Karlsruhe UIR.

"If the rate of directional change is quite
small - and not confirmed by eyes - the change
will be virtually on detectable and you will
probably will not sense any motion or whatsoever."
When many successive changes are realized
successfully, a sense of false safety forms.
Things are going well, we did this, let's add
this modification also... Without really
evaluating the possibility of falling into a
situation where the system fails and must return
back to the backups where not all the
modifications may have been done. More
important, the current controller stuff may have
difficulty to remember which version has which
new functions etc...

"Here's where trouble begins! Inside the airplane,
if you are unable to see the ground and establish
visual reference you are just seconds away
fom the famous graveyard spiral."
The writer from tells the solution.
There must be reference points... There must be
metrics values carefully and insistingly used
in an ATC system. 10 years ago, at Karlsruhe,
we only head retrospective reliability measures.
In other words, how long the system has worked
in the last year, for how long without interruption
it has worked. Nothing prospective...
There must be measures as used in spaceshuttle and
aerospace companies... to predict possible
problems... It is a shame if an ATC center
is still not using these measures. I will
write my next article on this matter which
could have saved the lives of the Ukranian

Metrics values about the ongoing maintenance
and enhancement activities, even the performance
of the controlling and technical personnel can
give a strong indication, and may be the AWARENESS
that could help the operational management
to rise out of their FALSE SENSE OF REALITY,
when things seem to be going on routine
manners, in the "automatic processing" of the
ATC systems using pre-scheduled "schemata"...


Ali Riza SARAL