The BEMOSA consortium will hold a special workshop in Brussels on July 10, 2012 focusing on the managerial implications of the intermediate findings of extensive research conducted in European airports.
The workshop is the third and last workshop in a series of events devoted to applying human factor and the principles of social networking to airport security. At the first workshop the general conclusions of the study were presented. At the second workshop specific case studies have been provided and discussed, especially their implications for airport security operations in general and false alarms and manager-employee relations in particular.
The third workshop will discuss the implication of the findings of BEMOSA in daily airport management and training programs on enhancing security decisions by airport employees.
The workshop will focus on translating the groundbreaking findings into guidelines for a novel training program for airport personnel. The training program will be based upon unique behaviour models developed by BEMOSA.
The BEMOSA evidence-based simulation and training program will eventually lead to reduced false alarms, increased passenger satisfaction and improved labor relations, ultimately leading to improved profitability.
The workshop will be held in the offices of DG Research of the European Commission in Rue du Champs de Mars 21 in Brussels. The event is free of charge but registration is obligatory.
Tuesday, June 19, 2012
Thursday, June 7, 2012
Florida airport’s TSA firings send workers the wrong message
By Prof. Alan (Avi) Kirschenbaum*
The latest episode of being negatively rewarded for not following airport security rules has hit the headlines once again with reports on the firing of Transportation Security Administration (TSA) workers at Southwest Florida International Airport for failing to perform random screenings.
This, of course, will lead to security being tightened, not out of concern for aircraft and passenger safety but out of fear of being fired. The result: longer lines, more frustrated passengers and loss of revenue for airports.
So what is the big fuss about if, as TSA spokesman David Castelveter said, “It’s the random secondary that did not happen. At no time was a traveler’s safety at risk and there was no impact on flight operations”? The logic of this somehow escapes me, but it does point toward an inherent problem in how security is viewed and practiced.
Just as we found in the BEMOSA Project, bending, breaking and ignoring the rules is part of the normative behavior of security employees. In this case, only 15 percent of the roughly 280 TSA employees at an airport were caught and either fired or suspended. And what was the reason? They did not perform “random checks” of the passengers which, as we have heard, do nothing to enhance security but might make it “a little more difficult” for the bad guys to disrupt air transportation. Of course, this begs the question of “how much more difficult” as against decimating your work force and increasing the frustration of passengers.
Here again we see the overdone imposition of rules as against the judgment and experience of security personnel. Rather than imbed and enrich employees with skills that go far beyond what technology or its related rules can offer, we are sending the message to “keep a low profile,” “don't rock the boat,” “don't take initiative” – just be the robot the engineers and security managers have designed you to be.
And why is this critical to making security decisions?
By treating passengers as sterile cogs who are seen as a threat, we logically must apply the engineering model of airports designed as a mass production high risk facility (akin to a prison). But what if we start to view airports as service providers where passengers are customers?
Well, not unexpectedly, the BEMOSA results have revealed an important clue. Those security employees who considered the security and safety of passengers to be high on their priority list were the ones who tended to bend and break the rules. Those who really didn’t care that much about passengers were the compliant bureaucratic rule keepers.
Just ask yourself: to whom would you rather trust your lives?
The writer is the initiator and coordinator of BEMOSA (Behavioral Modeling of Security in Airports).*
The latest episode of being negatively rewarded for not following airport security rules has hit the headlines once again with reports on the firing of Transportation Security Administration (TSA) workers at Southwest Florida International Airport for failing to perform random screenings.
This, of course, will lead to security being tightened, not out of concern for aircraft and passenger safety but out of fear of being fired. The result: longer lines, more frustrated passengers and loss of revenue for airports.
So what is the big fuss about if, as TSA spokesman David Castelveter said, “It’s the random secondary that did not happen. At no time was a traveler’s safety at risk and there was no impact on flight operations”? The logic of this somehow escapes me, but it does point toward an inherent problem in how security is viewed and practiced.
Just as we found in the BEMOSA Project, bending, breaking and ignoring the rules is part of the normative behavior of security employees. In this case, only 15 percent of the roughly 280 TSA employees at an airport were caught and either fired or suspended. And what was the reason? They did not perform “random checks” of the passengers which, as we have heard, do nothing to enhance security but might make it “a little more difficult” for the bad guys to disrupt air transportation. Of course, this begs the question of “how much more difficult” as against decimating your work force and increasing the frustration of passengers.
Here again we see the overdone imposition of rules as against the judgment and experience of security personnel. Rather than imbed and enrich employees with skills that go far beyond what technology or its related rules can offer, we are sending the message to “keep a low profile,” “don't rock the boat,” “don't take initiative” – just be the robot the engineers and security managers have designed you to be.
And why is this critical to making security decisions?
By treating passengers as sterile cogs who are seen as a threat, we logically must apply the engineering model of airports designed as a mass production high risk facility (akin to a prison). But what if we start to view airports as service providers where passengers are customers?
Well, not unexpectedly, the BEMOSA results have revealed an important clue. Those security employees who considered the security and safety of passengers to be high on their priority list were the ones who tended to bend and break the rules. Those who really didn’t care that much about passengers were the compliant bureaucratic rule keepers.
Just ask yourself: to whom would you rather trust your lives?
The writer is the initiator and coordinator of BEMOSA (Behavioral Modeling of Security in Airports).*
Subscribe to:
Posts (Atom)