The State Department’s OIG recently released its Evaluation of the Antiterrorism Assistance (ATA) Program for Countries Under the Bureaus of Near Eastern Affairs and South and Central Asian Affairs (Report Number AUD/MERO-12-29, April 2012).
How much and where it went?
- From FYs 2002 through 2010, the Bureau of Diplomatic Security’s Office of Antiterrorism Assistance (DS/T/ATA) and the Bureau of Counterterrorism (CT) have been provided nearly $1.4 billion for ATA programs worldwide, with approximately 65 percent of that assistance ($873.3 million) going to programs in North Africa, the Middle East, South Asia, and Central Asia.
- In FY 2011, the ATA program’s budget request was $205 million, with approximately $125 million designated for the 22 North Africa, Middle East, Central Asia, and South Asia countries.
- In FY 2010 (or FY2009?), the ATA program expended approximately $62 million trained nearly 2,700 participants from countries in North Africa, the Middle East, South Asia, and Central Asia at a cost of approximately $1,800 per student per day of training.
- The average training course lasted 13 days and was attended by 21 students, which equates to approximately $23,000 per student per class, or $1,800 per student per day of training.
The OIG report did not say where the training sessions were held but seriously — how do you rack up $1800 a day for training per trainee? Oops, sorry, how quickly we forget. That’s almost as bad as the GSA scandal which cost federal taxpayers nearly $2750 per person.
Something about objectives, indicators and lots of strategeries:
- OIG found that for 20 of the 22 countries, CT and DS/T/ATA did not develop specific or measurable strategic or performance objectives in the Country Assistance Plans.
- OIG found that for eight of the 22 countries, CT provided broad strategic objectives that were vague or included an inordinate number of goals.
- OIG found that nearly all of the performance indicators and targets used to define success or failure of a country program were ambiguous, were not measurable, or lacked meaning.
Let’s have some examples:
Lebanon: The strategic objectives for Lebanon directed the ATA program to help modernize and professionalize security forces “through basic and advanced training and equipment and operation upgrades.”
India: The strategic objectives for India directed the program to emphasize critical incident response; post-incident investigation; human rights; border security; international threat finance; extradition and prosecution; and the protection of critical infrastructure, including port, rail, and airport security.
Bahrain and Morocco: A performance objective for both Bahrain and Morocco is to enhance the country’s “capability in investigating, and responding to terrorism.”
Nepal: The two program objectives for Nepal are “to enhance the capabilities of Nepalese police to utilize ATA training” and to “improve capabilities of the Nepalese police to counter and respond to terrorism.”
And the Success Measurement Award goes to ATA Bangladesh where one performance indicator for measuring the success of the increasing protection capabilities for Bangladeshi leaders was “regular updates from U.S. Embassy, ATA program visits, and feedback from Bangladesh’s law enforcement community on enhanced institutional management and procedures developed through ATA training to protect national leaders.”
If that’s a measure of success, we’d hate to see what failure is like.
So, cmon- is this program effective?
“Since 1983, DS/T/ATA has provided ATA program training to participants from North Africa, the Middle East, Central Asia, and South Asia. However, DS/T/ATA could not determine the program’s effectiveness because it had not developed specific, measurable, and outcome-oriented program objectives or implemented a mechanism for program evaluation. In addition, DS/T/ATA and CT were not consulting with DRL when selecting partner countries or when determining the assistance to be provided to those countries because DS/T/ATA and CT officials stated they were unaware of the requirement. As a result, the Department has no assurance that the ATA program is achieving its intended statutory purposes or that the overall or individual programs are successful. Further, DS/T/ATA has no basis for determining when partner countries are capable of sustaining their own ATA program without U.S. support.”
Bottom line answer is – since 1983
Who the heck knows?
But you’d be pleased to know that this has not kept State from pouring more money into a program that has not been proven to be effective since it has no idea how to measure its effectiveness.
Why don’t we just add the disbursement of funds as an indicator of success and make it easy on everyone?
Related articles
- State Dept Tops Chief Watchdog Vacancy Club – 1,546 Days and Counting (diplopundit.net)
- State Dept May Dump Multimillion-Dollar Iraqi Police Program; Nooooo! Not So Says Embassy Baghdad (diplopundit.net)
- State Dept Anticipates Spending $25 Million on Internet Freedom This Year (diplopundit.net)
You must be logged in to post a comment.