Audit of the VA's Project Management Accountability System (PMAS) Implementation
Here’s some light reading: the U.S. Dept. of Veterans Affairs’ Office of the Inspector General’s “Audit of the Project Management Accountability System Implementation.” Known as “PMAS,” the system was put in place in 2009 to provide better oversight of the VA’s troubled IT development projects. This was done in light of a history of cost overruns and failed IT projects at the VA.
Having been a VA consultant (in the area of social media), and having done a detailed review on my own of the PMAS’ initial guidebook issued by the VA’s CIO, I have some familiarity with the environment and the implementation challenges PMAS faced.
VA’s expenditures on system development are huge and complex. PMAS looked like an overdue dose of oversight and transparency. Having been responsible for developing project management offices myself, though, I could sense some of the difficulties to come. Unfortunately, and this report lays out some very damning findings without naming names, only two people were assigned to implement and manage PMAS operations. These operations apparently centered around a rather traditional “stoplight” format dashboard.
Most troubling was, according to the report, the unreliability of the data feeding the dashboard and the fact that baseline data could be replaced with current data, a practice which totally defeated the ability to track performance and accountability.
What’s the moral of this story? Evidently, providing serious oversight of multimillion dollar programs requires much more time, attention, and resources than the VA was able — or willing — to devote to the problem. Also, we can’t tell what, if any, were the political forces that also played a role in undermining the PMAS effort.
For example, were staff and managers of individual projects always completely forthcoming and cooperative in providing input data to the PMAS staff? There are incidents described in the report, for example, where troubled — and ultimately cancelled — projects were consistently shown to be in the “green” despite repeated problems.
Also, was it appropriate to create a “dashboard” process which, by definition, assumes availability of appropriate and timely data? Or would it have been more appropriate and effective to start out with less formalized and more personal reporting processes that would take advantage of modern telecommunication and conferencing so that program and project managers could interact spontaneously in real time? Such processes are time consuming and weigh heavily on senior management, but perhaps they should be attempted at least as an interim process. Even more significantly, were managers of targeted projects collaboratively involved in developing the PMAS approach? Had they been, would they have raised issues concerning the level of effort involved in implementing and operating PMAS?
Copyright (c) 2011 by Dennis D. McDonald