Friday, April 10, 2015

Open System Acquisition via PlugFest Plus

The Secretary of the Air Force’s Bending the Cost Curve (BTCC) initiative recognizes that weapon system cost and schedule over runs are ubiquitous and unsustainable.   It asks the question “is there dumb stuff we can beat out of the current approach?”




The Plug Fest Plus (PFP) is the subset of BTCC focused on Open System Acquisition (OSA).

Fixing broken Defense weapon system acquisition generally, and information systems acquisition in particular, has been tried before… We’ll need to find some pragmatic answers to some basic questions if we’re going to make more progress this time…





How will we experiment with better acquisition processes?  We will apply the scientific method and pose and test the following hypothesis

If we incentivize industrial innovation by streamlining the procurement process and lowering barriers to entry via vehicles such as Other Transaction Agreement wherein the industrial “performer” is actually a not for profit consortium open to any and all potential solution providers…

…and…

We provide a readily accessible developers virtual, distributed  “sand box” that provisions processes and tools for agile “plug-and-play” open system engineering, testing, and certification.  

…then…

We will steadily, and measurably, increase ROI per defense dollar invested. 




Many enlightened government studies, roadmaps, and watchdog reports clearly articulate the issues associated with Defense acquisition.  These studies typically also clearly identify the desired improved to-be state.  They even describe the steps necessary to get from the as-is to the to-be.  However, they inevitably assume that the required remedial action can take place within the same Pentagon processes that created the problem in the first place.  This satisfies Einstein’s definition of insanity.




“You cannot solve a problem with the same thinking that created it”
- Einstein
(1879-1955)


This time around we will carefully analyze both past successes and failures.  We’ll learn lessons from the success cases, and we’ll apply them to avoid stepping on the same rakes we’ve stepped on in the past. 

It’s not all bad news.  Throughout history, Government has frequently had profound success in influencing industrial innovation to support policy objectives. For example:

ü  The IRS spawned a thriving on-line marketplace of tax services by giving away computerized tax codes, and providing low barrier certification against IRS eFile open standards to service providers.  Taxes get filed and refunds get received measurably faster, better, and cheaper than ever before. 

ü  The National Weather Service (NWS) has catalyzed a market of value-added weather service providers by investing in meteorological research, and freely providing the resulting trusted data in open standard formats.

ü  DARPA invented and shared the technology, and supported the traditional-bucking community, that launched the open-standard-based Internet. 

ü  In a truly Joint effort, the DoD invented the Global Positioning System (GPS).  Enlightened federal policy makes trusted GPS precise time and position data ubiquitously available in open standard formats for commercial use.  

Study of these and other success cases reveals a pattern of effective governmental behavior as follows:

  1. Invest in basic research of the hard problem of interest to the government.
  2. Make the resulting intellectual property broadly available to potential industrial innovators
  3. Reduce industrial risk through some government stamp of approval that has the effect of a metaphorical “Good Housekeeping Seal of Approval” 





What will we measure? 

Utility-per-cost-per-time… where…

Utility = demonstrated improvement over baseline values of operational capability.
Cost = lifecycle cost

Time = development time + testing time + certification time + deployment time.

We will define the appropriate measures of effectiveness and measures of performance for all of this, and use it as the basis of PFP solicitations. 

How will we measure?  We will adapt best practices from commercial processes like Apps Store developers’ portals and bake them into a virtual distributed plugtest system

I.e., the PFP plugtest system will be informed by successful commercial rapid evolutionary plug-and-play “product line” approach to crowd sourcing technological innovation used by, for example, Apple Apps, Android phones, and Microsoft Windows. 

Certainly this commercial product line approach includes ruthlessly enforcing compliance with open standard technical interfaces.  We’ll do that too.  However, in successful IT product lines, the specific choice of IT standards follows careful business case analysis aimed at optimizing objectively defined, customer-centric value chains for the enterprise of interest.  Those choices are different for Apple, Microsoft, and the New York Stock Exchange -- because business models are different. 

Further, before would-be IT solution engineers log into commercial application developers’ environments, they must agree to specified standard intellectual property rights agreements, standard profit sharing models, and standard security domains.   In this way, as soon as a technology is successfully verified and validated as “pluggable” into the technical and business architecture, the provider can deploy it and start making money, and consumers can start using it and harvesting value. 

Government “open standard” IT initiatives inevitably fail to address these hard-nosed business issues built into the “app store” model.   Rather, they apparently count on the mere existence of new, abstract, open standard philosophies to inspire good outcomes within the old acquisition processes.  The PFP plugtest approach follows the successful commercial techniques by specifying “plugs” that address open standard business process as well as open standard interoperable technology. 



So ….PFP plug tests will answer these questions

1. Does it Plug-and-Play?
ü  Interoperability?
ü  How long does it take to configure?
ü  Does license model support sharing?
ü  Sustainability?
ü  What are lifecycle cost?
ü  Will technology be regularly refreshed?
ü  Does government retain appropriate IP rights?
ü  Security?
ü  Does it inherit reciprocal IA and CDS controls?
ü  Is the software assured?

2. Does it improve operational outcomes?

ü  Better probability of detection?
ü  Better probability of interdiction?
ü  More accuracy?
ü  Shorter planning cycles?
ü  Less logistic delay time?


Again….PFP OTA’s will pay for crowd sourcing of requirements broadly across innovative COTS communities, and demonstrated improvement in utility-per-cost-per-time.  That is, the “performer” on PFP OTAs will be not-for-profit consortia open to any qualified vendor with low barriers to membership…

So, having defined the measurable and testable parameters associated with any particular PFP solicitation…

… And having provisioned the virtual, distributed, plugtest system that is readily accessible to OTA consortium members….

PFP sponsors will use objective plugtests to down-select the most qualified vendor teams, set targets for incremental capability improvement, and will then immediately execute funding via pre-greased OTA fiscal process.  This process will take days and weeks rather than months and years… 




We will learn by crawling before walking before running, that is…
Initially, PFP has identified a single capability requirement and modest funding to support it.  The first solicitation will use an existing Army Contracting Command (ACC) OTA vehicle and consortium, namely the C5 OTA and C5 Technologies Consortium.  The first award will be made using a prototype version of the distributed plugtest system in May of 15.  More RFPs will be announced at the event accompanying that first award.



The second batch of PFP awards will be made using an incrementally improved plugtest system in August.  Again, more RFPs will be announced. By then the Air Force will have created its own tailored OTA…which will be available to accept end-of-fiscal year 15 funding…

Meanwhile the capability developed under the first award will be deployed, at least as an operational prototype…

We’ll iterate a few more times…

By the end of FY 16 the PFP process will be well established as a preferred option for program managers across the USAF and joint acquisition community. 

What does PFP success look like? 





… A thriving subset of the COTS IT marketplace aligned with the highest priority USAF information sharing requirements!  

USAF funds issued via OTA will incentive COTS developers to evolve their next releases to include features identified by USAF operators as the most potentially useful… 

Plugtests will validate and verify those COTS products per functionality, interoperability, sustainability, and security targets.  

Technology gaps will be identified in this process… Larger S&T projects designed to develop big bang game changers will be spawned…

Meanwhile…

Successfully V&V’d incrementally improved products get placed on “Approved Product Lists”… 

These products become readily available via low barrier procurement vehicles such as the GSA schedule…
 
Pre approved products, i.e pre-certified products, that are easily procured get reused by other programs and other operational folks…

Operational folks share new great ideas, which spawn new solicitations…
… And the virtual cycle continues! 


No comments:

Post a Comment