blog

 See You in Linkedin

John Hollmann is the Manager of the CENTER FOR COST ENGINEERING Group in Linkedin. Check out the posts there.

New Book!: Project Risk Quantification

The topics in the blog posts below are now covered in great  detail in this new book. It covers the full, integrated project risk  quantification process that results in the industry’s first  “universal” capex risk quantification (useful for NPV analysis).  The methods flowchart is shown here. A website (www.validrisk.com) has also been launched to cover  services related to this topic. This is what the publisher has to  say about the text: Project Risk Quantification presents the most practical,  realistic, and integrated approach to project cost and schedule  Risk Quantification that is available today! It offers proven,  empirically-valid methods and tools applicable to projects of all  types and at all decision gates. The text is written for both the  manager and the risk analysis practitioner. It will bring reliable  accuracy and contingency determination to your capital project  organization.”  

May 2015: Simple Systemic Risk and

Contingency Model

Risk analysis and contingency estimating can be a daunting subject for the uninitiated. Therefore, I wrote an article for a general engineering audience for Chemical Engineering Magazine (Dec 2014). It’s title is; Improve Your Contingency Estimates for More Realistic Project Budgets”. The article includes a discussion of “systemic” risks. These risks are artifacts of company and project “systems”; i.e., the culture, process maturity, biases, complexity, technology and so on. We know from historical research how these system attributes drive cost uncertainty, we know these risks are dominant (they cause more cost and schedule impact than the usual suspects in the risk register) and we can rate these attributes objectively. The best method to quantify them is parametric risk modeling (AACE RP 43R-08). However; to create a model, one needs historical data which is hard to come by. The good news is that there is 50 years of industry research to borrow from. The article includes the following tablular version of my systemic contingency model that borrows from published research by John Hackney, Rand Corp and others. One can use this to get a quick contingency estimate and a range (e.g., p10/p90 range is about -0.5X/3X the contingency shown).   The table is used by rating your project’s attributes and then finding the corresponding continengency percent. For example, for a Class 3 estimate with medium complexity and technology, a starting contingency of 18% is suggested. To this one must make the “Adjustments” as appropriate. Finally, an allowance for project specific risk contingency using the Expected Value method must be added (see the full article or contact John Hollmann for more information).

April 2015: Estimate Accuracy: Dealing

(Poorly) With Reality

In 2012, my meta-analysis of empirical estimate accuracy  research for the process industries was published in AACE’s  Cost Engineering Journal. In the same journal issue, a parallel  IPA, Inc. appeared based on their large industry database.  Together, these papers form an empirical (real measurements)  stake-in-the-ground as to the state of process industry cost  performance and an indictment of our poor risk analyses and  contingency and reserve quantification. I have presented this  paper, including IPA’s learnings, many times since then in the  N. America, S. America, Europe and Asia. It might be wishful  thinking, but I see evidence of more realistic, empirically valid  risk quantification going on by owners and major EPCs.  Hopefully, before too long, “line-item ranging” (LIR; putting  high/low percents on estimate line items and running Monte  Carlo) will be a thing of the past and everyone will use risk-   driven approaches. The study (see link) found that the average large project (funded at a Class 4ish level of scope definition and planning) overran its board  sanctioned amount by about 20% (1.2 on chart) and about 10 percent of projects overran cost by 70% (1.7 on chart) or more...i.e., blowouts.  This outcome has been consistent for at least 50 years; our practices are not improving. I know from past benchmarking experience that most projects have contingency based on rule-of-thumb or LIR risk analysis methods. We know from research that these methods fail to properly  consider the “reality” of known risks, both systemic and project-specific (LIR produces the same curve every time; i.e., the as-estimated red line on the chart). I will not review the RPs in this posting, but AACE has documented best practice risk quantification methods. They are all risk-driven. The approaches I use are parametric analysis for systemic risk, and expected value for project specific (see products). If your  company is using LIR or rules-of-thumb, I strongly recommend looking into the AACE Recommended Practices. Contact John Hollmann for  further information.   
© Validation Estimating LLC
© Validation Estimating LLC

blog

 See You in Linkedin

John Hollmann is the Manager of the CENTER FOR COST  ENGINEERING Group in Linkedin. Check out the posts there.

New Book!: Project Risk Quantification

The topics in the blog  posts below are now  covered in great detail  in this new book. It  covers the full,  integrated project risk  quantification process  that results in the  industry’s first  “universal” capex risk  quantification (useful  for NPV analysis). The  methods flowchart is  shown here. A website (www.validrisk.com) has also been launched to cover  services related to this topic. This is what the publisher has to say  about the text: Project Risk Quantification presents the most practical, realistic,  and integrated approach to project cost and schedule Risk  Quantification that is available today! It offers proven, empirically-   valid methods and tools applicable to projects of all types and at  all decision gates. The text is written for both the manager and the risk analysis practitioner. It will bring reliable accuracy and  contingency determination to your capital project organization.”  

May 2015: Simple Systemic Risk and

Contingency Model

Risk analysis and contingency estimating can be a daunting subject for the uninitiated. Therefore, I wrote an article for a general engineering audience for Chemical Engineering Magazine (Dec 2014). It’s title is; Improve Your Contingency Estimates for More Realistic Project Budgets”. The article includes a discussion of “systemic” risks. These risks are artifacts of company and project “systems”; i.e., the culture, process maturity, biases, complexity, technology and so on. We know from historical research how these system attributes drive cost uncertainty, we know these risks are dominant (they cause more cost and schedule impact than the usual suspects in the risk register) and we can rate these attributes objectively. The best method to quantify them is parametric risk modeling (AACE RP 43R-08). However; to create a model, one needs historical data which is hard to come by. The good news is that there is 50 years of industry research to borrow from. The article includes the following tablular version of my systemic contingency model that borrows from published research by John Hackney, Rand Corp and others. One can use this to get a quick contingency estimate and a range (e.g., p10/p90 range is about -0.5X/3X the contingency shown).   The table is used by rating your project’s attributes and then finding the corresponding continengency percent. For example, for a Class 3 estimate with medium complexity and technology, a starting contingency of 18% is suggested. To this one must make the “Adjustments” as appropriate. Finally, an allowance for project specific risk contingency using the Expected Value method must be added (see the full article or contact John Hollmann for more information).

April 2015: Estimate Accuracy: Dealing

(Poorly) With Reality

In 2012, my meta-analysis of empirical estimate accuracy research  for the process industries was published in AACE’s Cost Engineering Journal. In the same journal issue, a parallel IPA, Inc. appeared  based on their large industry database. Together, these papers form an empirical (real measurements) stake-in-the-ground as to the  state of process industry cost performance and an indictment of  our poor risk analyses and contingency and reserve quantification. I have presented this paper, including IPA’s learnings, many times  since then in the N. America, S. America, Europe and Asia. It might be wishful thinking, but I see evidence of more realistic,  empirically valid risk quantification going on by owners and major  EPCs. Hopefully, before too long, “line-item ranging” (LIR; putting  high/low percents on estimate line items and running Monte Carlo)  will be a thing of the past and everyone will use risk-driven  approaches. The study (see link) found that the average large project (funded  at a Class 4ish level of scope definition and planning) overran its  board sanctioned amount by about 20% (1.2 on chart) and about 10  percent of projects overran cost by 70% (1.7 on chart) or  more...i.e., blowouts. This outcome has been consistent for at  least 50 years; our practices are not improving. I know from past  benchmarking experience that most projects have contingency  based on rule-of-thumb or LIR risk analysis methods. We know from  research that these methods fail to properly consider the “reality”  of known risks, both systemic and project-specific (LIR produces  the same curve every time; i.e., the as-estimated red line on the  chart). I will not review the RPs in this posting, but AACE has  documented best practice risk quantification methods. They are all risk-driven. The approaches I use are parametric analysis for  systemic risk, and expected value for project specific (see  products). If your company is using LIR or rules-of-thumb, I strongly recommend looking into the AACE Recommended Practices.  Contact John Hollmann for further information.