WSRcg - What We Do As Software Project Failure Expert Witnesses

On this webpage we present:

Software Failure Litigation

Software & System Failure Litigation

WSRcg provides expert witness services for litigation involving large-scale system project and software failures. We have provided opinions or testimony in such matters in North America, Europe, Asia, and in courts at all levels including US Federal Courts, Superior Courts, and the Court of Federal Claims, and in arbitration and mediation. As pioneers in the areas of systems development methodology, systems testing and acceptance, systems contracts, and CPR (Cooperative Project Renewal) – A Methodology For Resuscitating Drowning Projects, WSRcg experts use their particularly rich experience and diverse expertise to quickly uncover the root causes of why a system and/or project has failed in terms of quality, cost, schedule, functionality and performance.

WSRcg has learned over the past 20 years in over 100 IT, computer, software failure and software project failure matters in North America, Asia, and Europe, that the same eleven attributes (i.e., underlying causes) become important claims in the pleadings/complaints prepared by both parties! These eleven items, for sure, weigh in very heavily as potential root causes in virtually all such failures (and successes).

While in most cases, multiple parties tend contribute to a failure, courts typically hold one party more responsible than the others, and often the determination of liability can be counter-intuitive. To help the court correctly attribute the primary blame for a software or software project failure, it is important to carefully isolate and identify the root cause of failure, which itself depends on the degree to which each of the eleven forces and related decisions played a role in shaping the events of a project.

Back to Top

 

Typical Issues/Charges between Parties in Software Project Failure Litigation

While the IT, computer and software industries have changed dramatically over the past twenty years with the flattening world’s adoption of the internet and the world wide web, 4th generation languages, Sarbanes Oxley, smart phones, mobile computing, web 2.0, outsourcing, SaaS, hosting, cyber terrorism, identity theft, Bill Gates’ retirement, ever more complicated and sometimes overreaching ERP software solutions – computer and software failures still abound in almost record numbers. The new 3G iphone cannot be activated for hours, and in some cases, days. ERP software failures and software project failures are reported each week. Over $55 billion was written off for failed and scratched computer, software and systems projects in the USA alone a few years ago -- and that’s only what was reported!

Over the past 20 years, Warren S. Reid, Managing Director of WSR Consulting Group, LLC has compiled, updated (although little has changed) and published what he believes are the same main complaints between the constituents of software projects (1) Acquirers (i.e., in-house counsel, CXOs, Board members), (2) Vendors/Integrators/Consultants, (3) User Departments and Users, and (4) In-house IT departments. These same claims and charges are universal. Warren Reid’s well published work, “He Said ... She Said...” is summarized below:

The “HE SAID… SHE SAID…” Chartfor Failed Software Projects
# Category:
"-abilities"
Areas of High Experience & Expertise Expertise Level
1 Feasibility System doesn’t work; Not what we wanted You changed your minds; You don’t know what you want or need
2 Capability You delivered limited functionality You continually changed project scope
3 Compatibility The system failed in the field & in production
You didn’t perform required “business process reengineering” to make it work
4 Credibility Your software, services & expertise were oversold You conducted your reference checks & due diligence; What didn’t you know?
5 Usability No one can use system! Poor training “Required staff” never came to primary training or refresher training
6 Stability The system is “fundamentally flawed”
We only need 2 months to fix issues
7 Culpability You never told us that! You gave poor advice! You didn’t follow our recommendations; You changed/delayed making decisions
8 Reliability The system is full of bugs! Bad data conversion & interfaces caused problems. Systems always have bugs!
9 Responsibility You failed as Systems Integration Project Manager (SIPM). No! YOU failed as the SIPM. That role was not my job!
10 Availability You baited & switched! You provided unqualified, unstable, uncommitted staff, project manager(s), Steering Committee YOU baited & switched! You provided unqualified, unstable, uncommitted staff, project manager(s), Steering Committee
11 Suitability You abandoned good Project Management & System Development Life Cycle (SDLC) methodologies You were unwilling to comply with agreed to, promised & necessary methods (to save $$ w/o associated risk)

While it is oftentimes true that all parties could have contributed to the causes of the software or system project failure in a given area, it is our experience that:

  • One party usually contributes more to the failure and its root causes
  • One party will have done something (or did not do something it was supposed to or should have done) to set the wheels in motion or the dominoes falling to create the root causes of project failure – although it was not evident until we discovered that during our investigation and review of discovered materials.

BUT, you will see below, however, that these eleven attributes, do not explain the whole story of project failure to a trier of fact. WSRcg has identified additional levels of hidden internal conflicts that go beyond the eleven attributes. These also doom projects -- sometimes even before they are started!

Back to Top

 

Going Deeper into Software Project Failure

Warren Reid’s experiences in software implementation and litigation matters have led him to further investigate and deconstruct the business, corporate cultures and profit models of each constituent, and the impact these factors have on software development, implementation, maintenance projects, AND on new computer product launches. The new model promotes and provides a better understanding, at a visceral level, of not only the clashes between, but the conflicts and divergent goals within each of the constituent party. What WSRcg’s expert witnesses know is that, in many cases, computer/software projects fail even where constituents employ strong SDLC, requirements elicitation, staff, scope control, testing, and project management processes. Software projects will continue to fail as long as the internal conflicts and divergent goals are not surfaced, addressed and/or mitigated.

This insight and knowledge can then be useful and instructive in:

  • Project success: developing concrete and practical recommendations to help assure project and product success, and
  • Litigation testimony: uncovering the hidden and obscure meanings and purposes of specific party actions that damaged the chances of project or product success. Understanding and explaining such self-interested attitudes, policies, strategies and even intentions behind the behavior, decisions, and actions taken and not taken by the parties is instructive to the triers of fact. Litigators have told us that it is often helpful to the jury‘s understanding and appreciation of the facts to know not only what was done, but why something was/might have been done, and how that directly contributed to cost overruns, undelivered functionality, schedule delays, unacceptable project risks, and unhappy customers – or alternatively – why the customer is being unreasonable in not accepting a “successful systems” delivered per the contract or industry standards.

WSRcg believes that uncovering these internal party conflicts requires special additional expert witness skills and experience above and beyond software project, technical and IT leadership expertise. These areas delve into each constituent’s internal conflicts, divergent goals, company culture, level of risk tolerance, and business and product life cycle goals and positioning.

This knowledge must then be integrated with an understanding of the larger business climate/risks facing constituents including worldwide technology disruptors, the general economy, competitive pressures, industry consolidation, and near and longer term business intelligence. Together, all of these data points and inferences will tell a story of why executives, technologists, department heads and professionals made the decisions they made. It is our experience that these types of analyses require new and creative document discovery approaches and requests, and deposition questioning to unearth and prove these hidden realities in litigation. In a few of our cases, the final verdicts/judgments were dependent on these discoveries and a clear and simple presentation of these facts and opinions to the triers of fact.


Warren S. Reid reserves all rights for this adaption only. The author of the original view of this risks is unknown to me, but I suspect it could be either Ed Yourdon or Capers Jones."

The 1st level of the model (below as Model 1), describing the clashes between constituents was originally developed by Prof. Barry Boehm and his graduate students in the Ph.D. program at the University of Southern California. WSRcg has built upon this foundation to create Model 2 below which identifies many of the internal conflicts and divergent goals affecting each project constituent internally. Warren Reid, Managing Director of WSRcg will be guest lecturing on his “conflicts/goals divergence” additions to the model next semester in USC’s Graduate School of Engineering program. Mr. Reid has already presented his new views, experience, and materials to CIO groups in Southern California to enthusiastic and thoughtful response.

WSRcg believes that these new concepts and findings will not only help in developing and presenting a true and persuasive story to the triers of fact of what really happened, but will also lead to better ways to improve IT contracting and identifying, mitigating and managing all the levels of software, corporate, and industry risk. Ultimately it will help to assure software and computer systems project success.

Below are two work-in-process models. The first one, based on the work of Dr. Barry Boehm and his USC Software Engineering Ph.D. Students, presents the “Clashes” between the IT and software project constituents.

The second model, shown below, is based on Warren Reid’s experience and expertise on large-scale software and ERP projects and as an expert witness in computer, software and ERP project failure litigation. This model explores the underlying “Internal Conflicts and Goals Divergence” within each constituent.

For a full size image and discussion of this model, please contact Warren Reid at contact us.

Back to Top

 

Selected WSRcg ERP Litigation Experience

Format: Case Reference – Case Type

Engaging counsel’s client (Plaintiff or Defendant) – Party’s role in matter

Very Large e-Commerce Retailer v ERP Vendor – ERP systems project and unsuitable software failure
PLAINTIFF – CUSTOMER

Dispute: A customer licenses e-commerce software from an ERP vendor only to find that the vendor’s Platinum Implementation Partner is unable to configure, customize and install the system per agreement. Additionally the system has insurmountable scalability problems and is unable to meet the required volumes of the customer. Plaintiff sues for fraud.

Our Role: WSRcg’s Expert Witness was deposed and testified at trial about: the failure of the vendor to properly estimate the project and the special customization requirements of the customer; the unsuitability of the software for the job; serious strategic software design and architecture decisions made by the vendor regarding the software and the suite that would prevent a “successful” implementation without much more investment and cost – information that the vendor allegedly withheld from the customer.

Result: Pending.

 

Health Care Provider v ERP Vendor – ERP computer project failure JOINT DEFENDANTS – CUSTOMER & VENDOR

Dispute: An integrator/outsourcer sues hospital defendant for $10 million claiming it was ready to go-live with the system several months earlier, but for delays caused by the vendor and customer.

Our Role: Both Warren Reid and Randy Brown (of WSR Consulting Group, LLC) were deposed and testified in arbitration that the Integrator had: poorly staffed this mission critical project; discarded reasonable project management practices, standards, and tools for a project of this size and nature; abandoned industry standard/contractually promised SDLC methodology; failed to execute a reliable test strategy/plan; never stabilized system’s infrastructure; contributed to concurrent delays.

Result: Plaintiff wins. Judgment being appealed.

 

Big 5 Consulting Firm – systems integrator role in ERP failure
DEFENDANT - INTEGRATOR

Dispute: The Fortune 500 customer alleged that for the $50 million fee, the integrator failed to properly develop and implement a large ERP software project.

Our Role: Our expert report covered: role of System Integration Project Manager; control over subs; system stability, performance, functionality; scope creep; database design; go-live readiness.

Result: Settlement favorable to Integrator.

 

Multiple Insurance Companies
DEFENDANT - INSURERS

Dispute: Plaintiff’s claimed “sue and labor” in Y2K matter. One case alone was for $74 million.

Our Role: To determine to what extent the remedial costs expended by insureds; were reasonable; were paid for “fortuitous” events; really saved the insurers money. We also defined and applied new technology and meaning to the terms “destruction, distortion and corruption of data” in time-honored insurance coverage policy interpretation.

Result: All cases were settled or won with no insurer liability or payout.



Very large Internet Service and Content Provider in a Class Action Lawsuit
DEFENDANT – DEVELOPER

Dispute: Stockholders claiming failure to adequately disclose major changes in its technology strategy in its Prospectus.

Our Role: We demonstrated changes were evolutionary, concurrent and compatible with the development of the Internet, and were disclosed consistent with industry common practices.

Result: Arbitrators awarded most favorable result to defendant.

 

International Medical Company
PLAINTIFF - CUSTOMER

Dispute: Involved a system developed by a major outsourcer which failed to perform.

Our Role: Our research, report, and testimony helped show that the system developed was not of workmanlike quality, was not built using the outsourcer’s “System Life Cycle Methodology” or any industry standards, and did not work.

Result: Judgment of fraud against defendant.

 

U.S. Government and President of the United States
DEFENDANTS – CUSTOMER

Dispute: The President, as head of the Base Realignment and Closure (BRAC) Commission agreed to close down a computerized robotics army base that failed to meet contractual performance requirements during Desert Storm. The government refused to pay the $150 million owed to plaintiff software, robot and smart building developer.

Our Role: With secret clearance, our team visited the base site, reviewed and analyzed performance history and stats of the base during the war, and developed a 3-D simulation model which proved that there were fundamental and pervasive design and implementation flaws and decisions unilaterally made by the plaintiff developer that caused the system (and the robots) to crash.

Result: Overwhelming and complete exoneration for defendants

 

Developer of ERP/Wholesale Distribution Software – misappropriation of trade secrets and bad faith
PLAINTIFF - DEVELOPER

Dispute: Misappropriation of trade secrets & bad faith as large company pretends to perform due diligence – but in the end rejects purchase of Plaintiff company in bad faith for alleged poor performance, and instead very quickly develops/releases competing system with a different GUI.

Our Role: We demonstrated that Defendant’s system had a virtually identical suite of functionality & underlying algorithms; opined that Defendant stole these trade secrets. Also proved Defendant stacked due diligence testing to purposefully fail by applying unrealistic data volumes and criteria.

Result: The Defendant settled in favor of the Plaintiff the day before the trial.

 

Malaysian Stock Exchange
PLAINTIFF - CUSTOMER

Dispute: Issues resolved including as-promised design vs. as-produced design, project management and budgeting/estimation issues, poor quality of system, readiness to go live, and contract interpretation.

Our Role: Engagement to determine whether or not said exchange had a viable claim against its computer systems vendors, developers and integrators.

Result: Case settled in favor of Plaintiff

 

Large Magazine Subscription Company
PLAINTIFF - CUSTOMER

Dispute: Integrator/Developer failed to properly install a new system involving very complicated data base requirements.

Our Role: Involved source code issues, quality of design and implementation, year 2000 issue, recovery and restart failure, performance, and other issues.

Result: Case settled at very favorable terms to plaintiff.

 

Canadian Government
DEFENDANT – CUSTOMER

Dispute: Government terminated a contract with a Big 5 consulting firm in project to build a fully automated, state-of-the-art federal employee payroll system.

Our Role: Our expert reports showed failings in project management, risk management, quality management, configuration management, estimation, design and staffing – all of which caused delays that triggered the termination for cause.

Result: Consulting Firm plaintiff settled on terms favorable to the government.

 

International Fast Food Company
DEFENDANT - CUSTOMER

Dispute: Developer sued over the refusal of the customer to install and pay for an integrated POS system.

Our Role: We testified and demonstrated that the system was fundamentally flawed, would never have worked, was not fault tolerant, couldn’t handle peak performance, and was inappropriate for intended environment.

Result: Defendant exonerated – no damages.

 

Manufacturer of Hi-Tech Products
PLAINTIFF – DEVELOPER

Dispute: Right to import into U.S. hi-tech products that permit the unlocking of certain security devices (hardware locks) on PCs.

Our Role: Delivered written testimony and proofs that there exists a "substantial legitimate commercial purpose" for the product.

Result: Products allowed for import.

 

State Government with Large Dispatch System
DEFENDANT - CUSTOMER

Dispute: State Government hired a large integrator to install a complex computer-aided dispatch system – but cancelled the contract after many delays and poor testing results. Integrator sued for $100 million.

Our Role: Our expert analysis and report revealed that integrator grossly underestimated the project due to its inexperience with these types of systems – compounded by poor project management tools, poor collection of user requirements, insufficiently qualified staff, and inadequate testing approaches.

Result: Jury rejected lawsuit, and awarded Defendant $1 million for legal costs.

Back to Top

 

Background Statistics on Software Project Failure

The IT/Software industry has continued to struggle to bring in software development and/or implementation projects on schedule, on budget, on-point and on-target. The Standish Group’s Chaos Reports and analyses show this rather clearly as does the work of Capers Jones and others.

Back to Top

 

What Can Be Done? Why Are Almost 30% of Projects “Successful”?

How to Make Your Project A Success and Run with the Winning 30%
  • it’s about employing good standards, training, tools, methodologies, and techniques; and recognizing they are still evolving,
  • it’s about considering alternatives in context (i.e., which SDLC approach is best for your project and organization now? e.g., evolutionary/spiral; prototype; planned; agile?) Different projects may be better served by different methodologies,
  • it’s about identifying, sharing, managing and mitigating known and likely risks,
  • it’s about planning, executing and reporting progress in the way the different stakeholders need to see such information. It’s about all constituents knowing where each stands in terms of actual progress, critical path, estimates to complete, earned value, and estimating what it will take to complete its part of the project and the project a s a whole,
  • it’s about open, honest, measurable, and metric defined progress reports, prioritized work assignments, balanced staff loading, and assigning staff with the right skills,
  • it’s about knowing and evaluating the risks associated with deviating from agreed to promised and/or “industry standard” plans and methodologies.
  • it’s about making sure the earlier deliverables are of a quality and scope that provide the foundation for the rest of the software and system to be built upon.
  • it’s about being tough, clear AND flexible – all at the same time.
  • it’s about testing, testing, and more testing -- but it must be the right kinds of non-duplicative testing! It’s about functional, end-to-end, use case and usability tests; build the system with user in mind, i.e., with the acceptance test criteria and process agreed to at or near project start
  • it’s about software/product performance including: security/privacy, reliability, portability, load testing, stress testing, backup, recovery/restart, archiving, maintainability, regression & static tests (e.g., reviews, structured walk-thoughs, inspections), and quality assurance and control)
  • it’s about not taking your eye off the ball – it’s about having your key stakeholders agree to a measurable definition of project “success” & software acceptance/readiness to go-live in measurable ways so there is something to aim for and to know when you are done,
  • it’s about anticipating and absorbing staff turnover, lack of required skills,
  • it’s about getting your facility ready and your networks, wiring, specialty furniture and HVAC requirements installed; it’s about redundancy in design and back-up procedures in the event of component or power failures,
  • it’s about your ordering, deploying, attaching, and testing the hardware, network, system control & command software, and help desk hooked-up and tested in place,
  • it’s about training your users and providing refresher training for those who forgot what they learned or missed the training,
  • it’s about deploying security codes; documenting security level policy re access to different software functionalities,.
  • it’s about catching, reporting, isolating, fixing, testing, and regression testing before putting code back into the library. It’s about only promoting software and software fixes to production after all the proper reviews and signoffs have been performed
  • it’s about users, super-users, technicians, data base administrators, performance engineers, trainers, help desk staff and systems, and documentation being in place to permit smooth management, operation, maintenance and control of the system, the on-line and batch processes, archiving, tuning, etc.
  • it’s about creating service pack standards and procedures, processes and controls for downloading/uploading software patches and versions more easily and correctly – and without the need for special techs on site.
  • it’s about people, processes, new roles & responsibilities, working with new reports & information in new formats to achieve the benefits, competitive advantage, growth and cost savings that made up the business case for going ahead with the project.

Back to Top

 

For More Related Information

Click on one of the following areas for more information:

Back to Top

 

Call Warren S. Reid at (818) 986-8842 if you have any questions or wish to discuss your case.

WSRcg

Home | Founder: Warren S. Reid | Firm Introduction & Philosophy | References & Testimonials | Firm News | Credentials, Cases, Testimony | Approach to Cases & Discovery | Expert Reports | ERP Expert Witnesses | Software Expert Witnesses | Computer Expert Witnesses | Internet Expert Witnesses | IP Expert Witnesses | IT Consulting Expertise | Firm Resume | Consultant & Expert Witness Resumes | Publications | Presentations | Firm Blog | Contact Us

WSR Consulting Group LLC · Los Angeles · (818) 986-8842 · www.wsrcg.com · wsreid@wsrcg.com
Copyright © 2004, 2005, 2006 WSR Consulting Group, LLC. All rights reserved