top of page

The Security System Cybersecurity Never Built

The discipline inherited its working model from financial audit, never matured past the prescriptive rule, and now asks its management systems to govern something that was never engineered. The breaches we keep being surprised by are the consequence.


There is a quiet contradiction at the centre of modern cybersecurity. Organizations score well on framework after framework. Their controls operate as designed. Their audit reports come back clean. Their ISO 27001 information security management system (ISMS) is certified. And then they get breached — sometimes catastrophically, often through pathways their control catalogues said were covered.


The standard explanations point at risk sophistication, talent shortages, or budget. None of these reach the actual cause. The primary reason security doesn't deliver is structural resulting from the legacy of how the discipline was built.



The Audit Inheritance


The audit-and-controls tradition grew up around financial reporting. Its purpose was external assurance, and its mechanism was a defined set of controls (reconciliations, approvals, segregations of duty), executed against transactional data, tested by independent auditors, and reported through a standardized opinion. In that domain, the model worked. Controls operated on the same data the outcome was measured against. "The control operated effectively" and "the outcome was achieved" were tightly coupled.


In the 1990s, the same logic was extended to information technology. ISACA built CISA around the auditor's view of IT. COBIT formalized control objectives. When information security needed an assurance vocabulary in the early 2000s, it borrowed the one already in use. SOC 2 extended into security. ISO 27001 framed information security as a management system audited against documented controls. NIST publications enumerated controls and assessment procedures. CIS produced a prioritized catalogue. The frameworks travelled, the certifications travelled — CISSP, CISA, CISM became the credentials of the profession, all of them oriented toward audit and management.


What did not travel was the coupling between control and outcome that made the audit model work in finance. A financial control operates on the ledger; the ledger is the outcome. A cybersecurity control operates on a digital system being actively probed by adaptive adversaries, embedded in an environment that changes daily, and connected to consequences that emerge from interactions across the enterprise. "The control operated as designed" no longer implies "the outcome was achieved." The language continued to suggest it did.


The profession was built on a model designed for transactional assurance and applied to a domain that requires regulatory engineering.


Arrested at the Prescriptive Rule


The clearest sign of the inheritance is that cybersecurity has never matured past prescriptive regulation. Prescriptive regulation specifies what to do — do this, maintain that, implement these. Performance-based regulation specifies what outcome to achieve and leaves the means to the regulated party. Risk-based regulation requires the regulated party to design regulation proportionate to its actual risks and demonstrate adequacy. Outcome-based regulation holds the regulated party accountable for the goal itself.


The CIS Controls are prescriptive: establish and maintain the secure configuration of enterprise assets and software. NIST 800-53 is prescriptive. ISO 27002 is prescriptive. PCI DSS is prescriptive. The maturity discussions — IG1 to IG3, managed to optimized — are about how thoroughly the prescribed things get done, not about whether the controls work together to deliver requisite security — the actual outcome the business depends on.


Other safety-critical disciplines moved off pure prescription decades ago. Process safety matured from boiler codes and electrical codes into functional safety. Aviation moved from prescriptive airworthiness to performance-based navigation and safety management systems. Each of these disciplines was forced to grow up by the inadequacy of prescription against systems too complex, dynamic, and consequential for recipe-following.


Cybersecurity is now in exactly that position and has not made the journey. The audit legacy is why — audit verifies prescription well and verifies almost nothing else well.


The Missing System


The deeper consequence is that the discipline never built the system that should be doing the regulating.

Cybersecurity has the parts — SOCs, EDRs, SIEMs, identity management, firewalls, vulnerability scanners, incident response teams, analysts, procedures, policies. It has frameworks that catalogue these parts. It has audits that verify the parts were procured, staffed, and are being maintained. What it does not have is an engineered security system — an integrated arrangement of people, process, and technology, designed and integrity-rated, whose function is to keep the business and operations within a defined secure envelope.


The parts exist. The system does not. There is no integrated design basis that spans the human, procedural, and technical layers together. There are no integrity ratings on end-to-end security functions that include the people performing them and the processes governing them. There is no equivalent of a SIL-3 detection function or a SIL-2 containment function — where the rating covers the detection technology, the analyst competency, the response procedure, and the management arrangements as a single engineered whole.


The components are procured separately, staffed separately, trained separately, operated separately, measured separately, audited separately. They are an inventory, not a system.


The ISMS Confusion


The most consequential confusion in the discipline is the assumption that an ISMS is the security system. It is not.


An ISMS is a management system. It manages the activities of doing security — the policies, responsibilities, planning, resourcing, documentation, review cycles, corrective actions, continual improvement. ISO 27001 is explicit about this: it describes how to manage an organization's approach to information security. The artifacts are management artifacts — a Statement of Applicability, a risk treatment plan, internal audits, management reviews. What the standard does not describe, and what no ISMS standard describes, is the engineered system that actually delivers security.


The functional safety parallel makes the gap obvious. A process plant has two distinct systems: a safety management system (ISO 45001, the OSHA PSM management elements) and a safety system (the engineered protective arrangement governed by ISA 84 / IEC 61511). The safety system is not just instrumented functions — it integrates the protective technology with the operators who respond to alarms, the procedures that govern safe shutdown, the engineers who maintain integrity through the lifecycle, and the competency management that keeps the human layer reliable. The SIL rating of a safety function covers the integrated performance, not just the technology. The management system governs the lifecycle of the safety system. The safety system delivers the protection. A plant with an excellent management system and an inadequate safety system is unsafe.


Cybersecurity has the management system and not the system being managed. What the ISMS manages, in the absence of an engineered security system, is the procurement and operation of individual components according to prescriptive frameworks. The management system manages the parts.

This is why a mature ISMS does not prevent breaches at the rate boards expect. The ISO 27001 audit opinion says: the management system is operating as designed, the controls listed are implemented, the cycle is functioning. It does not say — and cannot say, because the standard does not define it — that the security system is delivering the security outcomes the business depends on. Boards hear "we have a certified ISMS" and interpret it as "we have a system that secures our information." What they have is a management system governing activities related to information security. The system that would deliver the security — engineered, integrated, integrity-rated — is not what the certificate covers.


What Functional Safety Got Right


Process safety solved the equivalent problem a generation ago. Hazard analysis (HAZOP, What-If, FMEA) starts with consequences and traces backward to initiating events. Safeguards are derived from the analysis, not assembled from a catalogue. Layers of Protection Analysis makes regulation additive and quantified, with each independent layer reducing consequence frequency by a defined order of magnitude. Bow-tie analysis traces the full causal chain from initiating event through preventive and mitigative barriers to the consequence. Safety Integrity Levels tie required performance to required risk reduction — a SIL-3 function has a quantified probability of failure on demand and is engineered, tested, and maintained against that target.


The translation to cybersecurity is direct. For each element of process safety, there is a cybersecurity equivalent that a mature discipline would have built:


  • Safety system → Security system, engineered and distinct from IT operations

  • Safety management system → ISMS, managing the lifecycle of the security system

  • HAZOP / What-If / FMEA → Threat modelling traced to business consequence

  • LOPA → Quantified defense-in-depth with independent layer credit

  • Bow-tie analysis → Threat, preventive controls, compromise event, mitigative controls, consequence

  • Safe operating envelope → Defined secure operating envelope

  • Safety Integrity Level → Required integrity level for a security function, derived from risk reduction needed

  • Management of Change → Security impact of change, integrated with change management

  • PHA revalidation → Periodic re-evaluation of threat model and security system adequacy


Every one of these is a regulatory artifact a sister safety-critical discipline considers minimum. None of them are exotic. All of them are within reach.


The Question for Boards


Every executive in a process industry can answer "what is your safety system?" — they can describe its architecture, its integrity basis, its accountability, and how they know it is working. Almost no executive in any industry can answer the equivalent question "what is your security system?" in those terms. They can describe their tools, their team, their framework conformance, their certified ISMS. They cannot describe their security system as an engineered arrangement with a design basis and an integrity claim, because they don't have one. The discipline has not produced one for them to procure or build.


The questions to ask are different.


Not "are our controls operating effectively?" but "is our security system actually providing the security the business needs — does it work?" 


Not "is our ISMS certified?" but "what does our management system manage, and is the thing being managed adequate to the threats we face?" 


Not "did we pass our audit?" but "if we removed each component tomorrow, what consequences would re-emerge — and is that an acceptable answer?"


This is engineering work. Engineering work belongs to engineers. The Professional Digital Engineering discussion underway in Canada and elsewhere is not separate from this conversation — it is the same conversation. The protective systems that keep municipalities, utilities, hospitals, and supply chains operating in the digital era are engineered regulatory systems, and they require the discipline, accountability, and regulatory recognition that other engineered protective systems have received.


The Path Forward


Cybersecurity does not need to abandon what it has built. The frameworks, controls, assessments, and management systems are helpful. However, they are not sufficient. What the profession needs is to build the system that should be at the centre of all of it — the engineered security system, integrating people, process, and technology, that keeps the business and operations within a defined secure envelope, against the threat conditions the organization actually faces.


Process safety made this transition over a generation. It moved from prescriptive codes to engineered protective systems with management systems above them, and the result is an industry where boards can answer the question "what is your safety system?" and have a real answer. Cybersecurity is at the same crossroads, with the same legacy problem and the same path available.


The cyber risk terrain has outgrown the audit-derived foundations the profession was built on. It is time for cybersecurity to grow up — to mature past the prescriptive rule, to build the security system the discipline never built, and to become accountable for the outcomes its name has always claimed.



Raimund (Ray) Laqua, P.Eng., PMP is the founder of Lean Compliance Consulting, Inc. He works with organizations to design compliance programs that deliver on their promises, and advocates for Professional Digital Engineering as a regulated discipline in Canada.

 
 

Can your compliance deliver on obligations?

The Compliance Capability Assessment gives you an honest picture of where your program stands — and a strategic conversation about what to do next.

bottom of page