Cyber in the light of Kabul – uncertainty, speed, and assumptions.
By Dr. Jan Kallberg, US Army Cyber Institute and the United States Military Academy
Aug 25, 2021

The Talban's swift assumption of power in Afghanistan holds lessons that apply equally to intelligence and cyber operations. It illustrates the implications of faulty assumptions and the difficulty of staying inside the opposition's OODA loop, especially when one's own concepts remain unclear.

Cyber in the light of Kabul – uncertainty, speed, and assumptions.

There is a similarity between the cyber community and the Intelligence Community (IC) – we both deal with a denied environment where we have to assess the adversary based on limited verifiable information. The recent events in Afghanistan, with the Afghan government and its military imploding, and the events that followed, were unanticipated and against the ruling assumptions. The assumptions were off, and the events that unfolded were unprecedented and fast. The Afghan security forces evaporated in ten days facing a far smaller enemy, and this led to the humanitarian crisis. 

There is no blame to be cast in any direction: it is evident that this was not the expected trajectory of events. But still, in my view, there is a lesson to be learned from the events in Kabul that applies to cyber.   

Operating under conditions of uncertainty.

The high degree of uncertainty, the speed with which events unfold in both Southwest Asia and cyberspace, and our reliance on assumptions (not always vetted beyond our inner circles) make the analogy work. According to the media, in Afghanistan there was no clear strategy to reach a decisive outcome. You could say the same about cyber. What is a decisive cyber outcome at a strategic level? Are we just staring at tactical noise, from ransomware to unsystematic intrusions, when we should try to figure out the big picture instead? 

Cyber is loaded with assumptions that we, over time, have come to accept. The assumptions become our path-dependent trajectory, and in the absence of the grand nation-state on nation-state cyber conflict, the assumptions remain intact. The only reason why cyber’s failed assumption has not yet surfaced is the absence of full cyber engagement during a conflict. There is a creeping assumption that senior leaders will lead future cyber engagements; meanwhile, the data show that increased velocity in the engagements could nullify the time window for leaders to lead. Why do we want cyber leaders to lead? It is just how we do business. That is why we traditionally have senior leaders. 

John Boyd’s OODA-loop (Observe, Orient, Decide, Act) has had a renaissance in cyber the last three years. The increased velocity, supported by more capable hardware, machine learning, artificial intelligence, and massive data utilization makes it questionable if there is time for senior leaders to lead traditionally. The risk is that senior leaders are stuck in the first O in the OODA loop, just observing, or in the latter case, orienting in the second O in OODA. It might be the case that there is no time to lead because events are unfolding faster than our leaders can decide and act. The way technology is developing, I have a hard time believing that there will be any significant senior leader input at critical junctures because the time window is so narrow. 

Leaders will always lead by expressing intent, and that might be the only thing left. Instead of precise orders, do we train leaders and subordinates to be led by intent as a form of decentralized mission command? 

Critical infrastructure: high-value or high-payoff?

Another dominant cyber assumption is that critical infrastructure is the probable attack vector. In the last five years, the default assumption in cyber is that critical infrastructure constitutes a tremendous national cyber risk. That might be correct, but there are numerous others. In 1983, the Congressional Budget Office (CBO) defined critical infrastructure as “highways, public transit systems, wastewater treatment works, water resources, air traffic control, airports, and municipal water supply.” By the patriot Act of 2001, the scope had grown to include; “systems and assets, whether physical or virtual, so vital to the United States that the incapacity or destruction of such systems and assets would have a debilitating impact on security, national economic security, national public health or safety, or any combination of those matters.” By 2013, in Presidential Policy Directive 21 (PPD-21), the scope widened even further. By now it almost encompasses all society. Today concession stands at ballparks are critical infrastructure, together with thousands of other non-critical functions, shows a mission drift that undermines a national cyber defense. There is no guidance on what to prioritize and what not to prioritize that we might have to live without at a critical juncture. 

The question is whether critical infrastructure matters for our potential adversaries as an attack vector or is it critical infrastructure because it matters to us? Suppose that a potential adversary wants to attack infrastructure around American military facilities and slow down the transportation apparatus from bases to the port of embarkation (POE) to delay the arrival of U.S. troops in theater. The adversary might make a different assessment, saying that tampering with the American homeland only strengthens the American will to fight and popular support for a conflict. Or a potential adversary might utilize our critical infrastructure as a capture-the-flag training ground for training their offensive teams, but the activity itself might have no further strategic intent.  

As broad as the definition is today, it is likely that the focus on critical infrastructure reflects what concerns us, instead of what the adversary considers essential for them to reach strategic success. So today, when we witnessed the unprecedented events in Afghanistan, where it appears that our assumptions were off, it is good to keep in mind that cyber is heavy with untested assumptions. In cyber, what we know about the adversary and their intent is limited. We make assumptions based on the potential adversaries’ behavior and doctrine, but it is still an assumption.  

So the failures to correctly assess Afghanistan should be a wake-up call for the cyber community, which also relies on unvalidated information.

A note on the author.

Jan Kallberg, Ph.D., is a research scientist at the Army Cyber Institute at West Point and an assistant professor at the U.S. Military Academy. The views expressed are those of the author and do not reflect the official policy or position of the CyberWire, the Army Cyber Institute at West Point, the U.S. Military Academy, the Department of Defense, or the U.S. Government.