The University Matters Logo
Published on

Questions to Ponder on Cybersecurity

Article By
  • avatar
    Name
    Sam Schwartz
    Position
    Founder
    Organization
    The University Matters
    Twitter

The reality is that your university will face an impactful and trust-eroding cyber incident/attack/disaster at some point. The question is not if, but when. There needs to be a plan for once something happens. How often does your governing board (and relevant shared governance bodies) discuss cybersecurity issues with the CTO of your university? And is this discussion scheduled on a regular, predictable basis? Or is it more ad hoc?

I assert that there are four major risk areas generally associated with IT failure, whether due to unintentional operational failures or intentional cyberattack: reputational risk, risk of financial loss, risk of data loss, and risk of system downtime. In light of the digital assets under your university's purview, how do you plan on mitigating each of these four risk areas once an attack or failure incident has occurred? Specifically:

  • Regarding reputational risk: What is the plan to respond to a major cybersecurity attack or system failure? Who is the point person for the press? Who briefs senior leaders and this board? Who briefs internal stakeholder groups? What do your crisis planning drills look like? How often do they take place? What is the plan to regain the public's trust when hacks and system failures inevitably occur?

  • Regarding financial risk: What controls are in place to immediately halt financial transfers to and from your university's financial coffers? What policies exist to help guide decision-making in ransomware situations?

  • Regarding data loss: Are there automatic backups of data stored somewhere? Does the data backup or data recovery process itself cause any security risk? (E.g., by exposing data at some point during transfer?) Are there digital "capture the flag" games, "red/blue team" exercises or other penetration-testing sessions that occur on a periodic basis?

  • Regarding system downtime: In a worst case scenarios, how long would it take to get back up and running? Are there systems we know will not fail gracefully? What do your technical disaster recovery drills look like?

The answers to each of these questions will depend on the specifics of your university. When thinking specifically about intentional cyberattack, I find that the parable of "How to Outrun a Pack of Wolves" rings true: more often than not, you don't need to outrun the wolves. Rather, you just need to outrun the person next to you. Just by asking these "Questions to Ponder" every year or so - and having answers to them at all - means that your university is leaving far easier targets for the wolves to pounce on.

Of course, it's best if there wasn't a successful attack in the first place. Nearly all successful cyberattacks arise from two vulnerabilities:

  1. Old software which hasn't been updated recently.

  2. Old-fashioned deception of an innocent human being. The tools of deception may be modern. The practice of deception, however, is as old as humanity itself.

Regarding (1) old software,  university IT leadership should be able to answer the following questions:

  • What are the digital assets and tools under your purview/jurisdiction, and what is their quality? Digital assets can include (but are not limited to) source code, compiled/installable software, personal data, multimedia, etc. Further, how often do you do an inventory of your digital assets and tools? Do the tools/assets uplift people through their user-facing identification and other elements? (Nobody likes to be treated as a faceless number by an impersonal system.)

  • What specific digital assets have concerning levels of technical debt? That is to say, how old/in need of updating are your assets? Relatedly, what are the external or third-party software, middleware, or hardware dependencies required for your organization's technology stack to function, and are any of them obsolete?

  • What steps do you take to safeguard all of your digital assets and tools? Have you done third party or internal "war games" testing of your systems to identify weaknesses in your safeguards? For example, phishing simulations, penetration testing, or "capture the flag" exercises? What monitoring and logging tools do you employ to detect problems? Are they adequate?

Regarding (2) deception, I ask the following questions:

  • What automatic tools are you using to help users recognize whether a communication is suspicious or not? How are you ensuring that blame for mistakes doesn't fall on individuals, but rather creating an ecosystem such that there is no single point of failure? For example, implementing two-factor authentication?

  • What is done to ensure security policies are user friendly? For example, if users are forced to update their passwords so frequently, or contain so many unwieldy rules about what the password can and cannot contain, that some users write the password on a sticky note posted to their keyboard, this is an ineffective security policy -- not a failure of the employee.

  • What steps do you take to ensure users have the correct level of access and computational resources for their role? How is this verified? How often is this verified?

To this last point, I note that many sufficiently large private companies, non-profits, and governments operate with a "least amount of privileges, least amount of computational resources as strictly necessary" mindset when thinking about end users. Acceptable use agreements are written to protect the firm/government, not the regular user. While this makes a lot of sense for a company or government unit, I dissent from this mindset for the university situation.

Universities, particularly public universities, should allow their end users (e.g., students, employees, community members) to operate in digital spaces in an environment of greatest freedom. As we in higher education structure cybersecurity, privacy, and acceptable use policies for conduct in digital spaces that set a cultural precedent and can stand the test of time, I look to the Chicago Principles and other existing university policies around speech for inspiration. As universities think about guiding principles for acceptable user conduct in digital spaces, I submit the following as a starting point for discussion:

As a public institution, the university will sustain a higher and more open standard for freedom of behavior in digital spaces than may be expected or preferred in private settings. Freedom of action in digital spaces is a cornerstone of an academic institution committed to the creation and transfer of knowledge. Expression of diverse actions in digital spaces are crucial for learning and understanding, not solely for those who present and defend a way of digital life but for those who observe, agree, disagree, and pass judgment on those lifestyles. The belief that an action in digital spaces is pernicious, despicable, detestable, or offensive cannot be grounds for its suppression.

Freedom of action in digital spaces does not, of course, mean that individuals may do whatever they wish, wherever they wish. The University may restrict actions that violates the law, that falsely defames or digitally limits others, that constitutes a genuine threat or harassment, that unjustifiably invades substantial privacy or confidentiality interests, or that is otherwise directly incompatible with the functioning of the University - including undue risk in the four areas of reputational risk, financial risk, risk of data loss, or risk of system downtime. But these are narrow exceptions to the general principle of freedom of action in digital spaces, and it is vitally important that these exceptions never be used in a manner that is inconsistent with the University's commitment to a completely free and open digital ecosystem.

avatar

Sam Schwartz

Founder
The University Matters

Sam is the founder of UMatters.org. He is an assistant professor of computer science at the University of Wisconsin - Eau Claire. Views are his own.