Zero Trust technology works; excuses don’t

Zero Trust technology works; excuses don’t:

Pointing to culture as being the “problem” is a cop-out and shows a lack of tenacity and fortitude. If security is to be put in place, then the culture must come along and accept that, if it wants to survive in today’s threat environment, a degree of discomfort is tolerable.

Leadership needs to make sure everyone knows that:

  • They will be watching the network.
  • All users will be monitored, all the time.
  • Users will have to authenticate to every asset.
  • It’s not their data; it’s the company’s, so the company controls it.
  • Security isn’t optional.

Users need to learn to deal with security — it’s a way of life now (or at least, it should be). If that’s not going to work for some folks, then tell them to go somewhere else and be their security problem — or make the choice to allow them to hinder security and be ready to be part of a breach. Tell the board or shareholders that, thanks to the groans of a few individuals, you have chosen to allow “culture” to threaten the bottom line of the company.

In today’s world, it is no longer acceptable to allow a few individuals’ fears and unfounded concerns about monitoring and security operations to impede a secure digital future for the majority.

(Via Latest Topic for ZDNet in security)

I got out of the habit of posting about these types of #content but this one I think strikes most of the right notes.

IMHO there is too much hand wringing and pandering to millennials — more specifically to the idea of what it means to be a millennial — in order to hire and keep them as workforce. To that end I like four of the five points above.

It’s that middle one about users having to authenticate to every asset that can be problematic. That’s the one where potentially security introduces friction into the business. Depending on how zero trust is implemented, security could introduce a significant amount of friction into business processes.

Friction costs money. Security breaches do, too, but there needs to be a serious objective calculus done in the organization to make sure the balance is properly struck. Also, regardless of generation, if security adds in layers of frustration that can have a tremendous impact on morale.

The success of zero trust is on the security team(s), IT, and Risk Management working together to provide value to the business. Respond on social media with your thoughts.

Also on:

NCSC: Time for Boards to Get Cyber Literate

NCSC: Time for Boards to Get Cyber Literate:

During the speech, Martin posed five basic questions board members should be asking of their technical teams.

These cover: how the organization deals with phishing, privileged IT accounts, software and device patching, supply chain security and authentication.

“Crucially, we are also telling you what to look for in the response,” he added.

“If the answer is: ‘We have hired X and bought Y to address the problem,’ ask the question again. You need to understand what is actually happening — not what activity has been bought.”

(Via Infosecurity)

Cannot agree more.

Martin admitted that the government’s strategy on providing businesses with cybersecurity advice and best practice hasn’t worked out as expected, with organizations focusing on good governance and simply outsourcing expertise.

Focusing on good governance is not a bad thing. Many organizations don’t do it well if at all. However, it might not help much independent of other activities.

Outsourcing expertise also isn’t a bad thing, but boards need to know that they cannot outsource ownership and responsibility. Finding a “trusted security advisor” is a great move, and any worth their salt will help educate the board.

Ultimately, this is the key take-away:

… board members can’t manage risk they don’t understand, so they must become more cyber-literate …


Also on:

Maybe the National Risk Management Center Will Combat Critical Infrastructure Hacks

The National Risk Management Center Will Combat Critical Infrastructure Hacks:

At a cybersecurity summit Tuesday, Homeland Security secretary Kirstjen Nielsen announced the creation of the National Risk Management Center, which will focus on evaluating threats and defending US critical infrastructure against hacking. The center will focus on the energy, finance, and telecommunications sectors to start, and DHS will conduct a number of 90-day “sprints” throughout 2018 in an attempt to rapidly build out the center’s processes and capabilities.

“We are reorganizing ourselves for a new fight,” Nielsen said on Tuesday, who described the new center as a “focal point” for cybersecurity within the federal government. Nielsen also noted that DHS is working with members of Congress on organizational changes that can be mandated by law to improve DHS’s effectiveness and reach.

(Via Security Latest)

Based on the recent news from the Boston Globe about TSA wasting resources on zero value “security”, I am skeptical of how useful this will be in the U.S. Government’s security efforts. I seem to recall something similar was in the works over a decade ago.

However, Secretary Nelson seems to have said the right things in her talk:

  • Risk-based approach
  • Threat evaluation versus threat chasing
  • Focused on specific critical industries
  • Taking an agile development approach to building out capabilities
  • Working with Congress
  • Being the focal point for government

There are unanswered questions. We will get more answers as the process moves along.

I sincerely hope this isn’t another Security Theater opportunity to waste time and taxpayer resources.

Also on:

Don’t Panic: Hackers Can Now Steal Data Even From Faraday Cage Air-Gapped Computers

From Hacker News:

A team of security researchers—which majorly focuses on finding clever ways to get into air-gapped computers by exploiting little-noticed emissions of a computer’s components like light, sound and heat—have published another research showcasing that they can steal data not only from an air gap computer but also from a computer inside a Faraday cage.

Fascinating research for sure. If you happen to be one of the few working in an environment where air-gapping and Faraday cages are common, this highlights that they are not 100% effective in isolation (no pun intended). This is a reminder of the value of good security hygiene, physical and analog and digital, and occasional validation of assumptions.

For the other 99.999% of security professionals, there are more practical and pragmatic risks requiring addressing with a higher return on investment. This is a reminder of the value of good security hygiene, physical and analog and digital, and occasional validation of assumptions.

See what I did there?

See Also:



The Answers Are Out There: Disaster Preparedness

The Answers Are Out There:

At two minutes to noon on Sept. 1, 1923, the ground began to tremble in Tokyo and nearby Yokohama. A 7.9 magnitude earthquake had struck Japan. The shaking lasted for nearly five minutes, causing gas stoves to topple, which in turn ignited thousands of wooden buildings. The fires eventually claimed more lives than the quake itself — more than 140,000 people died in all. Although Japan had experienced earthquakes in the past, this one was different and for a singularly important reason: It inspired the Japanese to focus intently on disaster preparedness.

Almost nine decades later, that readiness was put to the test in extreme fashion. On March 11, 2011, a 9.1 magnitude earthquake struck Japan. Within 10 minutes, a tsunami — which in some places towered as high as a 10-story building — crashed into the coast and swept as far as six miles inland. Unlike in 1923, however, this time Japan’s government and its citizens were ready.

(Via Foreign Policy)

Check it: after natural disasters they assess and make adjustments. In between the Japanese practice. They train. They analyze.

How about Bangladesh?

Because of the country’s susceptibility to frequent flooding, it is also vulnerable to the spread of diarrheal diseases, such as cholera. When flooding struck in 1988, such illnesses caused 27 percent of the resulting deaths in one rural area in the country. Yet when Bangladesh was hit by unprecedented floods in August 2017, which damaged or destroyed nearly 700,000 homes, there were virtually no deaths from diarrheal diseases, according to the website Third Pole. The reason? More effective public health measures, including better-equipped medical facilities and greater awareness of the need for preventive action.

And France?

Having learned a bitter lesson in 2003, when the worst heat wave since 1540 killed some 15,000 people there, the country was prepared when a heat wave nicknamed Lucifer stuck Europe in August. Temperatures reached a record-breaking 106.9 degrees Fahrenheit in parts of southern France.

There were no reported deaths in France during the Lucifer heat wave, and the United Nations has cited France as a model for how other nations should respond when temperatures spike.

The article lists many examples of how the French learned and managed.

I love this line about Morocco’s efforts:

In an effort to reduce its vulnerabilities, the country has taken a different but equally important approach: focusing on financing risk reduction rather than recovery.

They don’t waste time or effort on theater, that act where governments and organizations decide they want people to feel safer.

I wonder what lessons my governments learned and what adjustments they’re making in the wake of the natural disasters we’ve faced in the United States & Assoc.

Bringing it around to the professional, how is your organization preparing to avoid and manage risk in advance of weather related issues?

Board of Directors – InfoSec Ostriches?

There’s no group of people in an organization who’s understanding of the value of Information Security (InfoSec) is more critical than the Board of Directors (BoD).

Dark Reading posted a thought provoking article about BoD and how they may think the company’s security posture is better than the reality.

Are they nothing more than InfoSec ostriches, burying their heads in the sand?

The author listed four items in support of this argument:

  1. Lack of baselines
  2. Overconfidence
  3. Don’t know about security incidents
  4. Don’t ask for metrics

I pose additions to the list.

I’d add a general lack of understanding. Boards often see InfoSec as overhead and not as core to the business. Every company is effectively open 24 by 7, whether as actually able to complete transactions (Retail, banking, etc.) or from a reputation perspective (Web site, social media, etc.). They don’t know the terms and acronyms (Security Operations Center [SOC], Security Event & Incident Management [SIEM], Virtual Private Network [VPN], Firewall, Identity & Access Management [IAM], Governance, Risk management & Compliance [GRC], etc.). Only the smart confident board members will ask.

Boards often don’t know or understand security projects, their objectives (what they mean to solve), and the positive impact to the business. This falls squarely on the Chief Security Officer (CSO) and/or the Chief Information Security Officer (CISO), but security managers and leaders can help with this, too.

BoD’s lack awareness of security risks. I find this most common in older companies that don’t possess a mature governance and oversight culture. The typical refrain is that “we’re flexible and move quickly; if we had a mature GRC with security-based risk management we’d lose that flexibility”.

Do you have anything to add to the list? What are some ways of combating the Board’s security ignorance?

Or do you completely disagree?

Over on PVCSec we discuss this topic. Check out the show.

Also on:

The Determination of Idiots

It is hard to impossible to stop a determined idiot.

This axiom came to mind as I read the story of the man who leapt from a zoo’s train. His athletic jump cleared a sixteen foot tall protective fence and landed right in the pen of a Siberian tiger. The tiger attacked the man severely.

I’ve been trying to think of valid reasons why the man would do that. From the reports I read the man seemed intent upon clearing the safety fence, so this was no accident. He wasn’t trying to escape a giant space scorpion or an icky bug or a girlfriend he was trying to break up with. Even if he was trying to commit suicide I imagine there are ample other avenues available, avenues less elaborate and ultimately more effective. This seems to me more in the field of a frat prank or a dare or drunken misplaced determination or a desperate ploy to get famous.

I think the guy is an idiot. If one applies Hanlon’s Razor, i.e. “never attribute to malice that which is adequately explained by stupidity”, he’s an idiot. Even if he’s not, even if there’s a valid reason for this bizarre chain of events, let’s assume for the sake of this post that he is an idiot.

We’re left with an idiot that did, deliberately and with malice of forethought, jump into a Siberian tiger’s pen. The tiger did what tigers do: the man was mauled. The man was rescued by zoo staff.

I’m sure that some people read the stories and were appalled that a tiger would do such a thing. They might demand the animal’s destruction for daring to harm homo sapien in such a blatant way.

Others might read that story and say, “How can we let people jump from a moving train over a 16-foot fence into a wild animal’s pen? We must implement laws and elaborate security systems to prevent this.”

The zoo did neither. The zoo’s director, Jim Breheny, handled the situation, in my humble opinion, appropriately based on the actual risk.

The Associated Press quoted Breheny as saying, “When someone is determined to do something harmful to themselves, it’s very hard to stop that. … The tiger did nothing wrong in this episode”.

The most telling part from both a risk management and incident handling perspective is the other statement from the Associated Press article I read: “Zoo officials said they would review safety procedures but stressed that the situation was unusual.” By the way, they’re not going to euthanize the tiger.

“We review everything, but we honestly think we provide a safe experience,” Breheny said in the Associated Press article. “And this is just an extraordinary occurrence. … Somebody was deliberately trying to endanger themselves.”

The lesson: don’t make more out of an isolated incident than is there. If the particular ingredients to a problem are all unlikely to rare, then the response should be proportional. This isn’t to say that there aren’t lessons to be learned from every event. Rather, a rational and judicial

The corollary: given enough resources a determined person can defeat security measures. As my Dad said to me after my childhood home was broken into, “If someone wants to get in bad enough they’ll find a way”.

The same is true for determined idiots.