We have a few fascinating stories from the trenches for you today as we delve into our crypt of memories! We do a lot of penetration testing and cybersecurity reviews at Samurai, and, unfortunately, people ignore our advice. This is quite strange when you consider the fact that we are paid to dispense information. And to not have it taken to heart seems counter-intuitive.
During a risk assessment many years ago, we assessed a clients’ systems, processes and policies, and discovered several things that were not right. We compiled and presented a report, and there was a list of about ten things our client needed to address. They failed to address it and got breached with something that was at the top of the list. The client called back to inform us they were breached, and we advised that a plan should be put in place immediately to fix everything as a prevention mechanism. Unfortunately, they only wanted to fix the issue that caused the breach instead of everything on the list.
We went ahead and fixed the main issue. A few weeks went by, and the same client got breached again for something further down on the list. The client put the second breach down to just being incredibly unlucky! We fixed the second problem, and they left things at that. Despite warnings and recommendations from Samurai, another breach occurred!
Amazingly people ignore our advice, pay the price for it, and continue to ignore our advice. And so the vicious circle continues. It seems like a lesson that is very difficult for people to learn. It may be due to some odd cognitive bias or maybe the ‘ostrich effect’ of burying your head in the sand. Why it happens is unclear, but it happens too often for it to be anything other than a human trait. Pointing a finger at yourself and identifying your own faults can be quite difficult…
So clients are just ignoring the results entirely?
We have had situations with penetration tests where we found a few things wrong and were ignored. Even after emphasising the gravity of the situation. And, of course, there were breaches off the back of that. Maybe the worst of those was a massive organisation we did a test for years ago. We reviewed their system for free, and we discovered that SMB version 1 was open and pointing towards the internet. This was before the Fancy Bear’s ‘EternalBlue’ leak, and at the time there was no WannaCry. However, we still knew that having SMB exposed to the internet was a dumb idea, and we made that very clear. Several months later WannaCry hit, and catastrophe struck many organisations. And this is just one example of many where good advice was ignored. Sometimes companies would run a test and then a retest 3 to 4 months later. They would wait the entire 3-4 months before they made a change. Their main focus is to ensure that they pass the test next time, not focussing on their security during the gap between tests. We try to raise awareness about it, but it is always a challenge.
At Samurai, we don’t just highlight the bad, we also like to highlight the good stuff when we do these assessments. Part of the problem is that people do not like to accept bad news. As humans, we are programmed to ignore bad news. During the stock market collapse, studies showed that people stopped checking their portfolios when things were down. And when there was an uptick in the market, portfolio checks increased. We go into denial with bad news.
Has anyone ever hidden anything from us to stop us from finding it?
Yes, it is sneaky and pointless to hide things from us. We had a situation where we did a penetration test, and after the test the company was breached for a particular service. The rebuttal is always as follows: you have done a penetration test, and you guys did not find this problem! And the retest then uncovered the real issues. In this case, we discovered that the service did not exist when we did the initial test. And after reviewing the logs in further detail, we realised that the service was turned off before the penetration test and switched back on after the test. They intentionally turned it off so that the service would not show up in the test, and they got breached.
We did a test for another client and found bit torrent files. When we informed the IT manager, he responded rather strangely to the news. As we continued our tests, we could see how the bit torrent files decreased in size and how someone was moving the data and hiding it. It ended up being the same IT Manager moving the files. Great lesson here: just because someone works in IT does not mean they are always doing the right thing…
We have had situations during incidence response where we tried to work out why a particular kind of breach had occurred, and we discovered that logs were cleared after the breach had happened. When we don’t have the logs to see what happened, it makes our job very difficult. After a bit of probing, we discovered it was someone quite senior in the organisation that deleted the logs. This scenario places organisations in an uncomfortable situation because you don’t have enough information to tell what activities to put in place to ensure that the breach does not happen again. It also does not protect the person who deleted the logs. We will find you!
Once, one of our clients did not like the report we produced because it was too damning. They wanted us to write a more glowing report, and we refused to do it. We actually lost some money because of our refusal. They got another company in to tell them what they wanted to hear. But that does not mean that they were secure. It is our top priority to keep any organisation safe and secure. All we want to do is help, and make sure the same problems do not arise in the future. But ignorance is bliss.
What about the use of terrible passwords?
We have had loads of interesting issues with passwords. We worked on network intrusion detection for an organisation once and found their private community string in clear text on the network. It was a massive company and a very long time ago. The community string for SNMP was ‘nimda’, which is ‘admin’ spelled backward. It was very concerning that after we told them, they did not understand the gravity of the situation until we blurted the actual password out on a call to the organisation with many role-players. You could hear a pin drop after that!
We have found many domain admin passwords that were derivatives of the word ‘admin’, with very little difference.
Out of all the experiences, what is the biggest clanger you’ve come across?
There are a few. During a hospital visit recently, the specialist consultant had to confer with colleagues and walked out of the room, leaving his PC on. He was logged in to their central database, and his login ID card was still in the machine. I could have easily gone in to look at patient records if I wanted to.
The worst one is at a large organisation where we did a very light touch test. We had permission to see whether a particular web application was secure, but we were not allowed to be intrusive or put the web app at risk of it breaking. The web app contained confidential, private/personal information on many of us and this test enabled us to see everybody’s information. When this came to a head, it was discovered that the company employed a software company that used an unsecured software product. When we reported the findings, it was dismissed as test data, not live data. We double-checked the records, verified facts on the internet, and found matching data on Google and Facebook. The organisation ended up defending the software company. People manipulate the truth all the time to save face, and it is the wrong thing to do because it puts everyone at risk.
When we look at why this happens, a lot of it has to do with the cybersecurity ‘box-ticking’ culture. It is more important to tick all the boxes and get the certification than it is to be secure and protect the data of the people you are working with. It is brilliant to have standards when they are performed properly, and not simply subjectively determined by an individual. Compliance does not equal security. And when something bad happens no one wants to admit they have made a mistake. They would rather do research and present findings that support their decision.
On a lighter note…
There are a couple of funny stories we have encountered throughout the years. We were once told by a company that they were unhackable and indestructible. And of course, that was the perfect red rag to a bull for us. They were so confident that they placed a bet on it that Samurai could not breach their organisation. We were unable to hack them by any of the conventional means. So we moved to social engineering by creating email accounts that looked identical to theirs. We sent out an employee satisfaction report to the company database. To monitor activity, we had a script running to alert us on our mobiles if anyone completed the report. The script came through at 7 pm one evening, and we struck gold! We found intellectual property to architectural systems as well as systems passwords. So we won the bet!
We got locked in an office of a well-known seller of automobiles once. This was in the earlier days of Samurai. It was at the end of the day and we were finishing off tests, when we realised that the floor was empty, and the alarm light was flashing. We were locked in! We could not get hold of anyone, so we made a run for it. The alarm went off as we headed for the fire escape. We managed to get outside, but we could not pass the metal gates and barriers around the car park. A rescue team came to help and in the end, we got out of there 2 hours after we had finished testing.
The moral of the story is: don’t ever ignore our advice. You will always pay the price in the end, and it will probably be higher than it would have been had you listened to us.
Please note: This article has been transcribed and summarised from our podcast of the same title.