Another massive data breach

I disagree, John. No software is 100% secure. It’s written by humans and, despite our best efforts, we’re not yet infallible. Plus those financial organisations you use are reliant upon software that isn’t even written by them, whether that be the OS, middleware apps, the backend systems, the open source libraries used by a widget in their website, the firmware running on their network infrastructure, the apps on their employees’ phones, etc… It all contains vulnerabilities of one sort or another.

Some vulnerabilities are, obviously, less sever than others but you only have to look at the number of vulnerabilities disclosed year after year after year (the Common Vulnerabilities and Exposures, or CVE, database managed by NIST in the US is a good place to start) to understand that after all these years software is still as buggy as hell.

Even if we assume for a minute that an organisation did somehow have control over all of this code they use, I’d argue that they still wouldn’t be 100% secure because its customers wouldn’t tolerate it. Daniel Miessler puts it well in this article

image

and

image
source: https://danielmiessler.com/blog/the-reason-software-remains-insecure/

2 Likes

Plus, if software is ever 100% secure then I’ll be out of a job :smiley:

1 Like

I agree Gareth, software is full of holes, and despite being brilliant Assembler programmer :wink: I dug some of them in my time, in application and systems software.

(We won’t discuss a certain rounding error in a tricky interest calculation (or algorithm as it would be called now) back in the seventies :face_with_hand_over_mouth: A fixed vs floating point issue that meant adding a penny on at the end was the only way I could get the bloody thing to balance. I sometimes wonder how that multiplied up over the years. Maybe I was really the cause of the 2008 banking crisis :joy:)

But software vulnerabilities aren’t my problem anymore. I would guess mainframes are still at the core of most large banks and I would guess my old firm still provides the H/W, OS and middleware at the heart of those systems. As an ex Sysprog, though obviously not up to date, I still know how those systems hang together and how difficult it is to make sure dodgy bits of code, running on dodgy bits of code are secure.

But the CEO’s of the financial institutions whose infrastructure I ran wouldn’t have been in the least interested in those excuses had there been a data breech or incursion on my watch. They wanted 100% and I, now as an end user, want the same. Aiming for anything less is unacceptable.

(Reminds me of the old joke about the Japanese supplier who when asked by British Leyland Purchasing if they could achieve the 97% defect free component delivery demanded, responded, “we can add 3% defects if you insist”.)

I’m sure you’ll agree that all companies should strive for 100% security. It’s not just their data, it’s also our data that’s at risk. There will be failures, but there should mind focusing disincentives.

I agree fully that overall the balance between S/W utility and the damage caused is firmly on the side of utility, but I think that differs depending the application. I think email, for example, is fast approaching the tipping point. I’m careful with my addresses but some of them are bombarded with spam. That’s down to data leaks from companies at least some of which IMO have inadequate and/or sloppy security, companies that have made the decision to not invest and to take the risk because it’s not them that will suffer the consequences - hence big fines required.

I think it’s also down to Outlook and Gmail and other email providers not providing better security, for example a “blue tick” verified approach to email address. Plus whatever internet service providers are hosting dodgy sites.

I think it would be pretty easy to do an 80:20 job (more like 99:01 IMHO) on SPAM with appropriate legislation and international agreements. Shutting down the internet connection of that Nigerian politician who wants to transfer me a million dollars could halve it overnight :joy:

1 Like

Yes, striving for 100% security is commendable even if I don’t think it’s achievable :wink:

There’s a difference between an organisation being insecure and being wilfully insecure. The ones who knowingly don’t or won’t attempt to fix issues deserve to be punished harshly (and I agree with your other point that CEOs should be heavily fined if found to have been negligent by choosing to not invest appropriately… I’d even go so far as to charge some with imprisonment like the Americans can do for Sarbanes Oxley breaches).

So I still think no company can ever be 100% secure. But if they can evidence that they identify threats and vulnerabilities, that they perform risk assessments and that those risk assessments demonstrate that there is a valid case to not mitigate the risk, then I’m fairly comfortable with that. It’s all about being proportionate at the end of the day.

1 Like

Personal emails were allowed…hence (presumably) the heavy emphasis on preventing/detecting phishing attacks… I guess there is always this eternal debate on striking the right balance in maximising information security for the firm/its clients vs convenience for its partners and staff (in using emails). It seems that unlike in Government or Banking, there is no custom or practice of banning personal emails for staff in the accountancy profession (or I’d imagine many other professions). Staff would no doubt mutiny at the prospect of being unable to use personal emails, and it would hardly be a great recruitment angle! Not an easy issue to resolve.

If by personal emails it is meant emails of an apparently non-work related nature, then I see no additional risk compared to work only emails. If it meant no emails to me as an individiual then that might reduce the risk of phishing.

Most of the emails I see that are attempts at phishing have a work/business connection and few are outside of that context. We have regular tests using mocked up phishing emails here: outlook has a ‘report phishing’ button for any email we may think is malicious, and there’s normally a congratulations message when one of the test messages is detected. Following the phishing link in a test message earns the person extra training in how to spot a fake email.

Over the last couple of years we’ve seen a big rise in emails purporting to come from senior executives, asking people connected to them to buy gift cards etc & send the codes to specific places because ‘the executive’ is travelling and has been unable to do this themselves. It’s not an IT security risk as such, but the connections made in the content of the emails between senior and junior staff sometimes shows a surprising degree of intelligence.

I wouldn’t have had it George. The email system is a company asset and IMO should only be used for company business. it’s not as if staff couldn’t organise their own email addresses. I think a strict line between what is company and what is personal is vital, any blurring at the edges will only lead to unnecessary problems. One would have hoped our auditors (PwC) and tax advisors (KPMG) agreed, but there you go :face_with_hand_over_mouth:

Times have changed over personal stuff. The company I worked for in the 90s had a policy of no personal equipment in the labs - that included radios, Walkmans, mobile phones. The threat was that anything found would leave through the autoclave (a kind of sterilising pressure cooker). It never did happen, but it WAS a possibility. Now everyone had smartphones, in-ears for music, maintains email contact while working etc. Generally the UK corporate environment is much less controlling over personal stuff.

1 Like

Yes, I think we really do need some verified mechanism for emails.

I worked with a software engineer who seemed to have to do this sort of ‘bodge’ all the time. He became a manager :grin:.
Said software engineer, to fix a reported bug added about 1000 line of complicated code that basically tore down and built back up a chunk of the software stack, whilst keeping other things alive. It sort of worked. I was asked to look at it and changed the ‘==’ to a ‘!=’ in a single line of code, which was the real issue.

My former employer permitted ‘reasonable personal use’ of emails at work… ‘Reasonable’ was (probably intentionally) undefined, but was intended to prevent excessive personal use, domestic emergencies apart, that would significantly reduce the time spent actually working. People tended to voluntarily work quite long hours, if needed, and it seemed only fair to turn a blind eye to sensible personal email usage by way of quid pro quo. Professionals often have to justify their use of time via detailed timesheets, and it soon becomes clear who is abusing the reasonable usage rule.

1 Like

Yes, also how the corporate network looks has changed massively in recent years too. Previously it was relatively easy to control access to resources… you just restricted it to stuff within the network perimeter.

When I first started out, I worked for a fairly small company where we had our servers (mostly IBM AS/400s… I’m showing my age!) that were on the same network within the office. We basically had a very flat network with no separation at all between devices, users, etc…

The next big change was the introduction of colocation hosting in data centres, which moved the servers out of the office building and into a server rack inside a secure data centre that provided resiliency, redundancy, economies of scale, etc… Our network just stretched from the office to the DC over the internet.

Then, due to the increase in Managed Service Providers, the network had to be extended to include suppliers’ networks. As this was typically via site-to-site VPNs between the colo DC and the supplier’s DC it was still manageable.

Then with the adoption of cloud service providers like AWS, Azure & GCP the on-prem network that connected via VPN to MSPs’ networks now needed to also connect to your Virtual Private Cloud in Amazon’s or Google’s data centres… things were already starting to get more complex now.

Now with Covid and the explosion of teleworking, the on-prem network has been loosened to include employees’ home networks too (again mostly via VPNs). But now we also have to cater for people using their mobile phone to access emails, or wanting to use their own laptop instead of the corporate battle brick.

In the space of 25 years, we’ve loosened the definition of the network to include those that are trusted but aren’t managed by our own teams (MSPs, suppliers, etc…)… and even those that we know are being used for dodgy stuff by other people (our colleagues’ teenage kids downloading dodgy torrents or playing cracked games on home wifi networks that our colleagues are also using to connect to our work network).

2 Likes

[quote=“hairbear, post:30, topic:43687”]:face_with_hand_over_mouth:
He became a manager
[/quote]

So did I, but not as result of my technical incompetence :face_with_hand_over_mouth:It was greed :joy: I really did enjoy programming. I never liked application programming and got out of it quickly, too many awkward users. I preferred being a systems programer, which had the added advantage that nobody understanding what you were doing but they knowing they needed you. Assembler was my preferred language because and instruction was an instruction and you could do weird and wonderful things. I especially liked self modifying code, where depending on what you were trying to do, you could modify machine code instructions further down the pipeline. God help anybody subsequently trying to debug it from a core dump but the programs were tight as hell and used very little memory. You download some trivial app now and it megabytes and megabytes for not very much. Probably dragging runtime libraries and who knows want tat along with it.

I keep promising myself to do a bit more programming, I think it could be a good to keep the old brain ticking over. I’ve set up VM/370 (my favourite OS) running under (or is “on”) Hercules on a Mac before. May do that again and write some 370 code for a laugh. Sort of regressing towards childhood.

1 Like

Oof… kudos to you. I’m not a natural programmer so struggle with much easier to understand languages. I don’t even pretend to understand assembler.

Haha… very true! :smiley:

1 Like

Good description of the current challenge.

My first reasonable network was an IBM 4341 mainframe based in Grenoble with S/34s (and you thought AS/400s, their grandsons, showed your age :joy:) in half a dozen countries around Europe. I had IBM 3705 front end processor with a 9600 baud line to each of them. That’s it in the top left. The line reliability was woeful.

1 Like

Actually, I think it’s the simplest of all. Each instruction does what it say on the tin (or in the manual in this case), just one thing. A move is a move, and add is an add and a move left logical is, well you get my drift. What’s not to love about the test under mask instruction, and what you could achieve with boolean algebra at the bit level was awesome. You could cram loads of information into a few bytes. It was all a very pleasant mental exercise. You could even tell who had written certain bits of code through their style. The one thing I was always careful with was documenting my programs. Without that coming back to modify one after any period you wouldn’t be able to figure out what was going on.

The higher the level the language the more remote I feel from the machine. I never made the leap to C and structured programming caused my ears to bleed. I was a great user of macros and subroutines and had always has a lot of structure in my own programs but I wasn’t comfprtable formulaic methodologies. So as they started to take over and the era of the freewheeling programmer started to pass I went into management :slightly_smiling_face: My technical background helped me all the way through my career as I understood just what my techies were talking about.

1 Like

In connection with network discussions this just amused me.

1 Like

You must have hated the invention of pipelines and caches then :slight_smile:

ISTR that some games used self-modifying code knowing that it would have no effect when run on “bare metal” but would frustrate anyone trying to run the code in a debugger.

1 Like

I think that jumper was delivered with the mainframe - many of my colleagues seemed to wear similar :slight_smile:

5 Likes

Similar for me. My first 10 years employed as a programmer was all in assembler, including writing our own micro kernel and working at the hardware level. Even when it all moved to C, I worked exclusively at the hardware driver level. No middleware or UI work, which I didn’t like at all.

1 Like