I'm going to say something that's been festering in my mind for a while now.
-
nobody is held liable when breaches occur and your PII gets stolen for the fifth time in a single year.
And then we read the inevitable report that it was a third-party managed system that was 6 months behind in patches that got popped. Or it was a risk assessment result that they said "they would get to that eventually" and never did.
You start throwing executives in cuffs for failing to do their duty and sure as shit things would start changing.
@da_667 I don't particularly want them in cuffs for failing to patch because it just strengthens the paternalistic forced patching bs.
I want them in cuffs for possession of PII we never consented for them to collect or store in the first place.
-
@da_667 I’ve been at improvements for decades inside companies. Automated scans in pipelines, and IDE tools make things a lot better for those that care. I’ve worked with many developers that take pride in their work, and just need guidance. Reported vulnerabilities motivate many internally to improve not just the one problem, but the system involved.
It still takes time to change (though far less now). Sprints are measured in weeks, and work needs justification (like a report).@drewdaniels @da_667 why do these all seem to happen after the fact? Why (especially if they are known) is it not mandatory to have them in place prior to developing/building any new products/facilities?
-
@da_667 isn't the whole fucking reason credit scores even exist to circumvent anti-discrimination laws and rules by using arbitrary numbers that just so happen to "correlate well" with race?
-
nobody is held liable when breaches occur and your PII gets stolen for the fifth time in a single year.
And then we read the inevitable report that it was a third-party managed system that was 6 months behind in patches that got popped. Or it was a risk assessment result that they said "they would get to that eventually" and never did.
You start throwing executives in cuffs for failing to do their duty and sure as shit things would start changing.
@da_667 Twice this year I received a stack of letters to every member of my family from the medical practice my family uses. The first was from a data breach and the second a different ransomware attack.
-
@drewdaniels @da_667 why do these all seem to happen after the fact? Why (especially if they are known) is it not mandatory to have them in place prior to developing/building any new products/facilities?
@Rickd6 @da_667 many of them do. Justification helps. Organizations have many people.
You’re right that it’s not perfect, and I could tell many stories.
There are many people trying to do the right things. Most individuals take pride in their work.
I hear you that things are getting worse, and it’s hard to see any improvements. -
@da_667 I don't particularly want them in cuffs for failing to patch because it just strengthens the paternalistic forced patching bs.
I want them in cuffs for possession of PII we never consented for them to collect or store in the first place.
@dalias I absolutely want executives in cuffs for failing to secure data that I have no choice but to trust to them, that is mostly immutable. They get paid ridiculous sums of money for the job, but there are zero consequences for that failure. and if that means an executive gets jail time for failing to patch a box, I would welcome it. At the same time, I would absolutely welcome them getting imprisoned for the collection of PII, especially biometric data that they, historically never needed
When I acquired my credit card in the early 2000s, I never once needed to take a picture of my license, or take a picture of myself for some credit card company to verify my identity. They tell you that the data isn't stored, but if it isn't, then why did they need it in the first place?
-
@dalias I absolutely want executives in cuffs for failing to secure data that I have no choice but to trust to them, that is mostly immutable. They get paid ridiculous sums of money for the job, but there are zero consequences for that failure. and if that means an executive gets jail time for failing to patch a box, I would welcome it. At the same time, I would absolutely welcome them getting imprisoned for the collection of PII, especially biometric data that they, historically never needed
When I acquired my credit card in the early 2000s, I never once needed to take a picture of my license, or take a picture of myself for some credit card company to verify my identity. They tell you that the data isn't stored, but if it isn't, then why did they need it in the first place?
@da_667 "..that I have no choice but to trust to them.."
This is exactly what "never consented" means.
-
@da_667 "..that I have no choice but to trust to them.."
This is exactly what "never consented" means.
@dalias really?
-
@dalias really?
@da_667 Yes. Something handed over under duress is not consented.
-
I'm going to say something that's been festering in my mind for a while now. In my two decades of practice in information security, I have yet to see responsible disclosure result in measurably better security posture.
Code quality hasn't improved, patch management hasn't improved, minimum viable product hasn't improved, automated security updates, especially for IoT devices... Jesus Fucking Christ haven't improved. The cost of failure for organizations losing your data due to gross negligence has in no way improved, why should responsibility be the domain of the security researcher when nobody else is willing to share in that responsibility?
I'm half-tempted to say if you have 0-days you might as well get paid for them than be responsible. Because even with a tilted playing field, nothing has measurably improved since I've been here and I would argue with "vibe coding" and the tech industry's view of "Let the AI handle it" that software quality is the worst it has been since the 90s. I lived through windows millennium edition. I've seen shit you wouldn't believe.
"Hardware's fucked because we can't buy any, software is fucked because the LLMs trained by reddit and stack overflow are in charge now. You might as well fucking guess at this point."
@da_667 I believe that if we could make the C-suite personally liable AND enforce it. Many problems will sort themselves out
-
J jwcph@helvede.net shared this topic