Should we reconsider our reliance on computer systems?

  • 38 Replies
  • 1271 Views
*

markjo

  • Content Nazi
  • The Elder Ones
  • 42955
Re: Should we reconsider our reliance on computer systems?
« Reply #30 on: August 04, 2024, 06:12:29 PM »
What failed here was redundancy, testing, backups, and a failover mechanism.


Disaster recovery also failed.
As I see it, it was mostly a failure of CrowdStrike not properly testing the patch and the customers failing to patch in a test environment first.  This is why I never trust a dot zero release of anything from anyone, including updates and patches.  Let the other guys beta test them, then if everything is okay, only then do I update.  At least it wasn't too bad once they figured out that you just had to reboot a bunch of times or boot into safe-mode.  No backups or disaster recovery to mess with.  Just way too much down time.
Science is what happens when preconception meets verification.
Quote from: Robosteve
Besides, perhaps FET is a conspiracy too.
Quote from: bullhorn
It is just the way it is, you understanding it doesn't concern me.

*

Username

  • Administrator
  • 17873
  • President of The Flat Earth Society
Re: Should we reconsider our reliance on computer systems?
« Reply #31 on: August 05, 2024, 12:47:51 AM »
What failed here was redundancy, testing, backups, and a failover mechanism.


Disaster recovery also failed.
As I see it, it was mostly a failure of CrowdStrike not properly testing the patch and the customers failing to patch in a test environment first.  This is why I never trust a dot zero release of anything from anyone, including updates and patches.  Let the other guys beta test them, then if everything is okay, only then do I update.  At least it wasn't too bad once they figured out that you just had to reboot a bunch of times or boot into safe-mode.  No backups or disaster recovery to mess with.  Just way too much down time.
So, testing? And putting off the responsibility to a centralized third party?
"You are a very reasonable man John." - D1

"The lunatic, the lover, and the poet. Are of imagination all compact" - The Bard

*

markjo

  • Content Nazi
  • The Elder Ones
  • 42955
Re: Should we reconsider our reliance on computer systems?
« Reply #32 on: August 05, 2024, 02:48:40 PM »
So, testing? And putting off the responsibility to a centralized third party?
Yes, software developers should be testing their software before it gets released.  If the developers want to use a third party to help with the testing, then that would probably be a good thing, assuming that they can afford the service. 

Large organizations should also bear some responsibility to test all new software (including updates and patches) in an isolated test environment before rolling out to the live environment.  Even then, Murphy's law says that some bugs and/or malware will eventually slip through and potentially wreak havoc.
Science is what happens when preconception meets verification.
Quote from: Robosteve
Besides, perhaps FET is a conspiracy too.
Quote from: bullhorn
It is just the way it is, you understanding it doesn't concern me.

*

Lorddave

  • 18453
Re: Should we reconsider our reliance on computer systems?
« Reply #33 on: August 06, 2024, 04:10:06 AM »
What nonsense.

A laptop, running unconnected to a power cord (on battery), using high-quality disks of some sort.
How would you hack that? Short of breaking into where it's  set up, I don't see how.

The internet is like a giant open door to remote access. And anything that can be changed by a third party can The best way to stop that is to shut the door.

I could very easily design a computer system to do just that, if I had a computer engineer.
1. Implement offline redundancy by making a secure core system which can then transfer to delivery systems. That is, if all gas stations are on the same system, you have a system that will backup without being hacked. And ideally, you have a few of these, in case one fails.
2. Cut out all these "data centers." Environmentally, they are bloated by constantly being online like that. They use alot of heating and cooling that wouldn't be used by a constant update data load. Our neighboring town for instance is getting its water stolen for a data center. It is far more resource intensive to try to manage that information online than to have local updates which are manually logged each day. It's also more secure.
3. Make secure (if possible, underground) facilities for such data. The harder it is to locate such an area, the harder more physical (such as Ethernet or RF signals) hacks  are to do.
The first three are just good practice. But here's where design and engineering comes in.
4. Make an ID chip for these computers. Make the system based on this for upload. The data cannot be updated except one way. Meaning an asshole/idiot cannot tamper from their computer, only the person with that computer hardware can log in and change things. With the same password, any other computer will not login.
5. If an outside computer tries to figure out the ID, all they will see is a one-time dummy code. The ID Chip computer logs in to update the server, which in turn communicates with the rest of the network. Btw, the ID Chip needed should not be shared with any computer but the server, in close proximity. Basically, it's StreetPass technology. Not exactly, but the feed is basically ultra-short range signal.
6. The two ID chips are lock and key, so if the upload computer ever gets worn out, you have to swap out the ID chips on both computers.

Wow.

Just wow... You managed to make things worse while also trying to invent things we've done for decades.

1. These exist.  But to do it, ya gotta take a backup from something online and move it to this offline storage.  And restoring it requires the opposite way.  In the case of cloud strike... They did that.  An offline fix (delete a file) applied manually to every single pc.  Which was alot.

2. Data centers are more efficient.  Consider this: a building housing its own little datacenter to handle the logins, data storage, programs, servers, etc .. for an entire company OR.... a bunch of individual physical servers all sitting in various buildings, each one doing a single thing.  In my company that would require a few hundred physical machines, each using nearly 500W, plus dedicated power and cooling and additional networking, which requires more network devices to manage the machines.  Plus each one having to run its OS ALL THE TIME. 
Whereas a datacenter does all that in one spot which cuts down on network devices needed, increases access times, and allows for easier recovery.  Like say your email server in Nevada died.  Now you gotta go drive out there and fix it so people in Nevada can have email.  In a datacenter, if a server fails, it fails over to another one, using the exact same data and programs on new hardware automatically.  No so easy with a bunch of dedicated servers spread across the world.

Also, this forum is hosted on a datacenter.  Just saying...


3. Datacenters are one of the most secure places in the world.  If the apocalypse happens, head to a datacenter.  You'll survive most things.

4. 5. 6.
Yes, we have various methods for this.  Like encryption, system passwords, firewalls, security keys, MAC whitelisting, subnetting, etc...
You can't just hack a system that's on the internet.  Not if it has decent security.  Your home wifi probably is fine. 

Most hacking is done via social engineering or utilizing flaws in systems like SQL injection or buffer overflows.  But that fails if you can't get in.  And backups are not open to the public net.

So yeah, your ideas are good and we've been doing it this whole time. >_>
You have been ignored for common interest of mankind.

I am a terrible person and I am a typical Blowhard Liberal for being wrong about Bom.

*

Username

  • Administrator
  • 17873
  • President of The Flat Earth Society
Re: Should we reconsider our reliance on computer systems?
« Reply #34 on: August 06, 2024, 01:15:49 PM »
So, testing? And putting off the responsibility to a centralized third party?
Yes, software developers should be testing their software before it gets released.  If the developers want to use a third party to help with the testing, then that would probably be a good thing, assuming that they can afford the service. 

Large organizations should also bear some responsibility to test all new software (including updates and patches) in an isolated test environment before rolling out to the live environment.  Even then, Murphy's law says that some bugs and/or malware will eventually slip through and potentially wreak havoc.
Yeah the organizations using it should have tested in an isolated way and they shouldn't be using services that allow the third party to push updates. That's the screw up as far as cause, but the other things I mentioned are how they fucked up after it happened or before it happened in system design.
"You are a very reasonable man John." - D1

"The lunatic, the lover, and the poet. Are of imagination all compact" - The Bard

*

JackBlack

  • 23174
Re: Should we reconsider our reliance on computer systems?
« Reply #35 on: August 06, 2024, 03:59:01 PM »
A laptop, running unconnected to a power cord (on battery), using high-quality disks of some sort.
How would you hack that? Short of breaking into where it's  set up, I don't see how.
How are you getting information on and off that?
Because if you don't have any way to do so, that device is useless.

Unless you plan on having a person manually enter all the data (which still opens you up to social engineering attacks), you are going to be using some device to transfer data, and that device can be hacked.
USB drives and even floppies can contain viruses.

And there are some ways to get information across that air gap.
What you need is a faraday cage to encapsulate that computer, ideally surrounded by something giving off lots of random noise.
This would not be environmentally friendly at all.

The best way to stop that is to shut the door.
No it isn't.
That is like saying some foods could poison you and cause you to throw up. The best way to stop that is to seal your mouth shut so you can't eat ever, causing your death.

In reality, selecting the best way involves understanding the risks of each option and the consequences of them, including flow on effects.

I could very easily design a computer system to do just that, if I had a computer engineer.
You mean you could come up with a bunch of requirements with no concern for the actual implementation of it.

Implement offline redundancy by making a secure core system which can then transfer to delivery systems. That is, if all gas stations are on the same system, you have a system that will backup without being hacked. And ideally, you have a few of these, in case one fails.
How?
How is this system backed up? How does it transfer?

2. Cut out all these "data centers." Environmentally, they are bloated by constantly being online like that. They use alot of heating and cooling that wouldn't be used by a constant update data load. Our neighboring town for instance is getting its water stolen for a data center. It is far more resource intensive to try to manage that information online than to have local updates which are manually logged each day. It's also more secure.
Data centres can be more environmentally friendly than the alternative.
It needs to be built to handle the maximum load, not a constant load.
A data centre allows different companies to share resources so each can handle their peak load, which moves between different companies.

3. Make secure (if possible, underground) facilities for such data. The harder it is to locate such an area, the harder more physical (such as Ethernet or RF signals) hacks  are to do.
And such facilities would require a lot for cooling.
The infrastructure around it will likely give it away.

4. Make an ID chip for these computers. Make the system based on this for upload. The data cannot be updated except one way. Meaning an asshole/idiot cannot tamper from their computer, only the person with that computer hardware can log in and change things. With the same password, any other computer will not login.
And then if the ID chip is lost or damaged, all the data is as well.
But that ID chip is effectively just a fancy password anyway.
You have the possibility of that chip being stolen or cloned, or just the technology behind it being circumvented.

Also, this is effectively what is already used by SSH.
For SSH you generate keys, a private key which should be stored securely on your computer, and a public key which goes to the device you want to access.
That private key is used just like your chip to prove you are allowed to access the computer. But there are flaws with the implementation which sometimes allows other people to get in.

So other than these being made as chips, we already have this technology, and know its limitations.
What makes you think your chip will be magically perfect, and not suffer from any of the flaws of SSH?

*

markjo

  • Content Nazi
  • The Elder Ones
  • 42955
Re: Should we reconsider our reliance on computer systems?
« Reply #36 on: August 06, 2024, 05:00:00 PM »
A laptop, running unconnected to a power cord (on battery), using high-quality disks of some sort.
How would you hack that? Short of breaking into where it's  set up, I don't see how.
How are you getting information on and off that?
Sneakernet, obviously.
Science is what happens when preconception meets verification.
Quote from: Robosteve
Besides, perhaps FET is a conspiracy too.
Quote from: bullhorn
It is just the way it is, you understanding it doesn't concern me.

*

disputeone

  • 25603
  • Or should I?
Re: Should we reconsider our reliance on computer systems?
« Reply #37 on: August 06, 2024, 07:17:28 PM »
Croutons computer network at Langley is air gapped. They consider it safe because of that. At least safe from nobody but them.

Theoretically a device on a lan network with no other connectivity options in its hardware should be safe. You'd have to have an insider, like in the case of the vault 7 leaks.

North Korea uses Intranet for this reason although I'm quite sure Crouton and his workmates have access to it. Most people have a price. When you can pay billions of dollars in taxpayer money to hack your own taxpayers devices with zero day exploits you can buy a disillusioned North Korean official.

That's why the US is banning Kaspersky isn't it, not because they think Russia will hack the NSA or the CIA, it's because Kaspersky exposed pegausus and isn't as easily paid off or threatened as western AV companies. It's not that they don't want people to be hacked, it's that they only want the US to be able to hack them.

Edit.
3. Make secure (if possible, underground) facilities for such data. The harder it is to locate such an area, the harder more physical (such as Ethernet or RF signals) hacks  are to do.
And such facilities would require a lot for cooling.
The infrastructure around it will likely give it away.

Like Pine Gap.
The solution is seemingly a large foreign military presence to keep taxpayers ignorant of what they are paying for.

Military grade AI systems used mainly for domestic propaganda. Along with signals intelligence allowing Isael to bomb children with greater precision.
https://michaelwest.com.au/six-eyes-australias-secret-support-for-the-israeli-assault-on-gaza-through-pine-gap/

I'm sorry why do we pay tax again?
Oh right, the roads and public transport.
Wait, then why does car registration and train tickets cost so much?
Questions for later.
« Last Edit: August 06, 2024, 07:24:25 PM by disputeone »
Why would that be inciting terrorism?  Lorddave was merely describing a type of shop we have here in the US, a bomb-gun shop.  A shop that sells bomb-guns. 

*

JackBlack

  • 23174
Re: Should we reconsider our reliance on computer systems?
« Reply #38 on: August 07, 2024, 02:10:38 AM »
Theoretically a device on a lan network with no other connectivity options in its hardware should be safe. You'd have to have an insider, like in the case of the vault 7 leaks.
Assuming really no other connectivity options. So no other USB drives or anything like that.
All data manually entered or manually transferred onto paper to take out.
And again ideally in a faraday cage surrounded by noise generators, with a power supply that is likewise gapped (e.g. charge alternating sets of batteries, with the one being used not being charged, with them electrically disconnected on change over with manual disconnections, so no analysis of power usage).