This took care of integrity, but not privacy. The solution was the ingenious system of packet-switching error-correction, a complex system that allowed the sender to verify that the recipient had gotten all the parts of their transmission and resend the parts that disappeared en route. In the earliest days of the internet, we operated on the assumption that the major threat here was technical: our routers and wires might corrupt or lose the data on the way. Take internet data privacy and integrity – that is, ensuring that when you send some data to someone else, the data arrives unchanged and no one except that person can read that data. There's a joke about physicists, that all of their reasoning begins with something they know isn't true: "Assume a perfectly spherical cow of uniform density on a frictionless surface…" The world of information security has a lot of these assumptions, and they get us into trouble. If you can't trust your senses, your reason, the authorities, your hardware, your software, your compiler, or third-party service-providers, well, shit, that's pretty frightening, isn't it (paging R. How do we trust the Bloomberg reporters? How do we trust Apple? If we ask a regulator to investigate their claims, how do we trust the regulator? Hell, how do we trust our senses? And even if we trust our senses, how do we trust our reason? I had a lurid, bizarre nightmare last night where the most surreal events seemed perfectly reasonable (tldr: I was mugged by invisible monsters while trying to order a paloma at the DNA Lounge, who stole my phone and then a bicycle I had rented from the bartender). Four years later, we still don't know who was right. The companies whose servers were said to have been compromised rejected the entire story. The authors claimed to have verified their story in every conceivable way. How about your hardware? Back in 2018, Bloomberg published a blockbuster story claiming that the server infrastructure of the biggest cloud companies had been compromised with tiny hardware interception devices: OK, say you whittle your own compiler out of a whole log that you felled yourself in an old growth forest that no human had set foot in for a thousand years. How do you know the software you use to store those keys is trustworthy? Well, maybe you audit the source-code and compile it yourself.īut how do you know your compiler is trustworthy? When Unix/C co-creator Ken Thompson received the Turing Prize, he either admitted or joked that he had hidden back doors in the compiler he'd written, which was used to compile all of the other compilers: ![]() ![]() Say you maintain your own cryptographic keys on your own device. The problem is, it's trust all the way down. Some infosec pros go so far as to say "trust no one," a philosophy more formally known as "Zero Trust," that holds that certain elements of your security should never be delegated to any third party. ![]() Colophon: Recent publications, upcoming/recent appearances, current writing projects, current readingĭelegating trust is really, really, really hard (infosec edition) ( )ĬORRECTION: A previous version of this thread reported that Trustcor has the same officers as Packet Forensics they do not they have the same officers as Measurement Systems.Hey look at this: Delights to delectate.Delegating trust is really, really, really hard (infosec edition): Who knows what secrets lurk in your browser's root certificate store?.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |