Talk:Backdoor (computing)
From Wikipedia, the free encyclopedia
Contents |
[edit] Dumbness
I removed a paragraph from the article, after reading through the cited paper and rewriting the paragraph for clarity. Here's my clarified version:
- The paper Countering Trusting Trust through Diverse Double-Compiling[1] points out that it is possible to verify that an untrusted compiler is free of Trusting Trust exploits, if the attacker has access to both the source code of the untrusted compiler, and the machine code of a trusted compiler or cross-compiler. (The trusted compiler need not have any relationship to the untrusted compiler; for example, it might produce less efficient code, or code for a different platform.)
At first glance, this attack seems worthless β if I have a trusted compiler, why am I bothering with the untrusted compiler?
But at second glance, things look up β perhaps the untrusted compiler is much better than the trusted one, and we'd prefer to use it, if only we could prove it was trustworthy. (So I added that last parenthetical sentence to the paragraph.) All we have to do, according to the cited paper, is compile the untrusted compiler's source with our trusted compiler, giving C = UT, and then compare the given binary UU to UC (since U and C should be functionally equivalent). If the binaries match, then UU is trustworthy; otherwise, we can't say anything one way or the other. (For example, real-world compilers usually add filesystem-dependent debugging information or timestamps to their output, so they wouldn't be considered trustworthy by this approach.)
Okay, but look again! At third glance, we realize that our "diverse double-compiling" approach involves the production of UC, which is a version of the untrusted compiler compiled from trusted source code using a trusted compiler. (In fact, it's supposed to be bit-for-bit identical to the original compiler!) So at this point, why are we bothering with UU at all? (And remember, we have no reason to think that our untrusted compiler really is UU at all β it might be UW, and again that's usually true for real-world compilers.)
So after all that doubletalk about "diverse double-compiling", what we've come away with is this:
- If you compile trusted source code with a trusted compiler, then you can trust the binary.
Well, of course you can (modulo Thompson's caveat about compromised loaders)! We don't need a thirteen-page PDF to tell us that! --Quuxplusone 08:48, 16 February 2007 (UTC)
[edit] Confusion about Dumbness
You're right about one thing: Why bother with UU at all, if you have the source for U and expect it to compile properly on T?
This approach would be useful for testing whether you have an existing binary compiler which may or may not be compromised, and want to find out if it has been. In other words, it's rootkit detection. It makes no sense, however, if you can simply compile UT out of the box, and then compile UUT (should read U sub U sub T, but U<sub>U<sub>T</sub></sub> doesn't seem to get along with Wikimedia, or perhaps even HTML) -- after all, presumably U produces better binaries somehow, so why wouldn't you make the compiler itself "better"?
But therein lies the problem: Why bother with U at all, unless it somehow produces better binaries? If it does produce better binaries, those better binaries will be different, anyway, so the whole test is pointless anytime you'd want to use it. I suppose one compile might have better error reporting, or it might operate faster, but how often do we see changes like that happen to modern compilers, versus optimization?
After all, if you're using a compiler and a compiled language, chances are you don't care much how good your compiler or language are, other than how fast your program will run after it's compiled.
--(This section will be signed when I bother getting an account. Feel free to butcher at will, haven't touched "Dumbness" as I don't have a clue about the right way to do this...) βThe preceding unsigned comment was added by 63.162.81.179 (talk) 12:08, 12 March 2007 (UTC).
[edit] Intel plans (remote KVM improvement)
What about writing a few words about Intel plans to "improve" Centrino Pro platform? Here are a few of the (some already implemented?) features (source):
- - NIC based TCP/IP filters configurable remotely
- - Handy magic bypass for TCP/IP filters
- - Remote BIOS updates over the network
- - Remote IDE redirection, as in boot off CDROM over the network
- - Persistent storage even if you change hard disks
- - Authentication can be done on Kerberos.
- - Built in web interface on every machine (port 16994)
- - handy well documented SDK for building whatever you need to interact with this
Gaz v pol 15:13, 12 June 2007 (UTC)
[edit] Excess
Instead of removing the hard disk physically, it's easier to write your own dissasembler. Compile it with the infected compiler, it won't recognize it as "the" dissasembler it was taught to infect. If nobody points out a flaw in my reasoning, I'll remove that last part about removing the hard disk. --euyyn (talk) 00:25, 8 February 2008 (UTC)

