Talk:Backdoor (computing)

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia


Wiki Education Foundation-supported course assignment[edit]

This article is or was the subject of a Wiki Education Foundation-supported course assignment. Further details are available on the course page. Student editor(s): Alicevoe.

Above undated message substituted from Template:Dashboard.wikiedu.org assignment by PrimeBOT (talk) 15:11, 16 January 2022 (UTC)[reply]

Dumbness[edit]

I removed a paragraph from the article, after reading through the cited paper and rewriting the paragraph for clarity. Here's my clarified version:

The paper Countering Trusting Trust through Diverse Double-Compiling[1] points out that it is possible to verify that an untrusted compiler is free of Trusting Trust exploits, if the attacker has access to both the source code of the untrusted compiler, and the machine code of a trusted compiler or cross-compiler. (The trusted compiler need not have any relationship to the untrusted compiler; for example, it might produce less efficient code, or code for a different platform.)

At first glance, this attack seems worthless — if I have a trusted compiler, why am I bothering with the untrusted compiler?

But at second glance, things look up — perhaps the untrusted compiler is much better than the trusted one, and we'd prefer to use it, if only we could prove it was trustworthy. (So I added that last parenthetical sentence to the paragraph.) All we have to do, according to the cited paper, is compile the untrusted compiler's source with our trusted compiler, giving C = UT, and then compare the given binary UU to UC (since U and C should be functionally equivalent). If the binaries match, then UU is trustworthy; otherwise, we can't say anything one way or the other. (For example, real-world compilers usually add filesystem-dependent debugging information or timestamps to their output, so they wouldn't be considered trustworthy by this approach.)

Okay, but look again! At third glance, we realize that our "diverse double-compiling" approach involves the production of UC, which is a version of the untrusted compiler compiled from trusted source code using a trusted compiler. (In fact, it's supposed to be bit-for-bit identical to the original compiler!) So at this point, why are we bothering with UU at all? (And remember, we have no reason to think that our untrusted compiler really is UU at all — it might be UW, and again that's usually true for real-world compilers.)

So after all that doubletalk about "diverse double-compiling", what we've come away with is this:

If you compile trusted source code with a trusted compiler, then you can trust the binary.

Well, of course you can (modulo Thompson's caveat about compromised loaders)! We don't need a thirteen-page PDF to tell us that! --Quuxplusone 08:48, 16 February 2007 (UTC)[reply]

Quuxplusone is completely right. The "diverse double-compiling" is too trivial to even mention. The reflecting trust stuff is _also_ trivial compared to some CS stuff (only the trivial CS topics are on Wikipedia), but non-trivial enough and useful as a reference to make people aware of these issues. The fact that people bothered to publish it only shows how bad the journal is in which it has been published. I am not going to delete the paragraph, because some idiot will probably see it as destroying the article, but I would rather have it removed. It would not surprise me if the person who wrote that paragraph is related to the author of the article, because nobody with any common sense would think it is interesting. It is not worthy of mentioning. —Preceding unsigned comment added by 194.109.21.4 (talk) 21:43, 27 November 2010 (UTC)[reply]

Here's what Bruce Schneier has to say on the topic:

Now you might read this and think: "What's the big deal? All I need to test if I have a trusted compiler is...another trusted compiler. Isn't it turtles all the way down?" Not really. You do have to trust a compiler, but you don't have to know beforehand which one you must trust. If you have the source code for compiler T, you can test it against compiler A. Basically, you still have to have at least one executable compiler you trust. But you don't have to know which one you should start trusting.

[1]

In short: as long as there is at least one single uncompromised binary of a C compiler, no matter which one (disregarding compilers written using their own language extensions), one can recover. That _is_ a defence against the trusting trust attack, and as far as I know, the only practical one. I'd say it's worth mentioning in the article.Mreftel (talk) 18:29, 19 October 2013 (UTC)[reply]

Confusion about Dumbness[edit]

You're right about one thing: Why bother with UU at all, if you have the source for U and expect it to compile properly on T?

This approach would be useful for testing whether you have an existing binary compiler which may or may not be compromised, and want to find out if it has been. In other words, it's rootkit detection. It makes no sense, however, if you can simply compile UT out of the box, and then compile UUT (should read U sub U sub T, but U<sub>U<sub>T</sub></sub> doesn't seem to get along with Wikimedia, or perhaps even HTML) -- after all, presumably U produces better binaries somehow, so why wouldn't you make the compiler itself "better"?

But therein lies the problem: Why bother with U at all, unless it somehow produces better binaries? If it does produce better binaries, those better binaries will be different, anyway, so the whole test is pointless anytime you'd want to use it. I suppose one compile might have better error reporting, or it might operate faster, but how often do we see changes like that happen to modern compilers, versus optimization?

After all, if you're using a compiler and a compiled language, chances are you don't care much how good your compiler or language are, other than how fast your program will run after it's compiled.

--(This section will be signed when I bother getting an account. Feel free to butcher at will, haven't touched "Dumbness" as I don't have a clue about the right way to do this...) —The preceding unsigned comment was added by 63.162.81.179 (talk) 12:08, 12 March 2007 (UTC).[reply]

Intel plans (remote KVM improvement)[edit]

What about writing a few words about Intel plans to "improve" Centrino Pro platform? Here are a few of the (some already implemented?) features (source):

  • - NIC based TCP/IP filters configurable remotely
  • - Handy magic bypass for TCP/IP filters
  • - Remote BIOS updates over the network
  • - Remote IDE redirection, as in boot off CDROM over the network
  • - Persistent storage even if you change hard disks
  • - Authentication can be done on Kerberos.
  • - Built in web interface on every machine (port 16994)
  • - handy well documented SDK for building whatever you need to interact with this

Gaz v pol 15:13, 12 June 2007 (UTC)[reply]


Hardware backdoor - A few years ago I read about a young kid(early teens) who disassembled the CPU (Intel I believe) hardcode and discovered a backdoor. Never heard anything else since. If you want your data/work to be secure you had better unplug your PC etc and put gum in the USB ports. 73.149.116.253 (talk) 00:36, 9 July 2015 (UTC)[reply]

Excess[edit]

Instead of removing the hard disk physically, it's easier to write your own dissasembler. Compile it with the infected compiler, it won't recognize it as "the" dissasembler it was taught to infect. If nobody points out a flaw in my reasoning, I'll remove that last part about removing the hard disk. --euyyn (talk) 00:25, 8 February 2008 (UTC)[reply]


A few years ago a young kid disassembled his PC chip - BIOS, etc - and discovered that a backdoor was part of the hardware from the factory. After a brief hubbub I have never heard anything more. The software backdoors are the least of your problems - the hardware method is undetectable and unstoppable - unless you have one machine that never hooks up to the internet or use a very old or costom machine.159.105.80.141 (talk) 12:46, 8 February 2010 (UTC) Passing data on a disk is iffy.[reply]

to eliminate hardware backdoor possibility one have to control entire system life-cycle including own silicone foundry. Some remedies: use some old stuff, programmable ics, log everything in statu nascendi, program inhi^ AC. — Preceding unsigned comment added by 99.90.197.87 (talk) 02:55, 16 May 2012 (UTC)[reply]

hard bd[edit]

easier lit: [1] — Preceding unsigned comment added by 99.90.197.87 (talk) 20:13, 25 May 2012 (UTC)[reply]

W32/Induc-A[edit]

Induc was actually discovered by the Delphi programmer. This blog post (in Russian) was written on August 12, and the article on the Sophos website for 6 days later — Preceding unsigned comment added by 159.224.121.67 (talk) 13:57, 11 December 2012 (UTC)[reply]

Note to Safinaskar regarding "same"[edit]

"same" should be there, but on further reflection the quotes should be removed. Sorry, bad monitor, the emphasis should stay.

Wheeler's model does not require that the second compiler be free of backdoors "that affect the process". If a backdoor is present that affects the process it will be detected, unless it is the same backdoor as is in the original compiler. Thus, Wheeler only requires that the backdoors be either absent or different.

Did that help?

GaramondLethe 02:17, 13 January 2013 (UTC)[reply]

unless it is the same backdoor as is in the original compiler. Are you sure? Wheeler's paper doesn't say this. So this is just your opinion, so this is WP:OR. Your arguments seems right. And I probably agree with you. But we should not write to Wikipedia our arguments, we should just write arguments which are present in WP:IRS. Safinaskar (talk) 14:51, 20 January 2013 (UTC)[reply]
The philosopher John Wilkins describes a Principle of Charity that ought to govern how we read a text. He quotes (among others) the Oxford Companion to Philosophy:

In its simplest form, it holds that (other things being equal) one’s interpretation of another speaker’s words should minimize the ascription of false beliefs to that speaker.

If I am to read Wheeler's use of "independent" compilers as not accounting for the possibility of the same backdoor being present in both compilers, then I have to conclude that not only did Wheeler make an obvious error in his dissertation but that his dissertation committee wasn't competent enough to spot the error either. The more charitable reading is that Wheeler intended "independent" to be read as "not having the same backdoors", in which case both Wheeler and his dissertation committee can be seen as competent. This is not a common use of the term "independent" and so spelling it out in the article makes sense. GaramondLethe 16:49, 20 January 2013 (UTC)[reply]
1. Search Wheeler's text for the part "independen". You will see that this part appears only 14 times. This appearances are:
1.1. 6 appearances are quotes. So, Wheeler himself doesn't like this term, it just quotes it from other authors.
1.2. 1 appearance is comment on quote. The quote is: "if an independent (whatever that is) implementation of the specification will generate...", and Wheeler's comment is: "However, it does not explain what independence would mean..." (section 2.2). So, Wheeler notes that the word "independence" doesn't have precise meaning.
1.3. 7 appearances which are not about "independence of compilers". For example, "even those systems whose defenders perform independent source code analysis" (section 3.1).
So, Wheeler doesn't introduce/use term "independence" in the sense "independence of compilers".
2. But Wheeler's text uses the word "trusted" a lot. So, what means "trusted"? We see in section 4.3: "in this dissertation, something is “trusted” if we have justified confidence that it does not have triggers and payloads that would affect the results of DDC". As you can see, Wheeler doesn't say anything about same backdoors. He just said: "does not have triggers and payloads that would affect the results of DDC".
3. What you mean when you say "same backdoors"? Can different compilers (for example, gcc and MVS) have same backdoors? What means "same backdoors" in this case? Exactly same backdoor code? Or similar backdoor idea? As you can see, Wheeler's text completely doesn't say anything about this. All this is your ideas.
4. I have to conclude that not only did Wheeler make an obvious error. Please say exact section and phrase, which contents (at your opinion) error (when interpreted literally), say what is literal (and, simultaneously, "wrong") understanding of this phrase and what is your suggested "right" understanding of the phrase. I think that Wheeler's text has no any important errors.
5. Imagine compiler A which have some backdoor. And compiler B, which in fact is not compiler, it just generates random binaries. As you can see, they, of course, doesn't contain "same backdoors" in your terminology. But imagine the following: we "compile" source code of A compiler (without backdoor) using B compiler. B compiler accidently generated A compiler with backdoor. Then we compile source code of A (without backdoor) using this binary (A with backdoor) and of course, we get A with backdoor again. So, A and B doesn't contain same backdoors. But A contain backdoor. And DDC doesn't detected this :) Of course, probability of such scenario is extra small. But it is possible (in mathematical sense). So you are wrong. Proofed. But see the following paragraph:
6. As you can see, this question is controversial. So, we should just say what Wheeler said. And we should not create something new. So, we should not speak about "same backdoors" (they are your idea). We should not try to "fix" Wheeler's errors. We should not to argue about this.
7. You pointed to some "Principle of Charity". Is this Wikipedia rule? :) Safinaskar (talk) 22:58, 21 January 2013 (UTC)[reply]

Hi Safinaskar.

Based on the above I'm having a hard time not coming to the conclusion that you have no idea what you're talking about. To take just one example, yes, exactly the same backdoors can be found across different compilers and different operating systems, and the exact same source code can be used in all of them. This is really obvious if you've written a compiler or even studied them (or even if you understood "Reflections on Trusting Trust"). Compilers are just programs, and for the most part they're written in C. The backdoor would also be written in C (in this example) and when the compiler was compiled the back door would be incorporated into the binary. If the backdoor was in the front-end then it would only affect languages that targeted that front end; if it was in the back end it would affect programs written in any language the compiler understood but only for a particular architecture; and if the backdoor affected the IR then it affects everything.

This isn't the kind of thing that gets put into a dissertation because committee members get really annoyed when you spend a lot of time stating the obvious. (Yes, my Ph.D. is in computer science. And yes, I've hacked on gcc and gotten a conference and journal paper out of it.)

So rather than me trying to give you a graduate compilers course on the talk page let's step back and look at the bigger picture. Wheeler's dissertation has only been cited 7 times according to Google Scholar. Given the wealth of academic work in detecting (statically and dynamically) back doors in compilers and other applications, I think this is a clear case of WP:UNDUE. (Compare this to "Reflection on trusting trust", which has been cited 549 times.)

What's your best argument for keeping the Wheeler paragraph?

GaramondLethe 00:00, 22 January 2013 (UTC)[reply]

I think we should keep paragraph about Wheeler paper. But it is better to remove it than keep it in your version. We should keep Wheeler paragraph because it states that there is possible to detect backdoors and gives good method for detecting them. If we remove it reader can think that backdoors are powerful and invulnerable.
Again, I think all important facts in Wheeler paper are right. If you think Wheeler made some error then please say exact place of this error.
I know compiler internals. I know what is front end, back end etc.
I said you main argument why we should leave my version (because we should just copy Wheeler's ideas and not create something new (OR)). Safinaskar (talk) 21:58, 22 January 2013 (UTC)[reply]
As you didn't address my concerns about undue weight being given to Wheeler's work, I think it best that the paragraph on Wheeler remain deleted. GaramondLethe 23:06, 22 January 2013 (UTC)[reply]
The Wheeler's paper says how to detect "trusting trust" attack. So, it is important here. Of course, there is a lot of paper which describes how to detect the attack, too. But (according to Wheeler's paper) Wheeler's paper is the best of them :) Also the paper is monography and it is PhD dissertation. It is science paper. So, of course, it has big weight. Safinaskar (talk) 17:03, 23 January 2013 (UTC)[reply]
Also, you are speaking about UNDUE. But this rule just says (as I know) what sources has enough weight to construct neutral point of view (WP:NPV). Of course, the "Trusting trust" section is neutral, so the UNDUE rule is unnecessary for us. (But if you bring to article your own ideas (not taken from sources) such as the "same" word, then, of course, the section will not be neutral.) Safinaskar (talk) 17:14, 23 January 2013 (UTC)[reply]
WP:UNDUE means viewpoints must be represented in "proportion to the prominence of each viewpoint in the published, reliable sources". Rather than saying "of course it has big weight" I can actually measure that weight. Thompson's paper has been cited in the peer-reviewed literature over 500 times. The Petersen paper in the article has been cited 95 times and gets a single sentence. The RAND report has been cited 35 times and gets a single sentence. We could probably stand to include a pointer to snort (cited 2093 times) and a general model of intrusion detection (cited 2838 times). And while we're at it, we might as well include a more up-to-date definition (cited 230 times). (Citation counts per google scholar.)
Wheeler's dissertation has been cited 7 times and you're asking for a long paragraph.
I understand you didn't realize that the "weight" of a scientific work could be derived empirically, and you also had no idea of both the breadth and depth of research in this area. You made the understandable error of thinking that if an article was scientific, that gave it sufficient weight to be included here. However, the weight of scientific articles exists on a continuum, and most have no business being cited on wikipedia, much less being given a full paragraph of discussion. WP:UNDUE was created precisely to keep minor works like Wheeler's out of wikipedia articles.
You don't have to take my word for this. If you want to take this to dispute resolution (WP:3RD, WP:DRN, or whatever) I'm happy to do so.
GaramondLethe 18:29, 23 January 2013 (UTC)[reply]
Okey, you are right. Let's keep this paragraph removed. (Also, SUDDENLY, Wheeler is Wikipedia admin, but of course, this doesn't change anything.) Safinaskar (talk) 19:37, 23 January 2013 (UTC)[reply]
<Grin> Nice catch! GaramondLethe 19:42, 23 January 2013 (UTC)[reply]

As it stands right now, the article sort of implies that the state of the art includes no way to detect such a backdoor. If a whole paragraph would be undue weight, couldn't we at least have one sentence that the attack is detectable? It'd go something like "David A. Wheeler described a method to detect this sort of tampering through the use of an independently written compiler.<ref>a citation</ref>" --Damian Yerrick (talk | stalk) 05:49, 27 January 2013 (UTC)[reply]

Hi Damian. Two points. First, the standard way of detecting tampered binaries is a combination of checksums and/or cryptographic signatures. this is far more convenient than requiring source code and covers all binaries, not just the compiler (tripwire is one solution among many). Second, I don't know that we want to extend this article to cover compiler compromises in general. I'm only aware of three incidents: Thompson, Win32.Induc.A (which didn't result in a back door) and (maybe) Stuxnet (which also didn't result in a back door). So I'm a little reluctant to add a poorly-cited academic paper on detecting compromised compilers to an article focused on what happens after you have the compromise. That said, I'm happy to continue the discussion. GaramondLethe 21:49, 27 January 2013 (UTC)[reply]

Off-topic content be in the article[edit]

I've already started one discussion about this[2] and now I'm starting a second. For some strange reason, people are adding content to the article which has nothing to do with the article's topic. Case in point: "South African police team fly to Apple HQ to crack Oscar Pistorius iPhone code."[3] Nowhere in the cited reference does it say anything about backdoors. Second case in point: Apple's goto fail bug. The first two references[4][5] don't even mention anything about backdoors and the third-reference[6] does but calls it a "nefarious conspiracy", yet our article describes it as if it were a fact. It's not. We should not be going around spreading conspiracy nonsense as if it were fact. A Quest For Knowledge (talk) 11:03, 31 March 2014 (UTC)[reply]

Thanks for describing your position. I've had some experience elsewhere on wiki about similar difficulties. Your position can roughly be summarised as "if the source doesn't use the word 'X', we cannot post that reference under topic 'X'." Mine is admittedly a broader view, because not every writer on the planet describes 'X' under those terms. English is a more fluid language than French or other languages that have 'official' dictionaries and people who are trained to look after the lexicon. So in my view, a writer who is unconcious of the finer points of wikipedian discussion of 'X' may describe his subject as 'A', especially if s/he uses the English as second language. The Samsung backdoor is a useful case in point. If that does not explain what wikipedia considers to be a 'backdoor', what is the use of wikipedia? I challenge you to find a more appropriate term than 'backdoor' in the case of Samsung. 69.60.247.253 (talk) 16:53, 31 March 2014 (UTC)[reply]

Beast 2.07 Remote Administration Tool[edit]

I downloaded Beast on my computer in safe mode with networking and so it dosen't find it's way into the system of my files. I on purposely hacked my brother's computer and he got really mad. How do i fix this?

  • Description-
  • Removal-
  • Reading help content-

♣♠♣♦♦♥ ♠₮৳ — Preceding unsigned comment added by 2601:1:8801:B20E:80E3:40FE:CB2F:BE53 (talk) 21:15, 15 February 2015 (UTC)[reply]


"of no concern"[edit]

that's a lol since Mr Ed Snowden. quote : "in high-security settings, where such attacks are a realistic concern." — Preceding unsigned comment added by 91.60.132.129 (talk) 01:08, 19 February 2015 (UTC)[reply]

Good list of backdoors[edit]

Let us expand this section by the GNU list: https://www.gnu.org/proprietary/proprietary-back-doors.en.html

Zezen (talk) 05:52, 22 December 2018 (UTC)[reply]
The problem of that list is, that some might argue that some of the mentioned "backdoors" are not really "backdoors", but actually intended features or other trade-offs. (see "It's a feature, not a bug.")
The current list in Wikipedia only mentions backdoors that are really verified to be neafearious backdoors for an actor. Things like "unintended back doors" e.g. contradict themselve, as a backdoor is actually an intended feature by someone, otherwise it is just a (admittingly, very big/bad) vulnerability. --rugk (talk) 17:38, 22 December 2018 (UTC)[reply]