aboutsummaryrefslogtreecommitdiff
path: root/summary/src/ctor/main.tex
blob: ac4b505c46679a73fe2101c63a1980f7e59597b8 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
\begin{kaupaper}[
    author={%
      \textbf{Rasmus Dahlberg},
      Tobias Pulls,
      Tom Ritter, and
      Paul Syverson
    },
    title={%
      Privacy-Preserving \& Incrementally-Deployable Support for Certificate Transparency in Tor
    },
    reference={%
      PETS (2021)
    },
    summary={%
      One deployment challenge of Certificate Transparency is to ensure that
      monitors and end-users are engaged in gossip-audit protocols.  This is
      particularly difficult for end-users because such engagement can harm
      privacy.  For example, verifying that a certificate is included by
      fetching an inclusion proof from a log reveals which website was visited.
      We propose a gradual roll-out of Certificate Transparency in Tor Browser
      that preserves privacy \emph{due to} and \emph{how we use} the anonymity
      network Tor.  The complete design holds log operators accountable for
      certificates they promise to append by having Tor relays fetch inclusion
      proofs against the same view agreed upon by directory authorities in Tor's
      consensus.  Found issues (if any) are reported to trusted auditors.  The
      incremental design side-steps much of the practical deployment effort by
      replacing the audit-report pattern with cross-logging of certificates in
      independent logs, thus assuming that at least one log is honest as opposed
      to no log in the complete design.  All Tor Browser needs to do is verify
      log signatures and then submit the encountered certificates to randomly
      selected Tor relays.  Such submissions are probabilistic to balance
      performance against the risk of eventual detection of log misbehavior.
      Processing of the submitted certificates is also randomized to reduce
      leakage of real-time browsing patterns, something Tor Browser cannot do on
      its own due to criteria like disk avoidance and the threat model for
      wanting Certificate Transparency in the first place.  We provide a
      security sketch and estimate performance overhead based on Internet
      measurements.
    },
    participation={\vspace{-.25cm}
      I had the initial idea and was the main driver to move the work forward,
      first in discussion with Tobias and then together with Tom and Paul.
    },
    label={
      paper:ctor
    },
]
  \maketitle
  \begin{abstract}
    \input{src/ctor/src/abstract}
  \end{abstract}
  
  \input{src/ctor/src/introduction}
  \input{src/ctor/src/background}
  \input{src/ctor/src/adversary}
  \input{src/ctor/src/design}
  \input{src/ctor/src/analysis}
  \input{src/ctor/src/cross-logging}
  \input{src/ctor/src/performance}
  \input{src/ctor/src/privacy}
  \input{src/ctor/src/related}
  \input{src/ctor/src/conclusion}

  \input{src/ctor/src/acknowledgements}
  
  \bibliographystyle{plain}
  \bibliography{src/ctor/src/ref}

  \begin{appendices}
    \input{src/ctor/src/appendix}
  \end{appendices}
\end{kaupaper}