Proof of knowledge – Wikipedia

course of synergistic proof
In cryptography, a proof of knowledge is an synergistic proof in which the prover succeeds in ‘convincing ‘ a voucher that the prover knows something. What it means for a machine to ‘know something ‘ is defined in terms of calculation. A machine ‘knows something ‘, if this something can be computed, given the machine as an input. As the course of study of the prover does not necessarily spit out the cognition itself ( as is the case for zero-knowledge proofs [ 1 ] ) a machine with a different program, called the cognition centrifuge is introduced to capture this idea. We are by and large interest in what can be proven by polynomial clock time bounded machines. In this subject the adjust of cognition elements is limited to a set of witnesses of some terminology in NP. Let x { \displaystyle x } x be a statement of speech L { \displaystyle L } L in NP, and W ( x ) { \displaystyle W ( x ) } W(x) the set of witnesses for ten that should be accepted in the proof. This allows us to define the following sexual intercourse : R = { ( x, west ) : ten ∈ L, tungsten ∈ W ( x ) } { \displaystyle R=\ { ( x, west ) : x\in L, w\in W ( x ) \ } } R=\{(x,w):x\in L,w\in W(x)\}. A proof of cognition for relation back R { \displaystyle R } R with cognition erroneousness κ { \displaystyle \kappa } \kappa is a two party protocol with a prover P { \displaystyle P } P and a verifier V { \displaystyle V } V with the following two properties :

  1. Completeness: If ( ten, west ) ∈ R { \displaystyle ( x, tungsten ) \in R }(x,w)\in R P { \displaystyle P } tungsten { \displaystyle watt }w ten { \displaystyle ten } V { \displaystyle V } Pr ( P ( x, watt ) ↔ V ( x ) → 1 ) = 1 { \displaystyle \Pr ( P ( x, w ) \leftrightarrow V ( x ) \rightarrow 1 ) =1 }\Pr(P(x,w)\leftrightarrow V(x) \rightarrow 1) =1
  2. Validity: Validity requires that the success probability of a knowledge extractor E { \displaystyle e }E P ~ { \displaystyle { \tilde { P } } }\tilde P P ~ { \displaystyle { \tilde { P } } }

Details on the definition [edit ]

This is a more rigorous definition of Validity : [ 2 ] Let R { \displaystyle R } be a witness relation, W ( x ) { \displaystyle W ( x ) } the set of all witnesses for public value x { \displaystyle x }, and κ { \displaystyle \kappa } the cognition mistake. A proof of cognition is κ { \displaystyle \kappa } -valid if there exists a polynomial-time machine E { \displaystyle vitamin e }, given prophet access to P ~ { \displaystyle { \tilde { P } } }, such that for every P ~ { \displaystyle { \tilde { P } } }, it is the case that E P ~ ( x ) ( x ) ∈ W ( x ) ∪ { ⊥ } { \displaystyle E^ { { \tilde { P } } ( ten ) } ( x ) \in W ( x ) \cup \ { \bot \ } } E^{{{\tilde  P}(x)}}(x)\in W(x)\cup \{\bot \} and Pr ( E P ~ ( x ) ( x ) ∈ W ( x ) ) ≥ Pr ( P ~ ( x ) ↔ V ( x ) → 1 ) − κ ( x ). { \displaystyle \Pr ( E^ { { \tilde { P } } ( adam ) } ( ten ) \in W ( x ) ) \geq \Pr ( { \tilde { P } } ( ten ) \leftrightarrow V ( x ) \rightarrow 1 ) -\kappa ( x ). } \Pr(E^{{{\tilde  P}(x)}}(x)\in W(x))\geq \Pr({\tilde  P}(x)\leftrightarrow V(x)\rightarrow 1)-\kappa (x). The leave ⊥ { \displaystyle \bot } \bot signifies that the Turing machine E { \displaystyle e } did not come to a decision. The cognition error κ ( x ) { \displaystyle \kappa ( ten ) } \kappa (x) denotes the probability that the verifier V { \displaystyle V } might accept x { \displaystyle x }, even though the prover does in fact not know a spectator w { \displaystyle west }. The cognition extractor E { \displaystyle einsteinium } is used to express what is meant by the cognition of a Turing machine. If E { \displaystyle east } can extract west { \displaystyle tungsten } from P ~ { \displaystyle { \tilde { P } } }, we say that P ~ { \displaystyle { \tilde { P } } } knows the measure of watt { \displaystyle west }. This definition of the robustness property is a combination of the validity and hard validity properties. [ 2 ] For small cognition errors κ ( x ) { \displaystyle \kappa ( x ) }, such as e.g. 2 − 80 { \displaystyle 2^ { -80 } } 2^{{-80}} or 1 / phosphorus o l yttrium ( | x | ) { \displaystyle 1/\mathrm { poly } ( |x| ) } 1/{\mathrm  {poly}}(|x|) it can be seen as being stronger than the soundness of ordinary interactional proof .

relation to general synergistic proof [edit ]

In ordering to define a particular proofread of cognition, one need not merely define the linguistic process, but besides the witnesses the voucher should know. In some cases proving membership in a language may be easy, while computing a particular witness may be hard. This is best explained using an exemplar : Let ⟨ g ⟩ { \displaystyle \langle g\rangle } \langle g\rangle be a cyclic group with generator gigabyte { \displaystyle guanine } g in which solving the discrete logarithm problem is believed to be hard. Deciding membership of the language L = { x ∣ guanine watt = x } { \displaystyle L=\ { x\mid g^ { west } =x\ } } {\displaystyle L=\{x\mid g^{w}=x\}} is trivial, as every adam { \displaystyle ten } is in ⟨ deoxyguanosine monophosphate ⟩ { \displaystyle \langle g\rangle }. however, finding the witness watt { \displaystyle watt } such that g tungsten = x { \displaystyle g^ { west } =x } g^{w}=x holds corresponds to solving the discrete logarithm trouble .

Protocols [edit ]

Schnorr protocol [edit ]

One of the simple and frequently used proof of cognition, the proof of knowledge of a discrete logarithm, is due to Schnorr. [ 3 ] The protocol is defined for a cyclic group G q { \displaystyle G_ { q } } G_{q} of order q { \displaystyle q } q with generator gravitational constant { \displaystyle thousand }. In order to prove cognition of x = log deoxyguanosine monophosphate ⁡ y { \displaystyle x=\log _ { g } yttrium } x=\log _{g}y, the prover interacts with the voucher as follows :

  1. In the first round the prover commits himself to randomness r { \displaystyle r }

    r thyroxine = thousand gas constant { \displaystyle t=g^ { radius } }t=g^{r}commitment.

  2. The verifier replies with a challenge c { \displaystyle degree centigrade }c
  3. After receiving c { \displaystyle hundred }response) south = radius + c x { \displaystyle s=r+cx }s=r+cx

The voucher accepts, if thousand sulfur = metric ton y speed of light { \displaystyle g^ { randomness } =ty^ { c } } g^{s}=ty^{{c}}. We can see this is a valid proof of cognition because it has an extractor that works as follows :

  1. Simulate the prover to output deoxythymidine monophosphate = gigabyte radius { \displaystyle t=g^ { r } } Q { \displaystyle Q }Q
  2. Generate random value carbon 1 { \displaystyle c_ { 1 } }c_{1} second 1 = radius + c 1 ten { \displaystyle s_ { 1 } =r+c_ { 1 } x }{\displaystyle s_{1}=r+c_{1}x}
  3. Rewind the prover to state Q { \displaystyle Q } vitamin c 2 { \displaystyle c_ { 2 } }c_{2} s 2 = r + c 2 x { \displaystyle s_ { 2 } =r+c_ { 2 } ten }{\displaystyle s_{2}=r+c_{2}x}
  4. Output ( s 1 − south 2 ) ( c 1 − c 2 ) − 1 { \displaystyle ( s_ { 1 } -s_ { 2 } ) ( c_ { 1 } -c_ { 2 } ) ^ { -1 } }{\displaystyle (s_{1}-s_{2})(c_{1}-c_{2})^{-1}}

Since ( sulfur 1 − mho 2 ) = ( r + c 1 ten ) − ( gas constant + c 2 adam ) = x ( hundred 1 − c 2 ) { \displaystyle ( s_ { 1 } -s_ { 2 } ) = ( r+c_ { 1 } x ) – ( r+c_ { 2 } ten ) =x ( c_ { 1 } -c_ { 2 } ) } {\displaystyle (s_{1}-s_{2})=(r+c_{1}x)-(r+c_{2}x)=x(c_{1}-c_{2})}, the output of the extractor is precisely ten { \displaystyle x }. This protocol happens to be zero-knowledge, though that property is not required for a proof of cognition .

sigma protocols [edit ]

Protocols which have the above three-move social organization ( commitment, challenge and reaction ) are called sigma protocols [ citation needed ]. The greek letter Σ { \displaystyle \Sigma } \Sigma visualizes the flow of the protocol. Sigma protocols exist for proving versatile statements, such as those pertaining to discrete logarithm. Using these proofs, the prover can not lone prove the cognition of the discrete logarithm, but besides that the discrete logarithm is of a particular shape. For example, it is possible to prove that two logarithm of y 1 { \displaystyle y_ { 1 } } y_{1} and y 2 { \displaystyle y_ { 2 } } y_{2} with respect to bases g 1 { \displaystyle g_ { 1 } } g_{1} and gigabyte 2 { \displaystyle g_ { 2 } } g_{2} are equal or fulfill some other analogue relation. For a and b elements of Z q { \displaystyle Z_ { q } } Z_{q}, we say that the prover proves cognition of x 1 { \displaystyle x_ { 1 } } x_{1} and x 2 { \displaystyle x_ { 2 } } x_{2} such that y 1 = gigabyte 1 x 1 ∧ yttrium 2 = gravitational constant 2 ten 2 { \displaystyle y_ { 1 } =g_ { 1 } ^ { x_ { 1 } } \land y_ { 2 } =g_ { 2 } ^ { x_ { 2 } } } y_{1}=g_{1}^{{x_{1}}}\land y_{2}=g_{2}^{{x_{2}}} and ten 2 = a adam 1 + boron { \displaystyle x_ { 2 } =ax_ { 1 } +b } x_{2}=ax_{1}+b. Equality corresponds to the extra case where a = 1 and b = 0. As adam 2 { \displaystyle x_ { 2 } } can be trivially computed from x 1 { \displaystyle x_ { 1 } } this is equivalent to proving cognition of an x such that y 1 = guanine 1 adam ∧ y 2 = ( gravitational constant 2 a ) ten g 2 b-complex vitamin { \displaystyle y_ { 1 } =g_ { 1 } ^ { x } \land y_ { 2 } = { ( g_ { 2 } ^ { a } ) } ^ { x } g_ { 2 } ^ { bacillus } } y_{1}=g_{1}^{{x}}\land y_{2}={(g_{2}^{a})}^{{x}}g_{2}^{b}. This is the intuition behind the come notation, [ 4 ] which is normally used to express what precisely is proven by a proof of cognition .

P K { ( adam ) : y 1 = deoxyguanosine monophosphate 1 ten ∧ y 2 = ( gravitational constant 2 a ) adam g 2 bel }, { \displaystyle PK\ { ( x ) : y_ { 1 } =g_ { 1 } ^ { ten } \land y_ { 2 } = { ( g_ { 2 } ^ { a } ) } ^ { x } g_ { 2 } ^ { b } \ }, }

PK\{(x):y_{1}=g_{1}^{{x}}\land y_{2}={(g_{2}^{a})}^{{x}}g_{2}^{b}\},

states that the prover knows an x that fulfills the relative above .

Applications [edit ]

Proofs of cognition are utilitarian creature for the construction of identification protocols, and in their non-interactive discrepancy, signature schemes. such schemes are :
They are besides used in the construction of group signature and anonymous digital certificate systems .

See besides [edit ]

References [edit ]

informant : https://coinselected.com
Category : crypto topics

Leave a Reply

Your email address will not be published.