Stay up to date on all things crypto and blockchain
Token Daily is a place to discover trending news and products in crypto and blockchain.
Token Daily is a place to discover trending news and products in crypto and blockchain.
This is so cool, but also I think the primary application is going to be making us need new devices that can hardware accelerate it, making even more DRM, moving more stuff to the cloud, etc.
Also more than likely Ethereum style stuff that runs every program on thousands of nodes, which will then be the next big thing integrated with every P2P app...
I'm sure there's going to be a lot of great uses, but also some really annoying ones.
What a crap article.
We already do what's supposedly so great with this "crown jewel" (wtf).
Time to make offline backups of what's been discovered so far before it mysteriously disappears...
Hi Jason_Protell, your submission has been removed for the following reason(s):
It has a sensationalized, editorialized, or biased headline and is therefore in violation of Submission Rule #4. Please read our headline rules and consider reposting with a more appropriate title.
If you feel this was done in error, or would like further clarification, please don't hesitate to message the mods.
Ah yes, just like Time AI - sounds like snake oil to me. I read most of it and it's just garbage vague statements. If they're gonna publish an article on technical stuff they should try to take these complex ideas and make them understandable to non-math experts. Even for math experts, this reads like nonsense.
So, from the sound of it, this would be usable for DRM which would be much more difficult or actually impossible to break.
I'm not enthusiastic about that.
Is there a slightly more technical article anywhere? I can't really understand what's going on from this oversimplification.
And we'll all come back here in 3 months when these scientists die from strange but absolutely normal circumstances
unhappy day for governments around the world who want to ban end to end encryption
While the protocol is far from ready to be deployed in real-world applications, (...)
If something isn't ready for real-world application yet, in 9 out of 10 cases it actually means the military of at least several nations is using that something in covert ops.
Bitcoin is a patchwork of tech which already existed and was used for various civilian purposes. Had Bitcoin included in its workings some really exotic tech not yet declassified for civilian use, Satoshi Nakamoto would've been suicided long time ago before any of its source code ever left his computer.
I Honestly
Have no clue what I just read, what this actually means, or what any real practical implementations are
And I can't tell if it's because this is too vague or I'm not familiar enough with the terminology
From the paper:
In this work, we show how to construct indistinguishability obfuscation from subexponential hardness of four well-founded assumptions. We prove:
Theorem (Informal). Let τ ∈ (0,∞), δ ∈ (0, 1), ǫ ∈ (0, 1) be arbitrary constants. Assume sub-exponential security of the following assumptions, where λ is a security parameter, and the parameters ℓ, k, n below are large enough polynomials in λ:
- the SXDH assumption on asymmetric bilinear groups of a prime order p = O(2^λ),
- the LWE assumption over Z^p with subexponential modulus-to-noise ratio 2^k^ǫ , where k is the dimension of the LWE secret,
- the LPN assumption over Z^p with polynomially many LPN samples and error rate 1/ℓδ, where ℓ is the dimension of the LPN secret,
- the existence of a Boolean PRG in NC0 with stretch n^(1+τ),
Then, (subexponentially secure) indistinguishability obfuscation for all polynomial-size circuits exists.
Further, assuming only polynomial security of the aforementioned assumptions, there exists collusion resistant public-key functional encryption for all polynomial-size circuits.
Attempting to translate:
the SXDH assumption on asymmetric bilinear groups of a prime order p = O(2^λ),
SXDH = Symmetric External Diffie-Hellman assumption. This assumption is that there is some pair of cyclic groups G_1 and G_2, such that there is an efficiently computable bilinear map from G_1 X G_2 -> G_T while the 'easiest' variant of the Diffie-Hellman problem is intractable across both G_1 and G_2. The 'easiest' variant, called DDH is to, given a generator element g and the two values g^x and g^y, with x and y unknown, have a decision procedure which is better than chance at guessing the value of g^xy. (This entails the discrete logarithm problem being intractable, as well as the weaker CDH assumption; both other those are explicitly but redundantly called out in the definition of SXDH. One other variant of the DH problem also needs to be intractable, co-CDH. I'm pretty sure, though not certain, that the truth of co-CDH is also entailed by DDH.)
- the LWE assumption over Z^p with subexponential modulus-to-noise ratio 2^k^ǫ , where k is the dimension of the LWE secret,
The LWE assumption is the assumption that the Learning With Errors problem is intractable. This is the problem of discerning the coefficients of a linear function f over a finite ring (Z^p , here), given a set of (x^1 ,...x^j , y) samples where y_i = f(x^1_i ... x^j_i )+σ_i where σ_i is a noise parameter whose values are 0 with high probability and follow a known noise distribution. In this case that distribution's total noise (averaged over all samples, I think?) is known to be less than p/((2^k )^ǫ ), ǫ being one of the arbitrary parameters from the non-inclusive unit interval.
Some other sources I looked at said that LWE specifically mandates Gaussian noise, but how that squares with "y_i = f(x^1_i ... x^j_i) with high probability" I don't understand.
- the LPN assumption over Z^p with polynomially many LPN samples and error rate 1/ℓδ, where ℓ is the dimension of the LPN secret,
LPN = "Learning Parity with Noise", is a very similar problem to LWE. Here the range of f is restricted to a single bit, and the error rate is the fraction of samples whose output will be bit-flipped. (Bernoulli noise.)
- the existence of a Boolean PRG in NC^0 with stretch n^(1+τ),
PRG=pseudorandom generator, NC^0 = the class of functions computable by constant-depth, bounded-fanin circuits. Stretch being n^(1+τ) means that the PRG generates output of length at least weakly exponential in the input e.g. if τ=0.01 100 bits of seed generates 104 bits of output sequence and 1000 bits of seed generates 1071 bits of output. τ can be arbitrarily small so long as it's positive, so this is requiring 'minimally exponential' stretch.
AFAICT the strongest known NC^0 PRGs have only linear stretch, so this assumption is by far the shakiest.
Alright this is cool enough and all but I just can’t help from offering a contrasting voice for a claim made in the article. This whole thing about a ‘crowning jewel’ and ‘building crypto’ up from this notion...
...techniques can be new or cool without superseding everything magically. I think it’s a bit misleading to make this seem too universal. Of course it could be a big deal if it works out in practice, but the in practice is part of cryptography — and the whole of cryptography cannot be derived from obfuscation.
Hiding the implementation of any program does not always meet your security goals!
Found one of their published pieces of research. I don’t understand this shit past at most page 15 lol. https://arxiv.org/pdf/2008.09317.pdf
Pretty much the main use of such a thing is uncrackable DRM systems, which we absolutely do not need in this world.
> But iO is stronger than it sounds. For example, suppose you have a program that carries out some task related to your bank account, but the program contains your unencrypted password, making you vulnerable to anyone who gets hold of the program. Then — as long as there is some program out there that could perform the same task while keeping your password hidden — an indistinguishability obfuscator will be strong enough to successfully mask the password. After all, if it didn’t, then if you put both programs through the obfuscator, you’d be able to tell which obfuscated version came from your original program.
Why does existence of another mask the secret? Even if you couldn't distinguish the programs, couldn't you just try both possible secrets?
Reads article
Yup! Gosh, I can't wait for it to become remotely practical.
It's been a few years -- wow, almost a decade -- but I remember a paper that claimed to prove that "no fully homomorphic program can be written such that the secret key was guarded in all cases," i.e. it's theoretically possible to trick a FHE program into giving up its own key, unless your program is basically trivial.
It was one of those papers that you run across, then never see again, and wistfully wish you'd saved a reference to. Tantalizing idea, if true, but probably not true.
EDIT: I just went back to finish the article, and they linked to it: https://link.springer.com/chapter/10.1007/3-540-44647-8_1 Welp. Looks like it was a good paper!
>For decades, computer scientists wondered if there is any secure, all-encompassing way to obfuscate computer programs, allowing people to use them without figuring out their internal secrets.
Obfuscate executable code? Somehow I don't think the original paper is talking about that.
<skims through the beginning of the paper>
They are?
But... how do you execute it then?
I wonder if this will be the Achilles heel. If there were a way to perform a task without needing a secret, you would not be using homomorphic encryption. You could just do regular computation since there is no secret. However, if the only way to perform the computation is with the secret, then Indistinguishability Obsfuscation won't help you. To me it seems that iO is useful only for computations that didn't need homomorphic encryption in the first place, because they already had a way to do the computation without the secret.
If you're going to spend 9 paragraphs talking about the controversy of whether such a thing is possible or not, dedicate at least 1 sentence at the start to explain it. Otherwise it's just confusing because as a reader, I have no idea why I should care or what's even worth caring about here.
Recommendation to anyone who hasn't read the article yet: start with the part called "The Crown Jewel".
As someone who works in cryptography, this isn't "The Crown Jewel" of Cryptography.
Cryptography to me is about helping queer people live their best life even in countries where they're hated.
Cryptography to me is about providing assurance that the software you're running was produced by the people you thought produced it, and that the built binary is reproducible from the source code, which technical people can also inspect to assert the absence of backdoors.
Obfuscation and copy-protection are non-goals for me. Fully homomorphic encryption has uses in machine learning and cloud services, but I find them at odds with the goals I care about.
TokenSoft is the volume leader in compliant token sales.
The open protocol for tokenized debt.
A secure online platform for buying, selling, and storing digital currencies.
A second layer, off-chain scaling proposal for bitcoin.
Ensuring the blockchain is inexpensive and accessible to everyone.
An open protocol for decentralized exchange on the ethereum blockchain.