But it could def use more visibility, being Jitsi is a direct Zoom rival, and nothing else out there allows free , client side end to end encrypted open source videoconferencing (for now, while groups like Signal are working on it...)
wait wait, open source E2Ee that are videoconferencing?
I only know of E2E free tech -
Signal is working on videoconferencing, Wire....can do it to a extent, but theres other issues with wire, Jitsi, is getting there as we see and testing it openly as linked above -jami- i hear it's not as stable, but yeah, thats an option
Duo- not open source, though e2e and working hard to increase their numbers to chase facetime and hangouts
facetime- not open source
\-You know of more????
Jami is the big one I would recommend, and it's stable and I use it pretty regularly, but Tox, Linephone, technically MegaChat (is open source, would approach cautiously). And of course the others that you listed.
Edit: it's worth mentioning, in case anyone gets the wrong idea, that I love Jitsi, and this is fantastic. I use many different solutions, depending need need and was only responding to give more info on some others. I'm a big Jitsi user. Along with Signal, Riot/Matrix, Cryptpad, Jami, and many others. I really only refuse to use WhatsApp because I have negative trust for Facebook. I actively DISTRUST them, disbelieve anything that they say, and won't use any of their products. Facebook could tell me that the sky is still blue, and I think I'd still go check to make sure.
I don't disagree, but I do want to mention that that's a more recent semantic distinction. Those concepts (to my knowledge, and I've tried to educate myself on this) didn't exist back around the early nineties, when for instance, the Linux kernel was made available. I should have been more specific that the source is available for review, not necessarily that you are licensed to take it and re-implemented. Thank you for bringing the distinction to the attention of others that may have read my post.
Sorry if I was too tongue-in-cheek 😂😂.
I do disagree it is just semantics, as you said mega is only open for review, but people cant modify the code or run in themselves, which is the core of FOOS philosophy. I think the term originated due to microsoft trying to subvert the OSS movement by releasing some code with really restricting licenses.
Wow, you've just listed a lot of websites privacy-friendly that I didn't know. I used to make notes on Google Docs but since I became a "privacy junkie" I just use the notepad, gonna give a try to CryptPad.
Another things, if you had to choose between Signals and Jami which one would you choose?
A lot depends on your use case. Signal is quite helpful for asynchronous messaging, which Jami does not (unless it's extremely recently) support. What this means in practice, is that you can't leave offline messages for users. It is, however, strongly anonymous, and distributed. Jami is useful for multi-participant conference calls, and doesn't depend on any centralized server. This may, or may not result in better performance (and may result in worse if the path between participants is complicated). If you do decide to use Jami, immediately make a backup of your account, and add a second device, or you will lose it some day. I use both, though I do use Signal much more often, simply due to the offline messaging support.
Agree, 1000%- I can't wait till this works in FF.....
So, not sure if you're keeping up with this- but at the time of this post I just heard this might work in nightly, I am grabbing it right now to test-
Also, a dev showed up in the thread about this-
, so you could bug him- but I've been hearing they're moving fast to add functionality to Firefox ,and have been adding it- so keep an eye on this, they are moving quite fast.
Update #2. So while others might be getting it to work- I grabbed nightly and enabled subgrid, and did not see the option on the alpha site or the main site- I note using Chrome's raw pre-developer verison , Canary, I now see the option on the main site as well, so no need to use the alpha.jitis.net site, [meet.jit.si](https://meet.jit.si) now is functional
Sidenote: i looked at this- the WebRTC devs are the same teams that work on Chrome, Hangouts, Duo- etc- (funny enough they're looking at using this new development to enhance Duo according to twitter, which is funny since Duo is already e2e and they have it up to 12 people, though they want it bigger)
So, I don't know if this is a matter of the work on the Firefox team's side or the Jitsi Team's side-
I guess a random stream being interpreted as MPEG wouldn't look like evenly distributed noise, that's probably more an effect of the codec.
As it says in the article they've not gotten to key exchange yet, they probably will use DH.
Provided you arrange secure multiparty key distribution ahead of time. Which is far from a trivial problem. Also, it litters the key in your web history. Which probably gets uploaded to your google account.
It’s a decent step, but it leaves most of the really hard problems to the users.
The real problem it seems to me is the .gov is actively making real encryption illegal to tbe best of its ability. I have 100% faith that a lot of you encryption folks could come up with near unbreakavle methods. Im a bit in awe of tbose of you who figure out how to break encryption honestly. I also have 100% faith that is the biggest fear of our elite. Theyre scanning our, and our politicians emails, phone calls etc as we speak. Theyre going to go to the mat to keep that power
It's kind of the cats out the bag since you can self host this unless they make it illegal by knocking on everyone's door I don't see it being realistic even if they do pass some end-to-end encryption bill in the US.
Personally I'd love to see this made completely peer 2 peer including the room creation so that there is no way to shut it down.
The truth is if the .gov keeps getting more and more corrupt there are far too many programmers out here for them to stop us from releasing unbreakable corruption into the wild. Im a progressive and im disgusted by both parties turning us into a surveillance state to be clear. And youre right. Release freeware anonymously as well as the source everywhere and they cant stop us. Especially with the quality of the average .gov programmer
This looks like just the trivial part of e2ee video: encrypting/decrypting a stream is not complicated.
It is my impression correct that this means there's no downscaling performed or anything? Did I miss something?
I.e. if I have 4 people in a conference and they all have 1920x1080 streams, each should take up only a quarter of a 1920x1080 screen. Without encryption you'd just have the bridge downscaling the streams and mixing them into a single 1920x1080 stream, thus getting linear bandwidth consumption (one 1920x1080 stream down per participant), instead of quadratic (n 1920x1080 streams down per participant, downscaled and mixed locally). I'm curious what the devs plan for this.
All participants use simulcast, which means they send their video in 3 different resolutions (on meet.jit.si clients send 720p, 360p and 180p) and all of them are encrypted. So if one participant has downlink issues, we just forward them a lower-quality stream (other participants can continue to receive the highest quality of the sender they can take).
I think each client sends a correct video stream of a scrambled 1080/n and the server joins them as normal videos. On the client again decodes and gets the full 1080 scrambled frame, decoded before sending it to the screen.
Oh, it's been solved already over and over again. It is just that people still seem to believe that they can somehow have secure communications without verifying that they are actually talking to who they think they are talking to. There is nothing to solve here other than expectations.
What Jitsi is doing now where you send the key over another channel is better than most things that claim E2EE in that the server can't trivially MITM everything. You need to do that on the wire.
Exactly. "Here is how we did the trivial part of e2e. Our next step is to figure out something nobody has a great solution to and which is crucial to get any user adoption".
(I'm aware of Matrix and others doing interesting things with key exchange. But I don't consider that a great solution, especially if you want this to be used by non-technical people.)
Yeah, I wonder if Jitsi Meet can be run without Jitsi Videobridge...
Although from what I understand for such a scenario the solution would be to just run Videobridge on your own system locally since it'd make no difference to the end result. (And it's easier to keep the software components neatly distinct that way.)
Reminds me of when PGP was classed as a munition and couldn’t be exported - but a book of the source code was protected under the 1st amendment so was sent overseas, OCRd and compiled in Europe to get around the ban.
[quote]Meanwhile police and other government agency's are busy encrypting their radio communications.[/quote]
The government is not totally opposing end-to-end encryption. End-to-end is perfectly fine, so long as the government can decrypt the data along with the sender and recipient(s).
There's no way Alice and Bob should be able to talk in secret without Uncle Sam knowing everything that's going on.
Since law enforcement are able to decrypt law enforcement communications... They, unlike the rest of us, are perfectly fine using strong encryption without the need for a second key.
There are two unrelated things here. If the government had the private keys to the CAs beforehand, they could intercept with a man-in-the-middle attack to capture the traffic. May corporate spy-on-employees do this (except they don't have the private keys, they just are their own CA which they make your browser accept). Anti-virus software also often do this. Gaining the private keys to the CA doesn't really help them.
The other is what is known as forward secrecy. Some ciphersuites used by TLS (the encrypted transport underlying HTTPS) have forward secrecy, which means even if after the fact the observing party gets access to everyone's private key, it still doesn't allow them to decrypt the communication. Browsers and web servers are moving toward these protocols, but there is still a lot of web traffic done without it. If the government captured the communication, forced the owner of the server (not the CA, just the server) to give them its private key, they could then decrypt this traffic.
Looking at my connection to reddit, I see it is using TLS\_ECDHE\_RSA\_WITH\_AES\_128\_GCM\_SHA256. In this case, it is the second E in ECDHE means the exchange uses an ephemeral key, and this connection has forward secrecy.
> If the government had the private keys to the CAs beforehand, they could intercept with a man-in-the-middle attack to capture the traffic.
> Gaining the private keys to the CA doesn't really help them.
What? Isn't that the opposite of the statement above?
> The other is what is known as forward secrecy. ...even if after the fact the observing party gets access to everyone's private key, it still doesn't allow them to decrypt the communication.
The implication is that they're MITM **while** the traffic is happening, so forward secrecy isn't relevant, as the session is still active, and they know the session secret since they were there in the middle during the handshake.
> forced the owner of the server (not the CA, just the server) to give them its private key, they could then decrypt this traffic.
They don't need to get the owner certificate, with a CA key they can just make their own and MITM live traffic.
> > Gaining the private keys to the CA doesn't really help them.
> What? Isn't that the opposite of the statement above?
I guess what I meant was that if they didn't have the private key before your TLS session, gaining it afterwards won't help them.
I think this is why the out-of-band tracking of which certs are actually valid for a given site are quite useful. If someone gets a hold of a CA private key, and makes a new cert for the site, it should get flagged, at least at first. Of course, the government would be just the entity intercept early and consistently.
Forward secrecy helps against where they get the private key for the web server, although if they have access to the server during the communication session, they could also capture internal to the server.
> make sure that self signing doesn't become a thing.
But... Self signing has been a thing for awhile and is usually used for testing purposes because they aren't ever marked as secure to begin with. I can create a self signed certificate right now using powershell, but it wouldn't be marked as secured and no one's going to trust my website.
I assume what you mean is that CA's can just give the keys to the government so that they could essentially do MITM attacks without you knowing. But if that's the case then he's not really outlawing encryption then is he?
> But... Self signing has been a thing for awhile
Not if you define "a thing" as "working for real users in real applications on the current relevant platforms; browsers, Android and iOS". Then it doesn't work at all.
> CA's can just give the keys to the government so that they could essentially do MITM
I'm 100% sure they already have.
> But if that's the case then he's not really outlawing encryption then is he?
What this clown does isn't really relevant when there's 10 000x more politicians doing basically the same, but smarter. He's basically just making noise so we ignore the real end of encryption happening in the background.
> Not if you define "a thing" as "working for real users in real applications on the current relevant platforms browsers, Android and iOS". Then it doesn't work at all.
"But... Self signing has been a thing for awhile and is usually used for testing purposes **because they aren't ever marked as secure to begin with.**"
> I'm 100% sure they already have.
UK does it already
>He's basically just making noise
Every politician in a nutshell
It would actually be quite trivial to do. It can be outlawed, packet inspection systems can be installed at ISPs and users can be prosecuted for sending encrypted communications, et cetera. Politically, I doubt that's feasible, but technically it is very feasible.
> It can be outlawed, packet inspection systems can be installed at ISPs and users can be prosecuted for sending encrypted communications,
Not sure how well that sort of legislation would fare at least on the scrutiny regarding whether these aspects can be defined in such a way that it can be ensured that we reduce false negatives as much as possible. Not to mention that encryption is used for things like shopping online, and banking, which is increasingly able to be done at home, and via cell phones and other devices - which if you unilaterally outlawed "encryption" would cause a host of problems there.
> users can be prosecuted for sending encrypted communications
Depends on your exact definition of 'encrypted communications' Can I be prosecuted for buying something from Ebay or sending a message over Facebook? It's hard to outlaw something that's pretty standard.
> It's hard to outlaw something that's pretty standard.
No it isn't. It's *easy* to outlaw something that's pretty standard, and it's been done many times. True, they can't arrest *everyone*, but they can do something better, which is arrest *anyone they want*. That's what "prosecutorial discretion" means in a world where everyone is a criminal. Piss off those in power, and they will find something to charge you with.
I'm not saying it's politically feasible in the current political environment, just that there is no technical issue with doing that. Encrypted data is easily distinguishable from unencrypted data.
Besides, with Facebook or eBay the government can simply get records directly from those companies, there is no need to disallow HTTPS. The proposed regulations are targeting companies that use end-to-end encryption such that they do not have access to the content being transmitted using their service. There is no technical issue with doing that either.
> just that there is no technical issue with doing that. Encrypted data is easily distinguishable from unencrypted data.
How so? Rather... are you sure that this can be done with enough accuracy to be meaningful? Is it even possible to act like encrypted data is unilaterally so easy to discern?
...but probably not *end-to-end* encrypting them.
Or, if they are, it's without using a central routing service. Because the bill they're pushing through isn't about making end-to-end illegal for you or me, it's about making it impractical for, say, Whatsapp to keep doing e2e.
Practically, this would mean Jitsi-the-open-source-project is probably fine, but anyone trying to offer Jitsi-as-a-website would have problems.
I guess for a lot of people that's a distinction without a difference, but the thing is subtle and insidious. They've learned their lessons, they aren't going to push through something as dumb as actually _banning_ encryption.
If the copies are end-to-end encrypted, they can't meaningfully be audited. (Traditionally, the "ends" in "end-to-end" are the devices of the people actually involved in the communication; otherwise, we'd say something like "client-server encryption", I guess?)
They've been talking about it since 2016. I think in today's day and age, it's basically irrelevant. Connections are metered more often than not, and you are not going to be using your 5GB mobile allowance for peer-to-peer streaming, especially when a cloud instance costs next to nothing and delivers a much better user experience.
It's coming. Decentralized internet is the future. Free from the government's sticky fingers.
I could very well see a type of facebook in which you host your own "profile page" and maybe 5-50 other profile pages you visit frequently. Everyone does this and share the burden of the "website".
I think that would make the problems with fake news and bots significantly worse. Facebook, Twitter, etc spend a significant amount of money on fact checking and automated and manual review of posts and yet even with that it is a problem. Imagine how bad it would be without that work happening.
> our animalistic, dopamine driven, hard wired social reality feedback circuits facing unconstrained information transfer
Nice string of words, I'd say sentence but I didn't think your entire sentence in that quote.
Okay here's the thing I'm thinking about. Every new generation want's their "own" thing. It dsnt have to be good, or bad, it just has to be theirs. And theirs alone. Kids aged 6-16 don't want to use the same social platform as their parents are using. You can draw parallels here to music taste, kids want something that's theirs. And it's been that way since the dawn of time. So when even their grandparents started using facebook, kids nowadays arent - they're using other platforms like tiktok/snapchat what not.
And the next generation will use a different service. Just because of this "need" for something that's theirs and theirs alone.
What I'm trying to say in so many words is that the hegemony of facebook, instagram, and the likes is not going to last forever.
All we need is a new site the kids starts using with the right backend solution that is democratic and insulated from "sticky fingers". Sadly money makes the world go round, and there's no money in a decentralized site, as such you don't have the funding to market the site and thus it fails - like people have pointed out (Mastodon/fediverse/diaspora/activityhub). Maybe such a site needs to be tied to something with money behind it, so there's an incentive to spread the word about the site.
In the end. The change starts with you. For you to press delete on your facebook/instagram/snapchat/tiktok/google/microsoft and also Reddit! accounts and change site. You might think Reddit is not a social media site, but oh yes it is.
The funny thing is, it was much easier to run your own mail server 20 years ago. Today, you are virtually guaranteed to end up in spam filters 95% of the time unless you jump through 1000 hoops, which is why services like Sendgrid are doing so well.
We already had that. It was called net news.
I find it hard to believe that in an era of bittorrent and blockchain we couldn't manage to re-decentralize something that started out as decentralized in the first place.
Curious thought. What if someone invented a cycleGAN (image generating neural net) that spit out child pornography in copious amounts. No real child was harmed in the process, no child was taken advantage of.
Would this service be praised or hated? It's inherently good as it saves a ton of children from actual harm. But the stigma of the entire thing and the cultural and hateful overlay of this disease that is pedofilia makes it very hard to discuss with rationality and any emotion except hate.
Everything must be decentralized, and fast. Power generation and the internet are the two that we need to get on the fastest. Water and food are pretty close behind. Maybe even bump food to the top in the age of coronavirus.
> I could very well see a type of facebook in which you host your own "profile page" and maybe 5-50 other profile pages you visit frequently. Everyone does this and share the burden of the "website".
This has _never_ been done before and would _never_ fail.
I'd like to see it of course, but we've seen a couple of attempts at that so far.
Because you know your data is in good hands? I stick to using a voip server a close friend of mine runs from his house. Unlike with discord or somesuch, you actually know who is running the server, and server actually means "real server", not glorified chatroom.
Discord had a malicious stroke of genius in calling their glorified chatrooms "servers" to obfuscate what it means to actually "run your own server".
I remember people being very pissed when the internet upload speeds began to plummet in favor of downloads, since they figured this would be inevitable as well. But you cant really host something on 2.5mbps upload, and thus we became reliant on our corporate overlords for content.
Any household's upload speed alone isn't enough to host a YouTube competitor but thousands in aggregate could start to come closer. There's several P2P schemes that do a pretty good job aggregating lots of small pipes into a veritable *torrent* of data. There's also lots of cheap VPS services on much fatter pipes than can easily augment residential servers.
Even if you're just hosting something on your residential connection having several Mbps of bandwidth is plenty for serving lots of services. It doesn't take much bandwidth to host a blog.
> But you cant really host something on 2.5mbps upload
You act like 10 years ago 2.5mbps was the norm. :-) We had decentralized stuff long before Google was around, and now we have bittorrent and blockchain, so it's just a matter of making things convenient enough that people will use them.
Also, you don't have to serve stuff from your house. You just need lots of people competing. Payment processing isn't a shit-show, because lots of people host payment processing, even though it's an expensive thing to do.
They have pretty good uploads if you are the only person uploading to the cell tower. Mobile networks are like gyms -- they only work when 5% or fewer of their users are doing anything with their phone at any given time. Actually, the same is true of residential connections -- a typical residential ISP is oversubscribed by a factor of at least several hundred. In other words, there are 5000 people with "gigabit" internet all connected to one 10 gig port.
It really has very little to do with politics, and a lot to do with physics. It's a lot easier to create a 100 Gb link between 2 racks or between 2 buildings than to do so between 1000 customers spread out in a suburban area. That's also why countries where people live in dense apartment blocks tend to have very cheap and very fast Internet relative to places like the US.
I have 500/50 internet. Most people here in Stockholm can afford it easily ($45 instead of $20). And you could easily service around 10 people with that upload. Heck you could even stream 1080p to 10 people.
The "moderate demand" is the issue
Once something has a number of years behind and is forgotten enough it will be lost. It happens all the time. Both all decentralized and all centralized aren't good enough
Sure one person can host, but that doesn't mean that that one person will always be there. There still needs to be a decentralized effort to preserve everything, even what we don't actually care about, and make it redundant
Decentralized Internet has been right around the corner for two decades and the current technology shift is making that more and more unlikely.
Connectivity is definitively trending toward centralization, not away from It. It’s just not going to happen.
Dumb question - does decentralized internet mean decentralized services and content found on the internet? Is there a way we could access the internet without ISPs? Would that basically boil down to building a giant mesh network?
There’s other reasons people pursue decentralization, I am going to address from the context of your question though.
A commodity item (stay with me) is an item that is not seen as fundamentally different from one provider to the next.
Purely in terms of what you have access to, the internet is a commodity service. That is, you, me and everyone else basically has access to all the same websites and batch of connected services no matter which ISP we have.
For the context of your question, decentralization is in part about ensuring that the internet remains a commodity item. No service should be enabled to get so large as to be able to fundamentally destroy service providers with back room deals that make your internet different from mine because I have a different ISP.
> Turns out the much acclaimed cloud is just a cluster of linux servers stored in a building somewhere.
No it isn't, it's the package of software services that runs on those totally managed linux servers in a building somewhere.
"The cloud" is just all the infrastructure work you'd need to do served up as more expensive individual components that are easier to use.
Have you ever thought about why only big players can afford to have their own clouds? That's because of the massive economies of scale. In other words, small operators have much higher costs that make them uncompetitive. The same economics spells doom for a P2P solution. Whether it's blockchain or anything else, all P2P networks eventually become centralized, because a centralized approach is simply more efficient. P2P is only useful when a centralized approach isn't feasible due to e.g. legal constraints, and the users are willing to pay the additional cost.
> This is because the web is built on top of a centralized architecture.
Actually, everything about the Internet is decentralized. It was originally designed as a highly-resilient network architecture for military purposes. To the extent it is centralized, the centralization has arisen spontaneously due to economic reasons.
> and it works much better than the centralized approach
P2P is basically dead in the era of metered internet. Everyone these days uses a seedbox, most of which are hosted in one datacenter in France. A perfect example of spontaneously arising centralization.
> Because they have the income to afford and maintain a massive centralized server farm?
So you are saying that companies who use Amazon or Google clouds are stupid and would be better off setting up their own datacenters, like they used to back in the early 00s?
That's the whole point of the shift to cloud: massive centralized server farms are far cheaper per-unit than smaller, less-centralized ones. It's the same thing with other utilities: a big natural gas power plant is far cheaper per-kilowatt than running a small generator in your backyard.
Depends, AWS is definitely not cheap for anything that doesn't need to scale dynamically.
I can get a massive server with 256gb ram, 2x 12 core xenons and a couple geforces from my local provider for 100-200 bucks a month, try doing that in the cloud and it's going to cost you thousands.
Also simply because it's not able to provide the same features as a centralized architecture right now.
Mobile device constraints, constrained networks with NAT and firewall, not being able to store messages on a server and delivering them later, authentication, etc. are all problems that are not solved very well yet in p2p networks.
The internet might very well need to be replaced to enable truly p2p applications. This might sound impossible, but for example [Gnunet](https://gnunet.org) can actually work on top of the current internet even though it is designed to be independent of it.
That's fair, it's something I've worried about too, though. I don't know if it's likely or not though, it would at least require setting up numerous US-owned nodes in foreign countries(possible), or cooperation with foreign powers setting up their nodes locally.
I don't know enough about the precautions that the tor project itself takes against this sort of thing through to say the degree to which it's an actual threat.
Probably because if everything was end-to-end encrypted, it would be very difficult for law enforcement to do their jobs. The vast majority of investigations involving terror groups or organized crime involve wiretaps. If it's impossible to perform wiretaps from the service provider side, investigators would have to physically bug premises to collect evidence, which is obviously much more difficult, invasive, and dangerous.
I don't know why Lindsay is trying to, I know the feds want to so the TSA can more easily continue warrantless search of Americans' communications with foreigners.
Under the Patriot Act they can spy on any conversation provided at least one of the parties is outside of the United States. That becomes difficult when everyone from my grandma talking to her friends from her small island in Greece to actual ISIS members all use end to end encrypted messaging apps. Their solution is not to find a better way to investigate terror groups but rather to say that every conversation including my grandma's need to have the decryption key available on request for them by whoever runs the service.
> Funny how politicians can't seem to work together unless it's to illegally spy on US citizens.
Why I was *so* bummed when Russ Feingold lost his Senate seat. PATRIOT act, passed in 2001 by a vote of 98-1 in the Senate... three guesses as to who the "-1" was.
> Amendment IV
> The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
This is cool! At first I think Emil enters the key using URL query (?e2eekey=foo), which will be sent to the server => this will allow Jitsi to be able to decrypt the call. But in fact, he uses the URL hash (#e2eekey=foo), so the key is not sent to the server and all encryption & decryption happens on client-side.
The hard thing now is how do callers (clients) come up with the same secret key without leaking it to the server or the public. Perhaps something like Diffie Hellman in TLS?
just having the hash in the URL is great because it splits the key across services. sure if slack and your ISP work together to specifically nail you, they could. But really, all you need to route around that is a pre existing channel of communication you can trust. think telegram, think what's app, all currently existing channels.
Just having basic zero knowledge end to end encryption is a great improvement.
If you're interested in that, have a look at the pastebin called 0bin: https://0bin.net/
The encryption key for your paste is included in the URL hash, and calculated only locally in JS. It's never sent to them, so they don't know what your paste says.
Sure, it might not be in the Nginx/Apache/Whatever logs by default, but they could definitely access it if they wanted to.
Well if you don't trust the server running the bridge, you can run the software yourself, just like you can now with standard Jitsi.
It's good to have extra layers of protection, and you can make informed decisions based on how important security is for you.
I get your point, but you can audit the JS that is being executed
Browser extension idea: you "pin" the JS of a website at a given moment, after auditing it. If it ever changes, you receive a warning, and you can review a diff between the previous and the current version (using git as a backend I guess).
They talk about this in the post:
> As we already pointed out, passing keys as URL parameters is a demo thing only. Aside from being impractical it also carries risks given that URL params are stored in browser history.
> Our next step is therefore to work out exactly how key management and exchange would work. We expect we will be using The Double Ratchet Algorithm through libolm but the details are still to be ironed out.
> The hard thing now is how do callers (clients) come up with the same secret key without leaking it to the server or the public. Perhaps something like Diffie Hellman in TLS?
well it is just url, it can be sent over any other channel trusted by others. IIRC it does have some matrix/riot integration which also does e2e
You could think this as an additional layer because password is something you provide to the server, and you don't want e2ee keys to be passed to the server.
One possibility would be to derive both the password and the key from another password, so you could still just use one password without the server learning the e2ee key.
It is easiest to use this in conjunction with other services. For example Matrix supports group chats with e2ee, so in that case the person that begins the session could just generate the key and share it on the encrypted channel. An email invite to a meeting could include the e2ee secret (exactly like in the demo).
But, of course, email usually isn't e2ee, so maybe that's not the way 🤔.
> But, of course, email usually isn't e2ee, so maybe that's not the way 🤔.
It is sad that we have code and standards to do that for good 20 years (GPG/PGP), yet it is stuck in that limbo where you *can* do it but it is too inconvenient for typical user so it just never got popular.
Sad that this is dependent on an extension to WebRTC that’s (currently) Chrome / Chromium-only. It’s still a proposed API, but I don't see it being supported by other browser in the foreseeable future.