-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Updating Crypto to use WebCrypto API and to replace RSA with ECC #446
Conversation
👇 Click on the image for a new way to code review
Legend |
This upgrades our pkgs.nix commit to the latest master commit in Nixpkgs: This implies some changes to the rest of the nix derived dependencies and we have to update The We'll see if there are other issues. |
Node has native implementations of Ed25519 (for signing and verification) and X25519 for encryption/descryption. However there's also a native JS version of this: It's time to do some benchmarking between the 2 to know which one is best to replace node-forge. |
Relevant reading: |
Notes from https://github.com/paulmillr/noble-ed25519 Random BytesRandom source is hardcoded in the library to use either web crypto or node's crypto (but not node's web crypto). randomBytes: (bytesLength: number = 32): Uint8Array => {
if (crypto.web) {
return crypto.web.getRandomValues(new Uint8Array(bytesLength));
} else if (crypto.node) {
const { randomBytes } = crypto.node;
return new Uint8Array(randomBytes(bytesLength).buffer);
} else {
throw new Error("The environment doesn't have randomBytes function");
}
}, However it is possible to override this by directly monkey patching the library like: import * as ed from '@noble/ed25519';
ed.utils.randomBytes = (bytesLength: number = 32): Uint8Array => {
// Bring your own random byte generator...
}; Note that this function is assumed to be synchronous. The reason for this is that web crypto's random byte generator According to node, it is recommended to use the async version, however this is only a performance issue if generating This means, even when synchronous, it's would be easy to make a streamable by making it a generator: async function* generateRandomBytes(length, chunk): AsyncGenerator<Uint8Array> {
// partition length to chunk, and yield chunks
// while sleeping to allow the interpreter to proceed
yield randomBytes(chunk);
await sleep(0);
} As a side note, have a look at https://nodejs.org/api/cli.html#uv_threadpool_sizesize to increase the number of libuv Ed25519 Private & Public KeysEd25519 is digital signature scheme doesn't bundle in encryption unlike RSA. Generating a private key is really easy, it's just a matter of generating 32 random bytes. import * as ed from '@noble/ed25519';
const privateKey: Uint8Array = ed.utils.randomPrivateKey();
// This is the equivalent of the above
const privateKeyAlso: Uint8Array = ed.utils.randomBytes(32); Doing this is really fast... much faster than RSA. Note that in web crypto, it says that if you want to generate keys, you want to actually use The resulting key is just a Now in order to get the public key, it's a simple math operation: const publicKey: Uint8Array = await ed.getPublicKey(privateKey); The Also the size of both the private and public keys is always 32 bytes. Right now our Ok so the basic signing and verification works like this: const message = ed.utils.randomBytes(1000);
// A signature is always 64 bytes
const signature = await ed.sign(message, privateKey);
(await ed.verify(signature, message, publicKey)) === true; That's basically it. X25519Now how do we do encryption and decryption? It's quite different from RSA which has bundled it's own encryption utility in it. Basically this is the diffie-hellman key exchange. This is a key agreement protocol that allows two parties to derive a shared secret key without ever having to exchange it. The shared secret key is what is used for symmetric encryption. The key point is that this shared secret is not known... and actually it's not communicated over any channel. It's sufficient for the 2 parties to share only their public key, and both parties will end up creating the same shared secret without any communication. To do this using the ed25519 keys: import * as ed from '@noble/ed25519';
const privateKey: Uint8Array = ed.utils.randomPrivateKey();
const publicKey: Uint8Array = await ed.getPublicKey(privateKey);
// This is also 3 bytes long, and it is deterministic
const sharedSecret: Uint8Array = await ed.getSharedSecret(privateKey, publicKey); The shared secret is a X25519 shared key. Here's a demonstration: import * as ed from '@noble/ed25519';
async function main() {
const alicePrivateKey = ed.utils.randomPrivateKey();
const alicePublicKey = await ed.getPublicKey(alicePrivateKey);
const alice = {
private: alicePrivateKey,
public: alicePublicKey,
};
const bobPrivateKey = ed.utils.randomPrivateKey();
const bobPublicKey = await ed.getPublicKey(bobPrivateKey);
const bob = {
private: bobPrivateKey,
public: bobPublicKey,
};
// Imagine Alice and Bob exchange public keys
const aliceSharedSecret = await ed.getSharedSecret(
alice.private,
bob.public
);
const bobSharedSecret = await ed.getSharedSecret(
bob.private,
alice.public
);
for (let i = 0; i < aliceSharedSecret.byteLength; i++) {
if (aliceSharedSecret[i] !== bobSharedSecret[i]) {
console.log('Shared secrets are not equal');
}
}
// The secrets are the same!
}
void main(); The shared secret is always 32 bytes. Now to actually do the encryption, we need a symmetric cipher system. The noble library does not handle this. At this point I believe that the shared secret is not directly used, instead it is The derived secret (nonce) and the original shared secret is what is used for encryption and decryption. The encryption/decryption should still work with AES-256-GCM. The X25519 just means we used the DH exchange. |
So even if we were to use the noble library, we'd still have to use the webcrypto library in nodejs anyway. Which means that since ed25519 is already available inside node... we may not really need to bring in the extra library. |
We should also test the TLS system's support for the Ed25519 or X25519. |
One other issue is that webcrypto doesn't deal with x509 certs. So it seems we'd need at least 3 libraries:
|
I've dived into how to get Ed25519 into our X.509 PKI: Some notes:
|
Ok there's another issue, the webcrypto standard doesn't yet support X25519 or Ed25519. NodeJS added them in before it's been standardised by the web. There's a proposal for it here: https://github.com/tQsW/webcrypto-curve25519/blob/master/explainer.md This means webcrypto implementation in nodejs isn't standardised. Which could meant that it would be worth continue using the noble version of Ed25519. At the same time, as I illustrated above. If we want to use the TLS certificate for web browsers. We would need to derive another key that is based on NIST curve which would then used for the web TLS. This https://crypto.stackexchange.com/questions/50249/converting-a-c25519-curve-into-a-nist-supported-curve-for-fips-crypto demonstrates that we would use HKDF on the private key to generate an appropriate P256 key, which is then presented as the TLS certificate to browser. However between 2 PK nodes, they should just use the Ed25519 X.509 certificate directly. Now since the browser won't be trusting the Ed25519 keys, they'd be trusting the derived keys directly. One could present in the certificate that the new P256 key is signed (trusted) by the root Ed25519 key... that would then produce a "certificate chain". And this would be used until browsers had native support for Ed25519 X.509 certs. At that point, we discard the P256 certificate. |
Let's go through the bootstrapping process.
const seedByteString = pkcs5.pbkdf2(
'fan rocket alarm yellow jeans please reunion eye dumb prepare party wreck timber nasty during nature timber pond goddess border slam flower tuition success',
'mnemonic',
2048,
64,
md.sha512.create(),
);
// Always 64 bytes
const seedBuffer = Buffer.from(b, 'binary'); Compare with: https://iancoleman.io/bip39/
Ok this takes care of the bootstrapping and root key. What about general encryption/decryption? This is where webcrypto is introduced as a replacement. As for TLS, to avoid node specific dependencies here, it would be worthwhile to check out https://www.npmjs.com/package/@peculiar/x509. This gives us a library for general purpose usage. JOSE can still be used, but we may require something more general later. |
Ok now I'm going to try out the webcrypto in nodejs to see how it compares to noble. |
Ok I've worked out the webcrypto API. It is very clunky... and not very flexible. One of the first things I noticed that is that you cannot generate an Ed25519 key directly with deterministic random values. That is our bootstrap process above cannot be done at all since we are going from recovery code to root key. There's only 2 ways to get an Ed25519 private key: import { webcrypto } from 'crypto';
function main () {
const keyPair = await webcrypto.subtle.generateKey(
{ name: 'Ed25519' },
true,
[ 'sign', 'verify' ]
) as CryptoKeyPair;
// The API does not allow us to export the private key as raw bytes
// It does however allow us to export as JWK
// Only the public key can be exported as raw bytes
// JWK is the modern format, better than pkcs8 or spki
const privateKeyJWK = await webcrypto.subtle.exportKey(
'jwk',
keyPair.privateKey
);
// Since we cannot export the raw bytes, we also cannot import it as raw bytes
// I tried actually, and it doesn't allow it, so we can only import via jwk
// or the pkcs8
const privateKeyAgain = await webcrypto.subtle.importKey(
'jwk',
privateKeyJWK,
'Ed25519',
true,
['sign']
);
}
void main(); Basically this already means that webcrypto is out, and we cannot use it in this way. Furthermore, the webcrypto API is very restrictive... probably to avoid footguns. Notice that when generating the key it is only allowed to It has a concept of key wrapping, which is basically encrypting a key with another key. This is useful for example when we want to store the private root key on disk, and need to encrypt it with the password. This is basically a key derivation function plus encryption. So to do this: const aeskey = await webcrypto.subtle.generateKey({
name: 'AES-KW',
length: 256,
}, true, ['wrapKey', 'unwrapKey']);
const aeskey2 = await webcrypto.subtle.generateKey({
name: 'AES-GCM',
length: 256,
}, true, ['wrapKey', 'unwrapKey', 'encrypt', 'decrypt']);
// This basically exports it as JWK using the first 2 parameters
// then uses the second 2 parameters to "encrypt it"
// thus giving us an ArrayBuffer
// is this known as a JWE?
const wrappedPrivate = await webcrypto.subtle.wrapKey(
'jwk',
keyPair.privateKey,
aeskey,
'AES-KW'
);
const randomBytes = webcrypto.getRandomValues(new Uint8Array(12));
const wrappedPrivate2 = await webcrypto.subtle.wrapKey(
'jwk',
keyPair.privateKey,
aeskey2,
{
name: 'AES-GCM',
iv: randomBytes
}
);
const enc = new TextEncoder();
const wrappedPrivate3 = await webcrypto.subtle.encrypt(
{
name: 'AES-GCM',
iv: randomBytes
},
aeskey2,
enc.encode(JSON.stringify(privateKeyJWK))
);
const d2 = await webcrypto.subtle.decrypt(
{
name: 'AES-GCM',
iv: randomBytes,
},
aeskey2,
wrappedPrivate2
);
const d3 = await webcrypto.subtle.decrypt(
{
name: 'AES-GCM',
iv: randomBytes,
},
aeskey2,
wrappedPrivate3
); So basically the At any case, it does show that we can easily use to it do AES-GCM encryption. For cross platform compatibility, it seems webcrypto itself is quite restrictive. Many things should work without relying on node:
|
We can start prototyping the bootstrapping method as stated above using all the things we have worked out. While also exploring the X509 library, we will need the ability to create custom extensions as that's what we are using right now for our custom root certificate chain mechanism. PeculiarVentures also has a PKI.js which seems to be able to do the same thing. Not sure about the difference. |
While reworking the bip39, I found https://github.com/paulmillr/scure-bip39. I found that it ends up using It can replace the bip39 library we are downloading, and minimise the mount of dependencies we are using. Furthermore, our bip39 isn't entirely correct because it's not using the same normalisation technique. So I'm thinking we just use scure/bip39 and then use that instead. Originally we couldn't just use But with ed25519, it's simple to just use the first 32 bytes of the returned seed as the private key of Ed25519. Also note that we don't bother with the optional passphrase as per https://vault12.com/securemycrypto/crypto-security-basics/bip39/what-is-a-bip39-passphrase
Plus it's just a bit confusing. |
Just a note that optional passphrase can be used as the "25th" seed word. That can be used to create hierarchical key pairs. This concept is actually standardised under BIP32, and is used to create different wallets and probably used again for BIP44. See #352 for more information. We'll leave this to a later date and keep with our usage of BIP39 without any additional passphrase. |
So we are replacing: async function generateDeterministicKeyPair(
bits: number,
recoveryCode: string,
): Promise<KeyPair> {
const prng = random.createInstance();
prng.seedFileSync = (needed: number) => {
// Using bip39 seed generation parameters
// no passphrase is considered here
return pkcs5.pbkdf2(
recoveryCode,
'mnemonic',
2048,
needed,
md.sha512.create(),
);
};
const generateKeyPair = promisify(pki.rsa.generateKeyPair).bind(pki.rsa);
return await generateKeyPair({ bits, prng });
} With:
And now with async function generateDeterministicKeyPair(recoveryCode: RecoveryCode) {
// This uses BIP39 standard, the result is 64 byte seed
// This is deterministic, and does not use any random source
const recoverySeed = await bip39.mnemonicToSeed(recoveryCode);
// Slice it to 32 bytes, as ed25519 private key is only 32 bytes
const privateKey = recoverySeed.slice(0, 32);
const publicKey = await nobleEd.getPublicKey(privateKey);
return {
publicKey,
privateKey
};
} In this case we are keeping to using Uint8Array to avoid using The result is:
|
To override both the random sources it has to be done globally for scure and noble libraries: // @ts-ignore - this overrides the random source used by @noble and @scure libraries
utils.randomBytes = (size: number = 32) => getRandomBytesSync(size);
nobleEd.utils.randomBytes = (size: number = 32) => getRandomBytesSync(size);
// Note that NodeJS Buffer is also Uint8Array
function getRandomBytesSync(size: number): Uint8Array {
console.log('CUSTOM CALLED');
const randomArray = webcrypto.getRandomValues(new Uint8Array(size));
return randomArray;
// return Buffer.from(randomArray, randomArray.byteOffset, randomArray.byteLength);
} That means, as long as the keys utils is imported, it will be overrided. But this makes sense, so it's always important to use the |
One issue is that panva/jose doesn't make it esay to override the random source. The random source is in The actually compiled code could be This is not as easy as overriding the above, as it is exported as default, plus jose has some restrictions on the way it exports things. Cisco's node-jose is a little more flexible... but it seems we'd want to avoid doing anything non-deterministic in jose if we can't override the random source. |
Creating the JWK itself does not actually involve JOSE. You can just do this: const d = base64.base64url.baseEncode(rootKeyPair.privateKey);
const x = base64.base64url.baseEncode(rootKeyPair.publicKey);
const privateKey = {
alg: 'EdDSA',
kty: 'OKP', // Octet key pair
crv: 'Ed25519', // Curve
d: d, // Private key
x: x, // Public key
ext: true, // Extractable (always true in nodejs)
key_ops: ['sign', 'verify'], // Key operations
};
// Note that if you don't pass `d`, then it becomes a public key, rather than a private key
console.log(JSON.stringify(privateKey)); This produces a JSON string with what we need. It's only if we want to use JOSE operations, that we would then use Ok but let's see how we can encrypt this with the root password, and whether this means turning it into a JWE (and if so, does it end up using the random source in jose). The |
So JOSE has its own random source which cannot be overridden and also its own crypto algos. In the future we should swap out panva/jose with our own jose. We cannot use node-jose because that isn't sufficient either. At any case, after reading the JW* specs, it seems easy to do. Now about the encrypted JWK. As discussed earlier, this is a JWK:
An encrypted JWK according to the spec is a JWE with some additional metadata. This is a JWE that we will want to create:
To be precise there are 3 kinds of JWE serialisations: compact, flattened, and general. The above is the flattened serialisation. Compact is just one entire encoded string. The flattened is still JSON. General is used for if you want to encrypt something for multiple recipients. In order to produce that JWE from the JWK we just do this: const privateKeyString = JSON.stringify(privateKey);
const jwe = new jose.FlattenedEncrypt(Buffer.from(privateKeyString));
jwe.setProtectedHeader({
alg: 'PBES2-HS512+A256KW', // PBES2-HS512 is a scheme using PBKDF2 with HMAC512, then A256KW is used for key wrapping
enc: 'A256GCM', // this is the symmetric encryption cipher
cty: 'jwk+json' // this is the "type" of JWE, in this case, telling us that this is an encrypted JWK as a JSON object
});
// Now according to the JWK RFC, the internal steps are quite complicated
// But what happens is:
// 1. A random CEK is generated
// 2. The random CEK uses A256GCM to encrypt the plaintext (the JSON string of JWK)
// 3. The root password is passed into PBKDF2 to derive a key
// 4. This derived key encrypts the CEK with A256KW
const rootPassword = Buffer.from('some password');
// Here jwe.encrypt is doing all of those steps under the hood, it is also randomly generating all the parameters, like `iv` for A256GCM, and also the count and salt for PBKDF2
const encryptedJWK = await jwe.encrypt(rootPassword);
const decryptedJWK = await jwe.flattenedDecrypt(encryptedJWK, rootPassword);
const privateKeyAgain = JSON.parse(decryptedJWK.plaintext.toString());
// Example of outputting the protected header information (it's not encrypted, only protected integrity through authenticated encryption, this is facilitated by the `encryptedJWK.tag` property)
console.log(jose.decodeProtectedHeader(encryptedJWK)); Note that JOSE is setting up certain parameters automatically. The PBKDF2 count is defaulted to 2048, and the PBKDF2 salt is randomly generated and also put in to the resulting The
Some questions that might be asked:
Note that JWE does support direct encryption mode, where there's no intermediate CEK: See also: https://security.stackexchange.com/questions/80966/what-is-the-point-of-aes-key-wrap-with-json-web-encryption Ok so anyway, the There's no designated file extension for this. We could just use We could also just start a trend here: Still need to explore what to store the certificate format. It may still be a good idea to continue storing as the original PCKS8 and SPKI formats for cross compatibility... but maybe it's time to move to the future as well. Also see: https://smallstep.com/docs/step-cli/reference/crypto/jwk They suggest |
Ok now we can move to dealing with the encryption key or database key. We have a root key that is Ed25519. Which atm cannot be used to derive new keys, do key wrapping or any kind of encryption. In a way, it represents identity. However our database is meant to be encrypted with symmetric encryption requiring a symmetric key. We can call this the "data encryption key". It functions similarly to the CEK discussed above in the JWK. Atm, this data encryption key is randomly generated. It does not need to be derived from the root key. And by derivation we mean to use something HKDF not PBKDF2. The reason it does not need to be derived, is because if the root key were to change, then the data encryption key would also change, and thus require re-encryption of the entire database. And we do change the root key, then we would not be able to derive the same data encryption key. Therefore the link between the root key and the data encryption key is separate. So the data encryption key is randomly generated, and will be saved to disk along with the database. This does mean that without the data encryption key you will not be able to decrypt the database. This is basically how it is done already. So there's no change here. Atm, this database encryption key is in With the new ed25519 root key, we now need to use this to encrypt the data encryption key (the
Now here I'm confused by the specifics. There is some debate whether this is secure:
The paper seems to argue that this is fine. It seems we would do a diffie hellman exchange between the root private and root public (because we are our own recipient). To get a shared secret, then apply HKDF-Extract-like KDF. Then the resulting key can be used to encrypt a randomly generated symmetric key. Note that hashicorp vault has this same concept known as the encryption key. Unlike hashi's vault, we don't do data encryption key rotation yet. Their rotation of the encryption key is chained, maintained in a key ring that keeps all the symmetric keys around including the old ones. Then they encrypt new values with the new key, while still being able to decrypt the old values. I'm guessing if old values get accessed, they get repaired to the new key: https://www.vaultproject.io/docs/internals/rotation |
Ok I've figured out several ways of managing the data encryption key for the database. The end result we're looking for is there to be a To do this, we have to first generate a random 32 byte symmetric key for AES256GCM. Then basically use it for encryption/decryption: const DEK = getRandomBytesSync(32);
const DEKJWK = {
alg: "A256GCM",
kty: "oct",
k: base64.base64url.baseEncode(DEK),
ext: true,
key_ops: ["encrypt", "decrypt"],
};
// This imports as a Uint8Array, but if we need webcrypto encrypt/decrypt, we need to use CryptoKey
// const DEKImported = await jose.importJWK(DEKJWK) as Uint8Array;
const DEKImported = await webcrypto.subtle.importKey(
'jwk',
DEKJWK,
'AES-GCM',
true,
['encrypt', 'decrypt']
);
const iv = getRandomBytesSync(16);
const cipherText = await webcrypto.subtle.encrypt(
{
name: 'AES-GCM',
iv,
tagLength: 128,
},
DEKImported,
Buffer.from('hello world')
);
const combinedText = new Uint8Array(iv.length + cipherText.byteLength);
const cipherArray = new Uint8Array(cipherText, 0, cipherText.byteLength);
combinedText.set(iv);
combinedText.set(cipherArray, iv.length);
const iv_ = combinedText.subarray(0, iv.length);
const plainText = await webcrypto.subtle.decrypt(
{
name: 'AES-GCM',
iv: iv_,
tagLength: 128
},
DEKImported,
cipherText
); Ok great, so now how do we store this symmetric key on the disk safely and securely? Well we can imagine that we start with an initial keyring (which is basically files on disk, not a database). And we now have several options.
So it's a choice between option 1 and option 2. According to https://neilmadden.blog/2021/02/16/when-a-kem-is-not-enough/, JOSE's Furthermore this gist https://gist.github.com/codedust/b69ebb3be60490e8ddc4f1cabf76ec90 says:
So it seems that we should just use the So I'm going with option 1. |
In option 1, we hit a question about what the salt should be? Well according to https://soatok.blog/2021/11/17/understanding-hkdf/, we don't really need to apply a salt in the HKDF-Extract. But we will apply an info tag at the HKDF-Expand step. And that pretty much covers it. |
The exploration of option 2 is interesting and may be useful in the future, because we can certainly convert the Ed25519 keypair to a x25519 keypair. The pub key is mapped to 1 pub key. The x25519 private key is the same as the ed25519 private key. This can be useful in the future if we need to apply it to further kinds of encryption/decryption tasks. One of the things I'm concerned about is that, option 1 covers our usecase of KEM for the database key, and potentially other state keys. But the Ed25519 put into TLS is a separate construct, and I believe OpenSSL will do its own thing. But as mentioned above browsers don't understand this yet (and you cannot use x25519 as the keypair either). But if we want to use Ed25519 to eventually produce encrypted signed messages, we have to follow the KEM concepts in here https://neilmadden.blog/2021/02/16/when-a-kem-is-not-enough/. Or better yet, at that point we may just follow the I believe this is it:
This comes from:
|
The above could be useful for notification messages that we want to secure beyond the initial TLS connection. Or the exporting of vaults. |
Investigating https://github.com/PeculiarVentures/x509's library, I have come across an incompatibility between peculiar's webcrypto polyfill and node's webcrypto library. Details here: PeculiarVentures/webcrypto#55 and PeculiarVentures/PKI.js#89 and PeculiarVentures/webcrypto#25 So it seems that if I want to use ed25519 with https://github.com/PeculiarVentures/x509, I have to use peculiar's webcrypto and not node's webcrypto. Now peculiar's webcrypto might be a good idea anyway. I'd have to compare the performance. The main reason is that webcrypto API has not actually standardised on how ed25519 keys are to be used. Node's implementation is just an experiment. This is different from node forge, that allows one to just use raw buffers. What are the advantages of using peculiar's webcrypto? Well it's a bit more portable, we aren't limited to a specific node version such as So if we do end up using peculiar's webcrypto... we then have to polyfill all the uses of it... otherwise libraries like jose, and noble will end up using the native crypto. Would that just be a matter of setting the global property |
1bbd728
to
33bc3b1
Compare
At this point, no WASM is involved here at all. It won't be needed (can investigate later for #422). Instead this is likely what will happen... New libraries:
Remove libraries:
Library that we keep using: What will happen are:
The end result is mainly:
|
[ci skip]
[ci skip]
[ci skip]
[ci skip]
[ci skip]
[ci skip]
- refactoring `claimNode` process, The handler logic was moved into `NodeManager` and updated to use the token/claim changes. - refactored identities claim - claiming nodes and identities adds the link to the gestalt graph - index claim read - fixed claiming identities [ci skip]
[ci skip]
- fixing notifications tokens - fixing notification tokens - Permissions and notifications for claiming nodes - notifications using `parse` and `generate` functions now [ci skip]
[ci skip]
[ci skip]
- fixed linting - removing unnecessary test files - adding seek tests for `getClaims` - agent adds self to gestalt graph on start up. - updated worker test - fixing tests - fixed bin tests - fixing bin tests - fixing agent tests - fixing client tests - fixing vaults tests - fixing notification tests - fixes to `Discovery` - updating logger and DB dependencies - gestalts model testing - fixing discovery tests - updating nodes tests - updated identities tests to use fast-check - fixing identities tests
3e19999
to
b399b1a
Compare
The problem with the errors has been fixed in 1.1.7: MatrixAI/js-errors@3b269ac#diff-7ae45ad102eab3b6d7e7896acd08c427a9b25b346470d7bc6507b6481575d519 However the fix has to propagate here to
TS 4.9 enables the Update editor accordingly. Also bring in 1.1.7 of js-errors when possible. |
After this merges, you should regenerate docs for And we are still pending benchmark re-run. |
After updating the dependencies we're seeing 2 new build errors. Here it seems like if we extend an error with a specified error code. TS is now taking that code as a number literal. The quick fix is to explicitly type
|
- fixed some type errors that cropped up
This should be good to merge except for one test failure in CI.
I can't recreate this locally so it will be a little tricky to debug. I won't be able to fix this today. we could merge now and address it in staging. |
I think maybe the exceptions got changed?
Could also be an ordering problem. Maybe the multi connection doesn't work on CI? Maybe DNS fails? This is something we should check. |
I'm disabling the failing test for now. It's only a problem in CI and the underlying connection logic is subject to change with the current quic/rpc update. |
Disabled for now, failing in CI and the core network logic is subject to change with the QUIC changes
The JSON schemas for claims and notifications have been removed. No need to copy them during build anymore.
How do I generate trust wallet phrase with balance |
Description
This PR focuses on updating the crypto utilities used by PK. We've been hitting problems using RSA and node-forge utilities, and we should start using the standardised WebCrypto API. This won't fully solve cross platform cryptography because that will need to wait until we hit mobile platforms and deal with it by using WASM or other utilities.
There are some new features coming into this PR:
NodeId
is now finally the public key. This means you no longer have to acquire the public key separately from theNodeId
. Once you have theNodeId
you can use it for public key verification, and for encryption.@peculiar/webcrypto
is being monkey patched toglobalThis.crypto
. This ensures that every library is using the same webcrypto backend and this includes CSPRNG and encryption/decryption facilities.Massive performance improvements in all areas:
Old performance (node-forge):
New performance (web crypto):
Newer performance (libsodium):
Issues Fixed
--root-key-file
#433tokens
domain and specialise tokens for Sigchain, Notifications, Identities and Sessions #481KeyRing
,CertificateManager
#472NodeID
has changed. #386Tasks
ED25519
as the root key@scure/bip39
for generating recovery code and deterministically generating the ED25519 root key* Note that
panva/jose
does not allow randomness to be standardised, this means the JOSE library will need to be replaced in the future* For now all libraries will mostly end up using node's native randomness generation because we are running in node runtime
* The
panva/jose
library can be replaced... it's just mostly implementing JOSE RFCs that's the issue, but we are only using a limited set, otherwise can fork the library to provide an alternative implementation. Alternatively we would need to monkey patch a global webcrypto runtime@peculiar/x509
CertManager
and plug this into the TLS configuration.KeyRing
class which extracts all root key pair and KEM mechanism out ofKeyManager
.KeyRing
by extracting out tests fromKeyManager.test.ts
.CertificateManager
to extract out root certificate functionality out ofKeyManager
. It must take theKeyRing
andDB
as dependencies.CertificateManager
by extracting out tests fromKeyManager.test.ts
.KeyManager
withKeyRing
if they only require theNodeId
.CommandStart
andCommandBootstrap
need's it's configs updated.Observable
ofKeyRing
, continue using theEventBus
for theKeyRing
.* This requires changing
KeyManagerChangeData
toCertificateManagerChangeData
for now, as that's where the origin of renewing identity will come from.KeyManager
for now, and plan a new issue for a newKeyManager
intended for secure computation usage and the management of arbitrary subkeys.[ ] 22. Sigchain needs to use the- Sigchain is being refactored, see Replace JOSE with our ownCryptoKey
by usingkeysUtils.importKey
tokens
domain and specialise tokens for Sigchain, Notifications, Identities and Sessions #481src/claims/utils.ts:36
createClaim
takes the private key as the PEM format, this needs to be updated to take a private key directly.CertManager
may have an expired current certificate occurring due not starting theCertManager
for a while. This means we need to immediately renew the certificate uponstart
. Right now this is not guaranteed. Need to add in some renewal logic that occurs automatically if the current certificate is now expired.pk agent status
commandwrapWithPassword
when outputting the private key to show the key pairpk keys root
, these should be showing the JWK for public key, and JWE for the private key, use dictionary formatting as well during human format, and JSON otherwisepk keys private
,pk keys public
,pk keys keypair
. Theprivate
andkeypair
commands should be taking a password for wrapping. This should take--password-path
or take from input prompt. The--format json
should produce a useful JSON dictionary. Forkeypair
it should be{ publicKey: JWK, privateKey: JWKEncrypted }
.pk agent status
produces a recursive dictionary output for public key JWK.pk agent start
andpk agent bootstrap
can use all new key ring configuration and cert manager configuration.randomSource
into whereverIdSortable
andIdRandom
is being constructed. This ensures thatjs-id
is usingkeys/utils/random.ts
instead of its own provided randomness.NodeId
.src/tokens
domain replacing JOSE JWSsrc/tokens
src/claims
domain specialisingsrc/tokens
src/claims
tokens
using parsing functionsclaims
using parsing functionsSigchain
to use the new claims and tokensSigchain
with the new claims and tokens structureSigchain
for faster access for link identity and link nodeidentities
to use the newtokens
andclaims
IdentityInfo
andNodeInfo
intogestalts
gestalts
to record indexed information acquired from discovery - Gestalt Link Schema Refactoring - Derived from JOSE replacement #492discovery
to use thetokens
andclaims
, in particular claim links and verifying claim links - Discovery Refactoring - Derived from JOSE replacement #493notifications
to use the newtokens
General.json
,VaultShare.json
, andGestaltInvite.json
to useparse/generate
utilities instead of JSON schema. These utilities should go into thenotifications/utils.ts
. It can reference thevalidation/errors.ts
.tokens
domain, and they should be "signed tokens"sessions
to use the newtokens
createSessionToken
andverifySessionToken
with calls to thetokens
domain. The session token should then be aSignedToken
. The signature is being signed by a symmetric key. Not the private key.src/keys/utils/hashing.ts
. (Replace node forge uses with these). Note that multiformats hashing may require a "webcrypto" polyfill, but we don't know for suregestaltGraph.setNode()
for the agent's own node at the startup of the agent. - Updating Crypto to use WebCrypto API and to replace RSA with ECC #446 (comment)Sigchain.getClaims
andSigchain.getSignedClaims
pagination testing problems: Sigchain Class API should provide paginated ordered claims by returning Array-POJO and indexed access #327 (comment)Testing
Tests must start using fast check arbitraries and where suitable model based testing:
Minimal tests for networking, grpc, nodes because we are likely to change it quite a bit in our next major rework of the networking with QUIC and RPC with JSONRPC.
Final checklist