-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix Failing Build Tests on Windows & MacOS #14
Comments
Full output from failing tests:
|
Fixing the failures mentioned in this issue needs to be done before working on the integration tests for these platforms, since there are likely underlying issues we aren't aware of and that would make it difficult to work on integration tests. The failures can be grouped into a few different categories:
|
I've found additional problems on windows with respect to There are 3 dimensions to dealing with Windows for console programs.
I've found that powershell and cmd also behave differently. I've written this test script: const os = require('os');
const process = require('process');
const fs = require('fs');
const { Buffer } = require ('buffer');
async function main(argv = process.argv) {
argv = argv.slice(2);
if (argv[0] === '0') {
console.log('Hello 盘');
} else if (argv[0] === '1') {
process.stdout.write(`HELLO${os.EOL}盘${os.EOL}`);
} else if (argv[0] === '2') {
// LF
process.stdout.write('HELLO\n盘\n');
} else if (argv[0] === '3') {
// CRLF
process.stdout.write('HELLO\r\n盘\r\n');
} else if (argv[0] === '4') {
// WITH setting the utf-8 encoding
process.stdout.setDefaultEncoding('utf-8');
// process.stdout.setEncoding('utf-8');
process.stdout.write('HELLO\n盘\n');
} else if (argv[0] === '5') {
// WITH setting the utf-8 encoding
process.stdout.setDefaultEncoding('utf-8');
process.stdout.write('HELLO\r\n盘\r\n');
} else if (argv[0] === '6') {
// Writing buffers directly
process.stdout.write(Buffer.from('hello\n盘\n', 'utf-8'));
} else if (argv[0] === '7') {
// Writing buffers directly
process.stdout.write(Buffer.from('hello\r\n盘\r\n', 'utf-8'));
} else if (argv[0] === '8') {
// Setting encoding AND writing buffers
process.stdout.setDefaultEncoding('utf-8');
// process.stdout.setEncoding('utf-8');
process.stdout.write(Buffer.from('hello\n盘\n', 'utf-8'));
} else if (argv[0] === '9') {
// Setting encoding AND writing buffers
process.stdout.setDefaultEncoding('utf-8');
process.stdout.write(Buffer.from('hello\r\n盘\r\n', 'utf-8'));
} else if (argv[0] === '10') {
process.stdout.write(
Buffer.from('hello\n盘\n', 'utf-8'),
'utf-8'
);
} else if (argv[0] === '11') {
// Writing with fs
fs.writeFileSync(1, 'hello\n盘\n');
} else if (argv[0] === '12') {
// Writing with fs
fs.writeFileSync(1, 'hello\r\n盘\r\n');
} else if (argv[0] === '13') {
// Writing with fs and specifying encoding
fs.writeFileSync(1, 'hello\n盘\n', 'utf-8');
} else if (argv[0] === '14') {
// Writing with fs and specifying encoding
fs.writeFileSync(1, 'hello\r\n盘\r\n', 'utf-8');
} else if (argv[0] === '15') {
// Writing with fs and buffers
fs.writeFileSync(1, Buffer.from('hello\n盘\n', 'utf-8'));
} else if (argv[0] === '16') {
// Writing with fs and buffers
fs.writeFileSync(1, Buffer.from('hello\r\n盘\r\n', 'utf-8'));
} else if (argv[0] === '17') {
// Writing to a file
fs.writeFileSync('./lf.txt', 'hello\n盘\n');
fs.writeFileSync('./crlf.txt', 'hello\r\n盘\r\n');
}
}
void main(); Then I've ran Or use this script on powershell. for (($i = 0); $i -lt 17; $i++) { node ./test.js $i > ./tmp/$i.txt }
node ./test.js 17 The results show this: Powershell ALWAYS converts everything to UTF-16 LE with CRLF, no exceptions, the only case where this doesn't occur is with 17, where nodejs directly writes to files.
CMD doesn't actually support unicode characters on render, and it can't even show them anyway. So the Chinese character becomes just a question mark box. But what about LF/CRLF and encoding? Well it turns out, that CMD preserves unicode encoding during piping. It does not convert it to UTF-16 LE. It also preserves line endings, it does not auto convert LF to CRLF. Basically it does what Linux does, and it is far more obvious about the behaviour.
|
Let's figure out a principle of least surprise here. At first, I was thinking that we should use However if someone is working across platforms, then it can be quite surprising if users expect that a text file that they output/pipe contains CRLF while normal file writes from PK creates LF line endings. This is because we do tell users to use We would preserve whatever line endings the user uses when they write the file to PK. At the same time, it appears windows CMD and powershell understand Therefore:
However we still have a problem:
Now according to https://nodejs.org/api/fs.html#fswritefd-string-position-encoding-callback
But of course this would be an instruction to end users to set this up. Let me check. |
Using These 2 Q&A show how complex it is for windows to setup to properly use UTF-8: |
Using That's sufficient to change to At the same time, one must choose a correct font. Luicda Console or the default Consolas is fine for displaying some unicode, but only SimSun-ExtB can do Chinese characters. The font selection doesn't appear to affect SSHing into the powershell. That probably relies on whatever THIS however does not fix the problem of piping. Still piping into any file proceeds to convert LF to CRLF and converts to UTF 16 LE. |
Alternatively users can be recommended to use the new Windows terminal program or ConHost or ConEmu as alternatives. It just seems at least on Windows, terminal programs are just not first class atm. |
This Q&A https://stackoverflow.com/questions/40098771/changing-powershells-default-output-encoding-to-utf-8 explains how to actually change things like Powershell Core which is v6+ and this is not installed by default on Windows atm... does default to utf-8 BOMless. So we can also point users to downloading and installing powershell v6 (powershell core) instead of the original powershell. I suspect this is a similar issue with LF to CRLF. We may just wait for more later version of powershell. |
The end result is that:
|
@tegefaulkes @emmacasolin I've left this change https://github.com/MatrixAI/js-polykey/issues/401#issuecomment-1186819981 on by default in matrix-win-1. Note that SSH does not get affected by font changes. It does get affected by the code page though. Gitlab SAAS should also make the relevant changes but I don't know if they did. |
Some additional resources here https://shapeshed.com/writing-cross-platform-node/. Ignore the |
Regarding async vs sync. As I've been trying to debug some test issues, having async console outputs by default is making this difficult. So let's standardise for 1. as well.
// Default behaviour on Node.js:
// Files: synchronous on Windows and POSIX
// TTYs (Terminals): asynchronous on Windows, synchronous on POSIX
// Pipes (and sockets): synchronous on Windows, asynchronous on POSIX
// In order to align Windows with POSIX behaviour:
if (process.stdout.isTTY) {
process.stdout._handle.setBlocking(true);
} else if (os.platform() === 'win32' && !process.stdout.isTTY) {
process.stdout._handle.setBlocking(false);
} The relevant issue is nodejs/node#11568. The above might need to be done in PK, as it is a "global" application setting, rather than in js-logger. |
However if we want this to be the case for tests, which just the js-logger directly, they will need to set those settings in the js-logger instead. |
I think needs a PR on the js-logger for the This has been created MatrixAI/js-logger#22 |
These build tests run on stage build, not stage integration. So these have to be done before any further integration testing on Windows or MacOS specified in #10. |
This can only be done after MatrixAI/Polykey#419 is merged. In MatrixAI/Polykey#419, I want to integrate |
I'm trying to gauge weather this is done. I've recently fixed the mac and windows build. There has been a bunch of manual testing on the mac build so we can safely say that is working. Windows should be working but it has its own idiosyncrasies so it needs some of it's own manual testing. But the point is these builds are working There are CI integration jobs for all builds and they are running and passing but the windows and mac builds just run the help text. So minimally it's checking runtime loading of all dependencies. So the platform specific integration tests need to be expanded. but I'm not sure if that's a requirement of this issue. |
I think this issue would evolve into a high level issue targeting all sorts of Windows related compatibility. We haven't done alot of work in Windows yet as we have still problems to fix in general. So fixing up platform specific issues for Windows has to be put till later, and identifying those tests are relevant. |
The above comments are all relevant to various windows related issues. We can group these all together under a Windows compatibility project. |
Specification
This is not about integration testing, this is about our unit tests
Not all of our tests are passing on Windows/Mac, both in the CI/CD and on
matrix-win-1
/matrix-mac-1
. Some of these are due to obvious platform differences (e.g. the usage of/
vs\
), but others may be more difficult to debug.Failing tests on windows:
tests/bin
/vaults/vaults.test.ts
/agent
/start.test.ts
/stop.test.ts
/status.test.ts
/secrets/secrets.test.ts
/bootstrap.test.ts
/utils.test.ts
tests/keys/KeyManager.test.ts
tests/network/Proxy.test.ts
tests/nodes/NodeConnection.test.ts
tests/vaults
/VaultInternal.test.ts
/VaultManager.test.ts
/utils.test.ts
/VaultOps.test.ts
Failing tests on Mac (all timeouts, even when only a single test is run):
tests/network/proxy.test.ts
tests/bin/agent/start.test.ts
Additional context
Tasks
The text was updated successfully, but these errors were encountered: