Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add Squeak4.6 and Squeak5.0 to ConfigurationOfMetacelloPreview #stable release #365

Closed
dalehenrich opened this issue Aug 21, 2015 · 29 comments
Assignees

Comments

@dalehenrich
Copy link
Member

Basically slipstream the definitions onto the 1.0.0-beta.32.17 release

@dalehenrich
Copy link
Member Author

@pdebruic
Copy link
Collaborator

@dalehenrich Anything I can do on this to nudge it along so we can release the updated ConfigurationOfMetacelloPreview so Squeak 4.6 & 5.0 get a good version of Metacello?

@dalehenrich
Copy link
Member Author

the tests are failing on travis and while I l think the changes should cause the failures, I am also concerned about the volume of failures and the fact that I haven't seen 4.6 or 5.0 pass yet ...

@pdebruic
Copy link
Collaborator

While I agree that there's no need to merge until the tests pass it seems like you triggering Travis is a bottleneck on this issue.

Is there a way for me or @krono to keep fixing this without your intervention? I'd guess checkout the commit, make another issue and make a PR. Does that seem right to you?

I was looking at the massive failures and they seemed to be from the github api rate limiting bug. Is builderCI using basic auth, oauth or the unauthenticated calls to the github API? Seems like it would have to be basic or oauth but wanted to check with you before delving into that.

@dalehenrich
Copy link
Member Author

@krono an @pdebruic I am seeing these tests fail constantly for GemStone (not likely to be an APIU rate limit problem) and I am concerned that we have introduced an regression since the last release of of the preview configuration and if it indeed is a regression then all of sudden "every Metacello load in the universe will fail" ... so I have to be cautious, but of course it takes more time to be meticulous and up to now I haven't had the luxury ...

I will eat dinner and see what I can find out tonight ...

**************************************************************************************
    Results for anArray( 'BaselineOfMetacello') Test Suite
911 run, 900 passes, 8 expected defects, 2 failures, 1 errors, 0 unexpected passes
**************************************************************************************
*** FAILURES *******************
    MetacelloGithubIssue277TestCase debug: #'testGithubRepositoryPatternMatchingA'.
    MetacelloScriptingSuccessTestCase debug: #'testGitHubTagPatternLoad3'.
*** ERRORS *******************
    MetacelloScriptingSuccessTestCase debug: #'testGitHubTagPatternLoad3'.
**************************************************************************************

@dalehenrich
Copy link
Member Author

... yeah those tests are failing on all platforms (almost) with the new configuration ...

@dalehenrich
Copy link
Member Author

did you find evidence ? I looked for the explicit error messages and
didn't see any ... also if I trigger just one test (like an hour ago) I
only get those those two test failures ... rate limit should show up in
more places ...

AFAIK I can't run tests on travis-ci using authentication (if you can
figure out how to do it ... I'm all for it) ... in later versions of
Metacello I've added better caching to get tests to pass more frequently
... the last time the preview was tested I didn't get massive failures ....

So I have to look closer on the off chance that we have a regression...

Dale

On 8/30/15 5:06 PM, Paul DeBruicker wrote:

While I agree that there's no need to merge until the tests pass it
seems like you triggering Travis is a bottleneck on this issue.

Is there a way for me or @krono https://github.com/krono to keep
fixing this without your intervention? I'd guess checkout the commit,
make another issue and make a PR. Does that seem right to you?

I was looking at the massive failures and they seemed to be from the
github api rate limiting bug. Is builderCI using basic auth, oauth or
the unauthenticated calls to the github API? Seems like it would have
to be basic or oauth but wanted to check with you before delving into
that.


Reply to this email directly or view it on GitHub
https://github.com/dalehenrich/metacello-work/issues/365#issuecomment-136219663.

@dalehenrich
Copy link
Member Author

Test failure reproduces locally so I'm debugging now ...

On 8/30/15 6:21 PM, Dale Henrichs wrote:

did you find evidence ? I looked for the explicit error messages and
didn't see any ... also if I trigger just one test (like an hour ago)
I only get those those two test failures ... rate limit should show up
in more places ...

AFAIK I can't run tests on travis-ci using authentication (if you can
figure out how to do it ... I'm all for it) ... in later versions of
Metacello I've added better caching to get tests to pass more
frequently ... the last time the preview was tested I didn't get
massive failures ....

So I have to look closer on the off chance that we have a regression...

Dale

On 8/30/15 5:06 PM, Paul DeBruicker wrote:

While I agree that there's no need to merge until the tests pass it
seems like you triggering Travis is a bottleneck on this issue.

Is there a way for me or @krono https://github.com/krono to keep
fixing this without your intervention? I'd guess checkout the commit,
make another issue and make a PR. Does that seem right to you?

I was looking at the massive failures and they seemed to be from the
github api rate limiting bug. Is builderCI using basic auth, oauth or
the unauthenticated calls to the github API? Seems like it would have
to be basic or oauth but wanted to check with you before delving into
that.


Reply to this email directly or view it on GitHub
https://github.com/dalehenrich/metacello-work/issues/365#issuecomment-136219663.

@dalehenrich
Copy link
Member Author

Side effect of an External project bugfix[1] ... with a fix for a
Metacello bug[2] ...

I'll update the test on the branch then update the SHA ... we should see
green tests ...

Dale

[1] dalehenrich/external#2
[2] https://github.com/dalehenrich/metacello-work/issues/330

On 8/30/15 6:24 PM, Dale Henrichs wrote:

Test failure reproduces locally so I'm debugging now ...

On 8/30/15 6:21 PM, Dale Henrichs wrote:

did you find evidence ? I looked for the explicit error messages and
didn't see any ... also if I trigger just one test (like an hour
ago) I only get those those two test failures ... rate limit should
show up in more places ...

AFAIK I can't run tests on travis-ci using authentication (if you can
figure out how to do it ... I'm all for it) ... in later versions of
Metacello I've added better caching to get tests to pass more
frequently ... the last time the preview was tested I didn't get
massive failures ....

So I have to look closer on the off chance that we have a regression...

Dale

On 8/30/15 5:06 PM, Paul DeBruicker wrote:

While I agree that there's no need to merge until the tests pass it
seems like you triggering Travis is a bottleneck on this issue.

Is there a way for me or @krono https://github.com/krono to keep
fixing this without your intervention? I'd guess checkout the
commit, make another issue and make a PR. Does that seem right to you?

I was looking at the massive failures and they seemed to be from the
github api rate limiting bug. Is builderCI using basic auth, oauth
or the unauthenticated calls to the github API? Seems like it would
have to be basic or oauth but wanted to check with you before
delving into that.


Reply to this email directly or view it on GitHub
https://github.com/dalehenrich/metacello-work/issues/365#issuecomment-136219663.

@dalehenrich
Copy link
Member Author

I've added you (@pdebruic) and @krono as collaborators with push access ... please don't break the Metacello universe:)

dalehenrich added a commit that referenced this issue Aug 31, 2015
dalehenrich added a commit that referenced this issue Aug 31, 2015
…y to reduce sensitivity of test to new Seaside versions
dalehenrich added a commit that referenced this issue Aug 31, 2015
…d so try to reduce sensitivity of test to new Seaside versions
@dalehenrich
Copy link
Member Author

@pdebruic and @krono at this point it looks like the only failures are Squeak5.0 and Pharo2.0 (occasionally) .... before going further I would like to know that Squeak5.0 is not going to require additional work --- I don't look forward to having to publish another SHA for Squeak5.0 (which requires a new version ... new tags ... new deploy) ...

@krono
Copy link
Collaborator

krono commented Aug 31, 2015

I see.

It seems to be a problem in the update procedure for Squeak 5.0 so nothing with metacello-work per se.
I try to fix on the squeak side (again :/)

@krono
Copy link
Collaborator

krono commented Aug 31, 2015

So, after fixing on the Squeak side, the MNU is gone and we are down to one error: MetacelloGithubIssue175TestCase>>testIssue175 (#175 is about GitHub caching stuff, I'll see to that)

@krono
Copy link
Collaborator

krono commented Aug 31, 2015

Looking at the failure of #175, we had clean slate for the whole gitub-cache stuff as of #345 but I don't see anything of that loaded. In my test image, replaying the Travis commands, I get Metacello-GitHub-dkh.29 as loaded version, whereas in #345 it is at least topa.49.

Also, the error is a Heisenbug for me. When I run all tests for the first time, the error is present. When I try to debug or run it a second time, the error just disappears and everything works nicely.

I am unsure about the origin of the error. Can we ignore this one? I think the cache dir switching issue is less important for the bootstrap and slipstreaming

@dalehenrich
Copy link
Member Author

regarding topa.49 ... this the ConfigurationOfMetacelloPreview and it is an intermediate stepping stone to being able to load directly from github ... it is such a pain to maintain this whole bootstrapping cha cha, that I minimize the changes that are back ported ... passing the tests at this point in time means that the latest version can be loaded ...

Yes I have seen the random errors and have been working on the master branch to eliminate them ... typically the problem has shown up because a particular package doesn't get unloaded correctly which then upsets the image state for a follow-on test ... random failures can be ignored ...

I've restarted the Squeak5.0 runs and if they pass I will move forward with the checklist ... I'm concerned about Pharo2,0 failures, but I don't have the time to look into them ... I will cross my fingers:)

Thanks @krono (and @pdebruic) for your help and patience ...

@dalehenrich
Copy link
Member Author

Looks like the same error is hitting a slightly different test -previewBootstrap.st being triggered in a CompiledMethodWithNode class>>generateMethodFromNode:trailer: call ...

@dalehenrich
Copy link
Member Author

@krono .... left you off previous comment

@krono
Copy link
Collaborator

krono commented Aug 31, 2015

@dalehenrich Where did you get the trigger message from? I can't find it in the log…

@dalehenrich
Copy link
Member Author

Hmmmm, there are too many things going on ... I swear that I re-ran the tests and then looked at the log file and saw that it was the same error (that's when I copied the CompiledMethodWithNode class>>generateMethodFromNode:trailer: error), so I stopped looking, but you are right, now that link gives me a failed test with test errors .... I have had persistent problems with Safari, Chrome and Firefox with the travis site, not updating correctly so I think this must be yet another case ...

Apparently we're down to test failures? Perhaps if you could double check the errors, since I distrust the ability of my browsers to show me the proper results from travis... then we can call this fixed and I can move on to the next step ...

I've got other things that I've got to get done, so I will move on when I have a few uninterrupted moments:)

@krono
Copy link
Collaborator

krono commented Aug 31, 2015

Take your time.
I've just confirmed the #175-related error (not a failure, alas)
I've seen no failures before, but in your link, (build 1475.6), there's the other one (1475.23 travisPreviewCI.st), with

*** FAILURES *******************
    MetacelloIssue188TestCase debug: #testOrderedVersions.
*** ERRORS *******************
    MetacelloScriptingDocumentationIssue196TestCase debug: #testLockCommandReference2.
    MetacelloScriptingIssuesTestCase debug: #testIssue234a.

So 2 Errors and 2 Failures

dalehenrich added a commit that referenced this issue Sep 11, 2015
==== Startup Error: Error: Class category name
'Metacello-TestsPharo20MC' for the class 'MetacelloTestsPackageSet' is
inconsistent with the package name 'Metacello-TestsCommonMC.pharo20'
@dalehenrich
Copy link
Member Author

Okay ... I've fixed the Pharo2.0 (and Pharo5.0 failure due to mapckage mismatch) ... I've applied Paul's patch for Issue #369 on the issue_365 branch The two errors do not reproduce locally, so are probably due to random failures ---- historically this is because a package was not cleaned up correctly by squeak in one test causing a failure in a subsequent test ... --- but random test failures won't prevent me from pushing this puppy ... assuming the tests will be green I'll need to update the Preview config again ... we'll see if I get to this tomorrow:)

@dalehenrich
Copy link
Member Author

@krono
Copy link
Collaborator

krono commented Sep 12, 2015

yay :)

dalehenrich added a commit that referenced this issue Sep 12, 2015
Merge Issue #365 work into configuration branch
@dalehenrich
Copy link
Member Author

check off step 3

@dalehenrich
Copy link
Member Author

check off step 4

@dalehenrich dalehenrich self-assigned this Sep 13, 2015
@dalehenrich
Copy link
Member Author

check off step 5:

http://seaside.gemtalksystems.com/ss/metacello/ConfigurationOfMetacelloPreview-dkh.60.mcz
http://seaside.gemtalksystems.com/ss/MetacelloRepository/ConfigurationOfMetacelloPreview-dkh.60.mcz
http://www.squeaksource.com/MetacelloRepository/ConfigurationOfMetacelloPreview-dkh.60.mcz
http://smalltalkhub.com/mc/dkh/metacello/main/ConfigurationOfMetacelloPreview-dkh.60.mcz

@dalehenrich
Copy link
Member Author

need PR #364 merged in to truly support Squeak4.6 and Squeak6.0

@dalehenrich
Copy link
Member Author

@krono ... did you figure out what happened here? @pdebruic and I were on parallel paths solving the same problem:)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants