jnsereko
(Joshua Nsereko)
May 26, 2022, 5:28am
1
Tired of the shame of pushing dozens of commits before making my workflow pass… here is the workflow
I have installed nektos / act however on. running locally, its like it needs 1000 years to complete.
Is there any other way of testing workflows locally other than pushing and waiting on GitHub?
cc @dkayiwa @ibacher @kdaud
kdaud
(Kakumirizi Daud)
May 26, 2022, 8:06am
2
I have not done this before but seems nectos/act to be the solution that can do the work.
This could be associated to memory management!
jnsereko
(Joshua Nsereko)
May 26, 2022, 8:21am
3
i selected medium on its installation because i couldn’t see the whole 20GB go just like that
jnsereko
(Joshua Nsereko)
May 26, 2022, 8:28am
4
Thought that there is some other way you do it.
Do you only wait for its results on GitHub while the workflow is executed
like you just stay optimistic
kdaud
(Kakumirizi Daud)
May 26, 2022, 10:14am
5
This has been the ideal! Pushing changes to the branch until actions is happy, and then squash into a single commit.
running locally, its like it needs 1000 years to complete
This is weird! This would mean running it twice requires 2000 yrs
i selected medium on its installation because i couldn’t see the whole 20GB go just like that
How about selecting other options during installation and find out it’s performance?
kdaud
(Kakumirizi Daud)
May 26, 2022, 2:19pm
6
@jnsereko alternatively would be ensuring the workflow is well written and then publish it to Github actions to run the plan.
sharif
(Sharif Magembe)
May 26, 2022, 7:50pm
7
Another way could be,
Avoiding to write functionalities in .yml file for provided GitHub Actions.
Write as much as possible in CI-agnostic way for example using Bash scripts, or using Docker file which might be specific for now.
ibacher
(Ian Bacher)
June 2, 2022, 4:03pm
8
Don’t be ashamed of making a lot of commits! Go for it! We can always squash the commits down on merge and sometimes it takes… many, many tries to get things right. The limitation of act is that it’s all Docker based so while it runs locally, it first needs to download the associated Docker images and checkout repos, etc. so it can still be a bit of a bandwidth hog.
2 Likes