melezhik

joined 2 years ago
[–] melezhik@programming.dev 1 points 1 day ago* (last edited 1 day ago)

In what environment is the CI

By default this is alline:latest docker container , however one can use custom docker images ( to be documented, but let me know if you are interested ), on the demo server , Ubuntu image is used

[–] melezhik@programming.dev 1 points 1 day ago* (last edited 1 day ago) (1 children)

run_task comes as a part of dsci SDK for Python . What do you mean by build time dependencies? You are free put anything into Python / Bash tasks that are called by run_task function in job file … if you point your repo on demo server I can help you with that …

[–] melezhik@programming.dev 2 points 2 days ago

Update for new users: Just create some repo and add http://127.0.0.1:4000/forgejo_hook to a repo web hook, then create some dsci pipeline and it will gets triggered

[–] melezhik@programming.dev 2 points 2 days ago

Actually just enabled self registration , no need to ask me, but still will be happy to see anyone in discord )

13
Dead Simple CI - looking for beta testers (deadsimpleci.sparrowhub.io)
submitted 2 days ago* (last edited 2 days ago) by melezhik@programming.dev to c/show_and_tell@programming.dev
 

Hey! I am building a brand new CI based on top of forgejo/gitea - the thing is to use general purpose programming languages instead of YAML for pipelines. So I have launched a forgejo instance with DSCI runner integrated, where you can find some example pipelines for demo projects - http://forgejo.sparrowhub.io/root

So I am looking for beta testers, anyone who wants to try out the dsci - please let me know - so I will create an account for you ( you may find the link to the discord channel at the web site ) and you will start to create and run pipelines for projects you like

[–] melezhik@programming.dev 1 points 5 days ago

It uses yaml only for configuration part, but pipeline itself is far more then that . Not sure what do you mean by “middle ground”, could you please elaborate? Thanks

[–] melezhik@programming.dev 2 points 5 days ago* (last edited 5 days ago)

Just added the feature of running jobs on localhost for debugging:

cd .dsci/job_one; docker run -it -v $PWD:/opt/job --entrypoint /bin/bash dsci -c "cd /opt/job/; s6 --task-run ."

[–] melezhik@programming.dev 3 points 6 days ago

Feedback are welcome , the project is in very early stage …

20
Dead Simple CI (deadsimpleci.sparrowhub.io)
submitted 6 days ago* (last edited 6 days ago) by melezhik@programming.dev to c/programming@programming.dev
 

Dead simple CI - http://deadsimpleci.sparrowhub.io/ could be thought as an extension to any modern CI system - GitHub/Gitea/Gitlab/Forgejo/you name it , adding to default pipeline mechanism (usually based on yaml) the convenient for programmers use of general programming languages, it uses web hooks and commit statues API to report results back to native CI

 

Double TAP is lightweight testing framework where users write black box tests as rules checking output from tested "boxes". Boxes could be anything from http client, web server to messages in syslog. This universal approach allows to test anything with just dropping text rules describing system behavior in black box manner.

Rules are written in formal DSL and could be extended on many programming languages

Tool aims to help with infrastructure audit and testing as well as with validating development environments

 

Nice screenshots are attached )

 

Tomtit is a cli task runner when you need to run repetitive tasks / tools around your project. It's similar to make, but with more generic approach, not exactly tied to build tasks only

 

After one Rakulang community member and bio informatics developer mentioned the Nexflow data pipeline framework, I was surprised that Sparky and Sparrow6 eco system could be a good fit for such a type of tasks ...

 

Just create a .env/vars.env file which is not kept in source code, define some Bash variables here with sensitive data and then run sparrowdo cli referencing those variables in this file in a safe way:

—tags password=.env[PASSWORD],token=.env[TOKEN]

  • variables are not exposed in bash history
  • not seen via ps aux
  • variables file gets transferred to remote host over scp
  • file with variables not kept on remote host ( immediately removed after exporting to Sparrowdo scenario )
  • host specific vs default env variables allowed

Safe and simple

[–] melezhik@programming.dev 2 points 8 months ago

TLTR;

Sparky is a distributed jobs framework that allows orchestration of remote tasks on cluster of nodes. It’s simple to set up and easy to use. This post is a brief overview of Sparky architecture design.

Sparky targeted audience is:

cloud providers to manage underlying multiple hosts infrastructure

data scientists to process data in distributed manner (aka data pipelines)

software engineers and devops doing any tasks with distributed nature in mind

[–] melezhik@programming.dev 1 points 9 months ago* (last edited 9 months ago)

You may try out https://github.com/melezhik/sparky which is a local / remote task runner with nice front end and scripts could be written on many languages

 

This URL provides a collection ( 17 recipes ) of Raku/Sparrow snippets to munge your data. Raku provides powerful regexs mechanism to search text, Sparrow - some high level blocks to make it even easier, all this together allows user to use Raku in daily data processing tasks as alternative to well known solutions like sed/grep/awk/perl

Every recipe is an example of how to solve a real user task ( stack overflow questions ) which you may compare with other solutions ( none Raku ) on the same link and make your opinion. Don’t forget to give the author some credit by voting up on stack overflow if you like them ))

PS all recipes are tested by myself, appreciate any suggestions, improvements, bugs reporting

[–] melezhik@programming.dev 1 points 10 months ago

Yep. Fancy devs watching me coding some Rakulang in nano 😂

view more: next ›