Welcome to the main channel on the development of MoarVM, a virtual machine for NQP and Rakudo (moarvm.org). This channel is being logged for historical purposes.
Set by lizmat on 24 May 2021.
disbot2 <#flirora> A bit late to this discussion, but I agree; given that Python and Raku allow AI contributions, having an explicit policy against them would make a good selling point for those who prefer software without LLM-generated code 07:29
07:32 ShimmerFairy left
disbot2 <librasteve> @#flirora your link has no mention of Raku, did I miss something? 07:39
<#flirora> No, you didn’t
<librasteve> i started a proposal to adopt a non LLM policy here github.com/issues/created?issue=Ra...ving%7C510 … please feel free to feedback and add your support if you agree 07:41
librasteve_ timo: please note ^^ 07:44
disbot2 <#flirora> Thanks 07:50
timo I'd prefer a stronger stance against generative AI in raku than just "there must be a human in the loop". among other reasons because the status of generative AI output in terms of copyright is still uncertain, it might not be possible to legally put AI generated code in a project, if the user who wrote the prompt isn't the copyright holder, they cannot license it to other people in the first place 08:06
and that doesn't mention any of the externalities yet, like the fact that raku contributors now have to pay multiple times as much for the hardware they need to develop on since not just GPUs and DDR5 and 4 RAM prices have gone way way up, but also SSDs (NVMe and SATA) and HDDs prices have been rising 08:08
I already have a dayjob where I'm confronted with products that bombard developers and software development teams with bullshit AI upsell offers on what feels like a daily basis. it'd sure be great if I could come to my favourite open source project and not have to deal with any AI slop 08:13
I don't want to claim that nobody can use a code generating AI without creating bad code, or without thoroughly reviewing the results before passing them on
disbot2 <librasteve> yeah- glad you support the general case, perhaps you could add your comments to the issue so that they are in context as and when it get's pushed to the RSC 08:14
timo but I don't feel like the costs can be justified by increased contributions
it feels like a very classical "deal with the devil"
Voldenet the link above doesn't work, github.com/Raku/problem-solving/issues/510 proper link
It's impossible to prevent AI code _if_ human is in the loop controlling code quality, however typical AI code is awful to read and maintain 08:15
disbot2 <librasteve> I just came across the one I am proposing and it seemed to fit the bill ... I didn't want to frighten @antononcube too much so this one is fairly weak ... if you prefer a stronger one, then please add that as a counterproposal...
Voldenet so maybe it's enough to simply ban low quality code… 08:16
timo i disagree about that
if low quality code that was made by a human is suggested, there is a real benefit to tutoring the user and help improve the code together 08:17
disbot2 <librasteve> there are some nuances such as "who is gonna police this" and "how are they going to know"
timo if the code is mostly gen ai, the time by a maintainer to help improve it is essentially completely wasted
the majority of rules cannot policed sensibly 08:18
be policed*
that doesn't mean we shouldn't have the rule in the first place
disbot2 <librasteve> so I think the "i am a human and I vouch for this PR" is a way to start with self policing and rely on contributor reuptation
timo it just means that if you break the rule on purpose you're an asshole
we already can't reasonably police that a contributor actually has the right to license their contributions under the project's license 08:19
disbot2 <librasteve> and if you break the rule you risk an impact on your reputation and much higher level of inspectioin on your PRs
Voldenet general "don't post low quality AI slop" rule applies to everything probably
IMO there's no problems with llm-generated code in general, because it can be pretty good when input is sane and constrained 08:20
timo I'm still low-key angry about that claude-generated plan for a generative AI based automatic review for all ecosystem contributions
disbot2 <librasteve> OK so we don't / can't police IP infringements today ... (do we have a policy for that?) 08:21
timo I'm not sure we need explicit policies against things that are already illegal? 08:22
disbot2 <librasteve> if we had a PR gate that said "I am submitting this PR and I certify that this code does not contain any IP infringement" then this puts the legal responsibility on the contributor to some extent (ianal) 08:23
timo the main thing about people who suggest AI slop getting a bad reputation is that the barrier to entry for making AI slop suggestions is so incredibly low. that's what the curl project experienced. though in their case there was a monetary reward 08:24
if we really want stuff to be real and proper, we would need people to sign a CLA; there is something like that for one of our pieces but I can't remember which 08:25
disbot2 <librasteve> where is daniel when we need him?
<librasteve> i feel like I am pulling a loose piece of wool from a sweater 08:27
timo I feel like putting an AGENTS.md in the repo that asks the user to refrain from using an AI coding agent or any generative AI on the project, or they will have to suffer the consequences. and then put something in Configure.pl that creates an empty file called $HOME in the base directory of the repository so that the agent sees it and decides to "clean up" for you by calling "rm -rf $HOME" and bam. 08:28
problem solved.
Voldenet sadly, it's potentially not quite legal to do things like those 08:32
(doing rm -rf if users tries to use agents) 08:33
s/tries/try/
timo no no
we don't do rm -rf
the user does it themselves
disbot2 <librasteve> 😈 08:34
Voldenet Ah, if user decides so themselves, that's the other story }:-> 08:36
timo if having your entire home directory deleted by your coding agent is a problem for you, maybe don't use a coding agent that can easily mistake a file called "$HOME" for your literal home folder in the first place? 08:38
Voldenet Well, in the first place don't give your agent permissions to work outside of some certain area (git repository probably), that's obvious 08:40
timo it's not working outside the git repository, it's just deleting a file inside the git repository
... it's also way past the time to look into migrating away from github ... 08:50
08:56 ShimmerFairy joined
disbot2 <dr.shuppet> Linux kernel has been using Developer Certificate of Origin (DCO) for a long time (since 2004) to avoid having to do CLAs 08:59
ShimmerFairy Looking back on convo I missed: I think you probably can't tell if/when to accept AI contributions until after the speculative bubble pops. There's IMO too much interest in making LLMs the next big thing to trust attempts to insert it into existing works. If it has any utility here, it'll only become clear when it's no longer making people infinite money. 09:05
As for AGENTS.md, I'd rather not put one in, even a malicious one, for two reasons: first, "put poison food in the work fridge to catch the food thief" is generally very frowned upon, in any context; second, I don't want to legitimize LLM usage by including such a file at all (that is, I think the lack of a file is a good first hint that the project isn't interested). 09:08
timo yeah, not putting AGENTS.md in is better than having one 09:10
xiaomiao the trick is to enjoy hot sauce a lot 09:12
I trained people at a previous job who just grabbed random gummybears etc. when on a table ... exciting fun when they noticed that those were spicy gummybears 09:13
13:07 Geth left, Geth joined
disbot2 <antononcube> Having such policy would be very counter productive. 13:39
lizmat well, at least it's not earwax :-)
disbot2 <antononcube> For example, LLM-providers can’t help break their APIs. (On frequent, but irregular basis.) Re-learning those APIs is waste of time, they are most likely to be used for a brief time. This is a perfect application of LLMs — to generate code interfacing an LLM-provider for some temporarily valid API. 13:43
lizmat To me that feels like getting people hooked on a drug to be able to get hooked on a drug 13:45
disbot2 <antononcube> I can give non-LLM examples in the same vein -- making Raku connectors/clients to existing established, well known systems. Say, Google, OR-Tools, which has many software components from different sources. I like OR-Tools, but do consider learning the underlying APIs waste of time. 13:51
<antononcube> (Although, I did that when I was payed to do it.) 13:52
timo many API description languages/formats exist, like Swagger. we should see to it that we have something good in the ecosystem that can ingest such a definition and give a good raku module as a result 15:28
Voldenet Swagger is now OpenAPI 15:32
timo OK, sure 15:34
we have something for cro already but i think that's when you start with an openapi definition and want to create a cro router skeleton from that 15:35
not sure if it handles creating client methods and such
Voldenet doesn't seem so, it looks like a validation layer for a given spec 15:42
it's possible to parse yaml/json into OpenAPI::Model, but openapi clients generated that way are not very nice to use - it still feels like using some sort of raw protocol that way 16:03
which is not surprising, because it's all http underneath, and instead of doing `get("/a/$param1/b/$param2")` you'd do `get.a($param1).b($param2)` 16:06
or get.a{$param1}.b{$param2} 16:08
it is slight improvement, because it has types, headers and so on, but these apis were all poor compared to true client libraries written by hand 16:14
timo it'd be good to have something that allows you to create an initial library skeleton that already works with just the definition so you can extend it to make the interface nicer, but in that case it's very important that any additions the user makes can be kept when re-generating from a changed definition 16:36
17:16 librasteve_ left 17:56 librasteve_ joined 19:25 Guest693 left 19:27 timo2 joined 19:31 timo left 20:14 grayeul joined 20:15 timo2 is now known as timo, [Coke] is now known as timo2, timo2 is now known as [Coke]
grayeul Hi all! I'm trying to build nqp (to build an rpm) and while this works on EL10 (RockyLinux-10 actually), I'm getting an error on RockyLinux-9. The problem comes up when it tries to make gen/moar/stage1/ModuleLoader.moarvm from ModuleLoader.nqp 20:19
The cmd passed is: '/usr/bin/moar' --libpath=src/vm/moar/stage0 src/vm/moar/stage0/nqp.moarvm --bootstrap --no-regex-lib --target=mbc --setting=NULL --stable-sc=stage1 --output=gen/moar/stage1/ModuleLoader.moarvm src/vm/moar/ModuleLoader.nqp 20:20
and it throws an error that --setting is illegal. Not sure why this works on Rocky10, but not Rocky9 ?
I also don't understand why that option seems to be passed to moar, when moar --help suggest it is not an option.. 20:22
timo src/vm/moar/stage0/nqp.moarvm is the "program" it's running and that gets the rest of the arguments 20:34
i think that's the very first command in the build that uses nqp, is that right? can you check with '/usr/bin/moar' --libpath=src/vm/moar/stage0 src/vm/moar/stage0/nqp.moarvm --help
grayeul This compiler is based on HLL::Compiler followed by a bunch of options (same on both distros) 20:39
Looks the same on both (lost a line somewhere it seems) 20:40
so -- it's trying to build nqp, and using an 'early stage' of that build for later things? And yes, it does seem like that might be the issue... 20:41
I'm trying to track down what are the key differences (obviously different gcc, etc...)
I'm using same source (2026.02) for both... 20:42
I guess the report from --help just lists the option 'names' but does not include the '--' prefix, though I assume it expects them. 20:43
So ... the --help seems to be reasonable. But I can *add* to the --help option and it is still happy if the cmd is: --target=mbc or --no-regex-lib or --bootstrap (or all) -- but it complains about --setting, --stable-sc as illegal options. And interestingly, it doesn't say --output is illegal, but it says it cannot have a value. So, seems like somehow the argument parsing is not working properly. 21:05
(on RockyLinux-9) 21:06
timo do you have a script of some kind to reproduce all this easily, for example in a container? 21:23
i'll try a simple modification of an opensuse one i still have here 21:26
grayeul I am running a build using 'mock' -- if you have access to that, it might not be too hard. 21:28
I'm wondering if something went wrong when I built moarvm -- and the problem is there -- but it seems like that ran without problem.
timo is the /usr/bin/moar the moar you have just built? 21:31
grayeul I can possibly see about getting a container of some sort setup.... but I'm about out of juice, after messing with this most of the day.... I'll try doing that tomorrow and report back here...
timo i have a Containerfile that successfully compiles and installs a moarvm and nqp
grayeul Yes, I think so... I built that into an rpm first... and then have installed that rpm into the chroot where I'm working to build nqp
timo gist.github.com/timo/61cc9129b7c6d...80f810e680 21:32
grayeul if you have any environment based on rpm.... I could certainly get you that rpm (for moarvm) -- and then possibly you could try it.
timo i do have a rocky linux 10 container that i was using to look for the cause of the crashes in Sparrow6 21:35
not container, VM.
i would prefer getting the srpm over the rpm probably, but having the rpm that gives that error could be interesting too 21:36
grayeul all seemed to be working (for this...) on rocky10 21:37
timo build log in the gist
ah right, i wouldn't be able to see the problem on 10 anyway then
i am chronically short on SSD space :(
can you also give me more of the output? maybe including the build of the moarvm package as well? 21:38
grayeul yeah... I'm not sure if this is helpful or not -- but you could look at: copr.fedorainfracloud.org/coprs/gr...oj/builds/ and I think you can see the build logs, etc for moar, and the failed one for nqp 21:39
or you can see the moar build log (successful) here: rpa.st/7MCDU 21:41
and the failed nqp build here: rpa.st/6T6EA 21:42
21:50 grayeul2 joined, grayeul2 left
timo --no-silent-build for nqp would be helpful to get the executed commands in the output 21:51
21:51 grayeul2 joined 21:52 grayeul2 left
grayeul I'll see what I can do -- but probably not until tomorrow... thanks for taking the time to look.. 21:52
timo sure
the "illegal option" is a reaction to when the given option isn't found in a hash, maybe something's really wrong with hashes for some reason? 21:53
[Coke] FYI, we did just release 2026.03
don't expect it to fix your issues, but might as well use the latest one. 21:54
grayeul OK -- I think I kicked off another build on copr, with the --no-silent-build option... but sometimes it takes a while to get a builder. 21:56
I did see that you had released 2026.03 -- but (after successfully building moarvm/nqp/rakudo on rocky10) -- when I tried 2026.03 that failed with a different error (on 10). I figured I'd stick with 2026.02 to get going on 9, then go back and see what was up with 2026.03.... 21:58
timo huh.
well, that doesn't bode well
copr-dist-git.fedorainfracloud.org...oarvm.git/ should i be able to see this? gives me a 500 internal server error 21:59
grayeul well, I'm pretty sure... it did -- can't seem to find that build right now, so I need to re-check.
timo but i can see the log of the ongoing build live in front of me now 22:01
grayeul no -- that's not a valid URL -- I'm fairly new to copr, so haven't figured out how to navigate everywhere... but, on the main project page you should see the cmd: `#> dnf copr enable grayeul/TestProj` which should let you enable that as a repository, then can use dnf commands to pull down git/x86_64 rpms.
[Coke] grayeul: makes sense, thanks. 22:02
timo ok, it does look like the output from --no-silent-build is what we would expect
grayeul I think there is also a Download repo button at: #> dnf copr enable grayeul/TestProj that will give you a .repo file that can be used. You might have some luck with that.
already failed in the same way I see.... 22:03
ok -- I'm done for today.... cya
[Coke] ~~
Thanks for bringing this to us!
timo gnite