🦋 Welcome to the MAIN() IRC channel of the Raku Programming Language (raku.org). Log available at irclogs.raku.org/raku/live.html . If you're a beginner, you can also check out the #raku-beginner channel! Set by lizmat on 6 September 2022. |
|||
00:00
reportable6 left
00:03
reportable6 joined
00:07
willthechill left
00:33
laidback_01 left
00:41
guifa joined
00:52
razetime joined
01:52
reportable6 left
01:53
reportable6 joined
|
|||
guifa just realized that is semi joke idea of having an inline C block get compiled in the background and added with NativeCall was.... already done! | 01:53 | ||
Just with Go instead | |||
Voldenet | I did that already, it's not that much of a joke actually | 01:56 | |
simplifies writing wrappers with libs requiring structs | |||
guifa | Voldenet: link? | 01:58 | |
Voldenet | the shortest example I have on hand ix.io/4Aoy | ||
guifa is preparing a history of slangs | |||
Voldenet | definitely not a slang though | ||
guifa | Remind me of that link in a week or so | 01:59 | |
And I'll try to make it one haha | |||
(would just need to write a C grammar et voilà) | |||
tbrowder__ | .ask lizmat is there a public way to search the weekly archives? | 02:03 | |
tellable6 | tbrowder__, I'll pass your message to lizmat | ||
Voldenet | reason why I didn't do slang is because C's grammar is impressively large and I simply needed to work on C code iteratively: www.quut.com/c/ANSI-C-grammar-l-2011.html | 02:08 | |
02:12
Ekho left
|
|||
guifa | Voldenet: yeah. But if I could get the grammar to at least validate, I could hand the string en masse to the compiler | 02:16 | |
the catch is the preprocessing and Pragma handling I think | 02:17 | ||
02:19
Ekho joined
|
|||
Voldenet | hmm, I think the slang would need to do two passes for that | 02:22 | |
with preprocessor grammar being separate from C grammar | 02:23 | ||
expanding macros in a raku grammar definitely sounds like a fun adventure | 02:27 | ||
02:35
jgaz left
02:40
jgaz joined,
teatwo left,
teatwo joined
03:23
willthechill joined
03:36
rf left
|
|||
guifa | Voldenet the catch of course is how to preprocess when you're looking for a non-EOF terminal token | 03:46 | |
Voldenet | doesn't preprocessor stop when it finds EOF though | 03:47 | |
guifa | right, but if I do something like | 03:48 | |
c-sub foo { ... } | |||
my EOT is now actually } | 03:49 | ||
err EOF | |||
Voldenet | but you still need to expand macros, they could conditionally expand into } or not | 03:52 | |
some solution would be to simply get the longest parsable output of gcc -E by iterating from the end of the file | 03:59 | ||
since most likely raku code will not be parsable by gcc… | |||
practically I bet that `consume anything, devour literals and preprocessor directives, count {}s in real code` would cover all useful code | 04:05 | ||
04:06
razetime left
05:04
guifa left
05:07
willthechill left
05:13
razetime joined
05:21
willthechill joined
05:40
siavash joined
05:42
razetime left
05:43
razetime joined
05:53
ProperNoun left,
ProperNoun joined
05:55
siavash left
05:56
ProperNoun left
05:57
ProperNoun joined,
siavash joined
06:00
reportable6 left,
reportable6 joined
06:24
siavash left
06:26
siavash joined
06:54
siavash left
07:24
sena_kun joined
07:34
jpn joined
07:53
dakkar joined
07:58
abraxxa joined
08:18
jpn left
08:26
razetime left
08:28
abraxxa left
08:39
Sgeo left
08:42
jpn joined
08:54
razetime joined
08:57
abraxxa joined
09:03
siavash joined,
siavash2 joined,
siavash2 left
|
|||
lizmat | . | 09:04 | |
tellable6 | 2023-07-12T02:03:12Z #raku <tbrowder__> lizmat is there a public way to search the weekly archives? | ||
09:06
siavash left
09:07
siavash joined
|
|||
lizmat | .tell tbrowder_ on google, include "rakudoweekly.blog" in your search, and it will effectively do that for you | 09:07 | |
tellable6 | lizmat, I'll pass your message to tbrowder__ | ||
09:23
sena_kun left
10:06
sena_kun joined
|
|||
Geth | docker: 7cae2b1fce | (Daniel Mita)++ | 5 files Bump to 2023.06 |
10:29 | |
docker: 40594a6003 | Altai-man++ (committed using GitHub Web editor) | 5 files Merge pull request #55 from Raku/2023.06 Bump to 2023.06 |
|||
10:54
euandreh joined
10:58
euandreh left
10:59
euandreh joined
11:27
razetime left
11:44
jpn left
12:00
reportable6 left
12:01
siavash left
12:03
reportable6 joined
12:15
tea3po joined
12:17
teatime joined
12:19
teatwo left
12:20
tea3po left
13:11
guifa joined
|
|||
[Coke] | Do we have any concurrency examples that show running an external program, getting output and *also* the exitcode? Most of the samples I have at hand (and in the concurrency page in the docs) shows one or the other, but not both. | 13:15 | |
(that is, it's either a react block to deal with input and output, or a shell that checks the exitcode, but not both) | |||
13:16
teatwo joined
|
|||
[Coke] | ah, docs.raku.org/type/Proc/Async does have a giant react block with an .exitcode, but could use better highlighting (and a bug fix on the output there). | 13:18 | |
13:20
teatime left
13:25
jpn joined
|
|||
tbrowder__ | japhb: i am adding issues to my pending pr to | 13:44 | |
tellable6 | 2023-07-12T09:07:42Z #raku <lizmat> tbrowder_ on google, include "rakudoweekly.blog" in your search, and it will effectively do that for you | 13:45 | |
13:46
euandreh left
|
|||
tbrowder__ | show my intended changes. what do you think about creating a separate module to create a test json input file? or make App::* a class and create such in the TWEAK? | 13:48 | |
lizmat: search worked great, thnx | 13:49 | ||
[Coke]: for us timid proc users, examples consistently showing how to get final results are really appreciated. | 13:56 | ||
13:56
euandreh joined
14:17
melezhik joined
|
|||
melezhik | . | 14:17 | |
if there is way in raku to dynamically generate list of functions? | |||
say I have a @list and I want to integrate through it and generate a function named $I for every item in list | 14:18 | ||
where $I is an item in @list | 14:19 | ||
m: my @list = <a b c>; for @list -> $item { our sub $item () {} } | 14:20 | ||
camelia | ===SORRY!=== Error while compiling <tmp> Missing block at <tmp>:1 ------> = <a b c>; for @list -> $item { our sub⏏ $item () {} } expecting any of: new name to be defined |
||
melezhik | m: my @list = <a b c>; for @list -> $item { say $item } | 14:21 | |
camelia | a b c |
||
melezhik | m: my @list = <a b c>; for @list -> $item { eval "sub $item () \{ \}" } | ||
camelia | ===SORRY!=== Error while compiling <tmp> Undeclared routine: eval used at line 1. Did you mean 'EVAL', 'val'? |
||
melezhik | m: my @list = <a b c>; for @list -> $item { EVAL "sub $item () \{ \}" } | ||
camelia | ===SORRY!=== Error while compiling <tmp> EVAL is a very dangerous function!!! (use the MONKEY-SEE-NO-EVAL pragma to override this error but only if you're VERY sure your data contains no injection attacks). at <tmp>:1 ------> -> $i… |
||
melezhik | m: use MONKEY-SEE-NO-EVAL; my @list = <a b c>; for @list -> $item { EVAL "sub $item () \{ \}" } | 14:22 | |
camelia | ( no output ) | ||
melezhik | m: use MONKEY-SEE-NO-EVAL; my @list = <a b c>; for @list -> $item { EVAL "sub $item () \{ \}" }; a(); | ||
camelia | ===SORRY!=== Error while compiling <tmp> Undeclared routine: a used at line 1 |
||
melezhik | maybe better try my luck with closures | 14:24 | |
14:24
melezhik left
14:31
teatwo left,
teatwo joined
14:41
jpn left
14:46
RonaldR34g4m left
14:50
jpn joined
|
|||
[Coke] | m: { sub a() {} }; a() | 14:57 | |
camelia | ===SORRY!=== Error while compiling <tmp> Undeclared routine: a used at line 1 |
||
[Coke] | m: sub a() {}; a() | ||
camelia | ( no output ) | ||
14:59
Sgeo joined
15:00
rf joined
15:01
euandreh left
15:04
euandreh joined
15:37
Vyrus joined
|
|||
japhb | tbrowder__: Didn't realize you could PR *issues* on GitHub, that's kindof cool in an odd sort of way. | 15:42 | |
I'm totally fine with having a module that creates test files and data sets. The reason I originally chose to use an externally-created JSON file is so that I used real-world data shapes and didn't accidentally design the data set to favor particular codecs. | 15:43 | ||
But as long as we keep at least one (usefully large and complex) real world JSON data file, I'm happy to have generated ones as well that test particular facets of performance or fidelity or what have you. | 15:44 | ||
tbrowder__: Did I answer all your questions? :-) | |||
15:55
guifa left
15:56
guifa joined
16:00
guifa left
|
|||
tbrowder__ | weird, i can't find the issues now, but yes. anyhoo, one thing i really need from you in the README is a short intro on how a new raku person should set up and run the app. how to get the big json, file, etc. make it suitable for a person looking to move from python to raku. | 16:03 | |
and not familiar with all the ecosystem bits | 16:04 | ||
16:07
euandreh left
16:08
willthechill left,
guifa joined
|
|||
tbrowder__ | ahem, and to help me :-) | 16:13 | |
guifa | First talk down. Now to finish writing the next one haha | 16:20 | |
16:36
dakkar left
16:40
deoac joined
|
|||
japhb | tbrowder__: Ah, meaning, you'd like me to do that *first* it sounds like, since it's not obvious how to get set up properly. OK, will put that on my list for today (which is kinda packed, so it will be a bit before I can do so). | 16:44 | |
tbrowder__ | no rush, thnx | 16:45 | |
this is just nice-to-have stuff for me | |||
japhb | gotcha | 16:46 | |
antononcube | @guifa "(would just need to write a C grammar et voilà)" -- Probably not that hard -- or hard, but easier -- using ANTLR C grammars and DrForr's conversion packages. | 16:53 | |
Or using (E)BNF grammars for C and my conversion packages. 🙂 | 16:55 | ||
guifa | The catch is that C has a multi phase compile process with macro expansions in some of those | 17:16 | |
Ack, some of the RakuAST stuff changed | 17:18 | ||
antononcube | @guifa Yeah, and the so called "lexer hack." | ||
17:21
jpn left
17:22
sena_kun left
|
|||
nemokosch | the C preprocessor is banally simple compared to a usual PL parser though | 17:24 | |
17:28
abraxxa left
|
|||
antononcube | What is "PL parser" ? | 17:29 | |
guifa | programming language | ||
antononcube | 🙂 Agh | ||
guifa | nemokosch: yeah, there are single pass C compilers so it can be done, will just be complex | ||
antononcube | There are existing efforts: raku.land/github:andydude/C::Parser | 17:30 | |
nemokosch | I took this mannerism from raiph | 17:31 | |
17:34
jpn joined
17:36
guifa left,
squashable6 left
17:37
guifa joined
17:39
squashable6 joined
17:47
jpn left
17:51
jpn joined
|
|||
tbrowder__ | i'm trying to use JSON::Fast to output sorted keys using ":sorted" keys. it works find for "text" keys, but my hash has "Numeric" keys. for such hashes, this works: "%myhash.keys.sort({.Numeric}". the README says to give a Callable but so far i've had no success. any help, pls? | 17:53 | |
17:54
Vyrus left
|
|||
tbrowder__ | m: my %h = set <100 90 8>; say %h.keys.sort; say %h.keys.sort({.Numeric}); | 17:56 | |
camelia | (100 8 90) (8 90 100) |
||
tbrowder__ | i'm trying to get the same kind of results with JSON::Fast but haven't broken the code on the correct syntax yet. | 17:58 | |
sorry, JSON::Fast uses ":sorted-keys" as named arg | 17:59 | ||
18:00
reportable6 left,
reportable6 joined,
jpn left
18:04
jpn joined
18:05
sena_kun joined
18:12
jpn left
18:26
jpn joined
|
|||
nemokosch | does JSON even interpret numeric keys I wonder | 18:27 | |
japhb | Nope. But CBOR does. | ||
(ObCBORIsBetterComment) | 18:28 | ||
18:29
Vyrus joined
|
|||
nemokosch | when I read binary, I always think: does this mean we re-enter byte-order hell? | 18:30 | |
actually I have a funny issue at work, they cast it as if it was simply a byte-order portability upgrade but the currently generated binary is clearly significantly different from the expected binary, the length is the same but the histogram of bytes is visibly completely different :\ | 18:31 | ||
18:33
guifa left
18:35
guifa joined
|
|||
tbrowder__ | m: my %h = set <"100" "90" "8">; say %h.keys.sort; say %h.keys.sort({.Numeric}); | 18:37 | |
camelia | ("100" "8" "90") Cannot convert string to number: base-10 number must begin with valid digits or '.' in '⏏\"100\"' (indicated by ⏏) in block <unit> at <tmp> line 1 |
||
tbrowder__ | m: my %h = set <"100" "8" "90">; say %h.keys.IntStr.sort{.Numeric}); | 18:40 | |
camelia | ===SORRY!=== Error while compiling <tmp> Unexpected closing bracket at <tmp>:1 ------> "90">; say %h.keys.IntStr.sort{.Numeric}⏏); |
||
tbrowder__ | m: my %h = set <"100" "8">; say %h.keys.IntStr.sort({.Numeric}); | 18:42 | |
camelia | No such method 'IntStr' for invocant of type 'Seq' in block <unit> at <tmp> line 1 |
||
tbrowder__ | m: my $a = "100"; sat $a.IntStr | 18:44 | |
camelia | ===SORRY!=== Error while compiling <tmp> Undeclared routine: sat used at line 1. Did you mean 'set', 'say'? |
||
[Coke] | . | 18:45 | |
tbrowder__ | m: my %h = set <"100" "8">; say %h.keys.Int.sort({.Numeric}); | 18:47 | |
camelia | (2) | ||
[Coke] | do you mean >>.IntStr ? | ||
calling Int on a list == calling .elems | |||
tbrowder__ | wrong order... | 18:48 | |
maybe needs a .map | 18:54 | ||
[Coke] | sure, >>.foo is like .map(*.foo) | 18:56 | |
tbrowder__ | ok, trying offline... | 19:00 | |
never was very good at golf! | 19:07 | ||
19:10
thebb joined,
jpn left
|
|||
xinming_ | Is there any dynamic var builtin for a loop? | 19:15 | |
Something like, $?LOOP-COUNT etc.. | |||
19:19
jpn joined
19:21
guifa left
|
|||
nemokosch | what would that do? | 19:29 | |
19:42
guifa joined
19:43
guifa left
19:44
jpn left
|
|||
[Coke] | m: my @a = 'a'..'q'; for @a.kv -> ($k,$v) { say $k, $v } | 19:44 | |
camelia | Cannot unpack or Capture `0`. To create a Capture, add parentheses: \(...) If unpacking in a signature, perhaps you needlessly used parentheses? -> ($x) {} vs. -> $x {} or missed `:` in signature unpacking? -> &c:(Int) {} in block <unit> at … |
||
[Coke] | m: my @a = 'a'..'q'; for @a.kv -> $k,$v { say $k, $v } | 19:45 | |
camelia | 0a 1b 2c 3d 4e 5f 6g 7h 8i 9j 10k 11l 12m 13n 14o 15p 16q |
||
[Coke] | that's not quite what you're asking | 19:46 | |
m: my @a = 'a'..'q'; for @a -> $v { FIRST state $k; say $k++, $v # this either | 19:47 | ||
camelia | ===SORRY!=== Error while compiling <tmp> Missing block at <tmp>:1 ------> RST state $k; say $k++, $v # this either⏏<EOL> expecting any of: postfix statement end statement modifier stat… |
||
[Coke] | m: my @a = 'a'..'q'; for @a -> $v { FIRST state $k; say $k++, $v } # this either | ||
camelia | 0a 1b 2c 3d 4e 5f 6g 7h 8i 9j 10k 11l 12m 13n 14o 15p 16q |
||
xinming_ | m: sub a (:&t:(Array)) { my @a = <a b>; t(@a); }; sub x (@array) { @array.raku.say; }; a(:t(&x)) | ||
camelia | Signature constraint check failed in binding to parameter '&t'; expected :(Array $) but got :(@array) in sub a at <tmp> line 1 in block <unit> at <tmp> line 1 |
||
[Coke] | (the FIRST is useless there, nevermind) | ||
xinming_ | m: sub a (:&t:(Array)) { my \a = <a b>; t(\a); }; sub x (@a) { @a.raku.say; }; a(:t(&x)) | 19:48 | |
camelia | Signature constraint check failed in binding to parameter '&t'; expected :(Array $) but got :(@a) in sub a at <tmp> line 1 in block <unit> at <tmp> line 1 |
||
19:48
guifa joined
|
|||
xinming_ | in this case, How do we enforce Array type in sub-signatures please? | 19:49 | |
m: sub a (:&t:(Array)) { my @a = <a b>; t(2a); }; sub x (@a) { @a.raku.say; }; a(:t(&x)) | |||
camelia | ===SORRY!=== Error while compiling <tmp> Unable to parse expression in argument list; couldn't find final ')' (corresponding starter was at line 1) at <tmp>:1 ------> sub a (:&t:(Array)) { my @a = <a b>; t(2⏏a); }; sub x (@a) { @a.… |
||
xinming_ | m: sub a (:&t:(Array)) { my @a = <a b>; t(@a); }; sub x (@a) { @a.raku.say; }; a(:t(&x)) | 19:50 | |
camelia | Signature constraint check failed in binding to parameter '&t'; expected :(Array $) but got :(@a) in sub a at <tmp> line 1 in block <unit> at <tmp> line 1 |
||
xinming_ | m: sub a (:&t:(Array)) { my @a = <a b>; t($ = @a); }; sub x (@a) { @a.raku.say; }; a(:t(&x)); # adding $ = @a in arg call also raises the error | 19:52 | |
camelia | Signature constraint check failed in binding to parameter '&t'; expected :(Array $) but got :(@a) in sub a at <tmp> line 1 in block <unit> at <tmp> line 1 |
||
tbrowder__ | ref sorting JSON::Fast, looking at the code, lizmat implemented the sorting, and it looks to me like it only p | 20:01 | |
*permits the $^a cmp $^b style.. | 20:02 | ||
[Coke] | looking at the code, I'm not sure what's expecting a pair, but yah, I can't get it to work with just .Int either. | 20:20 | |
I'd write a sub that does what tou want and pass that so you don;'t need it all in the invocation of to-json | 20:21 | ||
20:21
jpn joined
|
|||
[Coke] | m: sub inter($a, $b) {$a.key.Int cmp $b.key.Int}; my %a = (80=>1, 100=>2, 10=>30); use JSON::Fast; my $b = to-json(%a, :pretty, :sorted-keys(&inter)); say $ | 20:25 | |
camelia | ===SORRY!=== Error while compiling <tmp> Could not find JSON::Fast in: /home/camelia/.raku /home/camelia/rakudo-m-inst-2/share/perl6/site /home/camelia/rakudo-m-inst-2/share/perl6/vendor /home/camelia/rakudo-m-inst-2/sh… |
||
[Coke] | here, this generates { "10": 30, "80": 1, "100": 2 } but pretty | 20:26 | |
20:27
jpn left
21:10
guifa left
|
|||
tbrowder__ | hm, thnx, looks good,but not working for me yey | 21:14 | |
*yet, still trying... | 21:15 | ||
Voldenet | you can't sort keys in json | 21:20 | |
you can attempt, but almost no json parser makes guarantees of keys being sorted | |||
m: my %a = (80=>1, 100=>2, 10=>30); use JSON::Fast; my $b = to-json(%a.List.sort(*.key <=> *.key).map(*.kv), :pretty); say $b | 21:23 | ||
camelia | ===SORRY!=== Error while compiling <tmp> Could not find JSON::Fast in: /home/camelia/.raku /home/camelia/rakudo-m-inst-2/share/perl6/site /home/camelia/rakudo-m-inst-2/share/perl6/vendor /home/camelia/rakudo-m-inst-2/sh… |
||
tbrowder__ | erg, i have a mixed bag, mostly IntStrs, but a few Str only. trying to fancify my sub | ||
ok, got it working! thanks for the help, [Coke]! | 21:37 | ||
i seem to remember this dance now: my sub does elsif on ~~ Int or Str and then either uses cmp or <=> on the two same type keys or just orders them on my preference (all Str before all Int) | 21:41 | ||
Voldenet: ye of little faith, Raku rules! | 21:43 | ||
Voldenet | it's not about raku strictly, but about json | 21:44 | |
either way, arrays are better way of expressing anything in json | 21:45 | ||
arrays are usually faster to parse, accept more datatypes as 'keys' | |||
must preserve sorting | 21:46 | ||
that last property is the most useful, since you can have a discriminator field as first field in the array | |||
so '{ "type": "SomeBoringDto", "field1": 1, "field2": 2 }' -> '["SomeBoringDto", "field1", 1, "field2", 2]' | 21:47 | ||
tbrowder__ | yes, i think i see, but, for my use case, the sorted hash is just what i want. it would help a bunch if you could put some good examples in the docs or the JSON::Fast module. | 21:59 | |
lizmat | PRs welcome :-) | 22:00 | |
tbrowder__ | yepper! | ||
22:06
sena_kun left
22:08
sena_kun left
22:21
tea3po joined
22:24
teatwo left
|
|||
xinming_ | m: sub a (:&t:(Array)) { my @a = <a b>; t(@a); }; sub x (@a) { @a.raku.say; }; a(:t(&x)) | 22:39 | |
camelia | Signature constraint check failed in binding to parameter '&t'; expected :(Array $) but got :(@a) in sub a at <tmp> line 1 in block <unit> at <tmp> line 1 |
||
xinming_ | m: sub t (Array $a) { $a.raku.say; }; my @a = <a b>; t(@a); | ||
camelia | $["a", "b"] | ||
xinming_ | Anyone here knows why the latter worked, But the :&t:(Array) version doesn't work as expected? | 22:40 | |
How can I make the former example work | |||
m: sub a (:&t:(Array)) { my @a = <a b>; t(@a); }; sub x (Array \a) { a.raku.say; }; a(:t(&x)) | 22:43 | ||
camelia | ["a", "b"] | ||
xinming_ | I figured out a version which seem to work, But quite confusing | ||
m: sub a (:&t:(Array)) { my @a = <a b>; t(@a); }; sub x (Array $a) { $a.raku.say; }; a(:t(&x)) | 22:44 | ||
camelia | $["a", "b"] | ||
xinming_ | m: sub a (:&t:(@x)) { my @a = <a b>; t(@a); }; sub x (@a) { @a.raku.say; }; a(:t(&x)) | 22:51 | |
camelia | ["a", "b"] | ||
xinming_ | Now, I'm quite confused by @x vs Array | 22:52 | |
22:59
deoac left
|
|||
nemokosch | well there might be a reason people don't like to talk about this stuff | 23:01 | |
I suspect if the community ever took the type system seriously, that would have led to the impression that @ really does more harm than good | 23:02 | ||
needless to say, that's only my 2 cents, although I have a reason to think that | 23:03 | ||
23:03
teatwo joined
|
|||
xinming_ | I think I can understand the reason now, Array $a is not the same as @a | 23:04 | |
Since Array in signature, It expects a scalar container around the Array, Where @a means the Array itself | 23:05 | ||
docs does mention that Array, List is the most difficult part to design, I believe so. | 23:06 | ||
23:07
tea3po left
|
|||
Voldenet | @a means the array container around the | 23:07 | |
Array | |||
m: my $s := []; my @a := $s; $s.push(42); dd @a | 23:08 | ||
camelia | [42] | ||
nemokosch | it's debated whether that can be considered a container or not. From a certain point of view it is (it can supply assignment semantics via the STORE method), from another it isn't (it's VAR method reports the same value back, same for decontainerisation) | 23:09 | |
Voldenet | this is enormously confusing and source of errors, so I usually use scalars everywhere out of p5 habit | ||
nemokosch | I tend to forget how it is for function parameters, iirc it's not quite like for variables - but one thing is sure | 23:10 | |
Scalar is an extra feature, a $variable is net more capable than a @variable | 23:11 | ||
Voldenet | btw, this example has tons of problems: `sub a (:&t:(Array)) { my @a = <a b>; t(@a); }; sub x (@a) { @a.raku.say; }; a(:t(&x))` | 23:12 | |
s/tons of/two/ | 23:13 | ||
xinming_ | Voldenet: Well, It's just for testing purpose. But glad to hear the corrections | ||
Voldenet | `<a b>` produces a List, and `x` consumes anything | ||
xinming_ | That's why I assign <a b> to @a | 23:14 | |
nemokosch | @a will be an Array containing two strings, I see no problem with that | ||
xinming_ | m: sub x (@a) { @a.raku.say }; x(1,2,3); | ||
camelia | ===SORRY!=== Error while compiling <tmp> Calling x(Int, Int, Int) will never work with declared signature (@a) at <tmp>:1 ------> sub x (@a) { @a.raku.say }; ⏏x(1,2,3); |
||
xinming_ | m: sub x (@a) { @a.raku.say }; x((1,2,3),); | 23:15 | |
camelia | (1, 2, 3) | ||
Voldenet | makes sense | ||
nemokosch | pretty sure assignment to @vars itself is subject to the "single argument rule" | ||
xinming_ | I know in these 2 days, My mind about Array/List in raku has refreshed. :-) | ||
nemokosch | so no nesting happens, instead, the content of the List is unwrapped straight into the Array @a declared | ||
Voldenet | m: sub a (:&t:(@)) { my @a = <a b>; t(@a); }; sub x (@a) { @a.raku.say; }; a(:t(&x)) | 23:16 | |
camelia | ["a", "b"] | ||
xinming_ | I think, explaining @ is not same as Array in signature may be a headache for both explainer and new raku users. | 23:17 | |
Voldenet | simply using $ everywhere makes explaining and code simpler | 23:18 | |
nemokosch | I agree, honestly | ||
Voldenet | m: sub a (:&t:(Array)) { my $a = <a b>.Array; t($a); }; sub x (Array $a) { $a.raku.say; }; a(:t(&x)) | ||
camelia | $["a", "b"] | ||
Voldenet | it's so obvious | ||
nemokosch | especially if you are keen on typing | ||
if you are keen on typing, I wouldn't even be trying with other sigils in your place, maybe the "unsigil" every now and then, that can be handy | 23:19 | ||
xinming_ | well it's perl, I like @ and % | ||
nemokosch | from what I heard, they were even more evil in Perl so maybe you just like to suffer 😛 | 23:20 | |
Voldenet | passing @ in perl was pure suffering | ||
that's where you learn to use scalars everywhere | |||
nemokosch | anyway, they are absolutely not like type info or interface annotation in Raku either | 23:21 | |
Voldenet | Well, it's understandable when you know that passing @ actually passes contents of the array and not the ref | ||
nemokosch | in Raku, the sigils are like metadata about how much of a variable the symbol should be - that, aaaand a convenient mixture of default value and interface annotation... | 23:22 | |
Voldenet | …leads you to this madness | ||
m: my $x = [1, 2]; for $x { .say } | 23:23 | ||
camelia | [1 2] | ||
23:25
gfldex_ joined
|
|||
Voldenet | that's the most "perl5" thing in raku | 23:26 | |
23:26
mst left,
Woodi_ joined
|
|||
nemokosch | well well, I think that's a surprisingly apparent mistake... the concept of "you can assign a single value to this" and "it is an atomic value" got mixed up | 23:26 | |
23:26
hexology- joined
|
|||
I wonder if "itemization" is ever useful but even if it is, I wonder what urged somebody to mistake it for "scalarization", in the mutability sense | 23:26 | ||
23:26
mst joined
|
|||
leont isn't sure they ever truly grokked itemization | 23:27 | ||
23:28
lizmat_ joined
|
|||
japhb | m: my \x = [1, 2]; for x { .say } | 23:28 | |
camelia | 1 2 |
||
japhb | If you want to avoid itemization, you need merely say so. :-) | ||
nemokosch | but that's the thing - the Scalar container does two things at once | 23:29 | |
you have to opt out from something that you thought merely provided mutability | |||
Voldenet | mutability should be opt-in in the first place | ||
nemokosch | and being able to downright store an Array or whatever into one thing can be useful | 23:30 | |
so you might have valid reasons to use $var to hold a complex data type | |||
it's rather obscure that you need to do this "decontainerization dance" when you merely want to iterate on it | 23:31 | ||
tbrowder__ | Voldenet: interesting about the json array: is it faster that just reading a formatted flat file? | ||
nemokosch | ... especially since most (but again, not all...) methods don't care about containerization at all, so it would be convenient to settle on for $x { ... } and $x.map({ ... }) reliably doing the same thing | 23:32 | |
tbrowder__ | s/that/than; | ||
23:32
lizmat left,
goblin left,
samcv left,
hexology left,
Woodi left,
gfldex left
|
|||
Voldenet | tbrowder__: probably no, because flat file can be optimized further | 23:33 | |
23:33
samcv joined
|
|||
Voldenet | json needs to use different code for unescaping strings and parsing floats and numbers, but integers can be simple numbers with separator | 23:36 | |
otoh writing .slurp-rest.split(",").map(*.Int) is probably going to be slower | 23:38 | ||
since json deserializer can stream files in a wiser wayt | |||
tbrowder__ | ref flat file: yeah, that's kinda been my experience. but i think the json hash suits the data size i'm looking at. esp. for read, update, write. | ||
and search | 23:39 | ||
Voldenet | if I needed read, update and write, I'd just use sqlite | ||
it's orders of magnitude more flexible and faster | |||
tbrowder__ | i've thought about that (and used it in tnebut for me that will be a later optimization | 23:40 | |
Voldenet | (btw, that full string deserialization is a problem that happens in some json C deserializers that simply do strlen which is O(n) in C…) | ||
tbrowder__ | *used it in the past | 23:41 | |
Voldenet | in fact, if performance is important, then json is not the best format | 23:42 | |
japhb | Formats that have length-specified strings and arrays don't (naturally at least) suffer from that problem, so e.g. protobuf and CBOR don't face the O(n) skipping issue. | ||
Voldenet | yeah, I was going to suggest protobuf which even has raku module | ||
japhb | And avoiding escaping and unescaping is really nice too. | ||
Voldenet | capnp and flatbuffers are going to be even faster, theoretically they can avoid deserializing and mmap into huge files | 23:44 | |
japhb | Oh, as a side note: automagic JSON and CBOR handling are already part of Cro. So if you are using that, well, there you go. | ||
tbrowder__ | i'm looking at a need for a single-user data store, so i'm happy with the perf i've seen so far. | ||
Voldenet | though the path I'd use is json -> sqlite | 23:45 | |
japhb | Makes sense | ||
tbrowder__ | ok, thnx | ||
Voldenet | sqlite if full json parse/write takes too much time | ||
japhb | ... or PostgreSQL if you want to do the same thing with some JSON- or JSONB-valued individual fields | 23:46 | |
But that's a much bigger sledgehammer. | 23:47 | ||
Voldenet | I've used mysql/postgres in some projects and I regret maintenance costs | 23:48 | |
usually where I should've used flat file to begin with | |||
japhb | MySQL I agree, having used that since Ye Olden Dayes. PostgreSQL I've had less trouble with. | 23:49 | |
Voldenet | (not mysql, but mariadb now) | ||
(still, not much difference in practice) | |||
japhb | Nodnod | ||
23:52
El_Che left
23:53
El_Che joined
|