MasterDuke | Zoffix: nqp::dd()++ | 00:32 | |
m: say Hash.^attributes[0].perl | 00:58 | ||
camelia | rakudo-moar 13f479: OUTPUT«No such method 'perl' for invocant of type 'BOOTSTRAPATTR' in block <unit> at <tmp> line 1» | ||
MasterDuke | m: say Hash.^attributes[1].perl | ||
camelia | rakudo-moar 13f479: OUTPUT«Attribute.new» | ||
MasterDuke | what would be a good value for the .perl of a BOOTSTRAPATTR? | 00:59 | |
timotimo | we don't necessarily want to give bootstrapattr new methods | 01:02 | |
m: say Hash.^attributes>>.name | |||
camelia | rakudo-moar 13f479: OUTPUT«($!descriptor $!storage)» | ||
timotimo | and you can always get that ^ | ||
MasterDuke | why not? | ||
timotimo | not really sure; i believe we had this discussion before (also, before #perl6-dev came to be) | 01:03 | |
MasterDuke | i'm looking at RT #77070 | 01:04 | |
synopsebot6 | Link: rt.perl.org/rt3//Public/Bug/Displa...l?id=77070 | ||
MasterDuke | i've added a perl() and gist() locally, pretty sure it didn't break any spectests | 01:05 | |
timotimo | wow, that original one is from 2010 :) | ||
MasterDuke | yeah, behavior changed a couple different times | 01:07 | |
but it does seem LTA that you can .gist/.perl some attributes, but not all | 01:08 | ||
m: .perl.say for Hash.^attributes.reverse | 01:09 | ||
camelia | rakudo-moar 13f479: OUTPUT«Attribute.newNo such method 'perl' for invocant of type 'BOOTSTRAPATTR' in block <unit> at <tmp> line 1» | ||
timotimo | ideally, bootstrapattr would be derived from Any, but bootstrapping issues prevent that | 01:10 | |
you can not only not .gist or .perl BootstrapAttr, you can also not .Str, .say, .print, .note, .elems, .Int, .Bool, ... them | 01:11 | ||
Zoffix | m: multi foo (Int $x, int $y) {say "Int + native" }; foo 2, 2; | 01:12 | |
camelia | rakudo-moar 13f479: OUTPUT«Cannot resolve caller foo(Int, Int); none of these signatures match: (Int $x, int $y) in block <unit> at <tmp> line 1» | ||
Zoffix | m: sub foo (Int $x, int $y) {say "Int + native" }; foo 2, 2; | ||
camelia | rakudo-moar 13f479: OUTPUT«Int + native» | ||
Zoffix | I wrote down the rules for how multies with natives will be figured out. gist.github.com/zoffixznet/4a63358...82730c6486 | 01:13 | |
timotimo | anyway, i'm now going to bed | ||
Zoffix | This is needed to fix RT#128655 in Routine.analyze_dispatch that messes things up during optimization, but it looks like just regular multi dispatch is LTA too, so I'll try to fix that too | 01:14 | |
synopsebot6 | Link: rt.perl.org/rt3//Public/Bug/Displa...?id=128655 | ||
MasterDuke | .tell jnthn do you have any thoughts on adding missing methods to BOOTSTRAPATTR? irclog.perlgeek.de/perl6-dev/2016-...i_13368379 for reference | 01:20 | |
yoleaux2 | MasterDuke: I'll pass your message to jnthn. | ||
Zoffix | m: say :⒗<a> | 01:29 | |
camelia | rakudo-moar 13f479: OUTPUT«===SORRY!===Argument to "say" seems to be malformedat <tmp>:1------> say⏏ :⒗<a>Confusedat <tmp>:1------> say :⏏⒗<a> expecting any of: colon pairOther potential difficulties: Unsupported…» | ||
MasterDuke | rt.perl.org/Ticket/Display.html?id=129319 | 01:31 | |
dalek | kudo/nom: 2d3ff66 | (Zoffix Znet)++ | src/core/Exception.pm: Stringify objects in Exceptions::JSON Some exceptions (like X::CompUnit::UnsatisfiedDependency) have attributes with objects and not just strings. When we give that to the JSON encoder, it chokes when trying to serialize those objects. Change those to Str, to avoid the issue. Fixes RT#129810: rt.perl.org/Public/Bug/Display.html?id=129810 |
01:59 | |
synopsebot6 | Link: rt.perl.org/rt3//Public/Bug/Displa...?id=129810 | ||
Zoffix | hmmm | 02:01 | |
dafuq | 02:11 | ||
m: say now.Date, Date.today | |||
camelia | rakudo-moar 2d3ff6: OUTPUT«2016-10-102016-10-10» | ||
Zoffix | Gives me 2016-10-102016-10-09 in a local build :/ | 02:12 | |
geekosaur | timezone fun? | 02:13 | |
Zoffix | prolly | ||
Yup | 02:18 | ||
Date.today uses local TZ and now uses UTC | |||
dalek | ast: 8a2147f | (Zoffix Znet)++ | S32-temporal/DateTime-Instant-Duration.t: Fix broken coverage test The test incorrectly assumes Date.today uses the same timezone as `now`. However, `now` uses UTC, while `Date.today` uses system timezone, leading to this test breaking when the difference in timezones causes days to be different. |
02:22 | |
kudo/nom: e39229d | (Zoffix Znet)++ | t/spectest.data: Add S04-exceptions/exceptions-json.t to list of tests to run |
02:23 | ||
kudo/nom: c57a26e | (Zoffix Znet)++ | src/core/Exception.pm: Do not stringify values that JSON can handle Preserve null, boolean, and numerics, since those can be represented in JSON. Stringify all else. |
02:26 | ||
Zoffix | m: %*ENV<RAKUDO_EXCEPTIONS_HANDLER>="JSON"; "dasdsad".EVAL | 02:37 | |
camelia | rakudo-moar c57a26: OUTPUT«Unhandled exception: This representation (P6int) cannot unbox to a native string (for type BOOTInt) at gen/moar/m-CORE.setting:17361 (/home/camelia/rakudo-m-inst-2/share/perl6/runtime/CORE.setting.moarvm:) from gen/moar/m-CORE.setting:17356 (/hom…» | ||
Zoffix | tsk tsk. I suck. | ||
dalek | kudo/nom: 565b528 | (Zoffix Znet)++ | src/core/Exception.pm: Do not crash when handling exceptions with no `message` method |
02:40 | |
ast: ff86ab8 | (Zoffix Znet)++ | S04-exceptions/exceptions-json.t: Test Exceptions::JSON can handle exceptions with no `message` |
02:57 | ||
Zoffix | >_< | ||
dalek | kudo/nom: 3b5ef07 | (Zoffix Znet)++ | src/core/Exception.pm: Fix stringification with Exceptions::JSON Old method still crashed on certain exceptions. Watch for things Rakudo::Internals.to-json can handle and pass them as is, the rest, stringify. |
03:02 | |
ast: 412539a | (Zoffix Znet)++ | S04-exceptions/exceptions-json.t: Add more Exceptions::JSON tests |
|||
Zoffix | Oh.. how did I get this distracted. I was going through commits and new tickets to prep for release next week :P | 03:06 | |
MasterDuke, so the rt.perl.org/Ticket/Display.html?id=129319 can be closed? I recall there was some sort of conversation with TimToady that it wasn't worth it trying to detect such usages to give better errors. | 03:43 | ||
This being a revert for it, it seems github.com/rakudo/rakudo/commit/31...efdcc31c9b | 03:52 | ||
NeuralAnomaly, stats | 04:18 | ||
NeuralAnomaly | Zoffix, [✔] Next release will be in 5 days. Since last release, there are 39 new still-open tickets (0 unreviewed and 0 blockers) and 0 unreviewed commits. See perl6.fail/release/stats for details | ||
Zoffix | \o/ | ||
dalek | kudo/nom: d034599 | (Zoffix Znet)++ | docs/ChangeLog: Add all changes to date Documents commits: c4fd9f5 8fb9ec9 dad57b0 9b6f2eb c78f5dc 08ead04 7e35062 4b1864b 96df2d7 77a2ff1 3789a07 e9409cc fef3655 4bcd7e0 553cedb e12ebb9 b3c92ba 539a7d1 3623490 b4a4b60 e4e8238 8f2279b 9dcde75 3448c71 c2455ca 04f4b76 3d2a919 3aa7254 ca93ac9 6ef4cdf ff12748 2cad3d2 f6524e6 5f91031 2673ca8 84b7ebd 28bf874 2dd6230 b77d2b7 4abc28c 6aab641 6c07321 1e6c465 2a2f26c 7a33c2c 7a50c30 f72cc62 798c2e2 8f14219 2d3ff66 |
||
Zoffix | 🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺 | 04:20 | |
REMINDER: Rakudo 2016.10 will be released next Saturday (Oct. 15). Please review the Changelog, to ensure your work has been correctly entered. | |||
🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺 | |||
psch, bartolin I added JVM stuff as "Many fixes and additions improving JVM backend support". If you feel more detailed log is needed, please add it. | 04:21 | ||
[Tux] | This is Rakudo version 2016.09-158-gd034599 built on MoarVM version 2016.09-39-g688796b | 06:12 | |
csv-ip5xs 3.252 | |||
test 17.304 | |||
test-t 7.908 | |||
csv-parser 18.530 | |||
This is Rakudo version 2016.09-158-gd034599 built on MoarVM version 2016.09-39-g688796b | 06:16 | ||
csv-ip5xs 3.481 | |||
test 17.581 | |||
test-t 7.441 | |||
csv-parser 19.776 | |||
as I thought 7.9 to be too high | |||
just ran it another two times | |||
And after restarting some very memory-eager windows | 06:34 | ||
This is Rakudo version 2016.09-158-gd034599 built on MoarVM version 2016.09-39-g688796b | |||
csv-ip5xs 3.439 | |||
test 16.893 | |||
test-t 7.090 | |||
csv-parser 17.944 | |||
which is kinda interesting: IP5 is less influenced than pure-perl6 | 06:35 | ||
Zoffix++ | 06:41 | ||
RabidGravy | FROGGS++ nice one on fixing the native call for the RPi | 10:47 | |
lizmat | Files=1146, Tests=53276, 213 wallclock secs (12.97 usr 4.13 sys + 1306.76 cusr 122.25 csys = 1446.11 CPU) | 11:10 | |
loks like a few % less then yesterday | 11:11 | ||
*looks | |||
:-) | |||
[Tux]: pure perl 6 needs more memory, most likely, and thus is more affected by other applications that use memory? | 11:12 | ||
|Tux| | sounds legit | 11:13 | |
psch | r: my \x := gather do for ^3 { .take; LAST { .say } }; say +(x) | 12:06 | |
camelia | rakudo-jvm 2a1605, rakudo-moar d03459: OUTPUT«223» | ||
psch | all of that gather/phasers/CX stuff feels really weirdly broken? | ||
m: my \x := gather do for ^3 { .take; LAST { .say }; last }; say +(x) | |||
camelia | rakudo-moar d03459: OUTPUT«01» | ||
psch | explicit &last only fires the phasers once, but a phaser without a &last call fires twice..? | 12:07 | |
DrForr | [Tux]: Where can I find the latest benchmarks? | 12:09 | |
cygx | o/ | 13:16 | |
Zoffix | \o | 13:17 | |
cygx | jnthn: not sure if you've seen it, but as a result of yesterday's discussion with cowens, I wrote up another one of my Possibly Bad Ideas, cf gist.github.com/cygx/b545c206a0f7c...4b26afccf6 | 13:26 | |
jnthn | Feels odd that strict would die on completely valid utf-8 that just happens to not be in NFC. And warning on input that ain't NFC is noisy too. | 13:29 | |
yoleaux2 | 01:20Z <MasterDuke> jnthn: do you have any thoughts on adding missing methods to BOOTSTRAPATTR? irclog.perlgeek.de/perl6-dev/2016-...i_13368379 for reference | ||
jnthn | I'm not sure sticking the burden on every encoding is a good way to go | ||
cowens | I think I like that, but the big question (and the unresolved one from S15) is what to do when these different string types mix via operators | ||
cygx | jnthn: it would not warn on denormal input, just input that | ||
jnthn | If the user wants to work with codepoints instead of graphemes they should declare that. | 13:30 | |
cygx | ... is invalid accordint to a strict UTF-8 decoder | ||
dalek | kudo/js: 3cc214b | (Pawel Murias)++ | src/vm/js/ (2 files): [js] Implement nqp::p6bindassert. |
||
kudo/js: 9bbab73 | (Pawel Murias)++ | src/core/Failure.pm: Update JVM workaround to take into account there is an other backend besides moar and jvm. |
|||
kudo/nom: 6283124 | lizmat++ | src/core/Str.pm: Scrape off about 3% off Str.match It's not a lot, but Str.match appears high in basically all profiles that have any regexen in them |
|||
jnthn | Ah, that meaning :) | ||
cygx | eg we would warn for CESU-8 and modified UTF-8 | ||
dalek | p: 88892b0 | (Pawel Murias)++ | src/vm/js/Operations.nqp: [js] Make unbox_s decont it's argument. |
13:32 | |
p: 8a74d75 | (Pawel Murias)++ | t/nqp/067-container.t: Test that nqp::unbox_s works on containers. |
|||
travis-ci | NQP build failed. Pawel Murias 'Test that nqp::unbox_s works on containers.' | 13:37 | |
travis-ci.org/perl6/nqp/builds/166427877 github.com/perl6/nqp/compare/af751...74d75d03c2 | |||
cygx | I've been working on a toy model of the decoder, github.com/cygx/p6-newio/blob/master/decoder.p6 | 13:40 | |
I'll have to do some thinking how to incorporate the new proposal... | |||
jnthn | Well, in reality for the most common decodings we'll just want to use the existing VM-backed decoders for speed. | 13:45 | |
Though I guess this is more prototyping/API exploration? | 13:47 | ||
cygx | yes | ||
I've some vague ideas on how I want it to look, I'm trying to figure out if that's feasible | 13:48 | ||
jnthn | *nod* | ||
cygx | turns out my idea about having streams where you can freely intermix reading bytes, codes and graphs would work out, with a 2 slight caveats: | 13:52 | |
1. reading a fixed number of graphs is sub-optimal (it will decode more codes than necessary and needs to count graphemes in a separate step) | 13:53 | ||
2. ... | 13:54 | ||
ah, I remember: line separators with ambiguous representation would be problematic | |||
cf github.com/cygx/p6-newio/blob/c4dc...ing.pm#L46 | 13:56 | ||
dalek | kudo/nom: 831f437 | MasterDuke17++ | src/Perl6/Grammar.nqp: Simplify rad_number token a bit Pull out some copied code into their own tokens. |
14:00 | |
kudo/nom: 6531ef7 | lizmat++ | src/Perl6/Grammar.nqp: Merge pull request #900 from MasterDuke17/rad_number_cleanup Simplify rad_number token a bit |
|||
lizmat | jnthn: is method !cursor_more in nqp/src/QRegex/Cursor.nqp only intended for external usage, or also internal? | 14:01 | |
looks like it's only external, but maybe I'm not looking well enough ? | |||
jnthn | Well, on 1, by definition you need to read one more code in order to know whether you've reached a grapheme boundary. | 14:02 | |
lizmat: iirc it's only used by things implementing :g and similar | 14:04 | ||
lizmat: Pretty sure it's never called by code-gen | |||
lizmat | ok, so I could change the interface if need be ? | ||
jnthn | I'd think so | ||
lizmat | oki | ||
jnthn | I guess NQP has code that calls it too | ||
In it's subst impl | |||
cygx | jnthn: to clarify, as it stands now, it would decode all available codes and would need to re-calculate boundaries the next time you request more graphemes | 14:06 | |
jnthn | "all available"? | 14:07 | |
And I guess you mean all available bytes into codes? | 14:08 | ||
cygx | yes | ||
jnthn | Note that problem 2 goes away if handles function at a particular level (you can mandate, for example, that the separators are unambiguous if the normalization mode is Uni; in all other cases you know how to interpret them) | 14:09 | |
cygx is afk for a bit | 14:10 | ||
jnthn | (where Uni means "leave it just as it is") | ||
MasterDuke | is there any reason the :package isn't Stash here? github.com/rakudo/rakudo/blob/nom/....nqp#L2817 | 14:26 | |
jnthn | MasterDuke: No, that looks like a thinko | 14:36 | |
MasterDuke | i thoughto | 14:37 | |
Zoffix | MasterDuke, [23:44:16] <Zoffix> MasterDuke, so the rt.perl.org/Ticket/Display.html?id=129319 can be closed? I recall there was some sort of conversation with TimToady that it wasn't worth it trying to detect such usages to give better errors. | 14:51 | |
MasterDuke | right, i was just looking for the last conversation i with TimToady about it to add as a comment before i close | 14:53 | |
close as reject i assume? | 14:55 | ||
Zoffix | Yeah | 14:56 | |
MasterDuke | done | 14:59 | |
Zoffix | Thanks. | 15:02 | |
dalek | kudo/nom: a15fe9c | lizmat++ | src/core/Cursor.pm: Port !cursor_more and !cursor_next from nqp Wanted to change the interface to !cursor_more specifically in nqp so that it wouldn't need to take a hash and do lookups. Especially since it only checked for <ex> to call !cursor_next and <ov> to see whether an overlap needed to be handled. The new Cursor.CURSOR_MORE now only checks for overlap: it is expected that the new .CURSOR_NEXT is called directly if exhaustive is set. |
15:12 | |
rakudo/nom: 92c0921 | lizmat++ | src/core/Str.pm: | |||
rakudo/nom: Use the new .CURSOR_MORE/CURSOR_NEXT in Str.match | |||
lizmat | This seems to improve "foofoofoofoofoo".match(/foo/,:g) with 10% . | ||
Again, not a lot but in a very hot path generally. | |||
review: github.com/rakudo/rakudo/commit/92...8583dad95f | 15:13 | ||
afk& | |||
|Tux| | DrForr, tux.nl/Talks/CSV6/speed4.html - sorry for the delay | 15:23 | |
top frame has a link to the timing log | 15:24 | ||
pmurias checks out travis failure | 15:29 | ||
dalek | p: 16e2df7 | (Pawel Murias)++ | src/vm/js/nqp-runtime/ (2 files): [js] Start evaling more specialized accessors to speed things up a bit. |
15:30 | |
p: 875d088 | (Pawel Murias)++ | t/nqp/067-container.t: Fix test count. |
|||
travis-ci | NQP build passed. Pawel Murias 'Fix test count.' | 15:34 | |
travis-ci.org/perl6/nqp/builds/166461998 github.com/perl6/nqp/compare/8a74d...5d0888ab0f | |||
cygx | jnthn: the problems I mentioned are specific to the modeless design because I want to keep zero-cost for the lower levels as well as manageable complexity | 15:47 | |
what I envision are line separators defined in terms of bytes (it's not implemented in the toy model nor in my API mockup) | 15:48 | ||
so any of lines(:bin), .lines(:uni) and lines can reuse the same implementation without overhead | 15:49 | ||
pmurias | jnthn: accessing/binding native attributes does no autovivication and other sideeffects (besides storing the native value if binding) | ||
cygx | jnthn: I'll keep experimenting, and will give an update once I get to more conclusive results... | 15:50 | |
(assuming you do not want to tackle the issue yourself Right Now) | 15:51 | ||
jnthn | pmurias: Correct (if you were looking for confimration? :)) | 15:54 | |
pmurias | yes, forgot the ? | ||
I want to compile them to 'foo.attr$7' instead of foo.$$getattr$7() and foo.$$bindattr$7(value) | 15:57 | ||
cygx | jnthn: could you update the comment at the top of MoarVM's src/6model/reprs/MVMString.h if it is indeed out of date? | 16:08 | |
I'm assuming that at the very least, the someday-NFG buffer nowadays is NFG | |||
jnthn | cygx: Will take a look soon; bit tied up with $dayjob-task for the next 10-15 mins | ||
cygx | no worries, it can wait | 16:10 | |
it's just that the comment was apparently written pre-NFG and also references things that are supposded to happen 'later' | 16:11 | ||
dalek | kudo/nom: e250a84 | MasterDuke17++ | src/Perl6/Metamodel/BOOTSTRAP.nqp: Correct a thinko |
16:21 | |
kudo/nom: e8c8af6 | MasterDuke17++ | src/Perl6/Metamodel/BOOTSTRAP.nqp: Convert some BOOTSTRAPATTRs to Attributes This is a first pass through BOOTSTRAP.nqp only changing attributes where all the attributes explicitly added to a class can be converted (i.e., it may still have some BOOTSTRAPATTRs from a parent class). This allows more attributes to be more easily worked with a the Perl 6 level. |
|||
rakudo/nom: f117a61 | (Zoffix Znet)++ | src/Perl6/Metamodel/BOOTSTRAP.nqp: | |||
timotimo | i'm a bit surprised we'll have overflow detection for infix:<*>(int, int) | 16:25 | |
Zoffix | Why? | 16:26 | |
timotimo | there has been arguments against that in the past, if i remember correctly | ||
Zoffix | Well, without the fix I made, it was returning 0 on overflow. | 16:27 | |
timotimo | mhm | ||
i hope it'll still inline everywhere | |||
Zoffix | Ah. That's why :| | 16:32 | |
timotimo | also, our sub can now return not only an int, but also an object | 16:33 | |
Zoffix | Yeah, and there's a bug with it too: rt.perl.org/Ticket/Display.html?id...et-history | 16:34 | |
m: sub (--> int) { Failure.new }() | |||
camelia | rakudo-moar f117a6: OUTPUT«This type cannot unbox to a native integer: P6opaque, Failure in sub at <tmp> line 1 in block <unit> at <tmp> line 1» | ||
Zoffix | m: sub (--> int) { return Failure.new }() | ||
camelia | rakudo-moar f117a6: OUTPUT«Failed in block <unit> at <tmp> line 1Actually thrown at: in block <unit> at <tmp> line 1» | ||
timotimo | mhm | ||
jnthn | infix:<*>(int, int) is not meant to upgrade to Int ever | 16:39 | |
And thus may overflow | |||
Note it can only ever be called if at lesat one of the arguments is *declared* as an int | |||
(Either both must be, or one must be a sufficiently small literal) | |||
Zoffix | jnthn, how is that overflow signaled | 16:40 | |
jnthn | It isn't | ||
int is *native* | |||
And unchecked | |||
Zoffix | jnthn, so 0 is correct answer on overflow? | 16:41 | |
jnthn | Could easily happen, yes | ||
If you wrote the same program in C with int (or long) you could see the same kind of behavior | |||
Zoffix | OK, then I'll revert my "fix" and undo another one that I stole it from. | ||
jnthn | (And a bunch of other languages) | 16:42 | |
Zoffix | The other one is infix:<**>(int $a, int $b) github.com/rakudo/rakudo/blob/nom/...nt.pm#L293 | 16:43 | |
jnthn | cygx: Updated that comment; didn't spot anything else out of date in the vicinity of it | 16:46 | |
cygx: On "stuff I was going to work on" - I've certainly made a start on encoding stuff, by exposing the decode stream API and using it to fix up IO::Socket::Async | 16:50 | ||
cygx: My plan from there was to do similar with Proc::Async | 16:51 | ||
cygx: Followed by dropping the char-level async I/O ops | |||
cygx: (At the VM level) | |||
Thus fixing some robustness problems and allowing the use of different encodings (the ones we already support) for Proc::Async and IO::Socket::Async | 16:52 | ||
Beyond that I hadn't really been planning to do other I/O refactors Really Soon because there's just so many other problems competing for my time | 16:53 | ||
But solving the ones I just mentioned would elimiante some of the most nasty things | |||
FROGGS | o/ | 16:54 | |
jnthn | I don't think that especially conflicts with much that you're doing | 16:55 | |
I'm relatively happy with the encoder/decoder API that I proposed, but was planning to keep any code implementing that under Rakudo::Internals for the moment | |||
So we'll still have the freedom to tweak it further | |||
cygx | jnthn: so not much change of unnecessary or duplicated work if we eventually do go with a more extensive refactor | 17:03 | |
somewhat related, grapheme boundary detection does not seem to be exposed right now | 17:04 | ||
it might be nice to have an op for that... | |||
jnthn | Well, synthetics are never meant to leak out to user-space | 17:05 | |
Both codepoints -> grapheme string and vice versa are exposed. | 17:06 | ||
(Otherwise we shouldn't Str.Uni or Uni.Str) | |||
I guess there maybe are other uses for the op | |||
cygx | well, some people (read: me ;) ) might want to do their own chunking of codepoints into grapheme clusters | 17:07 | |
jnthn | All the data needed to do the calculation is exposed | 17:08 | |
At least, I can't think of anything that isn't | |||
DrForr | |Tux|: No worries, I was just curious. | ||
jnthn | If you're doing this for prototyping purposes, though, I'd just implement one or two of the rules | ||
Also note that .comb>>.NFD for example would get you a list of list of codepoints that make up each grapheme | 17:10 | ||
So you can piggy-back on the existing impl that way | 17:11 | ||
On separators, I'm a bit hesitant to try and deal with them at byte level | 17:12 | ||
It's possible that a given grapheme might expode to a whole range of possible codepoint forms | |||
*explode | |||
e.g. something with 3 combiners on could easily be represented in 6 differnet ways | 17:14 | ||
cygx | .oO(or just expose should_break from normalize.c as an op) |
17:15 | |
I just thought that would be easier, but if you're opposed to providing that, I can deal | |||
MasterDuke | jnthn: i'm now trying some of the attributes that didn't work with just s/BOOTSTRAPATTR/Attribute/. if i change Hash's Mu $!descriptor, rakudo compiles, but attempting to use moar to run even -e '' gives: X::TypeCheck::Assignment exception produced no message | 17:16 | |
jnthn | cygx: Well, trying to keep the number of ops from getting too huge, is all... | 17:17 | |
cygx: I'm also somewhat concerned about exposing should_break because as of Unicode 9 it seems you perhaps cannot determine it just by looking at the two codepoints on either side :/ | 17:19 | ||
travis-ci | Rakudo build failed. Zoffix Znet 'Merge pull request #901 from MasterDuke17/remove_BOOTSTRAPATTR_where_possible | 17:20 | |
travis-ci.org/rakudo/rakudo/builds/166476243 github.com/rakudo/rakudo/compare/9...17a61595fa | |||
buggable | [travis build above] ☠ Did not recognize some failures. Check results manually | ||
Zoffix | oh shit | 17:21 | |
Seems that busted JVM | |||
MasterDuke | oops, my bad | ||
Zoffix | :) | ||
jnthn | See www.unicode.org/reports/tr29/#Graph...dary_Rules for details. Note how it has rules like "That is, do not break between regional indicator (RI) symbols if there is an odd number of RI characters before the break point." | ||
But if you've just got two codepoints, you can't know if there's more than one RI before :/ | 17:22 | ||
Despite that the note below says "Grapheme cluster boundaries can be easily tested by looking at immediately adjacent characters." | |||
Which I'd really like to believe but can't see how is true with the RI rules | 17:23 | ||
timotimo | right :| | 17:24 | |
cygx | I'm not quite sure if I even understand the intention | 17:29 | |
the regex version just has it as Regional_Indicator+ | |||
but according to the abstract description, I think you'd at the very least have to remember if there's a break right before our pair of codes | 17:30 | ||
from looking at wikipedia, the idea is apparently that regional indicators always should come in pairs | 17:32 | ||
the simplified rules: at the beginning of a new grapheme cluster, do not break between a pair of regional indicators; otherwise, do | 17:38 | ||
timotimo | but how do you tell if you're inside a pair of regional indicators or between two pairs? | ||
well, if you carry a little flag that tells you if you're %% 2 away from the start marker of a regional indicator perhaps? is there such a "start of RI" thing? | 17:40 | ||
jnthn | dinner time, bbl | 17:45 | |
cygx | yeah, I thought that was unnecessary - but the problem with that are prepend characters... | ||
timotimo | the what now? | 17:46 | |
cygx | Prepend: Indic_Syllabic_Category = Consonant_Preceding_Repha, or Indic_Syllabic_Category = Consonant_Prefixed, or Prepended_Concatenation_Mark = Yes | 17:50 | |
timotimo | i don't know anything about those :| | 17:51 | |
cygx | the point is, the first regional indicator always starts a new cluster unless it follows a prepend character | ||
(or another regional indicator, of course) | 17:52 | ||
timotimo | so it's really legal to have a prepend character in front of a regional indicator? | 17:53 | |
cygx | I don't see why it should not be | ||
Prepended_Concatenation_Mark: A small class of visible format controls, which precede and then span a sequence of other characters, usually digits. These have also been known as "subtending marks", because most of them take a form which visually extends underneath the sequence of following digits. | 17:54 | ||
timotimo | hmm | ||
and is it allowed to have combined marks on regional indicator letters, too? | |||
like a regional indicator ÄÄ | 17:55 | ||
cygx | no, that would break between the combinator and the 2nd indicator | ||
timotimo | hmm | 17:56 | |
cygx | you can have stuff like [prepend indicator indicator extend] and [prepend indicator indicator extend] as single clusters, but no extend between indicators | 17:58 | |
up... the second one should only have one indicator within | |||
dalek | kudo/worry_broken_heredoc_stopper: d184fcd | timotimo++ | src/Perl6/Grammar.nqp: warn when a heredoc stopper has rubbish after it like accidentally included semicolon or something |
||
timotimo | cygx: can i put combining marks at the end of a pair of regional indicators? | 17:59 | |
like a germany flag with heavy metal umlauts on top of it? | |||
cygx | yes | ||
timotimo | tremendous | 18:00 | |
cygx | or after a single one, but it will no longer be combined with a following second indicator | ||
timotimo | .u strikethrough | 18:01 | |
yoleaux2 | U+1D7A LATIN SMALL LETTER TH WITH STRIKETHROUGH [Ll] (ᵺ) | ||
timotimo | .u strike | ||
yoleaux2 | U+1D7A LATIN SMALL LETTER TH WITH STRIKETHROUGH [Ll] (ᵺ) | ||
timotimo | ... | ||
18:02 | |||
that's the one i want, though it doesn't show | |||
MasterDuke | i was just thinking about the stack trace that gets generated with --ll-exception (or in the profile) | 18:06 | |
it mentions the lines in m-CORE.settings, which is good, but then i have to open up the settings and find the line, and then if i want to change something i have to find out what file it came from | |||
i remember lizmat did some stuff to get better correspondence between settings and the original source file (for Zoffix's Sourcery module i think, or maybe the coverage reports) | |||
timotimo | twitter.com/loltimo/status/785542004100988928 | 18:07 | |
MasterDuke | how difficult would it be to get the error message to also report the original source file? the line there would be even better, but just the file would be a start | ||
cygx | timotimo: lol, timo ;) | 18:11 | |
Zoffix already had that conversation. | 18:12 | ||
My opinion is it's fine the way it is. Because returning the line of the original source file doesn't give you the actual location of anything, whereas the line in the setting file does. | |||
Sourcery has an additional critical piece of information: the commit sha. Without it, you can't actually locate anything correctly. And that sha is obtained from $*PERL.compiler.version, and *it's actually too short*. | 18:13 | ||
I doubt I'll find it now, but I did encounter a case where there were the sha was too short and there was more than one commit with it in them. | 18:14 | ||
So even the Sourcery doesn't give you 100% accurate location of the source code, but the core location is 100% accurate. | |||
s/core/CORE-setting/ | 18:15 | ||
MasterDuke | don't know what you mean by "the line of the original source file doesn't give you the actual location of anything"? | 18:17 | |
Zoffix | MasterDuke, what code is at src/core/Int.pm at line 142? | 18:18 | |
MasterDuke | "return Nil if self == 0;", but i assume you're making some sort of point, i'm just not getting it | 18:19 | |
Zoffix | MasterDuke, no, you're wrong. The correct answer is "when int16 { Range.new( -32768, 32767 ) }" | 18:21 | |
MasterDuke, I can prove it: github.com/rakudo/rakudo/blob/ec38...nt.pm#L142 | |||
MasterDuke, but my point was is there's no answer to that question, because it requires the third component to locate the actual code: the commit sha | |||
So if you run your code in 2015.12, you get `when int16...` at that line, but if you run today's HEAD you get the `return Nil...` | 18:22 | ||
MasterDuke | but that isn't that hard to get | ||
Zoffix | How would you get it? | ||
MasterDuke | somebody with a monthly release, we know the sha | ||
somebody who's building from source, they should know enough to be able to get it themselves | 18:23 | ||
and even if $*PERL.compiler.version had a conflict, i don't think there are so many we couldn't easily figure out which one was correct | 18:24 | ||
Zoffix | And that's my argument against: the current system gives you exact location you can look at immediately. Your proposal has "they should know enough" in it. | ||
And the current system can be used by a machine. Yours isn't accurate enough to do so. | 18:25 | ||
MasterDuke | well, i did say just giving the file would be good | ||
Zoffix | A simple wrapper script around perl 6 executable can figure out the right location. | 18:26 | |
I think I even mentioned something like that in sourcery article | |||
MasterDuke | and no reason we couldn't continue to give the CORE line # also | ||
Zoffix | perl6.party/post/Perl-6-Core-Hackin...#doitforme | ||
I dunno. To me, this is a case of merging development tools useful for development of the compiler only together with our final product. | 18:27 | ||
m: sub foo {}; dd &foo.line # like this logic would have to watch for whether the thing is a core location or not, for example. | 18:28 | ||
camelia | rakudo-moar f117a6: OUTPUT«1» | ||
Zoffix | And, yes, I realize we have RAKUDO_MODULE_DEBUG and RAKUDO_OPTIMIZER_DEBUG, but in those cases there aren't any simple alternatives. | 18:29 | |
MasterDuke | well, you could argue that descending into *any* non-user files in an error message is merging compiler development information | 18:32 | |
(not saying it's a strong argument though) | |||
i mean look at this | 18:34 | ||
m: say oh no! | |||
camelia | rakudo-moar f117a6: OUTPUT«===SORRY!=== Error while compiling <tmp>Bogus postfixat <tmp>:1------> say oh no⏏! expecting any of: infix infix stopper postfix statement end statement modifier stat…» | ||
MasterDuke | compared with the same thing and --ll-exception | ||
where there's a mix of original and generated source files listed | 18:36 | ||
but anyway, i'm not trying to propose a fundamental restructuring of error messages | 18:37 | ||
but something that yes, is probably of most usefulness to a developer working on Perl 6 itself | 18:38 | ||
maybe a rakudo environment variable? | |||
Zoffix | Right, but why can't it be a dozen-line wrapper script and we have exactly zero performance impact on production code and much smaller maintenance burden? | 18:41 | |
Personally, I don't think I'd find that addition useful. I do use sourcery bot to find where to look for, but can't say I'm dying to have exceptions give me locations of actual code, TBH. | |||
MasterDuke | ok, how about adding the wrapper script to the repo? doesn't have to be made available in os packages | 18:43 | |
Zoffix | I'm perfectly fine with that. | 18:44 | |
MasterDuke | (and i'm not just making up use cases, i do actually do run something, get a CORE line, open up CORE, page up until i find what original file very frequently | 18:45 | |
) | |||
but hey, maybe there's a simpler way to improve my workflow | 18:46 | ||
and not just errors, i do it a lot when looking at profiles | 18:48 | ||
pull-one (gen/moar/m-CORE.settings:###) | 18:49 | ||
there are a lot of pull-one's (just as an example), i have to go look to see exactly which one it is | 18:50 | ||
lizmat | MasterDuke: pull-one is the workhorse of any iterator | 19:21 | |
MasterDuke | lizmat: i just meant that there are a lot of them, so i don't know exactly which source file to look in. a counter-example is something like chomp, i have a much better chance of guessing that correctly | 20:04 | |
lizmat | m: say "abcdefg".match(/./,:2nd) # works as expected | 20:10 | |
camelia | rakudo-moar f117a6: OUTPUT«「b」» | ||
lizmat | m: say "abcdefg".match(/./,:3x) # works as expected | ||
camelia | rakudo-moar f117a6: OUTPUT«(「a」 「b」 「c」)» | ||
lizmat | m: say "abcdefg".match(/./,:2nd, :3x) # expected b d e | ||
camelia | rakudo-moar f117a6: OUTPUT«()» | ||
lizmat | is this a bug? ^^^ | 20:11 | |
Zoffix | I would think :2nd and :3x are mutually exclusive. | 20:13 | |
jnthn | m: say "ab".match(/./,:3x) | ||
camelia | rakudo-moar f117a6: OUTPUT«()» | ||
jnthn | So :3x appears to mean "only return if you can match 3 times" | ||
m: say "abcdefg".match(/./,:2nd, :1x) | 20:14 | ||
camelia | rakudo-moar f117a6: OUTPUT«(「b」)» | ||
Zoffix | m: say "1234567890".match(/./,:3x) | ||
camelia | rakudo-moar f117a6: OUTPUT«(「1」 「2」 「3」)» | ||
jnthn | Seems consistent at least :) | ||
lizmat | jnthn Zoffix: suppose we want the second match but 3 x (aka match 2, 4,6 ) | 20:15 | |
jnthn | That's not the 2nd match, that's every 2nd match | ||
lizmat | m: say "abcdefg".match(/./,:2nd, :g) | ||
camelia | rakudo-moar f117a6: OUTPUT«「b」» | ||
jnthn | m: say "1234567890".match(/./,:x(1, 3, ... *)) | ||
camelia | rakudo-moar f117a6: OUTPUT«Potential difficulties: Comma found before apparent sequence operator; please remove comma (or put parens around the ... call, or use 'fail' instead of ...) at <tmp>:1 ------> say "1234567890".match(/./,:x(1, 3,⏏ ... *))* …» | ||
jnthn | m: say "1234567890".match(/./,:x(1, 3 ... *)) | 20:16 | |
camelia | rakudo-moar f117a6: OUTPUT«in Str.match, got invalid value of type Seq for :x, must be Int or Range in block <unit> at <tmp> line 1Actually thrown at: in block <unit> at <tmp> line 1» | ||
jnthn | Hm, I thought you could do something like that :) | ||
lizmat | jnthn: yeah, I guess I could do something like that | ||
jnthn | oh wait | ||
I did it wrong | |||
lizmat | I'm overhauling Str.match completely atm | ||
jnthn | m: say "1234567890".match(/./,:nd(1, 3 ... *)) | ||
camelia | rakudo-moar f117a6: OUTPUT«「1」» | ||
Zoffix | m: say "1234567890".match(/./,:nd(1, 3 ... *), :g) | ||
camelia | rakudo-moar f117a6: OUTPUT«「1」» | ||
Zoffix | ¯\_(ツ)_/¯ | 20:17 | |
jnthn | m: say "1234567890".match(/./,:nd(1, 3, 5)) | ||
camelia | rakudo-moar f117a6: OUTPUT«(「1」 「3」 「5」)» | ||
jnthn | m: say "1234567890".match(/./,:nd(eager 1, 3 ... *)) | ||
oh heh ;) | |||
Zoffix | :D | ||
m: say "1234567890".match(/./,:nd(eager 1, 3 ... * > 10)) | |||
camelia | rakudo-moar f117a6: OUTPUT«(timeout)» | ||
rakudo-moar f117a6: OUTPUT«()» | |||
lizmat | m: m: say "1234567890".match(/./,:st(1, 3, 5)) | 20:18 | |
camelia | rakudo-moar f117a6: OUTPUT«(「1」 「3」 「5」)» | ||
jnthn | m: say "1234567890".match(/./,:nd(list 1, 3 ... *)) | ||
camelia | rakudo-moar f117a6: OUTPUT«「1」» | ||
jnthn | Curious | ||
lizmat | well, fwiw, I can fix that probably | ||
jnthn | m: say "1234567890".match(/./,:nd(list(1, 3 ... *))) | ||
camelia | rakudo-moar f117a6: OUTPUT«「1」» | ||
jnthn | Anyway, I think :3x, :2nd not working is reasonable | ||
But if :nd(1,3,5) works then it's bothersome that :nd(1, 3 ... 7) doesn't | 20:19 | ||
m: say "1234567890".match(/./,:nd(list 1, 3 ... 7)) | |||
camelia | rakudo-moar f117a6: OUTPUT«「1」» | ||
lizmat | yeah, ok, noted | ||
jnthn | m: say "1234567890".match(/./,:nd(list 1, 3, 5, 7)) | ||
camelia | rakudo-moar f117a6: OUTPUT«(「1」 「3」 「5」 「7」)» | ||
lizmat | will look into that | ||
cygx_ | jnthn, timotimo: p6 implementation of the clustering algorithm, potentially buggy: github.com/cygx/p6-newio/blob/master/cluster.p6 | ||
jnthn | Yeah, that way should work, I think | ||
Hm, table is a neat way to do it | 20:21 | ||
lizmat | jnthn: so :$x is incompatbile with :nd (and friends) and should also not silently be ignored | ||
jnthn | Well, :nd(2) and :x(1) was fine...not sure if raising an error is too harsh | 20:23 | |
MasterDuke | still playing around with BOOTSTRAP. if i change Array's $!descriptor to an Attribute, rakudo compiles, but won't install and a bunch of things give this message: X::TypeCheck::Assignment exception produced no message | 20:46 | |
m: my Int @a; @a[1] = "a" | |||
camelia | rakudo-moar f117a6: OUTPUT«Type check failed in assignment to @a; expected Int but got Str ("a") in block <unit> at <tmp> line 1» | ||
MasterDuke | m: my Int @a; @a[1] = $*ERR | 20:47 | |
camelia | rakudo-moar f117a6: OUTPUT«Type check failed in assignment to @a; expected Int but got IO::Handle (IO::Handle.new(:path(...) in block <unit> at <tmp> line 1» | ||
MasterDuke | locally, the first one works fine, but for the second i get: Type check failed in assignment to @a; expected Int but got IO::Handle (?) | ||
jnthn | If you make that an Attribute, it may set up some vivification on it | ||
In a case where $!descriptor is meant to be null | 20:48 | ||
MasterDuke | it looks like that's what scalar_attr() does, but just a plain Attribute.new will also? | 20:49 | |
jnthn | Think so | 20:54 | |
That may be one of the cases where we need BOOTSTRAPATTR | |||
Or we need to configure Attribute differently (and maybe make it support that) | 20:55 | ||
The important bit is what happens at repr_compose time | |||
MasterDuke | in Attribute.new: my $scalar := nqp::create(Scalar); ...; nqp::bindattr($attr, Attribute, '$!auto_viv_container', $scalar); | 20:59 | |
jnthn | Yup | 21:01 | |
That's the bit. I guess a version of scalar_attr that doesn't set $!auto_viv_container would do it. | |||
MasterDuke | scalar_attr or a new Attribute.new? | 21:02 | |
jnthn | No, leave Attribute.new alone | 21:04 | |
That's user-facing | |||
We want something internal | 21:05 | ||
MasterDuke | oh hey, my scalar_attr2 is working so far | 21:15 | |
jnthn | :) | 21:20 | |
MasterDuke | spectest passed. now a somewhat cleaner implementation than just adding a scalar_attr2 | 21:21 | |
jnthn | MasterDuke++ | ||
MasterDuke | oh, while you're here. changing to Attribute.new for Int/Num/Str all give: "This type cannot unbox to a native integer: P6opaque, Int/Num/Str" | 21:26 | |
i haven't started looking at those yet, but any quick suggestions? | 21:27 | ||
jnthn | Probably setting the wrong type | 21:29 | |
These all have native attributes within them (or a P6bigint in the case of Int) | |||
lizmat | and another Perl 6 Weekly hits the Net: p6weekly.wordpress.com/2016/10/10/...r-is-near/ | 21:45 | |
AlexDaniel | ah yeah, and bisectable has fully recovered from all of the problems :) | 22:07 | |
MasterDuke++ | 22:08 | ||
lizmat | MasterDuke++ indeed | 22:14 | |
good night, #perl6-dev! | |||
Zoffix | lizmat++ good weekly | 23:05 | |
cognominal | lizmat++ | 23:52 |