zlacker

[parent] [thread] 2 comments
1. praecl+(OP)[view] [source] 2023-05-20 01:39:23
Those are niceties and can be implemented with some small hacks. Most big nets do very little slicing. Lots of dimension permutations (transpose, reshape, and friends) but less slicing. I personally use a lot of slicing so will do my best to support a clean syntax.
replies(2): >>tysam_+fk >>whimsi+In1
2. tysam_+fk[view] [source] 2023-05-20 06:48:48
>>praecl+(OP)
I've come to believe over the last few years that slicing is one of the most critical parts of a good ML array framework for a number of things and I've used it heavily. PyTorch, if I understand correctly, still doesn't have it right in terms of some forms of slice assignment and the handling of slice objects (please correct me if I'm wrong) though it is leagues better than tensorflow was.

I've written a lot of dataloader and such code over the last number of years, and the slicing was probably the most important (and most hair-pulling) parts for me. I've really debated writing my own wrapper at some point (if it is indeed worth the effort) just to keep my sanity, even if it is as the expense of some speed.

3. whimsi+In1[view] [source] 2023-05-20 17:44:41
>>praecl+(OP)
I disagree with this, slice notation is powerful and I use it quite a bit in DL.

Even just the [:, None] trick replacing unsqueeze is super useful for me.

[go to top]