The issue is that it was a bit outdated in the choice of _which_ things to choose as the one Go way. People expect a map/filter method rather than a loop with off by one risks, a type system with the smartness of typescript (if less featured and more heavily enforced), error handling is annoying, and so on.
I get that it’s tough to implement some of those features without opening the way to a lot of “creativity” in the bad sense. But I feel like go is sometimes a hard sell for this reason, for young devs whose mother language is JavaScript and not C.
Do they? After too many functional battles I started practicing what I'm jokingly calling "Debugging-Driven Development" and just like TDD keeps the design decisions in mind to allow for testability from the get-go, this makes me write code that will be trivially easy to debug (specially printf-guided debugging and step-by-step execution debugging)
Like, adding a printf in the middle of a for loop, without even needing to understand the logic of the loop. Just make a new line and write a printf. I grew tired of all those tight chains of code that iterate beautifully but later when in a hurry at 3am on a Sunday are hell to decompose and debug.
I think it's a bad trade-off, most languages out there are moving away from it
It's just that a ridiculous amount of steps in real world problems can be summarised as 'reshape this data', 'give me a subset of this set', or 'aggregate this data by this field'.
Loops are, IMO, very bad at expressing those common concepts briefly and clearly. They take a lot of screen space, usually accesory variables, and it isn't immediately clear from just seing a for block what you're about to do - "I'm about to iterate" isn't useful information to me as a reader, are you transforming data, selecting it, aggregating it?.
The consequence is that you usually end up with tons of lines like
userIds = getIdsfromUsers(users);
where the function is just burying a loop. Compare to:
userIds = users.pluck('id')
and you save the buried utility function somewhere else.
So for a large loop the code like
for i, value := source { result[i] = value * 2 + 1 }
Would be 2x faster than a loop like
for i, value := source { intermediate[i] = value * 2 }
for i, value := intermediate { result[i] = value + 1 }
For example, Rust iterators are lazily evaluated with early-exits (when filtering data), thus it's your first form but as optimized as possible. OTOH python's map/filter/etc may very well return a full list each time, like with your intermediate. [EDIT] python returns generators, so it's sane.
I would say that any sane language allowing functional-style data manipulation will have them as fast as manual for-loops. (that's why Rust bugs you with .iter()/.collect())
I always encounter these upsides once every few years when preparing leetcode interviews, where this kind of optimization is needed for achieving acceptable results.
In daily life, however, most of these chunks of data to transform fall in one of these categories:
- small size, where readability and maintainability matters much more than performance
- living in a db, and being filtered/reshaped by the query rather than code
- being chunked for atomic processing in a queue or similar (usual when importing a big chunk of data).
- the operation itself is a standard algorithm that you just consume from a standard library that handless the loop internally.
Much like trees and recursion, most of us don’t flex that muscle often. Your mileage might vary depending of domain of course.
I agree with this. I feel like Go was a very smart choice to create a new language to be easy and practical and have great tooling, and not to be experimental or super ambitious in any particular direction, only trusting established programming patterns. It's just weird that they missed some things that had been pretty well hashed out by 2009.
Map/filter/etc. are a perfect example. I remember around 2000 the average programmer thought map and filter were pointlessly weird and exotic. Why not use a for loop like a normal human? Ten years later the average programmer was like, for loops are hard to read and are perfect hiding places for bugs, I can't believe we used to use them even for simple things like map, filter, and foreach.
By 2010, even Java had decided that it needed to add its "stream API" and lambda functions, because no matter how awful they looked when bolted onto Java, it was still an improvement in clarity and simplicity.
Somehow Go missed this step forward the industry had taken and decided to double down on "for." Go's different flavors of for are a significant improvement over the C/C++/Java for loop, but I think it would have been more in line with the conservative, pragmatic philosophy of Go to adopt the proven solution that the industry was converging on.
I assume, anyway. Maybe the Go debugger is kind of shitty, I don't know. But in PHP with xdebug you just use all the fancy array_* methods and then step through your closures or callables with the debugger.
After Go added generics in version 1.18, you can just import someone else's generic implementations of whatever of these functions you want and use them all throughout your code and never think about it. It's no longer a problem.
colors := items.Filter(_.age > 20).Map(_.color)
Instead of colors := items.Filter(func(x Item){ return x.age > 20 }).Map(func(x Item){ return x.color })
which as best as I can tell is how you'd express the same thing in Go if you had a container type with Map and Filter defined.