But I can't help but agree with a lot of points in this article. Go was designed by some old-school folks that maybe stuck a bit too hard to their principles, losing sight of the practical conveniences. That said, it's a _feeling_ I have, and maybe Go would be much worse if it had solved all these quirks. To be fair, I see more leniency in fixing quirks in the last few years, like at some point I didn't think we'd ever see generics, or custom iterators, etc.
The points about RAM and portability seem mostly like personal grievances though. If it was better, that would be nice, of course. But the GC in Go is very unlikely to cause issues in most programs even at very large scale, and it's not that hard to debug. And Go runs on most platforms anyone could ever wish to ship their software on.
But yeah the whole error / nil situation still bothers me. I find myself wishing for Result[Ok, Err] and Optional[T] quite often.
I'd say that it's entirely the other way around: they stuck to the practical convenience of solving the problem that they had in front of them, quickly, instead of analyzing the problem from the first principles, and solving the problem correctly (or using a solution that was Not Invented Here).
Go's filesystem API is the perfect example. You need to open files? Great, we'll create
func Open(name string) (*File, error)
function, you can open files now, done. What if the file name is not valid UTF-8, though? Who cares, hasn't happen to me in the first 5 years I used Go.Score another for Rust's Safety Culture. It would be convenient to just have &str as an alias for &[u8] but if that mistake had been allowed all the safety checking that Rust now does centrally has to be owned by every single user forever. Instead of a few dozen checks overseen by experts there'd be myriad sprinkled across every project and always ready to bite you.
Of the top of my head, in order of likely difficulty to calculate: byte length, number of code points, number of grapheme/characters, height/width to display.
Maybe it would be best for Str not to have len at all. It could have bytes, code_points, graphemes. And every use would be precise.
FWIW the docs indicate that working with grapheme clusters will never end up in the standard library.
If your API takes &str, and tries to do byte-based indexing, it should almost certainly be taking &[u8] instead.
If your API takes &str, and tries to do byte-based indexing, it should
almost certainly be taking &[u8] instead.
Str is indexed by bytes. That's the issue. let s = “asd”;
println!(“{}”, s[0]);
You will get a compiler error telling you that you cannot index into &str. fn main() {
let s = "12345";
println!("{}", &s[0..1]);
}
compiles and prints out "1".This:
fn main() {
let s = "\u{1234}2345";
println!("{}", &s[0..1]);
}
compiles and panics with the following error: byte index 1 is not a char boundary; it is inside 'ሴ' (bytes 0..3) of `ሴ2345`
To get the nth char (scalar codepoint): fn main() {
let s = "\u{1234}2345";
println!("{}", s.chars().nth(1).unwrap());
}
To get a substring: fn main() {
let s = "\u{1234}2345";
println!("{}", s.chars().skip(0).take(1).collect::<String>());
}
To actually get the bytes you'd have to call #as_bytes which works with scalar and range indices, e.g.: fn main() {
let s = "\u{1234}2345";
println!("{:02X?}", &s.as_bytes()[0..1]);
println!("{:02X}", &s.as_bytes()[0]);
}
IMO it's less intuitive than it should be but still less bad than e.g. Go's two types of nil because it will fail in a visible manner.