That's pretty damning. Facebook execs knew that extremist groups were using their platform and Facebook's own tooling catalyzed their growth, and yet they did nothing about it.
A recommendation engine is just an algorithm to maximize an objective function. That objective function being matching users with content that they enjoy and engage with. The algorithm has no in-built notion of political extremism. It almost assuredly seems to be the case that people with radical opinions prefer to consume media that matches their views. If Bob is a three-percenters, it's highly unlikely he'd prefer to read the latest center-left think piece from The Atlantic.
Unless you're willing to ban recommend engines entirely, the only possible alternative I can see is for Facebook to intentionally tip the scales. Extremist political opinions would have to be explicitly penalized in the objective function.
But now you've turned Facebook from a neutral platform into an explicit arbiter of political opinion. It means some humans at Facebook are intentionally deciding what people should and should not read, watch and listen to. Remember Facebook as an organization is not terribly representative of the country as a whole. Fewer than 5% of Facebook employees vote Republican, compared to 50% of the country. Virtually no one is over 50. Males are over-represented relative to females. Blacks and hispanics are heavily under-represented. And that doesn't even get into international markets, where the Facebook org is even less representative.
The cure sounds worse than the disease. I really think it's a bad idea to pressure Facebook into the game of explicitly picking political winners and losers. A social media platform powerful enough to give you everything you want is strong enough to destroy everything you value.
The same thing social networks did before.
If I subscribed to 1000 people, show me whatever the hell they wrote, all of it, in chronological order.
Don't show me what my friends wrote on other pages, if they think that's important of interesting, they will link or share manually.
Limit shares/retweets. Limit groups sizes. Surface more information as topics/tags/whatever so that users can do more sophisticated filtering themselves or collaboratively. I want to mute my uncle when he talks about politics, not all the time. Facebook already does more sophisticated analyses than just extracting topic information (I know because I work there and I can see the pipelines). Show those results to me so I can use them as I see fit. That's how you make things better. Chronological vs. algorithmic order? Pfft. In fact, I do want the most interesting things out of that set shown first. I just want to have more control over what's in the set.
Sorting something to 1000-th page is censorship. Legally probably not, it's still available you just need to click page down 1000 times but IANAL and don't care.
I don't want algorithms to do any filtering. If someone shares crap every 10 minutes I can always unfollow. Still, I like your idea about manual filtering with tags.