Non-traditional media, much more than traditional, allows people to pick and choose what the want to hear. To a large extent, in fact, the providers help out by algorithmically "learning" what you want to see, in one way or another, and then biasing what you see to align with that.
So, you get echo chambers, amplified by what are almost certainly ridiculously simple algorithms.
Even absent algorithms, though, we can see how communities self-select.
The LuLa Coffee Corner has selected a little cadre of like-minded people who do most of the talking, sharing their links and information with one another. Sure, there are some nay sayers, but the cadre has learned to mostly ignore them. The naysayers have, for the most part, conceded the ground to the cadre because, ultimately, who cares?
Facebook is more nefarious, because everyone lives inside their own little curated cadre on Facebook, by design. It looks like Lula, except that instead of naysayers leaving, they are simply eased off into their own little cadre, and you never see them, and they never see you. This is what the algorithm adds.
Does one blame the non-traditional media and its algorithms, then? Given that this is what people will tend to do themselves anyways? The algorithm merely expedites and amplifies what was already there. Is there some moral obligation on the part of Facebook (or LuLa) to force alternative viewpoints, or viewpoints determined by editorial edict to be Correct, down the throats of the people.
Certainly this is what traditional media did and do. There is very limited feedback, a few letters to the editor from cranks, so in general the editorial oversight was driven internally, not by "what do the people want to see" so they simply did their best and, from time to time, simply told the truth as they saw it (see it) there being no obviously more profitable alternative.