A similar thing to rootkits for technical systems also occurs in social communities. Through administrative measures covered by censorship, facts about the community are hidden from the members. This can go up to a point where a completely different picture of the community becomes immersive.
The pattern becomes visible mostly in border cases in sub-community level (e.g. a malicious hacker tricks a person into a wrong belief that leads to behavior exploitable by the hacker, while the victim acts under the impression of a different situation -- most people would call it "social engineering" then) and meta-community level (e.g. a meta-community spanning multiple communities like a hosting company, a search engine or a legislation dislikes certain forms of criticism and fights them through deletion or lawsuits up to a point where the majority believes there never was something to criticize at all). On community-level, it is probably the most difficult case to find out, as it is big enough for manipulating party to try to keep the impression durable, while the number of involved persons is small enough not to raise public attention. In this order of magnitude it is probably mostly employed by commercial communities that would be penalized by customers or competitors being able to deduce the exact manner of commercial activity from the online community interaction.
Isn't this what the "Matrix" movie was all about?
As it stands, this is rather a ShallowPage, and needs work. A good start may be a better name. Something along the lines of Ministry of Truth sounds good ;)
Also, examples of this phenomenon would be most helpful. -- ChrisPurcell
Admittably, the term "SocialRootKit" has a quite negative color. It still describes perfectly well the situation of manipulating and hiding the traces. The rootkit term fits probably better on the case of an attack on an existing community, while the ministry term sounds more like a community that is built upon the lie.
(Note that marking this page as deleted is not bad as such! It just means that if the page is not edited in a couple of weeks, it will be removed. Think of this as activating BrainstormMode.) -- ChrisPurcell
Chris, I think you're jumping the gun a little. Just ask questions to flesh out the idea, and then if later when we have a better sense of what the page represents, we can fix it. I've undeleted the page. -- SunirShah
Alrighty. Was going to do that myself when I saw the original 3 sentences had been fleshed out a bit, but you beat me to it. -- ChrisPurcell
I'm not sure what is being 'rooted'? And what does 'root' mean in a social environment? There is the comparable concept of the CryptoCracy, perhaps, but that is a caricature rather than a reality. The whole issue with a RootKit is that a computer system relies on HardSecurity. The very rigidity of HardSecurity means that once it is compromised, its very mechanisms of control become available to the attacker. For instance, particularly nasty root kits will lock out other potential roots and patch holes so others cannot compromise the system to regain control. Similarly, once an army takes a fort, it can use that fort to defend and attack against its original occupiers. Essentially, there is no way to negotiate with or mitigate a rigid power structure, because the power is embedded in structures external to the social situation. So, I think a SocialRootKit is an oxymoron. -- SunirShah
Donald Knuth was maybe the first person talking about the idea that computer programming should be seen as just another way of communication between human beings. When viewed that way, the distinction between computer systems and social systems begin to blur. On the one hand, fascism has shown that HardSecurity can also be applied to social environments. On the other hand, one might argue that HardSecurity in computer systems was a bad idea in the first place. For a static system, it might work, and the attacker might be able to keep it compromised forever. But when viewed globally, the means of computer interaction evolve, and a system cannot remain static as it has to upgrade. Through the upgrades, new possibilities of attack get in. Though some parts of the computer industry try to attack this problem by distributing cryptographically signed upgrades, they just move the "root" point to attack somewhere else this way, where it is even more dangerous to be compromised. When realizing this, it seems logical to think about applying more SoftSecurity to computer systems, so the damage rootkits can do can be kept inside borders. This is a bit apart from the topic of this wiki, but maybe it shows that the power of rootkits can be equally strong or weak on both social and computer-based systems, while the mechanism of action holds quite a bit of similarity. --anon.
Actually, that is right on topic for this wiki, as our primary interest has historically been building people-focused, SoftSecurity-based Internet solutions. I think that the distinction between computer systems and social systems have always been fluid, as computer systems have always been created for human purposes. Timesharing systems made this particularly apparent as they made computing social. I am inspired by the IncompatibleTimesharingSystem's design goal to maintain an actual operational system's reliability using SoftSecurity. -- SunirShah
I agree that the term is rather confusing and that a better one can probably be found, but I don't think it is an oxymoron or that the phenomena it describes is impossible. It is possible for an "attacker" to manipulate information to such a degree that a community's social mores (it's SoftSecurity structures) can be turned against the attacker's opponents. I don't know too much about it, but the current situation in North Korea seems similar to this. I've heard that many North Korean citizens believe that, for instance, chronic power outages are caused by American sabotage, and conversely are unaware of the relatively lavish way of life and some of the harsh methods used by their government leaders. To the extent that this causes patriotism, and to the extent that dissidents are socially sanctioned by their peers as a result, and if you accept that in discouraging dissent, North Koreans are acting against their own interests, this would be an example of some people (the government leaders) manipulating information in such a way so as to turn the community's social "defence" mechanisms against itself.
If you disagree with the particulars there, you could still imagine a dystopian scenario where a small group of people takes over a nation (with evil intent) or the world and rewrites history and controls education so as to make the next generation believe that the rulers are in fact good for society and that dissidents are bad for society; then, in that generation, the majority of the people would socially sanction dissidents. This would be acting against their own interests but would serve to protect the conquerers. -- BayleShanks
I think the problem is that I can certainly imagine many things, but they don't happen in reality. Kafka's The Trial or More's Utopia are fictional, and moreover they are so powerful because they are so cartoony. Your description of North Korea might be correct, but it's hard to be certain here in North America. Moreover, all of the real world cases that correspond to what you're describing are backed up with a serious threat of violence. -- SunirShah
Yes, that's true. I have not given a real example in which the perpetuators of the SocialRootkit did not also have a threat of violence, so I haven't convincingly argued that the SocialRootkit idea is more than theoretical. My intuition says there is some example somewhere though, I'll keep trying to think of one. Perhaps cults? Perhaps some instance of media manipulation by some famous person or corporation? I'll keep trying to think of something specific. -- BayleShanks
DemaGoguery. Formeting fear in the population is the general method of disabling SoftSecurity. Fear of someone leads to hatred of that someone, which motivates violence towards that someone. This is why great leaders work hardest to AlleviateInsecurity, not forment it.
"The only thing we have to fear is fear itself." -- Franklin D. Roosevelt
However, the caveat to demogoguery is that it is only sustainable if it is used to secure a threat of violence. Otherwise, your message can and will be overpowered in time.
“You may fool all the people some of the time, you can even fool some of the people all of the time, but you cannot fool all of the people all the time.” -- Abraham Lincoln
But this is why demagogues are so feared and loathed, because great demagogues always act with a plan to secure militias. And to keep the cost of maintaining those supporting groups low, they generally have an active plan to maintain GroupThink, such as planting ringers to reiterate the message and to threaten CommunityExile to those who go off message. The consequences of not having such a plan is to be overwhelmed by your own out-of-control mob. -- SunirShah