Some CSCW researchers investigate trust within and between teams of people and how it can be used as a tool in determining how information is actually shared between members.
Topics of interest include (but are not limited to):
- repute: "information and person credibility assessment and usage",
- deference relations: "trusted information routing discovery mechanisms,"
- factions: "the use of trust as a tool for decision making regarding information sharing." - 
"In general, the disclosure of information, how it propagates through networks of people and machines, and how trust can play a valuable part in, amongst other things, what is disclosed, to whom, when, and for what duration." - by Stephen Marsh of the National Research Council of Canada.
Trust for our purposes can be divided into found trust, built trust and grown trust. It can be measured as "social capital" but not very reliably, as usually a faction gets involved in what behaviour is seen as admirable or reliable or even predictable, and is certainly required to create reputation.
Trolls tend to challenge prevailing ideas about trust - on large public wikis they very often succeed in reversing people's beliefs about reputation and trustworthiness, by using a sort of scientific method of baiting sysops. This works quite reliably and may bring about regime change.
In response to 18th century trolls, who created the French Revolution and American Revolution, governance organizations began to use distrust more explicitly, to prevent trust from becoming too centralized. This is probably what the Consumerium Governance Organization should do on day one, instead of repeating all of political evolution as Wikipedia is doomed to do, wasting years trying to deny that politics as usual is also inevitable and that factionalism is probably good too, when it's correctly supported and each element of the political spectrum has their own role & faction to protect common interests.
See w:trust for a more general discussion of both ideas of trust.