Conference at UCL entitled, Trust and Triviality: Where is the Internet Going
Medium doesn’t matter (i trust the other person i am speaking to I don’t worry about the telephone). RP: what about e-voting.
Trade-off between safety and access to information. First it must be acknowledge that there is a trade-off. This trade-off can be improved but will always remain. Example of schools.
Quality, truth, trust and elitism.
- Regulation central stuff from the ex-ITV guy. Duty of truth??
- Top-down allows control and regulation.
- Bottom up allows for free speech - but he claims a very dubious free speech.
- does internet narrow or widen points of view?
- Craig’s list.
- Comment on wikipedia. Democratization of production of information but is it reliable.
- RP: What scares me is this assumption (made by e.g. ITV guy) that the internet is uncontrollable. Unfortunately it is all too censorable.
Ed Richards, Senior Partner OfCom, Strategy and Market Developments
- In next debate over communications act will need to have public debate.
- (!!) Need effective DRM to create trust in the online world
- (!!) Bad: 4.7 million UK users have knowingly downloaded illegal content
- Mentioned EUCD etc. No particular emphasis but simply as a fact of life
- User don’t want to be criminalized they want to use iTunes ….
- Young generation need to be focused on to address this idea that music should be free. This culture of freeness is pervasive among the younger generation.
- Twin track: creative commons along side a DRM commercial IPR environment. Creative Commons embraced.
- What could undermine this: spam, child protection issues,
RP: wonderful to hear this support and interest in a creative commons [ed: as concept and as group]. One thing I am very interested in is a direct participation by the Government in nurturing this Commons - a role they have obviously long taken in the traditional academic and artistic spheres.
- In response: we might want to ask that if content is generated by the govt why should it NOT be released in a creative commons manner. Now I can see there might be issues with this approach, for example if content is sourced from independent producers, but still I think you start from a position of why not.
- raised issue of sustainability
- RP: didn’t really know about non-commercial licenses
Guy from Internet Watchdog: We don’t want to close down the internet or regulate it formally but we want the US to see the internet is not the Wild West and that something nees to be done
Red-headed women: Lessig + issues with very strong IP protection. Aren’t there major problems with over-strong IP.
‘For my money the DMCA goes far too far’ … ‘It makes neither economic and cultural sense’ … '
RP: they all know about Lessig
DRM and stopping consumers thinking they should get stuff for free
Earl of Selbourne: Cyber Trust and Information Security
RP: Data Collection in Government:
- Value of information and collection costs.
- Data acquisition costs often not evaluated.
- If information not reliable what value is it.
- As what? Are they are certification or more than that.
- Clearly more than that. Often not about information at all but about creating an image
- For me there is no silver bullet to solve the trust and reliability problem. Particularly as this can be in direct tension with a desire to have a very open/free approach to dissemination.
- The recursion argument. Suppose I receive piece of information X. How do I know that this piece of information is reliable/true/correct. Have following options:
- I have my own knowledge that allows me to check validity.
- Find an expert who I already trust who can tell me validity or not. But how did i find and verify the expert. At this point have to recurse. Verification knowledge may be of two kinds:
- Direct verification. I actually know the area and can judge from my own direct knowledge.
- I know something about the characeteristic of the information provider (e.g. they provide footnotes, they have a prestigious reputation for honesty, they have a right of reply that is administered honestly). This is indirect verification.
Relation to information bandwidth. Can often work out that an article is for or against a particular information with minimal processing. Levels of processing. Can take principal components approach (e.g. presence of footnotes) as a way of reducing the amount of information to process.
Can consider reliability of information in a social/truth sense as similar to information reliability issues in traditional information theory. I receive a signal. problem is the simple statistical mechanisms for analysing corruption of signals over nosiy channels are not appropriate to that relating to social reliability (are there facts true or not)
Are chinese whispers statistically like degradation of signals in traditional IT areas?