Musings no comments
Last post I talked about how it was possible to understand the web as a cognitively constructed object which in some ways abided by the laws of celestial mechanics. Further to this I now want to expand on the observational approach which is adopted in astrophysics and combine this with the idea of an information spectrum.
To start with, a simple comparison between the astronomerâs telescope and the web userâs device (computer, tablet, phone – whatever it may be) will elaborate on the theme of observing a phenomenon or âthingâ. At first this comparison might seem arbitrary but upon closer inspection we can see important similarities between the two observational apparatus. Seaborn (1998) points out that ânearly all of the information astronomers have received about the universe beyond our solar system has come from the careful study of the light emitted b stars, galaxies and interstellar clouds of gas and dustâ (Seaborn 1998). The telescope is the astronomers most valuable tool to understand the physical world and âour modern understanding of the universe has been made possible by the quantitative measurement of the intensity and polarization of light in every part of the electromagnetic spectrumâ. (http://en.wikipedia.org/wiki/Electromagnetic_spectrum)
As I mentioned in the last post, astrophysics/physics is a holistic approach which emphasises the objective perspective as opposed to the subjective. âAlthough observational astronomy now covers the entire range of the electromagnetic spectrum, along with many areas of particle physics, the most familiar part of the field remains in the optical regime of the human eyeâ (Seaborn 1998). As humans we better understand the phenomena which we can see. Once astronomers were able to identify light and its properties, they were then able to understand more obscure concepts which were harder to detect. It was then the subjective approach adopted by humans and their understanding of the world around them which lead them to the objective study of the universe. That is the subject was removed.
Just as light is the most easily recognised feature of the electromagnetic spectrum, the most identifiable feature in the web is arguably words. Web users observe the web by using computers and other such devices to access web pages. On one level the web consists of language and protocols which humans can consume and understand. Websites are made of varying markup languages. Words then are the most familiar part of the field in what I term the information spectrum. Breaking down the idea of language, we recede along the spectrum and go towards code, metadata and finally binary code. Just as âa wave is a disturbance that travels through a mediumâ, information is a series of disturbances (ultimately 1s and 0s) which travel through the air and are picked up by our devices. In information spectrum, code is the unit with the smallest wavelength.
Now then, using psychology which is the science that seeks to understand behavoir and mental processes, we can build an information spectrum that incorporates cognitive mechanisms. The result is a profound objective spectrum in the holistic sense, which also incorporates the subject (that is humans) into it. Words are the basis of ideas and as ideas are internalised by humans we grow towards collective thinking. On the internet we can observe communities who share ideas and on the information spectrum, these collectives would occupy the other extreme. This social psychological approach, where the âprimary emphasis is on discovering and explaining the causes of behaviourâ (Carlson et al. 2007) allows us to explain how behavior is driven by ideas. For example the Arab Springs is an example which emphasises the role of the web, and particularly social media, in communicating information. Ideas were circulated and disseminated by users and âsocial cognition involves our perception ad interpretation of information about our social environment and our behaviour in response to that environmentâ (Carlson et al. 2007). Information is the central phenomena which is internalised by individuals at one level, and communities at a higher level. When people share the same idea, we tend towards the psychological concept known as groupthink. The idea that âthe people want to bring down the regimeâ was the abstract thought which was internalised by the Arab Spring protestors.
We can see then that people are attracted to information in a similar way to how masses are gravitationally attracted to each other in the universe. Can we then compare galaxies to the relationship between people and ideas? The idea or collective thought is like the sun, the largest object in our galaxy that everything else is attracted to. People or groups are closer to the idea depending on how much they agree with the idea. Could two polarised beliefs about and idea be representative of two different galaxies which exist in the same system? After all âa galaxy is a massive, gravitationally bound system consisting of stars, stellar remnants and interstellar medium of gas and dust, and an important but poorly understood component called dark matterâ.
Is the web then, a massive, information bound system consisting of individual thinkers and collectives who have internalised ideas – and an important but poorly understood component called the deep web? Some complications to this idea are the physical concepts of time and space â two concepts which are increasingly complicated when applied to the web. Significantly the web is an object with no dimensions and is decentralized. Does the web have any shape? Also while astronomers were able to observe the universe from within â that is from vantage point of earth they were able to determine what was around us â the way in which we observe the web is more complicated. When we use an observational device such as a laptop, are we observing the web from the inside or outside? Or even more profound, is there one web? Or is it a multi web? Our view of the web becomes increasingly multifarious as personalisation takes effect on the web â what one users observes is different from another. Again this tends towards the dichotomy of subjective and objective approaches adopted by psychology and astrophysics respectively.
Is it possible to create an information spectrum which incorporates ideas and how we perceive them? Human interpretability becomes an important issue in such a conceptual spectrum and the perplexing idea of incorporating a subjective approach in and objective view prevails.
Philosophy – meta ethics no comments
Last week I spoke about Kantâs theories of how to determine right and wrong. This week I have been reading a few different perspectives on morals. It turns out that the field of ethics is split in to three parts; one is normative ethics, which is an attempt to define rules for what is right and wrong, one is applied ethics in which real life problems such as abortion are analysed, and one is meta-ethics in which questions about the nature of ethics are asked. Kantâs theory belongs to the field of normative ethics in that he attempts to set rules of how to lead an ethical life. A wider view is taken in meta-ethics; in this field of research, it is asked what it means to be morally right or wrong.
One meta-ethical theory that particularly interested me was moral relativism; a standpoint based on the assumption that all societies have different customs and values. This assumption is so well supported by anthropologists and psychologists that it can be considered as uncontreversially true. Building on this assumption certain logical steps can be made (this appears to be a common way of building philosophical theories). It can be assumed that due to the different value systems, different societies can have different ideas on what is right and wrong. It is therefore reasonable to assume that there can be no universally applicable moral rules, as moral systems are socially constructed. This view makes morality simply a description of the moral values of a particular society at a particular point in time.
If this approach was taken towards an attempt to make judgements about the morality of moral behaviours (e.g. digital piracy) it would be impossible to define a set of moral rules for online behaviour, due to the fact that the web is almost universal.
Computer Hacking and the Moral Standpoint: For Saints or Sinners? no comments
Initial thoughts on philosophy constitute the subject as a means to question, analyse and assemble thoughts and conceptions of the universe (Nagel, 1987). Furthermore, it has been mentioned that these analyses technically cannot be answered within the current available technology of that particular time. And so, early constructions of the ‘heavens’ and earth, from various religious eras, seemed contradictory to the later findings from scientific studies. Although, from this, further questions can be built upon this to create new philosophies of the universe. This insinuates that the foundations of philosophy stem from an innate human motivation to learn about the world around us and question our very purpose within it. Whether in hindsight that an act of communications hacking is indeed relevant or even significant to the purposes of human beings on planet earth and within the universe.
Not only does philosophy stem from an ideology of ‘nothing is certain’, it also strives to suggest that there are implications and ramifications for such actions associated with what is humanly and technologically possible. Brian Harvey (1985) analyses the ethical consequences of such hacking actions and that the human associated with the action will indeed, over time, become desensitised to the ethical implications as a consequence of their actions. This seems as though it is dependant upon whether they are hacking for the greater good or whether it a simple act of breaking down security systems in order to alter the intended message to its audience.
Furthermore, Harvey (1985) notes that the ethical understanding of a human being is something that is learned, something that is driven from our interaction with the environment and with society, that ethical understanding and awareness are social phenomena that are altered according to variables such as gender, race, religious belief, culture and status. In reference to communications hacking, the ethical implications are based on the judgement from the end-user or the audience of the message and not of the hackers themselves. Although, this begs the question of whether empathy is an emotion on the flipside to perceived ethical discrepancies.
In a news article, the Vatican stated that hackers are subjects associated with a greater good, that they are driving us away from the restrictions and securities of western society and that freedom of speech and the right to know such information is a part of society (Discover Magazine, 2011). In reference to one of my previous posts, this article continues down the route that hacktism is the ideology associated with the freedom of information.
It is clear to see that there is a crossover point between political action and philosophical theory. In my next entry I shall be researching how philosophical theories are represented as political ideologies and how this may affect societies perceptions on communications hacking. Does political reasoning give the hacker the excuse to perform such actions?
Markets no comments
In which market structures do web-only firms operate? And what are the implications?
There are many market structures in which firms trade[1]: perfect competition in which many firms sell an identical product; monopolistic competition in which a large number of firms compete with slightly different products, leading to differentiation; oligopoly where a small number of firms compete; and monopoly in which one firm produces a unique good or service, e.g. utility suppliers.
Perfect competition
Perfect competition gives rise to a situation in which economic profit induces entry into the market by firms, which in turn eliminates profit. And economic loss induces exit, which in turn eliminates the loss. When profit and loss have been eliminated and entry and exit has stopped, a competitive market is in long-term equilibrium. But this is a rare state to maintain.
Monopolistic competition
Monopolistic competition results in many product innovations, to achieve differentiation, which are cost-efficient to produce so not significant. It differs from perfect competition in that there is excess capacity and the prices are higher.
Oligopoly
An oligopoly has a small number of interdependent firms resulting from natural barriers to entry. It is distinguished from monopolistic competition by measuring the market ownership of the 5 largest firms compared to the next 10 largest firms, with 60% market ownership by largest firms giving the oligopoly. It is studied using game theory.
Monopoly
A monopoly has 2 key features: there is no close substitute and there are barriers to entry which deter potential competitors. The 3 types of barrier are: natural in which economies of scale enable one firm to supply the entire market at the lowest cost; the ownership barrier if one firms owns the major portion of a resources; and a legal barrier if a firm is granted a monopoly franchise, government licence, patent or a copyright.
Web-only firms operate in a mix of all 4 market types. There is the monopoly in search services and SEO advertising by Google; the oligopoly-duopoly of Google and Facebook in platforms for user-generated social content and dependent applications; the monopolistic competition of other topic or media based dependent social networks (e.g. SoundCloud, Myspace, Youtube, Vimeo, Goodreads etc); the perfect competition of free knowledge sites with professionally created or user-generated content (e.g. Wikipedia, online periodicals, Quora, Stackoverflow etc).
Are the Freemium business model permutations, including indirect revenue streams, the most economically viable models? Are they supported or undermined by the mixed market environment? The next instalment followsâŠ
[1] Economics / Parkin, Michael, 1939-
Anthropology & global issues no comments
This week I have done some more reading on anthropologyâs methods, complementing the findings I wrote about last week. The more IÂ found out about anthropology, the more I wondered how as a discipline it would tackle global issues. Indeed from reading the introductory texts, I got the sense that anthropology (the socio-cultural kind) was concerned with the study of human kind. A priori this doesnât seem to pose a problem in terms of the globality of the subject matter, but in its approach and even epistemology, anthropology is firmly based on the notion of classification. Indeed its ontologies are cultures, peoples, societies, etc. and its methods are primarily descriptive and comparative, assuming the existence of different âthingsâ to compare. As mentioned in previous posts, an anthropologist looks at a society/community/social group which he/she investigates doing fieldwork, conducting interviews, historical research, etc. But what happens when the group in question is the entire world population, as is often the case with so-called global issues? How then would such a discipline tackle questions that seem to contradict its own epistemological foundations?
Trying to look at the digital divide from an anthropological lens, I have hit what might be the crux of the issue in this assignment â how to let go of my previous assumptions about the world, shaped in large parts by my training in International Relations and instead of re-phrasing the âproblĂ©matiqueâ I immediately see with the global digital divide in anthropological terminology, attempt to âdiscoverâ the problems and âframeâ it as an anthropologist would. In order to try and do that, and while I did find some answers in the introductory readings, I decided nevertheless to look for some more targeted articles on the issue.
An article by Kearney (1995) âThe Local and the Global: The Anthropology of Globalization and Transnationalismâ in Annual Review of Anthropology was particularly helpful. The answers or thinking I considered fall in two broad categories â theoretical and practical.
On the practical side, Peoples and Bailey (2000, p. 5) assert that for global issues, which have admittedly gained importance in the past two decades, anthropologists are often called to consult on specific projects â an emerging sub-field of the discipline referred to as applied anthropology. The idea here is that solutions to global problems often require local knowledge, provided by traditional anthropological research and therefore increasingly useful in the field.
On the theoretical front, Kearney recognises that new thinking is required in âanthropological theory and forms of representation that are responses to such nonlocal contexts and influencesâ (1995, p. 547). He sees global issues (and globalisation) as having âimplication for [anthropologyâs] theory and methodsâ as research which is limited to local units of analysis âyield incomplete understandings of the localâ (1995, p. 548). He sees the redefinition of space-time into a multidimensional global space with fluid boundaries and sub-spaces as the most important disruption to anthropological epistemology. He also notes that the notion of âprogressâ assumed in the discipline and the notion of âdevelopmentâ is and needs to be questioned in the context of globalisation, that is to say that there is no inevitability in the course of global history. Moreover with the âdeterritorialisationâ of culture, the focus of anthropological study is shifting towards âidentityâ. Underpinning these changes is the fundamental reframing of the concept of classification, no longer considered âan invariant subject of investigation in anthropology, but taken instead as a historically contingent world-view categoryâ (1995, p. 557).
This has given me some interesting avenues to explore so I will conclude my introductory reading on anthropology here. Next week I will start looking at management as a discipline.
References
Kearney M. (1995) âThe Local and the Global: The Anthropology of Globalization and Transnationalismâ in Annual Review of Anthropology, Vol. 24, pp. 547-565
Peoples, J. and Bailey, G. (2000) Humanity: An Introduction to Cultural Anthropology, 5th ed., Belmont: Wadsworth/Thomson Learning
Economics, Models and Data. no comments
Economic theories are constructed using models and data. Models can be described as frameworks which organise how economists think about a problem. Models create a simplified and easier to manage reality with which to test theories. Data is the facts with which the model interacts, therefore the data needs to be relevant.
Data can be;
- Time series – which shows how a variable has changed over time. This is usually graphically represented.
- Cross sectional – shows a fixed point in time how a variable differs between groups or individuals.
Data is represented as;
- Index numbers – this allows the comparison of data without using units and showing any change relative to a base number. Indexes can also be expressed as averages.
- Nominal or real variables – nominal values show the price of things, whereas real values show the price of things taking into account the factors which may influence the price. For instance, a nominal value may have increased, but a real value would show the increase was due to rising labour costs and there was not an increase at all.
Economic models use empirical research to examine the realtionship of interest.Â
Therefore economists;
- Construct a theory
- Develop a model to test the theory
- Test the theory with data
EW IV: Philosophy and Law no comments
This week I had a look at David Bainbridge, Introduction to Computer Law, and Godwin, Cyber Rights. The Bainbridge is terrifically dull – he’s a professor of law and business and it really comes across in the text. I did find this vaguely useful, in the sense that now I realise why I will never be a lawyer. Anyways, he does make some useful commentary on the issue of freedom of expression, which is what I think I am going to be approximately focusing on for my coursework. He goes through a few case studies of real trials and discusses the outcomes which might be interpreted as being problematic in different ways. I think what Bainbridge is saying is that there isn’t really much precedent for questions of freedom on the internet yet – there’s only a limited number of real life trials that have happened and the results aren’t necessarily consistent.
The other book by Godwin, Cyber Rights: Defending Free Speech in the Digital Age, is much better, and I would recommend it. As the title suggests, it is advocationist in nature right from the start. Godwin thinks that freedom on the web is something that should be defended, and we should be much more worried about the consequences of restricting people rather than the consequences of not restricting people. Godwin is himself a lawyer, and discusses a large number of case studies on the issue of rights on the internet, particularly as related to free speech. He also argues that the web is really quite different from the other inventions of communication that came before. On page 75 he says
“The constitutional justification for special regulation of broadcast content – which covers radio, television, and cable and includes regulations like time-based restrictions (such as limiting material for mature audiences to distribution at certain times) -has been twofold. First is the concept of scarcity of resources. There is a notion that broadcasting frequences are so scarce that the government is the only institution with a global enough perspective to step in, allocate them, and govern their use for the public good. Second is the notion that broadcasting is pervasive in some fashion – that it creeps into the home in a way that makes it unique. Regardless of whether you accept these justifications for content control over the airwaves, the fact is that the internet is nothing like broadcasting in either way. Internet communication is not scarce. Every time you add a computer node to the internet, you’ve expanded its size. It is not pervasive because (with the arguable exception of spam…) you don’t have people pushing content into your home; you have people logging on and pulling content from all over the world…It is a fundamentally choice-driven medium for communication…” – p75
Wittgenstein (Philosophy of Semantics) no comments
Augustine describes the process of learning language and human behaviour as a child; by seeing the words and motions used in the proper place and at the proper time he learned to use them properly himself. Speech software can do something a bit like this? Or whatever. Basically soft AI can learn to manipulate speech, though it has no conscious desires outside of what has been programmed. In mimicking physical human behaviour though we might go into the uncanny valley.
âEvery word has a meaning.â p2
Wittgenstein draws up an analogy for the use of language as mental object retrieval in which a shopkeeper is given the instruction to retrieve five red apples. The âapplesâ are matched to a catalogue, the colour âredâ is compared to a colour sample and the cardinal numbers to âfiveâ are listed. For each number, the shopkeeper retrieves one apple of the chosen colour. Following this protocol, the shopkeeper fulfills the instructions and may return to a position of readiness. p3
This is simplification in this example helps to draw aside some of the murkiness which âsurrounds the working of languageâ (p4), and highlights the fact that in the earlier stages of language learning, that is, learning the functions of words, it is not explanation that is imparted, but training.
âIn the practice of the use of language on party calls out the words, the other acts on [responds to] them.â -p5
âNaming something is like attaching a label to a thing.â -p7
âWhat are the simple constituent parts of which reality is composed?â Our conception of things (chairs, trees) is made up of parts, but what is the simplest (ie not composite) form of these parts? The elements? The atoms? We infer lots of stuff from looking at a wooden chair. The wood, and all that this implies (trees, branches, forests, saws, varnish, factories); the paint; how comfortable it may be. This complex web of background knowledge is completely natural in humans but really hard for computers.
âA name signifies only what is an element of reality.â -p29
And as an aside, from the Donna Harraway: âMicroelectronics mediates the translations of ⊠mind into artificial intelligence and decision procedures.â -p304
Demographic World View: Act One Scene Two no comments
Etymologically, demography comes from the Greek words demos (for population) and graphia (for description or writing).Demography stated informally tries to answer the following questions:
– How many people of what kind are where?
– How did the number of people come about?
– What is the implication of the number derived?
Formally, demography is the scientific study of human population and its dynamics.
Demography deals with aggregates of individuals, it describes the characteristics of population. Most demographic studies employ quantitative and statistical methods, features of population are often measured by counting people in the whole population or sub-populations and comparing the counts.
Population size is a number with absolute and relative connotations. In the absolute sense, human population size quantifies the number of people in a country, region or space. Beyond the numerical quantity is the concern for distribution both within and among country, region, or space, this accounts for the relative connotation. Resulting from the concepts of population size and distribution is population density which is the relationship between population size, distribution, and the space that contains it.
Population density is consequential to the well being of the population. Notably, population density explains the viral spread of disease, knowledge, and ideas; epidemics is most likely to occur in a densely populated space as knowledge and ideas can easily diffuse.
Population study is concerned with the size and distribution of identifiable subgroups within populations. This concern yields information on the structure and composition of population. The characterization (categorization or classification) of population relies on endless list of traits- age, gender, education, religion, income, occupation, language, race, ethnicity etc. However, some traits are more useful; traits that change less frequently or has predictable pattern of change. Age and gender are the basic and most influential characteristics to demographic processes, hence they are known as demographic characteristics.
The dynamics of population is rooted in the basic demographic processes of birth, death, and migration. Basically, population changes can be associated with leaving or entering; to leave means dying or emigrating and to enter means being born or immigrating. This fact can be depicted in the basic demographic equation that follows:
Pt+1 = Pt + Bt ,t +1 – Dt ,t +1 + It ,t+1 – Et ,t+1
where Pt is the number of persons at time t and the number of persons one year later is Pt ,t+1; Bt ,t+1 and Dt ,t+1 are the number of births and deaths that occur between times t and t+1 respectively; It ,t+1 and Et ,t+1 represent the number of immigrants to and emigrants from the population respectively between times t and t+1.
The difference between Bt ,t+1 and Dt ,t+1 is referred to as natural increase (or decrease when the difference is negative) while the difference between It ,t+1 and Et ,t+1 is known as positive net international migration when the difference is positive and negative net international migration otherwise.
Growth in demographic parlance refers to change in population size. From the demographic equation above, growth means the difference between Pt+1 and Pt even though this difference is negative. The interplay of demographic processes results in population growth as well as compositional changes in population.
Readings
David Yaukey and Douglas L. Anderton, Demography: The Study of Human Population 2nd ed., 2001
Dudley L. Poston, JR. and Leon F. Bouvier, Population and Society: An Introduction to Demography, 2010
Current disciplinary debates in Political Science no comments
Political Science
The book I have been reading this week contains an overview of how political science has evolved in the last decades as a discipline. Entitled Making Political Science Matter (Schram & Caterino, 2006), this edited book builds up on a debate sparked by Flyvbjerg (2001) focused on the limitations of current methodologies âcurrent at that time- in social inquiry. One of the bookâs claims is that methodological diversity in this field is somewhat constrained by the pluralism of post-positivism. In other words, positivism in political sciences emulates natural sciences in dividing the discipline in subfields that become isolated one another, each one with their own methodologies. Owing to this division or constrained pluralism, a need of âtrading zonesâ or common understanding between disciplines has been identified.
Also, all essays in the book are highly critical to the application of âhard scienceâ -in which quantitative methods are included-, in political analysis, as this approach seems to be too distant to the object of study, which in this case is the society, composed in turn by people, not objects. This is why hard science cannot fully explain or provide a complete understanding of social phenomena. This limitation is leading to a revolutionary period in which a movement called Perestroika is challenging the current paradigm in social science. Together with Flyvbjerg, Perestroika aims to include ânot to switch to- phronesis in the study of politics. Phronesis is a key term in the flyvbjerian debate, meaning that intuition and practical wisdom are critical to the study of social phenomena.
In short, from this book, it seems like political science is distancing from the paradigms of natural sciences, moving towards an approach in which social and political phenomena are approached from a more humanist perspective, in which personal experience gains significance. This shift might be necessary to be considered by other disciplines such as computer science when looking for a common ground , a âtrade zoneâ in which to have a fluid communication.