Locational State Theory is based on the analysis of datasets of object properties by taking into account divergences (variance) and convergence (equivalence) of property values that are associated with the specific location of objects in space-time. The term locational-state refers to location in space-time and state is the corresponding value of a property. LST is concerned with a general canonical form as a natural or unique representation of an object that includes the space-time dimension.
This advancing domain developed from an initial concern with the exact specification and communication of information requirements as well as the validation that information, being sent in response to a requirement, is what had been requested.
This work was initiated by Hector McNeill1
a Senior Scientific Officer with the Information Technology & Telecommunications Task Force (ITTTF) of the European Commission in Brussels in 1985.
McNeill was concerned with systems strategies associated with the evolution of a global network. This was a decade before the World Wide Web became a reality but it was already clear, at that time, that the Internet would become the foundation network for this transition. McNeill foresaw the current issues related to the circulation of poor quality information as well as intentionally biased information making decision-making less effective, erroneous and therefore risky. On the side of risk, the antagonism that can arise from misleading information and recriminations arising from poor decisions by policy-makers affecting millions of constituents, are states of affairs that should and can be avoided. At the extreme, misleading information, can lead to conflict2
However, Locational State Theory did not start out as a foundation for a theory it was only intended to provide a simple a way to establish unambiguous specifications of the datasets required to satisfy specific information requirements. In fact, this process started out to answer a simple question, "How can we safeguard against misrepresentation in information received over a global network in order to be sure that we can base decisions on the information?"
The ability to specify information requirements precisely and the capability to assess the match of information provided in response to the original requirement, can only be judged to the degree that the person asking the question and receiving the answers has notions of the probability that the information provided corresponds to precisely what was requested. It is often the case that those demanding information for a specific purpose do not always know how to specify their requirement in terms of accuracy and representation, As a result they are not always aware of the degree to which the received information in response corresponds to their original request. The concept of complete and incomplete datasets is often ill-understood. As a result decisions can be taken using poor quality information resulting in disappointing outcomes. Since in many cases of policy and business decisions there are significant economic and social implications associated with poor decisions, the issue of data quality specifications is of importance3
. This problem of being able to define precisely information requirements is related to the multi-disciplinary nature of most issues and single individuals usually do not possess the required breadth of knowledge to undertake the appropriate level of decision analysis required. However, Locational-state provides an important foundation for the establishment of due diligence procedures so as to help both information requester and the supplier to end up with a more coherent question-response relationship. Such due diligence procedures can help define what needs to be taken into account when specifying information of importance to the requester. Locational state can help the questioner identify cross-related additional questions to cross-check on replies. To do that another dimension of information is required, related to causality and the determinants of the outcomes which flag other associated data requirements with which to cross-check the validity of received dataset.
This proposition and its explanation first appeared in an internal brief prepared by McNeill, in 1985, for economic sector stakeholder panels. The establishment of stakeholder/user panels was pioneered by McNeill to assist in the development of online learning/innovation systems development work across a range of sector applications, at the ITTTF in Brussels.
Locational State then, concerns the precise specification of data elements that are object properties.
This approach has important implications for decision analysis by creating a close relationship between determinant decision analysis models to the current state of knowledge on relevant cause and effect relationships, probabilities of events and information quality.
The word Locational
: is not an English word in common usage but it acts as an adjective with the syntactic role of qualifying the nature of the properties that describe an object or State
as being dependent on space-time coordinates expressed as geographic (longitude, latitude and altitude) and time coordinates.
1 Hector McNeill is director of SEEL-Systems Engineering Economics Lab. SEEL was established by McNeill in 1983 and it is now a specialized unit of the George Boole Foundation Limited. He is a graduate of Cambridge University and completed post-graduate studies at Cambridge and Stanford Universities.
In this own words, McNeill explained that, "My work at the time was concerned with the development of initiatives for the development of information applications. My own brief was to identify ways to establish broad civilian benefits from a wider use of a global communications network. The basic theme was the convergence of all then current analog and digital applications into a single digital standard. I was also asked to develop "learning systems" initiatives. However, I am afraid that the concept generally held was that "learning systems" consisted of course preparation modules and online university tuition like the Open University model in the UK." However, by that time (1985) I had been developing the Real Incomes Approach to economics for more than a decade. This approach to macroeconomics is designed to promote technological change and the development of human technique, learning, the accumulation of tacit knowledge and improvement in the quality of explicit knowledge. This is the very human foundation for the dissemination of innovation. These human traits account for 60% of economic growth and the improvement of human welfare. Therefore, for me, a learning system is the organization of society so as to maximize the acquisition and use of advancing knowledge by all people at any life stage. This learning system is an ongoing organism that supports us throughout our life."
"This approach, however, faces many challenges. We all know that part of the process of learning is trial and error but the better the quality of information and sharing of experience not only cautions us on what to avoid but also helps us deploy decision analysis leading to more successful outcomes. If we receive defective information the process of human advance is debased and corrupted. Indeed in many transactions, the art of lying by misrepresentation or omission of key facts is assumed, by some, to be part of the way they gain advantage over others. This creates problems for those who for philosophical or religious reasons believe that a better state to be in is one where all can take decisions based on an ability to trust that supplied information is factual. This, of course, would lead to a better condition for mankind supported by mutual consent to be established within constitutional frameworks that are designed to uphold this condition by protecting society from the abuse of misrepresentation and imposing sanctions on those whose behavior prejudices others".
2 Today this has become more obvious in the political spheres and in so-called "Fake News" battle that has erupted (2016-2017) between social, alterative and "mainstream" media, all of whom have significant problems with information quality and reliability.
3 Recently (November 2017) US Congressional hearings concerning the use of social media services such as Google, Facebook and Twitter by "foreign agents" are a case study of a completely structureless interaction with the posing of wholly deficient questions by Congress members, reflecting pre-conceived ideas of what was happening, and a lack of understanding of how such systems work combined with deficient responses from these organizations.