"Prediction markets could be used as a tool to aggregate and evaluate information, a kind of "weak crystal ball" that could help lawmakers make more informed decisions.
...
The idea is rather than taking a survey to get the average opinion and relying on the wisdom of crowds, the prediction market identifies the wisdom in crowds because the market only attracts participants who feel confident enough in their predictions that they are willing to put money on the line. Prediction markets give participants a financial incentive to get things right."
http://www.miller-mccune.com/politics/government-prediction-markets-3656/
Jacques Fresco on the problem of superstitious ignorance
http://www.youtube.com/watch?v=EXWnFeG5EyA
"There are four types of knowledge services: generate content, develop products, provide assistance, and share solutions. Knowledge services are modeled as a circular value chain comprising nine stages that embed, advance, or extract value from knowledge-based products and services. The stages are: generate, transform, manage, use internally, transfer, enhance, use professionally, use personally, and evaluate. (Simard, 2007) described a rich to reach service delivery spectrum that is segmented into categories of recipients, with associated levels of distribution, interactions, content complexity, and channels. The categories, from rich to reach, are: unique (once only), complex (science), technical (engineering), specialized (professional), simplified (popular), and mandatory (everyone).
From the perspective of knowledge markets, Mcgee and Prusak (1993) note that people barter for information, use it as an instrument of power, or trade it for information of greater value. Davenport and Prusak (1998) used a knowledge marketplace analogy to describe the exchange of knowledge among individuals and groups. However, Shapiro and Varian (1999) indicate that information markets will not resemble textbook competitive markets with many suppliers offering similar products but lacking the ability to influence prices. Simard (2006) described knowledge markets as a group of related circular knowledge-service value chains that function collectively as a sector, to embed, advance, and extract value to yield sector outcomes and individual benefits."
http://en.wikipedia.org/wiki/Knowledge_market#Knowledge_services
"In large research organizations there is a tendency for new research projects to originate in knowledge silos. Applied research faces growing challenges of how to consider the needs of the customers, scientific knowledge and societally relevant questions in research projects. Service science is an emerging research interface, in which participation from different disciplines, such as design, business needed. In front of these challenges collaboration across the silos, hierarchical levels, disciplines and different actors is indispensable. We claim that multifaceted collaboration does not emerge without specific efforts. Cross-disciplinary research requires network processes for initiating learning, synergy and collaboration. We analyse a method aiming at co-creation in a multidisciplinary research network. We focus especially on how this interactive and cocreative process promotes the crossing of borders of knowledge silos."
http://www.cba.neu.edu/uploadedFiles/Site_Sections/OLKC_2010/Program_Overview/Parallel_Sessions/124_Halonen_Full%20Paper_326_Crossing%20the%20borders%20of%20knowledge%20silos%20in%20Service%20Science%20anf%20Business%20network.pdf
"At the most basic level, a job is essentially a set of incentives. As a person acts according to those incentives, he or she performs work that is currently required in order to produce products and services. In the economy of the future, if that work is no longer required, we will need to create “virtual” jobs. In other words, people will continue to earn income by acting in accordance with incentives, but their actions will not necessarily result in “work” in the
traditional sense."
http://www.thelightsinthetunnel.com/
"In The Future of Work, renowned organizational theorist Thomas W. Malone, codirector of MIT’s landmark initiative “Inventing the Organizations of the 21st Century,” shows where these things are already happening today and how—if we choose—they can happen much more in the future. Malone argues that a convergence of technological and economic factors—particularly the rapidly falling cost of communication—is enabling a change in business organizations as profound as the shift to democracy in governments. For the first time in history, says Malone, it will be possible to have the best of both worlds—the economic and scale efficiencies of large organizations, and the human benefits of small ones: freedom, motivation, and flexibility."
http://ccs.mit.edu/futureofwork/
"Coordination may be defined as the process of managing dependencies between activities (Malone & Crowstone, 1994). The need for coordination arises from the fact that literally all organizations are a complex aggregation of diverse systems, which need to work or be operated in concert to produce desired outcomes. To simplify the picture, one could decompose an organization into three broad components of actors, goals and resources. The actors, comprising of entities such as management, employees, customers, suppliers and other stakeholders perform interdependent activities aimed at achieving certain goals. To perform these activities, the actors require various types of inputs or resources. As explained later in the paper the inputs may themselves be interdependent in the ways that they are acquired, created or used. The goals to which the actors aspire are also diverse in nature. Some of them will be personal while others are corporate. Even where the goals are corporate, they address different sets of stakeholders and may be in conflict.
Calls for coordination are evident is situations where a) temporality is a factor, such that effects of delays or of future consequences of today’s decisions are not immediately apparent b) there is a large number of actors c) there is a large number interactions between actors or tasks in the system or d) where combinations or occurrences in the system involve an aspect of probability (stochastic variability). In summary, the more complex the system (and organizations are complex aggregations) the more coordination is necessary.
Multiple actors and interactions, resources and goals need to be coordinated if common desired outcomes are to be achieved. Viewed from the need to maintain perspective and solve problems that might arise from these multiplicities, coordination links hand in glove with the concept of systems thinking."
http://en.wikibooks.org/wiki/Systems_Theory/Coordination
What is Coordination Theory and How Can It Help Design Cooperative Work Systems:
http://crowston.syr.edu/system/files/10.1.1.92.4445.pdf
A Coordination Theory Approach to Organizational Process Design:
http://orgsci.journal.informs.org/content/8/2/157.abstract
Perspectives on Mechanism Design in Economic Theory:
http://www.nobelprize.org/nobel_prizes/economics/laureates/2007/myerson_lecture.pdf
Coordinating Mechnisms in Care Provider Groups: Relational Coordination as a Mediator and Input Uncertainty as a Meditor of Performance Effects:
http://www.jstor.org/pss/822615
Mechanism Design for Automated Negotiation and its Application to Task Oriented Domains:
http://www.dia.fi.upm.es/~phernan/AgentesInteligentes/referencias/zlotkin96.pdf
Electronic Markets and Electronic Heirarchies:
http://is.esade.edu/faculty/wareham/Teaching/StratNetComp/Readings/Electronic%20Markets%20and%20electronic%20Hierarchies.pdf
The Promise of Prediction Markets:
http://www.arlingtoneconomics.com/studies/promise-of-prediction-markets.pdf
Market Engineering:
http://www.econbiz.de/archiv/ka/uka/information/market_engineering.pdf
"Mechanism design, an important tool in microeconomics, has recently found widespread applications in modeling and solving decentralized design problems in many branches of engineering, notably computer science, electronic commerce, supply chain management, network economics, and services science and engineering. Mechanism design is concerned with settings where a social planner faces the problem of aggregating the announced preferences of multiple agents into a collective decision when the agents exhibit strategic behavior."
http://lcm.csa.iisc.ernet.in/iisc-ibm-workshop/tutorials.html
"MS&E 181: Issues in Technology and Work for a Postindustrial Economy
How changes in technology and organization are altering work and lives. Approaches to studying and designing work. How understanding work and work practices can assist engineers in designing better technologies and organizations. Topics include job design, distributed and virtual organizations, the blurring of boundaries between work and family life, computer supported cooperative work, trends in skill requirements and occupational structures, monitoring and surveillance in the workplace, downsizing and its effects on work systems, project work and project-based lifestyles, the growth of contingent employment, telecommuting, electronic commerce, and the changing nature of labor relations.
...
MS&E 201: Dynamic Systems
Goal is to think dynamically in decision making, and recognize and analyze dynamic phenomena in diverse situations. Concepts: formulation and analysis; state-space formulation; solutions of linear dynamic systems, equilibria, dynamic diagrams; eigenvalues and eigenvectors of linear systems, the concept of feedback; nonlinear dynamics, phase plane analysis, linearized analysis, Liapunov functions, catastrophe theory. Examples: grabber-holder dynamics, technology innovation dynamics, creation of new game dynamics in business competition, ecosystem dynamics, social dynamics, and stochastic exchange dynamics.
...
MS&E 236H: Game Theory with Engineering Applications
Advanced and mathematically more rigorous version of MS&E 236. Strategic interactions among multiple decision makers emphasizing applications to engineering systems. Topics: efficiency and fairness; collective decision making and cooperative games; static and dynamic noncooperative games; and complete and incomplete information models. Competition: efficient markets; Bertrand, Cournot, and Stackelberg models. Mechanism design: auctions, contracts. Examples from engineering problems."
...
MS&E 248: Economics of Natural Resources
Intertemporal economic analysis of natural resource use, particularly energy, and including air, water, and other depletable mineral and biological resources. Emphasis is on an integrating theory for depletable and renewable resources. Stock-flow relationships; optimal choices over time; short- and long-run equilibrium conditions; depletion/extinction conditions; market failure mechanisms (common-property, public goods, discount rate distortions, rule-of-capture); policy options.
...
MS&E 299: Voluntary Social Systems
Ethical theory, feasibility, and desirability of a social order in which coercion by individuals and government is minimized and people pursue ends on a voluntary basis. Topics: efficacy and ethics; use rights for property; contracts and torts; spontaneous order and free markets; crime and punishment based on restitution; guardian-ward theory for dealing with incompetents; the effects of state action-hypothesis of reverse results; applications to help the needy, armed intervention, victimless crimes, and environmental protection; transition strategies to a voluntary society.
...
MS&E 302: Optimal Dynamic Systems
Controllability and observability, stabilizing feedback. Optimal control theory and the Pontryagin maximum principle; problems with inequality constraints, transversality condition, discounting cost, infinite horizon problem; the Hamilton-Jacobi-Bellman equation; stochastic control. Applications: optimal economic growth, control of predator/prey systems, spread of product innovation.
...
MS&E 336: Topics in Queuing Networks
Advanced efficient control and high-performance design of queuing systems involving job scheduling and resource (server) allocation. Dynamic and stochastic scheduling. Resource allocation in random environments. Real-time scheduling algorithms. Efficient control of queuing networks (routing, admission, flow control, etc.). Performance evaluation of complex queuing structures; identification of performance bottlenecks and techniques for alleviating them. General principles and methodology of high-performance design. Case studies and applications to the design of communication networks, high-speed switching, computer systems, flexible manufacturing systems, service systems, parallel and distributed processing networks, etc.
...
MS&E 336: Topics in Game Theory with Engineering Applications
Seminar. Recent research applying economic methods to engineering problems. Recent topics include: incentives in networked systems; mechanism design in engineered systems; and dynamics and learning in games.
...
MS&E 343: Optimal Control Theory with Applications in Economics
Classical and nonclassical optimal control applications in economics. Necessary and sufficient optimality conditions: maximum principle and HJB equation. Applications: single-person decision problems such as dynamic pricing, investment, marketing, and harvesting of renewable resources; multi-agent games such as dynamic oligopolies with open and closed-loop equilibria, capital accumulation, and dynamic pricing; and design of economic mechanisms such as screening contracts, regulation, and auctions.
...
MS&E 344: Applied Information Economics
The strategic acquisition, pricing, transfer, and use of information. Theoretical findings applied to real-world settings. Topics: optimal risk bearing, adverse selection, signaling, screening, nonlinear and state-contingent pricing, design of contests, incentives and organizations, strategic information transmission, long-run relationships, negative information value, research and invention, leakage and espionage, imperfect competition, information sharing, search and advertising, learning, and real-option exercise games."
http://explorecourses.stanford.edu/CourseSearch/m_search?page=0&q=MS%26E&filter-catalognumber-MS%26E=on
"We’re drowning in data. Bits are faster than atoms. Our jungle-surplus wetware can’t keep up. At least, not without Boyd’s help. In a society where every person, tethered to their smartphone, is both a sensor and an end node, we need better ways to observe and orient, whether we’re at home or at work, solving the world’s problems or planning a play date. And we need to be constantly deciding, acting, and experimenting, feeding what we learn back into future behavior.
We’re entering a feedback economy."
http://www.forbes.com/sites/oreillymedia/2012/01/05/goodbye-information-economy-hello-feedback-economy/
Humanist Community Forum (2011-10-23): Economic Issues from a Humanist Perspective (Hamid Javanbakht)
Hamid Javanbakht, a student studying service systems engineering and post-industrial economics, will be discussing how incentive-altruism and selfishness need not always be working at cross-purposes. The field of mechanism design seeks to set up compatible incentives which take into account the various preferences of local agents so that their actions not only profit themselves individually but also extend beyond their own interest to serve the global good. He will also be questioning whether morality is always effective at producing desirable behavior in cases where perverse incentives are built into the system such as finance and the military.
http://vimeo.com/32349025
Innovation in Large Service Systems in the Interest of Society
International Working Group on Services in HealthCare, Environment, Energy and Transportation
Executive Speaker Series
In the coming years, we anticipate a diverse and rich set of technologies and service systems that can sense and respond to the needs of the society in areas such as HealthCare, Environment, Energy, Transportation, Green Cars and others. We envision that such service systems will not only transform the way we live, but also dramatically change the underlying services economy.
This working group will ask a fundamental question: what does it take to design reliable, cost-efficient and manageable large service systems that can address the complex needs of the society. Examples of large service systems include healthcare service systems (e.g. remote home monitoring and tele-health), global supply chains, large environmental sensor systems (e.g. water management services), road traffic systems and large enterprises.
...
The working group intends to identify the technical, social and service systems deployment challenges by combining expertise from design, management, engineering, computer science, social sciences and operations research. It intends a focused effort using real-world case-studies that illustrate the interaction between technology, business processes, and human factors to solve problems related to improving patient care, quality of life, environment and more.
http://www.cmu.edu/silicon-valley/ilss/index.html
A Group Decision Making Method for Integrating Outcome Preferences in Hypergame Situations:
http://www.dss.dpem.tuc.gr/pdf/A%20Group%20Decision%20Making%20Method%20for%20Integrating.pdf
"The need to aggregate preferences occurring in many different disciplines: in welfare economics, where one attempts to find an economic outcome which would be acceptable and stable; in decision theory, where a person has to make a rational choice based on several criteria; and most naturally in voting systems, which are mechanisms for extracting a decision from a multitude of voters' preferences. The framework for Arrow's theorem assumes that we need to extract a preference order on a given set of options (outcomes). Each individual in the society (or equivalently, each decision criterion) gives a particular order of preferences on the set of outcomes. We are searching for a preferential voting system, called a social welfare function (preference aggregation rule), which transforms the set of preferences (profile of preferences) into a single global societal preference order. The theorem considers the following properties, assumed to be reasonable requirements of a fair voting method:"
http://en.wikipedia.org/wiki/Arrow's_impossibility_theorem
Advances in Hypergame Theory:
http://www.sci.brooklyn.cuny.edu/~parsons/events/gtdt/gtdt06/vane.pdf
Hypergame: A two-player game in which player 1 chooses any finite game and player 2 moves first. A pseudoparadox then arises as to whether the hypergame is itself a finite game. - Mathworld
On modeling value constellations to understand complex service system interactions:
http://www.cambridgeservicealliance.org/uploads/downloadfiles/serviceweekslides/2A%20-%20pmaglio-.pdf
"Maglio and Spohrer (2008) call for a multi-disciplinary approach – Service Science – to
understanding value co-creation in socio-technical systems. Service system (Spohrer et al., 2008) is a value co-creation configuration of resources that can be dynamically configured and connected to other service system’s resources. The service system is by nature complex and dynamic, involving people, technology, shared information, and value propositions connecting internal and external service systems (Maglio and Spohrer 2008). It is centered on provider–consumer interactions and is capable of improving its own state and the one of another system through acquiring, sharing or applying resources, with the aim of creating a basis for systematic service production and innovation. These resources can be competencies, knowledge, shared information, technology, people, and organizations."
http://www.requisiteremedy.com/docs/value-co-creation.pdf
"From service science perspective, value co-creation based on mutual understanding between customer and provider is one of fundamental importance. Service-dominant (S-D) logic is tied to the value-in-use meaning of value. The roles of providers and consumers are not distinct, meaning that value is co-created, jointly and reciprocally, also mutually beneficial relationship. However, at crucial points of interaction between customer and provider, where the co-creation experience occurs and where value is co-created, misunderstandings and service breakdowns can destroy the relationship. In this paper, we analyze formally how customer and provider are sharing internal model in the first phase of value co-creation model of service innovation, i.e., co-experience and co-definition. In co-experience, customer and provider perceived the value of each value proposition differently. Customer have an own internal model and so provider is, therefore co-experience is the most crucial feature of service system. Symbiotic hypergame analysis, in general explicitly assumes that the players involved possess subjective internal model of the environment including the counterparts. These assumptions convince us that it is the most adequate and convenient for describing value co-creation process by customer and provider. First, we categorizing customer and provider into the several types based on customer expectation and provider ability. Then, analyze formally using symbiotic hypergame analysis, how mutual understanding can be achieved between customer and provider. From the analysis, mutual understanding can be achieved as long as customer and provider have same interpretation, customer who has high expectation believes that provider is innovative and vice versa. It has been proven by analyzing Hyper Nash equilibrium in each scenario for pair of each type based on symbiotic hypergame analysis."
http://journals.isss.org/index.php/proceedings54th/article/viewFile/1378/483
"CouchSurfing is committed to facilitating intercultural understanding, personal development, and inspiring social experiences amongst the members of its global community. While other online networks help people connect to the friends they already have, CouchSurfing connects people to the new friends they would like to have, and even creates new connections between entire social groups. The community, which has millions of members and is growing fast, consists of people of all walks of life who are interested in hosting travelers in their homes, staying with locals while they travel, and meeting new people at events and social outings.
We became a B Corp because it provides the support for our mission that we would expect to find in the non-profit sector, while allowing us the freedom to innovate that more traditional for-profit companies enjoy.
Europe, North America, and South America are, as of August 2011, the regions with the largest numbers of CouchSurfing members. However, the community continues to reach more and more people in all areas of the world. In the year 2010, South America, Central Asia, Southeast Asia and the Middle East all experienced more than 90% growth rates in sign ups.
The Change We Seek™
CouchSurfing’s goal is to create inspiring experiences for their members that help them to explore the world, whether on the road or at home, and connect with one another. They believe that real social change starts with the individual, and that lasting education comes through personal experience. When people have the ability to reach out of their normal social interactions and spend time with people from different cultures and different backgrounds, they come to demolish their prejudices, from large to small.
The true power of their mission is that people can enrich their lives and the world around them through things that they already enjoy: fun, social contact, and the chance to have experiences that they’ll never forget.
They are committed to helping the world become more understanding, more connected, and more generous across all borders, be they physical or psychological."
http://bcorporation.net/couchsurfing
We believe university-level education can be both high quality and low
cost. Using the economics of the Internet, we've connected some of the
greatest teachers to hundreds of thousands of students all over the
world.
http://www.udacity.com/
In The Coming Education Revolution I discussed Sebatian Thurn and
Peter Norvig’s online AI class from Stanford that ended up enrolling
160,000 students. Felix Salmon has the remarkable update:
…there were more students in [Thrun's] course from Lithuania alone
than there are students at Stanford altogether. There were students in
Afghanistan, exfiltrating war zones to grab an hour of connectivity to
finish the homework assignments. There were single mothers keeping the
faith and staying with the course even as their families were being
hit by tragedy. And when it finished, thousands of students around the
world were educated and inspired. Some 248 of them, in total, got a
perfect score: they never got a single question wrong, over the entire
course of the class. All 248 took the course online; not one was
enrolled at Stanford.
Thrun was eloquent on the subject of how he realized that he had been
running “weeder” classes, designed to be tough and make students fail
and make himself, the professor, look good. Going forwards, he said,
he wanted to learn from Khan Academy and build courses designed to
make as many students as possible succeed — by revisiting classes and
tests as many times as necessary until they really master the
material.
And I loved as well his story of the physical class at Stanford, which
dwindled from 200 students to 30 students because the online course
was more intimate and better at teaching than the real-world course on
which it was based.
So what I was expecting was an announcement from Thrun that he was
helping to reinvent university education: that he was moving all his
Stanford courses online, that the physical class would be a space for
students to get more personalized help. No more lecturing: instead,
the classes would be taken on the students’ own time, and the job of
the real-world professor would be to answer questions from kids paying
$30,000 for their education.
But that’s not the announcement that Thrun gave. Instead, he said, he
concluded that “I can’t teach at Stanford again.” He’s given up his
tenure at Stanford, and he’s started a new online university called
Udacity. He wants to enroll 500,000 students for his first course, on
how to build a search engine — and of course it’s all going to be
free.
http://marginalrevolution.com/marginalrevolution/2012/01/udacity.html
Regional Coordination for Reduced Military Spending: Potential and Design:
http://users.ox.ac.uk/~econpco/research/pdfs/RegionalCoordforReducedMilitary.pdf
Intertemporal choice is the study of the relative value people assign to two or more payoffs at different points in time. This relationship is usually simplified to today and some future date. Intertemporal choice was introduced by John Rae in 1834 in the "Sociological Theory of Capital". Later, Eugen von Böhm-Bawerk in 1889 and Irving Fisher in 1930 elaborated on the model.
http://en.wikipedia.org/wiki/Intertemporal_choice
This book combines economic theory and design to create tools that economists can use to apply in social, political and institutional application. This book seeks to provide the necessary stepping stones in order to facilitate the diffusion and adoption of this powerful tool for studying incentive structures in economics. The book presents a number of examples, both theoretical and real-life. It also has a chapter that samples the literature that tests mechanisms away from the blackboard, in laboratories and the real world.
This book provides readers (students and applied economists) with the tools to design the rules of economics to harness the power of incentives.
http://books.google.com/books?id=dCRzDTT4HD4C&dq=mechanism+design+military+spending&output=html_text&source=gbs_navlinks_s
Banking: A Mechanism Design Approach
The authors study banking using the tools of mechanism design, without a priori assumptions about what banks are, who they are, or what they do. Given preferences, technologies, and certain frictions - including limited commitment and imperfect monitoring - they describe the set of incentive feasible allocations and interpret the outcomes in terms of institutions that resemble banks. The bankers in the authors' model endogenously accept deposits, and their liabilities help others in making payments. This activity is essential: if it were ruled out the set of feasible allocations would be inferior. The authors discuss how many and which agents play the role of bankers. For example, they show agents who are more connected to the market are better suited for this role since they have more to lose by reneging on obligations. The authors discuss some banking history and compare it with the predictions of their theory.
http://www.ssc.upenn.edu/~rwright/papers/RVW.pdf
"Too big to fail" is a colloquial term in regulation and public policy that refers to businesses dealing with market complications related to moral hazard, macroeconomics, economic specialization, and monetary theory.
According to this theory, certain financial institutions are so large and so interconnected that their failure will be disastrous to an economy. Proponents of this theory believe that these institutions should become recipients of beneficial financial and economic policies from governments or central banks to keep them alive. It is thought that companies that fall into this category take positions that are high-risk, as they are able to leverage these risks based on the policy preference they receive. The term has emerged as prominent in public discourse since the 2007–2010 global financial crisis.
Some critics see the policy as counterproductive and that large banks or other institutions should be left to fail if their risk management is not effective. Moreover, some assert that the "too big to fail" policy has been explicitly refuted in the People's Republic of China, with the insolvency of Guangdong International Trust & Investment Corporation in 1998.
Some economists, such as Nobel Laureate Paul Krugman, hold that economy of scale in banks, as in other businesses, as worth preserving, so long as they are well regulated in proportion to their economic clout, and therefore that "too big to fail" status can be acceptable. The global economic system must also deal with sovereign states being too big to fail.Others, such as Alan Greenspan, disagree: “If they’re too big to fail, they’re too big”.
http://en.wikipedia.org/wiki/Too_big_to_fail
"The liturgical system in Classical Athens (479–322 BCE) privately provided public goods, including naval defense. I use it to evaluate mechanism design policies and to address uncertainties in the historical record by adding predictive economic theory to research by ancient historians. I evaluate the system's success at meeting the conflicting goals of efficiency, feasibility, and budget balance by analyzing the Athenian citizens' incentives within a game of asymmetric information. In the game, multiple equilibria occur; citizens may or may not volunteer for duty or avoid it. I relate the game theoretic findings to historical events."
http://journals.cambridge.org/action/displayAbstract;jsessionid=3C85BAB6548B86F2D4E27D3B3864C219.journals?fromPage=online&aid=1031588
"Dean Gates’ current research focuses on game theory and mechanism design applied to both military manpower and acquisition. In military manpower, this research focuses on designing auctions to set retention and voluntary separation bonuses for military personnel using purely monetary incentives or individualized combinations of monetary and non-monetary incentives. This research has also developed a mechanism for setting assignment incentive pay to attract service members to hard-to-fill billets. In acquisition, this research focuses on incentive contracts, procurement auctions, contractor protests and technology transfer. Past research has involved policy analysis, cost-benefit analysis, burden-sharing in defense alliances, and government R&D and technology policy."
http://www.nps.edu/Administration/Deans/deans_GSPBB.html
Too Big to Know is about what happens to knowledge and expertise now that we are faced with the fact that there is way way way more to know than can be known by any individual. Its hypothesis is that knowledge and expertise are becoming networks, and are taking on the properties of networks.
http://www.toobigtoknow.com/
Market Engineering comprises the structured, systematic and theoretically founded procedure of analyzing, designing, introducing and also quality assuring of electronic market platforms as well as their legal framework regarding simultaneously their market mechanisms and trading rules, systems, platforms and media, and their business models. Market Engineering borrows concepts and methods from Economics, particularly, Game Theory, and Mechanism Design concepts, but also borrows concepts from Finance,Information Systems and Operations Research.
http://en.wikipedia.org/wiki/Market_engineering
The difficulty in designing and implementing electronic markets is oftentimes the interdependence of technical and economic objectives (Weinhardt et al. 2006). From an economic viewpoint, an electronic market must encompass common economic performance desiderata such as allocative efficiency. Relying on existing market mechanisms known from other contexts when constructing new markets may, however, result in poor efficiency (Lai 2005). The mechanism designer also has to account for the technical conditions of the target domain. To give an example, in case of a market for allocating Grid computing resources these conditions comprise the underlying environment in terms of Grid middleware and the requirements of potential Grid users and applications. The market should act as a resource allocation manager, hence, fulfilling general requirements upon such a manager. This allows the introduction of the precondition that a market apt for the Grid has to be realized as an electronic market. Otherwise, the market cannot fulfill an automated resource allocation as required by a Grid resource management system.
http://www.econbiz.de/archiv/ka/uka/information/market_engineering.pdf
Over the last couple of years, interest in prediction markets as a forecasting method has continuously increased in the scientific world and in industry. Markets provide incentives for information revelation and can be used as a mechanism for aggregating information. So far, prediction markets have done well in every known comparison with other forecasting methods. Whereas information aggregation is only a byproduct of most traditional markets, prediction markets are set up with the explicit purpose of soliciting information. Engineered carefully, prediction markets can directly guide decision making. This paper describes the fundamentals of prediction markets as well as their key design elements. We thereby aim at giving insights into design decisions which have to be made by prediction market operators. Moreover, we contribute to the literature by giving an extensive overview on fields of application of prediction markets which have been discussed in academic literature.
http://www.ksri.kit.edu/Upload/Publications/c0f28bf7-9b5c-4151-bce1-57f6095d5821.pdf
The research group Information & Market Engineering (IM) focuses on the challenges imposed by electronic markets and on the approaches used to meet these challenges. Our vision is to apply a well-defined and structured engineering process to create new markets and to re-engineer existing markets. The methodology of Market Engineering is a very promising way to design and implement new possible market structures combining the disciplines of economics, informatics and law. The basis for a deeper understanding of the research field Market Engineering is established through theories, methods and tools from these underlying disciplines.
http://www.ksri.kit.edu/Default.aspx?PageId=302&lang=en
http://www.im.uni-karlsruhe.de/
Prediction Markets versus Alternative Methods Empirical Tests of Accuracy and Acceptability
http://www.itas.fzk.de/deu/lit/2009/grae09a.pdf
Information Management and Market Engineering, Volume 1
http://books.google.com/books?id=vCywfD09AMwC&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false
Duality arises in linear and nonlinear optimization techniques in a wide variety of applications.
Current flows and voltage differences are the primal and dual variables that arise when optimization and equilibrium models are used to analyze electrical networks. In economic markets the primal variables are production and consumption levels and the dual variables are prices of goods and services. In structural design methods tensions on the beams and modal displacements are the respective primal and dual variables..
http://dualityscience.com/yahoo_site_admin/assets/docs/cdgo2007_PDF.21362207.pdf
Minkowski Space-Time and Thermodynamics
http://philsci-archive.pitt.edu/4278/1/Minkowski_Spacetime%26Thermodynamics.pdf
Design Theory
http://tinyurl.com/DesignTheory
Iso-: equal in quantity or quality, by extension; same, similar, alike, as much, agree together, consistent + telesis 1) bound and/or unbound intentionality. 2) the common component of state/content/space and syntax/law/time. Co-Agentive: conjunctive agency. Intra-Extensional: locally-denotative, globally-connotative. Constraint-Satisfaction: a constraint is satisfiable if it is possible to find an interpretation (model) that makes it true.
Friday, November 25, 2011
Friday, April 15, 2011
Model-Theoretic Isomorphism, Ehrenfeucht–Fraïssé Games, Utility-Maximizing Languages, Confluent-Semantic Competence, Superrationally-Bound Attractors
Image: http://www.thekathleenshow.com/2010/09/12/robin-t-lakoff/
"Mainstream reality theory counts among its hotter foci the interpretation of quantum theory and its reconciliation with classical physics, the study of subjective consciousness and its relationship to objective material reality, the reconciliation of science and mathematics, complexity theory, cosmology, and related branches of science, mathematics, philosophy and theology. But in an integrated sense, it is currently in an exploratory mode, being occupied with the search for a general conceptual framework in which to develop a more specific theory and model of reality capable of resolving the paradoxes and conceptual inconsistencies plaguing its various fields of interest (where a model is technically defined as a valid interpretation of a theory in its universe of reference). Because of the universal scope of reality theory, it is subject to unique if seldom-recognized demands; for example, since it is by definition a universal theory of everything that is real, it must by definition contain its rules of real-world interpretation. That is, reality theory must contain its own model and effect its own self-interpretative mapping thereto, and it must conform to the implications of this requirement. This “self-modeling” capacity is a primary criterion of the required framework.
...
According to its mandate, the true description of reality must possess two novel features not found in any dominant paradigm: (1) global structural and dynamical reflexivity or “self-excited circuitry”, with perception an integral part of the self-recognition function of reality; (2) matter-information equivalence, an identification (up to isomorphism) of concrete physical reality with information, the abstract currency of perception. Together, these features constitute a cosmological extension of cybernetics, or equivalently, a metacybernetic extension of cosmology.
...
When a set of observations is explained with a likely set of equations interpreted therein, the adhesion between explanandum and explanation might as well be provided by rubber cement. I.e., scientific explanations and interpretations glue observations and equations together in a very poorly understood way. It often works like a charm…but why? One of the main purposes of reality theory is to answer this question.
The first thing to notice about this question is that it involves the process of attribution, and that the rules of attribution are set forth in stages by mathematical logic. The first stage is called sentential logic and contains the rules for ascribing the attributes true or false, respectively denoting inclusion or non-inclusion in arbitrary cognitive-perceptual systems, to hypothetical relationships in which predicates are linked by the logical functors not, and, or, implies, and if and only if. Sentential logic defines these functors as truth functions assigning truth values to such expressions irrespective of the contents (but not the truth values) of their predicates, thus effecting a circular definition of functors on truth values and truth values on functors. The next stage of attribution, predicate logic, ascribes specific properties to objects using quantifiers. And the final stage, model theory, comprises the rules for attributing complex relations of predicates to complex relations of objects, i.e. theories to universes. In addition, the form of attribution called definition is explicated in a theory-centric branch of logic called formalized theories, and the mechanics of functional attribution is treated in recursion theory.
...
The fact that most such theories, e.g. theories of physics, point to the fundamental status of something “objective” and “independent of language”, e.g. matter and/or energy, is quite irrelevant, for the very act of pointing invokes an isomorphism between theory and objective reality…an isomorphism that is subject to the Reality Principle, and which could not exist unless reality shared the linguistic structure of the theory itself.
Perhaps the meaning of this principle can be most concisely expressed through a generalization of the aphorism “whereof one cannot speak, one must be silent”: whereof that which cannot be linguistically described, one cannot perceive or conceive. So for the observational and theoretical purposes of science and reality theory, that which is nonisomorphic to language is beyond consideration as a component of reality.
Diagram 7: In this syndiffeonic diagram, the assertion “Language differs from reality” is laid out along an extended line segment representing the supposed difference between the relands. Just as in the generic diagram above, both relands possess the attribute “inclusion in the relational syntactic medium (Language Reality)”. Because they are both manifestations of the same underlying medium, their difference cannot be absolute; on a fundamental level, reality and language share common aspects. This is consistent with the nature of the “difference” relationship, which is actually supposed to represent a semantic and model-theoretic isomorphism."
http://www.ctmu.org/
COGNITION AS INTERACTION
"Many cognitive activities are irreducibly social, involving interaction between several different agents. We look at some examples of this in linguistic communication and games, and show how logical methods provide exact models for the relevant information flow and world change. Finally, we discuss possible connections in this arena between logico-computational approaches and experimental cognitive science."
http://www.illc.uva.nl/Publications/ResearchReports/PP-2005-10.text.pdf
http://staff.science.uva.nl/~johan/research.html
"Preference is a key area where analytic philosophy meets philosophical logic. I start with two related issues: reasons for preference, and changes in preference, first mentioned in von Wright’s book The Logic of Preference but not thoroughly explored there. I show how these two issues can be handled together in one dynamic logical framework, working with structured two-level models, and I investigate the resulting dynamics of reason-based preference in some detail. Next, I study the foundational issue of entanglement between preference and beliefs, and relate the resulting richer logics to belief revision theory and decision theory."
http://www.springerlink.com/content/aw4w76p772007g47/
Utility and Value of Information in Cognitive Science, Biology and Quantum Theory
A generalisation of the concepts utility and information value is given for the non-commutative case. In particular, states and utility operators are considered in dual linear spaces equipped with pre-orders, generated by a wedge of utility operators. It is shown that solutions to the information value problem give rise to an isotone Galois connection between the pre-ordered spaces. A particular form of this connection depends on the choice of a functional representing information resource. The properties of information resource are discussed from the point of information value theory, and an example is presented that generalises several known forms of classical and quantum information. Potential areas of application of information value in cognitive science, biology and quantum theory are discussed.
http://tinyurl.com/quantumbioinformaticsIII
"What is space computing, simulation, or understanding? Converging from several sources, this seems to be something more primitive than what is usually meant by computation, something that was along with us since antiquity (the word "choros", "chora", denotes "space" or "place" and is seemingly the most mysterious notion from Plato, described in Timaeus 48e - 53c) which has to do with cybernetics and with the understanding of the front end visual system. It may have some unexpected applications, also. Here, inspired by Bateson (see Supplementary Material), I explore from the mathematical side the point of view that there is no difference between the map and the territory, but instead the transformation of one into another can be understood by using a formalism of tangle diagrams."
http://arxiv.org/PS_cache/arxiv/pdf/1103/1103.6007v2.pdf
"In ref. [7], S. Majid presents the following `thesis' : ``(roughly speaking) physics polarises down the middle into two parts, one which represents the other, but that the latter equally represents the former, i.e. the two should be treated on an equal footing. The starting point is that Nature after all does not know or care what mathematics is already in textbooks. Therefore the quest for the ultimate theory may well entail, probably does entail, inventing entirely new mathematics in the process. In other words, at least at some intuitive level, a theoretical physicist also has to be a pure mathematician. Then one can phrase the question `what is the ultimate theory of physics ?' in the form `in the tableau of all mathematical concepts past present and future, is there some constrained surface or subset which is called physics ?' Is there an equation for physics itself as a subset of mathematics? I believe there is and if it were to be found it would be called the ultimate theory of physics. Moreover, I believe that it can be found and that it has a lot to do with what is different about the way a physicist looks at the world compared to a mathematician...We can then try to elevate the idea to a more general principle of representation-theoretic self-duality, that a fundamental theory of physics is incomplete unless such a role-reversal is possible. We can go further and hope to fully determine the (supposed) structure of fundamental laws of nature among all mathematical structures by this self-duality condition. Such duality considerations are certainly evident in some form in the context of quantum theory and gravity. The situation is summarised to the left in the following diagram. For example, Lie groups provide the simplest examples of Riemannian geometry, while the representations of similar Lie groups provide the quantum numbers of elementary particles in quantum theory. Thus, both quantum theory and non-Euclidean geometry are needed for a self-dual picture. Hopf algebras (quantum groups) precisely serve to unify these mutually dual structures.''
http://planetmath.org/encyclopedia/IHESOnTheFusionOfMathematicsAndTheoreticalPhysics2.html
"In the first of three articles, we review the philosophical foundations of an approach to quantum gravity based on a principle of representation-theoretic duality and a vaguely Kantian-Buddist perspective on the nature of physical reality which I have called `relative realism'. Central to this is a novel answer to the Plato's cave problem in which both the world outside the cave and the `set of possible shadow patterns' in the cave have equal status. We explain the notion of constructions and `co'constructions in this context and how quantum groups arise naturally as a microcosm for the unification of quantum theory and gravity. More generally, reality is `created' by choices made and forgotten that constrain our thinking much as mathematical structures have a reality created by a choice of axioms, but the possible choices are not arbitary and are themselves elements of a higher-level of reality. In this way the factual `hardness' of science is not lost while at the same time the observer is an equal partner in the process. We argue that the `ultimate laws' of physics are then no more than the rules of looking at the world in a certain self-dual way, or conversely that going to deeper theories of physics is a matter of letting go of more and more assumptions. We show how this new philosophical foundation for quantum gravity leads to a self-dual and fractal like structure that informs and motivates the concrete research reviewed in parts II,III. Our position also provides a kind of explanation of why things are quantized and why there is gravity in the first place, and possibly why there is a cosmological constant.
Keywords: quantum gravity, Plato's cave, Kant, Buddism, physical reality, quantum logic, quantum group, monoidal category, T-duality, Fourier transform, child development"
http://philsci-archive.pitt.edu/3345/
"In 1993 the famous Dutch theoretical physicist G.’t Hooft put forward a bold proposal. This proposal, which is known as the Holographic Principle, consists of two basic assertions:
“Assertion 1: The first assertion of the Holographic Principle is that all of the information contained in some region of space can be represented as a `Hologram' - a theory that `lives' on the boundary of that region. For example, if the region of space in question is a Coffee shop, then the holographic principle asserts that all of the physics, which takes place in the coffee shop, can be represented by a theory, which is defined on the walls of the Tearoom.
Assertion 2: The second assertion of the Holographic Principle is that the theory on the boundary of the region of space in question should contain at most one degree of freedom per Planck area.18
Before, I have assumed that the information in space-time, in it’s entirely, is reflected and registered in singularity. To make it objective, Holographic theorists convert the whole ordeal to spatial form again but with one dimension less and present it to us. On the context of the proposed model, we can ignore the inverse Fourier transform and imagine that information remains in spectral state while in singularity. We do not have to pass to spatial phase (do not have to conjugate) and look at the shadow in the wall to realize that information is out there. Or we may do that for objectivity reasons, but at least we'd better appreciate and recognize the spectral state of the information. This is similar to mind function. According to holographic brain Theory, information remains at spectral form in the brain. That is what I am trying to convey about the singularity as well. In holography, we stay in the spatial boundaries, to demonstrate the experiment. In this model however, I surpass all spatial dimensions and introduce a geometrical point to accommodate information.
We have enough information to dare passing the spatial boundaries. Our imagination can help us building theories and present them for speculation and investigation. Meantime if we establish a sound theory for mind function, we can utilize mind activities as analogy to explore beyond finite world. Holographic theory, says that all the information can be present in space with one-dimension less. M theorist (representing different Super String theories) found out that the answer of major paradoxes could not be found in our 4-dimension space-time. To find solutions they had to look out of 4-dimensional space. My question is why did they have to travel to assumed spaces with different dimensions to create a basis to solve the paradoxes? Why couldn't we untie and free ourselves from space boundaries? We know from the Einstein’s Special Theory of Relativity that time and space are not absolute.
At this point, let me add this beautiful piece from the University of Cambridge DAMTP web page.18
Holography through the Ages
To them, I said,
the truth would be literally nothing
but the shadows of the images.
Plato, The Republic (Book VII)
Plato, the great Greek philosopher, wrote a series of ‘Dialogues’, which summarized many of the things, which he had learned from his teacher, who was the philosopher Socrates.
One of the most famous of these Dialogues is the ‘Allegory of the Cave.’ In this allegory, people are chained in a cave so that they can only see the shadows, which are cast on the walls of the cave by a fire. To these people, the shadows represent the totality of their existence - it is impossible for them to imagine a reality, which consists of anything other than the fuzzy shadows on the wall. However, some prisoners may escape from the cave; they may go out into the light of the sun and behold true reality. When they try to go back into the cave and tell the other captives the truth, they are mocked as madmen. Of course, to Plato this story was just meant to symbolize mankind's struggle to reach enlightenment and understanding through reasoning and open-mindedness. We are all initially prisoners and the tangible world is our cave. Just as some prisoners may escape out into the sun, so may some people amass knowledge and ascend into the light of true reality. What is equally interesting is the literal interpretation of Plato's tale: The idea that reality could be represented completely as `shadows' on the walls.18
Holonomic Brain
Numerous studies in neuro-physiology suggest that memories in the brain are not stored in a specific location; rather, they are dispersed over the entire brain. The conventional view is that the brain is a computational device. There is a growing body of literature, though, that shows there are severe limitations to computation (Penrose, 1994; Rosen, 1991; Kampis, 1991; Pattee, 1995). For instance, Dr. Jeff Paradeoux writes:
Penrose uses a variation of the "halting problem" to show that the mind cannot be an algorithmic process. Rosen argues that computation (or simulation) is an inaccurate representation of the natural causes that are in place in nature. Kampis shows that the informational content of an algorithmic process is fixed at the beginning and no "new" information can be brought forward. Pattee argues that the complete separation of initial conditions and equations of motion necessary in a computation may only be a special case in nature. Pattee argues that systems that can make their own measuring devices can affect what they see and have ‘semantic closure. 19
Experiments show that selective damage to certain area of brain tissue will not erase the specific related memories. It further suggests that memories are restored as frequency. The experiment performed by Bernstein is worth mentioning. Here is a summary of his experiment and the follow up work by Karl Pribram, Professor Emeritus at Stanford University and his associate:
“Bernstein dressed people in black leotards and had them perform simple tasks such as running or hammering nails against a black background. The leotard had been decorated with white dots over each joint. Bernstein took cinematographic films of these activities. On his films he therefore had a record of the movements of the dots, which described a series of waveforms. When he analyzed the records according to a Fourier procedure he was able to accurately predict the next movement in the sequence.What we needed was direct proof that cells in the motor cortex were responsive to wave forms. So Amad Sharafat, an engineering student, and I devised an apparatus, which moved a cat’s paw up and down at different frequencies. We recorded from motor cortical cells and found many that were tuned to the frequencies with which the paw was moved.” 57
He then explains:
“What the data suggest is that there exists in the cortex, a multidimensional holographic-like process serving as an attractor or set point toward which muscular contractions operate to achieve a specified environmental result.
The specification has to be based on prior experience (of the species or the individual) and stored in holographic-like form. Activation of the store involves patterns of muscular contractions (guided by basal ganglia, cerebellar, brain stem and spinal cord) whose sequential operations need only to satisfy the 'target' encoded in the image of achievement” 57
http://www.universaltheory.org/html/consciousness/holonomic_brain/holonomic_brain5.htm
"The basic meaning of ešrāq (Illumination) is “rising,” more precisely “rising of the sun” (Lane, Arabic English Lexicon I, pp. 1539-41). The term is used extensively in Arabic and Persian philosophical texts, signifying a special intuitive mode of cognition with no temporal extension (i.e., a-temporal), spatially coordinated “in” (fi) the knowing, self-conscious subject (Ar. al-mawżuʿ al-modrek bi’l-ḏāt; Pers.
man-e dānanda/ḵod-āgāh). In other words, it applies to the relation between the “apprehending subject” (al-mawżuʿ al-modrek) and “apprehensible object” (al-modrak). The term ešrāq is also widely used in popular discourse. In its general, non-technical usage in ordinary language, it signifies the “mystical” as well as the range of extraordinary types of knowledge, including personal inspiration (elhām)."
http://www.iranica.com/articles/illuminationism
"The Sun of Reality is one Sun but it has different dawning-places, just as the phenomenal sun is one although it appears at various points of the horizon. During the time of spring the luminary of the physical world rises far to the north of the equinoctial; in summer it dawns midway and in winter it appears in the most southerly point of its zodiacal journey. These day springs or dawning-points differ widely but the sun is ever the same sun whether it be the phenomenal or spiritual luminary. Souls who focus their vision upon the Sun of Reality will be the recipients of light no matter from what point it rises, but those who are fettered by adoration of the dawning-point are deprived when it appears in a different station upon the spiritual horizon."
http://bcca.org/bahaivision/BWF/0612thesunofreality.html
"Model theory studies structures (which are usually models of some formal language): their construction, classification, and relations between them. Given that databases, graphs, and mathematical constructs studied in theoretical computer science (categories, domains, Chu spaces) can all be seen as (relational) structures, model theory provides a set of tools potentially useful for any computer scientist.
...
Below is the preliminary schedule of lectures:
Lecture 1: Introduction. Language and models of first order logic, definition of truth and entailment. Handout for the first lecture. Answers to the exercises.
Lecture 2: Maps and formulas they preserve (isomorphism, homomorphism, embeddings, substructures, direct products).
Lecture 3: Ehrenfeucht-Fraisse games.
Lecture 4: Language and models of modal logic.
Lecture 5: Bisimulation."
http://www.cs.nott.ac.uk/~nza/MGS/MGS99/index.html
"In the mathematical discipline of model theory, the Ehrenfeucht–Fraïssé game (also called back-and-forth games) is a technique for determining whether two structures are elementarily equivalent.
...
The main idea behind the game is that we have two structures, and two players (defined below). One of the players wants to show that the two structures are different, whereas the other player wants to show that they are somewhat similar (according to first-order logic). The game is played in turns and rounds; A round proceeds as follows: First the first player (Spoiler) chooses any element from one of the structure, and the other player chooses an element from the other structure. The other player's task is to always pick an element that is "similar" to the one that Spoiler chose. The second player (Duplicator) wins if there exists an isomorphism between the elements chosen in the two different structures.
The game lasts for a fixed amount of steps (γ) (an ordinal, but usually a finite number or ω)."
http://en.wikipedia.org/wiki/Ehrenfeucht–Fraïssé_game
"Notions of bisimulation play a central role in the theory of transition systems. As
different kinds of system are encountered, different notions of bisimulation arise,
but the same questions are posed: Is there a fixed-point characterization for the
maximal bisimulation, bisimilarity? Is there a minimal system, where bisimilar
states are equated? And is there a procedure for constructing a minimal system,
or for verifying bisimilarity?
The theory of coalgebras provides a setting in which different notions of transition
system can be understood at a general level. In this paper we investigate
notions of bisimulation at this general level, and determine how and when these
questions can be answered."
http://www.cl.cam.ac.uk/~ss368/calco09.pdf
Symmetry of subject and predicate
http://chu.stanford.edu/
Furthering the idea of the use of Chu Spaces for consciousness problems and especially the binding problem
http://ttjohn.blogspot.com/2005/11/use-of-chu-spaces-for-consciousness.html
Big Toy Models: Representing Physical Systems As Chu Spaces
"We pursue a model-oriented rather than axiomatic approach to the foundations of Quantum Mechanics, with the idea that new models can often suggest new axioms. This approach has often been fruitful in Logic and Theoretical Computer Science. Rather than seeking to construct a simplified toy model, we aim for a `big toy model', in which both quantum and classical systems can be faithfully represented - as well as, possibly, more exotic kinds of systems.
To this end, we show how Chu spaces can be used to represent physical systems of various kinds. In particular, we show how quantum systems can be represented as Chu spaces over the unit interval in such a way that the Chu morphisms correspond exactly to the physically meaningful symmetries of the systems - the unitaries and antiunitaries. In this way we obtain a full and faithful functor from the groupoid of Hilbert spaces and their symmetries to Chu spaces. We also consider whether it is possible to use a finite value set rather than the unit interval; we show that three values suffice, while the two standard possibilistic reductions to two values both fail to preserve fullness. "
http://arxiv.org/abs/0910.2393
"A good theory is like a good map. If we plan our itinerary with a good map it gets us to where we want to go. Why? Because the logical structure of the map parallels -- or, as we will say, is isomorphic to -- the spacial structure of the roads we are travelling.
Consider the following:
a. A theory is defined by a set of variables and the set of relations among them. A map generally indicates route intersections, the connections among them and the distances between them.
b. A program is a well-specified procedure. It is defined by a set of operations and the set of relations among them. A program like an itinerary, for example, may specify operations like, "drive north on route 363 for 4.5 miles to route 611." Such operations will generally only work if they are done in a specific sequence.
c. Informally, we can think of an isomorphism as a relationship of perfect correspondence of parts. A program Px is isomorphic to a theory Tz, if and only if for each operation in Px, there corresponds only one variable in Tz. Also, for each relation among operations in Px, there corresponds one and only one relation among variables in Tz. Our itinerary will be isomorphic with our map if and only if for each critical intersection on our trip, there corresponds an operation which takes us to it.
d. To the extent that a program is isomorphic to a theory, those who pursue the program may be said to be using that theory. Mere allusions to a theory will lack many correspondences between operations and variables, and the relationships among them. A person might well, for example, get from one place to another from habit or chance without following a map. Or a teacher may say that he is "reinforcing" student "responses" without actually following operant conditioning theory, as, for example, he writes an "A" on a report card at a time and place long removed from the presence of the student's behavior.
e. Programs pursue goals. An "adequate" theory is one whose isomorphically related program achieves its goal."
http://home.comcast.net/~erozycki/Isomorphism.html
"In computer science, confluence is a property of rewriting systems, describing that terms in this system can be rewritten in more than one way, to yield the same result. This article describes the properties in the most abstract setting of an abstract rewriting system."
...
Strong confluence is another variation on local confluence that allows us to conclude that a rewriting system is globally confluent. An element a ∈ S is said to be strongly confluent if for all b, c ∈ S with a → b and a → c there exists d ∈ S with b →* d and either c → d or c = d; if every a ∈ S is strongly confluent, we say that → is strongly confluent.
A strongly confluent element need not be confluent, but a strongly confluent rewriting system is necessarily confluent."
http://en.wikipedia.org/wiki/Confluence_(abstract_rewriting)
The Communication of Meaning in Anticipatory Systems: A Simulation Study of the Dynamics of Intentionality in Social Interactions
"Psychological and social systems provide us with a natural domain for the study of anticipations because these
systems are based on and operate in terms of intentionality. Psychological systems can be expected to contain a model of
themselves and their environments; social systems can be strongly anticipatory and therefore co-construct their
environments, for example, in techno-economic (co-)evolutions. Using Dubois’ hyper-incursive and incursive
formulations of the logistic equation, these two types of systems and their couplings can be simulated. In addition to their
structural coupling, psychological and social systems are also coupled by providing meaning reflexively to each other’s
meaning-processing. Luhmann’s distinctions among (1) interactions between intentions at the micro-level, (2)
organization at the meso-level, and (3) self-organization of the fluxes of meaningful communication at the global level
can be modeled and simulated using three hyper-incursive equations. The global level of self-organizing interactions
among fluxes of communication is retained at the meso-level of organization. In a knowledge-based economy, these two
levels of anticipatory structuration can be expected to propel each other at the supra-individual level.
Keywords: anticipation, social system, meaning, communication, incursion, double contingency"
http://www.leydesdorff.net/casys07/casys07.pdf
"There are two different approaches in decision theory: evidential decision theory and causal decision theory. Evidential decision theory seeks to maximize the utility of a choice, taking into account what the choice tells you about yourself, and therefore, about other parts of the world that may correlate with your own behavior. A justification for evidential decision theory is given. This first involves a scenario intended to suggest evidential decision theory as an approach. Some objections to evidential decision theory being used in Newcomb’s paradox are that it seems to imply reverse causation, but it is shown that this issue is raised by any decision anyway. Light cones are used to give a simplified view of events, in which it is shown that there is no profound transition involved in going from an event causally following from a choice to one related to it less directly. The view of an outside observer is taken to show how decisions should be approached with no assumption of them having any special status as a result of being “owned” by the decider. Making a special case of your own decisions violates the Copernican principle. It is argued that, even if we try to view our decisions causally, correlation between other parts of reality will mean that choices tend to “contaminate” much of the description of reality in non-causal, indirect ways. Evidential decision theory can be justified by considering identical players in a game, and then considering almost identical players. The term “meta-causation” is proposed. A choice meta-causes an event if it corresponds to that event irrespective of whether or not the event causally follows it. Evidential decision theory is correct, but has little practical significance in many everyday situations in which we have a lot of knowledge. The next article will discuss situations where it could be relevant."
http://www.paul-almond.com/Correlation1.pdf
Converging towards what? Pragmatic and Semantic Competence
The paper tries to build a bridge between results in commonsense reasoning and inferential theories of meaning. We focus on the problem of communication and the contrast between two views of communication, the “expressive ” view and the “convergence ” view. According to the convergence view (and local holism which supports it) the meaning of a sentence is the set of inferences to which speakers converge in a discourse context. The problem is that we have no idea about the strategy of this convergence, even if it is apparent that the convergence of inferences depends on contextual clues and pragmatic factors. We claim that in order to accept the convergence view we need to supplement the idea of meaning as inference with recent results in multi-context theories. Our solution to the problem is based on a distinction between semantic competence and contextual competence defined as rule governed pragmatic competence.
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.77.6404
"Although standard game theory assumes common knowledge of rationality, it does so in a different way. The game theoretic analysis maximizes payoffs by allowing each player to change strategies independently of the others, even though in the end, it assumes that the answer in a symmetric game will be the same for all. This is the definition of a game theoretic
Nash equilibrium, which defines a stable strategy as one where no player can improve the payoffs by unilaterally changing course. The superrational equilibrium is one which maximizes payoffs where all the players strategies are forced to be the same before the maximization step.
Some argue that superrationality implies a kind of magical thinking in which each player supposes that his decision to cooperate will cause the other player to cooperate, despite the fact that there is no communication. Hofstadter points out that the concept of "choice" doesn't apply when the player's goal is to figure something out, and that the decision does not cause the other player to cooperate, but rather same logic leads to same answer independent of communication or cause and effect. This debate is over whether it is reasonable for human beings to act in a superrational manner, not over what superrationality means."
http://en.wikipedia.org/wiki/Superrationality
"There are several variants of the “binding problem,” which asks how a massively parallel system can achieve coherence. The most striking examples involve subjective experience and therefore remain intractable to experimentation. For example, we know that visual processing involves dozens of separate brain areas, yet we perceive the world as a coherent whole. Even leaving subjective experience aside, there are still compelling technical problems in understanding how a neural network can perform crucial computational tasks, such as
those that arise in reasoning about and acting in the world.
A basic problem, and the one that we will focus on, is the “variable binding” problem. As a first example, consider your ability to pick up objects. Depending on the object, its current position, and your goals, you have a very wide range of ways of grasping and manipulating
the object, all realized by the network of neurons that is your brain. This is an instance of the variable binding problem because your choice of values for the three variables object, position, and goal has consequences throughout your brain on how the action is carried out.
In conventional computing, we assume that different program modules all have access to the values of (global) variables and can modify their behavior appropriately. Any theory of neural computation needs some mechanism for achieving this kind of global effect."
http://leon.barrettnexus.com/papers/barrett-2008-nc-binding.pdf
Saturday, April 02, 2011
Unisective-Fractal Cosmogony, Autoplectic-Random Choices, Constitutive-Causal Relations, Motivic-Priority Mechanisms, Transductive-Syntactic Coverings
Image: http://emergent-culture.com/science-of-synchronicity-self-organizing-systems-from-atoms-to-humanity-pt-2/
Not all theists from the monotheistic tradition think the universe was created from "nothing"...rather unity.
The Beginning that Hath No Beginning: Bahá'í Cosmogony
...
A different reading of this paragraph is presented by Moojan Momen in an article in which he argues for cognitive relativism vis-à-vis questions of Bahá’í metaphysics. In it, he describes the issue as “that of whether the world of creation is coeternal with God or created in time.” In line with the development and application to Bahá’í metaphysics of the idea of cognitive relativism, which is the wider context in which this statement occurs, the author suggests that both of these positions are equally valid, but neither of them are “true” in anything like an absolute sense, for they concern realities or processes about which no intelligible concept can be considered adequate.
http://irfancolloquia.org/pdf/lights3_brown.pdf
"`The beginning of all things is the knowledge of God ...':[1] with this epigrammatic statement Bahá'u'lláh indicates in God the centre of human life. In `the knowledge of God' is `the beginning of all things', such as knowing, being aware, acting, working, educating, governing, making art. Therefore Bahá'í scholars or would-be philosophers must necessarily move from this `beginning' in their efforts to relate the pregnant concepts of `divine philosophy' enshrined in the Bahá'í texts with the great discoveries made by human intellect during this century described by `Abdu'l-Bahá as `a century of the revelation of reality', `the century of science, inventions, discoveries and universal laws'.[2]
God is unknowable
`... man cannot grasp the Essence of Divinity ...': this is the first statement Bahá'í scholars or would-be philosophers are bound to utter. Similar statements are numerous in the Bahá'í texts. Herein follow some of the explanations set forth in the Bahá'í texts to justify such human incapacity:
Differentiation of stages. Bahá'u'lláh writes: `Whatsoever in the contingent world can either be expressed or apprehended can never transgress the limits which, by its inherent nature, have been imposed upon it';[3] and moreover: `Every attempt which, from the beginning that has no beginning, hath been made to visualize and know God is limited by the exigencies of His own creation ...'4 And `Abdu'l-Bahá explains that `... differentiation of stages in the contingent world is an obstacle to understanding. Every superior stage comprehendeth that which is inferior and discovereth the reality thereof, but the inferior one is unaware of that which is superior and cannot comprehend it. Thus man cannot grasp the Essence of Divinity ...'5
God's all-inclusiveness. `... the Divine Essence surrounds all things. Verily, that which surrounds is greater than the surrounded, and the surrounded cannot contain that by which it is surrounded, nor comprehend its reality'.[6]
Human limitations. `... whatsoever can be conceived by man is a reality that hath limitations and is not unlimited; it is circumscribed, not all-embracing. It can be comprehended by man, and is controlled by him.'[7]
Human incapacity to know the essence of things. `As our knowledge of things, even of created and limited things, is knowledge of their qualities and not of their essence, how is it possible to comprehend in its essence the Divine Reality, which is unlimited?'[8]
Limitations of human understanding. `It is evident that the human understanding is a quality of the existence of man, and that man is a sign of God: how can the quality of the sign surround the creator of the sign? that is to say, how can the understanding, which is a quality of the existence of man, comprehend God?'[9]
The same idea is set forth also in other words: `These people, all of them, have pictured a God in the realm of the mind, and worship that image which they have made for themselves. And yet the image is comprehended, the human mind being the comprehender thereof, and certainly the comprehender is greater than that which lieth within its grasp; for imagination is but the branch, while mind is the root; and certainly the root is greater than the branch.'[10]
***
To the question `How shall we know God?', `Abdu'l-Bahá answers: `We know Him by His attributes. We know Him by His signs. We know Him by His names'.[11] Man can know God `... by his reasoning power, by observation, by his intuitive faculties and the revealing power of his faith': he will be thus enabled to `believe in God, discover the bounty of His Grace ... become[th] certain that ... conclusive spiritual proofs assert the existence of that unseen reality'.[12] This is the true `science of Divinity', a set of `intellectual proofs ... based upon observation and evidence', `logically proving the reality of Divinity, the effulgence of mercy, the certainty of inspiration and immortality of the spirit'.[13]
Therefore, though God is inaccessible in His Essence, man is able nevertheless to understand that He exists. He can achieve this understanding by treading a threefold path:
the path of his reasoning power, through which he can formulate theoretical, rational proofs of His existence;
the path of observation, through which he can discover His traces throughout the universe and in human history;
the path of his insight and faith, through which he can obtain a spiritual perception of His existence and confirm the results achieved through reason and observation.
Rational proofs of Divinity
Rational or `intellectual proofs of Divinity'[14] abundantly set forth in `Abdu'l-Bahá's Writings and recorded talks[15] can be divided into two groups: cosmological and teleological.
Cosmological proofs[16]
On the grounds of movement and the principle of efficient cause. Bahá'u'lláh writes: `All that is created , however, is preceded by a cause. This fact, in itself, establisheth, beyond the shadow of a doubt the unity of the Creator';[17] and `Abdu'l-Bahá explains: `... we observe that motion without a motive force, and an effect without a cause are both impossible: that every being hath come to exist under numerous influences and continually undergoeth reaction. These influences, too, are formed under the action of still other influences ... Such process of causation goes on, and to maintain that this process goes on indefinitely is manifestly absurd. Thus such a chain of causation must of necessity lead eventually to Him Who is the Ever-Living, the All-Powerful, Who is Self-Dependent and the Ultimate Cause.'[18]
On the grounds of the different degrees of perfection. `... limitation itself proves the existence of the unlimited, for the unlimited is known through the limited, just as weakness itself proveth the existence of power, ignorance the existence of knowledge, poverty the existence of wealth';19 `... our need is an indication of supply and wealth. Were it not for wealth, this need would not exist ... In other words, demand and supply is the law and undoubtedly all virtues have a centre and a source. That source is God, from Whom all these bounties emanate'.20
Teleological proofs[21]
`... every arrangement and formation that is not perfect in its order we designate as accidental, and that which is orderly, regular, perfect in its relations and every part of which is in its proper place and is an essential requisite of the other constituent parts, this we call a composition formed through will and knowledge ...'22
The universe is a `Great Workshop'; `though (its) infinite realities are diverse in their character, yet they are in the utmost harmony and closely connected together'. `Thus to connect and harmonize these diverse and infinite realities an all-unifying Power is necessary ...' In other words, `... interaction, co-operation and interrelation amongst beings are under the direction and will of a motive Power which is the origin, the motive force and the pivot of all interactions in the universe'.[23]
`... when you look at nature itself, you see that it has no intelligence, no will ...';[24] `Inasmuch as we find all phenomena subject to an exact order and under control of universal law, the question is whether this is due to nature or to divine and omnipotent rule.'[25] `... from the premises advanced by naturalists,[26] the conclusions are drawn that nature is the ruler and governor of existence and that all virtues and perfections are natural exigencies and outcome'.
`... man is but a part or member of that whereof nature is the whole'.
`Man possesses certain virtues of which nature is deprived.'
`Man, the creature, has volition and certain virtues. Is it possible that his Creator is deprived of these?'
`... the Creator of man must be endowed with superlative intelligence and power in all points that creation involves and implies'.25
`... formation is of three kinds and of three kinds only: accidental, necessary and voluntary. The coming together of the various constituent elements of being cannot be compulsory, for then the formation must be an inherent property of the constituent parts and the inherent property of a thing can in no wise be dissociated from it, such as light that is the revealer of things, heat that causes the expansion of elements, and the solar rays which are the essential property of the sun. Thus under such circumstances the decomposition of any formation is impossible, for the inherent properties of a thing cannot be separated from it. The third formation remaineth and that is the voluntary one, that is, an unseen force described as the Ancient Power, causeth these elements to come together, every formation giving rise to a distinct being.'[27]
The rational proofs of God's existence set forth by `Abdu'l-Bahá are not, evidently, new in the context of Western and Islamic philosophy. In this respect, it should be noted that `Abdu'l-Bahá's authoritative exposition of the Bahá'í teachings -- set forth in His Writings and recorded talks -- is often worded in a Western, mostly Aristotelian and Plotinian, philosophical language. He uses this language -- as Bahá'u'lláh said addressing a Sufi audience in a Sufi philosophical language -- `out of deference to the wont of men and after the manners of the friends':[28] in other words He is willing to adapt His language to the understanding and culture of the audience He is addressing.29
The perception of the indwelling Spirit
Though `Abdu'l-Bahá says that these rational proofs are `a decisive argument',[30] nevertheless He does not present them as an irreplaceable demonstration of God's existence, nor does He say that they may alone inspire an atheist with faith in God. `These obvious arguments', He states, `are adduced for the weak souls; but if the inner perception is open, a hundred thousand clear proof become visible. Thus, when man feels the indwelling spirit, he is in no need of arguments for its existence; but for those who are deprived of the bounty of the spirit, it is necessary to establish external arguments.'[31]
He wrote however: `... apply thyself to rational and authoritative arguments. For arguments are a guide to the path and by this the heart will be turned unto the Sun of Truth. And when the heart is turned unto the Sun, then the eye will be opened and will recognize the Sun through the Sun itself. Then (man) will be in no need of arguments (or proofs) for the Sun is altogether independent ...'[32]
In other words, these rational proofs, as promoters of faith in God, are only relatively effective. Inasmuch as `... the reality of Divinity is evidenced by virtue of its outpourings and bestowals',[33] rational proofs should be confirmed through the other two above mentioned paths (i.e. observation, and insight and faith) which -- because they can lead to the recognition of God's traces throughout the universe -- open `the inner perception'[34] to His existence and are therefore a more effective path towards a strong faith in Him.
Bahá'u'lláh writes: `Every created thing in the whole universe is but a door leading unto His knowledge, a sign of His sovereignty, a revelation of His names, a symbol of His majesty, a token of His power, a means of admittance into His straight path ...'[35]
http://bahai-library.org/books/quest/quest.02.html
"For Plotinus, discursive thinking always presupposes intuitive thinking, even if it seems temporally impossible to discursive thought. As discursive thinking inspects the world about it, it observes that there are beings that differ from one another. Further each of these beings has distinguishable parts. For Plotinus, even the soul which has no spatial extension still has such parts "in the form of various powers such as reasoning, perceiving and desiring. The essential thing is that the constituents of a particular being are brought into a characteristic kind of unity, for without this it would not be one being of the sort it is. Detached fragments of matter do not constitute a body; dissociated powers do not constitute a soul; there must be unified in appropriate ways if such beings are to exist. A flock of geese, for example, has considerable unity, but not as much unity as the components of an individual goose; and so we may say of the goose that she is more of a being than the flock" (Jordan 255-56).
But no being that we know is in itself perfectly unified. Perfect unity would exclude all parts distinguishable from one another, all multiplicity and diversity. "Unity pure and simple cannot coexist with any plurality of aspects or parts." The unity of the existents that we know through discursive thinking then are imperfectly unified. But an imperfect unity would fail to be a unity unless it was kept together through the power of a higher principle of unity. Thus, the all the distinct beings that surround us presuppose in their imperfect unity a perfected unity in which the imperfect being must some how participate. Ultimately, for Plotinus, the source of this perfect unity is the One, which is in its utter simplicity and non-diversity beyond being. For Plotinus, unity precedes and is distinguished from being. Or in other words, all beings are the emanation of a unity that is utterly transcendent to the diversity uncovered in beings. "As the last word in oneness, Unity must be the last word in reality, responsible for Being but standing beyond Being--beyond the diversity within particular beings, beyond the diversity of the whole cosmos of particular beings." (Jordan)
"For Plotinus as for Philo, God transcends the world completely and far surpasses human comprehension. God is beyond description, for to describe anything is to specify the predicates that belong to some subject; but in Unity--in absolute, unqualified Oneness--there is no diversity whatever and therefore no distinction between subject and predicate. In saying that God is Unity, Plotinus does not mean that Unity is a predicate or characteristic of God; he means that "God" and "Unity" (or the "One") are interchangeable names for precisely the same thing. God does not have characteristics and is Himself above and beyond them all."
But even as the one precedes and is distinguished from beings, it serves as their source. "The One is all things and not a single one of them: for the Source of all is not all things; yet It is all things, for they all, so to speak run back to It: or rather, in It they are not yet but will be....In order that being may exist, the One is not being but the Generator of being....The One, perfect because It seeks nothing, has nothing, and needs nothing overflows as it were, and Its superabundance makes something other than Itself" (Armstrong, 51). Plotinus argues that the Hypostasai comes into being because "the One, perfect because It seeks nothing, has nothing and needs nothing, overflows, as it were, and Its superabundance makes something other than Itself. Its halt and turning towards the One constitutes being, its gaze upon the One, Nous. Since it halts and turns towards the One that it may see, it becomes at once Nous and being. Resembling the One thus Nous produces in the same way, pouring forth a multiple power. Just as That, Which was before it, poured forth its likeness, so what Nous produces is a likeness of itself. This activity springing from being is Soul, which comes into being while Nous abides unchanged: for as a necessary consequence of its own existence: and the whole order of things is eternal: the lower world of becoming was not created at a particular moment but its eternally being generted: it is always there as a whole, and particular things in it only perish so that others may come into being" (Enneads, V.2.1).
We can note here that Augustine will take issue with Plotinus's notion that God is beyond being and will argue instead that God is perfected Being. Further, rather than arguing the world exists from a series of emanations that are occuring eternally from out of a certain logical necessity, Augustine will emphasize the world is the outcome of an act of creation ex nihilo, from out of nothingness. Thus, Augusine's God is given a radical responsibility for the existence of a world that might not have been and one that is certainly not generated from out of eternity.
The Plotinian overflow of the one into being is termed a hypostasis by Plotinus (literally that which "stands under") and sets in motion a series of further hypostasai or emanations, each lower than the last, each more distant from the One and thus possessing a unity that grows more and more discursive, imperfect, non-immediate, non-intuitive.
Schematic Diagram of the Hypostasai (Emanations)
THE GOOD To Hen The One
A: BEING (What Can Be Integrated (What Returns to the One))
I. Nous Mind
II. Psyché Soul
IIa. Nature
III. Body
B: NON-BEING (Beyond What Can Be Integrated: The Indeterminate)
EVIL Hulé Matter
Nature has been added to Plotinus's own listing, since it is implied in the discussion of the generation of bodies. The One is not a hypostasis but the source of all the others. Neither is Hulé a hypostasis. Rather it is a "formless darkness on which form is merely superimposed." For this reason it is considered without being and thus evil since it resists or negates the overflowing of the one into Being. This equation of the negative with evil will also be appropriated by Augustine. For Plotinus, Logos (Word) names the formative force proceeding from a higher principle which expresses or represents that principles in a lower plane of Being. Thus Logos holds the key to the unity and continuity of the various levels of Being emanating from The One. For Augustine, Logos will be appropriated as that aspect of the Trinity involved in the incarnation of God as Jesus Christ. The Logos's Plotinian role is signified in the diagram by a series of arrows indicating the generation of each lower plane from out of the reality of the higher one. This generation is itself the outcome of a pure overflow of reality from one level to the next.
But one must not conclude from this schema that the One is spatially "outside" of our world. Rather, the One is "intimately present in the centre of our souls; or rather we in him, for Plotinus prefers to speak of the lower as in the higher" (Armstrong, 30). This inness of the One is an inadequate spatial metaphor but provides us with the sense that we are not removed from the one, but it is already totally with us, sustaining our being. This notion of interiority is exploited by Augustine, especially in Chapter ten of his Confessions."
http://faculty.salisbury.edu/~jdhatley/plotinus.htm
"Characteristic (from the Greek word for a property or attribute (= trait/measurement) of an entity)"
Any attribute we assign to God is limited by cognitive rules of recognition and transformation. Information is a difference which makes a difference. For example, because one of the deepest "truths" we experience is love, so God is described as the attribute of love. To say God has no characteristics is like saying it cannot be characterized or constrained by anything, informational/cognitive binding rules are inadequate to the task, when we do characterize, it is embodied in terms we recognize and understand.
Since an attribute is a difference relation, the medium which supports their distinctions must have enough connectivity and consistency so that relation could be made. I basically think of it as a fraction, 1/x. We may infinitely divide the whole number 1, but the values we give to x are never enough to know the whole as a part, only by becoming "one" with the whole, though at that point there is no informational distinction.
Lawrence Krauss mentions how if we created a baby universe, it would appear from our perspective as contracting rather than expanding, who is to say our universe isn't similar? It internally appears to be expanding, but externally appears to be contracting? Christopher Langan has called this duality "conspansion", the size of space can be held constant while matter contracts at an accelerating pace, or the size of matter can be held constant while space expands at an accelerating pace, when simultaneously expanding/contracting the same: (1/x)(x/1)=1 or (y/x)(x/y)=1.
There is more than one way to 'mean the same thing' --> 'isotelesis'. It's especially apparent to me having been raised a Christian though my parents were Muslim, it becomes a problem when people start refering to it as your God or my God as if there's ultimately a measurable difference, which is why I'm interested in ideas on God that only refer to it as 'our' --> 'koinotely'...even for sentient life in another galaxy.
"The Mandala Project provides a creative visual and experiential demonstration of unity with diversity. Through art, the project brings people together to create something larger than themselves while honoring the uniqueness of the individual and celebrating the benefits and gifts of a collective experience.
Recognizing what we have in common, while respecting our differences, increases our capacity for creating peace. Truth, beauty, and goodness are values honored by all cultures: through the pursuit of knowledge we discover truth; in nature we see beauty; from acts of kindness we experience goodness. Combined with art, these values are the foundation on which we build connections that cultivate peace."
http://www.mandalaproject.org/About/Index.html
"`Abdu'l-Baha explains that when we speak of God as having certain attributes, what we are saying is that God does not lack perfection.(11) For example, if we say that God is merciful, we are saying that he does not in any way lack the perfection of mercy. If we say God is loving, then he does not lack any aspect of the perfection of love. Therefore, when we attribute qualities to God, we are not asserting that we can comprehend those qualities. Rather, we acknowledge that those attributes are beyond our comprehension.
'He is a true believer in Divine unity who, far from confusing duality with oneness, refuseth to allow any notion of multiplicity to becloud his conception of the singleness of God, who will regard the Divine Being as One Who, by His very nature, transcendeth the limitations of numbers. (G:LXXXIV, 166)'
Another quality we cannot use to describe God is that of "oneness". We cannot say that God is "one" because God is exalted above the concept of number. From God, all numbers emerge, but God never becomes the number one and cannot be described by it. He remains sanctified above the idea of number and also the idea of "multiplicity", which number carries with it. Number is a part of creation and therefore limited; it is not something that can be ascribed to God."
http://www.whoisbahaullah.com/Alison/unity.html
The Inflation Debate
Is the theory at the heart of modern cosmology deeply flawed?
"An alternative to inflationary cosmology that my colleagues and I have proposed, known as the cyclic theory, has just this property. According to this picture, the big bang is not the beginning of space and time [see “The Myth of the Beginning of Time,” by Gabriele Veneziano; Scientific American, May 2004] but rather a “bounce” from a preceding phase of contraction to a new phase of expansion, accompanied by the creation of matter and radiation. The theory is cyclic because, after a trillion years, the expansion devolves into contraction and a new bounce to expansion again. The key point is that the smoothing of the universe occurs before the bang, during the period of contraction. Any procrastinating rogue regions continue to contract while well-behaved regions bounce on time and begin expanding, so the rogue regions remain comparatively small and negligible."
http://www.scientificamerican.com/article.cfm?id=the-inflation-summer&page=5
"Histories in which the universe eternally inflates, therefore, hardly contribute to the'no boundary amplitudes we measure. Thus the global structure of the universe that eternal inflation predicts, differs from the global structure predicted by top down cosmology. Essentially this is because eternal inflation is again based on the classical idea of a unique history of the universe, whereas the top down approach is based on the quantum sum over histories. The key difference between both cosmologies is that in the proposal based on eternal inflation there is thought to be only one universe."
http://arxiv.org/pdf/hepth/0602091
"I can think of no more striking example of isotely, the attainment of the same end by different methods in different groups, than these manifold methods of producing a single color."
http://tinyurl.com/4u6p6kd
Isotelic (adj.): Referring to factors that produce, or tend to produce, the same effect
http://tinyurl.com/4s42e2a
"Another indicator of differences between the qualitative and quantitative traditions is the importance or lack thereof attributed to the concept of ‘‘equifinality’’ (George and Bennett 2005). Also referred to as ‘‘multiple, conjunctural causation’’ or just ‘‘multiple causation,’’ the concept of equifinality is strongly associated with the qualitative comparative analysis approach developed by Ragin (1987), and it plays a key role in how many qualitative scholars think about causal relationships. In contrast, discussions of equifinality are absent in quantitative work. If one were to read only large-N quantitative work, the word ‘‘equifinality’’ (or its synonyms) would not be part of one’s methodological vocabulary."
http://www.jamesmahoney.org/articles/A%20Tale%20of%20Two%20Cultures%20Proofs.pdf
"Equifinality means that any given phenomenon may be explained in two or more ways...Ludwig von Bertalanffy seems responsible for naming this principle: "the same final state may be reached from different initial conditions and in different ways. This is what is called equifinality"...The physicist John Barrow's image is clear. "If you stir a barrel of thick oil it will rapidly settle down to the same placid state, no matter how it was first stirred"...And Richard Feynman, inventor of a way to "calculate the probability of [a quantum] event that can happen in alternative ways", says, "it is possible to start from many apparently different starting points, and yet come to the same thing" In the social sciences, Weber is a leading exponent of equifinality...Emile Durkheim has the same idea: "to arrive at the same goal, many different routes can be, and in reality are, followed."
The term equi-initiality is used here to designate the converse of equifinality; it means that more than one consequence may be produced from any given("initial") phenomenon. We speak of coming to a "turning point," a "cross-roads," or a "crisis," from which alternative consequences can be expected to flow -- depending, usually, on some ultimate infinitesimal factor ("the Butterfly Effect," "For want of a nail"). Equi-initiality is what Barrow has in mind when he argues for the "non-uniqueness of the future states of a system following the prescription of a definite starting state".
The mark of the social: discovery or invention?
"In the most general sense, equifinality is the case where different conditions lead to similar effects. ... In response to the problems associated with equifinality, there have been many calls for simpler models (e.g. Beven, 1996a,b; Kirkby, 1996; Young et al., 1996). Parsimony in science has been a virtue for some time (e.g. William of Occam advocated it in the fourteenth century). However, some systems are not simple (e.g. linear) enough to justify simple models for all applications. A model cannot adequately capture processes and process interactions that are not included in its formulation. For example, black (or any colour) box models cannot provide process-based diagnostics within the box. Our perception of the best use of physics-based hydrologic-response simulation was lucidly characterized by Kirkby (1996):
Models are thought experiments which help refine our understanding of the dominant processes acting. . .While most simulation models may be used in a forecasting mode, the most important role of models is as a qualitative thought experiment, testing whether we have a sufficient and consistent theoretical explanation of physical processes. The best model can only provide a possible explanation which is more consistent with known data than its current rivals. Every field observation, and especially the more qualitative or anecdotal ones, provides an opportunity to refute, or in some cases overturn, existing models and the theories which lie behind them."
http://pangea.stanford.edu/~keith/119.pdf
The goal is model-theoretic symmetry.
Universe:
Origin:
...late Middle English: from Old French univers or Latin universum, neuter of universus 'combined into one, whole', from uni- 'one' + versus 'turned' (past participle of vertere)
Diverse:
Origin:
Middle English: via Old French from Latin diversus 'diverse', from divertere 'turn in separate ways' (see divert)
Participatory:
Origin:
late 15th century: from Latin participat- 'shared in', from the verb participare, based on pars, part- 'part' + capere 'take'"
Anticipatory:
Origin:
mid 16th century (in the senses ‘to take something into consideration’, ‘mention something before the proper time’): from Latin anticipat- 'acted in advance', from anticipare, based on ante- 'before' + capere 'take'"
We don't just live in a 'part'-icipatory 'uni'-(di-)verse, but 'anti'-cipatory also.
Antiset:
A set which transforms via converse functions. Antisets usually arise in the context of Chu spaces.
http://mathworld.wolfram.com/Antiset.html
A converse, transpose, or inverse relation:
http://en.wikipedia.org/wiki/Inverse_relation
"By analogy with the extension of a type as the set of individuals of that type, we define the extension of an attribute as the set of states of an idealized observer of that attribute, observing concurrently with observers of other attributes. The attribute-theoretic counterpart of an operation mapping individuals of one type to individuals of another is a dependency mapping states of one attribute to states of another. We integrate attributes with types via a symmetric but not self-dual framework of dipolar algebras or disheaves amounting to a type-theoretic notion of Chu space over a family of sets of qualia doubly indexed by type and attribute, for example the set of possible colors of a ball or heights of buildings."
http://conconto.stanford.edu/conconto.pdf
"The intended models of a theory T with product sorts are those that respect products, namely those such that the product sorts are interpreted as sets of tuples containing κ = 1 copy of each tuple.
...
In this section we exploit the full generality of the homfunctor in a notion of commune (the suggested name in algebraic contexts) or disheaf (as its categorical counterpart making the connection with presheaf), as a common generalization of the notions of presheaf and Chu space [1, 2] (a construct that subsumes point set topology) that is particularly easy to define in terms of a notion of didense extension.
In place of a single small category J as the base we take two small categories J and L. Elements of an object D (for disheaf) are represented as morphisms a : j → D as before, but now we also allow dual elements or states of D, represented as morphisms x : D → l for objects l of L.
...
Defining presheaves as colimits makes the proposition that ˜F is a representation an honest representation theorem instead of a mere definition. However abstraction promises both generality and simplicity. Whether or not the colimit approach can be considered a general characterization of presheaves and their morphisms, one would have to be thoroughly wedded to the categorical point of view to call it a simple one. The following approach to density is developed in terms solely of categories and their extensions, with the latter understood as a more elementary notion than functor. The approach can be understood as simply a translation into more elementary language of the semantic definition of density.
http://boole.stanford.edu/pub/yon.pdf
"In model theory and related areas of mathematics, a type is a set of first-order formulas in a language L with free variables x1, x2,…, xn which are true of a sequence of elements of an L-structure . Loosely speaking, types describe possible elements of a mathematical structure. Depending on the context, types can be complete or partial and they may use a fixed set of constants, A, from the structure . The question of which types represent actual elements of leads to the ideas of saturated models and omitting types."
http://en.wikipedia.org/wiki/Type_(model_theory)
I'm just following hunches, my ideas aren't yet coherent, again I'm no expert, just tracking clues...I don't know what 'God' is, that seems to follow by definition...it is the highest unknowable, the boundary of everything, and the boundary of that..."nothing".
"We take this need for a self-dual overall picture as a fundamental postulate for physics, which we called [12] the principle of representation-theoretic self-duality:
(Postulate) a fundamental theory of physics is incomplete unless
self-dual in the sense that such a role-reversal is possible. If a
phenomenon is physically possible then so is its observer-observed
reversed one.
One can also say this more dynamically: as physics improves its structures tend to become self dual in this sense. This has in my view the same status as the second law of thermodynamics: it happens tautologically because of the way we think about things. In the case of thermodynamics it is the very concept of probability which builds in a time asymmetry (what we can predict given what we know now) and the way that we categorise states that causes entropy to increase (when we consider many outcomes ‘the same’ then that situation has a higher entropy by definition and is also more likely). In the case of the self-duality principle the reason is that in physics one again has the idea that something exists and one is representing it by experiments. But experimenters tend to think that the set ˆX of experiments is the ‘real’ thing and that a theoretical concept is ultimately nothing but a representation of the experimental outcomes. The two points of view are forever in conflict until they agree that both exist and one represents the other."
http://philsci-archive.pitt.edu/3345/1/qg1.pdf
"In mathematics, triality is a relationship between three vector spaces, analogous to the duality relation between dual vector spaces. Most commonly, it describes those special features of the Dynkin diagram D4 and the associated Lie group Spin(8), the double cover of 8-dimensional rotation group SO(8), arising because the group has an outer automorphism of order three. There is a geometrical version of triality, analogous to duality in projective geometry."
http://en.wikipedia.org/wiki/Triality
"Cover. A collection of subsets of a space is a cover (or covering) of that space if the union of the collection is the whole space.
...
Partition of unity. A partition of unity of a space X is a set of continuous functions from X to [0, 1] such that any point has a neighbourhood where all but a finite number of the functions are identically zero, and the sum of all the functions on the entire space is identically 1."
http://www.wordiq.com/definition/Partition_of_unity
"In abstract algebra, a cover is one instance of some mathematical structure mapping onto another instance, such as a group (trivially) covering a subgroup. This should not be confused with the concept of a cover in topology.
When some object X is said to cover another object Y, the cover is given by some surjective and structure-preserving map f : X → Y. The precise meaning of "structure-preserving" depends on the kind of mathematical structure of which X and Y are instances. In order to be interesting, the cover is usually endowed with additional properties, which are highly dependent on the context."
http://en.wikipedia.org/wiki/Cover_(algebra)
"In mathematics, an embedding (or imbedding) is one instance of some mathematical structure contained within another instance, such as a group that is a subgroup.
When some object X is said to be embedded in another object Y, the embedding is given by some injective and structure-preserving map f : X → Y. The precise meaning of "structure-preserving" depends on the kind of mathematical structure of which X and Y are instances. In the terminology of category theory, a structure-preserving map is called a morphism.
The fact that a map f : X → Y is an embedding is often indicated by the use of a "hooked arrow", thus: On the other hand, this notation is sometimes reserved for inclusion maps.
Given X and Y, several different embeddings of X in Y may be possible. In many cases of interest there is a standard (or "canonical") embedding, like those of the natural numbers in the integers, the integers in the rational numbers, the rational numbers in the real numbers, and the real numbers in the complex numbers. In such cases it is common to identify the domain X with its image f(X) contained in Y, so that then X ⊆ Y."
http://en.wikipedia.org/wiki/Embedding
"Gödel's ontological proof is a formal argument for God's existence by the mathematician Kurt Gödel. It is in a line of development that goes back to Anselm of Canterbury. St. Anselm's ontological argument, in its most succinct form, is as follows: "God, by definition, is that for which no greater can be conceived. God exists in the understanding. If God exists in the understanding, we could imagine Him to be greater by existing in reality. Therefore, God must exist." A more elaborate version was given by Gottfried Leibniz; this is the version that Gödel studied and attempted to clarify with his ontological argument."
http://en.wikipedia.org/wiki/Gödel's_ontological_proof
"Relativity pursues the goal of explanatory self-containment up to a point; spacetime contains matter and energy that cause spacetime fluctuations that cause changes in matter and energy. The CTMU, on the other hand, pursues the goal of self-containment all the way up to cosmogenesis. And while neither GR nor QM does anything to resolve the fatal paradoxes of ex nihilo creation and quantum nonlocality, the CTMU dissolves such paradoxes with a degree of logical rectitude to which science seldom aspires."
http://www.megafoundation.org/CTMU/Articles/Supernova.html
"The CTMU says that by its self-generative, self-selective nature, which follows directly from the analytic requirement of self-containment, reality is its own “designer”. Other features of the generative grammar of reality imply that reality possesses certain logical properties traditionally regarded as theological or spiritual, and that to this extent, the self-designing aspect of reality is open to a theological or spiritual interpretation. The CTMU, being a logical theory, does not attempt to force such an interpretation down anyone’s throat; not all semantic permutations need affect theoretical structure. What it does do, however, is render any anti-theological interpretation a priori false, and ensures that whatever interpretation one chooses accommodates the existence of an “intelligent designer”…namely, reality itself. In light of the CTMU, this is now a matter more of logic than of taste. In any case, it should be clear that the CTMU yields new ways of looking at both evolution and teleology. Just as it is distinguished from other theories of cosmic evolution by its level of self-containment, particularly with regard to its preference for self-determinacy rather than external determinacy or indeterminacy, so for its approach to biological evolution. Unlike other theories, the CTMU places evolutionary biology squarely in the context of a fundamental, selfcontained model of reality, thus furnishing it with an explanation and foundation of its own instead of irresponsibly passing the explanatory buck to some future reduction; instead of counting it sufficient to model its evolutionary implications in the biological world, the CTMU establishes model-theoretic symmetry by providing a seamless blend of theory and universe in which the biological world can itself be “modeled” by physical embedment. This alone entitles it to a place in the evolutionary debate."
http://www.scribd.com/doc/24820585/Cheating-the-Millennium-The-Mounting-Explanatory-Debts-of-Scientific-Naturalism
"The name literally says it all. The phrase “Cognitive-Theoretic Model of the Universe” contains three main ingredients: cognitive theory, model, and universe. Cognitive theory refers to a general language of cognition (the structural and transitional rules of cognition); universe refers to the content of that language, or that to which the language refers; and model refers to the mapping which carries the content into the language, thus creating information. The way in which the title brings these three ingredients together, or “contracts” their relationship to the point of merging, reflects their perfect coincidence in that to which the title implicitly refers, i.e., reality (the physical universe plus all that is required to support its perception and existence). Thus, the CTMU is a theory which says that reality is a self-modeling universal language, or if one prefers, that the universe is a self-modeling language.
The operation of combining language, universe, and model to create a perfectly self-contained metalanguage results in SCSPL, short for Self-Configuring Self-Processing Language. This language is “self-similar” in the sense that it is generated within a formal identity to which every part of it is mapped as content; its initial form, or grammatical “start symbol”, everywhere describes it on all scales. My use of grammatical terminology is intentional; in the CTMU, the conventional notion of physical causality is superseded by “telic causation”, which resembles generative grammar and approaches teleology as a natural limit. In telic causation, ordinary events are predicated on the generation of closed causal loops distributing over time and space. This loop-structure reflects the fact that time, and the spatial expansion of the cosmos as a function of time, flow in both directions – forward and backward, outward and inward – in a dual formulation of causality characterizing a new conceptualization of nature embodied in a new kind of medium or “manifold”.
That’s as simple as I can make it without getting more technical. Everything was transparently explained in the 56-page 2002 paper I published on the CTMU, which has been downloaded hundreds of thousands of times. But just in case this still doesn’t qualify as “plain English”, there’s an even easier way to understand it that is available to any reader familiar with the Bible, one of the most widely read and best-understood books ever written.
In the New Testament, John 1 begins as follows: “In the beginning was the Word, and the Word was with God, and the Word was God” (my italics). Much controversy has centered on this passage, as it seems to be saying that God is literally equivalent to logos, meaning “word”, “wisdom”, “reason”, or “truth”. Insofar as these meanings all refer to constructs or ingredients of language or to language itself, this amounts to the seemingly imponderable assertion that God, of Whom believers usually conceive as an all-powerful Entity or Being, somehow consists of language. The CTMU is precisely what it takes to validate this assertion while preserving the intuitive conception of God as the all-knowing Creator – or in non-theological terms, the “identity” or “generator” – of reality. Nothing but the CTMU can fully express this biblical “word-being duality” in a consistent logico-mathematical setting.
...
Some of these projects relate to a book I’ve been writing on mathematically proving the existence of God. Surprising as it may seem, this can certainly be done. In fact, save for a few crucial ingredients, it was nearly accomplished by (e.g.) Anselm of Canterbury in the 11th century AD. (Sadly, neither Anselm nor his various followers and modern analysts were able to pin down all of the logic and ontology required to fill out and support his essential argument.)
Some people, reasoning from past failures, regard such a proof as impossible. But then again, many people had considered it impossible to solve the venerable chicken-or-egg problem, for which I presented a concise case-by-case solution around a decade ago. The chicken-or-egg problem and the existence of God both relate to the general issue of circular dependency, a connection to be explored in the book.
I would hope that in time – if we still have the time – my work along these lines could revolutionize theology. Some will no doubt warm to this prospect; others will not, including committed atheists, uneducable agnostics, and theists who insist on ascribing illogical “divine properties” to God on dogmatic grounds ultimately having little to do with core scripture. But no matter what anyone may say, truth, logic, and God are equivalent concepts. To cheat one of them is to cheat all of them."
http://www.superscholar.org/interviews/christopher-michael-langan/
"OK…perhaps there’s yet another loose end. Asking which of two things came first implies that time flows in a straight line from past to future (those are the “loose ends”). But what if time were to flow in either direction, or even to loop around, flowing in what amounts to a circle? No more loose ends. In fact, loops have no ends at all! But in this case, the answer depends on whether we’re on the forward or reverse side of the loop, heading towards the future or the past. Another way to formulate this question: does the cause lead to the effect, or is there a sense in which the effect leads to the cause? Suffice it to say that no matter which way we choose to go, the original answers to the four versions (1, 2a, 2b and 2c) of the chicken-or-egg question are all affected the same way. They are either all unchanged or all reversed, with no additional ambiguity save that pertaining to the direction of time (not a problem for most non-physicists and non-cosmologists)."
http://www.megafoundation.org/CTMU/Articles/Which.html
"As readers of Noesis will recall, this crucial redefinition begins with a mutual, recursive interdefinition of information and cognition within a "reified tautology" called a quantum transducer. The quantum transducer, being paradoxiform by direct analogy with tautologically-based inference, models the way subjectively-tautological cognitive syntaxes transduce information in time. The universality of this model allows reality to be reduced to it, and thus to (cognitive) information. "Information" is the objective aspect of the quantum transducer for itself and for all others; it is cognition-for-cognition, equating generalistically to a cognitive identity relation on that part of reality to which it corresponds (i.e., the part containing all the transducers playing active and passive roles in it)."Langan, 1992, Noesis 76
"Because cognition and generic information transduction are identical up to isomorphism – after all, cognition is just the specific form of information processing that occurs in a mind – information processing can be described as “generalized cognition”, and the coincidence of information and processor can be referred to as infocognition. Reality thus consists of a single “substance”, infocognition, with two aspects corresponding to transduction and being transduced. Describing reality as infocognition thus amounts to (infocognitive) dual aspect monism. Where infocognition equals the distributed generalized self-perception and self-cognition of reality, infocognitive monism implies a stratified form of “panpsychism” in which at least three levels of self-cognition can be distinguished with respect to scope, power and coherence: global, agentive and subordinate.
[...]
Retooling the information concept consists of three steps. First, it must be equipped with the means of its own transduction or transformative processing. Where information transduction is (cognitively) recognized as generalized cognition, this amounts to replacing it with a dual-aspect quantum of reflexivity, infocognition, which embodies telic feedback. Second, its bit structure, a simplistic and rather uninspired blend of 2-valued propositional logic and probability theory, must be extended to accommodate logic as a whole, including (1) predicate logic, (2) model theory and (3) language theory, broadly including the theories of mathematical languages, metalanguages and generative grammars. After all, since information does nothing but attribute linguistically-organized predicates to objects in the context of models, its meaning involves the mathematics of predicates, languages and models. And third, it must be generalized to an ultimate ancestral medium, telesis, from which cognitive syntax and its informational content arise by specificative feedback as part of a unified complex…a recursive coupling of information and metainformation, or transductive syntax."
http://ctmucommunity.org/wiki/Infocognition
Processes and hyperuniverses
"We show how to define domains of processes, which arise in the denotational semantics of concurrent languages, using hypersets, i.e. non-wellfounded sets. In particular we discuss how to solve recursive equations involving set-theoretic operators within hyperuniverses with atoms. Hyperuniverses are transitive sets which carry a uniform topological structure and include as a clopen subset their exponential space (i.e. the set of their closed subsets) with the exponential uniformity. This approach allows to solve many recursive domain equations of processes which cannot be even expressed in standard Zermelo-Fraenkel Set Theory, e.g. when the functors involved have negative occurrences of the argument. Such equations arise in the semantics of concurrrent programs in connection with function spaces and higher order assignment. Finally, we briefly compare our results to those which make use of complete metric spaces, due to de Bakker, America and Rutten."
http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=6A0F669F20F01CF07D5A712808179788?doi=10.1.1.34.6540&rep=rep1&type=pdf
Hyperset Models of Self, Will and Reflective Consciousness
"A novel theory of reflective consciousness, will and self is presented, based on modeling each of these entities using self-referential mathematical structures called hypersets. Pattern theory is used to argue that these exotic mathematical structures may meaningfully be considered as parts of the minds of physical systems, even finite computational systems. The hyperset models presented are hypothesized to occur as patterns within the ”moving bubble of attention” of the human brain and any brainlike AI system. They appear to be compatible with both panpsychist and materialist views of consciousness, and probably other views as well."
http://goertzel.org/consciousness/consciousness_paper.pdf
The Tessellatice of Mother-Space as a Source and Generator of Matter and Physical Laws:
"Many researchers are involved in the search for a theory of everything (TOE). However, do we yet have a “theory of something”? The problem was studied by Bounias (2000) on the basis of pure mathematical principles. He firmly believed the ultimate theory might be some mathematical principle. Following Bounias (2000), and Bounias and Krasnoholovets (2002, 2003), we can explore the problem of the constitution of space in terms of topology, set theory and fractal geometry."
http://www.inerton.kiev.ua/Einstein_Poincare.pdf
Generation of fractals from incursive automata, digital diffusion and wave equation systems.
"This paper describes modelling tools for formal systems design in the fields of information and physical systems. The concept and method of incursion and hyperincursion are first applied to the fractal machine, an hyperincursive cellular automata with sequential computations with exclusive or where time plays a central role. Simulations show the generation of fractal patterns. The computation is incursive, for inclusive recursion, in the sense that an automaton is computed at future time t + 1 as a function of its neighbouring automata at the present and/or past time steps but also at future time t + 1. The hyperincursion is an incursion when several values can be generated for each time step. External incursive inputs cannot be transformed to recursion. This is really a practical example of the final cause of Aristotle. Internal incursive inputs defined at the future time can be transformed to recursive inputs by self-reference defining then a self-referential system. A particular case of self-reference with the fractal machine shows a non deterministic hyperincursive field. The concepts of incursion and hyperincursion can be related to the theory of hypersets where a set includes itself. Secondly, the incursion is applied to generate fractals with different scaling symmetries. This is used to generate the same fractal at different scales like the box counting method for computing a fractal dimension. The simulation of fractals with an initial condition given by pictures is shown to be a process similar to a hologram. Interference of the pictures with some symmetry gives rise to complex patterns. This method is also used to generate fractal interlacing. Thirdly, it is shown that fractals can also be generated from digital diffusion and wave equations, that is to say from the modulo N of their finite difference equations with integer coefficients."
Keywords: Computer Simulation, Fractals, Information Systems, Mathematics, Models, Biological, Philosophy, Physics"
http://pubget.com/paper/9231908
Teleology as Higher-Order Causation: A Situation-Theoretic Account. (Minds and Machines)
"Situation theory, as developed by Barwise and his collaborators, is used to demonstrate the possibility of defining teleology (and related notions, like that of proper or biological function) in terms of higher order causation, along the lines suggested by Taylor and Wright. This definition avoids the excessive narrowness that results from trying to define teleology in terms of evolutionary history or the effects of natural selection. By legitimating the concept of teleology, this definition also provides promising new avenues for solving long standing problems in the philosophy of mind, such as the problems of intentionality and mental causation."
http://philpapers.org/rec/RRA
Formal Semantics of Acknowledgements, Agreements and Disagreements:
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.20.3316&rep=rep1&type=pdf
Late Neoplatonism
"Departing from Plotinus' position, the later Neoplatonists tend toward an increasingly complicated articulation of the emanation of the sensible world out of the One. Perhaps complication is inevitable given a form of explanation that generates diversity and motion by hierarchical descent from static unity. If different principles of division are assigned, any dividing point in the hierarchy can be construed as the point of emergence of a new level."
http://tinyurl.com/3zvdshk
"To describe the essential nature of things, and the 'flow' or 'procession' from unity to diversity, Iamblicus turned to Neo-Pythagorean metaphysics of number. He believed that things are organized by number and relate to each other in mathematical proportion."
http://tinyurl.com/3eeuwen
"Anything existing after The First must necessarily arise from that First, whether immediately or as tracing back to it through intervenients; there must be an order of secondaries and tertiaries, in which any second is to be referred to The First, any third to the second.
Standing before all things, there must exist a Simplex, differing from all its sequel, self-gathered not inter-blended with the forms that rise from it, and yet able in some mode of its own to be present to those others: it must be authentically a unity, not merely something elaborated into unity and so in reality no more than unity's counterfeit; it will debar all telling and knowing except that it may be described as transcending Being -- for if there were nothing outside all alliance and compromise, nothing authentically one, there would be no Source. Untouched by multiplicity, it will be wholly self-sufficing, an absolute First, whereas any not-first demands its earlier, and any non-simplex needs the simplicities within itself as the very foundations of its composite existence.
There can be only one such being: if there were another, the two [as indiscernible] would resolve into one, for we are not dealing with two corporal entities.
Our One-First is not a body: a body is not simplex and, as a thing of process cannot be a First, the Source cannot be a thing of generation: only a principle outside of body, and utterly untouched by multiplicity, could be The First.
Any unity, then, later than The First must be no longer simplex; it can be no more than a unity in diversity."
http://thriceholy.net/Texts/Plotinus4.html
"It is this variety that the Baha'i faith seeks alongside unity. The Baha'is believe in the term "unity with diversity." The strength of their belief in this idea is openly expressed through their conscious lack of established ritualistic actions in their religious practices since each individual, each culture, and each group is free to bring their own "slant" to the way the religion is celebrated in an area. The idea "pulls people together, but it appreciates the diversity of the cultures from which they come" (Quinn Interview). In my interview with David Quinn, a Baha'i practitioner, David compared the diversity of mankind to a flower garden whose colors are many, saying that if a garden has only a single kind and colored flower, it "becomes a bit boring." Mankind, too, would be "a bit boring" if it was made up of identical people.
At first I mistook the Baha'i belief in both unity and diversity as a paradox, thinking that unity means monotony and homogeneity, especially in the area of religion. However, in absorbing the idea more fully, I realized that it is unrealistic to believe that unity can ever be achieved without the recognition and acceptance that diversity exists. To ignore the presence of diversity would be to create conflict and to destroy the possibility of unity. Perhaps it is the Baha'i belief in creating unity within the diversity that already exists in the world that has attracted such a diversity of followers- the Baha'i path to unity is a realistic one."
http://www.warren-wilson.edu/~religion/newifo/religions/alternative/index/bahai/essay2.shtml
"Syntactic Coherence and Consistency: The Multiplex Unity Principle (MU)
The universe topologically contains that which descriptively contains the universe. MU, the minimum and most general informational configuration of reality, defines the relationship holding between unity and multiplicity, the universe and its variegated contents. Through its structure, the universe and its contents are mutually inclusive, providing each other with a medium.
In other words, we can equivalently characterize the contents of the universe as being topologically “inside” it (topological inclusion), or characterize the universe as being descriptively “inside” its contents, occupying their internal syntaxes as acquired state (descriptive inclusion). The universe generically includes its contents by serving as their syntactic unisect, while the contents contain the universe in a more specific sense involving specific event histories that become “entangled” by interaction. From the first viewpoint, the syntactic coherence of the overall medium enforces mutual consistency of contents, while from the second viewpoint, the coherent syntaxes of its contents contain and consistently recognize and transform the medium. Thus, the universe enforces its own consistency through dual self-containment.
Diagram 10: In the syndiffeonic diagram [Diagram 6], we can plainly see the containment of objects by the medium, but we cannot see the containment of the medium by the objects. Bearing in mind that the terms syntax and content are to some extent relative designations, the upper node in Diagram 10 corresponds to the global medium (global syntactic unisect or “metasyntax” of reality), while the lower node corresponds to the objects therein (syntactic operators contained in the medium); each is a multiplex unity. Coherence flows from global syntax into local content by way of global topological containment, thereby enforcing unity across diverse locales, and back to global syntax in multiple entangled streams generated by cross-transduction of content. Syntax becomes state, and state becomes syntax (where “syntax” is understood to encompass an “ectosyntactic” distribution of syntactic operators). The universe thus remains coherent and consistent in the course of evolution.
MU expresses syndiffeonic symmetry of syntax and content on the spatiotemporal level of reality. Just as syndiffeonesis can be regarded as a paradox identifying difference with sameness, MU can be regarded as an ultimate form of paradox identifying spatiotemporal multiplicity and unity (the MU diagram is an explosion of the syndiffeonic relation diagram in which the stratification dimension is split into descriptive and topological strands or “temporal dimensions”). MU structure resolves the MU paradox in situ by dual stratification, providing closure as the open-ended informational stratification of type theory cannot.
...
Every syndiffeonic relation has synetic and diffeonic phases respectively exhibiting synesis and diffeonesis (sameness and difference, or distributivity and parametric locality), and displays two forms of containment, topological and descriptive. The medium is associated with the synetic phase, while the difference relation is associated with the diffeonic phase (because the rules of state and transformation of the medium are distributed over it, the medium is homogeneous, intrinsically possessing only relative extension by virtue of the difference relationships it contains). Because diffeonic relands are related to their common expressive medium and its distributive syntax in a way that combines aspects of union and intersection, the operation producing the medium from the relands is called unisection ( ). The synetic medium represents diffeonic potential of which the difference relationship is an actualization.
http://ctmu.net/
The Relationship between Topology and Logic
"•Three Objectives
–1) The idea of category as ‘universe of mathematical discourse’
–2) The category of locales as a context for topological space theory
–3) The idea of a Topos as a special category that is good enough for set theory.
•From (3) we will define Geometric Logic"
http://www.christophertownsend.org/Documents/TopLog.ppt
Topological Foundations of Cognitive Science
http://ontology.buffalo.edu/smith/articles/topo.html
"The theory of motives was originally conjectured as an attempt to unify a rapidly multiplying array of cohomology theories, including Betti cohomology, de Rham cohomology, l-adic cohomology, and crystalline cohomology. The general hope is that equations like
[point]
[projective line] = [line] + [point]
[projective plane] = [plane] + [line] + [point]
can be put on increasingly solid mathematical footing with a deep meaning. Of course, the above equations are already known to be true in many senses, such as in the sense of CW-complex where "+" corresponds to attaching cells, and in the sense of various cohomology theories, where "+" corresponds to the direct sum.
From another viewpoint, motives continue the sequence of generalizations from rational functions on varieties to divisors on varieties to Chow groups of varieties. The generalization happens in more than one direction, since motives can be considered with respect to more types of equivalence than rational equivalence. The admissiable equivalences are given by the definition of an adequate equivalence relation."
http://en.m.wikipedia.org/wiki/Motive_(algebraic_geometry)
"If we could imagine a time when no beings existed, this imagination would be the denial of the Divinity of God. Moreover, absolute nonexistence cannot become existence. If the beings were absolutely nonexistent, existence would not have come into being. Therefore, as the Essence of Unity (that is, the existence of God) is everlasting and eternal--that is to say, it has neither beginning nor end--it is certain that this world of existence, this endless universe, has neither beginning nor end.
...
It is necessary, therefore, that we should know what each of the important existences was in the beginning-- for there is no doubt that in the beginning the origin was one: the origin of all numbers is one and not two. Then it is evident that in the beginning matter was one, and that one matter appeared in different aspects in each element. Thus various forms were produced, and these various aspects as they were produced became permanent, and each element was specialized. But this permanence was not definite, and did not attain realization and perfect existence until after a very long time. Then these elements became composed, and organized and combined in infinite forms; or rather from the composition and combination of these elements innumerable beings appeared."
http://www.ibiblio.org/Bahai/Texts/EN/SAQ/SAQ-47.htm
"Attributional logic is the logic that deals exclusively with properties of objects. Example:
“------ is green” attributes or assigns the property of greenness to any object whose name is substituted for the blank.
With the appearance of Najat (salvation) by the great Muslim philosopher Avicenna (980-1037) comes the first use of relational logic as a basis of a proof of God’s existence. Avicenna thereby avoids any appeal to Aristotle’s infinite regression principle.
Relational logic includes attributional logic but goes beyond the latter by treating also relations or links between also relations or links between two existents. Example:
“----- is a brother of ____”
...
II. The Modern Period: the advent of relational logic.
The first systematic treatment of relational logic was in Begriffschrift (1879), by G. Frege. Begriffschrift means “concept writing”.
Frege’s basic idea was that written language was twice removed from its content, being a transcription of the phonemes of speech, which in turn, represent ideas. Frege originated the notion of a formal language in which each symbol represents exactly one logical idea.
...
The successors to Frege were B. Russell, E. Zermelo, and finally J. von Neumann in his doctoral thesis in 1925 which, in the opinion of many, carried relational logic to its most refined form."
http://william.hatcher.org/wp-content/uploads/2008/08/logical_proof_presentation_200309.pdf
More Than Life Itself: A Synthetic Continuation in Relational Biology
http://www.ontoslink.com/index.php?page=shop.getfile&file_id=432&product_id=288&option=com_virtuemart&Itemid=64&lang=en
"A. H. Louie’s More Than Life Itself is an exploratory journey in relational biology, a study of life in terms of the organization of entailment relations in living systems. This book represents a synergy of the mathematical theories of categories, lattices, and modelling, and the result is a synthetic biology that provides a characterization of life. Biology extends physics. Life is not a specialization of mechanism, but an expansive generalization of it. Organisms and machines share some common features, but organisms are not machines. Life is defined by a relational closure that places it beyond the reach of physicochemical and mechanistic dogma, outside the reductionistic universe, and into the realm of impredicativity. Function dictates structure. Complexity brings forth living beings."
http://www.complex.vcu.edu/
ROBERT ROSEN AND GEORGE LAKOFF: THE ROLE OF CAUSALITY IN COMPLEX SYSTEMS
http://www.slideshare.net/isotelesis/robert-rosen-and-george-lakoffthe-role-of-causality-in-complex-systems
Complex Systems from the Perspective of Category Theory: II. Covering Systems and Sheaves
"Using the concept of adjunctive correspondence, for the comprehension of the structure of a complex system, developed in Part I, we introduce the notion of covering systems consisting of partially or locally defined adequately understood objects. This notion incorporates the necessary and sufficient conditions for a sheaf theoretical representation of the informational content included in the structure of a complex system in terms of localization systems. Furthermore, it accommodates a formulation of an invariance property of information communication concerning the analysis of a complex system."
http://philsci-archive.pitt.edu/1237/1/axiomath2.pdf
"I presume most of my readers would believe there is no common ground between the concepts of "metaphysical naturalism" and "intelligent design." In fact, one dictionary definition of naturalism goes so far as to exclude any teleological facts from the domain of naturalism. But in my mind, that goes too far.
If naturalism has a true antonym, it is supernaturalism: the belief that some sort of "higher power" has the ability to create, destroy, ignore, or break the physical laws of nature "at will." The essence of my assertions herein is that "intelligent design" can occur without violating the bedrock principles of "metaphysical naturalism." In other words, you can have our universe be the product of "intelligent design" and yet never require any supernatural phenomena to effect the "intelligent design" of our universe.
Because I think most of my readers would disbelieve the essence of the previous paragraph, I ask you disbelievers to please willingly suspend your disbelief and read on through this essay while at least entertaining the possibility that I could be correct in this regard. I intend to relate a story of the possible here; a story which cannot be disproved by anything currently known by both science and the believers in "intelligent design." It is true there is no direct evidence in favor of what I propose herein, unless you consider all that we know to be that "direct evidence." My essential premise is this:
What if both the believers in "metaphysical naturalism" and the believers in "intelligent design" are each totally correct? What would it take for that to be true, and what are the philosophical consequences?"
http://www.infidels.org/library/modern/bill_schultz/crsc.html
"In agreeing to write this essay, I have promised to explain why I find Darwinism unconvincing. In order to keep this promise, I will be compelled to acknowledge the apparently paradoxical fact that I find it convincing as well. I find it convincing because it is in certain respects correct, and in fact tautologically so in the logical sense; I find it unconvincing because it is based on a weak and superficial understanding of causality and is therefore incomplete. Explaining why this is so will require a rather deep investigation of the nature of causality. It will also require not only that a direction of progress be indicated, but that a new synthesis embracing the seemingly antithetical notions of teleology and natural selection be outlined."
http://www.scribd.com/doc/24820585/Cheating-the-Millennium-The-Mounting-Explanatory-Debts-of-Scientific-Naturalism
"Along with Zurek’s related theory of envariance, quantum Darwinism explains how the classical world emerges from the quantum world and proposes to answer the quantum measurement problem, the main interpretational challenge for quantum theory. The measurement problem arises because the quantum state vector, the source of all knowledge concerning quantum systems, evolves according to the Schrödinger equation into a linear superposition of different states, predicting paradoxical situations such as “Schrödinger's cat”; situations never experienced in our classical world. Quantum theory has traditionally treated this problem as being resolved by a non-unitary transformation of the state vector at the time of measurement into a definite state. It provides an extremely accurate means of predicting the value of the definite state that will be measured in the form of a probability for each possible measurement value. The physical nature of the transition from the quantum superposition of states to the definite classical state measured is not explained by the traditional theory but is usually assumed as an axiom and was at the basis of the debate between Bohr and Einstein concerning the completeness of quantum theory.
Quantum Darwinism explains the transition of quantum systems from the vast potentiality of superposed states to the greatly reduced set of pointer states as a selection process, einselection, imposed on the quantum system through its continuous interactions with the environment. All quantum interactions, including measurements, but much more typically interactions with the environment such as with the sea of photons in which all quantum systems are immersed, lead to decoherence or the manifestation of the quantum system in a particular basis dictated by the nature of the interaction in which the quantum system is involved. In the case of interactions with its environment Zurek and his collaborators have shown that a preferred basis into which a quantum system will decohere is the pointer basis underlying predictable classical states. It is in this sense that the pointer states of classical reality are selected from quantum reality and exist in the macroscopic realm in a state able to undergo further evolution.
As a quantum system’s interactions with its environment results in the recording of many redundant copies of information regarding its pointer states, this information is available to numerous observers able to achieve consensual agreement concerning their information of the quantum state. This aspect of einselection, called by Zurek ‘Environment as a Witness’, results in the potential for objective knowledge."
http://en.m.wikipedia.org/wiki/Quantum_Darwinism
"In ref. [7], S. Majid presents the following `thesis' : ``(roughly speaking) physics polarises down the middle into two parts, one which represents the other, but that the latter equally represents the former, i.e. the two should be treated on an equal footing. The starting point is that Nature after all does not know or care what mathematics is already in textbooks. Therefore the quest for the ultimate theory may well entail, probably does entail, inventing entirely new mathematics in the process. In other words, at least at some intuitive level, a theoretical physicist also has to be a pure mathematician. Then one can phrase the question `what is the ultimate theory of physics ?' in the form `in the tableau of all mathematical concepts past present and future, is there some constrained surface or subset which is called physics ?' Is there an equation for physics itself as a subset of mathematics? I believe there is and if it were to be found it would be called the ultimate theory of physics. Moreover, I believe that it can be found and that it has a lot to do with what is different about the way a physicist looks at the world compared to a mathematician...We can then try to elevate the idea to a more general principle of representation-theoretic self-duality, that a fundamental theory of physics is incomplete unless such a role-reversal is possible. We can go further and hope to fully determine the (supposed) structure of fundamental laws of nature among all mathematical structures by this self-duality condition. Such duality considerations are certainly evident in some form in the context of quantum theory and gravity. The situation is summarised to the left in the following diagram. For example, Lie groups provide the simplest examples of Riemannian geometry, while the representations of similar Lie groups provide the quantum numbers of elementary particles in quantum theory. Thus, both quantum theory and non-Euclidean geometry are needed for a self-dual picture. Hopf algebras (quantum groups) precisely serve to unify these mutually dual structures."
http://planetmath.org/encyclopedia/IHESOnTheFusionOfMathematicsAndTheoreticalPhysics2.html
"You mention Bill Dembski’s 3-way distinction between determinacy, nondeterminacy (chance) and design. In the CTMU, this distinction comes down to the 3-way distinction between determinacy, nondeterminacy and self-determinacy, the last being associated with telic recursion and the others being secondarily defined with respect to it. Telic recursion is just another term for "metacausation"; instead of simply outputting the next state of a system, it outputs higher-order relationships between state and law (or state and syntax).
Regarding the distinction between origins and evolution, not too many people are clear on it. This distinction is based on the standard view of causality, in which there seems to be a clean distinction between the origin and application of causal principles, specifically first-order Markovian laws of nature. In the CTMU, origins distribute over causes in a new kind of structure called a conspansive manifold, and are therefore not cleanly distinguishable from causality. Both are products of a higher-order process, telic recursion. To put it in simpler terms, evolution consists of events which originate in causes which originate in (teleological) metacauses. So in the CTMU, to talk about evolution is to talk about metacausal origins by ontogenic transitivity."
http://www.iscid.org/boards/ubb-get_topic-f-6-t-000351-p-2.html
“597. This is the argument of the materialists. On the other hand those who are informed of divine philosophy answer in the following terms:
Composition is of three kinds.
1. Accidental composition.
2. Involuntary composition.
3. Voluntary composition.
There is no fourth kind of composition. Composition is restricted to these three categories.
If we say that composition is accidental, this is philosophically a false theory, because then we have to believe in an effect without a cause, and philosophically, no effect is conceivable without a cause. We cannot think of an effect without some primal cause, and composition being an effect, there must naturally be a cause behind it.
As to the second composition i.e., the voluntary composition. Involuntary composition means that each element has within it, as an inherent function this power of composition. For example, certain elements have flowed towards each other, and as an inherent necessity of their being they are composed. That is, it is the imminent need of these elements to enter into composition.
For example, the inherent quality of fire is burning or heat. Heat is an original property of fire.
Humidity is the inherent nature of water. You cannot conceive of H2O, which is the chemical form of water, without having humidity connected, for that is its inherent quality, inseparable and indivisible.
Now as long as it is the inherent necessity of these elements to be composed, there should not be any decomposition. While we observe that after each composite organism, there is a process of decomposition we learn that the composition of the organisms of life is neither accidental nor involuntary. Then what have we as a form of composition? It is the third, that is the voluntary composition. And that means that the infinite forms of organisms are composed through a superior will, the eternal will, the will of the living and self-subsistent Lord.
This is a rational proof, that the Will of the Creator is effected through the process of composition."
http://bahai-library.com/compilations/bahai.scriptures/7.html
"This, of course, raises a question: how are their next states actually determined? What is the source of the extra tie-breaking measure of determinacy required to select their next events (“collapse their wave functions”)? The answer is not, as some might suppose, “randomness”; randomness amounts to acausality, or alternatively, to informational incompressibility with respect to any distributed causal template or ingredient of causal syntax. Thus, it is either no explanation at all, or it implies the existence of a “cause” exceeding the representative capacity of distributed laws of causality. But the former is both absurd and unscientific, and the latter requires that some explicit allowance be made for higher orders of causation…more of an allowance than may readily be discerned in a simple, magical invocation of “randomness”.
The superposition principle, like other aspects of quantum mechanics, is based on the assumption of physical Markovianism. ([40] 40 A Markoff process is a stochastic process with no memory. That is, it is a process meeting two criteria: (1) state transitions are constrained or influenced by the present state, but not by the particular sequence of steps leading to the present state; (2) state transition contains an element of chance. Physical processes are generally assumed to meet these criteria; the laws of physics are defined in accordance with 1, and because they ultimately function on the quantum level but do not fully determine quantum state transitions, an element of chance is superficially present. It is in this sense that the distributed laws of physics may be referred to as “Markovian”. However, criterion 2 opens the possibility that hidden influences may be active.) It refers to mixed states between adjacent events, ignoring the possibility of nonrandom temporally-extensive relationships not wholly attributable to distributed laws. By putting temporally remote events in extended descriptive contact with each other, the Extended Superposition Principle enables coherent cross-temporal telic feedback and thus plays a necessary role in cosmic self-configuration. Among the higher-order determinant relationships in which events and objects can thus be implicated are utile state-syntax relationships called telons, telic attractors capable of guiding cosmic and biological evolution.
Given that quantum theory does not seem irrevocably attached to Markovianism, why has the possibility of higher-order causal relationships not been seriously entertained? One reason is spacetime geometry, which appears to confine objects to one-dimensional “worldlines” in which their state-transition events are separated by intervening segments that prevent them from “mixing” in any globally meaningful way. It is for this reason that superposition is usually applied only to individual state transitions, at least by those subscribing to conservative interpretations of quantum mechanics.
Conspansive duality, which incorporates TD (Topological-Descriptive Duality) and CF (Constructive-Filtrative Duality) components, removes this restriction by placing state transition events in direct descriptive contact. Because the geometric intervals between events are generated and selected by descriptive processing, they no longer have separative force. Yet, since worldlines accurately reflect the distributed laws in terms of which state transitions are expressed, they are not reduced to the status of interpolated artifacts with no dynamical reality; their separative qualities are merely overridden by the state-syntax dynamic of their conspansive dual representation.
In extending the superposition concept to include nontrivial higher-order relationships, the Extended Superposition Principle opens the door to meaning and design. Because it also supports distribution relationships among states, events and syntactic strata, it makes cosmogony a distributed, coherent, ongoing event rather than a spent and discarded moment from the ancient history of the cosmos. Indeed, the usual justification for observer participation – that an observer in the present can perceptually collapse the wave functions of ancient (photon-emission) events – can be regarded as a consequence of this logical relationship.
...
Standard recursion is “Markovian” in that when a recursive function is executed, each successive recursion is applied to the result of the preceding one. Telic recursion is more than Markovian; it self-actualizatively coordinates events in light of higher-order relationships or telons that are invariant with respect to overall identity, but may display some degree of polymorphism on lower orders. Once one of these relationships is nucleated by an opportunity for telic recursion, it can become an ingredient of syntax in one or more telic-recursive (global or agent-level) operators or telors and be “carried outward” by inner expansion, i.e. sustained within the operator as it engages in mutual absorption with other operators. Two features of conspansive spacetime, the atemporal homogeneity of IEDs (operator strata) and the possibility of extended superposition, then permit the telon to self-actualize by “intelligently”, i.e. telic-recursively, coordinating events in such a way as to bring about its own emergence (subject to various more or less subtle restrictions involving available freedom, noise and competitive interference from other telons). In any self-contained, self-determinative system, telic recursion is integral to the cosmic, teleo-biological and volitional levels of evolution.
...
Where space and time respectively correspond to information and a combination of generalized cognition and telic recursion, one may therefore conclude that the conspansive evolution of spacetime is an alternation of teleo-cognitive and informational phases cross-refined by telic recursion involving extended, trans-Markovian telonic relationships."
http://www.ctmu.net/
22. Determinateness, Hyper-Determinateness, and Super-Determinateness
"This chapter is about whether we can make sense of a notion of hyper-determinacy that will transcend the hierarchy of iterations of the determinacy operators and restore paradox. To investigate this, we need to be more rigorous about how the transfinite iterations of the operator are defined. It turns out that one can in a sense define operators that `iterate beyond the allowable hierarchies', but that they don't lead to new paradoxes because they don't behave like we would expect an iteration of a determinacy operator to behave. This is connected to König's paradox. It is argued that the use of higher order resources would not affect the conclusion."
http://www.ingentaconnect.com/content/oso/2490704/2008/00000001/00000001/art00027
"Generalized cognition" is not limited to the specific form of self-processing information found in neurons, check out globally unitary self-dual algebraic structures. The study of "intelligence" in nonbiological systems is quite common these days.
Mirror Neurons, Mirrorhouses, and the Algebraic Structure of the Self:
...
And finally, we may take this conceptual vision one more natural step. The mirrorhouse inside an individual person’s mind is just one small portion of the overall social mirrorworld. What we really have is a collection of interlocking mirrorhouses. If one
face of the tetrahedron comprising my internal mirrorhouse at a certain moment corresponds to one of your currently active subselves, then we may view our two selves at that moment as two adjacent tetrahedra. We thus arrive at a view of a community of interacting individuals as a tiling of part of space using tetrahedra, a vision that would have pleased Buckminster Fuller very much indeed.
Phenomenology of Mirroring
But what does all this abstraction mean in terms of individual subjective experience? Lohmar (2006) has explored the experiential significance of mirror neurons using the language of phenomenology. In this vein he has proposed several theses:
Thesis 1: Maximality. We can co-experience all dimensions of experiencing in other persons.
Thesis 2: Weakness. In co-experiencing the experiences of other persons we always deal with an experience that is dimmed or weakened in a characteristic way.
Thesis 3: Phantasmata. Co-experienced sensations are “phantasma” of sensations. A phantasma of a sensation is “something like” a sensation, i.e., it is given to us in the medium of a sensation; but it is not, however, a real sensation, because phantasmata take place in the absence of that which normally evokes the appropriate sensation. The phantasmata, which make our co-sensing possible, do not appear deliberately but rather unwillingly. But the fact that they occur unwillingly does not imply that they occur automatically in all cases. In Husserlian language, it may be said that phantasmata have both a sense-bearing and sense-fulfilling function at the same time (Husserl 1970, Section 9).
Thesis 4: Sense-bearing intentions. Phantasmata with which we co-experience the sensations, feelings, volition and bodily actions of others have a precise sense. They are specific intentions-of-something, i.e., they are sense-bearing intentions.
Lohmar rephrases his fourth thesis in terms of the idea of “co-willing with others” – the idea that we may experience the doing of something when someone else does it. While this may appear problematic, the issues go away when one delves into the
neuropsychology of experienced “free will,” which is well-documented to largely consist of post-facto explanations of unconsciously-determined actions (Freeman et al, 2000). If ordinary cases of will are largely “illusory” in this sense, there is no reason why instances of co-willing can’t have the same phenomenological and neurophysiological status as instances of individual willing.
In Lohmar’s terminology, we may say that the various observers inside an individual’s mental mirrorhouse are recursively experiencing each others’ phantasmal sensations as higher-order phantasmata – we have phantasmata of phantasmata of phantasmata ...and sometimes there is a real sensation in there too, getting reflected around and around; but there need not necessarily be. One may also have phantasmata that merely reflect other phantasmata, in a bottomless non-well-founded hierarchy. This reminds one of Baudrillard’s (1983) notion of simulation, as a process that in itself need not be simulating anything real. One of the purposes of reflection is simulation; and one of the main uses of simulation is to simulate physically real phenomena; but this is not the only possible use."
http://www.goertzel.org/dynapsyc/2007/mirrorself.pdf
"In the first of three articles, we review the philosophical foundations of an approach to quantum gravity based on a principle of representation-theoretic duality and a vaguely Kantian-Buddist perspective on the nature of physical reality which I have called `relative realism'. Central to this is a novel answer to the Plato's cave problem in which both the world outside the cave and the `set of possible shadow patterns' in the cave have equal status. We explain the notion of constructions and `co'constructions in this context and how quantum groups arise naturally as a microcosm for the unification of quantum theory and gravity. More generally, reality is `created' by choices made and forgotten that constrain our thinking much as mathematical structures have a reality created by a choice of axioms, but the possible choices are not arbitary and are themselves elements of a higher-level of reality. In this way the factual `hardness' of science is not lost while at the same time the observer is an equal partner in the process. We argue that the `ultimate laws' of physics are then no more than the rules of looking at the world in a certain self-dual way, or conversely that going to deeper theories of physics is a matter of letting go of more and more assumptions. We show how this new philosophical foundation for quantum gravity leads to a self-dual and fractal like structure that informs and motivates the concrete research reviewed in parts II,III. Our position also provides a kind of explanation of why things are quantized and why there is gravity in the first place, and possibly why there is a cosmological constant."
http://philsci-archive.pitt.edu/3345/
Chu Spaces: Automata with Quantum Aspects
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.83.7742&rep=rep1&type=pdf
The Duality of the Universe:
http://philsci-archive.pitt.edu/4039/1/Dualiverse.pdf
"Majid’s achievements in the domain of quantum groups and related topics are remarkable. There are reasons to believe that they were inspired, at least partially, by his philosophical ideas and, vice versa, that his mathematical constructions opened before him broad philosophical horizons. They are indeed broader than the ones I have been able to present in this paper. He deals extensively, among others, with “an intrinsic dualism between observer and observed”, with the relationship between mathematics and physics, and he develops some general ideas in the spirit of Kantian and Hegelian philosophy. I postpone the analysis of this aspects of his views to another occasion."
http://www.springerlink.com/content/pv74037m67326r7q/
"Hegel is another German philosopher whose dialectical system has been called idealistic. In his Science of Logic (1812–1814) Hegel argued that finite qualities are not fully "real," because they depend on other finite qualities to determine them. Qualitative infinity, on the other hand, would be more self-determining, and hence would have a better claim to be called fully real. Similarly, finite natural things are less "real"—because they are less self-determining—than spiritual things like morally responsible people, ethical communities, and God."
http://en.m.wikipedia.org/wiki/Idealism
Categorical Ontology of Complex Spacetime Structures: The Emergence of Life and Human Consciousness
...
"Our essay also introduces a novel higher-dimensional algebra approach to space/time ontology that is uniquely characteristic to the human brain and the mind. The human brain is perhaps one of the most complex systems—a part of the human
organism which has evolved earlier than 2 million years ago forming a separate species from those of earlier hominins/hominides. Linked to this apparently unique evolutionary step—the evolution of the H. sapiens species—human consciousness emerged and co-evolved through social interactions, elaborate speech, symbolic communication/language somewhere between the last 2.2 million and 60,000 years ago. The term ultra-complexity level is here proposed to stand for the mind, or the mental level, that is a certain dynamic pattern of layered processes emerging to the most complex level of reality based upon super-complex activities and higher-level processes in special, super-complex systems of the human brain coupled through certain synergistic and/or mimetic interactions in human societies. In this sense, we are proposing a non-reductionist, categorical ontology that possesses both universal attributes and a top level of complexity encompassed by the human consciousness,
The focus in this essay is therefore on the emergence of highly complex systems as categorical, universal and dynamic patterns/structures in spacetime, followed by the even more complex—and also harder to understand, or precisely represent—the emergence of the unique human consciousness. The claim is defended here that the emergence of ultra-complexity requires the occurrence of ‘symmetry breaking’ at several levels of underlying organization, thus leading to the asymmetry of the human brain—both functional and anatomical; such recurring symmetry breaking may also require a sharp complexity increase in our representations of mathematical-relational structure of the human brain and also human consciousness.
Arguably, such repeated symmetry breaking does result in layered complexity dynamic patterns (Baianu and Poli 2008; Poli 2006c) in the human mind that appear to be organized in a hierarchical manner. Thus, ‘conscious planes’ and the focus of
attention in the human mind are linked to an emergent context-dependent variable topology of the human brain, which is most evident during the brain’s developmental stages guided by environmental stimuli such as human/social interactions; the earliest stages of a child’s brain development would be thus greatly influenced by its mother.
The human mind is then represented for the first time in this essay as an ultracomplex ‘system of processes’ based on, but not necessarily reducible to, the human brain’s highly complex activities enabling and entailing the emergence of mind’s own consciousness; thus, an attempt is made here to both define and represent in categorical ontology terms the human consciousness as an emergent/global, ultracomplex process of mental activities as distinct from—but correlated with—a
multitude of integrated local super-complex processes that occur in the human brain. Following a more detailed analysis, the claim is defended that the human mind is more like a ‘multiverse with a horizon, or horizons’ rather than merely a
‘super-complex system with a finite boundary’. The mind has thus freed itself of the real constraints of spacetime by separating, and also ‘evading’, through virtual constructs the concepts of time and space that are being divided in order to be
conquered by the human free will. Among such powerful, ‘virtual’ constructs of the human mind(s) are: symbolic representations, the infinity concept, continuity, evolution, multi-dimensional spaces, universal objects, mathematical categories and abstract structures of relations among relations, to still higher dimensions, many-valued logics, local-to-global procedures, colimits/limits, Fourier transforms, and so on, it would appear without end.
On the other hand, alternative, Eastern philosophical ontology approaches are not based on a duality of concepts such as: mind and body, system and environment, objective and subjective, etc. In this essay, we shall follow the Western philosophy
‘tradition’ and recognize such dual concepts as essentially distinctive items. The possible impact of Eastern philosophies on psychological theories—alongside the Western philosophy of the mind—is then also considered in the concluding sections."
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.145.9486&rep=rep1&type=pdf
It's not just a question of whether mind can operate independently of brain, it's also the reverse. The thesis of Illuminationism in my interpreation is that we are adjunct local mirrors in a state of 'isotelesis' reflecting a shared global light of 'koinotely', the relationship between the two phases appears 'polytelic', as in a multiverse of branching possibilities, accurate representations depend on polished mirrors, but the light itself doesn't originate from the reflection, it is a latent property which manifests depends on levels of observer complexity.
Measurement processes and cosmological emergence
"As a geometrical entity, directed time traditionally follows a one dimensional path of pointlike elements, obeying the classical logic of set theory. Modern mathematics allows more general forms of logic, but time is often still considered a classical variable, which interpolates the geometry of initial and final states of a physical system. How can cosmological time be viewed as an emergent concept in geometrical logic? Using quantum logic as a clue, we investigate local time steps in algebras of finite collections of n measurement outcomes. A local concept of observer time is implicit in the measurement question, but
cosmic time becomes a Hegelian measure of complexity of the observer."
http://www.fqxi.org/data/essay-contest-files/Sheppeard_TimeMDS08.pdf?phpMyAdmin=0c371ccdae9b5ff3071bae814fb4f9e9
"Illuminationism is a doctrine in theology according to which the process of human thought needs to be aided by God. It is the oldest and most influential alternative to naturalism in the theory of mind and epistemology. It was an important feature of ancient Greek philosophy, Neoplatonism, medieval philosophy, and in particular, the Illuminationist school of Persian Islamic philosophy."
http://en.m.wikipedia.org/wiki/Illuminationism
"The virtues of humanity are many but science is the most noble of them all. The distinction which man enjoys above and beyond the station of the animal is due to this paramount virtue. It is a bestowal of God; it is not material, it is divine. Science is an effulgence of the Sun of Reality, the power of investigating and discovering the verities of the universe, the means by which man finds a pathway to God. All the powers and attributes of man are human and hereditary in origin, outcomes of nature's processes, except the intellect, which is supernatural. Through intellectual and intelligent inquiry science is the discoverer of all things. It unites present and past, reveals the history of bygone nations and events, and confers upon man today the essence of all human knowledge and attainment throughout the ages. By intellectual processes and logical deductions of reason, this super-power in man can penetrate the mysteries of the future and anticipate its happenings."
http://info.bahai.org/article-1-5-3-1.html
"What becomes of the soul after its separation from the body? The question concerns that which has a place and that which is placeless. The human body is in space; the soul has no place in space. Space is a quality of material things and that which is not material does not partake of space. The soul, like the intellect, is an abstraction. Intelligence does not partake of the quality of space, though it is related to man's brain. The intellect resides there, but not materially. Search in the brain you will not find the intellect. In the same way though the soul is a resident of the body it is not to be found in the body.
When man dies, his relation with the body ceases. The sun is reflected in the mirror; the mirror reflects the light and brilliancy of the sun, but the sun does not reside in the mirror. It does not enter nor come out of the mirror, nevertheless one sees it in the mirror, so the soul reflects itself in the body. If the mirror be broken the sun does not die. The body is the temporary mirror; the spiritual soul suffers no change, no more than the sun does remaining eternally in its own station. Even as in the world of dreams when all the physical faculties are in abeyance and the soul travels in all realms seeing, hearing, speaking, so when the physical body decomposes, the soul is not affected."
http://bahai-library.com/abdulbaha_divine_philosophy.html&chapter=3
In the Baha'i view, there is no such thing as eternal damnation, the original sin, or evil.
"The epitome of this discourse is that it is possible that one thing in relation to another may be evil, and at the same time within the limits of its proper being it may not be evil. Then it is proved that there is no evil in existence; all that God created He created good. This evil is nothingness; so death is the absence of life. When man no longer receives life, he dies. Darkness is the absence of light: when there is no light, there is darkness."
http://reference.bahai.org/en/t/ab/SAQ/saq-75.html
"Luhmann alternates between this meta-biological model using concepts of functional differentiation and structural coupling for the explanation, and a meta-theological one where meaning seems to be given transcendentally as a substance analogous to life (Luhmann, 1986). The meta-theological metaphor is pursued by Luhmann by grounding his theory on the operation of the distinction (that is, on a paradox). The first distinction is then Lucifer’s breaking away from God as the devil (Luhmann, 1990, at pp. 118 ff.) or, in other words, the problem of the Theodicy, that is, the origin of evil in the world (Leibniz, [1710] 1962)."
http://www.scribd.com/doc/20186514/Luhman-s-Theory-Specif-genomena-of-Husserl
"His published work emerged as part of a generation of French thinkers including Gilles Deleuze, Jean-François Lyotard, Michel Foucault, Jacques Derrida and Jacques Lacan who all shared an interest in semiotics, and he is often seen as a part of the poststructuralist philosophical school. In common with many poststructuralists, his arguments consistently draw upon the notion that signification and meaning are both only understandable in terms of how particular words or "signs" interrelate. Baudrillard thought, as many post-structuralists, that meaning is brought about through systems of signs working together. Following on from the structuralist linguist Ferdinand de Saussure, Baudrillard argued that meaning (value) is created through difference - through what something is not (so "dog" means "dog" because it is not-"cat", not-"goat", not-"tree", etc.). In fact, he viewed meaning as near enough self-referential: objects, images of objects, words and signs are situated in a web of meaning; one object's meaning is only understandable through its relation to the meaning of other objects; in other words, one thing's prestige relates to another's mundanity.
From this starting point Baudrillard constructed broad theories of human society based upon this kind of self-referentiality. His pictures of society portray societies always searching for a sense of meaning — or a "total" understanding of the world — that remains consistently elusive. In contrast to poststructuralists such as Foucault, for whom the formations of knowledge emerge only as the result of relations of power, Baudrillard developed theories in which the excessive, fruitless search for total knowledge lead almost inevitably to a kind of delusion. In Baudrillard's view, the (human) subject may try to understand the (non-human) object, but because the object can only be understood according to what it signifies (and because the process of signification immediately involves a web of other signs from which it is distinguished) this never produces the desired results. The subject, rather, becomes seduced (in the original Latin sense, seducere, to lead away) by the object. He therefore argued that, in the last analysis, a complete understanding of the minutiae of human life is impossible, and when people are seduced into thinking otherwise they become drawn toward a "simulated" version of reality, or, to use one of his neologisms, a state of "hyperreality." This is not to say that the world becomes unreal, but rather that the faster and more comprehensively societies begin to bring reality together into one supposedly coherent picture, the more insecure and unstable it looks and the more fearful societies become. Reality, in this sense, "dies out.""
http://en.m.wikipedia.org/wiki/Jean_Baudrillard
Jeam Baudrillard: Symbolic Exchange and Death
http://www.scribd.com/doc/39207194/Professor-Jean-Baudrillard-Symbolic-Exchange-and-Death-0803983999
Jean Baudrillard: Selected Writings
http://www.humanities.uci.edu/mposter/books/Baudrillard,%20Jean%20-%20Selected%20Writings_ok.pdf
The Illusion of The Beginning: A Theory of Drawing and Animation
It may be that universal history is the history of a handful of metaphors. The purpose of this note will be to sketch a chapter of this history.
http://www.thefreelibrary.com/The+Illusion+of+The+Beginning%3A+A+Theory+of+Drawing+and+Animation.-a064263079
Orders of Value: Probing the Theoretical Terms of Archival Practice
...
The French sociologist Jean Baudrillard has placed the contemporary drive to documentation within a larger phenomenonal context, which he calls the "fatal strategy" of modern society. This strategy excludes the benign constraints intrinsic in the
"dialectical mode," such as reconciliation, synthesis and equilibrium, in favour of radical antagonisms, an "ascent to extremes" which are noticeable in the incidences of "infinite proliferation" symptomatic of our "hyperdeterminacy" and "hyperfunctionality." More darkly, Baudrillard finds in cancer, a disease of (cell) overproduction, a suitable symbol for the "hyperactivity" of modern society.
This is the true behaviour of the cancerous cell (hypervitality in a single direction), of the hyperspecialization of objects and people, of the operationalism of the smallest detail, and of the hypersignification of the slightest sign: the leitmotif of our daily lives. But this is also the chancroid secret of every obese and cancerous system: those of communication, of information, of production, of destruction.'
The bloated significance of signs, of communication and information, Baudrillard contends, is traceable to the obsession with determining causes, with locating origins, which, therefore, results in the obliteration of finalities. And it is the obliteration of finalities that produces the documentary mentality: for every document found, there are always others that empower their discoverers either to undermine or to engulf the earlier one in new causes: one document always needs, points the way to or preemptively explains others. And so goes the historian's quest for the document that will reveal the undiscovered cause
- Marc Bloch somewhere referred to "la hantise des origines" - and which will undermine a previous account of genesis. Reflecting this situation, Baudrillard says, is
'the hypertrophying of historical research, the delirium of explaining everything, of ascribing everything, of referencing everything . . . All this becomes a fantastic burden - references living off one another and at the other's expense. Here again we have an excrescent interpretive system developing without any relation to its objective. All this is a consequence of a forward flight in the face of the haemorrhaging of objective causes.'
http://journals.sfu.ca/archivar/index.php/archivaria/article/download/11761/12711
Which is more general, consciousness or intelligence? I think what we really should be asking is whether consciousness is limited to organic brains. I am not trying to convince you to see it "my way", there is more than one way to see it, I'm just offering information from various sources for those interested in forming their own conclusions.
intelligent:
1) having the capacity for thought and reason especially to a high degree
2) endowed with the capacity to reason
conscious:
1) intentionally conceived
2) knowing and perceiving; having awareness of surroundings and sensations and thoughts
A Sheaf Theoretic Approach to Consciousness
A new fundamental mathematical model of consciousness based on category theory is presented. The model is based on two philosophical-theological assumptions: a) the universe is a sea of consciousness, and b) time is multi-dimensional and non-linear.
http://works.bepress.com/gkato/16/
Category Theory and Consciousness:
http://books.google.com/books?hl=en&lr&id=eRkooap_j-YC&oi=fnd&pg=PA357&dq=category+theory+and+consciousness+kato&ots=1FWA8LAocQ&sig=_sDKnE-7tJnoQurjmNERmtE4vO8
Category Theory as the Language of Consciousness
http://www.mindspring.com/~r.amoroso/Amoroso24.pdf
Causation is self-contained, but the principle of causal closure only acknowledges physical causes, however information is more than physical (matter/energy) it is also mental. Descartes separated the physical from the mental to please the Church, however there must be interaction between the two if reality is reducible to a common substance. I believe the representational and the physical undergo a form of causal feedback. "Cybersemiotics" is another attempt to extend the information concept. Downward or Top-Down Causation is also something to consider.
"Biosemiotics sees the evolution of life and the evolution of semiotic systems as two aspects of the same process. The scientific approach to the origin and evolution of life has, in part due to the success of molecular biology, given us highly valuable accounts of the outer aspects of the whole process, but has overlooked the inner qualitative aspects of sign action, leading to a reduced picture of causality. Complex self-organized living systems are also governed by formal and final causality —- formal in the sense of the downward causation from a whole structure (such as the organism) to its individual molecules, constraining their action but also endowing them with functional meanings in relation to the whole metabolism; and final in the sense of the tendency to take habits and to generate future interpretants of the present sign actions. Here, biosemiotics draws also upon the insights of fields like systems theory, theoretical biology and the study of complex self-organized systems.
...
It may help to resolve some forms of Cartesian dualism that is still haunting philosophy of mind. By describing the continuity between body and mind, biosemiotics may also help us to understand how human "mindedness" may naturalistically emerge from more primitive processes of embodied animal "knowing."
http://en.m.wikipedia.org/wiki/Biosemiotics
I am against the politicization of science and/or religion, that is my problem with the ID movement, I have an interest in alternative explanation for evolutionary results, however from the perspective of anticipatory modeling in setting up the conditions for emergence, or cybernetic "mechanism design" to borrow a term from game theory.
"Consider a dynamical system whose behavior appears random or chaotic. There are two ways in which an apparent randomness can occur: (1) external noise, so that if the evolution of the system is unstable, external perturbations amplify exponentially with time -such systems are called homoplectic; (2) internal mechanisms, so that the randomness is generated purely by the dynamics itself and does not depend on any external sources or require that randomness be present in the initial conditions -such systems are called autoplectic systems. An example of an autoplectic system is the one-dimensional, two-state, two neighbor Cellular Automaton rule-30, starting from a single non-zero site. The temporal sequence of binary values starting from that single non-zero initial seed are completely random, despite the fact that the evolution is strictly deterministic and the initial state is ordered."
http://www.iscid.org/encyclopedia/Autoplectic_Systems
Stephen Wolfram has described three mechanisms for randomness:
http://www.wolframscience.com/nksonline/section-10.3
The first two consider it already present and external to the system, while the third is generated and internal.
1. An external environment continuously injects randomness into the rules of a system
2. The initial conditions are chosen at random, and the subsequent evolution follows definite rules sensitive to those initial conditions.
3. No random input is given at all, yet randomness is generated by the dynamics of the system itself.
I think the randomness we see in nature is largely the result of the third form, while the first two are used in mathematical practice.
There is plenty of evidence that human beings developed from earlier species of animals, I believe the issue has to do with nonrandom priority mechanisms which order selective action.
The variation is random, the selection process doesn't have to be.
The source of randomness is also relevant, is it external or internal to the system? Do the initial or boundary conditions of an evolving system have to be random?
It's not clear what is meant by volition anyway, I think of it as a type of constitutive-causal or whole-part composition relation, similar to "top-down causation".
"In our view, the phrase ‘top-down causation’ is often used to describe a perfectly coherent and familiar relationship between the activities of wholes and the behaviors of their components, but the relationship is not a causal relationship. Likewise, the phrase ‘bottom-up causation’ does not, properly speaking, pick out a causal relationship. Rather, in unobjectionable cases both phrases describe mechanistically mediated effects. Mechanistically mediated effects are hybrids of constitutive and causal relations in a mechanism, where the constitutive relations are interlevel, and the causal relations are exclusively intralevel. Appeal to top-down causation seems spooky or incoherent when it cannot be explicated in terms of mechanistically mediated effects."
http://philosophyfaculty.ucsd.edu/faculty/pschurchland/classes/cs200/topdown.pdf
Evolution in four dimensions: genetic, epigenetic, behavioral, and symbolic variation in the history of life
"Ideas about heredity and evolution are undergoing a revolutionary change. New findings in molecular biology challenge the gene-centered version of Darwinian theory according to which adaptation occurs only through natural selection of chance DNA variations. In Evolution in Four Dimensions, Eva Jablonka and Marion Lamb argue that there is more to heredity than genes. They trace four "dimensions" in evolution—four inheritance systems that play a role in evolution: genetic, epigenetic (or non-DNA cellular transmission of traits), behavioral, and symbolic (transmission through language and other forms of symbolic communication). These systems, they argue, can all provide variations on which natural selection can act."
...
"One of the things that Barbara McClintock discovered many years ago was that stress conditions lead to a massive movement of mobile genetic elements in the genomes of plants. She regarded this as an adaptive response, which provided an important source of new variation.
...
People can and do argue whether 'induced global' mutation is an evolved adaptive response, or something pathological which may incidentally have beneficial effects, but there is no doubt that our second kind of nonrandom mutation process -- 'local hypermutation' -- is an adaptation. With induced global mutation, the mutations produced are nonrandom because they occur at a 'time' when they are likely to be useful; with local hypermutation, changes are produced at a genomic 'place' where they are useful."
http://tinyurl.com/6dpqkbn
"Diagram 2: The upper diagram illustrates ordinary cybernetic feedback between two information transducers exchanging and acting on information reflecting their internal states. The structure and behavior of each transducer conforms to a syntax, or set of structural and functional rules which determine how it behaves on a given input. To the extent that each transducer is either deterministic or nondeterministic (within the bounds of syntactic constraint), the system is either deterministic or “random up to determinacy”; there is no provision for self-causation below the systemic level. The lower diagram, which applies to coherent self-designing systems, illustrates a situation in which syntax and state are instead determined in tandem according to a generalized utility function assigning differential but intrinsically-scaled values to various possible syntax-state relationships. A combination of these two scenarios is partially illustrated in the upper diagram by the gray shadows within each transducer.
...
Because states express topologically while the syntactic structures of their underlying operators express descriptively, attributive duality is sometimes called state-syntax duality. As information requires syntactic organization, it amounts to a valuation of cognitive/perceptual syntax; conversely, recognition consists of a subtractive restriction of informational potential through an additive acquisition of information. TD duality thus relates information to the informational potential bounded by syntax, and perception (cognitive state acquisition) to cognition.
In a Venn diagram, the contents of circles reflect the structure of their boundaries; the boundaries are the primary descriptors. The interior of a circle is simply an “interiorization” or self-distribution of its syntactic “boundary constraint”. Thus, nested circles corresponding to identical objects display a descriptive form of containment corresponding to syntactic layering, with underlying levels corresponding to syntactic coverings.
This leads to a related form of duality, constructive-filtrative duality.
...
That is, the circular boundaries of the Venn circles can be construed as those of “potentialized” objects in the process of absorbing their spatiotemporal neighborhoods. Since the event potentials and object potentials coincide, potential instantiations of law can be said to reside “inside” the objects, and can thus be regarded as functions of their internal rules or “object syntaxes”.
Objects thus become syntactic operators, and events become intersections of nomological syntax in the common value of an observable state parameter, position.
The circle corresponding to the new event represents an attribute consisting of all associated nomological relationships appropriate to the nature of the interaction including conserved aggregates, and forms a pointwise (statewise) “syntactic covering” for all subsequent potentials.
...
The telic-recursive cross-refinement of syntax and content is implicit in the “seed” of Γ-grammar, the MU form, which embodies the potential for perfect complementarity of syntax and state, law and matter.
Since this potential can only be specifically realized through the infocognitive binding of telesis, and localized telic binding is freely and independently effected by localized, mutually decoherent telic operators, deviations from perfect complementarity are ubiquitous. SCSPL evolution, which can be viewed as an attempt to help this complementarity emerge from its potential status in MU, incorporates a global (syntactic) invariant that works to minimize the total deviation from perfect complementarity of syntax and state as syntactic operators freely and independently bind telesis. This primary SCSPL invariant, the Telic Principle, takes the form of a selection function with a quantitative parameter, generalized utility, related to the deviation. The Telic Principle can be regarded as the primary component of SCSPL syntax…the spatiotemporally distributed self-selective “choice to exist” coinciding with MU.
...
SCSPL incorporates the concepts of syntactic stratification and syntactic distribution. For example, because the laws of mathematics everywhere apply with respect to the laws of physics, the former distribute over the latter in the syntactic sense. Thus, where the laws of mathematics and physics are denoted by S1=LMS and S2 respectively, S1 distributes over S2, i.e. forms a syntactic covering for S2.
Essentially, this means that the laws of mathematics are everywhere a required syntactic component of the language of physics. With S2 is associated an SCSPL “sublanguage” called LO (with a letter O subscript). LO constitutes the world of perception, the classical objective universe of sense data traditionally studied by science. LO is contained in the telic-recursive, pre-informational phase of SCSPL, LS, which encompasses the cross-refinement of LO syntax and LO content from the pre-infocognitive aspect of SCSPL. The part of SCSPL grammar confined to LO incorporates certain restrictions to which LS is not subject; e.g., the grammatical portion of LO (S2) is fixed, distributed and supposedly continuous, while that of LS can also be mutable, local and discrete…in a word, telic.
Γ grammar is the generative grammar of SCSPL = (LS⊃LO). Γ grammar is unlike an ordinary grammar in that its processors, products and productions coincide and are mutually formed by telic recursion. Syntax and state, loosely analogous to form and content (or productions and products), are mutually refined from telesis through telic recursion by infocognitive processors. Production rules include the Telic Principle, distributed elements of syntax formed in the primary phase of telic recursion, and more or less polymorphic telons formed by agent-level telors. The corresponding modes of production are global telic recursion, informational recursion by distributed syntax, and local telic recursion.
The “words” produced by Γ grammar are not strings of symbols, but LO spatial relationships among parallel processors that can read and write to each other’s states. In effect, the states of its processors are roughly analogous to the symbols and strings of an ordinary language. The processors of Γ grammar thus function not only as transducers but as symbolic placeholders for observables and values, while their external states correspond to products and their state transitions realize the productions of the grammar. In other words, the states and state transitions of the processors of Γ grammar comprise a representation of Γ grammar, rendering SCSPL a dynamic self-modeling language or “interactive self-simulation”.
...
Γ grammar generates SCSPL according to the utility of its sentient processors, including the self-utility of Γ and the utility of its LO relations to telors in A. Γ and A generate telons on the global and local level respectively; thus, they must be capable of recognizing and maximizing the selection parameter υ (in the case of human telors, for example, this requires the QPS (qualio-perceptual syntax) and ETS (emo-telic syntax) components of the HCS (Human Cognitive-Perceptual Syntax)). As such, they are responsible for telic recursion and may be regarded as the “generators” of Γ grammar, while the set Q of elementary physical objects are freely and competitively acquired by telons and thus occupy an ontologically secondary position.
Γ grammar is conspansive. Non-global processors alternate between the generation and selective actualization of possible productions, and thus between the generative and selective (inner expansive and requantizative) phases of conspansion. The selective phase of an operator coincides with interactive mutual-acquisition events, while the generative phase coincides with the generation and selective actualization of possible productions through hological multiplexing. In conjunction with extended spatiotemporal superposition, conspansion provides the means of local (telic and informational) recursion.
...
It is instructive to experiment with the various constructions that may be placed on LS and LO. For example, one can think of LS as “L-sim”, reflecting its self-simulative, telic-recursive aspect, and of LO as “L-out”, the output of this self-simulation. One can associate LO with observable states and distributed-deterministic state-transition syntax, and LS with the metasyntactic Telic Principle. One can even think of LS and LO as respectively internal and (partially) external to SCSPL syntactic operators, and thus as loosely correspondent to the subjective and objective aspects of reality. Where LS and LO are associated with the coherent inner expansion and decoherent requantization phases of conspansion, so then are subjective and objective reality, simulation and output, “wave and particle”. In other words, the subjective-objective distinction, along with complementarity, can be viewed as functions of conspansive duality.
...
Where space and time correspond to information and generalized cognition respectively, and where information and cognition are logically entwined in infocognitive SCSPL syntactic operators intersecting in states and state-transition events, space and time are entwined in a conspansive event-lattice connected by syntax and evolving through mutual absorption events among syntactic operators, symmetric instances of generalized observation influenced by telic recursion. Thus, time is not “fed into” the explanation of existence, but is a function of conspansive, telic-recursive SCSPL grammar.
...
To see how information can be beneficially reduced when all but information is uninformative by definition, one need merely recognize that information is not a stand-alone proposition; it is never found apart from syntax. Indeed, it is only by syntactic acquisition that anything is ever “found” at all. That which is recognizable only as syntactic content requires syntactic containment, becoming meaningful only as acquired by a syntactic operator able to sustain its relational structure; without attributive transduction, a bit of information has nothing to quantify. This implies that information can be generalized in terms of “what it has in common with syntax”, namely the syndiffeonic relationship between information and syntax."
http://www.megafoundation.org/CTMU/Articles/Langan_CTMU_092902.pdf
"The paper uses the tools of mereotopology (the theory of parts, wholes and boundaries) to work out the implications of certain analogies between the 'ecological psychology' of J. J Gibson and the phenomenology of Edmund Husserl. It presents an ontological theory of spatial boundaries and of spatially extended entities. By reference to examples from the geographical sphere it is shown that both boundaries and extended entities fall into two broad categories: those which exist independently of our cognitive acts (for example, the planet Earth, its exterior surface); and those which exist only in virtue of such acts (for example: the equator, the North Sea). The visual field, too, can be conceived as an example of an extended entity that is dependent in the sense at issue. The paper suggests extending this analogy by postulating entities which would stand to true judgments as the visual field stands to acts of visual perception. The judgment field is defined more precisely as that complex extended entity which comprehends all entities which are relevant to the truth of a given (true) judgment. The work of cognitive linguists such as Talmy and Langacker, when properly interpreted, can be shown to yield a detailed account of the structures of the judgment fields corresponding to sentences of different sorts. A new sort of correspondence-theoretic definition of truth for sentences of natural language can then be formulated on this basis.
...
Are fiat boundaries, and the fiat objects they circumclude, discovered or created? The former view has in its favour the virtue of ontological parsimony: only one sort of boundary needs to be admitted into our ontology, where on the latter view we should have to admit in addition to purely geometrical boundaries also certain historically determined boundaries which coincide with these. An argument in favour of the existence of historically created boundaries can however be formulated as follows. We note, first of all, that 'Hamburg' is an ambiguous term, standing on the one hand for a certain city (Hamburg-Stadt) and on the other hand for a certain administrative entity (Hamburg-Land), which is one of the constituent."
http://ontology.buffalo.edu/smith/articles/tvf.html
"The concept of measure is intimately involved with the notion of number. Modeling, a sophisticated form of abstract description, using mathematics and computation, both tied to the concept of number, and their advantages and disadvantages are exquisitely detailed by Robert Rosen in Life Itself, Anticipatory Systems, and Fundamentals of Measurement. One would have hoped that mathematics or computer simulations would reduce the need for word descriptions in scientific models. Unfortunately for scientific modeling, one cannot do as David Hilbert or Alonzo Church proposed: divorce semantics (e.g., symbolic words: referents to objects in reality) from syntax (e.g., symbolic numbers: referents to a part of a formal system of computation or entailment). One cannot do this, even in mathematics without things becoming trivial (ala Kurt Godel). It suffices to say that number theory (e.g., calculus), category theory, hypersets, and cellular automata, to mention few, all have their limited uses. The integration between all of these formalisms will be necessary plus rigorous attachment of words and numbers to show the depth and shallowness of the formal models. These rigorous attachments of words are ambiguous to a precise degree without the surrounding contexts. Relating precisely with these ambiguous words to these simple models will constitute an integration of a reasonable set of formalisms to help characterize reality."
http://edgeoforder.org/pofdisstruct.html
"Transfinite induction is an extension of mathematical induction to well-ordered sets, for instance to sets of ordinals or cardinals."
http://en.m.wikipedia.org/wiki/Transfinite_induction
Mathematical undecidability and quantum randomness
"Another very cool result from this week. Anton Zeilinger and co-conspirators claim a link between mathematical undecidability and quantum physics. Specifically, given appropriate coding, “whenever a mathematical proposition is undecidable within the axioms encoded in the state, the measurement associated with the proposition gives random outcomes.” Thus, they claim that “quantum randomness is irreducible and a manifestation of mathematical undecidability”."
http://arxiv.org/abs/0811.
"In logic, statistical inference, and supervised learning, transduction or transductive inference is reasoning from observed, specific (training) cases to specific (test) cases. In contrast, induction is reasoning from observed training cases to general rules, which are then applied to the test cases. The distinction is most interesting in cases where the predictions of the transductive model are not achievable by any inductive model. Note that this is caused by transductive inference on different test sets producing mutually inconsistent predictions."
http://en.wikipedia.org/wiki/Transduction_(machine_learning)
Subscribe to:
Posts (Atom)