Sei sulla pagina 1di 17

InfoComputationalPhilosophyOfNature:AnInformational UniverseWithComputationalDynamics

GordanaDodigCrnkovic MlardalenUniversity,Sweden,http://www.idt.mdh.se/~gdc TheCybersemioticsCritiqueoftheExistingPracticeofWissenshaft InhisarticleCybersemiotics:AnEvolutionaryWorldViewGoingBeyondEntropy and Information into the Question of Meaning (Brier 2010), Sren Brier rightly criticizes the present state of scientific understanding of nature, (life, consciousness and cultural meaning all as a part of nature and evolution, includinghumansandotherlivingbeings),specifically: 1. The physicochemical scientific paradigm based on third person objective empirical knowledge and mathematical theory, but with no conceptions of experientallife,meaningandfirstpersonembodiedconsciousnessandtherefore meaningfullinguisticintersubjectivity; 2. The biological and natural historical science approach understood as the combination of genetic evolutionary theory with an ecological and thermodynamicviewbasedontheevolutionofexperientallivingsystemsasthe groundfactandengagedinasearchforempiricaltruth,yetdoingsowithouta theory of meaning and first person embodied consciousness and thereby linguisticmeaningfulintersubjectivity; 3. The linguisticculturalsocial structuralist constructivism that sees all knowledge as constructions of meaning produced by the intersubjective web of language,culturalmentalityandpower,butwithnoconceptofempiricaltruth, life, evolution, ecology and a very weak concept of subjective embodied first personconsciousnessevenwhiletakingconsciousintersubjectivecommunication andknowledgeprocessesasthebasicfacttostudy(thelinguisticturn); 4. Any approach which takes the qualitative distinction between subject and object as the ground fact, on which all meaningful knowledge is based, considering all results of the sciences including linguistics and embodiment of consciousness as secondary knowledge, as opposed to a phenomenological (Husserl)oractuallyphaneroscopic(Peirce)firstpersonpointofviewconsidering consciousmeaningfulexperiencesinadvanceofthesubject/objectdistinction. ThisistheviewBrierarguesforinamoredetailinhisbookWhyInformationis not enough (Brier 2008) where he also proposes the Cybersemiotic star a diagram showing how knowledge understood as Wissenschaft arises in a naturalistframeworkinfourdifferentapproachestocognition,communication, meaning and consciousness: the exact natural sciences, the life sciences,

phenomenologicalhermeneuticinterpretationalhumanitiesandthesociological discursivelinguistic, which are all considered to be equally important and have tobeunitedinatransdisciplinarytheory ofinformation,semiotics,firstperson consciousnessandanintersubjectiveculturalsocialcommunicativeapproach: The semiotic star in cybersemiotics claims that the internal subjective, the intersubjectivelinguistic,ourlivingbodies,andnatureareirreducibleandequally necessaryasepistemologicalprerequisitesforknowing.Theviablerealityofany of them cannot be denied without selfrefuting paradoxes. There is an obvious connectednessbetweenthefourworlds,whichPeircecalledsynechism.Italso pointstoPeircesconclusionthatlogicandrationalityarepartoftheprocessof semiosis, and that meaning in the form of semiosis is a fundamental aspect of reality,notjustaconstructioninourheads.(Brier2009) Cybersemiotics is an ambitious and important project. It correctly identifies problems in the established traditional ideas of knowledge and science and proposesnewpossibledirectionsofthedevelopment. In present article I will argue that Infocomputationalism (ICON), built on different grounds and adopting scientific thirdperson account (DodigCrnkovic 20062011), covers the entire list from (Brier 2010) except for qualia as experienced in a firstperson mode. Qualia as other natural phenomena are accountedforinathirdpersonperspective. InfoComputationalismisapurelyscientificnaturalisticframework.Itapproaches thecomplexworldofnaturalphenomenawiththeinfocomputationaltoolsand modelsbasedonresearchresultsfromarangeofsciencesfrommathematics tocomputing,biologyandecology.ICONisconstructedaswhatWolfram(2002) calls a new kind of science. It uses among other modeling tools newly discovered generative models of complex systems, which starting from often very simple rules, generates complex behaviors such as selforganization of an insect swarm or an immune system. This type of approach is known as agent basedmodeling(ABM),(Axelrod1997)(Epstein2007).Beingatrulynewscientific modeling approach, it is under intensive development and we can already anticipatefutureinsightsintogenerativemechanismsofemergentbehaviorsofa varietyofcomplexsystems.Itprovidesaframeworkthatcutsacrossallfourof thedomainsofBrierssemioticstar. So even though science(s), as typically taught at the universities today in Philosophy of Science/Theory of Science courses, consist of disparate fields of study which more or less ignore each other, the scientific practice is changing rapidly and interdisciplinary projects are becoming increasingly important and mainstream. From the research grassroots, very much thanks to information and communication technologies providing smooth communication means, cross disciplinary scientific practices arise, bringing the necessity of understanding across the research field boundaries. Generative models are excellent tools

which help bridging gaps across the research fields. (Dodig Crnkovic 2003). Denning (2007) declares: Computing is a natural science and ICON provides plentyofevidenceforthisclaim.BiologistssuchasKurakin(2009)alsoaddtothe informationbasednaturalism: Whenreconceptualizedinequivalenttermsofselforganizingadaptivenetworks of energy/matter/information exchanges, complex systems of different scales appear to exhibit universal scaleinvariant patterns in their organization and dynamics, suggesting the selfsimilarity of spatiotemporal scales and fractal organizationofthelivingmattercontinuum. InfocomputationalismandCybersemiotics ICON is based on two principles: information (structure) and computation (dynamics) (Dodig Crnkovic 2006, 2009). The fundamental nature of reality is informational (one might say: in ICON, being is informational) and generalized understanding of computation as its dynamics (in ICON, becoming is computational). In Kants vocabulary, the thinginitself (das Ding an sich), unknowablewhilecertainlyexisting,isunderstoodasprotoinformationequally existing and unknowable which for an agent through interaction becomes informationwhichwillbeusedforallsortsofagencyintheworldsensorimotor aswellaslanguagerelated.BeingacontemporaryversionofNaturalPhilosophy, ICONofcourseincludesevolutionaswellasallscientificresultsfromcomplexity and the rest of physics, biology, ecology, sociology and other thirdperson scientificaccounts. CybersemioticsbuildsonPeircessynechismasaconnectednessbetweenmatter and mind, nature and culture. This continuity and connectedness is often opposed to computational universe based on discrete models such as Ed Fredkins Digital Physics. However, it is important to emphasize that natural computationalism is a general idea of the existing physical universe with all its manyorganizationallevels,seenasacomputationalnetworkwherecomputation is going on from quantummechanical level up to cosmic level and back. Computing exists as both discrete (dominant model of computing today) and continuous(suchasfoundinsomeanalogcomputerswhichweredevelopedin theearlycomputerera,andbeingsupersededbydigitalones). Natural computationalism is thus not essentially dependent on the assumption aboutthediscretenessoftheuniverse.IftheNaturecomputes,itscomputationis both continuous and discrete. Historically there have been both types of frameworks, both geometry & calculus (continuous) and algebra & algorithms (discrete)andthosetwowaysofthinkingareapplicablefordifferentpurposes. ThediscretePlatonicworldbasedonintegers("Godmadetheintegers;allelseis theworkofman"Kronecker)isaccordingtoChaitinmorebeautifulandmore comprehensivethantheworldofrealnumberswithitsuncountableinfinities.In anycase,whatmakescomputationaluniverseattractiveisitsconstructivenature

weconstructamodelwhichwehavecontrolover,andcomparethebehavior ofthemodelwiththerealworldphenomena: So I said: let us design a world with pure thought that is computational, with computationasabuildingblock.ItslikeplayingGod,butitwouldbeaworldwe canunderstand,no?Ifweinventitwecanunderstanditwhereasifwetryto figureouthowthisworldworksitturnsintometaphysicsagain.Chaitinin(Zenil 2011)p355. Cybersemiotics adopts Peirce's trichotomy of Firstness, Secondness and Thirdness. Peirce defines Firstness as being independent of anything else; Secondness as being relative to, the interaction with something else and Thirdnessasmediation,throughwhichaFirstnessandaSecondnessarebrought into relation. Similarly, Peirce describes the sign relation as the triad of icon, index and symbol: an icon represents an object by its inherent form, which resembles the object; an index represents the object through some causal relationship and a symbol represents an object by a convention within a communityofpractice. AccordingtoSowasinterpretationofPeirce,ineverylivingbeing,frombacteria tohumansandperhapsbeyond,semiosisisthecrucialThirdnessthatenablesthe organismtorespondtosignsbytakingactionsthatservetofurtheritsgoalsof gettingfood,avoidingharm,andreproducingitskind.Formostlifeforms,those goalsareunconscious,andmostofthemarebuiltintotheirgenes.Butthereisno differenceinprinciplebetweentheevolutionarylearningthatisencodedingenes and the individual learning that is encoded in neurons. Understanding life at everylevelandineverykindoforganizationfromcoloniesofbacteriatohuman businesses and governments requires an understanding of signs, goals, communication,cooperation,andcompetitionallofwhichinvolveaspectsof Thirdness. If we are to establish mapping between Peirces approach and Info computationalone,FirstnesswouldcorrespondtoProtoInformation(themode ofbeingofthatwhichiswithoutreferencetoanysubjectorobject),Secondness tointeraction(themodeofbeingofthatwhichisitselfinreferringtoasecond subject,anytypeofinformationexchange),andThirdnesswouldcorrespondto (intentional)agency(themodeofbeingofthatwhichisitselfinbringingasecond andathirdsubjectintorelationwitheachother). Infocomputationalismhasstrongconnectionsnotonlywithphysics,biologyand cognitivesciencebutalsowithartificialintelligenceandrobotics.Itisthereforea frameworkwhichissuitableforgeneralizationfrombiologicaltoartificialagents. Cognitive agents within this framework are not mechanical deterministic machines,butadaptive,learningandanticipativebeingsincreasinglycapableof adequateandintelligentbehavior.

Forsciencestodayintelligentagentissomethingthatcanbebiologicalbutalso artificial.ICONisconstructedasageneralizationinthisrespect. Peirces semiosis is situated in Thirdness, while Infocomputationalism extends through all three domains: Firstness (protoinformation), Secondness (interaction)andThirdness(intentionalagency). EverysignisinformationbutnotallofinformationisasigninthesenseofPeirce: Asign,orrepresentamen,issomethingwhichstandstosomebodyforsomething insomerespectorcapacity.Itaddressessomebody,thatis,createsinthemindof that person an equivalent sign, or perhaps a more developed sign. That sign which it creates I call the interpretant of the first sign. The sign stands for something,itsobject.Itstandsforthatobject,notinallrespects,butinreference to a sort of idea, which I have sometimes called the ground of the representamen.(CP2.228) The aim of this article is not of course to claim superiority of any specific approach, but rather to elucidate their domains of applicability and focus. For further comparisons between Cybersemiotics and Infocomputationalism, see (DodigCrnkovic2010).AsWhiteheadcogentlynoticed: Human knowledge is a process of approximation. In the focus of experience, there is comparative clarity. But the discrimination of this clarity leads into the penumbralbackground.Therearealwaysquestionsleftover.Theproblemisto discriminateexactlywhatweknowvaguely.Whitehead(1937) From the historical lessons learned we may conclude that information (and computation)willnotbeenoughtoprovidetheultimateworldview.Therehas never been a thought system in the history that could withstand changes in human civilization.Manysystemshave greatlycontributedtothedevelopment of humanity, often in combination, and more frequently in opposition to each other. They have nonetheless broadened the horizons and helped human imagination in constructing new and ever more complex thought systems, theoriesandapplications. In retrospect looking at the previous paradigm of Clockwork Universe, we can conclude: mechanics was not enough. Nevertheless we learned a lot and mechanicswasextremelyfruitfulconceptualdevice.AsWhitehead(1933)points out,eachspecificmethodorapproachisatlengthexhausted.Initiallyasystem maybeasuccessbutdevelopedtothelimitsofwhatitcansupport,itcollapses and finally presents an obstacle for new systems to come. What makes a framework worth effort is its fruitfulness as a generator of new ideas and knowledge (Dodig Crnkovic & Mller 2009). It is important to be able to recognizethispotentialinanewparadigm.

Here is the summary of what makes the infocomputationalist naturalism a promisingresearchprogramme(DodigCrnkovic&Mller2009): Unlike mechanicism, infocomputationalist naturalism has the ability to tackle as well fundamental physical structures as life phenomena within the same conceptualframework.Theobserverisanintegralpartoftheinfocomputational universe. Integration of scientific understanding of the structures and processes of life with the rest of natural world will help to achieve the unreasonable effectiveness of mathematics (or computing in general) even for complex phenomenaofbiologythattodaylackmathematicaleffectiveness(Gelfand)in sharpcontrasttophysics(Wigner). Infocomputationalism (which presupposes computationalism and informationalism) presents a unifying framework for common knowledge production in many up to know unrelated research fields. Present day specialization into various isolated research fields has led to the alarming impoverishmentofthecommonworldview. Our existing computing devices are a subset of a set of possible physical computingmachines,andTuringMachinemodelisasubsetofenvisagedmore generalnaturalcomputationalmodels.Advancementofourcomputingmethods beyondtheTuringChurchparadigmwillresultincomputingcapableofhandling complex phenomena such as living organisms and processes of life, social dynamics,communicationandcontroloflargeinteractingnetworksasaddressed inorganiccomputingandotherkindsofunconventionalcomputing. Understanding of the semantics of information as a part of the data informationknowledgewisdom sequence, in which more and more complex relationalstructuresarecreatedbycomputationalprocessingofinformation.An evolutionary naturalist view of semantics of information in living organisms is given based on interaction/information exchange of an organism with its environment. Discreteandanaloguearebothneededinphysicsandsoinphysicalcomputing whichcanhelpustodeeperunderstandingoftheirrelationship. Relatingphenomenaofinformationandcomputationunderstoodininteractive paradigm will enable investigations into logical pluralism of information produced as a result of interactive computation. Of special interest are open systems in communication with the environment and related logical pluralism includingparaconsistentlogic. Of all manifestations of life, mind seems to be informationtheoretically and philosophically the most interesting one. Infocomputationalist naturalism (computationalism + informationalism) has a potential to support, by means of

models and simulations, our effort in learning about mind and developing artifactual(artificial)intelligenceinthedirectionoforganiccomputing. In a situation of a paradigm shift where many different approaches coexist, Sowas notion of knowledge soup is useful as it stands for the fluid, dynamicallychangingnatureoftheinformationthatpeoplelearn,reasonabout, actupon,andcommunicate.(Sowa2000) UniverseasInformationalStructure Theuniverseis"nothingbutprocessesinstructuralpatternsallthewaydown" (Ladyman,etal.2007)p.228"Fromthemetaphysicalpointofview,whatexist arejustrealpatterns"(p.121).Understandingpatternsasinformation,onemay infer that information is a fundamental ontological category. The ontology is scalerelative.Whatweknowabouttheuniverseiswhatwegetfromsciences, as "special sciences track real patterns" (p. 242). "Our realism consists in our claim that successful scientific practice warrants networks of mappings as identifiedabovebetweentheformalandthematerial"(p.121). This idea of informational universe coincides with Floridis Informational StructuralRealism(Floridi2008)(Floridi2009).Weknowasmuchoftheworldas weexploreandcognitivelydigest: Sincewewishtodeviseanintelligibleconceptualenvironmentforourselves,we do so not by trying to picture or photocopy whatever is in the room (mimetic epistemology), but by interacting with it as a resource for our semantic tasks, interrogatingitthroughexperience,testsandexperiments.Realityinitselfisnot a source but a resource for knowledge. Structural objects (clusters of data as relational entities) work epistemologically like constraining affordances: they allow or invite certain constructs (they are affordances for the information system that elaborates them) and resist or impede some others (they are constraints for the same system), depending on the interaction with, and the natureof,theinformationsystemthatprocessesthem.Theyareexploitablebya theory,atagivenLevelofAbstraction,asinputofadequatequeriestoproduce information(themodel)asoutput.(Floridi2008) What infocomputationalist naturalism wants is to understand that dynamical interaction of informational structures as a computational process. It includes digitalandanalogue,continuousanddiscreteasphenomenaexistinginphysical world on different levels of description and digital computing is a subset of a more general natural computing. Wolfram finds equivalence between the two descriptionsmatterandinformation: [M]atterismerelyourwayofrepresentingtoourselvesthingsthatareinfact somepatternofinformation,butwecanalsosaythatmatteristheprimarything and that information is our representation of that. It makes little difference, I dontthinktheresabigdistinctionifoneisrightthattheresanultimatemodel

for the representation of universe in terms of computation. Wolfram in (Zenil 2011)p.389 More detailed discussion of different questions of informational universe, natural infocomputationalism including cognition, meaning, intelligent agency andothersimilartopicsisgivenin(DodigCrnkovicandHofkirchner2011). InwhatfollowsIwillfocusonexplainingthenewideaofcomputationwhichis essentially different from the notion of performing an in advance given procedureinadeterministicmechanicalway.Thisnewconceptofcomputation, naturalcomputation(sometimescalledunconventionalcomputationinorderto emphasizeitsdifferencefromthecomputationalmodelsweareusedto)allows fornondeterministiccomplexcomputationalsystemswithself*properties.Here self* stands for selforganization, selfconfiguration, selfoptimization, self healing, selfprotection, selfexplanation, and self/contextawareness applied toinformationprocessingsystems.Scheutz(2002)arguesthatthis new kindof computationalismappliedtothetheoryofmindisabletoexplainthenatureof intentionalityandtheoriginoflanguage. InfoComputationalismasNaturalPhilosophy EversinceTuringproposedhiscomputationmodelwhichidentifiescomputation with the execution of an algorithm, a predefined (discrete, finite) procedure, there have been questions about how widely the Turing Machine model is applicable. ChurchTuring Thesis establishes the equivalence between a Turing Machineandanalgorithm,interpretedastoimplythatallofcomputationmust bealgorithmic.However,withtheadventofcomputernetworks,themodelofa computerinisolation,representedbyaTuringMachine,hasbecomeinsufficient. Todayscomputersystemshavebecomelarge,consistingofmassivenumbersof autonomousandparallelelementsacrossmultiplescales.Atthenanoscalethey approach programmable matter; at the macro scale, huge number of cores compute in clusters, grids or clouds, while at the planetary scale, sensor networks connect environmental and satellite data to track climate and other globalscale phenomena. The common for these modern computing systems is that they are ensemblelike (as they form one whole in which the parts act in concert to achieve a common goal the way an organism is an ensemble of its cells)andphysical(asensemblesactinthephysicalworldandinteractwiththeir environmentthroughsensorsandactuators). Thesolutionfortheproblemsofextremecomplexityofmoderncomputational networksissoughtinNaturalcomputingasanewparadigmofcomputingwhich deals with computability in the physical world. It has brought a fundamentally new understanding of computation and presents a promising new approach to the complex world of autonomous, intelligent, adaptive, and networked computingthathassuccessivelyemergedinrecentyears.SignificantforNatural computingisabidirectionalresearch(RozenburgandKari2008):asthenatural

sciences are rapidly absorbing ideas of information processing, computing concurrentlyassimilatesideasfromnaturalsciences. TheDefinitionofComputationandtheTuringMachineModel The definition of computation is still under debate, and at the moment, the closest to common acceptance is the view of computation as information processing, found in different mathematical accounts of computing as well as CognitivescienceandNeuroscience,see(Burgin2005). Basically, for a process to be a computation a model must exist such as algorithm, network topology, physical process or in general any mechanism whichensuresdefinabilityofitsbehavior.(DodigCrnkovic2011) The characterization of computing can be made in several dimensions by classification into orthogonal types: digital/analog, symbolic/subsymbolic, interactive/batchandsequential/parallel.Nowadaysdigitalcomputersareused to simulate all sorts of natural processes, including those that in physics are understood ascontinuous. However, it is important to distinguish between the mechanismofcomputationandthesimulationmodel. Nomatterifthedataconstituteanysymbols;computationisaprocessofchange of the data structures. Symbols appear on a high level of organization and complexity, and always in relation with living organisms. Symbols represent somethingforalivingorganism;haveafunctionascarriersofmeaning. The notion of computation as formal (mechanical) symbol manipulation originates from discussions in mathematics in the early twentieth century. The mostinfluentialprogramforformalizationwasinitiatedbyHilbert,whotreated formalized reasoning as a symbol game in which the rules of derivation are expressed in terms of the syntactic properties of the symbols. As a result of Hilberts program large areas of mathematics have been formalized. Formalization implies the establishment of the basic language which is used to formulate the system of axioms and derivation rules defined such that the important semantic relationships are preserved by inferences defined only by thesyntacticformoftheexpressions.HilbertsGrundlagenderMathematik,and Whitehead and Russells Principia Mathematica are examples of such formalization. However, there are limits to what can be formalized, as demonstratedbyGdelsincompletenesstheorems. A secondimportantissueafterformalization ofmathematicswastodetermine theclassoffunctionsthatarecomputableinthesenseofbeingdecidablebythe application of a mechanical procedure or an algorithm. Not all mathematical functionsarecomputableinthissense.ItwasfirstAlanTuringwhodevelopeda general method to define the class of computable functions. He proposed the logical computing machine", which is a description of a procedure that processessymbolswrittenonatape/paperinawayanalogoustowhatahuman doeswhencomputingafunctionbyapplicationofamechanicalrule.According to Turing, the class of computable functions was equivalent to the class of

functions that could be evaluated in a finite number of steps by a logical computingmachine(Turingmachine). Thebasicideawasthatanyoperationsthataresensitiveonlytosyntaxcanbe simulatedmechanically. What the human following a formal algorithm does by recognition of syntactic patterns, a machine can be made to do by purely mechanical means. Formalization and computation are closely related and togetherentailthatreasoningwhichcanbeformalizedcanalsobesimulatedby theTuringmachine.Turingassumedthatamachineoperatinginthiswaywould actuallybedoingthesamethingasthehumanperformingcomputation. Somecriticshavesuggestedthatwhatthecomputerdoesismerelyanimitation or simulation of what the human does, even though it might be at some level isomorphictothehumanactivity,butnotinallrelevantrespects.Iwouldaddan obviousremark.TheTuringmachineissupposedtobegivenfromtheoutset its logic, its physical resources, and the meanings ascribed to its actions. The TuringMachinepresupposesahumanasapartofasystemthehumanisthe one who poses the questions, provides material resources and interprets the answers. TheChurchTuringthesisstatesthatanykindofcomputationcorrespondstoan equivalent computation performed by the Turing machine. In its original formulation(Church1935,1936),thethesissaysthatrealworldcalculationcan be performed using the lambda calculus, which is equivalent to using general recursivefunctions.Actually,theChurchTuringthesisisusedasadefinitionfor computation. There has never been a proof, but the evidence for its validity comesfromtheequivalenceofcomputationalmodelssuchascellularautomata, registermachines,andsubstitutionsystems. The ChurchTuring thesis has been extended to a proposition about the processes in the natural world by Stephen Wolfram in his Principle of computational equivalence (Wolfram 2002), in which he claims that there are only a small number of intermediate levels of computing before a system is universal and that most natural systems can be described as universal computational mechanisms. However, a number of computing specialists and philosophers of computing (Hava Siegelman, Mark Burgin, Jack Copeland, and representativesofnaturalcomputing)questiontheclaimthatallcomputational phenomenainallrelevantaspectsareequivalenttotheTuringMachine. George Kampis for example, in his book SelfModifying Systems in Biology and Cognitive Science (1991) claims that the ChurchTuring thesis applies only to simple systems. According to Kampis, complex biological systems must be modeled as selfreferential, selforganizing systems he calls "component systems"(selfgeneratingsystems),whosebehavior,though computationalin a generalizedsense,goesfarbeyondthesimpleTuringmachinemodel. A component system is a computer which, when executing its operations (software)buildsanewhardware....[W]ehaveacomputerthatrewiresitselfin

a hardwaresoftware interplay: the hardware defines the software and the softwaredefinesnewhardware.Thenthecirclestartsagain.(Kampis,p.223) Goertzel(1994)suggeststhatstochasticandquantumcomputingmodelswould be more suitable for component systems. Molecular computers are even more obviouscandidates. TheComputingUniverseNaturalistComputationalism KonradZusewasthefirsttosuggest(in1967)thatthephysicalbehaviorofthe entireuniverseisbeingcomputedonabasiclevel,possiblyoncellularautomata, bytheuniverseitselfwhichhereferredtoas"RechnenderRaum"orComputing Space/Cosmos. Consequently, Zuse was the first pancomputationalist (natural computationalist).HereisChaitinsaccount: Andhowabouttheentireuniverse,canitbeconsideredtobeacomputer?Yes,it certainlycan,itisconstantlycomputingitsfuturestatefromitscurrentstate,it's constantlycomputingitsowntimeevolution!AndasIbelieveTomToffolipointed out,actualcomputerslikeyourPCjusthitcharideonthisuniversalcomputation! (Chaitin,2007) EvenWolframinhisANewKindofScienceadvocatespancomputationalistview, a new dynamic kind of reductionism, in which complexity of behaviors and structuresfoundinnaturearederived(generated)fromafewbasicmechanisms. Natural phenomena are thus the products of computation processes. In a computational universe new and unpredictable phenomena emerge as a result of simple algorithms operating on simple computing elements such as e.g. cellular automata, and complexity originates from the bottomup emergent processes. Cellular automata are equivalent to a universal Turing Machine (Wolframs Rule 110). Wolframs critics remark however that cellular automata donotevolvebeyondacertainlevelofcomplexity.Themechanismsinvolveddo notnecessarilyproduceevolutionarydevelopment.Actualphysicalmechanisms atworkinthephysicaluniverseappeartobequitedifferentfromsimplecellular automata. Critics also claim that it is unclear if the cellular automata are to be thoughtofasametaphororwhetherrealsystemsaresupposedtousethesame mechanism on some level of abstraction. Wolfram meets this criticism by pointing out that cellular automata are models and as such surprisingly successfulones. FredkininhisDigitalPhilosophy(Fredkin1992)suggeststhatparticlephysicscan emerge from cellular automata. The universe is digital, time and space are not continuous but discrete. He goes a step beyond the usual computational universepicture:evenhumansaresoftwarerunningonauniversalcomputer. Wolfram and Fredkin assume that the universe is on a fundamental level a discretesystem,andsoasuitablebasisforanallencompassingdigitalcomputer. Actually the hypothesis about the discreteness of the physical world is not decisive for pancomputationalism (natural computationalism). As already

mentioned, there are digital as well as analogue computers. On a quantum mechanical level, the universe performs computation (Lloyd 2006) on characteristicallydualwaveparticleobjects. There are interesting philosophical connections between digital and analog processes. For example, DNA code (digital) is closely related to protein folding (analog) for its functioning in biological systems. Moreover, even if in some representation there would may be purely digital (and thus conform to Pythagorean ideal of number as a principle of the world) computation in the universe is performed on many different levels of organization, such as bio computing,membranecomputing,spatialcomputingetc. InformationProcessingBeyondtheTuringLimit Computationisnowadaysperformedbycomputersystemsconnectedinglobal networks of multitasking, often mobile, interacting devices. Classical understanding of computation as syntactic mechanical symbol manipulation is being replaced by information processing, with both syntactic and semantic aspects being expressed. According to Burgin (2005), information processing in practiceincludesthefollowing: Preservinginformation(protectinginformationfromchange) Changinginformationoritsrepresentation Changingthelocationofinformationinthephysicalworld Intheabovelist(3)canactuallybeunderstoodaschangingtherepresentation, andisthereforeasubsetofthe(2).Moreover,preservinginformation(1)canbe described as change performed by identity operation. What it boils down to is thatcomputationisingeneralachangeofinformationorasusuallyexpressed, informationprocessing. Searching for further generalization, it can be noted that mechanisms of both computation and communication imply the transformation and preservation of information. Bohan Broderick (2004) compares notions of communication and computationandarrivesattheconclusionthatcomputationandcommunication areoftenconceptuallyindistinguishable.Hearguesthatthedifferencebetween computationandcommunicationliesonlyinthedomain:computationislimited to a process within a system and communication is an interaction between a system and its environment. An interesting problem of distinction arises when the computer is conceived as an open system in communication with the environment,theboundaryofwhichisdynamic,asinbiologicalcomputing. Burgin identifies three distinct components of information processing systems: hardware (physical devices), software (programs that regulate its functioning), andinfowarewhichrepresentsinformationprocessingperformedbythesystem. Infoware is a shell built around the softwarehardware core which is the traditional domain of automata and algorithm theory. Semantic Web is an exampleofinforware.

NaturalComputation The classical mathematical theory of computation devised long before global computer networks is based on the theory of algorithms. Ideal, classical theoretical computers are mathematical objects and they are equivalent to algorithms,abstractautomata(Turingmachines),effectiveprocedures,recursive functions,orformallanguages. Compared with new computing paradigms, Turing machines form the proper subsetofthesetofinformationprocessingdevices,inmuchthesamewaythat NewtonstheoryofgravitationisaspecialcaseofEinsteinstheory,orEuclidean geometryisalimitcaseofnonEuclideangeometries. Forimplementationsofcomputationalism,interactivecomputing(suchasamong others agent based) is the most appropriate model, as it naturally suits the purpose of modeling a network of mutually communicating processes. See (DodigCrnkovic20062011) Amongthenewparadigmsofcomputing,Naturalcomputationhasaprominent place.Itisastudyofcomputationalsystemsincludingthefollowing: Computingtechniquesthattakeinspirationfromnatureforthedevelopmentof novelproblemsolvingmethods Useofcomputerstosimulatenaturalphenomena;and Computingwithnaturalmaterials(e.g.,molecules,atoms) Naturalcomputationiswellsuitedfordealingwithlarge,complex,anddynamic problems. It is an emerging interdisciplinary area closely related with the artificial intelligence and cognitive science, vision and image processing, neuroscience,systemsbiology,bioinformaticstomentionbutafew. Fields of research within Natural computing are among others biological computing/organic computing, artificial neural networks, swarm intelligence, artificial immune systems, computing on continuous data, membrane computing, artificial life, DNA computing, quantum computing, neural computation, evolutionary computation, evolvable hardware, selforganising systems,emergentbehaviours,machineperceptionandsystemsbiology. Computational paradigms studied by natural computing are abstracted from naturalphenomenasuchasselfXattributesofliving(organic)systems(including replication, repair, definition and assembly), the functioning of the brain, evolution, the immune systems, cell membranes, and morphogenesis. These computational paradigms can be implemented not only in the electronic hardware, but also in materials such as biomolecules (DNA, RNA), or quantum computingsystems(physicalcomputing). According to pancomputationalism (natural computationalism) (Dodig Crnkovic 20062011), one can view the time development (dynamics) in nature as information processing, and learn about its computational characteristics. Such

processes include selfassembly, developmental processes, gene regulation networks, gene assembly in unicellular organisms, proteinprotein interaction networks,biologicaltransportnetworks,andsimilar. Naturalcomputinghasspecificcriteriaforsuccessofacomputation.Unlikethe caseofTuringmodel,thehaltingproblemisnotacentralissue,butinsteadthe adequacyofthecomputationalresponse.Organiccomputingsysteme.g.adapts dynamically to the current conditions of its environments by selforganization, selfconfiguration, selfoptimization, selfhealing, selfprotection and context awareness. In many areas, we have to computationally model emergence not being algorithmic (Barry Cooper, Aaron Sloman) which makes it interesting to investigatecomputationalcharacteristicsofnonalgorithmicnaturalcomputation (subsymbolic,analog). An"OrganicComputingSystem"isatechnicalsystem,whichadaptsdynamically to the current conditions of its environment. It is characterised by the selfX properties: selforganization & selfconfiguration (autoconfiguration); self optimisation (automated optimization), selfprotection (automated computer security)&selfhealing,selfexplainingandcontextawareness.IdeasofOrganic Computing and its fundamental concepts arose independently in different researchareaslikeNeuroscience,MolecularBiology,andComputerEngineering. Selforganising systems have been studied for quite some time by mathematicians,sociologists,physicists,economists,andcomputerscientists,but so far almost exclusively based on strongly simplified artificial models. Central aspects of Organic Computing systems have been and will be inspired by an analysis of information processing in biological systems. http://www.organic computing.org In sum, solutions are being sought in natural systems with evolutionary developed strategies for handling complexity in order to improve complex networksofmassively parallelautonomousengineered computationalsystems. The research in theoretical foundations of Natural computing is needed to improveunderstandingonthefundamentallevelofcomputationasinformation processingwhichunderlieallofthecomputinginnature. Much like the research in other disciplines of Computing such as AI, SE, and Robotics, Natural computing is interdisciplinary research, and has a synthetic approach, unifying knowledge from a variety of related fields. Research questions, theories, methods and approaches are used from Computer Science (such as Theory of automata and formal languages, Interactive computing), Information Science (e.g. Shannons theory of communication), ICT studies, Mathematics (such as randomness, Algorithmic theory of information), Logic (e.g. pluralist logic, game logic), Epistemology (especially naturalized epistemologies), evolution and Cognitive Science (mechanisms of information processing in living organisms) in order to investigate foundational and conceptualissuesofNaturalcomputationandinformationprocessinginnature.

Inthesetimesbrimmingwithexcitement,ourtaskisnothinglessthantodiscover anew,broader,notionofcomputation,andtounderstandtheworldaroundusin termsofinformationprocessing.(RozenbergandKari2008) This development necessitates what (Cooper, Lve and Sorbi 2007) call taking computationalresearchbeyondtheconstraintsofnormalscience. Conclusion Starting with the Briers Cybersemiotics critique of the existing practice of Wissenshaft, this article develops the argument for an alternative approach to the problem of naturalized framework for knowledge production. It is the framework of natural infocomputationalism, ICON, based on concepts of information (structure) and computation (process). It maps Peirces Firstness, Secondness and Thirdnes into protoinformation, interaction and intentional agency.Inthisapproach,whichisasynthesisofinformationalism(theviewthat nature is informational) and computationalism (the view that nature computes its time development), computation is in general not a substrateindependent disembodied symbol manipulation. Based on the informational character of nature, where matter and informational structure are equivalent, information processing in general is embodied and in general substrate specific. The Turing Machine model of abstract discrete symbol manipulation is a subset of the Naturalcomputingmodel.WiththisgeneralizedideaofNaturalcomputingand Informational Structural Realism, Infocomputationalism (ICON), adopting scientificthirdpersonaccount(DodigCrnkovic20062011),coverstheentirelist of requirements for naturalist knowledge production framework from (Brier 2010)exceptforqualiaasexperiencedinafirstpersonmode.InICON,qualiaas well as other natural phenomena are accounted for in a thirdperson perspective. References Axelrod,R.(1997).TheComplexityofCooperation:AgentBasedModelsof CompetitionandCollaboration.Princeton:PrincetonUniversityPress. BohanBroderick,P.(2004)OnCommunicationandComputation.Mindsand Machines,14(1),119. Brier,S.(2008)Cybersemiotics:WhyInformationisnotEnough!Universityof TorontoPress:Toronto,Canada. Brier,S.(2009)Cybersemioticpragmaticismandconstructivism.Cons.Foun.5, 1938. Brier,S.(2010)Cybersemiotics:AnEvolutionaryWorldViewGoingBeyond EntropyandInformationintotheQuestionofMeaning.Entropy12,no.8:1902 1920.

Burgin,M.(2005).SuperRecursiveAlgorithms,SpringerMonographsin ComputerScience. Chaitin,G.(2007)EpistemologyasInformationTheory:FromLeibnizto.In (DodigCrnkovicandStuart2007). Church,A.(1935)AbstractNo.204.Bull.Amer.Math.Soc.41,332333. Church,A.(1936)AnUnsolvableProblemofElementaryNumberTheory.Amer. J.Math.58,345 Cooper,S.B.;Lwe,B.Sorbi,A.eds.(2007)NewComputationalParadigms. ChangingConceptionsofWhatisComputable.SpringerMathematicsof Computingseries,XIII. CP2.228CollectedPapersofCharlesSandersPeirce,8volumes,vols.16,eds. CharlesHartshorneandPaulWeiss,vols.78,ed.ArthurW.Burks.Cambridge, Mass.:HarvardUniversityPress,19311958. Denning,P.(2007)Computingisanaturalscience,communicationsoftheACM, 50(7),1318.http://cs.gmu.edu/cne/pjd/PUBS/CACMcols/cacmJul07.pdf. DodigCrnkovic,G.&Burgin,M.(2010)InformationandComputation.World ScientificPubCoInc.Singapore. DodigCrnkovic,G.(2006)InvestigationsintoInformationSemanticsandEthicsof Computing(pp.1133).Vsters,Sweden:MlardalenUniversityPress. DodigCrnkovic,G.(2009)InformationAndComputationNets.Investigationsinto InfocomputationalWorld.InformationandComputation(pp.196). Saarbrucken:VdmVerlag.Retrievedfrom DodigCrnkovic,G.(2010)TheCybersemioticsandInfoComputationalist ResearchProgrammesasPlatformsforKnowledgeProductioninOrganismsand Machines.Entropy12,no.4:878901. DodigCrnkovic,G.(2010a)BiologicalInformationandNaturalComputationin Thinkingmachinesandthephilosophyofcomputerscience:conceptsand principles.InJ.Vallverd(Ed.),HersheyPA:InformationScienceReference. DodigCrnkovic,G.(2011)SignificanceofModelsofComputationfromTuring ModeltoNaturalComputation.MindsandMachines,(R.TurnerandA.Eden guesteds.)Volume21,Issue2,Page301. http://www.springerlink.com/openurl.asp?genre=article&id=doi:10.1007/s1102 301192351 DodigCrnkovic,G.&Hofkirchner,W(2011)FloridisOpenProblemsin PhilosophyofInformation,TenYearsAfter.forthcoming. DodigCrnkovic,G.,&Mueller,V.(2009)ADialogueConcerningTwoWorld Systems:InfoComputationalvs.Mechanistic.In(DodigCrnkovic&.Burgin2011) pp.14984.

DodigCrnkovic,G.,&Stuart,S.(2007)Computation,Information,Cognition:The NexusandtheLiminal.CambridgeScholarsPub.Newcastle,UK. DodigCrnkovic,G.(2003)Shiftingtheparadigmofthephilosophyofscience: Thephilosophyofinformationandanewrenaissance.MindsandMachines, 13(4),521536. http://www.springerlink.com/content/g14t483510156726/fulltext.pdf Epstein,J.M.(2007).GenerativeSocialScience:StudiesinAgentBased ComputationalModeling.PrincetonUniversity. FloridiL.(2008)AdefenceofinformationalstructuralrealismSynthese161:2, Springer,pp.219253 FloridiL.(2009)Againstdigitalontology,Synthese,168(1),15178. Fredkin,E.(1992)FiniteNature.ProceedingsoftheXXVIIthRencotredeMoriond Goertzel,B.(1994)ChaoticLogic.PlenumPress. Kampis,G.(1991)SelfModifyingSystemsInBiologyAndCognitiveScience:A NewFrameworkForDynamics,Information,AndComplexity,PergamonPress. Kurakin,A.(2009)Scalefreeflowoflife:onthebiology,economics,andphysics ofthecell,Theoreticalbiology&medicalmodelling6. Ladyman,J.,Ross,D.,Spurrett,D.,andCollier,J.(2007)Everythingmustgo: metaphysicsnaturalised,pp1368,ClarendonPress,Oxford. Lloyd,S.(2006)Programmingtheuniverse:aquantumcomputerscientisttakes onthecosmos.Knopf.NewYork. Rozenberg,G.andKari,L.(2008)Themanyfacetsofnaturalcomputing, CommunicationsoftheACM,51.7283. Scheutz,M.(2002)Computationalismnewdirections,pp1223,MITPress, CambridgeMass. Sowa,J.F.(2000)KnowledgeRepresentation:Logical,Philosophical,and ComputationalFoundations,Brooks/ColePublishingCo.,PacificGrove,CA. Wolfram,S.(2002)ANewKindofScience.WolframScience. Whitehead,A.N.(1937)EssaysinScienceandPhilosophy,PhilosophicalLibrary, NewYork. Whitehead,A.N.(1933)AdventuresofIdeas,Macmillan,NewYork. ZenilH.(2011)RandomnessThroughComputation:SomeAnswers,More Questions,WorldScientificPublishingCo.Singapore.

Potrebbero piacerti anche