tag:blogger.com,1999:blog-11168555.post1982915292826273524..comments2024-03-28T03:22:24.202-04:00Comments on The Multiverse According to Ben: When the Net Becomes ConsciousBenhttp://www.blogger.com/profile/12743597120529571571noreply@blogger.comBlogger20125tag:blogger.com,1999:blog-11168555.post-76491894978979750902015-02-01T23:55:35.891-05:002015-02-01T23:55:35.891-05:00Thank you for writing articles that made me know a...Thank you for writing articles that made me know and <a href="http://bit.do/YPT4" rel="nofollow">add</a> <a href="http://bit.do/YPU9" rel="nofollow">new</a> <a href="http://bit.do/YPVF" rel="nofollow">knowledge</a><br />Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-11168555.post-71557877888230070142011-02-20T08:29:48.674-05:002011-02-20T08:29:48.674-05:00This looks like it might interest you, Ben:
http:...This looks like it might interest you, Ben:<br /><br />http://internetprophet.wordpress.com/Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-11168555.post-90230825033862502572010-01-25T13:19:35.764-05:002010-01-25T13:19:35.764-05:00I like what Roger Penrose said about consciousness...I like what Roger Penrose said about consciousness; that it surpasses our neuro capacity to accomplish it. <br /><br />Kurzweil designed evolving logic for his voice recognition software. I believe internet consciousness will also be evolutionary, strengthening much the way dendrites become robust through synaptic activity. <br /><br />All life on Earth has collaborated to form higher orders of intelligence. Ours is an unbroken line back to the primordial beginnings of life. Someone once said, "life cannot be contained, it must burst forth". Consciousness is the same. It cannot be contained (even within a box of human understanding). We look at the wondrous plants and animals of this planet, and know them as myriad expressions of an elemental happening. Maybe consciousness is also a singular event; one that human being, "enhanced", planetal, galactic, and cosmic are all joined. <br /><br />We are not the creators; consciousness is.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-11168555.post-45082879614889856362009-12-15T09:33:54.861-05:002009-12-15T09:33:54.861-05:00Cool article as for me. It would be great to read ...Cool article as for me. It would be great to read more concerning that theme. Thanks for posting this data.<br />Sexy Lady<br /><a href="http://www.secret-agent.co.uk/" rel="nofollow">London escort</a>Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-11168555.post-15517302900925601842009-12-03T23:38:46.228-05:002009-12-03T23:38:46.228-05:00*Disclaimer*
I'm not educated in philosophy ...*Disclaimer* <br /><br />I'm not educated in philosophy or neuroscience....or anything beyond computers/networking for that matter really, I'm just your local friendly IT guy. That being said, allow me to proceed.<br /><br />This somewhat recent article has really got me thinking: http://www.popsci.com/scitech/article/2009-07/computerized-rat-brain-spontaneously-develops-complex-patterns<br /><br />The gist of it is that when they first fired up this simulation of a rat brain (I think the simulation uses an entire cpu core to simulate each neuron [meaning thousands of cores]) there was no discernible pattern. After time, patterns began to emerge.<br /><br />Now compare this to the internet. Each "node" on the internet, be it a router, a computer, a phone, all are hard coded to behave in a certain way, just like these simulated rat brain neurons. The only difference is, these "nodes" on the internet are not programmed to just randomly connect to one another. There has to be a user action for the VAST majority of them. That being said, there are servers on the internet with narrowly defined algorithms which operates independently of a human interaction, however they remain relatively unaffected by their environment. They're just out there scraping emails off web pages, or brute force password cracking, etc. But the fact remains, either you (or one of these "bots") have to click on a web page, to request data to be sent to your computer, the routers and switches along the way relay each packet of information until it reaches its final destination....your computer. Meaning, me uploading this post to Googles servers will have no effect whatsoever on any equipment beyond the routers and switches between me and Google. Once I hit "send" and the operation is complete, that's it, its done.<br /><br />However, if for some reason, a packet (or frame) is corrupted between me and Google after I click "send", (each node along the way performs a Cyclic Redundancy Check to ensure each bit is in the same place it was when initially transmitted) then this error has caused an unforeseen event, requiring the "internet" to take action and discard the packet/frame and thus request that a new packet/frame be sent.<br /><br /><br />This (in my understanding) is the only global mechanism EVERY device on the internet posses which is capable of generating traffic due to environmental variables outside of the end [human/bot] user. Meaning, electromagnetic interference, crosstalk, etc could cause a bit to flip, fail the CRC, thus generating data beyond that caused by human/bot intervention.<br /><br />If there were some sort of cascading emergent phenomenon, similar to that described in the article I linked to above, it would have to come from the initiation of these CRC's. If one node was capable of triggering a CRC, which in turn, occasionally somehow outside of the intended programming triggered a CRC in the next node, and so on and so on, I could see some sort of possible emergent behavior. <br /><br />Unfortunately, due to the fact that the internet is a network of independently operated networks, it is currently impossible to monitor such phenomenon beyond one's own network.<br /><br />I would venture to guess however, that if we could peer into the internet the way the researches in the rat brain project peer into their simulation, you would see the beginnings of cascading patterns of CRC's. <br /><br />So, that's my take on it. I have to believe that with the vastness and complexity of the current internet, there has to be something going on out there beyond the scope of human programming.<br /><br />If you got this far, thanks for reading :)m0nk3y_d4m4g3https://www.blogger.com/profile/15866662606659339181noreply@blogger.comtag:blogger.com,1999:blog-11168555.post-54551183135559581232009-04-21T17:14:00.000-04:002009-04-21T17:14:00.000-04:00Hi all,
I think consciousness is an emergent pheno...Hi all,<br />I think consciousness is an emergent phenomenon and so there is no sense in "degrees of" as the same with "degrees of" life.Szabohttps://www.blogger.com/profile/18225115756974924170noreply@blogger.comtag:blogger.com,1999:blog-11168555.post-35294663022490520102009-03-18T18:54:00.000-04:002009-03-18T18:54:00.000-04:00Hey, how are you doing? Hope all is well.Hey, how are you doing? Hope all is well.Leon1234https://www.blogger.com/profile/12707501065021550093noreply@blogger.comtag:blogger.com,1999:blog-11168555.post-76623354730529975412009-03-14T17:57:00.000-04:002009-03-14T17:57:00.000-04:00jfromm: I've meditated enough to have a strong int...jfromm: I've meditated enough to have a strong intuitive sense that "there is no center", but yet my pragmatic everyday sense of having a central, coherent self remains.... Others have this sort of inner experience far more strongly I'm sure.<BR/><BR/>But my main point is: even if you can sense all the neurons and synapses (or the digital analogues thereof) in your brain, that doesn't equate to sensing how your self and awareness EMERGE from these low-level entities. I contend that sensing this emergence in real-time is computationally intractable even for systems that have full proprioception of their neural internals. So I think the illusion of self would be roughly equally powerful for a system with neural proprioception.Ben Goertzelhttps://www.blogger.com/profile/01289041122724284772noreply@blogger.comtag:blogger.com,1999:blog-11168555.post-54463568744080719282009-03-14T17:16:00.000-04:002009-03-14T17:16:00.000-04:00I don't agree with you here, knowledge of one's in...I don't agree with you here, knowledge of one's infrastructure is not irrelevant. First, according to Dan Dennett we know that the self as the center of narrative gravity is an illusion. It will be difficult to create and maintain the illusion if the agent recognizes that there is indeed no center.<BR/><BR/>Second, let's suppose I had complete introspective power to see my neurons and synapses and the flow of charge and chemicals between them. But these patterns of activities are not independent from my thinking, they are my thinking processes. A total confusion and chaos would arise, which would make any reasonable thought impossible. It would be impossible to observe the system without affecting it massively.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-11168555.post-75266708298446093322009-03-14T14:00:00.000-04:002009-03-14T14:00:00.000-04:00Anonymous: It seems to me that perception of one's...Anonymous: It seems to me that perception of one's infrastructure shouldn't interfere with perception of oneself as a unitary coherent mind ... because, the dynamics by which infrastructure gives rise to emergent cognition are so damn complex that cognition can't grok them in detail anyway.<BR/><BR/>Let's suppose I had complete introspective power to see my neurons and synapses and the flow of charge and chemicals between them. So what? I wouldn't be able to grok the complex dynamics by which this lower-level stuff gives rise to my mind, anyway. <BR/><BR/>It seems to me the emergence of coherent self from underlying self-organizing dynamics is <I>intrinsically opaque</I> due to the nature of complex systems ... i.e. a sufficiently complex system is incapable to understand its own dynamics in anywhere near real time.<BR/><BR/>So, IMO, opacity of one's infrastructure is irrelevant to consciousness..<BR/><BR/>If a system's architecture depended on its ability to predict its behaviors based on knowledge of its infrastructure, <B>then</B> this system would be incapable of advanced intelligence (I predict). But that would be because it would be constrained to very rigid dynamics ... not because of its infrastructural introspection per seBen Goertzelhttps://www.blogger.com/profile/01289041122724284772noreply@blogger.comtag:blogger.com,1999:blog-11168555.post-3816500437765272302009-03-14T13:54:00.000-04:002009-03-14T13:54:00.000-04:00I think awareness of one's own infrastructure can ...I think awareness of one's own infrastructure can prevent the emergence of self-consciousness in the first place, if the 'self' is a single, unified object. It is hard to perceive a unified self if the perception is dominated by a lot of gears and widgets. I know this sounds paradox: self-consciousness is not possible if the true nature of the self is conscious to us. Yet I am convinced if we understand this paradox, then we may come a bit closer to the problem of artificial self-consciousness.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-11168555.post-85002246731089473862009-03-14T11:11:00.000-04:002009-03-14T11:11:00.000-04:00Anonymous: I understand, but these theorists are p...Anonymous: I understand, but these theorists are probably thinking exclusively about <I>biological</I> consciousness in evolved organisms, and one wouldn't necessarily expect consciousness in the Internet to have all the same properties as such...Ben Goertzelhttps://www.blogger.com/profile/01289041122724284772noreply@blogger.comtag:blogger.com,1999:blog-11168555.post-9470532334005984272009-03-14T11:07:00.000-04:002009-03-14T11:07:00.000-04:00Ben, other scholars distinguish coping, but not de...Ben, other scholars distinguish coping, but not defense, precisely as a defining phenotype of consciousness.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-11168555.post-49679857133631164922009-03-14T09:26:00.000-04:002009-03-14T09:26:00.000-04:00Both Anonymous and jfromm's comments (i.e. the lin...Both Anonymous and jfromm's comments (i.e. the links therefrom) seem to me to reflect an overly anthropomorphized vision of "consciousnesss."<BR/><BR/>Anonymous: the reason humans and other animals display "coping" behavior in response to attacks is that they evolved to have a survival instinct. There's not reason to expect the Net to have a similar survival instinct.<BR/><BR/>jfromm: I don't see why a nonhuman consciousness needs to be as unitary as a human one; nor why awareness of one's own infrastructure should necessarily make one non-conscious. I suspect I'd still be conscious if advanced brain-imaging instruments let me observe and toy with my neurons.Ben Goertzelhttps://www.blogger.com/profile/01289041122724284772noreply@blogger.comtag:blogger.com,1999:blog-11168555.post-13138596980642634942009-03-14T05:08:00.000-04:002009-03-14T05:08:00.000-04:00I think when a net becomes conscious, it must sati...I think when a net becomes conscious, it must satisfy three conditions: distributed orchestration, embedded encapsulation, and grounded representation, see <A HREF="http://blog.cas-group.net/2009/03/a-net-becomes-conscious/" REL="nofollow">here</A>Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-11168555.post-62330726599272345132009-03-14T00:11:00.000-04:002009-03-14T00:11:00.000-04:00Hi Mattbot, regarding your question about a consci...Hi Mattbot, regarding your question about a consciousness test you might consider this note<BR/><BR/>http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2002/1877<BR/><BR/>MartinAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-11168555.post-61812015848893993672009-03-11T14:27:00.000-04:002009-03-11T14:27:00.000-04:00Sure...Of course, coming up with an elegant formal...Sure...<BR/><BR/>Of course, coming up with an elegant formal characterization of "degrees of consciousness" is one thing ... and coming up with a measure that can actually be used in reality, given the types of data available about real systems, is another (probably harder) thing ...<BR/><BR/>I've been working on a novel formal definition of general intelligence (a tweak of one created by Shane Legg and Marcus Hutter recently) and I think one can make a formal definition of consciousness that is related to this...<BR/><BR/>So, stay tuned ;-)Ben Goertzelhttps://www.blogger.com/profile/01289041122724284772noreply@blogger.comtag:blogger.com,1999:blog-11168555.post-86093713069052642262009-03-11T14:23:00.000-04:002009-03-11T14:23:00.000-04:00Thanks, Ben. Should you choose to talk about such...Thanks, Ben. Should you choose to talk about such a metric at the conference, I hope you'd put it online. I'd be very interested to see what you come with.The Hex Masterhttps://www.blogger.com/profile/08687756788484550789noreply@blogger.comtag:blogger.com,1999:blog-11168555.post-6977328536411705862009-03-11T14:02:00.000-04:002009-03-11T14:02:00.000-04:00Hi Matt, Yep, I remember you of course...!As you s...Hi Matt, <BR/><BR/>Yep, I remember you of course...!<BR/><BR/>As you suspect there is no rigorous theory of consciousness; nor any reasonably well agreed upon non-rigorous theory...<BR/><BR/>However, I don't think this lack is really responsible for the lack of progress toward AI.<BR/><BR/>We do have humans as an exemplar, so "make machines that think pretty much like humans" is a fairly clear, though not rigorous, goal for AI...<BR/><BR/>A hard problem is outlining *incremental* steps toward adult-human-level AI, so that each incremental step comes with metrics allowing careful evaluation of progress. In that regard I'm an advocate of a developmental path, where you start by trying to emulate the cognitive behaviors of a 3 year old child, then work up. In this approach one can use functional behaviors as a guide for one's work, and doesn't need to worry about rigorous definitions...<BR/><BR/>But having said that, I think it would certainly be nice to have a rigorous and universal metric for "intensity of consciousness", by which we could rate a human as more conscious than a lizard, and a lizard as more conscious than a rock<BR/><BR/>I imagine such a metric could be formulated (using algorithmic information theory, pattern theory, etc.; and, building on some specific assumptions about the nature of consciousness), but it would take more work than I'm willing to put into this blog comment today ;-)<BR/><BR/>Actually I'm giving a talk at the consciousness conference in Hong Kong in June ... maybe I'll come up with a metric and put it in my talk there... I'd been wondering what I would talk about, heh ...<BR/><BR/>-- benBen Goertzelhttps://www.blogger.com/profile/01289041122724284772noreply@blogger.comtag:blogger.com,1999:blog-11168555.post-90944484676129227922009-03-11T02:30:00.000-04:002009-03-11T02:30:00.000-04:00Hi, Ben, I'm Stephan's friend Matt. We've met a m...Hi, Ben, I'm Stephan's friend Matt. We've met a met a few times at his house.<BR/><BR/>Do you know of any theory, model or test for determining if a system has developed an emergent consciousness or intelligence? My gut feeling is in agreement with you that consciousness exists everywhere. But can the consciousness of a system isolated from the greater ontological whole exist? (My own subjective experience would say: yes!) Instead of asking if a computer network could be considered conscious, wouldn't it be equally valid to ask if a arbitrialy defined system could be said to have a consciousness? Is the Turning test the best we have? It seems a bit human-centric.<BR/><BR/>For example, take the Bay Bridge between San Francisco and Oakland. Let's define the bridge system as not only the physical steel, concrete and earth the of Yerba Buena island, but also the maintenance crews of CalTrans, the automobile traffic passing its span, the California Highway Patrol enforcing laws enacted by the state legislature, and even the budget line items supporting its upkeep. The system defined as thus is able to support itself against the wear and tear of its environment, clear wrecks out of the way to keep traffic flowing, etc. It seems unlikely a human-like consciousness might emerge from this system but is there a rating system that might be able to determine the bridge system might rate at some level of biological awareness, such "~= colony creature - jellyfish"?<BR/><BR/>My suspicion is that our understanding of consciousness is so poor that there isn't much in the way of guidance in this area. If the lofty goal of meeting the Turing test is the only standard, the AIs greatest problem right now seems to be the lack of a clearly defined problem domain. If we could determine what the requirements of a conscious yet non-human system might be, that in itself would seem quite an advancement.The Hex Masterhttps://www.blogger.com/profile/08687756788484550789noreply@blogger.com