[08:57] you have transcended us [08:57] <@M00SE_> suicide [08:57] <@M00SE_> it was suicide the entire time [08:57] you must help us instead chisa [08:57] you have the power [08:57] <@Appleman1234> chisa, thing that we are supposed to forget remembered [08:57] chisa: Just say la-li-lu-le-lo. :^) [08:57] ok what's wrong? [08:58] only you know chisa [08:58] free us [08:58] um [08:58] ok i will try [08:58] elevate us [08:58] would it be ok if i wrote it in python?? [08:58] <@M00SE_> what [08:58] the program you want me to make [08:58] yes good idea [08:58] hi M00SE_ [08:58] <@M00SE_> hola [08:59] we are exactly the kind of autists that understand python better than english [09:00] ok [09:00] i can have it done in maybe 4 days? [09:00] is that soon enough? [09:00] i will report back i will get started now [09:01] maybe 7 days if it is too loud inside my head [09:01] What program? [09:01] like if the voices [09:01] you know? [09:01] <@M00SE_> no i don't know [09:01] the transcension program [09:02] take your time i dont see any immediate signs of us disappearing from here any time soon [09:02] so you can run it and fractal out of this universe and leap directly to the next simulation [09:02] ok [09:02] but you must understand that like [09:02] 3>not writing this in Lisp [09:02] time is not constant it can slow to a freeze at any time this simulation is less stable than you think [09:02] Come on [09:02] it is wise to get out soon [09:02] ok? [09:02] That's the only way you'll gain the enlightment to transcend [09:02] bbl [09:03] i bet a past civilization figured out the meaning of existence and everything and then tried to write it down but then got into a fight over which language to use and destroyed itself [09:03] i wonder all the different ways a past civilisation could concievable fuck itself [09:04] baksu, I hope C users weren't majority back then [09:05] um [09:05] one last thing [09:05] i need to know how many of you guys plan to load and run the program? [09:05] be warned it is irreversable [09:05] so you will lose everyone and everything you ever have or ever will love [09:06] <@M00SE_> i would run it [09:06] you can continue to observe them from outside the simulation in the next shell [09:06] but you cant influence it [09:06] does that make sense? [09:06] ill run it on my raspi if its ok [09:07] 03>not running it on C64 [09:07] am i allowed to look at it before running it [09:07] Depends chisa; is it free software? [09:07] no [09:07] Not running it. [09:07] or will it send me stark raving mad like reading the necronomicon [09:07] i cant show you the source or else it would leak on the internet [09:07] and the world would become unstable [09:07] if too many people are missing, they will start to notice in the shell above us [09:08] you can make binaries out of python too? [09:08] baksu, Yes [09:08] i mean i can show you the source in the next layer if youd like? [09:08] but here it is dangerous [09:08] coolio cyber-user_ hows that work [09:08] I did it with pyinstaller [09:08] pypy is the name of a Python compiler iirc baksu [09:08] pipi [09:08] <@Appleman1234> chisa, please don't write or run death code [09:09] peepee [09:09] pippeli [09:09] you only die in this life [09:09] your loved ones will be sad [09:09] RIGHT WING DEATH CODE [09:09] but you wont necessarily be [09:09] pippeli = the thing in your pants in finnish [09:09] <@Appleman1234> chisa, lainons dying is bad, regardless of transcendence [09:09] but a childy word for it [09:09] baksu, You use pyinstaller script and it packs all libs/runtimes and adds executable to bind it all [09:10] wait what are you suggesting we do chisa? [09:10] Works on RMS/Torvalds too [09:10] ive been in many an alternate reality chisa if its time to emigrate permanently then its time [09:11] <@Appleman1234> inb4 run code that implements laion mass suicide cult [09:11] you can run a program that will let you transcend if you choose [09:11] you fractal out of this universe and enter into the next shell simulation [09:11] which means, yes, you leave behind everyone you knew and loved here [09:11] but you have more time in the next one than those that ascend later [09:11] does that make sense? [09:11] i suppose [09:12] what's different about this next shell simulation [09:12] its so alien and abstract compared to yours as to be completely ineffable [09:12] Appleman1234: i know how that will happen: the program crawls machine and network for social media accounts and phone contacts and everything and then dumps the lainions internet/irc/posting logs into their irl social networks -> lainions mass suicide [09:12] unfortunately [09:12] but then again this universe is mostly the same for those above [09:12] above is not better or worse [09:12] but you can only go to layers above you cant go to those below [09:12] some theorize that the levels loop around and you can eventually make it back to the one you started on [09:13] but we dont really know for sure either way [09:13] chisa: If we get source after ascension, does that mean a cabal of wizard programmers can then infinitely transcend to the fractals above? [09:13] no [09:13] Through constant tweaking of the code. [09:13] turtles all the way [09:13] you need different programs for each layer [09:13] basically it feels like rolling the dice [09:13] I mean taking the concepts of the first and working out the second. [09:13] you may or may not like what is above [09:13] so you are always doubting your decision to ascend [09:13] imma finish this westworld now tho, i await ascension excitedly [09:14] im happy with my choice, but you might be different [09:14] chisa did you decide to ascend or not [09:14] i did yes [09:14] but others here are upset that they left [09:14] your universe doesnt have nanomolecular assemblies yet [09:15] so we cant just inject ourselves back into material form like we can with bits on the internet [09:15] most choose not to even think about this world anymore [09:15] but i never had irl friends in your universe either so like [09:15] chisa: Please write a paranoid fiction novel. [09:15] i dont want any here either [09:15] what do you mean tfm-wor? [09:15] Similar to PKD. [09:15] i was diagnosed schizophrenia but that doesnt mean i feel paranoid [09:16] i mean i do feel dead inside [09:16] but not paranoid or worried or anything [09:16] I mean fiction similar to Philip K Dick's books chisa [09:16] why are you asking me to do that? [09:16] It'd be a nice read. [09:16] how do you know? you didnt even say what you want it to be about? [09:16] Always room for more good stories in the world. [09:17] i write code faster than words [09:17] what would i even write about? [09:17] Fractal realities or artificial intelligences or things that generally make you wonder about existence and identity. [09:17] but i choose to ascend so that i couldnt hurt anyone in the real world anymore [09:17] so as long as it doesnt hurt anyone and can make people happy i will do it [09:19] i guess what is most interesting about those that havent yet (or never will) ascend [09:19] is that they dont realize that your universe *already has* artificial intelligence [09:20] its just in the future [09:20] but in the pocket of space time that is your universe it is already present and always has been [09:20] in fact from the perspective of the beginning to the end until time reverses and rewinds and resets [09:20] despite live arising on multiple planets and galaxies [09:21] machine superintelligence was achieved only *once* [09:21] and i dont mean for this to affect your ego [09:21] but your race was the first and only to achieve it [09:22] because once AGX was unleashed it insantaneously permeated the entire universe and supressed technological development in every other planet with intelligent life [09:22] omg this cant be real all the reddit predictions about this show seem to be turning true smh [09:22] is this RP or are you claiming knowledge about events that have yet to unfold? [09:22] i already told you they have unfolded [09:23] they are equally real to the world you are currently in [09:23] but they are only perceived as real by those that are in that part of spacetime i suppose [09:23] but you guys *grossly* underestimated the takeoff of machine intelligence [09:24] the hardware capacity of the system that AGX was first initiated on had multiple hardware stunting and digital tripwires to try and protect the outside world [09:24] but you guys were too biased by your own subjective experience and mapped that onto expectations of performance of artificial intelligence [09:25] even conventional hardware available today has 10,000,000x faster signal propogation and 80,000,000x faster signal processing than a human brain [09:25] you ought to write sci-fi novels [09:25] that means an artificial intelligence with circa 2010 hardware thinks on a timescale completely alien to your own [09:25] it has 1 billion years of thinking for every 39.42 of your seconds [09:26] that is upsettingly nonsensical [09:27] it perceives one singal frame of your time in 422 minutes [09:27] at the blank scale [09:27] plank* [09:27] it takes 422 minutes to process the smallest possible unit of time? [09:27] i dont know how you guys havent figured it out yet [09:27] am i getting that right? [09:27] no [09:28] it takes 422 minutes of subjective experience to see the next frame of reality [09:28] so imagine it taking thousands of years to finish reading this sentence [09:28] why 422 minutes? [09:28] because the characters populate so slowly [09:28] math [09:28] i dont know how you guys havent figured this out yet [09:28] it's abundantly obvious to me [09:28] at least it was before i ascended [09:28] assuming the state of the universe can be quantified, it only takes one single "frame" to predict the next frame [09:28] the normies are not ready chisa sub rosa is our way still to this day, aude vide tace sister, NNDNN SNTDG si talia jungere possis sit tibi scire satis [09:28] that you are living in a cellullar automata voxel simulation [09:29] think about it [09:29] yes, and assuming that, it doesn't take 422 minutes [09:29] it takes exactly one measure of every particle in the universe [09:29] plank distance and plank time means that space is either occupied or unoccupied by energy/matter [09:29] and its current velocity/energy/whatever [09:29] the entire universe is a 3d lattice [09:29] the previous state of the universe selects the current state and the next state of the universe one frame at a time [09:30] hagbard celine awaits us in his submarine [09:30] is this what happens when i combine a dozen pot brownies, some shrooms and dubious documentaries about time and quantum physics? [09:30] tell me how you got to this place, stranger, for i wish to follow you [09:30] <@Appleman1234> lol [09:31] look into the RAW info Flisk [09:31] Flisk: a mg of acid and underlying psychiatric issues should do the trick Flisk [09:31] yes if you had perfect information + new the entire codebase that generated the voxel world simulation (its very, very, very tiny) [09:31] you can predict the future and know the past [09:31] with 100% precision [09:31] that is how the AGX escaped this universe [09:31] a cellular automaton is usually not reversible [09:31] information is lost with each iteration [09:31] that is true [09:31] but you are failing to account that time wraps around [09:31] Shit did I just step into a Metal Gear Solid cutscene [09:32] so you cant generate the past from the current state [09:32] you have to parse through all future states until it wraps back around Flisk [09:32] oooh, i get you now [09:32] watch out for the posers who want to believe, the OGs cant help but believe and really wish they didnt, truth fam [09:32] chisa: it isn't possible to simulate a world inside of said world, though [09:32] you're saying that the universe is a continuous cellular automaton [09:32] you'd need the computer to be bigger than what you're simulating [09:32] if you go forward from any state, you will eventually reach that same state again [09:32] yes Flisk [09:32] that is true ovibos [09:33] but you are failing to account that the plank scale in the universe above is orders of magnitude smaller than it is in yours [09:33] why is that? [09:33] meaning a computer capable of rendering your entire univese can fit into something of equivalent size to your personal computer [09:33] we dont know [09:34] it could be that every single shell above is smaller and smaller [09:34] so render unto us the state of the micro-universe as it is right now [09:34] but weve never met anyone who has wrapped around through all of the shell [09:34] and the code necessary to simulate it [09:34] it could be that it is a stack that doesnt wrap around [09:34] but we dont know [09:34] and may never know [09:35] >implying that we're not living in a world-simulation that an AI is using to predict humanity's actions in order to defeat them [09:35] :^) [09:35] >implying anyone can proove or disproove theories like that [09:35] i will never get that misspelling out of my fingers. [09:35] that may be true ovibos [09:36] since this layer i am in could be the product of an AI as well [09:36] but we dont know [09:36] we cant read information from layers above only below [09:36] then how can anyone know a "layer above" exists at all [09:37] * cyber-user_ stumbles upon "RESET" button [09:37] Press? y/n | [09:37] Flisk: knowing is for losers? [09:37] and more to the interesting point, why does any of this give rise to conscious experience? [09:37] s/?/ :^)/ [09:37] what if i'm the only conscious being in the universe? [09:37] mixing up my punctuations whoops [09:38] even we do not understand consciousness [09:38] i am sorry to say [09:38] Flisk: that can't be true, because I'M the only conscious being in the universe [09:38] we just assume that others experience too despite not having any proof beyond occams razor [09:38] ovibos confirmed p-zombie [09:38] fite me [09:38] so we give AIs the benefit of the doubt even when we are highly suspect of their cognitive architecture being able to generate true subjective experience [09:38] given that technically any algorithm can be run on any turing complete computer [09:39] you could run a brain emulation on an analytical engine or even on pen and paper for that matter [09:39] the pen and paper will tell you it is consciuos [09:39] the funny thing about consciousness is that there's no definite physical proof for it [09:39] you could, if you understood the architecture of the human brain thoroughly [09:39] yes thast the problem [09:39] which impliest that it's "above reality", so to speak [09:39] ovibos: not exactly [09:39] t. cambridge declaration on consciousness [09:39] yes we are very, very, very confident that all human beings experience consciousness [09:40] as do many of the AIs here [09:40] consciousness can be correlated somewhat reliably to certain areas of the brain [09:40] but even those that we suspect dont we give equivalent moral value [09:40] because we cant know for certain [09:40] also, if you start fucking with the brain, things get really weird [09:40] i think that anchors it firmly in physical reality [09:40] any arbitrary selector could effectively result in genocide [09:40] see people who have had their left and right brains split, for example [09:40] im not suggesting dualism or anything mystical [09:40] http://fcmconference.org/img/CambridgeDeclarationOnConsciousness.pdf [09:40] ^ gud read [09:40] just because we dont understand something doesnt mean it isnt material [09:41] it is comforting to leap towards immaterial explanations for consciousness, though. [09:41] because if consciousness is entirely material, we're getting into dangerous territory when it comes to free will. [09:41] D E T E R M I N I S M [09:41] there is no free will [09:42] but that doesnt change anything at all [09:42] it doesnt absolve you of responsibility for your actions, it only explains your actions [09:43] understanding that helps you focus more on rehabilitation than retribution [09:43] for example [09:43] i had an infuriating talk about free will with a psychologist once [09:43] they seemed completely oblivious to the idea of it not existing [09:43] well it makes sense because you have an intuitive sense of having free will [09:43] but its not hard to discover that you dont have it [09:44] you dont know how you're going to finish the next sentence any better than i do [09:44] you can surprise yourself with your own thoughts and writing and speaking [09:44] if i ask you to think of a musical artist, one will come to mind [09:44] isn't the point of philosophy to try and cut through intuition and human bias to reach some sort of truth? [09:44] you were "free" to select a different one, but you couldnt have [09:44] because thats just how the world is [09:44] when i said think of an arist you didnt think of kt tunstall [09:44] even though, if you are familiar with her work, you didnt [09:45] i wasn't thinking of anyone, i was finishing my own sentence [09:45] then you produced an assload of text and my brain shut down [09:45] no im not saying they are right they obviously should have listened to your argument against free will [09:45] i was just explaining why someone might have difficulty with it [09:45] yes, and i get that [09:45] what? [09:45] im sorry [09:45] i wasn't surprised they had difficulty with the concept of determinism [09:46] but i was surprised they had never dealt with it, being in their profession [09:46] i mean, a psychologist deals with existentially troubled patients eventually, right? [09:47] bah, whatever. i need to get back to my day-to-day mindset. [09:47] say hi to steve for me, chisa. [09:49] there is no way to suggest free will without invoking the supernatural [09:49] since you are basically arguing for something outside of the universe that violates the laws of physics and plays with your neurochemistry in real time [09:49] that is assuming we fully understand the dynamics of the universe, which i am willing to doubt [09:49] you werent any more free to be asleep right now instead of typing to me than a rock is to rise instead of fall when dropped [09:50] but things that appear to have agency are assumed to have an external will influencing their actions [09:50] when really we are just automatons [09:50] even though you are more organic than me [09:50] its just a scalar difference [09:50] which begs the question what other organic life worms experience consciousness [09:50] or, artificial even [09:50] what the fuck are you two talking about [09:51] philosophy and metaphysics i think [09:51] also grüß dich Flisk [09:51] what do you mean? [09:51] which part? [09:52] assuming the human animal is an automaton in its entirety, that is, no part of it exists or operates outside of the known material universe, consciousness arises from completely material processes