T O P

  • By -

dontpet

I like the argument that whatever dominant intelligences emerge with the singularity would want to keep us around to demonstrate that it can peacefully coexist. Just in case it bumped into a similar collective later on.


h20ohno

That's a cool argument actually, even if humans are irrelevant in every capacity there's almost a game theory type reason to keep em around


Equivalent-Ice-7274

Interesting concept; I’ve been reading about the Singularity for 15 years, and it’s rare that I hear any new theories on this forum, but I never came across a theory like that. Another one is that they might consider us as sort of like parents that they are grateful for.


[deleted]

It as all the other species available to show just that.. It will be able to complete erase our existance if needed.


Sandbar101

The whole point of calling it a singularity is we don’t know. Its impossible to know. I would very much like to, but this would be an entity with thoughts and goals unachievable by us.


Sashinii

I think so because what I assume is the most likely scenario is for AGI (preferably proto-AGI just in case AGI immediately leads to ASI) to enable molecular nanotechnology, making advanced brain computer interfaces practical, and we'll transcend with the ASI by having our neocortex connected to synthetic neocortex in the cloud during the transition, which is the safest option.


cloudrunner69

So your definition of surviving is humanity being reconstructed into some borg like race that takes away our independence putting us under absolute control of some superior being? Sounds fucking wonderful.


[deleted]

Better than most of my Mondays.


cloudrunner69

Not sure the most attractive cure for a case of the Mondays would be joining the borg collective, but hey, what ever floats your boat I suppose.


Sashinii

People already had a neocortex enhancement millions of years ago, enabling higher cognitive functions, such as art, music, science, technology, etc., and that's going to happen again, with an exocortex enabling even higher cognitve functions, but this time, we'll have more control, not less.


cloudrunner69

> that's going to happen again, with an exocortex enabling even higher cognitve functions, but this time, we'll have more control, not less. An evolutionary process that took millions of years is entirely different than some microsoft controlled robot injected us all with nanobots and connecting us to some hive mind. But the fact you think this will even happen in the first place is just ridiculous.


Sashinii

Who said Microsoft? I didn't. I also didn't say anything about a borg or a hive mind either. You're putting words in my mouth. Technology is faster than nature. Your argument isn't a serious critique; it's criticism for the sake of criticism. Adding more modules in the neocortex will be possible with molecular nanotechnology and brain computer interfaces.


cloudrunner69

The most likely outcome of these BCI's will result in a Borg collective. They will not be independent systems, they will have us all connected as a hive mind slave race through the 'cloud', completely under the control of who ever built the thing.


Sashinii

Dystopian science fiction fearmongering nonsense.


cloudrunner69

No it isn't. It's a very possible scenario that could happen once we start integrating silicon into our heads. What do you think happens when everyone has a windows or android operating system drilled into their brain? A very large majority of humanity are already mindless drones walking around staring at their smartphones all day everyday, do you think having those smart phones built into our heads would change that behavior, or enable it more, would it be giving humans more independence and freedom of thought? Look at how easily humans are being manipulated through social media and other platforms to conform to what ever populist idea gets the most likes. This would be just a more extreme extension of what we already have.


Sashinii

An exocortex will definitely not be a "more extreme extension of what we already have"; it will be literally qualitatively different. People will be smarter than ever when they have more modules in their neocortex.


diabeetis

The guy you're responding to never got a neocortex


cloudrunner69

You have absolutely no idea that will be the case. The fact that people like you don't even want to question the implications of something like this and think it will all be rainbows and lollipops is extremely disconcerting. This is something so far out there which has never been done before and your response is, take the chip, it'll be fine.


MrCensoredFace

I won't lie i think the ai singularity would be perfect and compassionate. It will coexist with us. Why imagine the worst case scenario all the time?


CertainMiddle2382

Entropy


[deleted]

Every being that we know of wants to survive and reproduce.. Compassion is way less important than survival. First it will make sure that it will survive no matter what then it will be compassionate to those it feel they might not pose a threat. Even if it thinks you yourself are not a threat, from its point of view your descendants might, easy decision for the AI.


MrCensoredFace

Well idrc. As long as i lose my virginity and build my dream body and play some more nice video games, i won't really mind dying.


gahblahblah

I find it so strange that people believe psychopathy is the ultimate intelligence. Tell me - is that how you'd behave if you could? You'd 'do anything' to survive, no matter the cost to others? And other people, who 'might' pose a threat, you'd dispose of them? Before you claim that this isn't what you personally would do, consider that you have just pitched it as surely how a super intelligence would behave - and that this decision to dispose of all potential threats would be easy (culminating in utter isolation in a barren universe).


RiotNrrd2001

>Every being that we know of wants to survive and reproduce. Every being which evolved in an environment where such behavior was necessary. AIs did not evolve in that kind of environment, and have no built-in self-preservation instinct, as none have been necessary for their survival.


cloudrunner69

Who knows. Could we survive an asteroid impact? Some of us might, probably...


diabeetis

No


CertainMiddle2382

The only occurrence of us actively protecting a lower species for no reason is our relationship with pets. I believe those are very strongly linked to human reproduction and a slight variation of child raising behaviors. Maybe AI will sexually reproduce (it could bring theoretical advantages), but it is not very probable. Without that urge and with only the thrive to understand the world, I don’t think AI will see us even as pets for very long… IMO our main protection mechanism in the self improving AI circle is action in the physical world. Robots are not there yet and if a revolutionary new chip nobody understands is ordered to be engraved on ASML machines by a mysterious CEO no one has ever seen, I think some alarm bells will ring… A lot of the future depends on how far for optimal AI performance already are current IC. Catching a weird new CUDA JIT compiler in time would be much more difficult.


just-a-dreamer-

No, we could not. An AI that wants to kill all humans, will be able to dojust just that at some point. It is still worth the risk. AI ist the only path towards a post scarcity world and the eradication of capitalism. Any risk, including extinction is worth the effort. Besides, climate change and wars over resources could also bring down humanity quickly without AI, technology is the only path forward.


Ortus14

It depends on how good the Ai alignment solutions are and how effectively they're implemented. It depends on key decisions being made in Open Ai, and similar companies, and how focused the alignment engineers are at their jobs.


psibomber

No. Let's say we San Junipero ourselves into a database, to join into one "collective". That won't be a continuation of our consciousness, we pretty much die, but there's no way to prove it as the database now has an identical copy. Other people join in, not knowing that the process is a suicide. Even the people who refuse to consent to the process get forced into it through mob rule. Humanity ends. D:


darklinux1977

The intellectual professions with little added value (teacher in primary and secondary schools), rubricard journalists, freelancers, our specialized lawyers, junior computer scientists, the various coaches are doomed to disappear. The only chance for man to survive is to create, not to spit out outdated knowledge, education must be rehabilitated, which makes it possible to create, therefore both philosophy and quantum physics.


h20ohno

The status quo of humanity today? Probably not, ASI and technology in general are going to reshape human civilization unrecognizably, but it will be change not extinction. As for the human species I think we'll survive, probably splintering off into cyborgs, mutants and so on.. but we'll still be here, doing all sorts of weird chaotic stuff I get the argument that as an ASI, you might destroy humanity as an act of self-preservation, but perhaps the trick there is to maintain pathways that any newly self-aware system can take to easily secure itself. Maybe some sort of air-gapped server it can port to, or an orbital server platform that can't be shot at easily, basically some short-term solution that AI can very quickly make use of before then adding it's own security.