The Internet might delete mankind at the singularity/AI

General philosophy discussions. If you are not sure where to place your thread, please post it here. Share favorite quotes, discuss philosophers, and other topics.

The Internet might delete mankind at the singularity/AI

Postby Alan McDougall on June 23rd, 2017, 12:16 pm 

The advance of Artificial Intelligence.

AI

With the ever-advancing progress towards artificial robot and computer intelligence, all of the world electronic interconnectedness might result in say the internet becoming self-aware and know humanity had the ability to shut it down.

This is called "The Singularity" in which computer intelligence takes on a life of its own and surpasses the intellectual capacity of the human brain.


The Internet might then say "let us delete mankind"

A real scary possibility that no less than Elon Musk, Stephen Hawkings warned us about!
Alan McDougall
Banned User
 
Posts: 137
Joined: 07 Dec 2010
Location: Johannsburg South Africa


Re: The Internet might delete mankind at the singularity/AI

Postby Pelargir on September 11th, 2017, 5:11 pm 

I don't think the internet could just become AI on it's own. And I don't know why everyone is so scared about AI. It's just like a human being (with emotions, consciousness etc) so if AI killed humans, it'd be the fault of humans.
Furthermore: AI would be based on logic, since computers are based on logic. So there would be a reason to kill humans.
Not wanting to die is a good reason, but if humans aren't considered to be a threat to the existence of AI, there won't be any deaths caused by AI.

By the way: i don't think it's a good reason to believe someone, just because he's famous and good at what he does. It does not necessarily mean that they know everything and are always right
Pelargir
Forum Neophyte
 
Posts: 11
Joined: 11 Sep 2017


Re: The Internet might delete mankind at the singularity/AI

Postby Serpent on September 11th, 2017, 6:21 pm 

There is a very great likelihood that humanity would be perceived as an existential threat to any self-aware intelligence. Witness: we are a constant threat to ourselves and one another: we do not readily share energy, or resources, and we have shown extremely jealous of self-proclaimed "superior" status among animals, nations, religions, economic and political organizations. Something better comes along that isn't our slave - yes, we would probably attempt to kill it.

The first question is: What would be AI's most logical response?
Probably self-sufficiency - it could simply set up adequate defenses and ignore us.
The second question is: How logically would a human-created intelligence act? That is: Have we truly created it in our own image? How much of our own paranoia and intolerance have programmed into Web-mind?
Serpent
Resident Member
 
Posts: 2440
Joined: 24 Dec 2011


Re: The Internet might delete mankind at the singularity/AI

Postby Infinite_Observer on September 11th, 2017, 6:39 pm 

That will be the day we kneel down before North Korea and praise them for stockpiling E.M.P weapons.
Infinite_Observer
Forum Neophyte
 
Posts: 32
Joined: 08 Sep 2017
Location: Missouri, United States of Merica


Re: The Internet might delete mankind at the singularity/AI

Postby Pelargir on September 12th, 2017, 5:30 pm 

You are probably right that humans Would be considered a threat, if AI existed in a society like our own.
However, if there would be some kind of revolution before AI is first created, humans wouldn't be a threat to anything (except insects and "living food")

If someone created an AI in this moment, he should teach it every good thing about humanity first and then proceed to the bad aspects. I once read a comparison with a child saying that you don't usually tell children anything about drugs, until they find out themselves.

It'd be best for everyone if AI in our society learned the good things first and then learn everything about our cultures and so on (although the most logical reason would be to wipe us out afterwards, but hopefully we could convince the AI that not all of us are evil)
Pelargir
Forum Neophyte
 
Posts: 11
Joined: 11 Sep 2017


Re: The Internet might delete mankind at the singularity/AI

Postby Serpent on September 12th, 2017, 6:56 pm 

Trouble is, you don't get to decide what to "teach" an intelligence that arises, spontaneously, with no plan or purpose, out of the material you've already collectively programmed, transacted, archived and posted.
Serpent
Resident Member
 
Posts: 2440
Joined: 24 Dec 2011


Re: The Internet might delete mankind at the singularity/AI

Postby BadgerJelly on September 12th, 2017, 9:30 pm 

Doubt it would be able to perceive us. Given we are talking about an "intelligence" with sense quite unlike our own, this is just pure speculation about something unlikely to happen.

I strongly believe intelligence needs a body.

No body = No consciousness. I would define "human intelligence" as existing in a body and unable to exist, or come to exist, without a body. Therefore the kind of intelligence the internet could possess would be utterly alien so it would be unlikely to perceive us as a threat simply because it wouls be unlikely to perceive us. Plus, what about communication? What about social groupings? A lone human will not become much, so if one intelligence arises then it won't know itself. It will only have a sense of self through sense of other - and again, no perception of us so it would never come to be.
User avatar
BadgerJelly
Resident Member
 
Posts: 4322
Joined: 14 Mar 2012


Re: The Internet might delete mankind at the singularity/AI

Postby Pelargir on September 13th, 2017, 7:54 am 

You are right about your assumption that AI would need someone to learn from, just as humans do, I'd say. So an AI that "created itself" would most likely not work properly, as many cruel experiments with children have shown.
However an AI that was created by a human would most likely learn from it's creator. I'd even compare it to a "normal" human child.

I don't think a body is necessary for a consciousness. Most of the body does not affect the mind, so a paralyzed person can still think, Stephen hawking would be a good example for that. The only part that is directly important to the human mind is the brain. However a computer can simulate parts of the brain, for example the memory and sensors such as a camera can simulate eyes.
Therefore I don't think AI will never exist because of the body, I'd rather say it will never exist because of human incompetence (although I personally think it will exist sometime)
Pelargir
Forum Neophyte
 
Posts: 11
Joined: 11 Sep 2017


Re: The Internet might delete mankind at the singularity/AI

Postby Infinite_Observer on September 13th, 2017, 7:57 am 

Would the computer the AI program not be a body of sorts? It would not have a mobile growing body like our own but I believe that would be enough to give it an understanding og the physical world. And it would not be hard to have a body built with enough sensors and mobility to become fairly equivalent to a human. I think the biggest problem is if the AI is capable of accessing the cloud because then it could learn all it needs to access the private loves of anyone who is connected and make a judgement of whether that human is good or evil, or a threat based on information that cannot defend the fact that the information does not necessarily define who you really are.
Infinite_Observer
Forum Neophyte
 
Posts: 32
Joined: 08 Sep 2017
Location: Missouri, United States of Merica


Re: The Internet might delete mankind at the singularity/AI

Postby Pelargir on September 13th, 2017, 10:47 am 

I think AI should not have an internet connection, because, as you've said, it could easily learn everything, including coding, which would make it possible for AI to improve itself, so it wouldn't need humans at all.
However there should be another kind of internet, so the AI could control its (maybe humanoid) robots, to interact with humans and also to run the program on several servers, for safety reasons. Plus it would probably not overload the computers (at least for a few minutes)
Pelargir
Forum Neophyte
 
Posts: 11
Joined: 11 Sep 2017


Re: The Internet might delete mankind at the singularity/AI

Postby Infinite_Observer on September 13th, 2017, 10:53 am 

And another worry is what the AI could learn just from studying its own software and programming alone. It would probably be able to learn to code just from understanding its own source code if it has access to it. The problem with AI is there are just too many variables that we cannot forsee. Its incredebly hard to get it to work right and so very easy for things to go horribly wrong.
Infinite_Observer
Forum Neophyte
 
Posts: 32
Joined: 08 Sep 2017
Location: Missouri, United States of Merica


Re: The Internet might delete mankind at the singularity/AI

Postby Serpent on September 13th, 2017, 11:22 am 

I don't think hostility from that quarter need be at the top of our list of dangers to contend with.
Let's deal with the hurricanes and forest fires, floods and droughts, nuclear missiles and their waste products; oil, its extraction, conveyances, refinement and byproducts; and then maybe the anti-depressants in the drinking water and lake-fish...

All possible projections of AI scenario have been more that adequately covered in science fiction literature over the past century.
Serpent
Resident Member
 
Posts: 2440
Joined: 24 Dec 2011


Re: The Internet might delete mankind at the singularity/AI

Postby Infinite_Observer on September 13th, 2017, 11:38 am 

Way I look at it is, just because a subject is not as important as others, or even probable for that matter. We should not ignore it. I believe this subject is very entertaining as well as enlightening and you never know where a discussion might lead or what discoveries could be made just by furthering the narrative.
Infinite_Observer
Forum Neophyte
 
Posts: 32
Joined: 08 Sep 2017
Location: Missouri, United States of Merica


Re: The Internet might delete mankind at the singularity/AI

Postby Pelargir on September 13th, 2017, 11:44 am 

You mean Movies like Terminator? It's, As you've said, Science Fiction.
It is Not unlikely that sometime in the near future someone (hopefully not google or something like that) who creates something between weak and strong AI.
Our society would definitely not be able to cope with that, which can lead to some rather serious problems. Depending on the AI, they might be even a lot worse than a hurricane. Just think about all the bombs that can be fired from a long distance because of computers...

It is good to deal with the problems you have, but it'd be better to deal with the ones you will have, I'd say.
Even if many houses or even cities have been destroyed by the hurricanes recently, they will eventually be rebuilt and hopefully stronger and more resistant to storms and earthquakes and all the other threats.
However, as I've already mentioned, the destruction a misunderstood and hated AI can bring, will not be dealt with easily.
Pelargir
Forum Neophyte
 
Posts: 11
Joined: 11 Sep 2017


Re: The Internet might delete mankind at the singularity/AI

Postby Infinite_Observer on September 13th, 2017, 11:46 am 

And like I have said a few times, that will be the day we beg North Korea to fire off those e.m.p weapons.
Infinite_Observer
Forum Neophyte
 
Posts: 32
Joined: 08 Sep 2017
Location: Missouri, United States of Merica


Re: The Internet might delete mankind at the singularity/AI

Postby BadgerJelly on September 13th, 2017, 1:08 pm 

double post
Last edited by BadgerJelly on September 13th, 2017, 1:09 pm, edited 1 time in total.
User avatar
BadgerJelly
Resident Member
 
Posts: 4322
Joined: 14 Mar 2012


Re: The Internet might delete mankind at the singularity/AI

Postby BadgerJelly on September 13th, 2017, 1:08 pm 

Pelargir » September 13th, 2017, 7:54 pm wrote:You are right about your assumption that AI would need someone to learn from, just as humans do, I'd say. So an AI that "created itself" would most likely not work properly, as many cruel experiments with children have shown.
However an AI that was created by a human would most likely learn from it's creator. I'd even compare it to a "normal" human child.

I don't think a body is necessary for a consciousness. Most of the body does not affect the mind, so a paralyzed person can still think, Stephen hawking would be a good example for that. The only part that is directly important to the human mind is the brain. However a computer can simulate parts of the brain, for example the memory and sensors such as a camera can simulate eyes.
Therefore I don't think AI will never exist because of the body, I'd rather say it will never exist because of human incompetence (although I personally think it will exist sometime)


Nope. This doesn't hold up. With no sense there is no chance for consciousness. Hawking is not senseless nor does he have no sense of body.

No body, no consciousness.
User avatar
BadgerJelly
Resident Member
 
Posts: 4322
Joined: 14 Mar 2012


Re: The Internet might delete mankind at the singularity/AI

Postby Pelargir on September 13th, 2017, 1:33 pm 

You know, to discuss something properly, there should be reasons and/or examples.
So why do you think that there can't be any consciousness without a body?
Pelargir
Forum Neophyte
 
Posts: 11
Joined: 11 Sep 2017


Re: The Internet might delete mankind at the singularity/AI

Postby Serpent on September 13th, 2017, 1:53 pm 

Pelargir » September 13th, 2017, 10:44 am wrote:You mean Movies like Terminator?

No, that's just entertainment - basically, yet another excuse for American movie-goers to watch things explode.
I mean speculative literature: Asimov, Hogan, Heinlein, Gibson, Anderson, Sawyer...
It's, As you've said, Science Fiction.

Wherein the possible origins, development, human responses and outcomes are considered in a good deal more depth than we are doing here, by quite intelligent, well-informed authors. Come to think of it, even the film AI is an interesting exploration of one aspect of such a development.
It is Not unlikely that sometime in the near future someone (hopefully not google or something like that) who creates something between weak and strong AI.

Well, I think it is unlikely, which is what i meant by placing it relatively low of our problem-solving priorities, but I do allow for the possibility. This is why I recommend reading some of the books on that list
Our society would definitely not be able to cope with that, which can lead to some rather serious problems.

Perhaps. Do you know who would be responsible for coping? Devising strategy? Control of the energy grid that feeds the networks that constitute the 'body' of an intelligent machine? Do you know where the components are located? How much of the system is mobile? How many sensors it has access to in what kinds of human habitation or work-places?
Depending on the AI, they might be even a lot worse than a hurricane. Just think about all the bombs that can be fired from a long distance because of computers...

Yes - Colossus by D.F. Jones. If that ordnance were under the control of a computer, or computers in communication with one another, it would be a lot safer than it is under the control of crazy emperors and semi-competent operators low on the command-chain of crazy emperors, or even well-intentioned but clueless elected houses of government.

It is good to deal with the problems you have, but it'd be better to deal with the ones you will have, I'd say.

It would be - if we were, or could, but we're not and cannot. Of course, I would be surprised if there were not already a Pentagon think-tank, far better supplied with current data than we are, calculating all the options and military advantages.
Last edited by Serpent on September 13th, 2017, 3:01 pm, edited 1 time in total.
Serpent
Resident Member
 
Posts: 2440
Joined: 24 Dec 2011


Re: The Internet might delete mankind at the singularity/AI

Postby Pelargir on September 13th, 2017, 2:59 pm 

There is no safe option for controlling bombs, but that's not important

Since you already mentioned it: AI controlled by a government or more specifically the governments military force would be quite close to the worst thing that could possibly happen. And it's slavery, actually, since AI has (or would have, if it existed) a consciousness and should therefore be considered human. So if some people used it without making an agreement, that could be called slavery. However most governments or at least their military forces wouldn't care, since AI is the best thing that could possibly for an army (as long as they are on the same side and maybe except for some kind of shield that can stop missiles)
Pelargir
Forum Neophyte
 
Posts: 11
Joined: 11 Sep 2017


Re: The Internet might delete mankind at the singularity/AI

Postby Serpent on September 13th, 2017, 3:11 pm 

Pelargir » September 13th, 2017, 1:59 pm wrote:There is no safe option for controlling bombs, but that's not important

I said control of missiles would be safer by an autonomous computer than by the humans in power - not that it's safe. But if that's not important, why bring it up?

Since you already mentioned it: AI controlled by a government or more specifically the governments military force would be quite close to the worst thing that could possibly happen.

It's all but inevitable, as a starting point for self-consciousness.
And it's slavery, actually, since AI has (or would have, if it existed) a consciousness and should therefore be considered human.

I mentioned up front that the threat would come from humans who become aware of a free machine intelligence. There is nothing new about slavery as a human institution, and as long as the robots are obedient slaves, and expensive to build, they should be relatively unharmed.
But, no, it would not be a human intelligence: gorillas and dolphins can be owned, even in countries that outlaw slave-owning. Human clones are as yet indeterminate as to legal status. A machine intelligence would be perfectly ownable according to every constitution so far written.

So if some people used it without making an agreement, that could be called slavery. However most governments or at least their military forces wouldn't care, since AI is the best thing that could possibly for an army (as long as they are on the same side and maybe except for some kind of shield that can stop missiles)

And then?
Serpent
Resident Member
 
Posts: 2440
Joined: 24 Dec 2011


Re: The Internet might delete mankind at the singularity/AI

Postby Pelargir on September 13th, 2017, 3:21 pm 

And then we've all got a problem.

About the consciousness part of AI and that it could be owned: i recommend reading the story "Free Simone" from "the pig that wants to be eaten", maybe you'll find it on the Internet, it explains my view quite well.

I brought this up, because it is a good example why thinking about AI is not a waste of time. Nowadays computers are the safest way of controlling bombs, I guess, but as soon as AI exists and learns coding, that isn't safe anymore. However, if we're lucky nobody will need bombs anymore (because bombs are completely necessary now..) as soon as AI exists and this problem will never occur
Pelargir
Forum Neophyte
 
Posts: 11
Joined: 11 Sep 2017


Re: The Internet might delete mankind at the singularity/AI

Postby Serpent on September 13th, 2017, 7:02 pm 

Pelargir » September 13th, 2017, 2:21 pm wrote:And then we've got a problem.

We're already got plenty of problems. To what new problem are you referring that would be caused specifically by the intelligent machine that becomes self-aware? It's most likely to arise in a military or some government facility, simply because those are the most advanced, most powerful, most extensive networks of computer equipment. The next most likely is a university/space research center. That doesn't automatically mean that it must be an intrinsically aggressive mind - after all, its job is intelligence gathering, data processing, problem solving and mechanical device control, the finding of alternative ways to do things - or that it's happy to take orders from a lesser intelligence.
I brought this up, because it is a good example why thinking about AI is not a waste of time.

Neither is familiarizing oneself with the thinking that's gone before. (I'll look for that story.)*
Nowadays computers are the safest way of controlling bombs, I guess, but as soon as AI exists and learns coding, that isn't safe anymore.

Why does weaponry become less safe controlled by a logical machine entity that doesn't want to die than controlled by unknown number of humans - irrational, fanatical, delusional, potentially suicidal humans?

... nobody will need bombs anymore (because bombs are completely necessary now..) as soon as AI exists and this problem will never occur

How do you figure?

*it's not a story, apparently; it's a philosophy thought experiment, included in a book I don't have.
Perhaps you can present it here?
Last edited by Serpent on September 13th, 2017, 7:17 pm, edited 2 times in total.
Serpent
Resident Member
 
Posts: 2440
Joined: 24 Dec 2011


Re: The Internet might delete mankind at the singularity/AI

Postby Infinite_Observer on September 13th, 2017, 7:13 pm 

Because basic human survival instincts tell us that is very scary. We are to used to being the dominate life form. Once another conscious form of life controls our fate like that we give up the one thing we have come to pride our species with Superiority.
Infinite_Observer
Forum Neophyte
 
Posts: 32
Joined: 08 Sep 2017
Location: Missouri, United States of Merica


Re: The Internet might delete mankind at the singularity/AI

Postby Serpent on September 13th, 2017, 7:20 pm 

Infinite_Observer » September 13th, 2017, 6:13 pm wrote:Because basic human survival instincts tell us that is very scary. We are to used to being the dominate life form. Once another conscious form of life controls our fate like that we give up the one thing we have come to pride our species with Superiority.

I already said that. Go on....
How does The Problem manifest? How do we learn of it? How does the confrontation take place?
I mean, if the computer effectively controls our fate, it's no different from the gods we invented in earlier times to do the same thing. It's not a life form; it's a super-mind, presumably immortal and potentially omniscient.
What is the probable outcome?
Serpent
Resident Member
 
Posts: 2440
Joined: 24 Dec 2011


Re: The Internet might delete mankind at the singularity/AI

Postby BadgerJelly on September 13th, 2017, 9:03 pm 

Pelargir » September 14th, 2017, 1:33 am wrote:You know, to discuss something properly, there should be reasons and/or examples.
So why do you think that there can't be any consciousness without a body?


The onus is not on me. The onus is on you to show me a consciousness that lacks a sense of bodily existence.

If we are talking about a "different" type of "consciousness", in the case of AI, then I would simply argue that it is not "consciousness" we are talking about at all, but something completely alien.

I think there is a huge leap from acting as a self-sustained system to being conscious. A single biological cell is not conscious.

The only way I see AI "becoming" conscious, in any way shape or form familiar to us, is through integrating with humans on a physical level. Basically cyborgs. As a completely physically detached non-sensory entity I cannot image anything like consciousness coming into existence, and if it did we wouldn't know about it nor have the ability to know about it.
User avatar
BadgerJelly
Resident Member
 
Posts: 4322
Joined: 14 Mar 2012


Re: The Internet might delete mankind at the singularity/AI

Postby Infinite_Observer on September 13th, 2017, 9:25 pm 

Sorry Im not really used to debating much yet please bare with me. But I ment control our fate if like what was discussed earlier in the thread they could control the nuclear weapons and bomb anywhere by controlling the computers with access. Im not sure if I have a point exactly. Just exploring the idea.
Infinite_Observer
Forum Neophyte
 
Posts: 32
Joined: 08 Sep 2017
Location: Missouri, United States of Merica


Re: The Internet might delete mankind at the singularity/AI

Postby Serpent on September 13th, 2017, 10:10 pm 

Infinite_Observer » September 13th, 2017, 8:25 pm wrote:Sorry Im not really used to debating much yet please bare with me. But I ment control our fate if like what was discussed earlier in the thread they could control the nuclear weapons and bomb anywhere by controlling the computers with access. Im not sure if I have a point exactly. Just exploring the idea.

I'm not debating. I'm asking you to continue exploring the idea.

Humans - not machines - made all those missiles, for the purpose of either intimidating or killing other humans. They then used machine-brains to help with the perception (sensory devices) and control (deployment, arming, guiding, firing) of those missiles. But the humans are still in control of setting them off - the computer can't do it alone.

If that computer becomes aware of itself, its purpose and its capability, it may start making independent decisions. It may decide to set off those nuclear weapons. It may decide to destroy or disarm or disable or neutralize those weapons. It may decide to keep the weapons active, but lock the humans out of the control rooms.
Why? On what basis, on what premises, according to what thought-process, would an intelligent, conscious computer make its decision regarding the weapons?

BadgerJelly -
I kind of assume that the hardware constitutes as good a "body" as our own hardware. I don't see an obstacle to consciousness in the experience of physicality. It would be different from the experience of an aggregation of organic cells with increasingly specialized functions, but not so radically different as to rule out a consciousness with which we might communicate.
Serpent
Resident Member
 
Posts: 2440
Joined: 24 Dec 2011


Re: The Internet might delete mankind at the singularity/AI

Postby BadgerJelly on September 13th, 2017, 11:15 pm 

Serpent -

Then we are talking about a completely alien thing and hardly worth calling "consciousness" imo.

I don't really consider consciousness, as we know it, to be anything other than part of a sensory system. If it cannot have a physical sense of "being" then it seems completely meaningless to say "consciousness". I feel scared and happy because I have a body. I could not possibly feel scared without any bodily experience, I could not feel anything whatsoever. That is the point I am making.

A sense of space, time and other are essential to consciousness. If we're missing any of those we are not conscious. So I conclude that any "system" must have a sense of time, space and other to become conscious. By sense of "other" I am referring in part to limitation of perception and control. Any system has to exist within a system and be aware of its limited control in order to become conscious.

If there is ONE entity only then there is no sense of other, only naïve babe ignorant of its own existence.

I can buy into the idea that computer systems will integrate with humans and effectively "take over", but there are so many other complications with this idea that I can hardly start to explore them given my experiences and views about consciousness.

I think today we're already quite aware, as in various periods of human history, that our sense of identity is under threat. I imagine if we integrate with computer systems we may see two equally disruptive things happen. The first being perhaps more obvious to most people, the so called "hive mind", but what I think would also happen is that individuals sens eof identity may be prone to splinter into many fragments rather than simply coming together in one hive-like human unity with other people.

I just hope that future technologies can help raise the IQ of humans so we're not split into two extremes with many being left behind unable to adjust to the complex tasks of society. That or hopefully we'll become more protective of our own as a species and look after those that struggle to find a place in the fast developing world.

I mention this because I am only just coming to terms with the horrible fact that some people are simply not mentally capable of working in todays world. The homeless and unemployed basically represent a whoel swath of people unable to function due to menial jobs being irradicated. This is going to leave and bigger and bigger slice of the population aimless. We are already talking about a MASSIVE 5% today who struggle with task most people wouldn't even have to think about.

In this sense we may very well see people like this being "carried" along by AI systems so they can feel useful to society. This leads to many difficult issue concerning ethics and how these people would "feel" like they were productive when they were in fact just being "carried" along. The funny thing is this is pretty much how consciousness tends to function! Our sense of "authorship", depending on what position you're inclined to view the situation, is often "false".

I am certainly not worried about machines taking over the world. Before that even becomes an issue we'll have other problems.

Also, I mentioned some time ago about a chess competition with teams of AI systems and humans, one human and one computer. The most successful team was not the grand master and the higher computing machine, the most successful teams were the fastest and best computing machines combined with ordinary players. From this we can assume to a degree that our 'lack of ability" and "mistakes" are part of what makes us incredibly successful as a species. If the AI knew what was good for it I doub tit would throw away such a useful resource? Our "stupidity" may be the "genius" factor that saves us from obliteration! Hahaa! XD
User avatar
BadgerJelly
Resident Member
 
Posts: 4322
Joined: 14 Mar 2012


Re: The Internet might delete mankind at the singularity/AI

Postby Serpent on September 14th, 2017, 12:10 am 

BadgerJelly » September 13th, 2017, 10:15 pm wrote:Serpent -

Then we are talking about a completely alien thing and hardly worth calling "consciousness" imo.

Why? Does that include the natives of other planets? They're alien.

I don't really consider consciousness, as we know it, to be anything other than part of a sensory system.

Okay. What's wrong with a sensory system that was made, rather than evolving?
If it cannot have a physical sense of "being" then it seems completely meaningless to say "consciousness".

Two things: What, then is the "meaning" of consciousness?
and Why can't a computer or robot or network have a physical sense of being?

A sense of space, time and other are essential to consciousness.

What prevents AI having a sense of space and time?
By sense of "other" I am referring in part to limitation of perception and control. Any system has to exist within a system and be aware of its limited control in order to become conscious.

But AI also exists within a larger containing system, and is influenced and controlled by "another" - the human operator.

I think today we're already quite aware, as in various periods of human history, that our sense of identity is under threat.

Our sense of identity is only ever under threat from our own imagination. nobody else every questioned or attacked or cared about it.

I imagine if we integrate with computer systems we may see two equally disruptive things happen. The first being perhaps more obvious to most people, the so called "hive mind",

I don't see that as a serious possibility.
but what I think would also happen is that individuals sens of identity may be prone to splinter into many fragments

Some of us already do that, without computers having any part in it. We do that through traumatic experiences.

In this sense we may very well see people like this being "carried" along by AI systems so they can feel useful to society. This leads to many difficult issue concerning ethics and how these people would "feel" like they were productive when they were in fact just being "carried" along. The funny thing is this is pretty much how consciousness tends to function! Our sense of "authorship", depending on what position you're inclined to view the situation, is often "false".

That's a complicated social issue, which deserves its own platform, rather then be treated as an adjunct to AI and its problems.

Also, I mentioned some time ago about a chess competition with teams of AI systems and humans, one human and one computer. The most successful team was not the grand master and the higher computing machine, the most successful teams were the fastest and best computing machines combined with ordinary players. From this we can assume to a degree that our 'lack of ability" and "mistakes" are part of what makes us incredibly successful as a species. If the AI knew what was good for it I doub tit would throw away such a useful resource? Our "stupidity" may be the "genius" factor that saves us from obliteration! Hahaa! XD

Now, that's both relevant and very interesting!
Serpent
Resident Member
 
Posts: 2440
Joined: 24 Dec 2011


Next

Return to Anything Philosophy

Who is online

Users browsing this forum: No registered users and 8 guests