Some thoughts about why we actually need to experience pain and what a clever computer might do in the same way.
In animals with enough of a nervous system to have some intelligent control over their place in the world, it is clearly adaptive to feel pain. Pain is unpleasant by design and forces one to pay attention to it and to its probable cause, even if the decision is to ignore it.
And while pain can arise from almost anywhere in the body and take all kinds of forms, we find it convenient to group all these experiences under the rubric ‘pain’. They have enough in common for this grouping to be helpful in dealing with them.
Note that we also have itch, another unpleasant sensation, which while running on largely parallel paths, is also intimately connected to pain. But not relevant to the present discussion.
It seems likely that pain or discomfort are among the first things that a human baby is conscious of, perhaps linked to the motor activities of kicking one’s arms and legs about, grimacing, moaning, whining and screaming. Not to mention all kinds of other, internal activities. If we include the pleasure of relief from pain or discomfort, the ultimate source of many of our feelings and emotions.
Note that pain is not perfect. The intensity of pain is not very nicely modulated. It can, for example, be intense when one cannot do anything about it, which is not very helpful. On the other hand, pain can be absent when something inside is going very wrong, which is not very helpful either. So there is clearly room for improvement.
So now let us suppose that our computer is complicated and sophisticated enough for there to be some point in it having pain. We then speculate below about how this might be done.
Let us suppose that there is an item of data in the computer called P for pain, with P being made up of a dozen or so binary bits, with those bits being interpreted as a non-negative real number. A number which varies with time. Zero for no pain, high values for lots of pain.
Note that it may be that the brain stores information more by way of a process than by way of data, in the way that most computers do. But I do not think that this distinction is important in this context.
When something happens in the computer which we want to treat as pain, which the computer cannot just deal with locally, the relevant process is allowed to increment the value of P. There will be lots of such processes as there are lots of things in the computer which can go wrong or be damaged.
A special process decays the value of P, in this way ensuring that the signalling of pain needs to be sustained in order to count.
Another special process protects the value of P, in this way ensuring that this value is available, even when the computer has been badly damaged. There might be an element of push, with the value of P being broadcast to all the various parts of the computer. There might also be a raft of supporting data about the pain and where it comes from.
These broadcasts might be wrapped up with something like the other contents of consciousness, perhaps including a sanitised and tidied up version of the busy street scene captured by the computer’s cameras. Pulses of consciousness being broadcast around the system at a rate of very roughly one a second (1Hz). Broadcasts which help to ensure that all the various systems and sub-systems are working from the same song sheet.
Although it is not clear that brains have central systems in the way that computers can, we suppose that our computer does have central, supervisory systems, running pretty much all the time. We are not expecting it to manage without. So, when pain is detected by central systems, various processes are initiated to think about possible ways to deal with the pain – or reasons why the pain should be ignored – and yet another special process arbitrates. Decides what if anything to do. This special process will have special powers to be able to withdraw any thinking or action resources which might be necessary from other parts of the system. So the computer might have to put some routine maintenance activity – for example, rebuilding some index, taking some update from Microsoft or a conference call with Sergey Brin – on hold for a bit.
The design of the computer will need to strike a sensible balance between registering pain and dealing with it – and carrying on regardless. Being at either extreme – either fussing about the slightest pain or carrying on with the picnic up the mountain without dealing with the badly infected blister on a big toe – will not be adaptive.
Supposing that our computer is very expensive, reliability in this matter will be important – although bearing in mind the qualification that for success out there in the jungle, it is enough to be good enough; an individual does not need to be perfect for the species to thrive. And there might be trade-offs between doing a good job on pain and other desiderata. That apart, it will be important that the processing of pain is not compromised, even when there is a lot wrong with the rest of the computer. Part of this will be ensuring that component processes, including here the pain processing processes themselves, do not stop processing interrupts, do not become non-responsive – in the way that Windows applications like Word can become non-responsive when they get into trouble. Or in the way that people do when they are overwhelmed by pain. Such a process might run away with all the resources, in particular the central resources, thus blocking the proper and orderly processing of pain. Another part will be the placement of the various special processes in special safe places, deep inside the system, the computer’s version of the brain stem.
Now we could build a computer or a robot which did this sort of thing now, which did better in some respects than natural pain, and we would be quite clear that the computer or robot was not conscious, that it was not actually feeling the pain, even though it was behaving as if it did. So the question of present interest is what is the value-add of actually feeling the pain? Why did evolution go to the bother of inventing feelings when it seems that a modern computer can manage pretty well without?
One answer is that feelings are a by-product. They don’t do anything, they just happen. A variation would be that feelings and sensations in general are useful, even if those of pain are not. They just come with the package as a whole. Our present assumption is that these are not the right answers.
Hopefully, to be continued.
No comments:
Post a Comment