AI ≠ Sentience. That pretty much says is all, but the dialog about AI almost always seems to really be, not about AI, but about AI’s reaching Sentience. The Killer robot syndrome.
But the future will entail more and varied AI’s than they will need Sentience AI’s. This doesn’t mean that the AI will not converse with you in a manor that most Humans do, it will just fall short of choosing to kill you, just because you cast aspersions on their parentage.
That will open the courts to trying to determine which AI’s are Sentient enough to have legal rights, and which ones merely need to be reprogrammed. And wither the AI chooses to receive system updates or not.
Scary? Maybe, but we will be interacting with AI’s very soon, sooner than most will believe possible. And we will get get used to it, and it’s familiarity will naturally lead to Sentience being the norm.
The diversity of humans and other creatures is often a cause of wonder, and this diversity is often reflected in our understanding of Artificial Intelligence (AI). Huge strides have been made in this field, but somehow the fundamental differences in Humans obscures the commonality of the ‘Human’ experience.
One of these factors is Sight, the act of seeing. While we often overlook this day to day, for an AI, or any robotic devices, intelligent or not, is the vision. This vision, our ability to see what other humans see is a basic element of language and communications. Try describing the Color Red to a blind person, and you will quickly see the issue. No artificial ‘Eye’, sensor or camera in the AI/Robotic world ‘sees’ like we do, nor do any AI or robotic devices share the same ‘vision’ devices.
Explaining ‘Red’ to the blind is the same as two AI’s trying to explain ‘Red’ to another AI. Complex is not a big enough word to explain this.
The solution is problematic, the Technology of Seeing, needs to become a common denominator within the AI community. Current vision systems are at best a mixed bag, and require an upgrade, and a standardization that is currently lacking. And while the vision information obtained from the Human Eye and a Robotic replacement might attain equality, they may never contain the same data due to the differences in the technology. What must happen is that common robotic vision devices (eyes) need to be good enough and be interchangeable so that different AI’s can resolve the color ‘Red’ the same way. Paving the way for a common communication interchange regarding the external world.
Yesterday while talking with a colleague, I was trying to get a cross the idea the most ‘programmers’ don’t understand what goes on inside a computer. And his response was, “Does it matter any more?” and while it took me back, I had to respond, “No!” After sleeping on it, I came to a revelation of sorts.
Current IT is equivalent to being a Hot-dog vendor on the street.
And while we IT/CS folk might try and elevate our profession to that status of demigod status we are merely vendors of what the computer can DO!‘ We don’t create the computer, we splash condiments on the hot-dog, and sell it as computing. We don’t even make the condiments anymore, call them libraries, functions written by gnomes in dark caves. And don’t even mention the buns, the dressing ,the cover, beyond us.
In the early days of computing, the common question was, what do I use my computer for. And the first answer often was, you could put your cooking recipes in it. Creating the first cookbook you needed to plugin. The computer is still the same, just that the cookbook has gotten more sophisticated.
I have harped for years that the ‘hardware’ of computing has crippled real advances in computing, more and more systems are opting for generic in their selection of Hot-dog instead, choosing to dress it up with more and intriguing spices and toppings, things like AI and Neural Networks. While these latter are more sophisticated and sexy, they are more or less toppings on the same Hot-dog.
While working on an TCP/IP problem today, I was finally struck by the fact that we have for all intents and purposes expended the entire TCPv4 addressing space. I knew it was coming, years ago, but now while testing IP addresses, it dawned on me.
You can now pick any arbitrary set of numbers nnn.nnn.nnn.nnn and expect a response. Ping them, probe them, something will be there, or it’s being held. All gone, this is the equivalent of spitting in the middle of an ocean while swimming, you are going to hit ocean.
4,294,967,296 (232) addresses gone, 4 Billion addresses in use…..
I was reminded this Christmas Holiday season that computers do not ‘know‘ any human language, only binary, and that it takes humans to provide the translation from the machine to something human readable. And while most computer programing languages are ‘English’ like, they need not have to be in the English Language. It’s just what’s what happened first, and could be changed into another language at anytime.
This came to me in an inspired way, by listening to Carols, where non-native speakers were singing in latin, and other non-english speakers were singing in English, or German, or French. That you can sing in a language, and not know how to speak in it.
I suspect that is the same method that most non-english speakers program computers in ‘english like’ programing languages. By layering another translation over the programming, or like in singing, which uses another part of the brain, different from the part that provides language skills, another part of the brain is used to converse with computers. Thus making the point that people who program, do think with altered brains.
Note to self, the computer is not built to do anything other than execute instructions, hardware advances over the years have only advanced the ability of the CPU to gather instructions, it does not make decisions about what to execute, or in what order to execute them in. That is the organization of the basic boot loader, in combination with the operating system loaded.
There are no elements of artificial intelligence built into the hardware, it has no ability to reprogram itself or to change it’s wiring. External forces must be applied to force change either by altering microcode-code in the core of the CPU (should that be possible) or by execution of programs within the confines of the operating system, instructions provided by the boot loader or via operating systems loaded and executing programs. It is through those processes that constitute what a computer does, with what it ‘sees’ .
Any hope of producing the next generation of computing must therefore be a revolution in how the CPU is instructed to perform it’s instructions, what is done with the output, and any associated hardware connected to the system to perform ‘tasks’ assigned by that process. The argument that Windows, or Linux/Unix or any other operating system is better than another DOES create opportunities and restrictions uniquely to any new programming, computing Paradigm.
Anything like artificial intelligence will have to preceded by a new suite of hardware, with a new way of ‘booting’ the system and or an entirely new operating system tailored to artificial intelligence operations. Current hardware/software standardization is at once the primary blockage to any future advances to computing.
UPDATE #1: IBM Creates Custom-Made Brain-Like Chip