Yeah, probably off topic a bit. But we would have to agree on what could be achieved with AI in order to decide on how it could be used in space. Maybe someday AIs will be discussing how we can be used in space applications. I live in fear of their conclusion.
Quote from: ppnl on 11/23/2022 12:56 amYeah, probably off topic a bit. But we would have to agree on what could be achieved with AI in order to decide on how it could be used in space. Maybe someday AIs will be discussing how we can be used in space applications. I live in fear of their conclusion.Nothing based on any existing technology has any hope of ever evolving into something like what you're imagining. An effort to develop human intelligence using current AI technology is like trying to get to the moon using ancient Babylonian technology (i.e. building a tower tall enough to reach the moon). You simply cannot scale it up that far. Nowhere near.
Nothing based on any existing technology has any hope of ever evolving into something like what you're imagining. An effort to develop human intelligence using current AI technology is like trying to get to the moon using ancient Babylonian technology (i.e. building a tower tall enough to reach the moon). You simply cannot scale it up that far. Nowhere near.
Quote from: Greg Hullender on 11/23/2022 01:18 amNothing based on any existing technology has any hope of ever evolving into something like what you're imagining. An effort to develop human intelligence using current AI technology is like trying to get to the moon using ancient Babylonian technology...The Babylonian technology that matters is agriculture. That lets you expand the population, so you have enough one in a million geniuses (or a million non-geniuses) that you can refine copper, steel, silicon, ... . Five millennia later you're on the Moon.Time scales may vary.
Nothing based on any existing technology has any hope of ever evolving into something like what you're imagining. An effort to develop human intelligence using current AI technology is like trying to get to the moon using ancient Babylonian technology...
Whatever the disadvantages l... machines don't automatically need food, water, and oxygen.But how can we bootstrap such applications here on Earth, in order to eventually extend them into space?
Quote from: Greg Hullender on 11/23/2022 01:18 am. . . An effort to develop human intelligence using current AI technology is like trying to get to the moon using ancient Babylonian technology (i.e. building a tower tall enough to reach the moon). You simply cannot scale it up that far. Nowhere near.Until maybe 5-10 years ago I'd have agreed, but the recent 'Deep Learning' boom has shown that techniques long disregarded and woefully inefficient and inelegant can be perfectly adequate at performing real-world tasks if you can throw enough hardware at the problem. Human consciousness arises from a massively parallel array of incredibly stupid individual neurons working kinda-sorta in tandem and sometimes producing the 'correct' result, so I rather suspect any human-like AI will arise from a similarly messy pile of hacks and excessive use of compute resources rather than some elegant new simulation technique.
. . . An effort to develop human intelligence using current AI technology is like trying to get to the moon using ancient Babylonian technology (i.e. building a tower tall enough to reach the moon). You simply cannot scale it up that far. Nowhere near.
Where I disagree is with the idea that intelligence will "arise" from a sufficiently large system. I see that as no more likely than new life arising because a kid mixed all the chemicals in his chemistry set together. I know the notion of "emergent behavior" in software systems is popular with a lot of people, I pretty much reject it outright.
Quote from: sanman on 12/01/2022 05:51 amWhatever the disadvantages l... machines don't automatically need food, water, and oxygen.But how can we bootstrap such applications here on Earth, in order to eventually extend them into space?Pretty sure they "automatically" need electricity.As to "bootstrapping", that implies sentience.
Where I disagree is with the idea that intelligence will "arise" from a sufficiently large system.
Quote from: Greg Hullender on 12/02/2022 03:01 pmWhere I disagree is with the idea that intelligence will "arise" from a sufficiently large system. But do you have an argument for such an opinion?
A neuron is just a physical object that obeys physical laws. The Church-Turing thesis suggests that a computer program should be able to emulate the function of a neuron. While emulating a hundred billion neurons with a thousand trillion interconnections is challenging there is no new physics here as far as we can tell. Therefore it seems to be mostly an engineering problem.The transistor was invented in 1947. ... I see no reason that A.I. cannot proceed similarly. A.I. will be embedded in just about every piece of technology we produce.
Quote from: JohnFornaro on 12/01/2022 10:40 amQuote from: sanman on 12/01/2022 05:51 amWhatever the disadvantages l... machines don't automatically need food, water, and oxygen.But how can we bootstrap such applications here on Earth, in order to eventually extend them into space?Pretty sure they "automatically" need electricity.Sure they need electricity, and there's plenty to be had ...
Quote from: sanman on 12/01/2022 05:51 amWhatever the disadvantages l... machines don't automatically need food, water, and oxygen.But how can we bootstrap such applications here on Earth, in order to eventually extend them into space?Pretty sure they "automatically" need electricity.
But wait. There's also plenty of "food, water, and oxygen" to be had. What point were you making?
But do you have an argument for such an opinion? A neuron is just a physical object that obeys physical laws. The Church-Turing thesis suggests that a computer program should be able to emulate the function of a neuron. While emulating a hundred billion neurons with a thousand trillion interconnections is challenging there is no new physics here as far as we can tell. Therefore it seems to be mostly an engineering problem.The transistor was invented in 1947. It was around the size of a matchbox. In the span of one lifetime we have learned how to put over five billion of them on a little chip of silicone. Before 1947 such a thing would have consumed a large fraction of the electrical output of the entire US and the waste heat would have torched an entire city. I have a dozen of these things operating in my house now. I see no reason that A.I. cannot proceed similarly. A.I. will be embedded in just about every piece of technology we produce. Asking how it can be used in space is like asking how transistors can be used in space.
Quote from: ppnl on 12/02/2022 10:19 pmBut do you have an argument for such an opinion? A neuron is just a physical object that obeys physical laws. The Church-Turing thesis suggests that a computer program should be able to emulate the function of a neuron. While emulating a hundred billion neurons with a thousand trillion interconnections is challenging there is no new physics here as far as we can tell. Therefore it seems to be mostly an engineering problem....I'm not arguing that intelligence is supernatural--just that we don't have the foggiest idea how to engineer such a thing. Nor is it reasonable to suppose that if we just make our computing systems bigger they'll somehow magically become intelligent.
But do you have an argument for such an opinion? A neuron is just a physical object that obeys physical laws. The Church-Turing thesis suggests that a computer program should be able to emulate the function of a neuron. While emulating a hundred billion neurons with a thousand trillion interconnections is challenging there is no new physics here as far as we can tell. Therefore it seems to be mostly an engineering problem....
Trust me on this--lack of computing power is not what stops current (or foreseeable) AI technology from having human-level intelligence.
It's easy to say it's an engineering problem, but it's a problem where it's clear that something critical is missing from our current systems, and we have no idea what that is.
"Trust me on this" -- While you may assert that we [or you?] know the full extent of human intelligence, and the computing power and complexity of the human brain, you and we do not.
I don't believe you have to know "the full extent of human intelligence" to see this. You just need a broad understanding of the current state of the art in AI.