Quote from: sanman on 01/26/2025 11:09 amQuote from: Star One on 01/25/2025 08:06 amNo wonder they were able to build it so quickly and cheaply then.Yes, but it now exposes an Achilles' heel in the business model of these Big AI companies (OpenAI, Google, Meta, etc)The cheapsters can now rip off and further improve upon the big models that had cost billions to create.aka. "We Have No Moat"The models released by DeepSeek show significant improvements over OpenAI's GPT-4o1, while having been made very cheaply.Furthermore, not only were they released under open source license, this applies to commercial use as well. Even businesses can use those models for free.the feedback loop of AI improving AI was going to happen anyway. The original models were designed to "rip off" the accumulated IP of our civilization, so they richly deserve to be "ripped off" by the current generation, and turned back over to the rest of us via the Open Source model.
Quote from: Star One on 01/25/2025 08:06 amNo wonder they were able to build it so quickly and cheaply then.Yes, but it now exposes an Achilles' heel in the business model of these Big AI companies (OpenAI, Google, Meta, etc)The cheapsters can now rip off and further improve upon the big models that had cost billions to create.aka. "We Have No Moat"The models released by DeepSeek show significant improvements over OpenAI's GPT-4o1, while having been made very cheaply.Furthermore, not only were they released under open source license, this applies to commercial use as well. Even businesses can use those models for free.
No wonder they were able to build it so quickly and cheaply then.
Investors punished global tech stocks on Monday after the emergence of a Chinese chatbot competitor to OpenAI’s ChatGPT, DeepSeek, raised doubts about the sustainability of the US artificial intelligence boom.The tech-heavy Nasdaq index in New York opened lower after investors digested the implications of the latest AI model developed by the startup DeepSeek.
Also is this a sign the whole AI bubble is about to burst?What’s more impressive still is it was built off of relatively lower power chips, because of the chip ban to China, bringing into question this constant quest for more and more powerful hardware, using more and more power. TBH this last point seems good news for things like spaceflight if you can run powerful AI applications on older hardware with lower power requirements.
QuoteInvestors punished global tech stocks on Monday after the emergence of a Chinese chatbot competitor to OpenAI’s ChatGPT, DeepSeek, raised doubts about the sustainability of the US artificial intelligence boom.The tech-heavy Nasdaq index in New York opened lower after investors digested the implications of the latest AI model developed by the startup DeepSeek.https://www.theguardian.com/business/2025/jan/27/tech-shares-asia-europe-fall-china-ai-deepseek
Power companies that are most exposed to the tech sector’s data center boom plunged Monday, as the debut of China’s DeepSeek open-source AI laboratory led investors to question how much energy artificial intelligence applications will actually consume.
Power companies also got a soaking on the markets.QuotePower companies that are most exposed to the tech sector’s data center boom plunged Monday, as the debut of China’s DeepSeek open-source AI laboratory led investors to question how much energy artificial intelligence applications will actually consume.https://www.cnbc.com/2025/01/27/power-stocks-plunge-as-ai-energy-needI’m glad a disruptive element has come along as the sector was already looking complacent. It also brings into question the US approach of throwing more and more money, hardware and power at the issue to try and get progress.
Quote from: InterestedEngineer on 01/27/2025 07:39 amDoes that mean that energy is to the fourth power of time decrease?Yes, if you hold mass ratio constant then the energy scales as the inverse fourth power of travel time.So if you want to use the same propellant to get you there in half the time, you need 16x as much energy (and of course, 25 = 32x as much power).
Does that mean that energy is to the fourth power of time decrease?
Given a constant mass ratio of a rocket, show me how the energy rate scales with the 5th power of travel time between two points when using a conventional rocket
Cross postingQuote from: Twark_Main on 01/28/2025 11:28 pmQuote from: InterestedEngineer on 01/27/2025 07:39 amDoes that mean that energy is to the fourth power of time decrease?Yes, if you hold mass ratio constant then the energy scales as the inverse fourth power of travel time.So if you want to use the same propellant to get you there in half the time, you need 16x as much energy (and of course, 25 = 32x as much power). The 2020s version of TL;DR, here's the prompt you feed chatGTP to derive the 5th powerQuoteGiven a constant mass ratio of a rocket, show me how the energy rate scales with the 5th power of travel time between two points when using a conventional rocketAlas it's not possible to paste the result here. Try it yourself. It appears to be a valid derivation.Note it's the inverse of the 5th power of the travel time. Even the wrong prompt will get you the right answer.I need to make a "let me chatgtp that for you" website.
Who knew that our glorious spaceflight future wasn't Star Wars or Star Trek, but Dark Star...
Quote from: InterestedEngineer on 01/29/2025 06:30 pmCross postingQuote from: Twark_Main on 01/28/2025 11:28 pmQuote from: InterestedEngineer on 01/27/2025 07:39 amDoes that mean that energy is to the fourth power of time decrease?Yes, if you hold mass ratio constant then the energy scales as the inverse fourth power of travel time.So if you want to use the same propellant to get you there in half the time, you need 16x as much energy (and of course, 25 = 32x as much power). The 2020s version of TL;DR, here's the prompt you feed chatGTP to derive the 5th powerQuoteGiven a constant mass ratio of a rocket, show me how the energy rate scales with the 5th power of travel time between two points when using a conventional rocketAlas it's not possible to paste the result here. Try it yourself. It appears to be a valid derivation.Note it's the inverse of the 5th power of the travel time. Even the wrong prompt will get you the right answer.I need to make a "let me chatgtp that for you" website.Cross snarking:Wait, do we really need ChatGPT now to know that 16x as much energy in half the time is 32x as much power?? It's worse than I thought....Edit: also, as far as historical availability later on, "just try it yourself with the latest Gen AI" ages amazingly poorly. Most of the early models aren't even accessible anymore. If you want to evaporate multiple cups of coffee on elementary physics, the least you could do is share with the class."Alas paste is impossible" is pretty weak, especially since you're apparently so proud of this you're reposting it. What, you can't... take a screenshot??(without ChatGPT reminding you, I mean)
The concern is germane. If this is the answer to the matter of "this is your brain on ChatGPT," maybe the question we should be more concerned with is, "what is left for brains to be used for in Space Applications, apart from slavishly and unquestioningly following AI instructions?" Who is the real user now, the man or the machine?
Quote from: Twark_Main on 01/31/2025 09:37 amThe concern is germane. If this is the answer to the matter of "this is your brain on ChatGPT," maybe the question we should be more concerned with is, "what is left for brains to be used for in Space Applications, apart from slavishly and unquestioningly following AI instructions?" Who is the real user now, the man or the machine? Our relationship may become symbiotic. We'll continue to endow and entrust AI with more and more capabilities, so long as it advances our collective interests.That will continue until the Zeroth Law comes into effect. After that we'll become like fleas clinging to a dog's back.
Quote from: Interested EngineerGiven a constant mass ratio of a rocket, show me how the energy rate scales with the 5th power of travel time between two points when using a conventional rocketAlas it's not possible to paste the result here. Try it yourself. It appears to be a valid derivation.Note it's the inverse of the 5th power of the travel time. Even the wrong prompt will get you the right answer.
The energy rate required for a rocket to travel between two fixed points scales with the 5th power of the travel time when considering a constant mass ratio and maintaining the same velocity change.
Thanks, but that's way over my head!Got it, let's simplify:When you use a rocket to travel from one point to another:If you want to take longer to get there (more travel time), you need less speed but for a longer time.The power (energy per second) needed to keep you going for that longer time goes up a lot because:You're moving slower, so you need to be pushing for much longer.The engine has to work harder over this longer period to get you to the same place.In simple terms, if you double your travel time, the energy you need each second doesn't just double; it goes up by a factor of 32 (2 to the 5th power roughly). So, longer trips require much, much more energy each second than shorter ones.
Also, seeing the formal derivation is far different than waving ones hand and saying "yep it is 5x". We expect students to show their work. Note most of the work is proving the energy is 4x.
Quote from: InterestedEngineer on 01/31/2025 06:26 pmQuote from: Interested EngineerGiven a constant mass ratio of a rocket, show me how the energy rate scales with the 5th power of travel time between two points when using a conventional rocketAlas it's not possible to paste the result here. Try it yourself. It appears to be a valid derivation.Note it's the inverse of the 5th power of the travel time. Even the wrong prompt will get you the right answer.I just axed Grok:Conclusion:Quote from: GrokThe energy rate required for a rocket to travel between two fixed points scales with the 5th power of the travel time when considering a constant mass ratio and maintaining the same velocity change.Thanks, I said, but that long explanation is way over my head!Quote from: GrokThanks, but that's way over my head!Got it, let's simplify:When you use a rocket to travel from one point to another:If you want to take longer to get there (more travel time), you need less speed but for a longer time.The power (energy per second) needed to keep you going for that longer time goes up a lot because:You're moving slower, so you need to be pushing for much longer.The engine has to work harder over this longer period to get you to the same place.In simple terms, if you double your travel time, the energy you need each second doesn't just double; it goes up by a factor of 32 (2 to the 5th power roughly). So, longer trips require much, much more energy each second than shorter ones.
TL;DR in Mars/space we really want more than just an LLM which can answer the right question. We want a system that can 'understand' our goal and call attention to important things we may not even think to ask about.
Quote from: JohnFornaro on 01/31/2025 07:45 pm... that long explanation is way over my head!As far as I can see, nothing in this interaction shows that Grok "knows" that 5th power scaling is the correct answer. It could just be regurgitating the input.If you give it the exact same prompt with other scaling exponents, does it correct the prompt?
... that long explanation is way over my head!
Quote from: Twark_Main on 02/01/2025 06:49 pmQuote from: JohnFornaro on 01/31/2025 07:45 pm... that long explanation is way over my head!As far as I can see, nothing in this interaction shows that Grok "knows" that 5th power scaling is the correct answer. It could just be regurgitating the input.If you give it the exact same prompt with other scaling exponents, does it correct the prompt? My bold. As it turns out, even the short explanation stumps me, especially now, since "regurgitation" is on the table.
Quote from: JohnFornaro on 02/04/2025 01:38 pmQuote from: Twark_Main on 02/01/2025 06:49 pmAs far as I can see, nothing in this interaction shows that Grok "knows" that 5th power scaling is the correct answer. It could just be regurgitating the input.If you give it the exact same prompt with other scaling exponents, does it correct the prompt? Even the short explanation stumps me, especially now, since "regurgitation" is on the table.I mean, try "axing" Grok or ChatGPT ...(etc, trying a bunch of different exponents other than "5th")By "regurgitating," I'm suggesting that the LLMs haven't shown that they actually know the fifth power scaling is correct, since they might just be parroting the prompt back at you.
Quote from: Twark_Main on 02/01/2025 06:49 pmAs far as I can see, nothing in this interaction shows that Grok "knows" that 5th power scaling is the correct answer. It could just be regurgitating the input.If you give it the exact same prompt with other scaling exponents, does it correct the prompt? Even the short explanation stumps me, especially now, since "regurgitation" is on the table.
As far as I can see, nothing in this interaction shows that Grok "knows" that 5th power scaling is the correct answer. It could just be regurgitating the input.If you give it the exact same prompt with other scaling exponents, does it correct the prompt?