They do have a website https://www.orbitai.global/It claims a "Genesis" mission November 29 (with a countdown clock). Farther down it says "Launch partner: Galactic Energy". And "First satellite with onboard blockchain wallet performing the world's first in-orbit blockchain transaction signature."This feels like a pile of techy buzzwords.The DeStarAI section also has some bizarre claims like "vacuum provides natural cooling" and "Compute Capacity: ∞ TFLOPS".
Quote from: Twark_Main on 12/08/2025 01:10 amQuote from: Vultur on 12/07/2025 07:32 pmUsing AI just for the sake of using AI isn't valuable. It only makes sense if it's cheaper than doing the same thing without AI, or gives real benefits that aren't otherwise attainable."Otherwise attainable" at what price?Current AI is cripplingly flawed, a technology in its infancy. But I suspect it won't be that way for too much longer."At what price" is exactly the question.Right now, given the amount of investment money going toward AI, end users are generally not paying the 'real' cost of it. If vastly more energy/hardware efficient means are found, of course that changes. But vastly less energy use means the purpose of moving it to orbit goes away.
Quote from: Vultur on 12/07/2025 07:32 pmUsing AI just for the sake of using AI isn't valuable. It only makes sense if it's cheaper than doing the same thing without AI, or gives real benefits that aren't otherwise attainable."Otherwise attainable" at what price?Current AI is cripplingly flawed, a technology in its infancy. But I suspect it won't be that way for too much longer.
Using AI just for the sake of using AI isn't valuable. It only makes sense if it's cheaper than doing the same thing without AI, or gives real benefits that aren't otherwise attainable.
Quote from: Twark_Main on 12/08/2025 01:10 amI think "railgun" is being used here as a catch-all for any electric linear accelerator tech, as opposed to rocket launches, which when you pencil it out clearly wouldn't work.Coil guns have major problems too... Is there any linear accelerator that would clearly be practical at multiple km/s for long enough lifetimes to compete with (say) $10-20/kg Earth launch costs?
I think "railgun" is being used here as a catch-all for any electric linear accelerator tech, as opposed to rocket launches, which when you pencil it out clearly wouldn't work.
Quote from: Vultur on 12/08/2025 04:03 amQuote from: Twark_Main on 12/08/2025 01:10 amQuote from: Vultur on 12/07/2025 07:32 pmUsing AI just for the sake of using AI isn't valuable. It only makes sense if it's cheaper than doing the same thing without AI, or gives real benefits that aren't otherwise attainable."Otherwise attainable" at what price?Current AI is cripplingly flawed, a technology in its infancy. But I suspect it won't be that way for too much longer."At what price" is exactly the question.Right now, given the amount of investment money going toward AI, end users are generally not paying the 'real' cost of it. If vastly more energy/hardware efficient means are found, of course that changes. But vastly less energy use means the purpose of moving it to orbit goes away.I think that depends on whether "vastly less" means 4x or 100x. I suspect it's closer to the former.Steam engines were around 5-10% before we developed a theory of heat engines, whereas the absolute limit is closer to 40-60%. That's not 100x.
If only we had a company with demonstrated success in blue-sky engineering space technologies that have never existed before...
Quote from: Twark_Main on 12/08/2025 02:24 pmQuote from: Vultur on 12/08/2025 04:03 amQuote from: Twark_Main on 12/08/2025 01:10 amQuote from: Vultur on 12/07/2025 07:32 pmUsing AI just for the sake of using AI isn't valuable. It only makes sense if it's cheaper than doing the same thing without AI, or gives real benefits that aren't otherwise attainable."Otherwise attainable" at what price?Current AI is cripplingly flawed, a technology in its infancy. But I suspect it won't be that way for too much longer."At what price" is exactly the question.Right now, given the amount of investment money going toward AI, end users are generally not paying the 'real' cost of it. If vastly more energy/hardware efficient means are found, of course that changes. But vastly less energy use means the purpose of moving it to orbit goes away.I think that depends on whether "vastly less" means 4x or 100x. I suspect it's closer to the former.Steam engines were around 5-10% before we developed a theory of heat engines, whereas the absolute limit is closer to 40-60%. That's not 100x. I don't think this is like engines at all, though.A human brain is something like 20W (~20-25% of basal metabolic rate consumption of 80-100W).So "intelligence" doesn't have a physical requirement to be crazy power hungry.
Quote from: Twark_Main on 12/08/2025 02:24 pmIf only we had a company with demonstrated success in blue-sky engineering space technologies that have never existed before... There's a difference though. SpaceX has very efficiently implemented and scaled things that were impractical before (reusability, subcooled propellant, large scale low latency satellite internet) to achieve the world's highest launch rate. People questioned whether these things were economical, but I don't think anyone seriously thought in say 2002 that they weren't physically possible, just that the market wouldn't support the cost. Some of these things had been attempted in the 90s but failed, partly due to the dotcom bubble.
The issues with coilguns may be qualitatively worse, in the sense that "the materials you need just may not exist".
If you have a reason to believe that (as Elon says) "success is not one of the possible outcomes," then perhaps you could articulate it?
Today, Aetherflux announced a Q1 2027 target for its first orbital data center satellite, which leverages solar power in space to address the massive energy needs for artificial intelligence. The project, dubbed "Galactic Brain", offers a bypass to the current five-to-eight year time horizon for data centers to be built on Earth.Access to energy is one of the primary bottlenecks for scaling artificial intelligence. This problem is driven by infrastructure timelines: securing real estate, establishing utility connections and constructing new data centers can take more than half a decade."The race for artificial general intelligence is fundamentally a race for compute capacity, and by extension, energy. The elephant in the room is that our current energy plans simply won't get us there fast enough," said Baiju Bhatt, founder and CEO of Aetherflux, and co-founder of Robinhood. "Galactic Brain puts the sunlight next to the silicon and skips the power grid entirely."Aetherflux's first data center node for commercial use is targeted for Q1 2027; subsequent satellite launches will build a constellation of nodes to scale capacity.
Quote"The race for artificial general intelligence is fundamentally a race for compute capacity, and by extension, energy. The elephant in the room is that our current energy plans simply won't get us there fast enough," said Baiju Bhatt, founder and CEO of Aetherflux, and co-founder of Robinhood.
"The race for artificial general intelligence is fundamentally a race for compute capacity, and by extension, energy. The elephant in the room is that our current energy plans simply won't get us there fast enough," said Baiju Bhatt, founder and CEO of Aetherflux, and co-founder of Robinhood.
Quote from: Eric Hedman on 12/08/2025 05:28 amEveryone seams to be jumping on the bandwagon. This is certainly true today ... The question is cost. The real cost of LLM queries (energy, hardware depreciation, etc) is currently not really being charged to users - otherwise you couldn't get Google AI Overviews, free ChatGPT use, Gemini on Android phones, etc. I think the key question is how many uses would/will survive if/when real costs have to be charged to end users.(Or whether dramatically more energy efficient [non-LLM?] AI technologies will be developed, which might remove the entire issue but also remove the desirability of putting AI in orbit.]
Everyone seams to be jumping on the bandwagon.
Quote from: Vultur on 12/08/2025 02:18 pmQuote from: Eric Hedman on 12/08/2025 05:28 amEveryone seams to be jumping on the bandwagon. This is certainly true today ... The question is cost. The real cost of LLM queries (energy, hardware depreciation, etc) is currently not really being charged to users - otherwise you couldn't get Google AI Overviews, free ChatGPT use, Gemini on Android phones, etc. I think the key question is how many uses would/will survive if/when real costs have to be charged to end users.(Or whether dramatically more energy efficient [non-LLM?] AI technologies will be developed, which might remove the entire issue but also remove the desirability of putting AI in orbit.]But we don't pay for the cost of today's "regular" data centers either, and they're not cheap.The cost is borne by the entire business model of Google.So once everyone is used to making conversational queries and getting complex results, there's no going back
Quote from: meekGee on 12/09/2025 08:30 pmQuote from: Vultur on 12/08/2025 02:18 pmQuote from: Eric Hedman on 12/08/2025 05:28 amEveryone seams to be jumping on the bandwagon. This is certainly true today ... The question is cost. The real cost of LLM queries (energy, hardware depreciation, etc) is currently not really being charged to users - otherwise you couldn't get Google AI Overviews, free ChatGPT use, Gemini on Android phones, etc. I think the key question is how many uses would/will survive if/when real costs have to be charged to end users.(Or whether dramatically more energy efficient [non-LLM?] AI technologies will be developed, which might remove the entire issue but also remove the desirability of putting AI in orbit.]But we don't pay for the cost of today's "regular" data centers either, and they're not cheap.The cost is borne by the entire business model of Google.So once everyone is used to making conversational queries and getting complex results, there's no going backBut it's not paid for by Google's business model today, it's paid for by massive unsustainable infusions of investment capital in hopes for returns which are very unlikely to actually materialize.(And Google results today are notably worse than a few years ago.)It doesn't ultimately matter whether advertisers or end users are paying for it, if neither can afford to pay for it. It doesn't (IMO) make any sense for the majority of resources in an economy, or at least a large fraction, to be used for glorified search. Especially if that glorified search is questionably any better than a mid 2010s search that used vastly less resources.But eh. I'm not sure this line of argument is worth pursuing much further. In a few years the answer should be clear.
Quote from: Vultur on 12/09/2025 09:55 pmQuote from: meekGee on 12/09/2025 08:30 pmQuote from: Vultur on 12/08/2025 02:18 pmQuote from: Eric Hedman on 12/08/2025 05:28 amEveryone seams to be jumping on the bandwagon. This is certainly true today ... The question is cost. The real cost of LLM queries (energy, hardware depreciation, etc) is currently not really being charged to users - otherwise you couldn't get Google AI Overviews, free ChatGPT use, Gemini on Android phones, etc. I think the key question is how many uses would/will survive if/when real costs have to be charged to end users.(Or whether dramatically more energy efficient [non-LLM?] AI technologies will be developed, which might remove the entire issue but also remove the desirability of putting AI in orbit.]But we don't pay for the cost of today's "regular" data centers either, and they're not cheap.The cost is borne by the entire business model of Google.So once everyone is used to making conversational queries and getting complex results, there's no going backBut it's not paid for by Google's business model today, it's paid for by massive unsustainable infusions of investment capital in hopes for returns which are very unlikely to actually materialize.(And Google results today are notably worse than a few years ago.)It doesn't ultimately matter whether advertisers or end users are paying for it, if neither can afford to pay for it. It doesn't (IMO) make any sense for the majority of resources in an economy, or at least a large fraction, to be used for glorified search. Especially if that glorified search is questionably any better than a mid 2010s search that used vastly less resources.But eh. I'm not sure this line of argument is worth pursuing much further. In a few years the answer should be clear.Initial regular data centers were investment-funded too, but now they are priced in.
This is not the case for any 'AI' company thus far: operating costs exceed revenue, let alone if you then have to amortise the spending on the silicon packed into them (for perspective, just to meet the $1.6tn invested in 'AI' thus far, every single US citizen would ned to subscribe to some sort of AI service to the tune of ~$40 per month for the next decade).LLMs are in the 'underpants gnome' phase: everyone assumes there will be some profit at some point, but not a single company has any idea what they revenue source will be, let alone has actually achieve it.
Interesting tweet:Quote from: CJHandmerinference Starlink (Star Thought?) satellites
inference Starlink (Star Thought?) satellites
Quote from: Tywin on 12/08/2025 05:21 pmInteresting tweet:Quote from: CJHandmerinference Starlink (Star Thought?) satellites"StarThink" was right. There.
Quote from: Vultur on 11/21/2025 03:56 amThey do have a website https://www.orbitai.global/It claims a "Genesis" mission November 29 (with a countdown clock). Farther down it says "Launch partner: Galactic Energy". And "First satellite with onboard blockchain wallet performing the world's first in-orbit blockchain transaction signature."This feels like a pile of techy buzzwords.The DeStarAI section also has some bizarre claims like "vacuum provides natural cooling" and "Compute Capacity: ∞ TFLOPS".The countdown clock is now to December 10.Nextspaceflight.com doesn't show a Galactic Energy launch on December 10... Though there is a CAS Space Kinetica 1 with unknown payload. *Shrug*
Quote from: edzieba on 12/10/2025 04:49 pmThis is not the case for any 'AI' company thus far: operating costs exceed revenue, let alone if you then have to amortise the spending on the silicon packed into them (for perspective, just to meet the $1.6tn invested in 'AI' thus far, every single US citizen would ned to subscribe to some sort of AI service to the tune of ~$40 per month for the next decade).LLMs are in the 'underpants gnome' phase: everyone assumes there will be some profit at some point, but not a single company has any idea what they revenue source will be, let alone has actually achieve it. Yeah, exactly. There is no clear path to any revenue sources of the appropriate scale.
Quote from: Vultur on 12/11/2025 03:18 amQuote from: edzieba on 12/10/2025 04:49 pmThis is not the case for any 'AI' company thus far: operating costs exceed revenue, let alone if you then have to amortise the spending on the silicon packed into them (for perspective, just to meet the $1.6tn invested in 'AI' thus far, every single US citizen would ned to subscribe to some sort of AI service to the tune of ~$40 per month for the next decade).LLMs are in the 'underpants gnome' phase: everyone assumes there will be some profit at some point, but not a single company has any idea what they revenue source will be, let alone has actually achieve it. Yeah, exactly. There is no clear path to any revenue sources of the appropriate scale.I mean, people said the same thing about the Internet. And now pretty much everyone does everything online. "Oh, sure, we'll spend $280 per month on broadband, cable tv, netflix, Disneyplus, mobile data plans, etc (source: https://www.reviews.org/internet-service/cost-of-internet-streaming-and-cell-phone-bills/) , but $40 on self-driving cars, delivery robots, house cleaning, yard service, etc, is too much." Plus companies can benefit to the tune of potentially thousands of dollars per month in increased productivity per employee from doing the same, eventually.The global e-commerce market is currently around $7 trillion in revenue per year. Americans alone drive their automobiles 3 trillion miles per year, and Americans are just 1/20th of the world population.Globally, the Internet is worth, I dunno, maybe $30 trillion or so?