Author Topic: Starlink fundamental cost per GB equation (and comparison to competition)  (Read 27155 times)

Offline r8ix

  • Full Member
  • ***
  • Posts: 306
  • Liked: 297
  • Likes Given: 94
In the urban core with multistory buildings, though, I kind of agree that Starlink won’t be more than a niche competitor.
We don't (yet) have real competition among ISPs in the US, generally, but my last apartment in central Russia (2017) I had 120Mbs symmetrical fiber for about $7/month ($3 in June, July, and August). I also had at least 17 providers I could choose from…
Similar situation with cell service, too.
Most of the US is served by function monopolies or duopolies when it comes to internet service, and Starlink will be very welcome as an additional option.
An addition to the lack of ISP competition is the traditional control over permits and right of way leases in the US. These are controlled by the Local government. A lot of times a provider obtains a exclusive lease for Telephone or Cable TV service to be strung or buried in the right of ways. Either of the 2 lease holders can provide Internet. As in my location the Telephone provider AT&T and the Cable TV provider Mediacom. It is mostly a case of existing long term leases and a slow continuous technology evolution of both from wired transmission to fiber transmission so that now both offer 1Gbps Internet. For another provider to come into the are they would have to somehow obtain the permits and leases as well as string the fiber/cable. The permit and long term leasing processes are slow and are a major road block for competitors to put in service in an area. Basically for a long time it has been effectively local government controlled monopolies for telephone or Cable TV services. But such has not long ago disappeared and what remains is high up front cost barrier to other providers from offering service. It is mainly because in the US the providers own all of their infrastructure they use to provide service instead of leasing a data capability over a state owned infrastructure. In the second provider model to be a ISP provider just requires the connection to the state owned infrastructure and a connection to the Internet backbone by your data center with your DNS and etc servers exist.

Yes, in economics, it's known as the "last mile problem", and is the justification for regulatory control over delivery of utilities and similar services. Turns out, however, that it mostly traces back to some sponsored research from ~150 years ago, that a company used to argue for being given the local monopoly. It became accepted wisdom, and has rarely been challenged since.

Technology has given us some workarounds, so now in many places we have tv providers that offer phone service and vice versa. Hence duopolies instead of monopolies. Adding Starlink to that mix will be a big improvement, and force some of these companies to get off their butts and start improving their offerings. Cable companies, e.g., have some of the worst customer service in the country, according to surveys of such things.

Online ccdengr

  • Full Member
  • ****
  • Posts: 713
  • Liked: 520
  • Likes Given: 81
Technology has given us some workarounds...
That must explain why cellular telephone companies, which have few or no monopolistic advantages, are so greatly beloved by their customers.  ::)

Offline macpacheco

  • Full Member
  • ****
  • Posts: 892
  • Vitoria-ES-Brazil
  • Liked: 368
  • Likes Given: 3041
It helps not to have to dig cables in the ground, even in suburban areas. So even if your backhaul is cheaper with fiber (which depends on a lot of assumptions that may not last too long), your last mile will remain fairly expensive with cable in the suburbs.

And Starlink can compete just by buying a hungrier business. They overcome the inherent cost difference by operating a lot more lean. Comcast isn’t known for its efficiency.

In the urban core with multistory buildings, though, I kind of agree that Starlink won’t be more than a niche competitor.
Unless Starlink can handle 100k users in the same location Starlink simply won't have the spectrum to make a dent in Comcast/Time Warner/Verizon/CenturyLink business.
I understand and relate to your hatred of such companies and understand how you want (at all costs) to believe Starlink will change it. But it's simply unrealistic.
At the same time they will still be able to handle a few customers everywhere such customers has a clean view of the sky. So the few that want to vote with their wallets should be able to get Starlink anywhere that doesn't look like Manhattan/downtown LA/Chicago.
The world has ~7 billion people, call it ~2 billion homes/businesses, a mere 0.1% is 2 million customers. And I'm certain they will service closer to 1% than 0.1%. They will succeed by a wide margin. But they simply can't beat the virtually unlimited spectrum fiber optics can deliver.
The big telcos roll 144 or 288 strand fiber for their long haul routes. Today a DWDM system can easily get 2Tbps per pair of strands, or 144Tbps per fiber cable. And this will keep growing. I think in a decade or two we should be able to get 1000Tbps per fiber cable with a 100GE signal on each lambda.
We're approaching the point where the Internet simply don't need to get faster. 80% of the content come from local caches (Google, Netflix, Facebook, Akamai, ...) and with fiber already rolled out there's little in the way of delivering Gbps to everybody as competition intensifies.
What can demand more bandwidth than 4K 3D video ? It's proven 8K simply doesn't produce better viewing experience, unless you're standing at a 80" TV. And even 100Mbps is fast enough for a few 4K 3D streams.
Looking for companies doing great things for much more than money

Offline Robotbeat

  • Senior Member
  • *****
  • Posts: 39364
  • Minnesota
  • Liked: 25393
  • Likes Given: 12165
Spatial multiplexing is key. If there are ~60,000 satellites in orbit at 500km altitude, there will be about 100 satellites in the sky at any one time above any point (more if orbits are optimized). That means 100x frequency reuse. Plus the ability to increase gain on both ends using bigger (and/or higher frequency) phased arrays. Yes, I do think one satellite will be able to service more than 1000 people at a time, so I don’t doubt that SpaceX could eventually serve 100,000 customers in a single area.
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Offline matthewkantar

  • Senior Member
  • *****
  • Posts: 2191
  • Liked: 2647
  • Likes Given: 2314
If it profitable to serve scattered people, it will be profitable to serve concentrated people. The limit is how much space there is in orbit. There is lots and lots and lots of space in orbit.

Offline Asteroza

  • Senior Member
  • *****
  • Posts: 2911
  • Liked: 1127
  • Likes Given: 33
They could still broadcast like a regular cable or satellite company does. That way if 1000 people in the beam are watching the Super Bowl or whatever, they don't need 1000x the bandwidth. But streaming video (like youtube) uses normal bandwidth.

I would bet that streaming video is already the majority of data usage for broadband. I have regularly wondered if it might eventually make sense to keep a nearline Netflix cache on Starlink satellites once the satellites get larger.

Livestreams would then need to be something multicast-like, and that adds a lot of headaches... Designing for that edge case seems rough.

Online Barley

  • Full Member
  • ****
  • Posts: 1075
  • Liked: 739
  • Likes Given: 409
If it profitable to serve scattered people, it will be profitable to serve concentrated people. The limit is how much space there is in orbit. There is lots and lots and lots of space in orbit.
Not so.  The limits are economic.

If you have unserved customers in the least dense region adding capacity allows you to server a great many additional customers all over the world.  If the only unserved customers are in the most dense regions adding the same capacity only gains a few new customers.   At some point it becomes uneconomic to add global capacity to server concentrated people, leave that market to local providers.

Online oldAtlas_Eguy

  • Senior Member
  • *****
  • Posts: 5308
  • Florida
  • Liked: 5010
  • Likes Given: 1511
If it profitable to serve scattered people, it will be profitable to serve concentrated people. The limit is how much space there is in orbit. There is lots and lots and lots of space in orbit.
Not so.  The limits are economic.

If you have unserved customers in the least dense region adding capacity allows you to server a great many additional customers all over the world.  If the only unserved customers are in the most dense regions adding the same capacity only gains a few new customers.   At some point it becomes uneconomic to add global capacity to server concentrated people, leave that market to local providers.
Here is a simple solution for high density population areas. Use of high bandwidth (the 10Gbps mentioned by SpaceX) to serve a complete building. The result is a mini ISP just for the building operated by a maintenance company that used to do installation and maintenance/repair for regular cable providers. A single 10Gpbs UT on the roof could serve a few hundred actual users at 100Mbps access each. The mini ISP operator collects funds from his subscribers in the building(s) and the difference between the costs, manpower, hardware, and the Starlink 10Gbps UT subscription becomes his company's profit. SpaceX (Starlink) would only deal with the mini ISP operator.

Such that for a metropolitan city may have only a few hundred 10Gpbs UTs but could serve 10's of thousands end users. Most of the problem in efficient use of bandwidth in a free air system is the contention between the number of UT's in the beam spot. The more UT's the lower the efficiency once max efficiency has been reached. This is a sweet spot of some value of UT density per sat channel. That either side the efficiency of usage of the available bandwidth falls off.

NOTE many buildings already operate similar to this way with cable providers.

Online envy887

  • Senior Member
  • *****
  • Posts: 8166
  • Liked: 6836
  • Likes Given: 2972
If it profitable to serve scattered people, it will be profitable to serve concentrated people. The limit is how much space there is in orbit. There is lots and lots and lots of space in orbit.
Not so.  The limits are economic.

If you have unserved customers in the least dense region adding capacity allows you to server a great many additional customers all over the world.  If the only unserved customers are in the most dense regions adding the same capacity only gains a few new customers.   At some point it becomes uneconomic to add global capacity to server concentrated people, leave that market to local providers.

There's a point where the marginal cost of adding capacity is not worthwhile, but it's likely well above the point of average density saturation because there are widely separated dense areas all over the world. The same additional capacity can increase service density in New York, London, Rio, and Sydney, all at the same time.

Offline AC in NC

  • Senior Member
  • *****
  • Posts: 2484
  • Raleigh NC
  • Liked: 3630
  • Likes Given: 1950
And just to note, the reason the situation ended up as it did has a lot to do with the fact that American municipalities generally have horrible revenue problems.

That is pretty off-topic and the error is magnified by the fact that very little of your characterization of American municipalities is accurate.  Want me telling you how Finnish municipalities work?
« Last Edit: 02/07/2021 01:55 pm by AC in NC »

Online oldAtlas_Eguy

  • Senior Member
  • *****
  • Posts: 5308
  • Florida
  • Liked: 5010
  • Likes Given: 1511
Currently Starlink is using I believe 64QAM which allows the sending of 6 bits of data per Hz of frequency used. Thereby for a 1Gbps data channel it uses 167MHz of frequency. By increasing to 256QAM (adding 2 more bits) you gain an additional 33% of data speed per Hz. So for each channel jumps to 1.333Gbps data rate without the need for more frequency use. But the downside is that the antennas need more gain, 33% more, to recover the signal margin because of the increase in data causing a need for better signal resolution to be able to read those now less significant differences in signal level representing the data.

The other problem is that this change is not compatible with existing UT's and would have to wait for it's general use for the old UT's to be replaced. The new ones would be able to handle either the old 64QAM or the 256QAM. If SpaceX was smart they could have added a contingency in the UT's design only needing a software upgrade to enable 256QAM in the existing UT's which would eat into the signal margins increasing the lost packets occurrence.

A new UT design with 6DB more antenna gain would enable going to as high as 4096QAM or 12 bits per HZ. This increases the data rate to 2Gbps per channel for the same frequency usage of a channel or instead of 100 UT connectes per channel 200 UT connections per channel that 20Gbps sat increases to a 40Gbps sat without hardly any volume or mass changes. Also to increase the signal margins the sats would almost double the phased array diameters. This does 2 things increases the Gain both for transmit and receive while also enabling spots half the diameter illumination of the Earth surface, meaning a need to produce as many as 4 individual spots per same area. So now instead of a 40Gbps sat it is a 160Gbps sat. Multiply by 100 sats in the sky and any one point sees an increase of 2X but a larger area sees a bandwidth increase of number of simultaneous UT connections by a factor 8X. So a large area urban spread could support 8X the number of UT's it now can support. raising the possibility of 100,000 to a value of 800,000 UT connections!!!!!!

Added: NOTE in this scenario all you need for that 10Gbps connection capability is a connection with 5 simultaneous sats. 2Gbps per channel*5sats=10Gbps. A Gateway could then with 20 sats have a data rate of a minimum of 40Gbps. It would not be surprising to see Gateways increase to be able to handle as much as 200Gbps connections into the Internet. Now add V band and that 160Gbps sat increases to a 320Gbps sat and Gateways increase easily to as much as 400Gbps. Becasue of movement and because of frequency channels overlap the Gateways could go as high as 600Gbps. Which also means that for a temporary remote military base with a Gateway installation comprised of several trailers could have a multiple of 180Gbps (30 connections to 20 Sats using Ku, Ka and V bands with 2Gbps per channel with a 50% frequency spot area overlap usage) each of data connectivity to anywhere else in the world.

Added in order to get back to where this ties to the thread is that with little cost difference for either sats or UT's (advancing tech giving more capability for same costs.This is the electronics and mass of the electronics as well as the power efficiencies. The cost per bit to the users could easily drop from it's current to 1/2 to as much as 1/8 while the data rate available to a customer rises by 2X to 8X (from 100Mbps to 200 to 800Mbps) effectively becoming close to a Gigabit connection for practically all users. Only fiber direct to house would have a possibility to offer higher bit rates. Coax runs into a lot of problems when upping the signal levels on the coax to greatly increase (by a factor 10 a 20DB signal increase) such as a lot of RFI generated which the FCC would never allow.
« Last Edit: 02/07/2021 10:27 pm by oldAtlas_Eguy »

Online Barley

  • Full Member
  • ****
  • Posts: 1075
  • Liked: 739
  • Likes Given: 409
Currently Starlink is using I believe 64QAM which allows the sending of 6 bits of data per Hz of frequency used.

...

A new UT design with 6DB more antenna gain would enable going to as high as 4096QAM or 12 bits per HZ.

There is a difference in signal strength depending on the angle of the satellite to the zenith.  The difference between a satellite 25 degrees above the horizon and a satellite at the zenith will be at least 10dB (more than double the path length, plus cosine effect at the UT and possibly at the satellite).  If they are using QAM64 at the 25 degree limit the same RF hardware should have the margin to support QAM4096 (or better) near the zenith.

This suggest
1) Variable encoding would be useful.  If this is implemented then:
2) As the constellation grows all links should be closer to vertical in rural areas.  So it might be possible to switch encoding without changing the RF hardware.  Also capacity could grow super-linearly with the number of satellites.
3) In very dense areas you gain less than expected from being able to use all satellites in view as those close to the horizon have worse single strength and bandwidth.

It would surprise me if the UTs and satellites did not already have some flexibility in encoding.  Most of the encoders I have seen support multiple encodings (it's in software, even if it's software burned into an ASIC.)  Whether they can change encodings without resetting the device may be a different matter.  Whether QAM64 is the best or worst supported would be unknown.

Offline vsatman

Currently Starlink is using I believe 64QAM which allows the sending of 6 bits of data per Hz of frequency used.

64QAM is in theory (modulator capabilities). In reality, this is most likely only on the line in Ka band between the satellite and the gateway. For the line in Ku between the satellite and the terminal, SNR measurements (signal to noise ratio Eb / No) show about 10 dB. That is, it is 8PSK and 3 bits / Hertz.

Online envy887

  • Senior Member
  • *****
  • Posts: 8166
  • Liked: 6836
  • Likes Given: 2972
Currently Starlink is using I believe 64QAM which allows the sending of 6 bits of data per Hz of frequency used.

...

A new UT design with 6DB more antenna gain would enable going to as high as 4096QAM or 12 bits per HZ.

There is a difference in signal strength depending on the angle of the satellite to the zenith.  The difference between a satellite 25 degrees above the horizon and a satellite at the zenith will be at least 10dB (more than double the path length, plus cosine effect at the UT and possibly at the satellite).  If they are using QAM64 at the 25 degree limit the same RF hardware should have the margin to support QAM4096 (or better) near the zenith.

This suggest
1) Variable encoding would be useful.  If this is implemented then:
2) As the constellation grows all links should be closer to vertical in rural areas.  So it might be possible to switch encoding without changing the RF hardware.  Also capacity could grow super-linearly with the number of satellites.
3) In very dense areas you gain less than expected from being able to use all satellites in view as those close to the horizon have worse single strength and bandwidth.

It would surprise me if the UTs and satellites did not already have some flexibility in encoding.  Most of the encoders I have seen support multiple encodings (it's in software, even if it's software burned into an ASIC.)  Whether they can change encodings without resetting the device may be a different matter.  Whether QAM64 is the best or worst supported would be unknown.
This effect is partly mitigated by running the off-boresight and high slant angle beams at higher power, per the FCC filing.

Offline RedLineTrain

  • Senior Member
  • *****
  • Posts: 2599
  • Liked: 2507
  • Likes Given: 10527
If it profitable to serve scattered people, it will be profitable to serve concentrated people. The limit is how much space there is in orbit. There is lots and lots and lots of space in orbit.

This is related to the argument that Viasat is making.  In a podcast today, Dankberg stated that the limit is how much space there is in LEO and that scaling in LEO will be hampered because of the potential creation of orbital debris.

This seems like a weak argument, especially in self-cleaning LEO below 650 km.  SpaceX thinks that it can scale to at least 35,000 satellites, if the FCC will let it.  And the large majority of those satellites will be in inclinations that can serve the most profitable concentrations in the US and Europe (33°, 38°, 43°, 46°, and 53° in the proposed second constellation) with many beams.

In any event, SpaceX is placing a sophisticated phased array antenna in every home that they service.  Those can probably communicate with GEO or MEO satellites, if SpaceX were to wish to do that for some reason.

Discussion starts at 31:43, with the meat of the discussion at 38:30.

« Last Edit: 09/28/2021 07:53 pm by RedLineTrain »

Offline Asteroza

  • Senior Member
  • *****
  • Posts: 2911
  • Liked: 1127
  • Likes Given: 33
If it profitable to serve scattered people, it will be profitable to serve concentrated people. The limit is how much space there is in orbit. There is lots and lots and lots of space in orbit.

This is related to the argument that Viasat is making.  In a podcast today, Dankberg stated that the limit is how much space there is in LEO and that scaling in LEO will be hampered because of the potential creation of orbital debris.

The fundamental upper limit of sats is the arc second angle resolution of the phased array antennas (primarily user terminal) to differentiate sats in the same plane for the same frequencies (well maybe not just same plane but generally in field of view). As I understand it, this is ostensibly the reasoning behind GEO "slots" so your antenna can reasonably not receive signals from other sats. Anyone with experience twisting a home satellite TV dish to find a sat will sorta understand. It's an angular resolution issue so while GEO slots have a fair amount of spacing between individual slots/sats, lower orbits would have tighter spacing (but the actual number of slots per plane doesn't change since the resolution angle is probably fixed). If a sat doesn't overlap frequency-wise, it could in theory co-occupy a given slot position, but that includes all frequencies (internet/payload as well telemetry and control).

But that sorta is derived from parabolic antenna design and pointing, in that off-axis signals greater than a specific offset angle won't collect at the antenna focus/horn so effectively get passively ignored. How does that work with a fixed phased array which sees a much wider "view", as it doesn't have a physical form of off-axis filter? Anybody with RF knowledge care to expand on that?

Offline dondar

  • Full Member
  • ****
  • Posts: 441
  • the Netherlands
  • Liked: 299
  • Likes Given: 267
If it profitable to serve scattered people, it will be profitable to serve concentrated people. The limit is how much space there is in orbit. There is lots and lots and lots of space in orbit.

This is related to the argument that Viasat is making.  In a podcast today, Dankberg stated that the limit is how much space there is in LEO and that scaling in LEO will be hampered because of the potential creation of orbital debris.

The fundamental upper limit of sats is the arc second angle resolution of the phased array antennas (primarily user terminal) to differentiate sats in the same plane for the same frequencies (well maybe not just same plane but generally in field of view). As I understand it, this is ostensibly the reasoning behind GEO "slots" so your antenna can reasonably not receive signals from other sats. Anyone with experience twisting a home satellite TV dish to find a sat will sorta understand. It's an angular resolution issue so while GEO slots have a fair amount of spacing between individual slots/sats, lower orbits would have tighter spacing (but the actual number of slots per plane doesn't change since the resolution angle is probably fixed). If a sat doesn't overlap frequency-wise, it could in theory co-occupy a given slot position, but that includes all frequencies (internet/payload as well telemetry and control).

But that sorta is derived from parabolic antenna design and pointing, in that off-axis signals greater than a specific offset angle won't collect at the antenna focus/horn so effectively get passively ignored. How does that work with a fixed phased array which sees a much wider "view", as it doesn't have a physical form of off-axis filter? Anybody with RF knowledge care to expand on that?
You are talking about antenna directivity. Phase array does the same signal search you had to do physically, but "digitally", generally the system knows were to look so it does the signal acquisition reasonably quickly.

Because you use multiple antennas (see array) signal encoding is of critical importance and the choice of MIMO determines the level of directivity. (some arrays have also programmable selectivity which add choices).
 You can trade level of directivity (and the resistance to noise) for the useful bandwidth. Fully developed active arrays provide choices unthinkable otherwise. (see MIMO theory if curious). But the practical physical limitations remain, and you still have to point antenna generally in the direction of your source. (something like +-60 * for most commercially available systems).

TLDR: You can mitigate signal interference coming from closely positioned satellites or dishies using specific encoding choices which will lead to loss of bandwidth.

Offline vsatman

It's an angular resolution issue so while GEO slots have a fair amount of spacing between individual slots/sats,

 How does that work with a fixed phased array which sees a much wider "view", as it doesn't have a physical form of off-axis filter? Anybody with RF knowledge care to expand on that?

Minimal distance between sats on GEO  is 2 degrees...

Starlink user terminal  has half power beamwidth  from 2,8 degrees (in boresight ) to 4,5 degr (at slant) 

Tags:
 

Advertisement NovaTech
Advertisement Northrop Grumman
Advertisement
Advertisement Margaritaville Beach Resort South Padre Island
Advertisement Brady Kenniston
Advertisement NextSpaceflight
Advertisement Nathan Barker Photography
1