Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We’re talking about 5G vs satellite not wired.

There’s a 4 orders of magnitude difference between high density areas and low density ones. So no you don’t need millimeter wave everywhere. You can increase bandwidth per tower, but you can also the number of cell sites.

Further every frequency you add removes users from other frequencies. IE: At 10 miles you can use a subset of frequencies, but those frequencies don’t need to cover for people 100m from the cell tower because those are on 5G.

Thus double the number of cell sites means there’s an extra circle of people on mm wave frequencies around the new towers. Thus you more than double effective bandwidth in low density areas when you double the number of towers.

Meanwhile the reverse happens with satellites. For a given number of satellites there’s some areas where you have sufficient capacity for the density at those area. Suppose you have enough satellites for ships and aircraft over the ocean, add new satellites to handle higher density and the time those satellites are over the ocean isn’t getting you new customers. IE the percentage of time the average satellite is at 90+% capacity drops when you add more satellites.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: