I actually work on telecommunications specifically for embedded devices.
Turning off 2G will give us a lot of headache. It’s not only smart meters that use it, but many other smart devices. Think of sensors in remote critical infrastructure too. Having said that, 2G is implemented mostly in older legacy devices, so the time to change them will come anyway.
With that being said, most IoT solutions nowadays go for LTE. There was also an attempt with NB-IoT but the very limited data bandwidth is a big bottleneck especially if you look at the latest legislation requiring OTA software security patches (which might even include kernel updates - huge - lots of bandwidth needed).
I thought they weren’t turning off 2G. What’s the benefit? Other than forcing it sooner? Most places that used 2G still get exceptional coverage from it.
Coverage is from frequency, not generation of signal encoding.
The benefit is you can reuse the frequency bands for something better, like 5G. That’s what they did in my country, among others. So, now we get 5G on 3 different frequency ranges. High speed and long range.
I was under the impression that higher bandwidth wireless networks required higher frequency bands for that data. Like a specific frequency should have a theoretical maximum data transfer rate and the only way to get around that would be some kind of fancy compression algorithms.
However the lowest GSM frequency was 300Mhz, so there is still quite a lot of bandwidth there (if I’m not mistaken to a theoretical maximum of 600Mbit/s for a 2 level signal, though in practice quite a lot less as this are radio-waves rather than signals in circuit lines, so encoding schemes have to be disgned for a lot more noise and other problems).
Anyways, the point being that the right encoding scheme can extract some Mbit/s from even the 300Mhz band.
Frequency isn’t that relevant, it’s frequency bandwidth. The bit rate is n/T with n being bits per symbol and T symbol duration which itself is 1/B with B being the frequency bandwidth. You want to increase the bit rate you can either increase the number of bits per symbol or increase the frequency bandwith. 5G allows bandwiths up to 400MHz per channel, there isn’t enough space in the lower frequency ranges for such large bandwidths, so you go up.
Isn’t the infrastructure for 2G also a factor? Over here for example we have lots of towers in remote mountain regions, rather complicated to upgrade all of them. It can be done but it will take a while.
I’ve done this as a project to go from 3G to LTE for a network of a few hundred devices.
3G and LTE (4G) used almost identical AT commands. The motherboards were build so the modems were swappable. It wasn’t too bad. I’m told the field techs had to drive 5 hours across the Australian outback to access some of them.
After rolling out 3g router fail over for pokies, lotto, wagering in Oz I’m sure the money they saved from no longer having any downtime can pay for 4G, 5g, and starlink redundancy.
5 hours of driving across Oz? Wouldn’t even make Carnarvon Gorge much less Mount Isa.
I work on the national electrical grid and there are a bunch of remote sensors in all of the substations. Some of them are what essentially amount to remotely controlled circuit breakers. I think they trip automatically if they lose connection because they assume something bad has happened. So that’ll be fun.
I work on the software side of things I’m not an electrical engineer so I have no idea if they’re actually changing them over yet but they’re still thousands of them on the network at the moment.
I work for the grid too and we also have these. Usually only for bigger substations to transmit measurements and switching states, maybe a bit of telemetry like a tripped fuse.
I hope for dear god that you are remembering wrong and none of them trigger when loosing connection. Whoever thought of that should be immediately fired.
A loss of connection from a single device should never trip a circuit breaker (no idea how the bigger equivalent is called in english), especially if its connected wireless.
The software that controls them is absolutely terrible so I wouldn’t be surprised if that is how they work.
But thinking about it it does seem really stupid for no reason so maybe what it is is the concern that if they do trip after 2G is turned off there’s no way for us to know that.
I was wondering how this would affect car manufacturers and their TCU’s or what have you. Curious to know if cars will be bricked when this is obsolete and they turn off 2G and 3G.
My 2013 Focus Sync 2 software that does system checks shit down like 6 years ago and I periodically get a message that I need to do a diagnostic check and send it in to Ford, but then it errors out.
Is it one of the ones where you can update the software from a USB stick and software off the internet or does it have to do with looking for a software update then timing out because the network isn’t there?
I was wondering my previous question because I know Mozilla just put out a report on how if you own a newer car your automaker is tracking you via the modems built into the cars.
It wants to use my phone to make a call and transmit the data like an old school modem. There’s not built in modem in the car as far as I can tell. It’s a 2013 Focus Titanium.
I actually work on telecommunications specifically for embedded devices.
Turning off 2G will give us a lot of headache. It’s not only smart meters that use it, but many other smart devices. Think of sensors in remote critical infrastructure too. Having said that, 2G is implemented mostly in older legacy devices, so the time to change them will come anyway.
With that being said, most IoT solutions nowadays go for LTE. There was also an attempt with NB-IoT but the very limited data bandwidth is a big bottleneck especially if you look at the latest legislation requiring OTA software security patches (which might even include kernel updates - huge - lots of bandwidth needed).
I thought they weren’t turning off 2G. What’s the benefit? Other than forcing it sooner? Most places that used 2G still get exceptional coverage from it.
Coverage is from frequency, not generation of signal encoding.
The benefit is you can reuse the frequency bands for something better, like 5G. That’s what they did in my country, among others. So, now we get 5G on 3 different frequency ranges. High speed and long range.
I was under the impression that higher bandwidth wireless networks required higher frequency bands for that data. Like a specific frequency should have a theoretical maximum data transfer rate and the only way to get around that would be some kind of fancy compression algorithms.
That is correct.
However the lowest GSM frequency was 300Mhz, so there is still quite a lot of bandwidth there (if I’m not mistaken to a theoretical maximum of 600Mbit/s for a 2 level signal, though in practice quite a lot less as this are radio-waves rather than signals in circuit lines, so encoding schemes have to be disgned for a lot more noise and other problems).
Anyways, the point being that the right encoding scheme can extract some Mbit/s from even the 300Mhz band.
Frequency isn’t that relevant, it’s frequency bandwidth. The bit rate is n/T with n being bits per symbol and T symbol duration which itself is 1/B with B being the frequency bandwidth. You want to increase the bit rate you can either increase the number of bits per symbol or increase the frequency bandwith. 5G allows bandwiths up to 400MHz per channel, there isn’t enough space in the lower frequency ranges for such large bandwidths, so you go up.
Isn’t the infrastructure for 2G also a factor? Over here for example we have lots of towers in remote mountain regions, rather complicated to upgrade all of them. It can be done but it will take a while.
Not complicated at all. For the most part, all they do is swap one box - the transmitter. That’s it.
(However, that doesn’t consider other things, like improvements in redundancy and safety, or construction standards that didn’t exist back then.)
But really, all that needs to be done is pull out one box, and slide in a new box. Not complicated at all.
I’ve done this as a project to go from 3G to LTE for a network of a few hundred devices.
3G and LTE (4G) used almost identical AT commands. The motherboards were build so the modems were swappable. It wasn’t too bad. I’m told the field techs had to drive 5 hours across the Australian outback to access some of them.
After rolling out 3g router fail over for pokies, lotto, wagering in Oz I’m sure the money they saved from no longer having any downtime can pay for 4G, 5g, and starlink redundancy.
5 hours of driving across Oz? Wouldn’t even make Carnarvon Gorge much less Mount Isa.
Beautiful country to drive across tho.
One of those places was Mt. Isa. It was equipment for mining.
I work on the national electrical grid and there are a bunch of remote sensors in all of the substations. Some of them are what essentially amount to remotely controlled circuit breakers. I think they trip automatically if they lose connection because they assume something bad has happened. So that’ll be fun.
I work on the software side of things I’m not an electrical engineer so I have no idea if they’re actually changing them over yet but they’re still thousands of them on the network at the moment.
I work for the grid too and we also have these. Usually only for bigger substations to transmit measurements and switching states, maybe a bit of telemetry like a tripped fuse.
I hope for dear god that you are remembering wrong and none of them trigger when loosing connection. Whoever thought of that should be immediately fired.
A loss of connection from a single device should never trip a circuit breaker (no idea how the bigger equivalent is called in english), especially if its connected wireless.
The software that controls them is absolutely terrible so I wouldn’t be surprised if that is how they work.
But thinking about it it does seem really stupid for no reason so maybe what it is is the concern that if they do trip after 2G is turned off there’s no way for us to know that.
I was wondering how this would affect car manufacturers and their TCU’s or what have you. Curious to know if cars will be bricked when this is obsolete and they turn off 2G and 3G.
My 2013 Focus Sync 2 software that does system checks shit down like 6 years ago and I periodically get a message that I need to do a diagnostic check and send it in to Ford, but then it errors out.
It’s annoying as fuck.
Is it one of the ones where you can update the software from a USB stick and software off the internet or does it have to do with looking for a software update then timing out because the network isn’t there?
I was wondering my previous question because I know Mozilla just put out a report on how if you own a newer car your automaker is tracking you via the modems built into the cars.
It is actually dumber than that.
It wants to use my phone to make a call and transmit the data like an old school modem. There’s not built in modem in the car as far as I can tell. It’s a 2013 Focus Titanium.
And it can’t because
new phones don’t use 3G?cell providers shut down 3G services in 2022?No, because whatever back end service ford used, ford shut down.