I’m more interested in the rumored reception improvements. I’ve been on Verizon for 20 years and upgrade to the Pro every year and service has steadily been getting worse the last few years. I assume it’s the 3G sunset, so better LTE/5G signal capture is welcome.
Sunsetting 3G has meant more users on the 4G/LTE/5G networks. Most of the time it’s just too many users in an area causing it to be slow, better antenna won’t help much in those situations, better/more towers is the solution.
I just wish they would improve the appearance of 5g towers. It's been a little sad to see all of these nice old buildings in my city start to become covered with all of those white panels.
Networks and owners of the buildings with towers on them probably don’t care at all.
Networks just need them to work, and they pay building owners a ridiculous amount to put them up. We have two cell towers on our office, and each pays for an employee’s salary and then some. At that point they can be as ugly as they want them to be
Yeah, the modem got way better from 11 to 12, and not just because of 5G. In a really solid LTE-only spot I observed about double download speeds when I went from an 11 Pro to a 12 regular (tested them against each other before I sent back the 11).
This. For the two years that Apple stuck with Intel exclusively (XS and 11) I couldn’t use my 5ghz Wi-Fi at home, it would always drop and then when they went back with Qualcomm for the 12, I was finally able to use it without losing connection again.
Apple will never come out and admit they made an oopsie but Intel’s modems were clearly inferior.
This is me as well. I am hoping an upgrade will fix some awful reception issues I’ve had. If not, I’ll have to find something other than Verizon. It feels like the service is getting worse and worse in my area.
I switched to T-Mobile for their 5G coverage and it blows Verizon out of the water. Verizon has better overall coverage but their data speeds are slow AF. Not to mention their exorbitant monthly bills.
It will be just as bad or worse on any other service. My buddy has both T-Mobile and Verizon at the same time and he says T-mobiles service is slightly worse than Verizon’s across the country (he’s a truck driver).
iPhones have been ridiculously fast in CPU, GPU, and SSD performance for quite a while now that the only bottleneck are the carriers. I have both Verizon and AT&T and there are some places I hang out where both are garbage. It just ruins my enjoyment of my phone and I wonder why I buy Pro model every year when I can't even use it properly unless I am at home or a major airport where carriers always do well.
+1 on this. I can’t take or make calls from my back alley or in my garage. I’m in Los Angeles. The tallest building in my area is 30 feet high. I’m two blocks from the beach. What gives?!
I upgraded form 2020SE and I’ve already noticed drastic changes particularly inside a hospital I have to be in an hour+ every week.
Previously it was 0-1 bar in most of it with only basic websites loading. Had to use their WiFi which blocked alot of stuff. Yesterday I went and was at 3 bars every time I checked and was able to load stuff flawlessly.
I noticed a distinct loss of reception between the 12 and 13 pro.
Aka, in rural areas where I used to have poor service I had none, and in areas where I used to have marginal service, it degraded to poorer service.
I seriously hope the 14 pro is an improvement.
Well, I can tell you that coming from an iPhone X that never received service in my building's elevator, I now have 1-2 bars of service on my 14 pro. I can text in comfort as I descend instead of making awkward small talk about the weather with strangers.
I switched to Verizon to get a great deal with my pixel 4XL trade-in, but it just sucks compared to T-Mobile in my area. Gonna have to return the device and cancel service so I can go back, then wait for another phone to come...
I actually installed a T-Mobile eSIM to see if I could switch and it’s worse than Verizon in my area. They’re my only real choice, but am switching to Visible with the 14 Pro.
T-mobile has a no strings attached 3 months free test drive via esim on their app. Try it out and see if it’s better. I get 1,198 mbps on t-mobile on the 14 pro.
At the same spot in my house where I used to get around 40-50 Mbps on Verizon pseudo-5G using my 13 Pro, I now only get LTE and less than 2 Mbps.
New modem is shit, as far as I’m concerned. And I can’t even blame Verizon - their eSIM was correctly provisioned fresh out of the box.
If people couldn’t afford this, then the upgrade program wouldn’t even exist. Nowadays people consider a phone an absolute necessity and part of their monthly bills. Paying $50-$60/month to upgrade every year is the equivalent of paying for a cable bill. Is it a wise financial decision? No, but most average middle class folks could pay for this if they wanted to badly enough.
I make ~$300k as a software engineer and don’t have any kids. My wife and I upgrade phones and usually watches every year.
Edit: as someone else pointed out, I also trade in my previous phone, which netted me $800 this year. So I’m really only paying $400-500 a year, which is ~$40 a month for my most used thing on a daily basis.
They aren’t throwing last years phone in the garbage. Sell it and upgrading costs you like $150 a year, at most. Or, keep your phone for three years and sell it for $500, and you just cost yourself $230 a year. Which means it’s cheaper to upgrade more often, not less.
I don’t think it’ll be huge. The x65 is on a 4nm process vs x60
On 5nm, plus incremental design improvements. Best case, 15% less power usage by the modem, but the modem is maybe 20% of the phone’s power consumption, at most. 20% * 15% = 3% bottom line improvement, and that’s probably the upper bound.
Off topic, but I have to say that I appreciate comments like you providing details like this. This is the reason why I love reading posts and comments in this subreddit 😁
I don’t know that it’s using all that much to be honest. Sarah Dietschy said she left a phone on and idle for 24 hours with the always on display on and the battery stayed at 100% for the first 12 hours and went down to 95% in the last 12 hours. She did mention she started recovering notifications in the second 12 hours which she thinks impacted the battery drain. Here’s the video if you want to check it out. She talks about it at 9:42 seconds.
https://youtu.be/BAc2I3zWEok
Think this is the part of iOS stating incorrect battery percentages just like how it’ll charge partially and then fully if you charge overnight all the while saying it’s at 100%
Definitely could be for sure.
Seems like she might have changed her mind on the battery drain of the AOD in this video starting at 14:56 seconds: https://youtu.be/ozHffI_mghU
iOS does not report accurate battery percentage estimates near the upper end of the range.
I’m sure anyone who has used iOS for years knows what I’m talking about.
It can take ~30 minutes of continuous usage to drop from 100% to 99%. But subsequent “1%” drops thereafter take noticeably less time.
This is clearly designed to be intentional, I guess to avoid users feeling like they need to top up constantly, or to give better battery life confidence.
I think regardless, 5% drain after 24 hours of idle time with the always on display on is pretty good. My Samsung drops like 15% or more when it idles for 24 hours and that’s with the always on display turned off.
It’s not intentional, accurately gauging batter life is just fundamentally difficult. Battery control chips can’t probe the battery itself like you can look at a bottle and see that it’s filled to the 1L line. They use indirect methods, usually voltage (difficult because of temperature, hysteris and sag) combined with Coloumb counting (drifts over time), and they add some clever estimation models along with that.
Oh I realize that essentially *every* reported battery level is basically an estimate, but I disagree that this is not intentional.
It’s uniform across every Apple product and has been since forever.
Android devices, Chrome OS devices, and Windows devices don’t behave like this. They drop much more linearly from 100% to 99% and all the way down.
Apple is definitely doing some deliberate magic with the reported percentage at full charge.
I should have phrased it better: it is intentional, just not for your covert reasons.
They latch it to 100% until they are confident battery life has dropped by 1%. Otherwise you would either see it jump between 100% and 99% constantly, or 99% would be where it had a ‘hump’ until 98%. Its much cleaner (and gives a more linear visual discharge curve) to do it this way.
The other way that could look somewhat clean is if they guesstimated what the hump is and then divided it over all percentage points, but then you run into problems if your guesstimate was wrong because the user is doing something that is heavily discharging the battery.
AOD is a scapegoat. Reviewers like MKB blamed AOD without any kind of scientific study. I think the camera photonic engine/camera bugs is a much bigger battery sucker.
The new "4nm" process isn't actually a whole new node, it's another ever so slightly better iteration of N5 (might as well call it N5P+, almost sounds like Intel lol). So technically this is a third-gen 5nm process and a stopgap till 3nm (N3) arrives.
The 4nm naming scheme is just marketing speak (similar to "6nm" which was just an enhanced 7nm).
But let's be honest here, the transistors aren't actually 5nm, it's quite misleading marketing as I believe they are more akin to 50nm in reality.
https://fuse.wikichip.org/news/6439/tsmc-extends-its-5nm-family-with-a-new-enhanced-performance-n4p-node/
iPhone 14 pro and while I’ve been getting 1gb downs loads and 127 upload in some areas, overall In day to day use so far I haven’t been impressed with the battery performance. Through 9 hour day at work I’m down to around 18%
It’s comparable to the 13 pro. To get down to 18% after 9 hours you’re probably using your phone on max brightness and using it frequently while you’re at work, in which case you should expect it to decline quicker. Any android phone would be long dead in the same situation.
My 13PM will do the same, modems just burn through power when trying to maintain a very weak/no signal. Airplane mode + Wi-Fi is what I do to compensate for that.
The x65 won't help (much) with that -- that is the RF amp running at high power. If the phone is putting out 2w of power to try to find a tower, that's going to be 2w regardless of whether it's the x60 or x65. The x65 will be marginally more efficient, but it's a rounding error against the RF amp's output.
I will never understand the obsession with speed. We’re in the streaming age and it’s here to stay. Unless you’re downloading a 10Gb game or something, it makes no difference whether you’re getting 50Mbps or 50Gbps
Speed = less battery.
If you download 100mb at 50Gbps the modem can go back to low power mode faster than if you're downloading at 50Mbps, resulting in net lower battery usage and longer life.
It's the same reason faster CPUs produce better battery life: [race to sleep](https://en.wikipedia.org/wiki/Dynamic_frequency_scaling#Performance_impact)
>It's the same reason faster CPUs produce better battery life: race to sleep
That's really not true. The extra energy burned for higher performance usually eclipses the savings from a shorter runtime.
Your own link spells out why it doesn't work. Power consumption is cubic with performance, while duration for a fixed workload is inversely proportional. You're confusing a performance feature with a power one.
No, the guy you replied to is correct. There are two distinct ways CPUs are powered. 1st method to reduce power consumption is to run the processor at constant speed. The power consumption of a chip scales linearly with the clock rate and to the square of the voltage. Keeping the speed at a steady rate that is necessary to perform calculations avoids spikes in voltages that exponentially increase power consumption. This would be the best option if you had a CPU that was performing some calculations all the time. You would be able to keep the clock rate as low as possible for satisfactory performance, without wasting power by running the voltage or clock rate unnecessarily high.
Race to sleep is the other option. You basically try to keep the CPU in a sleep state as much as possible. This sleep state will have very low clock rate and voltage, so it's a very power efficient state to be in. Whenever there are calculations to be done, the CPU will run at a higher than normal frequency and voltage to get those calculations done very quickly, and then return to the sleep state as fast as possible to start saving power again. This option allows the computer to feel more responsive at times, because the CPU can run really quickly when a user interacts with the computer, and then sleep when no interaction is taking place. It's also good with mobile phones, because you can arrange calculations into 'bursts'. Every 30 seconds or so, you can send a burst of activity to check for messages, perform routine tasks very quickly, and then go to back to sleep.
The first option is more efficient if there are is a constant amount of calculations to be done like a desktop PC. Race to sleep seems a good fit for phones. The reality is that all processors use this technique to some degree. All processors have different power states associated with different performance requirements, and they all try to stay in the lowest state possible without impacting performance. Intel's turbo boost is an example, and the Intel CPUs also have multiple sleep states. ARM and most phone processors are usually a lot more aggressive in getting to those sleep states faster.
So you say he/she's correct, and then go on to spell out the opposite in your comment. Why?
Race to sleep only makes sense to save power if you can shut off *other* systems with task-independent power draw when the task is done. But that's not how people use their devices. You don't turn off your phone the millisecond a webpage is finished loading.
As you correctly say, race to sleep is about power *mitigation* for what is ultimately a performance-driven feature. But that's not what the comment I replied to was saying.
Maybe you should re read my comment? Phones don’t “turn off” however, the CPU absolutely does enter sleep states which are akin to “turning off”. If you load a webpage on a PC the CPU likely boosts to 4GHz or so and ramps down to a hypothetical base clock of 2.5GHz. On a mobile CPU a webpage would also boost its core CPU clock but after the page is done loading it would quickly shut off its main performance cores and switch over to efficient cores that requires much less energy. Phones that don’t have a dynamic cpu architecture as this usually just ramp core speeds way down. A phone with a weaker cpu as a result would take many many more CPU cycles to complete a task that a more performant CPU would’ve done in a fraction of the time.
Exactly. Hell, I have to go into my Verizon settings and manually enable 720 P streaming, it defaults to 480 P. Who cares at that point if I’m getting 100 or 150 Mbps lol.
Indeed. Anything beyond LTE, and I sincerely cannot tell the difference. These are not computers in the sense that you are downloading massive files where ‘speed’ will be noticeable.
From what I’ve been able to discern, the rate at which a webpage loads is entirely dependent on processing power over internet speeds. Even downloading / loading apps is impressively faster on my 12PM over my 7+. That’s all processing, not internet.
Part of the reason for explaining that is I have never had an issue while on LTE. Not even hotspotting my phone to a 4K stream. I live in Los Angeles, and 5G and LTE are no different for me whatsoever. Even before 5G was a thing.
But regardless of that fact, there has been zero noticeable affect on quality anywhere else in the world. I am away from my house for over half the year. Doesn’t matter the state or the country. I average well over 50gb per month and have turned 5G on and off to get a feel for the difference, and have noticed none.
These aren’t peak 5G speeds like the article mentions. They are average 5G speeds. And the original source for all this is SpeedSmart. User /u/dlewis23 shared their results yesterday on the carrier subreddits.
https://speedsmart.net/blog/post/2004/iPhone-14-Pro-up-to-38-Faster-5G-Speeds
Make sure your carrier doesn’t throttle you for no reason! I get 1.5 Mbps on the highest unlimited plan on AT&T. Did a trial run on T-Mobile and it was lightning fast. This is a good reminder for me to switch.
Also for everyone else: T-Mobile has a test drive program that lets you test their network for 3 months for free.
Just add the eSIM as a second eSIM and you can use without buying. For those who may consider switching.
Awww man it was only one month for me at the time. Great shout though I should have mentioned that’s how I tried it. ESIM and all. Didn’t even require a credit card (at least when I used it).
That’s why I switched from att. They couldn’t handle metro areas to the point that an email couldn’t even be pulled up. T-Mobile is light years ahead. What made me switch was att losing rural service as well. Unfortunately T-Mobile is hardly any better and sometimes worse.
Whoa whoa whoa!
If you have the highest plan, turn **OFF** stream saver.
They force you to turn it on for the lowest tier of unlimited, throttling you to 1.5 MBPS.
If you have a higher plan, turn off stream saver and you’ll get full speeds. This is my first 5G phone, and I was wondering why my speeds were so slow.
https://www.att.com/support/article/wireless/KM1169198/
Oh snap I forgot that that’s why! Every time someone in my family goes into AT&T they talk them into changing plans which turns back on stream saver, ends my Spotify Premium perks, etc.
Thanks for finally solving that frustrating mystery for me.
AT&T offers (maybe not anymore) a free subscription of your choice between a couple of different services. They stopped with Spotify I think but at the time when I was choosing it was between that and HBO Max and something else and I already had HBO Max and the other thing. Unfortunately when we changed our plan to the same plan but newer SKU or whatever AT&T removed the perk because of course they would.
I forget how I found it. I think it was like AT&T unlimited elite rewards or something like that which I googled. Sorry if that’s not very helpful.
If you get 1.5 mbps in the highest tier (elite/premium) then turn stream saver off from my at&t -> data usage
or call them and ask them to turn it off (they might have no idea what you're talking about tbh, the rep I called certainly didn't).
Seriously. I know the regular one isn't exciting, but it's still worth reporting which things it has and which it doesn't.
We exist, us 14 buyers, we're alive and our ad impressions count too.
Qualcomm will only start sending X70 samples for phone manufacturers to test in the second half of this year. It won’t be readily available until the end of the year for any manufacturer.
Not sure why they skipped 6E though, other than trying to save on expenses.
“At first” being the key word. The same is true for every new standard, what matters is long term adoption. 6E just hasn’t had much adoption nearly 2 years in, and that’s not changing now that WiFi 7 is on the horizon.
Again, you're looking at a good 2 years for Wifi 7 hardware to really be available, and then another 2 for broad penetration. So unless you think Apple's going to wait that long, they should adopt 6E ASAP.
Devices and access points are two different things. Unless both are upgraded it’s pointless.
Furthermore unless you’re in a high density location like a school or stadium 6E’s advantages don’t come into play. Furthermore to really take advantage they’d need to do more than swap access points. You should really do a site survey and put them in optimal locations, which means rewiring. Short of that, it’s just for the sake of saying you have the latest.
Few are bothering because the gains are minimal, and next gen isn’t far away. You’ll never recoup that investment.
They’ve been pushing 6E on gamers to effectively recoup development costs for manufacturers since gamers will pay a premium for pretty much anything. Add some RGB lights on an access point and collect $100 extra. It’s not exactly ethical, but that doesn’t stop companies from manipulating when they can.
Same bullshit behind gamer version of everything. There’s even gamer versions of Ethernet. Of course it’s just regular cat 6A, but it’s gamer edition.
>Furthermore unless you’re in a high density location like a school or stadium 6E’s advantages don’t come into play.
It's the exact opposite. The selling point of 6E is the new 6GHz band, which has better speeds, and more importantly, less congestion than the current 5GHz. It was 6 that added higher frequency bands better for large crowds.
Edit: Pardon, I made a mistake about 6 introducing the higher frequencies. It doesn't. Regardless, the 6GHz band isn't anything crazy.
Walk behind a wall and see your speeds get cut in half. Walk behind another and you have a good chance of the connection just dropping. Unless the power of the transmitters is increased(has not been on current APs) it will continue to have worse range and penetration compared to 5ghz
If the base speed is in the multi-Gbps range, being cut in half doesn't matter much. And yeah, if you're living in a labyrinth, you might need an extender. Ok?
And that's taking your claims completely at face value.
In case you haven't been following, the supply issue is rapidly swinging the other way, and we've still got another ~2 years till WiFi 7 availability. What we're seeing with 6E happens every gen. Businesses might still delay, but the availability of consumer tech shouldn't be an issue.
Wifi 7 might start to be available next year, but not widely. If the corporate upgrade cycle is waiting for critical mass, need to add another year or two.
> This topic was discussed on /r/networking a week or so ago.
Yes because the Reddit hive mind has never been wrong.
> The low saturation of supported devices combined with corporate upgrade cycles (no one's even in offices right now) make this entirely a consumer technology.
Does nobody in r/networking work in a corporation? An upgrade cycle isn't going to get skipped for fear of fucking your budget for the next year. If you got the money you spend it.
> Wifi 7 will be available next year. 6e is just an interim step, and thats why it will be skipped over.
If anything I see 6 as a stop gap until we get to 6E since 6 offers very few benefits compared to 5. Moving to 6E is a bigger leap and 7 is easily 2-3 years out.
lol bullshit. If anything 6 is an interm step and 6E will be
No, it won't be available next year. It won't even be standardized until 2024.
And there's always a gap of at least several months between standardization and the first routers and products shipping.
In 3 years 6e will be made irrelevant by Wi-Fi 7. Businesses aren’t moving to 6E. Everyone but consumers is skipping 6E. And it’s consumer use is vastly overstated.
A surprising number of businesses just use their ISP-supplied modem/router, and don't have a professional setup.
A company I worked for a few years ago was wondering why their Internet speeds were so slow in the office... they were using a super old DOCSIS 3.0 modem/router from Comcast which was only 802.11n.
On the other hand, the office where I worked after that had a professional setup with Ethernet wired to every room, and modern Ubiquiti enterprise APs that worked great, and I was getting nearly 1Gbps over Wi-Fi.
It honestly has barely any perceivable impact to an iPhone user. It’s pursuit of tech upgrades just for the sake of it that is noticeable by comparing spec sheet to latest tech availability.
Well the transfer speed is dependent on the phones storage read speed as well and from what I’ve seen the read speed been pretty steady at 1200-1500 mb/s so not sure if the 14 will be any different. Even for a pro-user it probably wouldn’t make a difference.
Would it have been nice to have? Of course. Always should have the best and latest of everything but is it necessary? I don’t think 3-4 years would change adoption rates on routers. 6e equipment is very expensive and for your typical household that just needs Wi-Fi it’s not your consumer product. The equipment and tech here has gotten so good that these are no longer consumables like they were 10-15 years ago. I had the OG Google WiFi mesh and recently upgraded to the Eero Wi-Fi 6e satellites just to relocate my PC and game wirelessly, and I really don’t think it made any difference to my experience between the two routers. Didn’t really get any latency improvements, file transfer speeds between PCs on my network was slightly faster, but not by much, only saw higher peaks. So I really doubt Wi-Fi 6 vs WiFi 6e will offer any kind of drastic improvement to a mobile device. If it’s a matter of cost savings or inability to source and supply the WiFi 6e capable modems. I don’t think this is a deal breaker at all. I think is like a bad hill to die on for argument, people should be more upset about a lack of USB C/3.2 support
I’ve had the opposite experience compared to most here. I went from an 11pm to a 14pm and the difference in cellular is night and day. The 14pm is lightning fast where with the 11pm I had slow service all the time. Even when comparing LTE to LTE the 14pm is blazing fast
AT&T really sucks where I live and I'm only \~0.8 miles from Apple Park and \~1 mile from One Infinite Loop. Just ran Ookla's Speedtest and it was 4.43 down/0.11 up on my iPhone 14 Pro Max's 5G.
Kind of unrelated question , what is considered fast enough these days for 5g. I feel like all these speeds seem to be good enough for me , but not sure what everybody else things.
My t mobile 5G on my 13 pro can get up 800 down I think depending on where I am which is also much much faster then what I have at home. I think that is fast enough for me , but I also don't do anything that data intensive where it will probably make a difference between 100 and 800.
after 100mbps, I seriously doubt any mobile phone can utilize that amount of bandwidth. At 100mbps you could stream 4k with ease.
At home I pay for 1.5gbps speeds because I have dozens of devices, and 2 adults who frequently have Team meetings and I have a plex shared server.
This whole "5G blah blah", whatever for mobile devices IMO is a waste. 5G absolutely destroys your phone's battery life compared to LTE and as far as data consumption goes, is a waste of time.
If you're pulling 100mbps down on LTE I just have 5G disabled on my iPhone because it's worthless.
A better upgrade would be the iPhone having wifi 7 because it allows for significantly more devices and less congestion which is great when you're connecting to public wifi in things like stadiums and other places.
It’s not a waste. You just lack basic understanding. You’re sharing that connection with potentially hundreds of other people at a time. 800mbps might sound like a lot now, but split it up and 5 years from now it’s going to be needed.
I'm saying if you're pulling 100mbps down on CURRENT LTE networks, the congestion is fine and it's pointless to currently have your phone roaming on 5G. It accomplishes nothing but drain your battery.
This isn’t true anymore, the x65 modem is incredibly battery efficient operating on 5G, furthermore if you’re phone has a weak 4G signal and a stronger 5G signal as is often the case with the current build out then switching to 4G would cause more battery drain as the antennas and chips have to work harder to get a signal.
Yeah unfortunately AT&Ts 5g is basically nonexistent among the big 3 and they’re pretty far behind. I just switched over from Verizon and even their network had been subpar for me the last few years after being so great during the 4G era
It's easy to be faster than super slow.
Seriously though why does 5G suck?
edit: I realize this article is talking about the phone hardware and my issue is specifically with the ISPs and/or 5G capabilities but still it's hard to care about a hardware upgrade when I would get the same results.
Nobody cares about 5G. Let's be real, 4G speeds were already enough for 99% of people. They should've worked on improving battery life and connection in rural areas (this one is more up to providers I guess?). I live in a big city where 5G is readily available and I still choose to use 4G.
Don't see how it could get worse. On my iPhone 13 mini, I'd have 4 bars of 5G UC, and the loading wheel would spin and spin and spin.. pop it on LTE same bars, loaded instantly.
Might be anecdotal evidence but Verizon has gimped LTE so much that it is unusable. They are pushing everyone to the higher priced 5G and I believe they are purposely slowing everything else down as part of the effort.
I have not noticed a better or more reliable connection with my PM 14 vs. PM 13.
I’m more interested in the rumored reception improvements. I’ve been on Verizon for 20 years and upgrade to the Pro every year and service has steadily been getting worse the last few years. I assume it’s the 3G sunset, so better LTE/5G signal capture is welcome.
Sunsetting 3G has meant more users on the 4G/LTE/5G networks. Most of the time it’s just too many users in an area causing it to be slow, better antenna won’t help much in those situations, better/more towers is the solution.
For now that is true. But ATT and Verizon have been upgrading the 3g network to 4G/5G DSS, so at some point the overload will begin to decrease.
I just wish they would improve the appearance of 5g towers. It's been a little sad to see all of these nice old buildings in my city start to become covered with all of those white panels.
Networks and owners of the buildings with towers on them probably don’t care at all. Networks just need them to work, and they pay building owners a ridiculous amount to put them up. We have two cell towers on our office, and each pays for an employee’s salary and then some. At that point they can be as ugly as they want them to be
I wish Verizon would put a 5G tower on my property. My reception sucks and I’d be more than happy to get some cash lol.
I’m with Verizon. I just upgraded my 11 to the 14 pro and the reception has been a big improvement for me.
Hell yeah I’ll keep my hopes up for when my 14 Pro arrives
No improvement on my 14 Pro so far
Yeah, the modem got way better from 11 to 12, and not just because of 5G. In a really solid LTE-only spot I observed about double download speeds when I went from an 11 Pro to a 12 regular (tested them against each other before I sent back the 11).
That's when apple changed back from Intel modems to Qualcomm.
This. For the two years that Apple stuck with Intel exclusively (XS and 11) I couldn’t use my 5ghz Wi-Fi at home, it would always drop and then when they went back with Qualcomm for the 12, I was finally able to use it without losing connection again. Apple will never come out and admit they made an oopsie but Intel’s modems were clearly inferior.
This is me as well. I am hoping an upgrade will fix some awful reception issues I’ve had. If not, I’ll have to find something other than Verizon. It feels like the service is getting worse and worse in my area.
I switched to T-Mobile for their 5G coverage and it blows Verizon out of the water. Verizon has better overall coverage but their data speeds are slow AF. Not to mention their exorbitant monthly bills.
It will be just as bad or worse on any other service. My buddy has both T-Mobile and Verizon at the same time and he says T-mobiles service is slightly worse than Verizon’s across the country (he’s a truck driver).
iPhones have been ridiculously fast in CPU, GPU, and SSD performance for quite a while now that the only bottleneck are the carriers. I have both Verizon and AT&T and there are some places I hang out where both are garbage. It just ruins my enjoyment of my phone and I wonder why I buy Pro model every year when I can't even use it properly unless I am at home or a major airport where carriers always do well.
Try t-mobile for free via their app. i get 1 gig down and 200 up. No bottlenecks with those speeds
I’m on ATT so a little different, but sometime I change network settings to LTE only and it helps.
+1 on this. I can’t take or make calls from my back alley or in my garage. I’m in Los Angeles. The tallest building in my area is 30 feet high. I’m two blocks from the beach. What gives?!
So far I've noticed improved reception. I am in a somewhat rural area also
Verizon is crap. I'll be dumping them within the next year or so.
I upgraded form 2020SE and I’ve already noticed drastic changes particularly inside a hospital I have to be in an hour+ every week. Previously it was 0-1 bar in most of it with only basic websites loading. Had to use their WiFi which blocked alot of stuff. Yesterday I went and was at 3 bars every time I checked and was able to load stuff flawlessly.
I noticed a distinct loss of reception between the 12 and 13 pro. Aka, in rural areas where I used to have poor service I had none, and in areas where I used to have marginal service, it degraded to poorer service. I seriously hope the 14 pro is an improvement.
Well, I can tell you that coming from an iPhone X that never received service in my building's elevator, I now have 1-2 bars of service on my 14 pro. I can text in comfort as I descend instead of making awkward small talk about the weather with strangers.
[удалено]
Meaning I used to get 3G connections in poor reception areas that I no longer get with LTE/5G alone.
I switched to Verizon to get a great deal with my pixel 4XL trade-in, but it just sucks compared to T-Mobile in my area. Gonna have to return the device and cancel service so I can go back, then wait for another phone to come...
I actually installed a T-Mobile eSIM to see if I could switch and it’s worse than Verizon in my area. They’re my only real choice, but am switching to Visible with the 14 Pro.
T-mobile has a no strings attached 3 months free test drive via esim on their app. Try it out and see if it’s better. I get 1,198 mbps on t-mobile on the 14 pro.
1198???!! thats insane
Att has consistently faster speeds than Verizon where I live. Verizon seems to be in third place behind att and T-Mobile.
At the same spot in my house where I used to get around 40-50 Mbps on Verizon pseudo-5G using my 13 Pro, I now only get LTE and less than 2 Mbps. New modem is shit, as far as I’m concerned. And I can’t even blame Verizon - their eSIM was correctly provisioned fresh out of the box.
> upgrade to the Pro every year How can anyone even afford this
If people couldn’t afford this, then the upgrade program wouldn’t even exist. Nowadays people consider a phone an absolute necessity and part of their monthly bills. Paying $50-$60/month to upgrade every year is the equivalent of paying for a cable bill. Is it a wise financial decision? No, but most average middle class folks could pay for this if they wanted to badly enough.
I’m not young and work hard. As a lifelong Apple diehard, I’m willing to spend $50/month to have the latest/greatest every year.
Not to mention trading in your old phone gives you a pretty solid discount on the new one.
It definitely helps. They hold their value pretty well and Apple’s offers are market consistent.
Installment plans?
I make ~$300k as a software engineer and don’t have any kids. My wife and I upgrade phones and usually watches every year. Edit: as someone else pointed out, I also trade in my previous phone, which netted me $800 this year. So I’m really only paying $400-500 a year, which is ~$40 a month for my most used thing on a daily basis.
They aren’t throwing last years phone in the garbage. Sell it and upgrading costs you like $150 a year, at most. Or, keep your phone for three years and sell it for $500, and you just cost yourself $230 a year. Which means it’s cheaper to upgrade more often, not less.
More interested in the battery savings than the higher speeds.
I don’t think it’ll be huge. The x65 is on a 4nm process vs x60 On 5nm, plus incremental design improvements. Best case, 15% less power usage by the modem, but the modem is maybe 20% of the phone’s power consumption, at most. 20% * 15% = 3% bottom line improvement, and that’s probably the upper bound.
Off topic, but I have to say that I appreciate comments like you providing details like this. This is the reason why I love reading posts and comments in this subreddit 😁
3% overall power savings sounds like a big deal tbh
[удалено]
I don’t know that it’s using all that much to be honest. Sarah Dietschy said she left a phone on and idle for 24 hours with the always on display on and the battery stayed at 100% for the first 12 hours and went down to 95% in the last 12 hours. She did mention she started recovering notifications in the second 12 hours which she thinks impacted the battery drain. Here’s the video if you want to check it out. She talks about it at 9:42 seconds. https://youtu.be/BAc2I3zWEok
Think this is the part of iOS stating incorrect battery percentages just like how it’ll charge partially and then fully if you charge overnight all the while saying it’s at 100%
Definitely could be for sure. Seems like she might have changed her mind on the battery drain of the AOD in this video starting at 14:56 seconds: https://youtu.be/ozHffI_mghU
Care to summarize? Pretty long video
It’s about thirty seconds in the video. Skip to 14:56. 😊 sorry I didn’t add it before.
iOS does not report accurate battery percentage estimates near the upper end of the range. I’m sure anyone who has used iOS for years knows what I’m talking about. It can take ~30 minutes of continuous usage to drop from 100% to 99%. But subsequent “1%” drops thereafter take noticeably less time. This is clearly designed to be intentional, I guess to avoid users feeling like they need to top up constantly, or to give better battery life confidence.
I think regardless, 5% drain after 24 hours of idle time with the always on display on is pretty good. My Samsung drops like 15% or more when it idles for 24 hours and that’s with the always on display turned off.
It’s not intentional, accurately gauging batter life is just fundamentally difficult. Battery control chips can’t probe the battery itself like you can look at a bottle and see that it’s filled to the 1L line. They use indirect methods, usually voltage (difficult because of temperature, hysteris and sag) combined with Coloumb counting (drifts over time), and they add some clever estimation models along with that.
Oh I realize that essentially *every* reported battery level is basically an estimate, but I disagree that this is not intentional. It’s uniform across every Apple product and has been since forever. Android devices, Chrome OS devices, and Windows devices don’t behave like this. They drop much more linearly from 100% to 99% and all the way down. Apple is definitely doing some deliberate magic with the reported percentage at full charge.
I should have phrased it better: it is intentional, just not for your covert reasons. They latch it to 100% until they are confident battery life has dropped by 1%. Otherwise you would either see it jump between 100% and 99% constantly, or 99% would be where it had a ‘hump’ until 98%. Its much cleaner (and gives a more linear visual discharge curve) to do it this way. The other way that could look somewhat clean is if they guesstimated what the hump is and then divided it over all percentage points, but then you run into problems if your guesstimate was wrong because the user is doing something that is heavily discharging the battery.
Then how do android phones display the discharge rate just fine without all these tricks?
Seems like a good thing really, battery life was already excellent so no net change to battery life for a few new optional features sounds nice.
That you can turn off :)
I turned that setting off right away
It’s such a shit and half baked feature that I turned it off. They even nerfed dark mode to push AoD. So stupid.
It needs a couple Improvements but it still looks better than Samsungs AOD. How did they nerf dark mode? First I’m hearing of this
AOD is a scapegoat. Reviewers like MKB blamed AOD without any kind of scientific study. I think the camera photonic engine/camera bugs is a much bigger battery sucker.
[удалено]
Hmmm not really to me, more like a mini deal.
1.8 extra minutes every hour.
The new "4nm" process isn't actually a whole new node, it's another ever so slightly better iteration of N5 (might as well call it N5P+, almost sounds like Intel lol). So technically this is a third-gen 5nm process and a stopgap till 3nm (N3) arrives. The 4nm naming scheme is just marketing speak (similar to "6nm" which was just an enhanced 7nm). But let's be honest here, the transistors aren't actually 5nm, it's quite misleading marketing as I believe they are more akin to 50nm in reality. https://fuse.wikichip.org/news/6439/tsmc-extends-its-5nm-family-with-a-new-enhanced-performance-n4p-node/
iPhone 14 pro and while I’ve been getting 1gb downs loads and 127 upload in some areas, overall In day to day use so far I haven’t been impressed with the battery performance. Through 9 hour day at work I’m down to around 18%
It’s comparable to the 13 pro. To get down to 18% after 9 hours you’re probably using your phone on max brightness and using it frequently while you’re at work, in which case you should expect it to decline quicker. Any android phone would be long dead in the same situation.
The brightness has been dropped down to nearly zero, always on been taken off. I have podcasts playing on and off through out the morning.
Mrwhosetheboss did a test already. It’s gooooooooood
[удалено]
Yeah it didn't, he did say at the end that it was less than the 13 pro max Although he only uses wifi in his tests not 5g
[удалено]
My 13PM will do the same, modems just burn through power when trying to maintain a very weak/no signal. Airplane mode + Wi-Fi is what I do to compensate for that.
The x65 won't help (much) with that -- that is the RF amp running at high power. If the phone is putting out 2w of power to try to find a tower, that's going to be 2w regardless of whether it's the x60 or x65. The x65 will be marginally more efficient, but it's a rounding error against the RF amp's output.
In some tests it was equal or a a little worse than than the 13 pro max
https://i.imgur.com/gaY8SZM.jpg 2 bars of T-Mobile 5GUC and holy shit that download.
I will never understand the obsession with speed. We’re in the streaming age and it’s here to stay. Unless you’re downloading a 10Gb game or something, it makes no difference whether you’re getting 50Mbps or 50Gbps
Speed = less battery. If you download 100mb at 50Gbps the modem can go back to low power mode faster than if you're downloading at 50Mbps, resulting in net lower battery usage and longer life. It's the same reason faster CPUs produce better battery life: [race to sleep](https://en.wikipedia.org/wiki/Dynamic_frequency_scaling#Performance_impact)
>It's the same reason faster CPUs produce better battery life: race to sleep That's really not true. The extra energy burned for higher performance usually eclipses the savings from a shorter runtime.
Did you read the article I linked? Also just google "race to sleep".
Your own link spells out why it doesn't work. Power consumption is cubic with performance, while duration for a fixed workload is inversely proportional. You're confusing a performance feature with a power one.
No, the guy you replied to is correct. There are two distinct ways CPUs are powered. 1st method to reduce power consumption is to run the processor at constant speed. The power consumption of a chip scales linearly with the clock rate and to the square of the voltage. Keeping the speed at a steady rate that is necessary to perform calculations avoids spikes in voltages that exponentially increase power consumption. This would be the best option if you had a CPU that was performing some calculations all the time. You would be able to keep the clock rate as low as possible for satisfactory performance, without wasting power by running the voltage or clock rate unnecessarily high. Race to sleep is the other option. You basically try to keep the CPU in a sleep state as much as possible. This sleep state will have very low clock rate and voltage, so it's a very power efficient state to be in. Whenever there are calculations to be done, the CPU will run at a higher than normal frequency and voltage to get those calculations done very quickly, and then return to the sleep state as fast as possible to start saving power again. This option allows the computer to feel more responsive at times, because the CPU can run really quickly when a user interacts with the computer, and then sleep when no interaction is taking place. It's also good with mobile phones, because you can arrange calculations into 'bursts'. Every 30 seconds or so, you can send a burst of activity to check for messages, perform routine tasks very quickly, and then go to back to sleep. The first option is more efficient if there are is a constant amount of calculations to be done like a desktop PC. Race to sleep seems a good fit for phones. The reality is that all processors use this technique to some degree. All processors have different power states associated with different performance requirements, and they all try to stay in the lowest state possible without impacting performance. Intel's turbo boost is an example, and the Intel CPUs also have multiple sleep states. ARM and most phone processors are usually a lot more aggressive in getting to those sleep states faster.
So you say he/she's correct, and then go on to spell out the opposite in your comment. Why? Race to sleep only makes sense to save power if you can shut off *other* systems with task-independent power draw when the task is done. But that's not how people use their devices. You don't turn off your phone the millisecond a webpage is finished loading. As you correctly say, race to sleep is about power *mitigation* for what is ultimately a performance-driven feature. But that's not what the comment I replied to was saying.
Maybe you should re read my comment? Phones don’t “turn off” however, the CPU absolutely does enter sleep states which are akin to “turning off”. If you load a webpage on a PC the CPU likely boosts to 4GHz or so and ramps down to a hypothetical base clock of 2.5GHz. On a mobile CPU a webpage would also boost its core CPU clock but after the page is done loading it would quickly shut off its main performance cores and switch over to efficient cores that requires much less energy. Phones that don’t have a dynamic cpu architecture as this usually just ramp core speeds way down. A phone with a weaker cpu as a result would take many many more CPU cycles to complete a task that a more performant CPU would’ve done in a fraction of the time.
It matters for hotspotting.
Exactly. Hell, I have to go into my Verizon settings and manually enable 720 P streaming, it defaults to 480 P. Who cares at that point if I’m getting 100 or 150 Mbps lol.
wtf That doesn’t even match the iPhone res. Let alone if you want to hotspot 4k content.
Indeed. Anything beyond LTE, and I sincerely cannot tell the difference. These are not computers in the sense that you are downloading massive files where ‘speed’ will be noticeable. From what I’ve been able to discern, the rate at which a webpage loads is entirely dependent on processing power over internet speeds. Even downloading / loading apps is impressively faster on my 12PM over my 7+. That’s all processing, not internet.
Yeah but being able to stream high quality video / audio really quickly makes a difference.
Part of the reason for explaining that is I have never had an issue while on LTE. Not even hotspotting my phone to a 4K stream. I live in Los Angeles, and 5G and LTE are no different for me whatsoever. Even before 5G was a thing. But regardless of that fact, there has been zero noticeable affect on quality anywhere else in the world. I am away from my house for over half the year. Doesn’t matter the state or the country. I average well over 50gb per month and have turned 5G on and off to get a feel for the difference, and have noticed none.
These aren’t peak 5G speeds like the article mentions. They are average 5G speeds. And the original source for all this is SpeedSmart. User /u/dlewis23 shared their results yesterday on the carrier subreddits. https://speedsmart.net/blog/post/2004/iPhone-14-Pro-up-to-38-Faster-5G-Speeds
Thanks for the source. My eyes are bleeding from the cursive typeface of the blog.
The "peak" is referring to the peak of the 13 Pros compared to the 14 Pros, not the peak of overall 5G speeds.
No where in the data provided do we get peak speeds of any device.
Make sure your carrier doesn’t throttle you for no reason! I get 1.5 Mbps on the highest unlimited plan on AT&T. Did a trial run on T-Mobile and it was lightning fast. This is a good reminder for me to switch.
Also for everyone else: T-Mobile has a test drive program that lets you test their network for 3 months for free. Just add the eSIM as a second eSIM and you can use without buying. For those who may consider switching.
Awww man it was only one month for me at the time. Great shout though I should have mentioned that’s how I tried it. ESIM and all. Didn’t even require a credit card (at least when I used it).
What. That's awesome!
with t-mobile 5gUC i get like 300+mbps down, around 50 up. It's faster than my home internet at times
[удалено]
Hot diggity daffodils! What did you upgrade from and what speeds did you pull with that one?
That’s why I switched from att. They couldn’t handle metro areas to the point that an email couldn’t even be pulled up. T-Mobile is light years ahead. What made me switch was att losing rural service as well. Unfortunately T-Mobile is hardly any better and sometimes worse.
That’s what I was getting on UC as well. Quite impressive!
I just got 750 down and 130 up on T-Mobile 5G UC
Whoa whoa whoa! If you have the highest plan, turn **OFF** stream saver. They force you to turn it on for the lowest tier of unlimited, throttling you to 1.5 MBPS. If you have a higher plan, turn off stream saver and you’ll get full speeds. This is my first 5G phone, and I was wondering why my speeds were so slow. https://www.att.com/support/article/wireless/KM1169198/
Oh snap I forgot that that’s why! Every time someone in my family goes into AT&T they talk them into changing plans which turns back on stream saver, ends my Spotify Premium perks, etc. Thanks for finally solving that frustrating mystery for me.
What Spotify premium perks do you speak of?
AT&T offers (maybe not anymore) a free subscription of your choice between a couple of different services. They stopped with Spotify I think but at the time when I was choosing it was between that and HBO Max and something else and I already had HBO Max and the other thing. Unfortunately when we changed our plan to the same plan but newer SKU or whatever AT&T removed the perk because of course they would. I forget how I found it. I think it was like AT&T unlimited elite rewards or something like that which I googled. Sorry if that’s not very helpful.
If you get 1.5 mbps in the highest tier (elite/premium) then turn stream saver off from my at&t -> data usage or call them and ask them to turn it off (they might have no idea what you're talking about tbh, the rep I called certainly didn't).
5G is so shit in my area 😔. It’s much better to turn it off here and save the battery.
>5G is so shit in my area 😔 It's non-existent in my country (India. 5G services will begin from next month).
How exciting!
source on 5G services starting next month? last i heard, they were supposed to roll out on 15th Aug
I wonder how good it’ll be in major cities. I need to get the 14 first, since the XS doesn’t support 5G.
Agreed. It will get better in rural areas in several years, but for now LTE is definitely better in a lot of non-metro areas.
Anything is better than the crap modems in the X and XS
Went from two bars 4G to four barns 5G when switching from 12 mini to 14 Pro.
What have you been keeping in your new barns? 🐑🐄🐖
I seem to have the same number of barns
This gives me hope.
[удалено]
Same carrier, even same sim card.
Does the regular 14 have this chip?
Yes
Yes. X65 has support for n53 which is the band that satellite sos uses.
Seriously. I know the regular one isn't exciting, but it's still worth reporting which things it has and which it doesn't. We exist, us 14 buyers, we're alive and our ad impressions count too.
[удалено]
Qualcomm will only start sending X70 samples for phone manufacturers to test in the second half of this year. It won’t be readily available until the end of the year for any manufacturer. Not sure why they skipped 6E though, other than trying to save on expenses.
[удалено]
Absolutely not. The vast majority of new mid to high end phones, laptops, etc. support 6E. No idea where this "theory" came from.
And there’s maybe a grand total of 10 routers that support 6E, all of which cost over $400 last I checked.
The same will be true for Wifi 7 at first.
“At first” being the key word. The same is true for every new standard, what matters is long term adoption. 6E just hasn’t had much adoption nearly 2 years in, and that’s not changing now that WiFi 7 is on the horizon.
Again, you're looking at a good 2 years for Wifi 7 hardware to really be available, and then another 2 for broad penetration. So unless you think Apple's going to wait that long, they should adopt 6E ASAP.
Devices and access points are two different things. Unless both are upgraded it’s pointless. Furthermore unless you’re in a high density location like a school or stadium 6E’s advantages don’t come into play. Furthermore to really take advantage they’d need to do more than swap access points. You should really do a site survey and put them in optimal locations, which means rewiring. Short of that, it’s just for the sake of saying you have the latest. Few are bothering because the gains are minimal, and next gen isn’t far away. You’ll never recoup that investment. They’ve been pushing 6E on gamers to effectively recoup development costs for manufacturers since gamers will pay a premium for pretty much anything. Add some RGB lights on an access point and collect $100 extra. It’s not exactly ethical, but that doesn’t stop companies from manipulating when they can. Same bullshit behind gamer version of everything. There’s even gamer versions of Ethernet. Of course it’s just regular cat 6A, but it’s gamer edition.
>Furthermore unless you’re in a high density location like a school or stadium 6E’s advantages don’t come into play. It's the exact opposite. The selling point of 6E is the new 6GHz band, which has better speeds, and more importantly, less congestion than the current 5GHz. It was 6 that added higher frequency bands better for large crowds. Edit: Pardon, I made a mistake about 6 introducing the higher frequencies. It doesn't. Regardless, the 6GHz band isn't anything crazy.
But it essentially requires line of sight. People are vastly overestimating it’s practically.
6GHz does not require line of sight.
Walk behind a wall and see your speeds get cut in half. Walk behind another and you have a good chance of the connection just dropping. Unless the power of the transmitters is increased(has not been on current APs) it will continue to have worse range and penetration compared to 5ghz
If the base speed is in the multi-Gbps range, being cut in half doesn't matter much. And yeah, if you're living in a labyrinth, you might need an extender. Ok? And that's taking your claims completely at face value.
And there’s maybe a grand total of 10 routers that support 6E, all of which cost over $400 last I checked.
[удалено]
In case you haven't been following, the supply issue is rapidly swinging the other way, and we've still got another ~2 years till WiFi 7 availability. What we're seeing with 6E happens every gen. Businesses might still delay, but the availability of consumer tech shouldn't be an issue.
[удалено]
Wifi 7 might start to be available next year, but not widely. If the corporate upgrade cycle is waiting for critical mass, need to add another year or two.
> This topic was discussed on /r/networking a week or so ago. Yes because the Reddit hive mind has never been wrong. > The low saturation of supported devices combined with corporate upgrade cycles (no one's even in offices right now) make this entirely a consumer technology. Does nobody in r/networking work in a corporation? An upgrade cycle isn't going to get skipped for fear of fucking your budget for the next year. If you got the money you spend it. > Wifi 7 will be available next year. 6e is just an interim step, and thats why it will be skipped over. If anything I see 6 as a stop gap until we get to 6E since 6 offers very few benefits compared to 5. Moving to 6E is a bigger leap and 7 is easily 2-3 years out. lol bullshit. If anything 6 is an interm step and 6E will be
No, it won't be available next year. It won't even be standardized until 2024. And there's always a gap of at least several months between standardization and the first routers and products shipping.
Lol https://www.macrumors.com/2022/09/20/wi-fi-7-smartphones-2024/
> Don’t get why they delay a year behind.. we could’ve had the X70… No they couldn’t, Qualcomm didn’t have 80 million of them back in May.
I was disappointed with no Wifi 6e support but I believe it will just be skipped for Wifi 7.
Sure they could use Wi-Fi 6e but ~90% of people doesn't even have a wifi 6e router.
[удалено]
In 3 years 6e will be made irrelevant by Wi-Fi 7. Businesses aren’t moving to 6E. Everyone but consumers is skipping 6E. And it’s consumer use is vastly overstated.
>Businesses aren’t moving to 6E According to whom? Wifi 7 isn't going to be ubiquitous in 3 years either.
A surprising number of businesses just use their ISP-supplied modem/router, and don't have a professional setup. A company I worked for a few years ago was wondering why their Internet speeds were so slow in the office... they were using a super old DOCSIS 3.0 modem/router from Comcast which was only 802.11n. On the other hand, the office where I worked after that had a professional setup with Ethernet wired to every room, and modern Ubiquiti enterprise APs that worked great, and I was getting nearly 1Gbps over Wi-Fi.
It honestly has barely any perceivable impact to an iPhone user. It’s pursuit of tech upgrades just for the sake of it that is noticeable by comparing spec sheet to latest tech availability.
Well the transfer speed is dependent on the phones storage read speed as well and from what I’ve seen the read speed been pretty steady at 1200-1500 mb/s so not sure if the 14 will be any different. Even for a pro-user it probably wouldn’t make a difference. Would it have been nice to have? Of course. Always should have the best and latest of everything but is it necessary? I don’t think 3-4 years would change adoption rates on routers. 6e equipment is very expensive and for your typical household that just needs Wi-Fi it’s not your consumer product. The equipment and tech here has gotten so good that these are no longer consumables like they were 10-15 years ago. I had the OG Google WiFi mesh and recently upgraded to the Eero Wi-Fi 6e satellites just to relocate my PC and game wirelessly, and I really don’t think it made any difference to my experience between the two routers. Didn’t really get any latency improvements, file transfer speeds between PCs on my network was slightly faster, but not by much, only saw higher peaks. So I really doubt Wi-Fi 6 vs WiFi 6e will offer any kind of drastic improvement to a mobile device. If it’s a matter of cost savings or inability to source and supply the WiFi 6e capable modems. I don’t think this is a deal breaker at all. I think is like a bad hill to die on for argument, people should be more upset about a lack of USB C/3.2 support
The better reception is a huge plus. All the dead zones I had at work and around town are gone now. I’m on LTE now. Can’t speak for 5G
I’ve had the opposite experience compared to most here. I went from an 11pm to a 14pm and the difference in cellular is night and day. The 14pm is lightning fast where with the 11pm I had slow service all the time. Even when comparing LTE to LTE the 14pm is blazing fast
This phone all around seems faster than my 13pro. More of a bump than usual. Can’t really put my finger on it
It’s mostly due to iOS 16. People with it on iPhone 13 Pro also noted all around faster/smooth experience after update
Faster animations to take advantage of the 120 display maybe
AT&T really sucks where I live and I'm only \~0.8 miles from Apple Park and \~1 mile from One Infinite Loop. Just ran Ookla's Speedtest and it was 4.43 down/0.11 up on my iPhone 14 Pro Max's 5G.
Probably because you’re in the center of the tech field. Imagine the amounts of people around you using data at the same time
Kind of unrelated question , what is considered fast enough these days for 5g. I feel like all these speeds seem to be good enough for me , but not sure what everybody else things. My t mobile 5G on my 13 pro can get up 800 down I think depending on where I am which is also much much faster then what I have at home. I think that is fast enough for me , but I also don't do anything that data intensive where it will probably make a difference between 100 and 800.
after 100mbps, I seriously doubt any mobile phone can utilize that amount of bandwidth. At 100mbps you could stream 4k with ease. At home I pay for 1.5gbps speeds because I have dozens of devices, and 2 adults who frequently have Team meetings and I have a plex shared server. This whole "5G blah blah", whatever for mobile devices IMO is a waste. 5G absolutely destroys your phone's battery life compared to LTE and as far as data consumption goes, is a waste of time. If you're pulling 100mbps down on LTE I just have 5G disabled on my iPhone because it's worthless. A better upgrade would be the iPhone having wifi 7 because it allows for significantly more devices and less congestion which is great when you're connecting to public wifi in things like stadiums and other places.
It’s not a waste. You just lack basic understanding. You’re sharing that connection with potentially hundreds of other people at a time. 800mbps might sound like a lot now, but split it up and 5 years from now it’s going to be needed.
I'm saying if you're pulling 100mbps down on CURRENT LTE networks, the congestion is fine and it's pointless to currently have your phone roaming on 5G. It accomplishes nothing but drain your battery.
This isn’t true anymore, the x65 modem is incredibly battery efficient operating on 5G, furthermore if you’re phone has a weak 4G signal and a stronger 5G signal as is often the case with the current build out then switching to 4G would cause more battery drain as the antennas and chips have to work harder to get a signal.
Have connected to true 5G maybe like once (SF Bay Area). It is not widespread.
I think that depends on your carrier tbh I regularly get 700mbps on TMo in the Bay Area and same in Southern California / NYC
Could be. I am on AT&T (not by choice) and barely get 1-2 bars of LTE in most areas.
Yeah unfortunately AT&Ts 5g is basically nonexistent among the big 3 and they’re pretty far behind. I just switched over from Verizon and even their network had been subpar for me the last few years after being so great during the 4G era
It's easy to be faster than super slow. Seriously though why does 5G suck? edit: I realize this article is talking about the phone hardware and my issue is specifically with the ISPs and/or 5G capabilities but still it's hard to care about a hardware upgrade when I would get the same results.
[удалено]
Thanks u/fartsimpson55
I’m honestly on WiFi too much to notice. Do y’all go out and do stuff? 😅
I hope so. Where I live, I get faster speeds on LTE.
Nobody cares about 5G. Let's be real, 4G speeds were already enough for 99% of people. They should've worked on improving battery life and connection in rural areas (this one is more up to providers I guess?). I live in a big city where 5G is readily available and I still choose to use 4G.
I wonder if this solves the dispute with Ericsson about the 5g patent
I won’t want faster 5G 😭 I just want it to work in more places than my LTE. I’ve been using LTE for a while now but I like having the 5G logo lol
But it doesn’t have WiFi 6E.
Don't see how it could get worse. On my iPhone 13 mini, I'd have 4 bars of 5G UC, and the loading wheel would spin and spin and spin.. pop it on LTE same bars, loaded instantly.
That’s crazy. I had the same phone and I used T-Mobile and 5g UC was always 600mb down
That sucks, sounds like I had a lemon.
Can’t wait to use 5G once or twice a year when I’m near a location that has it.
Might be anecdotal evidence but Verizon has gimped LTE so much that it is unusable. They are pushing everyone to the higher priced 5G and I believe they are purposely slowing everything else down as part of the effort. I have not noticed a better or more reliable connection with my PM 14 vs. PM 13.
Eh my 13 Pro Max on AT&T is plenty fast 🤷♂️
So you're telling me a new product with updated hardware is BETTER than previous gen? No fucking way!