Yeah, no. This is not about chargers or batteries or phones or cars. This study is about improved charge/discharge rates for supercapacitors.
Supercaps have very high flow rate, but extremely low capacity. Put them in a phone or a car and it would run very fast for five minutes. Supercaps are useful, don't get me wrong, but they're not batteries.
Very cool research from UC Boulder, but the journalism leans way too far into clickbait.
Increasing capacitance (how much charge is stored to reach a certain voltage) or the voltage it is charged to would indeed increase the capacity. Putting several in parallel would work, as would making a bigger capacitor. The main problem as far as I can tell is that the energy density of even supercapacitors is low, so you'd need a much larger volume to have the same capacity (and thus a much thicker phone).
As of 2010, the best ultracapacitors can only store about 5% of the energy that lithium-ion rechargeable batteries can, limiting them to a couple of miles per charge.
A supercapacitor buffer will cost around twice as much and deliver around 1/10th the watt-hours of a similarly-sized lead acid battery. And lead acid isn't exactly great to begin with.
Capacitors are useful, but only in applications where the total amount of energy stored is more-or-less unimportant.
Solid point. A laptop battery is around 60Wh, and charging that in 1 minute would pull 3.6kW from the outlet, or roughly double what a US residential outlet can deliver.
Supercaps stay pretty cool under high current charging/discharging, but your laptop would have to be the size of a mini fridge.
The research paper itself was only talking about using the tech for wearable electronics, which tend to be tiny. The article probably made the cars-and-phones connection for SEO. Good tech, bad journalism.
Stories like this have been posted every so often to reddit. I'll believe this is possible when I see it available in consumer electronics (and not just lab conditions).
No, never. Current charging rates already get close to thermal constraints. Hitting those charging rates either requires accepting much lower power density or using way more metal per cell. This research might inform design changes to improve charging rates, but we'll never see high capacity batteries charging in a minute.
The researchers know this and only mention wearables and iot devices applications. The article author erroneously makes the leap to high energy density devices.
If you don't care about energy density at all, ceramic capacitors can already charge and discharge in microseconds.
The researchers who wrote the paper only mentioned possibly applying the tech to very small things like wearables and Iot applications where a large capacitor might be relevant. It’s the journalist summarizing it that makes the wild claims about phones and cars, which don’t tend to use capacitors for a bunch of reasons, not least of which is that they tend to be physically twenty times larger than a given battery of the same capacity.
If people are able to deal with batteries anywhere near that large, then I’d imagine most of them would choose twenty times the battery life/ range over being able to charge fast enough overload a wall outlet/ small power plant.
More of an actual comment, good. More efficient capacitors in both speed and heat certainly helps in charging devices of all sizes. Of course it wouldn't be charging large batteries in seconds, but that doesn't mean no improvement.
No, it doesn’t effect devices of all sizes, only devices that might use this specific bulky capacitor, all other devices will show exactly zero improvement because there is no real point to mixing capacitors in with a large battery. Being able to quickly get three minutes of charge per whole hour of battery capacity you replace with capacitors just isn’t that useful because you might as well just stay plugged in for an extra few minutes and get the same charge plus that extra hour before needing to find a charger at all.
As for EV’s the problem is even more pointless, as being able to go a half mile the street from a charger massive enough that it can output a small power plants worth of electricity is similarly to specialized of a use case to be worth the loss of range and greater degradation of the rest of the battery.
Almost every electrical system on the planet uses capacitors. Especially high power systems. Of which evs are.
"No real point in mixing capacitors in with a large battery" ?? That's done literally all the time for both filtering and for intermittent high power output. Like when I say almost every electrical system uses caps, I mean almost every electrical system.
Obviously nearly every electrical circuit board uses capacitors in some respect, especially for filtering and smoothing, but it is extremely rare for them to be used for bulk energy storage outside of things like adjusting power factor.
Given we are talking about charging times, which are primarily limited by the batteries charge current vs degradation curve and not at all by the various small capacitors in the charger’s electronics, there is fundamentally no effect on charge times unless you are replacing the energy storage medium itself with supercapacitors.
We can already supply enough dc power to charge an EV battery at its maximum designed curve via dc fast charging stations, which involve some contractors and shunts but actually don’t even involve any size of capacitors at all in the car itself on the HV side.
Or you know, reducing thermal load by using broadly more efficient capacitors allowing you to shove more current in the car. Or by meeting grid scale requirements for car charging by smoothing out the grid impact of a bunch of charging at once. Or any number of benefits.
Ultimately this certainly benefits car charging. It benefits all electronics. No you won't be getting two second car charges with this.
Again, there are no capacitors car side to be produceing thermal load in the first place during dc fast charging in the first place, and that thermal load is not the primary barrier to how much current can go into the battery without degradation anyway. After all, if it was we would just upscale the cars heat pump and be charged in five minutes.
Car charging is not coordinated to the point where they all plug in within a few seconds, and if it was a few second randomizer on when eqch timer actually starts charging would accomplish the exact same effect without hundreds of millions to billions of dollars in new grid scale capacitors and inverters.
This is also unlikely to become a significant problem because a lot of the grid is moving to battery backed solar and wind, where the limit is price per megawatt hour and as such said batteries can provide far more current than the grid could consume. You might be limited by inverter capacity, but storage capacitors are also fundamentally a DC technology so you would need them anyway.
This may turn out to have benefits for electronics that rely on already specialized supercapacitors, but it can by definition not have any impact on processes that are not currently limited in any way by capacitor technology like battery bulk charge current, the thing that actually limits how fast a car can fast charge.
Not really trying to argue, just trying to help explain the high current DC battery systems I have experience with and to someone how seems to have some conceptual understanding of what individual components do, but not how and why they are used or where the limitations come from.
Them being confidently incorrect doesn’t help of course. :)
That being said you haven’t really given me much to work off of as to where these misconceptions are coming from beyond a journalist confusing applications for batteries and capacitors and this really seems to be going nowhere, so bye.
So an electric car might hold 100 kWh. To charge that in 1 minute you would need 6000 kW of power, or 6 MW. Typical "rapid" chargers today do 350 kW and these are the kind that are difficult to find. A nuclear plant makes around 1,000 MW so if you had 166 cars charging at once you would overload one.
Battery degradation is ridiculously unpredictable. I've seen 100 cycle batteries at 75% and 2500 cycle batteries over 90%. I only dealt with like 5 MacBook models
Depending on what battery protection modes are in play, many have smart charging or other features designed to prolong life. Also a fair few batteries come out with greater than design capacity from the factory. It's called a design capacity and not an absolute capacity for a reason. A phone battery that left the factory at 110% could conceivably still be at or above 100%.
Fyi it's not overnight charging that's the issue either, it's charging to 100%. What one device consider 100% varies and devices will essentially lie to you about it. 4.2V is normally considered 100% full for Lithium Cobalt Oxide batteries yet some devices push higher than this while others skirt under to pad capacity and cycle life respectively. It's about tradeoffs.
Consider that the energy output of a 12-gauge shotgun is approximately 4500 Newton-meters and, from personal experience, can rotate a first-gen iPad at an extrapolated 240 rpm (extrapolated as this proved difficult to sustain). That gives us an equivalent of 113 kW! A modern ipad would only need about 13 kW to charge in one second.
So, one shotgun shell could easily charge yours and 7 of your friends ipads instantaneously, although the results are difficult to appreciate.
The claim that it’s equivalent to 16GB is incorrect, probably closer to 10-12GB on a Windows computer. Linux however is different, and in some cases better than MacOS at memory usage.
It would be nice is someone built a spin gravity space station with lunar and martian levels. We could see how people fare after several months in that environment before setting off to build permanent bases.
I've maintained this idea for a while as well. It's really only after Pichai took over that Google and Android both have started scrapping useful programs/apps/services, made needless change to make products worse, and in general just haven't really innovated much at all. At least when compared to how the company was run when Larry Page and Erik Schmidt were running the company.
Any recommendations from the community for models? I use ChatGPT for light work like touching up a draft I wrote, etc. I also use it for data related tasks like reorganization, identification etc.
I would also reccommend faraday.dev as a way to try out different models locally using either CPU or GPU. I believe they have a build for every desktop OS.
bgr.com
Newest