this post was submitted on 01 Aug 2023
323 points (97.1% liked)
Technology
59647 readers
2650 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Can someone explain in lay terms why this is relevant? I read somewhere that it was considered as important as the invention of the transistor but didn't catch why.
Superconductors in general have no electrical resistance. That's basically electricity's friction.
Superconducting materials make the strongest electromagnets, they have big applications in quantum stuff (which I don't properly understand to try explaining), and they're used in something called a Tokamak, a specific kind of fusion reactor. They're useful in anything where electrical resistance is bad in general. When electricity is resisted, we lose some and get heat, so a superconducting wire would lose none and never heat up as a result of resistance.
Superconductors traditionally have to be super freaking cold. A lot of these applications can only be done with liquid nitrogen or even colder things, keeping them superconducting. You can do some things with pressure to help out with that, but the point is it's not easy to keep a material superconducting. This effort translates to costs, often prohibitive ones, as you need to actively keep these materials from collecting any heat.
If this research pans out, though, this kind of superconductor will just work at standard temperature and pressure. These could go into standard circuits, they can sit around without bleeding money on upkeep, they're very cool.
People are comparing them to transistors in part because before transistors we had vacuum tubes. Vacuum tubes do the same thing as a transistor, but they're effectively a lightbulb. They burn out, they produce heat, and they didn't miniaturize. Transistors were magic at the time because we could do so much more with them than vacuum tubes, and for superconducting metals, this is the same.
Thanks for the explanation. So, this means we are another step closer to quantum computers for example?
I'm trying to grasp on this concept and how we could see this in our daily lives. Better batteries? I thought about that because they get hot when charging but not sure if it's because of the resistance. Going into standard circuits means we'll have better SoCs? better integrated circuits? Faster computers or phones?
Im trying to think about a daily life application but maybe it won't have a direct impact on that area, maybe it's more about facilitating research that will eventually turn into daily life stuff?
A conductor with no resistance is a big deal for many electrical applications. Electrical resistance is often a big part of design. Removing that aspect changes things significantly. Electrical power losses and the size of conductors can be greatly reduced.
I've read lots of unsubstantiated claims about superconductors. A solution has to be producible in quantity at a reasonable cost. Otherwise it's not going to be a breakthrough. I mean we currently have expensive and bulky superconductor solutions, but they're limited to applications where it's reasonable such as MRI machines and particle accelerators.
An inexpensive room temperature superconductor would make the most difference in tech sectors such as power transmission, electromechanical, and power electronics. These are areas where power loss due to circuit resistance is a big part of design. The impact would be minimal for computing and logic. There may be areas where power loss can be reduced, but logic relies on semi-conductors which must have resistance to function, it's in the name. The term "semi" implies resistance.
Would this potential superconductor work in devices like phones and laptops? Would it lead to more efficient operation?
If inexpensive it could be used in power components for consumer electronics like phones and laptops, but wouldn't make a huge difference since most of the power consumption occurs in chips and displays where superconductors wouldn't apply. Though it could lead to some reduction in size and better efficiency. Battery operated devices are considered low power. High power applications are where superconductors offer the most benefit.
Better batteries, yeah. That's down the line. We will also generate heat during the actual use of any devices. But, less.
It also could become the most efficient commercial batteries, but I expect the cost will be prohibitive. Sending electricity always has a loss, but it doesn't through a superconductor, so these will have a lot of uses at power generation sites, both reducing heat and losslessly storing it (until it enters the traditional grid).
It won't directly transfer to faster tech or anything like that, but it helping quantum computing could do so indirectly.
Definitely it's more of a facilitating research kinda thing. You can't play with superconductors in a lab in a cost efficient way, but this could let you.
Also maglevs and MRI's directly use superconductors currently, so that's a direct use, lower cost MRI's and incredibly fast trains.
Heat is a huge barrier to increasing clock speeds, so a room temperature and pressure superconductor would actually fairly directly translate to major performance gains in computing.
While true, that'd only be for a superconducting CPU. I doubt this material can both superconduct and act as a transistor, and even if it can, I highly doubt you could pack in anywhere near the amount we have in standard CPUs. So while we might replace a standard power supply with a superconducting one, and reduce heat that way, I don't see any direct computing boosts from this. We could superconduct everything around a CPU, have superconducting wires, but the heat from a CPU is generated in the silicon.
It'll be pretty nice to have 100% efficient PSUs, though. Definitely some gains there, just not the same revolutionary ones seen elsewhere.
This is where my mind went. Wondered if the reduction in heat would allow further overclocking/defaults on both CPU and GPUs. I don't know that much about the actual hardware and how it works though.
Not really. First, standard equipment is limited by cost, not technology. Nothing stopping some power user from using liquid nitrogen to cool a desktop, it's just costly. Superconductor tech, though, would be bleeding edge, it wouldn't cost any less for a long time. Supercomputing, on the other hand, has had access to more esoteric cooling systems, and can already use them. They also have had access to the extreme cold superconductors that have already existed.
The real issue there is the CPU makes the heat, but this tech isn't a transistor. We can't replace the silicon chips with superconducting ones, at least not in a form dense enough to be a CPU. There's lots of small improvements around the CPU we can make, but those aren't at the "wow, this will revolutionize technology" level. They're cool but it's the other stuff that's gonna get the focus.
Managing heat is a large part of circuit design. Superconductors can fundamentally change everything about it meaning far smaller much faster and more capable in every way. As an example 95%+ of modern CPU's and GPUs are cooling related. The actual chips are tiny in comparison to the whole component.
Got it. So it'll eventually lead to develop or improve daily stuff. I hope this material becomes a reality.
I think it'll be first used in energy transmission.
If it's actually a thing.
One thing that you'll definitely observe in daily life is the development of fusion reactors. They're significantly safer than regular nuclear reactors (which run on fission), and also a lot cheaper (theoretically). The current downside to fusion reactors is that up until this point, it usually takes up more energy to run it than the energy that gets produced. So in other words, it doesn't actually generate enough energy to make it worth building. Most of the energy spent is trying to keep the magnets in the reactor cold enough to function. Since room temperature superconductors should function at room temperature, there will be no need to keep them cold, so a lot of the energy spent keeping the magnets cold will become unnecessary. This will significantly improve the development of fusion reactors, to the point where it is possible that we may even see fusion reactors on our energy grid in our lifetimes. Basically, if this claim is true, you can expect that energy costs will become virtually negligible and the world will almost completely run on renewable energy
Well I really hope this is real then and more importantly it translates to cheap, clean energy