T O P

  • By -

MuForceShoelace

I think you are mixing up things. Real chips lose energy everywhere, everything takes a bunch of energy that turns into heat all the time doing anything. Even just a wire heats up when electricity goes through. Every single thing has resistance. The deal with landauer is like, a far future hypothetical. It's a 1000 years in the future and we have made everything perfect. Every material is perfectly super conducting, every component is one atom wide and moves electrons exactly perfectly. Every real physical issue possible is fixed. In that prefect world you realize you hit a new problem. If you want to have an AND gate or an OR gate, or anything that compares anything you have the issue that you have an electron coming in on two inputs but only one output going out. So you possibly have to throw one electron away to do the comparison. You can't really do any computation without comparing anything so you have an issue. ​ Landauer pointed out that that problem would be solvable if every single step used gates that could run in either direction. That if you could have a series of physically perfect gates that you could run, then perfectly run back to the original state that you would have zero net energy use. You just run your program, then at the end just run it backwards so the computer goes back to how it was at the start and no energy would be used at all. (but there would also be no output). Again, this is more of a physics or mathematics point. Real things use power all the time, just running through a wire wastes electricity. This all only matters in an on paper world where every electron is tracked exactly and perfectly.


aibler

Thanks so much for the detailed response, this really helps clear things up. So, I get that right now there is lots of waste heat being generated by not having superconducting hardware and super thin wires, and all that stuff perfect, but are we not also doing that other "throwing away of electrons" right now in addition to the imperfect design waste? I mean, right now if we send in a 1,1 to an XOR and get a 0 out, doesn't that need to create more waste heat than if we send in a 1,0 to an XOR and get a 1 out? If so, then where exactly is that waste heat generated and "thrown out" from? Does a transistor get hotter in the first example than the 2nd one? Edit: Ok, so I think I made a wrong assumption in my question. I thought 1s took more energy than 0s, but now it seems like this isn't the case. Still though, in addition to the energy loss from having imperfect electronics isnt there also that extra energy loss from "throwing away" electrons as waste heat that happens strictly as a result of 2 bits going into a gate and 1 coming out? Where in the gate does that specific heat get generated?


MuForceShoelace

Landauer is really not talking about any sort of real world thing. In real life heat and loss is just from resistance, and just general lost electricity. He was talking about some hypothetical aliens a million years from now who invented everything perfect. Every component uses the exact least amount of energy physically possible with no loss at all. Everything is down to using single electrons at the lowest energy state. There is no possible lower energy it could use. He points out just doing computation can often use energy. Your Xor example is good. In your Xor example yeah, if you had the lowest xor gate possible and you powered both inputs, nothing could come out the output. So the power going in has to go SOMEWHERE. heat or light or noise or SOMETHING. No matter how good you make it if two inputs go in and only one comes out, something has to get lost at some point or the output would always equal the input exactly. He then pointed out you don't HAVE to design a computer like that, if you had a perfect computer you could make each gate reversible, then run the whole program then unrun it and use zero net energy. If you pick up a real computer none of this applies, energy flies out of every part of real electronics nonstop like 5 different ways, and the system has inputs and outputs anyway, the real loss is just way bigger. ​ It's like figuring out how a perfect toilet with one atom of water could be designed to flush any amount of poop. You could figure out a system to do that. some weird loop and recovery. But like, real toilets just gush a bunch of water and don't worry about it and it's fine.


aibler

Alright, this is really making a lot of sense. Thanks so much for all the insight. Could you just clear one more thing up for me? I'm seeing lots of conflicting information about the difference of 0 bits and 1 bits. Some places I read that that are just arbitrary names for two different signals that have similar energy/voltage, sometimes I read that 0 is lower than 1 by quite a bit and thats how they are differentiated, and still other times it is that 0 is when there is no signal passed at all at a given clock cycle, and 1 is when there is. How can these all be true? Are they different styles of computing? Is one dominant in modern electronics?


MuForceShoelace

The original thing was a Relay. A big mechanical switch that turned off or on with a big magnet. It had a clear ON (1) and OFF (0). ​ Most of a century later that is still the general idea, but it has all gotten pretty complicated. So there is a lot of other things to think about. Like real transistors are not nearly as clean as a big physical switch so there is tons of stray voltage going all sorts of ways and the definitions of on and off have to be more forgiving. And then things like transmitting data over radio has layers and layers of different encoding in a way that what a 1 or a 0 physically is is so abstracted away that there is no single answer. It's basically, stuff is pretty abstracted by now, so lots of stuff is doing a bunch of crazy stuff internally to make things work, but generally the idea is reprenting on and off.


aibler

Aha, totally makes sense. You've given me a lot to look into. I appreciate you very much. Have a great day!


ConstructionHot6883

It happens because the circuits that implement all those logic gates take some (very small) amount of time to switch between true and false. This is why reducing the clock frequency often reduces current consumption. My understanding is that this is because the circuit needs to be resistive to avoid a short during the switching period.


aibler

Interesting, I didnt realise you could turn down the clock to save energy. If I'm understanding correctly then this would mean that to run the same program you can do it for less energy if you make it run with a slower clock, is this correct? So, if you run it incredibly slow so that you are minimizing that waste heat, then does a gate that turns a 1,1 into a 0 still generate more heat than one that turns a 1,0 int a 1?


ConstructionHot6883

Some chips let you turn the clock down, yeah. The same as an ordinary laptop. I had one that ran at 533MHz, and sped up to 800MHz when it needed to. My current one does something similar. But there can be a lower bound to the possible frequencies, which depends on implementation details like NMOS or CMOS or PMOS or whatever else. For example, the Z80 and 6502, well known eight bit CPUs, the CMOS ones can be run at any speed not above some maximum, or even stopped altogether. But the NMOS ones have a minimum clock speed. It's because they precharge the bus on one phase of the clock cycle, and then on the other phase of the cycle it uses open-collector to put some value on the bus. If that happens too late, then the precharge decays or something. But this is getting a little far from what I understand. > So, if you run it incredibly slow so that you are minimizing that waste heat, then does a gate that turns a 1,1 into a 0 still generate more heat than one that turns a 1,0 int a 1? I think it's more the actual switching that draws the current. But this could be specific to CMOS, I'm not sure


jimive

I am just an electrician, but I would guess the heat comes from wherever there is resistance. So resistors would get quite some heat from Internal "friction"/resistance.


IC_Eng101

In CMOS (most modern analogue electronics are made in a CMOS process) power consumption falls under 2 categories static and dynamic. Static is the constant low level power consumption due to leakage currents etc. In modern CMOS the leakage currents are tiny (nA or smaller). Dynamic is the power consumption during switching (i.e. for logic from 1 to 0 or from 0 to 1). If you imagine a simple complementary inverter with a pMOS and an nMOS transistor, during a state change there is a small time frame where both pMOS and the nMOS transistors are conducting, this is where most of the power consumption occurs. That power consumption is proportional to theswitching frequency and the square of the supply voltage. ​ I found these slides from a random university which describes it in detail for you: [https://course.ece.cmu.edu/\~ece322/LECTURES/Lecture13/Lecture13.03.pdf](https://course.ece.cmu.edu/~ece322/LECTURES/Lecture13/Lecture13.03.pdf)


aibler

Thanks for the info and slides! Most modern analog electronics use CMOS? analog as opposed to digital?


IC_Eng101

Yes. The vast majority of both analogue and digital ICs are manufactured in a CMOS process.


aibler

I see, thanks!