1st law of thermodynamics definition

The first law of thermodynamics is an extension of the conservation of energy to systems that are not isolated. Energy can be transferred into or out of a. THE FOUR LAWS. First Law, The first law states that the amount of energy added to a system is equal to the sum of its increase in heat energy and the work. 1st law of thermodynamics. A thermodynamic system is a collection of objects We define the internal energy of a system to be the.

1st law of thermodynamics definition -

The 2nd Law of Thermodynamics -- A Probabilistic Law

In our lead-in discussion to why we need a 2nd law, we point out that energy conservation — the 1st law of thermodynamics — suffices to rule out a lot of thermal sort of things that don't happen — like things getting warmer without any source of warmth. But there are a lot of thermal things that don't happen that are perfectly consistent with the 1st law; like thermal energy flowing from a cold object to a hotter object. In order to codify and elaborate our understanding of these results, we turn to the ideas of probability to understand how energy tends to be distributed.

A probabilistic law

That seems a bit strange. What does a discussion about probabilities have to do with a physical law? Physical laws are always true, aren't they? And isn't probability really about things that are only sometimes true?

In many ways, molecules in physics are like multi-sided dice, and the likelihood that a particle will be located in a particular location in space (or have a particular energy) is analogous to the likelihood that a multi-sided die will land on a particular side. There are many different ways for the molecules to move, and the details of why they move in one way or another is very sensitive to exactly where they are and how they are moving — and is very much out of our control. A "random motion model" for molecules is much more useful than a model that tries to calculate the motion of every molecule.

The likelihood that all the smoke particles in a smoke-filled room will move as a result of their chaotic motions into one corner of the room is analogous to the likelihood that nearly all the coins in a set of 1023 tosses will land on heads. It's very, VERY unlikely! If you tossed that many coins over and over again for the lifetime of the universe (14 billion years) the odds that you would see all heads is still minuscule — totally ignorable. This extremely low probability is what transforms a "probability statement" into a "physical law."

The reason you will never see the smoke particles accumulate in one small corner is that there are many, many more ways for the smoke particles to distribute themselves uniformly throughout the room than there are ways for the particles to all be located in just one corner of the room. That said, just as it is not impossible for all 1023 tosses to land on heads, it is not impossible that all the smoke particles will spontaneously move to one corner of the room ... just don't hold your breath waiting for it to happen. 

Microstates and macrostates

More generally we can say that when the number of atoms or molecules in a system is large, the system will most likely move toward a thermodynamic state for which there are many possible microscopic "arrangements" of the energy.  (And they will be very unlikely to move toward a thermodynamic state for which there are very few possible microscopic arrangements.) If this seems mysterious, go back to the discussion of coin tosses — it's a pretty good analogy.  The H/T ratio (say, 5/5) — which we refer to as a macrostate of the system — is analogous to a thermodynamic state of a system, where only the pressure, temperature, and density of the various molecules are specified. The different ways in which that H/T ratio can be obtained (say, HTTHTTHHHT) — which we refer to as a microstate — is analogous to the specification of the spatial and energetic arrangement of each of the atoms/molecules that compose a particular thermodynamic state.

As we saw in the coin toss discussion, if one only looks at the macrostate description, one is much more likely to get a H/T result that corresponds to a large number of possible arrangements. Likewise, one is much more likely to get an atom/molecule distribution that corresponds to a large number of arrangements.

The second law

The Second Law of Thermodynamics can now be stated in this qualitative way: 

When a system is composed of a large number of particles, the system is exceedingly likely to spontaneously move toward the thermodynamic (macro)state that correspond to the largest possible number of particle arrangements (microstates).  

There are a few really important words to make note of in this definition. First, the system must have a LARGE number of particles.

Firstly, if the system has just a few particles, it is not exceedingly likely that the particles will be in one state rather than another. Only when the number of particles is large do the statistics become overwhelming. If one tosses a coin just twice, there is a reasonable chance (namely, 25%) that one will obtain all heads.

Secondly, the system is EXCEEDINGLY LIKELY, but not guaranteed, to move toward a state for which there are the most particle arrangements. The larger the number of particles, the more likely it is, but it is never a guarantee.

Thirdly, this law does not specify the specific nature of these "arrangements." It may be that we are only interested in spatial location, in which case an arrangement corresponds to the spatial location of each particle in the system. More arrangements would then correspond to more ways of positioning the particles in space. In other contexts we may be interested in energy, and arrangements would then correspond to the set of energies corresponding to the system's constituents.  In either case, the most likely thermodynamic state is the one for which there are the most microscopic arrangements.

Biological implications

The Second Law of Thermodynamics is a statistical law of large numbers. But we have to be careful. Although biological systems almost always consist of a huge number of atoms and molecules, in some critical cases there are a very small number of molecules that make a big difference.

For example, a cell may contain only a single copy of a DNA molecule containing a particular gene. Yet that single molecule may be critical to the production of protein molecules that are critical to the survival of the cell. For some processes a small number of molecules in a cell (fewer than 10!) can make a big difference. On the other hand, a cubic micron of a fluid in an organism typically contains on the order of 1014 molecules! The second law of thermodynamics is a law that is indispensable in analyzing biological systems in countless contexts; but it is essential to understand it well — not to just use it mindlessly. (See the associated problem to estimate some molecules in a cell that might not behave according to our probabilistic laws.)

Just as our probability that the number of Heads we got in flipping coins got narrower as the the number of flips got larger, the probability that our results are those predicted by statistical mechanics (most probable macrostates) gets sharper and sharper. The variation around that perfect probability (corresponding to an infinite number of flips or particles) is called fluctuations. The scale of fluctuations can be estimated crudely as about 1/(square root of the number). So for 1014 molecules, our corrections due to fluctuations are about 1 part in 107. Whereas, if we only have 100 = 102 molecules, our fluctuations are expect to be about 1 part in 101 or 10%.  But again,  fluctuations may play a crucial role in the processes of a living cell. Learning to estimate when the standard rules of thermodynamics may safely be applied can be very valuable!

Entropy

Since the number of microstates corresponding to a particular macrostate plays a critical role, we need a way to count them in order to quantify what's going on with the probabilities. The number of arrangements is so large, that it turns out to be convenient to work with a smaller number — the log of the number of microstates. This is just like counting the powers of 10 in a large number rather than writing out all the zeros. For a very large N, the number 10N is considerably larger than N! And it turns out that working with the log of the number of microstates is very much more convenient.

Essentially what is happening is that when you put two systems together (imagine combining two boxes of gases into one) the number of microstates of the combination is basically the product of the number of microstates in each. (If we flip a coin 10 times, the number of microstates is 210. If we flip it another 10 times, the new number of microstates is 220 — the product of 210 with 210.) If we take the log of the number of microstates, when we add two systems together, the logs of their number of microstates add to get the total number. This turns out to be both easier to work with and to lead to a number of nice ways of expressing things mathematically.

The log of the number of distributions of the energy that correspond to the thermodynamic state of a system is termed the "entropy" of the system, and is given the symbol $S$. Another way of stating the Second Law, therefore, is to say that systems are exceedingly likely to spontaneously move toward the state having the highest entropy $S$. Using the symbol $W$ to represent the number of arrangements of the energy that correspond to a particular thermodynamics state, we can write an expression for entropy as follows:

$$S = k_B \ln{W}$$

The constant $k_B$ is called Boltzmann's constant, and its value is $1.38 \times 10^{-23} \mathrm{J/K}$.  (Yes, it's the same constant we ran into in our discussion of kinetic theory of gases — chemistry's gas constant $R$ divided by Avogadro's number, $N_A$.) The important thing to take from this equation is that the entropy $S$ is a measure of the number of arrangements $W$. As $W$ goes, so goes the entropy. 

But of course the number $W$ is usually a HUGE number, and counting up arrangements to arrive at its value would usually take you forever. Fortunately, it is very rarely the case that we actually need to do the counting. Rather, we usually need only to compare two thermodynamic states and to decide which one is consistent with the greatest number of microscopic arrangements. That is the state to which the system will evolve. 

Systems

When discussing the Second Law of Thermodynamics, it is crucial to be very careful about defining the system that one is considering. While it is always the case that the entropy of the universe is overwhelmingly likely to increase in any spontaneous process, it is not necessarily the case that a particular sub-system of the universe will experience an increase in entropy.

If the system being studied is isolated, i.e., if no matter or energy is allowed to enter or leave the system, then the system's entropy will increase in any spontaneous process.  But, if the system is NOT isolated, it is entirely possible its entropy will decrease.

Stated more generally, it is entirely possible that one part of the universe will exhibit an entropy decrease during a spontaneous process while the rest of the universe exhibits a larger increase in entropy, such that the overall entropy in the universe has increased. All of this is just to say that it is of utmost importance to be clear about the system to which the Second Law of Thermodynamics is being applied.  

It is not obvious at this stage that the statement of the Second Law of Thermodynamics presented here will be practically useful in understanding which processes in nature are spontaneous and which ones are not. What, for example, does any of this have to do with the fact that heat spontaneously transfers from hot objects to cold objects and not the other way around? What does this have to do with chairs sliding across a room? And what does it have to do with the electrostatic potential across a biological membrane? As it turns out, the Second Law of Thermodynamics as defined above can in fact explain those examples.

Ben Geller 11/8/11 and Joe Redish 12/8/11

Источник: https://www.compadre.org/nexusph/course/The_2nd_Law_of_Thermodynamics_--_A_Probabilistic_Law

What Is the First Law of Thermodynamics?

The First Law of Thermodynamics states that heat is a form of energy, and thermodynamic processes are therefore subject to the principle of conservation of energy. This means that heat energy cannot be created or destroyed. It can, however, be transferred from one location to another and converted to and from other forms of energy. 

Thermodynamics is the branch of physics that deals with the relationships between heat and other forms of energy. In particular, it describes how thermal energy is converted to and from other forms of energy and how it affects matter. The fundamental principles of thermodynamics are expressed in four laws.

“The First Law says that the internal energy of a system has to be equal to the work that is being done on the system, plus or minus the heat that flows in or out of the system and any other work that is done on the system," said Saibal Mitra, a professor of physics at Missouri State University. "So, it’s a restatement of conservation of energy." 

Mitra continued, "The change in internal energy of a system is the sum of all the energy inputs and outputs to and from the system similarly to how all the deposits and withdrawals you make determine the changes in your bank balance.” This is expressed mathematically as: ΔU = Q – W, where ΔU is the change in the internal energy, Q is the heat added to the system, and W is the work done by the system. 

History

Scientists in the late 18th and early 19th centuries adhered to caloric theory, first proposed by Antoine Lavoisier in 1783, and further bolstered by the work of Sadi Carnot in 1824, according to the American Physical Society. Caloric theory treated heat as a kind of fluid that naturally flowed from hot to cold regions, much as water flows from high to low places. When this caloric fluid flowed from a hot to a cold region, it could be converted to kinetic energy and made to do work much as falling water could drive a water wheel. It wasn’t until Rudolph Clausius published "The Mechanical Theory of Heat" in 1879 that caloric theory was finally put to rest. 

Thermodynamic systems

Energy can be divided into two parts, according to David McKee, a professor of physics at Missouri Southern State University. One is our human-scale macroscopic contribution, such as a piston moving and pushing on a system of gas. Conversely, things happen at a very tiny scale where we can’t keep track of the individual contributions. 

McKee explains, “When I put two samples of metal up against each other, and the atoms are rattling around at the boundary, and two atoms bounce into each other, and one of the comes off faster than the other, I can’t keep track of it. It happens on a very small time scale and a very small distance, and it happens many, many times per second. So, we just divide all energy transfer into two groups: the stuff we’re going to keep track of, and the stuff we’re not going to keep track of. The latter of these is what we call heat.”

Thermodynamic systems are generally regarded as being open, closed or isolated. According to the University of California, Davis, an open system freely exchanges energy and matter with its surroundings; a closed system exchanges energy but not matter with its surroundings; and an isolated system does not exchange energy or matter with its surroundings. For example, a pot of boiling soup receives energy from the stove, radiates heat from the pan, and emits matter in the form of steam, which also carries away heat energy. This would be an open system. If we put a tight lid on the pot, it would still radiate heat energy, but it would no longer emit matter in the form of steam. This would be a closed system. However, if we were to pour the soup into a perfectly insulated thermos bottle and seal the lid, there would be no energy or matter going into or out of the system. This would be an isolated system. 

In practice, however, perfectly isolated systems cannot exist. All systems transfer energy to their environment through radiation no matter how well insulated they are. The soup in the thermos will only stay hot for a few hours and will reach room temperature by the following day. In another example, white dwarf stars, the hot remnants of burned-out stars that no longer produce energy, can be insulated by light-years of near perfect vacuum in interstellar space, yet they will eventually cool down from several tens of thousands of degrees to near absolute zero due to energy loss through radiation. Although this process takes longer than the present age of the universe, there’s no stopping it.

Heat engines

The most common practical application of the First Law is the heat engine. Heat engines convert thermal energy into mechanical energy and vice versa. Most heat engines fall into the category of open systems. The basic principle of a heat engine exploits the relationships among heat, volume and pressure of a working fluid. This fluid is typically a gas, but in some cases it may undergo phase changes from gas to liquid and back to a gas during a cycle. 

When gas is heated, it expands; however, when that gas is confined, it increases in pressure. If the bottom wall of the confinement chamber is the top of a movable piston, this pressure exerts a force on the surface of the piston causing it to move downward. This movement can then be harnessed to do work equal to the total force applied to the top of the piston times the distance that the piston moves. 

There are numerous variations on the basic heat engine. For instance, steam engines rely on external combustion to heat a boiler tank containing the working fluid, typically water. The water is converted to steam, and the pressure is then used to drive a piston that converts heat energy to mechanical energy. Automobile engines, however, use internal combustion, where liquid fuel is vaporized, mixed with air and ignited inside a cylinder above a movable piston driving it downward. 

Refrigerators, air conditioners and heat pumps

Refrigerators and heat pumps are heat engines that convert mechanical energy to heat. Most of these fall into the category of closed systems. When a gas is compressed, its temperature increases. This hot gas can then transfer heat to its surrounding environment. Then, when the compressed gas is allowed to expand, its temperature becomes colder than it was before it was compressed because some of its heat energy was removed during the hot cycle. This cold gas can then absorb heat energy from its environment. This is the working principal behind an air conditioner. Air conditioners don’t actually produce cold; they remove heat. The working fluid is transferred outdoors by a mechanical pump where it is heated by compression. Next, it transfers that heat to the outdoor environment, usually through an air-cooled heat exchanger. Then, it is brought back indoors, where it is allowed to expand and cool so it can absorb heat from the indoor air through another heat exchanger. 

A heat pump is simply an air conditioner run in reverse. The heat from the compressed working fluid is used to warm the building. It is then transferred outside where it expands and becomes cold, thereby allowing it to absorb heat from the outside air, which even in winter is usually warmer than the cold working fluid. 

Geothermal or ground-source air conditioning and heat pump systems use long U-shaped tubes in deep wells or an array of horizontal tubes buried in a large area through which the working fluid is circulated, and heat is transferred to or from the earth. Other systems use rivers or ocean water to heat or cool the working fluid. 

Additional resources

Here are three other explanations of the First Law of Thermodynamics:

Jim Lucas is a contributing writer for Live Science. He covers physics, astronomy and engineering. Jim graduated from Missouri State University, where he earned a bachelor of science degree in physics with minors in astronomy and technical writing. After graduation he worked at Los Alamos National Laboratory as a network systems administrator, a technical writer-editor and a nuclear security specialist. In addition to writing, he edits scientific journal articles in a variety of topical areas.
Источник: https://www.livescience.com/50881-first-law-thermodynamics.html
Thermodynamics I
The universe consists of matter and energy.  Energy, regardless of the form has two properties - intensity and capacity.  The table below lists examples of this.  Thermodynamics is the study of the relationships between heat and other forms of energy.  Thermodynamics has three basic "Laws" .  These laws can be represented mathematically and are useful in understanding the way the universe behaves.

Intensity and Capacity Factors of Energy
 

Energy FormIntensity or Intensity PropertyCapacity or Extensive PropertyCommonly used Energy Unit (Work)
Heat (Thermal)Temperature (degree)Entropy Change (cal/deg)Calories
ExpansionPressure (dynes/cm2)Volume Change (cm3)ergs
SurfaceSurface Tension (dynes/cm)Area Change (cm2)ergs
ChemicalChemical Potential (cal/mole)Number of MolesCalories

As an example consider the mechanical work done by a gas on its surroundings W = PdV

Thermodynamic can also be applied to the surface of a liquid.
As an example Surface Free Energy = Surface Tension times the change in surface area.  OR   Fs = Gamma X dA

THE FIRST LAW OF THERMODYNAMICS

The First Law of Thermodynamics states that energy can neither be created nor destroyed.

Energy can change forms or move from one body to another but the total energy of the system cannot change.  Einstein developed a relationship between matter and energy which simple put equated the two.  The energy content of even a small amount of matter if converted to energy is so large compared to the energy of pharmaceutical systems that  E = MC2 isonly included here for completeness.  The first law can be written mathematically as:

E = Q - W
or
dE = q - w
Where E is the energy of the system and q is the heat content and w is the work done by the system.

Other words that must be defined in thermodynamic terms include

Closed vs Open System - A closed system is one that does not exchange matter with the outside while an open system does exchange matter with the outside.

Isothermal Process - is one which is carried out without a change in temperature

Adiabatic Process - is one that is carried out without a change in the heat content of the system.

Boiling water is an isothermal process as the temperature of the water stays at 1000C even though you add heat to the system. The extra heat escapes the system as steam.  A reaction carried out inside a Deware flask is adiabatic as no heat can escape the flask.
The equation changes in an adiabatic system to dE = -W since the change in q = 0.

The simplest system to use to understand these process is the expansion of a gas.

As you heat a gas in a container with at least one movable side, like a piston, the heat will add energy to the gas which will expand against the pressure pushing down on the piston.  The movement of the piston does work.  W = PdV

In a steam locomotor we actually convert water to steam.  The volume is greatly expanded as a result of this process and the expanding water vapor moves the piston.  If we perform this change of state at the boiling point of water the heat absorbed by the water is called the heat of vaporization (Hv) and is about 9270 calories/mole.  The work of expansion against atmospheric pressure is a function of the change of volume when one mole of liquid water 18 ml expands to 30.6 liters of water vapor.

How did I know the volume of one mole of water??
I used the ideal gas law to calculate the volume of one mole of water vapor at 1000C.

V = n RT/P where n = 1, R=0.082 liter atm/ mole degree and T is in degrees Kelvin or 3730
 

Work is the product of the pressure times the change in volume.

W = 1 atm X (30.6 - .018)  =  30.6 liter atm/mole = 741 calories

E = 9270 calories - 741 calories = 8979 cal/mole

In class we will also discuss heat content and define Enthalpy in mathematical terms.

Heat Capacity is the amount of heat that must be absorbed by one mole of a substance to raise the temperature one degree.  Often written as C = q/dT
 
 
 
 

Источник: https://web.wilkes.edu/arthur.kibbe/Thermo1.htm

1.3 The first two laws of thermodynamics

The natural laws which govern the environment and which are, therefore, of interest to us are the first two laws of thermodynamics. These relate to closed systems. Strictly speaking, the earth is not a closed system as it receives energy from the sun, but it is almost a closed system.

First law of thermodynamics

The first law states that whenever energy is converted in form, its total quantity remains unchanged. In other words, energy (or matter) can be neither created nor destroyed.

Common and Stagl (2005) use the example of coal-fired electricity generating plant. The coal is heated which produces electricity. A by-product of this process is waste heat that is transported away as cooling water or gases. In addition, various waste gases are emitted into the atmosphere, which cause pollution, such as acid rain.

Second law of thermodynamics

This law states that in a closed system, entropy does not decrease.

Entropy could be described as a measure of the 'disorderedness' of energy. For instance, ordered energy is useful and an example of this is the energy stored in a battery. However, disordered energy is not useful, and an example is the energy dispersed into the environment by a fire.

Entropy is a thermodynamic property of matter and is related to the amount of energy that can be transferred from one system to another in the form of work. For a given system with a fixed amount of energy, the value of the entropy ranges from zero to a maximum. If the entropy is at its maximum, then the amount of work that can be transferred is equal to zero, and if the entropy is at zero, then the amount of work that can be transferred is equal to the energy of the system.

During an irreversible process the entropy of a system always increases.

The key points to remember from the above are that, because of these natural laws:

  • increased extraction of minerals by the production process leads to an increase in wastes
  • there is a limit on the substitutability of inputs
  • since production and consumption lead to the dissipation of matter, scarce energy is needed for recycling

The importance of these two laws relates to the use, re-use, and recycling of the environment after interactions with the economy.

Let us look more closely at the subject of recycling, as this would seem to offer a chance for the economy to retain the use of scarce resources.

Recycling

There is a hierarchy of resource use that includes recycling. This is referred to as the 3R's - reduce, re-use, and recycle. The final and least appealing option after resource use is to dispose of any remaining waste.

There are now many materials which are routinely recycled and re-used. For example, glass bottles have been collected and re-used by a number of drinks companies for many years. In various countries this practice is encouraged by the use of deposit-refund schemes. Other examples include paper, metal, glass, plastic, textiles, and garden waste.

For instance, in the Netherlands, household waste that can be composted is collected separately from other household waste and is composted by the local authorities. To encourage citizens to participate in this scheme, householders received some free compost soon after the scheme was set up. However, there are clearly costs involved in such a scheme.

  • separate waste bins were provided for the compostable waste
  • information was provided to householders
  • householders use time to separate their waste
  • costs of separate collection and of dealing with the compost

In the Netherlands, chemical household waste is also collected separately, with similar costs involved. There are numerous examples of different economic instruments used to deal with waste at both the household level and industry.

There are clearly limits to what resources can be re-used and recycled. These limits are not only dictated by the laws of thermodynamics but also the costs associated with re-using and recycling many items.

Источник: https://www.soas.ac.uk/cedep-demos/000_P570_IEEP_K3736-Demo/unit1/page_09.htm

What is the second law of thermodynamics?

Thermodynamics is the study of heat and energy. At its heart are laws that describe how energy moves around within a system, whether an atom, a hurricane or a black hole. The first law describes how energy cannot be created or destroyed, merely transformed from one kind to another. The second law, however, is probably better known and even more profound because it describes the limits of what the universe can do. This law is about inefficiency, degeneration and decay. It tells us all we do is inherently wasteful and that there are irreversible processes in the universe. It gives us an arrow for time and tells us that our universe has a inescapably bleak, desolate fate.

Despite these somewhat deflating ideas, the ideas of thermodynamics were formulated in a time of great technological optimism – the Industrial Revolution. In the mid-19th century, physicists and engineers were building steam engines to mechanise work and transport and were trying to work out how to make them more powerful and efficient.

Many scientists and engineers – including Rudolf Clausius, James Joule and Lord Kelvin – contributed to the development of thermodynamics, but the father of the discipline was the French physicist Sadi Carnot. In 1824 he published Reflections on the Motive Power of Fire, which laid down the basic principles, gleaned from observations of how energy moved around engines and how wasted heat and useful work were related.

The second law can be expressed in several ways, the simplest being that heat will naturally flow from a hotter to a colder body. At its heart is a property of thermodynamic systems called entropy – in the equations above it is represented by "S" – in loose terms, a measure of the amount of disorder within a system. This can be represented in many ways, for example in the arrangement of the molecules – water molecules in an ice cube are more ordered than the same molecules after they have been heated into a gas. Whereas the water molecules were in a well-defined lattice in the ice cube, they float unpredictably in the gas. The entropy of the ice cube is, therefore, lower than that of the gas. Similarly, the entropy of a plate is higher when it is in pieces on the floor compared with when it is in one piece in the sink.

A more formal definition for entropy as heat moves around a system is given in the first of the equations. The infinitesimal change in entropy of a system (dS) is calculated by measuring how much heat has entered a closed system (δQ) divided by the common temperature (T) at the point where the heat transfer took place.

The second equation is a way to express the second law of thermodynamics in terms of entropy. The formula says that the entropy of an isolated natural system will always tend to stay the same or increase – in other words, the energy in the universe is gradually moving towards disorder. Our original statement of the second law emerges from this equation: heat cannot spontaneously flow from a cold object (low entropy) to a hot object (high entropy) in a closed system because it would violate the equation. (Refrigerators seemingly break this rule since they can freeze things to much lower temperatures than the air around them. But they don't violate the second law because they are not isolated systems, requiring a continual input of electrical energy to pump heat out of their interior. The fridge heats up the room around it and, if unplugged, would naturally return to thermal equilibrium with the room.)

This formula also imposes a direction on to time; whereas every other physical law we know of would work the same whether time was going forwards or backwards, this is not true for the second law of thermodynamics. However long you leave it, a boiling pan of water is unlikely to ever become a block of ice. A smashed plate could never reassemble itself, as this would reduce the entropy of the system in defiance of the second law of thermodynamics. Some processes, Carnot observed, are irreversible.

Carnot examined steam engines, which work by burning fuel to heat up a cylinder containing steam, which expands and pushes on a piston to then do something useful. The portion of the fuel's energy that is extracted and made to do something useful is called work, while the remainder is the wasted (and disordered) energy we call heat. Carnot showed that you could predict the theoretical maximum efficiency of a steam engine by measuring the difference in temperatures of the steam inside the cylinder and that of the air around it, known in thermodynamic terms as the hot and cold reservoirs of a system respectively.

Heat engines work because heat naturally flows from hot to cold places. If there was no cold reservoir towards which it could move there would be no heat flow and the engine would not work. Because the cold reservoir is always above absolute zero, no heat engine can be 100% efficient.

The best-designed engines, therefore, heat up steam (or other gas) to the highest possible temperature then release the exhaust at the lowest possible temperature. The most modern steam engines can get to around 60% efficiency and diesel engines in cars can get to around 50% efficient. Petrol-based internal combustion engines are much more wasteful of their fuel's energy.

The inefficiencies are built into any system using energy and can be described thermodynamically. This wasted energy means that the overall disorder of the universe – its entropy – will increase over time but at some point reach a maximum. At this moment in some unimaginably distant future, the energy in the universe will be evenly distributed and so, for all macroscopic purposes, will be useless. Cosmologists call this the "heat death" of the universe, an inevitable consequence of the unstoppable march of entropy.

Источник: https://www.theguardian.com/science/2013/dec/01/what-is-the-second-law-of-thermodynamics

Entropy and the Laws of Thermodynamics

The principal energy laws that govern every organization are derivedfrom two famous laws of thermodynamics. The second law, known as Carnot'sprinciple, is controlled by the concept of entropy.

Today the word entropy is as much a part of the language of thephysical sciences as it is of the human sciences. Unfortunately, physicists,engineers, and sociologists use indiscriminately a number of terms thatthey take to be synonymous with entropy, such as disorder, probability,noise, random mixture, heat; or they use terms they consider synonymouswith antientropy, such as information, neguentropy, complexity, organization,order, improbability.

There are at least three ways of defining entropy:

  • in terms of thermodynamics (the science of heat), where the names of Mayer, Joule, Carnot, and Clausius (1865) are important;
  • in terms of statistical theory, which fosters theequivalence of entropy and disorder -- as a result of the work ofMaxwell, Gibbs, and Boltzmann (1875), and
  • in terms of information theory, which demonstrates the equivalence of neguentropy (the opposite of entropy) and information -- as a result of the work of Szilard, Gabor, Rothstein,and Brillouin (1940-1950)
The two principal laws of thermodynamics apply only to closed systems,that is, entities with which there can be no exchange of energy, information,or material. The universe in its totality might be considered a closedsystem of this type; this would allow the two laws to be applied to it.

The first law of thermodynamics says that the total quantityof energy in the universe remains constant. This is the principle of theconservation of energy. The second law of thermodynamics states that thequality of this energy is degraded irreversibly. This is the principleof the degradation of energy.

The first principle establishes the equivalence of the different formsof energy (radiant, chemical, physical, electrical, and thermal), the possibilityof transformation from one form to another, and the laws that govern thesetransformations. This first principle considers heat and energy as twomagnitudes of the same physical nature

About 1850 the studies of Lord Kelvin, Carnot, and Clausius of the exchangesof energy in thermal machines revealed that there is a hierarchy amongthe various forms of energy and an imbalance in their transformations.This hierarchy and this imbalance are the basis of the formulation of thesecond principle.

In fact physical, chemical, and electrical energy can be completelychanged into heat. But the reverse (heat into physical energy, for example)cannot be fully accomplished without outside help or without an inevitableloss of energy in the form of irretrievable heat. This does not mean thatthe energy is destroyed; it means that it becomes unavailable for producingwork. The irreversible increase of this nondisposable energy in the universeis measured by the abstract dimension that Clausius in 1865 called entropy(from the Greek entrope, change).

The concept of entropy is particularly abstract and by the same tokendifficult to present. Yet some scientists consider it intuitively; theyneed only refer mentally to actual states such as disorder, waste, andthe loss of time or information. But how can degraded energy, or its hierarchy,or the process of degradation be truly represented?

There seems to be a contradiction between the first and second principles.One says that heat and energy are two dimensions of the same nature; theother says they are not, since potential energy is degraded irreversiblyto an inferior, less noble, lower-quality form--heat. Statistical theoryprovides the answer. Heat is energy; it is kinetic energy that resultsfrom the movement of molecules in a gas or the vibration of atoms in asolid. In the form of heat this energy is reduced to a state of maximumdisorder in which each individual movement is neutralized by statisticallaws.

Potential energy, then, is organized energy; heat is disorganized energy.And maximum disorder is entropy. The mass movement of molecules (in a gas,for example) will produce work (drive a piston). But where motion is ineffectiveon the spot and headed in all directions at the same time, energy willbe present but ineffective. One might say that the sum of all the quantitiesof heat lost in the course of all the activities that have taken placein the universe measures the accumulation of entropy.

One can generalise further. Thanks to the mathematical relation betweendisorder and probability, it is possible to speak of evolutiontoward an increase in entropy by using one or the other of two statements:"left to itself, an isolated system tends toward a state of maximumdisorder" or "left to itself, an isolated system tends towarda state of higher probability." (this is illustrated most simply by the example of the box with two compartments). These equivalent expressions can besummarized:

  • Potential energy -> entropy
  • Ordered energy -> disorganized energy (heat)
  • High-quality energy -> heat (low-grade energy)
  • Order -> disorder
  • Improbability -> probability

The concepts of entropy and irreversibility, derived from the secondprinciple, have had a tremendous impact on our view of the universe. Inbreaking the vicious circle of repetitiveness in which the ancients weretrapped, and in being confronted with biological evolution generating orderand organization, the concept of entropy indirectly opens the way to aphilosophy of progress and development (see: the direction of evolution). At the same time it introducesthe complementarity between the "two great drifts of the universe"described in the works of Bergson and Teilhard de Chardin.

The image of the inexorable death of the universe, as suggested by thesecond principle, has profoundly influenced our philosophy, our ethics,our vision of the world, and even our art. The thought that by the verynature of entropy the ultimate and only possible future for man is annihilationhas infiltrated our culture like a paralysis. This consideration led LeonBrillouin to ask, "How is it possible to understand life when theentire world is ordered by a law such as the second principle of thermodynamics,which points to death and annihilation?"

See also:

Источник: http://pespmc1.vub.ac.be/ENTRTHER.html

First law of Thermodynamics and the definition of Internal Energy

You should carefully identify the validity range of each statement. A more general statement can contain a special case, but it is not possible to obtain the more general from special cases.

First principle of thermodynamics is actually $$ \Delta U = q + w, $$ where $q$ and $w$ are heat and work exchanged by the system with the environment.

For the special case of a reversible transformation $$ dU = TdS -PdV.~~~~~~~~~~~~~~~~~[1] $$ In general, $U$ depends on $S$ and $V$, and such a dependence can be transformed into a dependence on $T$ and $V$. It is only for the special case of a perfect gas that $U$ depends on $T$ only. Therefore, from the special case of the isothermal behavior of a perfect gas it is impossible to obtain a relation like $[1]$ which is valid for all phases and for all systems.

answered Aug 17 '20 at 7:57

GiorgioPGiorgioP

20.7k66 gold badges3232 silver badges6767 bronze badges

$\endgroup$Источник: https://physics.stackexchange.com/questions/573800/first-law-of-thermodynamics-and-the-definition-of-internal-energy

1st law of thermodynamics definition -

1.3 The first two laws of thermodynamics

The natural laws which govern the environment and which are, therefore, of interest to us are the first two laws of thermodynamics. These relate to closed systems. Strictly speaking, the earth is not a closed system as it receives energy from the sun, but it is almost a closed system.

First law of thermodynamics

The first law states that whenever energy is converted in form, its total quantity remains unchanged. In other words, energy (or matter) can be neither created nor destroyed.

Common and Stagl (2005) use the example of coal-fired electricity generating plant. The coal is heated which produces electricity. A by-product of this process is waste heat that is transported away as cooling water or gases. In addition, various waste gases are emitted into the atmosphere, which cause pollution, such as acid rain.

Second law of thermodynamics

This law states that in a closed system, entropy does not decrease.

Entropy could be described as a measure of the 'disorderedness' of energy. For instance, ordered energy is useful and an example of this is the energy stored in a battery. However, disordered energy is not useful, and an example is the energy dispersed into the environment by a fire.

Entropy is a thermodynamic property of matter and is related to the amount of energy that can be transferred from one system to another in the form of work. For a given system with a fixed amount of energy, the value of the entropy ranges from zero to a maximum. If the entropy is at its maximum, then the amount of work that can be transferred is equal to zero, and if the entropy is at zero, then the amount of work that can be transferred is equal to the energy of the system.

During an irreversible process the entropy of a system always increases.

The key points to remember from the above are that, because of these natural laws:

  • increased extraction of minerals by the production process leads to an increase in wastes
  • there is a limit on the substitutability of inputs
  • since production and consumption lead to the dissipation of matter, scarce energy is needed for recycling

The importance of these two laws relates to the use, re-use, and recycling of the environment after interactions with the economy.

Let us look more closely at the subject of recycling, as this would seem to offer a chance for the economy to retain the use of scarce resources.

Recycling

There is a hierarchy of resource use that includes recycling. This is referred to as the 3R's - reduce, re-use, and recycle. The final and least appealing option after resource use is to dispose of any remaining waste.

There are now many materials which are routinely recycled and re-used. For example, glass bottles have been collected and re-used by a number of drinks companies for many years. In various countries this practice is encouraged by the use of deposit-refund schemes. Other examples include paper, metal, glass, plastic, textiles, and garden waste.

For instance, in the Netherlands, household waste that can be composted is collected separately from other household waste and is composted by the local authorities. To encourage citizens to participate in this scheme, householders received some free compost soon after the scheme was set up. However, there are clearly costs involved in such a scheme.

  • separate waste bins were provided for the compostable waste
  • information was provided to householders
  • householders use time to separate their waste
  • costs of separate collection and of dealing with the compost

In the Netherlands, chemical household waste is also collected separately, with similar costs involved. There are numerous examples of different economic instruments used to deal with waste at both the household level and industry.

There are clearly limits to what resources can be re-used and recycled. These limits are not only dictated by the laws of thermodynamics but also the costs associated with re-using and recycling many items.

Источник: https://www.soas.ac.uk/cedep-demos/000_P570_IEEP_K3736-Demo/unit1/page_09.htm

The 2nd Law of Thermodynamics -- A Probabilistic Law

In our lead-in discussion to why we need a 2nd law, we point out that energy conservation — the 1st law of thermodynamics — suffices to rule out a lot of thermal sort of things that don't happen — like things getting warmer without any source of warmth. But there are a lot of thermal things that don't happen that are perfectly consistent with the 1st law; like thermal energy flowing from a cold object to a hotter object. In order to codify and elaborate our understanding of these results, we turn to the ideas of probability to understand how energy tends to be distributed.

A probabilistic law

That seems a bit strange. What does a discussion about probabilities have to do with a physical law? Physical laws are always true, aren't they? And isn't probability really about things that are only sometimes true?

In many ways, molecules in physics are like multi-sided dice, and the likelihood that a particle will be located in a particular location in space (or have a particular energy) is analogous to the likelihood that a multi-sided die will land on a particular side. There are many different ways for the molecules to move, and the details of why they move in one way or another is very sensitive to exactly where they are and how they are moving — and is very much out of our control. A "random motion model" for molecules is much more useful than a model that tries to calculate the motion of every molecule.

The likelihood that all the smoke particles in a smoke-filled room will move as a result of their chaotic motions into one corner of the room is analogous to the likelihood that nearly all the coins in a set of 1023 tosses will land on heads. It's very, VERY unlikely! If you tossed that many coins over and over again for the lifetime of the universe (14 billion years) the odds that you would see all heads is still minuscule — totally ignorable. This extremely low probability is what transforms a "probability statement" into a "physical law."

The reason you will never see the smoke particles accumulate in one small corner is that there are many, many more ways for the smoke particles to distribute themselves uniformly throughout the room than there are ways for the particles to all be located in just one corner of the room. That said, just as it is not impossible for all 1023 tosses to land on heads, it is not impossible that all the smoke particles will spontaneously move to one corner of the room ... just don't hold your breath waiting for it to happen. 

Microstates and macrostates

More generally we can say that when the number of atoms or molecules in a system is large, the system will most likely move toward a thermodynamic state for which there are many possible microscopic "arrangements" of the energy.  (And they will be very unlikely to move toward a thermodynamic state for which there are very few possible microscopic arrangements.) If this seems mysterious, go back to the discussion of coin tosses — it's a pretty good analogy.  The H/T ratio (say, 5/5) — which we refer to as a macrostate of the system — is analogous to a thermodynamic state of a system, where only the pressure, temperature, and density of the various molecules are specified. The different ways in which that H/T ratio can be obtained (say, HTTHTTHHHT) — which we refer to as a microstate — is analogous to the specification of the spatial and energetic arrangement of each of the atoms/molecules that compose a particular thermodynamic state.

As we saw in the coin toss discussion, if one only looks at the macrostate description, one is much more likely to get a H/T result that corresponds to a large number of possible arrangements. Likewise, one is much more likely to get an atom/molecule distribution that corresponds to a large number of arrangements.

The second law

The Second Law of Thermodynamics can now be stated in this qualitative way: 

When a system is composed of a large number of particles, the system is exceedingly likely to spontaneously move toward the thermodynamic (macro)state that correspond to the largest possible number of particle arrangements (microstates).  

There are a few really important words to make note of in this definition. First, the system must have a LARGE number of particles.

Firstly, if the system has just a few particles, it is not exceedingly likely that the particles will be in one state rather than another. Only when the number of particles is large do the statistics become overwhelming. If one tosses a coin just twice, there is a reasonable chance (namely, 25%) that one will obtain all heads.

Secondly, the system is EXCEEDINGLY LIKELY, but not guaranteed, to move toward a state for which there are the most particle arrangements. The larger the number of particles, the more likely it is, but it is never a guarantee.

Thirdly, this law does not specify the specific nature of these "arrangements." It may be that we are only interested in spatial location, in which case an arrangement corresponds to the spatial location of each particle in the system. More arrangements would then correspond to more ways of positioning the particles in space. In other contexts we may be interested in energy, and arrangements would then correspond to the set of energies corresponding to the system's constituents.  In either case, the most likely thermodynamic state is the one for which there are the most microscopic arrangements.

Biological implications

The Second Law of Thermodynamics is a statistical law of large numbers. But we have to be careful. Although biological systems almost always consist of a huge number of atoms and molecules, in some critical cases there are a very small number of molecules that make a big difference.

For example, a cell may contain only a single copy of a DNA molecule containing a particular gene. Yet that single molecule may be critical to the production of protein molecules that are critical to the survival of the cell. For some processes a small number of molecules in a cell (fewer than 10!) can make a big difference. On the other hand, a cubic micron of a fluid in an organism typically contains on the order of 1014 molecules! The second law of thermodynamics is a law that is indispensable in analyzing biological systems in countless contexts; but it is essential to understand it well — not to just use it mindlessly. (See the associated problem to estimate some molecules in a cell that might not behave according to our probabilistic laws.)

Just as our probability that the number of Heads we got in flipping coins got narrower as the the number of flips got larger, the probability that our results are those predicted by statistical mechanics (most probable macrostates) gets sharper and sharper. The variation around that perfect probability (corresponding to an infinite number of flips or particles) is called fluctuations. The scale of fluctuations can be estimated crudely as about 1/(square root of the number). So for 1014 molecules, our corrections due to fluctuations are about 1 part in 107. Whereas, if we only have 100 = 102 molecules, our fluctuations are expect to be about 1 part in 101 or 10%.  But again,  fluctuations may play a crucial role in the processes of a living cell. Learning to estimate when the standard rules of thermodynamics may safely be applied can be very valuable!

Entropy

Since the number of microstates corresponding to a particular macrostate plays a critical role, we need a way to count them in order to quantify what's going on with the probabilities. The number of arrangements is so large, that it turns out to be convenient to work with a smaller number — the log of the number of microstates. This is just like counting the powers of 10 in a large number rather than writing out all the zeros. For a very large N, the number 10N is considerably larger than N! And it turns out that working with the log of the number of microstates is very much more convenient.

Essentially what is happening is that when you put two systems together (imagine combining two boxes of gases into one) the number of microstates of the combination is basically the product of the number of microstates in each. (If we flip a coin 10 times, the number of microstates is 210. If we flip it another 10 times, the new number of microstates is 220 — the product of 210 with 210.) If we take the log of the number of microstates, when we add two systems together, the logs of their number of microstates add to get the total number. This turns out to be both easier to work with and to lead to a number of nice ways of expressing things mathematically.

The log of the number of distributions of the energy that correspond to the thermodynamic state of a system is termed the "entropy" of the system, and is given the symbol $S$. Another way of stating the Second Law, therefore, is to say that systems are exceedingly likely to spontaneously move toward the state having the highest entropy $S$. Using the symbol $W$ to represent the number of arrangements of the energy that correspond to a particular thermodynamics state, we can write an expression for entropy as follows:

$$S = k_B \ln{W}$$

The constant $k_B$ is called Boltzmann's constant, and its value is $1.38 \times 10^{-23} \mathrm{J/K}$.  (Yes, it's the same constant we ran into in our discussion of kinetic theory of gases — chemistry's gas constant $R$ divided by Avogadro's number, $N_A$.) The important thing to take from this equation is that the entropy $S$ is a measure of the number of arrangements $W$. As $W$ goes, so goes the entropy. 

But of course the number $W$ is usually a HUGE number, and counting up arrangements to arrive at its value would usually take you forever. Fortunately, it is very rarely the case that we actually need to do the counting. Rather, we usually need only to compare two thermodynamic states and to decide which one is consistent with the greatest number of microscopic arrangements. That is the state to which the system will evolve. 

Systems

When discussing the Second Law of Thermodynamics, it is crucial to be very careful about defining the system that one is considering. While it is always the case that the entropy of the universe is overwhelmingly likely to increase in any spontaneous process, it is not necessarily the case that a particular sub-system of the universe will experience an increase in entropy.

If the system being studied is isolated, i.e., if no matter or energy is allowed to enter or leave the system, then the system's entropy will increase in any spontaneous process.  But, if the system is NOT isolated, it is entirely possible its entropy will decrease.

Stated more generally, it is entirely possible that one part of the universe will exhibit an entropy decrease during a spontaneous process while the rest of the universe exhibits a larger increase in entropy, such that the overall entropy in the universe has increased. All of this is just to say that it is of utmost importance to be clear about the system to which the Second Law of Thermodynamics is being applied.  

It is not obvious at this stage that the statement of the Second Law of Thermodynamics presented here will be practically useful in understanding which processes in nature are spontaneous and which ones are not. What, for example, does any of this have to do with the fact that heat spontaneously transfers from hot objects to cold objects and not the other way around? What does this have to do with chairs sliding across a room? And what does it have to do with the electrostatic potential across a biological membrane? As it turns out, the Second Law of Thermodynamics as defined above can in fact explain those examples.

Ben Geller 11/8/11 and Joe Redish 12/8/11

Источник: https://www.compadre.org/nexusph/course/The_2nd_Law_of_Thermodynamics_--_A_Probabilistic_Law

What Is the First Law of Thermodynamics?

The First Law of Thermodynamics states that heat is a form of energy, and thermodynamic processes are therefore subject to the principle of conservation of energy. This means that heat energy cannot be created or destroyed. It can, however, be transferred from one location to another and converted to and from other forms of energy. 

Thermodynamics is the branch of physics that deals with the relationships between heat and other forms of energy. In particular, it describes how thermal energy is converted to and from other forms of energy and how it affects matter. The fundamental principles of thermodynamics are expressed in four laws.

“The First Law says that the internal energy of a system has to be equal to the work that is being done on the system, plus or minus the heat that flows in or out of the system and any other work that is done on the system," said Saibal Mitra, a professor of physics at Missouri State University. "So, it’s a restatement of conservation of energy." 

Mitra continued, "The change in internal energy of a system is the sum of all the energy inputs and outputs to and from the system similarly to how all the deposits and withdrawals you make determine the changes in your bank balance.” This is expressed mathematically as: ΔU = Q – W, where ΔU is the change in the internal energy, Q is the heat added to the system, and W is the work done by the system. 

History

Scientists in the late 18th and early 19th centuries adhered to caloric theory, first proposed by Antoine Lavoisier in 1783, and further bolstered by the work of Sadi Carnot in 1824, according to the American Physical Society. Caloric theory treated heat as a kind of fluid that naturally flowed from hot to cold regions, much as water flows from high to low places. When this caloric fluid flowed from a hot to a cold region, it could be converted to kinetic energy and made to do work much as falling water could drive a water wheel. It wasn’t until Rudolph Clausius published "The Mechanical Theory of Heat" in 1879 that caloric theory was finally put to rest. 

Thermodynamic systems

Energy can be divided into two parts, according to David McKee, a professor of physics at Missouri Southern State University. One is our human-scale macroscopic contribution, such as a piston moving and pushing on a system of gas. Conversely, things happen at a very tiny scale where we can’t keep track of the individual contributions. 

McKee explains, “When I put two samples of metal up against each other, and the atoms are rattling around at the boundary, and two atoms bounce into each other, and one of the comes off faster than the other, I can’t keep track of it. It happens on a very small time scale and a very small distance, and it happens many, many times per second. So, we just divide all energy transfer into two groups: the stuff we’re going to keep track of, and the stuff we’re not going to keep track of. The latter of these is what we call heat.”

Thermodynamic systems are generally regarded as being open, closed or isolated. According to the University of California, Davis, an open system freely exchanges energy and matter with its surroundings; a closed system exchanges energy but not matter with its surroundings; and an isolated system does not exchange energy or matter with its surroundings. For example, a pot of boiling soup receives energy from the stove, radiates heat from the pan, and emits matter in the form of steam, which also carries away heat energy. This would be an open system. If we put a tight lid on the pot, it would still radiate heat energy, but it would no longer emit matter in the form of steam. This would be a closed system. However, if we were to pour the soup into a perfectly insulated thermos bottle and seal the lid, there would be no energy or matter going into or out of the system. This would be an isolated system. 

In practice, however, perfectly isolated systems cannot exist. All systems transfer energy to their environment through radiation no matter how well insulated they are. The soup in the thermos will only stay hot for a few hours and will reach room temperature by the following day. In another example, white dwarf stars, the hot remnants of burned-out stars that no longer produce energy, can be insulated by light-years of near perfect vacuum in interstellar space, yet they will eventually cool down from several tens of thousands of degrees to near absolute zero due to energy loss through radiation. Although this process takes longer than the present age of the universe, there’s no stopping it.

Heat engines

The most common practical application of the First Law is the heat engine. Heat engines convert thermal energy into mechanical energy and vice versa. Most heat engines fall into the category of open systems. The basic principle of a heat engine exploits the relationships among heat, volume and pressure of a working fluid. This fluid is typically a gas, but in some cases it may undergo phase changes from gas to liquid and back to a gas during a cycle. 

When gas is heated, it expands; however, when that gas is confined, it increases in pressure. If the bottom wall of the confinement chamber is the top of a movable piston, this pressure exerts a force on the surface of the piston causing it to move downward. This movement can then be harnessed to do work equal to the total force applied to the top of the piston times the distance that the piston moves. 

There are numerous variations on the basic heat engine. For instance, steam engines rely on external combustion to heat a boiler tank containing the working fluid, typically water. The water is converted to steam, and the pressure is then used to drive a piston that converts heat energy to mechanical energy. Automobile engines, however, use internal combustion, where liquid fuel is vaporized, mixed with air and ignited inside a cylinder above a movable piston driving it downward. 

Refrigerators, air conditioners and heat pumps

Refrigerators and heat pumps are heat engines that convert mechanical energy to heat. Most of these fall into the category of closed systems. When a gas is compressed, its temperature increases. This hot gas can then transfer heat to its surrounding environment. Then, when the compressed gas is allowed to expand, its temperature becomes colder than it was before it was compressed because some of its heat energy was removed during the hot cycle. This cold gas can then absorb heat energy from its environment. This is the working principal behind an air conditioner. Air conditioners don’t actually produce cold; they remove heat. The working fluid is transferred outdoors by a mechanical pump where it is heated by compression. Next, it transfers that heat to the outdoor environment, usually through an air-cooled heat exchanger. Then, it is brought back indoors, where it is allowed to expand and cool so it can absorb heat from the indoor air through another heat exchanger. 

A heat pump is simply an air conditioner run in reverse. The heat from the compressed working fluid is used to warm the building. It is then transferred outside where it expands and becomes cold, thereby allowing it to absorb heat from the outside air, which even in winter is usually warmer than the cold working fluid. 

Geothermal or ground-source air conditioning and heat pump systems use long U-shaped tubes in deep wells or an array of horizontal tubes buried in a large area through which the working fluid is circulated, and heat is transferred to or from the earth. Other systems use rivers or ocean water to heat or cool the working fluid. 

Additional resources

Here are three other explanations of the First Law of Thermodynamics:

Jim Lucas is a contributing writer for Live Science. He covers physics, astronomy and engineering. Jim graduated from Missouri State University, where he earned a bachelor of science degree in physics with minors in astronomy and technical writing. After graduation he worked at Los Alamos National Laboratory as a network systems administrator, a technical writer-editor and a nuclear security specialist. In addition to writing, he edits scientific journal articles in a variety of topical areas.
Источник: https://www.livescience.com/50881-first-law-thermodynamics.html

What is the second law of thermodynamics?

Thermodynamics is the study of heat and energy. At its heart are laws that describe how energy moves around within a system, whether an atom, a hurricane or a black hole. The first law describes how energy cannot be created or destroyed, merely transformed from one kind to another. The second law, however, is probably better known and even more profound because it describes the limits of what the universe can do. This law is about inefficiency, degeneration and decay. It tells us all we do is inherently wasteful and that there are irreversible processes in the universe. It gives us an arrow for time and tells us that our universe has a inescapably bleak, desolate fate.

Despite these somewhat deflating ideas, the ideas of thermodynamics were formulated in a time of great technological optimism – the Industrial Revolution. In the mid-19th century, physicists and engineers were building steam engines to mechanise work and transport and were trying to work out how to make them more powerful and efficient.

Many scientists and engineers – including Rudolf Clausius, James Joule and Lord Kelvin – contributed to the development of thermodynamics, but the father of the discipline was the French physicist Sadi Carnot. In 1824 he published Reflections on the Motive Power of Fire, which laid down the basic principles, gleaned from observations of how energy moved around engines and how wasted heat and useful work were related.

The second law can be expressed in several ways, the simplest being that heat will naturally flow from a hotter to a colder body. At its heart is a property of thermodynamic systems called entropy – in the equations above it is represented by "S" – in loose terms, a measure of the amount of disorder within a system. This can be represented in many ways, for example in the arrangement of the molecules – water molecules in an ice cube are more ordered than the same molecules after they have been heated into a gas. Whereas the water molecules were in a well-defined lattice in the ice cube, they float unpredictably in the gas. The entropy of the ice cube is, therefore, lower than that of the gas. Similarly, the entropy of a plate is higher when it is in pieces on the floor compared with when it is in one piece in the sink.

A more formal definition for entropy as heat moves around a system is given in the first of the equations. The infinitesimal change in entropy of a system (dS) is calculated by measuring how much heat has entered a closed system (δQ) divided by the common temperature (T) at the point where the heat transfer took place.

The second equation is a way to express the second law of thermodynamics in terms of entropy. The formula says that the entropy of an isolated natural system will always tend to stay the same or increase – in other words, the energy in the universe is gradually moving towards disorder. Our original statement of the second law emerges from this equation: heat cannot spontaneously flow from a cold object (low entropy) to a hot object (high entropy) in a closed system because it would violate the equation. (Refrigerators seemingly break this rule since they can freeze things to much lower temperatures than the air around them. But they don't violate the second law because they are not isolated systems, requiring a continual input of electrical energy to pump heat out of their interior. The fridge heats up the room around it and, if unplugged, would naturally return to thermal equilibrium with the room.)

This formula also imposes a direction on to time; whereas every other physical law we know of would work the same whether time was going forwards or backwards, this is not true for the second law of thermodynamics. However long you leave it, a boiling pan of water is unlikely to ever become a block of ice. A smashed plate could never reassemble itself, as this would reduce the entropy of the system in defiance of the second law of thermodynamics. Some processes, Carnot observed, are irreversible.

Carnot examined steam engines, which work by burning fuel to heat up a cylinder containing steam, which expands and pushes on a piston to then do something useful. The portion of the fuel's energy that is extracted and made to do something useful is called work, while the remainder is the wasted (and disordered) energy we call heat. Carnot showed that you could predict the theoretical maximum efficiency of a steam engine by measuring the difference in temperatures of the steam inside the cylinder and that of the air around it, known in thermodynamic terms as the hot and cold reservoirs of a system respectively.

Heat engines work because heat naturally flows from hot to cold places. If there was no cold reservoir towards which it could move there would be no heat flow and the engine would not work. Because the cold reservoir is always above absolute zero, no heat engine can be 100% efficient.

The best-designed engines, therefore, heat up steam (or other gas) to the highest possible temperature then release the exhaust at the lowest possible temperature. The most modern steam engines can get to around 60% efficiency and diesel engines in cars can get to around 50% efficient. Petrol-based internal combustion engines are much more wasteful of their fuel's energy.

The inefficiencies are built into any system using energy and can be described thermodynamically. This wasted energy means that the overall disorder of the universe – its entropy – will increase over time but at some point reach a maximum. At this moment in some unimaginably distant future, the energy in the universe will be evenly distributed and so, for all macroscopic purposes, will be useless. Cosmologists call this the "heat death" of the universe, an inevitable consequence of the unstoppable march of entropy.

Источник: https://www.theguardian.com/science/2013/dec/01/what-is-the-second-law-of-thermodynamics

Thermodynamic Laws that Explain Systems


A thermodynamic system is one that interacts and exchanges energy with the area around it. The exchange and transfer need to happen in at least two ways. At least one way must be the transfer of heat. If the thermodynamic system is "in equilibrium," it can't change its state or status without interacting with its environment. Simply put, if you're in equilibrium, you're a "happy system," just minding your own business. You can't really do anything. If you do, you have to interact with the world around you.

The zeroth law of thermodynamics will be our starting point. We're not really sure why this law is the zeroth. We think scientists had "first" and "second" for a long time, but this new one was so important it should come before the others. And voila! Law Number Zero! Here's what it says: When two systems are sitting in equilibrium with a third system, they are also in thermal equilibrium with each other.

In English: systems "One" and "Two" are each in equilibrium with "Three." That means they each have the same energy content as "Three". But if THAT’S true, then all the values found in "Three", match those in both "One" and "Two". It’s obvious, then, that the values of "One" and "Two" must ALSO match. This means that "One" and "Two" have to be in equilibrium with each other.

The first law of thermodynamics is a little simpler. The first law states that when heat is added to a system, some of that energy stays in the system and some leaves the system. The energy that leaves does work on the area around it. Energy that stays in the system creates an increase in the internal energy of the system.

In English: you have a pot of water at room temperature. You add some heat to the system. First, the temperature and energy of the water increases. Second, the system releases some energy and it works on the environment (maybe heating the air around the water, making the air rise).

The big finish! The second law of thermodynamics explains that it is impossible to have a cyclic (repeating) process that converts heat completely into work. It is also impossible to have a process that transfers heat from cool objects to warm objects without using work.

In English: that first part of the law says no reaction is 100% efficient. Some amount of energy in a reaction is always lost to heat. Also, a system can not convert all of its energy to working energy.

The second part of the law is more obvious. A cold body can't heat up a warm body. Heat naturally wants to flow from warmer to cooler areas. Heat wants to flow and spread out to areas with less heat. If heat is going to move from cooler to warmer areas, it is going against what is “natural”, so the system must put in some work for it to happen.

Next Stop On Physics4Kids Tour
Next page on thermodynamics and heat.
Return to Top of Page
Or search the sites for a specific topic.



Rader Network Side Navigation

NASA Ames Helps Develop Heat Shield (NASA/Ames Video)
Did you know? Physics Fact.

Encyclopedia.com:
http://www.encyclopedia.com/topic/Laws_of_thermodynamics.aspx
Wikipedia:
http://en.wikipedia.org/wiki/Laws_of_thermodynamics
Encyclopædia Britannica (Conservation of Energy):
http://www.britannica.com/EBchecked/topic/187240/conservation-of-energy


Источник: http://www.physics4kids.com/files/thermo_laws.html
1st law of thermodynamics definition

youtube video

Thermodynamics: Crash Course Physics #23

5 Replies to “1st law of thermodynamics definition”

  1. They did the exact thing to me ! I deposited a federal check into their bank and they closed my account and froze it, as soon as I fix this issue I’m withdrawing all my money and leaving there bank.

  2. Kak. Saya buat user id nya kok salah terus ya. Padahal sudah sesuai aturan yg di bawahnya.

  3. TAMIL SERVER TECH - தமிழ் சர்வர் டெக் says:

    sHoP 2.o"

Leave a Reply

Your email address will not be published. Required fields are marked *