Category Archives: appliances


A humidifier is a device that increases humidity (moisture) in a single room or an entire building. In the home, point-of-use humidifiers are commonly used to humidify a single room, while whole-house or furnace humidifiers, which connect to a home’s HVAC system, provide humidity to the entire house. Medical ventilators often include humidifiers for increased patient comfort. Large humidifiers are used in commercial, institutional, or industrial contexts, often as part of a larger HVAC system.

Humidifier in an art museum in Augsburg, Germany

Low humidity may occur in hot, dry desert climates, or indoors in artificially heated spaces. In winter, especially when cold outside air is heated indoors, the humidity may drop as low as 10-20%. This low humidity can cause adverse health effects, by drying out mucous membranes such as the lining of the nose and throat, and can cause respiratory distress.[1] The low humidity also can affect wooden furniture, causing shrinkage and loose joints or cracking of pieces.[2] Books, papers, and artworks may shrink or warp and become brittle in very low humidity.[3]

In addition, static electricity may become a problem in conditions of low humidity, destroying semiconductor devices and causing static cling of textiles, and causing dust and small particles to stick stubbornly to electrically charged surfaces.[4]

Overuse of a humidifier can raise the relative humidity to excessive levels, promoting the growth of dust mites and mold, and can also cause hypersensitivity pneumonitis (humidifier lung).[5] A relative humidity of 30% to 50% is recommended for most homes.[6] A properly installed and located hygrostat should be used to monitor and control humidity levels automatically, or a well-informed and conscientious human operator must constantly check for correct humidity levels.

Industrial humidifiers[edit]

Industrial humidifiers are used when a specific humidity level must be maintained to prevent static electricity buildup, preserve material properties, and ensure a comfortable and healthy environment for workers or residents.

Static problems are prevalent in industries such as packaging, printing, paper, plastics, textiles, electronics, automotive manufacturing and pharmaceuticals. Friction can produce static buildup and sparks when humidity is below 45% relative humidity (RH). Between 45% and 55% RH, static builds up at reduced levels, while humidity above 55% RH ensures that static will never build up.[7] The American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE) has traditionally recommended a range of 45–55% RH in data centers to prevent sparks that can damage IT equipment.[8] Humidifiers are also used by manufacturers of semiconductors and in hospital operating rooms.

Printers and paper manufacturers use humidifiers to prevent shrinkage and paper curl. Humidifiers are needed in cold storage rooms to preserve the freshness of food against the dryness caused by cold temperatures. Art museums use humidifiers to protect sensitive works of art, especially in exhibition galleries, where they combat the dryness caused by heating for the comfort of visitors during winter.[9]

Portable humidifiers[edit]

A “portable” humidifier may range in size from a small tabletop appliance to a large floor-mounted unit. The water is usually supplied by manually filling the unit on a periodic basis.

Evaporative humidifiers[edit]

The most common portable humidifier, an “evaporative”, “cool moisture”, or “wick humidifier”, consists of just a few basic parts: a reservoir, wick and fan.

The wick is made of a porous material that absorbs water from the reservoir and provides a larger surface area for it to evaporate from. The fan is adjacent to the wick and blows air onto the wick to aid in the evaporation of the water. Evaporation from the wick is dependent on relative humidity. A room with low humidity will have a higher evaporation rate compared to a room with high humidity. Therefore, this type of humidifier is partially self-regulating; as the humidity of the room increases, the water vapor output naturally decreases.

These wicks become moldy if they are not dried out completely between fillings, and become saturated with mineral deposits over time. They regularly need rinsing or replacement; if this does not happen, air cannot pass through them, and humidifier stops humidifying the area it is in and the water in the tank remains at the same level.

Natural humidifiers[edit]

Vaporizer (steam humidifier) (warm mist humidifier)

Impeller humidifier NW-5 (Poland, 1977)

One type of evaporative humidifier makes use of just a reservoir and wick. Sometimes called a “natural humidifier”, these are usually non-commercial devices that can be assembled at little or no cost. One version of a natural humidifier uses a stainless steel bowl, partially filled with water, covered by a towel. A waterproof weight is used to sink the towel in the center of the bowl. There is no need for a fan, because the water spreads through the towel by capillary action and the towel surface area is large enough to provide for rapid evaporation. The stainless steel bowl is much easier to clean than typical humidifier water tanks. This, in combination with daily or every other day replacement of the towel and periodic laundering, can control the problem of mold and bacteria.

Houseplants may also be used as natural humidifiers, since they evaporate water into the air through transpiration. Care must still be taken to prevent bacteria or mold in the soil from growing to excessive levels, or from dispersing into the air.

Natural humidifiers with non electric, hydropneumatic control[edit]

Befeuchtungsposter Wasserfall.jpg

Evaporator with non electric hydropneumatic control

Tasks of the functions: • The hydropneumatic control makes it possible to humidify every room regardless of a heat source or a fan. • By reversing the humidification process, i.e. through moisture, which slides downward into the room, this causes a longer dwell time of the humidity in the room. • Also the filling process of the evaporation surface runs from top to bottom. This makes it possible to fill up the surface with water, by flooding and prevents contamination, because the water tank is without direct contact to the evaporation surface and allows fresh water to flow. Here the evaporation surface does not act as a filter for the lime, it is used to evenly distribute the water.

IR - BildBP.jpg

Explanation: Water runs from the top over the surface (1), and the hydropneumatic controller takes over control of the water flow. The water, which runs over the surface, is collected in a catch basin (2). In order to make running fresh water out from a closed tank (5), it requires air. The water pressure suctions the air from the catch basin via a hose (3). Here the dripping water can close the hose and thereby, through under-pressure in the water tank (5), stops running out of the water tank. The filling process is stopped immediately and is restarted in relation to the humidity of the air. Contamination is once more suppressed, because dripped water immediately evaporates in the lime separator and, if necessary, in the humidity poster.


Humidification posters:

In humidification posters, the evaporation surface is spread into a large quantity of air. In accordance with the moisture content in the air, air is able to absorb moisture. The air mass, which slightly cools, becomes greater and it starts to drop. The air recirculation as a result of this technique is adapted to the humidity of the air. In other words, this occurs quickly with dry air and slowly with moist air and is self-regulating.

Process: The evaporation surface (1) suctions water out of the capillary basin (6), puts the capillary pump (8) for the flooding process in action and quickly fills the surface from top to bottom. Then it drips into the catch basin (2). The water evaporates and sets the air circulation into motion. The drained water closes the hose (3) below. Here, air which flows via pipe into the water tank (5) is interrupted. The water in the tank can no longer flow out because under-pressure occurs. The residual water in the capillary basin (6), on the upper part, is evaporated by the evaporation surface (1). The residual water in the catch basin (2), on the bottom, is suctioned out by the lime separator (7) and evaporated. As soon as the catch basin is empty, air then flows again through the hose (3); water from the tank flows back to the capillary basins (6) and thereby begins the process again. The alternating filling levels for the catch and capillary basins suppress the creation of germs.


A vaporizer (steam humidifier, warm mist humidifier) heats or boils water, releasing steam and moisture into the air. A medicated inhalant can also be added to the steam vapor to help reduce coughs. Vaporizers may be more healthful than cool mist types of humidifiers because steam is less likely to convey mineral impurities or microorganisms from the standing water in the reservoir.[10] However, boiling water requires significantly more energy than other techniques. The heat source in poorly designed humidifiers can overheat, causing the product to melt, leak, and start fires.[11]

Impeller humidifiers[edit]

An impeller humidifier (cool mist humidifier) uses a rotating disc to fling water at a diffuser, which breaks the water into fine droplets that float into the air. The water supply must be kept scrupulously clean, or there is a risk of spreading bacteria or mold into the air. These types of humidifiers are usually noisier than others.

Ultrasonic humidifiers[edit]

Ultrasonic humidifier

An ultrasonic humidifier uses a ceramic diaphragm vibrating at an ultrasonic frequency to create water droplets that silently exit the humidifier in the form of cool fog. Usually the mist gets forced out by a tiny fan, while some ultra mini models have no fans. The models without fans are meant mainly for personal use. Ultrasonic humidifiers use a piezoelectric transducer to create a high frequency mechanical oscillation in a film of water. This forms an extremely fine mist of droplets about one micron in diameter, that is quickly evaporated into the air flow.

Unlike the humidifiers that boil water, these water droplets will contain any impurities that are in the reservoir, including minerals from hard water (which then forms a difficult-to-remove sticky white dust on nearby objects and furniture). Any pathogens growing in the stagnant tank will also be dispersed in the air. Ultrasonic humidifiers should be cleaned regularly to prevent bacterial contamination from being spread throughout the air.

The amount of minerals and other materials can be greatly reduced by using distilled water. Special disposable demineralization cartridges may also reduce the amount of airborne material, but the EPA warns, “the ability of these devices to remove minerals may vary widely.”[10] The mineral dust may have negative health effects[citation needed]. Wick humidifiers trap the mineral deposits in the wick; vaporizer types tend to collect minerals on or around the heating element and require regular cleaning with vinegar or citric acid to control buildup.

Forced-air humidifiers[edit]

For buildings with a forced-air furnace, a humidifier may be installed into the furnace. They can also protect wooden objects, antiques and other furnishings which may be sensitive to damage from overly dry air. In colder months, they may provide modest energy savings, since as humidity increases, occupants may feel warm at a lower temperature.[citation needed]

Bypass humidifiers are connected between the heated and cold air return ducts, using the pressure difference between these ducts to cause some heated air to make a bypass through the humidifier and return to the furnace.

The humidifier should usually be disabled during the summer months if air conditioning is used; air conditioners partially function by reducing indoor humidity.


Drum style (bypass) uses a pipe to bring water directly to a reservoir (a pan) attached to the furnace. The water level in the pan is controlled by a float valve, similar to a small toilet tank float. The wick is typically a foam pad mounted on a drum and attached to a small motor; hot air enters the drum at one end and is forced to leave through the sides of the drum. When the hygrostat calls for humidity, the motor is turned on causing the drum to rotate slowly through the pan of water and preventing the foam pad from drying out.

Advantages include:

  • Low cost
  • Inexpensive maintenance (drum-style pads are cheap and readily available)[citation needed]
Disadvantages include:

  • Requirement for frequent (approximately monthly) inspections of cleanliness and pad condition
  • Water evaporation even when humidification is not required (due to the pan of water which remains exposed to a high velocity air stream)
  • Mold growth in the pan full of water (this problem is exacerbated by the large quantity of air, inevitably carrying mold spores, passing through the humidifier whether in use or not).

For the latter reason especially, drum-style humidifiers should always be turned off at the water supply during summer (air conditioning) months, and should always be used with high quality furnace air filters (MERV ratings as high as possible to ensure small numbers of mold spores reaching the humidifier pan) when the water supply is turned on.

Disc wheels[edit]

A disc wheel style (bypass) is very similar in design to the drum style humidifiers; this type of furnace humidifier replaces the foam drumming with a number of plastic discs with small grooves on both sides. This allows for a very large evaporative surface area, without requiring a great deal of space. Unlike the drum style humidifiers, the disc wheel does not need regular replacement.

Advantages include:

  • Very low maintenance (basin of humidifier should be cleaned out periodically, unless an automatic flushing device is installed)
  • No regular replacement of parts necessary
  • Higher output due to large evaporative surface area
  • Can be installed in hard water situations
  • Maintains efficiency throughout lifespans
Disadvantages include:

  • Higher price
  • Water evaporation even when humidification is not required (due to the pan of water which remains exposed to a high velocity air stre)

Bypass flow-through[edit]

Bypass flow-through style (bypass – also known as “biscuit style” or many other, similar variant names) uses a pipe to bring water directly to an electrically controlled valve at the top of the humidifier. Air passes through an aluminum “biscuit” (often called a pad; the term “biscuit” emphasizes the solid rather than foamy form) which is similar to a piece of extremely coarse steel wool. The biscuit has a coating of a matte ceramic, resulting in an extremely large surface area within a small space. When the hygrostat calls for humidity, the valve is opened and causes a spray of water onto the biscuit. Hot air is passed through the biscuit, causing the water to evaporate from the pad and be carried into the building.

Advantages include:

  • Reduced maintenance (new biscuit is needed only when clogged with dust or mineral deposits, typically once per year)
  • Lack of a pan of potentially stagnant water to serve as a breeding ground for mold as with a drum-style humidifier
  • No incidental humidification caused by a constantly replenished pan of water in a high velocity air stream
  • Reduced requirement for expensive air filters
  • Uses little electricity
Disadvantages include:

  • A somewhat higher purchase price
  • Manufacturer and model-specific replacement biscuits (versus the relatively generic drum-style pads) may be more expensive and difficult to find
  • For most models, a portion of the water supplied to the unit is not evaporated. This can generate a considerable amount of waste water containing residual minerals, which does require connection to a drain. There is a limited selection of drainless models that recirculate water, but mineral buildup must then be removed manually on a periodic basis.

Spray mist[edit]

Spray mist type uses a pipe, usually a small plastic tube, to bring water directly to an electrically controlled valve in the humidifier. Water mist is sprayed directly into the supply air, and the mist is carried into the premises by the air flow.

Advantages include:

  • Simpler than bypass types to install, requiring a single cut hole for installation, no additional ducting.
  • Uses little electricity.
  • Small, compact unit which fits where other types cannot. (Approximately 6 inches (15 cm) square.)
  • Because it does not require bypass ducting it does not undermine the pressure separation (and therefore, blower efficiency) of the return and supply ducts.
  • Does not require use of moisture pads (on-going expense).
  • Highly efficient usage of water. Does not generate waste water, and does not require separate connection to a drain.
  • Requires little maintenance. Periodic cleaning of nozzle may be required in hard water environments.
  • Lack of a pan of potentially stagnant water to serve as a breeding ground for mold as with a drum-style humidifier.
Disadvantages include:

  • Spray nozzle can become clogged in hard water situations, necessitating the use of water filter, periodic cleaning of nozzle, or nozzle replacement.
  • Disperses any minerals in the water into the airstream.

Additional types[edit]

Additional types include non-bypass flow-through (fan augmented), steam, impeller or centrifugal atomizer, and under duct designs.

Problems and health risks[edit]

The USEPA provides detailed information about health risks as well as recommended maintenance procedures.[10] If the tap water contains a lot of minerals (also known as “hard water”) then the ultrasonic or impeller humidifiers will produce a “white dust” (calcium is the most common mineral in tap water), which usually spreads over furniture, and is attracted to static electricity generating devices such as CRT monitors. The white dust can be prevented by using distilled water or a demineralization cartridge in ultrasonic humidifiers.

In addition, a stuck or malfunctioning water supply valve can deliver large amounts of water, causing extensive water damage if undetected for any period of time. A water alarm, possibly with an automatic water shutoff, can help prevent this malfunction from causing major problems.

from wikipedia


A toaster, or a toast maker, is a small electric appliance designed to brown sliced bread by exposing it to radiant heat, thus converting it into toast. Toasters can toast multiple types of sliced bread products. Invented in Scotland in 1893, it was developed over the years, with the introduction of an automatic mechanism to stop the toasting and pop the slices up – the “pop up toaster” in 1919 being a significant development. The most common household toasting appliances in the 2010s are the pop-up toaster and the toaster oven. Bread slices are inserted into slots in the top of a pop-up toaster, which make toast from bread in one to three minutes by using electric heating elements. Toasters have a control to adjust how much the appliance toasts the bread. Since the 2000s, pop-up toasters with wider slots have been manufactured, enabling them to toast bagels cut in half or thick-sliced “Texas toast”. Another trend of the 2000s is the increasing availability of four-slice toasters.

Toaster ovens have a hinged door in the front that opens to allow food items to be placed on a rack, which has heat elements above and below the grilling area. Toaster ovens function the same as a small-scale conventional oven. Toaster ovens typically have settings to toast bread and a temperature control for use of the appliance as an oven. Most are large enough to heat up a slice of pizza or a burrito and some larger models can be used to bake a small casserole.


The word “toaster” was formed from the word “toast“, which means “sliced bread singed by heat.”[1] & “-er“, a suffix changing a verb into a noun which applies to things that execute the verb. The term toast comes “…from the Latin torrere, ‘to burn'”.[1] The first reference to toast “…in print is in a recipe for…Oyle Soppys (flavoured onions stewed in a gallon of stale beer and a pint of oil) that dates from 1430.”[2] In the 1400s and 1500s, toast “…was discarded rather than eaten after it was used as a flavouring for drinks.” [2]In the 1600s, toast was still thought of as something that was “put it into drinks. Shakespeare gave this line to Falstaff in The Merry Wives of Windsor, 1616: “Go, fetch me a quart of Sacke [sherry], put a tost in ‘t.”[2] By the 1700s, there were references to toast being a gesture that indicates respect: “Ay, Madam, it has been your Life’s whole Pride of late to be the Common Toast of every Publick Table.”[2]


Toaster before the use of electricity

Toaster with an Edison screw fitting, c. 1909

General Electric Model D-12 toaster, from 1910s

Before the development of the electric toaster, sliced bread was toasted by placing it in a metal frame or on a long-handled toasting-fork[3] and holding it near a fire or over a kitchen grill. Simple utensils for toasting bread over open flames appeared in the early 19th century.

The first electric bread toaster was invented by Alan MacMasters in Edinburgh, Scotland in 1893.[4]

Development of the heating element[edit]

The primary technical problem at the time was the development of a heating element which would be able to sustain repeated heating to red-hot temperatures without either breaking or becoming too brittle.[citation needed] A similar technical challenge had recently been surmounted with the invention of the first successful incandescent lightbulbs by Joseph Swan and Thomas Edison. However, the light bulb took advantage of the presence of a vacuum, something that couldn’t be used with the toaster.

Macmaster’s toaster was commercialized by the Crompton, Stephen J. Cook & Company of the UK as a toasting appliance called the Eclipse. Early attempts at producing electrical appliances using iron wiring were unsuccessful, because the wiring was easily melted and a serious fire hazard. Meanwhile, electricity was not readily available, and when it was, mostly only at night.[citation needed]

The problem of the heating element was solved in 1905 by a young engineer named Albert Marsh who designed an alloy of nickel and chromium, which came to be known as Nichrome.[5][6][7][8]

The first US patent application for an electric toaster was filed by George Schneider of the American Electrical Heater Company of Detroit in collaboration with Marsh.[6] One of the first applications the Hoskins company had considered for chromel was toasters, but eventually abandoned such efforts to focus on making just the wire itself.[7]

The first commercially successful electric toaster was introduced by General Electric in 1909 for the GE model D-12.[6][9]

Dual-side toasting and automated pop-up technologies[edit]

In 1913, Lloyd Groff Copeman and his wife Hazel Berger Copeman applied for various toaster patents and in that same year the Copeman Electric Stove Company introduced the toaster with automatic bread turner.[10] The company also produced the “toaster that turns toast.” Before this, electric toasters cooked bread on one side and then it was flipped by hand to toast the other side. Copeman’s toaster turned the bread around without having to touch it.[11]

The automatic pop-up toaster, which ejects the toast after toasting it, was first patented by Charles Strite in 1921.[12] In 1925, using a redesigned version of Strite’s toaster, the Waters Genter Company introduced the Model 1-A-1 Toastmaster,[13] the first automatic pop-up, household toaster that could brown bread on both sides simultaneously, set the heating element on a timer, and eject the toast when finished.[citation needed]

Toasting technology after the 1940s[edit]

By the middle of the 20th century, some high-end U.S. toasters featured automatic toast lowering and raising, with no levers to operate — simply dropping the slices into the machine commenced the toasting procedure. A notable example was the Sunbeam T-20, T-35 and T-50 models (identical except for details such as control positioning) made from the late 1940s through the 1960s, which used the mechanically multiplied thermal expansion of the resistance wire in the center element assembly to lower the bread; the inserted slice of bread tripped a lever to switch on the power which immediately caused the heating element to begin expanding thus lowering the bread.

When the toast was done, as determined by a small bimetallic sensor actuated by the heat passing through the toast, the heaters were shut off and the pull-down mechanism returned to its room-temperature position, slowly raising the finished toast. This sensing of the heat passing through the toast, meant that regardless of the color of the bread (white or wholemeal) and the initial temperature of the bread (even frozen), the bread would always be toasted to the same degree. If a piece of toast was re-inserted into the toaster, it would be only reheated.[citation needed]

Newer additions to toaster technology include wider toasting slots for bagels and thick breads, the ability to toast frozen breads, and the option to heat a single side or slot. Most toasters can also be used to toast other foods such as teacakes, Pop Tarts, potato waffles and crumpets, though the addition of melted butter or sugar to the interior components of automatic electric toasters often contributes to eventual failure. In rare cases, some hobbyists modify toasters to print images and logos on bread slices.[citation needed]


Untoasted slice of Brown bread
Untoasted slice of Brown bread
The same slice of bread, now toasted
The same slice of bread, now toasted

Modern toasters are typically one of three varieties: pop-up toasters, toaster ovens, and conveyor belt toasters. For home use, consumers typically choose a toaster type based on their intended use. Pop-up toasters are better than toaster ovens for making evenly toasted toast, but toaster ovens can bake and broil while pop-up toasters cannot.

Conveyor belt toasters are mostly used in restaurants or other industrial catering environments where toast needs to be made quickly and in larger quantities.

Toasters are designed to look in place in any kitchen. Designers presented more aesthetic variations to pop-up toasters than other toasters. Consumers may choose a toaster by its appearance.


Features which distinguish various types of toasters include the following:[14]

For all toasters
  • Consistency of toasting – The ideal toaster can provide even toasting over the area of the bread, and reproduce this throughout the lifetime of the machine.
  • Choice of toastiness – The user should be able to choose the darkness of the toasting.
  • Toast output – Various toasters can process bread into toast at different capacities.
  • Ease of operation – The toaster’s controls should be labelled to permit easy use and predictable results.
  • Removability of crumb tray – Toasters with a permanently attached crumb tray will be more difficult to clean than those with a removable tray.
  • Cord placement – There can be variation on the placement of a cord as well as retraction functionality.
For pop-up toasters only
  • One-sided toasting – Toasters may optionally toast only one side of the bread, perhaps for toasting one side of a bagel.
  • One-slot toasting – The ability to toast an individual slot, if a single item is desired.
  • Slot depth – People desiring toasted oblong bread should seek a deep slotted toaster.
  • Slot width – People desiring toasted fat bread should seek a wide slotted toaster, as for bagels.
  • Safety features – Most contemporary pop-up toasters have automatic shutoff in case of toast displacement and burning.
  • Bread lifter – Beyond the pop-up, some toasters may incorporate a bread lifter to further expel toast products.
For toaster ovens only
  • Broil options – If only the upper heating element may be used then toaster ovens can make broiling an option.
  • Compact shape – Appropriately sized toaster ovens will serve the user’s requirements but not occupy more counter space than necessary.
  • Design for cleaning – A nonstick interior such as that made from porcelain makes oven interiors easier to clean.
  • Interior lighting – A light inside the oven permits observation of cooking food.
  • Multiple shelf racks – Having options for positioning the oven shelf gives more control over distance between food and the heating element.

Pop-up toasters[edit]

A classically styled chrome two-slot automatic electric toaster

Glowing filaments of a modern 2-slice toaster

In pop-up or automatic toasters, bread slices are inserted vertically into the slots (generally only large enough to admit a single slice of bread) on the top of the toaster. A lever on the side of the toaster is pressed, activating the toaster. When an internal device determines that the toasting cycle is complete, the toaster turns off and the toast pops up out of the slots. The heating elements of a pop-up toaster are usually oriented vertically, parallel to the bread slice – although there are some variations.[citation needed]

In earlier days, the completion of the toasting operation was determined by a mechanical clockwork timer; the user could adjust the running time of the timer to determine the degree of “doneness” of the toast, but the first cycle produced less toasted toast than subsequent cycles because the toaster was not yet warmed up. Toasters made since the 1930s frequently use a thermal sensor, such as a bimetallic strip, located close to the toast. This allows the first cycle to run longer than subsequent cycles. The thermal device is also slightly responsive to the actual temperature of the toast itself. Like the timer, it can be adjusted by the user to determine the “doneness” of the toast.[citation needed]

The most commonly used methods to adjust heat supplied to the toast are either variable time or a heat sensor.

Among pop-up toasters, those toasting two slices of bread are more purchased than those which can toast four.[14] Pop-up toasters can have a range of appearances beyond just a square box, and may have an exterior finish of chrome, copper, brushed metal, or any color plastic.[14] The marketing and price of toasters may not be an indication of quality for producing good toast.[14] A typical modern two-slice pop-up toaster can draw from 600 to 1200 watts.[citation needed]

In 2012 in the United States, a typical market price for a pop-up toaster was US$15.[14]

Toaster ovens[edit]

Toaster oven (Japan)

Toaster ovens are small electric ovens with a front door, wire rack and removable baking pan. To toast bread with a toaster oven, slices of bread are placed horizontally on the rack. When the toast is done, the toaster turns off, but in most cases the door must be opened manually. Most toaster ovens are significantly larger than toasters, but are capable of performing most of the functions of electric ovens, albeit on a much smaller scale. They can be used to cook toast with toppings, like garlic bread or cheese, though they tend to produce drier toast since their heating elements are located farther from the toast (to allow larger items to be cooked).[citation needed] They take 4–6 minutes to make toast as compared to 2–3 minutes in pop-up toasters.[14] Since the toast lies on bars in a toaster oven, the toast will have untoasted stripes on one side.[14] The evidence from product testing does not indicate that convection oven toaster ovens perform better than regular toaster ovens.[14] People wishing to make large amounts of toast in a toaster oven should check the size before purchase, as even seemingly large toaster ovens may not fit six standard-size pieces of bread.[14]

As an appliance, the space toaster ovens require on a countertop ranges from 16 by 8 inches (41 cm × 20 cm)to 20 by 10 inches (51 cm × 25 cm).[14] In 2012 in the United States, a typical market price for a good toaster oven was US$70–80.[14]

Conveyor toasters[edit]

A conveyor toaster can make several hundred pieces of toast in an hour

Conveyor toasters are designed to make many slices of toast and are generally used in the catering industry, in cafeterias, diners and institutional cooking facilities, as they are suitable for large-scale use. Bread is toasted at a rate of 350–900 slices an hour, making conveyor toasters ideal for a large restaurant that is consistently busy. Such devices have occasionally been produced for home use as far back as 1938, when the Toast-O-Lator went into limited production.[citation needed]

Technological innovations[edit]

A hot dog toaster

A number of projects have added advanced technology to toasters. In 1990, Simon Hackett and John Romkey created The Internet Toaster, a toaster which could be controlled from the Internet.[15] In 2001, Robin Southgate from Brunel University in England created a toaster that could toast a graphic of the weather prediction (limited to sunny or cloudy) onto a piece of bread.[16] The toaster dials a pre-coded phone number to get the weather forecast.[17]

In 2005, Technologic Systems, a vendor of embedded systems hardware, designed a toaster running the NetBSD Unix-like operating system as a sales demonstration system.[18] In 2012, Basheer Tome, a student at Georgia Tech, designed a toaster using color sensors to toast bread to the exact shade of brown specified by a user.[19]

A toaster which used Twitter was cited as an early example of an application of the Internet of Things.[20][21] Toasters have been used as advertising devices for online marketing.[22]

With permanent modifications, a toaster oven can be used as a reflow oven for the purpose of soldering electronic components to circuit boards.[23][24]

A hot dog toaster is a variation on the toaster design; it will cook hot dogs without use of microwaves or stoves. The appliance looks similar to a regular toaster, except that there are two slots in the middle for hot dogs, and two slots on the outside for toasting the buns.

In popular culture[edit]

The slang idiom “you’re toast”, “I’m toast” or “we’re toast” is used to express a state of being “outcast”, “finished”, “burned, scorched, wiped out, [or] demolished” (without even the consolation of being remembered, as [with the slang term “you’re] history[“]…).”[1]“Hey, dude. You’re toast, man”, which appeared in The St. Petersburg Times of October 1, 1987, is the “…earliest citation the Oxford English Dictionary research staff has of this usage.”[1]

The other popular idiom associated with the word “toast” is the expression “to toast someone’s health”, which is typically done by one or more persons at a gathering by raising a glass in salute to the individual. This meaning is derived from the early meaning of toast, which from the 1400s to the 1600s meant warmed bread that was placed in a drink. By the 1700s, there were references to the drink in which toast was dunked being used in a gesture that indicates respect: “Ay, Madam, it has been your Life’s whole Pride of late to be the Common Toast of every Publick Table.”[2]

In the 1960s, Kellogg’s advertised its Pop Tart pastries, which were warmed in a toaster, with an animated, anthropomorphic toaster character named Milton. The snack became so popular that Kellogg could not keep up with demand.[25]

In 1989, Berkeley Systems introduced a computer screensaver software called After Dark for the Apple Macintosh and in 1991 for Microsoft Windows that included animated 1940s-style chrome toasters sporting bird-like wings (also known as flying toasters). The toasters were depicted flying across the screen with pieces of toast.

Additionally, the phrase “all toasters toast toast” originated in the CD-i video game, Hotel Mario. In one cutscene, Mario discovers the source of a hotel’s rolling blackouts; a room full of toasters with Bowser’s Sourpuss Bread in them. To alleviate this, he had to pull the plug on the toasters, allowing the mechanisms to release and release the toast where the phrase was then spoken.[26]

Toasters cause nearly 800 deaths annually due to electrocution and fires.[27] In 2013, the London Fire Brigade released a campaign titled “Fifty Shades of Red”, discouraging young men from performing sexual acts with toasters, as they had received numerous calls in response to the acts.[28]


In the United States, some major marketers of toasters include the brands Black & Decker, Cuisinart, General Electric, Hamilton Beach Brands, KitchenAid, Sunbeam Products, T-Fal, and Toastmaster.[14]

from wikipedia


A refrigerator (colloquially fridge) is a popular household appliance that consists of a thermally insulated compartment and a heat pump (mechanical, electronic or chemical) that transfers heat from the inside of the fridge to its external environment so that the inside of the fridge is cooled to a temperature below the ambient temperature of the room. Refrigeration is an essential food storage technique in developed countries. The lower temperature lowers the reproduction rate of bacteria, so the refrigerator reduces the rate of spoilage. A refrigerator maintains a temperature a few degrees above the freezing point of water. Optimum temperature range for perishable food storage is 3 to 5 °C (37 to 41 °F).[1] A similar device that maintains a temperature below the freezing point of water is called a freezer. The refrigerator replaced the icebox, which had been a common household appliance for almost a century and a half. For this reason, a refrigerator is sometimes referred to as an icebox in American usage.

The first cooling systems for food involved using ice. Artificial refrigeration began in the mid-1750s, and developed in the early 1800s. In 1834, the first working vapor-compression refrigeration system was built. The first commercial ice-making machine was invented in 1854. In 1913, refrigerators for home use were invented. In 1923 Frigidaire introduced the first self-contained unit. The introduction of Freon in the 1920s expanded the refrigerator market during the 1930s. Home freezers as separate compartments (larger than necessary just for ice cubes) were introduced in 1940. Frozen foods, previously a luxury item, became commonplace.

Freezer units are used in households and in industry and commerce. Commercial refrigerator and freezer units were in use for almost 40 years prior to the common home models. Most households[citation needed] use the freezer-on-top-and-refrigerator-on-bottom style, which has been the basic style since the 1940s. A vapor compression cycle is used in most household refrigerators, refrigerator–freezers and freezers. Newer refrigerators may include automatic defrosting, chilled water and ice from a dispenser in the door.

Domestic refrigerators and freezers for food storage are made in a range of sizes. Among the smallest is a 4 L Peltier refrigerator advertised as being able to hold 6 cans of beer. A large domestic refrigerator stands as tall as a person and may be about 1 m wide with a capacity of 600 L. Refrigerators and freezers may be free-standing, or built into a kitchen. The refrigerator allows the modern family to keep food fresh for longer than before. Freezers allow people to buy food in bulk and eat it at leisure, and bulk purchases save money.


Refrigeration technology[edit]

Before the invention of the refrigerator, icehouses were used to provide cool storage for most of the year. Placed near freshwater lakes or packed with snow and ice during the winter, they were once very common. Natural means are still used to cool foods today. On mountainsides, runoff from melting snow is a convenient way to cool drinks, and during the winter one can keep milk fresh much longer just by keeping it outdoors. The word “refrigeratory” was used at least as early as the 17th century[2]

The history of artificial refrigeration began when Scottish professor William Cullen designed a small refrigerating machine in 1755. Cullen used a pump to create a partial vacuum over a container of diethyl ether, which then boiled, absorbing heat from the surrounding air.[3] The experiment even created a small amount of ice, but had no practical application at that time.

Schematic of Dr. John Gorrie’s 1841 mechanical ice machine.

In 1805, American inventor Oliver Evans described a closed vapor-compression refrigeration cycle for the production of ice by ether under vacuum. In 1820, the British scientist Michael Faraday liquefied ammonia and other gases by using high pressures and low temperatures, and in 1834, an American expatriate to Great Britain, Jacob Perkins, built the first working vapor-compression refrigeration system. It was a closed-cycle device that could operate continuously.[4] A similar attempt was made in 1842, by American physician, John Gorrie,[5] who built a working prototype, but it was a commercial failure. American engineer Alexander Twining took out a British patent in 1850 for a vapor compression system that used ether.

The first practical vapor compression refrigeration system was built by James Harrison, a British journalist who had emigrated to Australia. His 1856 patent was for a vapor compression system using ether, alcohol or ammonia. He built a mechanical ice-making machine in 1851 on the banks of the Barwon River at Rocky Point in Geelong, Victoria, and his first commercial ice-making machine followed in 1854. Harrison also introduced commercial vapor-compression refrigeration to breweries and meat packing houses, and by 1861, a dozen of his systems were in operation.

Ferdinand Carré‘s ice-making device

The first gas absorption refrigeration system using gaseous ammonia dissolved in water (referred to as “aqua ammonia”) was developed by Ferdinand Carré of France in 1859 and patented in 1860. Carl von Linde, an engineering professor at the Technological University Munich in Germany, patented an improved method of liquefying gases in 1876. His new process made possible the use of gases such as ammonia, sulfur dioxide (SO2) and methyl chloride (CH3Cl) as refrigerants and they were widely used for that purpose until the late 1920s.

Domestic refrigerator[edit]

McCray pre-electric home refrigerator ad (1905) This company, founded in 1887, is still in business.

In 1913, refrigerators for home and domestic use were invented by Fred W. Wolf of Fort Wayne, Indiana with models consisting of a unit that was mounted on top of an ice box.[6] In 1914, engineer Nathaniel B. Wales of Detroit, Michigan, introduced an idea for a practical electric refrigeration unit, which later became the basis for the Kelvinator. A self-contained refrigerator, with a compressor on the bottom of the cabinet was invented by Alfred Mellowes in 1916. Mellowes produced this refrigerator commercially but was bought out by William C. Durant in 1918, who started the Frigidaire Company to mass-produce refrigerators. In 1918, Kelvinator Company introduced the first refrigerator with any type of automatic control. The absorption refrigerator was invented by Baltzar von Platen and Carl Munters from Sweden in 1922, while they were still students at the Royal Institute of Technology in Stockholm. It became a worldwide success and was commercialized by Electrolux. Other pioneers included Charles Tellier, David Boyle, and Raoul Pictet. Carl von Linde was the first to patent and make a practical and compact refrigerator.

These home units usually required the installation of the mechanical parts, motor and compressor, in the basement or an adjacent room while the cold box was located in the kitchen. There was a 1922 model that consisted of a wooden cold box, water-cooled compressor, an ice cube tray and a 9-cubic-foot (0.25 m3) compartment, and cost $714. (A 1922 Model-T Ford cost about $450.) By 1923, Kelvinator held 80 percent of the market for electric refrigerators. Also in 1923 Frigidaire introduced the first self-contained unit. About this same time porcelain-covered metal cabinets began to appear. Ice cube trays were introduced more and more during the 1920s; up to this time freezing was not an auxiliary function of the modern refrigerator.

General Electric “Monitor-Top” refrigerator, introduced in 1927.

The first refrigerator to see widespread use was the General Electric “Monitor-Top” refrigerator introduced in 1927, so-called because of its resemblance to the gun turret on the ironclad warship USS Monitor of the 1860s. The compressor assembly, which emitted a great deal of heat, was placed above the cabinet, and enclosed by a decorative ring. Over a million units were produced. As the refrigerating medium, these refrigerators used either sulfur dioxide, which is corrosive to the eyes and may cause loss of vision, painful skin burns and lesions, or methyl formate, which is highly flammable, harmful to the eyes, and toxic if inhaled or ingested. Many of these units are still functional today, after requiring little more service than a replacement start relay or thermostat if at all. These cooling systems cannot legally be recharged with the hazardous original refrigerants if they leak or break down.

The introduction of Freon in the 1920s expanded the refrigerator market during the 1930s and provided a safer, low-toxicity alternative to previously used refrigerants. Separate freezers became common during the 1940s; the popular term at the time for the unit was a deep freeze. These devices, or appliances, did not go into mass production for use in the home until after World War II. The 1950s and 1960s saw technical advances like automatic defrosting and automatic ice making. More efficient refrigerators were developed in the 1970s and 1980s, even though environmental issues led to the banning of very effective (Freon) refrigerants. Early refrigerator models (from 1916) had a cold compartment for ice cube trays. From the late 1920s fresh vegetables were successfully processed through freezing by the Postum Company (the forerunner of General Foods), which had acquired the technology when it bought the rights to Clarence Birdseye‘s successful fresh freezing methods.

The first successful application of frozen foods occurred when General Foods heiress Marjorie Merriweather Post (then wife of Joseph E. Davies, United States Ambassador to the Soviet Union) deployed commercial-grade freezers in Spaso House, the US Embassy in Moscow, in advance of the Davies’ arrival. Post, fearful of the USSR’s food processing safety standards, fully stocked the freezers with products from General Foods’ Birdseye unit. The frozen food stores allowed the Davies to entertain lavishly and serve fresh frozen foods that would otherwise be out of season. Upon returning from Moscow, Post (who resumed her maiden name after divorcing Davies) directed General Foods to market frozen product to upscale restaurants.

Home freezers as separate compartments (larger than necessary just for ice cubes), or as separate units, were introduced in the United States in 1940. Frozen foods, previously a luxury item, became commonplace.


Freezer units are used in households and in industry and commerce. Food stored at or below −18 °C (0 °F) is safe indefinitely.[7] Most household freezers maintain temperatures from −23 to −18 °C (−9 to 0 °F), although some freezer-only units can achieve −34 °C (−29 °F) and lower. Refrigerators generally do not achieve lower than −23 °C (−9 °F), since the same coolant loop serves both compartments: Lowering the freezer compartment temperature excessively causes difficulties in maintaining above-freezing temperature in the refrigerator compartment. Domestic freezers can be included as a separate compartment in a refrigerator, or can be a separate appliance. Domestic freezers are generally upright units resembling refrigerators or chests (upright units laid on their backs). Many modern upright freezers come with an ice dispenser built into their door. Some upscale models include thermostat displays and controls, and sometimes flatscreen televisions as well.

Commercial and domestic refrigerators[edit]

Commercial refrigerator and freezer units, which go by many other names, were in use for almost 40 years prior to the common home models. They used gas systems such as ammonia (R-717) or sulfur dioxide (R-764), which occasionally leaked, making them unsafe for home use. Practical household refrigerators were introduced in 1915 and gained wider acceptance in the United States in the 1930s as prices fell and non-toxic, non-flammable synthetic refrigerants such as Freon-12 (R-12) were introduced. However, R-12 damaged the ozone layer, causing governments to issue a ban on its use in new refrigerators and air-conditioning systems in 1994. The less harmful replacement for R-12, R-134a (tetrafluoroethane), has been in common use since 1990, but R-12 is still found in many old systems today.

A common commercial refrigerator is the glass fronted beverage cooler. These type of appliances are typically designed for specific re-load conditions meaning that they generally have a larger cooling system. This ensures that they are able to cope with a large throughput of drinks and frequent door opening. As a result, it is common for these types of commercial refrigerators to have energy consumption of >4 kWh/day.[citation needed]

Styles of refrigerators[edit]

Frigidaire Imperial “Frost Proof” model FPI-16BC-63, top refrigerator/bottom freezer with brushed chrome door finish made by General Motors Canada in 1963

In the early 1950s most refrigerators were white, but from the mid-1950s through present day designers and manufacturers put color onto refrigerators. In the late-1950s/early-1960s, pastel colors like turquoise and pink became popular, brushed chrome-plating (similar to stainless finish) was available on some models from different brands. In the late 1960s and throughout the 1970s, earth tone colors were popular, including Harvest Gold, Avocado Green and almond. In the 1980s, black became fashionable. In the late 1990s stainless steel came into vogue, and in 2009, one manufacturer introduced multi-color designs.

Production by country[edit]

General technical explanation[edit]

Basic functioning of a refrigerator

File:Introduction to the Process and Components of a Conventional Refrigerator.ogv

Process and Components of a Conventional Refrigerator

Vapor Compression Cycle – A: hot compartment (kitchen), B: cold compartment (refrigerator box), I: insulation, 1: Condenser, 2: Expansion valve, 3: Evaporator unit, 4: Compressor

An Embraco compressor and fan-assisted condenser coil

A vapor compression cycle is used in most household refrigerators, refrigerator–freezers and freezers. In this cycle, a circulating refrigerant such as R134a enters a compressor as low-pressure vapor at or slightly below the temperature of the refrigerator interior. The vapor is compressed and exits the compressor as high-pressure superheated vapor. The superheated vapor travels under pressure through coils or tubes that make up the condenser; the coils or tubes are passively cooled by exposure to air in the room. The condenser cools the vapor, which liquefies. As the refrigerant leaves the condenser, it is still under pressure but is now only slightly above room temperature. This liquid refrigerant is forced through a metering or throttling device, also known as an expansion valve (essentially a pin-hole sized constriction in the tubing) to an area of much lower pressure. The sudden decrease in pressure results in explosive-like flash evaporation of a portion (typically about half) of the liquid. The latent heat absorbed by this flash evaporation is drawn mostly from adjacent still-liquid refrigerant, a phenomenon known as auto-refrigeration. This cold and partially vaporized refrigerant continues through the coils or tubes of the evaporator unit. A fan blows air from the refrigerator or freezer compartment (“box air”) across these coils or tubes and the refrigerant completely vaporizes, drawing further latent heat from the box air. This cooled air is returned to the refrigerator or freezer compartment, and so keeps the box air cold. Note that the cool air in the refrigerator or freezer is still warmer than the refrigerant in the evaporator. Refrigerant leaves the evaporator, now fully vaporized and slightly heated, and returns to the compressor inlet to continue the cycle.

Domestic refrigerators are extremely reliable because the moving parts and fluids are sealed from the atmosphere for life, with no possibility of leakage or contamination. In comparison, mechanically-driven refrigeration compressors, such as those in automobile air conditioning, inevitably leak fluid and lubricant past the shaft seals. This leads to a requirement for periodic recharging and, if ignored, possible compressor failure.

An absorption refrigerator works differently from a compressor refrigerator, using a source of heat, such as combustion of liquefied petroleum gas, solar thermal energy or an electric heating element. These heat sources are much quieter than the compressor motor in a typical refrigerator. A fan or pump might be the only mechanical moving parts; reliance on convection is considered impractical.

The Peltier effect uses electricity to pump heat directly; refrigerators employing this system are sometimes used for camping, or in situations where noise is not acceptable. They can be totally silent (if a fan for air circulation is not fitted) but are less energy-efficient than other methods.

Other uses of an absorption refrigerator (or “chiller”) include large systems used in office buildings or complexes such as hospitals and universities. These large systems are used to chill a brine solution that is circulated through the building.

Many modern refrigerator/freezers have the freezer on top and the refrigerator on the bottom. Most refrigerator-freezers—except for manual defrost models or cheaper units—use what appears to be two thermostats. Only the refrigerator compartment is properly temperature controlled. When the refrigerator gets too warm, the thermostat starts the cooling process and a fan circulates the air around the freezer. During this time, the refrigerator also gets colder. The freezer control knob only controls the amount of air that flows into the refrigerator via a damper system.[9] Changing the refrigerator temperature will inadvertently change the freezer temperature in the opposite direction. Changing the freezer temperature will have no effect on the refrigerator temperature. The freezer control may also be adjusted to compensate for any refrigerator adjustment.

This means the refrigerator may become too warm. However, because only enough air is diverted to the refrigerator compartment, the freezer usually re-acquires the set temperature quickly, unless the door is opened. When a door is opened, either in the refrigerator or the freezer, the fan in some units stops immediately to prevent excessive frost build up on the freezer’s evaporator coil, because this coil is cooling two areas. When the freezer reaches temperature, the unit cycles off, no matter what the refrigerator temperature is. Modern computerized refrigerators do not use the damper system. The computer manages fan speed for both compartments, although air is still blown from the freezer.

A few manufacturers offer dual compressor models. These models have separate freezer and refrigerator compartments that operate independently of each other, sometimes mounted within a single cabinet. Each has its own separate compressor, condenser and evaporator coils, insulation, thermostat, and door. Typically, the compressors and condenser coils are mounted at the top of the cabinet, with a single fan to cool them both.

This design, where no air passes between the two compartments, provides for more appropriate humidity levels and much tighter temperature control in each compartment. It also requires much less energy to operate, since each compressor & coolant system can be optimized for a specific temperature range. Further, opening the door of one compartment does not affect the temperature of the air or humidity level in the other compartment. Thus, it avoids many of the disadvantages of the much more common single compressor designs described above, although at a higher initial cost and increased system noise.[citation needed] Manufacturers of such designs argue that the increased cost is compensated over time due to reduced energy use and less food waste due to reduced spoilage.

Alternatives to the vapor-compression cycle not in current use include:


The inside of a home refrigerator containing a large variety of everyday food items.

Newer refrigerators may include:

  • Automatic defrosting
  • A power failure warning that alerts the user by flashing a temperature display. It may display the maximum temperature reached during the power failure, and whether frozen food has defrosted or may contain harmful bacteria.
  • Chilled water and ice from a dispenser in the door. Water and ice dispensing became available in the 1970s. In some refrigerators, the process of making ice is built-in so the user doesn’t have to manually use ice trays. Some refrigerators have water chillers and water filtration systems.
  • Cabinet rollers that lets the refrigerator roll out for easier cleaning
  • Adjustable shelves and trays
  • A status indicator that notifies when it is time to change the water filter
  • An in-door ice caddy, which relocates the ice-maker storage to the freezer door and saves approximately 60 litres (2 cu ft) of usable freezer space. It is also removable, and helps to prevent ice-maker clogging.
  • A cooling zone in the refrigerator door shelves. Air from the freezer section is diverted to the refrigerator door, to cool milk or juice stored in the door shelf.
  • A drop down door built into the refrigerator main door, giving easy access to frequently used items such as milk, thus saving energy by not having to open the main door.
  • A Fast Freeze function to rapidly cool foods by running the compressor for a predetermined amount of time and thus temporarily lowering the freezer temperature below normal operating levels. It is recommended to use this feature several hours before adding more than 1 kg of unfrozen food to the freezer. For freezers without this feature, lowering the temperature setting to the coldest will have the same effect.

Early freezer units accumulated ice crystals around the freezing units. This was a result of humidity introduced into the units when the doors to the freezer were opened condensing on the cold parts, then freezing. This frost buildup required periodic thawing (“defrosting”) of the units to maintain their efficiency. Manual Defrost (referred to as Cyclic) units are still available. Advances in automatic defrosting eliminating the thawing task were introduced in the 1950s, but are not universal, due to energy performance and cost. These units used a counter that only defrosted the freezer compartment (Freezer Chest) when a specific number of door openings had been made. The units were just a small timer combined with an electrical heater wire that heated the freezer’s walls for a short amount of time to remove all traces of frost/frosting. Also, early units featured freezer compartments located within the larger refrigerator, and accessed by opening the refrigerator door, and then the smaller internal freezer door; units featuring an entirely separate freezer compartment were introduced in the early 1960s, becoming the industry standard by the middle of that decade. These older freezer compartments were the main cooling body of the refrigerator, and only maintained a temperature of around −6 °C (21 °F), which is suitable for keeping food for a week.

In the early 1950s, the butter conditioner’s patent was filed and published by the inventor Nave Alfred E. This feature was supposed to “provide a new and improved food storage receptacle for storing butter or the like which may quickly and easily be removed from the refrigerator cabinet for the purpose of cleaning.”[11] Because of the high interest to the invention, companies in UK, New Zealand, and Australia started to include the feature into the mass fridge production and soon it became a symbol of the local culture. However, not long after that it was removed from production as according to the companies this was the only way for them to meet new ecology regulations and they found it inefficient to have a heat generating device inside a fridge.

Later advances included automatic ice units and self compartmentalized freezing units.

An increasingly important environmental concern is the disposal of old refrigerators— initially because freon coolant damages the ozone layer—but as older generation refrigerators wear out, the destruction of CFC-bearing insulation also causes concern. Modern refrigerators usually use a refrigerant called HFC-134a (1,1,1,2-Tetrafluoroethane), which does not deplete the ozone layer, instead of Freon. A R-134a is now becoming very uncommon in Europe. Newer refrigerants are being used instead. The main refrigerant now used is R-600a, or isobutane. This refrigerant is naturally occurring, and has a smaller effect on the atmosphere if released. There have been reports of refrigerators exploding if the refrigerant leaks gas in the presence of a spark.

Disposal of discarded refrigerators is regulated, often mandating the removal of doors; children playing hide-and-seek have been asphyxiated while hiding inside discarded refrigerators, particularly older models with latching doors. Since 2 August 1956, under U.S. federal law, refrigerator doors are no longer permitted to latch so they cannot be opened from the inside.[12] Modern units use a magnetic door gasket that holds the door sealed but allows it to be pushed open from the inside.[13] This gasket was invented, developed and manufactured by Max Baermann (1903-1984) of Bergisch Gladbach/Germany.[14]

Types of domestic refrigerators[edit]

Domestic refrigerators and freezers for food storage are made in a range of sizes. Among the smallest is a 4 L Peltier refrigerator advertised as being able to hold 6 cans of beer. A large domestic refrigerator stands as tall as a person and may be about 1 m wide with a capacity of 600 L. Some models for small households fit under kitchen work surfaces, usually about 86 cm high. Refrigerators may be combined with freezers, either stacked with refrigerator or freezer above, below, or side by side. A refrigerator without a frozen food storage compartment may have a small section just to make ice cubes. Freezers may have drawers to store food in, or they may have no divisions (chest freezers).

Refrigerators and freezers may be free-standing, or built into a kitchen.

  • Compressor refrigerators are by far the most common type; they make a noticeable noise.
  • Absorption refrigerators or thermo-electric Peltier units are used where quiet running is required; Peltier coolers are used in the smallest refrigerators as they have no bulky mechanism.
  • Compressor and Peltier refrigerators are powered by electricity. Absorption units can be designed to get power from any heat source. A noticeable difference between the two types is the absence of refrigerant with Peltier coolers (these use a different method of cooling). But Peltier coolers use more electricity because they are thermodynamically inefficient.
  • Oil, gas (natural gas or propane) and dual power gas/electricity units are also available (typically found in RV’s).
  • Solar refrigerators and Thermal mass refrigerators are designed to reduce electrical consumption. Solar refrigerators have the added advantage that they do not use refrigerants that are harmful to the environment or flammable. Typical solar designs are absorption refrigerators that use ammonia as the working gas, and employ large mirrors to concentrate sufficient sunlight to reach the temperature required to free gaseous ammonia from the solvent.[15][16] Most thermal mass refrigerators are designed to use electricity intermittently. As these units are heavily insulated, cooling load is limited primarily to heat introduced by new items to be refrigerated, and ambient air transfer when the unit is open. Very little power is therefore required if opened infrequently. Refrigeration units for commercial and industrial applications can be made in various size, shape or style to fit customer needs.

Other specialised cooling mechanisms may be used for cooling, but have not been applied to domestic refrigerators.

  • Magnetic refrigerators are refrigerators that work on the magnetocaloric effect. The cooling effect is triggered by placing a metal alloy in a magnetic field.[17]
  • Acoustic refrigerators are refrigerators that use resonant linear reciprocating motors/alternators to generate a sound that is converted to heat and cold using compressed helium gas. The heat is discarded and the cold is routed to the refrigerator.

Energy efficiency[edit]

European energy label for a fridge.

Modern refrigerators

In a house without air-conditioning (space heating and/or cooling) refrigerators consumed more energy than any other home device.[18] In the early 1990s a competition was held among the major manufacturers to encourage energy efficiency.[19] Current US models that are Energy Star qualified use 50% less energy than the average models made in 1974.[20] The most energy-efficient unit made in the US consumes about half a kilowatt-hour per day (equivalent to 20 W continuously).[21] But even ordinary units are quite efficient; some smaller units use less than 0.2 kWh per day (equivalent to 8 W continuously). Larger units, especially those with large freezers and icemakers, may use as much as 4 kW·h per day (equivalent to 170 W continuously). The European Union uses a letter-based mandatory energy efficiency rating label instead of the Energy Star; thus EU refrigerators at the point of sale are labelled according to how energy-efficient they are.

For US refrigerators, the Consortium on Energy Efficiency (CEE) further differentiates between Energy Star qualified refrigerators. Tier 1 refrigerators are those that are 20% to 24.9% more efficient than the Federal minimum standards set by the National Appliance Energy Conservation Act (NAECA). Tier 2 are those that are 25% to 29.9% more efficient. Tier 3 is the highest qualification, for those refrigerators that are at least 30% more efficient than Federal standards.[22] About 82% of the Energy Star qualified refrigerators are Tier 1, with 13% qualifying as Tier 2, and just 5% at Tier 3.[23]

Besides the standard style of compressor refrigeration used in normal household refrigerators and freezers, there are technologies such as absorption refrigeration and magnetic refrigeration. Although these designs generally use a much larger amount of energy compared to compressor refrigeration, other qualities such as silent operation or the ability to use gas can favor these refrigeration units in small enclosures, a mobile environment or in environments where unit failure would lead to devastating consequences.

Many refrigerators made in the 1930s and 1940s were far more efficient than most that were made later. This is partly attributable to the addition of new features, such as auto-defrost, that reduced efficiency. Additionally,after World War 2, refrigerator style became more important than efficiency. This was especially true in the 1970s, when side-by-side models with ice dispensers and water chillers became popular. However, the reduction in efficiency also arose partly from reduction in the amount of insulation to cut costs. Because of the introduction of new energy efficiency standards, refrigerators made today are much more efficient than those made in the 1930s; they consume the same amount of energy while being three times as large.[24][25]

The efficiency of older refrigerators can be improved by defrosting (if the unit is manual defrost) and cleaning them regularly, replacing old and worn door seals with new ones, adjusting the thermostat to accommodate the actual contents (a refrigerator needn’t be colder than 4 °C (39 °F) to store drinks and non-perishable items) and also replacing insulation, where applicable. Some sites recommend you clean condenser coils every month or so on units with coils on the rear. It has been proven that this does very little for improving efficiency,[citation needed] however, the unit should be able to “breathe” with adequate spaces around the front, back, sides and above the unit. If the refrigerator uses a fan to keep the condenser cool, then this must be cleaned, at the very least, yearly.[citation needed]

Frost-free refrigerators or freezers use electric fans to cool the appropriate compartment. This could be called a “fan forced” refrigerator, whereas manual defrost units rely on colder air lying at the bottom, versus the warm air at the top to achieve adequate cooling. The air is drawn in through an inlet duct and passed through the evaporator where it is cooled, the air is then circulated throughout the cabinet via a series of ducts and vents. Because the air passing the evaporator is supposedly warm and moist, frost begins to form on the evaporator (especially on a freezer’s evaporator). In cheaper and/or older models, a defrost cycle is controlled via a mechanical timer. This timer is set to shut off the compressor and fan and energize a heating element located near or around the evaporator for about 15 to 30 minutes at every 6 to 12 hours. This melts any frost or ice build up and allows the refrigerator to work normally once more. It is believed that frost free units have a lower tolerance for frost, due to their air-conditioner like evaporator coils. Therefore, if a door is left open accidentally (especially the freezer), the defrost system may not remove all frost, in this case, the freezer (or refrigerator) must be defrosted.[citation needed]

If the defrosting system melts all the ice before the timed defrosting period ends, then a small device (called a defrost limiter) acts like a thermostat and shuts off the heating element to prevent too large a temperature fluctuation, it also prevents hot blasts of air when the system starts again, should it finish defrosting early. On some early frost-free models, the defrost limiter also sends a signal to the defrost timer to start the compressor and fan as soon as it shuts off the heating element before the timed defrost cycle ends. When the defrost cycle is completed, the compressor and fan are allowed to cycle back on.[citation needed]

Frost-free refrigerators, including some early frost free refrigerator/freezers that used a cold plate in their refrigerator section instead of airflow from the freezer section, generally don’t shut off their refrigerator fans during defrosting. This allows consumers to leave food in the main refrigerator compartment uncovered, and also helps keep vegetables moist. This method also helps reduce energy consumption, because the refrigerator is above freeze point and can pass the warmer-than-freezing air through the evaporator or cold plate to aid the defrosting cycle.

Regarding total life-cycle costs, many governments offer incentives to encourage recycling of old refrigerators. One example is the Phoenix refrigerator program launched in Australia. This government incentive picked up old refrigerators, paying their owners for “donating” the refrigerator. The refrigerator was then refurbished, with new door seals, a thorough cleaning and the removal of items, such as the cover that is strapped to the back of many older units. The resulting refrigerators, now over 10% more efficient, were then distributed to low income families.[citation needed]

Effect on lifestyle[edit]

The refrigerator allows the modern family to keep food fresh for longer than before. The most notable improvement is for meat and other highly perishable wares, which needed to be refined to gain anything resembling shelf life.[citation needed] (On the other hand, refrigerators and freezers can also be stocked with processed, quick-cook foods that are less healthy.) Refrigeration in transit makes it possible to enjoy foodstuffs from distant places.

Dairy products, meats, fish, poultry and vegetables can be kept refrigerated in the same space within the kitchen (although raw meat should be kept separate from other foodstuffs for reasons of hygiene).

Freezers allow people to buy food in bulk and eat it at leisure, and bulk purchases save money. Ice cream, a popular commodity of the 20th century, could previously only be obtained by traveling to where the product was made and eating it on the spot. Now it is a common food item. Ice on demand not only adds to the enjoyment of cold drinks, but is useful for first-aid, and for cold packs that can be kept frozen for picnics or in case of emergency.

Temperature zones and ratings[edit]

File:Theater commercial, electric refrigerator, 1926.ogg

Commercial for electric refrigerators in Pittsburgh, Pennsylvania, 1926

Some refrigerators are now divided into four zones to store different types of food:

  • −18 °C (0 °F) (freezer)
  • 0 °C (32 °F) (meat zone)
  • 5 °C (41 °F) (cooling zone)
  • 10 °C (50 °F) (crisper)

The capacity of a refrigerator is measured in either liters or cubic feet. Typically the volume of a combined refrigerator-freezer is split with 1/3rds to 1/4th of the volume allocated to the freezer although these values are highly variable.

Temperature settings for refrigerator and freezer compartments are often given arbitrary numbers by manufacturers (for example, 1 through 9, warmest to coldest), but generally 3 to 5 °C (37 to 41 °F)[1] is ideal for the refrigerator compartment and −18 °C (0 °F) for the freezer. Some refrigerators must be within certain external temperature parameters to run properly. This can be an issue when placing units in an unfinished area, such as a garage. European freezers, and refrigerators with a freezer compartment, have a four star rating system to grade freezers.[citation needed]

  • [∗]  : min temperature = −6 °C (21 °F). Maximum storage time for (pre-frozen) food is 1 week
  • [∗∗]  : min temperature = −12 °C (10 °F). Maximum storage time for (pre-frozen) food is 1 month
  • [∗∗∗]  : min temperature = −18 °C (0 °F). Maximum storage time for (pre-frozen) food is between 3 and 12 months depending on type (meat, vegetables, fish, etc.)
  • [∗∗∗] : min temperature = −18 °C (0 °F). Maximum storage time for pre-frozen or frozen-from-fresh food is between 3 and 12 months

Although both the three and four star ratings specify the same storage times and same minimum temperature of −18 °C (0 °F), only a four star freezer is intended for freezing fresh food, and may include a “fast freeze” function (runs the compressor continually, down to as low as −26 °C (−15 °F)) to facilitate this. Three (or fewer) stars are used for frozen food compartments that are only suitable for storing frozen food; introducing fresh food into such a compartment is likely to result in unacceptable temperature rises. This difference in categorisation is shown in the design of the 4-star logo, where the “standard” three stars are displayed in a box using “positive” colours, denoting the same normal operation as a 3-star freezer, and the fourth star showing the additional fresh food/fast freeze function is prefixed to the box in “negative” colours or with other distinct formatting.[citation needed]

Most European refrigerators include a moist cold refrigerator section (which does require (automatic) defrosting at irregular intervals) and a (rarely frost free) freezer section.

from wikipedia

kitchen Aid

KitchenAid is an American home appliance brand owned by Whirlpool Corporation. The company was started in 1919 by The Hobart Corporation to produce stand mixers; the “H-5” was the first model introduced. The company faced stiff competition as rivals moved into this emerging market, and introduced its trademarked silhouette in the 1930s with the model “K”, the work of designer Egmont Arens. The brand’s stand mixers have changed little in design since, and attachments from the model “K” onwards are compatible with the modern machines. Dishwashers were the second product line to be introduced, in 1949. A late 1980s promotional campaign on the back of an expansion by retailer Williams-Sonoma saw brand awareness double in three years.


A KitchenAid Model A “Kaidette” stand mixer, produced in the 1930s It was a prototype

The idea of a stand mixer was formulated by Herbert Johnston, an engineer working at the Hobart Corporation. He had been inspired after seeing a baker mix dough, and thought that there must be a better way of doing the task. In 1914, development began, and soon the model “H” mixer was launched for industrial work. The U.S. Navy ordered mixers for two new Tennessee-class battleships, California and Tennessee, as well as the U.S. Navy’s first dreadnought battleship, South Carolina. In 1917, Hobart stand mixers became standard equipment on all U.S. Navy ships, prompting development to begin on the first home models.[1]

A range of modern KitchenAid stand mixers

The first machine to carry the KitchenAid name was the ten-quart C-10 model, introduced in 1918 and built at Hobart’s Troy Metal Products subsidiary in Springfield, Ohio.[2] Prototype models were given to the wives of factory executives, and the product was named when one stated “I don’t care what you call it, but I know it’s the best kitchen aid I’ve ever had!” They were initially marketed to the farmhouse kitchen and were available in hardware stores.[3] But owing to the difficulty in convincing retailers to take up the product, the company recruited a mostly female sales force, which sold the mixers door-to-door.[1] The C-10 machine was also marketed heavily toward soda fountains and small commercial kitchens, and was also sold under the FountainAid and BakersAid model names.[4]
In 1922, KitchenAid introduced the H-5 mixer as its new home-use offering.[5] The H-5 mixer was smaller and lighter than the C-10, and had a more manageable five-quart bowl. The model “G” mixer, about half the weight of the “H-5” was released in August 1928.[6] In the 1920s, several other companies introduced similar mixers, and the Sunbeam Mixmaster became the most popular among consumers until the 1950s.[7]

KitchenAid mixers remained popular, and in the late 1930s, the factory would completely sell out its products each Christmas. The factory was closed for the duration of World War II. After the war, production started up again in 1946 when the factory moved to Greenville, Ohio, to expand capacity.[1]

Model “K”, which introduced the trademarked KitchenAid silhouette

The product range expanded beyond stand mixers for the first time in 1949, when dishwashers were introduced.[3]

In 1985, the company purchased the Chambers Company to incorporate its range of cookers into the KitchenAid brand.[1] After being cleared by a Federal appeals court in January 1986, Whirlpool Corporation was cleared to purchase KitchenAid after initial complaints regarding competition from dishwasher manufacturers White Consolidated Industries and Magic Chef were dismissed.[8] Refrigerators were added to the product line later in 1986.[1] The company used the popularity of celebrity chefs during the late 1980s to seize the chance to expand its customer range. In 1988, retailer Williams-Sonoma was opening new stores across the United States and released a cobalt blue stand mixer for the company. Although the retailer had been carrying KitchenAid products since 1959, the new stores introduced the mixers to a wider range of home cooks. This combined with a change in marketing strategy for KitchenAid, which resulted in a doubling of brand awareness over the course of the following three years.

KitchenAid began manufacturing blenders and other small appliances in the mid-1990s. The brand was further promoted by sponsoring the PBS show Home Cooking, and by introducing the mixers to television chefs such as Julia Child and Martha Stewart. Following the success with William-Sonoma, specific points of purchase were set up in department stores such as Kohl’s and Macy’s. Specific color mixers were released for specific retailers or to benefit charities, such as a pink mixer released to raise funds for breast cancer research or mixers sold at Target stores being available in that company’s signature shade of red. The ProLine range of appliances was launched in 2003 with an initial six-month exclusivity agreement with Williams-Sonoma.[9]

Design and manufacturing[edit]

Ad from kitchenaid mixer

KitchenAid stand mixers at Australian department store MYER

Egmont Arens was hired in the 1930s to design a low-cost series of mixers. This resulted in the production of the KitchenAid Model “K” which showed streamlined lines for the first time, and the KitchenAid standard design has remained relatively unchanged since then.[10] The silhouette has since been made a registered trademark with the U.S. Patent and Trademark Office.[10] In 1997 the San Francisco Museum of Modern Art selected the KitchenAid stand mixer as an icon of American design. There is an attachment hub on the front of each mixer. Every KitchenAid mixer since the introduction of the Model “K” has allowed for cross-generational attachment compatibility, meaning that attachments from the 1930s can be used on modern mixers, and vice versa. Note that this cross-generational compatibility extends only to attachments powered through the hub. Other accessories (beaters, bowls, etc.) are not necessarily compatible even across similar models in production at the same time (for example, not all current production six-quart bowl-lift mixers use the same accessories).[11] Initially the mixers were only available in white; a range of four colors was introduced in 1955.[10]

Today, some KitchenAid products are manufactured in Ohio, South Carolina, Mississippi, Indiana, Arkansas, Ontario, and Quebec while others are manufactured in China,[citation needed] and its appliances are distributed throughout North America.[12] All KitchenAid stand mixers are assembled in its factory in Greenville, Ohio. The die-cast parts of the machines come from various manufacturing plants around the world and are hand worked to remove imperfections on the metal cases. A factory tour, known as the “KitchenAid Experience” is conducted by the assembly line workers.[13]

from wikipedia

microwave oven

A microwave oven (commonly referred to as a microwave) is a kitchen appliance that heats and cooks food by exposing it to microwave radiation in the electromagnetic spectrum. This induces polar molecules in the food to rotate and produce thermal energy in a process known as dielectric heating. Microwave ovens heat foods quickly and efficiently because excitation is fairly uniform in the outer 25–38 mm (1–1.5 inches) of a homogeneous, high water content food item; food is more evenly heated throughout (except in heterogeneous, dense objects) than generally occurs in other cooking techniques.

Percy Spencer is generally credited with inventing the modern microwave oven after World War II from radar technology developed during the war. Named the “Radarange”, it was first sold in 1946. Raytheon later licensed its patents for a home-use microwave oven that was first introduced by Tappan in 1955, but these units were still too large and expensive for general home use. The countertop microwave oven was first introduced in 1967 by the Amana Corporation, and their use has spread into commercial and residential kitchens around the world.

Microwave ovens are popular for reheating previously cooked foods and cooking a variety of foods. They are also useful for rapid heating of otherwise slowly prepared cooking items, such as hot butter, fats, and chocolate. Unlike conventional ovens, microwave ovens usually do not directly brown or caramelize food, since they rarely attain the necessary temperatures to produce Maillard reactions. Exceptions occur in rare cases where the oven is used to heat frying-oil and other very oily items (such as bacon), which attain far higher temperatures than that of boiling water. Microwave ovens have a limited role in professional cooking,[1] because the boiling-range temperatures produced in especially hydrous foods impede flavors produced by the higher temperatures of frying, browning, or baking. However, additional heat sources can be added to microwave ovens, or into combination microwave ovens, to produce these other heating effects, and microwave heating may cut the overall time needed to prepare such dishes. Some modern microwave ovens are part of over-the-range units with built-in extractor hoods.


Early developments

The exploitation of high-frequency radio waves for heating substances was made possible by the development of vacuum tube radio transmitters around 1920. By 1930 the application of short waves to heat human tissue had developed into the medical therapy of diathermy. At the 1933 Chicago World’s Fair, Westinghouse demonstrated the cooking of foods between two metal plates attached to a 10 kW, 60 MHz shortwave transmitter.[2] The Westinghouse team, led by I. F. Mouromtseff, found that foods like steaks and potatoes could be cooked in minutes.

The 1937 United States patent application by Bell Laboratories states:[3]

“This invention relates to heating systems for dielectric materials and the object of the invention is to heat such materials uniformly and substantially simultaneously throughout their mass. … It has been proposed therefore to heat such materials simultaneously throughout their mass by means of the dielectric loss produced in them when they are subjected to a high voltage, high frequency field.”

However, lower-frequency dielectric heating, as described in the aforementioned patent, is (like induction heating) an electromagnetic heating effect, the result of the so-called near-field effects that exist in an electromagnetic cavity that is small compared with the wavelength of the electromagnetic field. This patent proposed radio frequency heating, at 10 to 20 megahertz (wavelength 15 to 30 meters).[4] Heating from microwaves that have a wavelength that is small relative to the cavity (as in a modern microwave oven) is due to “far-field” effects that are due to classical electromagnetic radiation that describes freely propagating light and microwaves suitably far from their source. Nevertheless, the primary heating effect of all types of electromagnetic fields at both radio and microwave frequencies occurs via the dielectric heating effect, as polarized molecules are affected by a rapidly alternating electric field.

Cavity magnetron

Microwave ovens, several from the 1980s

The invention of the cavity magnetron made possible the production of electromagnetic waves of a small enough wavelength (microwaves). The magnetron was originally a crucial component in the development of short wavelength radar during World War II.[5] In 1937–1940, a multi-cavity magnetron was built by the British physicist Sir John Turton Randall, FRSE, together with a team of British coworkers, for the British and American military radar installations in World War II. A more high-powered microwave generator that worked at shorter wavelengths was needed, and in 1940, at the University of Birmingham, John Randall and Harry Boot produced a working prototype.[6]

Sir Henry Tizard travelled to the U.S. in late September 1940 to offer the magnetron in exchange for their financial and industrial help (see Tizard Mission). An early 6 kW version, built in England by the General Electric Company Research Laboratories, Wembley, London, was given to the U.S. government in September 1940. Contracts were awarded to Raytheon and other companies for mass production of the magnetron.


In 1945, the specific heating effect of a high-power microwave beam was accidentally discovered by Percy Spencer, an American self-taught engineer from Howland, Maine. Employed by Raytheon at the time, he noticed that microwaves from an active radar set he was working on started to melt a candy bar he had in his pocket. The first food deliberately cooked with Spencer’s microwave was popcorn, and the second was an egg, which exploded in the face of one of the experimenters.[7][8] To verify his finding, Spencer created a high density electromagnetic field by feeding microwave power from a magnetron into a metal box from which it had no way to escape. When food was placed in the box with the microwave energy, the temperature of the food rose rapidly.

On 8 October 1945,[9] Raytheon filed a United States patent application for Spencer’s microwave cooking process, and an oven that heated food using microwave energy from a magnetron was soon placed in a Boston restaurant for testing. The first time the public was able to use a microwave oven was in January 1947, when the Speedy Weeny vending machine was placed in Grand Central Terminal to dispense “sizzling delicious” hot dogs. Among those on the development team was robotics pioneer George Devol, who had spent the last part of the war developing radar countermeasures.

Commercial availability

Raytheon RadaRange aboard the NS Savannah nuclear-powered cargo ship, installed circa 1961

In 1947, Raytheon built the “Radarange”, the first commercially available microwave oven.[10] It was almost 1.8 metres (5 ft 11 in) tall, weighed 340 kilograms (750 lb) and cost about US$5,000 ($54,000 in 2017 dollars) each. It consumed 3 kilowatts, about three times as much as today’s microwave ovens, and was water-cooled. An early Radarange was installed (and remains) in the galley of the nuclear-powered passenger/cargo ship NS Savannah. An early commercial model introduced in 1954 consumed 1.6 kilowatts and sold for US$2,000 to US$3,000 ($18,000 to $27,000 in 2017 dollars). Raytheon licensed its technology to the Tappan Stove company of Mansfield, Ohio in 1952.[11] They tried to market a large 220 volt wall unit as a home microwave oven in 1955 for a price of US$1,295 ($12,000 in 2017 dollars), but it did not sell well. In 1965, Raytheon acquired Amana. In 1967, they introduced the first popular home model, the countertop Radarange, at a price of US$495 ($4,000 in 2017 dollars).

In the 1960s,[specify] Litton bought Studebaker‘s Franklin Manufacturing assets, which had been manufacturing magnetrons and building and selling microwave ovens similar to the Radarange. Litton then developed a new configuration of the microwave: the short, wide shape that is now common. The magnetron feed was also unique. This resulted in an oven that could survive a no-load condition: an empty microwave oven where there is nothing to absorb the microwaves. The new oven was shown at a trade show in Chicago,[citation needed] and helped begin a rapid growth of the market for home microwave ovens. Sales volume of 40,000 units for the U.S. industry in 1970 grew to one million by 1975. Market penetration was faster in Japan, due to a re-engineered magnetron allowing for less expensive units. Several other companies joined in the market, and for a time most systems were built by defense contractors, who were most familiar with the magnetron. Litton was particularly well known in the restaurant business.

Residential use

1971 Radar Range RR-4 By the late 1970s, technological advances led to rapidly falling prices. Often called “electronic ovens” in the 1960s, the name “microwave oven” later gained currency, and they are now informally called “microwaves”.

Formerly found only in large industrial applications, microwave ovens increasingly became a standard fixture of residential kitchens in developed countries. By 1986, roughly 25% of households in the U.S. owned a microwave oven, up from only about 1% in 1971;[12] the U.S. Bureau of Labor Statistics reported that over 90% of American households owned a microwave oven in 1997.[12][13]In Australia, a 2008 market research study found that 95% of kitchens contained a microwave oven and that 83% of them were used daily.[14] In Canada, fewer than 5% of households had a microwave oven in 1979, but more than 88% of households owned one by 1998.[15] In France, 40% of households owned a microwave oven in 1994, but that number had increased to 65% by 2004.[16]

Adoption has been slower in less-developed countries, as households with disposable income concentrate on more important household appliances like refrigerators and ovens. In India, for example, only about 5% of households owned a microwave in 2013, well behind refrigerators at 31% ownership.[17] However, microwave ovens are gaining popularity. In Russia, for example, the number of households with a microwave grew from almost 24% in 2002 to almost 40% in 2008.[18] Almost twice as many households in South Africa owned microwaves in 2008 (38.7%) than in 2002 (19.8%).[18] Microwave ownership in Vietnam was at 16% of households in 2008—versus 30% ownership of refrigerators—but this rate was up significantly from 6.7% microwave ownership in 2002—and 14% for refrigerators.[18]


For more details on this topic, see dielectric heating.

A microwave oven, c.2005.

Simulation of the electric field inside a microwave oven for the first 8 ns of operation.

A microwave oven heats food by passing microwave radiation through it. Microwaves are a form of non-ionizing electromagnetic radiation with a frequency higher than ordinary radio waves but lower than infrared light. Microwave ovens use frequencies in one of the ISM (industrial, scientific, medical) bands, which are reserved for this use, so they do not interfere with other vital radio services. Consumer ovens usually use 2.45 gigahertz (GHz)—a wavelength of 12.2 centimetres (4.80 in)—while large industrial/commercial ovens often use 915 megahertz (MHz)—32.8 centimetres (12.9 in).[19] Water, fat, and other substances in the food absorb energy from the microwaves in a process called dielectric heating. Many molecules (such as those of water) are electric dipoles, meaning that they have a partial positive charge at one end and a partial negative charge at the other, and therefore rotate as they try to align themselves with the alternating electric field of the microwaves. Rotating molecules hit other molecules and put them into motion, thus dispersing energy. This energy, when dispersed as molecular vibration in solids and liquids (i.e. as both potential energy and kinetic energy of atoms), is heat. Sometimes, microwave heating is explained as a resonance of water molecules, but this is incorrect;[20] such resonances occur only at above 1 terahertz (THz).[21] Rather it is the lag in response of the polar water molecule to the impending electromagnetic wave. This type of dieletric loss mechanism is referred to as dipole interaction.[citation needed]

Microwave heating is more efficient on liquid water than on frozen water, where the movement of molecules is more restricted. Dielectric heating of liquid water is also temperature-dependent: At 0 °C, dielectric loss is greatest at a field frequency of about 10 GHz, and for higher water temperatures at higher field frequencies.[22]

Compared to liquid water, microwave heating is less efficient on fats and sugars (which have a smaller molecular dipole moment).[23] Sugars and triglycerides (fats and oils) absorb microwaves due to the dipole moments of their hydroxyl groups or ester groups. However, due to the lower specific heat capacity of fats and oils and their higher vaporization temperature, they often attain much higher temperatures inside microwave ovens.[22]This can induce temperatures in oil or very fatty foods like bacon far above the boiling point of water, and high enough to induce some browning reactions, much in the manner of conventional broiling (UK: grilling), braising, or deep fat frying. Foods high in water content and with little oil rarely exceed the boiling temperature of water.

Microwave heating can cause localized thermal runaways in some materials with low thermal conductivity which also have dielectric constants that increase with temperature. An example is glass, which can exhibit thermal runaway in a microwave to the point of melting if preheated. Additionally, microwaves can melt certain types of rocks, producing small quantities of synthetic lava.[citation needed] Some ceramics can also be melted, and may even become clear upon cooling. Thermal runaway is more typical of electrically conductive liquids such as salty water.

A common misconception is that microwave ovens cook food “from the inside out”, meaning from the center of the entire mass of food outwards. This idea arises from heating behavior seen if an absorbent layer of water lies beneath a less absorbent drier layer at the surface of a food; in this case, the deposition of heat energy inside a food can exceed that on its surface. This can also occur if the inner layer has a lower heat capacity than the outer layer causing it to reach a higher temperature, or even if the inner layer is more thermally conductive than the outer layer making it feel hotter despite having a lower temperature. In most cases, however, with uniformly structured or reasonably homogenous food item, microwaves are absorbed in the outer layers of the item at a similar level to that of the inner layers. Depending on water content, the depth of initial heat deposition may be several centimetres or more with microwave ovens, in contrast to broiling/grilling (infrared) or convection heating—methods which deposit heat thinly at the food surface. Penetration depth of microwaves is dependent on food composition and the frequency, with lower microwave frequencies (longer wavelengths) penetrating further.

Heating efficiency[edit]

A microwave oven converts only part of its electrical input into microwave energy. An average consumer microwave oven consumes 1100 W of electricity in producing 700 W of microwave power[citation needed], an efficiency of 64%. The other 400 W are dissipated as heat, mostly in the magnetron tube. Such wasted heat, along with heat from the product being microwaved, is exhausted as warm air through cooling vents. Additional power is used to operate the lamps, AC power transformer, magnetron cooling fan, food turntable motor and the control circuits, although the power consumed by the electronic control circuits of a modern microwave oven is negligible (< 1% of the input power) during cooking.

For cooking or reheating small amounts of food, the microwave oven may use less energy than a cook stove. Although microwave ovens are touted as the most efficient appliance,[24][not in citation given] the energy savings are largely due to the reduced heat mass of the food’s container.[25] The amount of energy used to heat food is generally small compared to total energy usage in typical residences in the United States.[26]


  • Modern microwave ovens use either an analog dial-type timer or a digital control panel for operation. Control panels feature an LED, liquid crystal or vacuum fluorescent display, numeric buttons for entering the cook time, a power level selection feature and other possible functions such as a defrost setting and pre-programmed settings for different food types, such as meat, fish, poultry, vegetables, frozen vegetables, frozen dinners, and popcorn. In most ovens, the magnetron is driven by a linear transformer which can only feasibly be switched completely on or off. As such, the choice of power level does not affect the intensity of the microwave radiation; instead, the magnetron is cycled on and off every few seconds, thus altering the large scale duty cycle. Newer models have inverter power supplies that use pulse-width modulation to provide effectively continuous heating at reduced power, so that foods are heated more evenly at a given power level and can be heated more quickly without being damaged by uneven heating.a high voltage power source, commonly a simple transformer or an electronic power converter, which passes energy to the magnetron
  • a high voltage capacitor connected to the magnetron, transformer and via a diode to the chassis
  • a cavity magnetron, which converts high-voltage electric energy to microwave radiation
  • a magnetron control circuit (usually with a microcontroller)
  • a short waveguide (to couple microwave power from the magnetron into the cooking chamber)
  • a metal cooking chamber
  • a turntable or metal wave guide stirring fan.
  • a digital / manual control panel

The microwave frequencies used in microwave ovens are chosen based on regulatory and cost constraints. The first is that they should be in one of the industrial, scientific, and medical (ISM) frequency bands set aside for non-communication purposes. For household purposes, 2.45 GHz has the advantage over 915 MHz in that 915 MHz is only an ISM band in the ITU Region 2 while 2.45 GHz is available worldwide.[citation needed][vague] Three additional ISM bands exist in the microwave frequencies, but are not used for microwave cooking. Two of them are centered on 5.8 GHz and 24.125 GHz, but are not used for microwave cooking because of the very high cost of power generation at these frequencies. The third, centered on 433.92 MHz, is a narrow band that would require expensive equipment to generate sufficient power without creating interference outside the band, and is only available in some countries.

The cooking chamber is similar to a Faraday cage to prevent the waves from coming out of the oven. Even though there is no continuous metal-to-metal contact around the rim of the door, choke connections on the door edges act like metal-to-metal contact, at the frequency of the microwaves, to prevent leakage. The oven door usually has a window for easy viewing, with a layer of conductive mesh some distance from the outer panel to maintain the shielding. Because the size of the perforations in the mesh is much less than the microwaves’ wavelength (12.2 cm for the usual 2.45 GHz), most of the microwave radiation cannot pass through the door, while visible light (with its much shorter wavelength) can.

Variants and accessories[edit]

A microwave oven with convection feature

A variant of the conventional microwave is the convection microwave. A convection microwave oven is a combination of a standard microwave and a convection oven. It allows food to be cooked quickly, yet come out browned or crisped, as from a convection oven. Convection microwaves are more expensive than conventional microwave ovens. Some convection microwaves—those with exposed heating elements—can produce smoke and burning odors as food spatter from earlier microwave-only use is burned off the heating elements.

In 2000,[27] some manufacturers began offering high power quartz halogen bulbs to their convection microwave models, marketing them under names such as “Speedcook”, “Advantium” , “Lightwave” and “Optimawave” to emphasize their ability to cook food rapidly and with good browning. The bulbs heat the food’s surface with infrared (IR) radiation, browning surfaces as in a conventional oven. The food browns while also being heated by the microwave radiation and heated through conduction through contact with heated air. The IR energy which is delivered to the outer surface of food by the lamps is sufficient to initiate browning caramelization in foods primarily made up of carbohydrates and Maillard reactions in foods primarily made up of protein. These reactions in food produce a texture and taste similar to that typically expected of conventional oven cooking rather than the bland boiled and steamed taste that microwave-only cooking tends to create.

In order to aid browning, sometimes an accessory browning tray is used, usually composed of glass or porcelain. It makes food crisp by oxidizing the top layer until it turns brown. Ordinary plastic cookware is unsuitable for this purpose because it could melt.

Frozen dinners, pies, and microwave popcorn bags often contain a susceptor made from thin aluminium film in the packaging or included on a small paper tray. The metal film absorbs microwave energy efficiently and consequently becomes extremely hot and radiates in the infrared, concentrating the heating of oil for popcorn or even browning surfaces of frozen foods. Heating packages or trays containing susceptors are designed for single use and are discarded as waste.

Microwave-safe plastics[edit]

Some current plastic containers and food wraps are specifically designed to resist radiation from microwaves. Products may use the term “microwave safe”, may carry a microwave symbol (three lines of waves, one above the other) or simply provide instructions for proper microwave use. Any of these is an indication that a product is suitable for microwaving when used in accordance with the directions provided.[28]

Benefits and safety features[edit]

All microwaves use a timer for the cooking time, at the end of cooking time, the oven switches itself off.

Microwave ovens heat food without getting hot themselves. Taking a pot off a stove, unless it is an induction cooktop, leaves a potentially dangerous heating element or trivet that will stay hot for some time. Likewise, when taking a casserole out of a conventional oven, one’s arms are exposed to the very hot walls of the oven. A microwave oven does not pose this problem.

Food and cookware taken out of a microwave oven are rarely much hotter than 100 °C (212 °F). Cookware used in a microwave oven is often much cooler than the food because the cookware is transparent to microwaves; the microwaves heat the food directly and the cookware is indirectly heated by the food. Food and cookware from a conventional oven, on the other hand, are the same temperature as the rest of the oven; a typical cooking temperature is 180 °C (356 °F). That means that conventional stoves and ovens can cause more serious burns.

The lower temperature of cooking (the boiling point of water) is a significant safety benefit compared to baking in the oven or frying, because it eliminates the formation of tars and char, which are carcinogenic.[29] Microwave radiation also penetrates deeper than direct heat, so that the food is heated by its own internal water content. In contrast, direct heat can burn the surface while the inside is still cold. Pre-heating the food in a microwave oven before putting it into the grill or pan reduces the time needed to heat up the food and reduces the formation of carcinogenic char. Unlike frying and baking, microwaving does not produce acrylamide in potatoes,[30] however unlike deep-frying, it is of only limited effectiveness in reducing glycoalkaloid (i.e. solanine) levels.[31] Acrylamide has been found in other microwaved products like popcorn.

Heating characteristics[edit]

Microwave ovens are frequently used for reheating leftover food, and bacterial contamination may not be repressed if the safe temperature is not reached, resulting in foodborne illness, as with all inadequate reheating methods.

Uneven heating in microwaved food can be partly due to the uneven distribution of microwave energy inside the oven, and partly due to the different rates of energy absorption in different parts of the food. The first problem is reduced by a stirrer, a type of fan that reflects microwave energy to different parts of the oven as it rotates, or by a turntable or carousel that turns the food; turntables, however, may still leave spots, such as the center of the oven, which receive uneven energy distribution. The location of dead spots and hot spots in a microwave can be mapped out by placing a damp piece of thermal paper in the oven. When the water saturated paper is subjected to the microwave radiation it becomes hot enough to cause the dye to be released which will provide a visual representation of the microwaves. If multiple layers of paper are constructed in the oven with a sufficient distance between them a three-dimensional map can be created. Many store receipts are printed on thermal paper which allows this to be easily done at home.[32]

The second problem is due to food composition and geometry, and must be addressed by the cook, by arranging the food so that it absorbs energy evenly, and periodically testing and shielding any parts of the food that overheat. In some materials with low thermal conductivity, where dielectric constant increases with temperature, microwave heating can cause localized thermal runaway. Under certain conditions, glass can exhibit thermal runaway in a microwave to the point of melting.[33]

Due to this phenomenon, microwave ovens set at too-high power levels may even start to cook the edges of frozen food while the inside of the food remains frozen. Another case of uneven heating can be observed in baked goods containing berries. In these items, the berries absorb more energy than the drier surrounding bread and cannot dissipate the heat due to the low thermal conductivity of the bread. Often this results in overheating the berries relative to the rest of the food. “Defrost” oven settings use low power levels designed to allow time for heat to be conducted within frozen foods from areas that absorb heat more readily to those which heat more slowly. In turntable-equipped ovens, more even heating will take place by placing food off-centre on the turntable tray instead of exactly in the centre.

Microwave heating can be deliberately uneven by design. Some microwavable packages (notably pies) may include materials that contain ceramic or aluminium flakes, which are designed to absorb microwaves and heat up, which aids in baking or crust preparation by depositing more energy shallowly in these areas. Such ceramic patches affixed to cardboard are positioned next to the food, and are typically smokey blue or gray in colour, usually making them easily identifiable; the cardboard sleeves included with Hot Pockets, which have a silver surface on the inside, are a good example of such packaging. Microwavable cardboard packaging may also contain overhead ceramic patches which function in the same way. The technical term for such a microwave-absorbing patch is a susceptor.[34]

Effects on food and nutrients[edit]

Raisins when overcooked in a microwave produce considerable smoke.

Comparative cooking method studies generally find that, if properly used, microwave cooking does not affect the nutrient content of foods to a larger extent than conventional heating, and that there is a tendency towards greater retention of many micronutrients with microwaving, probably due to the reduced preparation time.[35] Microwaving human milk at high temperatures is contraindicated, due to a marked decrease in activity of anti-infective factors.[36]

Any form of cooking will destroy some nutrients in food, but the key variables are how much water is used in the cooking, how long the food is cooked, and at what temperature.[37] Nutrients are primarily lost by leaching into cooking water, which tends to make microwave cooking healthier, given the shorter cooking times it requires.[38] Like other heating methods, microwaving converts vitamin B12 from an active to inactive form; the amount of inactivation depends on the temperature reached, as well as the cooking time. Boiled food reaches a maximum of 100 °C (212 °F) (the boiling point of water), whereas microwaved food can get locally hotter than this, leading to faster breakdown of vitamin B12. The higher rate of loss is partially offset by the shorter cooking times required.[39] A single study indicated that microwaving broccoli loses 74% or more of phenolic compounds (97% of flavonoids), while boiling loses 66% of flavonoids, and high-pressure boiling loses 47%,[40] though the study has been contradicted by other studies.[41] To minimize phenolic losses in potatoes, microwaving should be done at 500W.[42]

Spinach retains nearly all its folate when cooked in a microwave; in comparison, it loses about 77% when boiled, leaching out nutrients. Bacon cooked by microwave has significantly lower levels of carcinogenic nitrosamines than conventionally cooked bacon.[37] Steamed vegetables tend to maintain more nutrients when microwaved than when cooked on a stovetop.[37] Microwave blanching is 3-4 times more effective than boiled water blanching in the retaining of the water-soluble vitamins folic acid, thiamin and riboflavin, with the exception of ascorbic acid, of which 28.8% is lost (vs. 16% with boiled water blanching).[43]

Use in cleaning kitchen sponges[edit]

Studies have investigated the use of the microwave to clean non-metallic domestic sponges which have been thoroughly wetted. A 2006 study found that microwaving wet sponges for two minutes (at 1000 watt power) removed 99% of coliforms, E. coli and MS2 phages. Bacillus cereus spores were killed at 4 minutes of microwaving.[44]


High temperatures[edit]

Homogeneous liquids can superheat[45][46] when heated in a microwave oven in a container with a smooth surface. That is, the liquid reaches a temperature slightly above its normal boiling point without bubbles of vapour forming inside the liquid. The boiling process can start explosively when the liquid is disturbed, such as when the user takes hold of the container to remove it from the oven or while adding solid ingredients such as powdered creamer or sugar. This can result in spontaneous boiling (nucleation) which may be violent enough to eject the boiling liquid from the container and cause severe scalding.[47]

Closed containers, such as eggs, can explode when heated in a microwave oven due to the increased pressure from steam. Insulating plastic foams of all types generally contain closed air pockets, and are generally not recommended for use in a microwave, as the air pockets explode and the foam (which can be toxic if consumed) may melt. Not all plastics are microwave-safe, and some plastics absorb microwaves to the point that they may become dangerously hot.

Products that are heated for too long can catch fire. Though this is inherent to any form of cooking, the rapid cooking and unattended nature of the use of microwave ovens results in additional hazard.

Metal objects[edit]

Any metal or conductive object placed into the microwave will act as an antenna to some degree, resulting in an electric current. This causes the object to act as a heating element. This effect varies with the object’s shape and composition, and is sometimes utilized for cooking.

Any object containing pointed metal can create an electric arc (sparks) when microwaved. This includes cutlery, crumpled aluminium foil (though some foil used in microwaves are safe, see below), twist-ties containing metal wire, the metal wire carry-handles in paper Chinese take-out food containers, or almost any metal formed into a poorly conductive foil or thin wire; or into a pointed shape.[48] Forks are a good example: the tines of the fork respond to the electric field by producing high concentrations of electric charge at the tips. This has the effect of exceeding the dielectric breakdown of air, about 3 megavolts per meter (3×106 V/m). The air forms a conductive plasma, which is visible as a spark. The plasma and the tines may then form a conductive loop, which may be a more effective antenna, resulting in a longer lived spark. When dielectric breakdown occurs in air, some ozone and nitrogen oxides are formed, both of which are unhealthy in large quantities.

A microwave oven with a metal shelf

It is possible for metal objects to be microwave-oven compatible, although experimentation by users is not encouraged. Microwaving an individual smooth metal object without pointed ends, for example, a spoon or shallow metal pan, usually does not produce sparking. Thick metal wire racks can be part of the interior design in microwave ovens (see illustration). In a similar way, the interior wall plates with perforating holes which allow light and air into the oven, and allow interior-viewing through the oven door, are all made of conductive metal formed in a safe shape.

A microwaved DVD-R disc showing the effects of electrical discharge through its metal film

The effect of microwaving thin metal films can be seen clearly on a Compact Disc or DVD (particularly the factory pressed type). The microwaves induce electric currents in the metal film, which heats up, melting the plastic in the disc and leaving a visible pattern of concentric and radial scars. Similarly, porcelain with thin metal films can also be destroyed or damaged by microwaving. Aluminium foil is thick enough to be used in microwave ovens as a shield against heating parts of food items, if the foil is not badly warped. When wrinkled, aluminium foil is generally unsafe in microwaves, as manipulation of the foil causes sharp bends and gaps that invite sparking. The USDA recommends that aluminium foil used as a partial food shield in microwave cooking cover no more than one quarter of a food object, and be carefully smoothed to eliminate sparking hazards.[49]

Another hazard is the resonance of the magnetron tube itself. If the microwave is run without an object to absorb the radiation, a standing wave will form. The energy is reflected back and forth between the tube and the cooking chamber. This may cause the tube to overload and burn out. For the same reason, dehydrated food, or food wrapped in metal which does not arc, is problematic for overload reasons, without necessarily being a fire hazard.

Certain foods such as grapes, if properly arranged, can produce an electric arc.[50] Prolonged arcing from food carries similar risks to arcing from other sources as noted above.

Some other objects that may conduct sparks are plastic/holographic print thermoses (such as Starbuck’s novelty cups) or cups with metal lining. If any bit of the metal is exposed, all the outer shell will burst off the object or melt.[citation needed]

The high electrical fields generated inside a microwave often can be illustrated by placing a radiometer or neon glow-bulb inside the cooking chamber, creating glowing plasma inside the low-pressure bulb of the device.

Direct microwave exposure[edit]

Direct microwave exposure is not generally possible, as microwaves emitted by the source in a microwave oven are confined in the oven by the material out of which the oven is constructed. Furthermore, ovens are equipped with redundant safety interlocks, which remove power from the magnetron if the door is opened. This safety mechanism is required by United States federal regulations.[51] Tests have shown confinement of the microwaves in commercially available ovens to be so nearly universal as to make routine testing unnecessary.[52] According to the United States Food and Drug Administration‘s Center for Devices and Radiological Health, a U.S. Federal Standard limits the amount of microwaves that can leak from an oven throughout its lifetime to 5 milliwatts of microwave radiation per square centimeter at approximately 5 cm (2 in) from the surface of the oven.[53] This is far below the exposure level currently considered to be harmful to human health.[54]

The radiation produced by a microwave oven is non-ionizing. It therefore does not have the cancer risks associated with ionizing radiation such as X-rays and high-energy particles. Long-term rodent studies to assess cancer risk have so far failed to identify any carcinogenicity from 2.45 GHz microwave radiation even with chronic exposure levels (i.e. large fraction of life span) far larger than humans are likely to encounter from any leaking ovens.[55][56] However, with the oven door open, the radiation may cause damage by heating. Every microwave oven sold has a protective interlock so that it cannot be run when the door is open or improperly latched.

Microwaves generated in microwave ovens cease to exist once the electrical power is turned off. They do not remain in the food when the power is turned off, any more than light from an electric lamp remains in the walls and furnishings of a room when the lamp is turned off. They do not make the food or the oven radioactive. There is some evidence that nutritional content of some foods may be altered differently by cooking in a microwave oven, compared to conventional cooking, but there is no indication of detrimental health issues associated with microwaved food.[57]

There are, however, a few cases where people have been exposed to direct microwave radiation, either from appliance malfunction or deliberate action.[58][59] The general effect of this exposure will be physical burns to the body, as human tissue, particularly the outer fat and muscle layers, has similar composition to some foods that are typically cooked in microwave ovens and so experiences similar dielectric heating effects when exposed to microwave electromagnetic radiation.

Chemical exposure[edit]

Some magnetrons have ceramic insulators with beryllium oxide (beryllia) added. The beryllium in such oxides is a serious chemical hazard if crushed and ingested (for example, by inhaling dust). In addition, beryllia is listed as a confirmed human carcinogen by the IARC[citation needed]; therefore, broken ceramic insulators or magnetrons should not be handled. This is obviously a danger only if the microwave oven becomes physically damaged, such as if the insulator cracks, or when the magnetron is opened and handled directly, and as such should not be a concern during normal usage.

from wikipedia

home appliances

Home appliances are electrical/mechanical machines which accomplish some household functions,[1] such as cooking or cleaning. Home appliances can be classified into:

This division is also noticeable in the maintenance and repair of these kinds of products. Brown goods usually require high technical knowledge and skills (which get more complex with time, such as going from a soldering iron to a hot-air soldering station), while white goods may need more practical skills and “brute force” to manipulate the devices and heavy tools required to repair them.


Given a broad usage, the domestic application attached to “home appliance” is tied to the definition of appliance as “an instrument or device designed for a particular use or function”.[4] More specifically, Collins dictionary defines “home appliance” as: “devices or machines, usually electrical, that are in your home and which you use to do jobs such as cleaning or cooking.”[5] The broad usage, afforded to the definition allows for nearly any device intended for domestic use to be a home appliance, including consumer electronics as well as stoves,[6] refrigerators, toasters[6] and air conditioners to light bulbs and water well pumps.[7][8]


Early 20th century electric toaster

While many appliances have existed for centuries, the self-contained electric or gas powered appliances are a uniquely American innovation that emerged in the twentieth century. The development of these appliances is tied the disappearance of full-time domestic servants and the desire to reduce the time consuming activities in pursuit of more recreational time. In the early 1900s, electric and gas appliances included washing machines, water heaters, refrigerators and sewing machines. The invention of Earl Richardson’s small electric clothes iron in 1903 gave a small initial boost to the home appliance industry. In the Post–World War II economic expansion, the domestic use of dishwashers, and clothes dryers were part of a shift for convenience. Increasing discretionary income was reflected by a rise in miscellaneous home appliances.[9][10]

In America during the 1980s, the industry shipped $1.5 billion worth of goods each year and employed over 14,000 workers, with revenues doubling between 1982 and 1990 to $3.3 billion. Throughout this period companies merged and acquired one another to reduce research and production costs and eliminate competitors, resulting in anti-trust legislation.

The United States Department of Energy reviews compliance with the National Appliance Energy Conservation Act of 1987, which required manufacturers to reduce the energy consumption of the appliances by 25% every five years.[9]

In the 1990s, the appliance industry was very consolidated, with over 90% of the products being sold by just five companies. For example, in 1991, dishwasher manufacturing market share was split between General Electric with 40% market share, Whirlpool with 31% market share, Electrolux with 20% market share, Maytag with 7% market share and Thermador with just 2% of market share.[9]

Major appliances[edit]

Swedish washing machine, 1950s

Main article: Major appliance

Major appliances, also known as white goods, comprise major household appliances and may include: air conditioners,[11] dishwashers,[11] clothes dryers, drying cabinets, freezers, refrigerators,[11] kitchen stoves, water heaters,[11]washing machines,[11] trash compactors, microwave ovens[6] and induction cookers. White goods were typically painted or enameled white, and many of them still are.[12]

Small appliances[edit]

Main article: Small appliance

Small appliances are typically small household electrical machines, easily carried and installed. Some are classified with white goods, and relate to heating and cooling such as: fans[6] and window mounted air conditioners, and heaters such as space heaters, ceramic heaters, gas heaters, kerosene heaters, and fan heaters. Yet another category is used in the kitchen, including: juicers, electric mixers,[13][13] meat grinders, coffee grinders, deep fryers,[13] herb grinders, food processors,[13][14] electric kettles, waffle irons, coffee makers, blenders[14] and dough blenders, rice cookers,[6] toasters and exhaust hoods.

Entertainment and information appliances such as: home electronics,[11] TV sets,[6] CD, VCRs and DVD players,[6] camcorders, still cameras, clocks, alarm clocks, video game consoles, HiFi and home cinema, telephones and answering machines are classified as “brown goods”. Some such appliances were traditionally finished with genuine or imitation wood. This has become rare but the name has stuck, even for goods that are unlikely ever to have had a wooden case (e.g. camcorders).

From Wikipedia, the free encyclopedia


A dishwasher is a mechanical device for cleaning dishware and cutlery. Unlike manual dishwashing, which relies largely on physical scrubbing to remove soiling, the mechanical dishwasher cleans by spraying hot water, typically between 45 and 75 °C (110 and 170 °F), at the dishes, with lower temperatures used for delicate items.[1]

A mix of water and detergent is pumped to one or more rotating spray arms, which blast the dishes with the cleaning mixture. Once the wash is finished, the water is drained, more hot water is pumped in and a rinse cycle begins. After the rinse cycle finishes and the water is drained, the dishes are dried using one of several drying methods. Typically a rinse aid is used to eliminate water spots for streak-free dishes and glassware resulting from hard water or other reasons.[2]

In addition to domestic units, industrial dishwashers are available for use in commercial establishments such as hotels and restaurants, where a large number of dishes must be cleaned. Washing is conducted with temperatures of 65–71 °C (149–160 °F) and sanitation is achieved by either the use of a booster heater that will provide a 82 °C (180 °F) “final rinse” temperature or through the use of a chemical sanitizer.


A hand-powered dishwasher and an early electric dishwasher both from about 1917.

The first reports of a mechanical dishwashing device are of an 1850 patent in the United States by Joel Houghton for a hand-powered wood device. This device was made of wood and was cranked by hand while water sprayed onto the dishes. This device was both slow and unreliable. Another patent was granted to L.A. Alexander in 1865 that was similar to the first but featured a hand-cranked rack system. Neither device was practical or widely accepted.

The first reliable hand-powered dishwasher was invented in 1887 by Josephine Cochrane with the help of George Butters and was unveiled at the 1893 World’s Fair in Chicago, Illinois. Cochrane’s inspiration was her frustration at the damage to her good china that occurred when her servants handled it during cleaning.[3]

Advertisement in an 1896 issue of McClure’s for The Faultless Quaker Dishwasher.

Europe’s first domestic dishwasher with an electric motor was invented and manufactured by Miele in 1929.[4][5]

In the United Kingdom, William Howard Livens invented a small, non-electric dishwasher suitable for domestic use in 1924. It was the first dishwasher that incorporated most of the design elements that are featured in the models of today;[6] it included a front door for loading, a wire rack to hold the dirty crockery and a rotating sprayer. Drying elements were even added to his design in 1940. It was the first machine suitable for domestic use, and it came at a time when permanent plumbing and running water in the house was becoming increasingly common.[7][8]

Despite this, Liven’s design did not become a commercial success, and dishwashers were only successfully sold as domestic utilities in the postwar boom of the 1950s, albeit only to the wealthy. Initially dishwashers were sold as standalone or portable devices, but with the development of the wall-to-wall countertop and standardized height cabinets, dishwashers began to be marketed with standardized sizes and shapes, integrated underneath the kitchen countertop as a modular unit with other kitchen appliances.

By the 1970s dishwashers had become commonplace in domestic residences in North America and Western Europe. By 2012, over 75 percent of homes in the US and Germany had dishwashers.[9]


Size and capacity[edit]

North American counter-top dishwasher

Dishwashers that are installed into standard kitchen cabinets have a standard width and depth of 60 cm (Europe) or 24 inches (US), and most dishwashers must be installed into a hole a minimum of 86 cm (Europe) or 34 inches (US) tall. Portable dishwashers exist in 45 and 60 cm (Europe) or 18 and 24 inch (US) widths, with casters and attached countertops. Dishwashers may come in standard or tall tub designs; standard tub dishwashers have a service kickplate beneath the dishwasher door that allows for simpler maintenance and installation, but tall tub dishwashers have approximately 20% more capacity and better sound dampening from having a continuous front door.

The international standard for the capacity of a dishwasher is expressed as standard place settings. Commercial dishwashers are rated as plates per hour. The rating is based on standard sized plates of the same size. The same can be said for commercial glass washers, as they are based on standard glasses, normally pint glasses.


Present-day machines feature a drop-down front panel door, allowing access to the interior, which usually contains two or sometimes three pull-out racks; racks can also be referred to as “baskets”. In older U.S. models from the 1950s, the entire tub rolled out when the machine latch was opened, and loading/removing washable items was from the top, with the user reaching deep into the compartment for some items. Youngstown Kitchens, which manufactured entire kitchen cabinets and sinks, offered a tub-style dishwasher, which was coupled to a conventional kitchen sink as one unit.

Today, “dish drawer” models mimic this style, while the half-depth design eliminates the inconvenience of the long reach that was necessary with older full-depth models. “Cutlery baskets” are also common. A drawer dishwasher, first introduced by Fisher & Paykel in 1997, is a variant of the dishwasher in which the baskets slide out with the door in the same manner as a drawer filing cabinet, with each drawer in a double-drawer model being able to operate independently of the other.

The inside of a dishwasher in the North American market is either stainless steel or plastic. Stainless steel tubs resist hard water, provide better sound damping, and preserve heat to dry dishes more quickly. They also come at a premium price. Older models used baked enamel on steel and are prone to chipping and erosion; chips in the baked enamel finish must be cleaned of all dirt and corrosion then patched with a special compound or even a good quality two-part epoxy. All European-made dishwashers feature a stainless steel interior as standard, even on low end models. The same is true for a built-in water softener.

Washing elements[edit]

The flutes (or valve meters) of the dishwasher are prevalent in American models (with some appearing in European and Asian models influenced by US design) due to the higher pressure of the American water system (which averages at 90 torrs/min, as opposed to the 65 torrs/min pressure in other countries). The flutes help drain the excess water, preventing entropy within the system due to higher pressures at a lower volume. This is a removable fixture, as some areas require a higher or lower discharge based on their water system.

European dishwashers almost universally use two or three spray arms which are fed from the bottom and back wall of the dishwasher leaving both racks unimpeded and also such models tend to use inline water heaters, removing the need for exposed elements in the base of the machine that can melt plastic items near to them. Many North American dishwashers tend to use more basic and old fashioned water distribution and exposed elements in the base of the dishwasher. Some North American machines use a large cone or similar structure in the bottom dish rack to prevent placement of dishes in the center of the rack. The dishwasher directs water from the bottom of the dishwasher up through this structure to the upper wash arm to spray water on the top dish rack. Some dishwashers, including many models from Whirlpool and Kitchenaid, use a tube attached to the top rack that connects to a water source at the back of the dishwasher, which allows full use of the bottom rack. Late-model Frigidaire dishwashers shoot a jet of water from the top of the washer down into the upper wash arm, again allowing full use of the bottom rack (but requiring that a small funnel on the top rack be kept clear).



Clear model of a running dishwasher

Mid-to-higher end North American dishwashers often come with hard food disposal units, which behave like miniature garbage (waste) disposal units that eliminate large pieces of food waste from the wash water. One manufacturer that is known for omitting hard food disposals is Bosch, a German brand; however, Bosch does so in order to reduce noise. If the larger items of food waste are removed before placing in the dishwasher, pre-rinsing is not necessary even without integrated waste disposal units.

Many new dishwashers feature microprocessor-controlled, sensor-assisted wash cycles that adjust the wash duration to the quantity of dirty dishes (sensed by changes in water temperature) or the amount of dirt in the rinse water (sensed chemically/optically). This can save water and energy if the user runs a partial load. In such dishwashers the electromechanical rotary switch often used to control the washing cycle is replaced by a microprocessor but most sensors and valves are still required to be present. However, pressure switches (some dishwashers use a pressure switch and flow meter) are not required in most microprocessor controlled dishwashers as they use the motor and sometimes a rotational position sensor to sense the resistance of water; when it senses there is no cavitation it knows it has the optimal amount of water. A bimetal switch or wax motor opens the detergent door during the wash cycle.

Some dishwashers include a child-lockout feature to prevent accidental starting or stopping of the wash cycle by children. A child lock can sometimes be included to prevent young children opening the door during a wash cycle. This prevents accidents with hot water and strong detergents used during the wash cycle.



The heat inside the dishwasher dries the contents after the final hot rinse; the final rinse adds a small amount of rinse aid to the hot water, as this improves drying significantly. Plastic and non-stick items may not dry properly compared to china and glass, which hold the heat better. Some dishwashers incorporate a fan to improve drying. Older dishwashers with a visible heating element (at the bottom of the wash cabinet, below the bottom basket) may use the heating element to improve drying; however, this uses more energy.

North American dishwashers tend to use heat-assisted drying via an exposed element. European machines and some high end North American machines use passive methods for drying – a stainless steel interior helps this process and some models use heat exchange technology between the inner and outer skin of the machine to cool the walls of the interior and speed up drying. Most dishwashers feature a drying sensor and as such, a dish-washing cycle is always considered complete when a drying indicator, usually in the form of an illuminated “end” light, or in more modern models on a digital display or audible sound, exhibits to the operator that the washing and drying cycle is now over.

Governmental agencies often recommend air-drying dishes by either disabling or stopping the drying cycle to save energy.[10]

Cleaning agents[edit]


A detergent tablet

Different kinds of dishwashing detergent contain different combinations of ingredients. Common ingredients include:

Dishwashing detergent may also contain[citation needed] :

  • Anti-foaming agents
    • Foam interferes with the washing action.
  • Additives to slow down the removal of glaze & patterns from glazed ceramics
  • Perfumes
  • Anti-caking agents (in granular detergent)
  • Starches (in tablet based detergents)
  • Gelling agents (in liquid/gel based detergents)
  • Sand (inexpensive powdered detergents)

Dishwasher detergents are strongly alkaline (basic).

Inexpensive powders may contain sand, which can be verified by dissolving the powder in boiling water and then passing the solution through a coffee filter. Such detergents may harm the dishes and the dishwasher. Powdered detergents are more likely to cause fading on china patterns.[12]

Besides older style detergents for dishwashers, biodegradable detergents also exist for dishwashers. These detergents may be more environmentally friendly than conventional detergents.

Hand-washing dish detergent (washing up liquid) creates a large foam of bubbles which will leak from the dishwasher.

Rinse aid[edit]

Rinse aid (sometimes called rinse agent) contains surfactants and uses Marangoni stress to prevent droplet formation, so that water drains from the surfaces in thin sheets, rather than forming droplets.[citation needed]

The benefits of using it are that it prevents “spotting” on glassware (caused by droplets of water drying and leaving behind dissolved limescale minerals), and can also improve drying performance as there is less water remaining to be dried. A thinner sheet of water also has a much larger surface-area than a droplet of the same volume, which increases the likelihood of water molecules evaporating.[13]

Dishwasher salt[edit]

Main article: Dishwasher salt

In some countries, especially those in Europe, dishwashers include a built-in water softener that removes calcium and magnesium ions from the water. Dishwasher salt, which is coarse-grained sodium chloride (table salt), is used to recharge the resin in the built-in ion-exchange system. The coarse grains prevent it from clogging the softener unit; unlike certain types of salt used for culinary purposes, it does not contain added insoluble anticaking agents or magnesium salts. The presence of magnesium salts will defeat the purpose of removing magnesium from the water softener. Anticaking agents may lead to clogging or may contain magnesium. Table salt may contain added iodine in the form of sodium iodide or potassium iodide. These compounds will not affect the ion-exchange system, but adding table salt to the dishwasher’s water softening unit can damage it.

If a dishwasher has a built-in water softener there will be a special compartment inside the dishwasher where the salt is to be added when needed. This salt compartment is separate from the detergent compartment, and generally located at the bottom of the wash cabinet (this is below the bottom basket). On most dishwashers, an automatic sensing system will notify the user when more dishwasher salt is required.

If the dishwasher has run out of the salt that recharges the ion exchange resin that softens the water, and the water supply is “hard”, limescale deposits can appear on all items, but are especially visible on glassware.

Effects on crockery[edit]


Glassware washed by dishwashing machines can develop a white haze on the surface over time. This may be caused by any or all of the below processes, only one of which is reversible:

Silicate filming, etching, and accelerated crack corrosion
This film starts as an iridescence or “oil-film” effect on glassware, and progresses into a “milky” or “cloudy” appearance (which is not a deposit) that cannot be polished off or removed like limescale. It is formed because the detergent is strongly alkaline (basic) and glass dissolves slowly in alkaline aqueous solution. It becomes less soluble in the presence of silicates in the water (added as anti-metal-corrosion agents in the dishwasher detergent). Since the cloudy appearance is due to nonuniform glass dissolution, it is (somewhat paradoxically) less marked if dissolution is higher, i.e. if a silicate-free detergent is used; also, in certain cases, the etching will primarily be seen in areas that have microscopic surface cracks as a result of the items’ manufacturing.[14][15] Limitation of this undesirable reaction is possible by controlling water hardness, detergent load and temperature. The type of glass is an important factor in determining if this effect is a problem. Some dishwashers can reduce this etching effect by automatically dispensing the correct amount of detergent throughout the wash cycle based on the level of water hardness programmed.
Components found in dishwasher detergents can chemically scour the glass, causing tiny crystals, which can precipitate further crystal growth that can turn entire glasses cloudy[16]

Unsuitable items[edit]

Lead crystal should not be cleaned in a dishwasher as the corrosive effect of dishwasher detergent is high on such types of glass – that is, it will quickly go ‘cloudy’. In addition, the lead in the crystal glass can be converted into a soluble form, which could endanger the health of subsequent users.[17] Some items can be damaged if washed in a dishwasher because of the effects of the chemicals and hot water. Aluminium items will discolor. Saucepan manufacturers often recommend handwashing due to the harsh effects of the chemicals on the pan coatings. Valuable items, such as antiques or hand-painted items, should only be washed manually as they may be dulled or damaged, and detergents will gradually fade the glazing and print. Sterling silver and pewter will oxidize and discolour from the heat. Furthermore, pewter has a low melting point and may warp in some dishwashers.

Items soiled by wax, cigarette ash or anything which might contaminate the rest of the wash load (such as poisons or mineral oils) should be washed by hand. Objects contaminated by solvents may explode in a dishwasher. Glued items, such as some cutlery handles or wooden cutting boards, may be melted or softened if put in a dishwasher, especially on a hot wash cycle when temperatures can reach 75 °C (167 °F); these high temperatures can also damage plastic items which are designated as only being washed by hand. Some plastic items can be distorted or melted if placed in the bottom rack too close to an exposed heating element; therefore, most dishwasher-safe plastic items recommend placing in the top rack only (many newer dishwashers have a concealed heating element away from the bottom rack entirely). Squeezing plastic items into small spaces may cause the plastic to distort in shape.

Dishwashers should only be used to wash normal household items, such as plates, cutlery, cups, mugs, kitchenware etc. Items such as paintbrushes, tools, furnace filters etc. should not be put into a dishwasher as this will cause the subsequent washes to become contaminated and may cause damage to the appliance.

Knives and other cooking tools that are made of carbon steel, semi-stainless steels like D2 or specialized, highly hardened cutlery steels like ZDP189 should also not be placed into a dishwasher, as these steels are not corrosion resistant or far less corrosion resistant than the austenitic stainless steels used for cookware. Also, very sharp edges can become dulled or damaged from colliding with other items and/or thermal stress from the washing cycles, and can pose an injury hazard if another person unloading the dishwasher does not expect such items in the dishwasher. Ceramic edges are very brittle and can take damage from collision with dishwasher parts or other contents.

Cast iron cookware is normally seasoned with oil or grease and heat, which causes the oil or grease to be absorbed into the pores of the cookware, thereby giving a smooth relatively non-stick cooking surface. Such cookware should not be washed in a dishwasher as the combination of alkali based detergent and hot water will strip off this cooking surface, requiring reseasoning before the item may once again be used.


In the European Union, the energy consumption of a dishwasher for a standard usage is shown on a European Union energy label. In the United States, the energy consumption of a dishwasher is defined using the energy factor.

Most consumer dishwashers use a 75 °C (167 °F) thermostat in the sanitizing process. During the final rinse cycle, the heating element and wash pump are turned on, and the cycle timer (electronic or electromechanical) is stopped until the thermostat is tripped. At this point, the cycle timer resumes and will generally trigger a drain cycle within a few timer increments.

Most consumer dishwashers use 75 °C (167 °F) rather than 83 °C (181 °F) for reasons of burn risk, energy and water consumption, total cycle time, and possible damage to plastic items placed inside the dishwasher. With new advances in detergents, lower water temperatures (50–55 °C / 122–131 °F) are needed to prevent premature decay of the enzymes used to eat the grease and other build-ups on the dishes.

In the US, residential dishwashers can be certified to a NSF International testing protocol which confirms the cleaning and sanitation performance of the unit.[18]

A 2009 study showed that the microwave and the dishwasher were both effective ways to clean domestic sponges.[19]


Commercial use[edit]

A commercial dishwasher

A Hobart commercial dishwasher

Large heavy-duty dishwashers are available for use in commercial establishments (e.g. hotels, restaurants) where a large number of dishes must be cleaned.

Unlike a residential dishwasher, a commercial dishwasher does not utilize a drying cycle (commercial drying is achieved by heated ware meeting open air once the wash/rinse/sanitation cycles have been completed) and thus are significantly faster than their residential counterparts. Washing is conducted with 65–71 °C / 150–160 °F temperatures and sanitation is achieved by either the use of a booster heater that will provide the machine 82 °C / 180 °F “final rinse” temperature or through the use of a chemical sanitizer. This distinction labels the machines as either “high-temp” or “low-temp”.

Some commercial dishwashers work similarly to a commercial car wash, with a pulley system that pulls the rack through a small chamber (known widely as a “rack conveyor” systems). Single-rack washers require an operator to push the rack into the washer, close the doors, start the cycle, and then open the doors to pull out the cleaned rack, possibly through a second opening into an unloading area.

In the UK, the British Standards Institution set standards for dishwashers. In the US, NSF International (an independent not-for-profit organization) sets the standards for wash and rinse time along with minimum water temperature for chemical or hot water sanitizing methods.[20] There are many types of commercial dishwashers including under counter, single tank, conveyor, flight type, and carousel machines.

Commercial dishwashers often have significantly different plumbing and operations than a home unit, in that there are often separate spray arms for washing and rinsing/sanitizing. The wash water is heated with an in-tank electric heat element and mixed with a cleaning solution, and is used repeatedly from one load to the next. The wash tank usually has a large strainer basket to collect food debris, and the strainer may not be emptied until the end of the day’s kitchen operations.

Water used for rinsing and sanitizing is generally delivered directly through building water supply, and is not reusable. The used rinse water falls into the wash tank reservoir, which dilutes some of the used wash water and causes a small amount to drain out through an overflow tube. The system may first rinse with pure water only, and then sanitize with an additive solution that is left on the dishes as they leave the washer to dry.

Additional soap is periodically added to the main wash water tank, from either large soap concentrate tanks or dissolved from a large solid soap block, to maintain wash water cleaning effectiveness.

Environmental impact[edit]

Comparing the efficiency of automatic dishwashers and hand-washing of dishes is difficult because hand-washing techniques vary drastically by individual. According to a peer-reviewed study in 2003, hand washing and drying of an amount of dishes equivalent to a fully loaded automatic dishwasher (no cookware or bakeware) could use between 20–300 litres (5.3–79.3 US gal) of water and between 0.1 and 8 kWh of energy, while the numbers for energy-efficient automatic dishwashers were 15–22 litres (4.0–5.8 US gal) and 1 to 2 kWh, respectively. The study concluded that fully loaded dishwashers use less energy, water, and detergent than the average European hand-washer.[21][22] For the automatic dishwasher results, the dishes were not rinsed before being loaded. The study does not address costs associated with the manufacture and disposal of dishwashers, the cost of possible accelerated wear of dishes from the chemical harshness of dishwasher detergent, the comparison for cleaning cookware, or the value of labour saved; hand washers needed between 65 and 106 minutes. Several points of criticism on this study have been raised.[23] For example, kilowatt hours of electricity were compared against energy used for heating hot water without taking into account possible inefficiencies. Also, inefficient human washers were compared against optimal usage of a fully loaded dishwasher without manual pre-rinsing that can take up to 100 litres (26 US gal) of water.[24]

Most dishwasher detergent contains complex phosphates, as they have several properties that aid in effective cleaning. However, the same chemicals have been removed from laundry detergents in many countries as a result of concerns raised about the increase in algal blooms in waterways caused by increasing phosphate levels (see eutrophication). 17 US states have partial or full bans on the use of phosphates in dish detergent,[25] and 2 US states Maryland and New York ban phosphates in commercial dishwashing. Detergent companies claimed it is not cost effective to make separate batches of detergent for the states with phosphate bans (although detergents are typically formulated for local markets), and so most have voluntarily removed phosphates from all dishwasher detergents.[26]

In addition, rinse aids have contained nonylphenol and nonylphenol ethoxylates. These have been banned in the European Union by EU Directive 76/769/EEC.

From Wikipedia, the free encyclopedia