Energy is expensive now, and its costs keeps going up. It seem logical that anything done to save energy will also save money. And since insulation reduces energy loss, the general assumption today seems to be: the more insulation in a structure, the more energy- and money- saved. But is this assumption always correct? Or does the law of diminishing returns take effect at some point, so that the cost of additional insulation is greater than the cost of energy? What is the optimum amount of insulation? Researchers at the Portland Cement Association have been using a computer simulation to try to answer this question. They studied the heating and cooling costs that would be experienced if an apartment building were built in each of various locations and with varying R-levels of insulation. They have concluded that the practical, cost-effective R value for concrete walls enclosing residential space is much less than that proposed by most codes and standards. Therefore, some insulation is advisable, but high R values cannot be justified. Given annual heating and cooling loads, cost of fuel, inflation rates, cost of money and the cost of insulation, the study showed that insulating concrete walls in an apartment building to provide an R value of less than 8 squared is most cost effective in most locations. But this is only part of the story. Energy is used for cooling, as well as heating. Adding insulation to the walls of the building studied raised the cooling load. With the added insulation, internally generated and solar heat gains cannot as early escape when outdoor temperatures drop. As a result, raising the R value, lowers the combined heating and cooling load very little.