Printer Friendly

Why Johnny can't work: the causes of unemployment.

Whenever high unemployment persists for several years, it is because the costs of labor are rising more rapidly than its productivity. This was true in the 1930s, when the labor policies of Herbert Hoover and Franklin Roosevelt helped turn a temporary downturn into a decade-long Depression. It was true from 1979 to 1982, when the rapid slashing of inflation led to a sharp increase in the real wages of those who remained working. And it is true in 1992, as the minimum wage and unemployment compensation policies of President Bush and the Democratic majority in Congress have priced millions of Americans out of the job market.

Labor can become too expensive for employers in three ways. Most obviously, money wages (including fringe benefits) and/or payroll taxes can increase, without compensating improvements in productivity. Second, prices might fall, increasing the real purchasing power of wages but lowering the monetary value of each worker's output. Third, the productivity of labor may fall, raising unit labor costs as each worker produces less.

Healthy growth in real wages is compatible with full employment--if labor productivity rises sufficiently. Indeed, our nation's economic history is one of generally rising real wages and employment levels accompanied by rising labor productivity. The interludes of high unemployment have coincided with rises in the "adjusted real wage"--the costs of labor minus its productivity.

The Depression That Ended Quickly

Consider, for example, the difference between the 1920-1922 and 1929-1941 depressions. By most measures, the 1920-1922 downturn was initially a more severe contraction than the 1929 one. Industrial production fell by over 30 percent five quarters after the downturn began in 1920, compared with 28 percent over the first five months of the Great Depression. Twelve quarters after the 1920-1922 downturn began, however, industrial production was healthily above the 1920 high. In the Great Depression, by contrast, industrial production 12 quarters into the downturn had declined more than 50 percent. In the Great Depression, unemployment continuously rose for four consecutive years, and recovery to normal levels took another nine years.

Why? In the 1920-1922 downturn, the federal government did not interfere with the working of market forces. The sharp inflation associated with World War I turned abruptly to deflation in early 1920. Prices fell sharply, faster than money wages, pushing real wages up. Some labor was priced out of the market, and unemployment rose sharply. But by late 1921, wages were tumbling faster than prices, and the decline in real wages prompted a rise in employment and an end to the recession.

The failure of the government to act in 1920 partly reflected the classical faith in the self-correcting market mechanism. It also reflected the lack of strong direction in Washington, since President Wilson was seriously ill. His innate activism was tempered by his poor health. The new president in March 1921, Warren G. Harding, was opposed to governmental involvement, but in any case market forces started relieving the problem shortly after he took office.

Hoover's Biggest Mistake

In 1929, by contrast, America was led by a president, Herbert Hoover, who believed that high wages regardless of productivity were the key to prosperity, since high wages increased worker purchasing power and sales. Hoover was not alone in advocating an "enlightened" approach to business prosperity, a proto-Keynesian view that stressed the importance of high wages to maintain purchasing power. The nation's most prominent industrialist, the legendary Henry Ford, constantly preached the need for higher wages. Famous retailer Edward Filene and General Electric president Gerald Swope made similar arguments.

When the nation moved into a downturn in the fall of 1929 after the stock market crash, Hoover behaved in an activist fashion, calling business leaders to meetings at the White House. On November 21 he "jawboned" the DuPonts, Mellons, and other members of America's industrial elite on the importance of keeping wages high. Ford was so impressed that he announced he was going to raise wages, saying "Wages must not come down, they must not even stay on their present level; they must go up." Most business leaders and other Americans accepted the high wage policy; not until well into the Depression was there much negative comment about its efficacy. Even as late as 1931, U.S. Steel president James A. Farrell proclaimed that "Those who advocate wage reductions have not stopped to weigh the implications."

A second administration-supported effort, designed in part to protect domestic wages, was the Smoot-Hawley tariff, which took effect in 1930. Trade restrictions violated a basic principle of economics--the efficiency associated with the international division of labor. Higher tariffs also reduced international labor competition, leading to higher wages for American workers. We estimate that this wage effect of higher tariffs caused perhaps 20 percent of the rise in unemployment observed between 1929 and 1933.

Hoover's high wage lobbying worked. Wages fell, but they remained 8 percent higher than they would have if they had followed the pattern of recessions earlier in the century. Prices actually fell more than wages, so real wages rose significantly in 1930 and early 1931. Increasing numbers of workers were priced out of labor markets, and unemployment rose steadily from 3 percent in 1929 to a still-moderate level of 7 percent in mid-1930 to 28 percent in March 1933.

The high wage policy meant firms were paying far more than normal in labor costs, leading to a flood of red ink. By mid-1930, a majority of firms were not earning enough to cover dividend obligations. The ravaging of business balance sheets set the stage for the next phase of the downturn, converting a recession into a depression. As profits fell, firms found it increasingly difficult to meet their debt obligations to banks. Real wages rose even more with the banking crisis that began with the flight of deposits in fall 1930. The failure of banks lowered the money supply and the level of prices, and raised real wages further in 1931 and into 1932. Higher real wages, in turn, aggravated the rising unemployment. The failure of the Federal Reserve to stem the ensuing fiasco by serving as a "lender of last resort" to faltering banks added to the public policy failures of the era, as did a major "rob the rich" increase in the federal income tax in 1932.

The Depression, then, resulted not from a failure of markets to adjust to changing demand and supply conditions, but from Hoover's activism in maintaining high wages, accompanied by other public policy failures. The contrast to 1920-1922 is vivid: with a do-nothing government, a severe recession was over in slightly over two years; with an activist government, the country endured nearly four years of ever-greater unemployment.

FDR Compounds Hoover's Errors

The mistakes of Herbert Hoover were compounded by his successor Franklin Roosevelt, who was inaugurated in March 1933. At first, FDR's reassuring rhetoric and the 1933 bank holiday gave an important short-term psychological boost to the economy. Unemployment fell by five points (from 28.3 to 23.3 percent) between March and July 1933; it appeared that recovery was finally getting under way. Yet the promising early recovery stalled, and unemployment in mid-1935 was only slightly lower than in the summer of 1933.

The reason for this reversal is that New Deal policies intended to raise wages offset the positive gains from rising depositor confidence, labor productivity, and prices. Using a broad-based measure of wages, we estimate that real (inflation adjusted) wages rose over 2 percent a year from 1932 to 1940--a larger increase than the long-term average increase in American history resulting from productivity growth. Blue collar-wage increases were significantly larger. Despite high unemployment, wages were rising at historically high rates.

The National Industrial Recovery Act, passed in June 1933, effectively dictated a 40-cent-an-hour minimum wage for most firms at a time when the average wage in the private sector was about 45 cents. Following a Hoover-like strategy of trying to boost purchasing power, the NIRA caused factory wages to rise by over 20 percent in barely six months, putting a big damper on the employment growth that had begun in March 1933.

In 1935, the Supreme Court found the NIRA unconstitutional, ending a major upward bias in wages. The benefits of this development were eventually offset by the National Labor Relations (Wagner) Act of 1935, but for nearly two years that legislation was inoperative pending a review of its constitutionality. With the shackles of the 1933 minimum wage legislation removed, and the pro-union Wagner Act in abeyance, unemployment fell steadily. By the spring of 1937 unemployment had fallen to 13 percent, down from 21 percent in late 1935. At last, it seemed the Depression was ending; in Europe it was already over in most countries.

In April 1937 the Supreme Court declared the Wagner Act constitutional, unleashing a flood of organizational activity in major industries. Many steel and auto companies capitulated almost instantly to trade unions. Amid double digit unemployment, average wages rose 11.6 percent in 1937, with larger increases in industries where unions signed labor contracts. Not coincidentally, the economy stopped growing, and even went into another downturn, sending the unemployment rate to above 20 percent by the spring of 1938.

The major New Deal wage shocks had occurred by late 1938, and the market successfully began digesting them. The new wage-enhancing Fair Labor Standards Act, another potential wage shock, mercifully was being phased in over several years. Unemployment fell steadily as wage increases moderated and were overcome by productivity advances and some increase in prices. Conventional wisdom states that World War II ended the Great Depression, but most of the reduction in unemployment between 1938 and 1944 occurred before Pearl Harbor.

The New Deal left other legislative legacies that greatly affected wages in the longer run. The Social Security Act forced employers to finance a payroll tax that added to labor costs, and the Fair Labor Standards Act established a federal minimum wage law. The unemployment compensation system raised the reservation (minimally acceptable) wage of the unemployed, imparting an upward bias to wages.

Without increased unionization, and the new unemployment compensation and Social Security systems, we estimate that the 1937 unemployment rate would have been about 8 percent instead of the 14.3 percent recorded; by 1940, the unemployment rate would have fallen to about 6 percent (instead of 14.6 percent).

The Postwar Downturn That Never Came

World War II seemed to prove the validity of demand-side, Keynesian policies. The government stimulated aggregate demand as never before, with the budget deficit reaching about 20 percent of national output while the unemployment rate fell below 2 percent for three consecutive years, a first in American history.

Leaders of the new Keynesian orthodoxy were terrified that the end of the war would mean a massive decline in government spending, a shift from fiscal stimulus to a contractionary fiscal policy, and consequently massive unemployment. In 1945, Keynesian economist Robert Nathan predicted double-digit unemployment as demobilization proceeded, and many other economists similarly expected unemployment to return to the 10-14 percent range.

Alvin Hansen, the dean of American Keynesians, said that "the government cannot just disband the Army, close down munitions factories, stop building ships and remove all economic controls." But that is exactly what happened. The nation cut federal employment by 10 million in 12 months, moving from a massive budget deficit to a huge budget surplus. What happened to the unemployment rate? It rose, but to under 4 percent. The postwar depression never came.

Again, the explanation is that the real wage, adjusted for productivity change, actually fell, largely offsetting the potential rise in unemployment arising from soldiers coming home and munitions factories shutting down. The after-the-fact Keynesian explanation is that "pent-up" demand prevented a big downturn. While the nation indeed badly wanted consumer goods, the resumption of full domestic production came long after the labor market completed the huge postwar transition, creating, for example, millions of new civilian jobs between mid-1945 and mid-1946.

The postwar recovery was aided by a removal of war-time wage and price controls. Wages rose, but less rapidly than prices, and the adjusted real wage fell. Profits of American business rose, and the large amounts of funds available for investment kept interest rates low, stimulating a private investment spending boom. No major government public works or jobs programs were begun--the postwar transition was a great tribute to the powers of the labor market.

Illusions of the Sixties

The heyday of Keynesian orthodoxy in the United States was in the first generation following World War II. In no year did the unemployment rate average as much as 7 percent, and over the whole generation it averaged less than 5 percent. Economists began to think that they had eliminated business cycles. As is so often the case with that profession, they were wrong.

While Keynesian rhetoric captured the hearts and minds of professional economists, by some measures policy-makers were less activist in the late 1940s and 1950s than in the preceding two decades. Harry Truman and Dwight Eisenhower were the last American presidents to reduce the per-capita public debt. The period from 1947 to 1961 was not an era of Keynesian-style fiscal policy activism. There was even some retreat from the intensive labor market interventionism of the previous decades. The Taft- Hartley Act of 1947 (and to a lesser extent the Landrum-Griffin Act of 1959) reduced the power of labor unions somewhat. Although it was not recognized at the time, by the late 1950s labor union membership was falling relative to the total labor force. Federal jawboning of employers to keep wages high had ceased, and markets had adjusted to the wage-enhancing bias of earlier New Deal legislation.

The 1960s were hailed at the time as the successful demonstration of the effectiveness of Keynesian demand-management policies. Unemployment fell consistently during the decade. Economists began urging policy-makers to move up the newly discovered Phillips curve, using inflationary fiscal stimulus to stimulate employment. Lyndon Johnson led America roughly simultaneously into a major tax reduction, Great Society spending programs, and the Vietnam war. Fiscal stimulus was extremely strong, it seemed to work, and the 1968 unemployment rate stood at half the recessionary level recorded at the beginning of the decade. Increases in the rate of inflation approached the growth in money wages, so real wage growth was moderate in this era. After adjusting for robust productivity growth, real unit labor costs fell, stimulating employment.

However, the successes of the 1960s laid the groundwork for 1970s stagflation. In the 1960s, Americans were fooled by the unexpectedly high inflation. After all, deliberate policy to raise prices was new in American economic history. Workers were afflicted with "money illusion," accepting lower real wages than they normally would have. By the late 1960s, however, workers had shaken off their money illusion, and demanded larger wage settlements. Hourly compensation costs rose over 7 percent a year, compared with just under 4 percent in 1963, the year Lyndon Johnson was inaugurated.

Stagflation in the Seventies

The expansionary monetary and fiscal policies of the Great Society era led directly to the stagflation of the 1970s. During that decade, the average unemployment rate rose to over 6.2 percent, from under 4.8 percent in the 1960s. At the same time inflation increased, and for the first decade in American history the government ran budget deficits every single year, accompanied by unprecedented monetary expansion. In three consecutive years, 1975-1977, the stock of money (M2) grew more than 10 percent a year, a peacetime record. The macro stimulus did not work in achieving growth and full employment.

The Keynesian economics establishment used the OPEC-imposed oil price shocks of 1973 and 1979 to explain the stagflation of the era. There is no doubt that higher oil prices hurt the American economy. Yet there were clear signs of stagflation beginning well before the first oil shock; the "misery index" (the sum of the unemployment and inflation rates) was 10 in 1972, compared with 6.8 just five years earlier; both the unemployment and inflation components were rising long before the Arab oil boycott of 1973.

Moreover, labor costs were escalating because of two interrelated phenomena: higher taxes and regulatory expenses, and a decline in labor productivity growth. The Johnson-Nixon era of huge entitlement programs raised the reservation wage of low-income Americans. They became more picky about taking jobs, given the increasingly generous package of welfare benefits, including such new programs as Medicaid and Food Stamps. Rising Social Security taxes and minimum wages (including expanded coverage) likewise boosted employer labor costs.

The inflation also raised effective tax rates, owing to the progressivity of federal income taxes. Taxes on capital rose to near confiscatory levels (particularly capital gains taxes, which were levied on largely fictitious inflation-induced "gains"), so capital formation slowed, lowering productivity growth. Although that growth slowed for other reasons as well (i.e.,the influx of new inexperienced workers and the effects of the oil price shock on the usability of energy-intensive capital resources), government policy had to share in the blame for a rise in the adjusted real wage.

Surrounded by Keynesian advisers who were ignorant or contemptuous of the earlier classical tradition, Jimmy Carter continued to battle stagflation with policies of demand stimulus. The government increased spending substantially, and deficits were heavily monetized by a compliant Federal Reserve that purchased newly issued debt, thus expanding the supply of money but not the supply of goods, and fueling inflation. Inflationary expectations soared among workers and lenders, who demanded and received big wage increases and high interest rates. The nadir for the economy came in 1979-1980, when inflation reached double digits for the first non-war-related time in American history, while unemployment exceeded 7 percent. The misery index, which had typically hovered around five in the heyday of classical economic policies (1900-1929), and had still been in the single digits during the Kennedy-Johnson era, now approached 20.

The Reagan Boom

Enter Ronald Reagan and Paul Volcker. At the time of their arrival on the national policy stage, conventional economic wisdom was that there was a "core inflation" rate of perhaps 8 percent a year that could not be reduced at any time soon. An industrial policy was also increasingly advocated by those who felt the national economic malaise came from a lack of central direction.

Volcker's move to a more restrictive monetary policy was somewhat erratic, but by focusing on controlling the stock of money the Fed contributed to a significant cooling in the rate of inflation. The annual inflation rate, which averaged around 12 percent in 1979 and 1980, averaged only slightly over 6 percent in 1981 and 1982. Yet the inflationary expectations of workers and unions lagged behind the actual decrease in the inflation rate. In order to protect themselves against what was incorrectly perceived to be nearly double digit inflation, workers demanded (and got) large money wage increases in 1981 and into 1982. Money wages began to rise faster than prices, pushing real wages up. For example, in the first quarter of 1982, hourly labor costs were rising at an annual rate exceeding 7 percent, while the annual inflation rate had fallen to under 2 percent. This caused the 1982 recession.

Market forces ended the recession about a year after it began. Wage increases shrank, and by the second half of 1982 real wages began to fall, increasing labor's appeal, and beginning the largest recorded peacetime expansion in American history. During the remainder of the 1980s, money wage increases typically were only roughly equal to the rate of inflation, while labor productivity rose. The result was that labor costs of employers fell slowly but steadily per dollar of sales.

Why? After the high unemployment of the 1982 recession, American workers were more willing to sacrifice large real wage increases in order to improve job security. Labor unions became weaker and less militant, and public opinion was distinctly less pro-labor than in earlier decades. President Reagan's symbolically important 1981 firing of air traffic controllers after an illegal strike would have caused a public uproar in the 1940s or 1960s, but it was actually well received by a majority of the public. Unions represented 20.1 percent of employed wage and salary workers at the beginning of the economic expansion beginning in 1983, but by the end of the 1980s, the proportion had fallen steadily to 16.4 percent.

The 1981 tax cut may have also contributed to the moderation in wage demands by American labor. With tax reductions, the disposable income associated with any given wage rose. Workers whose before-tax real wage remained constant over time nonetheless received some increase in remuneration for their work effort because of the tax cut.

Deregulation in some cases worked to keep wages from rising. Until the late 1970s, for example, air fares and airline employee wages were relatively high because of government-enforced monopoly practices. The introduction of competition through fare and route deregulation led to lower ticket prices, forcing some airlines to successfully demand wage concessions from workers in order to keep labor costs under control.

The decline in the adjusted real wage in the 1980s led to the creation of 20 million jobs between late 1982 and the spring of 1990, in marked contrast to Europe and Japan, where job creation was comparatively anemic. The hands-off approach of the federal government during the Reagan years reduced policy-induced shifts in the adjusted real wage and the resultant unemployment instability. The decline in the real minimum wage in the 1980s (the hourly minimum wage stayed constant at $3.35 while prices rose) was only the most obvious of several policy decisions that reduced upward pressures on money wage increases.

The Current Recession

According to conventional wisdom, the 1990 recession began with Saddam Hussein's invasion of Kuwait in August. While the surge in oil prices accompanying Saddam's adventure hurt the American economy, the seeds of the recession were sown earlier with a rise in money wages. Hourly wage costs, including benefits, rose less than 4 percent annually in the 1987-1989 period. In the first quarter of 1990, however, they increased at a 5.8-percent annual rate, followed by an extraordinary 9.2-percent annual growth in the second quarter--before the Persian Gulf War began.

What caused this sudden surge in money wages? An important factor was the increase in the minimum wage. After holding steady at $3.35 an hour for over nine years (the longest period in the history of that wage without an increase), on April 1, 1990, the rate was increased by 13.4 percent, to $3.80 an hour. The rise in the minimum also had a shock effect on the rest of the labor market, as wages rose among employers who felt obliged to pay more than the minimum. Real gross domestic product, which grew at a 2.8- percent annual rate in the first quarter of 1990, actually fell at a 1.6-percent annual rate by the third quarter.

The recession caused by higher wages was worsened by the rise in oil prices. Markets soon began to adjust, however, offering promise for recovery. Money wage growth slowed sharply; by the first quarter of 1991, wages were growing 2.7 percent on an annual basis. Yet recovery was at least partially aborted by a second wage shock, when the minimum wage went up again, this time by 11.8 percent to $4.25 per hour, pushing wage growth in the second quarter of 1991 to nearly 5 percent on an annual basis. The new minimum wage in April 1991 was higher than what a majority of teen- age workers were making in 1989. The legally mandated wage explosion among unskilled workers helped create the 1990 recession, contributing to the stagnation in the economy over the past two years.

Aggravating the wage explosion was the decision of the Congress and President Bush to extend unemployment insurance benefits on three occasions. When prolonged benefits are provided, the "reservation wage" of the unemployed rises--the minimum wage at which they will accept a job and go back to work. With extended jobless benefits, the unemployed become finicky about jobs they are willing to take, increasing the average duration of unemployment and thus the overall unemployment rate.

The 1990 recession could have been prevented, or at least ended more quickly, had productivity risen in line with the wage increases. Yet labor productivity languished. Whereas output per hour worked rose at a respectable 1.66-percent average annual rate from 1982 to 1988, it was essentially unchanged from 1988 to 1991-- in part because of the growing regulatory burden of the Bush years.

The Unhappy Legacy of Federal Intervention

Early in the 20th century, the median annual unemployment rate was lower than 5 percent. This was still largely an era of laissez faire policy, when the federal government in most years spent less than 3 percent of gross national product, and interference in labor markets was small. In the decades of the 1970s and 1980s, by contrast, the median annual unemployment rate was 6.6 percent, nearly one-half higher than the median rate in the earlier period. Yet the latter period saw substantial government efforts to reduce unemployment. The increased modern unemployment was not caused by a lack of demand stimulus. On the fiscal policy side, every year had a budget deficit, and government spending rose as a proportion of national output. Regarding monetary policy, in nearly two-thirds of the years, the stock of money (M2) grew more than 8 percent, high by historical standards.

What happened? The normal rate of unemployment went up, in part because of the "Law of Unintended Consequences." Unemployment insurance programs reduced the intensity of job search and raised the reservation wage among the unemployed, increasing unemployment. The expansion of coverage under the minimum wage laws priced some unskilled labor out of the labor market.

Between 1977 and 1992, Social Security and related federal payroll taxes rose from 11.7 percent to 15.3 percent of wages, creating more upward pressure on the adjusted real wage. Similarly, workers' compensation costs as a percent of covered payroll nearly doubled between 1970 and 1990.

A host of new federal regulations added to what economists call structural unemployment. Environmental legislation forced coal mines to close. Safety legislation in some cases reduced productivity, forcing up the adjusted real wage. Soaring fringe benefit-labor costs reflected perversities in tax laws, and the impact of governmentally financed health care services on demand, and thus medical costs. Well-intentioned policies had unforeseen results.

Boosting Productivity

In some respects, the 1990s to date resemble the 1930s, albeit on a much milder scale. In the 1930s, whenever markets started to eliminate unemployment, government-induced wage shocks prevented recovery. Similarly, the minimum wage shocks of the early 1990s, along with other wage-increasing or productivity-reducing laws, have led to a sub-par economic performance. Given both the past record of George Bush and some of the campaign rhetoric of Bill Clinton--support for indexing of minimum wages and large mandated fringe-benefit increases in the health area--we are not sanguine about a return to the employment growth of the Reagan era, a period relatively devoid of government-induced wage shocks.

Employment stagnation is particularly acute in periods of low productivity growth. Workers expect and want some increase in their standard of living over time. In a stagnant economy without productivity growth, higher real wages are possible only by pricing abnormally large numbers of workers out of the market. A trade-off evolves: higher wages or higher employment. The key to avoiding this tradeoff, improving living standards and job opportunities simultaneously, is increasing the productivity of American labor.

History and the weight of economic theory suggest that we need to do three things to our human, capital, and natural resources to enhance productivity: enlarge them, improve them, and permit them to move. Increases in the ratio of capital to labor are fundamental to economic progress--giving workers more tools to work with (including skills and knowledge). Just as fundamental, technological change provides new and better ways of combining resources to obtain more output. Finally, history shows that a productive nation is one in which resources are able to move in response to better opportunities.

During America's rise to world economic preeminence early in this century, more than 20 percent of national output was devoted to capital formation, compared with about 16 percent today. The decline in net capital formation, which allows for depreciation, has been even more dramatic. Our investment levels are below those of Europe and industrial Asia, in large part because incentives to save and invest have been eroded by punitive tax policies and regulatory takings. Our educational expenditures have risen but true per capita human knowledge has probably fallen, reflecting the inefficiencies of a socialistic educational delivery system.

America's perversely high capital gains taxes dramatically reduce incentives to innovate as the rate of return on capitalized human ingenuity is reduced. (The recent reversal of the Reagan-era marginal income tax rate reductions also dampen entrepreneurial incentives.) Restrictions on mobility in the form of plant-closing notification laws, protectionist trade restrictions, and limits on immigration have prevented resources from flowing to their best use. The goal of simultaneously raising living standards and job growth requires a growth-oriented public policy that increases incentives to form, improve, and move capital and human resources. Attempts to raise wages by legislative fiat reduce job opportunities and ultimately do nothing to raise the American standard of living. The lesson of 20th-century economic history is that higher labor productivity is the only way to raise real wages without destroying jobs.
COPYRIGHT 1992 Hoover Institution Press
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1992 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Vedder, Richard; Gallaway, Lowell
Publication:Policy Review
Date:Sep 22, 1992
Words:4928
Previous Article:The great Potomac earthquake: America's new bargain with Washington.
Next Article:Let Congress be Congress: an agenda for legislative reform.
Topics:

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters