The keys to welfare reform, in the view of conservative policy analysts, are to give more authority to local governments, end the programs' entitlement status, expand the role of private charities, reconsider orphanages as a way to save children from the ravages of poverty and restrict benefits in order to encourage self-sufficiency.
"Our three principles--work, personal responsibility and state control--are the keys to unlocking the welfare prison that has kept our fellow citizens trapped," said Rep. E. Clay Shaw Jr. (R-Fla.), chairman of the House subcommittee overseeing welfare reform.
Could such an approach work?
Clues to the answer may lie in the nation's not-so-distant past. Today's conservative prescription for welfare reform is in some ways strikingly similar to what actually existed in the United States for much of its history. Until the coming of Franklin D. Roosevelt's New Deal in the 1930s and Lyndon B. Johnson's Great Society in the 1960s, there was no federally administered national welfare system.
Instead, towns and counties doled out assistance to families that had come upon hard times and operated "poor farms," "county homes" and other such institutions for those with no place else to turn. Churches and fraternal organizations funded orphanages. And, especially in rural areas, individual families often took in poor relations or provided sustenance in exchange for labor.
For many people, the system had its advantages: It provided a measure of protection against outright suffering. It did not spawn cumbersome bureaucracies. And the rampant drug addiction, violence, fatherless families and other nightmares that plague today's welfare culture were relatively rare.
At the same time, the quality of many local institutions was considered scandalous, even by the standards of the time. Disease and exploitation were commonplace, families were sometimes torn apart. Local resources were periodically overwhelmed by waves of immigration and such calamities as drought and recession.
And then as now, costs were considered too high.
More important, the local systems were plagued by a stigma that undercut their public acceptance: the corrosive suspicion that, while some clearly deserved help, too many undeserving individuals were riding along free and making a spectacle of their idleness.
Indeed, long before the coming of the New Deal and the Great Society, the perceived failings and inadequacies of the old approach had given rise to calls for reform--some as far back as the mid-19th Century.
"There was no golden age," historian Michael Katz said.
Nonetheless, Katz and many other experts agree, elements of the past--what worked in the old system and what did not--offer valuable guidance for those seeking a better system for the future.
"All debates on poverty and welfare have raged around the same series of questions: How can we discriminate between the able-bodied and the non-able-bodied and the 'worthy' and 'non-worthy' poor?" Katz said. "What is the impact of welfare on people's motivation to work? What are our obligations to each other? You can see these questions being played out with people on all sides of the issue, debating them in similar ways over a very long period of time."
Basically, the strategy for dealing with poverty that evolved from Colonial times--much of it derived, in turn, from the "poor law" of Elizabethan England--had two parts: direct material aid for those who could, with modest assistance, keep body and soul together in the regular world, and institutional care for those who could not--the old, the homeless, the very young, the sick and the disabled.
The former was widely known as "outdoor relief" and is regarded as the forerunner of today's welfare programs because it involved giving out cash and commodities. Local communities gave direct aid in the form of money and goods to citizens in need. More often than not, those who needed help were friends and neighbors.
"It was not always cash. It could be coal for fuel, or they could pay your rent for you, or they could give you food or castoff clothing," said Linda Gordon, professor of American history at the University of Wisconsin and author of a recent book about single mothers and welfare. "These were close communities where the poor were your neighbors."
That sense of solidarity was an essential underpinning of the system. People received help because the community felt obligated to take care of its own, not because there was any broad individual "entitlement."
The community-based nature of the system was imbued in law as well as custom. "One of the features for many years with poor laws was 'settlement' laws--an obligation to support those who belonged to the community," said Katz, a University of Pennsylvania history professor and author of the book "Improving Poor People."
That was both a strength and a source of problems. The community's obligation was clear. What was not clear, and what led to controversy, was the question of who was a member of the community. Had applicants truly settled in the town or were they transients?
"They went to great lengths to determine who was settled and who wasn't," he said. Those deemed "not settled" might be denied help or even compelled to move on.
Outdoor relief was prevalent everywhere outside the South, but in the Northeast it was particularly ingrained. And despite the widespread belief today that welfare is a uniquely modern problem, substantial portions of the population were given this help. Twenty out of every 1,000 people in Massachusetts received it in 1890, for example. At one time in neighboring New Hampshire, the rate exceeded 25 for every 1,000.
By way of comparison, in New Hampshire today the rate of mothers on Aid to Families With Dependent Children is 20 for every 1,000 residents. In Massachusetts, the AFDC rate is 50 for every 1,000.
Gradually, however, the traditional system that emphasized helping only those who belonged to the community became harder and harder to sustain. The westward expansion created problems. So did immigration and the Industrial Revolution. New communities sprang up in the West. Established communities began to receive influxes of migrants and immigrants--strangers, not friends and relations. Urban industrial centers with their impersonal ways and fluid populations became more common.
On such landscapes, who was "settled" and who was not?
Although desperately in need, many of the newcomers were treated with hostility. The potential for widespread suffering and strife began to mount. At the same time, even for those who were recognized as wards of the community, suspicions festered about who in the growing ranks were truly worthy of help and who might be taking advantage.
In sentiments not unlike those expressed by critics of the system today, critics of outdoor relief portrayed it as a cause of willful idleness or "pauperism," experts say. Many began to believe that it undermined the recipient's desire to work.
"As in England, there was a clear difference between being poor and being a pauper," according to UCLA law professor Joel Handler, who has studied the historical roots of welfare. "The latter was a moral issue. The idle able-bodied were viewed as criminals, as threats to themselves as well as the community."
Reformers in the 19th Century began campaigning to crack down on the dole and those getting it. One influential effort in Brooklyn, led by Seth Low, an ambitious politician, managed to shut down the local system entirely.
Under his leadership, Brooklyn became the first of 10 of the nation's 40 largest cities to abolish public outdoor relief between the 1870s and the 1890s. Others reduced the amount they provided.
While support for outdoor relief was eroding--although some form of it continued for many years--"indoor relief," the idea of institutionalizing needy people, was gaining popularity. The movement included hospitals, mental institutions and orphanages in addition to poorhouses.
The world of institutional care reflected a new belief in the power of institutions to solve social problems through their influence on behavior and character.
Communities bought into the concept eagerly. For example, in Massachusetts the number of poorhouses increased from 83 to 219 between 1824 and 1860.
But if the institutions helped address one problem--public confidence--they created another.
Conditions in these facilities were horrible--often deliberately so--in order to motivate inmates to leave and find work as soon as possible. "Everyone feared that if you were too nice to people, they wouldn't want to work. So you put them in an institution and fed them but didn't give them any money," Gordon said.
Most poorhouses were "wretched places, unsanitary, spreading disease, often effectively run by ungovernable inmates who entered and left at will, undercut by corrupt suppliers and managers and unsupported by a hostile public," Katz said.
In addition to being mistreated, families in them were often shattered, experts say. Resident parents wound up sending their children to orphanages to escape the harsh conditions.
Furthermore, communities were disappointed to learn that the misery did not seem to push people into the work force. Once in the poorhouse, residents had a hard time finding a way out, and that in turn fueled new suspicions about freeloading and dependency.
"There was no rehabilitation," said Handler, also the author of a forthcoming book on welfare.
The failings of these approaches paved the way for the modern system and provide a cautionary lesson for reformers today. No matter how distasteful paying money directly to the poor might seem, "institutional care is always more expensive," Gordon said.
For example, Connecticut had spent $1.2 million on indoor relief between 1898 and 1927 for a resident population that ranged from 2,000 to 5,000. Costs for outdoor relief were half that, even though the number of people served was sometimes tenfold more.
Another lesson from this time, experts say, is that if getting people into paying jobs is a priority, more than just strong discipline is needed.
As local and private institutions were discredited, government at a higher level gradually began to step in, gingerly.
By World War I, governments had established foster care to get youths out of almshouses, juvenile courts were created to keep them out of adult prisons and compulsory education and child labor restrictions were enacted to give them a chance for a better life.
Also, what became known as the "mothers' pensions" movement arose to stop the damage being done to the family structure. It helped push laws through the legislatures of 40 states to keep poor mothers and their children together, in what became a precursor to AFDC.
"Back then, a middle-class woman could fall into poverty quite suddenly if her husband died," said Theda Skocpol, professor of sociology and government at Harvard University. In 1931, mothers of 200,000 children drew government stipends so that they could stay at home with their children rather than go to work full time.
In 1921, Congress passed the Sheppard-Towner Act, which established a system of preventive maternal and child health services to help destitute women in isolated rural areas.
It "drastically reduced the infant mortality rate in a number of states," Gordon said. "And there were wonderfully heroic instances of nurses traveling on horseback to reach these people."
Five years later, the law was repealed after doctors complained it was unfair competition. But the direction of public policy was clear.
The Depression, however, with a resounding blow, ended the half-measures and ushered in a new era.
While "America has always had vibrant charitable traditions, they just couldn't handle the need," Skocpol said, explaining that when economic downturns occurred--as they did once a decade--private charities could not cope.
With unemployment surging to about 15 million people by 1932--10 times more than only three years before--and thousands of poor migrating around the nation, begging for work and living in squatters' camps, newly elected President Roosevelt put the federal government directly into the business of relief, or public assistance, for the first time.
Government spending for social welfare increased massively--from $208 million in 1932 to $4.9 billion in 1939.
The New Deal and related efforts saw creation of the National Recovery Administration, the Civilian Conservation Corps, the Public Works Administration, Social Security, unemployment insurance, AFDC, the National Youth Administration and the Works Progress Administration, among others, which clearly staked out a broad government responsibility for caring for the poor.
Spurred by the civil rights movement and horror stories of urban and rural poverty, the Great Society followed three decades later and expanded the welfare state, adding Medicare, Medicaid and federal aid for education.
But now that the comprehensive government approach has in its turn come to grief--also, tellingly, the victim of costs, negative social consequences and public anger--experts say the answer is not a headlong return to the past.
Rather, they say, a safer course would be a mix of roles for private organizations, local government and central government, emphasizing ways of building a bond between givers and receivers, providing discipline against abuse and funneling recipients directly toward work.
"There is a myth about welfare that, once upon a time, private citizens and voluntary associations took care of needy people in America and government had nothing to do with it," Katz said. "That is totally a myth."
Added Skocpol: "When the government does things for the poor, that stimulates and often works in partnership to what private groups are doing. These things don't rise and fall in opposition to each other, but together. There has never been a time where purely private activity has been sufficient to care for the poor."