A History of Social Work

The American social work profession was established in the late 19th century to ensure that immigrants and other vulnerable people gained tools and skills to escape economic and social poverty. Social workers help people in their personal and interpersonal lives in order to achieve social improvement and pursues social change to benefit a wide variety of individuals, families and communities.