Employer-sponsored health insurance plans dramatically expanded as a direct result of wage controls imposed by the federal government during World War II.[20] The labor market was tight because of the increased demand for goods and decreased supply of workers during the war. Federally imposed wage and price controls prohibited manufacturers and other employers from raising wages enough to attract workers. When the War Labor Board declared that fringe benefits, such as sick leave and health insurance, did not count as wages for the purpose of wage controls, employers responded with significantly increased offers of fringe benefits, especially health care coverage, to attract workers.[20]
Hospital indemnity insurance provides a fixed daily, weekly or monthly benefit while the insured is confined in a hospital. The payment is not dependent on actual hospital charges, and is most commonly expressed as a flat dollar amount. Hospital indemnity benefits are paid in addition to any other benefits that may be available, and are typically used to pay out-of-pocket and non-covered expenses associated with the primary medical plan, and to help with additional expenses (e.g., child care) incurred while in the hospital.[19][95]
Early hospital and medical plans offered by insurance companies paid either a fixed amount for specific diseases or medical procedures (schedule benefits) or a percentage of the provider's fee. The relationship between the patient and the medical provider was not changed. The patient received medical care and was responsible for paying the provider. If the service was covered by the policy, the insurance company was responsible for reimbursing or indemnifying the patient based on the provisions of the insurance contract ("reimbursement benefits"). Health insurance plans that are not based on a network of contracted providers, or that base payments on a percentage of provider charges, are still described as indemnity or fee-for-service plans.[19]

According to some experts, such as Uwe Reinhardt,[120] Sherry Glied, Megan Laugensen,[121] Michael Porter, and Elizabeth Teisberg,[122] this pricing system is highly inefficient and is a major cause of rising health care costs. Health care costs in the United States vary enormously between plans and geographical regions, even when input costs are fairly similar, and rise very quickly. Health care costs have risen faster than economic growth at least since the 1970s. Public health insurance programs typically have more bargaining power as a result of their greater size and typically pay less for medical services than private plans, leading to slower cost growth, but the overall trend in health care prices have led public programs' costs to grow at a rapid pace as well.
Through the 1990s, managed care insurance schemes including health maintenance organizations (HMO), preferred provider organizations, or point of service plans grew from about 25% US employees with employer-sponsored coverage to the vast majority.[69] With managed care, insurers use various techniques to address costs and improve quality, including negotiation of prices ("in-network" providers), utilization management, and requirements for quality assurance such as being accredited by accreditation schemes such as the Joint Commission and the American Accreditation Healthcare Commission.[70]
The state passed healthcare reform in 2006 in order to greater decrease the uninsured rate among its citizens. The federal Patient Protection and Affordable Care Act (colloquially known as "Obamacare") is largely based on Massachusetts' health reform.[39] Due to that colloquialism, the Massachusetts reform has been nicknamed as "Romneycare" after then-Governor Mitt Romney.[40]
Hospital and medical expense policies were introduced during the first half of the 20th century. During the 1920s, individual hospitals began offering services to individuals on a pre-paid basis, eventually leading to the development of Blue Cross organizations.[65] The predecessors of today's Health Maintenance Organizations (HMOs) originated beginning in 1929, through the 1930s and on during World War II.[67][68]

Nearly one in three patients receiving NHS hospital treatment is privately insured and could have the cost paid for by their insurer. Some private schemes provide cash payments to patients who opt for NHS treatment, to deter use of private facilities. A report, by private health analysts Laing and Buisson, in November 2012, estimated that more than 250,000 operations were performed on patients with private medical insurance each year at a cost of £359 million. In addition, £609 million was spent on emergency medical or surgical treatment. Private medical insurance does not normally cover emergency treatment but subsequent recovery could be paid for if the patient were moved into a private patient unit.[44]
Public programs provide the primary source of coverage for most seniors and also low-income children and families who meet certain eligibility requirements. The primary public programs are Medicare, a federal social insurance program for seniors (generally persons aged 65 and over) and certain disabled individuals; Medicaid, funded jointly by the federal government and states but administered at the state level, which covers certain very low income children and their families; and CHIP, also a federal-state partnership that serves certain children and families who do not qualify for Medicaid but who cannot afford private coverage. Other public programs include military health benefits provided through TRICARE and the Veterans Health Administration and benefits provided through the Indian Health Service. Some states have additional programs for low-income individuals.[43] In 2011, approximately 60 percent of stays were billed to Medicare and Medicaid—up from 52 percent in 1997.[44]

In the late 1990s and early 2000s, health advocacy companies began to appear to help patients deal with the complexities of the healthcare system. The complexity of the healthcare system has resulted in a variety of problems for the American public. A study found that 62 percent of persons declaring bankruptcy in 2007 had unpaid medical expenses of $1000 or more, and in 92% of these cases the medical debts exceeded $5000. Nearly 80 percent who filed for bankruptcy had health insurance.[59] The Medicare and Medicaid programs were estimated to soon account for 50 percent of all national health spending.[60] These factors and many others fueled interest in an overhaul of the health care system in the United States. In 2010 President Obama signed into law the Patient Protection and Affordable Care Act. This Act includes an 'individual mandate' that every American must have medical insurance (or pay a fine). Health policy experts such as David Cutler and Jonathan Gruber, as well as the American medical insurance lobby group America's Health Insurance Plans, argued this provision was required in order to provide "guaranteed issue" and a "community rating," which address unpopular features of America's health insurance system such as premium weightings, exclusions for pre-existing conditions, and the pre-screening of insurance applicants. During 26–28 March, the Supreme Court heard arguments regarding the validity of the Act. The Patient Protection and Affordable Care Act was determined to be constitutional on 28 June 2012. The Supreme Court determined that Congress had the authority to apply the individual mandate within its taxing powers.[61]
The share of Americans without health insurance has been cut in half since 2013. Many of the reforms instituted by the Affordable Care Act of 2010 were designed to extend health care coverage to those without it; however, high cost growth continues unabated.[3] National health expenditures are projected to grow 4.7% per person per year from 2016 to 2025. Public healthcare spending was 29% of federal mandated spending in 1990 and 35% of it in 2000. It is also projected to be roughly half in 2025.[4]
×