“Three Myths of Senior Living Communities” is an article written by Dwayne J. Clark, founder and chief executive officer of Aegis Living.  He was nice enough to allow us to share it with our readers.

It’s difficult to overcome stereotypes of senior living communities. Despite the fact that the level of available care and amenities, and the choice and type of facilities, have evolved significantly over the past several decades, people still tend to think of senior housing as the “old folks’ homes” of the past: antiseptic, white-walled, linoleum-lined institutions with cold nurses, hot temperatures, and nasty food. It’s no wonder then that the majority of people continue to buy into three myths about senior living institutions that are not only flat-out wrong but can actually be detrimental to the well-being of their aging loved ones. The three myths of senior living communities are:

1. All senior housing options are the same. The reality is that today’s senior living industry is similar to the hotel industry with a range of choices for every lifestyle, need and budget. You can find low-end chains that offer only the very basic in care and amenities, similar to a Motel 6. There are family-run operations, set up in residential homes, not unlike bed-and-breakfasts. And then there are high-end luxury options, comparable to a Four Seasons hotel. Too often, family members and seniors avoid even considering senior living options out of fear of the unknown and a misunderstanding of what present-day senior communities are all about. They are, unfortunately, relying on outdated childhood memories of when a grandparent or a great-aunt went off to a nursing home and never came back.

This does not have to be the case. At the higher end, senior living communities can provide lifestyle activity coordinators instead of program directors, and employ chefs instead of dieticians. They can offer on-site spas and appropriately equipped gyms, massage therapy services, manicures and pedicures, movie theaters, outdoor gardens, and gourmet dinners with wine on the menu. One new site even has a “man cave,” complete with pool tables and beer taps.

2. Entering a senior living community actually hastens the end of someone’s life. Assuming that a senior is better off “aging at home” can result in unnecessary suffering and even tragedy. Many seniors who could benefit from just a little added care are often found living alone, far away from family, largely isolated and devoid of much human interaction, and typically at high risk of physical falls, malnourishment, and depression. These seniors are perfect candidates for an assisted living community because, once they are living in a place where they have access to medical care, personal assistance, medication management, good nutrition, opportunities for mental and physical activity, and a chance to make friends and socialize, they truly thrive. In fact, several new studies show that not only does a move to an assisted living community not hasten a resident’s demise but, in fact, it can actually ensure a greater quantity—and a better quality—of life.

At many senior living communities there are residents who have renewed their childhood hobbies, or taken up new ones like writing, painting or billiards. There are residents who always have a dinner or coffee companion. They can enjoy on-site book groups and religious services. They can play checkers or Wii. Residents often enjoy unexpected romances and, in some cases, marriages. Family members, freed from the worry and guilt of seeing their loved ones in less-than-ideal circumstances, tend to visit more often, strengthening long-worn family ties through new opportunities for quality time and stress-free activities.

3. Only the very wealthy, and the very poor, can afford to live in a senior living community. The fact is that retirement and assisted living communities have been consciously created by senior housing developers to be very affordable for middle-class consumers. The monthly cost of assisted living varies, but the average for a more upscale residence is between $4,200 and $6,200 a month. At first glance, that sounds like a lot of money, and many a family member immediately thinks, “There is no way my mother can afford that.”

But the cost of assisted living needs to be carefully compared with the total cost of living at home. Ongoing expenses of seniors staying in their houses might include rent or mortgage payments; property taxes and homeowners insurance; utilities, such as electricity, heating oil or propane, water, trash pickup, cable, phone and Internet service; home maintenance costs, including lawn care, snow removal, tree care; routine and major repairs to the home (and appliances and other needed home equipment like an air conditioner or furnace); car maintenance; and food and cleaning supplies. Additionally, as a parent or sibling ages, there are likely to be new costs including outside help with laundry, housekeeping, home upkeep and meal preparation; real-time monitoring devices and medical equipment; home health care; and transportation for medical appointments and other necessities. Those expenses, when taken in their entirety, are likely to be almost as much as or equal to the flat-fee monthly cost of an assisted living community. And most people are surprised when they realize that not only can their parents afford to live at one of these communities, but they actually have leftover funds.

Some seniors, of course, won’t have quite enough monthly income to pay the total or to pay for incidentals and will have to begin to tap their financial assets, whether that means selling their home, pulling funds out of an IRA or 401K or beginning to pay down their life savings. In other cases, children or siblings will help pay for the difference. And there are other options as well. Couples can share a unit, making for a discounted overall rate. Many communities offer smaller studio apartments and two residents can share a two-bedroom suite, which helps cut the monthly cost.

What most aging seniors need is some oversight by professionals who understand their unique needs. They need to be treated with kindness and dignity, like any other person whether they’re still sharp or are prone to forgetfulness, and whether they remain physically strong or are in need of a walker. Seniors will find all of that in abundance at today’s retirement and assisted living communities. For new residents, living away from the life they’ve always known is an adjustment, but—more often than not—they quickly realize that it’s a change for the better. And their family members and other loved ones soon realize that the three myths about senior living communities are just that.

 

Dwayne J. Clark is the founder and CEO of Aegis Living, currently with 28 senior living communities in Washington, California, and Nevada, and the author of “My Mother, My Son.” Visit him online at www.mymothermyson.com

Celina Jacobson of the website Masters in Healthcare sent us an interesting article to share called 10 Common Medical Myths That Are Completely False.  We thought it was worth sharing:

For hundreds of years, humans have been programmed to believe things that were downright wrong. The same goes for medicine. Medicine has been, and continues to be, poorly understood, which has given way to several medical myths. A lack of knowledge and motivation to conduct further research has allowed people to come up with their own theories about the human body and how it works. We may not know everything about the body or completely understand its functions, but we do know that these 10 common medical myths are completely false.

1.The Flu Shot Can Give You the Flu: Despite many people’s beliefs, the flu shot does not infect you with the virus. In fact, the influenza viruses in a flu shot are inactivated, or killed, and they cannot cause an infection. Some people experience soreness or redness near the injection site after vaccination, but it does not cause flu illness. If someone does get flu-like symptoms after being vaccinated it’s generally because of a couple of reasons. First, they may have been exposed to one of the influenza viruses before getting vaccinated or before the vaccine takes effect. Second, they may have been infected by a different type of virus or non-flu virus, such as rhinovirus or a respiratory illness that are not protected by the vaccine. Most other flu-like symptoms are experienced among the elderly and people with weakened immune systems.

2.Swallowed Gum Stays in Your Stomach for Seven Years: The truth can be hard to swallow, but you’ll be happy to know that your childhood friends were wrong about gum staying in your stomach for seven years. We should all know better than to underestimate the power of the digestive system. When gum is swallowed, the body breaks it down just like every other food you ingest. During the digestion process, the body extracts materials that it can use and sends the rest out in the stool. Although the body cannot digest the synthetic ingredients of gum, it still passes normally from the stomach, small intestine and into the colon. Even though chewing gum is meant to be chewed, it’s perfectly fine to swallow it if need be.

3.Chocolate and Greasy Foods Cause Acne: Contrary to many beliefs, chocolate and greasy foods do not cause acne. Acne is caused by three main factors: overproduction of oil, also called sebum, irregular shedding of dead skin cells that cause an irritation of the hair follicles, and a buildup of bacteria. Although no one should go overboard on eating chocolate or greasy meals, there is no scientific link to diet and acne.

4.Cracking Your Knuckles Causes Arthritis: When doctors took a crack at disproving this myth, they found little to no truth behind it. The act of cracking your knuckles may sound bad to the ears, but it does not cause arthritis. Every time you crack your knuckles, you stretch the capsule that covers each joint and lower the pressure inside the joint, which creates a vacuum effect by causing the gasses that were previously dissolved in the capsule fluid to make a bubble and pop. While cracking your knuckles does not cause arthritis, it has been linked to ligament injury, discoloration of the tendons and reduced grip strength.

5.Cold Weather Can Give You a Cold : This one couldn’t be more false. Cold weather alone does not cause people to catch colds – you have to contract the virus from an infected person to get one. Colds are more common during the winter months because people are generally indoors during this time and the viruses can spread more easily. In fact, cold viruses tend to survive better in the spring, summer and early fall months because humidity levels are high. So until you swap germs with a sick person, you won’t get a cold from wet hair, cold temperatures or going hatless outside.

6.You Have to Wait 30 Minutes After Eating Before You Can Swim: Despite your mother’s warning, there is no scientific proof that swimming right after a meal is bad for you. It was commonly believed that people should refrain from swimming 30 minutes after eating because blood flows to your digestive tract and limits the blood needed to move your arm and leg muscles when swimming. Scientists have dispelled this myth, noting that while the body does use extra blood during digestion, it does not use enough to prevent your arms and legs from functioning properly. If anything, you may experience slight abdominal cramping if you swim right after eating.

7.You Lose Most of Your Body Heat Through Your Head: The common belief that people lose most body heat through their head is all in their head. Even though you do lose about 10 percent of body heat through your head, it is not the main exit. Body heat is lost through any and all parts of the body that are uncovered in cold temperatures. The myth likely goes back to a flawed military study from the 1950s that tested the loss of body heat when soldiers were exposed to extremely cold conditions. The volunteers experienced rapid heat loss in their heads, but the experiment was flawed because their head was the only part unclothed. Regardless of the myth, it’s a good idea to keep your head and most body parts covered in cold temperatures to stay warm, but know that you aren’t going to turn into an icicle without a hat.

8.Eating Turkey Makes You Sleepy: Just because you’re ready for a nap after eating a large Thanksgiving feast doesn’t mean the turkey is the culprit. The myth claims that turkey makes you sleepy because it contains a nutrient called tryptophan, which is used by the body to make sleep-related serotonin. Tryptophan is a natural occurring amino acid that is obtained from food protein, but turkey is just one of many sources of this essential acid. There are several other amino acids in turkey, but tryptophan is less abundant. It takes a while for these amino acids to circulate through the bloodstream and increase brain serotonin in the brain. A more plausible cause of post-turkey dinner drowsiness is the fact that you’ve eaten more in one sitting than you’re probably used to, and it takes a great deal of energy to digest all that food, therefore making you more sleepy.

9.We Only Use 10 Percent of Our Brains: Whoever believes that people only use 10 percent of their brains may not have a brain. I’m only kidding, but the truth is humans actually use every part of the brain and its always active. Even when you’re resting or thinking, at least 10 percent of the brain is in use. Every part of the brain has a specific function and multiple portions are being used at the same time to perform daily activities, such as breathing, making dinner and driving a car. Even if all parts of the brain aren’t firing neurons and communicating at the exact same time, you can be certain that the human brain is being worked 24/7.

10.Eating Late at Night Makes You Gain Weight: Fear not, late-night snackers, the myth about gaining weight from eating late at night are just plain false. Sure, you probably aren’t going to go run off that bowl of ice cream or bag of popcorn before bedtime, but it’s not going to make you balloon up overnight either. Over the years, scientists have conducted several studies to dispel this myth and results show that eating late at night does not increase one’s chance of gaining weight more than any other time of the day. However, it’s important to note that late-night snacking after you’ve consumed your normal caloric intake during the day may cause weight gain and should be avoided.
 

The Canadian Medical Association Journal released a report disproving one of the many myths used by tort "reform" advocates to push their agenda of protecting insurance comapnaies and nurisng home profits. 

After years of warnings from former United States president George Bush that "frivolous" medical malpractice lawsuits were driving doctors out of practice and inflating the cost of US health care, the weight of evidence now points to preventable errors — not misguided lawsuits — as the real source of the concerns.

In 6 consecutive State of the Union addresses, beginning in 2003, Bush urged the US Congress to pass what he called medical liability reform. He justified that reform, which urged the capping of pain-and-suffering awards at $250 000, by touting the need to ensure access to health care and to control rising costs.

 

The reform campaign was conducted against a backdrop of rising insurance premiums for US doctors. Despite the fact that volatile premiums have largely been found to be products of the insurance underwriting cycle (a cycle of gains and losses within the insurance industry), Bush, some Republicans, medical societies, hospitals and insurers exploited the "crisis," pushing lawmakers to make it more difficult for injured patients to sue doctors.   In fact, there is no evidence that doctors were hit with increasing numbers of malpractice claims during 2001-2004. Over the past 15 years, states that require insurers to file reports on malpractice claims indicate that rates have remained flat, or have even declined, relative to economic growth and population increases.

The real problem, says Tom Baker, a law professor at the University of Pennsylvania, is "not too much litigation, but too much malpractice. … The idea that Americans are suit-happy, litigation-crazy, and ready to rumble in the courts is one of the more amazing myths of our time."

In his 2005 book The Medical Malpractice Myth, Baker claims doctors, patients, legislators and voters have been misdirected and should be seeking ways to prevent malpractice. "It’s not pretty to say, but doctors and nurses make preventable mistakes that kill more people in the United States every year than workplace and automobile accidents combined."

The best-available research supports Baker’s position. Most Americans injured by medical malpractice do not sue. Most lawsuits are not frivolous, and courts efficiently weed out weak claims. Jury awards have not spiralled out of control, and lawsuits have not reduced access to doctors.

In a landmark study, the Institute of Medicine of the National Academy of Sciences estimated that medical errors kill up to 98 000 US hospital patients each year (Kohn LT, Corrigan JM, Donaldson MS, editors. To Err is Human: Building a Safer Health System. Washington, DC; 2000). In 2004, Healthgrades, an independent health care ratings company, reported nearly double that figure. Its examination of 37 million patient records from all 50 states, representing 45% of all US hospital admissions, found 195 000 hospital deaths from preventable medical errors annually between 2000 and 2002, (www.healthgrades.com).

"It’s really an epidemic," says Joanne Doroshow, who heads the New York-based Center for Justice and Democracy, a nonprofit, nonpartisan consumer rights organization. "It’s a terrible problem we have in this country, and I imagine around the world. Hospitals are dangerous places."

Evidence that medical malpractice in the US greatly exceeds malpractice lawsuits has been available since 1974, when California’s medical and hospital associations sponsored a study intended to buttress their efforts to get lawmakers to pass tort reform. Instead, it found that doctors and hospitals negligently injured 0.8% of hospital patients (Mills DH, editor. Report on the Medical Insurance Feasibility Study. Sacramento: California Medical Association and California Hospital Association; 1977). A later analysis of the data found that, at most, only 1 in 75 of those injured were compensated (Danzon, Patricia A. Medical Malpractice: Theory, evidence and public policy. Cambridge: Harvard University Press; 1985).

Recent research has confirmed that malpractice is rampant and few medical errors result in legal claims. In 1990, Harvard researchers examined more than 30 000 randomly selected records from New York hospitals. They concluded that 1% of patients were negligently injured, while only 4% of those who were injured, sued (Patients, doctors and lawyers: Medical injury, malpractice litigation, and patient compensation in New York. Cambridge: Harvard University Press; 1990).

The notion that frivolous lawsuits abound is also unsubstantiated. A 2007 study by Public Citizen showed the court system was "on the whole, a rational one that provides money for valid claims and dismisses invalid ones," (www.citizen.org). Using data from the US government’s National Practitioner Data Bank, the consumer nonprofit group concluded that complaints by "the business and medical lobbies are exaggerated and unsupported by the facts."

Harvard researchers reached a similar conclusion when they examined files from 1452 malpractice claims (NEJM 2006;354[19]:2024-33). Almost three-quarters had outcomes consistent with their merit. Only 10% of patients received payouts in the absence of error, while 16% received no payout despite the presence of error. "Portraits of a malpractice system that is stricken with frivolous litigation are overblown," the researchers concluded. The system performs "reasonably well" in dismissing such lawsuits and in compensating the injured.

In addition, there is evidence that jury awards are simply keeping up with the costs of medical care, rather than being out of line. In 2005, Dartmouth College economists studied payments made to patients between 1991 and 2003. Actual payments, not jury awards, grew an average of 4% annually — slowing to 1.6% a year since 2000 — or 52% since 1991, roughly equivalent to increases in health care costs (Health Aff January-June 2005; suppl Web exclusives:W5-240-W5-249). A 2004 RAND study examining 40 years of jury verdicts concluded that average payouts grew by less than real income, with more costly medical care responsible for more than half the growth in jury awards.

In 2007, Americans for Insurance Reform used the insurance industry’s own data to show that higher insurance premiums between 2001 and 2004 were not the result of sudden increases in claims and payouts. Instead, payouts per doctor were stable, or fell, with premium increases unconnected to actual payouts. Malpractice insurers "vastly" and "unnecessarily" increased reserves for future claims, the study found, (www.centerjd.org/air/StableLosses2007.pdf).

Even if caps and other limits on torts are imposed, they do not decrease malpractice premiums, according to the Center for Democracy and Justice. In 2002, it compared malpractice premiums to the amount of state-level tort "reform." Premiums did not decrease as tort law was restricted.   Some states that resisted enacting changes to malpractice lawsuits had low premium increases; some states that made major changes had high increases. "Laws that restrict the rights of injured consumers to go to court do not produce lower insurance costs or rates," the report concluded. "And insurance companies that claim they do are severely misleading this country’s lawmakers," (www.centerjd.org/archives/issues-facts/ANGOFFReport.pdf).

Overall, malpractice insurance and claims account for, at most, 2% of US health care spending, according to the US General Accounting Office, the investigative arm of Congress.

Allegations that the threat of lawsuits and high premiums were driving doctors out of business were also unfounded, according to an extensive investigation by the General Accounting Office into anecdotal stories from 5 "crisis" states, so-classified by the American Medical Association. The investigation concluded that access to health care was not widely affected, and that the number of physician departures were sometimes inaccurate.

The problem of volatile premiums won’t be solved without reform of the insurance industry, says Doroshow. In most states, insurance companies can raise rates without government oversight. Requiring companies to justify rate hikes in regulatory hearings could control fluctuations, she says. And forcing malpractice insurance companies to open their books would increase competition in the industry.

The political debate has begun to refocus, a reflection that the real malpractice problem concerns the number of injured patients who don’t receive compensation, says Baker. "The political rhetoric has shifted pretty dramatically in that direction."

As a senator, US President Barack Obama recognized the fallacy of the tort-reform remedy. In 2005, Obama and then-Senator Hillary Clinton cosponsored legislation aimed at reducing malpractice suits by reducing the number of patients medical malpractice killed or injured. During his campaign, Obama’s health platform called for doctors and hospitals to be required to report preventable errors. He also promised support to providers to create guidelines and technology to prevent future errors.

In the years ahead, as Obama and the Democrats focus on health care reform, US anesthesiologists are likely to serve as the model for patient-safety improvements. Anesthesiologists once sued more than any other speciality and once paid some of the highest malpractice premiums in the country. In the 1980s, the American Society of Anesthesiologists scoured every claim filed against its members to identify unsafe practices and developed new guidelines to reduce errors. The anesthesiologists are now among the safest practitioners, and their insurance rates have fallen. Similarly, some US hospitals have recently examined malpractice claims made against them to find ways to make procedures safer, resulting in fewer lawsuits and lower litigation costs.

 

 

Matthew M. Wallace is an attorney and CPA with the law firm of Matthew M. Wallace, PC, in Port Huron. Mr. Wallace wrote a great article about the myths of medicaid in the Times Herald.  He can be reached (810) 985-4320.  Below is a summary of the article.

Planning Matters: Busting myths about Medicaid

There are many misconceptions about Medicaid and Medicaid eligibility.  Medicaid laws are complex and confusing. I do not recommend that you try to plan for Medicaid by yourself. One mistake may cost you thousands of dollars and may result in months of Medicaid ineligibility. It is important to get good legal counsel from a knowledgeable legal specialist.

Misconception 1:
"I don’t need Medicaid; I have Medicare."

The truth: Medicare is a federal catastrophic major medical insurance program primarily for hospitalization. Medicare does not pay for long-term custodial care.

Medicaid is a state and federal funded and state-run assistance program.

For seniors, Medicaid is primarily for long-term care in Medicaid qualifying nursing homes, and in certain circumstances, long-term care outside of a nursing home.

Misconception 2:
"If I or my spouse go into a nursing home, the state will take my home away."

The truth: Your home is an exempt asset if it is owned by you or you and your spouse and can stay an exempt asset during your entire nursing home stay.

The home must be used and titled properly. A home that is not titled or used properly is not exempt and is available for nursing home expenses. There are other exempt assets in addition to the home and include one automobile, certain pre-paid funeral arrangements and certain life insurance policies.

Misconception 3: "If I give assets away, I have to wait 60 months to qualify for Medicaid."

The truth: The Department of Human Services looks back 60 months for transfers that are "divestments." If your transfer is not a divestment, it is ignored, even if it is made the day before you apply for Medicaid, and even if it is thousands of dollars.

To determine the number of months your divestment disqualifies you for Medicaid benefits after your Medicaid application is approved, you divide the amount of the divestment by the penalty divisor, which is $6,191 in 2008.

For example, a $20,000 divestment will disqualify you from receiving benefits for about 3.2 months after your application is approved.

 

Read More →